Get your research project reviewed by The Red Team: this seems like a good idea!

Ruben Arslan writes:

A colleague recently asked me to be a neutral arbiter on his Red Team challenge. He picked me because I was skeptical of his research plans at a conference and because I recently put out a bug bounty program for my blog, preprints, and publications (where people get paid if they find programming errors in my scientific code).

I’m writing to you of course because I’m hoping you’ll find the challenge interesting enough to share with your readers, so that we can recruit some of the critical voices from your commentariat. Unfortunately, it’s time-sensitive (they are recruiting until May 14th) and I know you have a long backlog on the blog.

OK, OK, I’ll post it now . . .

Arslan continues:

The Red Team approach is a bit different to my bounty program. Their challenge recruits five people who are given a $200 stipend to examine data, code, and manuscript. Each critical error they find yields a donation to charity, but it’s restricted to about a month of investigation. I have to arbitrate what is and isn’t critical (we set out some guidelines beforehand).

I [Arslan] am very curious to see how this goes. I have had only small submissions to my bug bounty program, but I have not put out many highly visible publications since starting the program and I don’t pay a stipend for people to take a look. Maybe the Red Team approach yields a more focused effort. In addition, he will know how many have actually looked, whereas I probably only hear from people who find errors.

My own interest in this comes from my work as a reviewer and supervisor, where I often find errors, especially if people share their data cleaning scripts and not just their modelling scripts, but also from my own work. When I write software, I have some best practices to rely on and still make tons of mistakes. I’m trying to import these best practices to my scientific code. I’ve especially tried to come up with ways to improve after I recently corrected a published paper twice after someone found coding errors during a reanalysis (I might send you that debate too since you blogged the paper, it was about menstrual cycles and is part of the aftermath of dealing with the problems you wrote about so often).

Here’s some text from the blog post introducing the challenge:

We are looking for five individuals to join “The Red Team”. Unlike traditional peer review, this Red Team will receive financial incentives to identify problems. Each Red Team member will receive a $200 stipend to find problems, including (but not limited to) errors in the experimental design, materials, code, analyses, logic, and writing. In addition to these stipends, we will donate $100 to a GoodWell top ranked charity (maximum total donations: $2,000) for every new “critical problem” detected by a Red Team member. Defining a “critical problem” is subjective, but a neutral arbiter—Ruben Arslan—will make these decisions transparently. At the end of the challenge, we will release: (1) the names of the Red Team members (if they wish to be identified), (2) a summary of the Red Team’s feedback, (3) how much each Red Team member raised for charity, and (4) the authors’ responses to the Red Team’s feedback.

Daniël has also written a commentary about the importance of recruiting good critics, especially now for fast-track pandemic research (although I still think Anne Scheels blog post on our 100% CI blog made the point even clearer).

OK, go for it! Seems a lot better than traditional peer review, the incentives are better aligned, etc. Too bad Perspectives on Psychological Science didn’t decide to do this when they were spreading lies about people.

This “red team” thing could be the wave of the future. For one thing, it seems scalable. Here are some potential objections, along with refutations to these objections:

– You need to find five people who will review your paper—but for most topics that are interesting enough to publish on in the first place, you should be able to find five such people. If not, your project must be pretty damn narrow.

– You need to find up to $3000 to pay your red team members and make possible charitable donations. $3000 is a lot, not everyone has $3000. But I think the approach would also work with smaller payments. Also, journal refereeing isn’t free! 3 referee reports, the time of an editor and an associate editor . . . put it all together, and the equivalent cost could be well over $1000. For projects that are grant funded, the red team budget could be incorporated into the funding plan. And for unfunded projects, you could find people like Alexey Guzey or Ulrich Schimmack who might “red team” your paper for free—if you’re lucky!