Three MIT students decided to punk a computer science conference that published less-than-thoroughly vetted articles by speaking from completely bogus research. They developed a program that allowed them to produce random, “nonsensical computer-science papers, complete with realistic-looking graphs, figures, and citations.”
By drawing attention to “predatory publishers,” the students accomplished their goal, but their program went on to do far more. Their website still gets many visitors and the team receives emails from researchers who have used SCIgen to successfully submit papers to different conferences.
“Our initial intention was simply to get back at these people who were spamming us and to maybe make people more cognizant of these practices,” says [Jeremy] Stribling. “We accomplished our goal way better than we expected to.”
This is one of the ways more content is not better. Perhaps the publishers in question here are so far out of their league to begin with, they wouldn’t recognize legitimate research if they took the time to read it. On the other hand, maybe this is a sign that some scientific fields have advanced to the point of being indistinguishable from magic, and how do you analyze magic?