Over the past three years, I tried to correct an influential but deeply flawed study on corporate sustainability. The authors ignored me, the journal refused to act, the scholarly community looked the other way, and two universities disregarded evidence of research misconduct -- even after the authors admitted publishing a misleading report. 𝗜 𝗻𝗼𝘄 𝗯𝗲𝗹𝗶𝗲𝘃𝗲 𝗼𝘂𝗿 𝘀𝘆𝘀𝘁𝗲𝗺𝘀 𝗳𝗼𝗿 𝗰𝘂𝗿𝗮𝘁𝗶𝗻𝗴 𝘁𝗿𝘂𝘀𝘁𝘄𝗼𝗿𝘁𝗵𝘆 𝘀𝗰𝗶𝗲𝗻𝗰𝗲 𝗻𝗲𝗲𝗱 𝗿𝗲𝗳𝗼𝗿𝗺—𝗮𝗻𝗱 𝗿𝗲𝗻𝗲𝘄𝗮𝗹. 𝗜 𝗼𝗳𝗳𝗲𝗿 𝘀𝗼𝗺𝗲 𝘀𝘂𝗴𝗴𝗲𝘀𝘁𝗶𝗼𝗻𝘀. Auden Schendler Mike Barnett Mike Toffel Robert Eccles Michael Lenox Alex Edmans Tom Gosling Tom Lyon Florian Heeb Florian Berg Lisa Sachs Bill Baue Duncan Austin Luca Berchicci Ken Pucker Tariq Fancy Brent Goldfarb David Kirsch Nilanjana Dutt Asli Arikan Robert Bloomfield Eddie Riedl Oreskes Naomi Dinah A. Koehler, PhD Erwin Danneels Jason Mitchell
Unfortunately these types of interactions seem increasingly common. We need the prestige journals to take this more seriously. See here for our recent post https://www.linkedin.com/posts/douglas-sheil-55330a9a_academicpublishing-scientificintegrity-openscience-activity-7418940767680749568-55L-?utm_source=share&utm_medium=member_android&rcm=ACoAABUMCo8BrLwsbmfOYxByVavYPlYIkGcQH9s
A good read. Are some fields more prone to this than others? If I had to guess then I’d say it was rarer in humanities.
I admire your persistence! It is a huge service from you. At least the community needs to know that there are people who cares deeply about scientific integrity and will seek further clarification when there’s compelling evidence against it. Personally, I am slightly worn down by my own such pursuit. Need more encouragement from you Andrew King
Thank you for your tireless (and often thankless) efforts on behalf of sound science. In the social media era, where every "influencer" and owner of a platform can spout truth-free opinions, it's more important than ever that the academic community uphold high standards.
This kind of unethical behavior undermines public trust in science. The scientific community bemoans the loss of public trust in science, but we fail to acknowledge our own contributions to this loss. See also: https://www.strukturelle.ch/en/post/when-scientific-institutions-allow-unethical-behavior-by-scientists-why-should-the-public-trust-in
Wow. there's so much that is important here. First, yes, "self-regulation" is an oxymoron, in science, as in business and governance. But... in science, peer review is supposed to play the role of regulation. but clearly it fails, particularly if colleagues turn a blind eye and editors refuse to take complains seriously. Second, this is consistent with my findings (and others) that universities do not take research integrity seriously enough. We have lots of examples. See one here: https://www.science.org/doi/10.1126/science.aec4187
Well at least Bob Eccles hasn’t blocked you, yet. Thanks for continuing to speak out.
The AI slop problem is now in scientific research publication as well. I just saw an article about false references and other problems with submissions. And peer reviewers are now leaning on AI tools for assistance. I thought this was an interesting experiment with AI generation going between image and text description. Seems to be after a number of iterations there is a "regression to the mean" with more and more information removed as the series of conversions progresses. https://theconversation.com/ai-induced-cultural-stagnation-is-no-longer-speculation-its-already-happening-272488
On the Subject of Establishing Trust: My new rule with content on this slopified feed is simple: see an “AI” image, assume “AI” thinking. So I don’t click. I know that “not all people using slop images used AI to cognitively offload their thinking,” but with the overwhelming amount of slop content that’s out there now, the correlation is, in my humble experience, high enough to avoid the click.