The big idea: should we get rid of the scientific paper?

From The Guardian:

When was the last time you saw a scientific paper? A physical one, I mean. An older academic in my previous university department used to keep all his scientific journals in recycled cornflakes boxes. On entering his office, you’d be greeted by a wall of Kellogg’s roosters, occupying shelf upon shelf, on packets containing various issues of Journal of Experimental Psychology, Psychophysiology, Journal of Neuropsychology, and the like. It was an odd sight, but there was method to it: if you didn’t keep your journals organised, how could you be expected to find the particular paper you were looking for?

The time for cornflakes boxes has passed: now we have the internet. Having been printed on paper since the very first scientific journal was inaugurated in 1665, the overwhelming majority of research is now submitted, reviewed and read online. During the pandemic, it was often devoured on social media, an essential part of the unfolding story of Covid-19. Hard copies of journals are increasingly viewed as curiosities – or not viewed at all.

But although the internet has transformed the way we read it, the overall system for how we publish science remains largely unchanged. We still have scientific papers; we still send them off to peer reviewers; we still have editors who give the ultimate thumbs up or down as to whether a paper is published in their journal.

This system comes with big problems. Chief among them is the issue of publication bias: reviewers and editors are more likely to give a scientific paper a good write-up and publish it in their journal if it reports positive or exciting results. So scientists go to great lengths to hype up their studies, lean on their analyses so they produce “better” results, and sometimes even commit fraud in order to impress those all-important gatekeepers. This drastically distorts our view of what really went on.

There are some possible fixes that change the way journals work. Maybe the decision to publish could be made based only on the methodology of a study, rather than on its results (this is already happening to a modest extent in a few journals). Maybe scientists could just publish all their research by default, and journals would curate, rather than decide, which results get out into the world. But maybe we could go a step further, and get rid of scientific papers altogether.

Scientists are obsessed with papers – specifically, with having more papers published under their name, extending the crucial “publications” section of their CV. So it might sound outrageous to suggest we could do without them. But that obsession is the problem. Paradoxically, the sacred status of a published, peer-reviewed paper makes it harder to get the contents of those papers right.

Consider the messy reality of scientific research. Studies almost always throw up weird, unexpected numbers that complicate any simple interpretation. But a traditional paper – word count and all – pretty well forces you to dumb things down. If what you’re working towards is a big, milestone goal of a published paper, the temptation is ever-present to file away a few of the jagged edges of your results, to help “tell a better story”. Many scientists admit, in surveys, to doing just that – making their results into unambiguous, attractive-looking papers, but distorting the science along the way.

And consider corrections. We know that scientific papers regularly contain errors. One algorithm that ran through thousands of psychology papers found that, at worst, more than 50% had one specific statistical error, and more than 15% had an error serious enough to overturn the results. With papers, correcting this kind of mistake is a slog: you have to write in to the journal, get the attention of the busy editor, and get them to issue a new, short paper that formally details the correction. Many scientists who request corrections find themselves stonewalled or otherwise ignored by journals. Imagine the number of errors that litter the scientific literature that haven’t been corrected because to do so is just too much hassle.

Finally, consider data. Back in the day, sharing the raw data that formed the basis of a paper with that paper’s readers was more or less impossible. Now it can be done in a few clicks, by uploading the data to an open repository. And yet, we act as if we live in the world of yesteryear: papers still hardly ever have the data attached, preventing reviewers and readers from seeing the full picture.

The solution to all these problems is the same as the answer to “How do I organise my journals if I don’t use cornflakes boxes?” Use the internet. We can change papers into mini-websites (sometimes called “notebooks”) that openly report the results of a given study. Not only does this give everyone a view of the full process from data to analysis to write-up – the dataset would be appended to the website along with all the statistical code used to analyse it, and anyone could reproduce the full analysis and check they get the same numbers – but any corrections could be made swiftly and efficiently, with the date and time of all updates publicly logged.

This would be a major improvement on the status quo, where the analysis and writing of papers goes on entirely in private, with scientists then choosing on a whim whether to make their results public. Sure, throwing sunlight on the whole process might reveal ambiguities or hard-to-explain contradictions in the results – but that’s how science really is. There are also other potential benefits of this hi-tech way of publishing science: for example, if you were running a long-term study on the climate or on child development, it would be a breeze to add in new data as it appears.

Link to the rest at The Guardian

As PG has mentioned on a couple of prior occasions here, the current mode of publishing scientific and academic papers is extraordinarily profitable for the those who do the publishing. The authors and those other academics and scientist who review the articles don’t get paid anything.

The universities and large research institutions that financially support the academics and scientists receive nothing from the sale of the research through journals despite the fact that those who employ the academics and scientists pay dearly to obtain subscriptions to the publishers of academic and scientific papers so their employees can read what other academics and scientists have published in the journals.

The publishers of the journals, commonly private companies owned by excessively wealthy European families who are heirs to hereditary fortunes receive and keep every bit of the shamefully large subscription revenues from the publication of other people’s work.

PG suggests that virtually any alternative would have to be better than the current, badly-outdated system.

24 thoughts on “The big idea: should we get rid of the scientific paper?”

  1. Not sure all the ‘sciences’ should be treated equally in the pursuit of academic credentials.

    The hard sciences – mathematics, physics, chemistry, engineering – have much more objective standards than, say, psychology.

    Anything that uses the infinitely variable human as a subject is fraught with possibilities for fraud, simply because the subjects have consciousness, and that interferes with objectivity.

    Don’t throw out the baby with the bubbles.

    • I noticed years ago that the hand-wringing about the scientific reproducibility crisis turns out, once we get down to specifics, to be the fields that are less tempting to the mathematically inclined. Rather, they are fields where both the author and the peer reviewers took a couple of stats classes in grad school. This is why they are going back and teaching about p-hacking and why it is a Bad Thing. I don’t believe that most p-hackers were consciously engaging in shenanigans so much as they lacked the training to understand the issue.

  2. PG is far too generous to the industry here (note that, as genteely appropriate, PG did not disparage one of his former employers by noting that another part of that former employer is one of the worst offenders). And it is a separate industry from trade publishing: A plurality of scientific journal articles, measured by stated circulation of the journals and guesstimated from various “scholarly impact” measures, are published under a stealth vanity-publishing model. In some fields, it’s not a plurality — it’s the default and/or essentially mandatory for the “prestigious” journals.

    The vanity publishing aspect is hidden in what are called “page charges” — authors (or, rather, their research grants, and of course there’s a line item for this in grant applications, even if only internally, so the public is paying for it if the grants are government-funded) are charged a fee, typically several hundred dollars, based upon the number of pages in the final, published piece. The range of page charges (whatever they’re called) is vast, and also adjusted by use and type of both graphics and tabular materials; as you can imagine, this affects some fields and some types of articles more than it does others.

    Further, the journals rely on outside referees to perform prepublication peer review.* This can be a thankless job, involving a range of writing skills and research skills that has to be seen to be believed.** It can be a rewarding job, with the opportunity to see new research and help sharpen presentation of it. It’s also an unpaid job, or at best provided a token honorarium. (And of course department chairs demand that tenure candidates take part in the system, but don’t provide any administrative support or time for doing so.)

    Now throw on top of this that academic publishers ordinarily demand either copyright transfer or treatment as work made for hire. Some are more amenable to this than others; ironically, a couple of entertainment-law journals (one in NYC, and it’s one that cutting-edge practitioners and those with scholarly interest in IP follow) are among the least reasonable. (Whether characterizing this material as work made for hire is proper when it is not separately and specifically commissioned prior to its creation is a… knottier problem.) And what that means is that one cannot ask the author for reprint permission; one must ask that publisher.***

    All of that said, the concept of peer-reviewed journals as a means of ensuring the validated extension and dissemination of knowledge is a least-bad attempt to create a uniform system with comprehensible expectations, while simultaneously encouraging appropriate risk-taking. Unfortunately, that uniform system has been thoroughly subverted both those whose interests are anything but consistent with that purpose. And that’s before getting into individual-instance misuse of the system, such as (former) Dr A___ W___ (the quackish/quasiacademic progenitor of “vaccines cause autism!” who is not named here for Reasons — he’d lose, and has lost, London-based libel suits for virtually identical descriptions) and (former) Prof D___ S___.

    * Disclosure: This shark served as such a referee (as they call it) for certain Oxford University Press publications until a change in editor and general-for-the-publisher editorial policy a few years back. This shark actually enjoyed the opportunity to learn from some of submissions in off-the-beaten-track areas of and related to intellectual property law, and even used a foreign doctrine learned about therein to assist a US client regarding an unrelated transaction in that nation. And although this was not in “the sciences,” the editorial and publication system was structured identically across OUP.

    ** Further disclosure: Once upon a time, this shark was the undergraduate tasked with doing the initial writeups for a chemistry/biochemistry/photochemistry research group he was part of, so he’s been familiar with these problems for a few decades… particularly with ESL postdocs whose education was largely outside the US. However good that education actually was, however brilliant the postdocs actually were, however innovative the research, US/UK/German academic publishing expectations were well outside their frames of reference.

    *** Do not invite this shark to discuss contracting and implementation practices regarding “permissions” unless you’ve got a couple of days to spare. Stating this in language appropriate for this family-friendly publication, the accounting practices (on both sides — requesting publications and granting publications) are at best well below the standard expected of ordinary business accounting, and seldom approach that “best.” This shark used to be in-house and spent substantial time on the dark side of the editorial desk; he not only knows where the bodies are buried, he’s still got the used spades and pickaxes in a storage unit.

    • Tony, that does not look like it has mechanisms built into it for prepublication semi/double-blind peer review. (Maybe I’m missing something, I’m doing tech support this afternoon and just fiddling around a bit.) Whatever system is adopted for academic publication, it will need to not just be less… ok, I’m going to say it… coopted, subverted, and fraudulent. It will also need to be transparent enough about built-in prepublication review mechanisms that old fuddy-duddies my age (who didn’t watch Wargames, look over at our own computer setups and soldering scars, and sneer — and I’m afraid that’s the vast majority of senior academics my age no matter what field we’re talking about) will accept as at least as bureaucratically sound as the current one.

      Look, senior academics didn’t get there just by being brilliant in their respective fields — they got there navigating bureaucracies, and failing to respect that means they won’t respect any replacement system. There simply has to be at least some semblance of a secret handshake involved…

      • Project Jupyter isn’t meant to be a replacement for the scientific paper system, it’s meant to help in data analysis and showing results. Maybe you could incorporate its technology (or similar) as part of a larger system (where Jupyter could display the results, with version tracking, authentication, approval mechanisms, etc), but I don’t have to knowledge or interest to say more about that.

  3. Once again the article title is not what I was thinking.

    I thought this was about the format of scientific papers like you find on arXiv.org.

    The format looks like a handy way to lay out world building, technology, history, etc…, for my stuff.

    It would be nice to have everything laid out like that in a letter page, hole-punched and in my binders, or simply as a pdf to look at as needed.

    I just asked the question of google and found this paper.

    A Template and Suggestions for Writing Easy-to-Read Research Articles
    https://arxiv.org/abs/1907.12204

    I had Adobe convert the paper to Word, so now I have a clean example of fonts and format.

    I went through and read the Guardian article and they mention Jupyter notebooks that Tony mentioned. I definitely need to look in to that system and see if it is useful as a writing tool.

    Thanks…

    BTW, there is no mention in the article about the ways submitting papers and “peer-review” can be massively misused to suppress competitors. Read the book “Green Earth” by Kim Stanley Robinson. On one hand the story is a “dream[1]” for “Global Warming” believers, while blatantly showing how the various approval committees in the NSF abuse their position to further their own interests and the companies that they are associated with.

    The book is a great example, on many levels, and I am still harvesting useful stuff from it.

    [1] The whole story is a “dream” because the story starts when the Arctic Ocean is already clear of ice. In reality, billions would already be dead at that point by a return to an Ice Age[2] in the Northern hemisphere.

    But I digress…

    [2] We have known since 1958 about this, and it has been ignored for “reasons”[3].

    The Coming Ice Age
    https://harpers.org/archive/1958/09/the-coming-ice-age/

    [3] An interesting point. The article appears and disappears over the years. I will go to link to the article and it has vanished, then like magic, it shows up again in full.

    That’s evocative.

    • NIVEN and POURNELLE played with that scenario in FALLEN ANGELS back in 2002.

      https://www.baen.com/fallen-angels.html

      They accelerated the return of the glaciers for story purposes and the story is more of a romp, a love letter to old school SF fen from before the field got too big. It may yet prove prophetic.

      • What’s interesting about Fallen Angels, is that I didn’t realize it was a parody when I first read it. It seemed all too real. Looking at the wiki page now, they go into detail about the parody, and it makes sense.

        Fallen Angels (Niven, Pournelle, and Flynn novel)
        https://en.wikipedia.org/wiki/Fallen_Angels_(Niven,_Pournelle,_and_Flynn_novel)

        In that light, Green Earth seems like a parody as well. I wonder if that was intentional.

        – Parody is a way of discussing something important while surrounding it with “shiny” to distract the people easily distracted, while giving the people paying attention something real.

        Starting on April Fools, I began seeing an Image/Seed that I wrote down, and day-by-day have slowly added to since. I did not know what was going on, but did not hurry the process. I slowly began to realize that it was a Story of “subtext”. That the horrific events being discovered were just to keep people turning the pages while the actual story in the “subtext” was about “Showing” the “adults in the room” stepping up and simply doing the work.

        – Anyone who was trying to sensationalize events, turn it into a “circus”, were quietly pushed out.

        I finally realized that what I was seeing was the “boys in the basement” handing me a Story in response to the bizarre collapse of mainstream media that is coming to light now in the “Real World tm”.

        “Subtext” is a way for me to comment as the writer without blatantly commenting. If people don’t get the “subtext” they will still follow the mystery of all the dead bodies that were found buried out in the middle of nowhere.

        I’ll try to finish writing the book this month. I’ve been distracted a bit finding a bunch of books that I missed, adding to the stuff that “Evil Geniuses” talked about. They have been there all this time, and I totally missed them.

        Just like the “boys in the basement” are showing me where all the bodies are buried, these books are doing the same.

        [1] My High Holy Day[2]

        [2] There is no fool like an April Fool.

        • FALLEN ANGELS is a triple layer story which is only fair since it has tbree authors. 🙂

          On the surface it is adventure SF story in the “if this goes on… ” vein.
          The second layer is precisely the highlighting of the authoritarian (“don’t ask questions, just believe what we tell you”) tendencies of the political extremists who actually share an underlying common cause in their hate of rationality and its expression through technology.
          And tbe third layer is the character study of STEMers embodied in tbe fen assisting the titular fallen angels, who turn out to be tbe true heroes of the narrative by finding ways to outsmart the authoritarians.

          The true genius of the story is how the three layers reinforce each other to present a straight faced narrative that hides a gentle parody that in turn hides a dead serious cautionary tale. The story still sounds plausible today even though the 70’s “the ice age is coming” hysteria has been replaced by “the world is going to burn” meme.

          I expect that in another 20 years, after technology does its thing and the climate hysteria starts to mitigate, the book will remain readable and enjoyable even by tbose unaware of the tuckerizations.

          BTW, the single stage rocket of the plot is a not-too-disguised DELTA CLIPPER from the 90’s DARPA and NASA reusable spaceship program that Pournelle was involved in. (It proved rapid reusability of rockets was doable but ignored until SpaceX forced the world to accept it.) The program was killed by politicians and old-space naysayers who insist, to this day, that Single Stage To Orbit is impossible. (Never mind tbat SPACEx’s Starship upper stage is capable of making orbit on its own, albeit with a minimal payload. Say two stranded spacers. 😉 )

          One thing about Niven-Pournelle: no duds. Even tbeir “lesser” works hold up fine decades later.

          • You mentioned the anthology, Before the Golden Age with Asimov, in an earlier thread.

            I remember reading that many times in the 70s when it first came out, and it has sat on my reference shelf since. I started reading the book again a few days ago, and I was instantly arguing with the Good Doctor.

            The Man Who Evolved

            Asimov’s complaint was that the cosmic ray would be too much energy and destroy the guy, yet it was clearly mentioned that the destructive aspects were filtered out leaving only the mutation part.

            The Jameson Satellite

            “Just because you can’t build a rocket with a radium drive doesn’t mean Jameson didn’t do it. It launched on automatic, achieved orbit and stayed in orbit for 40m years, fending off the occasional meteor with its radium repulser. You can’t argue with success.” – me

            “It’s just a story.” – Asimov

            “You could say the same thing about Foundation Trilogy.” – me

            Asimov let his “arrogance” about what he “knew” get in the way of Story. That’s why he wrote more non-fiction than he did fiction. His 11 year old self would not have approved of his older self.

            I need to read the book again, many times this year, and argue with his commentary. There is much to learn here.

            Thanks…

            • Asimov leaned towards hard SF. Not as much as Clarke but a lot more tban Heinlein. Too presentist for his own good. Best example being his DAVID “LUCKY” STARR series that while still fine reads are as dated as Burroughs when it comes to the planets in the solar system.

              He had similar problems with his projections of Earth population and culture in CAVES OF STEEL.

              On tbe flip side, he did far better in tbe stories that were decoupled from the present; other worlds, far in the future. And in THE GODS THEMSELVES, where he intentionally left his habits behind.

              Basically he too often let his inner academic overule the author side.

              (And on Foundation he let critics make him change the direction of tbe series. As far as I’m concerned FOUNDATION starts and ends with the trilogy and the rest is apocrypha: The Second Foundation won and saw the plan to completion. And the Second Empire became something like the worlds of Kingsbury’s PSYCHOHISTORICAL CRISIS.)

              https://www.amazon.in/Psychohistorical-Crisis-Donald-Kingsbury/dp/0312861028

              Asimov is an essential part of the history of the field but in the end modern SF is Heinlein’s: story first, science second, everything else far behind.

              • “Story first” is as it should be. Hard SF has the problem of letting “fact” dominate, even though “fact” can change at any moment. We used to have nine planets, but now we have eight in our solar system. Let the story come first so that it’s still good even when the facts change.

                • Part of the issue with Asimov is, as BEFORE THE GOLDEN AGE highlights, his formative years were all pre-Heinlein, high concept “scientifiction” and his own background was academic science.

                  Heinlein by contrast started out as an aeronautical engineer and served in the navy. With that background–engineers *use* science for practical goals–his writings per force had to be goal oriented. They use science to shape the story but it wasn’t the whole story; every one, from the very beginning. He’s the one to follow.

                  Hard science has its uses and very good writers can produce great works in that vein (Niven and Forward stand out) but they have to be very very careful to avoid the presentist trap. Few do.

              • The reply buttons are getting sparse again, so:

                Felix,

                David “Lucky” Starr”

                Love the series. It’s on my reference shelf. I need to read it again.

                The only reason the books “fail” is that the science in the books doesn’t match our dull boring Copy Solar System. Asimov was essentially writing about a Copy Solar System filled with living worlds. That’s the kind of Copy Solar System that I like.

                – There are millions of Copy Solar Systems in our Copy Milky Way, so everything is possible, in Story.

                Jamie,

                Yes, “Story first”. Don’t let “facts” get in the way of “Story”.

                I remember a series on PBS in the 80s? when they pointed out that modern science was a house of cards, built on unexamined “Facts”. They had a visual of that house of cards and how questioning even one of the bottom row of cards would cause it all to come crashing down.

                I can’t remember the name of that series. I would love to find it again. They stopped questioning “science” sometime in the mid-90s, and now only promote the revealed word of “scientism”.

                That’s so dull.

                • Proper science deals with hypothesis and theory, not “accepted consensus” and unquestionable revelation. The latter is religion, not science.

                  In proper science everything and everybody is on probation. Which is why it is news every time a new and different experiment verifies the validity of Relativity. And why the recent discrepancies in the Hubble Constant and the mass of the W boson are important. Unlike other fields, the researchers don’t tweaking the data to make it fit the required prediction. On the contrary, all proper researchers hope to *break* the “accepted consensus” to open the door to newer, more advanced knowledge. And have a shot at a Nobel Prize. 😉

    • As @TonyT mentioned, Project Jupyter main focus isn’t about scientific papers, but more about data analysis, keeping the sets of data original and reproducing the transformation of the data (cleaning, standarizing, etc) to produce the paper. More as a complement to reference in the paper for people wanting to fully dive in than a way to produce the paper itself, except for some very raw data intensive papers.

      I think for your purposes you would find more useful systems around organizing notes or projects with version control and producing various outputs, something to make your own Scrivener system allowing it to be more flexible to adapt to your workflow, but those systems are not intuitive if you don’t have a programming mindset or background (at least it is my impression) and don’t work out of the box, you need to study the system and your workflow, and adapt/take the useful parts.

      What I’m talking about are things like Notable.app, Obsidian.md, and other numerous systems. They usually work with plain text files that are formatted with markdown syntax or something similar instead of doc or pdf files, because version control systems work much more better with plain text files. You can generate pdf/word/html/epub… with them (or not, but normally at least the PDF export would be available), but the internal files are kept in plain text files.

      • I mentioned Jupyter because, with the internet, there’s no reason to simply simply graft the same old system onto new technology. It makes more sense, and should lead to better results, to take advantage of the internet by, for example, including all your data and calculations for others to see (unless you have something to hide…).

      • Thanks, Ana. I’ll check out Notable and Obsidian as well.

        I’m always trying to find something clean that can keep the documents together, yet I end up doing things by hand. I’ve been burned so many times by finding great software that then is discontinued and I have to salvage what is possible.

        I’ve been hand-coding simple html, and using what the browser can do with bookmarks, to keep work files together. I keep things as simple as possible to avoid things suddenly not working.

        The most interesting system that I’ve come across is:

        TiddlyWiki
        https://en.wikipedia.org/wiki/TiddlyWiki

        It is a flat file database with the code built into the file. I stopped using it years ago because I wasn’t sure if it would still be supported over time, yet now looking at the wiki page it might be interesting to try it again.

        Thanks…

        • Grumble, grumble, grumble.

          I tried TiddlyWiki again, and it looks great, but is a vast disappointment. You have to get an add-on for the specific browser you are using to be able to even “save”. And if the browser updates, making the add-on no longer work, you are stuck until a volunteer writes a new add-on.

          Beautiful concept, but for want of a nail a kingdom was lost.

          • The advantage of the systems I proposed is that they are simple text files, so if the system is discontinued, you can move to another system. Markdown is just a notation system, adding special symbols to the text to mark bolds, cursive, titles, putting placements for images… a lot of text editors used for coding are able to interpret markdown files and generate a preview.

            • Downloaded both Notable and Obsidian to play with each.

              The files can be opened with my usual text editors, and the folder system is very close to what I do by hand. The only flaw that I see so far in Notable is that it does not have word count, where Obsidian does, yet I can still work with that.

              The possibilities of what they can do is interesting. Working with both, and by hand, adds capability to my process.

              I may even use them to act on my impulse to write a book on “writing”. That’s dangerous.

              Thanks…

Comments are closed.