As the year 2023 drew to a close, now is a good time to reflect on the health of scientific research.
For the optimists among us, the state of scientific research is strong.
Revolutionary mRNA vaccine development won the Nobel prize this year; we’ve seen major advances in nuclear fusion research, and the FDA just approved a CRISPR-based gene therapy for sickle cell patients.
But there are vocal critics who maintain that scientific research reached a plateau long ago, and instead of a series of major breakthroughs, what we have now is a series of modest, incremental advances in science.
Among these critics are the Palantir defense company CEO, Peter Thiel (former PayPal cofounder and frenemy of Elon Musk), and Tyler Cowen, an economist, who have made the media rounds arguing we need to do more to make science revolutionary again.
Several articles published over the last year in Nature argue this very point –that we have entered a period of declining incrementalism in science.
- ‘Disruptive’ science has declined — and no one knows why
- Is science really getting less disruptive — and does it matter if it is?
- Papers and patents are becoming less disruptive over time
The researchers in the first article analyzed scientific paper submissions over time and noted that since the 1970s, major breakthroughs in sciences have been few and far between.
However, by and large, they were not able to identify a specific reason for this trend.
We have some thoughts on this.
One place to look would be whether basic research funding, such as grants from the NIH, has fallen off in recent decades or not.
A commonly held assumption is that NIH funding has been cut back significantly in recent decades. However, the Congressional Research Service and the NIH maintain that its funding levels have steady or slightly increased when accounting for inflation in laboratory projects – using the Biomedical Research and Development Price Index (BRDPI).
Another widely held belief is that it’s very hard for first-time PI to get an NIH grant, which could influence innovation outcomes. However, according to NHS’ own data, the percentage of awards given out to first-time applicants has actually risen by a few percentage points.
So why do so many people feel that funding has become harder to get? One possibility is that the NIH is offering bigger grant packages to fewer teams of applicants.
Another potential concern that needs to be investigated is whether the grants being awarded skew toward projects that have a known start and endpoint that help justify the funding.
While this approach has a laudable goal of justifying “value for money,” it might also have the unintended consequence of encouraging researchers to focus more on limited specific problems they know they can solve in advance rather than take on riskier basic research projects which might lead either to a major breakthrough or nothing at all.
Another aspect of the NHS claim that grant funding has kept up with inflation is whether it has also kept up with population growth. The potential number of scientific researchers has increased as the population has grown over the past few decades.
Now let’s turn to some concerns about academic research at colleges and universities as things have changed significantly for them in recent years.
The concept of “publish or perish“ has been with us for decades. But more recently, the number of publications issued by researchers has increased significantly, particularly among those derided as “research paper super spreaders,” some of whom have been able to publish as many as 70 papers per year, far exceeding the normal 4 or 5 papers published annually.
Why the overall increase in the number of scientific publications?
It could be due to a combination of the need to apply for more funding as well as a way to increase job security.
The career of a research faculty member has changed a lot from the 1970s to the 2020s. Back then, ambitious researchers sought out tenure-track professorships, which offered academic freedom and funding resources. Today, however, guaranteed tenure track professorships have been phased out at many institutions, leaving researchers on the hook to find their own funding resources.
In many cases, this involves working more closely with corporate entities who underwrite their funding; however, these corporate ties could influence the trajectory of scientific research over time, possibly to the detriment of basic research.
We will come back to the topic of corporate research in a moment, but first, we need to talk about the changed nature of sharing scientific knowledge at academic institutions.
In her new autobiography, this year’s co-winner of the Nobel Prize in Physiology or Medicine, Katalin Karikó, wistfully recalled that back in the day when she needed a DNA sample for her mRNA research, a colleague from another institution happily supplied it to her.
Unfortunately, the era of freely sharing scientific research data and resources is long gone. Younger scientists might be surprised to know that in the 1950s and 1960s, research at American public universities was put into the public domain without a second thought ‑ after all, the taxpayers had funded the research, and the commonly held belief was it should benefit the scientific community overall.
Unfortunately, this is no longer the case.
Most of today’s academic institutions see scientific advancement as a financial opportunity.
To this end, they have created academic technology transfer offices (sometimes called business development departments) that carefully manage and maintain the intellectual property rights of discoveries made by their research labs, with the intent of licensing them through patents. (A related problem for researchers is the rise of patent litigation and so-called “patent trolls” that gobble up research patents for financial gain, to the detriment of innovation in general.)
We mentioned above that we would return to the topic of corporate research.
It would be worth investigating how yesterday‘s giants in corporate-funded research, such as the world-famous Bell Labs – which in its heyday brought forth major inventions – from transistor to lasers – compare with today’s in-house research labs or independent research facilities, such as SRI or the Battelle Institute. Few of today’s “lean and mean” corporations have the resources to fund long-term development in-house.
Historical events in the past also had a major impact on the progress of scientific research.
World War II was a period of intense scientific research and innovation. The development of the nuclear bomb, which pushed physics, materials science, laboratory design, and experimentation to the limit is the primary example. But researchers during the period also developed the jet engine, the Norden bombsight, advanced radar systems, rocket-based weapons systems, as well as the foundations for modern programmable computers, such as the Colossus (the first system based on the modern John von Neumann architecture), which was able to decrypt German Enigma radio transmissions.
The subsequent Cold War with the Soviet Union led to massive investment in military research and development as the Pentagon sought to close the “missile gap” with the Soviet Union. Despite the introduction of Western civilian jet passenger aircraft, things changed again in 1957 when the Russians launched the first satellite to orbit the Earth, Sputnik, which caused America to fundamentally question whether its students were falling behind in math and science.
Sputnik also set in motion the beginning of the space race, which became a national mission when JFK vowed in September 1962 to send a man to the moon by the end of the decade.
The massive number of scientific problems that needed to be solved to safely bring astronauts to the surface of the moon and safely home led to massive increases in understanding of astrophysics, rocketry, computer miniaturization, and satellite telecommunications. (It also famously brought us Tang, the orange-flavored powdered drink.)
With the Cold War ending in 1989, governments around the world sought to enjoy the so-called peace dividend in hopes of spending less on military investments, and more on humanitarian programs. This was both the era of the personal computer and the supercomputer and the early stages of network computing (funded by DARPA, the military’s innovation center), which ultimately led to today’s Internet, one of the most transformative innovations of all time.
This brings us roughly to the year 2000.
As we get closer to the present day, it becomes harder to assess which of today’s current scientific developments will be considered the breakout discoveries that opened new doors to major advancements in the future.
To make that assessment, we will have to wait till the year 2050 or later to get a better perspective.
Nevertheless, even the most ardent critics of current scientific progress have to admit there have been some major advances in the last 20 years that undermine the argument that scientific research has become more incremental and less revolutionary.
For example, electronic miniaturization has transformed our lives over the last two decades.
Aside from performing medical surgery, the iPhone that we hold in our hands today performs nearly all the functions of the fictional Star Trek telecommunication device in the 1960s science fiction TV series. These advances in mobile communications have also opened up the entire world to instantaneous Internet-based communication.
In biological research, we now have the complete genetic code of the human genome, and work is underway to understand its variant versions.
Advances in artificial intelligence and learning have led to the discovery of 214 million protein structures, taking us one step closer to understanding complex disease mechanisms, such as those causing cancer, which remain elusive to date.
Material science is also making major breakthroughs. One of the major stories of the last decade is the dramatic improvements in solar cells, which now produce energy far more cheaply than fossil fuels. Intensive research in battery technology is also creating higher energy density battery cells for electric vehicles at lower cost.
Of course, some areas of research appear to have hit the proverbial wall. While there have been major advancements in fusion research (see our recent article), it still looks like it is a long way from becoming a practical large-scale technology.
Another area of concern is maintaining Moore’s Law, e.g. that the speed and capability of computers will double every two years. Given that microelectronic circuitry has reached the atomic level, we may need to transition to other technologies, such as quantum computing, but we are awaiting a breakthrough in this area.
And finally, breathtaking advances in artificial intelligence, such as the breakout star of 2023, ChatGPT, may lead us to even faster scientific advances by allowing researchers to streamline their literature search and make useful connections and insights to advance scientific knowledge.
It’s still up for debate whether the large language model (LLM) approach used by ChatGPT will lead to the Holy Grail of artificial general intelligence (AGI) or not, but if it does, future scientists will look back on the 2020s as a true golden era in scientific research.
We here at Formaspace wish you a happy and prosperous 2024.
Formaspace is Your Laboratory Research Partner
If you can imagine it, we can build it, at our Austin, Texas factory headquarters.
Talk to your Formaspace Sales Representative or Strategic Dealer Partner today to learn more about how we can work together to make your next construction project or remodel a success.