Is today’s science where the software industry was in the 70s?

by | Mar 3, 2014 | regular

Today there are literally billions of apps that you can download to your smartphone anywhere in the world and start using in seconds. Imagine a totally different world though, one where in order to use an App you had to buy a new phone each time or spend weeks developing it yourself. This was more or less the state of the computing world up until the 1980s. 

There’s a recent blog post called “The Myth of the Fall” regarding open source software. In it the argument is made that up to the 70s/early 80s it was all proprietary software, and that’s because the differing architectures made it impossible for a software program to be compatible with more than one computer – what we call “portability.” It was as if in order to use the same software you first had to first translate it into the specific language that a particular computer could “speak.” Reusing programs, one of the key benefits to open source, just wasn’t practical. Progress was cumbersome.

It took decades for dedicated individuals to engineer the hardware architectures needed for software portability and just as long for communities to adopt the common standards to make it possible. Once that was established, then open source software such as the Unix operating system could flourish and now the world literally runs on open source – and you can download countless apps to your phone. (Some nice parallels to the billions of years of evolution it took to get past single-cell organisms and into the multi-cellular explosion once it did happen)

Could it be that our science today, and in particular scholarly publishing, is where we were at with software in the 1970s? Software had to battle with incompatible computer hardware in the 70s, and in science we have cultural practices and business models that limit the reuse and dissemination of scholarly knowledge. Paywall publishers continue to lobby against Open Access, much like the 1970s IBM tried to convince us that personal computers weren’t needed – until Apple showed us what could be done with them. The result is that we aren’t making as much progress as we might otherwise achieve. Academic publishing today is like the non-portable software we had 30 years ago. That’s slowly changing though.

The Open Access movement today is to scientific progress as Open Source software was to the information technology explosion that began in the 1980s. And as common hardware architectures greased the wheels for open source to thrive, so do today’s emerging reproducibility and data availability initiatives hint towards a budding scientific explosion. Government and funding agency mandates for Open Access are also greasing those wheels, so that science is more “portable” and accessible to a larger audience.

Our mission at PeerJ is to efficiently publish the world’s knowledge, so that we can tackle this century’s greatest challenges. It’s going to take a modern update to how we think about doing science and publishing to achieve this. Because of that mission we don’t view PeerJ as a publisher in the traditional sense and you’ll continue to see us marching to the beat of a 21st century drum. Our job is to grease the wheels of science by getting scholarly knowledge (of any size) out there as far and as cheap as possible – with a few carrots thrown in for motivation. The software industry is a model we can all take a good look at and have confidence in for what is possible when “Open” is embraced. 
 

Get PeerJ Article Alerts