the future of content

Throughout human history the ability to share information ultimately ideas has propelled the human race. For millennia the ability to read and write was the distinguishing factor between the educated and the common. For the first 5000 years of recorded human history each individual copy was made painstakingly by hand which consequently made a copy rare and extremely valuable. Due to this very expensive cost of reproduction, only the writings esteemed to be of the greatest worth were ever reproduced. Writings were written with the view and hope that the work would survive lifetimes.

In the 1400s Gutenberg’s invention of the printing press forever changed the world. For the first time in history a machine, not a human, was able to do the reproducing for us. The early printing press was extremely manual requiring the typesetter to place every letter for every page. While the reproduction cost was reduced it was still expensive enough that only the most valuable works were produced.

Over the next few centuries the machines became more and more powerful. Thanks to the industrial revolution the machines had progressed to a point where real time reproductions were possible leading to the birth of newspapers and broadcasting was born. The next major leap in human communication came through radio and later television which enabled real time broadcasting without the need to distribute a physical medium. This was true real time broadcast where the receiver was seeing and hearing the information as it was happening.

While huge strides were made in the ability to receive information, through all of these revolutions the cost to reproduce/broadcast continued to be prohibitive to all but the most powerful. There continued to be an elite group of people who dictated what everyone else had access to through news corporations, book publishers, and television and radio stations.

In the late 1900s the Internet emerged. Machines had progressed to the point where the cost to reproduce and broadcast had reached nearly zero. For the first time in human history the right to broadcast was extended to anyone.

How did we, the human race, respond to this? Initially there was still a cost to producing content which was static. The cost had dropped so significantly that virtually anyone could publish something, but was still high enough that people only published things they deemed of value. This huge proliferation of published content created a massive problem of how to catalog and ultimately find information. A number of search engines emerged and ultimately Google solved the problem of indexing and connecting all of this published information.

Over the next few years the cost decreased and the ability to create dynamic content emerged. Services like Live Journal, Blogger and Flickr democratized this even further. They reduced the technical knowledge needed to publish on the Internet and greatly reduced the cost. To publish you no longer needed to provide your own infrastructure or domain name. They would gladly provide all of that for you in exchange for you giving up certain rights to them including the right to reproduce your content as they wanted and the right to inject their own advertising into your content.

In an effort to try to reproduce these services accessibility to non-technical publishers but retain the rights to an authors content open source applications like Drupal and Wordpress emerged. Their arrival was heralded and adoption exploded. While they did reduce the technical knowledge required to publish, they also retained many of the infrastructure challenges and costs.

One could not retell the history of content without mentioning Wikipedia. It’s singular mission …

In 2004 the web remained dominated by content of value. People “published” things on the Internet which they believed to be of value worth publishing. Google served as our portal to the web. We still turned to traditional media outlets for our news, but now also turned to new media outlets largely in the form of bloggers. In many regards this was the moment when humanity had achieved the ideal state. Anyone who desired to publish something of worth was able to write that information and a worldwide audience was able to receive it. Information on virtually any topic was available to anyone anywhere.

Up to this apex all publishing was developed under the belief that the number of authors would be a small fraction compared to the number of readers. This one-to-many relationship continued to motivate people to produce content of value even when the cost of reproduction decreased.

Facebook, Myspace and Twitter emerged around 2005. Building on the foundation of the blogging and photo services they took things further. Not content with anyone being able to publish they sought to make everyone an author and succeeded in that mission. The intended audience for most content became your small group of “friends”. Content became disposable and trite having little or no value. So little is the perceived value of this content that the billions of authors have no problem waiving most of their rights to this content to the companies that host these services. Few people care that Facebook alone controls all publishing rights to content given to it and it alone will determine which of your friends feeds your content will be inserted into. To reiterate this, Facebook determines which of your friends actually see the content you created and published. Twitter suffers the inverse problem, which is that too much content is created and only a tiny fraction of your followers will ever see any of this disposable content.

Even more telling is that we have now outsourced the creation of content to machines. Conservative estimates have placed the percentage of active users on twitter that are robots at around 10%, while other estimates suggest that nearly half of followers on twitter are robots.

In 2016 we have reached a saturation point where content of no value has overwhelmed us like a tsunami of rubbish.