Three reasons for the Big Oblivion

1. Internet´s carbon emission

3,4 Billion people are online and generate data traffic. Data traffic uses energy, and this causes carbon emission.
Due to the growing community, HD and increasing loading speed, the data-traffic volume doubles every 18 months. If data traffic increases in the coming 10 years as it has in the past 10 years, the traffic volume will be more than 50 times higher than today.
Currently the internet causes already 3% of the global carbon emission. Theoretically we soon run out of breath…
This critical point is too close as to rely on new technologies such as unlimited energy sources or electricity-free data transfer. Both ar far down the line.
Here´s what that means: In a very near future ecological and economic reasons will force us to massively reduce data.

Facts about data traffic and carbon emission

During the past ten years the volume of data traffic doubled every 18 months. In ten years time it will be more than 50 times higher than today.
In 2016 the internet causes already 3% of the global carbon emission.

Volume of data traffic:

– an article in The Telegraph

– the increasing volume of data traffic in numbers or more elaborated here.

Carbon emission:

– International Energy Agency  Statistics

– EDGAR (Emissions Database for Global Atmospheric Research) world total

Carbon emission of the internet:

– an impressing website: internetlivestats.com  right bottom the carbon emission is indicated

– an article by Marcus Hurst cites different sources of numbers.

In 2016 the internet causes more than 1 billion tons of CO2. The annual global emission is about 35 billion tons.

 

2. Unpaid bills

 CD, DVD, Harddrive are classed as old technology, today our data is “in the cloud” or “in the internet”.

What happens with a cloud account when its holder (a person or institution) terminates his existence? After a couple of unpaid bills the account will be closed and the content be deleted.

Blogs – instant mirror of our time – will never become valuable sources for future historians. Protocols of scientific congresses or research results of institutes will be irretrievably lost. Crucial know-how will vanish.

 

3. Accuracy vs. falsity

Less than 1% of the antique texts are still preserved. Publishing took some effort then, and after 2,000 years we can still reconstruct the antique world and its way of thinking.

Whereas publishing today is simple as a post, tweet, share or video upload. Not only scientists or newspapers are publishing, everybody can do it today.

And because pseudo-science has such a mass appeal, stories about aliens and crop circles are spread much more often than scientific papers. That’s why a Google search for “crop circle” yields 20 million results, whereas “moon landing” yields only 6 million. Even worse, 5 million of the 6 million hits are “moon landing hoax” stories…

 

There is the ESA probe GAIA. It produces the most detailed maps of our galaxy. And yet, the size of its entire data output over 5 years in outer space is dwarfed by 1 day of internet traffic (precisely: 5 years data collection equal 0,03% of the daily internet data traffic).

 

The things we put lots of effort into and spend lots of money on (spaceflight, medicine, research,…), drown in oceans of trivial, irrelevant contents and redundant bullshit (in the sense of Harry Frankfurt).

 

Here’s why that’s a problem: It´s very likely, that after 2000 years a random fragment of today´s data will result in a completely distorted image of our time.

Find out more>>>

Leave a Reply

Your email address will not be published. Required fields are marked *