Take a look at this graphic from IDC. It estimates the totals number of bits in the world over time. There are many things going on in this graphic. It shows that enterprise data is certainly growing, but does not comprise the majority of data; sensor data and social media data far outstrip it.
It also shows that a huge fraction of all data contains uncertainty. This has dramatic implications for old school programmers. Programming absolutely must continue to adopt new approaches to handle uncertain input data. Particularly for emerging cognitive applications. The traditional excuse of classifying ANY input errors as “garbage in” just won’t not cut it any more.
But my favorite part of this graphic are the axes; forget the graphic curves (how often does that happen). The x-axis shows time with 2015 on the far right. The y-axis shows the number of bits in the world. For the chemists among you, 10 to the 23rd is an essentially Avagadro’s number (6.02 E23), which is the number molecules in a “mole”. What does this data mean? Imagine you’re holding a tablespoon filled with water. You’re holding a mole of water molecules. The chart above implies that this year, 2015, there will be one bit of data for EVERY molecule of atomic H2O in that tablespoon. To me, that is nothing short of INCREDIBLE and AWESOME. When I was growing up, I remember trying to imagine how we would ever have such a gi-nor-mous number of macroscopic things. Well here we are, and in my lifetime. I’m moved.
So I hearby officially declare
2015 as the “mole of bits” year