While the concept of big data is relatively new, the origins of large data sets go back to the 1960s with the first data centers and the development of the relational database. By 2005, it became clear just how much data users generated through social media and other online services. At that time there were just over 130 exabytes of data in the world. By 2020, it's estimated that there will be 44 zettabytes, and that number is on pace to quadruple again by 2025. With the emergence of the Internet of Things (IoT), even more objects and devices are connected to the internet, gathering data on customer usage patterns and product performance, for example. Soon the emergence of artificial intelligence (AI) and machine learning will produce still more data. Companies should consider a number of things when handling big data before they invest: the volume, the velocity at which it comes in or can be acted on, and the variety of types. And because we live in a world of questionable information sourcing, two more components have emerged over the last few years: the value and veracity of data. (Fake news, anyone?) As many companies are already earmarking big data IT upgrades as major investments in the coming year, the time has come to recognize the benefits of creating a smart data strategy. Using the results to improve accuracy and make better long term decisions, it is also about reducing administrative costs and improving critical reporting processes. Finally, consider that global revenue from the storage and monitoring of big data will reach $203 billion by the year 2020 with an estimated 440,000 big data related job roles in the US alone and only 300,000 skilled professionals to fill them. This means that while collecting data alone will not help corporations to make smarter decisions, the sooner they adjust for their own inevitable tsunami of data, the better.