Deep learning for massive amounts of data
In this Exa-Byte scale era, the data are increasing at an exponential rate. This growth of data are analyzed by many organizations and researchers in various ways, and also for so many different purposes. According to the survey of International Data Corporation (IDC), the Internet is processing approximately 2 Petabytes of data every day [51]. In 2006, the size of digital data was around 0.18 ZB, whereas this volume has increased to 1.8 ZB in 2011. Up to 2015, it was expected to reach up to 10 ZB in size, and by 2020, its volume in the world will reach up to approximately 30 ZB to 35 ZB. The timeline of this data mountain is shown in Figure 2.1. These immense amounts of data in the digital world are formally termed as big data.
"The world of Big Data is on fire" | ||
--The Economist, Sept 2011 |
Figure 2.1: Figure shows the increasing trend of data for a time span of around 20 years
Facebook has almost 21 PB in 200M objects [52], whereas Jaguar ORNL...