The global online population has reached over 2 billion users, or about 30% of the world's population. The average Internet user spends about 16 hours on the Internet every month, while that figure doubles for users in the United States. This equates to approximately 35 billion hours of online user time—an equivalent to almost 4 million years aggregated every month.
For most people, browsing the Internet is not a computationally-intensive activity, and considering the constantly-evolving hardware profile of an average user, the processing potential of their hardware is generally underutilized. This leaves many idle computation cycles that can be harnessed for constructive purposes.
In addition, there are people, entities, and organizations that benefit heavily from distributed computing. For example, some users run high-performance computing applications that perform computationally-intensive tasks in fields such as quantum mechanics, weather forecasting, climate research, oil and gas exploration, molecular modeling, and physical simulations. Other users deal with big data applications, with data sets so large and complex, they are impossible to process using conventional processing techniques. For these applications, users need to capture, curate, store, search, share, transfer, analyze, and visualize data. They also need to find correlations in data for business trends, determine research quality, prevent diseases, link legal citations, combat crime, and determine real-time roadway traffic conditions. Still other users use distributed computing to perform rendering animation, cryptography searches, bioinformatics, face recognition, climate models, genetic algorithms, and many more applications.
Most of the users of distributed computations have a choice to either build on-premises data centers, or use private and public cloud computing services such as Amazon Web Services (AWS). These solutions require physical data centers to perform the computations, independent of whether they are on-premises systems, or in the cloud on the premises of third parties. Most data centers run at maximum capacity, at any demand, wasting about 90% of the electricity that is pulled off the grid. Combined with underutilization, the energy wasted in these data centers is about thirty times the amount used for actual computation. Worldwide, data centers use about 30 GW of electricity, roughly equivalent to 30 nuclear power plants. Thus, the costs associated with these solutions are dominated by administration, and raise the price of the associated distributed computing services.
Another element of the online infrastructure that has helped ease the delivery and distribution of information is content delivery networks (“CDNs”). CDNs serve content to end-users with the goal of high availability and high performance. They serve many different types of content for a large fraction of the Internet, including web objects (text, graphics and scripts), downloadable objects (media files, software, and documents), applications (e-commerce and portals), live streaming media, on-demand streaming media, and social networks. CDN nodes are usually geographically diversified, often over multiple backbones. These services help to reduce bandwidth costs, improve page load times, or increase global availability of content. The number of nodes and servers making up a CDN varies, depending on the architecture. Some may reach thousands of nodes with tens of thousands of servers on many remote points of presence (PoPs).