Zettabyte Era

Last updated

The Zettabyte Era or Zettabyte Zone [1] is a period of human and computer science history that started in the mid-2010s. The precise starting date depends on whether it is defined as when the global IP traffic first exceeded one zettabyte, which happened in 2016, or when the amount of digital data in the world first exceeded a zettabyte, which happened in 2012. A zettabyte is a multiple of the unit byte that measures digital storage, and it is equivalent to 1,000,000,000,000,000,000,000 (1021) bytes. [2]

Contents

According to Cisco Systems, an American multinational technology conglomerate, the global IP traffic achieved an estimated 1.2 zettabytes (an average of 96 exabytes (EB) per month) in 2016. Global IP traffic refers to all digital data that passes over an IP network which includes, but is not limited to, the public Internet. The largest contributing factor to the growth of IP traffic comes from video traffic (including online streaming services like Netflix and YouTube). [3] [4]

The Zettabyte Era can also be understood as an age of growth of all forms of digital data that exist in the world which includes the public Internet, but also all other forms of digital data such as stored data from security cameras or voice data from cell-phone calls. [5] Taking into account this second definition of the Zettabyte Era, it was estimated that in 2012 upwards of 1 zettabyte of data existed in the world and that by 2020 there would be more than 40 zettabytes of data in the world at large. [6]

The Zettabyte Era translates to difficulties for data centers to keep up with the explosion of data consumption, creation and replication. [7] In 2015, 2% of total global power was taken up by the Internet and all its components, so energy efficiency with regards to data centers has become a central problem in the Zettabyte Era. [8]

IDC forecasts that the amount of data generated each year will grow to 175 zettabytes by 2025. [9] [10] It further estimates that a total of 22 zettabytes of digital storage will be shipped across all storage media types between 2018 and 2025, with nearly 59 percent of this capacity being provided by the hard drive industry. [11]

The zettabyte

A zettabyte is a digital unit of measurement. One zettabyte is equal to one sextillion bytes or 1021 (1,000,000,000,000,000,000,000) bytes, or, one zettabyte is equal to a trillion gigabytes. [4] [2] To put this into perspective, consider that "if each terabyte in a zettabyte were a kilometre, it would be equivalent to 1,300 round trips to the moon and back (768,800 kilometers)". [4] As former Google CEO Eric Schmidt puts it, from the very beginning of humanity to the year 2003, an estimated 5 exabytes of information was created, [12] which corresponds to 0.5% of a zettabyte. In 2013, that amount of information (5 exabytes) took only two days to create, and that pace is continuously growing. [12]

Definitions

The concept of the Zettabyte Era can be separated into two distinct categories:

  1. In terms of IP traffic: This first definition refers to the total amount of data to traverse global IP networks such as the public Internet. In Canada for example, there has been an average growth of 50.4% of data downloaded by residential Internet subscribers from 2011 to 2016. [13] According to this definition, the Zettabyte Era began in 2016 when global IP traffic surpassed one zettabyte, estimated to have reached roughly 1.2 zettabytes. [3]
  2. In terms of all forms of digital data: In this second definition, the Zettabyte Era refers to the total amount of all the digital data that exists in any form, from digital films to transponders that record highway usage to SMS text messages. [5] According to this definition, the Zettabyte Era began in 2012, when the amount of digital data in the world surpassed one zettabyte. [6]

In 2016, Cisco Systems stated that the Zettabyte Era was now reality when global IP traffic reached an estimated 1.2 zettabytes. Cisco also provided future predictions of global IP traffic in their report The Zettabyte Era: Trends and Analysis. This report uses current and past global IP traffic statistics to forecast future trends. The report predicts trends between 2016 and 2021. Here are some of the predictions for 2021 found in the report: [3]

Factors that led to the Zettabyte Era

There are many factors that brought about the rise of the Zettabyte Era. Increases in video streaming, mobile phone usage, broadband speeds and data center storage are all contributing factors that led to the rise (and continuation) of data consumption, creation and replication. [3] [14] [15]

Increased video streaming

There is a large, and ever-growing consumption of multimedia, including video streaming, on the Internet that has contributed to the rise of the Zettabyte Era. [16] In 2011 it was estimated that roughly 25%–40% of IP traffic was taken up by video streaming services. [17] Since then, video IP traffic has nearly doubled to an estimated 73% of total IP traffic. Furthermore, Cisco has predicted that this trend will continue into the future, estimating that by 2021, 82% of total IP traffic will come from video traffic. [3]

The amount of data used by video streaming services depends on the quality of the video. Thus, Android Central breaks down how much data is used (on a smartphone) with regards to different video resolutions. According to their findings, per hour video between 240p and 320p resolution uses roughly 0.3 GB. Standard video, which is clocked in at a resolution of 480p, uses approximately 0.7 GB per hour. High-definition video which varies between 720p and 2k resolution uses about 0.9 GB (720p), 1.5 GB (1080p) and 3 GB (2k) per hour. Finally, 4K video, known as ultra-high-definition video, uses about 7.2 GB per hour. [18] [ dubious ]

Netflix and YouTube are at the top of the list in terms of the most globally streamed video services online. In 2016, Netflix represented 32.72% of all video streaming IP traffic, while YouTube represented 17.31%. The third spot is taken up by Amazon Prime Video where global data usage comes in at 4.14%. [19]

Netflix

Currently, Netflix is the largest video streaming service in the world, accessible in over 200 countries and with more than 80 million subscribers. [20] Streaming high definition video content through Netflix uses roughly 3 GB of data per hour, while standard definition takes up around 1 GB of data per hour. [21] In North America, during peak bandwidth consumption hours (around 8 PM) Netflix uses about 40% of total network bandwidth. [22] The vast amount of data marks an unparalleled period in time and is one of the major contributing factors that has led the world into the Zettabyte Era. [3]

YouTube

YouTube is another major video streaming (and video uploading) service, [23] whose data consumption rate across both fixed and mobile networks remains quite large. [24] In 2016, the service was responsible for using up about 20% of total Internet traffic and 40% of mobile traffic. As of 2018, 300 hours of YouTube video content is uploaded every minute. [25] [26]

Increased wireless and mobile traffic

The usage of mobile technologies to access IP networks has resulted in an increase in overall IP traffic in the Zettabyte Era. In 2016, the majority of devices that moved IP traffic and other data streams were hard-wired devices. Since then, wireless and mobile traffic have increased and are predicted to continue to increase rapidly. Cisco predicts that by the year 2021, wired devices will account for 37% of total traffic while the remaining 63% will be accounted for through wireless and mobile devices. Furthermore, smartphone traffic is expected to surpass PC traffic by 2021; PCs are predicted to account for 25% of total traffic, down from 46% in 2016, whereas smartphone traffic is expected to increase from 13% to 33%. [3]

According to the Organisation for Economic Co-operation and Development (OECD), mobile broadband penetration rates are ever-growing. Between June 2016 and December 2016 there was an average mobile broadband penetration rate increase of 4.43% of all OECD countries. Poland had the largest increase coming in at 21.55% while Latvia had the lowest penetration rate having declined 5.71%. The OECD calculated that there were 1.27 billion total mobile broadband subscriptions in 2016, 1.14 billion of these subscriptions had both voice and data included in the plan. [14]

Increased broadband speeds

Broadband is what connects Internet users to the Internet, thus the speed of the broadband connection is directly correlated to IP traffic – the greater the broadband speed, the greater the possibility of more traffic that can traverse IP networks. Cisco estimates that broadband speeds are expected to double by 2021. In 2016, global average fixed broadband reached speeds as high as 27.5 Mbit/s but are expected to reach 53 Mbit/s by 2021. [3] Between the fourth quarter of 2016 and the first quarter of 2017, average fixed broadband speeds globally equated to 7.2 Mbit/s.[ clarification needed ] South Korea was at the top of the list in terms of broadband speeds. In that period broadband speeds increased 9.3%. [27]

High-bandwidth applications need significantly higher broadband-speeds. Certain broadband technologies including Fiber-to-the-home (FTTH), high-speed digital subscriber line (DSL) and cable broadband are paving the way for increased broadband speeds. [3] FTTH can offer broadband-speeds that are ten times (or even a hundred times) faster than DSL or cable. [28]

Internet service providers in the Zettabyte Era

The Zettabyte Era has affected Internet service providers (ISPs) with the growth of data flowing from all directions. Congestion occurs when there is too much data flowing in and the quality of service (QoS) weakens. [29] In both China some ISPs store and handle exabytes of data. [6] The response by certain ISPs is to implement so-called network management practices in an attempt to accommodate the never-ending data-surge of Internet subscribers on their networks. Furthermore, the technologies being implemented by ISPs across their networks are evolving to address the increase in data flow. [30]

Network management practices have brought about debates relating to net neutrality in terms of fair access to all content on the Internet. [30] According to The European Consumer Organisation, network neutrality can be understood as an aim that "all Internet should be treated equally, without discrimination or interference. When this is the case, users enjoy the freedom to access the content, services, and applications of their choice, using any device they choose". [31]

According to the Canadian Radio-television and Telecommunications Commission (CRTC) Telecom Regulatory Policy 2009-657 there are two forms of Internet network management practices in Canada. The first are economic practices such as data caps, the second are technical practices like bandwidth throttling and blocking. According to the CRTC, the technical practices are put in place by ISPs to address and solve congestion issues in their network, however the CRTC states that ISPs are not to employ ITMPs for preferential or unjustly discriminatory reasons. [32] [ clarification needed ]

In the United States, however, during the Obama-era administration, under the Federal Communications Commission's (FCC) 15–24 policy, there were three bright-line rules in place to protect net neutrality: no blocking, no throttling, no paid prioritization. [33] On 14 December 2017, the FCC voted 3–2 to remove these rules, allowing ISPs to block, throttle and give fast-lane access to content on their network. [34]

In an attempt to aid ISP's in dealing with large data-flow in the Zettabyte Era, in 2008 Cisco unveiled a new router, the Aggregation Services Router (ASR) 9000, which at the time was supposed to be able to offer six times the speed of comparable routers. In one second the ASR 9000 router would, in theory, be able to process and distribute 1.2 million hours of DVD traffic. [35] In 2011, with the coming of the Zettabyte Era, Cisco had continued work on the ASR 9000 in that it would now be able to handle 96 terabytes a second, up significantly from 6.4 terabytes a second the ASR 9000 could handle in 2008. [36] [ relevant? ]

Data centers

Energy consumption

Data centers attempt to accommodate the ever-growing rate at which data is produced, distributed, and stored. Data centers are large facilities used by enterprises to store immense datasets on servers. [37] In 2014 it was estimated that in the U.S. alone there were roughly 3 million data centers, [38] ranging from small centers located in office buildings to large complexes of their own. [39] Increasingly, data centers are storing more data than end-user devices. By 2020 it is predicted that 61% of total data will be stored via cloud applications (data centers) in contrast to 2010 when 62% of data storage was on end-user devices. An increase in data centers for data storage coincides with an increase in energy consumption by data centers. [40]

In 2014, data centers in the U.S. accounted for roughly 1.8% of total electricity consumption which equates to 70 billion  kWh. Between 2010 and 2014 an increase of 4% was attributed to electricity consumption by data centers, this upward trend of 4% is predicted to continue through 2014–2020. [41] In 2011, energy consumption from all data centers equated to roughly 1.1% to 1.5% of total global energy consumption. [42] Information and communication technologies, including data centers, are responsible for creating large quantities of CO2 emissions. [43]

Google's green initiatives

The energy used by data centers is not only to power their servers. In fact, most data centers use about half of their energy costs on non-computing energy such as cooling and power conversion. Google's data centers have been able to reduce non-computing costs to 12%. [44] Furthermore, as of 2016, Google uses its artificial intelligence unit, DeepMind, to manage the amount of electricity used for cooling their data centers, which results in a cost reduction of roughly 40% after the implementation of DeepMind. [45] Google claims that its data centers use 50% less energy than ordinary data centers. [46] [ better source needed ]

According to Google's Senior Vice President of Technical Infrastructure, Urs Hölzle, Google's data centers (as well as their offices) will have reached 100% renewable energy for their global operations by the end of 2017. Google plans to accomplish this milestone by buying enough wind and solar electricity to account for all the electricity their operations consume globally. The reason for these green-initiatives is to address climate change and Google's carbon footprint. Furthermore, these green-initiatives have become cheaper, with the cost of wind energy lowering by 60% and solar energy coming down 80%. [46]

In order to improve a data center's energy efficiency, reduce costs and lower the impact on the environment, Google provides 5 of the best practices for data centers to implement: [47]

  1. Measure the Power Usage Effectiveness (PUE),[ clarification needed ] a ratio used by the industry to measure the energy used for non-computing functions, to track a data center's energy use.
  2. Using well-designed containment methods, try to stop cold and hot air from mixing. Also, use backing plates for empty spots on the rack and eliminate hot spots.
  3. Keep the aisle temperatures cold for energy savings.
  4. Use free cooling methods to cool data centers, including a large thermal reservoir or evaporating water.
  5. Eliminate as many power conversion steps as possible to lower power distribution losses.

The Open Compute Project

In 2010, Facebook launched a new data center designed in such a way that allowed it to be 38% more efficient and 24% less expensive to build and run than the average data center. This development led to the formation of the Open Compute Project (OCP) in 2011. [48] [49] The OCP members collaborate in order to build new technological hardware that is more efficient, economical and sustainable in an age where data is ever-growing. [49] The OCP is currently working on several projects, including one specifically focusing on data centers. This project aims to guide the way in which new data centers are built, but also to aid already existing data centers in improving thermal and electrical energy as well as to maximize mechanical performance. The OCP's data center project focuses on five areas: facility power, facility operations, layout and design, facility cooling and facility monitoring and control. [50]

Related Research Articles

<span class="mw-page-title-main">Internet</span> Global system of connected computer networks

The Internet is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP) to communicate between networks and devices. It is a network of networks that consists of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries a vast range of information resources and services, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and file sharing.

<span class="mw-page-title-main">Router (computing)</span> Device that forwards data packets between computer networks

A router is a networking device that forwards data packets between computer networks. Routers perform the traffic directing functions between networks and on the global Internet. Data sent through a network, such as a web page or email, is in the form of data packets. A packet is typically forwarded from one router to another router through the networks that constitute an internetwork until it reaches its destination node.

<span class="mw-page-title-main">Streaming media</span> Multimedia delivery method

Streaming media is multimedia for playback using an offline or online media player. Technically, the stream is delivered and consumed in a continuous manner by a client, with little or no intermediate storage in network elements. Streaming refers to the delivery method of content rather than the content itself.

<span class="mw-page-title-main">Internet service provider</span> Organization that provides access to the Internet

An Internet service provider (ISP) is an organization that provides services for accessing, using, managing, or participating in the Internet. ISPs can be organized in various forms, such as commercial, community-owned, non-profit, or otherwise privately owned.

<span class="mw-page-title-main">Broadband</span> Data transmission concept

In telecommunications, broadband or high speed is the wide-bandwidth data transmission that exploits signals at a wide spread of frequencies or several different simultaneous frequencies, and is used in fast Internet access. The transmission medium can be coaxial cable, optical fiber, wireless Internet (radio), twisted pair cable, or satellite.

<span class="mw-page-title-main">DSLAM</span> Network device that connects DSL interfaces to a digital communications channel

A digital subscriber line access multiplexer is a network device, often located in telephone exchanges, that connects multiple customer digital subscriber line (DSL) interfaces to a high-speed digital communications channel using multiplexing techniques. Its cable internet (DOCSIS) counterpart is the cable modem termination system.

<span class="mw-page-title-main">Internet access</span> Individual connection to the Internet

Internet access is a facility or service that provides connectivity for a computer, a computer network, or other network device to the Internet, and for individuals or organizations to access or use applications such as email and the World Wide Web. Internet access is offered for sale by an international hierarchy of Internet service providers (ISPs) using various networking technologies. At the retail level, many organizations, including municipal entities, also provide cost-free access to the general public.

<span class="mw-page-title-main">SK Broadband</span> Seoul-based telecommunications company

SK Broadband, Inc. KRX: 033630, formerly known as Hanaro Telecom, is a Seoul-based telecommunications company and a wholly owned subsidiary of SK Telecom. It is one of the largest broadband Internet access providers in South Korea. Until its takeover in 2008, Hanaro controlled nearly half of the Korean landline market, as it was the only last mile-competitive local exchange carrier (CLEC) other than the state-owned KT Corp. SK Broadband also has a division known as "Broad &" that controls a large portion of the South Korean calling card market.

Internet traffic is the flow of data within the entire Internet, or in certain network links of its constituent networks. Common traffic measurements are total volume, in units of multiples of the byte, or as transmission rates in bytes per certain time units.

<span class="mw-page-title-main">Internet Protocol television</span> Television transmitted over a computer network

Internet Protocol television (IPTV) is the delivery of television content over Internet Protocol (IP) networks. This is in contrast to delivery through traditional terrestrial, satellite, and cable television formats. Unlike downloaded media, IPTV offers the ability to stream the source media continuously. As a result, a client media player can begin playing the content almost immediately. This is known as streaming media.

<span class="mw-page-title-main">Net neutrality</span> Principle that Internet service providers should treat all data equally

Network neutrality, often referred to as net neutrality, is the principle that Internet service providers (ISPs) must treat all Internet communications equally, offering users and online content providers consistent rates irrespective of content, website, platform, application, type of equipment, source address, destination address, or method of communication.

A data cap, often erroneously referred to as a bandwidth cap, is an artificial restriction imposed on the transfer of data over a network. In particular, it refers to policies imposed by an internet service provider in order to limit customers' usage of their services; typically, exceeding a data cap would require the subscriber to pay additional fees based on whether they have exceeded this limit. Implementation of a data cap is sometimes termed a fair access policy, fair usage policy, or usage-based billing by ISPs.

<span class="mw-page-title-main">Xfinity</span> American cable provider

Comcast Cable Communications, LLC, doing business as Xfinity, is an American telecommunications business segment and division of Comcast Corporation used to market consumer cable television, internet, telephone, and wireless services provided by the company. The brand was first introduced in 2010; prior to that, these services were marketed primarily under the Comcast name.

Since its beginnings in 1995, the Internet in Malaysia has become the main platform for free discussion in the country's otherwise tightly controlled media environment. As of Q1 2017, Malaysia had broadband penetration rates of 103.6% and 81.8%.

Peer-to-peer caching is a computer network traffic management technology used by Internet Service Providers (ISPs) to accelerate content delivered over peer-to-peer (P2P) networks while reducing related bandwidth costs.

<span class="mw-page-title-main">Internet in India</span> Overview of the internet in India

Internet in India began in 1986 and was initially available only to the educational and research community. General public access to the internet in India began on 15 August 1995. American multinational digital communications technology conglomerate Cisco estimated that India will have more than 900 million Internet users by 2023.

Internet bottlenecks are places in telecommunication networks in which internet service providers (ISPs), or naturally occurring high use of the network, slow or alter the network speed of the users and/or content producers using that network. A bottleneck is a more general term for a system that has been reduced or slowed due to limited resources or components. The bottleneck occurs in a network when there are too many users attempting to access a specific resource. Internet bottlenecks provide artificial and natural network choke points to inhibit certain sets of users from overloading the entire network by consuming too much bandwidth. Theoretically, this will lead users and content producers through alternative paths to accomplish their goals while limiting the network load at any one time. Alternatively, internet bottlenecks have been seen as a way for ISPs to take advantage of their dominant market-power increasing rates for content providers to push past bottlenecks. The United States Federal Communications Commission (FCC) has created regulations stipulating that artificial bottlenecks are in direct opposition to a free and open Internet.

Net bias is the counter-principle to net neutrality, which indicates differentiation or discrimination of price and the quality of content or applications on the Internet by ISPs. Similar terms include data discrimination, digital redlining, and network management.

Internet rush hour is the time period when the majority of Internet users are online at the same time. Typically, in the UK the peak hours are between 7 and 11 pm. During this time frame, users commonly experience slowness while browsing or downloading content. The congestion experienced during the rush hour is similar to transportation rush hour, where demand for resources outweighs capacity.

Open Connect is a content distribution network specifically developed by Netflix to deliver its TV shows and movies to avoid the traffic and fees.

References

  1. "Zipping Past the Zettabyte Era: What's Next for the Internet?". Now. Powered by Northrop Grumman. Retrieved 5 July 2020.
  2. 1 2 TechTerms. "Zettabyte" . Retrieved 12 January 2018.
  3. 1 2 3 4 5 6 7 8 9 10 Cisco Systems. "The Zettabyte Era: Trends and Analysis" . Retrieved 12 October 2017.
  4. 1 2 3 Barnett, Jr., Thomas (9 September 2016). "The Zettabyte Era Officially Begins (How Much is That?)". Cisco Blogs. Retrieved 4 December 2017.
  5. 1 2 Gantz, John; Reinsel, David (December 2012). "THE DIGITAL UNIVERSE IN 2020: Big Data, Bigger Digital Shadows, and Biggest Growth in the Far East" (PDF): 1. Retrieved 4 December 2017.{{cite journal}}: Cite journal requires |journal= (help)
  6. 1 2 3 Xu, Zhi-Wei (March 2014). "Cloud-Sea Computing Systems: Towards Thousand-Fold Improvement in Performance per Watt for the Coming Zettabyte Era". Journal of Computer Science and Technology. 29 (2): 177–181. doi:10.1007/s11390-014-1420-2. S2CID   14818321.
  7. Arista. "The zettabyte era is here. Is your datacenter ready?" (PDF). Retrieved 12 January 2018.
  8. Newman, Kim (2014). "ChipScaleReview" . Retrieved 4 December 2017.
  9. Tom Coughlin (27 November 2018). "175 Zettabytes By 2025". Forbes. Retrieved 11 May 2020.
  10. IDC Corporate (21 January 2020). "Global DataSphere Forecast". Archived from the original on 6 December 2021. Retrieved 20 February 2021.
  11. Tom Coughlin (27 November 2018). "175 Zettabytes By 2025". Forbes. Retrieved 11 May 2020.
  12. 1 2 Vance, Jeff (25 June 2013). "Big Data Analytics Overview". Datamation. Retrieved 22 October 2017.
  13. CRTC. "Communications Monitoring Report 2016: Telecommunications sector overview". Canadian Radio-television and Telecommunications Commission. Archived from the original on 6 November 2017. Retrieved 22 October 2017.
  14. 1 2 OECD. "OECD Broadband Portal". Organisation for Economic Co-operation and Development. Retrieved 4 December 2017.
  15. Chéramy, Séverine; Clermidy, Fabien; Simon, Gilles; Leduc, Patrick. "The "active-interposer" concept for high-performance chip-to-chip connections". p. 35.
  16. Althoff, Tim; Borth, Damian; Hees, Jörn; Dengel, Andreas (2014). "Analysis and Forecasting of Trending Topics in Online Media Streams". arXiv: 1405.7452 [cs.SI].
  17. Rao, Ashwin; Legout, Arnaud; Lim, Yeon-sup; Towsley, Don; et al. (2011). Written at Tokyo, Japan. Network characteristics of video streaming traffic (PDF). CONEXT 2011. New York, New York, USA: ACM Press. doi:10.1145/2079296.2079321. ISBN   978-1-4503-1041-3.
  18. Hildenbrand, Jerry. "How much mobile data does streaming media use?". Android Central. Retrieved 4 December 2017.
  19. Nickelsburg, Monica (22 June 2016). "Study: Amazon Video is now the third-largest streaming service, behind Netflix and YouTube". Geek Wire. Retrieved 4 December 2017.
  20. Ingram, Mathew. "Netflix Is Winning the Streaming Race—But for How Long?". Fortune. Retrieved 4 December 2017.
  21. Netflix. "How can I control how much data Netflix uses?" . Retrieved 4 December 2017.
  22. Pariag, David; Brecht, Tim (2017). "Application Bandwidth and Flow Rates from 3 Trillion Flows Across 45 Carrier Networks" (PDF): 4–5. Retrieved 4 December 2017.{{cite journal}}: Cite journal requires |journal= (help)
  23. Orsolic, Irena; Pevec, Dario; Suznjevic, Mirko; Skorin-Kapov, Lea (2016). "YouTube QoE Estimation Based on the Analysis of Encrypted Network Traffic Using Machine Learning" (PDF). Retrieved 4 December 2017.{{cite journal}}: Cite journal requires |journal= (help)
  24. Summers, Jim; Brecht, Tim; Eager, Derek; Gutarin, Alex (2016). "Characterizing the Workload of a Netflix Streaming Video Server" (PDF). Retrieved 4 December 2017.{{cite journal}}: Cite journal requires |journal= (help)
  25. Chen, Steve; Hurley, Chad; Karim, Jawed. "37 Mind Blowing YouTube Facts, Figures and Statistics – 2018". FortuneLords. Retrieved 20 March 2018.
  26. Brown, Kim (20 January 2019). "Mind Blowing YouTube Facts, Figures And Statistics [Very Interesting!!!]". Top Ten Notch. Retrieved 16 April 2023.
  27. Akamai. "akamai's [state of the internet] – Q1 2017 report" (PDF). Akamai. Retrieved 7 December 2017.
  28. FCC (23 June 2014). "Types of Broadband Connections". Federal Communications Commission. Retrieved 7 December 2017.
  29. Al-Bahadili, Hussein (2012). Simulation in Computer Network Design and Modeling: Use and Analysis. Information Science Reference. p. 282. ISBN   9781466601925 . Retrieved 5 December 2017.
  30. 1 2 Dutta, Soumitra; Bilbao-Osorio, Beñat. "The Global Information Technology Report 2012" (PDF). Insead. S2CID   154025136 . Retrieved 5 December 2017.
  31. The European Consumer Organization. "Factsheet: The EU's Net Neutrality Rules" (PDF). Retrieved 5 December 2017.
  32. CRTC (21 October 2009). "Telecom Regulatory Policy CRTC 2009-657" . Retrieved 5 December 2017.
  33. FCC. "REPORT AND ORDER ON REMAND, DECLARATORY RULING, AND ORDER" (PDF). Retrieved 5 December 2017.
  34. Castro, Alex (14 December 2017). "The FCC just killed net neutrality". The Verge. Retrieved 31 January 2018.
  35. Hamblen, Matt (11 November 2008). "Cisco unveils a router for the 'Zettabyte Era'". Computer World. Retrieved 5 December 2017.
  36. Zawacki, Neil. "Cisco Introduces a Simpler Way to Build Internet for Zettabyte Era". Compare Business Products. Retrieved 5 December 2017.
  37. Stroud, Forrest (30 May 2014). "Data Center". Webopedia. Retrieved 5 December 2017.
  38. Tikoff Vargas, Maria. "10 Facts to Know About Data Centers". Office of Energy Efficiency & Renewable Energy. Retrieved 6 December 2017.
  39. Freedman, Andrew (30 September 2014). "There Are Now 3 Million Data Centers in the U.S., and Climbing". Mashable. Retrieved 20 March 2018.
  40. Hormann, Peter; Campbell, Leith (2014). "Data Storage Energy Efficiency in the Zettabyte Era" (PDF). Australian Journal of Telecommunications and the Digital Economy: 51.1–52.2. Retrieved 5 December 2017.
  41. Koomey, Jonathan; Masanet, Eric; Horner, Nathaniel; Azevedo, Inês; Lintner, William. "United States Data Center Energy Usage Report". Berkeley National Laboratory. Retrieved 5 December 2017.[ permanent dead link ]
  42. Rong, Huigui; Zhang, Haomin; Xiao, Sheng; Li, Canbing; Hu, Chunhua (2016). "Optimizing energy consumption for data centers" (PDF). Renewable and Sustainable Energy Reviews. 58: 674–691. doi:10.1016/j.rser.2015.12.283. Archived from the original (PDF) on 1 February 2018. Retrieved 5 December 2017.
  43. "5 things to know about cloud computing and GHG emissions". carbometrix.com. 22 March 2022. Retrieved 27 September 2023.
  44. Google: Data Centers. "Efficiency: How we do it". Google. Retrieved 6 December 2017.{{cite web}}: |last1= has generic name (help)
  45. Vincent, James (21 July 2016). "Google uses DeepMind AI to cut data center energy bills". The Verge. Retrieved 20 March 2018.
  46. 1 2 Hölzle, Urs. "100% renewable is just the beginning". Google. Retrieved 6 December 2017.
  47. Google: Data Centers. "Efficiency: How others can do it". Google. Retrieved 6 December 2017.{{cite web}}: |last1= has generic name (help)
  48. Facebook: Sustainability. "Building the most efficient data centers on Earth". Facebook. Retrieved 6 December 2017.{{cite web}}: |last1= has generic name (help)
  49. 1 2 Open Compute Project: About. "About OCP". Open Compute Project. Retrieved 6 December 2017.
  50. Open Compute Project. "Datacenter". Open Compute Project. Archived from the original on 3 December 2017. Retrieved 8 December 2017.

Further reading