Data furnace

Last updated

The data furnace is a method [1] of heating residential homes or offices by running computers in them, which release considerable amounts of waste heat. Data furnaces can theoretically be cheaper than storing computers in huge data centers because the higher cost of electricity in residential areas (when compared to industrial zones) can be offset by charging the home owner for the heat that the data center gives off. Some large companies that store and process thousands of gigabytes of data believe that data furnaces could be cheaper because there would be little to no overhead costs. The cost of a traditional data storage center is up to around $400 per server, [2] whereas the overhead cost per server of a home data furnace is around $10. Individuals had already begun using computers as a heat source by 2011. [2]

Contents

Usefulness

The first kind of data furnace (DF) could be a low cost seasonal DF. This kind of DF would use an existing broadband connection to perform delay-tolerant jobs such as content indexing or the processing of large sets of scientific data. [2] The server will only come on and start heating and processing when the house needs heat. The second kind of DF would be the low bandwidth neighborhood DF. This option can provide faster computations as it can run at all times, but this increases the risk of overheating. To get around this problem there may be vents to the outside added to the server racks to get rid of some of the unneeded heat. The third option would be an eco-friendly urban DF. This option, much like the second, runs year round and can vent excess heat to the outside. This would be an advantage for service providers to expand into urban areas more quickly, so long as the applications scale to the number of servers. This option causes a new challenge, because since it runs year round, the cost of electricity to run the servers cannot be offset by billing the home owners for the heat that they use as it will be little to none.

Energy requirements

For a data furnace heating water, the heating needs to be at least 56 °C (133 °F) to prevent the development of pathogens while limiting the risks of skin. Regarding space heating radiators, a temperature of 50–60 °C (122–140 °F) is suitable for a radiator embedding processors as long as the heating surface is of significant size to dissipate the heat. [3] [ dubious discuss ]

Security

There are concerns about the security of these servers, as they would be stored on private properties unmonitored. Unlike traditional data centers that are constantly monitored, data furnaces should be treated as the most insecure environment for data storage. For the best security, each server would have a device to prevent tampering. Furthermore, all of the data on these servers would have to be encrypted so that no one except the person requesting the data would have access to it. [4]

Applications

A few companies are commercialising this concept around the world. A German company Cloud&Heat offers hot water heated by a distributed data center installed in the premises. [5] French company Qarnot computing developed a radiator that heats with embedded processors and sells the computing power generated. [6]

Major tech companies like Microsoft and Google have begun exploring the integration of data furnaces in their sustainability initiatives to reduce the environmental impact of their data centers. Microsoft's 2024 Environmental Sustainability Report highlights their ongoing efforts to incorporate innovative solutions, including data furnaces, to achieve carbon neutrality and improve energy efficiency across their operations [7]

Further reading

Related Research Articles

<span class="mw-page-title-main">Thin client</span> Non-powerful computer optimized for remote server access

In computer networking, a thin client, sometimes called slim client or lean client, is a simple (low-performance) computer that has been optimized for establishing a remote connection with a server-based computing environment. They are sometimes known as network computers, or in their simplest form as zero clients. The server does most of the work, which can include launching software programs, performing calculations, and storing data. This contrasts with a rich client or a conventional personal computer; the former is also intended for working in a client–server model but has significant local processing power, while the latter aims to perform its function mostly locally.

<span class="mw-page-title-main">Server farm</span> Collection of computer servers

A server farm or server cluster is a collection of computer servers, usually maintained by an organization to supply server functionality far beyond the capability of a single machine. They often consist of thousands of computers which require a large amount of power to run and to keep cool. At the optimum performance level, a server farm has enormous financial and environmental costs. They often include backup servers that can take over the functions of primary servers that may fail. Server farms are typically collocated with the network switches and/or routers that enable communication between different parts of the cluster and the cluster's users. Server "farmers" typically mount computers, routers, power supplies and related electronics on 19-inch racks in a server room or data center.

<span class="mw-page-title-main">Data center</span> Building or room used to house computer servers and related equipment

A data center is a building, a dedicated space within a building, or a group of buildings used to house computer systems and associated components, such as telecommunications and storage systems.

<span class="mw-page-title-main">Central heating</span> Type of heating system

A central heating system provides warmth to a number of spaces within a building from one main source of heat. It is a component of heating, ventilation, and air conditioning systems, which can both cool and warm interior spaces.

Green computing, green IT, or ICT sustainability, is the study and practice of environmentally sustainable computing or IT.

Utility computing, or computer utility, is a service provisioning model in which a service provider makes computing resources and infrastructure management available to the customer as needed, and charges them for specific usage rather than a flat rate. Like other types of on-demand computing, the utility model seeks to maximize the efficient use of resources and/or minimize associated costs. Utility is the packaging of system resources, such as computation, storage and services, as a metered service. This model has the advantage of a low or no initial cost to acquire computer resources; instead, resources are essentially rented.

Software multitenancy is a software architecture in which a single instance of software runs on a server and serves multiple tenants. Systems designed in such manner are "shared". A tenant is a group of users who share a common access with specific privileges to the software instance. With a multitenant architecture, a software application is designed to provide every tenant a dedicated share of the instance—including its data, configuration, user management, tenant individual functionality and non-functional properties. Multitenancy contrasts with multi-instance architectures, where separate software instances operate on behalf of different tenants.

Renewable heat is an application of renewable energy referring to the generation of heat from renewable sources; for example, feeding radiators with water warmed by focused solar radiation rather than by a fossil fuel boiler. Renewable heat technologies include renewable biofuels, solar heating, geothermal heating, heat pumps and heat exchangers. Insulation is almost always an important factor in how renewable heating is implemented.

Cloud storage is a model of computer data storage in which data, said to be on "the cloud", is stored remotely in logical pools and is accessible to users over a network, typically the Internet. The physical storage spans multiple servers, and the physical environment is typically owned and managed by a cloud computing provider. These cloud storage providers are responsible for keeping the data available and accessible, and the physical environment secured, protected, and running. People and organizations buy or lease storage capacity from the providers to store user, organization, or application data.

Infrastructure as a service (IaaS) is a cloud computing service model where a cloud services vendor provides computing resources such as storage, network, servers, and virtualization. This service frees users from maintaining their own data center, but they must install and maintain the operating system and application software. Iaas provides users high-level APIs to control details of underlying network infrastructure such as backup, data partitioning, scaling, security and physical computing resources. Services can be scaled on-demand by the user. According to the Internet Engineering Task Force (IETF), such infrastructure is the most basic cloud-service model. IaaS can be hosted in a public cloud, a private cloud, or a hybrid cloud.

<span class="mw-page-title-main">Server room</span> Room containing computer servers

A server room is a room, usually air-conditioned, devoted to the continuous operation of computer servers. An entire building or station devoted to this purpose is a data center.

<span class="mw-page-title-main">Virtualization</span> Methods for dividing computing resources

In computing, virtualization (v12n) is a series of technologies that allows dividing of physical computing resources into a series of virtual machines, operating systems, processes or containers.

<span class="mw-page-title-main">Rackspace Cloud</span> Cloud computing platform

The Rackspace Cloud is a set of cloud computing products and services billed on a utility computing basis from the US-based company Rackspace. Offerings include Cloud Storage, virtual private server, load balancers, databases, backup, and monitoring.

<span class="mw-page-title-main">Cloud computing</span> Form of shared internet-based computing

"Cloud computing is a paradigm for enabling network access to a scalable and elastic pool of shareable physical or virtual resources with self-service provisioning and administration on-demand." according to ISO.

Dynamic Infrastructure is an information technology concept related to the design of data centers, whereby the underlying hardware and software can respond dynamically and more efficiently to changing levels of demand. In other words, data center assets such as storage and processing power can be provisioned to meet surges in user's needs. The concept has also been referred to as Infrastructure 2.0 and Next Generation Data Center.

<span class="mw-page-title-main">Wyse</span> American computing system manufacturer

Wyse Technology, Inc., or simply Wyse, was an independent American manufacturer of cloud computing systems. Wyse are best remembered for their video terminal line introduced in the 1980s, which competed with the market-leading Digital. They also had a successful line of IBM PC compatible workstations in the mid-to-late 1980s. But starting late in the decade, Wyse were outcompeted by companies such as eventual parent Dell. Current products include thin client hardware and software as well as desktop virtualization solutions. Other products include cloud software-supporting desktop computers, laptops, and mobile devices. Dell Cloud Client Computing is partnered with IT vendors such as Citrix, IBM, Microsoft, and VMware.

<span class="mw-page-title-main">Aquasar</span> Supercomputer system from IBM Research

Aquasar is a supercomputer prototype created by IBM Labs in collaboration with ETH Zurich in Zürich, Switzerland and ETH Lausanne in Lausanne, Switzerland. While most supercomputers use air as their coolant of choice, the Aquasar uses hot water to achieve its great computing efficiency. Along with using hot water as the main coolant, an air-cooled section is also included to be used to compare the cooling efficiency of both coolants. The comparison could later be used to help improve the hot water coolant's performance. The research program was first termed to be: "Direct use of waste heat from liquid-cooled supercomputers: the path to energy saving, emission-high performance computers and data centers." The waste heat produced by the cooling system is able to be recycled back in the building's heating system, potentially saving money. Beginning in 2009, the three-year collaborative project was introduced and developed in the interest of saving energy and being environmentally-safe while delivering top-tier performance.

<span class="mw-page-title-main">Data processing unit</span> Programmable electronic component

A data processing unit (DPU) is a programmable computer processor that tightly integrates a general-purpose CPU with network interface hardware. Sometimes they are called "IPUs" or "SmartNICs". They can be used in place of traditional NICs to relieve the main CPU of complex networking responsibilities and other "infrastructural" duties; although their features vary, they may be used to perform encryption/decryption, serve as a firewall, handle TCP/IP, process HTTP requests, or even function as a hypervisor or storage controller. These devices can be attractive to cloud computing providers whose servers might otherwise spend a significant amount of CPU time on these tasks, cutting into the cycles they can provide to guests.

<span class="mw-page-title-main">Immersion cooling</span> IT cooling practice

Immersion cooling is an IT cooling practice by which servers are completely or partially immersed in a dielectric fluid that has significantly higher thermal conductivity than air. Heat is removed from the system by putting the coolant in direct contact with hot components, and circulating the heated liquid through heat exchangers. This practice is highly effective as liquid coolants can absorb more heat from the system than air. Immersion cooling has many benefits, including but not limited to: sustainability, performance, reliability, and cost.

<span class="mw-page-title-main">Ampere Computing</span> American fabless semiconductor company

Ampere Computing LLC is an American fabless semiconductor company based in Santa Clara, California that develops processors for servers operating in large scale environments. It was founded in 2017 by Renée James.

References

  1. Jie Liu; Michel Goraczko; Sean James; Christian Belady; Jiakang Lu; Kamin Whitehouse (June 2011). "The Data Furnace: Heating Up with Cloud Computing" (PDF). Microsoft Research. Retrieved 30 December 2013.
  2. 1 2 3 Stross, Randall (2011-11-26). "Data furnaces could bring heat to homes". New York Times. Retrieved 30 December 2011.
  3. Pulley, Adam (28 January 2015). "How Hot Does A Data Furnace Heating System Need To Be?" (PDF). Green Processing. Archived from the original (PDF) on 6 July 2018. Retrieved 6 July 2018.
  4. "Worried About Being Cold this Winter? How About Installing a Data Furnace?". 2011-11-28. Retrieved 30 December 2011.
  5. "Germans get free heating from the cloud". DatacenterDynamics. Retrieved 2018-07-06.
  6. Velkova, Julia (1 December 2016). "Data that warms: Waste heat, infrastructural convergence and the computation traffic commodity". Big Data & Society. 3 (2): 205395171668414. doi: 10.1177/2053951716684144 .
  7. https://blogs.microsoft.com/on-the-issues/2024/05/15/microsoft-environmental-sustainability-report-2024/