A server room is a room, usually air-conditioned, devoted to the continuous operation of computer servers. An entire building or station devoted to this purpose is a data center.
The computers in server rooms are usually headless systems that can be operated remotely via KVM switch or remote administration software, such as Secure Shell, VNC, and remote desktop. [1] [2] [3] [4] [5]
Climate is one of the factors that affects the energy consumption and environmental impact of a server room. In areas where climate favours cooling and an abundance of renewable electricity, the environmental effects will be more moderate. Thus, countries with favourable conditions such as Canada, [6] Finland, [7] Sweden, [8] and Switzerland [9] are trying to attract companies to site server rooms there.
Building a server or computer room requires detailed attention to five main design considerations: [10]
Computer or server room location is the first consideration, even before considering the layout of the room's contents. Most designers agree that, where possible, the computer room should not be built where one of its walls is an exterior wall of the building. Exterior walls can often be quite damp and can contain water pipes that could burst and drench the equipment. Avoiding exterior windows means avoiding a security risk, and breakages. Avoiding both the top floors and basements means avoiding flooding, and leaks in the case of roofs. Lastly, server rooms should be centrally located because of the horizontal cabling involved which extends from this room to devices in other rooms. If a centralized computer room is not feasible, server closets on each floor may be an option. This is where computer, network and phone equipment are housed in closets and each closet is stacked above each other on the floor that they service.
In addition to the hazards of exterior walls, designers need to evaluate any potential sources of interference in proximity to the computer room. Designing such a room means keeping clear of radio transmitters and electrical interference from power plants or lift rooms, etc.
Other physical design considerations range from room size, door sizes and access ramps (to get equipment in and out) to cable organization, physical security and maintenance access.
Computer equipment generates heat, and is sensitive to heat, humidity, and dust, but also the need for very high resilience and failover requirements. Maintaining a stable temperature and humidity within tight tolerances is critical to IT system reliability.
In most server rooms "close control air conditioning" systems, also known as PAC (precision air conditioning) systems, are installed. These systems control temperature, humidity and particle filtration within tight tolerances 24 hours a day and can be remotely monitored. They can have built-in automatic alerts when conditions within the server room move outside defined tolerances.
Air conditioning designs for most computer or server rooms will vary depending on various design considerations, but they are generally one of two types: "up-flow" and "down-flow" configurations.
This type of air conditioning draws air into the front of the air handler unit (AHU), cools the air over the heat exchanger, then distributes the cooled air out through the top or through duct work. This air conditioning configuration is well suited to retro-fitted computer rooms when raised floors are either of inadequate depth or do not exist at all.
Typically, this type of air conditioning unit draws the air into the top of the air handling unit, cools the air over the heat exchanger, then distributes the air out of the bottom into the floor void. This conditioned air is then discharged into the server room via strategically placed floor grilles and onwards to equipment racks. These systems are well suited to new office buildings where the design can encompass raised floors suitable for ducting to computer racks.
Hot aisle / cold aisle configurations switch the forward direction of every other row so that two rows face each other and have their backs to the next row. This avoids the hot exhaust of one row of racks being sucked into the cooling intake of an adjacent row. Air conditioning ducts or vents are located between the two fronts since most equipment vents front to rear. A drawback of unenclosed hot aisle / cold aisle configuration is that there is a significant amount of uncontrolled or bypass mixing of hot and cold air outside the equipment. [11]
In an aisle containment configuration one of the aisle is enclosed with walls, ceilings and access doors to create an enclosed space. Aisle containment does not allow bypass mixing of hot and cold air. This forces all cold to hot air transformation to happen inside the equipment. Careful attention is paid to avoid open rack slots or other air flow leaks to make the front of the rack a continuous wall of the contained aisle. [12]
The adoption of liquid cooling technologies has allowed for highly efficient server room designs. When liquid cooling technologies are applied, server rooms don't rely on energy consuming air conditioning systems any more. Instead, all heat is captured in liquid, which can be rejected with a simple and efficient dry cooler.
Another factor of using liquid is the potential for heat reuse. Server rooms are slowly becoming part of heating systems and integrated within the same rooms, or connected to the utility space of buildings through a water circuit. This allows the heating installation to utilise server heat before using alternate means of heating. Temperature chaining principles are slowly adopted to generate sufficient temperature levels for reuse scenarios.
The fire protection system's main goal should be to detect and alert of fire in the early stages, then bring fire under control without disrupting the flow of business and without threatening the personnel in the facility. Server room fire suppression technology has been around for as long as there have been server rooms. Traditionally, most computer rooms used Halon gas, but this has been shown to be environmentally unfriendly (ozone depleting) and unsafe for humans. Modern computer rooms use combinations of inert gases such as nitrogen, argon and carbon dioxide. Other solutions include clean chemical agents such as FM200 and also hypoxic air solutions that keep oxygen levels down. To prevent fires from spreading due to data cable and cord heat generation, organizations have also used plenum cable coated with FEP tubing. This plastic reduces heat generation and safeguards material metal efficiently. [13]
The demands of server rooms are constantly changing as organizations evolve and grow and as technology changes. An essential part of computer room design is future proofing so that new requirements can be accommodated with minimal effort. As computing requirements grow, so will a server room's power and cooling requirements. As a rough guide, for every additional 100 kW of equipment installed, a further 30 kW of power is required to cool it. As a result, air conditioning designs will need to have scalability designed in from the outset.
The choice of racks in a server room is usually the prime factor when determining space. Many organisations use telco racks or enclosed cabinets to make the most of the space they have. Today, with servers that are one-rack-unit (1U) high and new blade servers, a single 19- or 23-inch rack can accommodate anywhere from 42 to hundreds of servers.
If the computer systems in a server room are mission critical, removing single points of failure and common-mode failures may be of high importance. [14] The level of desired redundancy is determined by factors such as whether the organisation can tolerate interruption whilst failover systems are activated, or must they be seamless without any business impacts. Other than computer hardware redundancy, the main consideration here is the provisioning of failover power supplies and cooling.
A 19-inch rack is a standardized frame or enclosure for mounting multiple electronic equipment modules. Each module has a front panel that is 19 inches (482.6 mm) wide. The 19 inch dimension includes the edges or ears that protrude from each side of the equipment, allowing the module to be fastened to the rack frame with screws or bolts. Common uses include computer servers, telecommunications equipment and networking hardware, audiovisual production gear, music production equipment, and scientific equipment.
A colocation center or "carrier hotel", is a type of data centre where equipment, space, and bandwidth are available for rental to retail customers. Colocation facilities provide space, power, cooling, and physical security for the server, storage, and networking equipment of other firms and also connect them to a variety of telecommunications and network service providers with a minimum of cost and complexity.
A server farm or server cluster is a collection of computer servers, usually maintained by an organization to supply server functionality far beyond the capability of a single machine. They often consist of thousands of computers which require a large amount of power to run and to keep cool. At the optimum performance level, a server farm has enormous financial and environmental costs. They often include backup servers that can take over the functions of primary servers that may fail. Server farms are typically collocated with the network switches and/or routers that enable communication between different parts of the cluster and the cluster's users. Server "farmers" typically mount computers, routers, power supplies and related electronics on 19-inch racks in a server room or data center.
A data center or data centre is a building, a dedicated space within a building, or a group of buildings used to house computer systems and associated components, such as telecommunications and storage systems.
A central heating system provides warmth to a number of spaces within a building from one main source of heat. It is a component of heating, ventilation, and air conditioning systems, which can both cool and warm interior spaces.
Failover is switching to a redundant or standby computer server, system, hardware component or network upon the failure or abnormal termination of the previously active application, server, system, hardware component, or network in a computer network. Failover and switchover are essentially the same operation, except that failover is automatic and usually operates without warning, while switchover requires human intervention.
Computer cooling is required to remove the waste heat produced by computer components, to keep components within permissible operating temperature limits. Components that are susceptible to temporary malfunction or permanent failure if overheated include integrated circuits such as central processing units (CPUs), chipsets, graphics cards, hard disk drives, and solid state drives.
A blade server is a stripped-down server computer with a modular design optimized to minimize the use of physical space and energy. Blade servers have many components removed to save space, minimize power consumption and other considerations, while still having all the functional components to be considered a computer. Unlike a rack-mount server, a blade server fits inside a blade enclosure, which can hold multiple blade servers, providing services such as power, cooling, networking, various interconnects and management. Together, blades and the blade enclosure form a blade system, which may itself be rack-mounted. Different blade providers have differing principles regarding what to include in the blade itself, and in the blade system as a whole.
A raised floor provides an elevated structural floor above a solid substrate to create a hidden void for the passage of mechanical and electrical services. Raised floors are widely used in modern office buildings, and in specialized areas such as command centers, Information technology data centers and computer rooms, where there is a requirement to route mechanical services and cables, wiring, and electrical supply. Such flooring can be installed at varying heights from 2 inches (51 mm) to heights above 4 feet (1.2 m) to suit services that may be accommodated beneath. Additional structural support and lighting are often provided when a floor is raised enough for a person to crawl or even walk beneath.
A technical room or equipment room is a room where technical equipment has been installed, for example for controlling a building's climate, electricity, water and wastewater. The equipment can include electric panels, central heating, heat network, machinery for ventilation systems, air conditioning, various types of pumps and boilers, as well as telecommunications equipment. It can serve one or more housing units or buildings.
All electronic devices and circuitry generate excess heat and thus require thermal management to improve reliability and prevent premature failure. The amount of heat output is equal to the power input, if there are no other energy interactions. There are several techniques for cooling including various styles of heat sinks, thermoelectric coolers, forced air systems and fans, heat pipes, and others. In cases of extreme low environmental temperatures, it may actually be necessary to heat the electronic components to achieve satisfactory operation.
A fan coil unit (FCU), also known as a Vertical Fan Coil Unit (VFC), is a device consisting of a heat exchanger (coil) and a fan. FCUs are commonly used in HVAC systems of residential, commercial, and industrial buildings that use ducted split air conditioning or central plant cooling. FCUs are typically connected to ductwork and a thermostat to regulate the temperature of one or more spaces and to assist the main air handling unit for each space if used with chillers. The thermostat controls the fan speed and/or the flow of water or refrigerant to the heat exchanger using a control valve.
Liebert Corporation was a global manufacturer of power, precision cooling and infrastructure management systems for mainframe computer, server racks, and critical process systems headquartered in Westerville, Ohio. Founded in 1965, the company employed more than 1,800 people across 12 manufacturing plants worldwide. Since 2016, Liebert has been a subsidiary of Vertiv.
HVAC is a major sub discipline of mechanical engineering. The goal of HVAC design is to balance indoor environmental comfort with other factors such as installation cost, ease of maintenance, and energy efficiency. The discipline of HVAC includes a large number of specialized terms and acronyms, many of which are summarized in this glossary.
Underfloor air distribution (UFAD) is an air distribution strategy for providing ventilation and space conditioning in buildings as part of the design of a HVAC system. UFAD systems use an underfloor supply plenum located between the structural concrete slab and a raised floor system to supply conditioned air to supply outlets, located at or near floor level within the occupied space. Air returns from the room at ceiling level or the maximum allowable height above the occupied zone.
The HP Performance Optimized Datacenter (POD) is a range of three modular data centers manufactured by HP.
A register is a grille with moving parts, capable of being opened and closed and the air flow directed, which is part of a building's heating, ventilation, and air conditioning (HVAC) system. The placement and size of registers is critical to HVAC efficiency. Register dampers are also important, and can serve a safety function.
Immersion cooling is an IT cooling practice by which complete servers are immersed in a dielectric, electrically non-conductive fluid that has significantly higher thermal conductivity than air. Heat is removed from a system by putting the coolant in direct contact with hot components, and circulating the heated liquid through heat exchangers. This practice is highly effective because liquid coolants can absorb more heat from the system, and are more easily circulated through the system, than air. Immersion cooling has many benefits, including but not limited to: sustainability, performance, reliability and cost.
Close Coupled Cooling is a last generation cooling system particularly used in data centers. The goal of close coupled cooling is to bring heat transfer closest to its source: the equipment rack. By moving the air conditioner closer to the equipment rack a more precise delivery of inlet air and a more immediate capture of exhaust air is ensured.
A green data center, or sustainable data center, is a service facility which utilizes energy-efficient technologies. They do not contain obsolete systems, and take advantage of newer, more efficient technologies.