Powered by
TTSReader
Share this page on
Article provided by Wikipedia


Main article: "Modular data center

Modularity and flexibility are key elements in allowing for a data center to grow and change over time. Data center modules are pre-engineered, standardized building blocks that can be easily configured and moved as needed.[32]

A modular data center may consist of data center equipment contained within shipping containers or similar portable containers.[33] But it can also be described as a design style in which components of the data center are prefabricated and standardized so that they can be constructed, moved or added to quickly as needs change.[34]

Environmental control[edit]

Data center environmental control

The physical environment of a data center is rigorously controlled. "Air conditioning is used to control the temperature and humidity in the data center. "ASHRAE's "Thermal Guidelines for Data Processing Environments"[35] recommends a temperature range of 18–27 °C (64–81 °F), a dew point range of −9 to 15 °C (16 to 59 °F), and ideal relative humidity of 60%, with an allowable range of 40% to 60% for data center environments.[36] The temperature in a data center will naturally rise because the electrical power used heats the air. Unless the heat is removed, the ambient temperature will rise, resulting in electronic equipment malfunction. By controlling the air temperature, the server components at the board level are kept within the manufacturer's specified temperature/humidity range. Air conditioning systems help control "humidity by cooling the return space air below the "dew point. Too much humidity, and water may begin to "condense on internal components. In case of a dry atmosphere, ancillary humidification systems may add water vapor if the humidity is too low, which can result in "static electricity discharge problems which may damage components. Subterranean data centers may keep computer equipment cool while expending less energy than conventional designs.

Modern data centers try to use economizer cooling, where they use outside air to keep the data center cool. At least one data center (located in "Upstate New York) will cool servers using outside air during the winter. They do not use chillers/air conditioners, which creates potential energy savings in the millions.[37] Increasingly indirect air cooling is being deployed in data centers globally which has the advantage of more efficient cooling which lowers power consumption costs in the data center. Many newly constructed data centers are also using Indirect Evaporative Cooling (IDEC) units as well as other environmental features such as sea water to minimize the amount of energy needed to cool the space.

Telcordia GR-2930, NEBS: Raised Floor Generic Requirements for Network and Data Centers, presents generic engineering requirements for raised floors that fall within the strict NEBS guidelines.

There are many types of commercially available floors that offer a wide range of structural strength and loading capabilities, depending on component construction and the materials used. The general types of "raised floors include stringer, stringerless, and structural platforms, all of which are discussed in detail in GR-2930 and summarized below.

Data centers typically have "raised flooring made up of 60 cm (2 ft) removable square tiles. The trend is towards 80–100 cm (31–39 in) void to cater for better and uniform air distribution. These provide a "plenum for air to circulate below the floor, as part of the air conditioning system, as well as providing space for power cabling.

Metal whiskers[edit]

Raised floors and other metal structures such as cable trays and ventilation ducts have caused many problems with "zinc whiskers in the past, and likely are still present in many data centers. This happens when microscopic metallic filaments form on metals such as zinc or tin that protect many metal structures and electronic components from corrosion. Maintenance on a raised floor or installing of cable etc. can dislodge the whiskers, which enter the airflow and may short circuit server components or power supplies, sometimes through a high current metal vapor "plasma arc. This phenomenon is not unique to data centers, and has also caused catastrophic failures of satellites and military hardware.[38]

Electrical power[edit]

""
""
A bank of batteries in a large data center, used to provide power until diesel generators can start

Backup power consists of one or more "uninterruptible power supplies, battery banks, and/or "diesel / "gas turbine generators.[39]

To prevent "single points of failure, all elements of the electrical systems, including backup systems, are typically fully duplicated, and critical servers are connected to both the "A-side" and "B-side" power feeds. This arrangement is often made to achieve "N+1 redundancy in the systems. "Static transfer switches are sometimes used to ensure instantaneous switchover from one supply to the other in the event of a power failure.

Low-voltage cable routing[edit]

Data cabling is typically routed through overhead "cable trays in modern data centers. But some["who?] are still recommending under raised floor cabling for security reasons and to consider the addition of cooling systems above the racks in case this enhancement is necessary. Smaller/less expensive data centers without raised flooring may use anti-static tiles for a flooring surface. Computer cabinets are often organized into a "hot aisle arrangement to maximize airflow efficiency.

Fire protection[edit]

""
""
"FM200 Fire Suppression Tanks

Data centers feature "fire protection systems, including "passive and "Active Design elements, as well as implementation of "fire prevention programs in operations. "Smoke detectors are usually installed to provide early warning of a fire at its incipient stage. This allows investigation, interruption of power, and manual fire suppression using hand held fire extinguishers before the fire grows to a large size. An "active fire protection system, such as a "fire sprinkler system or a "clean agent fire suppression gaseous system, is often provided to control a full scale fire if it develops. High sensitivity smoke detectors, such as "aspirating smoke detectors, activating "clean agent fire suppression gaseous systems activate earlier than fire sprinklers.

Passive fire protection elements include the installation of "fire walls around the data center, so a fire can be restricted to a portion of the facility for a limited time in the event of the failure of the active fire protection systems. Fire wall penetrations into the server room, such as cable penetrations, coolant line penetrations and air ducts, must be provided with fire rated penetration assemblies, such as "fire stopping.

Security[edit]

Physical security also plays a large role with data centers. Physical access to the site is usually restricted to selected personnel, with controls including a layered security system often starting with fencing, "bollards and "mantraps.[40] "Video camera surveillance and permanent "security guards are almost always present if the data center is large or contains sensitive information on any of the systems within. The use of finger print recognition "mantraps is starting to be commonplace.

Energy use[edit]

IT energy management

Energy use is a central issue for data centers. Power draw for data centers ranges from a few kW for a rack of servers in a closet to several tens of MW for large facilities. Some facilities have power densities more than 100 times that of a typical office building.[41] For higher power density facilities, electricity costs are a dominant "operating expense and account for over 10% of the "total cost of ownership (TCO) of a data center.[42] By 2012 the cost of power for the data center is expected to exceed the cost of the original capital investment.[43]

Greenhouse gas emissions[edit]

In 2007 the entire "information and communication technologies or ICT sector was estimated to be responsible for roughly 2% of global "carbon emissions with data centers accounting for 14% of the ICT footprint.[44] The US EPA estimates that servers and data centers are responsible for up to 1.5% of the total US electricity consumption,[45] or roughly .5% of US GHG emissions,[46] for 2007. Given a business as usual scenario greenhouse gas emissions from data centers is projected to more than double from 2007 levels by 2020.[44]

Siting is one of the factors that affect the energy consumption and environmental effects of a datacenter. In areas where climate favors cooling and lots of renewable electricity is available the environmental effects will be more moderate. Thus countries with favorable conditions, such as: Canada,[47] Finland,[48] Sweden,[49] Norway [50] and Switzerland,[51] are trying to attract cloud computing data centers.

In an 18-month investigation by scholars at Rice University's Baker Institute for Public Policy in Houston and the Institute for Sustainable and Applied Infodynamics in Singapore, data center-related emissions will more than triple by 2020. [52]

Energy efficiency[edit]

The most commonly used metric to determine the energy efficiency of a data center is "power usage effectiveness, or PUE. This simple ratio is the total power entering the data center divided by the power used by the IT equipment.

Total facility power consists of power used by IT equipment plus any overhead power consumed by anything that is not considered a computing or data communication device (i.e. cooling, lighting, etc.). An ideal PUE is 1.0 for the hypothetical situation of zero overhead power. The average data center in the US has a PUE of 2.0,[45] meaning that the facility uses two watts of total power (overhead + IT equipment) for every watt delivered to IT equipment. State-of-the-art data center energy efficiency is estimated to be roughly 1.2.[53] Some large data center operators like "Microsoft and "Yahoo! have published projections of PUE for facilities in development; "Google publishes quarterly actual efficiency performance from data centers in operation.[54]

The "U.S. Environmental Protection Agency has an "Energy Star rating for standalone or large data centers. To qualify for the ecolabel, a data center must be within the top quartile of energy efficiency of all reported facilities.[55]

European Union also has a similar initiative: EU Code of Conduct for Data Centres[56]

Energy use analysis[edit]

Often, the first step toward curbing energy use in a data center is to understand how energy is being used in the data center. Multiple types of analysis exist to measure data center energy use. Aspects measured include not just energy used by IT equipment itself, but also by the data center facility equipment, such as chillers and fans.[57]

Power and cooling analysis[edit]

Power is the largest recurring cost to the user of a data center.[58] A power and cooling analysis, also referred to as a thermal assessment, measures the relative temperatures in specific areas as well as the capacity of the cooling systems to handle specific ambient temperatures.[59] A power and cooling analysis can help to identify hot spots, over-cooled areas that can handle greater power use density, the breakpoint of equipment loading, the effectiveness of a raised-floor strategy, and optimal equipment positioning (such as AC units) to balance temperatures across the data center. Power cooling density is a measure of how much square footage the center can cool at maximum capacity.[60]

Energy efficiency analysis[edit]

An energy efficiency analysis measures the energy use of data center IT and facilities equipment. A typical energy efficiency analysis measures factors such as a data center's power use effectiveness (PUE) against industry standards, identifies mechanical and electrical sources of inefficiency, and identifies air-management metrics.[61]

Computational fluid dynamics (CFD) analysis[edit]

Computational fluid dynamics

This type of analysis uses sophisticated tools and techniques to understand the unique thermal conditions present in each data center—predicting the temperature, airflow, and pressure behavior of a data center to assess performance and energy consumption, using numerical modeling.[62] By predicting the effects of these environmental conditions, CFD analysis in the data center can be used to predict the impact of high-density racks mixed with low-density racks[63] and the onward impact on cooling resources, poor infrastructure management practices and AC failure of AC shutdown for scheduled maintenance.

Thermal zone mapping[edit]

Thermal zone mapping uses sensors and computer modeling to create a three-dimensional image of the hot and cool zones in a data center.[64]

This information can help to identify optimal positioning of data center equipment. For example, critical servers might be placed in a cool zone that is serviced by redundant AC units.

Green data centers[edit]

""
""
This water-cooled data center in the "Port of Strasbourg, France claims the attribute green.

Data centers use a lot of power, consumed by two main usages: the power required to run the actual equipment and then the power required to cool the equipment. The first category is addressed by designing computers and storage systems that are increasingly power-efficient.[2] To bring down cooling costs data center designers try to use natural ways to cool the equipment. Many data centers are located near good fiber connectivity, power grid connections and also people-concentrations to manage the equipment, but there are also circumstances where the data center can be miles away from the users and don't need a lot of local management. Examples of this are the 'mass' data centers like Google or Facebook: these DC's are built around many standardized servers and storage-arrays and the actual users of the systems are located all around the world. After the initial build of a data center staff numbers required to keep it running are often relatively low: especially data centers that provide mass-storage or computing power which don't need to be near population centers.Data centers in arctic locations where outside air provides all cooling are getting more popular as cooling and electricity are the two main variable cost components.[65]

Network infrastructure[edit]

""
""
An example of "rack mounted" servers

Communications in data centers today are most often based on "networks running the "IP "protocol suite. Data centers contain a set of "routers and "switches that transport traffic between the servers and to the outside world. "Redundancy of the Internet connection is often provided by using two or more upstream service providers (see "Multihoming).

Some of the servers at the data center are used for running the basic Internet and "intranet services needed by internal users in the organization, e.g., e-mail servers, "proxy servers, and "DNS servers.

Network security elements are also usually deployed: "firewalls, "VPN "gateways, "intrusion detection systems, etc. Also common are monitoring systems for the network and some of the applications. Additional off site monitoring systems are also typical, in case of a failure of communications inside the data center.

Data center infrastructure management[edit]

"Data center infrastructure management (DCIM) is the integration of information technology (IT) and facility management disciplines to centralize monitoring, management and intelligent capacity planning of a data center's critical systems. Achieved through the implementation of specialized software, hardware and sensors, DCIM enables common, real-time monitoring and management platform for all interdependent systems across IT and facility infrastructures.

Depending on the type of implementation, DCIM products can help data center managers identify and eliminate sources of risk to increase availability of critical IT systems. DCIM products also can be used to identify interdependencies between facility and IT infrastructures to alert the facility manager to gaps in system redundancy, and provide dynamic, holistic benchmarks on power consumption and efficiency to measure the effectiveness of "green IT" initiatives.

It's important to measure and understand data center efficiency metrics. A lot of the discussion in this area has focused on energy issues, but other metrics beyond the PUE can give a more detailed picture of the data center operations. Server, storage, and staff utilization metrics can contribute to a more complete view of an enterprise data center. In many cases, disc capacity goes unused and in many instances the organizations run their servers at 20% utilization or less.[66] More effective automation tools can also improve the number of servers or virtual machines that a single admin can handle.

DCIM providers are increasingly linking with "computational fluid dynamics providers to predict complex airflow patterns in the data center. The CFD component is necessary to quantify the impact of planned future changes on cooling resilience, capacity and efficiency.[67]

Managing the capacity of a data center[edit]

""
""
Capacity of a datacenter - Life Cycle

Several parameters may limit the capacity of a data center. For long term usage, the main limitations will be available area, then available power. In the first stage of its life cycle, a data center will see its occupied space growing more rapidly than consumed energy. With constant densification of new IT technologies, the need in energy is going to become dominant, equaling then overcoming the need in area (second then third phase of cycle). The development and multiplication of connected objects, the needs in storage and data treatment lead to the necessity of data centers to grow more and more rapidly. It is therefore important to define a data center strategy before being cornered. The decision, conception and building cycle lasts several years. Therefore, it is imperative to initiate this strategic consideration when the data center reaches about 50% of its power capacity. Maximum occupation of a data center needs to be stabilized around 85%, be it in power or occupied area. Resources thus managed will allow a rotation zone for managing hardware replacement and will allow temporary cohabitation of old and new generations. In the case where this limit would be overcrossed durably, it would not be possible to proceed to material replacements, which would invariably lead to smothering the information system. The data center is a resource in its own right of the information system, with its own constraints of time and management (life span of 25 years), it therefore needs to be taken into consideration in the framework of the SI midterm planning (between 3 and 5 years).

Applications[edit]

The main purpose of a data center is running the IT systems applications that handle the core business and operational data of the organization. Such systems may be proprietary and developed internally by the organization, or bought from "enterprise software vendors. Such common applications are "ERP and "CRM systems.

A data center may be concerned with just "operations architecture or it may provide other services as well.

Often these applications will be composed of multiple hosts, each running a single component. Common components of such applications are "databases, "file servers, "application servers, "middleware, and various others.

Data centers are also used for off site backups. Companies may subscribe to backup services provided by a data center. This is often used in conjunction with "backup tapes. Backups can be taken off servers locally on to tapes. However, tapes stored on site pose a security threat and are also susceptible to fire and flooding. Larger companies may also send their backups off site for added security. This can be done by backing up to a data center. Encrypted backups can be sent over the Internet to another data center where they can be stored securely.

For quick deployment or "disaster recovery, several large hardware vendors have developed mobile/modular solutions that can be installed and made operational in very short time. Companies such as

""
""
A modular data center connected to the power grid at a utility substation

US wholesale and retail colocation providers[edit]

According to data provided in the third quarter of 2013 by Synergy Research Group, "the scale of the wholesale colocation market in the United States is very significant relative to the retail market, with Q3 wholesale revenues reaching almost $700 million. "Digital Realty Trust is the wholesale market leader, followed at a distance by DuPont Fabros." Synergy Research also described the US colocation market as the most mature and well-developed in the world, based on revenue and the continued adoption of cloud infrastructure services.

Estimates from Synergy Research Group's Q3 2013 data.[78]
Rank Company name US market share
1 Various providers 34%
2 "Equinix 18%
3 CenturyLink-Savvis 8%
4 "SunGard 5%
5 "AT&T 5%
6 "Verizon 5%
7 Telx 4%
8 CyrusOne 4%
9 "Level 3 Communications 3%
10 "Internap 2%

See also[edit]

References[edit]

  1. ^ James Glanz (September 22, 2012). "Power, Pollution and the Internet". The New York Times. Retrieved 2012-09-25. 
  2. ^ a b "Power Management Techniques for Data Centers: A Survey", 2014.
  3. ^ TIA-942 Telecommunications Infrastructure Standard for Data Centers
  4. ^ "Archived copy". Archived from the original on 2011-11-06. Retrieved 2011-11-07. 
  5. ^ GR-3160, NEBS Requirements for Telecommunications Data Center Equipment and Spaces
  6. ^ Kasacavage, Victor (2002). Complete book of remote access: connectivity and security. The Auerbach Best Practices Series. CRC Press. p. 227. "ISBN "0-8493-1253-1. 
  7. ^ Burkey, Roxanne E.; Breakfield, Charles V. (2000). Designing a total data solution: technology, implementation and deployment. Auerbach Best Practices. CRC Press. p. 24. "ISBN "0-8493-0893-3. 
  8. ^ a b Mukhar, Nicholas. "HP Updates Data Center Transformation Solutions," August 17, 2011 [1]
  9. ^ "Sperling, Ed. "Next-Generation Data Centers," Forbes, March 15. 2010". Forbes.com. Retrieved 2013-08-30. 
  10. ^ Niccolai, James. "Data Centers Turn to Outsourcing to Meet Capacity Needs," CIO.com, May 10, 2011 [2]
  11. ^ Tang, Helen. "Three Signs it's time to transform your data center," August 3, 2010, Data Center Knowledge [3]
  12. ^ a b Miller, Rich. "Complexity: Growing Data Center Challenge," Data Center Knowledge, May 16, 2007 [4]
  13. ^ Sims, David. "Carousel's Expert Walks Through Major Benefits of Virtualization," TMC Net, July 6, 2010 [5]
  14. ^ Delahunty, Stephen. "The New urgency for Server Virtualization," InformationWeek, August 15, 2011. [6]
  15. ^ "HVD: the cloud's silver lining" (PDF). Intrinsic Technology. Retrieved 2012-08-30. 
  16. ^ Miller, Rich. "Gartner: Virtualization Disrupts Server Vendors," Data Center Knowledge, December 2, 2008 [7]
  17. ^ Ritter, Ted. Nemertes Research, "Securing the Data-Center Transformation Aligning Security and Data-Center Dynamics," [8]
  18. ^ "TELECOMMUNICATIONS INFRASTRUCTURE STANDARD FOR DATA CENTERS". ihs.com. 2005-04-12. Retrieved 2017-02-28. 
  19. ^ A document from the Uptime Institute describing the different tiers (click through the download page) "Data Center Site Infrastructure Tier Standard: Topology". Uptime Institute. 2010-02-13. Archived from the original (PDF) on 2010-06-13. Retrieved 2010-02-13. 
  20. ^ The rating guidelines from the Uptime Institute "Data Center Site Infrastructure Tier Standard: Topology" (PDF). Uptime Institute. 2010-02-13. Archived from the original (PDF) on 2009-10-07. Retrieved 2010-02-13. 
  21. ^ "Uptime Institute - Tier Certification". uptimeinstitute.com. Retrieved 2014-08-27. 
  22. ^ "Google Container Datacenter Tour (video)". 
  23. ^ "Walking the talk: Microsoft builds first major container-based data center". Archived from the original on 2008-06-12. Retrieved 2008-09-22. 
  24. ^ Cherry, Edith. "Architectural Programming: Introduction", Whole Building Design Guide, Sept. 2, 2009
  25. ^ Mullins, Robert. "Romonet Offers Predictive Modelling Tool For Data Center Planning", Network Computing, June 29, 2011 [9]
  26. ^ a b Jew, Jonathan. "BICSI Data Center Standard: A Resource for Today's Data Center Operators and Designers," BICSI News Magazine, May/June 2010, page 28. [10]
  27. ^ Data Center Energy Management: Best Practices Checklist: Mechanical, Lawrence Berkeley National Laboratory [11]
  28. ^ Clark, Jeff. "Hedging Your Data Center Power", The Data Center Journal, Oct. 5, 2011. [12]
  29. ^ Jew, Jonathan. "BICSI Data Center Standard: A Resource for Today's Data Center Operators and Designers," BICSI News Magazine, May/June 2010, page 30. [13]
  30. ^ Clark, Jeffrey. "The Price of Data Center Availability—How much availability do you need?", Oct. 12, 2011, The Data Center Journal [14]
  31. ^ Tucci, Linda. "Five tips on selecting a data center location", May 7, 2008, SearchCIO.com [15]
  32. ^ Niles, Susan. "Standardization and Modularity in Data Center Physical Infrastructure," 2011, Schneider Electric, page 4. [16]
  33. ^ Pitchaikani, Bala. "Strategies for the Containerized Data Center," DataCenterKnowledge.com, Sept. 8, 2011. [17]
  34. ^ Niccolai, James. "HP says prefab data center cuts costs in half," InfoWorld, July 27, 2010. [18]
  35. ^ ASHRAE Technical Committee 9.9, Mission Critical Facilities, Technology Spaces and Electronic Equipment (2012). Thermal Guidelines for Data Processing Environments (3 ed.). American Society of Heating, Refrigerating and Air-Conditioning Engineers. "ISBN "978-1936504-33-6. 
  36. ^ ServersCheck. "Best Practices for data center monitoring and server room monitoring". Retrieved 2016-10-07. 
  37. ^ "tw telecom and NYSERDA Announce Co-location Expansion". Reuters. 2009-09-14. 
  38. ^ "NASA - metal whiskers research". NASA. Retrieved 2011-08-01. 
  39. ^ Detailed explanation of UPS topologies "EVALUATING THE ECONOMIC IMPACT OF UPS TECHNOLOGY" (PDF). Archived from the original (PDF) on 2010-11-22. 
  40. ^ Sarah D. Scalet (2005-11-01). "19 Ways to Build Physical Security Into a Data Center". Csoonline.com. Retrieved 2013-08-30. 
  41. ^ "Data Center Energy Consumption Trends". U.S. Department of Energy. Retrieved 2010-06-10. 
  42. ^ J Koomey, C. Belady, M. Patterson, A. Santos, K.D. Lange. Assessing Trends Over Time in Performance, Costs, and Energy Use for Servers Released on the web August 17th, 2009.
  43. ^ "Quick Start Guide to Increase Data Center Energy Efficiency" (PDF). U.S. Department of Energy. Archived from the original (PDF) on 2010-11-22. Retrieved 2010-06-10. 
  44. ^ a b "Smart 2020: Enabling the low carbon economy in the information age" (PDF). The Climate Group for the Global e-Sustainability Initiative. Archived from the original (PDF) on 2011-07-28. Retrieved 2008-05-11. 
  45. ^ a b "Report to Congress on Server and Data Center Energy Efficiency" (PDF). U.S. Environmental Protection Agency ENERGY STAR Program. 
  46. ^ A calculation of data center electricity burden cited in the Report to Congress on Server and Data Center Energy Efficiency and electricity generation contributions to green house gas emissions published by the EPA in the Greenhouse Gas Emissions Inventory Report. Retrieved 2010-06-08.
  47. ^ Canada Called Prime Real Estate for Massive Data Computers - Globe & Mail Retrieved June 29, 2011.
  48. ^ Finland - First Choice for Siting Your Cloud Computing Data Center.. Retrieved 4 August 2010.
  49. ^ "Stockholm sets sights on data center customers". Archived from the original on 19 August 2010. Retrieved 4 August 2010. 
  50. ^ In a world of rapidly increasing carbon emissions from the ICT industry, Norway offers a sustainable solution Retrieved 1 March 2016.
  51. ^ Swiss Carbon-Neutral Servers Hit the Cloud.. Retrieved 4 August 2010.
  52. ^ Katrice R. Jalbuena (October 15, 2010). "Green business news.". EcoSeed. Archived from the original on 2016-06-18. Retrieved 2010-11-11. 
  53. ^ "Data Center Energy Forecast" (PDF). Silicon Valley Leadership Group. 
  54. ^ "Efficiency: How we do it – Data centers". Google. Retrieved 2015-01-19. 
  55. ^ Commentary on introduction of Energy Star for Data Centers "Introducing EPA ENERGY STAR for Data Centers". Jack Pouchet. 2010-09-27. Archived from the original (Web site) on 2010-09-25. Retrieved 2010-09-27. 
  56. ^ "EU Code of Conduct for Data Centres". iet.jrc.ec.europa.eu. Retrieved 2013-08-30. 
  57. ^ Sweeney, Jim. "Reducing Data Center Power and Energy Consumption: Saving Money and 'Going Green,' " GTSI Solutions, pages 2–3. [19]
  58. ^ Cosmano, Joe (2009), Choosing a Data Center (PDF), Disaster Recovery Journal, retrieved 2012-07-21 
  59. ^ Needle, David. "HP's Green Data Center Portfolio Keeps Growing," InternetNews, July 25, 2007. [20]
  60. ^ Inc. staff (2010), How to Choose a Data Center, retrieved 2012-07-21 
  61. ^ Siranosian, Kathryn. "HP Shows Companies How to Integrate Energy Management and Carbon Reduction," TriplePundit, April 5, 2011. [21]
  62. ^ Bullock, Michael. "Computation Fluid Dynamics - Hot topic at Data Center World," Transitional Data Services, March 18, 2010. [22] Archived January 3, 2012, at the "Wayback Machine.
  63. ^ Bouley, Dennis (editor). "Impact of Virtualization on Data Center Physical Infrastructure," The Green grid, 2010. [23]
  64. ^ Fontecchio, Mark. "HP Thermal Zone Mapping plots data center hot spots," SearchDataCenter, July 25, 2007. [24]
  65. ^ "Fjord-cooled DC in Norway claims to be greenest". Retrieved 23 December 2011. 
  66. ^ "Measuring Data Center Efficiency: Easier Said Than Done". Dell.com. Archived from the original on 2010-10-27. Retrieved 2012-06-25. 
  67. ^ "Computational-Fluid-Dynamic (CFD) Analysis | Gartner IT Glossary". gartner.com. Retrieved 2014-08-27. 
  68. ^ "Info and video about Cisco's solution". Datacentreknowledge. May 15, 2007. Archived from the original on 2008-05-19. Retrieved 2008-05-11. 
  69. ^ "Technical specs of Sun's Blackbox". Archived from the original on 2008-05-13. Retrieved 2008-05-11. 
  70. ^ And English Wiki article on "Sun's modular datacentre
  71. ^ Kidger, Daniel. "Mobull Plug and Boot Datacenter". Bull. Retrieved 2011-05-24. 
  72. ^ "HP Performance Optimized Datacenter (POD) 20c and 40c - Product Overview". H18004.www1.hp.com. Retrieved 2013-08-30. 
  73. ^ "Huawei's Container Data Center Solution". Huawei. Retrieved 2014-05-17. 
  74. ^ Kraemer, Brian (June 11, 2008). "IBM's Project Big Green Takes Second Step". ChannelWeb. Archived from the original on 2008-06-11. Retrieved 2008-05-11. 
  75. ^ "Modular/Container Data Centers Procurement Guide: Optimizing for Energy Efficiency and Quick Deployment" (PDF). Archived from the original (PDF) on 2013-05-31. Retrieved 2013-08-30. 
  76. ^ Slessman, George (May 7, 2013), System and method of providing computer resources, retrieved 2016-02-24 
  77. ^ "Modular Data Center Firm IO to Split Into Two Companies". Data Center Knowledge. Retrieved 2016-02-24. 
  78. ^ Synergy Research Group, Reno, NV. "Mature US Colocation Market Led by Equinix and CenturyLink-Savvis | Synergy Research Group". srgresearch.com. Retrieved 2014-08-27. 

External links[edit]

) )