In the past decade, companies in many industries have discovered the benefits of colocation. A purpose-built data center provides a safe and stable physical environment for a company’s critical computing systems, with sufficient power, cooling, and connectivity to guarantee server uptime and availability.
A new data center in the United States is generating electricity for its servers entirely from renewable sources, converting biogas from a sewage treatment plant into electricity and water. Siemens implemented the pilot project, which recently went into operation, together with Microsoft and FuelCell Energy. The data center is not connected to the public power grid. Siemens developed and installed intelligent control and monitoring technology for the plant as well as energy management software so that the servers can be reliably supplied with electricity at all times. The partners intend to demonstrate that using intelligent hardware and software, even critical installations such as data centers can be reliably operated with alternative energy sources.
An ASHRAE study has concluded that data centers can reduce their environmental impact by relaxing their control over humidity. Guidelines published this year will recommend a bigger range of safe humidity levels, as well as letting data centers warm up.
IT equipment is more robust than most users realize, and the influential industry body ASHRAE has argued that data center operators can reduce the energy wasted in cooling data centers more than necessary. This year the body plans to do a similar job for humidity.
Cooling optimization in a data center provides a significant opportunity for reducing operating costs since cooling systems consume 34 percent of the power used by a data center, according to Lawrence Berkley National Laboratory.
Due to the need to maintain five nines (99.999 percent) reliability, data centers are too often overdesigned in terms of the cooling capacity, and operating budgets pay the price for this practice.
You want to see a data center professional get hot under the collar? Suggest that maybe his data center cooling isn’t all that great. Data center professionals view data center cooling as both a point of pride and frustration – they are each very aware that it’s a constant struggle and even the slightest tweak can make a huge difference – both in saving the planet and saving money for the organization.
Ideally hydronic pipe should not be affected by the fluid it is carrying, and the fluid should not be affected by the pipe. Sadly, many HVAC system use a compromised material called carbon steel for hydronic piping. Water oxidizes steel, which causes pitting, scale build-up, and eventually leads to failure of the pipe. These effects are mitigated through the use of closed systems where possible, chemical treatment, PH-treatment, etc.
The data center is changing. We have new methods of cooling, optimizing the data center and even the utilization of green energy through next-generation geothermal technologies. The insides of the data center and what goes into the rack has been changing as well. New platforms around consolidation, server technology and cloud computing are all impacting how we process and utilize resources.
Providing a rare look inside its data center operations, Google recently posted a video describing its data center in Berkeley County, South Carolina, including descriptions of the facility’s cooling system and security measures.
The company announced it would build the South Carolina data center in 2007. Including an expansion project in 2013, Google’s total investment in the site amounts to $1.2 billion.
Keystone NAP has delivered the first modular “KeyBlock” unit to its data center on the border between Pennsylvania and New Jersey. The modules were co-developed with Schneider Electric and are configured to each individual customer’s power, cooling, and network connectivity needs.
Cabinets are the foundation of the data center’s physical infrastructure, used throughout the life cycle of the facility. IT equipment that runs the applications are contained within them, the cabling that connects the equipment to the users and the LAN/SANs are terminated and managed in them, power is distributed within them, and cooling is channeled through them. They are also the most visible infrastructure element, and how they look and fit together is often an indicator of how a data center is run and managed.