Telehouse’s Telehouse West in London Docklands
Demand for anonymous-looking buildings like this that store vast amounts of personal data that we hold in “the cloud” is growing exponentially. Tom Ravenscroft explains what it takes to build a data centre.
They may not talk about it, but some of the UK’s largest contractors, including Mace, ISG and Skanska, are working on building data centres. Because of concerns about revealing information that could compromise their physical security, as well as client confidentiality agreements, these developments are rarely seen in the press. However, these anonymous buildings play a vital part in our lives.
Data centres are the physical representations of the internet: the place where all the data that each of us produce and receive every day – whether sending emails, updating our Facebook status or uploading digital images to “the cloud” –
is physically stored.
Our increased reliance on computers and digital devices means that, as a nation, we are creating more data than ever before. This, in turn, is leading to continued growth in the construction of facilities to store this data.
“With the amount of data more than doubling every two years, corporate storage is becoming more of an issue every day,” says Steve Webb, chief information officer at Ark Data Centres, which designs, builds and operates data centres in the UK.
It is a little-known but nevertheless expanding sector of the construction market. Commercial real estate consultant CBRE reports that in London alone 75,000 sq m of data centres have been built over the past five years. It predicts the “market will remain healthy in the short-to-medium term”.
“During the last recession we were continuously busy,” says Malcolm Howe, critical systems partner at engineering consultancy Cundall. “It’s always been a good sector to be in and now we are seeing a lot of work coming through.”
Data centres are extremely functional buildings built to provide a secure, stable and controlled environment to house racks of servers. The large-scale data centres of global technology giants such as Facebook’s “Arctic Circle” data centre in Lulea, Sweden, or Google’s €75m facility in Dublin, are well-known examples.
Around 70% of data centres’ internal area is “white space” for servers
However, owner-occupied data centres only represent a small proportion of the market. Most of the UK’s storage is based in co-location sites run by hosting providers such as Telecity Group, Global Switch, Digital Realty and Virtue, which, because of their sensitive nature, clients do not like to promote. And, unlike many building types, data centres are not designed to advertise their function.
Andy Almond, director at Pick Everard, an architecture practice that has designed several data centres, says: “Often the client is looking for an anonymous building. They are designed to blend in.”
This means that a data centre will typically have an aesthetic similar to that of a warehouse, although the main difference externally will be its increased height – server racks are typically 2 metres high – along with the presence of multiple air-handling plants that may be visible on the building’s roof.
“Money is spent on protecting the data, not the building,” explains Almond. “The envelope of the building is not that expensive – it is essentially a big shed. The cost comes from the servicing.”
It’s what’s inside that counts
While the exteriors are rarely visually or architecturally appealing, the security of these servers is the reason the buildings are so interesting internally.
Nick Card, operations director at Skanska’s M&E contracting arm, SRW Engineering Services, explains: “The building of data centres is generally not complicated. However, the services contained within them are.”
The two key functions of the services are to provide electrical infrastructure resilience for the servers (box, overleaf) and to ensure they are cooled correctly. In a typical data centre, about 70% of the building will be “white space” for servers. The remaining 30% contains the plant needed to power and cool the servers.
“Servers need to be located in a highly controlled environment to reduce the risk of server failure,” continues Card. “Loss of servers represent a risk to all businesses and we have all read about the damage done to various brands when they fall over.
Telecity Group’s Joule House data centre in Manchester
“The combination of power and cooling, along with highly specified security and fire-detection systems, make data centres highly services orientated, with the services being a key and fundamental part of the facility.”
Even a minute of downtime can lead to huge financial losses, so resilience is the primary design concern. There are four tiers of data centre resilience, as defined by the Uptime Institute consortium. Tier 4 is the most robust: maintenance can be carried out without interruption to service, so it should never fail.
However, increased reliability comes at a cost and Tier 3 data centres are seen as the most cost-effective solutions for most businesses. Diesel generators and uninterrupted power supply (UPS) units, served by large banks of batteries, are on standby to prevent any interruption in the supply of power to the servers, in the event of a mains power failure. Some sites are provided with dual incoming grid power supplies, each from a separate source, as an added layer of resilience.
Cool customers
The servers generate a huge amount of heat, so continuous cooling is needed to maintain the temperature of the servers within the allowed tolerances. As well as penalties for loss of power, the operators of commercial data centres will be penalised if the temperature, which is monitored by sensors, deviates beyond the boundaries agreed.
Any break in the cooling could also lead to “thermal runaway” – where a rise in temperature changes the conditions in a way that leads to a further increase, and causes the servers to shut down.
Like all areas associated with IT, the design of data centres is evolving rapidly as operators and users aim to improve the sustainability credentials of what is an extremely power-hungry building. In an old data centre, cooling can use 30%-40% of the power, so it is in this area that most energy-saving advances have been made in the past five years.
"With the amount of data more than doubling every two years, corporate storage is becoming more of an issue."
Steve Webb, Ark Data Centres
A change in the industry-standard design guidance has helped: the technical standards set out by the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) have increased the range of acceptable air temperatures for data centres.
“Ten years ago server inlet temperatures were often maintained within half a degree of 23ºC. Now the range is between 16ºC and 27ºC,” says Cundall’s Howe. “This broadening of temperature range allows a free cooling air strategy to be developed.” Free cooling systems use low external temperatures to chill water and have the potential to save huge amounts of power in data centres.
Another innovation in the sector is the advancement of modular construction. ISG has been pioneering in this area and, with Gardner DC Solutions and BBlur Architecture, has developed a modular concept called DATA a RAY. This claims to offer up to 10,000 sq m of Tier 3 space in less than 12 months – twice as fast as a traditional build. However, as power is often the key determining factor in data centre construction times, the advantages of modular construction have so far not been fully realised and most data centres continue to be traditionally built.
Although you may not have realised it when you are walking or driving around the country, you have almost certainly passed a data centre. Perhaps it was a warehouse-style building covered in air-cooling units, or what you took for a B1 office and light industrial development on a secondary business park.
But those anonymous buildings might be where your emails are stored, and they are almost certainly a growth market for contractors and consultants in the years to come.
Zinc whiskers: what 10 out of 10 builders of data centres need to know
These tiny particles of metal can be a hidden hazard in data centres, as Tim Brown explains.
What are zinc whiskers?
Zinc whiskers were first recorded in the 1940s by Bell Laboratories and have been researched extensively by NASA.
They are tiny filaments of zinc that can grow from on galvanised surface, including electroplated floor tiles and cable management systems. They are usually a few millimetres long and a few microns in diameter but their rate of growth can vary.
What causes zinc whiskers?
The zinc used in the galvanising process is the catalyst for zinc whiskers to grow. Research suggests stress initiates growth. This may be residual stress from the galvanising process, mechanical stress from cutting, bending or fitting the galvanised materials on site, or thermal stress from the environment in which the galvanised materials have been installed.
Why are they a concern?
After growing on galvanised surfaces, zinc whiskers can detach themselves and float around the data centre. They may be tiny, but they are conductive and have the capacity to conduct tens of milliamperes before melting. So there is a potential for zinc whiskers to bridge tightly spaced electrical conductors and cause electrical shorts, causing nuisance glitches or damage to sensitive hardware.
How serious is the risk?
As the whiskers melt following contact with an energy source, it is difficult to assess how many faults they have caused. Much of the discussion on zinc whiskers has identified cable management systems as the main source, so Unitrunk commissioned research by Colin Gagg of the Open University’s Materials Engineering Group. He concluded that any galvanised surface used in a data centre carries a risk of zinc whiskers.
How can the risk be reduced?
The most commonly specified finishes for cable management systems in interior installations are electro-zinc and pre-galvanised. These are often specified for data centre environments but will not prevent zinc whiskers. Raising the specification to hot-dip galvanised cable trays is an option, but we believe this may simply delay the problem.
Alternatively, some form of encapsulation – such as a powder coating – is often recommended to seal in the zinc surface. However, if the encapsulation is breached – by cutting the cable management on site or because of abrasion during or after installation, for example – the risk is likely to return.
The third option – which we recommend – is to specify a stainless steel cable management system.
Tim Brown is national sales manager at cable management specialist Unitrunk.