Have you ever been inside a data centre? Or wondered what they’re like?
Well we’ve got two of them at our Manchester data centre facility, so let us show you around…
What is a data centre?
A data centre is quite simply a big building full of computing and networking equipment. These centralised locations collect, store, process and distribute data. That data could range from basic web hosting, E-Commerce, email and file sharing, to CRM and ERP systems, comms and collaboration services. They have existed in one form or another, since the advent of computers.
Data centres are incredibly secure. Data carries a value and cyber crime is big business, but the buck doesn’t stop with digital security. Infact, it doesn’t even start with it. The physical security of a data centre is paramount, keeping data safe, compliant and accessible 24 hours a day, 365 days a year. Perimeter fencing, secure gates, CCTV, biometric access controls, virtual tripwires, mantraps and tremor sensors allow authorised personnel to enter a data centre, while any would be invaders of the space are stopped in their tracks. We’ve invested over £1 million in our Security and Operations Control Centre alone - meaning our data centre is the only one in the world with NSI Gold Approved BS5979 security centre on-site.
Power infrastructure is the backbone of a data centre. Without power, a data centre is rendered completely useless. For a data centre to remain up and running on a permanent basis, redundant power supply is critical. Electricity enters a data centre via a National Grid feed, while UPS (Uninterruptible Power Supply) systems provide short-term power if the input power source fails, and also protect critical components against voltage spikes. Redundancy through on-site substations, transformers, backup generators and on-site battery storage mean that even in the case of a region-wise power outage, data centres stay up and running.
Data centre cooling ensures that a data centre facility remains at the ideal operating temperature. The optimum temperature for the equipment housed in a data centre facility is 18-22 °C, but all those servers running 24/7 generate some serious heat. So cooling is required. Hot and cold aisles are created by throwing the warm air generated by the servers out behind them - these are the hot aisles. The air is then taken from the hot aisles and cooled through air conditioning so that it can be pumped back into the aisles housing the front of the servers, creating cold aisles. This means that the air the servers are using to function is cooled to the optimum temperature for best performance and life expectancy of the servers.
Robust data centre connectivity is vital for data centre customers, and demand for high capacity, resilient connections between data centres has never been higher. At TeleData we’re carrier-neutral and offer access to all the major network providers using multiple ducts and fibre entry points into our facility. This offers the customer complete flexibility, and choice of a wide range of carriers competing on performance, service and price.
Data centres house a serious amount of power and equipment, which of course poses a fire risk. So fire detection and suppression systems are critical. VESDA (Very Early Smoke Detection Apparatus) and gas (Argonite or FM200) suppression mean that the chances of a fire actually happening are incredibly slim. Back in November, our early alarms system actually triggered due to minute traces of bonfire smoke particles being dragged into the DC through our air conditioning system. That just how accurate and sensitive these systems are. Luckily the traces our systems identified weren’t high enough to trigger the gas, just our internal alarms!