In February 2013, after over 10-years as a hosting provider, Bytemark made the decision to invest £1.2 million into building our own data centre. Here’s the complete guide to how we made it happen.
Peter and I are proud to announce that we’ve completed the purchase of a building for Bytemark. The name, YO26, is taken from the area of York where the data centre will be located. It will more than double our current hosting capacity, and provide a larger base for the company’s operations.
YO26 takes Bytemark’s mission of hand-built hosting to the next level. We’ve built our own servers for years, and rolled much of our own software. And there are so many advantages to a managed hosting provider owning its own premises:
- We’re free from price hikes imposed by data centre operators (and so are our customers).
- we can time building maintenance to minimise risk to our customers. Whereas, in a shared data centre we have to cope with maintenance being imposed upon us.
- We can offer colocation across Manchester and York.
- Our home-grown cloud hosting platform will gain a second installation, allowing self-service customers to benefit from that resilience.
Here are the original specifications we worked to when designing the data centre:
- 80 racks in cold-aisle contained pods
- At least two power feeds for each rack
- A 1MVA supply substation
- A 550kVA backup diesel generator
- An Airedale cooling system
- Two fibre paths to connect YO26 to the rest of our network
The fibre paths will be diversly routed to, and within, the building. One is going down the railway to our PoP in London, running at 10Gbps. The other will be going via Leeds to our existing Manchester sites. This completes a national ring that will be running at 10Gbps by the end of 2013.
All of these features have the ability to scale with demand.
How did we build a data centre?
We worked with data centre design experts Sudlows to turn this empty room into our state-of-the art data centre space. There were five key stages we went through to get there:
Laying the groundwork
The first few weeks involved getting the building, and surrounding land, in a position to store servers.
Indoors, all the internal walls have been built and plastered. Work on cabling and containment in what will be the floor void began. Light fixtures were fitted and fire suppression pipework and gas vessels were installed.
Outside, the land was cleared first of all. Shuttering was prepare, ready to pour concrete for the generator plinth. Then trenches were dug for laying the duct work for our fibre connectivity.
Flooring, Fans and Fibre
The next stage started with raising the floor throughout the main data hall. This hid all of the cabling that had previously been installed.
Entry points for fibre were also created. Holes cored through the concrete to allow the ducting in. The two entry points are on different walls of the building.
Then, last but not least, the extract fans were installed. During the mostly cool Yorkshire weather these will be exhausting the hot air from the building, allowing the air handling units to pump lovely fresh cool air under the floor. During unexpected tropical spells then dampers will seal the extractors and allow the air to be recirculated and cooled by regular air conditioners.
The Cooling System
Carrying on from the installation of the extractors, the next big project focused on finishing the cooling system.
A large lorry delivered a pair of Airedale Smartcool units. These will give us our initial 90kW of cooling capacity, in an N+1 configuration. We also received two Airedale Easicool units – these make up a separate cooling system for our UPS room.
As outlined above, our complete system allows us to avoid using energy-hungry air conditioning until the outside temperature reaches 16°C. According to our calculations, that was only 15% of the hours last year!
Bring in the generator
One of the final steps was to bring in the diesel generator to supply our backup power source. This was easier said than done. Our data centre is on the edge of a business park without any access from two sides – so in came the crane!
The generator can provide power off the grid for 20 hours at our initial full load without needing the tank refilled. We’ll be adding a new tank as data centre load requires and we won’t require its full capacity for a few more months at least.
There’s also space behind the data centre for two more generators if we should need them in the future.
Once all of the components were ready we had some final efficiency tests to run and then our first pods were ready to house our customers servers!
So there you have it, a step by step guide on how we built our wholly-owned data centre. Learn more about how the data centre has developed since 2013 on our dedicated YO26 page.
If you’re interested in seeing more images from the build, I’ve also put together a photo tour.