Monday 30 December 2019

Air V Liquid - Part 4 - Ecosystems

Following on from the previous articles, and over a year late! we're now going to look at the relative costs of providing an ecosystem for IT equipment, essentially the rationale for data centres.

I think its important to recognise that back in the past, the delivery of IT systems was a lot different to the way we do it today, but it does have a bearing on data centre ecosytem architectures.
Back in the day, business used a central mainframe and dumb terminals, the main frames were heavy bits of kit and I can remember some installations where floors were strengthened to take the weight, thus rooms in buildings were specifically used for IT equipment, cooling solutions were installed and Bob's your uncle, you had a computer room.

These were normally over provisioned to allow for expansion and I've personally been asked to build a new room that needed to cover the existing kit plus 100% expansion.
Well, thats all very well, but 100% of what exactly? floor space, power density, network capacity, cooling capacity? Normally, everything was doubled up, just to cover ourselves, but it was never going to be enough. Why? because IT was getting smaller, more equipment was needed, power densities rose, more network was needed. So these rooms soon became not fit for purpose and for a variety of reasons, insufficient cooling capacity, not enough power, in some cases not enough space.

So, IT managers were in a dilemma, without visibility of IT needs moving forward, it became impossible to provide expansion space without spending a great deal of capital in future proofing (with the risk of getting it completely wrong) or failing to meet the business requirements.
I've seen row upon row of racks, all empty because the business decided to use blade servers, which of course have a high power density that standard servers, and there wasn't sufficient power available so power was taken from other racks, rending them useless, this of course leads to hot spots because you've concentrating your IT (a blade chassis is about 7.5kW) into an area that was designed for a standard 2kW rack.

Today, business has other options than to keep their IT on premise, they can use colocation facilites or cloud services but they will still need a room on premise to provide networking access to the colocation/cloud services and they may have some on site compute (those services that can't go into the cloud for reasons such as latency or data transfer rates),

All we've done though, is transfer the problem of the ecosystem to someone else, now its the colocation provider that has to think about capacity, in terms of space, power and cooling and the thing is, is that they are always behind the curve, insofar as they are reactive rather than proactive, they respond to customers requirements in a building that was designed in the past, with the pasts intepretation of power, space and cooling requirements and that leads to the same problems. i.e. a lack of power, problems with cooling, and the risk of having empty racks.

Its understandable though, if you are a colocation or hosting provider, you dont have crystal balls to see into the future, so you have to deal with what you know or you can take a gamble on what the future looks like.

The future, to them is very much like the past, insofar that if 99% of systems are designed for air cooling then an air cooling infrastructure is what they will build.

Hence, the market is dominated by air cooled systems, and so we should build for air.

Building for air means, a raised floor (perhaps), it means CRAC/H's, it means pipework, it means chillers, or external units, in whatever flavour you desire, but you have to provide an infrastructure for what the market needs, and at the present time that is air.

But it doesn't have to be that way...

The data centre of the "future" is, very much like the data centre of today, given that we are building them today (as discussed with my friend and colleague Mark Acton) however, what would the data centre of the future look like if we did adopt some of the more outlandish suggestions coming out of academia and some design consultancies and what if we decided to adopt more liquid cooled options?

In November I attended the DCD London event, where not one but two immersed liquid cooled solutions were on show, both using the single immersion technique (this is where the server is immersed into a bath full of an engineered (non dielectric) liquid, the heat generated by the servers is carried by the liquid to the top of the bath and transferred via a heat exchanger to an external water circuit, this is then connected to a external dry cooler and the heat vented into the atmosphere, but when compared with an air cooled solution, we see that some of the capital plant items, namely the floor (baths dont need a raised floor), and CRAC/Hs are moot, as a result the capex and opex costs will be lower.
But, we can go one step further and get revenue, thus potentially reducing our costs even further. How?, simple, the heat rejected by the system is warmer and in a medium where it can be captured better than air and thus directed to provide, or offset energy use elesewhere, such as hot water or heating locally (within the building) or passed to a low temperature district heating system for use over a wider area. There are some commercial aspects that need to be ironed out with this approach, such as contractual agreements, cost, and service levels etc.

This approach, where waste heat is used to offset energy requirements elsewhere, is a fundamental aspect of Green Data Centres and from our research it appears that liquid immersed systems can contribute, and we're not the only ones thinking this..

The whole concept of data centres as engaged players in the energy transition towards the decarbonisation of society is within the remit of the EU funded Catalyst project

So, in terms of capital and operation costs of  air v liquid where do we stand ?

There are in effect 3 types of cooling for data centres, the first is using a chilled (or cold) water loop, this basically transfers the air cycle heat to liquids in the CRAC unit which are then pumped to a chiller where the retained heat is dissapated into the atmosphere.

The second is to use evaporative cooling, wiki provides good content on how evaporative cooling works and this is the text

"An evaporative cooler (also swamp cooler, swamp box, desert cooler and wet air cooler) is a device that cools air through the evaporation of water. Evaporative cooling differs from typical air conditioning systems, which use vapor-compression or absorption refrigeration cycles. Evaporative cooling uses the fact that water will absorb a relatively large amount of heat in order to evaporate (that is, it has a large enthalpy of vaporization). The temperature of dry air can be dropped significantly through the phase transition of liquid water to water vapor (evaporation). This can cool air using much less energy than refrigeration. In extremely dry climates, evaporative cooling of air has the added benefit of conditioning the air with more moisture for the comfort of building occupants.
The cooling potential for evaporative cooling is dependent on the wet-bulb depression, the difference between dry-bulb temperature and wet-bulb temperature (see relative humidity). In arid climates, evaporative cooling can reduce energy consumption and total equipment for conditioning as an alternative to compressor-based cooling. In climates not considered arid, indirect evaporative cooling can still take advantage of the evaporative cooling process without increasing humidity. Passive evaporative cooling strategies can offer the same benefits of mechanical evaporative cooling systems without the complexity of equipment and ductwork."

Some social media and search engine hyperscalers use this type of cooling technology.

The third is Emerging liquid technologies and include "liquid to chip", cold plate and immersive.

Liquid to chip and cold plate in effect are extending the chilled water loops into the rack, and in the case of liquid to chip into the server.

Immersed technologies however are a very different kettle of fish.

This is where a server is actually immersed into a non dielectric fluid in either a single mode (direct bath) or dual mode (server is encased in a blade type enclosure filled with the non dielectric fluid and installed into a chassis with the liquid cooling loops).

The heat transfer is made to the fluid and then via a heat exchanger to water and then to a dry cooler or other mode of use, these are the waste heat reuse scenarios often discussed, heating office areas, resdiential heating, swimming pools and greenhouses.

An air cooled data centre needs the following:

Raised Floor (not always)
Chiller (or dry cooler, other method of rejecting heat)
Power train (HV/LV boards, PDU's)

In a immersed liquid data centre, you reduce some of these elements as follows:

Raised Floor (we dont need to pump air under the floor, but you still might want to run power and network cables under the floor (but we're seeing a lot of overhead cable routes now so maybe not!))

CRAC/H's are not required
Chillers are not required, although if you dont have a easily available user for your waste heat, you might want to include a dry cooler for summer running
Power train - Most Immersed Units are already equipped with full 2N power, and only need a standard connection.
UPS would still be required but as you're only going to need it for power and not cooling, you can downsize it.
Batteries, again you can reduce the amount of batteries needed.

All in all, we think that moving to an fully immersed solution could save around 50% of a standard data centre build costs, couple that with reduced operating costs and your data centre is already saving lots of money, consider the CATALYST project and you may even begin to make money from selling that waste heat and providing grid services.

We geniunely believe that in the future ALL data centres will be used immersed technologies and be integrated with smart grids and that the CATALYST project will do EXACTLY what it says on the tin!

Thats the Air v Liquid skirmish put to bed, and we've been a strong supporter of the technology since 2010 when we saw the first immersed demo unit from ICEOTOPE, since then we've been following and writing about this technology in a number of articles, one of which was an update from the original article, I recall, Martin from Asperitas telling me that I would need to update it sooner rather than 2021 and I think he's right, so look out for that update to an update!!

Friday 20 December 2019

Carbon3IT 2019 Update and 2020 Forecast

This year has ben AMAZING!

Absolutely, bloody amazing, I said and I quote from last year "I'm not going to gaze into my crystal ball at this time, except to say that 2019 is going to be a VERY interesting year." and so it proved.
Did I say AMAZING!, it was and at the risk of repeating myself, it was AMAZING!

So, why was it amazing? Well, I'm going to follow our usual format of a month by month commentary so here goes....

So, January 2019 saw us visiting a new clients premises to begin work on a whitepaper on IST, this was published by them in Q2 and, will feature in a forthcoming edition of a european DC related publication as well as a feature on the CATALYST project, more on that later! We also attended in Amsterdam again followed by a quick visit to Brussels to speak at the ICT Footprint event. At the end of the Month I went to Boden, Lulea, Sweden to visit the DC that was the topic of a recent post. They also WON an award at the DCD Global Awards, the.....

We also had a few meetings with a client for ISO50001 certification, more on that later as well!

February is usually a fairly quiet month but we had a few calls for potential future projects, most of these planned to start in the new financial year and I spoke at the ENTress event in Wolverhampton on climate resilent infrastructure, specifically DCs citing the example of the City of Lancaster after Storm Desmond in 2015. We, as SFL also put in a funding round and we had loads of meetings to decide approach, content and finances, sadly we didn't get through to the next round but we did gain very valuable exprience.

March saw us visit the DCW event in London, which proved to be a turning point as we met up with Vicki from Green IT Amsterdam to have a handover of the CATALYST project, basically we are now the in-house data centre consultants for Green IT Amsterdam working on the CATALYST project, more info here

April  saw us visit Oslo at the Data Centre Forum where I spoke about the CATALYST project, the second of what turned out to be numerous trips to the Nordic region in 2019.
We also went to Zurich for a GRIG meeting, this is Green IT Global and consists of a number of organisations based in the UK, Netherlands, France, Switzerland and Finland promoting the use of sustainable ICT. We also had a couple of meetings with a few clients.

Early May,saw our MD getting excited as his team, Charlton got into a playoff position in League 1, eventually winning against Sunderland at Wembley. Oh, and a visit to Helsinki to speak at the Data Centre Forum on CATALYST and the EUCOC.

We love June, it might be because we get to go to Monaco for the DataCloud Europe event, as usual we were invited as the guest of the EU-JRC and this year partly funded by our friends at Rentaload, well if you call going back to a villa high up in the mountains above Monaco and staying in a dorm! Its always a good event for us and we picked up 3 new clients! A planned visit to the Netherlands was cancelled due to a family illness.

In July, our MD took his first visit to Poland and his 3rd to India, Bangalore to speak about CATALYST.

August is usually pretty quiet due to the holiday season and we all went to Amsterdam for a mini break, we did have a meeting with Green IT Amsterdam as well and we did a lot of work on the NHSD GP IT F project.

In September, which was probably the busiest month this year I visited Lincoln to finalise the coolDC design and build CEEDA award which also saw success at the DCD Global Awards winning the...

closely followed by a visit to Valencia to visit another DC for a CEEDA assessment, another GOLD and a visit to another one of their facilities is scheduled for Q1/2 2020..

Mid month it was off to Manchester for the DCA Data Centre Retransformation event where we held our 2nd CATALYST project Green Data Centre - Stakeholder Group meeting and presented at the Main event on the CATALYST project per se.

The Ops Director was also on the NEBOSH Health and Safety Course, a pretty intensive health and safety course on behalf on an existing client, she followed this up with a visit to their site for a review (required for the assessment part of the course).

I also spoke at Solar and Storage Live about ICT energy and DC's with Tim Chambers and Emma Fryer from coolDC and techUK respectively.

The week after was my epic euro tour where I visited 4 countries in 10 days, first to Amsterdam for a Green IT Amsterdam participants meeting, then an epic railway journey to Copenhagen to speak at DCF, then another train to Stockholm to assist with the Green IT Amsterdam study trip for Dutch Authorities (they guys responsible for DC planning approvals (you may recall that Amsterdam is on a DC construction ban at the present time!) Finally, a quick flight to Brussels for the CATALYST project preparation and EU review meeting (we passed!) my last train journey was back to London for a BSI TCT7/3/1 meeting.

We always tend to schedule the EUCOC Annual Best Practices meeting around this time of year, and this year it was scheduled for early October, this was followed by a visit to London for IP expo

November is conference season, 3 this year!, the month started with the DCD Converged event at London's Old Billingsgate, our MD had 2 speaking engagements the first to promote a new modular UPS solution and the 2nd on the first 10 yrs of the CEEDA programme.
Our ISO50001 client decided to amend the process and concentrate on their ESOS qualification due to some internal issues, this was completed in Dec (before the due date). We've completed all the policies, processes and procedures and review where we go from here in the new year.
Our MD had to leave early from DCD London and make his way to Amsterdam to speak at the DC Innovation Day organised by Mercer and Saval, followed by a Green IT Amsterdam team meeting, the following week was all about CATALYST.

The week after I made my way over to Dublin for Data Centres Ireland where had arranged the "heat" track, part of our CATALYST work followed by the 3rd Green Data Centres - Stakeholder Group meeting.

The following week we attended Data Center Forum in Stockholm, where our MD both ran a room, and delivered another presentation on CATALYST.

December is nearly always quite quiet, but I love going to the DCD Awards, and this year was excellent, I was very pleased 2 projects we've been involved get the recognition they deserve (see above).
This was followed by a visit to the Birmingham City University to begin preparations for a new Data Centre Module to be included in the Computing and Networking degrees, we are very honoured to be part of this project. It starts in January and we do have an option for a limited amount of "industry experts" to join us both as guest lecturers on specific subjects, we'll be in touch but please contact us on the email below if you'd like to join, and for some people who may work in the industry on specific areas and want to get an overall picture, a data centre 101 as it were, as well, again get in touch, but this will be extremely limited.

We then went to Amsterdam to meet with two clients as part of our work with Green IT Amsterdam and picked up 3 new pieces of work!

We also had a call with a potential partner on a new H2020 project, more on that in the new year.

As I stated earlier, we think 2020 is going to be very interesting indeed and as we have no idea as how its all going to pan out POLITICALLY, we're going to keep our powder dry for the time being.

But, that said, we will continue to offer the following services:

EU Code of Conduct for Data Centres Review and Preparation
CEEDA assessements (with our DCD partners)
ISO Management Systems for ISO9001/14001/22301/27001 and 50001
Data Centre Audits (with our M and E partners)
Data Centre Training (on site, and tailored to your requirements)
Data Centre Support Services - Compliance
Health and Safety Services

Special Services, if you have a problem that needs solving, let us know, through our wide network of consultants, supply chains and operators we've probably come across the problem before and therefore may be able to help

So far, we already have a number of assigments scheduled for Q1/2/3 and 4, but we'll always find time and space to add some more.

Finally, we'd like to wish all our customers, suppliers and industry collegues the very best wishes for 2020

PS Last year we said that our next blog post will be the 4th in our series "Air v Liquid" this, ahem, was delayed and we hope that this article will be on the cost side of things scheduled to be published early in the new year, honest!

As always, until next time.

If you need to get in touch with us, please use the following:
@carbon3it (Twitter/Skype)

Sunday 24 November 2019

Boden - The Arctic Circle

Last Jan/Feb I spent some time in Boden, Lulea, Sweden to visit the Boden One Data Centre, this is an EU funded project and you can find out all about the project here, as well as the RI:SE data centre laboratories, you can find out about them here.

I'm giving a presentation at Data Center Forum in Stockholm later this week, more info here , so thought it was time I got around to posting this article which has been 10 months in preparation!

This next section was written upon my return from the coldest place I'd even been to, it was minus 37, yes you read that correctly -37, which is to be expected when you're about 100yds inside the Arctic Circle!

Boden One DC

Before I write about my impressions of the Boden One DC I have to present some background, essentially, "The ultimate goal for the project Boden Type DC One is to build the prototype of the most energy and cost efficient data centre in the world in Boden, Sweden."

The project seeks to be "Resource efficient throughout its lifecycle" and goes on to state that
"Data centres consist of two distinct elements; IT-equipment and the surrounding service envelope. The innovation of Boden Type DC One is a new, holistic design approach for the latter. The goal is to bring a novel combination of existing techniques and locations to the market. This combination does not exist and has never been tested.

The unique solution offers a sustainable datacentre building which is energy and resource efficient throughout its lifecycle, cheaper to build and operate and brings jobs and knowledge to remote places in large parts of Europe. The cornerstones of the concept Boden Type DC One are:
  • Efficient fresh air cooling.
  • Modular building design.
  • Renewable energy.
  • Favourable location.
The 600kW prototype facility will be a living lab and demonstration site, as it will be tested by providers and end-users in a real operation environment with all aspects of its operations measured. With the prototype, the project stakeholders will be able to:
  • Validate that the datacenter concept meets the energy efficiency, financial reliability, and other targets in real operational environments.
  • Validate and improve the design software tools for modeling and simulating the operation of the facility and cooling equipment.
  • Demonstrate through accurate simulation that the prototype can be replicated in other European sites with less favourable conditions.
The name Boden Type Data Centre One is naturally created for the first Type DC in the location of Boden, in the north of Sweden. Close to the clean and high quality electricity supply by renewable energy source, ideal climate and service infrastructure."

These are laudable aspirations and goals and well in keeping with Carbon3IT's own view on data centres for the future.

However, and this is not a criticsm, more of an observation, there is no reference to existing data centre design and build standards, nor has any classification in terms of an Uptime Institute Tier or EN50600 Class been applied to the site. In my opinion, and contrary views are welcome, at best this site and current infrastructure would be classed as a Tier 1/Class 1 facility, but it is none the poorer for being classed as such, it is after all a research project.

 So, when I visited, this EU funded project was still a work in progress, the building is 98% complete, and was undergoing some final AC commissioning during my visit, and only 1 of the 3 pods has any active IT equipment installed. The configuration is 3 pods which will be fitted with an OCP stack (this is the active pod), a HPC cluster and finally a blockchain (cryptocurrency) mining operation.

It does not have a UPS nor a generator, but there is space for a UPS within the power room and the use of a generator is a moot point when you have a hydropower station and biogas plant literally meters away, the connection to the biogas plant is still under discussion, and information on the hydro power station can be found here.
It should also be noted that the site is also literally meters away from

The DC used 100% free cooling, there is a cold air corridor which feeds ambient air into the eco-cooling equipmnent which is then (depending on season) tempered with hot return air in winter to provide a range of inlet temperatures for the IT equipment or in summer passed straight through to the IT equipment with the hot air being vented out to a vaulted roof and then outside.

This next section was written today (24/11/2019) and the site is now fully operational and generating some "sweet" data which you can view here, the current PUE is here on the top left of the page (the reason why I posted the link is because this is live real time data and I didn't want to "date" this post.

This is very impressive, however it has be tempered with the fact that this is an unmanned test facility with some of the latest equipment that has been heavily optimised by the research team and as stated earlier would be classed under both the UTI Tier Topology and EN50600 Classification as a Tier/Class 1 site.

Yes, it is clearly a VERY energy efficient site, in terms of the ratio of total energy consumption against IT energy consumption being around the low 1.007 PUE mark.

So, some observations, the cold aisle temperatures in Pod 1 & 3 appear to be below ASHRAE reccommended at around 13.2 degrees in Pod 1, the OCP pod and 13.6 in Pod 3, the ASICs Pod, I'll make a mental note to ask the RI:SE team in Stockholm late this week to find out why this is the case, but it raises an interesting question about supply temperatures.

It is interesting that there is no UPS or Generator, could, or would a commercial DC operator consider this? I doubt it, although the prudent application of the EU Code of Conduct for Data Centres (Energy Efficiency) best practices applied to an organisation in relation to the "actual" mission critical IT functions via BP 4.3.x may yield an interesting view?

At best, the Boden One is a "hyper-edge" facility and could be deployed to link "edge" sites in a hub/spoke configuration.

My conclusion is that research has to be done, and the mission parameters are laudable:

Efficient fresh air cooling. Modular building design. Renewable energy.Favourable location.

However, DCs cannot always take advantage of fresh air cooling, on-site or local renewable energy or favourable locations and I'm sure that many in the community will be dismissive of the project and its results stating that its not real world, that is missing the point.

If we strip out some of the historical retoric out of DC design and educate users, we could build DCs using this research and build them cheaply, using less energy and save time deploying compute to remote locations, if we have to give up some uptime and resilience, is this necessarily a bad thing?

I'll be available at the Data Center Forum on the 28th November in Stockholm to defend my corner, I'll look forward to some robust discussions.

7th December 2019 - Update
FANTASTIC News from the recent DCD Global Awards held in London on the 5th December, Boden won an award the... 

Well Done!


Energy Efficiency - The 5th Fuel

Saw this BBC Article
ages ago and wrote an article but forgot to post however, I thought IT is was worthy of further discussion as it does have some interesting insights. I decide to post it in full below: Our comments are in italics

"Installing a single low-energy LED bulb may make a trivial contribution to cutting the carbon emissions that are overheating the planet.
But if millions choose LEDs, then with a twist of the collective wrist, their efforts will make a small but significant dent in the UK's energy demand.
Studies show making products more efficient has - along with other factors - already been slightly more effective than renewable energy in cutting CO2 emissions.
The difference is that glamorous renewables grab the headlines.
The "Cinderella" field of energy efficiency, however, is often ignored or derided. 

Yes, unfortunately, I've been to many data centres where the basics of energy efficiency are often ignored or derided, hopefully this article may save me some time in the future!

Who says this?

The new analysis of government figures comes from the environmental analysis website Carbon Brief.
Its author says EU product standards on light bulbs, fridges, vacuum cleaners and other appliances have played a substantial part in reducing energy demand.
Provisional calculations show that electricity generation in the UK peaked around 2005. But generation per person is now back down to the level of 1984 (around 5 megawatt hours per capita).


How much carbon has been reduced?

It’s widely known that the great switch from coal power to renewables has helped the UK meet ambitions to cut carbon emissions.
The report says the use of renewables reduced fossil fuel energy by the equivalent of 95 terawatt hours (TWh) between 2005 and now. And last year renewables supplied a record 33% share of UK electricity generation.
But in the meantime, humble energy efficiency has contributed to cutting energy demand by 103 TWh. In other words, in the carbon-cutting contest, efficiency has won – so far. And what’s more, efficiency is uncontroversial, unlike wind and solar.

Yes, efficiency is avoided cost, it also means that new plant doesn't need to be built

What role has industry played?

The energy efficiency story doesn’t just apply to households. There have been major strides amongst firms, too. Big supermarkets have worked hard to improve the performance of their lighting and refrigeration.
And because firms and individuals are using less energy, that has offset the rise in energy prices. So whilst the prices have gone up, often bills have gone down.
The issue is complicated, though. Other factors have to be taken into account, such as energy imported via cables from mainland Europe, population growth and shifts from old energy-intensive industries.

UK Data Centres represent a small amount of the total overall energy consumption, or so they would have us believe, here at Carbon3IT we think the actual amount that can be attributed to the sector is somewhat higher, that said, we include all IT energy and think its around 12% of total consumption. Let us know if you think differently, and we can discuss.

Should 'Cinderella' efficiency be allowed to shine?

Simon Evans from Carbon Brief told BBC News: “Although the picture is complex it’s clear that energy efficiency has played a huge role in help the UK to decarbonise – and I don’t think it’s got the recognition it should have.
"Say you change from a B or C-rated fridge to an A++ rated fridge. That can halve your energy use from the appliance, so it’s pretty significant.”
The UK government has consistently said it champions energy efficiency, but campaigners say it could do more. The UN’s climate body also supports energy efficiency as a major policy objective, although the issue features little in media coverage.
But supporters of efficiency argue that ratcheting up efficiency standards for everything from planes and cars to computer displays and freezers offers the best-value carbon reductions without the pain of confronting the public with restrictions on their lifestyle choices.
Joanne Wade from the Association for Conservation of Energy told BBC News: “I haven’t seen these figures before but I’m not surprised.
“The huge improvement in energy efficiency tends to be completely ignored. People haven’t noticed it because if efficiency improves, they are still able to have the energy services that they want. I suppose I should reluctantly agree that the fact that no-one notices it is part of its appeal.”
Scientists will be keen to point out that government-imposed energy efficiency is just one of a host of cures needed to tackle the multi-faceted problem of an overheating planet.

We totally agree! The use of the EUCOC can reduce energy consumption in data centres between 25-50% and possibly more if you adopt some of the latest technologies and optional best practices.

If you want any assistance in reducing your ICT energy consumption, let us know by emailing "" or by following and then messaging us on our twitter feed @carbon3it 

Sunday 20 October 2019

Wow, so long since our last post!

Well, they say that the devil makes work for idle hands so you can assume that we've been very busy, seeing as its over 9 months since we last posted, so we need to give you an update!

In our last post we spoke a little bit about Brexit, well, as expected its turned out to be a little more complex than most people thought, the current situation is that the UK is being run by the very people that promoted it, and even they cant find a way through. We reserve further comment until the situation becomes a little clearer.

So, what have we been up to?

It makes sense to list our current projects (where we are not bound by NDA's!)

FT NHSD GP IT Futures Project

Towards the end of last year we were asked to review the list of data centre standards applicable to the NHS Hosting for various projects, we consolidated this list down from over 500 individual points to the use of international standards, specifically the EN50600 series of Data Centre Design, Build and Operate standards developed by industry professionals over the last 7 years and a few others.
As a result of this successful project we were asked to assist in the Market Engagement and Assurance element of a new NHSD tender for GP IT Hosting Services, this work continues.


The EU Funded Catalyst Project aspires to turn data centres into flexible multi-energy hubs, which can sustain investments in renewable energy sources and energy efficiency. Leveraging on results of past projects, CATALYST will adapt, scale up, deploy and validate an innovative technological and business framework that enables data centres to offer a range of mutualized energy flexibility services to both electricity and heat grids, while simultaneously increasing their own resiliency to energy supply.
We have been retained as in-house consultants by  to assist Green IT Amsterdam to deliver their project elements which are, to play an active role in defining innovative business models, designing the impact creation strategy, and lead the establishment of the Green Data Centres stakeholder Group so as to share information on environmentally sustainable data centres and foster knowledge and experience exchange. In addition, we will be responsible for assessing the CATALYST framework and introducing the CATALYST Green DC Assessment toolkit which will offer support services to data centre operators and developers, certification bodies and other stakeholders to identify, measure and assess relevant KPIs and metrics in a modular, structured and organized way.
This project continues and is scheduled to end in September 2020.
As part of this project, our MD has been travelling widely across the EU and beyond, and expects to do a lot of travel for the next 11 months, so far this year, he has been to Oslo, Amsterdam, Helsinki, Poznan, Copenhagen, Stockholm, Brussels and Bangalore, India on behalf of the project and has trips to scheduled to Milan, Dublin, Stockholm for the rest of 2019, with occasional trips to Amsterdam for project updates, next years calendar has yet to be scheduled but keep on eye on the @catalyst-dc and @carbon3it twitter feeds for news. We'd like to invite all followers to our next CATALYST Green Data Centres - Stakeholder Group event taking place in Dublin, Ireland on the 20th November full details and registration can be found on the page, where other information on the project can be found.


Its been a relatively quiet year for CEEDA with only 2 assessments taking place, these were the coolDC Lincoln Discovery Centre, which was the "operate" element of a CEEDA design and operate assessment, and one in Spain (which is covered by an NDA) both are in the final stages of reporting and all will be revealed at the DCD London event in early November, you can register for the event here 
We'll have 2 speaking slots at this years conference, one has yet to be finalised and one on the "10years of CEEDA", Recommendations and Insights from the field" so register to attend!
We have 3 re-assessments taking place in 2020, some held over from 2019 due to operational issues, these are in Germany, Gibraltar, and Scotland, and a new client in Saudi Arabia. We understand from DCD that there will be a greater push on the CEEDA product moving forward, this is largely due to the climate emergency and the dual aspect of ICT globally, the first to mitigate the impact of ICT on the environment itself and the use of ICT system to monitor, predict and mitgate other aspects of the emergency. We are working very closely with DCD on this topic.
We've also been invited to judge this years DCD Awards and we are currently looking through the submissions for our particular category, you'll have to wait until the 5th December to hear the final award winners!

We are delivering some ad-hoc training on "energy and cost management in data centres" which is an old course no longer carried by the original supplier, but our client has specifically asked us to deliver it, based on our industry knowledge and connection in the industry, but also because we'll be visiting their DC and pointing out where potential energy savings can be made, we'll report back once we've delivered the training (late Oct)

We provide assistance in the developement of ISO management systems, specifcially in the data centre field, namely ISO9001, 14001, 22301, 27001, 45001 & 50001,
Our Operations Director has recently completed the NEBOSH General Certificate in Health and Safety and is awaiting the results.
We have been providing assistance to a local DC for ISO50001/ESOS.

We still continue our work with the global standards bodies, and have recently attended the BSI TCT7/3 group, been seconded to the TCT7/3/1 group working on a revamped EN50600-4-X standard and presented at the JCG - GDC TC215 committee on the CATALYST project (as part of our work!) 

Speaking Engagements
We've spoken at a number of events this year on  behalf of the CATALYST project and the EUCOC, mostly at
events in Oslo, Helsinki, and Copenhagen, but have also scheduled for Stockholm in late November.
You'll also see us at DCD London, DC Ireland and potentially in Copenhagen and Berlin, but we're in the early stages and nothing is confirmed yet.

So, in summary, we said last year in our last post that we felt that this year was going to be special as so it has proved.

We still continue to provide the following services and one of our major tasks over the coming months is to undertake a serious revamp of our website, so look out for that!

EU Code of Conduct for Data Centres Review and Preparation
CEEDA assessements (with our DCD partners)
ISO Management Systems for ISO9001/14001/22301/27001/45001 and 50001
Data Centre Audits (with our M&E partners)
Data Centre Training (on site, and tailored to your requirements)
Data Centre Support Services - Compliance

Special Services, if you have a problem that needs solving, let us know, through our wide network of consultants, supply chains and operators we've probably come across the problem before and therefore may be able to help.

Until next time (and we promise it wont be 10 months!!)