This is going to be the first in a series of articles on
liquid cooling in the data centre environment, with a view to sparking some
debate.
The use of liquid cooled technologies in the data centre has
been akin to debates about nuclear fusion, insofar that its always 5-10 years
from adoption.
We've been tracking the use of liquid cooled technologies
for some time now and we are party to some developments that have been taking
place recently, and it is my belief that we are going to see some serious
disruption taking place in the sector sooner rather than later.
We'll be looking at a unique way of looking at how to implement liquid cooled solutions in the next few posts, but before that we need to understand the direction of
travel outside of the DC Space.
Customers are increasingly using cloud services
and it appears that organisations that are buying physical equipment are only
doing so because of either "thats the way we've always done IT" or
that they are using specialist applications that cannot be provide via cloud
services or they are wedded to some out of date procurement process. This is
supported by the fact that conventional off the shelf server sales are in
decline, whilst specialist cloud enabled servers on on the up but being
purchased by Cloud Vendors and the Hyperscalers (albeit that they are designing
and building their own servers which by and large are not suitable for
conventional on premise deployements) There is also the Open Compute Project
that is mostly being used for pilots via development teams.
So, with cloud servers, the customer, in fact any customer
has no say in the type of physical hardware deployed to provide that SaaS, PaaS
or IaaS solution and that is the right approach. IT is increasingly becoming a Utility,
just like Electricity, Water and Gas. As a customer of these utilities I have
no desire to know how my electricity reaches my house, all I want is that when
I flick the light switch or turn on the TV, that the light comes on and that I
can watch the "Game of Thrones" I certainly do not care which power
station generated the electricity, how many pylons, how many transformers, how
many different cables etc that the "juice" passed through to get to
my plug/switch.
For digital services, I access a "service" on my
smartphone, tablet, desktop etc and via a broadband or wifi connection, access the "internet" and then route through various buildings
containing network transmission equipment to the server (which can be located
anywhere on the planet), the physical server(s) that the "service"
resides upon. Everything, apart from my own access device is not my asset, I
merely "pay" for the use of it. And I dont pay directly, I pay my ISP or Internet Service Provider a monthly fee for access and the supplier directly for the service that I am accessing and somehow they pay everybody else.
So, in essence, digital services are (to me) an app I select on my digital device, information is then sent and received over digital infrastructure, either fixed such as a broadband connection or via a wifi network. I have no knowledge or, nor do I care how the information is sent or received, merely that it is.
What does this have to do with the use of air or liquid in a data centre?
Well, its actually quite important, but before we cover that, we'll be covering the basics of data centres and we'll be doing that in the next post.
No comments:
Post a Comment