Top Data Center Trends and Predictions to Watch for in 202047 min read
Opinions below have been contributed by Bill Kleyman (Switch), Bob Bolz (Aquila, Inc.), Ian Seaton (Independent Consultant, formerly CPI), Lars Strong (Upsite Technologies), and Mark Acton (Independent Consultant/EkkoSense).
Introduction and closing remarks written by Bill Kleyman.
The really amazing part about 2020 is that a lot of people in our industry, at least to some extent, have already familiarized themselves with many of the common buzz terms that surround us. Cloud, security, data center, IoT, and others have all become mainstay terms and phrases. And, many realize that, beyond buzz terms, these are very real trends with very real impacts on our industry. With that being said, in my years of research, writing reports, speaking on stage, and continuing to push this industry forward, never has there been a time as exciting as this new decade. A lot is happening, and it will absolutely impact what you do every day. Whether you’re a consumer or a builder of a digital tomorrow, things are about to get even more connected. None of this is scary; but you do need to take note of these trends and at the least be prepared for what’s to come.
5G and Edge are real and you should be ready for these solutions. Imagine a 4-lane highway going into a major city, but 25 years ago. There aren’t that many cars, people are generally happy, and traffic moves well. Fast-forward to today where there a lot more cars, but no one put in more lanes. As simple as this might be, this happens on our networks both cellular and on the Internet. Cisco’s Mobile report shows that in 2016, 4G already carried 69 percent of the total mobile traffic and represented the largest share of mobile data traffic by network type. It will continue to grow faster than other networks to represent 79 percent of all mobile data traffic by 2021. This is why, 5G connectivity with its very high bandwidth (100 Mbps) and ultra-low latency (1 millisecond), is expected to drive very high traffic volumes. Basically, 5G are your new lanes capable of handling more traffic at higher speeds with fewer bumps along the way. Further, the introduction of 5G will accelerate the trend of edge data center networks extending their reach to proximities closer to end users than ever before. As far as edge is concerned, think of it simply as a method to process data and services as close to you, the user and consumer, as possible. Let me give you an example. If you’re lucky enough to have a Tesla, you probably want those cameras and AI engine to process what’s in front of you pretty quickly, right? Otherwise, you’d have to wait a bit for those images and data points to reach a remote data center and get back to you with insight and results. Well, as you can imagine, in driving a split-second can make all the difference. That’s what edge does – it allows you to process critical data points as close the source as possible. This is pretty important in some latency-sensitive applications. Be sure to take some notes and research both 5G and edge systems and how they’ll impact your part of the world.
Conversely, Edge, IoT, autonomous vehicles, 5G, plus smart cities will continue to be granted significant marketing and PR budgets but significant incremental changes or major milestones being achieved may be unlikely during 2020 alone. The investment and scale of deployment required for both 5G and smart cities will inhibit roll out and ensure that significant steps forward in these areas are unlikely during 2020. Edge in its purest sense is merely a deployment strategy rather than a new technology and is not a new technology per se. Similarly, IoT is not new, rather it is the scale at which remote devices and sensors are being deployed and the data that they generate which is accelerating change. What will become increasingly pressing during 2020 is that the data becoming available and the increasingly sophisticated analytical techniques being applied to multiple data sets. Meaning that the information that is increasingly available is far greater than the sum of its disparate parts. This will be become increasingly apparent during 2020 with resulting social and legal challenges around privacy and information rights as governments and legislation struggle to keep pace with technological advancements.
The most significant talking points are likely to come from the fields of AI and Machine Learning operating on increasingly large data sets and effectively creating accurate detail from information previously hidden or not able to be collected. The extremely rapid progress in this area is only likely to accelerate in 2020. The obvious benefits created by progress in these areas are inevitably also going to create increasing levels of concern about how these technologies are applied, who can use the information generated and the attendant privacy concerns. The recent cease and desist letter from Twitter to Clearview and the use of its facial recognition technology is a good example of a trend that is likely to continue throughout 2020 and beyond. During 2020, governments globally will increasingly attempt to address the issues created by technology developments in AI and ML alongside the increasing amounts of personally identifiable data available, without stifling the benefits that these technologies can bring. An increasingly difficult path to tread.
As a result of much of this, convergence of HPC and hyperscale data centers will continue.AI, Machine Learning, IoT, media distribution, Fintech and other data intensive operations have pushed cloud data centers toward providing higher bandwidth low latency connectivity. This will continue to be the case as we move closer to smart city implementations where fast data movement will be key to providing the software infrastructure required to support these new compute paradigms.
Google’s claims of Quantum Supremacy in late 2019 have been seen by some as being little fanciful, however it does seem possible that Quantum Computing could take the step of becoming commercially viable in late 2020. Quantum computing is not set to replace our current digital computing but for some specific problems it promised to radically reduce the time to reach a solution. In this respect if could significantly augment and probably revolutionize computing in some areas when commercially available. We appear to be on the cusp of this revolution.
The global data center industry will waste over $18 billion in 2020 due to inefficient cooling practices and poor airflow management. According to the Forbes Technology Council, global data center energy use in 2020 should be around 728 billion kW/Hours. According to the latest Uptime Institute data center survey, recent history of significant PUE improvements began flattening out a couple years ago and we should expect the global average in 2020 to be a sub-tic north of 1.50. With total global data center energy estimated at 728 billion kW hours, at a 1.5 PUE, around 243 billion kW hours will be used for something other than IT equipment. Significantly lower PUEs are realistically achievable, demonstrated in part by legislation in Shanghai mandating new data centers be designed to operate under a 1.30 PUE. Given that the objective of any such legislation is to reduce data center energy use but not necessarily reduce data center construction, we can safely assume the powers that be have been adequately educated on what passes both the PR and business viability tests. The Shanghai experience more than likely assumes some level of free cooling, since 70% of the year is south of 76˚F DB and 75% south of 72˚F WB. Such free cooling assumptions are instructive about electricity dollars left on the table, considering respondents to the 2019 iteration of Vertiv’s Data Center 2025 survey reported that a nearly equal number will rely on mechanical cooling by 2025 as those relying on air cooling and liquid cooling combined. Interestingly, those ratios have changed less than the statistical margin of error since the original survey in 2014. In addition, a half dozen years ago, Dell promoted the idea that their new servers could be deployed in 90% of U.S. data centers without mechanical cooling. If we assume that maybe 5% of these sub-2.0 PUEs constitutes electric conversion losses, then we are still left with a Mechanical Load Component of .45 on average, two thirds of which results from poor airflow management and inadequate implementation of free cooling. That waste constitutes around 140 billion kW/Hours of the non-IT global data center energy use. At a global average $0.13 per kW/H, that means our industry will spend $18.2 billion on inefficient or unnecessary data center cooling in 2020. Astute readers will notice that is a dramatic increase over last year’s forecast for 2019, but I would remind those readers that last year’s forecast excluded any use of free cooling, which reflects neither actual nor legislated behavior in many regions.
Data center liquid cooling will begin to establish some traction in 2020. A significant milestone has been realized that should lift actual activity on data center liquid cooling much closer to the level of conversation on the subject for the past more-than-a-few years. That conversation has remained fractured enough to make it nearly ubiquitous at any forum for data center conversation, from conference seminar to virtual thread. After all, when “liquid cooling” can refer to anything from server immersion to rear door heat exchangers to row-based DX cooling units to tower-fed liquid-to-liquid heat exchangers to undersea containers, we can pretty safely say it has arrived or it has not yet arrived, pretty much at the same time. For my purposes here, liquid cooling refers to either liquid immersion or direct contact liquid cooling. Adoption has been incremental and sporadic. While some liquid cooling providers have invested in growth or courted investors, others have de-emphasized their data center initiatives. Similarly, while some hyperscale players have waded into the water (Google and Alibaba, notably), others have put in a toe only to retreat to dry land (Microsoft). Nascent partnerships between liquid cooling developers and ICT OEMs or major HVAC vendors promise to continue moving the needle. The most interesting development, however, has been the release of Open Compute Project’s “Immersion Requirements Document.” There is nothing like an industry standard to chip away at barriers to adoption. A case in point comes from the electronic components industry where I spent some time early in my career. Surface mount technology had been developed around 1960, but it never really got off the ground until in the 1980s with standards for both product packaging and delivery systems released by Western Electric, JEDEC, IEC and IEEE. While the OCP document does not purport to be an industry standard, since it is focused on integration with other pre-existing or pending “open” platforms and it does not make significant application distinctions between single phase and two phase dielectrics, it does organize a significant body of requirements, ranging from safety to spill management to material data. Of particular note is a relatively complete list of performance comparison metrics, so one liquid cooling solution can be compared to both other liquid cooling solutions as well as to standard air cooling solutions for efficiency and effectiveness. There is also a long list of liquid specification requirements – not numbers with which to comply but numbers to be documented, plus hard minimum dielectric requirements. This document is an important development, but not quite a game-changer for opening the floodgates, since the Open Compute Project, while gaining traction, does not yet pack an ANSI wallop. For example, when I first started working on hot and cold air containment in the data center, adoption crept on a very slight slope, despite documentation by ASHRAE TC9.9 and BICSI. However, when NFPA 76 blessed containment designs and established some important material and deployment requirements, the flood gates were released and today containment is ubiquitous. Likewise, liquid cooling will continue gaining traction, but if a more universally recognized standards body than the OPC takes up the lead OPC has established, we will finally see the much anticipated bump.
With that being said, the power consumption of CPUs, GPUs and FPGAs has risen to such a level that the resistance of data center managers to liquid cooling is dropping for such systems. This is however selective. Lower power systems have remained air cooled in spite of the very real economies achievable. The power consumed by IT and data centers will exceed 4% of the power generated worldwide without change. In the future, companies will be forced to adopt systems with the highest efficiency just to stay in business because of legislation and public opinion around curbing global warming and environmental degradation.
Data center design must revolve around efficiency. That means, don’t just build for space – make sure you can fill those buckets. It’s not just about space any more. Leaders in the hyperscale and data center space are building solutions that are designed around efficiency, density, and maximum uptime. This means re-thinking how you deploy things like servers, storage ecosystems, and even how you build data centers in general. Converged systems coupled with dense rack solutions are allowing companies to vastly reduce the amount of space they need while still helping their business grow. To that extent, in 2020 and beyond be sure to review your own designs so that you can meet the demands of a very fast industry. This may very well be an opportunity to move to a partner, or design a more optimal data center ecosystem.
Genuine data center standards are becoming available and will represent a significant and beneficial change for the data center sector. The initial release of the first set of documents for ISO/IEC 22237 is likely 2020. Currently this series is only available at the Technical Specifications (TS) stage, however work is currently ongoing to develop these documents into a full ISO/IEC standard. This will be a first in the global data center standards arena. For the first time we will have an agreed international standards for many aspects of the data center including construction, power delivery, environmental control, cabling, and security. As an ISO standard, this will align with other commonly applied ISO Standards such ISO 9000, 50000, 14000, 27000 etc.
Data-driven data centers are a real thing. Look for smart sensors, data aggregation, and even VR/AR in the data center. I remember putting on a VR helmet and looking at a row of racks. And you know what? It was amazing. The detail was unbelievable, and I could manipulate the entire environment by seeing exactly what was in the rack and where it was placed. Is this the future of data center monitoring and management? Maybe. Future data centers will be smart and connected. To the point that they’ll be both predictive and prescriptive in how they operate. Data-driven solutions help produce patters and insights that we, as humans, could never see without help. I’m actually excited about this the most. Upcoming industry reports (AFCOM and InformationWeek) indicated further investments in smart technologies and the AFCOM State of the Data Center report went on to further state that many leaders are seeing ‘smart become the new normal’ for leading data center organizations. Be sure to pay attention to this trend and quickly learn where you can adopt smart technologies.
Security goes both ways… that is: Physical and Logical security. Make sure bad guys can’t deploy ransomware and that they can’t simply walk through the front door. Some of the most critical threats that were covered in the recent reports that I wrote were ransomware/data breaches, outside human threats, advanced persistent threats or targeted attacks, insider human threats, and loss of PII (personally identifiable information). In talking to a few of the respondents, it became clear that there was real concern around both physical attacks and those that come from the cyber world. The future of security will absolutely continue to look at all of the things the bad guys are throwing against our data centers. These are exploits, DDoS attacks, targeted vulnerabilities, and much more. But there’s also the very real physical threat as well. Just because it’s not being reported doesn’t mean it’s not happening out there. Whether it’s malicious or not, you need to focus on physical security of your infrastructure. The AFCOM report indicated that about half of respondents had some type of physical security issue happen to them. That’s a lot. Know that you’re a target and stay vigilant and prepared.
Blockchain development may well see the advent of increasingly useful commercial tools becoming available during 2020. Operating as a public distributed ledger, Blockchain is likely to revolutionize both B2B and B2C transactions globally in the form of smart contracts and fundamental changes in financial operations. There is significant investment and research into Blockchains for specific purposes and sectors that will become increasingly commercially viable. Thus far the promise has not been fulfilled, however the technology is maturing and 2020 could well bring the required step change that has been promised for some years.
There must be advancements made in recruiting young talent. Their values are evolving and it may be harder to get good people than you think. The one thing that I learned over the years is that the best managers will adapt to your skillset to manage you as effectively as possible. A major challenge for millennials was working with an older generation that simply followed the book on management. I never enjoyed those kinds of environments. Luckily for me, I had a series of amazing leaders who took the time to teach me through hands-on experiences and helped me learn a lot of complex concepts along the way. I’m forever grateful to those managers. Now, we face a new generation. Generation Z is poised to be pretty different than millennials and is already making waves. For example, the millennial generation is often seen as ‘job-hoppers.’ That is, constantly looking for a better opportunity to grow and make more money. Don’t expect the same from Generation Z. A study done by Indeed.com actually shows that the younger workforce is much more interested in “future-proof” jobs and job stability. Remember, this generation, like many others, grew up around a lot of instability. Many were very young, but still remember 9/11. And, they did live through the Great Recession. For them, stability in jobs (mainly in value-based industries like healthcare, life sciences, and tech) is really important. The point is that you need to understand your young employees and how they define their value system. Do your research and help this future generation grow and learn. Remember, the way they learn and succeed may be quite different than how you got ahead. Be ready to adapt.
The shape of the data center market will continue its girdled inverse boa constrictor squeeze. I like this image of a boa constrictor with its mouth full and dragging a carcass, but the observation will not likely be served by too deep a dive into explicating some of the metaphor’s details. Nevertheless, this is what our data center world is starting to look like, and it will only become more extremely so in the coming year. Specifically, as previously noted, growth in our industry will continue to be focused on both the edge and the hyperscale facilities. This is leading to less attention paid to the historic enterprise data centers. And with the eruption of IoT in the consumer and government arenas, which turn out to be relatively small potatoes when compared to looming industrial requirements, we hear the noise about micro data centers and cell tower modules and all sorts of self-contained processing and storage hubs. The noise is turning into a real proliferation. The only reason activity on the ground may not appear to match the hype is that there are so many very different ways to occupy the edge that our desire to locate and point at one thing is frustrated. And I would not necessarily call this a migration to the edge so much as the creation of new occupants for the edge. If there is, in fact, any migration, it is going to continue to be in the other direction as enterprise workloads move into managed services, cloud services and colocation spaces and these quasi-virtual spaces are physically located in hyperscale facilities owned by a small group of players who will be attracting investments from mainstream sources not typically associated with the data center world – follow the money, as they say. That investment pattern combined with massive energy contracts like this past year’s 70+ MW contract in Virginia point to the health of the bulge at this end of the boa de-constrictor as well.
There’s enough here alone to make your head spin. And none of this is slowing down. To really keep pace with what’s happening I suggest you do a few things. Follow a few good blogs, read a few good Tweets, and participate in an industry event or two. Sure, some of the topics get repeated. But that’s really because they’re important for you to note. The best way to get ahead in a fast-paced industry is to not navigate the seas of technology alone. Work with good partners, and ask lots of questions. This is the best way to both learn and share your knowledge. Most of all, don’t be afraid to try new things. As scary as some new solutions might be, you need to remember something important: To become a disruptive entity, you yourself might need to be disrupted. Don’t be afraid to get out of your comfort zone and learn something new. And sometimes that means you have to step out of your comfort zone. The good news is that you don’t have to do this alone. Now, go make 2020 an amazing year and a great start to a new decade.
The really amazing part about 2020 is that a lot of people in our industry, at least to some extent, have already familiarized themselves with many of the common buzz terms that surround us. Cloud, security, data center, IoT, and others have all become mainstay terms and phrases. And, many realize that, beyond buzz terms, these are very real trends with very real impacts on our industry. With that being said, in my years of research, writing reports, speaking on stage, and continuing to push this industry forward, never has there been a time as exciting as this new decade. A lot is happening, and it will absolutely impact what you do every day. Whether you’re a consumer or a builder of a digital tomorrow, things are about to get even more connected. None of this is scary; but you do need to take note of these trends and at the least be prepared for what’s to come.
- 5G and Edge are real and you should be ready for these solutions. Imagine a 4-lane highway going into a major city, but 25 years ago. There aren’t that many cars, people are generally happy, and traffic moves well. Fast-forward to today where there a lot more cars, but no one put in more lanes. As simple as this might be, this happens on our networks both cellular and on the Internet. Cisco’s Mobile report shows that in 2016, 4G already carried 69 percent of the total mobile traffic and represented the largest share of mobile data traffic by network type. It will continue to grow faster than other networks to represent 79 percent of all mobile data traffic by 2021. This is why, 5G connectivity with its very high bandwidth (100 Mbps) and ultra-low latency (1 millisecond), is expected to drive very high traffic volumes. Basically, 5G are your new lanes capable of handling more traffic at higher speeds with fewer bumps along the way. Further, the introduction of 5G will accelerate the trend of edge data center networks extending their reach to proximities closer to end users than ever before. As far as edge is concerned, think of it simply as a method to process data and services as close to you, the user and consumer, as possible. Let me give you an example. If you’re lucky enough to have a Tesla, you probably want those cameras and AI engine to process what’s in front of you pretty quickly, right? Otherwise, you’d have to wait a bit for those images and data points to reach a remote data center and get back to you with insight and results. Well, as you can imagine, in driving a split-second can make all the difference. That’s what edge does – it allows you to process critical data points as close the source as possible. This is pretty important in some latency-sensitive applications. Be sure to take some notes and research both 5G and edge systems and how they’ll impact your part of the world.
- Conversely, Edge, IoT, autonomous vehicles, 5G, plus smart cities will continue to be granted significant marketing and PR budgets but significant incremental changes or major milestones being achieved may be unlikely during 2020 alone. The investment and scale of deployment required for both 5G and smart cities will inhibit roll out and ensure that significant steps forward in these areas are unlikely during 2020. Edge in its purest sense is merely a deployment strategy rather than a new technology and is not a new technology per se. Similarly, IoT is not new, rather it is the scale at which remote devices and sensors are being deployed and the data that they generate which is accelerating change. What will become increasingly pressing during 2020 is that the data becoming available and the increasingly sophisticated analytical techniques being applied to multiple data sets. Meaning that the information that is increasingly available is far greater than the sum of its disparate parts. This will be become increasingly apparent during 2020 with resulting social and legal challenges around privacy and information rights as governments and legislation struggle to keep pace with technological advancements.
- The most significant talking points are likely to come from the fields of AI and Machine Learning operating on increasingly large data sets and effectively creating accurate detail from information previously hidden or not able to be collected. The extremely rapid progress in this area is only likely to accelerate in 2020. The obvious benefits created by progress in these areas are inevitably also going to create increasing levels of concern about how these technologies are applied, who can use the information generated and the attendant privacy concerns. The recent cease and desist letter from Twitter to Clearview and the use of its facial recognition technology is a good example of a trend that is likely to continue throughout 2020 and beyond. During 2020, governments globally will increasingly attempt to address the issues created by technology developments in AI and ML alongside the increasing amounts of personally identifiable data available, without stifling the benefits that these technologies can bring. An increasingly difficult path to tread.
- As a result of much of this, convergence of HPC and hyperscale data centers will continue.AI, Machine Learning, IoT, media distribution, Fintech and other data intensive operations have pushed cloud data centers toward providing higher bandwidth low latency connectivity. This will continue to be the case as we move closer to smart city implementations where fast data movement will be key to providing the software infrastructure required to support these new compute paradigms.
- Google’s claims of Quantum Supremacy in late 2019 have been seen by some as being little fanciful, however it does seem possible that Quantum Computing could take the step of becoming commercially viable in late 2020. Quantum computing is not set to replace our current digital computing but for some specific problems it promised to radically reduce the time to reach a solution. In this respect if could significantly augment and probably revolutionize computing in some areas when commercially available. We appear to be on the cusp of this revolution.
- The global data center industry will waste over $18 billion in 2020 due to inefficient cooling practices and poor airflow management. According to the Forbes Technology Council, global data center energy use in 2020 should be around 728 billion kW/Hours. According to the latest Uptime Institute data center survey, recent history of significant PUE improvements began flattening out a couple years ago and we should expect the global average in 2020 to be a sub-tic north of 1.50. With total global data center energy estimated at 728 billion kW hours, at a 1.5 PUE, around 243 billion kW hours will be used for something other than IT equipment. Significantly lower PUEs are realistically achievable, demonstrated in part by legislation in Shanghai mandating new data centers be designed to operate under a 1.30 PUE. Given that the objective of any such legislation is to reduce data center energy use but not necessarily reduce data center construction, we can safely assume the powers that be have been adequately educated on what passes both the PR and business viability tests. The Shanghai experience more than likely assumes some level of free cooling, since 70% of the year is south of 76˚F DB and 75% south of 72˚F WB. Such free cooling assumptions are instructive about electricity dollars left on the table, considering respondents to the 2019 iteration of Vertiv’s Data Center 2025 survey reported that a nearly equal number will rely on mechanical cooling by 2025 as those relying on air cooling and liquid cooling combined. Interestingly, those ratios have changed less than the statistical margin of error since the original survey in 2014. In addition, a half dozen years ago, Dell promoted the idea that their new servers could be deployed in 90% of U.S. data centers without mechanical cooling. If we assume that maybe 5% of these sub-2.0 PUEs constitutes electric conversion losses, then we are still left with a Mechanical Load Component of .45 on average, two thirds of which results from poor airflow management and inadequate implementation of free cooling. That waste constitutes around 140 billion kW/Hours of the non-IT global data center energy use. At a global average $0.13 per kW/H, that means our industry will spend $18.2 billion on inefficient or unnecessary data center cooling in 2020. Astute readers will notice that is a dramatic increase over last year’s forecast for 2019, but I would remind those readers that last year’s forecast excluded any use of free cooling, which reflects neither actual nor legislated behavior in many regions.
- Data center liquid cooling will begin to establish some traction in 2020. A significant milestone has been realized that should lift actual activity on data center liquid cooling much closer to the level of conversation on the subject for the past more-than-a-few years. That conversation has remained fractured enough to make it nearly ubiquitous at any forum for data center conversation, from conference seminar to virtual thread. After all, when “liquid cooling” can refer to anything from server immersion to rear door heat exchangers to row-based DX cooling units to tower-fed liquid-to-liquid heat exchangers to undersea containers, we can pretty safely say it has arrived or it has not yet arrived, pretty much at the same time. For my purposes here, liquid cooling refers to either liquid immersion or direct contact liquid cooling. Adoption has been incremental and sporadic. While some liquid cooling providers have invested in growth or courted investors, others have de-emphasized their data center initiatives. Similarly, while some hyperscale players have waded into the water (Google and Alibaba, notably), others have put in a toe only to retreat to dry land (Microsoft). Nascent partnerships between liquid cooling developers and ICT OEMs or major HVAC vendors promise to continue moving the needle. The most interesting development, however, has been the release of Open Compute Project’s “Immersion Requirements Document.” There is nothing like an industry standard to chip away at barriers to adoption. A case in point comes from the electronic components industry where I spent some time early in my career. Surface mount technology had been developed around 1960, but it never really got off the ground until in the 1980s with standards for both product packaging and delivery systems released by Western Electric, JEDEC, IEC and IEEE. While the OCP document does not purport to be an industry standard, since it is focused on integration with other pre-existing or pending “open” platforms and it does not make significant application distinctions between single phase and two phase dielectrics, it does organize a significant body of requirements, ranging from safety to spill management to material data. Of particular note is a relatively complete list of performance comparison metrics, so one liquid cooling solution can be compared to both other liquid cooling solutions as well as to standard air cooling solutions for efficiency and effectiveness. There is also a long list of liquid specification requirements – not numbers with which to comply but numbers to be documented, plus hard minimum dielectric requirements. This document is an important development, but not quite a game-changer for opening the floodgates, since the Open Compute Project, while gaining traction, does not yet pack an ANSI wallop. For example, when I first started working on hot and cold air containment in the data center, adoption crept on a very slight slope, despite documentation by ASHRAE TC9.9 and BICSI. However, when NFPA 76 blessed containment designs and established some important material and deployment requirements, the flood gates were released and today containment is ubiquitous. Likewise, liquid cooling will continue gaining traction, but if a more universally recognized standards body than the OPC takes up the lead OPC has established, we will finally see the much anticipated bump.
- With that being said, the power consumption of CPUs, GPUs and FPGAs has risen to such a level that the resistance of data center managers to liquid cooling is dropping for such systems. This is however selective. Lower power systems have remained air cooled in spite of the very real economies achievable. The power consumed by IT and data centers will exceed 4% of the power generated worldwide without change. In the future, companies will be forced to adopt systems with the highest efficiency just to stay in business because of legislation and public opinion around curbing global warming and environmental degradation.
- Data center design must revolve around efficiency. That means, don’t just build for space – make sure you can fill those buckets. It’s not just about space any more. Leaders in the hyperscale and data center space are building solutions that are designed around efficiency, density, and maximum uptime. This means re-thinking how you deploy things like servers, storage ecosystems, and even how you build data centers in general. Converged systems coupled with dense rack solutions are allowing companies to vastly reduce the amount of space they need while still helping their business grow. To that extent, in 2020 and beyond be sure to review your own designs so that you can meet the demands of a very fast industry. This may very well be an opportunity to move to a partner, or design a more optimal data center ecosystem.
- Genuine data center standards are becoming available and will represent a significant and beneficial change for the data center sector. The initial release of the first set of documents for ISO/IEC 22237 is likely 2020. Currently this series is only available at the Technical Specifications (TS) stage, however work is currently ongoing to develop these documents into a full ISO/IEC standard. This will be a first in the global data center standards arena. For the first time we will have an agreed international standards for many aspects of the data center including construction, power delivery, environmental control, cabling, and security. As an ISO standard, this will align with other commonly applied ISO Standards such ISO 9000, 50000, 14000, 27000 etc.
- Data-driven data centers are a real thing. Look for smart sensors, data aggregation, and even VR/AR in the data center. I remember putting on a VR helmet and looking at a row of racks. And you know what? It was amazing. The detail was unbelievable, and I could manipulate the entire environment by seeing exactly what was in the rack and where it was placed. Is this the future of data center monitoring and management? Maybe. Future data centers will be smart and connected. To the point that they’ll be both predictive and prescriptive in how they operate. Data-driven solutions help produce patters and insights that we, as humans, could never see without help. I’m actually excited about this the most. Upcoming industry reports (AFCOM and InformationWeek) indicated further investments in smart technologies and the AFCOM State of the Data Center report went on to further state that many leaders are seeing ‘smart become the new normal’ for leading data center organizations. Be sure to pay attention to this trend and quickly learn where you can adopt smart technologies.
- Security goes both ways… that is: Physical and Logical security. Make sure bad guys can’t deploy ransomware and that they can’t simply walk through the front door. Some of the most critical threats that were covered in the recent reports that I wrote were ransomware/data breaches, outside human threats, advanced persistent threats or targeted attacks, insider human threats, and loss of PII (personally identifiable information). In talking to a few of the respondents, it became clear that there was real concern around both physical attacks and those that come from the cyber world. The future of security will absolutely continue to look at all of the things the bad guys are throwing against our data centers. These are exploits, DDoS attacks, targeted vulnerabilities, and much more. But there’s also the very real physical threat as well. Just because it’s not being reported doesn’t mean it’s not happening out there. Whether it’s malicious or not, you need to focus on physical security of your infrastructure. The AFCOM report indicated that about half of respondents had some type of physical security issue happen to them. That’s a lot. Know that you’re a target and stay vigilant and prepared.
- Blockchain development may well see the advent of increasingly useful commercial tools becoming available during 2020. Operating as a public distributed ledger, Blockchain is likely to revolutionize both B2B and B2C transactions globally in the form of smart contracts and fundamental changes in financial operations. There is significant investment and research into Blockchains for specific purposes and sectors that will become increasingly commercially viable. Thus far the promise has not been fulfilled, however the technology is maturing and 2020 could well bring the required step change that has been promised for some years.
- There must be advancements made in recruiting young talent. Their values are evolving and it may be harder to get good people than you think. The one thing that I learned over the years is that the best managers will adapt to your skillset to manage you as effectively as possible. A major challenge for millennials was working with an older generation that simply followed the book on management. I never enjoyed those kinds of environments. Luckily for me, I had a series of amazing leaders who took the time to teach me through hands-on experiences and helped me learn a lot of complex concepts along the way. I’m forever grateful to those managers. Now, we face a new generation. Generation Z is poised to be pretty different than millennials and is already making waves. For example, the millennial generation is often seen as ‘job-hoppers.’ That is, constantly looking for a better opportunity to grow and make more money. Don’t expect the same from Generation Z. A study done by Indeed.com actually shows that the younger workforce is much more interested in “future-proof” jobs and job stability. Remember, this generation, like many others, grew up around a lot of instability. Many were very young, but still remember 9/11. And, they did live through the Great Recession. For them, stability in jobs (mainly in value-based industries like healthcare, life sciences, and tech) is really important. The point is that you need to understand your young employees and how they define their value system. Do your research and help this future generation grow and learn. Remember, the way they learn and succeed may be quite different than how you got ahead. Be ready to adapt.
- The shape of the data center market will continue its girdled inverse boa constrictor squeeze. I like this image of a boa constrictor with its mouth full and dragging a carcass, but the observation will not likely be served by too deep a dive into explicating some of the metaphor’s details. Nevertheless, this is what our data center world is starting to look like, and it will only become more extremely so in the coming year. Specifically, as previously noted, growth in our industry will continue to be focused on both the edge and the hyperscale facilities. This is leading to less attention paid to the historic enterprise data centers. And with the eruption of IoT in the consumer and government arenas, which turn out to be relatively small potatoes when compared to looming industrial requirements, we hear the noise about micro data centers and cell tower modules and all sorts of self-contained processing and storage hubs. The noise is turning into a real proliferation. The only reason activity on the ground may not appear to match the hype is that there are so many very different ways to occupy the edge that our desire to locate and point at one thing is frustrated. And I would not necessarily call this a migration to the edge so much as the creation of new occupants for the edge. If there is, in fact, any migration, it is going to continue to be in the other direction as enterprise workloads move into managed services, cloud services and colocation spaces and these quasi-virtual spaces are physically located in hyperscale facilities owned by a small group of players who will be attracting investments from mainstream sources not typically associated with the data center world – follow the money, as they say. That investment pattern combined with massive energy contracts like this past year’s 70+ MW contract in Virginia point to the health of the bulge at this end of the boa de-constrictor as well.
There’s enough here alone to make your head spin. And none of this is slowing down. To really keep pace with what’s happening I suggest you do a few things. Follow a few good blogs, read a few good Tweets, and participate in an industry event or two. Sure, some of the topics get repeated. But that’s really because they’re important for you to note. The best way to get ahead in a fast-paced industry is to not navigate the seas of technology alone. Work with good partners, and ask lots of questions. This is the best way to both learn and share your knowledge. Most of all, don’t be afraid to try new things. As scary as some new solutions might be, you need to remember something important: To become a disruptive entity, you yourself might need to be disrupted. Don’t be afraid to get out of your comfort zone and learn something new. And sometimes that means you have to step out of your comfort zone. The good news is that you don’t have to do this alone. Now, go make 2020 an amazing year and a great start to a new decade.
Authors:
Airflow Management Awareness Month 2019
Did you miss this year’s live webinars? Watch them on-demand now!
Airflow Management Awareness Month 2019
Did you miss this year’s live webinars? Watch them on-demand now!
This is a prediction that we all need to work on together to make happen. Given all of the headlines about the massive amounts of renewable energy being implemented around the world to power data centers, many incorrectly assumed that the basic blocking and tackling of sustainability was happening in the background.