Artificial Intelligence

Artificial intelligence runs more data centers, but still won’t relieve technology staffing woes


These are lucrative times for technology suppliers, but with the tsunami of technology cascading into the market, end-user customers are increasingly overwhelmed with it all. Artificial intelligence to the rescue? Hold that thought.  

reflection-ball-photo-by-joe-mckendrick.jpg

Photo: Joe McKendrick

That’s the word from a recent Uptime Institute survey that shows that technology vendors and data center operators are seeing almost nonstop expansion. For the third year in a row, about 80% say customer spending is at or above normal levels. Half expect strong capital spending growth in the next three to five years. About one in three predicts some deceleration in growth rates, but only one in six expects it to flatten or shrink. 

Good news, but where’s all that new stuff going to go? Forecasting data center capacity remains a dark art — and for the fourth year running, suppliers say it is the biggest issue their clients face. Managing a combination of different data center environments — typically any mix of on-premises, colocation, including both retail and wholesale, and cloud – rose to the second largest operator challenge.

In addition, nearly half (47%) of data center operators report difficulty finding qualified candidates, and 38% of technology providers say staffing shortages will inhibit growth. While AI is expected to be adopted more widely in the next five years, but will not alleviate the staffing shortage. One approach is to open up recruiting to a wider diversity of potential candidates, and most vendors, 88%,expect that more diverse staff will be recruited in the next three to five years.  

Both technology suppliers and data center operators share the view that AI will be increasingly used to improve facility operations. “AI techniques continue to advance, and the pandemic forced many data center operators to revisit their investments in remote monitoring and related software, driving interest in the technology,” The survey’s authors, Rhonda Ascierto and Jacqueline Davis of The Uptime Institute, state. A growing proportion of suppliers, 89% in 2021 (up from 70% in 2019), agree that AI will be widely used in data centers to improve efficiency and availability within the next five years.

Only about a third believe AI will reduce data center employment relative to workloads within the same timeframe. Just 23% of operators expect that technology will reduce operational staffing levels within the next five years. “Operators taking a measured approach toward the realities of deploying AI-driven software in mission-critical functions,” Ascierto and Davis state. “Improvements from the technology and its ability to lower staffing levels through automation are typically the result of an iterative, human-driven process. Developing trust in a system takes time.”

The Uptime analysts predict that nearly 2.3 million full-time employees will be needed globally by 2025 to support the design, build and operations of data center infrastructure globally, up from an estimated two million in 2020. Demand for workers “is compounded by the expected retirement of many existing workers,” the authors state. 

The top challenge for customers is consistent: forecasting data center capacity. “For some operators, this means dealing with runaway demand at their own on-premises data centers, while for others it is anticipating reduced demand as more work moves to third-party infrastructure,” according to Ascierto and Davis. “For most, the challenge is more nuanced than capacity sizing alone. It is about where different workloads should run based on requirements for cost, resiliency, compliance, and other factors.”

Edge computing has also edged its way into technology design and capacity planning. Suppliers’ confidence in the near-term growth of the edge has risen, with 60% agreeing most of their customers will own small edge data centers within five years, up from 48% a year ago. Most, nearly three-quarters, say they are making changes to their products or services in response to the edge opportunity. “Redesigns and changes may be necessary for edge deployments because edge computing is often sited in remote or new locations with minimal to no staff presence,” the Uptime authors state. 

“In the past decade, availability zones have become the de facto way for hyperscale operators to maintain always-on service, but the approach is no longer limited to hyperscales,” they add. “Enterprises are beginning to deploy private cloud workloads across racks in at least three colocation sites that are sufficiently near each other to ensure low latency, but far enough apart to avoid a localized disruption at one site affecting workloads at another. By replicating and diverting workloads among sites, it means the need for fault tolerance at a single data center can be reduced, at least in theory.”

There are two reasons for this diversification: increased independence on IT services, and architectural complexity. “More critical work is running in data centers than ever before, and any big component failures can cascade, making recovery difficult and expensive,” Ascierto and Davis point out.

There is growing awareness about IT’s role in sustainability, the survey also shows. As more large data centers (20 MW or greater) are established in the next five years — something anticipated by suppliers — there will be increased pressure on the sector to improve its environmental footprint. Almost three-quarters of suppliers expect that most facilities will have carbon reduction goals by 2022. Nearly four in five suppliers have established targets of their own to reduce the environmental impact of their products or services. The majority of these suppliers (80%) have redesigned their data center products/services to improve their environmental footprint in the past two years. 



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.