Five years ago, Artificial intelligence (AI) implementation was pretty rare. But today, the business world is awash with machine learning and AI experimentation. Data has become a vital part of almost every business operation and everyone is looking for ways to harvest data for insight and business results. At the heart of this transformation are data centers. New infrastructure is being devoted to AI projects, and the surge of data is demanding more intelligent operations and management. As we look at the rise of AI in the data center, here are some defining trends:
1) The Rise of Customized AI Chips
AI demands enormous computational power, and using general-purpose chips would be impossible, and outrageously expensive to scale. AI chips are specialized silicon chips designed to perform complex mathematical and computational tasks in a more efficient way. With most AI use cases today being very narrow, AI chips can be trained for a specific task such as pattern recognition, natural language processing, network security, robotics, and/or automation.
As AI continues to mature, capabilities will not only expand but the cost of implementation will also go down. AI will be taking on more use cases and be embedded in more devices. This trend will advance even further with RISC-V and other open-source technologies lowering the barriers to purpose-built “building blocks” that can focus on efficiency, performance, and scalability like never before.
2) The Move Towards “Auto” Everything
For IT teams the quest for greater efficiency is never-ending. To keep afloat amid the explosion of data and the complexity of diverse workloads, automation is essential for success.
On one hand, automation is a way of relieving the pressure from IT staff and freeing their time to more important projects. But automation is also key in helping AI take on more functions in the data center by removing tasks that rely on close human interaction. In the words of our Big Data Analytics Platform senior director: “Touch it twice? Automate it.”
Automation is what will help data centers make the journey to AI and move from being reactive to preventative, and ultimately, predictive.
3) We All Need Standards
More and more devices are going to see embedded intelligence. And while we often view the flow of data in a linear path between the endpoint device to the edge and the core/cloud, the reality is that we are moving towards an era with intelligence to everything. Different devices operate mutually with other devices in an ecosystem. These devices need to be able to “speak” to one another. An easy example is autonomous vehicles that will need a common “language” to communicate regardless of the car manufacturer and beyond vehicles themselves. The safety of autonomous driving is dependent on an ecosystem of smart traffic signals, road side units, pedestrian alerts, etc. Standardization and interoperability are key, and this will make an AI/ML ecosystem easier to integrate and deploy at the edge.
4) The Data Scientist Turns Virtual
There are simply not enough data scientists in the world to support the growth of machine learning workloads. It may sound like an oxymoron, but AI can help to manage AI.
By expanding existing tools and building a self-service platform, AI technology can be made accessible to more people in the business. Whether it’s software engineers, subject matter experts or even doctors, given a few homegrown skills and support, more stakeholders should be able to generate predictive, AI-based analysis. To some extent, anyone in an organization should be able to fulfill the baseline role of the data scientist.
At Western Digital, we’ve built the Big Data Analytics Platform as a key enabling platform that can host a multitude of data and analytics environments. The open platform approach enables data scientists to create repeatable and scalable solutions, and more stakeholders can take advantage of its self-service architecture.
5) AI Permeates Data Center Operations
As data grows and applications become more complex and diverse, the data center is desperate for efficiency improvement. Some go as far to say that without AI, many data centers will not be economically or operationally viable.  Some of the ways AI tools will assist is by improving resource and service provisioning, cooling and power optimization, and by detecting more cyber threats. Like with most things AI, the goal is to find the optimal workflow with humans, automating intelligence where needed, and driving business strategy through IT adeptness. The most successful data centers will strategically deploy and pair human and AI capabilities across most operations, and deploy smarter, highly efficient and flexible infrastructure.
AI in the Data Center
AI in the data center is not a fantasy of human-like robots. It’s a viable technology penetrating every market and every vertical, improving processes, unearthing insights, and powering many apps and features we use today. For us, too, AI has become business-critical. It plays a vital role in global manufacturing processes across 17 factories. These manufacturing applications target diverse areas such as a Digital Twin to simulate fabrication processes, image analysis for rapid fault detection, predictive maintenance to increase equipment uptime, accelerating test time through predictive adaptive test, alongside many other applications.
AI is a business discipline. It often requires experimentation beyond your comfort zone, but it is a cornerstone technology that should be part of futureproofing you business strategy and the data center.
- The Industry 4.0 Transition – Architecting for AI, ML and IoT
- How to Leverage NVMe™ for AI & ML Workloads
- 5 Reasons Enterprises are Finding it Hard to Adopt AI