The Future of AI is Bright

March 8, 2018
In the near term, a lot of new infrastructure in the data center is being devoted to enabling AI software applications to run. In this week’s Voices of the Industry column,  Marc Cram, Director of Sales for Server Technology, explores applications for the evolving world of AI, including the variety of software tools designed to find hidden patterns and correlations between elements of large data sets.

In this week’s Voices of the Industry column,  Marc Cram, Director of Sales for Server Technology, explores applications for the evolving world of AI, including a variety of software tools designed to find hidden patterns and correlations between elements of large data sets.

Marc Cram, Director of Sales, Server Technology

You may have heard the term “artificial intelligence” mentioned in the press in the past year. It has been a topic of much conversation regarding the potential impact to both the job market and the people who are seeking employment. When you hear the term AI, do you think of HAL from 2001 A Space Odyssey? Or maybe the Cyberdyne Systems Model 101 Series 800 (the Terminator)? Or does something more benign, like Apple’s Siri Voice Assistant come to mind?

According to dictionary.com, one definition for AI is “the capacity of a computer to perform operations analogous to learning and decision making in humans, as by an expert system, a program for CAD or CAM, or a program for the perception and recognition of shapes in computer vision systems.”

Originally proposed as an “Imitation Game,” the Turing Test, as described by Alan Turing, is a test of a machine’s ability to exhibit intelligent behavior through conversation equivalent to, or indistinguishable from that of a human. In the game, a human being and a computer would be interrogated under conditions where the interrogator would not know which was which, the communications being entirely by textual messages. Turing argued that if the interrogator could not distinguish between them by questioning, then it would be unreasonable not to call the computer intelligent, because we judge other people’s intelligence from external observation in just this way.

By 2020, 30% of data centers that fail to implement AI and machine learning will cease to be operationally & economically viable. — Gartner, Dec 2017

Commonly used terms for describing various types of artificial intelligence include machine learning (both supervised and unsupervised), expert systems, knowledge-based systems, neural networks, fuzzy logic, genetic algorithms, case-based reasoning, natural-language processing (NLP), and intelligent agents. High profile applications of AI include Siri, Google Now, Alexa, Cortana, driverless vehicles, and a wide variety of software tools designed to find hidden patterns and correlations between elements of large data sets.

In the near term, a lot of new infrastructure in the data center is being devoted to enabling AI software applications to run. NVIDIA GPUs are powering many of these installations, but there are numerous other platforms coming to challenge NVIDIA’s supremacy in the field. At the August 2017 Hot Chips conference, Amazon, Baidu, and Microsoft all detailed projected that utilize Field Programmable Gate Arrays (FPGAs) as computational accelerators for machine learning applications. Baidu detailed the XPU that targets compute-intensive, rule-based workloads. Microsoft’s Project Brainwave, a scalable acceleration platform for deep learning, provides real-time responses for cloud-based AI services. Amazon announced a Xilinx-based FPGA application that is accessible through EC2. For its part, Intel acquired Altera, an FPGA manufacturer that competes with Xilinx. Intel is touting their FPGAs as being well-suited for AI applications.

[clickToTweet tweet=”Gartner – By 2020, 30% of data centers that fail to implement AI and machine learning will cease to be operationally & economically viable. #AI” quote=”Gartner – By 2020, 30% of data centers that fail to implement AI and machine learning will cease to be operationally & economically viable. #AI”]

In 2016, IDC forecasted that by 2020, cognitive systems and artificial intelligence would be adopted across a broad range of industries and drive worldwide revenues from about $8B in 2016 to more than $47B in 2020 for a CAGR of 55.1%. Hearing the siren-song of opportunity, numerous venture capital funds have invested in countless AI-related startups. Companies worldwide have begun embedding and deploying AI into almost every kind of enterprise application or process. Google’s DeepMind AI was turned inward to look at their data center operations, and reduced their cooling bill by 40%. Google says that customers running in the Google cloud will improve their own energy efficiency.

Future applications for AI may include processing IoT data for Smart Cities, providing first and second level technical support for products, research and development of new genetic therapies, supervising the elderly that live at home alone, and lowering the cost of insurance by more accurately diagnosing illnesses and their causes. The future of AI is bright indeed.

Marc Cram is Director of Sales for Server Technology, a brand of Legrand. Contract him at [email protected]or connect with Marc on LinkedIn.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

Julia Ardaran / Shutterstock.com

Beyond Hyperscale: Quantum Frederick's Vision for Sustainable Data Center Development

Scott Noteboom, CTO of Quantum Loophole, explains how Quantum Frederick created the first carbon neutral industrial zone.

White Papers

Get the full report

Getting to Green – Paving a Sensible Path To Sustainable Data Centers

April 19, 2022
This special report, courtesy of Kohler, explores the current state of the five sustainable energy sources recognized by the U.S. Energy Information Administration – solar, wind...