How AI Will Inhabit Our Data-Driven Future

July 20, 2018
Artificial intelligence (AI) is being deployed in everything from manufacturing to medicine. AI will likely boost demand for cloud and data center services, as companies provision storage and computing power to analyze that data.

We continue with “Data Driven,” a series of articles examining the volume of data generated by emerging technologies. Today, Kayla Matthews examines the impact of artificial intelligence, and how its adoption will shape the world of IT hardware and data centers. 

Artificial intelligence (AI) is no longer a topic that only exists in science-fiction novels and blockbuster movies that make people dream about the future. It’s used in sectors ranging from manufacturing to medicine.

Plus, today’s society is largely data-driven. Personal gadgets like smartphones and fitness trackers collect data in households, while businesses depend on collective data and specialized platforms to make smarter decisions about the future of their establishments.

How does AI fit into this already data-rich world, and how will it continue to influence it? As adoption of AI accelerates, it will likely boost demand for cloud computing and data center capacity, as companies create large data troves and provision computing power to analyze that data.

As an example, Amazon Web Services says use of AI tools on its platform has surged 250 percent over the last year. “Machine learning is experiencing a renaissance in the cloud,” said Dr. Matt Wood, GM of Artificial Intelligence at Amazon Web Services at this week’s AWS Summit in New York. “We can store as much data as we want, and pull down as much compute as we need. Most of the restraints (an AI adoption) have melted away in the cloud.”

Business Adoption Will Bring More Storage of AI Data

A study polled more than 1,000 IT buyers in North America and Europe about their plans for 2018. It revealed that the respondents expected a 30 percent increase in their AI adoption during the year. It’s also worth noting that 13 percent of the people surveyed reported they’d already started using AI in their companies.

Those statistics strongly suggest that business users will heavily stimulate the need to figure out effective ways to store AI data.

The technology used for AI data must have some minimum characteristics to handle AI storage and processing. Those things include ample space that’s always available, error checking for the software array that runs the storage system, automation capabilities, user-friendliness and extremely fast performance.

All-flash systems offer those benefits, making them ideal to meet AI storage needs. They’re about a thousand times faster than disk arrays.

Storage and Processing Needs Gradually Increase

Another factor about AI that’s important to keep in mind is that many of its applications will need more robust storage and processing technologies as time passes. Similarly, the processing speed cannot become sluggish as new data gets stored and instead must remain consistently speedy.

For example, deep learning is a subset of AI involving algorithms that become more accurate as datasets grow. Many of the storage systems were built to hold large quantities of data, but not deliver it to its destinations. However, AI needs storage technology that can send data to algorithms’ locations, thereby making intelligent technologies work.

Plus, deep learning caused a 15 percent increase in data computing needs from 2015-2017. As people use AI more and more and increasingly realize its potential, they’ll demand storage and processing solutions that can handle the substantial amount of data AI generates. Content must be accessible right away to help AI technologies work as they should.

Details of Current and Future Data-Intensive AI Technologies

Projections indicate that the AI market should reach $36.8 billion by 2025. Analysts say the worth and use cases will only go up as people figure out how to help the associated technologies become nearly as smart as people.

Speaking of people, AI will likely be one of the primary technologies that help them get where they need to go, specifically inside autonomous cars.

Drive.ai is a company that launched a pilot program in Frisco, Texas, to transport residents in self-driving cars. There will initially be a safety driver behind the wheel, but not forever. The company’s autonomous vehicle app uses deep learning neural networks. Those are AI algorithms that learn by relying on connected networks to find patterns in data.

The algorithms allow the cars to make decisions based on what they see in real-world environments. Supporters of this kind of technology think it’s ideal for this use because it works similarly to the ways humans learn. However, it’s still too early to say whether Drive.ai will achieve the success it needs in the marketplace.

Regardless of Drive.ai’s future, autonomous driving will undoubtedly be one of the things that spur the need to find feasible options for AI data and retrieval. Estimates from Intel say that after one autonomous car drives for eight hours, it’ll generate and take in approximately 40 terabytes of data.

The smart home technology market will also play a notable role in contributing to AI data. Researchers project that segment to reach an incredible $150.6 billion globally by 2023 and believe Europe will see the fastest growth over the period. That’s due in part to U.K. mandates to have smart energy meters installed in every home by 2020.

AI and Smart Home Technology

The smart home market encompasses countless other gadgets, too. Smart speakers answer people’s questions, control compatible smart lights and thermostats, and help users buy things online just by speaking.

AI home security cameras recognize the faces of familiar people and work with apps that allow users to check on their abodes from wherever they are. Many of them allow looking back at the compiled footage, such as if a person wants to see everything that happened at home during the week they went on vacation.

Also, an Israel-based AI research company called Cortica entered a partnership with Best Group, a company in India. Cortica hopes to use AI algorithms to analyze hundreds of terabytes worth of data from CCTV cameras in public places like shopping malls, sports stadiums and city centers.

A demonstration of AI facial recognition technology from Cortica, an Israeli security company. (Photo: Cortica)

Security technology is already in use that recognizes faces and license plates, but Cortica’s technology goes beyond those capabilities by attempting to spot behavioral anomalies that could be evident when people are about to engage in illegal activities.

When building the algorithms, programmers relied on a segment of a rat’s brain that they kept alive outside the body and connected electrodes to. They could then study how the cortex responded to certain stimuli.

Due to that approach, the developers say that if the technology makes a mistake, they’ll be able to trace what happened back to single files or processes, allowing for targeted retraining.

Cortica’s technology is still in the early stages. However, since it holds onto all that data and people have a way to go back through it, it’s easy to envision how that company alone would have massive AI data needs to fulfill if the company reaches the point of mainstream adoption.

AI’s Progress Depends on Highly Capable Data Solutions

Speaking broadly, AI applications function after getting programmed with tremendous amounts of data. Then, many of them learn through use, such as when household dwellers put smart thermostats at the same temperature week after week.

At almost all the stages in those processes, AI stores data and needs to retrieve it quickly. In contrast, some traditional storage systems have adequate space but may fall short with speedy retrieval.

An eight-rack pod of Google’s liquid-cooled TPU version 3 servers for artificial intelligence workloads. (Image: Google)

In addition to the need for faster storage systems, AI requires more powerful hardware to process data, train algorithms, and make real-time decisions in applications like  autonomous driving.  New hardware for AI workloads is packing more computing power into each piece of equipment, boosting the power density – the amount of electricity used by servers and storage in a rack or cabinet – and the accompanying heat. The trend is challenging traditional practices in data center cooling, and prompting data center operators to adapt new strategies and designs.

An example of this trend, Google is using liquid cooling for its latest hardware for artificial intelligence, as the heat generated by its new Tensor Processing Units (TPUs) has exceeded the limits of its previous data center cooling solutions.

AI presents new challenges, as well as opportunities for tech experts to figure out how to handle the associated data in ways that facilitate the technology instead of hindering it.

About the Author

Kayla Matthews

Kayla Matthews is a tech journalist and blogger, whose work has appeared on websites such as VentureBeat, MakeUseOf, VICE’s Motherboard, Gear Diary, Inc.com, The Huffington Post, CloudTweaks, and others. Drawing from her interests in technology and its applications to daily life, Matthews writes about the intersection of technology and productivity.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

Image courtesy of EXFO

Navigating the Future: Upgrading Networks in Data Centers for 400G  

Nicholas Cole, Data Center Solution Manager at EXFO, explains why the journey towards 400G and beyond is not merely about keeping pace but also ensuring that every step forward...

White Papers

Get the full report

Decarbonized Resilience

Nov. 14, 2021
A new white paper from Enchanted Rock explores four alternatives to diesel backups to see which offer both resiliency and an economical way to meet climate goals.