• About Us
  • Partnership Opportunities
  • Privacy Policy

Data Center Frontier

Charting the future of data centers and cloud computing.

  • Cloud
    • Hyperscale
  • Colo
    • Site Selection
    • Interconnection
  • Energy
    • Sustainability
  • Cooling
  • Technology
    • Internet of Things
    • AI & Machine Learning
    • Edge Computing
    • Virtual Reality
    • Autonomous Cars
    • 5G Wireless
    • Satellites
  • Design
    • Servers
    • Storage
    • Network
  • Voices
  • Podcast
  • White Papers
  • Resources
    • COVID-19
    • Events
    • Newsletter
    • Companies
    • Data Center 101
  • Jobs
You are here: Home / Cloud / New Chips, Software Shift Workloads From Cloud to Mobile Devices

New Chips, Software Shift Workloads From Cloud to Mobile Devices

By Rich Miller - July 10, 2017

New Chips, Software Shift Workloads From Cloud to Mobile Devices

Microsoft's Sam George discusses the company's Azure IoT Edge platform in a presentation at the IoT World conference in Santa Clara. (Photo: Rich Miller)

LinkedinTwitterFacebookSubscribe
Mail

Our things are getting smarter and more powerful, bringing the computing power of the cloud into devices in our pockets. The trend is enabled by advances in hardware and software, as startups and cloud platforms alike seek to capitalize on the disruptive changes in the technology landscape.

The power of these new chips and devices will help shape America’s evolving IT infrastructure, moving more workloads and tasks to the very edge of the network. The Internet of Things (ioT) and artificial intelligence (AI) are bringing intelligence to mobile devices and industrial equipment, shifting computing power and algorithms to devices like smartphones and tablets, as well as appliances on factory floors and hospitals.

“You can now put more computational ability at the very edge of the network, essentially making it as if the cloud is in your back pocket,” said Ed Chan, Senior VP for Technology Strategy at Verizon, at a recent conference. “That’s kind of how we envision the way that 5G (next-generation connectivity) is going to change the world.”

The evolution of edge devices and “fog computing” – processing power near the perimeter of the network – will play a role in the geography of the data center industry, helping to deliver capacity to billions of devices and sensors.

This trend is expected to play a leading role in the Internet of Things, but is also emerging as a key strategy in artificial intelligence, providing the ability to run neural networks on smartphones. The capabilities of these devices will ripple beyond the fog layer, impacting the path of data traffic and location of workloads.

Want a Neural Network With that Smartphone?

New technologies like the Internet of Things, artificial intelligence (AI), autonomous vehicles and virtual reality will require data storage and computing power to become highly distributed.

A key element of all these technologies is analytics – the ability to process Big Data and extract value for businesses and consumers. This crunching of big data has historically been performed in the data center. As mobile devices add processing power and algorithms become more efficient, some analytics jobs are shifting to devices on the edge.

“Everything becomes a data center, because it has to,” said Scott Noteboom, founder and CEO of LitBit. “The majority of data center calculations and analytics will take place on the devices themselves.”

This has led to some impressive new hardware capabilities for mobile devices. At the recent O’Reilly Artificial Intelligence conference, startup Aipoly showed off a smartphone app that can run a convolutional neural network, a data-intensive AI process widely used for image recognition.

“Rather than running this in the cloud or large servers, we’re starting to run it on the mobile device,” said Aipoly co-founder Albert Rizzoli. “We wanted a system that could run in real time.”

Free Resource from Data Center Frontier White Paper Library

cloud data centers
Yes, the Cloud Is a Catalyst; It’s Also a Competitive Benchmark
Cloud data centers are typically located where the metrics of  total cost of ownership, flexibility, performance, and ‘righteousness’ are optimized. This white paper provides an overview of the US markets with the lowest total cost of ownership by ranking them based on land, energy, network and labor costs.
We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

Get this PDF emailed to you.

Aipoly co-founder Albert Rizzoli demonstrates how the company's app enables visually-impaired persons to use their smartphone camera for object recognition. Rizzoli presented at the O'Reilly AI conference in New York. (Photo: Rich Miller)

Aipoly co-founder Albert Rizzoli demonstrates how the company’s app enables visually-impaired persons to use their smartphone camera for object recognition. Rizzoli presented at the O’Reilly AI conference in New York. (Photo: Rich Miller)

Aipoly allows the visually-impaired to use their smartphone as a sensor and guide. Users point their phone, and the Aipoly app recognizes objects and people and provides an audio description. It can recognize friends, identify products in a grocery shelf, and discern colors and shapes.

The Aipoly app requires real-time execution to be effective. Sending images to the cloud and back would take about two seconds, while Aipoly’s on-device technology can deliver results in 250 milliseconds – about one-tenth the time. Aipoly has developed a deep learning engine that runs AI algorithms more efficiently.

“Cameras are no longer just for photography,” said Rizzoli. “They’re becoming sensors. AI on low power devices is commoditized. You can essentially have a Pokedex in your hand serving as a digital assistant.”

What New Technology Means to the Data Center

At first glance, the improved processing power of mobile devices could be perceived as a threat to data centers, representing a paradigm shift that frees users from the need to process data in huge server farms. The advancements in device-based processing and “fog computing” – data-crunching that occurs on near-edge appliances and servers – bear close watching, given their potential to create a more distributed computing architecture.

Over the years, pundits have warned that new processing technology could hurt the data center business – most notably Wall Street analyst Jim Cramer, who in 2009 warned his viewers to sell Equinix because new Intel chips “would make the data center obsolete.” Stock in Equinix, which was $77 at the time, has since risen to $422 a share. That may explain why Cramer has experienced a conversion, and now regularly features data center stocks (especially Equinix).

In truth, this is a trend that has been in place for years, in a variety of formats. Chips and memory have been getting smaller and more powerful for decades. Virtualization and containerization have emerged to offer more efficient use of resources. Moore’s Law has not slowed the growth of the data center industry, even through major shifts to mobile and cloud platforms, illustrating Jevons Paradox (efficiency prompts more usage, not less).

Edge or Fog?

Leading hardware vendors and cloud service providers are deploying products to help customers run some AI analytics on devices or on-premises appliances. In many cases, these capabilities provide an initial layer of analysis, reducing the amount of data that must be sent across the network for further analytics.

“The cloud was going to solve all our problems, but now there’s too much data to send,” said Sastry Malladi, Chief Technology Officer of Foghorn Systems, which makes software for small-footprint devices. “How do you run software to ingest and process data on devices? We used to call that embedded computing.”

The language around distributed computing can be a jumble of terms that evolve, intersect and overlap. Here at Data Center Frontier, we use the term edge computing when writing about network distribution. As we’ve noted, “edge” can be defined in a number of ways, spanning everything from regional data centers to micro data centers at telecom towers to the endpoints themselves.

Fog computing is a popular term for those focused on the Internet of Things. The IoT will produce an enormous number of connected devices – between 20 billion and 50 billion, according to various estimates. Some of these devices, like autonomous cars and some factory equipment, will generate huge volumes of data that requires real-time action.

The term fog computing has been embraced by Cisco Systems to describe a layer of computing between endpoints and data centers. Cisco has teamed with Intel, Microsoft, Dell, ARM and Princeton University to form the OpenFog Consortium , which seeks to accelerate distributed computing through open architecture and reference frameworks.

In some instances, fog computing is distributing decision-making , helping customers operate in real-time without latency.

“The idea of doing everything in the cloud will not work,” said Philippe Fremont, VP of Technical Marketing at Avnet, which makes embedded components for IoT applications. “A lot of our suppliers are moving intelligence to the edge.”

Getting Mythic With Processing in Memory

The shift to a distributed network is driven by innovation in both hardware and software. An example is Mythic, an Austin-based startup making hardware to power AI on devices.

“Neural networks are incredibly powerful, and you’re putting them inside devices with resource constraints,” said Michael Henry, the CEO of Mythic, in a presentation at the O’Reilly AI conference. “When we do these intelligent tasks, we want to do them at the edge. But you can’t put a GPU in a Fitbit.”

How do you put more power in a device without generating heat and draining the battery? Mythic’s solution is to do processing in flash memory. “We don’t have processors,” said Henry. “We use the memory to do the processing. We are able to do matrix math inside a flash memory array. We’re making silicon for inference.”

There are several types of AI computing workloads. In training, the network learns a new capability from existing data. In inference, the system applies its capabilities to new data, using its training to identify patterns and perform tasks, usually much more quickly than humans could.

“Once you have your algorithm trained, that’s where we come in,” says Henry. “New hardware will let us run all sorts of algorithms at the source of the data.”

Mythic CEO Mike Henry discusses the company's hardware during a presentation at the O'Reilly AI conference in New York. (Photo: Rich Miller)

Mythic CEO Mike Henry discusses the company’s hardware during a presentation at the O’Reilly AI conference in New York. (Photo: Rich Miller)

ARM Holdings is taking a similar approach,  positioning its new chips to power AI processing on these edge devices.

“A cloud-centric approach is not an optimal long-term solution if we want to make the life-changing potential of AI ubiquitous and closer to the user for real-time inference and greater privacy,” writes Nandan Nayampally on the ARM blog. “ARM has a responsibility to rearchitect the compute experience for AI and other human-like compute experiences. To do this, we need to enable faster, more efficient and secure distributed intelligence between computing at the edge of the network and into the cloud.”

Another hardware startup focused on this space is Nebbiolo Technologies, which in February launched a fog computing platform featuring its FogNode hardware device and a fog software suite. The Nebbiolo suite is being used by robotics vendor KUKA to create a cloud-to-fog system to securely manage industrial robots.

The major hardware incumbents are also focusing more power on edge devices. Last year Intel acquired Movidius, a startup developing low-power coprocessors to provide computer vision for drones and virtual reality devices. Intel has marketed the E3900 series of its mobile Atom processor for IoT and fog computing applications. NVIDIA also has a contender in its Jetson platform, which is designed to bring GPU-accelerated parallel processing to mobile devices.

Software Suites Bring Cloud to the Edge

Cloud computing providers are deploying software to support this distributed computing. Microsoft has introduced Azure IoT Edge, a platform enabling cloud intelligence to run on IoT devices.

“At Microsoft we think there’s going to be a balance between the cloud and the IoT,” said Sam George, Director of Azure IoT at Microsoft. “Not all data will be sent to the cloud. With edge you’re taking logic you might have run in the cloud and now you’re deploying that logic down to the edge.”

Azure IoT Edge makes that logic available in Docker-style containers, allowing devices to act locally based on the data they generate, while also taking advantage of the cloud to configure and manage them.

At the recent IoT World conference, George discussed how IoT Edge has extended the edge to the factory floor for Sandvik Coromant, which makes manufacturing tools and machining solutions. Azure Edge IoT Suite collects and analyzes data from sensors embedded in all of the tools across the shop floor, monitoring every aspect of their performance, as well as the existence of any bottlenecks in the overall supply chain for manufacturing. Sandvik Coromant takes that analysis and makes recommendations on how to optimize the manufacturing process, and creates a predictive maintenance schedule that’s designed to help avoid unscheduled shutdowns.

By moving the logic from the cloud to the fog, Aazure IoT Edge allows Sandvik Coromant to reduce its round-trip response time from 2 seconds to 100 milliseconds.

Amazon Web Services has just rolled out AWS Greengrass, an IoT service that Greengrass enables developers to create “serverless” code in the cloud using AWS Lambda functions, and deploy them to devices for local execution of applications. AWS Greengrass can be programmed to filter device data and only transmit necessary information back to the cloud.

LinkedinTwitterFacebookSubscribe
Mail

Tagged With: Artificial Intelligence, Edge Computing, Fog

Newsletters

Stay informed: Get our weekly updates!

Are you a new reader? Follow Data Center Frontier on Twitter or Facebook.

About Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Comments

  1. mtrifiro@vapor.io'Matt Trifiro says

    July 11, 2017 at 10:54 am

    The proliferation of increasingly powerful edge devices won’t displace cloud computing. It will actually increase demand for more (and more localized) cloud. More devices, data, and compute power in the field will create new use cases that will will drive the re-architecture of cloud as an N-tier architecture, where a lot of the new growth will be at the true edge, one hop from the device. These two trends — more powerful devices and the cloud extending to the edge — will be what enables our new classes of applications from autonomous driving to AR-driven telemedicine.

    While somewhat counterintuitive that more and more powerful devices will drive the need for more cloud, it can be partially explained by Jevon’s Paradox: https://en.wikipedia.org/wiki/Jevons_paradox

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Twitter

Voices of the Industry

Understanding the Differences Between 5 Common Types of Data Centers

Understanding the Differences Between 5 Common Types of Data Centers No two are data centers are alike when it comes to design or the applications and data they support with their networking, compute and storage infrastructure. Shad Secrist of Belden outlines the differences between 5 of the most common types of data centers including edge, colocation and hyperscale.

White Papers

Data Center Innovation: Driving Flexibility with Microservices

It is imperative in today’s fast-paced data center environment that product and service innovations be rolled out quickly and safely. A new white paper from QTS explains how microservices have boosted the speed, flexibility, and resilience of their proprietary Service Delivery Platform (SDP).

Get this PDF emailed to you.

We always respect your privacy and we never sell or rent our list to third parties. By downloading this White Paper you are agreeing to our terms of service. You can opt out at any time.

DCF Spotlight

Data center modules on display at the recent Edge Congress conference in Austin, Texas. (Photo: Rich Miller)

Edge Computing is Poised to Remake the Data Center Landscape

Data center leaders are investing in edge computing and edge solutions and actively looking at new ways to deploy edge capacity to support evolving business and user requirements.

An aerial view of major facilities in Data Center Alley in Ashburn, Virginia. (Image: Loudoun County)

Northern Virginia Data Center Market: The Focal Point for Cloud Growth

The Northern Virginia data center market is seeing a surge in supply and an even bigger surge in demand. Data Center Frontier explores trends, stats and future expectations for the No. 1 data center market in the country.

See More Spotlight Features

Newsletters

Get the Latest News from Data Center Frontier

Job Listings

RSS Job Openings | Pkaza Critical Facilities Recruiting

  • Critical Power Energy Manager - Data Center Development - Ashburn, VA
  • Site Development Manager - Data Center - Ashburn, VA
  • Data Center Facility Operations Director - Chicago, IL
  • Electrical Engineer - Senior - Dallas, TX
  • Mechanical Commissioning Engineer - Calgary, Alberta

See More Jobs

Data Center 101

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center 101: Mastering the Basics of the Data Center Industry

Data Center Frontier, in partnership with Open Spectrum, brings our readers a series that provides an introductory guidebook to the ins and outs of the data center and colocation industry. Think power systems, cooling, solutions, data center contracts and more. The Data Center 101 Special Report series is directed to those new to the industry, or those of our readers who need to brush up on the basics.

  • Data Center Power
  • Data Center Cooling
  • Strategies for Data Center Location
  • Data Center Pricing Negotiating
  • Cloud Computing

See More Data center 101 Topics

About Us

Charting the future of data centers and cloud computing. We write about what’s next for the Internet, and the innovations that will take us there. We tell the story of the digital economy through the data center facilities that power cloud computing and the people who build them. Read more ...
  • Facebook
  • LinkedIn
  • Pinterest
  • Twitter

About Our Founder

Data Center Frontier is edited by Rich Miller, the data center industry’s most experienced journalist. For more than 20 years, Rich has profiled the key role played by data centers in the Internet revolution. Meet the DCF team.

TOPICS

  • 5G Wireless
  • Cloud
  • Colo
  • Connected Cars
  • Cooling
  • Cornerstone
  • Coronavirus
  • Design
  • Edge Computing
  • Energy
  • Executive Roundtable
  • Featured
  • Finance
  • Hyperscale
  • Interconnection
  • Internet of Things
  • Machine Learning
  • Network
  • Podcast
  • Servers
  • Site Selection
  • Social Business
  • Special Reports
  • Storage
  • Sustainability
  • Videos
  • Virtual Reality
  • Voices of the Industry
  • Webinar
  • White Paper

Copyright Data Center Frontier LLC © 2022