Hugging Face Gets Hyperscalers and AI Vendors to Work (and Invest) Together

Sept. 19, 2023
Hugging Face, which defines itself as an AI community building the future, has significant support from cloud hyperscalers and the AI hardware community.
Hugging Face
Hugging Face Logo 6508addaaef56

When we think of the leading cloud hyperescalers, Google, AWS, and Microsoft, we most often paint them as giants locked in a fierce battle for market domination, taking no actions that would seem to not provide a direct benefit to their efforts.

If only for putting a dent in that image, Hugging Face is worth a look as the place to start your AI exploration and investment.

What is Hugging Face?

Hugging Face provides tools for building applications for machine learning. It is a repository for open source AI models and data sets that are freely accessible.

They make their money by offering features such as direct access to computing resources and customer support for the development of NLP and LLMs. Currently, there are over 300,000 AI models, 100,000 applications and 50,000 data sets for use by their customers and community.

Uniquely, those models include contributions from hyperscalers, and others, who are in direct competition with each other, a la Google, Microsoft and AWS.

Still in the start-up stage, Hugging Face had raised over $160 million in five rounds of funding from a variety of venture capital funds and angel investors to get their community-based (much like GitHub) AI development model off the ground. 

That investment more than doubled in their most recent round of funding, completed last month. This latest round raised $235 million, primarily from an interesting group of technology companies, including Amazon, AMD, Google, IBM, Intel, Nvidia, and Salesforce.

When you look at the models open-sourced on the Hugging Face site, you will see that all of these companies have been supportive of Hugging Face’s efforts, with some of these competitors providing hundreds of open-source models and data sets for users of the community.

Why use Hugging Face?

The platform allows users and developers to upload and share their models and projects.

Hugging Face has a collection of software tools that they call libraries, which users can utilize to accelerate the development of their work by evaluating model performance, cleaning up selected data sets, and taking advantage of the open source code that is provided on the site.

Users can avail themselves of the commercial side of Hugging Face by paying for access to compute and storage locally (within the Hugging Face platform), or continue using their existing cloud services by making use of the integration with Azure, Google Cloud, and AWS.

There is also an educational side to Hugging Face via their Classrooms app, which offers free resources, teaching materials and support for teachers and students.

This capability does a good job of highlighting the collaborative aspects of the site. Students and teachers can work together on the models, datasets, and demos that are hosted within the shared classroom space.

Classrooms appear to be an extension of Spaces, the feature that allows users to create and deploy ML demos quickly.

Spaces can be used to demonstrate, showcase, and share projects for anyone from a conference audience to involved stakeholders who wish to collaborate on the project. By default, Spaces are private, so you are not automatically sharing your work with the world.

Given that Hugging Face can be accessed at no cost, it is a pretty straightforward way for your developers to evaluate ML projects and build demos for LLM and other AI-focused projects.

If your business has already embraced the open-source development model, it is unlikely you will find a simpler way to start integrating AI development into your projects.

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.

About the Author

David Chernicoff

David Chernicoff is an experienced technologist and editorial content creator with the ability to see the connections between technology and business while figuring out how to get the most from both and to explain the needs of business to IT and IT to business.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

iStock Photo, courtesy of EdgeConneX
Source: iStock Photo, courtesy of EdgeConneX

Opportunity Awaits: Asia-Pacific's Surge in Data Center Development and Innovation

Phillip Marangella, Chief Marketing and Product Officer at EdgeConneX, explores data center development opportunities in the Asia-Pacific region.

White Papers

cover_20241007_125112

Transforming Your Data Center Networking: Digital Twins and AI

Oct. 7, 2024
Artificial Intelligence (AI) integration is widely expected to become more integral to digital infrastructures in the coming years, significantly impacting networking technologies...