Hugging Face Gets Hyperscalers and AI Vendors to Work (and Invest) Together
When we think of the leading cloud hyperescalers, Google, AWS, and Microsoft, we most often paint them as giants locked in a fierce battle for market domination, taking no actions that would seem to not provide a direct benefit to their efforts.
If only for putting a dent in that image, Hugging Face is worth a look as the place to start your AI exploration and investment.
What is Hugging Face?
Hugging Face provides tools for building applications for machine learning. It is a repository for open source AI models and data sets that are freely accessible.
They make their money by offering features such as direct access to computing resources and customer support for the development of NLP and LLMs. Currently, there are over 300,000 AI models, 100,000 applications and 50,000 data sets for use by their customers and community.
Uniquely, those models include contributions from hyperscalers, and others, who are in direct competition with each other, a la Google, Microsoft and AWS.
Still in the start-up stage, Hugging Face had raised over $160 million in five rounds of funding from a variety of venture capital funds and angel investors to get their community-based (much like GitHub) AI development model off the ground.
That investment more than doubled in their most recent round of funding, completed last month. This latest round raised $235 million, primarily from an interesting group of technology companies, including Amazon, AMD, Google, IBM, Intel, Nvidia, and Salesforce.
When you look at the models open-sourced on the Hugging Face site, you will see that all of these companies have been supportive of Hugging Face’s efforts, with some of these competitors providing hundreds of open-source models and data sets for users of the community.
Why use Hugging Face?
The platform allows users and developers to upload and share their models and projects.
Hugging Face has a collection of software tools that they call libraries, which users can utilize to accelerate the development of their work by evaluating model performance, cleaning up selected data sets, and taking advantage of the open source code that is provided on the site.
Users can avail themselves of the commercial side of Hugging Face by paying for access to compute and storage locally (within the Hugging Face platform), or continue using their existing cloud services by making use of the integration with Azure, Google Cloud, and AWS.
There is also an educational side to Hugging Face via their Classrooms app, which offers free resources, teaching materials and support for teachers and students.
This capability does a good job of highlighting the collaborative aspects of the site. Students and teachers can work together on the models, datasets, and demos that are hosted within the shared classroom space.
Classrooms appear to be an extension of Spaces, the feature that allows users to create and deploy ML demos quickly.
Spaces can be used to demonstrate, showcase, and share projects for anyone from a conference audience to involved stakeholders who wish to collaborate on the project. By default, Spaces are private, so you are not automatically sharing your work with the world.
Given that Hugging Face can be accessed at no cost, it is a pretty straightforward way for your developers to evaluate ML projects and build demos for LLM and other AI-focused projects.
If your business has already embraced the open-source development model, it is unlikely you will find a simpler way to start integrating AI development into your projects.