Hugging Face Shares $10 Million Worth of Computing Power to Help Beat the Big AI Companies - Latest Global News

Hugging Face Shares $10 Million Worth of Computing Power to Help Beat the Big AI Companies

Hugging Face, one of the biggest names in machine learning, is providing $10 million in free shared GPUs to help developers develop new AI technologies. The aim is to help small developers, academics and start-ups counteract the centralization of AI advances.

“We are fortunate to be able to invest in the community,” said Clem Delangue, CEO of Hugging Face The edge. Delangue said the investment was possible because Hugging Face is “profitable or close to profitable” and recently raised $235 million in funding, valuing the company at $4.5 billion.

Delangue is concerned about the ability of AI startups to compete with the tech giants. The most significant advances in artificial intelligence—like GPT-4, the algorithms behind Google Search, and Tesla’s Full Self-Driving system—remain hidden from big tech companies. Not only are these companies financially incentivized to keep their models proprietary, but with billions of dollars at their disposal for computing resources, they can increase those profits and stay ahead of the competition, which is what it takes for startups makes it impossible to keep up.

“If you end up with a few organizations that dominate too much, it will be harder to fight against that later.”

Hugging Face aims to make cutting-edge AI technologies accessible to everyone, not just the tech giants. I spoke with Delangue during Google I/O, the tech giant’s flagship conference where Google executives unveiled numerous AI features for their proprietary products and even a family of open source models called Gemma. For Delangue, the proprietary approach is not the future he envisions.

“If you go the open source route, you get to a world where most companies, most organizations, most nonprofits, policymakers and regulators can actually do AI as well.” So a much more decentralized path without having to great concentration of power, which I think is a better world,” Delangue said.

How it works

Access to computing power poses a significant challenge when building large language models, often favoring companies like OpenAI and Anthropic that contract with cloud providers for significant computing resources. Hugging Face wants to level the playing field by donating these shared GPUs to the community through a new program called ZeroGPU.

The shared GPUs can be accessed by multiple users or applications simultaneously, eliminating the need for each user or application to have their own GPU. ZeroGPU will be available via Hugging Face’s Spaces, an app publishing hosting platform that has created over 300,000 AI demos on CPU or paid GPU to date, according to the company.

“It is very difficult to get enough GPUs from the major cloud providers”

Access to the shared GPUs is determined by usage. So if some GPU capacity is not being actively used, that capacity is available for someone else to use. This makes them cost-effective, energy-efficient and ideal for widespread use. ZeroGPU uses Nvidia A100 GPU devices to perform this process – which offer about half the processing speed of the popular and more expensive H100.

“It’s very difficult to get enough GPUs from the major cloud providers, and the way to get them – which is a high barrier to entry – is to commit to very large quantities over long periods of time,” Delangue said.

Typically, a company would turn to a cloud provider such as Amazon Web Services to secure GPU resources for one or more years. This regulation disadvantages small businesses, independent developers and academics who build on a small scale and cannot predict whether their projects will be successful. Regardless of usage, they still have to pay for the GPUs.

“Also, knowing how many GPUs and what budget you need is a nightmare,” Delangue said.

Open source AI is catching up

As AI advances rapidly behind closed doors, Hugging Face’s goal is to empower people to develop more AI technology outdoors.

“If you end up with a few organizations that dominate too much, it will be harder to fight against that later,” Delangue said.

Andrew Reed, a machine learning engineer at Hugging Face, has even developed an app that visualizes the progress of proprietary and open source LLMs over time using LMSYS Chatbot Arena results, showing how the gap between the two moving ever closer together.

According to the company, over 35,000 variations of Meta’s open source AI model Llama have been shared on Hugging Face since Meta’s first release a year ago, ranging from “quantized and merged models to specialized models in biology and Mandarin.”

“AI should not be in the hands of a few. With this commitment to open source developers, we’re excited to see what everyone comes up with next in the spirit of collaboration and transparency,” Delangue said in a press release.

Sharing Is Caring:

Leave a Comment