Hugging Face is growing fast, with users creating new AI repositories every 10 seconds
The AI platform Hugging Face has hit a major milestone, surpassing one million freely available AI models. Co-founder Clément Delangue sees this as a sign that specialized AI solutions are becoming more prevalent.
Hugging Face’s one million public models include well-known examples like Llama, Gemma, Phi, Flux, Mistral and Stable Diffusion, as well as” 999,984 others,” Delangue said.
He believes this variety shows that specialized, optimized models for specific use cases, domains, languages, and hardware often outperform the idea of a single, all-purpose model.
Founded in 2016 as a chatbot company, Hugging Face has evolved into a leading platform for machine learning and AI. It’s best known for its open-source Transformers library, which offers pre-trained models for natural language processing tasks.
Ad
A new repository every 10 seconds
Delangue says Hugging Face also hosts nearly as many private models, accessible only to individual organizations. These allow companies to develop AI systems tailored to their specific needs.
A new repository (model, dataset, or space) is created on Hugging Face every 10 seconds. Delangue predicts that eventually there will be as many AI models as there are code repositories.