Microsoft has developed a new class of small language models, or SLMs, called Phi-3, which offer similar capabilities to large language models but are smaller in size and trained on smaller amounts of data. The Phi-3 models, starting with Phi-3-mini, measuring 3.8 billion parameters, outperform models twice their size across benchmarks that evaluate language, coding and math capabilities.
LLMs typically require significant computing resources to operate. While SLMs are not designed for in-depth knowledge retrieval like LLMs, they are well-suited for simpler tasks and can be used offline. These models are useful for organizations building applications that can run locally on a device, rather than in the cloud.
Microsoft is making Phi-3-mini available in the Microsoft Azure AI Model Catalog. It will also be available on platforms including Hugging Face, Ollama and as an NVIDIA NIM microservice. Additional models in the Phi-3 family, such as Phi-3-small and Phi-3-medium, will be released soon, according to a company blog post.