Microsoft unveils Phi-3 family of compact language models

Microsoft has announced the Phi-3 family of open small language models (SLMs), touting them as the most capable and cost-effective of their size available. The innovative training approach developed by Microsoft researchers has allowed the Phi-3 models to outperform larger models on language, coding, and math benchmarks.

"What we're going to start to see is not a shift from large to small, but a shift from a singular category of models to a portfolio of models where customers get...

Stability AI unveils 12B parameter Stable LM 2 model and updated 1.6B variant

Stability AI has introduced the latest additions to its Stable LM 2 language model series: a 12 billion parameter base model and an instruction-tuned variant. These models were trained on an impressive two trillion tokens across seven languages: English, Spanish, German, Italian, French, Portuguese, and Dutch.

The 12 billion parameter model aims to strike a balance between strong performance, efficiency, memory requirements, and speed. It follows the established framework of...