TECH & AI
PHI-4 STATISTICS
5.6B
Parameters in the Phi-4-multimodal model, fewer than most competing multimodal systems
6.14 %
Word error rate on the Huggingface OpenASR leaderboard, representing a new benchmark record
128,000
Maximum token sequence length supported by the Phi-4-mini model, enabling processing of extensive text
Their compact size enables deployment on edge devices, private on-premises servers or cloud – wherever latency, privacy or compliance demand it.
SLMs empower organisations to choose the best-fit location for every AI workload, freeing them from the bandwidth and privacy limits that can compromise cloud-only approaches.
Phi-4: Microsoft’ s latest SLM“ The energy intensity of advanced cloud and AI services has driven us to accelerate our efforts to drive efficiencies and energy reductions,”
170 August 2025