Sustainability Magazine August 2025 | Page 166

TECH & AI What are SLMs? While large models can wield hundreds of billions or even trillions of parameters, SLMs usually operate in the range of a few million up to about 10 billion parameters so require significantly less memory, processing power and storage.
Technically, SLMs deploy the same transformer architectures as their larger siblings, but optimisation techniques such as knowledge distillation, pruning and quantisation allow them to retain high task-specific performance at a fraction of the resource cost. By using domainspecific training datasets, SLMs can excel at focused tasks – like company-specific email summarisation or call centre enquiry resolution – rather than the generalpurpose omniscience claimed by LLMs.
What are LLMs? LLMs, such as GPT-4 and Gemini, are expansive neural networks trained on vast datasets encompassing much of the digitised world’ s text. With up to trillions of learnable parameters, LLMs can exhibit remarkable fluency in language, reasoning, summarisation, code and more. Their strengths lie in adaptability and breadth – an LLM can handle everything from legal document analysis to poetry.
However, this scale carries costs. Training and operating LLMs demands immense computational power, orchestration across specialised hardware( GPUs, TPUs) and a continuous internet connection. This not only increases financial outlays, but also amplifies carbon footprints.
166 August 2025