Helsinki-based Silo AI calls the new large language model Viking 7B.
Evaluations indicate best-in-class performance in all the Nordic languages without compromising the English outputs.
With the Viking model family, we reaffirm our commitment to Europes digital sovereignty.

It’s free, every week, in your inbox.
To fill the data gap, Silo applies a variety of techniques.One is optimising model architectures for pre-training.
Another incorporates translated pairs of high- and low-resource languages.

Several of the techniques use a cross-lingual signal, which enhances the connections between languages.
New parameters
The 7 billion-parameter Viking is the first release from a model family announcedlast month.
Silo also plans to launch 13B and 33B versions.
Checkpoints for both these LLMs were released today.
As the parameters expand, the models will improve their understanding of prompts and their capacity for nuanced outputs.
But they will also need greater computational resources, which lead to higher costs and energy consumption.
With resources under control and performance proven, Silo now plans to integrate every EU language.
We consider multilingual LLMs to constitute a part of Europes digital infrastructure, Sarlin said.
One of the themes of this years TNW Conference is Ren-AI-ssance: The AI-Powered Rebirth.
Story byThomas Macaulay
Thomas is the managing editor of TNW.
He leads our coverage of European tech and oversees our talented team of writers.
Away from work, he e(show all)Thomas is the managing editor of TNW.
He leads our coverage of European tech and oversees our talented team of writers.
Away from work, he enjoys playing chess (badly) and the guitar (even worse).