40% off TNW Conference!
This year I believe we will see demand for AI that is more open, secure, and accountable.
For example, the 7B model is far less accurate than the 70B version.

B stands for billion and indicates the number of data points the model was trained on.
All three models are fine-tuned versions of Metas Llama-2 large language model.
ClimateGPT was trained on data from the Dutch startup Erasmus.AIs planetary scale corpora.

WhileAppTek, a startup specialising inmachine learning, fine-tuned the ClimateGPT through a series of tests and benchmarks.
Thanks to software from AppTek, ClimateGPT is available in 20 languages.
Its creators also say that it was trained and is hosted entirely on renewable energy.

you could also request access to it on the ECIwebsite.
Its answer was detailed, thorough, and hit on many of the points addressed in our article.
Importantly, it was backed up by academic sources.

This is part of ECIs mission to make the development of AI as transparent and accessible as possible.
Story bySion Geschwindt
Sion is a freelance science and technology reporter, specialising in climate and energy.