AI model Poro sets new milestones for multilingual LLMs in Europe

AI model Poro sets new milestones for multilingual LLMs in Europe


Helsinki-based Silo AI has completed the training of the Poro model — a new milestone in its mission to create large language models (LLMs) for low-resource languages. Named after the Finnish word for “reindeer,” Poro is the first of a family of open-source multilingual LLMs. The startup is building the models alongside the University of Turku and the EU’s High Performance Language Technologies (HPLT) project. Poro is a 34.2 billion parameter model, designed to process English, Finnish, and code. It’s been trained on a dataset of 1 trillion tokens. “What we are proving with Poro is that we can build…

This story continues at The Next Web
  1. No comment added yet!
  2. Leave Your Comment

    Your email address will not be published. Required fields are marked *

AtSign Innovations