Artificial intelligence is constantly advancing, and efforts to promote digital independence and language diversity have taken a significant step forward with the creation of Viking, a cutting-edge language model. Developed by Silo AI, Europe’s largest private AI lab, in collaboration with the TurkuNLP research group at the University of Turku and HPLT, Viking is designed to cater to Nordic languages, as well as English and programming languages. The release of its first checkpoints marks a crucial milestone towards making European language models more accessible.
Viking is the latest and improved version of Silo AI’s previous LLM model, “Poroâ€. The architecture of Viking has been modernized, and the language support has been expanded to include various languages such as Danish, English, Finnish, Norwegian, Icelandic, Swedish, and several programming languages. Viking comes in three different models – Viking 7B, 13B, and 33B – which demonstrate Silo AI’s dedication to enhancing European digital infrastructure, promoting linguistic diversity, and ensuring that technology brings people closer together rather than creating cultural barriers.
A Technological and Cultural Bridge
The development of Viking is emblematic of Europe’s broader strategy to strengthen its digital sovereignty while embracing linguistic diversity. By leveraging the latest advancements in multilingual LLMs, Silo AI and TurkuNLP aim to create tools that are linguistically inclusive and culturally sensitive. Viking is specifically designed to perform exceptionally well in languages that typically receive less attention in the global AI race, thus providing a platform for innovation and communication within and beyond the Nordic region.
Performance
The performance of Viking is quite impressive, especially in dealing with low-resource languages. Based on initial evaluations after being trained with 1,000 billion tokens, Viking has been found to outperform other open models in these languages while maintaining its efficiency in English and programming languages. This achievement highlights the effectiveness of Viking’s approach to multilingual model training and its technological advancement in tackling the intricacies of multilingualism.
Architecture
The architecture of Viking is inspired by successful models such as Llama 2, which includes features like flash attention, rotary embeddings, and grouped query attention. It supports a sequence length of up to 4,000 and is backed by a dataset of 2 trillion tokens. The data spans over Nordic languages and a variety of programming languages. This demonstrates a significant advancement towards creating a multilingual and multifunctional LLM.
Open Access and Ethical Considerations
Viking is released under the Apache 2.0 License, making it freely available for both commercial and research purposes. Although Viking’s research checkpoints are helpful for academic and industry research, Silo AI recommends further training and fine-tuning before deployment in production environments.
Key Takeaways
Democratizing Multilingualism:Â Viking represents a significant effort to bring linguistic diversity to the forefront of AI development, particularly for Nordic languages.
Technological Innovation:Â With state-of-the-art architecture and extensive training data, Viking showcases superior performance in understanding and generating multilingual content.
Cultural Sensitivity: The model’s development emphasizes the importance of creating technology that respects and incorporates local values and cultures.
Open Access:Â Viking is freely available under the Apache 2.0 License, encouraging innovation across various sectors by providing access to a high-quality, multilingual LLM.
European Digital Sovereignty: The release of Viking aligns with Europe’s strategy to enhance its digital infrastructure and autonomy, positioning it as a leader in the global AI landscape.
The post SILO AI Releases New Viking Model Family (Pre-Release): An Open-Source LLM for all Nordic languages, English and Programming Languages appeared first on MarkTechPost.
Source: Read MoreÂ