AMD Unleashes 1B Parameter Language Models OLMo for Open-Source AI Research

AMD, a leading technology company, has announced its first open-source language models, OLMo, featuring 1 billion parameters. This initiative aims to empower researchers and developers with the tools necessary to build specialized AI solutions tailored to specific industry needs. By making these models open-source, AMD hopes to encourage innovation and customization in AI technology, helping meet the growing demand for specialized AI solutions across various sectors.

AMD OLMo models are pre-trained using 1.3 trillion tokens on AMD Instinct MI250 GPUs across 16 nodes. The models include three checkpoints representing different stages of training and are equipped with a two-phase supervised fine-tuning and DPO alignment to enhance reasoning and chat capabilities.

In benchmarking tests, the OLMo models have shown competitive performance against other open-source models of similar size, such as TinyLLaMA and MobiLLaMA. AMD’s decision to open-source the OLMo models highlights its commitment to the AI community. By providing access to training data, model weights, and code, AMD aims to foster innovation and collaboration in AI research, inspiring new developments and applications of AI technologies, leveraging the capabilities of AMD’s hardware solutions like the Ryzen AI processors.

Source

US Election Economic Data Impact Bitcoin Markets Amid Volatility

Bitcoin Set for 100000 Pump as US Presidential Election Approaches

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Crypto Fear & Greed Index