Apple has introduced OpenELM, a family of open-source AI large language models (LLMs) that can run entirely on a single device, eliminating the need for cloud servers.
The OpenELM family consists of eight models, divided into two categories: four pre-trained models and four instruction-tuned models. The models cover a range of parameter sizes between 270 million and 3 billion.
Two new AI releases by Apple today:
🧚♀️ OpenELM, a set of small (270M-3B) efficient language models. Weights on the Hub:
Pretrained: https://t.co/m0KJhDbc2o
Instruct: https://t.co/sph96ZmBYj👷♀️ CoreNet, a training library used to train OpenELM: https://t.co/bMJ5UXHQ9f
— Pedro Cuenca (@pcuenq) April 24, 2024
Apple offers its OpenELM models under a “sample code license,” allowing commercial usage and modification. However, the company emphasizes that these models are made available without safety guarantees. The users must know the potential risks of inaccurate, harmful, or biased outputs.
With this move, Apple joins Microsoft, Google, and Samsung in their efforts to make generative AI models run on PCs and smartphones.
The OpenELM models are designed for text-related tasks, aligning with Apple’s reported ambitions of introducing on-device AI features this year. The models are suited for running on commodity laptops or even some smartphones, making AI more accessible to a wider range of users. They use a layer-wise scaling strategy to allocate parameters within each layer of the transformer model, enabling enhanced accuracy results while being compute-efficient.
Per information on Hugging Face, Apple pre-trained the models using a new CoreNet library. Test results on the Hugging Face model hub show the OpenELM models have demonstrated impressive performance on a range of natural language processing tasks, including text classification and sentiment analysis.
The largest model, OpenELM-3B, achieved outstanding results on several benchmarks. This demonstrates the potential of Apple’s OpenELM models to enable cutting-edge AI capabilities on-device without relying on cloud infrastructure.
With this move, Apple is democratizing access to AI technology, enabling developers to create innovative applications and services previously limited by cloud-based infrastructure. As a result, we can expect to see a new wave of AI-powered apps and services that are more responsive, secure, and tailored to our mobile devices’ specifications.