Guide for Running Apple OpenELM on Your Local Machine

Younes
3 min readMay 5, 2024
Photo by Sumudu Mohottige on Unsplash

Recently, Apple introduced eight open source language models known as OpenELM (Open-source Efficient Language Models). These models are unique because they operate directly on the device, bypassing cloud servers. In this brief guide, we’ll demonstrate how to run and use them.

While I’m a fan of Ollama, which facilitates running most open-source LLMs locally with ease…

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in