LM Studio acquired Locally AI, a mobile app for running large language models on iPhone, iPad, and Mac, the company announced on April 8, 2026. The acquisition brings Locally AI creator Adrien Grondin onto the LM Studio team to lead native AI experiences across devices.
For the broader landscape, see our complete producer guide to AI music and audio in 2026.
What Happened
LM Studio, the desktop application that lets users run open-source AI models like Llama, Gemma, Qwen, and DeepSeek on macOS, Windows, and Linux, has acquired Locally AI, a mobile-first app for running local AI models on Apple devices.
Adrien Grondin, who built Locally AI, joins LM Studio as part of the deal to head up native AI development across the LM product line. LM Studio described the move as part of a broader push into mobile and cross-device AI access.
"By bringing Locally AI into the LM family, we are doubling down on our mission of making AI accessible and useful to you, across your devices, wherever you go," the company wrote in the announcement on their blog.
Why It Matters
Running AI models locally has been a desktop story for the past two years. LM Studio built a strong following among creators and developers who want privacy, offline access, and no API costs. But mobile has remained a gap: most on-device AI apps on iOS are either limited to cloud inference or rely on Apple's own on-device framework without open-model support.
Locally AI was one of the few apps targeting that gap directly. The acquisition signals that LM Studio intends to own the full local AI stack across platforms, not just the desktop layer. For creators who already use LM Studio on their Mac or PC, continuity with a mobile companion that runs the same models is a meaningful change to how local AI fits into a daily workflow.
If you have been looking at how running models locally fits into your creative or production workflow, the Creator's Guide to Running AI Locally in 2026 covers the current landscape in depth.
Key Details
- Locally AI supports iPhone, iPad, and Mac
- Adrien Grondin joins LM Studio to lead native AI experiences across devices
- LM Studio currently supports Llama, Gemma, Qwen, and DeepSeek on macOS, Windows, and Linux
- LM Studio is actively hiring application developers and system engineers
- The company plans "new ways to use your models and agents seamlessly across your own devices"
What to Do Next
If you use LM Studio on desktop, the Locally AI app is available now on the App Store. The integration roadmap between LM Studio and Locally AI has not been detailed yet, but Grondin's presence on the team suggests the two products will converge rather than run separately.
Developers interested in the hiring push can find open roles on the LM Studio website. The company is looking for engineers who want to work on the local AI runtime and native application layer.