Osaurus Lets Mac Users Run Local and Cloud AI Models While Keeping Data On-Device

Osaurus, an open-source Mac application that lets users switch between local and cloud-based AI models while storing their files, memory, and tools on their own hardware, has surpassed 112,000 downloads since launching roughly a year ago, according to the company’s website.

The app was announced in 2026 and supports a wide range of AI models, including MiniMax M2.5, Gemma 4, Qwen3.6, GPT-OSS, Llama, and DeepSeek V4, as well as Apple’s on-device foundation models and Liquid AI’s LFM family. For cloud connectivity, it links to providers such as OpenAI, Anthropic, Gemini, xAI/Grok, and others. It also ships with more than 20 native plugins covering Mail, Calendar, Browser, Git, Filesystem, and additional tools, and has recently added voice capabilities.

Osaurus was founded by Terence Pae, a former software engineer at Tesla and Netflix, and co-founder Sam Yoo. Pae said the project grew out of an earlier desktop AI app called Dinoki, where users questioned why they should pay for the app if they still had to pay for cloud token usage. That feedback pushed Pae toward building a locally run AI system. The founders are currently participating in Alliance, a startup accelerator based in New York.

Osaurus functions as what the company calls a “harness” — a control layer connecting different AI models and workflows through a single interface. Unlike similar developer-focused tools, it targets general consumers and runs AI operations inside a hardware-isolated virtual sandbox to limit security exposure.

Running local AI models requires significant hardware — at least 64 GB of RAM for basic use, and around 128 GB for larger models like DeepSeek V4. Pae acknowledged that local AI remains resource-intensive but said the efficiency of local models has improved substantially over the past year.

The team is considering expanding Osaurus to business customers in sectors such as legal and healthcare, where on-device processing could address data privacy concerns. Pae also suggested that broader adoption of local AI could reduce dependence on cloud data centers over time.

Source: TechCrunch

This article was generated by AI and cites original sources.