Swift app for the Mac specifically for MLX LLMs

If you want to try out large local language models, you can now test a new macOS app. It specializes in the Apple Silicon optimized MLX LLMs.

listen Print view
Pico AI Homelab on the Mac

Pico AI Homelab on the Mac: If desired, directly in the app or via browser.

(Image: Starling Protocol Inc)

3 min. read

Experiments with large language models (LLMs) directly on the home computer have been all the rage since the hype surrounding DeepSeek. Apple Silicon Macs are well suited for this, depending on the RAM expansion level and processor variant included. Apple has also released MLX, a special framework for accelerating machine learning (ML) for ARM Macs. The "Array Framework for Apple Silicon" is designed to make machine learning particularly efficient on the latest Macs. A new free app, which is available in the Mac App Store, now specializes in models that use the technology.

Pico AI Homelab comes from the provider Starling Protocol Inc and runs from macOS 15 (Sequoia). Similar to LM Studio, which runs as an Electron app, it is very easy to try out different models. There are currently said to be over 300 different ones. In addition to the distilled version of DeepSeek R1 in various versions, there are also Mistral, Meta Llama, Alibaba Qwen, Google Gemma and Microsoft Phi in different sizes and designs. These are each adapted for MLX, which should make them more performant than GGUF models.

Videos by heise

Pico AI Homelab is compatible with Ollama and also uses its API. This means that alternative chat apps such as Open WebUI, MindMac or Ollamac can also be used with the application. In general, Pico AI Lab runs as a local HTTP server (localhost), so chats are conducted in the browser. The entire system runs offline and no data is sent to the Internet. Pico AI Homelab does not collect any user information itself.

Pico AI Homelab runs on all Apple Silicon Macs from M1 upwards. The minimum RAM configuration must be 16 GByte, but more RAM from 32 GByte is extremely useful for large language models. Command line skills are not required to use Pico AI Homelab. "Thanks to the one-click guided installation, even beginners can get started quickly, while experienced users benefit from flexible customization options," write the creators.

The app is currently completely free, but it is unclear whether this will change at a later date. There are no API costs or subscription fees with the AI providers. However, you shouldn't expect too much from local LLMs: due to the significantly lower computing power compared to server models, the outputs are poorer and there are also more hallucinations. However, local models are definitely fun to "play" with.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

(bsc)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.