LM Studio
Desktop app for discovering, downloading, and running local LLMs. Clean chat UI, OpenAI-compatible API server, and automatic GPU detection. MLX engine optimized for Apple Silicon.
LM Studio is the desktop application that made local LLMs feel like a proper product. It provides a clean chat interface, model discovery from Hugging Face, one-click downloads, and an OpenAI-compatible API server for integrating local models into your apps. The v0.4+ architecture decouples the GUI from inference via a headless daemon called llmster, so the inference engine can run independently. On macOS, it uses the MLX engine specifically optimized for Apple Silicon with fast vision-input handling. On Windows and Linux, it leverages llama.cpp with GGUF/GGML formats. Automatic GPU detection and optimization means you do not need to configure hardware manually. For developers who want a visual interface for managing local models rather than a CLI, LM Studio is the most polished option available.
Similar Tools
Jan
Open-source ChatGPT alternative that runs 100% offline. Desktop app with local models, cloud API connections, custom assistants, and MCP integration. AGPLv3 licensed.
Ollama
The easiest way to run LLMs locally. One command to pull and run any model. OpenAI-compatible API. 52M+ monthly downloads. Supports GGUF, Safetensors, and custom Modelfiles.
GPT4All
Private local AI chatbot by Nomic. 250K+ monthly users, 65K GitHub stars. LocalDocs feature lets you chat with your own files. Runs on Windows, macOS, and Linux.
LocalAI
Open-source OpenAI API replacement. Runs LLMs, vision, voice, image, and video models on any hardware - no GPU required. 35+ backends. Distributed mode for scaling.
Get started with LM Studio
Desktop app for discovering, downloading, and running local LLMs. Clean chat UI, OpenAI-compatible API server, and automatic GPU detection. MLX engine optimized for Apple Silicon.
Try LM StudioGet weekly tool reviews
Honest takes on AI dev tools, frameworks, and infrastructure - delivered to your inbox.
Subscribe FreeMore Local AI Tools
Ollama
The easiest way to run LLMs locally. One command to pull and run any model. OpenAI-compatible API. 52M+ monthly downloads. Supports GGUF, Safetensors, and custom Modelfiles.
Jan
Open-source ChatGPT alternative that runs 100% offline. Desktop app with local models, cloud API connections, custom assistants, and MCP integration. AGPLv3 licensed.
GPT4All
Private local AI chatbot by Nomic. 250K+ monthly users, 65K GitHub stars. LocalDocs feature lets you chat with your own files. Runs on Windows, macOS, and Linux.
Related Guides
Claude Code Setup Guide
Configure Claude Code for maximum productivity -- CLAUDE.md, sub-agents, MCP servers, and autonomous workflows.
AI AgentsRun AI Models Locally with Ollama and LM Studio
Install Ollama and LM Studio, pull your first model, and run AI locally for coding, chat, and automation - with zero cloud dependency.
Getting StartedBuilding Your First MCP Server
Step-by-step guide to building an MCP server in TypeScript - from project setup to tool definitions, resource handling, testing, and deployment.
AI AgentsRelated Posts

Mercury 2: The LLM That Doesn't Generate Like an LLM
Inception Labs shipped the first reasoning model built on diffusion instead of autoregressive generation. Over 1,000 tok...

Claude Skills: A technical deep dive into Anthropic's new approach to AI context management
A comprehensive look at Claude Skills-modular, persistent task modules that shatter AI's memory constraints and enable p...

GPT-5: OpenAI's Most Capable Model
GPT-5 introduces a fundamentally different approach to inference. Instead of forcing developers to manually configure re...
