Local AI Chat
Phillip Lindley
Free
About Local AI Chat
Run small LLMs privately on your iPhone or Mac – no internet required.
This app allows you to interact with local AI models directly on an iPhone 15+ or Mac with M chip without relying on cloud-based processing. With a privacy-first approach, all AI computations happen entirely on-device, ensuring that your conversations stay private.
Privacy First:
• All processing is done locally on your iPhone
• No data is sent to external servers
• No tracking, no ads, no analytics
Features
- On-Device AI Chat – No internet needed, full privacy
- Fast & Efficient – Optimized LLM inference with Core ML / MLC LLM
- Supports DeepSeek R1 (1.5B) – A powerful yet lightweight model
- Read aloud responses
- Cat Mode (early access)
This app allows you to interact with local AI models directly on an iPhone 15+ or Mac with M chip without relying on cloud-based processing. With a privacy-first approach, all AI computations happen entirely on-device, ensuring that your conversations stay private.
Privacy First:
• All processing is done locally on your iPhone
• No data is sent to external servers
• No tracking, no ads, no analytics
Features
- On-Device AI Chat – No internet needed, full privacy
- Fast & Efficient – Optimized LLM inference with Core ML / MLC LLM
- Supports DeepSeek R1 (1.5B) – A powerful yet lightweight model
- Read aloud responses
- Cat Mode (early access)