Local AI
2024-04-30
I’m a big believer in “Local AI”, i.e. running an AI model embedded on your phone or other device, instead of accessing it through the cloud.
Why?
- 🔌 Availability: Works offline, not dependent on internet connectivity.
- 🔒 Privacy: sensitive data never leaves your device.
- ⚡️ “Speed”: less latency by processing data directly on the device, BUT a smaller device is also slower at loading and running a large language model.
- ☘️ Environment: less power usage, less CO₂.
As an example, compare:
- Humane AI Pin (cloud): https://www.youtube.com/watch?v=_w1vv7_dU2Y
- Rabbit R1 (local): https://www.youtube.com/watch?v=aqb3u5mXcl4
Apple is working on their “Ajax” model and OpenAI are working on their own hardware. Exciting times ahead!
If you want to try a Local AI on iPhone, try Private LLM.