Local AI

2024-04-30
Local AI

I’m a big believer in “Local AI”, i.e. running an AI model embedded on your phone or other device, instead of accessing it through the cloud.

Why?

  • 🔌 Availability: Works offline, not dependent on internet connectivity.
  • 🔒 Privacy: sensitive data never leaves your device.
  • ⚡️ “Speed”: less latency by processing data directly on the device, BUT a smaller device is also slower at loading and running a large language model.
  • ☘️ Environment: less power usage, less CO₂.

As an example, compare:

Apple is working on their “Ajax” model and OpenAI are working on their own hardware. Exciting times ahead!

If you want to try a Local AI on iPhone, try Private LLM.