Local AI needs to be the norm
Key takeaways
- One of the current trends in modern software is for developers to slap an API call to Open AI or Anthropic for features within their app.
- This laziness is creating a generation of software that is fragile, invades your privacy, and fundamentally broken.
- We need to return to a habit of building software where our local devices do the work.
One of the current trends in modern software is for developers to slap an API call to Open AI or Anthropic for features within their app. Reasonable people can quibble with whether those features are actually bringing value to users, but what I want to discuss is the fundamental concept of taking on a dependency to a cloud hosted AI model for applications.
This laziness is creating a generation of software that is fragile, invades your privacy, and fundamentally broken. We are building applications that stop working the moment the server crashes or a credit card expires.
We need to return to a habit of building software where our local devices do the work. The silicon in our pocket is mind bogglingly faster than what was available a decade ago. It has a dedicated Neural Engine sitting there, mostly idle, while we wait for a JSON response from a server farm in Virginia. That’s ridiculous.