Posts

Showing posts from October, 2025

Running Local LLMs

Image
Its been a while since I’ve had a topic in technology that I felt strongly enough to write about. After some experimentation with Ollama and testing out local LLMs, I felt the spark return. When this blog was originally written, August 2025, the novelty of chatGPT had worn off. What most people don’t know about LLMs is that the chatGPT experience of mid 2023 can now be had on your desktop or laptop computer! Why is this important? Its important because as LLMs get better and better, running local language models will become more and more viable. In addition, it increases privacy for those who use and find value in LLMs. Its my strong opinion that a niche has opened in the AI space for those who can understand and find use cases for local LLMs.  My unique contribution to this technology shift is to show that with modest hardware, individuals can still access a GPT4 level experience as it was had circa mid 2023. Additionally, I’ll show what I observed in my home lab with various size...