LLM

Posted by | July 31, 2025
The New LLM Server is Here!

As you might know, I have a server affectionately known as HAL that serves as the backbone for everything that goes on in my home and in my development.   I...

Continue Reading
Posted by | November 12, 2024
Open-source LLMs: Uncensored & secure AI locally with RAG

I have done a little bit of everything that is being discussed in this course - local LLMs, RAG, Function calling, etc. I recently installed Ollama on my server and...

Continue Reading