Dev Resources

Ollama

Ollama lets you run and manage large language models (like Llama 2, Mistral, etc.) locally with a simple CLI and API. It handles model downloading, GPU/CPU execution, and prompt management out of the box—ideal for integrating LLMs into local apps or dev workflows.

Available for macOS, Linux, and Windows.

resources.mytchall.dev