Category Archives: Ollama

Enable debug mode for Ollama on Linux/mac

Simply run it as follows: Note: You might want to export another variable to show the ollama serve command where the models are saved in case you receive a 404 about model not available, this happened to me for some … Continue reading

Posted in Ollama | Tagged , | Leave a comment

Set Ollama default context size on Linux

Just as simply as this: Then add the following lines (to set to 64k tokens context size): Save and exit, then restart Ollama: And you’re done: Note: Tested on Ubuntu 24.04 with Ollama 0.15 That’s it, Enjoy!

Posted in Ollama | Tagged , , , | Leave a comment