| Home | Manual | About |
To run and chat with Llama 3.1 or any other model:
ollama run llama3.1
This will run ollama at (Also automatically downloads the model if not available):
http://localhost:11434
Nothing to be changed. Just Needs to run in the Background.