Ollama
Install
curl -fsSL https://ollama.com/install.sh | shRun
ollama run llama3
# Prompt away :D
# quit the prompt
/byeUse ollama in Python scripts
pip install ollamaYou can then use it with this python code
import ollama
stream = ollama.chat(
model='mistral' # we can choose another one such as llama2 or llama3
messages=[{'role':'user', 'content': 'Your prompt here'}],
stream=True,
)
# Print the output
for chunk in stream:
print(chunk['message']['content'], end='', flush=True)Resources
Last updated