ollama run llama3
# Prompt away :D
# quit the prompt
/bye
Use ollama in Python scripts
pip install ollama
You can then use it with this python code
import ollama
stream = ollama.chat(
model='mistral' # we can choose another one such as llama2 or llama3
messages=[{'role':'user', 'content': 'Your prompt here'}],
stream=True,
)
# Print the output
for chunk in stream:
print(chunk['message']['content'], end='', flush=True)