ollamarunllama3# Prompt away :D# quit the prompt/bye
Use ollama in Python scripts
pipinstallollama
You can then use it with this python code
import ollamastream = ollama.chat( model='mistral'# we can choose another one such as llama2 or llama3 messages=[{'role':'user', 'content': 'Your prompt here'}], stream=True,)# Print the outputfor chunk in stream:print(chunk['message']['content'], end='', flush=True)