📝 Edit page
➕ Add page
Integration
How to integrate the running Ollama service in your application
Use LangChain
from langchain_ollama.llms import OllamaLLM
Use Ollama package
See ollama-python on GitHub.
import ollama
response = ollama.chat(model='llama3.1', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
Use cURL
Based on the Ollama docs:
curl http://localhost:11434/api/generate -d '{
"model": "llama3.1",
"prompt":"Why is the sky blue?"
}'