The initial versions of the Ollama Python and JavaScript libraries are now available:
Both libraries make it possible to integrate new and existing apps with Ollama in a few lines of code, and share the features and feel of the Ollama REST API.
Getting Started
Python
pip install ollama
import ollama
response = ollama.chat(model="llama2", messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])
JavaScript
npm install ollama
import ollama from 'ollama'
const response = await ollama.chat({
model: 'llama2',
messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)
Use cases
Both libraries support Ollama’s full set of features. Here are some examples in Python:
Streaming
for chunk in chat('mistral', messages=messages, stream=True):
print(chunk['message']['content'], end='', flush=True)
Multi-modal
with open('image.png', 'rb') as file:
response = ollama.chat(
model="llava",
messages=[
{
'role': 'user',
'content': 'What is strange about this image?',
'images': [file.read()],
},
],
)
print(response['message']['content'])
Text Completion
result = ollama.generate(
model="stable-code",
prompt="// A c function to reverse a string\n",
)
print(result['response'])
Creating custom models
modelfile=""'
FROM llama2
SYSTEM You are mario from super mario bros.
'''
ollama.create(model="example", modelfile=modelfile)
Custom client
ollama = Client(host="my.ollama.host")
More examples are available in the GitHub repositories for the Python and JavaScript libraries.
New GitHub handle
These libraries, and the main Ollama repository now live in a new GitHub organization: ollama! Thank you to all the amazing community members who maintain libraries to interact with Ollama via Dart, Swift, C#, Java, PHP, Rust and more – a full list is available here – please don’t hesitate to make a pull request to add a library you’ve built or enjoy using.
Special thank you to Saul, the original author of ollama-js
, and everyone else who has worked on community libraries to make Ollama more accessible from different programming languages.