Ollama & Open WebUI User Guide at UiS

Hosted Endpoints


๐Ÿš€ Accessing the Ollama API

๐Ÿ”ง Prerequisites

curl https://ollama.ux.uis.no/api/tags

๐Ÿงช Example: Generate a Completion

Use curl to send a prompt to the model:

curl https://ollama.ux.uis.no/api/generate -d '{ 
    "model": "llama3.3", 
    "prompt": "What is the capital of Norway?", 
    "stream": false 
}'

Full documentation for using the API is here

๐Ÿ Generate a Completion with Python

๐Ÿ”ง Prerequisites

Code Example for generating a single completion

python from ollama import Client

# Set up client to be used for generating completions
client = Client(host="https://ollama.ux.uis.no")

response = client.generate(
    model="llama3.3", 
    prompt="What is the capital of Norway?", 
    temperature=0.7, # optional: controls randomness 
    max_tokens=100 # optional: limits the length of the response 
) 
print(response["text"])