Location>code7788 >text

ollama installs and runs ollama 3.1 8b

Popularity:761 ℃/2024-08-09 16:22:41

ollama installs and runs ollama 3.1 8b

conda create -n ollama python=3.11 -y
conda activate ollama
curl -fsSL / | sh
ollama run songfy/llama3.1:8b

It's as simple as that and it's up and running.

We can interact with him on the command line.

image-20240809005919882

Of course we can also use the interface to access.

curl http://localhost:11434/api/generate -d '{
  "model": "songfy/llama3.1:8b",
  "prompt":"Why is the sky blue?"
}'
curl http://localhost:11434/api/chat -d '{
  "model": "songfy/llama3.1:8b",
  "messages": [
    { "role": "user", "content": "why is the sky blue?" }
  ]
}'

Install open-webui

vim /etc/systemd/system/, add Environment

vim /etc/systemd/system/

########## Contents ###########################################################
[Unit]
Description=Ollama Service
After=

[Unit] Description=Ollama Service After= [Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/root/anaconda3/envs/ollama/bin:/root/anaconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root /bin"

[Install]
WantedBy=
systemctl daemon-reload
systemctl enable ollama
systemctl restart ollama
docker run -d -p 8801:8080 --add-host=:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always /open-webui/open-webui:main

This 8801 is one of our open ports.

image-20240809011755858

Once started, we can use: ip: to access the external port.

We'll see a signup page.

image-20240809024103192

Since it's a private service, just sign up, log in and go to.

image-20240809024150277

Then you can use.

image-20240809024452273

That's it, it's no good.

Generating openai-compatible api's

Port Forwarding.

cat > /etc// << EOF
[nux-misc]
name=Nux Misc
baseurl=/download/nux/misc/el7/x86_64/
enabled=0
gpgcheck=1
gpgkey=/download/nux/
EOF
yum -y --enablerepo=nux-misc install redir

redir --lport=8802 --caddr=0.0.0.0 --cport=11434

This allows you to use python to call the

from openai import OpenAI

client = OpenAI(
    base_url='http://{ip}:{port}/v1/', # api_key='ollama', # The api_key is required but will be ignored in ollama.
    api_key='ollama', # api_key is required here but is ignored in ollama
)

completion = (
model="songfy/ollama3.1:8b",
messages=[
{"role": "user", "content": "Write a c++ quick sort code"}
])

print([0].)

Returns.

​```cpp
#include <iostream>

void swap(int &a, int &b) {
    int temp = a;
    a = b;
    b = temp;
}

void quickSort(int arr[], int left, int right) {
    if (left < right) {
        int pivotIndex = partition(arr, left, right);
        
        // Recursively sort subarrays
        quickSort(arr, left, pivotIndex - 1);
        quickSort(arr, pivotIndex + 1, right);
    }
}

int partition(int arr[], int left, int right) {
    int pivot = arr[right];
    int i = left - 1;

    for (int j = left; j < right; j++) {
        if (arr[j] <= pivot) {
            i++;
            swap(arr[i], arr[j]);
        }
    }

    swap(arr[i + 1], arr[right]);

    return i + 1;
}

void printArray(int arr[], int size) {
    for (int i = 0; i < size; i++)
        std::cout << arr[i] << " ";
    std::cout << "\n";
}

// Example usage
int main() {
    int arr[] = {5, 2, 8, 1, 9};
    int n = sizeof(arr) / sizeof(arr[0]);

    quickSort(arr, 0, n - 1);

    printArray(arr, n);
    return 0;
}
​```
exports:
 `1 2 5 8 9`

Quick sort is based on two descending arrays partitioned,An algorithm that then recursively performs similar operations on each subarray。`partition()`function divides the data according to the smallest or largest value in the list,and at each repetition,Split the list into larger values on the left and smaller values on the right.。