> Examples here run `ollama serve` and `ollama run` in the background to enable concise demonstrations from a single script or shell. However, official examples also show that these commands can be run in separate shells on the same node instead.
> Examples here run `ollama serve` and `ollama run` in the background to enable concise demonstrations from a single script or shell. However, official examples also show that these commands can be run in separate shells on the same node instead.
## 1.1. Running Ollama with the official container (recommended)
### 1.1. Running Ollama with the official container (recommended)
An Ollama container will be centrally provided on our HPC system **very soon**. However, for now lets assume we created one with the following command:
An Ollama container will be centrally provided on our HPC system **very soon**. However, for now lets assume we created one with the following command:
```bash
```bash
...
@@ -56,7 +56,7 @@ zsh submit_job_container.sh
...
@@ -56,7 +56,7 @@ zsh submit_job_container.sh
sbatch submit_job_container.sh
sbatch submit_job_container.sh
```
```
## 1.2. Downloading and running Ollama manually
### 1.2. Downloading and running Ollama manually
Before beeing able to execute Ollama and run the examples, you need to download Ollama and make it available to the upcoming workflow steps.
Before beeing able to execute Ollama and run the examples, you need to download Ollama and make it available to the upcoming workflow steps.