From b74d2903886a90dd8001dbbbcc331951ff21c7c3 Mon Sep 17 00:00:00 2001
From: Jannis Klinkenberg <j.klinkenberg@itc.rwth-aachen.de>
Date: Thu, 5 Jun 2025 15:11:49 +0200
Subject: [PATCH] Edit README.md

---
 machine-and-deep-learning/ollama/README.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/machine-and-deep-learning/ollama/README.md b/machine-and-deep-learning/ollama/README.md
index 393646d..dc43035 100644
--- a/machine-and-deep-learning/ollama/README.md
+++ b/machine-and-deep-learning/ollama/README.md
@@ -34,7 +34,7 @@ pip install ollama
 > [!NOTE]
 > Examples here run `ollama serve` and `ollama run` in the background to enable concise demonstrations from a single script or shell. However, official examples also show that these commands can be run in separate shells on the same node instead.
 
-## 1.1. Running Ollama with the official container (recommended)
+### 1.1. Running Ollama with the official container (recommended)
 
 An Ollama container will be centrally provided on our HPC system **very soon**. However, for now lets assume we created one with the following command:
 ```bash
@@ -56,7 +56,7 @@ zsh submit_job_container.sh
 sbatch submit_job_container.sh
 ```
 
-## 1.2. Downloading and running Ollama manually
+### 1.2. Downloading and running Ollama manually
 
 Before beeing able to execute Ollama and run the examples, you need to download Ollama and make it available to the upcoming workflow steps.
 
-- 
GitLab