Skip to content
Snippets Groups Projects
Commit 6092c356 authored by Jannis Klinkenberg's avatar Jannis Klinkenberg
Browse files

removed obsolete files

parent ac9b61e3
No related branches found
No related tags found
No related merge requests found
...@@ -16,14 +16,17 @@ Please find more information to Ollama in the following links: ...@@ -16,14 +16,17 @@ Please find more information to Ollama in the following links:
To demonstrate how to use Ollama with the `ollama-python` library, you first need to create a Python virtual environment. Run the following command **ONCE**: To demonstrate how to use Ollama with the `ollama-python` library, you first need to create a Python virtual environment. Run the following command **ONCE**:
```bash ```bash
# Specify the Ollama root directory, where binaries should be placed and where venv should be created, such as: # Specify the Ollama root directory
export OLLAMA_ROOT_DIR=${HOME}/ollama export OLLAMA_ROOT_DIR=${HOME}/ollama
# set further relative path variables
# initialize environment variables that refer to installation and virtual environment
source set_paths.sh source set_paths.sh
# create the venv # create the venv
zsh create_venv.sh module load Python
mkdir -p ${OLLAMA_ROOT_DIR}
python -m venv ${OLLAMA_VENV_DIR}
source ${OLLAMA_VENV_DIR}/bin/activate
pip install ollama
``` ```
## 1. Running Ollama ## 1. Running Ollama
...@@ -32,7 +35,18 @@ zsh create_venv.sh ...@@ -32,7 +35,18 @@ zsh create_venv.sh
## 1.1. Running Ollama with the official container ## 1.1. Running Ollama with the official container
Since an Ollama container will be centrally provided on our HPC system **very soon**, you can start using the examples right away, either in your current shell or by submitting a batch job to run them on a backend node: An Ollama container will be centrally provided on our HPC system **very soon**. However, for now lets assume we created one with the following command:
```bash
# Specify the Ollama root directory
export OLLAMA_ROOT_DIR=${HOME}/ollama
# set further relative path variables
source set_paths.sh
# build Ollama apptainer container
apptainer build ${OLLAMA_COINTAINER_IMAGE} docker://ollama/ollama
```
, you can start using the examples right away, either in your current shell or by submitting a batch job to run them on a backend node:
```bash ```bash
# run in current active shell # run in current active shell
zsh submit_job_container.sh zsh submit_job_container.sh
...@@ -47,14 +61,15 @@ Before beeing able to execute Ollama and run the examples, you need to download ...@@ -47,14 +61,15 @@ Before beeing able to execute Ollama and run the examples, you need to download
Execute the following instructions **ONCE** to download Ollama: Execute the following instructions **ONCE** to download Ollama:
```bash ```bash
# Specify the Ollama root directory, where binaries should be placed and where venv should be created, such as: # Specify the Ollama root directory
export OLLAMA_ROOT_DIR=${HOME}/ollama export OLLAMA_ROOT_DIR=${HOME}/ollama
# set further relative path variables
# initialize environment variables that refer to installation and virtual environment
source set_paths.sh source set_paths.sh
# download and extract the binariesthe venv # create required directory and download Ollama binaries
zsh download_and_extract.sh mkdir -p ${OLLAMA_INSTALL_DIR} && cd ${OLLAMA_INSTALL_DIR}
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
tar -xzf ollama-linux-amd64.tgz
``` ```
Now you can execute the examples, either in the current shell or by submitting a batch job that runs the examples on a backend node: Now you can execute the examples, either in the current shell or by submitting a batch job that runs the examples on a backend node:
......
#!/usr/bin/zsh
# create required directory
mkdir -p ${OLLAMA_ROOT_DIR}
# create Python virtual
module load Python
python -m venv ${OLLAMA_VENV_DIR}
# activate the environment
source ${OLLAMA_VENV_DIR}/bin/activate
# install the ollama-python library
pip install ollama
\ No newline at end of file
#!/usr/bin/zsh
# create required directory and download Ollama binaries
mkdir -p ${OLLAMA_INSTALL_DIR} && cd ${OLLAMA_INSTALL_DIR}
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
tar -xzf ollama-linux-amd64.tgz
\ No newline at end of file
...@@ -7,7 +7,7 @@ export OLLAMA_INSTALL_DIR=${OLLAMA_ROOT_DIR}/install ...@@ -7,7 +7,7 @@ export OLLAMA_INSTALL_DIR=${OLLAMA_ROOT_DIR}/install
export OLLAMA_VENV_DIR=${OLLAMA_ROOT_DIR}/venv_ollama export OLLAMA_VENV_DIR=${OLLAMA_ROOT_DIR}/venv_ollama
# path to Ollama container image # path to Ollama container image
export OLLAMA_COINTAINER_IMAGE=${HOME}/ollama/ollama.sif export OLLAMA_COINTAINER_IMAGE=${OLLAMA_ROOT_DIR}/ollama.sif
# extend path to make it executable in the shell # extend path to make it executable in the shell
export PATH="${OLLAMA_INSTALL_DIR}/bin:${PATH}" export PATH="${OLLAMA_INSTALL_DIR}/bin:${PATH}"
\ No newline at end of file
...@@ -14,10 +14,9 @@ ...@@ -14,10 +14,9 @@
### Load modules or software ### Load modules or software
############################################################ ############################################################
# specify your Ollama root directory # Specify the Ollama root directory
export OLLAMA_ROOT_DIR=${HOME}/ollama export OLLAMA_ROOT_DIR=${HOME}/ollama
# set further relative path variables
# set dependent paths
source set_paths.sh source set_paths.sh
# load Python and activate venv # load Python and activate venv
......
...@@ -14,10 +14,9 @@ ...@@ -14,10 +14,9 @@
### Load modules or software ### Load modules or software
############################################################ ############################################################
# specify your Ollama root directory # Specify the Ollama root directory
export OLLAMA_ROOT_DIR=${HOME}/ollama export OLLAMA_ROOT_DIR=${HOME}/ollama
# set further relative path variables
# set dependent paths
source set_paths.sh source set_paths.sh
# load Python and activate venv # load Python and activate venv
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment