diff --git a/machine-and-deep-learning/ollama/README.md b/machine-and-deep-learning/ollama/README.md
index eaec1b6e7ba5774509a73601e40c0c043d4e28e2..8e489e95c7907e9ed9a9258f8610c23ea025b19d 100644
--- a/machine-and-deep-learning/ollama/README.md
+++ b/machine-and-deep-learning/ollama/README.md
@@ -1,6 +1,6 @@
-# Running temporary Large Language Models (LLMs) with Ollama
+# Getting Started with LLM Inference Using Ollama
 
-This directory outlines two distinct scenarios and approaches, differing in the method of running the base Ollama server and the LLM:
+This directory outlines two distinct scenarios and approaches, differing in the method of running the base Ollama server and the Large Language Model (LLM):
 1. An approach utilizing the official Ollama container image, which encompasses the entire software stack and necessary binaries to operate Ollama.
 2. An approach involving manual setup of Ollama within your user directories, requiring you to download binaries and modify paths accordingly.
 
diff --git a/machine-and-deep-learning/vllm/README.md b/machine-and-deep-learning/vllm/README.md
index fba8867c4c75a7e470dfbb57a6cac9ed15b9ab44..6a39176bdc99e87e77af6ab9f6718c4a8eacd0b3 100644
--- a/machine-and-deep-learning/vllm/README.md
+++ b/machine-and-deep-learning/vllm/README.md
@@ -1,6 +1,6 @@
-# Running temporary Large Language Models (LLMs) with vLLM
+# Getting Started with LLM Inference Using vLLM
 
-This directory outlines how to run LLMs via vLLM, either with a predefined Apptainer container image or with a virtual environment where vLLM is installed. Interaction with LLMs happens through the `vllm` Python package.
+This directory outlines how to run Large Language Models (LLMs) and perform inference via vLLM, either with a predefined Apptainer container image or with a virtual environment where vLLM is installed. Interaction with LLMs happens through the `vllm` Python package.
 
 You can find additional information and examples on vLLM under https://docs.vllm.ai/en/latest/