This guide covers all the steps needed to install and configure TuRTLe for API-based inference and local Docker evaluation.
- For cluster/HPC setup with vLLM and Singularity, please see LOCAL_INFERENCE.md
- Python 3.10 or higher
- uv
- Docker (for local API evaluation only)
On macOS and Linux:
curl -LsSf https://astral.sh/uv/install.sh | shOn Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"Or using pip:
pip install uvgit clone --recursive https://github.com/HPAI-BSC/TuRTLe.git
cd TuRTLeInitialize the project and install dependencies:
uv init
uv add -r requirements.txtThis will create a virtual environment and install all required packages from requirements.txt.
Docker CE (Community Edition) with a recent version is required for running local evaluations with EDA tools.
Install Docker CE from https://docs.docker.com/get-docker/
Add your user to the docker group to run Docker without sudo permissions:
# Add current user to docker group
sudo usermod -aG docker $USER
# Verify Docker works without sudo
docker --versionPull the TuRTLe evaluation Docker image that contains all EDA tools:
docker pull ggcr0/turtle-eval:2.3.4- See the main README.md for quick start API inference and evaluation
- For cluster/HPC setup with vLLM and Singularity, please see LOCAL_INFERENCE.md