Project:LLMQueryService POC
From MaRDI portal
This page describes how to install the proof-of-concept LLM-based query service.
Try it here: http://130.73.240.230/ (only from the ZIB network or VPN)
Using a OpenStack VM
- Create a new instance (if you use Debian: at least 12)
- Install necessary libraries
- apt-get update
- apt-get install git python3-pip python3-venv
- Clone the repository and follow the rest of the manual ( https://git.zib.de/bzfconra/mardi_llm_bottest )
- Install Ollama ( https://ollama.com/download/linux )
- Check the logs: journalctl -u ollama -f
- You might need ssh-port-forwarding
- Example: ssh -L 8000:127.0.0.1:8501 -i OPENSTACK_KEY_FILE.pem debian@130.73.240.230
The ZIB LLM Server
- Reachable only from within the ZIB network
- To see the installed models: curl https://SERVERNAME/api/tags | jq
- To install a new model: curl https://SERVERNAME/api/pull -d '{"name": "qwen2.5:0.5b"}' | jq