LLM_Inferenz_Server_1/09_stop_openwebui.sh
herzogflorian f4fdaab732 Add Open WebUI integration and enhance Streamlit app
- Add Open WebUI scripts (06-09) for server-hosted ChatGPT-like interface
  connected to the vLLM backend on port 7081
- Add context window management to chat (auto-trim, token counter, progress bar)
- Add terminal output panel to file editor for running Python/LaTeX files
- Update README with Open WebUI setup, architecture diagram, and troubleshooting
- Update STUDENT_GUIDE with step-by-step Open WebUI login instructions

Made-with: Cursor
2026-03-02 18:48:51 +01:00

32 lines
876 B
Bash
Executable File

#!/usr/bin/env bash
# ------------------------------------------------------------------
# 09_stop_openwebui.sh
# Gracefully stops the background Open WebUI server.
# ------------------------------------------------------------------
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
PID_FILE="${SCRIPT_DIR}/logs/openwebui.pid"
if [ ! -f "$PID_FILE" ]; then
echo "No PID file found. Open WebUI may not be running."
exit 0
fi
SERVER_PID=$(cat "$PID_FILE")
if kill -0 "$SERVER_PID" 2>/dev/null; then
echo "Stopping Open WebUI (PID: ${SERVER_PID})..."
kill "$SERVER_PID"
sleep 2
if kill -0 "$SERVER_PID" 2>/dev/null; then
echo "Process still alive, sending SIGKILL..."
kill -9 "$SERVER_PID"
fi
echo "Open WebUI stopped."
else
echo "Open WebUI process (PID: ${SERVER_PID}) is not running."
fi
rm -f "$PID_FILE"