# Ollama bot After deploy: ## WebUI for Ollama: * http://localhost:8888 * use to install models like llama2, llama3 (https://ollama.com/library) ## Frontend * simple FE: http://localhost:5000/ ## Backend: * http://localhost:5000/openapi/swagger * http://localhost/backend/openapi/swagger