Ollama bot
After deploy:
WebUI for Ollama:
- http://localhost:8888
- use to install models like llama2, llama3 (https://ollama.com/library)
Frontend
- simple FE: http://localhost:5000/
Backend:
Description
An app for creating LLM-chatbot via system prompts, RAG resources and external APIs.
https://chatbot.tobiasweise.dev
Readme
3.2 MiB
Languages
JavaScript
85.8%
HTML
10.2%
CSS
2.5%
Python
1.1%
TypeScript
0.3%