mAICookbook is a personal cookbook assistant designed to help you explore Greek recipes interactively. The project uses a simple Server-Client architecture.
The server is a FastAPI service that hosts a simple RAG system built with LlamaIndex and Meltemi, a powerful Greek LLM. The knowledge base of the RAG system is powered by a Greek Recipes Dataset, providing a rich collection of traditional and modern Greek recipes.
The client is a simple Streamlit chatbot UI that allows users to interact with the RAG system and explore recipes.
Disclaimer: The server requires a machine with at least 8GB GPU VRAM in order to run the models successfully.
- Clone the repository
git clone https://github.com/infosciassoc/Meltemi-Workshop-Team-02.git
cd mAICookbook
- Create a virtual environment using conda
conda create -n maicookbook python=3.11
- Install Dependencies
pip install -r requirements.txt
- Set up Environment Variables
Create a .env file in the project root and add the following:
API_BASE=<your-meltemi-api-url>
API_KEY=<your-meltemi-api-key>
- Run the FastAPI server
uvicorn server:app --reload
- Run the Streamlit UI
streamlit run app.py
- Open the Streamlit app in your browser at http://localhost:8501.
- Start a conversation with the chatbot
- You can see previous conversations in the sidebar on the left side.