EcoDoc Sense is a web application that focusing on analysing the compliance of software architecture documents with certain green software patterns. A sustainability report is generated based on developer’s software architecture document, highlighting whether green software patterns are being adhered to or overlooked. Additionally, it provides suggestions on which green patterns should be implemented in the software.
Built out the main RAG ( Retrieval-Augmented Generation ) pipeline for the project and uses FastAPI to implement the RESTful service. Code in "Rag" folder.
Built out a user-friendly web user interface with the React Framework.
The scripts for finetuning the models are included in the "Finetuning" folder.
Ollama is used as the local LLM launcher in the project.
Install Docker Desktop to your machine.
Use the command below to create the docker image and run locally.
# build the docker image and run
docker-compose up --build
Install Ollama to local machine. After that, pull necessary models.
# pull Llama2 for embeddings generation
ollama pull llama2
# pull LLaVA for image extraction of user's document
ollama pull llava
Download finetuned model for final answer generation here.
Then, copy the Modelfile in project root directory to where you store the finetuned model.
After that, create a model instance by Modelfile.
# create the model instance
ollama create fineTunedModel -f /root/Modelfile
# change directory
cd .\Rag\
# install dependencies
pip install -r .\requirements.txt
# start server
python .\api.py
# change directory
cd .\frontend\
# install dependencies
npm install
# start server
npm start
Now EcoDoc Sense is ready on http://localhost:3000/
Detailed discussions and analysis of our prompt templates can be found in Generation Phase - Prompt Engineering.pdf
located in the project's root directory.