Skip to content

Mugen-Builders/coprocessor-llama-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation



EVM Linux Coprocessor as an AI Agent
Cartesi Coprocessor AI Agent powered by EigenLayer cryptoeconomic security

license last-commit

Table of Contents

Prerequisites

  1. Install Docker Desktop for your operating system.

    To install Docker RISC-V support without using Docker Desktop, run the following command:

     docker run --privileged --rm tonistiigi/binfmt --install all
  2. Download and install the latest version of Node.js

  3. Cartesi CLI is an easy-to-use tool to build and deploy your dApps. To install it, run:

    npm i -g @cartesi/cli
  4. Install the Cartesi Coprocessor CLI

Running

  1. Start the devnet coprocessor infrastructure + Llama.cpp server:

Caution

In the devnet environment, this project was only tested on Linux x86-64, MacOS users may encounter issues.

git clone https://github.com/zippiehq/cartesi-coprocessor
cd cartesi-coprocessor
docker compose -f docker-compose-devnet.yaml --profile llm up --build
  1. Build and Publish the application:
cd coprocessor
cartesi-coprocessor publish --network devnet
  1. Deploy the LlamaAgent.sol contract:

Warning

Before deploy the contract, create a .env file like this

RPC_URL=http://localhost:8545
PRIVATE_KEY="0xac0974bec39a17e36ba4a6b4d238ff944bacb478cbed5efcae784d7bf4f2ff80"
MACHINE_HASH=""
TASK_ISSUER_ADDRESS=""
  • You can see the machine hash running cartesi hash in the folder /coprocessor;
  • You can see the task issuer address for the devnet enviroment running cartesi-coprocessor address-book;
cd contracts
forge install
make agent
  1. Run the frontend:

Warning

Before run the frontend, please update the .env.local file with the LlamaAgent address deployed:

NEXT_PUBLIC_PROJECT_ID="e47c5026ed6cf8c2b219df99a94f60f4"
NEXT_PUBLIC_COPROCESSOR_ADAPTER=""
cd frontend
npm run dev

Note

Although this README provides instructions for the devnet environment, this application can be deployed on testnet and hosted on an infrastructure provided by Cartesi, which even has a Llama.cpp server available for communication via GIO. Follow the intructions provided here.

Demo

llama_agent.mp4