Skip to content

[ICLR 2025] Chain-of-Action: Faithful and Multimodal Question Answering through Large Language Models

License

Notifications You must be signed in to change notification settings

MAGICS-LAB/Chain-of-Actions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

Chain-of-Actions (CoA)

This is the code of the paper OutHopEff. You can use this repo to reproduce the results in the paper.

Chain of Action

Environmental Setup

Baseline

Environmental Setup

You can set up the experimental environment by running the following command line:

$ cd baselines/src
$ pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
pip install -r requirements.txt
$ pip3 install -r requirements.txt
$ export PYTHONPATH=$PYTHONPATH:$PWD

Datasets

Download the datasets from the following:

https://github.com/kojima-takeshi188/zero_shot_cot/tree/main/dataset

https://github.com/kojima-takeshi188/zero_shot_cot/tree/main/log

Instructions

You can run any baseline with our code provide, suchas auto_cot.py, few_shot.py, sc.py,....

You can use your own openai API and google seacrch API, which is required in our baseline code.

About

[ICLR 2025] Chain-of-Action: Faithful and Multimodal Question Answering through Large Language Models

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published