Skip to content

Latest commit

 

History

History

PyTorch

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Quick Run

  1. Just Setup Environment with Conda:
./setup_env.sh
  1. Setup Datasets MS-COCO:
./setup_datasets.sh
  1. Get Model Weights:
./setup_model.sh
  1. Directly Run Script:
./run_tq.sh

HowTo

  1. First of all, download DETR model weight from facebookresearch/detr to ${BIN_DIR}:

    • DETR-DC5-R101 for batch size 1:
    $ wget -P ${BIN_DIR} https://dl.fbaipublicfiles.com/detr/detr-r101-2c7b67e5.pth
    
  2. Then, download Backbone(ResNet101) model weights from torchvision to ${BIN_DIR}, note that there are two types of pretrained model:

$ wget -P ${BIN_DIR} https://download.pytorch.org/models/resnet101-63fe2227.pth
  1. Combine model weights and backbone weights:

    • DETR-DC5-R101 for batch size 1:
    $ cd sources/with_dilation_aka_bsz1/
    $ python detr.py --mode create --detr-path ${BIN_DIR}/detr-r101-dc5-a2e86def.pth --backbone-path ${BIN_DIR}/resnet101-63fe2227.pth --model-path ${BIN_DIR}/DETR_ResNet101_BSZ1.pth
    
    • DETR-R101 for batch size 2:
    $ cd sources/without_dilation_aka_bsz2/
    $ python detr.py --mode create --detr-path ${BIN_DIR}/detr-r101-2c7b67e5.pth --backbone-path ${BIN_DIR}/resnet101-63fe2227.pth --model-path ${BIN_DIR}/DETR_ResNet101_BSZ2.pth
    
  2. Assume the MS-COCO Validation dataset are extracted and placed in ${DATA_DIR}, We expect the directory structure to be the following:

path/to/coco/
  annotations/  # annotation json files
  val2017/      # val images

Result Files Format

All results are saved in json format.

  • ${RES_NAME}.main.* - record main results about:
    1. origin_image_size - list()
    2. batch_image_size - list()
    3. result
      • image_id - int
      • category_id - int
      • bbox - list()
      • score - float
  • ${RES_NAME}.main_extended.* - record extended main results for convenience of the evaluation by using pycocotools;
  • ${RES_NAME}.time.* - record performance results about:
    1. inference_time - float
    2. postprocess_time - float
    3. preprocess_time - float

Suppose one want to run the workload 100 times, and set batch size as 1 (DETR-DC5-R101) or 2 (DETR-R101), then the following command will save all the results in ${RES_DIR} and all files will be named with ${RES_NAME}:

  • DETR-DC5-R101 for batch size 1:
$ python run.py --run-number 100 --batch-size 1 --dataset-path ${DATA_DIR} --results-basepath ${RES_DIR}/${RES_NAME} --local --model-path ${BIN_DIR}/DETR_ResNet101_BSZ1.pth
  • DETR-R101 for batch size 2:
$ python run.py --run-number 100 --batch-size 2 --dataset-path ${DATA_DIR} --results-basepath ${RES_DIR}/${RES_NAME} --local --model-path ${BIN_DIR}/DETR_ResNet101_BSZ2.pth
  1. Now you are in sources/with*bsz*/ directory, first change into Main directory:
$ cd ../../
  1. Archive all the files:

    • DETR-DC5-R101 for batch size 1:
    $ ./archive.sh BSZ1
    
    • DETR-R101 for batch size 2:
    $ ./archive.sh BSZ2
    
  2. All settings, such as inference url (inference_address) and models' settings (models), about serve mode are set in ts_config_[bsz1|bsz2].properties, and serve.sh will read this file:

    • DETR-DC5-R101 for batch size 1:
    $ ./serve.sh start BSZ1
    
    • DETR-R101 for batch size 2:
    $ ./serve.sh start BSZ2
    
  3. In another terminal, change into sources directory. Suppose inference_address is set as ${URL} and model name is ${MODEL_NAME} run then following command:

    • DETR-DC5-R101 for batch size 1:
    $ cd sources
    $ python run.py --run-number 100 --batch-size 1 --dataset-path ${EXT_DIR} --results-basepath ${RES_DIR}/${RES_NAME} --server-url ${URL}/predictions/${MODEL_NAME}
    
    • DETR-R101 for batch size 2:
    $ cd sources
    $ python run.py --run-number 100 --batch-size 2 --dataset-path ${EXT_DIR} --results-basepath ${RES_DIR}/${RES_NAME} --server-url ${URL}/predictions/${MODEL_NAME}
    
  4. If command finished, stop serve:

    • DETR-DC5-R101 for batch size 1:
    $ ./serve.sh stop BSZ1
    
    • DETR-R101 for batch size 2:
    $ ./serve.sh stop BSZ2