Skip to content

Latest commit

 

History

History
40 lines (30 loc) · 1.44 KB

README.md

File metadata and controls

40 lines (30 loc) · 1.44 KB

sequences-transformer

Sequence Relation Classification with Transformers.

Installation

As prerequisite, you need installations of Python 3.6+, PyTorch 1.0.0+ and TensorFlow 2.0.0-rc1.

Clone the repository; ideally in a Python virtual environment. All dependencies can be installed via:

pip install -r requirements.txt

Fine-tuning BERT for Sequence Relation Classification

Data

The data is split into training, test, and development sets.

Run the following command to fine-tune a BERTBASE model on a sequence classification task.

python sequences-trainer.py \
  --model_type bert \
  --model_name_or_path bert-base-uncased \
  --task_name seq-classification \
  --do_train --do_eval \
  --data_dir data/ \
  --max_seq_length 20 --per_gpu_train_batch_size 4 \
  --learning_rate 2e-5 --num_train_epochs 20.0 \
  --output_dir gens/ \
  --eval_all_checkpoints \
  --overwrite_output_dir \
  --tokenizer_name bert-base-uncased \
  --do_lower_case

License & Credits

This project is licensed under the APACHE LICENSE, VERSION 2.0 LICENSE.md file for details.

The code is based on the original implementations provided by huggingface transformers