DAMO-ConvAI/ssll
出蛰 fa8479e963 add: ssll 2023-05-26 16:08:26 +08:00
..
scripts add: ssll 2023-05-26 16:08:26 +08:00
tools add: ssll 2023-05-26 16:08:26 +08:00
unifymodel add: ssll 2023-05-26 16:08:26 +08:00
README.md add: ssll 2023-05-26 16:08:26 +08:00
backtrans.py add: ssll 2023-05-26 16:08:26 +08:00
dataset.py add: ssll 2023-05-26 16:08:26 +08:00
decanlp_order.txt add: ssll 2023-05-26 16:08:26 +08:00
divide_data.py add: ssll 2023-05-26 16:08:26 +08:00
eda.py add: ssll 2023-05-26 16:08:26 +08:00
evaluate.py add: ssll 2023-05-26 16:08:26 +08:00
final_score.py add: ssll 2023-05-26 16:08:26 +08:00
lamol_dataset.py add: ssll 2023-05-26 16:08:26 +08:00
metrics.py add: ssll 2023-05-26 16:08:26 +08:00
prepare_data_for_pretrain.py add: ssll 2023-05-26 16:08:26 +08:00
pretrain.py add: ssll 2023-05-26 16:08:26 +08:00
random_order.py add: ssll 2023-05-26 16:08:26 +08:00
requirements.txt add: ssll 2023-05-26 16:08:26 +08:00
settings.py add: ssll 2023-05-26 16:08:26 +08:00
single_score.py add: ssll 2023-05-26 16:08:26 +08:00
text_classification_order.txt add: ssll 2023-05-26 16:08:26 +08:00
unitrain.py add: ssll 2023-05-26 16:08:26 +08:00

README.md

SSLL

Introduction


SSLL (Semi-Supervised Lifelong Language Learning) is build by Conversational AI Team, Alibaba DAMO Academy.

The corresponding paper has been published at EMNLP 2022 Findings: "Semi-Supervised Lifelong Language Learning".

SSLL Implementation


Requirements

pip install -r requirements.txt

Dataset

The datasets used in the experiments follows LAMOL.

Model Training and Evaluation

sh scripts/lltrain.sh 

The files required for training are under the folder unifymodel.

Citation


If you use our code or find SSLL useful for your work, please cite our paper as:
@inproceedings{zhao2022semi,
title={Semi-Supervised Lifelong Language Learning}, author={Zhao, Yingxiu and Zheng, Yinhe and Yu, Bowen and Tian, Zhiliang and Lee, Dongkyu and Sun, Jian and Li, Yongbin and Zhang, Nevin L.}, booktitle={Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Findings}, year={2022}
}

LAMOL citation:
@inproceedings{sun2019lamol,
title={LAMOL: LAnguage MOdeling for Lifelong Language Learning}, author={Sun, Fan-Keng and Ho, Cheng-Hao and Lee, Hung-Yi}, booktitle={International Conference on Learning Representations}, year={2020}
}