Update README.md

This commit is contained in:
Silver 2022-11-24 16:07:19 +08:00 committed by GitHub
parent 3e18986129
commit 9f00fb01d7
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 11 additions and 8 deletions

View File

@ -1,14 +1,14 @@
# PCLL
## Introduction
-----
PCLL (Prompt-Conditioned Lifelong Learning) is build by Conversational AI Team, Alibaba DAMO Academy.
The corresponding paper has been published at EMNLP 2022 main conference: "***Prompt Conditioned VAE: Enhancing Generative Replay for Lifelong Learning in Task-Oriented Dialogue***".
## PCLL Implementation
-----
### Requirements
```
```bash
pip install -r requirements.txt
```
### Dataset
@ -17,20 +17,23 @@ Put the required datasets for intent detection and slot filling tasks under the
The datasets process scripts are also contained `DATA`. You can read the files for more details.
### Model Training and Evaluation
```
```bash
sh scripts/intent_all_train.sh # for lifelong intent detection task
sh scripts/slot_all_train.sh # for lifelong slot filling task
```
The files required for training are `lltrain.py, mycvae/model.py, mycvae/trainer.py`
## Citation
----
If you use our code or find PCLL useful for your work, please cite our paper as:\
@inproceedings{zhao2022cvae,\
If you use our code or find PCLL useful for your work, please cite our paper:
```bibtex
@inproceedings{zhao2022cvae,
title={Prompt Conditioned VAE: Enhancing Generative Replay for Lifelong Learning in Task-Oriented Dialogue},
author={Zhao, Yingxiu and Zheng, Yinhe and Tian, Zhiliang and Gao, Chang and Yu, Bowen and Yu, Haiyang and Li, Yongbin and Sun, Jian and Zhang, Nevin L.},
booktitle={Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing},
year={2022},\
year={2022},
}
```