ACL script
Abstract
Introduction
Related Works
Toolkit for Prompt Compression
Methods
Selective Context
LLMLingua
LongLLMLingua
SCRL
Keep it Simple
Metrics
BLEU
ROUGE
Bertscore
Datasets
Modular interfaces
Evaluation
Future Works
Conclusion
Reference
-
Li, Yucheng et al. “Compressing Context to Enhance Inference Efficiency of Large Language Models.” Conference on Empirical Methods in Natural Language Processing (2023).
-
Jiang, Huiqiang et al. “LLMLingua: Compressing Prompts for Accelerated Inference of Large Language Models.” Conference on Empirical Methods in Natural Language Processing (2023).
-
Jiang, Huiqiang et al. “LongLLMLingua: Accelerating and Enhancing LLMs in Long Context Scenarios via Prompt Compression.” ArXiv abs/2310.06839 (2023): n. pag.
-
Ghalandari, Demian Gholipour et al. “Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning.” ArXiv abs/2205.08221 (2022): n. pag.
-
Keep It Simple: Unsupervised Simplification of Multi-Paragraph Text (Laban et al., ACL-IJCNLP 2021)
Enjoy Reading This Article?
Here are some more articles you might like to read next: