ACL script

Abstract

Introduction

Related Works

Toolkit for Prompt Compression

Methods

Selective Context

LLMLingua

LongLLMLingua

SCRL

Keep it Simple

Metrics

BLEU

ROUGE

Bertscore

Datasets

Modular interfaces

Evaluation

Future Works

Conclusion

Reference

  1. Li, Yucheng et al. “Compressing Context to Enhance Inference Efficiency of Large Language Models.” Conference on Empirical Methods in Natural Language Processing (2023).

  2. Jiang, Huiqiang et al. “LLMLingua: Compressing Prompts for Accelerated Inference of Large Language Models.” Conference on Empirical Methods in Natural Language Processing (2023).

  3. Jiang, Huiqiang et al. “LongLLMLingua: Accelerating and Enhancing LLMs in Long Context Scenarios via Prompt Compression.” ArXiv abs/2310.06839 (2023): n. pag.

  4. Ghalandari, Demian Gholipour et al. “Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning.” ArXiv abs/2205.08221 (2022): n. pag.

  5. Keep It Simple: Unsupervised Simplification of Multi-Paragraph Text (Laban et al., ACL-IJCNLP 2021)




Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • Prune & Incremental
  • Kannst Du die Musik hören?
  • Hello World
  • 120%
  • Why light?