THE BAT: Thoughts Hierarchical Enhancement Beyond Arbitrary Text Style Transfer

Biqing Zeng*, Junjie Liang, Yining Hua, Ruizhe Li, Huimin Deng, Yihao Peng, Ruitang Wang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingPublished conference contribution

Abstract

In the era of frozen pre-train model weights and fine-tuning large language models (LLMs) with prompts, we find that when using LLMs with standard prompt templates for text style transfer (TST), without limitations on the semantic space and sufficient context information, the model may generate text that deviates from the target style. We propose a set of new prompt templates integrated into a novel framework for arbitrary text style transfer, which balances transfer strength and fluency to enhance the accuracy of large language models in performing TST. Achieving an impressive 94.0% accuracy in transfer strength using GPT-4, our framework demonstrates significant performance. It also enables GPT-3.5-Turbo to surpass the performance of GPT-4 with the standard prompt. Additionally, due to the issue with unreliable TST metrics, we propose a novel prompt for TST evaluation. This prompt integrates scores from transfer strength, content retention, and fluency into a single score. We use this prompt to reevaluate previous TST models and highlight significant progress of our framework. Finally, we discover score fluctuation when using LLMs for text evaluation and propose an approach that requires LLMs to provide explanations. It enhances the evaluation stability by over 13% compared to prompts that do not have this requirement.

Original languageEnglish
Title of host publicationAdvanced Intelligent Computing Technology and Applications - 20th International Conference, ICIC 2024, Proceedings
EditorsDe-Shuang Huang, Zhanjun Si, Chuanlei Zhang
PublisherSpringer Science and Business Media Deutschland GmbH
Pages376-388
Number of pages13
ISBN (Print)9789819756711
DOIs
Publication statusPublished - 2024
Event20th International Conference on Intelligent Computing, ICIC 2024 - Tianjin, China
Duration: 5 Aug 20248 Aug 2024

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume14878 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference20th International Conference on Intelligent Computing, ICIC 2024
Country/TerritoryChina
CityTianjin
Period5/08/248/08/24

Keywords

  • Large Language Model
  • LLM-guided evaluation
  • Prompt Learning
  • Text Generation
  • Text Style Transfer

Fingerprint

Dive into the research topics of 'THE BAT: Thoughts Hierarchical Enhancement Beyond Arbitrary Text Style Transfer'. Together they form a unique fingerprint.

Cite this