In-context learning methods are commonly employed as inference strategies, where Large Language Models (LLMs) are elicited to solve a task by leveraging provided demonstrations without requiring parameter updates. Among these approaches are the reasoning methods, exemplified by Chain-of-Thought (CoT) and Program-Aided Language Models (PAL), which encourage LLMs to generate reasoning steps, leading to improved accuracy. Despite their success, the ability to deliver multi-step reasoning remains limited to a single language, making it challenging to generalize to other languages and hindering global development. In this work, we propose Cross-lingual Program-Aided Language Models (CrossPAL), a method for aligning reasoning programs across languages. Our method delivers programs as intermediate reasoning steps in different languages through a double-step cross-lingual prompting mechanism inspired by the Program-Aided approach. Moreover, we introduce Self-consistent Cross-PAL (SCross-PAL) to ensemble different reasoning paths across languages. Our experimental evaluations show that Cross-PAL outperforms existing methods, reducing the number of interactions and achieving state-of-the-art performance.

Ranaldi, L., Pucci, G., Haddow, B., Birch, A. (2024). Empowering Multi-step Reasoning across Languages via Program-Aided Language Models. In EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp.12171-12187). Association for Computational Linguistics (ACL) [10.18653/v1/2024.emnlp-main.678].

Empowering Multi-step Reasoning across Languages via Program-Aided Language Models

Leonardo Ranaldi
;
Giulia Pucci;
2024-01-01

Abstract

In-context learning methods are commonly employed as inference strategies, where Large Language Models (LLMs) are elicited to solve a task by leveraging provided demonstrations without requiring parameter updates. Among these approaches are the reasoning methods, exemplified by Chain-of-Thought (CoT) and Program-Aided Language Models (PAL), which encourage LLMs to generate reasoning steps, leading to improved accuracy. Despite their success, the ability to deliver multi-step reasoning remains limited to a single language, making it challenging to generalize to other languages and hindering global development. In this work, we propose Cross-lingual Program-Aided Language Models (CrossPAL), a method for aligning reasoning programs across languages. Our method delivers programs as intermediate reasoning steps in different languages through a double-step cross-lingual prompting mechanism inspired by the Program-Aided approach. Moreover, we introduce Self-consistent Cross-PAL (SCross-PAL) to ensemble different reasoning paths across languages. Our experimental evaluations show that Cross-PAL outperforms existing methods, reducing the number of interactions and achieving state-of-the-art performance.
2024 Conference on Empirical Methods in Natural Language Processing, EMNLP 2024
usa
2024
The 2024 Conference on Empirical Methods in Natural Language Processing
Association for Computational Linguistics
Rilevanza internazionale
2024
Settore IINF-05/A - Sistemi di elaborazione delle informazioni
English
Intervento a convegno
Ranaldi, L., Pucci, G., Haddow, B., Birch, A. (2024). Empowering Multi-step Reasoning across Languages via Program-Aided Language Models. In EMNLP 2024 - 2024 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp.12171-12187). Association for Computational Linguistics (ACL) [10.18653/v1/2024.emnlp-main.678].
Ranaldi, L; Pucci, G; Haddow, B; Birch, A
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/424363
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact