Mathematics • Vol 12 • No 22
DecoStrat: Leveraging the Capabilities of Language Models in D2T Generation via Decoding Framework
November 2024 • Elias Lemuye Jimale, Wenyu Chen, Mugahed A. Al–antari, Yeong Hyeon Gu, Victor Kwaku Agbesi, Wasif Feroze
Current language models have achieved remarkable success in NLP tasks. Nonetheless, individual decoding methods face difficulties in realizing the immense potential of these models. The challenge is primarily due to the lack of a decoding framework that can integrate language models and decoding methods. We introduce DecoStrat, which bridges the gap between language modeling and the decoding process in D2T generation. By leveraging language models, DecoStrat facilitates the exploration of alternative decoding meth…