Retrieving Comparative Arguments using Deep Pre-trained Language Models and NLU Notebook for the Touché Lab on Argument Retrieval at CLEF 2020

Viktoriia Chekalina, Alexander Panchenko

Research output: Contribution to journalConference articlepeer-review

Abstract

In this paper, we present our submission to the CLEF-2020 shared task on Comparative Argument Retrieval. We propose several approaches based on state-of-the-art NLP techniques such as Seq2Seq, Transformer, and BERT embedding. In addition to these models, we use features that describe the comparative structures and comparability of text. For the set of given topics, we retrieve the corresponding responses and rank them using these approaches. Presented solutions could help to improve the performance of processing comparative queries in information retrieval and dialogue systems.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume2696
Publication statusPublished - 2020
Event11th Conference and Labs of the Evaluation Forum, CLEF 2020 - Thessaloniki, Greece
Duration: 22 Sep 202025 Sep 2020

Fingerprint

Dive into the research topics of 'Retrieving Comparative Arguments using Deep Pre-trained Language Models and NLU Notebook for the Touché Lab on Argument Retrieval at CLEF 2020'. Together they form a unique fingerprint.

Cite this