Designing models for learning textual entailment recognizers from annotated examples is not an easy task, as it requires modeling the semantic relations and interactions involved between two pairs of text fragments. In this paper, we approach the problem by first introducing the class of pair feature spaces, which allow supervised machine learning algorithms to derive first-order rewrite rules from annotated examples. In particular, we propose syntactic and shallow semantic feature spaces, and compare them to standard ones. Extensive experiments demonstrate that our proposed spaces learn first-order derivations, while standard ones are not expressive enough to do so.
Zanzotto, F.m., Pennacchiotti, M., Moschitti, A. (2009). A Machine learning approach to textual entailment recognition. NATURAL LANGUAGE ENGINEERING, 15 - http://www.scimagojr.com/journalsearch.php?q=28380&tip=sid&clean=0(4), 551-582 [10.1017/S1351324909990143].
A Machine learning approach to textual entailment recognition
ZANZOTTO, FABIO MASSIMO;MOSCHITTI, ALESSANDRO
2009-01-01
Abstract
Designing models for learning textual entailment recognizers from annotated examples is not an easy task, as it requires modeling the semantic relations and interactions involved between two pairs of text fragments. In this paper, we approach the problem by first introducing the class of pair feature spaces, which allow supervised machine learning algorithms to derive first-order rewrite rules from annotated examples. In particular, we propose syntactic and shallow semantic feature spaces, and compare them to standard ones. Extensive experiments demonstrate that our proposed spaces learn first-order derivations, while standard ones are not expressive enough to do so.File | Dimensione | Formato | |
---|---|---|---|
2009_JNLE_ZanzottoPennacchiottiMoschitti.pdf
accesso aperto
Descrizione: Articolo Principale
Licenza:
Copyright dell'editore
Dimensione
303.41 kB
Formato
Adobe PDF
|
303.41 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.