In recent years, natural language processing techniques have been used more and more in IR. Among other syntactic and semantic parsing are effective methods for the design of complex applications like for example question answering and sentiment analysis. Unfortunately, extracting feature representations suitable for machine learning algorithms from linguistic structures is typically difficult. In this paper, we describe one of the most advanced piece of technology for automatic engineering of syntactic and semantic patterns. This method merges together convolution dependency tree kernels with lexical similarities. It can efficiently and effectively measure the similarity between dependency structures, whose lexical nodes are in part or completely different. Its use in powerful algorithm such as Support Vector Machines (SVMs) allows for fast design of accurate automatic systems. We report some experiments on question classification, which show an unprecedented result, e.g. 41% of error reduction of the former state-of-the-art, along with the analysis of the nice properties of the approach. © 2011 ACM.

Croce, D., Moschitti, A., Basili, R. (2011). Semantic convolution kernels over dependency trees: Smoothed partial tree kernel. In International Conference on Information and Knowledge Management, Proceedings (pp.2013-2016) [10.1145/2063576.2063878].

Semantic convolution kernels over dependency trees: Smoothed partial tree kernel

CROCE, DANILO;MOSCHITTI, ALESSANDRO;BASILI, ROBERTO
2011-10-01

Abstract

In recent years, natural language processing techniques have been used more and more in IR. Among other syntactic and semantic parsing are effective methods for the design of complex applications like for example question answering and sentiment analysis. Unfortunately, extracting feature representations suitable for machine learning algorithms from linguistic structures is typically difficult. In this paper, we describe one of the most advanced piece of technology for automatic engineering of syntactic and semantic patterns. This method merges together convolution dependency tree kernels with lexical similarities. It can efficiently and effectively measure the similarity between dependency structures, whose lexical nodes are in part or completely different. Its use in powerful algorithm such as Support Vector Machines (SVMs) allows for fast design of accurate automatic systems. We report some experiments on question classification, which show an unprecedented result, e.g. 41% of error reduction of the former state-of-the-art, along with the analysis of the nice properties of the approach. © 2011 ACM.
20th ACM Conference on Information and Knowledge Management, CIKM'11
Glasgow, gbr
2011
Special Interest Group on Information Retrieval (ACM SIGIR)
Rilevanza internazionale
ott-2011
Settore ING-INF/05 - SISTEMI DI ELABORAZIONE DELLE INFORMAZIONI
Settore INF/01 - INFORMATICA
English
kernel methods; natural language processing; question answering; support vector machines; syntactic semantic structures; Business, Management and Accounting (all); Decision Sciences (all)
Intervento a convegno
Croce, D., Moschitti, A., Basili, R. (2011). Semantic convolution kernels over dependency trees: Smoothed partial tree kernel. In International Conference on Information and Knowledge Management, Proceedings (pp.2013-2016) [10.1145/2063576.2063878].
Croce, D; Moschitti, A; Basili, R
File in questo prodotto:
File Dimensione Formato  
CIKM-SPTK2011_crux_material.pdf

solo utenti autorizzati

Licenza: Copyright dell'editore
Dimensione 183.42 kB
Formato Adobe PDF
183.42 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/124188
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? ND
social impact