Dependency-Based Construction of Semantic Space Models
- 1 June 2007
- journal article
- research article
- Published by MIT Press in Computational Linguistics
- Vol. 33 (2), 161-199
- https://doi.org/10.1162/coli.2007.33.2.161
Abstract
Traditionally, vector-based semantic space models use word co-occurrence counts from large corpora to represent lexical meaning. In this article we present a novel framework for constructing semantic spaces that takes syntactic relations into account. We introduce a formalization for this class of models, which allows linguistic knowledge to guide the construction process. We evaluate our framework on a range of tasks relevant for cognitive science and natural language processing: semantic priming, synonymy detection, and word sense disambiguation. In all cases, our framework obtains results that are comparable or superior to the state of the art.This publication has 23 references indexed in Scilit:
- Directional distributional similarity for lexical inferenceNatural Language Engineering, 2010
- Parameter optimization for machine-learning of word sense disambiguationNatural Language Engineering, 2002
- Evaluating sense disambiguation across diverse parameter spacesNatural Language Engineering, 2002
- Discovery of inference rules for question-answeringNatural Language Engineering, 2001
- The measurement of textual coherence with latent semantic analysisDiscourse Processes, 1998
- Producing high-dimensional semantic spaces from lexical co-occurrenceBehavior Research Methods, Instruments & Computers, 1996
- Lexical and message-level sentence context effects on fixation times in reading.Journal of Experimental Psychology: Learning, Memory, and Cognition, 1994
- The Learnability and Acquisition of the Dative Alternation in EnglishLanguage, 1989
- Development and application of a metric on semantic netsIEEE Transactions on Systems, Man, and Cybernetics, 1989
- Robust effects of syntactic structure on visual word processingMemory & Cognition, 1986