Please use this identifier to cite or link to this item:
Title: SemVec: semantic features word vectors based deep learning for improved text classification
Authors: Odeh, Firas 
Taweel, Ade 
Keywords: Convolution Neural Networks,;Artificial intelligence;Text classification;Deep learning (Machine learning);Neural networks (Computer science);Features engineering;Natural language processing (Computer science);Machine learning;Word embeddings
Issue Date: 2018
Publisher: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract: Semantic word representation is a core building block in many deep learning systems. Most word representation techniques are based on words angle/distance, word analogies and statistical information. However, popular models ignore word morphology by representing each word with a distinct vector. This limits their ability to represent rare words in languages with large vocabulary. This paper proposes a dynamic model, named SemVec, for representing words as a vector of both domain and semantic features. Based on the problem domain, semantic features can be added or removed to generate an enriched word representation with domain knowledge. The proposed method is evaluated on adverse drug events (ADR) tweets/text classification. Results show that SemVec improves the precision of ADR detection by 15.28% over other state-of the-art deep learning methods with a comparable recall score.
DOI: 10.1007/978-3-030-04070-3_35
Appears in Collections:Fulltext Publications

Show full item record

Page view(s)

checked on Jun 18, 2024


checked on Jun 18, 2024

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.