A Neural Layered Model for Nested Named Entity Recognition

Meizhi Ju, Makoto Miwa, Sophia Ananiadou

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Entity mentions embedded in longer entity mentions are referred to as nested entities.
Most named entity recognition (NER) systems deal only with the flat entities and ignore
the inner nested ones, which fails to capture finer-grained semantic information in underlying texts. To address this issue, we propose a novel neural model to identify nested entities by dynamically stacking flat NER layers. Each flat NER layer is based on the state-of-the-art flat NER model that captures sequential context representation with bidirectional long short-term memory (LSTM) layer andf eeds it to the cascaded CRF layer. Our model merges the output of the LSTM layer in the current flat NER layer to build new representation for detected entities and subsequently feeds them into the next flat NER layer. This allows our model to extract outer entities by taking full advantage of information encoded in their corresponding inner entities, in an
inside-to-outside way. Our model dynamically stacks the flat NER layers until no outer entities are extracted. Extensive evaluation shows that our dynamic model outperforms state-of-the-art feature-based systems on nested NER, achieving 74.7% and 72.2% on GENIA and ACE2005 datasets, respectively, in terms of F-score.
Original languageEnglish
Title of host publicationProceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
PublisherACM Digital Library
Pages1446-1459
Number of pages11
DOIs
Publication statusPublished - 1 Jun 2018

Fingerprint

Dive into the research topics of 'A Neural Layered Model for Nested Named Entity Recognition'. Together they form a unique fingerprint.

Cite this