Share this page:

Graph long short term memory for syntactic relationship discovery

Christopher Brian Quirk, Kristina Nikolova Toutanova, Wen-tau Yih, Hoifung Poon, and Nanyun Peng,

Download the full text


Abstract


Bib Entry

@misc{quirk2016graph,
  title = {Graph long short term memory for syntactic relationship discovery},
  author = {Quirk, Christopher Brian and Toutanova, Kristina Nikolova and Yih, Wen-tau and Poon, Hoifung and Peng, Nanyun},
  year = {2016}
}

Related Publications

  1. Discourse-level Relation Extraction via Graph Pooling

    I.-Hung Hsu, Xiao Guo, Premkumar Natarajan, and Nanyun Peng, in The Thirty-Sixth AAAI Conference On Artificial Intelligence Workshop on Deep Learning on Graphs: Method and Applications (DLG-AAAI), 2022.
    BibTeX Details 🏆 Best Paper Award
    @inproceedings{hsu2021discourse,
      title = {Discourse-level Relation Extraction via Graph Pooling},
      author = {Hsu, I-Hung and Guo, Xiao and Natarajan, Premkumar and Peng, Nanyun},
      booktitle = {The Thirty-Sixth AAAI Conference On Artificial Intelligence Workshop on Deep Learning on Graphs: Method and Applications (DLG-AAAI)},
      year = {2022}
    }
    
    Details
  2. Cross-sentence N-ary Relation Extraction with Graph LSTMs

    Nanyun Peng, Hoifung Poon, Chris Quirk, Kristina Toutanova, and Wen-tau Yih, Transactions of the Association of Computational Linguistics, 2017.
    Full Text Abstract BibTeX Details
    Recently, with the advances made in continuous representation of words (word embeddings) and deep neural architectures, many research works are published in the area of relation extraction and it is very difficult to keep track of so many papers. To help future research, we present a comprehensive review of the recently published research works in relation extraction. We mostly focus on relation extraction using deep neural networks which have achieved state-of-the-art performance on publicly available datasets. In this survey, we cover sentence-level relation extraction to document-level relation extraction, pipeline-based approaches to joint extraction approaches, annotated datasets to distantly supervised datasets along with few very recent research directions such as zero-shot or few-shot relation extraction, noise mitigation in distantly supervised datasets. Regarding neural architectures, we cover convolutional models, recurrent network models, attention network models, and graph convolutional models in this survey.
    @article{peng2017cross,
      title = {Cross-sentence N-ary Relation Extraction with Graph LSTMs},
      author = {Peng, Nanyun and Poon, Hoifung and Quirk, Chris and Toutanova, Kristina and Yih, Wen-tau},
      journal = {Transactions of the Association of Computational Linguistics},
      year = {2017}
    }
    
    Details
  3. Graph long short term memory for syntactic relationship discovery

    Christopher Brian Quirk, Kristina Nikolova Toutanova, Wen-tau Yih, Hoifung Poon, and Nanyun Peng,
    Full Text BibTeX Details
    @misc{quirk2016graph,
      title = {Graph long short term memory for syntactic relationship discovery},
      author = {Quirk, Christopher Brian and Toutanova, Kristina Nikolova and Yih, Wen-tau and Poon, Hoifung and Peng, Nanyun},
      year = {2016}
    }
    
    Details