Daniel Gildea
YOU?
Author Swipe
View article: Strictly Breadth-First AMR Parsing
Strictly Breadth-First AMR Parsing Open
AMR parsing is the task that maps a sentence to an AMR semantic graph automatically. We focus on the breadth-first strategy of this task, which was proposed recently and achieved better performance than other strategies. However, current m…
View article: Hierarchical Context Tagging for Utterance Rewriting
Hierarchical Context Tagging for Utterance Rewriting Open
Utterance rewriting aims to recover coreferences and omitted information from the latest turn of a multi-turn dialogue. Recently, methods that tag rather than linearly generate sequences have proven stronger in both in- and out-of-domain r…
View article: Rewarding Semantic Similarity under Optimized Alignments for AMR-to-Text Generation
Rewarding Semantic Similarity under Optimized Alignments for AMR-to-Text Generation Open
A common way to combat exposure bias is by applying scores from evaluation metrics as rewards in reinforcement learning (RL). Metrics leveraging contextualized embeddings appear more flexible than their n-gram matching counterparts and thu…
View article: Sequence-to-sequence AMR Parsing with Ancestor Information
Sequence-to-sequence AMR Parsing with Ancestor Information Open
AMR parsing is the task that maps a sentence to an AMR semantic graph automatically. The difficulty comes from generating the complex graph structure. The previous state-of-the-art method translates the AMR graph into a sequence, then dire…
View article: Latent Tree Decomposition Parsers for AMR-to-Text Generation
Latent Tree Decomposition Parsers for AMR-to-Text Generation Open
Graph encoders in AMR-to-text generation models often rely on neighborhood convolutions or global vertex attention. While these approaches apply to general graphs, AMRs may be amenable to encoders that target their tree-like structure. By …
View article: Tree Decomposition Attention for AMR-to-Text Generation
Tree Decomposition Attention for AMR-to-Text Generation Open
Text generation from AMR requires mapping a semantic graph to a string that it annotates. Transformer-based graph encoders, however, poorly capture vertex dependencies that may benefit sequence prediction. To impose order on an encoder, we…
View article: AWLCO: All-Window Length Co-Occurrence
AWLCO: All-Window Length Co-Occurrence Open
Analyzing patterns in a sequence of events has applications in text analysis, computer programming, and genomics research. In this paper, we consider the all-window-length analysis model which analyzes a sequence of events with respect to …
View article: Outside Computation with Superior Functions
Outside Computation with Superior Functions Open
We show that a general algorithm for efficient computation of outside values under the minimum of superior functions framework proposed by Knuth (1977) would yield a sub-exponential time algorithm for SAT, violating the Strong Exponential …
View article: AWLCO: All-Window Length Co-Occurrence
AWLCO: All-Window Length Co-Occurrence Open
Analyzing patterns in a sequence of events has applications in text analysis, computer programming, and genomics research. In this paper, we consider the all-window-length analysis model which analyzes a sequence of events with respect to …
View article: Efficient Outside Computation
Efficient Outside Computation Open
Weighted deduction systems provide a framework for describing parsing algorithms that can be used with a variety of operations for combining the values of partial derivations. For some operations, inside values can be computed efficiently,…
View article: Unsupervised Bilingual Lexicon Induction Across Writing Systems
Unsupervised Bilingual Lexicon Induction Across Writing Systems Open
Recent embedding-based methods in unsupervised bilingual lexicon induction have shown good results, but generally have not leveraged orthographic (spelling) information, which can be helpful for pairs of related languages. This work augmen…
View article: Generalized Shortest-Paths Encoders for AMR-to-Text Generation
Generalized Shortest-Paths Encoders for AMR-to-Text Generation Open
For text generation from semantic graphs, past neural models encoded input structure via gated convolutions along graph edges. Although these operations provide local context, the distance messages can travel is bounded by the number of en…
View article: Tensors over Semirings for Latent-Variable Weighted Logic Programs
Tensors over Semirings for Latent-Variable Weighted Logic Programs Open
Semiring parsing is an elegant framework for describing parsers by using semiring weighted logic programs. In this paper we present a generalization of this concept: latent-variable semiring parsing. With our framework, any semiring weight…
View article: AMR-to-Text Generation with Cache Transition Systems
AMR-to-Text Generation with Cache Transition Systems Open
Text generation from AMR involves emitting sentences that reflect the meaning of their AMR annotations. Neural sequence-to-sequence models have successfully been used to decode strings from flattened graphs (e.g., using depth-first or rand…
View article: Leveraging Dependency Forest for Neural Medical Relation Extraction
Leveraging Dependency Forest for Neural Medical Relation Extraction Open
Medical relation extraction discovers relations between entity mentions in text, such as research articles. For this task, dependency syntax has been recognized as a crucial source of features. Yet in the medical domain, 1-best parse trees…
View article: SemBleu: A Robust Metric for AMR Parsing Evaluation
SemBleu: A Robust Metric for AMR Parsing Evaluation Open
Evaluating AMR parsing accuracy involves comparing pairs of AMR graphs. The major evaluation metric, SMATCH (Cai and Knight, 2013), searches for one-to-one mappings between the nodes of two AMRs with a greedy hill-climbing algorithm, which…
View article: Predicting TED Talk Ratings from Language and Prosody
Predicting TED Talk Ratings from Language and Prosody Open
We use the largest open repository of public speaking---TED Talks---to predict the ratings of the online viewers. Our dataset contains over 2200 TED Talk transcripts (includes over 200 thousand sentences), audio features and the associated…
View article: A Causality-Guided Prediction of the TED Talk Ratings from the Speech-Transcripts using Neural Networks
A Causality-Guided Prediction of the TED Talk Ratings from the Speech-Transcripts using Neural Networks Open
Automated prediction of public speaking performance enables novel systems for tutoring public speaking skills. We use the largest open repository---TED Talks---to predict the ratings provided by the online viewers. The dataset contains ove…
View article: Ordered Tree Decomposition for HRG Rule Extraction
Ordered Tree Decomposition for HRG Rule Extraction Open
We present algorithms for extracting Hyperedge Replacement Grammar (HRG) rules from a graph along with a vertex order. Our algorithms are based on finding a tree decomposition of smallest width, relative to the vertex order, and then extra…
View article: SemBleu: A Robust Metric for AMR Parsing Evaluation
SemBleu: A Robust Metric for AMR Parsing Evaluation Open
Evaluating AMR parsing accuracy involves comparing pairs of AMR graphs. The major evaluation metric, SMATCH (Cai and Knight, 2013), searches for one-to-one mappings between the nodes of two AMRs with a greedy hill-climbing algorithm, which…
View article: Leveraging Dependency Forest for Neural Medical Relation Extraction
Leveraging Dependency Forest for Neural Medical Relation Extraction Open
Linfeng Song, Yue Zhang, Daniel Gildea, Mo Yu, Zhiguo Wang, Jinsong Su. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP…
View article: Neural Transition-based Syntactic Linearization
Neural Transition-based Syntactic Linearization Open
The task of linearization is to find a grammatical order given a set of words. Traditional models use statistical methods. Syntactic linearization systems, which generate a sentence along with its syntactic tree, have shown state-of-the-ar…
View article: Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks
Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks Open
Multi-hop reading comprehension focuses on one type of factoid question, where a system needs to properly integrate multiple pieces of evidence to correctly answer a question. Previous work approximates global evidence with local coreferen…
View article: N-ary Relation Extraction using Graph State LSTM
N-ary Relation Extraction using Graph State LSTM Open
Cross-sentence $n$-ary relation extraction detects relations among $n$ entities across multiple sentences. Typical methods formulate an input as a \textit{document graph}, integrating various intra-sentential and inter-sentential dependenc…
View article: Feature-Based Decipherment for Machine Translation
Feature-Based Decipherment for Machine Translation Open
Orthographic similarities across languages provide a strong signal for unsupervised probabilistic transduction (decipherment) for closely related language pairs. The existing decipherment models, however, are not well suited for exploiting…
View article: A Graph-to-Sequence Model for AMR-to-Text Generation
A Graph-to-Sequence Model for AMR-to-Text Generation Open
The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph. The current state-of-the-art method uses a sequence-to-sequence model, leveraging LSTM for encoding a linearized AMR structure.…
View article: AMR Parsing With Cache Transition Systems
AMR Parsing With Cache Transition Systems Open
In this paper, we present a transition system that generalizes transition-based dependency parsing techniques to generateAMR graphs rather than tree structures. In addition to a buffer and a stack, we use a fixed-size cache, and allow the …
View article: Leveraging Context Information for Natural Question Generation
Leveraging Context Information for Natural Question Generation Open
Linfeng Song, Zhiguo Wang, Wael Hamza, Yue Zhang, Daniel Gildea. Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers). 2018.
View article: N-ary Relation Extraction using Graph-State LSTM
N-ary Relation Extraction using Graph-State LSTM Open
Cross-sentence n-ary relation extraction detects relations among n entities across multiple sentences. Typical methods formulate an input as a document graph, integrating various intra-sentential and inter-sentential dependencies. The curr…
View article: The ACL Anthology: Current State and Future Directions
The ACL Anthology: Current State and Future Directions Open
The Association of Computational Linguistic's Anthology is the open source archive, and the main source for computational linguistics and natural language processing's scientific literature. The ACL Anthology is currently maintained exclus…