Learning with Partially Ordered Representations Article Swipe
Related Concepts
Unary operation
String (physics)
Computer science
Theoretical computer science
Relation (database)
Exploit
Grammar
Artificial intelligence
Space (punctuation)
Rule-based machine translation
Formal language
Algorithm
Natural language processing
Mathematics
Discrete mathematics
Linguistics
Data mining
Philosophy
Computer security
Operating system
Mathematical physics
Jane Chandlee
,
R. Eyraud
,
Jeffrey Heinz
,
Adam Jardine
,
Jonathan Rawski
·
YOU?
·
· 2019
· Open Access
·
· DOI: https://doi.org/10.18653/v1/w19-5708
· OA: W2952070529
YOU?
·
· 2019
· Open Access
·
· DOI: https://doi.org/10.18653/v1/w19-5708
· OA: W2952070529
This paper examines the characterization and learning of grammars defined with enriched representational models. Model-theoretic approaches to formal language theory traditionally assume that each position in a string belongs to exactly one unary relation. We consider unconventional string models where positions can have multiple, shared properties, which are arguably useful in many applications. We show the structures given by these models are partially ordered, and present a learning algorithm that exploits this ordering relation to effectively prune the hypothesis space. We prove this learning algorithm, which takes positive examples as input, finds the most general grammar which covers the data.
Related Topics
Finding more related topics…