Are Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement Article Swipe
Related Concepts
Transformer
Computer science
Sentence
Agreement
Natural language processing
Verb
Artificial intelligence
Heuristics
Artificial neural network
Object (grammar)
Parsing
Linguistics
Physics
Philosophy
Quantum mechanics
Operating system
Voltage
Bingzhi Li
,
Guillaume Wisniewski
,
Benoît Crabbé
·
YOU?
·
· 2021
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2109.10133
· OA: W3213202614
YOU?
·
· 2021
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2109.10133
· OA: W3213202614
Many recent works have demonstrated that unsupervised sentence representations of neural networks encode syntactic information by observing that neural language models are able to predict the agreement between a verb and its subject. We take a critical look at this line of research by showing that it is possible to achieve high accuracy on this agreement task with simple surface heuristics, indicating a possible flaw in our assessment of neural networks' syntactic ability. Our fine-grained analyses of results on the long-range French object-verb agreement show that contrary to LSTMs, Transformers are able to capture a non-trivial amount of grammatical structure.
Related Topics
Finding more related topics…