How Fast can BERT Learn Simple Natural Language Inference? Article Swipe
Yi‐Chung Lin
,
Keh‐Yih Su
·
YOU?
·
· 2021
· Open Access
·
· DOI: https://doi.org/10.18653/v1/2021.eacl-main.51
YOU?
·
· 2021
· Open Access
·
· DOI: https://doi.org/10.18653/v1/2021.eacl-main.51
This paper empirically studies whether BERT can really learn to conduct natural language inference (NLI) without utilizing hidden dataset bias; and how efficiently it can learn if it could. This is done via creating a simple entailment judgment case which involves only binary predicates in plain English. The results show that the learning process of BERT is very slow. However, the efficiency of learning can be greatly improved (data reduction by a factor of 1,500) if task-related features are added. This suggests that domain knowledge greatly helps when conducting NLI with neural networks.
Related Topics To Compare & Contrast
Vs
Arithmetic
Vs
Mathematics
Vs
Epistemology
Vs
Philosophy
Vs
Archaeology
Vs
Management
Vs
Economics
Concepts
Computer science
Inference
Artificial intelligence
Task (project management)
Simple (philosophy)
Natural language processing
Natural language
Process (computing)
Logical consequence
Textual entailment
Domain (mathematical analysis)
Binary number
Natural (archaeology)
Artificial neural network
Machine learning
Programming language
Arithmetic
History
Mathematics
Epistemology
Philosophy
Mathematical analysis
Archaeology
Management
Economics
Metadata
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.18653/v1/2021.eacl-main.51
- https://aclanthology.org/2021.eacl-main.51.pdf
- OA Status
- gold
- Cited By
- 8
- References
- 43
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W3154482395
All OpenAlex metadata
Raw OpenAlex JSON
No additional metadata available.