Mark Johnson
YOU?
Author Swipe
View article: Locus coeruleus tonic upregulation increases selectivity to inconspicuous auditory information in autistic compared to non-autistic individuals: a combined pupillometry and electroencephalography study
Locus coeruleus tonic upregulation increases selectivity to inconspicuous auditory information in autistic compared to non-autistic individuals: a combined pupillometry and electroencephalography study Open
Background Sensory processing requires selectivity to salient sensory input. Many autistic individuals report different sensory processing, which has been associated with altered sensory selectivity. The locus-coeruleus norepinephrine (LC-…
View article: Physically-constrained evapotranspiration models with machine learning parameterization outperform pure machine learning: Critical role of domain knowledge
Physically-constrained evapotranspiration models with machine learning parameterization outperform pure machine learning: Critical role of domain knowledge Open
Physics-informed machine learning techniques have emerged to tackle challenges inherent in pure machine learning (ML) approaches. One such technique, the hybrid approach, has been introduced to estimate terrestrial evapotranspiration (ET),…
View article: Mastering the Craft of Data Synthesis for CodeLLMs
Mastering the Craft of Data Synthesis for CodeLLMs Open
View article: Mastering the Craft of Data Synthesis for CodeLLMs
Mastering the Craft of Data Synthesis for CodeLLMs Open
Large language models (LLMs) have shown impressive performance in \emph{code} understanding and generation, making coding tasks a key focus for researchers due to their practical applications and value as a testbed for LLM evaluation. Data…
View article: A language-agnostic model of child language acquisition
A language-agnostic model of child language acquisition Open
View article: A Language-agnostic Model of Child Language Acquisition
A Language-agnostic Model of Child Language Acquisition Open
This work reimplements a recent semantic bootstrapping child-language acquisition model, which was originally designed for English, and trains it to learn a new language: Hebrew. The model learns from pairs of utterances and logical forms …
View article: A Multilingual Model of Child Language Acquisition
A Multilingual Model of Child Language Acquisition Open
View article: Sources of Hallucination by Large Language Models on Inference Tasks
Sources of Hallucination by Large Language Models on Inference Tasks Open
Large Language Models (LLMs) are claimed to be capable of Natural Language Inference (NLI), necessary for applied tasks like question answering and summarization. We present a series of behavioral studies on several LLM families (LLaMA, GP…
View article: Language
Language Open
Language is a system for communicating with other people using sounds, symbols, and words to express a meaning, idea, or thought. This chapter starts with the question of whether language is “biologically special”. This issue refers to the…
View article: Opening Doors to Recovery
Opening Doors to Recovery Open
Trial Registration: ClinicalTrials.gov identifier: NCT04612777.
View article: Smoothing Entailment Graphs with Language Models
Smoothing Entailment Graphs with Language Models Open
Nick McKenna, Tianyi Li, Mark Johnson, Mark Steedman. Proceedings of the 13th International Joint Conference on Natural Language Processing and the 3rd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics…
View article: Sources of Hallucination by Large Language Models on Inference Tasks
Sources of Hallucination by Large Language Models on Inference Tasks Open
Large Language Models (LLMs) are claimed to be capable of Natural Language Inference (NLI), necessary for applied tasks like question answering and summarization. We present a series of behavioral studies on several LLM families (LLaMA, GP…
View article: Impressions of the GDMC AI Settlement Generation Challenge in Minecraft
Impressions of the GDMC AI Settlement Generation Challenge in Minecraft Open
The GDMC AI settlement generation challenge is a procedural content generation (PCG) competition about producing an algorithm that can create a settlement in the game Minecraft. In contrast to the majority of AI competitions, the GDMC entr…
View article: Disfluency detection using a noisy channel model and deep neural language model
Disfluency detection using a noisy channel model and deep neural language model Open
Although speech recognition technology has improved considerably in recent years, current systems still output simply a sequence of words without any useful information about the location of disfluencies. On the other hand, such informatio…
View article: Speakers NMDC 2021 Plenary and Invited Speakers
Speakers NMDC 2021 Plenary and Invited Speakers Open
View article: Incorporating Temporal Information in Entailment Graph Mining
Incorporating Temporal Information in Entailment Graph Mining Open
We present a novel method for injecting temporality into entailment graphs to address the problem of spurious entailments, which may arise from similar but temporally distinct events involving the same pair of entities. We focus on the spo…
View article: Neural Rule-Execution Tracking Machine For Transformer-Based Text Generation
Neural Rule-Execution Tracking Machine For Transformer-Based Text Generation Open
Sequence-to-Sequence (S2S) neural text generation models, especially the pre-trained ones (e.g., BART and T5), have exhibited compelling performance on various natural language generation tasks. However, the black-box nature of these model…
View article: ECOL-R: Encouraging Copying in Novel Object Captioning with Reinforcement Learning
ECOL-R: Encouraging Copying in Novel Object Captioning with Reinforcement Learning Open
Novel Object Captioning is a zero-shot Image Captioning task requiring describing objects not seen in the training captions, but for which information is available from external object detectors. The key challenge is to select and describe…
View article: Blindness to Modality Helps Entailment Graph Mining
Blindness to Modality Helps Entailment Graph Mining Open
Understanding linguistic modality is widely seen as important for downstream tasks such as Question Answering and Knowledge Graph Population. Entailment Graph learning might also be expected to benefit from attention to modality. We build …
View article: Mention Flags (MF): Constraining Transformer-based Text Generators
Mention Flags (MF): Constraining Transformer-based Text Generators Open
Yufei Wang, Ian Wood, Stephen Wan, Mark Dras, Mark Johnson. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long …
View article: ECOL-R: Encouraging Copying in Novel Object Captioning with Reinforcement Learning
ECOL-R: Encouraging Copying in Novel Object Captioning with Reinforcement Learning Open
Novel Object Captioning is a zero-shot Image Captioning task requiring describing objects not seen in the training captions, but for which information is available from external object detectors. The key challenge is to select and describe…
View article: Open-Domain Contextual Link Prediction and its Complementarity with Entailment Graphs
Open-Domain Contextual Link Prediction and its Complementarity with Entailment Graphs Open
An open-domain knowledge graph (KG) has entities as nodes and natural language relations as edges, and is constructed by extracting (subject, relation, object) triples from text. The task of open-domain link prediction is to infer missing …
View article: Integrating Lexical Information into Entity Neighbourhood Representations for Relation Prediction
Integrating Lexical Information into Entity Neighbourhood Representations for Relation Prediction Open
Relation prediction informed from a combination of text corpora and curated knowledge bases, combining knowledge graph completion with relation extraction, is a relatively little studied task. A system that can perform this task has the ab…
View article: Detecting and Exorcising Statistical Demons from Language Models with Anti-Models of Negative Data
Detecting and Exorcising Statistical Demons from Language Models with Anti-Models of Negative Data Open
It's been said that "Language Models are Unsupervised Multitask Learners." Indeed, self-supervised language models trained on "positive" examples of English text generalize in desirable ways to many natural language tasks. But if such mode…
View article: Animal Record Management Using an Embedded RFID-Based System
Animal Record Management Using an Embedded RFID-Based System Open
ANIMAL RECORD MANAGEMENT USING AN EMBEDDED RFID-BASED SYSTEMThe current paper describes the design and implementation of a Radio Frequency Identification(RFID) system that uses an embedded microchip in conjunction with an RFID reader for t…
View article: Tablets For Timely Design Documentation
Tablets For Timely Design Documentation Open
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract Tablets for Timely Design Documentation Abstract One of the biggest challenges we have experienced in supervising digital systems senio…
View article: Gene Sequence Inspired Vhdl Plagiarism Screening
Gene Sequence Inspired Vhdl Plagiarism Screening Open
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract Session 3232 Gene Sequence Inspired Design Plagiarism Screening Mark C. Johnson, Curtis Watson, Shawn Davidson, Douglas Eschbach Purdue…
View article: Saccade dysmetria indicates attenuated visual exploration in autism spectrum disorder
Saccade dysmetria indicates attenuated visual exploration in autism spectrum disorder Open
The results leading to this publication have received funding from the Innovative Medicines Initiative 2 Joint Undertaking under grant agreement No 777394 for the project AIMS-2-TRIALS. This Joint Undertaking receives support from the Euro…
View article: Incorporating Temporal Information in Entailment Graph Mining
Incorporating Temporal Information in Entailment Graph Mining Open
We present a novel method for injecting temporality into entailment graphs to address the problem of spurious entailments, which may arise from similar but temporally distinct events involving the same pair of entities. We focus on the spo…
View article: Proceedings of the 5th Workshop on Representation Learning for NLP
Proceedings of the 5th Workshop on Representation Learning for NLP Open
Existing models for cross-domain named entity recognition (NER) rely on numerous unlabeled corpus or labeled NER training data in target domains.However, collecting data for low-resource target domains is not only expensive but also time-c…