Wilfred Ng
YOU?
Author Swipe
View article: Multi-Item-Query Attention for Stable Sequential Recommendation
Multi-Item-Query Attention for Stable Sequential Recommendation Open
The inherent instability and noise in user interaction data challenge sequential recommendation systems. Prevailing masked attention models, relying on a single query from the most recent item, are sensitive to this noise, reducing predict…
View article: MermaidFlow: Redefining Agentic Workflow Generation via Safety-Constrained Evolutionary Programming
MermaidFlow: Redefining Agentic Workflow Generation via Safety-Constrained Evolutionary Programming Open
Despite the promise of autonomous agentic reasoning, existing workflow generation methods frequently produce fragile, unexecutable plans due to unconstrained LLM-driven construction. We introduce MermaidFlow, a framework that redefines the…
View article: Hierarchical Inference for Multidimensional Expressive Qualities in Classical Piano Music: A Novel Approach to Human-Music-Technology Interaction
Hierarchical Inference for Multidimensional Expressive Qualities in Classical Piano Music: A Novel Approach to Human-Music-Technology Interaction Open
This paper investigates a hierarchical inference approach for classifying expressive qualities in classical piano performances using .mid files. Through seven iterative experiments, we refine classification methods, examining the impact of…
View article: An Empirical Revisiting of Linguistic Knowledge Fusion in Language Understanding Tasks
An Empirical Revisiting of Linguistic Knowledge Fusion in Language Understanding Tasks Open
Though linguistic knowledge emerges during large-scale language model pretraining, recent work attempt to explicitly incorporate human-defined linguistic priors into task-specific fine-tuning. Infusing language models with syntactic or sem…
View article: Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering
Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering Open
Representations of events described in text are important for various tasks. In this work, we present SWCC: a Simultaneous Weakly supervised Contrastive learning and Clustering framework for event representation learning. SWCC learns event…
View article: CoCoLM: Complex Commonsense Enhanced Language Model with Discourse Relations
CoCoLM: Complex Commonsense Enhanced Language Model with Discourse Relations Open
Large-scale pre-trained language models have demonstrated strong knowledge representation ability. However, recent studies suggest that even though these giant models contain rich simple commonsense knowledge (e.g., bird can fly and fish c…
View article: Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering
Improving Event Representation via Simultaneous Weakly Supervised Contrastive Learning and Clustering Open
Representations of events described in text are important for various tasks. In this work, we present SWCC: a Simultaneous Weakly supervised Contrastive learning and Clustering framework for event representation learning. SWCC learns event…
View article: An Empirical Revisiting of Linguistic Knowledge Fusion in Language Understanding Tasks
An Empirical Revisiting of Linguistic Knowledge Fusion in Language Understanding Tasks Open
Though linguistic knowledge emerges during large-scale language model pretraining, recent work attempt to explicitly incorporate human-defined linguistic priors into task-specific finetuning. Infusing language models with syntactic or sema…
View article: CoCoLM: COmplex COmmonsense Enhanced Language Model with Discourse Relations
CoCoLM: COmplex COmmonsense Enhanced Language Model with Discourse Relations Open
Large-scale pre-trained language models have demonstrated strong knowledge representation ability. However, recent studies suggest that even though these giant models contains rich simple commonsense knowledge (e.g., bird can fly and fish …
View article: CoCoLM: COmplex COmmonsense Enhanced Language Model
CoCoLM: COmplex COmmonsense Enhanced Language Model Open
Large-scale pre-trained language models have demonstrated strong knowledge representation ability. However, recent studies suggest that even though these giant models contains rich simple commonsense knowledge (e.g., bird can fly and fish …
View article: XDM: Improving Sequential Deep Matching with Unclicked User Behaviors for Recommender System
XDM: Improving Sequential Deep Matching with Unclicked User Behaviors for Recommender System Open
Deep learning-based sequential recommender systems have recently attracted increasing attention from both academia and industry. Most of industrial Embedding-Based Retrieval (EBR) system for recommendation share the similar ideas with sequ…
View article: When Hearst Is not Enough: Improving Hypernymy Detection from Corpus with Distributional Models
When Hearst Is not Enough: Improving Hypernymy Detection from Corpus with Distributional Models Open
We address hypernymy detection, i.e., whether an is-a relationship exists between words (x, y), with the help of large textual corpora. Most conventional approaches to this task have been categorized to be either pattern-based or distribut…
View article: Enriching Large-Scale Eventuality Knowledge Graph with Entailment Relations
Enriching Large-Scale Eventuality Knowledge Graph with Entailment Relations Open
Computational and cognitive studies suggest that the abstraction of eventualities (activities, states, and events) is crucial for humans to understand daily eventualities. In this paper, we propose a scalable approach to model the entailme…
View article: Multiplex Word Embeddings for Selectional Preference Acquisition
Multiplex Word Embeddings for Selectional Preference Acquisition Open
Conventional word embeddings represent words with fixed vectors, which are usually trained based on co-occurrence patterns among words. In doing so, however, the power of such representations is limited, where the same word might be functi…
View article: When Hearst Is not Enough: Improving Hypernymy Detection from Corpus with Distributional Models
When Hearst Is not Enough: Improving Hypernymy Detection from Corpus with Distributional Models Open
We address hypernymy detection, i.e., whether an is-a relationship exists between words (x ,y), with the help of large textual corpora. Most conventional approaches to this task have been categorized to be either pattern-based or distribut…
View article: Hypernymy Detection for Low-Resource Languages via Meta Learning
Hypernymy Detection for Low-Resource Languages via Meta Learning Open
Hypernymy detection, a.k.a, lexical entailment, is a fundamental sub-task of many natural language understanding tasks. Previous explorations mostly focus on monolingual hypernymy detection on high-resource languages, e.g., English, but fe…
View article: SDM: Sequential Deep Matching Model for Online Large-scale Recommender System
SDM: Sequential Deep Matching Model for Online Large-scale Recommender System Open
Capturing users' precise preferences is a fundamental problem in large-scale recommender system. Currently, item-based Collaborative Filtering (CF) methods are common matching approaches in industry. However, they are not effective to mode…
View article: Multiplex Word Embeddings for Selectional Preference Acquisition
Multiplex Word Embeddings for Selectional Preference Acquisition Open
Conventional word embeddings represent words with fixed vectors, which are usually trained based on co-occurrence patterns among words. In doing so, however, the power of such representations is limited, where the same word might be functi…
View article: Quegel: A General-Purpose Query-Centric Framework for Querying Big Graphs
Quegel: A General-Purpose Query-Centric Framework for Querying Big Graphs Open
Pioneered by Google's Pregel, many distributed systems have been developed for large-scale graph analytics. These systems expose the user-friendly "think like a vertex" programming interface to users, and exhibit good horizontal scalabilit…