Dake Bu
YOU?
Author Swipe
View article: Provably Transformers Harness Multi-Concept Word Semantics for Efficient In-Context Learning
Provably Transformers Harness Multi-Concept Word Semantics for Efficient In-Context Learning Open
Transformer-based large language models (LLMs) have displayed remarkable creative prowess and emergence capabilities. Existing empirical studies have revealed a strong connection between these LLMs' impressive emergence abilities and their…
View article: Provably Neural Active Learning Succeeds via Prioritizing Perplexing Samples
Provably Neural Active Learning Succeeds via Prioritizing Perplexing Samples Open
Neural Network-based active learning (NAL) is a cost-effective data selection technique that utilizes neural networks to select and train on a small subset of samples. While existing work successfully develops various effective or theory-j…