Baris Askin
YOU?
Author Swipe
View article: Language Model Planning from an Information Theoretic Perspective
Language Model Planning from an Information Theoretic Perspective Open
The extent to which decoder-only language models (LMs) engage in planning, that is, organizing intermediate computations to support coherent long-range generation, remains an open and important question, with implications for interpretabil…
View article: Ravan: Multi-Head Low-Rank Adaptation for Federated Fine-Tuning
Ravan: Multi-Head Low-Rank Adaptation for Federated Fine-Tuning Open
Large language models (LLMs) have not yet effectively leveraged the vast amounts of edge-device data, and federated learning (FL) offers a promising paradigm to collaboratively fine-tune LLMs without transferring private edge data to the c…
View article: Federated Communication-Efficient Multi-Objective Optimization
Federated Communication-Efficient Multi-Objective Optimization Open
We study a federated version of multi-objective optimization (MOO), where a single model is trained to optimize multiple objective functions. MOO has been extensively studied in the centralized setting but is less explored in federated or …
View article: FedAST: Federated Asynchronous Simultaneous Training
FedAST: Federated Asynchronous Simultaneous Training Open
Federated Learning (FL) enables edge devices or clients to collaboratively train machine learning (ML) models without sharing their private data. Much of the existing work in FL focuses on efficiently learning a model for a single task. In…