MTA: A Merge-then-Adapt Framework for Personalized Large Language Model Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2511.20072
Personalized Large Language Models (PLLMs) aim to align model outputs with individual user preferences, a crucial capability for user-centric applications. However, the prevalent approach of fine-tuning a separate module for each user faces two major limitations: (1) storage costs scale linearly with the number of users, rendering the method unscalable; and (2) fine-tuning a static model from scratch often yields suboptimal performance for users with sparse data. To address these challenges, we propose MTA, a Merge-then-Adapt framework for PLLMs. MTA comprises three key stages. First, we construct a shared Meta-LoRA Bank by selecting anchor users and pre-training meta-personalization traits within meta-LoRA modules. Second, to ensure scalability and enable dynamic personalization combination beyond static models, we introduce an Adaptive LoRA Fusion stage. This stage retrieves and dynamically merges the most relevant anchor meta-LoRAs to synthesize a user-specific one, thereby eliminating the need for user-specific storage and supporting more flexible personalization. Third, we propose a LoRA Stacking for Few-Shot Personalization stage, which applies an additional ultra-low-rank, lightweight LoRA module on top of the merged LoRA. Fine-tuning this module enables effective personalization under few-shot settings. Extensive experiments on the LaMP benchmark demonstrate that our approach outperforms existing SOTA methods across multiple tasks.
Related Topics
- Type
- preprint
- Landing Page
- http://arxiv.org/abs/2511.20072
- https://arxiv.org/pdf/2511.20072
- OA Status
- green
- OpenAlex ID
- https://openalex.org/W4416776690
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4416776690Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2511.20072Digital Object Identifier
- Title
-
MTA: A Merge-then-Adapt Framework for Personalized Large Language ModelWork title
- Type
-
preprintOpenAlex work type
- Publication year
-
2025Year of publication
- Publication date
-
2025-11-25Full publication date if available
- Authors
-
Xiaopeng Li, Yuanjin Zheng, Wanyu Wang, Wentao Zhang, Yiqi Wang, Xuetao Wei, Xiangyu ZhaoList of authors in order
- Landing page
-
https://arxiv.org/abs/2511.20072Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2511.20072Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2511.20072Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
Full payload
| id | https://openalex.org/W4416776690 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2511.20072 |
| ids.doi | https://doi.org/10.48550/arxiv.2511.20072 |
| ids.openalex | https://openalex.org/W4416776690 |
| fwci | |
| type | preprint |
| title | MTA: A Merge-then-Adapt Framework for Personalized Large Language Model |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | |
| locations[0].id | pmh:oai:arXiv.org:2511.20072 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | |
| locations[0].pdf_url | https://arxiv.org/pdf/2511.20072 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2511.20072 |
| locations[1].id | doi:10.48550/arxiv.2511.20072 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2511.20072 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5103920696 |
| authorships[0].author.orcid | https://orcid.org/0009-0008-8695-5591 |
| authorships[0].author.display_name | Xiaopeng Li |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Li, Xiaopeng |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5016044907 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-5768-367X |
| authorships[1].author.display_name | Yuanjin Zheng |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Zheng, Yuanjin |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5059573675 |
| authorships[2].author.orcid | https://orcid.org/0000-0001-5976-0707 |
| authorships[2].author.display_name | Wanyu Wang |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Wang, Wanyu |
| authorships[2].is_corresponding | False |
| authorships[3].author.id | https://openalex.org/A5008772211 |
| authorships[3].author.orcid | https://orcid.org/0000-0002-7532-5550 |
| authorships[3].author.display_name | Wentao Zhang |
| authorships[3].author_position | middle |
| authorships[3].raw_author_name | zhang, wenlin |
| authorships[3].is_corresponding | False |
| authorships[4].author.id | https://openalex.org/A5061701176 |
| authorships[4].author.orcid | https://orcid.org/0000-0001-9594-1919 |
| authorships[4].author.display_name | Yiqi Wang |
| authorships[4].author_position | middle |
| authorships[4].raw_author_name | Wang, Yiqi |
| authorships[4].is_corresponding | False |
| authorships[5].author.id | https://openalex.org/A5003379167 |
| authorships[5].author.orcid | https://orcid.org/0000-0002-4450-2251 |
| authorships[5].author.display_name | Xuetao Wei |
| authorships[5].author_position | middle |
| authorships[5].raw_author_name | Wei, Xuetao |
| authorships[5].is_corresponding | False |
| authorships[6].author.id | https://openalex.org/A5100645854 |
| authorships[6].author.orcid | https://orcid.org/0000-0003-2926-4416 |
| authorships[6].author.display_name | Xiangyu Zhao |
| authorships[6].author_position | middle |
| authorships[6].raw_author_name | Zhao, Xiangyu |
| authorships[6].is_corresponding | False |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2511.20072 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-11-28T00:00:00 |
| display_name | MTA: A Merge-then-Adapt Framework for Personalized Large Language Model |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-12-01T00:07:19.613710 |
| primary_topic | |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2511.20072 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2511.20072 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2511.20072 |
| primary_location.id | pmh:oai:arXiv.org:2511.20072 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | |
| primary_location.pdf_url | https://arxiv.org/pdf/2511.20072 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2511.20072 |
| publication_date | 2025-11-25 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 14, 26, 53, 74, 87, 134, 152 |
| abstract_inverted_index.To | 67 |
| abstract_inverted_index.an | 116, 161 |
| abstract_inverted_index.by | 91 |
| abstract_inverted_index.of | 24, 44, 169 |
| abstract_inverted_index.on | 167, 184 |
| abstract_inverted_index.to | 6, 103, 132 |
| abstract_inverted_index.we | 71, 85, 114, 150 |
| abstract_inverted_index.(1) | 36 |
| abstract_inverted_index.(2) | 51 |
| abstract_inverted_index.MTA | 79 |
| abstract_inverted_index.aim | 5 |
| abstract_inverted_index.and | 50, 95, 106, 124, 144 |
| abstract_inverted_index.for | 17, 29, 62, 77, 141, 155 |
| abstract_inverted_index.key | 82 |
| abstract_inverted_index.our | 190 |
| abstract_inverted_index.the | 21, 42, 47, 127, 139, 170, 185 |
| abstract_inverted_index.top | 168 |
| abstract_inverted_index.two | 33 |
| abstract_inverted_index.Bank | 90 |
| abstract_inverted_index.LaMP | 186 |
| abstract_inverted_index.LoRA | 118, 153, 165 |
| abstract_inverted_index.MTA, | 73 |
| abstract_inverted_index.SOTA | 194 |
| abstract_inverted_index.This | 121 |
| abstract_inverted_index.each | 30 |
| abstract_inverted_index.from | 56 |
| abstract_inverted_index.more | 146 |
| abstract_inverted_index.most | 128 |
| abstract_inverted_index.need | 140 |
| abstract_inverted_index.one, | 136 |
| abstract_inverted_index.that | 189 |
| abstract_inverted_index.this | 174 |
| abstract_inverted_index.user | 12, 31 |
| abstract_inverted_index.with | 10, 41, 64 |
| abstract_inverted_index.Large | 1 |
| abstract_inverted_index.LoRA. | 172 |
| abstract_inverted_index.align | 7 |
| abstract_inverted_index.costs | 38 |
| abstract_inverted_index.data. | 66 |
| abstract_inverted_index.faces | 32 |
| abstract_inverted_index.major | 34 |
| abstract_inverted_index.model | 8, 55 |
| abstract_inverted_index.often | 58 |
| abstract_inverted_index.scale | 39 |
| abstract_inverted_index.stage | 122 |
| abstract_inverted_index.these | 69 |
| abstract_inverted_index.three | 81 |
| abstract_inverted_index.under | 179 |
| abstract_inverted_index.users | 63, 94 |
| abstract_inverted_index.which | 159 |
| abstract_inverted_index.First, | 84 |
| abstract_inverted_index.Fusion | 119 |
| abstract_inverted_index.Models | 3 |
| abstract_inverted_index.PLLMs. | 78 |
| abstract_inverted_index.Third, | 149 |
| abstract_inverted_index.across | 196 |
| abstract_inverted_index.anchor | 93, 130 |
| abstract_inverted_index.beyond | 111 |
| abstract_inverted_index.enable | 107 |
| abstract_inverted_index.ensure | 104 |
| abstract_inverted_index.merged | 171 |
| abstract_inverted_index.merges | 126 |
| abstract_inverted_index.method | 48 |
| abstract_inverted_index.module | 28, 166, 175 |
| abstract_inverted_index.number | 43 |
| abstract_inverted_index.shared | 88 |
| abstract_inverted_index.sparse | 65 |
| abstract_inverted_index.stage, | 158 |
| abstract_inverted_index.stage. | 120 |
| abstract_inverted_index.static | 54, 112 |
| abstract_inverted_index.tasks. | 198 |
| abstract_inverted_index.traits | 98 |
| abstract_inverted_index.users, | 45 |
| abstract_inverted_index.within | 99 |
| abstract_inverted_index.yields | 59 |
| abstract_inverted_index.(PLLMs) | 4 |
| abstract_inverted_index.Second, | 102 |
| abstract_inverted_index.address | 68 |
| abstract_inverted_index.applies | 160 |
| abstract_inverted_index.crucial | 15 |
| abstract_inverted_index.dynamic | 108 |
| abstract_inverted_index.enables | 176 |
| abstract_inverted_index.methods | 195 |
| abstract_inverted_index.models, | 113 |
| abstract_inverted_index.outputs | 9 |
| abstract_inverted_index.propose | 72, 151 |
| abstract_inverted_index.scratch | 57 |
| abstract_inverted_index.stages. | 83 |
| abstract_inverted_index.storage | 37, 143 |
| abstract_inverted_index.thereby | 137 |
| abstract_inverted_index.Adaptive | 117 |
| abstract_inverted_index.Few-Shot | 156 |
| abstract_inverted_index.However, | 20 |
| abstract_inverted_index.Language | 2 |
| abstract_inverted_index.Stacking | 154 |
| abstract_inverted_index.approach | 23, 191 |
| abstract_inverted_index.existing | 193 |
| abstract_inverted_index.few-shot | 180 |
| abstract_inverted_index.flexible | 147 |
| abstract_inverted_index.linearly | 40 |
| abstract_inverted_index.modules. | 101 |
| abstract_inverted_index.multiple | 197 |
| abstract_inverted_index.relevant | 129 |
| abstract_inverted_index.separate | 27 |
| abstract_inverted_index.Extensive | 182 |
| abstract_inverted_index.Meta-LoRA | 89 |
| abstract_inverted_index.benchmark | 187 |
| abstract_inverted_index.comprises | 80 |
| abstract_inverted_index.construct | 86 |
| abstract_inverted_index.effective | 177 |
| abstract_inverted_index.framework | 76 |
| abstract_inverted_index.introduce | 115 |
| abstract_inverted_index.meta-LoRA | 100 |
| abstract_inverted_index.prevalent | 22 |
| abstract_inverted_index.rendering | 46 |
| abstract_inverted_index.retrieves | 123 |
| abstract_inverted_index.selecting | 92 |
| abstract_inverted_index.settings. | 181 |
| abstract_inverted_index.additional | 162 |
| abstract_inverted_index.capability | 16 |
| abstract_inverted_index.individual | 11 |
| abstract_inverted_index.meta-LoRAs | 131 |
| abstract_inverted_index.suboptimal | 60 |
| abstract_inverted_index.supporting | 145 |
| abstract_inverted_index.synthesize | 133 |
| abstract_inverted_index.Fine-tuning | 173 |
| abstract_inverted_index.challenges, | 70 |
| abstract_inverted_index.combination | 110 |
| abstract_inverted_index.demonstrate | 188 |
| abstract_inverted_index.dynamically | 125 |
| abstract_inverted_index.eliminating | 138 |
| abstract_inverted_index.experiments | 183 |
| abstract_inverted_index.fine-tuning | 25, 52 |
| abstract_inverted_index.lightweight | 164 |
| abstract_inverted_index.outperforms | 192 |
| abstract_inverted_index.performance | 61 |
| abstract_inverted_index.scalability | 105 |
| abstract_inverted_index.unscalable; | 49 |
| abstract_inverted_index.Personalized | 0 |
| abstract_inverted_index.limitations: | 35 |
| abstract_inverted_index.pre-training | 96 |
| abstract_inverted_index.preferences, | 13 |
| abstract_inverted_index.user-centric | 18 |
| abstract_inverted_index.applications. | 19 |
| abstract_inverted_index.user-specific | 135, 142 |
| abstract_inverted_index.Personalization | 157 |
| abstract_inverted_index.personalization | 109, 178 |
| abstract_inverted_index.ultra-low-rank, | 163 |
| abstract_inverted_index.Merge-then-Adapt | 75 |
| abstract_inverted_index.personalization. | 148 |
| abstract_inverted_index.meta-personalization | 97 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 7 |
| citation_normalized_percentile |