Adapting Self-Supervised Representations as a Latent Space for Efficient Generation Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2510.14630
We introduce Representation Tokenizer (RepTok), a generative modeling framework that represents an image using a single continuous latent token obtained from self-supervised vision transformers. Building on a pre-trained SSL encoder, we fine-tune only the semantic token embedding and pair it with a generative decoder trained jointly using a standard flow matching objective. This adaptation enriches the token with low-level, reconstruction-relevant details, enabling faithful image reconstruction. To preserve the favorable geometry of the original SSL space, we add a cosine-similarity loss that regularizes the adapted token, ensuring the latent space remains smooth and suitable for generation. Our single-token formulation resolves spatial redundancies of 2D latent spaces and significantly reduces training costs. Despite its simplicity and efficiency, RepTok achieves competitive results on class-conditional ImageNet generation and naturally extends to text-to-image synthesis, reaching competitive zero-shot performance on MS-COCO under extremely limited training budgets. Our findings highlight the potential of fine-tuned SSL representations as compact and effective latent spaces for efficient generative modeling.
Related Topics
- Type
- preprint
- Landing Page
- http://arxiv.org/abs/2510.14630
- https://arxiv.org/pdf/2510.14630
- OA Status
- green
- OpenAlex ID
- https://openalex.org/W4416146193
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4416146193Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2510.14630Digital Object Identifier
- Title
-
Adapting Self-Supervised Representations as a Latent Space for Efficient GenerationWork title
- Type
-
preprintOpenAlex work type
- Publication year
-
2025Year of publication
- Publication date
-
2025-10-16Full publication date if available
- Authors
-
Timy Phan, Josh Susskind, Björn OmmerList of authors in order
- Landing page
-
https://arxiv.org/abs/2510.14630Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2510.14630Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2510.14630Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
Full payload
| id | https://openalex.org/W4416146193 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2510.14630 |
| ids.doi | https://doi.org/10.48550/arxiv.2510.14630 |
| ids.openalex | https://openalex.org/W4416146193 |
| fwci | |
| type | preprint |
| title | Adapting Self-Supervised Representations as a Latent Space for Efficient Generation |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | |
| locations[0].id | pmh:oai:arXiv.org:2510.14630 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | cc-by |
| locations[0].pdf_url | https://arxiv.org/pdf/2510.14630 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2510.14630 |
| locations[1].id | doi:10.48550/arxiv.2510.14630 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | cc-by |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | https://openalex.org/licenses/cc-by |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2510.14630 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5050152685 |
| authorships[0].author.orcid | https://orcid.org/0000-0002-0446-9610 |
| authorships[0].author.display_name | Timy Phan |
| authorships[0].author_position | last |
| authorships[0].raw_author_name | Phan, Timy |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5043808400 |
| authorships[1].author.orcid | |
| authorships[1].author.display_name | Josh Susskind |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Susskind, Josh |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5084415727 |
| authorships[2].author.orcid | https://orcid.org/0000-0003-0766-120X |
| authorships[2].author.display_name | Björn Ommer |
| authorships[2].author_position | middle |
| authorships[2].raw_author_name | Ommer, Björn |
| authorships[2].is_corresponding | False |
| has_content.pdf | True |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2510.14630 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-18T00:00:00 |
| display_name | Adapting Self-Supervised Representations as a Latent Space for Efficient Generation |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-28T06:21:04.676242 |
| primary_topic | |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2510.14630 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2510.14630 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2510.14630 |
| primary_location.id | pmh:oai:arXiv.org:2510.14630 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | cc-by |
| primary_location.pdf_url | https://arxiv.org/pdf/2510.14630 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2510.14630 |
| publication_date | 2025-10-16 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 5, 14, 26, 41, 47, 77 |
| abstract_inverted_index.2D | 102 |
| abstract_inverted_index.To | 65 |
| abstract_inverted_index.We | 0 |
| abstract_inverted_index.an | 11 |
| abstract_inverted_index.as | 149 |
| abstract_inverted_index.it | 39 |
| abstract_inverted_index.of | 70, 101, 145 |
| abstract_inverted_index.on | 25, 119, 133 |
| abstract_inverted_index.to | 126 |
| abstract_inverted_index.we | 30, 75 |
| abstract_inverted_index.Our | 95, 140 |
| abstract_inverted_index.SSL | 28, 73, 147 |
| abstract_inverted_index.add | 76 |
| abstract_inverted_index.and | 37, 91, 105, 113, 123, 151 |
| abstract_inverted_index.for | 93, 155 |
| abstract_inverted_index.its | 111 |
| abstract_inverted_index.the | 33, 55, 67, 71, 82, 86, 143 |
| abstract_inverted_index.This | 52 |
| abstract_inverted_index.flow | 49 |
| abstract_inverted_index.from | 20 |
| abstract_inverted_index.loss | 79 |
| abstract_inverted_index.only | 32 |
| abstract_inverted_index.pair | 38 |
| abstract_inverted_index.that | 9, 80 |
| abstract_inverted_index.with | 40, 57 |
| abstract_inverted_index.image | 12, 63 |
| abstract_inverted_index.space | 88 |
| abstract_inverted_index.token | 18, 35, 56 |
| abstract_inverted_index.under | 135 |
| abstract_inverted_index.using | 13, 46 |
| abstract_inverted_index.RepTok | 115 |
| abstract_inverted_index.costs. | 109 |
| abstract_inverted_index.latent | 17, 87, 103, 153 |
| abstract_inverted_index.single | 15 |
| abstract_inverted_index.smooth | 90 |
| abstract_inverted_index.space, | 74 |
| abstract_inverted_index.spaces | 104, 154 |
| abstract_inverted_index.token, | 84 |
| abstract_inverted_index.vision | 22 |
| abstract_inverted_index.Despite | 110 |
| abstract_inverted_index.MS-COCO | 134 |
| abstract_inverted_index.adapted | 83 |
| abstract_inverted_index.compact | 150 |
| abstract_inverted_index.decoder | 43 |
| abstract_inverted_index.extends | 125 |
| abstract_inverted_index.jointly | 45 |
| abstract_inverted_index.limited | 137 |
| abstract_inverted_index.reduces | 107 |
| abstract_inverted_index.remains | 89 |
| abstract_inverted_index.results | 118 |
| abstract_inverted_index.spatial | 99 |
| abstract_inverted_index.trained | 44 |
| abstract_inverted_index.Building | 24 |
| abstract_inverted_index.ImageNet | 121 |
| abstract_inverted_index.achieves | 116 |
| abstract_inverted_index.budgets. | 139 |
| abstract_inverted_index.details, | 60 |
| abstract_inverted_index.enabling | 61 |
| abstract_inverted_index.encoder, | 29 |
| abstract_inverted_index.enriches | 54 |
| abstract_inverted_index.ensuring | 85 |
| abstract_inverted_index.faithful | 62 |
| abstract_inverted_index.findings | 141 |
| abstract_inverted_index.geometry | 69 |
| abstract_inverted_index.matching | 50 |
| abstract_inverted_index.modeling | 7 |
| abstract_inverted_index.obtained | 19 |
| abstract_inverted_index.original | 72 |
| abstract_inverted_index.preserve | 66 |
| abstract_inverted_index.reaching | 129 |
| abstract_inverted_index.resolves | 98 |
| abstract_inverted_index.semantic | 34 |
| abstract_inverted_index.standard | 48 |
| abstract_inverted_index.suitable | 92 |
| abstract_inverted_index.training | 108, 138 |
| abstract_inverted_index.(RepTok), | 4 |
| abstract_inverted_index.Tokenizer | 3 |
| abstract_inverted_index.effective | 152 |
| abstract_inverted_index.efficient | 156 |
| abstract_inverted_index.embedding | 36 |
| abstract_inverted_index.extremely | 136 |
| abstract_inverted_index.favorable | 68 |
| abstract_inverted_index.fine-tune | 31 |
| abstract_inverted_index.framework | 8 |
| abstract_inverted_index.highlight | 142 |
| abstract_inverted_index.introduce | 1 |
| abstract_inverted_index.modeling. | 158 |
| abstract_inverted_index.naturally | 124 |
| abstract_inverted_index.potential | 144 |
| abstract_inverted_index.zero-shot | 131 |
| abstract_inverted_index.adaptation | 53 |
| abstract_inverted_index.continuous | 16 |
| abstract_inverted_index.fine-tuned | 146 |
| abstract_inverted_index.generation | 122 |
| abstract_inverted_index.generative | 6, 42, 157 |
| abstract_inverted_index.low-level, | 58 |
| abstract_inverted_index.objective. | 51 |
| abstract_inverted_index.represents | 10 |
| abstract_inverted_index.simplicity | 112 |
| abstract_inverted_index.synthesis, | 128 |
| abstract_inverted_index.competitive | 117, 130 |
| abstract_inverted_index.efficiency, | 114 |
| abstract_inverted_index.formulation | 97 |
| abstract_inverted_index.generation. | 94 |
| abstract_inverted_index.performance | 132 |
| abstract_inverted_index.pre-trained | 27 |
| abstract_inverted_index.regularizes | 81 |
| abstract_inverted_index.redundancies | 100 |
| abstract_inverted_index.single-token | 96 |
| abstract_inverted_index.significantly | 106 |
| abstract_inverted_index.text-to-image | 127 |
| abstract_inverted_index.transformers. | 23 |
| abstract_inverted_index.Representation | 2 |
| abstract_inverted_index.reconstruction. | 64 |
| abstract_inverted_index.representations | 148 |
| abstract_inverted_index.self-supervised | 21 |
| abstract_inverted_index.class-conditional | 120 |
| abstract_inverted_index.cosine-similarity | 78 |
| abstract_inverted_index.reconstruction-relevant | 59 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 3 |
| citation_normalized_percentile |