Light the Way: An Enhanced Generative Adversarial Network Framework for Night-to-Day Image Translation With Improved Quality Article Swipe
YOU?
·
· 2024
· Open Access
·
· DOI: https://doi.org/10.1109/access.2024.3491792
Driving at night introduces considerable challenges due to reduced visibility, making it essential to explore techniques that enhance road information for drivers. With this purview, the research presents a technique to address visibility constraints faced during night-time driving, by converting night-time road images to day-time images using a supervised Generative Adversarial Network (GAN) model (NtD-GAN). Since paired images are required to train supervised GAN models, the research first exploits a novel approach for generating paired night-day datasets, as it is practically infeasible to collect such image pairs in a natural setting, owing to dynamic traffic environments. An innovative generator network architecture is proposed for the NtD-GAN. Furthermore, a new approach was proposed for generating and loading initial weights to expedite the NtD-GAN training. This initial weight assignment resulted in faster convergence of the NtD-GAN with significant improvement in Inception Score (IS) by 17.3%, in Structural Similarity Index (SSIM) by 5.5%, and in Naturalness Image Quality Evaluator (NIQE) by 10.3%. Moreover, the perceptual loss is introduced to the training loss function of the NtD-GAN to increase the visual quality of the reconstructed images. The experimental results also demonstrated a 0.23% increment in IS, a 0.07% reduction in Fréchet Inception Distance (FID), a 2.2% increment in SSIM, and a 7% reduction in Blind Referenceless Image Spatial Quality Evaluator (BRISQUE) compared to the NtD-GAN trained without perceptual loss. The comparison analysis with the benchmark models has demonstrated a significant improvement. For instance, in comparison to N2D-GAN, NtD-GAN has demonstrated a reduction in FID by 14.6%, an improvement in SSIM by 3.4%, an improvement in Peak Signal-to-Noise Ratio (PSNR) by 1.39 dB and a reduction in BRISQUE by 0.8%. The implementation of the NtD-GAN model is available at https://github.com/isurushanaka/paired-N2D.
Related Topics
- Type
- article
- Language
- en
- Landing Page
- https://doi.org/10.1109/access.2024.3491792
- OA Status
- gold
- Cited By
- 2
- References
- 90
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4404056951
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4404056951Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.1109/access.2024.3491792Digital Object Identifier
- Title
-
Light the Way: An Enhanced Generative Adversarial Network Framework for Night-to-Day Image Translation With Improved QualityWork title
- Type
-
articleOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2024Year of publication
- Publication date
-
2024-01-01Full publication date if available
- Authors
-
H. K. I. S. Lakmal, Maheshi B. Dissanayake, Supavadee AramvithList of authors in order
- Landing page
-
https://doi.org/10.1109/access.2024.3491792Publisher landing page
- Open access
-
YesWhether a free full text is available
- OA status
-
goldOpen access status per OpenAlex
- OA URL
-
https://doi.org/10.1109/access.2024.3491792Direct OA link when available
- Concepts
-
Image translation, Computer science, Translation (biology), Adversarial system, Image quality, Quality (philosophy), Image (mathematics), Generative adversarial network, Artificial intelligence, Generative grammar, Speech recognition, Computer vision, Chemistry, Philosophy, Biochemistry, Epistemology, Messenger RNA, GeneTop concepts (fields/topics) attached by OpenAlex
- Cited by
-
2Total citation count in OpenAlex
- Citations by year (recent)
-
2025: 2Per-year citation counts (last 5 years)
- References (count)
-
90Number of works referenced by this work
- Related works (count)
-
10Other works algorithmically related by OpenAlex
Full payload
| id | https://openalex.org/W4404056951 |
|---|---|
| doi | https://doi.org/10.1109/access.2024.3491792 |
| ids.doi | https://doi.org/10.1109/access.2024.3491792 |
| ids.openalex | https://openalex.org/W4404056951 |
| fwci | 1.06031512 |
| type | article |
| title | Light the Way: An Enhanced Generative Adversarial Network Framework for Night-to-Day Image Translation With Improved Quality |
| biblio.issue | |
| biblio.volume | 12 |
| biblio.last_page | 165978 |
| biblio.first_page | 165963 |
| topics[0].id | https://openalex.org/T10775 |
| topics[0].field.id | https://openalex.org/fields/17 |
| topics[0].field.display_name | Computer Science |
| topics[0].score | 0.9399999976158142 |
| topics[0].domain.id | https://openalex.org/domains/3 |
| topics[0].domain.display_name | Physical Sciences |
| topics[0].subfield.id | https://openalex.org/subfields/1707 |
| topics[0].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[0].display_name | Generative Adversarial Networks and Image Synthesis |
| topics[1].id | https://openalex.org/T12357 |
| topics[1].field.id | https://openalex.org/fields/17 |
| topics[1].field.display_name | Computer Science |
| topics[1].score | 0.913100004196167 |
| topics[1].domain.id | https://openalex.org/domains/3 |
| topics[1].domain.display_name | Physical Sciences |
| topics[1].subfield.id | https://openalex.org/subfields/1707 |
| topics[1].subfield.display_name | Computer Vision and Pattern Recognition |
| topics[1].display_name | Digital Media Forensic Detection |
| is_xpac | False |
| apc_list.value | 1850 |
| apc_list.currency | USD |
| apc_list.value_usd | 1850 |
| apc_paid.value | 1850 |
| apc_paid.currency | USD |
| apc_paid.value_usd | 1850 |
| concepts[0].id | https://openalex.org/C2779757391 |
| concepts[0].level | 3 |
| concepts[0].score | 0.7483774423599243 |
| concepts[0].wikidata | https://www.wikidata.org/wiki/Q6002292 |
| concepts[0].display_name | Image translation |
| concepts[1].id | https://openalex.org/C41008148 |
| concepts[1].level | 0 |
| concepts[1].score | 0.7287360429763794 |
| concepts[1].wikidata | https://www.wikidata.org/wiki/Q21198 |
| concepts[1].display_name | Computer science |
| concepts[2].id | https://openalex.org/C149364088 |
| concepts[2].level | 4 |
| concepts[2].score | 0.7099883556365967 |
| concepts[2].wikidata | https://www.wikidata.org/wiki/Q185917 |
| concepts[2].display_name | Translation (biology) |
| concepts[3].id | https://openalex.org/C37736160 |
| concepts[3].level | 2 |
| concepts[3].score | 0.7064083814620972 |
| concepts[3].wikidata | https://www.wikidata.org/wiki/Q1801315 |
| concepts[3].display_name | Adversarial system |
| concepts[4].id | https://openalex.org/C55020928 |
| concepts[4].level | 3 |
| concepts[4].score | 0.597517728805542 |
| concepts[4].wikidata | https://www.wikidata.org/wiki/Q3813865 |
| concepts[4].display_name | Image quality |
| concepts[5].id | https://openalex.org/C2779530757 |
| concepts[5].level | 2 |
| concepts[5].score | 0.5652937889099121 |
| concepts[5].wikidata | https://www.wikidata.org/wiki/Q1207505 |
| concepts[5].display_name | Quality (philosophy) |
| concepts[6].id | https://openalex.org/C115961682 |
| concepts[6].level | 2 |
| concepts[6].score | 0.5477530360221863 |
| concepts[6].wikidata | https://www.wikidata.org/wiki/Q860623 |
| concepts[6].display_name | Image (mathematics) |
| concepts[7].id | https://openalex.org/C2988773926 |
| concepts[7].level | 3 |
| concepts[7].score | 0.5328387022018433 |
| concepts[7].wikidata | https://www.wikidata.org/wiki/Q25104379 |
| concepts[7].display_name | Generative adversarial network |
| concepts[8].id | https://openalex.org/C154945302 |
| concepts[8].level | 1 |
| concepts[8].score | 0.4994087219238281 |
| concepts[8].wikidata | https://www.wikidata.org/wiki/Q11660 |
| concepts[8].display_name | Artificial intelligence |
| concepts[9].id | https://openalex.org/C39890363 |
| concepts[9].level | 2 |
| concepts[9].score | 0.45206722617149353 |
| concepts[9].wikidata | https://www.wikidata.org/wiki/Q36108 |
| concepts[9].display_name | Generative grammar |
| concepts[10].id | https://openalex.org/C28490314 |
| concepts[10].level | 1 |
| concepts[10].score | 0.33087438344955444 |
| concepts[10].wikidata | https://www.wikidata.org/wiki/Q189436 |
| concepts[10].display_name | Speech recognition |
| concepts[11].id | https://openalex.org/C31972630 |
| concepts[11].level | 1 |
| concepts[11].score | 0.3293047249317169 |
| concepts[11].wikidata | https://www.wikidata.org/wiki/Q844240 |
| concepts[11].display_name | Computer vision |
| concepts[12].id | https://openalex.org/C185592680 |
| concepts[12].level | 0 |
| concepts[12].score | 0.0 |
| concepts[12].wikidata | https://www.wikidata.org/wiki/Q2329 |
| concepts[12].display_name | Chemistry |
| concepts[13].id | https://openalex.org/C138885662 |
| concepts[13].level | 0 |
| concepts[13].score | 0.0 |
| concepts[13].wikidata | https://www.wikidata.org/wiki/Q5891 |
| concepts[13].display_name | Philosophy |
| concepts[14].id | https://openalex.org/C55493867 |
| concepts[14].level | 1 |
| concepts[14].score | 0.0 |
| concepts[14].wikidata | https://www.wikidata.org/wiki/Q7094 |
| concepts[14].display_name | Biochemistry |
| concepts[15].id | https://openalex.org/C111472728 |
| concepts[15].level | 1 |
| concepts[15].score | 0.0 |
| concepts[15].wikidata | https://www.wikidata.org/wiki/Q9471 |
| concepts[15].display_name | Epistemology |
| concepts[16].id | https://openalex.org/C105580179 |
| concepts[16].level | 3 |
| concepts[16].score | 0.0 |
| concepts[16].wikidata | https://www.wikidata.org/wiki/Q188928 |
| concepts[16].display_name | Messenger RNA |
| concepts[17].id | https://openalex.org/C104317684 |
| concepts[17].level | 2 |
| concepts[17].score | 0.0 |
| concepts[17].wikidata | https://www.wikidata.org/wiki/Q7187 |
| concepts[17].display_name | Gene |
| keywords[0].id | https://openalex.org/keywords/image-translation |
| keywords[0].score | 0.7483774423599243 |
| keywords[0].display_name | Image translation |
| keywords[1].id | https://openalex.org/keywords/computer-science |
| keywords[1].score | 0.7287360429763794 |
| keywords[1].display_name | Computer science |
| keywords[2].id | https://openalex.org/keywords/translation |
| keywords[2].score | 0.7099883556365967 |
| keywords[2].display_name | Translation (biology) |
| keywords[3].id | https://openalex.org/keywords/adversarial-system |
| keywords[3].score | 0.7064083814620972 |
| keywords[3].display_name | Adversarial system |
| keywords[4].id | https://openalex.org/keywords/image-quality |
| keywords[4].score | 0.597517728805542 |
| keywords[4].display_name | Image quality |
| keywords[5].id | https://openalex.org/keywords/quality |
| keywords[5].score | 0.5652937889099121 |
| keywords[5].display_name | Quality (philosophy) |
| keywords[6].id | https://openalex.org/keywords/image |
| keywords[6].score | 0.5477530360221863 |
| keywords[6].display_name | Image (mathematics) |
| keywords[7].id | https://openalex.org/keywords/generative-adversarial-network |
| keywords[7].score | 0.5328387022018433 |
| keywords[7].display_name | Generative adversarial network |
| keywords[8].id | https://openalex.org/keywords/artificial-intelligence |
| keywords[8].score | 0.4994087219238281 |
| keywords[8].display_name | Artificial intelligence |
| keywords[9].id | https://openalex.org/keywords/generative-grammar |
| keywords[9].score | 0.45206722617149353 |
| keywords[9].display_name | Generative grammar |
| keywords[10].id | https://openalex.org/keywords/speech-recognition |
| keywords[10].score | 0.33087438344955444 |
| keywords[10].display_name | Speech recognition |
| keywords[11].id | https://openalex.org/keywords/computer-vision |
| keywords[11].score | 0.3293047249317169 |
| keywords[11].display_name | Computer vision |
| language | en |
| locations[0].id | doi:10.1109/access.2024.3491792 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S2485537415 |
| locations[0].source.issn | 2169-3536 |
| locations[0].source.type | journal |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | 2169-3536 |
| locations[0].source.is_core | True |
| locations[0].source.is_in_doaj | True |
| locations[0].source.display_name | IEEE Access |
| locations[0].source.host_organization | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_name | Institute of Electrical and Electronics Engineers |
| locations[0].source.host_organization_lineage | https://openalex.org/P4310319808 |
| locations[0].source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| locations[0].license | cc-by |
| locations[0].pdf_url | |
| locations[0].version | publishedVersion |
| locations[0].raw_type | journal-article |
| locations[0].license_id | https://openalex.org/licenses/cc-by |
| locations[0].is_accepted | True |
| locations[0].is_published | True |
| locations[0].raw_source_name | IEEE Access |
| locations[0].landing_page_url | https://doi.org/10.1109/access.2024.3491792 |
| locations[1].id | pmh:oai:doaj.org/article:5892d681c6df49a494095fcd3a27f518 |
| locations[1].is_oa | False |
| locations[1].source.id | https://openalex.org/S4306401280 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | False |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | DOAJ (DOAJ: Directory of Open Access Journals) |
| locations[1].source.host_organization | |
| locations[1].source.host_organization_name | |
| locations[1].license | |
| locations[1].pdf_url | |
| locations[1].version | submittedVersion |
| locations[1].raw_type | article |
| locations[1].license_id | |
| locations[1].is_accepted | False |
| locations[1].is_published | False |
| locations[1].raw_source_name | IEEE Access, Vol 12, Pp 165963-165978 (2024) |
| locations[1].landing_page_url | https://doaj.org/article/5892d681c6df49a494095fcd3a27f518 |
| indexed_in | crossref, doaj |
| authorships[0].author.id | https://openalex.org/A5059298268 |
| authorships[0].author.orcid | https://orcid.org/0000-0001-6363-2097 |
| authorships[0].author.display_name | H. K. I. S. Lakmal |
| authorships[0].countries | LK |
| authorships[0].affiliations[0].institution_ids | https://openalex.org/I111199411 |
| authorships[0].affiliations[0].raw_affiliation_string | Department of Electrical and Electronic Engineering, University of Peradeniya, Peradeniya, Sri Lanka |
| authorships[0].institutions[0].id | https://openalex.org/I111199411 |
| authorships[0].institutions[0].ror | https://ror.org/025h79t26 |
| authorships[0].institutions[0].type | education |
| authorships[0].institutions[0].lineage | https://openalex.org/I111199411 |
| authorships[0].institutions[0].country_code | LK |
| authorships[0].institutions[0].display_name | University of Peradeniya |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | H.K.I.S. Lakmal |
| authorships[0].is_corresponding | False |
| authorships[0].raw_affiliation_strings | Department of Electrical and Electronic Engineering, University of Peradeniya, Peradeniya, Sri Lanka |
| authorships[1].author.id | https://openalex.org/A5017191477 |
| authorships[1].author.orcid | https://orcid.org/0000-0001-5209-5441 |
| authorships[1].author.display_name | Maheshi B. Dissanayake |
| authorships[1].countries | LK |
| authorships[1].affiliations[0].institution_ids | https://openalex.org/I111199411 |
| authorships[1].affiliations[0].raw_affiliation_string | Department of Electrical and Electronic Engineering, University of Peradeniya, Peradeniya, Sri Lanka |
| authorships[1].institutions[0].id | https://openalex.org/I111199411 |
| authorships[1].institutions[0].ror | https://ror.org/025h79t26 |
| authorships[1].institutions[0].type | education |
| authorships[1].institutions[0].lineage | https://openalex.org/I111199411 |
| authorships[1].institutions[0].country_code | LK |
| authorships[1].institutions[0].display_name | University of Peradeniya |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Maheshi B. Dissanayake |
| authorships[1].is_corresponding | False |
| authorships[1].raw_affiliation_strings | Department of Electrical and Electronic Engineering, University of Peradeniya, Peradeniya, Sri Lanka |
| authorships[2].author.id | https://openalex.org/A5069698375 |
| authorships[2].author.orcid | https://orcid.org/0000-0001-9840-3171 |
| authorships[2].author.display_name | Supavadee Aramvith |
| authorships[2].countries | TH |
| authorships[2].affiliations[0].institution_ids | https://openalex.org/I158708052 |
| authorships[2].affiliations[0].raw_affiliation_string | Multimedia Data Analytics and Processing Research Unit, Department of Electrical Engineering, Faculty of Engineering, Chulalongkorn University, Bangkok, Thailand |
| authorships[2].institutions[0].id | https://openalex.org/I158708052 |
| authorships[2].institutions[0].ror | https://ror.org/028wp3y58 |
| authorships[2].institutions[0].type | education |
| authorships[2].institutions[0].lineage | https://openalex.org/I158708052 |
| authorships[2].institutions[0].country_code | TH |
| authorships[2].institutions[0].display_name | Chulalongkorn University |
| authorships[2].author_position | last |
| authorships[2].raw_author_name | Supavadee Aramvith |
| authorships[2].is_corresponding | False |
| authorships[2].raw_affiliation_strings | Multimedia Data Analytics and Processing Research Unit, Department of Electrical Engineering, Faculty of Engineering, Chulalongkorn University, Bangkok, Thailand |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://doi.org/10.1109/access.2024.3491792 |
| open_access.oa_status | gold |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | Light the Way: An Enhanced Generative Adversarial Network Framework for Night-to-Day Image Translation With Improved Quality |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-06T03:46:38.306776 |
| primary_topic.id | https://openalex.org/T10775 |
| primary_topic.field.id | https://openalex.org/fields/17 |
| primary_topic.field.display_name | Computer Science |
| primary_topic.score | 0.9399999976158142 |
| primary_topic.domain.id | https://openalex.org/domains/3 |
| primary_topic.domain.display_name | Physical Sciences |
| primary_topic.subfield.id | https://openalex.org/subfields/1707 |
| primary_topic.subfield.display_name | Computer Vision and Pattern Recognition |
| primary_topic.display_name | Generative Adversarial Networks and Image Synthesis |
| related_works | https://openalex.org/W2888032422, https://openalex.org/W2996316059, https://openalex.org/W4385421777, https://openalex.org/W4377980832, https://openalex.org/W2905311601, https://openalex.org/W2897769091, https://openalex.org/W2845413374, https://openalex.org/W3005996785, https://openalex.org/W4297411772, https://openalex.org/W4235873501 |
| cited_by_count | 2 |
| counts_by_year[0].year | 2025 |
| counts_by_year[0].cited_by_count | 2 |
| locations_count | 2 |
| best_oa_location.id | doi:10.1109/access.2024.3491792 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S2485537415 |
| best_oa_location.source.issn | 2169-3536 |
| best_oa_location.source.type | journal |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | 2169-3536 |
| best_oa_location.source.is_core | True |
| best_oa_location.source.is_in_doaj | True |
| best_oa_location.source.display_name | IEEE Access |
| best_oa_location.source.host_organization | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| best_oa_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| best_oa_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| best_oa_location.license | cc-by |
| best_oa_location.pdf_url | |
| best_oa_location.version | publishedVersion |
| best_oa_location.raw_type | journal-article |
| best_oa_location.license_id | https://openalex.org/licenses/cc-by |
| best_oa_location.is_accepted | True |
| best_oa_location.is_published | True |
| best_oa_location.raw_source_name | IEEE Access |
| best_oa_location.landing_page_url | https://doi.org/10.1109/access.2024.3491792 |
| primary_location.id | doi:10.1109/access.2024.3491792 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S2485537415 |
| primary_location.source.issn | 2169-3536 |
| primary_location.source.type | journal |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | 2169-3536 |
| primary_location.source.is_core | True |
| primary_location.source.is_in_doaj | True |
| primary_location.source.display_name | IEEE Access |
| primary_location.source.host_organization | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_name | Institute of Electrical and Electronics Engineers |
| primary_location.source.host_organization_lineage | https://openalex.org/P4310319808 |
| primary_location.source.host_organization_lineage_names | Institute of Electrical and Electronics Engineers |
| primary_location.license | cc-by |
| primary_location.pdf_url | |
| primary_location.version | publishedVersion |
| primary_location.raw_type | journal-article |
| primary_location.license_id | https://openalex.org/licenses/cc-by |
| primary_location.is_accepted | True |
| primary_location.is_published | True |
| primary_location.raw_source_name | IEEE Access |
| primary_location.landing_page_url | https://doi.org/10.1109/access.2024.3491792 |
| publication_date | 2024-01-01 |
| publication_year | 2024 |
| referenced_works | https://openalex.org/W6754258843, https://openalex.org/W2954360087, https://openalex.org/W3023089346, https://openalex.org/W2963403405, https://openalex.org/W6751871898, https://openalex.org/W6754249846, https://openalex.org/W2962903125, https://openalex.org/W2972043148, https://openalex.org/W2963420272, https://openalex.org/W6745992979, https://openalex.org/W2796286534, https://openalex.org/W2962793481, https://openalex.org/W2963444790, https://openalex.org/W6752009473, https://openalex.org/W2963723198, https://openalex.org/W3196616669, https://openalex.org/W6734074887, https://openalex.org/W3159890710, https://openalex.org/W6755654156, https://openalex.org/W2957407072, https://openalex.org/W2963073614, https://openalex.org/W6754475721, https://openalex.org/W3185390150, https://openalex.org/W3096044019, https://openalex.org/W3096831136, https://openalex.org/W6678815747, https://openalex.org/W2963800363, https://openalex.org/W2962753688, https://openalex.org/W2953096069, https://openalex.org/W2962974533, https://openalex.org/W3034723751, https://openalex.org/W3180675665, https://openalex.org/W3179041635, https://openalex.org/W4312323561, https://openalex.org/W6730095352, https://openalex.org/W2883376126, https://openalex.org/W6739366716, https://openalex.org/W2964019233, https://openalex.org/W3034305572, https://openalex.org/W3196878101, https://openalex.org/W6748733227, https://openalex.org/W2476548250, https://openalex.org/W4221112813, https://openalex.org/W2974687854, https://openalex.org/W4293518983, https://openalex.org/W2977464309, https://openalex.org/W2970048031, https://openalex.org/W3110086937, https://openalex.org/W3156473972, https://openalex.org/W3195471471, https://openalex.org/W2969121634, https://openalex.org/W6843576508, https://openalex.org/W2907859765, https://openalex.org/W4313524862, https://openalex.org/W4312989585, https://openalex.org/W4390505076, https://openalex.org/W3035037798, https://openalex.org/W3173268697, https://openalex.org/W3108316907, https://openalex.org/W4399920061, https://openalex.org/W4401461698, https://openalex.org/W4399765324, https://openalex.org/W4387587622, https://openalex.org/W3213467918, https://openalex.org/W4381490500, https://openalex.org/W4283649772, https://openalex.org/W4386883680, https://openalex.org/W2558027072, https://openalex.org/W3035564946, https://openalex.org/W3108910236, https://openalex.org/W3116324887, https://openalex.org/W2973837922, https://openalex.org/W2339754110, https://openalex.org/W2194775991, https://openalex.org/W2572730214, https://openalex.org/W1677182931, https://openalex.org/W2535388113, https://openalex.org/W2963470893, https://openalex.org/W6637373629, https://openalex.org/W6683590716, https://openalex.org/W2475287302, https://openalex.org/W6718379498, https://openalex.org/W2183341477, https://openalex.org/W6765779288, https://openalex.org/W2133665775, https://openalex.org/W2102166818, https://openalex.org/W1982471090, https://openalex.org/W1964859077, https://openalex.org/W2340897893, https://openalex.org/W3207649350 |
| referenced_works_count | 90 |
| abstract_inverted_index.a | 28, 47, 69, 88, 107, 187, 192, 200, 206, 234, 246, 269 |
| abstract_inverted_index.7% | 207 |
| abstract_inverted_index.An | 96 |
| abstract_inverted_index.an | 252, 258 |
| abstract_inverted_index.as | 77 |
| abstract_inverted_index.at | 1, 283 |
| abstract_inverted_index.by | 38, 141, 148, 157, 250, 256, 265, 273 |
| abstract_inverted_index.dB | 267 |
| abstract_inverted_index.in | 87, 128, 137, 143, 151, 190, 195, 203, 209, 239, 248, 254, 260, 271 |
| abstract_inverted_index.is | 79, 101, 163, 281 |
| abstract_inverted_index.it | 11, 78 |
| abstract_inverted_index.of | 131, 170, 178, 277 |
| abstract_inverted_index.to | 7, 13, 30, 43, 60, 82, 92, 118, 165, 173, 218, 241 |
| abstract_inverted_index.FID | 249 |
| abstract_inverted_index.For | 237 |
| abstract_inverted_index.GAN | 63 |
| abstract_inverted_index.IS, | 191 |
| abstract_inverted_index.The | 182, 225, 275 |
| abstract_inverted_index.and | 114, 150, 205, 268 |
| abstract_inverted_index.are | 58 |
| abstract_inverted_index.due | 6 |
| abstract_inverted_index.for | 20, 72, 103, 112 |
| abstract_inverted_index.has | 232, 244 |
| abstract_inverted_index.new | 108 |
| abstract_inverted_index.the | 25, 65, 104, 120, 132, 160, 166, 171, 175, 179, 219, 229, 278 |
| abstract_inverted_index.was | 110 |
| abstract_inverted_index.(IS) | 140 |
| abstract_inverted_index.1.39 | 266 |
| abstract_inverted_index.2.2% | 201 |
| abstract_inverted_index.Peak | 261 |
| abstract_inverted_index.SSIM | 255 |
| abstract_inverted_index.This | 123 |
| abstract_inverted_index.With | 22 |
| abstract_inverted_index.also | 185 |
| abstract_inverted_index.loss | 162, 168 |
| abstract_inverted_index.road | 18, 41 |
| abstract_inverted_index.such | 84 |
| abstract_inverted_index.that | 16 |
| abstract_inverted_index.this | 23 |
| abstract_inverted_index.with | 134, 228 |
| abstract_inverted_index.(GAN) | 52 |
| abstract_inverted_index.0.07% | 193 |
| abstract_inverted_index.0.23% | 188 |
| abstract_inverted_index.0.8%. | 274 |
| abstract_inverted_index.3.4%, | 257 |
| abstract_inverted_index.5.5%, | 149 |
| abstract_inverted_index.Blind | 210 |
| abstract_inverted_index.Image | 153, 212 |
| abstract_inverted_index.Index | 146 |
| abstract_inverted_index.Ratio | 263 |
| abstract_inverted_index.SSIM, | 204 |
| abstract_inverted_index.Score | 139 |
| abstract_inverted_index.Since | 55 |
| abstract_inverted_index.faced | 34 |
| abstract_inverted_index.first | 67 |
| abstract_inverted_index.image | 85 |
| abstract_inverted_index.loss. | 224 |
| abstract_inverted_index.model | 53, 280 |
| abstract_inverted_index.night | 2 |
| abstract_inverted_index.novel | 70 |
| abstract_inverted_index.owing | 91 |
| abstract_inverted_index.pairs | 86 |
| abstract_inverted_index.train | 61 |
| abstract_inverted_index.using | 46 |
| abstract_inverted_index.(FID), | 199 |
| abstract_inverted_index.(NIQE) | 156 |
| abstract_inverted_index.(PSNR) | 264 |
| abstract_inverted_index.(SSIM) | 147 |
| abstract_inverted_index.10.3%. | 158 |
| abstract_inverted_index.14.6%, | 251 |
| abstract_inverted_index.17.3%, | 142 |
| abstract_inverted_index.during | 35 |
| abstract_inverted_index.faster | 129 |
| abstract_inverted_index.images | 42, 45, 57 |
| abstract_inverted_index.making | 10 |
| abstract_inverted_index.models | 231 |
| abstract_inverted_index.paired | 56, 74 |
| abstract_inverted_index.visual | 176 |
| abstract_inverted_index.weight | 125 |
| abstract_inverted_index.BRISQUE | 272 |
| abstract_inverted_index.Driving | 0 |
| abstract_inverted_index.Network | 51 |
| abstract_inverted_index.NtD-GAN | 121, 133, 172, 220, 243, 279 |
| abstract_inverted_index.Quality | 154, 214 |
| abstract_inverted_index.Spatial | 213 |
| abstract_inverted_index.address | 31 |
| abstract_inverted_index.collect | 83 |
| abstract_inverted_index.dynamic | 93 |
| abstract_inverted_index.enhance | 17 |
| abstract_inverted_index.explore | 14 |
| abstract_inverted_index.images. | 181 |
| abstract_inverted_index.initial | 116, 124 |
| abstract_inverted_index.loading | 115 |
| abstract_inverted_index.models, | 64 |
| abstract_inverted_index.natural | 89 |
| abstract_inverted_index.network | 99 |
| abstract_inverted_index.quality | 177 |
| abstract_inverted_index.reduced | 8 |
| abstract_inverted_index.results | 184 |
| abstract_inverted_index.traffic | 94 |
| abstract_inverted_index.trained | 221 |
| abstract_inverted_index.weights | 117 |
| abstract_inverted_index.without | 222 |
| abstract_inverted_index.Distance | 198 |
| abstract_inverted_index.N2D-GAN, | 242 |
| abstract_inverted_index.NtD-GAN. | 105 |
| abstract_inverted_index.analysis | 227 |
| abstract_inverted_index.approach | 71, 109 |
| abstract_inverted_index.compared | 217 |
| abstract_inverted_index.day-time | 44 |
| abstract_inverted_index.drivers. | 21 |
| abstract_inverted_index.driving, | 37 |
| abstract_inverted_index.expedite | 119 |
| abstract_inverted_index.exploits | 68 |
| abstract_inverted_index.function | 169 |
| abstract_inverted_index.increase | 174 |
| abstract_inverted_index.presents | 27 |
| abstract_inverted_index.proposed | 102, 111 |
| abstract_inverted_index.purview, | 24 |
| abstract_inverted_index.required | 59 |
| abstract_inverted_index.research | 26, 66 |
| abstract_inverted_index.resulted | 127 |
| abstract_inverted_index.setting, | 90 |
| abstract_inverted_index.training | 167 |
| abstract_inverted_index.(BRISQUE) | 216 |
| abstract_inverted_index.Evaluator | 155, 215 |
| abstract_inverted_index.Inception | 138, 197 |
| abstract_inverted_index.Moreover, | 159 |
| abstract_inverted_index.available | 282 |
| abstract_inverted_index.benchmark | 230 |
| abstract_inverted_index.datasets, | 76 |
| abstract_inverted_index.essential | 12 |
| abstract_inverted_index.generator | 98 |
| abstract_inverted_index.increment | 189, 202 |
| abstract_inverted_index.instance, | 238 |
| abstract_inverted_index.night-day | 75 |
| abstract_inverted_index.reduction | 194, 208, 247, 270 |
| abstract_inverted_index.technique | 29 |
| abstract_inverted_index.training. | 122 |
| abstract_inverted_index.(NtD-GAN). | 54 |
| abstract_inverted_index.Generative | 49 |
| abstract_inverted_index.Similarity | 145 |
| abstract_inverted_index.Structural | 144 |
| abstract_inverted_index.assignment | 126 |
| abstract_inverted_index.challenges | 5 |
| abstract_inverted_index.comparison | 226, 240 |
| abstract_inverted_index.converting | 39 |
| abstract_inverted_index.generating | 73, 113 |
| abstract_inverted_index.infeasible | 81 |
| abstract_inverted_index.innovative | 97 |
| abstract_inverted_index.introduced | 164 |
| abstract_inverted_index.introduces | 3 |
| abstract_inverted_index.night-time | 36, 40 |
| abstract_inverted_index.perceptual | 161, 223 |
| abstract_inverted_index.supervised | 48, 62 |
| abstract_inverted_index.techniques | 15 |
| abstract_inverted_index.visibility | 32 |
| abstract_inverted_index.Adversarial | 50 |
| abstract_inverted_index.Naturalness | 152 |
| abstract_inverted_index.constraints | 33 |
| abstract_inverted_index.convergence | 130 |
| abstract_inverted_index.improvement | 136, 253, 259 |
| abstract_inverted_index.information | 19 |
| abstract_inverted_index.practically | 80 |
| abstract_inverted_index.significant | 135, 235 |
| abstract_inverted_index.visibility, | 9 |
| abstract_inverted_index.Furthermore, | 106 |
| abstract_inverted_index.architecture | 100 |
| abstract_inverted_index.considerable | 4 |
| abstract_inverted_index.demonstrated | 186, 233, 245 |
| abstract_inverted_index.experimental | 183 |
| abstract_inverted_index.improvement. | 236 |
| abstract_inverted_index.Referenceless | 211 |
| abstract_inverted_index.environments. | 95 |
| abstract_inverted_index.reconstructed | 180 |
| abstract_inverted_index.Fréchet | 196 |
| abstract_inverted_index.implementation | 276 |
| abstract_inverted_index.Signal-to-Noise | 262 |
| abstract_inverted_index.<uri>https://github.com/isurushanaka/paired-N2D</uri>. | 284 |
| cited_by_percentile_year.max | 97 |
| cited_by_percentile_year.min | 95 |
| countries_distinct_count | 2 |
| institutions_distinct_count | 3 |
| citation_normalized_percentile.value | 0.73408699 |
| citation_normalized_percentile.is_in_top_1_percent | False |
| citation_normalized_percentile.is_in_top_10_percent | False |