AgroSense: An Integrated Deep Learning System for Crop Recommendation via Soil Image Analysis and Nutrient Profiling Article Swipe
YOU?
·
· 2025
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2509.01344
Meeting the increasing global demand for food security and sustainable farming requires intelligent crop recommendation systems that operate in real time. Traditional soil analysis techniques are often slow, labor-intensive, and not suitable for on-field decision-making. To address these limitations, we introduce AgroSense, a deep-learning framework that integrates soil image classification and nutrient profiling to produce accurate and contextually relevant crop recommendations. AgroSense comprises two main components: a Soil Classification Module, which leverages ResNet-18, EfficientNet-B0, and Vision Transformer architectures to categorize soil types from images; and a Crop Recommendation Module, which employs a Multi-Layer Perceptron, XGBoost, LightGBM, and TabNet to analyze structured soil data, including nutrient levels, pH, and rainfall. We curated a multimodal dataset of 10,000 paired samples drawn from publicly available Kaggle repositories, approximately 50,000 soil images across seven classes, and 25,000 nutrient profiles for experimental evaluation. The fused model achieves 98.0% accuracy, with a precision of 97.8%, a recall of 97.7%, and an F1-score of 96.75%, while RMSE and MAE drop to 0.32 and 0.27, respectively. Ablation studies underscore the critical role of multimodal coupling, and statistical validation via t-tests and ANOVA confirms the significance of our improvements. AgroSense offers a practical, scalable solution for real-time decision support in precision agriculture and paves the way for future lightweight multimodal AI systems in resource-constrained environments.
Related Topics
- Type
- preprint
- Language
- en
- Landing Page
- http://arxiv.org/abs/2509.01344
- https://arxiv.org/pdf/2509.01344
- OA Status
- green
- OpenAlex ID
- https://openalex.org/W4416691297
Raw OpenAlex JSON
- OpenAlex ID
-
https://openalex.org/W4416691297Canonical identifier for this work in OpenAlex
- DOI
-
https://doi.org/10.48550/arxiv.2509.01344Digital Object Identifier
- Title
-
AgroSense: An Integrated Deep Learning System for Crop Recommendation via Soil Image Analysis and Nutrient ProfilingWork title
- Type
-
preprintOpenAlex work type
- Language
-
enPrimary language
- Publication year
-
2025Year of publication
- Publication date
-
2025-09-01Full publication date if available
- Authors
-
Vinay Pandey, Rakesh Das, Diptaparna BiswasList of authors in order
- Landing page
-
https://arxiv.org/abs/2509.01344Publisher landing page
- PDF URL
-
https://arxiv.org/pdf/2509.01344Direct link to full text PDF
- Open access
-
YesWhether a free full text is available
- OA status
-
greenOpen access status per OpenAlex
- OA URL
-
https://arxiv.org/pdf/2509.01344Direct OA link when available
- Cited by
-
0Total citation count in OpenAlex
Full payload
| id | https://openalex.org/W4416691297 |
|---|---|
| doi | https://doi.org/10.48550/arxiv.2509.01344 |
| ids.doi | https://doi.org/10.48550/arxiv.2509.01344 |
| ids.openalex | https://openalex.org/W4416691297 |
| fwci | |
| type | preprint |
| title | AgroSense: An Integrated Deep Learning System for Crop Recommendation via Soil Image Analysis and Nutrient Profiling |
| biblio.issue | |
| biblio.volume | |
| biblio.last_page | |
| biblio.first_page | |
| is_xpac | False |
| apc_list | |
| apc_paid | |
| language | en |
| locations[0].id | pmh:oai:arXiv.org:2509.01344 |
| locations[0].is_oa | True |
| locations[0].source.id | https://openalex.org/S4306400194 |
| locations[0].source.issn | |
| locations[0].source.type | repository |
| locations[0].source.is_oa | True |
| locations[0].source.issn_l | |
| locations[0].source.is_core | False |
| locations[0].source.is_in_doaj | False |
| locations[0].source.display_name | arXiv (Cornell University) |
| locations[0].source.host_organization | https://openalex.org/I205783295 |
| locations[0].source.host_organization_name | Cornell University |
| locations[0].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[0].license | |
| locations[0].pdf_url | https://arxiv.org/pdf/2509.01344 |
| locations[0].version | submittedVersion |
| locations[0].raw_type | text |
| locations[0].license_id | |
| locations[0].is_accepted | False |
| locations[0].is_published | False |
| locations[0].raw_source_name | |
| locations[0].landing_page_url | http://arxiv.org/abs/2509.01344 |
| locations[1].id | doi:10.48550/arxiv.2509.01344 |
| locations[1].is_oa | True |
| locations[1].source.id | https://openalex.org/S4306400194 |
| locations[1].source.issn | |
| locations[1].source.type | repository |
| locations[1].source.is_oa | True |
| locations[1].source.issn_l | |
| locations[1].source.is_core | False |
| locations[1].source.is_in_doaj | False |
| locations[1].source.display_name | arXiv (Cornell University) |
| locations[1].source.host_organization | https://openalex.org/I205783295 |
| locations[1].source.host_organization_name | Cornell University |
| locations[1].source.host_organization_lineage | https://openalex.org/I205783295 |
| locations[1].license | cc-by |
| locations[1].pdf_url | |
| locations[1].version | |
| locations[1].raw_type | article |
| locations[1].license_id | https://openalex.org/licenses/cc-by |
| locations[1].is_accepted | False |
| locations[1].is_published | |
| locations[1].raw_source_name | |
| locations[1].landing_page_url | https://doi.org/10.48550/arxiv.2509.01344 |
| indexed_in | arxiv, datacite |
| authorships[0].author.id | https://openalex.org/A5103231223 |
| authorships[0].author.orcid | https://orcid.org/0009-0003-8855-1287 |
| authorships[0].author.display_name | Vinay Pandey |
| authorships[0].author_position | first |
| authorships[0].raw_author_name | Pandey, Vishal |
| authorships[0].is_corresponding | False |
| authorships[1].author.id | https://openalex.org/A5054072292 |
| authorships[1].author.orcid | https://orcid.org/0000-0002-7186-942X |
| authorships[1].author.display_name | Rakesh Das |
| authorships[1].author_position | middle |
| authorships[1].raw_author_name | Das, Ranjita |
| authorships[1].is_corresponding | False |
| authorships[2].author.id | https://openalex.org/A5107925292 |
| authorships[2].author.orcid | |
| authorships[2].author.display_name | Diptaparna Biswas |
| authorships[2].author_position | last |
| authorships[2].raw_author_name | Biswas, Debasmita |
| authorships[2].is_corresponding | False |
| has_content.pdf | False |
| has_content.grobid_xml | False |
| is_paratext | False |
| open_access.is_oa | True |
| open_access.oa_url | https://arxiv.org/pdf/2509.01344 |
| open_access.oa_status | green |
| open_access.any_repository_has_fulltext | False |
| created_date | 2025-10-10T00:00:00 |
| display_name | AgroSense: An Integrated Deep Learning System for Crop Recommendation via Soil Image Analysis and Nutrient Profiling |
| has_fulltext | False |
| is_retracted | False |
| updated_date | 2025-11-28T20:46:46.080685 |
| primary_topic | |
| cited_by_count | 0 |
| locations_count | 2 |
| best_oa_location.id | pmh:oai:arXiv.org:2509.01344 |
| best_oa_location.is_oa | True |
| best_oa_location.source.id | https://openalex.org/S4306400194 |
| best_oa_location.source.issn | |
| best_oa_location.source.type | repository |
| best_oa_location.source.is_oa | True |
| best_oa_location.source.issn_l | |
| best_oa_location.source.is_core | False |
| best_oa_location.source.is_in_doaj | False |
| best_oa_location.source.display_name | arXiv (Cornell University) |
| best_oa_location.source.host_organization | https://openalex.org/I205783295 |
| best_oa_location.source.host_organization_name | Cornell University |
| best_oa_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| best_oa_location.license | |
| best_oa_location.pdf_url | https://arxiv.org/pdf/2509.01344 |
| best_oa_location.version | submittedVersion |
| best_oa_location.raw_type | text |
| best_oa_location.license_id | |
| best_oa_location.is_accepted | False |
| best_oa_location.is_published | False |
| best_oa_location.raw_source_name | |
| best_oa_location.landing_page_url | http://arxiv.org/abs/2509.01344 |
| primary_location.id | pmh:oai:arXiv.org:2509.01344 |
| primary_location.is_oa | True |
| primary_location.source.id | https://openalex.org/S4306400194 |
| primary_location.source.issn | |
| primary_location.source.type | repository |
| primary_location.source.is_oa | True |
| primary_location.source.issn_l | |
| primary_location.source.is_core | False |
| primary_location.source.is_in_doaj | False |
| primary_location.source.display_name | arXiv (Cornell University) |
| primary_location.source.host_organization | https://openalex.org/I205783295 |
| primary_location.source.host_organization_name | Cornell University |
| primary_location.source.host_organization_lineage | https://openalex.org/I205783295 |
| primary_location.license | |
| primary_location.pdf_url | https://arxiv.org/pdf/2509.01344 |
| primary_location.version | submittedVersion |
| primary_location.raw_type | text |
| primary_location.license_id | |
| primary_location.is_accepted | False |
| primary_location.is_published | False |
| primary_location.raw_source_name | |
| primary_location.landing_page_url | http://arxiv.org/abs/2509.01344 |
| publication_date | 2025-09-01 |
| publication_year | 2025 |
| referenced_works_count | 0 |
| abstract_inverted_index.a | 42, 66, 85, 91, 111, 145, 149, 192 |
| abstract_inverted_index.AI | 211 |
| abstract_inverted_index.To | 35 |
| abstract_inverted_index.We | 109 |
| abstract_inverted_index.an | 154 |
| abstract_inverted_index.in | 18, 200, 213 |
| abstract_inverted_index.of | 114, 147, 151, 156, 174, 187 |
| abstract_inverted_index.to | 53, 78, 98, 163 |
| abstract_inverted_index.we | 39 |
| abstract_inverted_index.MAE | 161 |
| abstract_inverted_index.The | 138 |
| abstract_inverted_index.and | 8, 29, 50, 56, 74, 84, 96, 107, 131, 153, 160, 165, 177, 182, 203 |
| abstract_inverted_index.are | 25 |
| abstract_inverted_index.for | 5, 32, 135, 196, 207 |
| abstract_inverted_index.not | 30 |
| abstract_inverted_index.our | 188 |
| abstract_inverted_index.pH, | 106 |
| abstract_inverted_index.the | 1, 171, 185, 205 |
| abstract_inverted_index.two | 63 |
| abstract_inverted_index.via | 180 |
| abstract_inverted_index.way | 206 |
| abstract_inverted_index.0.32 | 164 |
| abstract_inverted_index.Crop | 86 |
| abstract_inverted_index.RMSE | 159 |
| abstract_inverted_index.Soil | 67 |
| abstract_inverted_index.crop | 13, 59 |
| abstract_inverted_index.drop | 162 |
| abstract_inverted_index.food | 6 |
| abstract_inverted_index.from | 82, 119 |
| abstract_inverted_index.main | 64 |
| abstract_inverted_index.real | 19 |
| abstract_inverted_index.role | 173 |
| abstract_inverted_index.soil | 22, 47, 80, 101, 126 |
| abstract_inverted_index.that | 16, 45 |
| abstract_inverted_index.with | 144 |
| abstract_inverted_index.0.27, | 166 |
| abstract_inverted_index.98.0% | 142 |
| abstract_inverted_index.ANOVA | 183 |
| abstract_inverted_index.data, | 102 |
| abstract_inverted_index.drawn | 118 |
| abstract_inverted_index.fused | 139 |
| abstract_inverted_index.image | 48 |
| abstract_inverted_index.model | 140 |
| abstract_inverted_index.often | 26 |
| abstract_inverted_index.paves | 204 |
| abstract_inverted_index.seven | 129 |
| abstract_inverted_index.slow, | 27 |
| abstract_inverted_index.these | 37 |
| abstract_inverted_index.time. | 20 |
| abstract_inverted_index.types | 81 |
| abstract_inverted_index.which | 70, 89 |
| abstract_inverted_index.while | 158 |
| abstract_inverted_index.10,000 | 115 |
| abstract_inverted_index.25,000 | 132 |
| abstract_inverted_index.50,000 | 125 |
| abstract_inverted_index.97.7%, | 152 |
| abstract_inverted_index.97.8%, | 148 |
| abstract_inverted_index.Kaggle | 122 |
| abstract_inverted_index.TabNet | 97 |
| abstract_inverted_index.Vision | 75 |
| abstract_inverted_index.across | 128 |
| abstract_inverted_index.demand | 4 |
| abstract_inverted_index.future | 208 |
| abstract_inverted_index.global | 3 |
| abstract_inverted_index.images | 127 |
| abstract_inverted_index.offers | 191 |
| abstract_inverted_index.paired | 116 |
| abstract_inverted_index.recall | 150 |
| abstract_inverted_index.96.75%, | 157 |
| abstract_inverted_index.Meeting | 0 |
| abstract_inverted_index.Module, | 69, 88 |
| abstract_inverted_index.address | 36 |
| abstract_inverted_index.analyze | 99 |
| abstract_inverted_index.curated | 110 |
| abstract_inverted_index.dataset | 113 |
| abstract_inverted_index.employs | 90 |
| abstract_inverted_index.farming | 10 |
| abstract_inverted_index.images; | 83 |
| abstract_inverted_index.levels, | 105 |
| abstract_inverted_index.operate | 17 |
| abstract_inverted_index.produce | 54 |
| abstract_inverted_index.samples | 117 |
| abstract_inverted_index.studies | 169 |
| abstract_inverted_index.support | 199 |
| abstract_inverted_index.systems | 15, 212 |
| abstract_inverted_index.t-tests | 181 |
| abstract_inverted_index.Ablation | 168 |
| abstract_inverted_index.F1-score | 155 |
| abstract_inverted_index.XGBoost, | 94 |
| abstract_inverted_index.accurate | 55 |
| abstract_inverted_index.achieves | 141 |
| abstract_inverted_index.analysis | 23 |
| abstract_inverted_index.classes, | 130 |
| abstract_inverted_index.confirms | 184 |
| abstract_inverted_index.critical | 172 |
| abstract_inverted_index.decision | 198 |
| abstract_inverted_index.nutrient | 51, 104, 133 |
| abstract_inverted_index.on-field | 33 |
| abstract_inverted_index.profiles | 134 |
| abstract_inverted_index.publicly | 120 |
| abstract_inverted_index.relevant | 58 |
| abstract_inverted_index.requires | 11 |
| abstract_inverted_index.scalable | 194 |
| abstract_inverted_index.security | 7 |
| abstract_inverted_index.solution | 195 |
| abstract_inverted_index.suitable | 31 |
| abstract_inverted_index.AgroSense | 61, 190 |
| abstract_inverted_index.LightGBM, | 95 |
| abstract_inverted_index.accuracy, | 143 |
| abstract_inverted_index.available | 121 |
| abstract_inverted_index.comprises | 62 |
| abstract_inverted_index.coupling, | 176 |
| abstract_inverted_index.framework | 44 |
| abstract_inverted_index.including | 103 |
| abstract_inverted_index.introduce | 40 |
| abstract_inverted_index.leverages | 71 |
| abstract_inverted_index.precision | 146, 201 |
| abstract_inverted_index.profiling | 52 |
| abstract_inverted_index.rainfall. | 108 |
| abstract_inverted_index.real-time | 197 |
| abstract_inverted_index.AgroSense, | 41 |
| abstract_inverted_index.ResNet-18, | 72 |
| abstract_inverted_index.categorize | 79 |
| abstract_inverted_index.increasing | 2 |
| abstract_inverted_index.integrates | 46 |
| abstract_inverted_index.multimodal | 112, 175, 210 |
| abstract_inverted_index.practical, | 193 |
| abstract_inverted_index.structured | 100 |
| abstract_inverted_index.techniques | 24 |
| abstract_inverted_index.underscore | 170 |
| abstract_inverted_index.validation | 179 |
| abstract_inverted_index.Multi-Layer | 92 |
| abstract_inverted_index.Perceptron, | 93 |
| abstract_inverted_index.Traditional | 21 |
| abstract_inverted_index.Transformer | 76 |
| abstract_inverted_index.agriculture | 202 |
| abstract_inverted_index.components: | 65 |
| abstract_inverted_index.evaluation. | 137 |
| abstract_inverted_index.intelligent | 12 |
| abstract_inverted_index.lightweight | 209 |
| abstract_inverted_index.statistical | 178 |
| abstract_inverted_index.sustainable | 9 |
| abstract_inverted_index.contextually | 57 |
| abstract_inverted_index.experimental | 136 |
| abstract_inverted_index.limitations, | 38 |
| abstract_inverted_index.significance | 186 |
| abstract_inverted_index.approximately | 124 |
| abstract_inverted_index.architectures | 77 |
| abstract_inverted_index.deep-learning | 43 |
| abstract_inverted_index.environments. | 215 |
| abstract_inverted_index.improvements. | 189 |
| abstract_inverted_index.repositories, | 123 |
| abstract_inverted_index.respectively. | 167 |
| abstract_inverted_index.Classification | 68 |
| abstract_inverted_index.Recommendation | 87 |
| abstract_inverted_index.classification | 49 |
| abstract_inverted_index.recommendation | 14 |
| abstract_inverted_index.EfficientNet-B0, | 73 |
| abstract_inverted_index.decision-making. | 34 |
| abstract_inverted_index.labor-intensive, | 28 |
| abstract_inverted_index.recommendations. | 60 |
| abstract_inverted_index.resource-constrained | 214 |
| cited_by_percentile_year | |
| countries_distinct_count | 0 |
| institutions_distinct_count | 3 |
| citation_normalized_percentile |