
Description: AGIcam is an affordable, solar-powered Raspberry Pi-based IoT sensor system designed for high-frequency, plot-level data acquisition. Data from the AGIcam system were used to develop time-series models of wheat yield using machine learning (Random Forest) and deep learning (Long Short-Term Memory, LSTM). This dataset provides the raw and preprocessed inputs used in those analyses. The dataset accompanies the manuscript “AGIcam: An Open-Source IoT-Based Camera System for Automated In-Field Phenotyping and Yield Prediction.” The dataset includes synchronized vegetation index (VI) metrics, weather data, and related agronomic trait data, along with sample images collected with the AGIcam system during spring and winter wheat field trials in the 2022 growing season. Dataset Content: Vegetation Index, Weather, and Agronomic Trait Data (Spring_wheat_data.csv and Winter_wheat_data.csv)The AGIcam dataset includes synchronized vegetation indices, weather measurements, and agronomic trait data. Each row in the dataset represents a single VI measurement extracted from AGIcam images, paired with weather parameters recorded at the corresponding time point. Plot-level heading date and grain yield (kg/ha) are included for downstream phenotyping and modeling applications. Information on column names in the CSV files date: Date of data acquisition timepoint: Hour of image capture wheat: Wheat type (spring/winter) sensor: AGIcam sensor ID variety: Genotype identifier rep_var: Variety replicate number rep_pic: Image replicate number vi: Vegetation index name max: Maximum VI value mean: Mean VI value median: Median VI value std: Standard deviation of VI values p95 / p90 / p85: 95th, 90th, and 85th percentile VI values avg_air_temp: Average air temperature (°C) humidity: Relative humidity (%) avg_soil_temp_8_in: Soil temperature at 8-inch depth (°C) precip: Precipitation (mm) solar_rad: Solar radiation (W/m²) heading_date: Date of heading stage yield_kg/ha: Grain yield (kg per hectare) Sample Images (AGIcam8.zip and AGIcam17.zip)Selected RGB and NoIR images captured from AGIcam sensors to demonstrate image quality and time resolution. Data Collection and Processing: Sensors: AGIcam units (RGB and NoIR cameras) Weather: ATMOS 41 sensor station Location: Spring and winter wheat field trials in Pullman, Washington State. Image Frequency: 3 captures per day per sensor. Use Case: This dataset can be used to: High-frequency VI and weather data modeling Time-series analysis of crop growth Machine/deep learning yield prediction Sensor fusion in breeding programs Acknowledgments:This study was funded by the United States Department of Agriculture (USDA) - National Institute for Food and Agriculture (NIFA) competitive project (accession number 1028108), hatch project (accession number 1014919), and Washington State University’s College of Agricultural, Human, and Natural Resource Sciences’ Emerging Research Issues competitive grant opportunity (ERI-20-04). The authors would like to thank Dr. Milton Valencia Ortiz and Kingsley Charles Umani for their support during the field camera installation Contact Information:For further inquiries regarding this dataset, please contact Sindhuja Sankaran (s.sankaran@wsu.edu).
Internet of Things, Time Series Data, Plant breeding
Internet of Things, Time Series Data, Plant breeding
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
