--------------------------------------------- # Data for Prince Czarnecki et al. 2021. Real-time automated classification of sky conditions using deep learning and edge computing. Remote Sensing 13(19):3859 https://doi.org/10.3390/rs13193859 Preferred citation: Czarnecki, Joby M. P.; Samiappan, Sathish; Wasson, Louis L.; McCraine, C. Daniel (2021). "Mississippi Sky Conditions". Mississippi State University Institutional Repository. Dataset. https://doi.org/xxxxx. Corresponding Author: Joby Czarnecki, Mississippi State University, joby.czarnecki@msstate.edu License: CC-BY-NC --------------------------------------------- ## Summary Data were collected using consumer grade trail cameras installed across Mississippi (USA) in 2019 and 2020 from March through September. Cameras were angled to collect an oblique, unobstructed view of the sky. Cameras were placed in time-lapse mode and set to collect one image every hour. Approximately 12,000 images were usable for analysis. Our intent in this work was to first compare deep learning approaches to classify sky conditions with regard to cloud shadows in agricultural fields using a visible spectrum camera. Sky conditions, and specifically shadowing from clouds, are critical determinants in the quality of images that can be obtained from low-altitude sensing platforms. Radiometric quality of remotely sensed imagery is crucial for precision agriculture applications because estimations of plant health rely on the underlying quality. We then developed an artificial-intelligence-based edge computing system to fully automate the classification process. Important points - All 12,000 images were labelled by a single human reviewer prior to network training. This labelling dictates the division of images within this dataset. The overall task for the human reviewer was to view each image and indicate if the sky condition was favorable for high quality image collection with a UAS. Each image was labelled according to the likelihood of 1) good image quality expected or 2) degraded image quality expected in the subsequent UAS operation based on the opinion of this single reviewer. We further quantified the interrater reliability of the reviewer with an intraclass correlation coefficient (ICC(A,1)) (McGraw and Wong 1996) with a blinded, random sample of 102 images (Walter et al. 1998). This data labelling may not conform to your individual needs and you may wish to re-label these data. --------------------------------------------- ## Files and Folders dataset.zip - contains two folders holding the images which comprise this dataset readme.txt (this file) - contains summary information, project specifics, and data specifications filelist.csv - contains a complete list of files included in this dataset Folders in dataset.zip GoodImageQuality - contains all images deemed by the human reviewer as sky conditions which allow for good image quality with UAV flight (2541 images) DegradedImageQuality - contains all images deemed by the human reviewer as sky conditions which do not allow for good image quality with UAV flight (9393 images) Image metadata for all images: Bits per Sample - 24 Color Space - sRGB EXIF version- 0220 File Extension - JPG Format - Joint Photographic Expert Group Height - 2080 Resolution - 72 dpi Width - 3744 Camera Manufacturer - Prometheus Camera Model - BTC5HDP F-Stop - f/2.4 Focal Length - 42mm Metering - center weighted average Flash - No ISO - variable --------------------------------------------- ## Materials & Methods This study utilized Strike Force HD Pro 18 MP consumer grade trail cameras manufactured by Browning Trail Cameras (Birmingham, Alabama, USA). Cameras were installed at sites across Mississippi (USA) in 2019. Cameras were re-deployed at a subset of sites for the 2020 growing season. These sites were in or near the following cities (*indicates 2020): Aberdeen*, Brooksville*, Caledonia*, Clarksdale, Greenwood, Kosciusko, Leland, Oxford, Starkville*, Rolling Fork, and Verona*. This study was conducted from March through September to align with the crop growing season in the region. Cameras were angled to collect an oblique, unobstructed view of the sky. Cameras were placed in time-lapse mode and set to collect one image every hour. Images are stored natively on the camera in a proprietary format. Only hourly images collected from 9 AM to 3 PM were utilized. These images were extracted as jpg image files. Images collected beyond these hours and hourly time lapse were discarded. Approximately 12,000 images were usable for analysis. Time, date, and and location stamps are located in the bottom of each individual image. Times are rounded to the nearest hour where necessary. The labelling workflow was as follows. Using the available set of hourly trail camera images, each image was visually inspected for the presence of clouds. For the first level of classification, if any amount of cloud presence, however faint, was detected, the image was marked as having clouds. For the second level of classification, the human reviewer rated each image determined to have clouds in the first classification for binary classification based on whether the sky condition was suitable for UAS collection of high-quality imagery. Our human reviewer has advanced degrees in operational meterology, a long-standing expertise in remote sensing, and specifically five years of experience with unmanned aerial systems-based imaging for agriculture. --------------------------------------------- ## Citations McGraw, K. O., & Wong, S. P. (1996). Forming inferences about some intraclass correlation coefficients. Psychological Methods, 1(1), 30. Walter, S., Eliasziw, M., & Donner, A. (1998). Sample size and optimal designs for reliability studies. Statistics in Medicine, 17(1), 101–110.