Login

Proceedings

Find matching any: Reset
Add filter to result:
Seed Localization System Suite with CNNs for Seed Spacing Estimation, Population Estimation and Doubles
R. Harsha Chepally, A. Sharda
Kansas State University

Proper seed placement during planting is critical to achieve uniform emergence which optimizes the crop for maximum yield potential. Currently, the ideal way to determine planter performance is to manually measure plant spacing and seeding depth. However, this process is both cost- and labor-intensive and prone to human errors. Therefore, this study aimed to develop seed localization system (SLS) system to measure seed spacing and seeding depth and providing the geo-location of each planted seed. The system consisted of a high-speed camera, light section sensor, and survey grade real time kinematic (RTK) global positioning system (GPS) unit. Images were acquired using a Basler (acA1920-40gc) color camera which provided 1920x1200 px resolution (2.3 MP), 42 frames per seconds (fps), and gigabit ethernet (GigE) connectivity. A 10 m GigE cable (2000028341) was selected to connect camera (RJ45) to the data acquisition system (RJ45). Camera body was paired with a ruggedized 12.5 mm fixed focal length Kowa (LM12HC-V) lens. Exposure time was set at 300 microseconds (μs). To ensure the camera will capture side-by side frames with the high-speed camera was configured to transmit and record at 40 fps over an ethernet. The SLS mount when mounted on the extension bracket, provided a consistent distance of 5.6 inches between lens face and the bottom of the seed trench. The constant distance between lens and target surface would not require users to re-focus cameras with change in planting depths. The lighting system is one of the most critical components to capture good images, with clearly visible subject in the frame. For initial setup, a floodlamp (AT419929, John Deere, USA), was mounted on a round base magnet to be installed on the side of the extension bracket to illuminate the area within the FOV of the camera. The data acquisition system comprised of NI cDAQ 9174; NI c-series module NI-9203; and fanless pc (LEC-2580P-711A) to capture simultaneous data from four cameras, laser line scanner, and GPS data mounted on four row units (rows 1, 4, 9 12 of CNH 12-row planter). The fanless Industrial PC had Intel® Core™ i7-6600U CPU, 2x RJ45 & 4x PoE High-performance IPC with Intel® 6th Gen Skylake Core™ i7 CPU and overall memory of 16 GB. The software was written in national instruments LabVIEW platform. The convolutional neural network used was trained of the resnet-18 backbone with a feature pyramid network, (RETINA NET) as this makes the higher-level features to learn and reason about lower-level features.The image analysis results that seed classification algorithm using RetinaNet predicted 99.4% of seeds for both populations at 6 mph, while the prediction decreased to 95.3% for 8mph and 30k population and 92.8% for 8 mph and 35k population. The results indicated the seed prediction in the image was very accurate for 6 mph planting speed. Similar results were observed for predicted mean seed spacing which were within 5% of the validation data. The results indicated that this system can be improved for real-time usage using high computational capabilities. 

Keyword: seeding spacing, high speed camera, planter, Convolutional neural network