Research Article
BibTex RIS Cite

Social Navigation in Warehouse Logistics Based on Artificial Intelligence and RGB-D

Year 2025, Volume: 59 Issue: 2, 301 - 322, 16.04.2025
https://doi.org/10.51551/verimlilik.1523828

Abstract

Purpose: Ensuring both human safety and transportation efficiency simultaneously during the navigation of autonomous mobile robots (AMRs) in warehouse logistics is a challenging problem due to dynamic environments and diverse obstacles. In this study, a social navigation approach based on artificial intelligence was developed to optimize these two critical factors.
Methodology: RGB images from an Intel_RealSense_D455 depth camera mounted on the PIXER AMR were utilized in a YOLOv8-based model to detect humans and reach trucks (RT). For human detection, the YOLOv8 model was trained with 4746 images and labels for 362 epochs, while RT detection used 4193 images and labels for 450 epochs. Each dataset was split into 60% training, 20% testing, and 20% validation subsets. The depth feature of the camera was used to measure object distances.
Findings: Objects detected with at least 80% accuracy had their midpoints identified, and distances were calculated using the depth camera. For humans detected within 2 meters, the robot's max_speed was reduced to 80%. For RTs detected at 6 meters, a new path was planned.
Originality: This study provides a novel integration of social navigation and deep learning to address the dual challenge of ensuring safety and efficiency in AMR navigation, contributing to advancements in warehouse logistics.

Ethical Statement

It was declared by the author(s) that scientific and ethical principles have been followed in this study and all the sources used have been properly cited.

Supporting Institution

This study was supported by TUBITAK within the scope of “Project No: 3231501

Project Number

TUBITAK - 3231501

References

  • Beyer, L., Hermans, A., Linder, T., Arras, K.O. and Leibe, B. (2018). “Deep Person Detection in 2D Range Data”, arXiv preprint:1804.02463.
  • Clavero, C., Patricio, M.A., García, J. and Molina, J.M. (2024). “DMZoomNet: Improving Object Detection Using Distance Information in Intralogistics Environments”, IEEE Transactions on Industrial Informatics, 20 (7), 9163-9171. https://doi.org/10.1109/TII.2024.3381795
  • Cognominal, M., Patronymic, K. and Wańkowicz, A. (2021). “Evolving Field of Autonomous Mobile Robotics: Technological Advances and Applications”, Fusion of Multidisciplinary Research An International Journal, 2(2), 189-200.
  • Daza, M., Barrios-Aranibar, D., Diaz-Amado, J., Cardinale, Y. and Vilasboas, J. (2021). “An Approach of Social Navigation Based on Proxemics for Crowded Environments of Humans and Robots”, Micromachines, 12(2), 193. https://doi.org/10.3390/mi12020193
  • Fragapane, G., Ivanov, D., Peron, M., Sgarbossa, F. and Strandhagen, J.O. (2022). “Increasing Flexibility and Productivity in Industry 4.0 Production Networks with Autonomous Mobile Robots and Smart Intralogistics”, Annals of Operations Research, 308(1), 125-143. https://doi.org/10.1007/s10479-020-03526-7
  • Francis, A., Pérez-d'Arpino, C., Li, C., Xia, F., Alahi, A., Alami, R. and Martín-Martín, R. (2023). “Principles and Guidelines for Evaluating Social Robot Navigation Algorithms”, arXiv preprint arXiv:2306.16740.
  • Gürevin, B., Gül, R., Eğri, S., Gültürk, F., Yıldız, M., Çalışkan, F. ve Pehlivan, İ. (2023). “A Novel Method Determining the Size and Angle of an Object Using a Depth Camera Without Reference”, Academic Platform Journal of Engineering and Smart Systems, 11(2), 41-46. https://doi.org/10.21541/apjess.1297168
  • Gürevin, B., Yıldız, M., Gültürk, F., Pehlivan, İ., Çalışkan, F., Boru, B. ve Yıldız, M.Z. (2024). “A Novel Control and Monitoring Interface Design for ROS Based Mobile Robots”, Düzce Üniversitesi Bilim ve Teknoloji Dergisi, 12(1), 496-509. https://doi.org/10.29130/dubited.1214278
  • Intel Realsense. (2024). https://www.intelrealsense.com/depth-camera-d455/, (Erişim Tarihi: 25.07.2024).
  • Jia, D., Hermans, A. and Leibe, B. (2020). “DR-SPAAM: A Spatial-Attention and Auto-Regressive Model for Person Detection in 2D Range Data”, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 10270-10277.
  • Kenk, M.A., Hassaballah, M. and Brethé, J.F. (2019). “Human-Aware Robot Navigation in Logistics Warehouses”, Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics (ICINCO 2019), 371-378.
  • Kim, C. E., Oghaz, M.M.D., Fajtl, J., Argyriou, V. and Remagnino, P. (2018). “A Comparison of Embedded Deep Learning Methods for Person Detection”, arXiv preprint arXiv:1812.03451.
  • Mavrogiannis, C., Baldini, F., Wang, A., Zhao, D., Trautman, P., Steinfeld, A. and Oh, J. (2023). “Core Challenges of Social Robot Navigation: A Survey”, ACM Transactions on Human-Robot Interaction, 12(3), 1-39. https://doi.org/10.1145/3583741
  • Mayershofer, C., Holm, D.M., Molter, B. and Fottner, J. (2020). “Loco: Logistics Objects in Context”, 19th IEEE International Conference on Machine Learning and Applications (ICMLA), 612-617.
  • Önal, O. and Dandıl, E. (2021). “Object Detection for Safe Working Environments Using YOLOv4 Deep Learning Model”, Avrupa Bilim ve Teknoloji Dergisi, (26), 343-351. https://doi.org/10.31590/ejosat.951733
  • Reis, D., Kupec, J., Hong, J. and Daoudi, A. (2023). “Real-Time Flying Object Detection with YOLOv8”, arXiv preprint arXiv:2305.09972.
  • Salmerón-Garci, J., Inigo-Blasco, P., Di, F. and Cagigas-Muniz, D. (2015). “A Tradeoff Analysis of a Cloud-Based Robot Navigation Assistant Using Stereo Image Processing”, IEEE Transactions on Automation Science and Engineering, 12(2), 444-454. https://doi.org/10.1109/TASE.2015.2403593
  • Trakadas, P., Simoens, P., Gkonis, P., Sarakis, L., Angelopoulos, A., Ramallo-González, A. P., Skarmeta, A., Trochoutsos, C., Calvo, D., Pariente, T., Chintamani, K., Fernandez, I., Irigaray, A.A., Parreira, J.X., Petrali, P., Leligou, N. and Karkazis, P. (2020). “An Artificial Intelligence-Based Collaboration Approach in Industrial IoT Manufacturing: Key Concepts, Architectural Extensions and Potential Applications”, Sensors, 20(19), 5480. https://doi.org/10.3390/s20195480
  • Truong, X.T. and Ngo, T.D. (2016). “Dynamic Social Zone Based Mobile Robot Navigation for Human Comfortable Safety in Social Environments”, International Journal of Social Robotics, 8, 663-684. https://doi.org/10.1007/s12369-016-0352-0
  • Ultralytics. (2024a). https://docs.ultralytics.com/tr/models/yolov8/, (Erişim Tarihi: 25.07.2024).
  • Ultralytics. (2024b). https://github.com/ultralytics/ultralytics/tree/main, (Erişim tarihi: 08.10.2024).
  • Wang, F. and Li, X. (2023). “Research on Warehouse Object Detection for Mobile Robot Based on YOLOv5”, 2023 International Conference on Image Processing, Computer Vision and Machine Learning (ICICML), 1196-1200.
  • Zhu, K. and Zhang, T. (2021). “Deep Reinforcement Learning Based Mobile Robot Navigation: A Review”, Tsinghua Science and Technology, 26(5), 674-691.

Yapay Zekâ ve RGB-D Tabanlı Depo İçi Lojistikte Sosyal Navigasyon

Year 2025, Volume: 59 Issue: 2, 301 - 322, 16.04.2025
https://doi.org/10.51551/verimlilik.1523828

Abstract

Amaç: Depo lojistiğinde otonom mobil robotların (AMR) navigasyonu sırasında insan güvenliğini ve taşıma verimliliğini aynı anda sağlamak, dinamik ortamlar ve çeşitli engeller nedeniyle zor bir problemdir. Bu çalışmada, bu iki kritik faktörü optimize etmek amacıyla yapay zekâ tabanlı bir sosyal navigasyon yaklaşımı geliştirilmiştir.
Yöntem: PIXER AMR üzerine monte edilmiş Intel_RealSense_D455 derinlik kamerasından alınan RGB görüntüler, insan ve Reach Truck (RT) algılaması için YOLOv8 tabanlı bir modelde kullanılmıştır. İnsan algılama için YOLOv8 modeli 4746 görüntü ve etiketle 362 epoch boyunca, RT algılama için ise 4193 görüntü ve etiketle 450 epoch boyunca eğitilmiştir. Her bir veri seti, %60 eğitim, %20 test ve %20 doğrulama alt kümelerine bölünmüştür. Algılanan nesnelerin mesafeleri, kameranın derinlik özelliği kullanılarak ölçülmüştür.
Bulgular: En az %80 doğrulukla algılanan nesnelerin orta noktaları belirlenmiş ve mesafeleri derinlik kamerası kullanılarak hesaplanmıştır. İnsanlar 2 metre mesafede algılandığında robotun maksimum hızı %80’e azaltılmıştır. RT’ler 6 metre mesafede algılandığında yeni bir rota planlanmıştır.
Özgünlük: Bu çalışma, AMR navigasyonunda güvenliği ve verimliliği sağlama ikili problemini ele alan sosyal navigasyon ve derin öğrenmenin yenilikçi bir entegrasyonunu sunarak depo lojistiğinde ilerlemelere katkı sağlamaktadır.

Ethical Statement

Yazarlar tarafından bu çalışmada bilimsel ve etik ilkelere uyulduğu ve yararlanılan tüm çalışmaların kaynakçada belirtildiği beyan edilmiştir.

Supporting Institution

Bu çalışma TÜBİTAK tarafından desteklenmiştir, "Project No: 3231501"

Project Number

TUBITAK - 3231501

References

  • Beyer, L., Hermans, A., Linder, T., Arras, K.O. and Leibe, B. (2018). “Deep Person Detection in 2D Range Data”, arXiv preprint:1804.02463.
  • Clavero, C., Patricio, M.A., García, J. and Molina, J.M. (2024). “DMZoomNet: Improving Object Detection Using Distance Information in Intralogistics Environments”, IEEE Transactions on Industrial Informatics, 20 (7), 9163-9171. https://doi.org/10.1109/TII.2024.3381795
  • Cognominal, M., Patronymic, K. and Wańkowicz, A. (2021). “Evolving Field of Autonomous Mobile Robotics: Technological Advances and Applications”, Fusion of Multidisciplinary Research An International Journal, 2(2), 189-200.
  • Daza, M., Barrios-Aranibar, D., Diaz-Amado, J., Cardinale, Y. and Vilasboas, J. (2021). “An Approach of Social Navigation Based on Proxemics for Crowded Environments of Humans and Robots”, Micromachines, 12(2), 193. https://doi.org/10.3390/mi12020193
  • Fragapane, G., Ivanov, D., Peron, M., Sgarbossa, F. and Strandhagen, J.O. (2022). “Increasing Flexibility and Productivity in Industry 4.0 Production Networks with Autonomous Mobile Robots and Smart Intralogistics”, Annals of Operations Research, 308(1), 125-143. https://doi.org/10.1007/s10479-020-03526-7
  • Francis, A., Pérez-d'Arpino, C., Li, C., Xia, F., Alahi, A., Alami, R. and Martín-Martín, R. (2023). “Principles and Guidelines for Evaluating Social Robot Navigation Algorithms”, arXiv preprint arXiv:2306.16740.
  • Gürevin, B., Gül, R., Eğri, S., Gültürk, F., Yıldız, M., Çalışkan, F. ve Pehlivan, İ. (2023). “A Novel Method Determining the Size and Angle of an Object Using a Depth Camera Without Reference”, Academic Platform Journal of Engineering and Smart Systems, 11(2), 41-46. https://doi.org/10.21541/apjess.1297168
  • Gürevin, B., Yıldız, M., Gültürk, F., Pehlivan, İ., Çalışkan, F., Boru, B. ve Yıldız, M.Z. (2024). “A Novel Control and Monitoring Interface Design for ROS Based Mobile Robots”, Düzce Üniversitesi Bilim ve Teknoloji Dergisi, 12(1), 496-509. https://doi.org/10.29130/dubited.1214278
  • Intel Realsense. (2024). https://www.intelrealsense.com/depth-camera-d455/, (Erişim Tarihi: 25.07.2024).
  • Jia, D., Hermans, A. and Leibe, B. (2020). “DR-SPAAM: A Spatial-Attention and Auto-Regressive Model for Person Detection in 2D Range Data”, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 10270-10277.
  • Kenk, M.A., Hassaballah, M. and Brethé, J.F. (2019). “Human-Aware Robot Navigation in Logistics Warehouses”, Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics (ICINCO 2019), 371-378.
  • Kim, C. E., Oghaz, M.M.D., Fajtl, J., Argyriou, V. and Remagnino, P. (2018). “A Comparison of Embedded Deep Learning Methods for Person Detection”, arXiv preprint arXiv:1812.03451.
  • Mavrogiannis, C., Baldini, F., Wang, A., Zhao, D., Trautman, P., Steinfeld, A. and Oh, J. (2023). “Core Challenges of Social Robot Navigation: A Survey”, ACM Transactions on Human-Robot Interaction, 12(3), 1-39. https://doi.org/10.1145/3583741
  • Mayershofer, C., Holm, D.M., Molter, B. and Fottner, J. (2020). “Loco: Logistics Objects in Context”, 19th IEEE International Conference on Machine Learning and Applications (ICMLA), 612-617.
  • Önal, O. and Dandıl, E. (2021). “Object Detection for Safe Working Environments Using YOLOv4 Deep Learning Model”, Avrupa Bilim ve Teknoloji Dergisi, (26), 343-351. https://doi.org/10.31590/ejosat.951733
  • Reis, D., Kupec, J., Hong, J. and Daoudi, A. (2023). “Real-Time Flying Object Detection with YOLOv8”, arXiv preprint arXiv:2305.09972.
  • Salmerón-Garci, J., Inigo-Blasco, P., Di, F. and Cagigas-Muniz, D. (2015). “A Tradeoff Analysis of a Cloud-Based Robot Navigation Assistant Using Stereo Image Processing”, IEEE Transactions on Automation Science and Engineering, 12(2), 444-454. https://doi.org/10.1109/TASE.2015.2403593
  • Trakadas, P., Simoens, P., Gkonis, P., Sarakis, L., Angelopoulos, A., Ramallo-González, A. P., Skarmeta, A., Trochoutsos, C., Calvo, D., Pariente, T., Chintamani, K., Fernandez, I., Irigaray, A.A., Parreira, J.X., Petrali, P., Leligou, N. and Karkazis, P. (2020). “An Artificial Intelligence-Based Collaboration Approach in Industrial IoT Manufacturing: Key Concepts, Architectural Extensions and Potential Applications”, Sensors, 20(19), 5480. https://doi.org/10.3390/s20195480
  • Truong, X.T. and Ngo, T.D. (2016). “Dynamic Social Zone Based Mobile Robot Navigation for Human Comfortable Safety in Social Environments”, International Journal of Social Robotics, 8, 663-684. https://doi.org/10.1007/s12369-016-0352-0
  • Ultralytics. (2024a). https://docs.ultralytics.com/tr/models/yolov8/, (Erişim Tarihi: 25.07.2024).
  • Ultralytics. (2024b). https://github.com/ultralytics/ultralytics/tree/main, (Erişim tarihi: 08.10.2024).
  • Wang, F. and Li, X. (2023). “Research on Warehouse Object Detection for Mobile Robot Based on YOLOv5”, 2023 International Conference on Image Processing, Computer Vision and Machine Learning (ICICML), 1196-1200.
  • Zhu, K. and Zhang, T. (2021). “Deep Reinforcement Learning Based Mobile Robot Navigation: A Review”, Tsinghua Science and Technology, 26(5), 674-691.
There are 23 citations in total.

Details

Primary Language English
Subjects Logistics
Journal Section Araştırma Makalesi
Authors

Bilal Gürevin 0000-0003-4035-2759

Hilal Öztemel 0009-0003-9942-8241

Burhan Turgut Ulutürk 0009-0008-5983-5447

Emre Sebat 0009-0003-2445-8622

Project Number TUBITAK - 3231501
Publication Date April 16, 2025
Submission Date July 30, 2024
Acceptance Date December 30, 2024
Published in Issue Year 2025 Volume: 59 Issue: 2

Cite

APA Gürevin, B., Öztemel, H., Ulutürk, B. T., Sebat, E. (2025). Social Navigation in Warehouse Logistics Based on Artificial Intelligence and RGB-D. Verimlilik Dergisi, 59(2), 301-322. https://doi.org/10.51551/verimlilik.1523828

23139       23140          29293

22408 Journal of Productivity is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0)