Real-Time Obstacle Detection for Unmanned Surface Vehicle Maneuver

(1) * Anik Nur Handayani Mail (Universitas Negeri Malang, Indonesia)
(2) Ferina Ayu Pusparani Mail (Universitas Negeri Malang, Indonesia)
(3) Dyah Lestari Mail (Universitas Negeri Malang, Indonesia)
(4) I Made Wirawan Mail (Universitas Negeri Malang, Indonesia)
(5) Aji Prasetya Wibawa Mail (Universitas Negeri Malang, Indonesia)
(6) Osamu Fukuda Mail (Saga University, Japan)
*corresponding author

Abstract


The rapid advancement and increasing demand for Unmanned Surface Vehicle (USV) technology have drawn considerable attention in various sectors, including commercial, research, and military, particularly in marine and shallow water applications. USVs have the potential to revolutionize monitoring systems in remote areas while reducing labor costs. One critical requirement for USVs is their ability to autonomously integrate Guidance, Navigation, and Control (GNC) technology, enabling self-reliant operation without constant human oversight. However, current study for USV shown the use of traditional method using color detection which is inadequate to detect object with unstable lighting condition. This study addresses the challenge of enabling Autonomous Surface Vehicles (ASVs) to operate with minimal human intervention by enhancing their object detection and classification capabilities. In dynamic environments, such as water surfaces, accurate and rapid object recognition is essential. To achieve this, we focus on the implementation of deep learning algorithms, including the YOLO algorithm, to empower USVs with informed navigation decision-making capabilities. Our research contributes to the field of robotics by designing an affordable USV prototype capable of independent operation characterized by precise object detection and classification. By bridging the gap between advanced visualization techniques and autonomous USV technology, we envision practical applications in remote monitoring and marine operations with object detection. This paper presents the initial phase of our research, emphasizing significance of deep learning algorithms for enhancing USV navigation and decision-making in dynamic environmental conditions, resulting in mAP of 99.51%, IoU of 87.80%, error value of the YOLOv4-tiny image processing algorithm is 0.1542.


Keywords


Real-Time; Onstacle Detection; Unmanned Surface Vehicle; Maneuver

   

DOI

https://doi.org/10.31763/ijrcs.v3i4.1147
      

Article metrics

10.31763/ijrcs.v3i4.1147 Abstract views : 864 | PDF views : 201

   

Cite

   

Full Text

Download

References


[1] R. K. Chandana and A. C. Ramachandra, “Real time object detection system with YOLO and CNN models: A review,” arXiv Prepr. arXiv2208.00773, 2022, https://doi.org/10.48550/arXiv.2208.00773.

[2] Y.-L. Chang, A. Anagaw, L. Chang, Y. Wang, C.-Y. Hsiao, and W.-H. Lee, “Ship Detection Based on YOLOv2 for SAR Imagery,” Remote Sens., vol. 11, no. 7, p. 786, Apr. 2019, https://doi.org/10.3390/rs11070786.

[3] W. Zhang, X. Gao, C. Yang, F. Jiang, and Z. Chen, “A object detection and tracking method for security in intelligence of unmanned surface vehicles,” J. Ambient Intell. Humaniz. Comput., vol. 13, no. 3, pp. 1279–1291, 2022, https://doi.org/10.1007/s12652-020-02573-z.

[4] L. Cheng et al., “Water Target Recognition Method and Application for Unmanned Surface Vessels,” IEEE Access, vol. 10, pp. 421–434, 2022, https://doi.org/10.1109/ACCESS.2021.3138983.

[5] X. Zhao, P. Sun, Z. Xu, H. Min, and H. Yu, “Fusion of 3D LIDAR and Camera Data for Object Detection in Autonomous Vehicle Applications,” IEEE Sens. J., vol. 20, no. 9, pp. 4901–4913, 2020, https://doi.org/10.1109/JSEN.2020.2966034.

[6] T. Zhang, X. Liu, Y. Lib, and M. Zhang, “Sea-surface object detection scheme for USV under foggy environment,” Indian J. Geo-Marine Sci., vol. 50, no. 11, pp. 960–968, 2022, https://doi.org/10.56042/ijms.v50i11.66765.

[7] C. Barrera, I. Padron, F. S. Luis, and O. Llinas, “Trends and challenges in unmanned surface vehicles (Usv): From survey to shipping,” TransNav Int. J. Mar. Navig. Saf. Sea Transp., vol. 15, 2021, https://doi.org/10.12716/1001.15.01.13.

[8] A. Gupta, A. Anpalagan, L. Guan, and A. S. Khwaja, “Deep learning for object detection and scene perception in self-driving cars: Survey, challenges, and open issues,” Array, vol. 10, p. 100057, 2021, https://doi.org/10.1016/j.array.2021.100057.

[9] G. Li, Z. Song, and Q. Fu, “Small boat detection for radar image datasets with yolo V3 network,” in 2019 IEEE International Conference on Signal, Information and Data Processing (ICSIDP), pp. 1–5, 2019, https://doi.org/10.1109/ICSIDP47821.2019.9173163.

[10] Y. Chen, X. Chen, J. Zhu, F. Lin, and B. M. Chen, “Development of an Autonomous Unmanned Surface Vehicle with Object Detection Using Deep Learning,” in IECON 2018 - 44th Annual Conference of the IEEE Industrial Electronics Society, pp. 5636–5641, 2018, https://doi.org/10.1109/IECON.2018.8591129.

[11] J. Taipalmaa, N. Passalis, and J. Raitoharju, “Different Color Spaces In Deep Learning-Based Water Segmentation For Autonomous Marine Operations,” in 2020 IEEE International Conference on Image Processing (ICIP), pp. 3169–3173, 2020, https://doi.org/10.1109/ICIP40778.2020.9190960.

[12] H. S. Cunha et al., “Water tank and swimming pool detection based on remote sensing and deep learning: Relationship with socioeconomic level and applications in dengue control,” PLoS One, vol. 16, no. 12, p. e0258681, 2021, https://doi.org/10.1371/journal.pone.0258681.

[13] D. Hema and D. S. Kannan, “Interactive color image segmentation using HSV color space,” Sci. Technol. J, vol. 7, no. 1, pp. 37–41, 2019, https://doi.org/10.22232/stj.2019.07.01.05.

[14] X. Wang, J. Yang, P. Ruan, and P. Wang, “An Improved Unsupervised Color Correction Algorithm for Underwater Image,” in 2021 IEEE 16th Conference on Industrial Electronics and Applications (ICIEA), pp. 1215–1220, 2021, https://doi.org/10.1109/ICIEA51954.2021.9516076.

[15] W. Jung, J. Woo, and N. Kim, “Recognition of the light buoy for scan the code mission in 2016 Maritime RobotX Challenge,” in 2017 IEEE Underwater Technology (UT), pp. 1–5, 2017, https://doi.org/10.1109/UT.2017.7890288.

[16] A. Al-Kaff, F. M. Moreno, A. de la Escalera, and J. M. Armingol, “Intelligent vehicle for search, rescue and transportation purposes,” in 2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), pp. 110–115, 2017, https://doi.org/10.1109/SSRR.2017.8088148.

[17] S. Yakin, T. Hasanuddin, and N. Kurniati, “Application of content based image retrieval in digital image search system,” Bull. Electr. Eng. Informatics, vol. 10, no. 2, pp. 1122–1128, 2021, https://doi.org/10.11591/eei.v10i2.2713.

[18] M. A. Hossain, M. I. Hossain, M. D. Hossain, N. T. Thu, and E.-N. Huh, “Fast-D: When non-smoothing color feature meets moving object detection in real-time,” IEEE Access, vol. 8, pp. 186756–186772, 2020, https://doi.org/10.1109/ACCESS.2020.3030108.

[19] R. Prados, R. García, N. Gracias, L. Neumann, and H. Vågstøl, “Real-time fish detection in trawl nets,” in OCEANS 2017 - Aberdeen, pp. 1–5, 2017, https://doi.org/10.1109/OCEANSE.2017.8084760.

[20] Y. Liu, Z. Bellay, P. Bradsky, G. Chandler, and B. Craig, “Edge-to-fog computing for color-assisted moving object detection,” in Big Data: Learning, Analytics, and Applications, vol. 10989, pp. 9–17, 2019, https://doi.org/10.1117/12.2516023.

[21] S.-H. Tsai and Y.-H. Tseng, “A novel color detection method based on HSL color space for robotic soccer competition,” Comput. Math. with Appl., vol. 64, no. 5, pp. 1291–1300, 2012, https://doi.org/10.1016/j.camwa.2012.03.073.

[22] J. K. Basak, B. G. K. Madhavi, B. Paudel, N. E. Kim, and H. T. Kim, “Prediction of total soluble solids and pH of strawberry fruits using RGB, HSV and HSL colour spaces and machine learning models,” Foods, vol. 11, no. 14, p. 2086, 2022, https://doi.org/10.3390/foods11142086.

[23] Y. Ardiyanto, I. T. Sujoko, W. A. Wibowo, V. D. H. Nugraha, and F. E. Saputra, “Prototype design of unmanned surface ship to detect illegal fishing using solar power generation technology,” J. Electr. Technol. UMY, vol. 3, no. 1, pp. 14–18, 2019, https://doi.org/10.18196/jet.3149.

[24] D. Chowdhury, S. Mandal, D. Das, S. Banerjee, S. Shome, and D. Choudhary, "An Adaptive Technique for Computer Vision Based Vehicles License Plate Detection System," 2019 International Conference on Opto-Electronics and Applied Optics (Optronix), pp. 1-6, 2019, https://doi.org/10.1109/OPTRONIX.2019.8862406.

[25] O. Surinta and S. Khruahong, “Tracking people and objects with an autonomous unmanned aerial vehicle using face and color detection,” in 2019 Joint International Conference on Digital Arts, Media and Technology with ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering (ECTI DAMT-NCON), pp. 206–210, 2019, https://doi.org/10.1109/ECTI-NCON.2019.8692269.

[26] C. Powers, R. Hanlon, and D. Schmale, “Tracking of a Fluorescent Dye in a Freshwater Lake with an Unmanned Surface Vehicle and an Unmanned Aircraft System,” Remote Sens., vol. 10, no. 2, p. 81, Jan. 2018, https://doi.org/10.3390/rs10010081.

[27] A. Aqthobilrobbany, A. Handayani, D. Lestari, M. Muladi, R. Andrie, and O. Fukuda, “HSV Based Robot Boat Navigation System,” 2020, pp. 269–273, https://doi.org/10.1109/CENIM51130.2020.9297915.

[28] C. Liu et al., “Development of USV Autonomy: Architecture, Implementation and Sea Trials,” Brodogradnja, vol. 73, no. 1, pp. 89–107, 2022, https://doi.org/10.21278/brod73105.

[29] P. Yang, C. Song, L. Chen, and W. Cui, "Image Based River Navigation System of Catamaran USV with Image Semantic Segmentation," 2022 WRC Symposium on Advanced Robotics and Automation (WRC SARA), pp. 147-151, 2022, https://doi.org/10.1109/WRCSARA57040.2022.9903932.

[30] B. Y. Suprapto, M. A. Kurniawan, M. K. Ardela, H. Hikmarika, Z. Husin, and S. Dwijayanti, “Identification of Garbage in the River Based on The YOLO Algorithm,” Int. J. Electron. Telecommun., vol. 67, no. 4, Jul. 2023, https://doi.org/10.24425/ijet.2021.137869.

[31] D. Freire, J. Silva, A. Dias, J. M. Almeida, and A. Martins, "Radar-based target tracking for Obstacle Avoidance for an Autonomous Surface Vehicle (ASV)," OCEANS 2019 - Marseille, pp. 1-6, 2019, https://doi.org/10.1109/OCEANSE.2019.8867477.

[32] Z. Gan, J. Zheng, Z. Jiang, and R. Lu, "Reward Shaping-based Double Deep Q-networks for Unmanned Surface Vessel Navigation and Obstacle Avoidance," IECON 2022 – 48th Annual Conference of the IEEE Industrial Electronics Society, pp. 1-6, 2022, https://doi.org/10.1109/IECON49645.2022.9968970.

[33] N. Hassan, K. W. Ming, and C. K. Wah, “A Comparative Study on HSV-based and Deep Learning-based Object Detection Algorithms for Pedestrian Traffic Light Signal Recognition,” in 2020 3rd International Conference on Intelligent Autonomous Systems (ICoIAS), pp. 71–76, 2020, https://doi.org/10.1109/ICoIAS49312.2020.9081854.

[34] J. Redmon and A. Farhadi, “YOLO9000: better, faster, stronger,” in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 7263–7271, 2017, https://doi.org/10.1109/CVPR.2017.690.

[35] A. Bochkovskiy, C.-Y. Wang, and H.-Y. M. Liao, “YOLOv4: Optimal Speed and Accuracy of Object Detection,” arXiv preprint arXiv:2004.10934, 2020, https://doi.org/10.48550/arXiv.2004.10934.

[36] L. I. Bo, X. I. E. Xiaoyang, W. E. I. Xingxing, and T. A. N. G. Wenting, "Ship detection and classification from optical remote sensing images: A survey," Chinese Journal of Aeronautics, vol. 34, no. 3, pp. 145-163, 2021, https://doi.org/10.1016/j.cja.2020.09.022.

[37] A. Aqthobilrobbany, A. N. Handayani, D. Lestari, Muladi, R. A. Asmara, and O. Fukuda, "HSV Based Robot Boat Navigation System," 2020 International Conference on Computer Engineering, Network, and Intelligent Multimedia (CENIM), pp. 269-273, 2020, https://doi.org/10.1109/CENIM51130.2020.9297915.

[38] A. N. Handayani, F. A. Pusparani, D. Lestari, I. M. Wirawan, O. Fukuda, and A. Aqthobirrobbany, “Robot Boat Prototype System Based on Image Processing for Maritime Patrol Area,” in 2022 2nd International Seminar on Machine Learning, Optimization, and Data Science (ISMODE), pp. 310–315, 2022, https://doi.org/10.1109/ISMODE56940.2022.10181007.

[39] M. Sportelli et al., “Evaluation of YOLO Object Detectors for Weed Detection in Different Turfgrass Scenarios,” Appl. Sci., vol. 13, no. 14, 2023, https://doi.org/10.3390/app13148502.

[40] M. Pan, Y. Liu, J. Cao, Y. Li, C. Li and C. -H. Chen, "Visual Recognition Based on Deep Learning for Navigation Mark Classification," in IEEE Access, vol. 8, pp. 32767-32775, 2020, https://doi.org/10.1109/ACCESS.2020.2973856.

[41] M. Singh and S. Indu, “BGR to HSV based Text Extraction from Manuscripts Using Slidebars,” in 2021 Asian Conference on Innovation in Technology (ASIANCON), pp. 1–6, 2021, https://doi.org/10.1109/ASIANCON51346.2021.9544674.

[42] V. R. Joseph, “Optimal ratio for data splitting,” Stat. Anal. Data Min., vol. 15, no. 4, pp. 531–538, 2022, https://doi.org/10.1002/sam.11583.

[43] S.-A. N. Alexandropoulos, C. K. Aridas, S. B. Kotsiantis, and M. N. Vrahatis, “Multi-Objective Evolutionary Optimization Algorithms for Machine Learning: A Recent Survey,” Approximation and Optimization: Algorithms, Complexity and Applications, pp. 35–55, 2019, https://doi.org/10.1007/978-3-030-12767-1_4.

[44] S.-A. N. Alexandropoulos, C. K. Aridas, S. B. Kotsiantis, and M. N. Vrahatis, “Multi-Objective Evolutionary Optimization Algorithms for Machine Learning: A Recent Survey BT - Approximation and Optimization : Algorithms, Complexity and Applications,” Approximation and Optimization: Algorithms, Complexity and Applications, pp. 35–55 2019, https://doi.org/10.1007/978-3-030-12767-1_4.

[45] Y. Zhang, Y. Yuan, Y. Feng, and X. Lu, "Hierarchical and Robust Convolutional Neural Network for Very High-Resolution Remote Sensing Object Detection," in IEEE Transactions on Geoscience and Remote Sensing, vol. 57, no. 8, pp. 5535-5548, 2019, https://doi.org/10.1109/TGRS.2019.2900302.

[46] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 779–788, 2016, https://doi.org/10.1109/CVPR.2016.91.

[47] C. Y. Wang, A. Bochkovskiy, and H. Y. M. Liao, "Scaled-yolov4: Scaling cross stage partial network," in Proceedings of the IEEE/cvf conference on computer vision and pattern recognition, pp. 13029-13038, 2021, https://doi.org/10.1109/CVPR46437.2021.01283.

[48] G. Shao, Y. Ma, R. Malekian, X. Yan, and Z. Li, “A Novel Cooperative Platform Design for Coupled USV–UAV Systems,” IEEE Trans. Ind. Informatics, vol. 15, no. 9, pp. 4913–4922, 2019, https://doi.org/10.1109/TII.2019.2912024.

[49] E. Giraldo-Pérez, E. Betancur, and G. Osorio-Gómez, "Experimental and statistical analysis of the hydrodynamic performance of planing boats: A Comparative study," Ocean Engineering, vol. 262, p. 112227, 2022, https://doi.org/10.1016/j.oceaneng.2022.112227.

[50] P. Asgharian and Z. H. Azizul, “Proposed Efficient Design for Unmanned Surface Vehicles,” arXiv Prepr. arXiv2009.01284, 2020, https://doi.org/10.48550/arXiv.2009.01284.


Refbacks

  • There are currently no refbacks.


Copyright (c) 2023 Anik Nur Handayani, Ferina Ayu Pusparani, Dyah Lestari, I Made Wirawan, Aji Prasetya Wibawa, Osamu Fukuda

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

 


About the JournalJournal PoliciesAuthor Information

International Journal of Robotics and Control Systems
e-ISSN: 2775-2658
Website: https://pubs2.ascee.org/index.php/IJRCS
Email: ijrcs@ascee.org
Organized by: Association for Scientific Computing Electronics and Engineering (ASCEE)Peneliti Teknologi Teknik IndonesiaDepartment of Electrical Engineering, Universitas Ahmad Dahlan and Kuliah Teknik Elektro
Published by: Association for Scientific Computing Electronics and Engineering (ASCEE)
Office: Jalan Janti, Karangjambe 130B, Banguntapan, Bantul, Daerah Istimewa Yogyakarta, Indonesia