[1] L. Zhang, C. Li, and H. Sun, “Object detection/tracking toward underwater photographs by remotely operated vehicles (ROVs),” Future Generation Computer Systems, vol. 126, pp. 163–168, Jan. 2022, doi: 10.1016/j.future.2021.07.011.
[2] F. Peng, Z. Miao, F. Li, and Z. Li, “S-FPN: A shortcut feature pyramid network for sea cucumber detection in underwater images,” Expert Syst Appl, vol. 182, 2021, doi: 10.1016/j.eswa.2021.115306.
[3] M. S. Asyraf, I. S. Isa, M. I. F. Marzuki, S. N. Sulaiman, and C. C. Hung, “CNN-based YOLOv3 Comparison for Underwater Object Detection,” Journal of Electrical & Electronic Systems Research, vol. 18, no. APR2021, 2021, doi: 10.24191/jeesr.v18i1.005.
[4] Jiao, P., Ye, X., Zhang, C., Li, W., & Wang, H. (2023). Vision‐based real‐time marine and offshore structural health monitoring system using underwater robots. Computer‐Aided Civil and Infrastructure Engineering.
[5] A. Mahavarkar, R. Kadwadkar, S. Maurya, and S. Raveendran, “Underwater Object Detection using Tensorflow,” ITM Web of Conferences, vol. 32, 2020, doi: 10.1051/itmconf/20203203037.
[6] C. Fu et al., “Rethinking general underwater object detection: Datasets, challenges, and solutions,” Neurocomputing, vol. 517, 2023, doi: 10.1016/j.neucom.2022.10.039.
[7] H. T. Nguyen, E. H. Lee, C. H. Bae, and S. Lee, “Multiple object detection based on clustering and deep learning methods,” Sensors (Switzerland), vol. 20, no. 16, 2020, doi: 10.3390/s20164424.
[8] Z. Wang, H. Chen, H. Qin, and Q. Chen, “Self-Supervised Pre-Training Joint Framework: Assisting Lightweight Detection Network for Underwater Object Detection,” J Mar Sci Eng, vol. 11, no. 3, 2023, doi: 10.3390/jmse11030604.
[9] S. Xu, M. Zhang, W. Song, H. Mei, Q. He, and A. Liotta, “A systematic review and analysis of deep learning-based underwater object detection,” Neurocomputing, vol. 527. 2023. doi: 10.1016/j.neucom.2023.01.056.
[10] X. Yang, S. Samsudin, Y. Wang, Y. Yuan, T. F. T. Kamalden, and S. S. N. bin Yaakob, “Application of Target Detection Method Based on Convolutional Neural Network in Sustainable Outdoor Education,” Sustainability (Switzerland), vol. 15, no. 3, 2023, doi: 10.3390/su15032542.
[11] A. Mathias, S. Dhanalakshmi, R. Kumar, and R. Narayanamoorthi, “Deep neural network driven automated underwater object detection,” Computers, Materials and Continua, vol. 70, no. 3, 2022, doi: 10.32604/cmc.2022.021168.
[12] S. Fayaz, S. A. Parah, and G. J. Qureshi, “Underwater object detection: architectures and algorithms – a comprehensive review,” Multimed Tools Appl, vol. 81, no. 15, pp. 20871–20916, Jun. 2022, doi: 10.1007/s11042-022-12502-1.
[13] S. Zhao, J. Zheng, S. Sun, and L. Zhang, “An Improved YOLO Algorithm for Fast and Accurate Underwater Object Detection,” Symmetry (Basel), vol. 14, no. 8, p. 1669, Aug. 2022, doi: 10.3390/sym14081669.
[14] F. Han, J. Yao, H. Zhu, and C. Wang, “Underwater Image Processing and Object Detection Based on Deep CNN Method,” J Sens, vol. 2020, pp. 1–20, May 2020, doi: 10.1155/2020/6707328.
[15] H. Yang, P. Liu, Y. Hu, and J. Fu, “Research on underwater object recognition based on YOLOv3,” Microsystem Technologies, vol. 27, no. 4, pp. 1837–1844, Apr. 2021, doi: 10.1007/s00542-019-04694-8.
[16] G. Coro and M. Bjerregaard Walsh, “An intelligent and cost-effective remote underwater video device for fish size monitoring,” Ecol Inform, vol. 63, 2021, doi: 10.1016/j.ecoinf.2021.101311.
[17] X. Chen, M. Yuan, C. Fan, X. Chen, Y. Li, and H. Wang, “Research on an Underwater Object Detection Network Based on Dual-Branch Feature Extraction,” Electronics (Switzerland), vol. 12, no. 16, 2023, doi: 10.3390/electronics12163413.
[18] J. Simon et al., “Using automated video analysis to study fish escapement through escape panels in active fishing gears: Application to the effect of net colour,” Mar Policy, vol. 116, 2020, doi: 10.1016/j.marpol.2019.103785.
[19] M. Zhang, S. Xu, W. Song, Q. He, and Q. Wei, “Lightweight underwater object detection based on yolo v4 and multi-scale attentional feature fusion,” Remote Sens (Basel), vol. 13, no. 22, 2021, doi: 10.3390/rs13224706.