[1]World Health Organization (WHO). (2023). Global status report on road safety 2023. Retrieved from https://www.who.int/
[2]Console and Associates. (2024). Blind spot car accident injuries. Retrieved from https://natlawreview.com/
[3]More, S., Mulla, A. C., Argade, S. G., Raskar, M., Sakhare, P., & Jadhav, S. P. (2024). Advanced Driver Assistance Systems (ADAS) feature in Modern Autonomous Vehicle. 2024 3rd International Conference for Innovation in Technology (INOCON), 1-5. https://doi.org/10.1109/INOCON60754.2024.10511531
[4]Lin, B.-F., et al. (2012). Integrating appearance and edge features for sedan vehicle detection in the blind-spot area. IEEE Transactions on Intelligent Transportation Systems, 13(2), 737-747. https://doi.org/10.1109/TITS.2011.2182649
[5]Boxu, Z., Cheng, Z., Libin, Z., & Meixia, L. (2023). Optimization of vehicle blind zone monitoring (BSD) evaluation scheme based on image processing and radar ranging algorithm. 2023 IEEE 2nd International Conference on Electrical Engineering, Big Data and Algorithms (EEBDA), 1657-1662. https://doi.org/10.1109/EEBDA56825.2023.10090708
[6]Nilsson, J., Ödblom, A. C. E., Fredriksson, J., Zafar, A., & Ahmed, F. (2010). Performance evaluation method for mobile computer vision systems using augmented reality. 2010 IEEE Virtual Reality Conference (VR), 19-22. https://doi.org/10.1109/VR.2010.5444821
[7]PJ Campanaro Attorney at Law. (2023). Types of car accidents caused by blind spots. Retrieved from https://www.csralawyer.com/.
[8]Yuan, B., Chen, Y.-A., & Ye, S. (2018). A lightweight augmented reality system to see-through cars. 2018 7th International Congress on Advanced Applied Informatics (IIAI-AAI), 855-860. https://doi.org/10.1109/IIAI-AAI.2018.00174
[9]Rameau, F., Ha, H., Joo, K., Choi, J., Park, K., & Kweon, I. S. (2016). A real-time augmented reality system to see-through cars. IEEE Transactions on Visualization and Computer Graphics, 22(11), 2395-2404. https://doi.org/10.1109/TVCG.2016.2593768
[10]Park, B.-J., Yoon, C., Lee, J.-W., & Kim, K.-H. (2015). Augmented reality based on driving situation awareness in vehicle. 2015 17th International Conference on Advanced Communication Technology (ICACT), 593-595. https://doi.org/10.1109/ICACT.2015.7224865
[11]Gomes, P., Olaverri-Monreal, C., & Ferreira, M. (2012). Making vehicles transparent through V2V video streaming. IEEE Transactions on Intelligent Transportation Systems, 13(2), 930-938. https://doi.org/10.1109/TITS.2012.2188289
[12]Li, H., & Nashashibi, F. (2011). Multi-vehicle cooperative perception and augmented reality for driver assistance: A possibility to ‘see’ through front vehicle. 2011 14th International IEEE Conference on Intelligent Transportation Systems (ITSC), 242-247. https://doi.org/10.1109/ITSC.2011.6083061
[13]Borsoi, R. A., & Costa, G. H. (2018). On the performance and implementation of parallax free video see-through displays. IEEE Transactions on Visualization and Computer Graphics, 24(6), 2011-2022. https://doi.org/10.1109/TVCG.2017.2705184
[14]Olaverri-Monreal, C., Gomes, P., Fernandes, R., Vieira, F., & Ferreira, M. (2010). The See-Through System: A VANET-enabled assistant for overtaking maneuvers. 2010 IEEE Intelligent Vehicles Symposium, 123-128. https://doi.org/10.1109/IVS.2010.5548020
[15]Gomes, P., Vieira, F., & Ferreira, M. (2012). The See-Through System: From implementation to test-drive. 2012 IEEE Vehicular Networking Conference (VNC), 40-47. https://doi.org/10.1109/VNC.2012.6407443
[16]Wang, Z., Jin, Q., & Wu, B. (2022). Design of a vision blind spot detection system based on depth camera. 2022 IEEE International Conference on Dependable, Autonomic and Secure Computing, International Conference on Pervasive Intelligence and Computing, International Conference on Cloud and Big Data Computing, International Conference on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech), 1-5. https://doi.org/10.1109/DASC/PiCom/CBDCom/Cy55231.2022.9927963
[17]Xia, Z., Gong, J., Long, Y., Ren, W., Wang, J., & Lan, H. (2022). Research on traffic accident detection based on vehicle perspective. 2022 4th International Conference on Robotics and Computer Vision (ICRCV), 223-227. https://doi.org/10.1109/ICRCV55858.2022.9953179
[18]Park, B.-J., Yoon, C., Lee, J.-W., & Kim, K.-H. (2015). Augmented reality based on driving situation awareness in vehicle. 2015 17th International Conference on Advanced Communication Technology (ICACT), 593-595. https://doi.org/10.1109/ICACT.2015.7224865
[19]https://www.cw.com.tw/article/5122137
[20]https://www.geeksforgeeks.org/differences-between-tcp-and-udp/
[21]Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 779-788. https://doi.org/10.1109/CVPR.2016.91
[22]Bakirci, M., & Bayraktar, I. (2024). YOLOv9-enabled vehicle detection for urban security and forensics applications. 2024 12th International Symposium on Digital Forensics and Security (ISDFS), 1-6. https://doi.org/10.1109/ISDFS60797.2024.10527304
[23]Nusari, A. N. M., Ozbek, I. Y., & Oral, E. A. (2024). Automatic vehicle accident detection and classification from images: A comparison of YOLOv9 and YOLO-NAS algorithms. 2024 32nd Signal Processing and Communications Applications Conference (SIU), 1-4. https://doi.org/10.1109/SIU61531.2024.10600761
[24]Shen, Y., & Yan, W. Q. (2018). Blind spot monitoring using deep learning. 2018 International Conference on Image and Vision Computing New Zealand (IVCNZ), 1-5. https://doi.org/10.1109/IVCNZ.2018.8634716
[25]Good, W. F., Maitz, G. S., & Gur, D. (1994). Joint photographic experts group (JPEG) compatible data compression of mammograms. Journal of Digital Imaging, 7(3), 123-132.
[26]夏漢軒(2022)。基於ROS2.0分散式通訊架構結合無線隨意網路之多路側單元自動駕駛系統研究。﹝碩士論文。國立虎尾科技大學﹞臺灣博碩士論文知識加值系統。 https://hdl.handle.net/11296/nsj747。[27]Chandan, G., Jain, A., Jain, H., & Mohana. (2018). Real time object detection and tracking using deep learning and OpenCV. 2018 International Conference on Inventive Research in Computing Applications (ICIRCA), 1305-1308. https://doi.org/10.1109/ICIRCA.2018.8597266