Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/152578
|
Title: | 閃爍燈光引導下的無人機區域搜索、目標定位與自主降落技術研究 Research on Unmanned Aerial Vehicle Area Search, Localization, and Autonomous Landing Techniques Guided by Flashing Lights |
Authors: | 陳暐中 Chen, Wei-Chung |
Contributors: | 劉吉軒 Liu, Jyi-Shane 陳暐中 Chen, Wei-Chung |
Keywords: | 智慧無人機 區域偵查 區域覆蓋路徑規劃 定頻閃爍燈光燈偵測 影像處理 人機協作 人機互動介面 Area surveillance Area Coverage path planning Fixed-frequency flashing light detection Human-machine interaction interface |
Date: | 2024 |
Issue Date: | 2024-08-05 12:47:02 (UTC+8) |
Abstract: | 隨著無人機技術在現今產業的迅速發展,其廣泛應用已經涵蓋多個領 域,特別是在山難搜索與救援方面。無人機能夠快速且靈活地進行災區探 測。無人機最大的優勢之一是能進入人類難以到達的地區,例如高山或陡 峭的地形,這樣可以有效地減少搜救人員的風險。相較於有人駕駛的飛機, 無人機的製造成本較低,機動性更高,且能顯著降低飛行員的傷亡風險, 因此在許多傳統依賴人力的任務中扮演了重要的角色。 在過去,無人機的視覺引導定位任務主要依賴於特殊的降落標記,如 H 圖型、QR Code 或 AprilTag 等。這些標記可以有效協助無人機進行快速定 位,但由於需要在高空中清晰辨識,通常標記的面積必須較大。然而,這 種大型標記不易攜帶,並且在緊急情況下要求受困者自行攜帶這些標記以 進行定位,在實際情況中並不現實,特別是在山難救援等緊急情況中。因 此,本研究提出使用閃爍的手電筒作為替代方案,透過固定頻率的閃爍燈 光引導無人機協助救援人員進行搜索,並在必要時精確降落至受難者旁邊, 提供急需的物資如醫療用品、食物和水等。這種方法相較於傳統標記,不 僅便攜且在緊急情況下更實用。此外,研究亦結合人機互動介面,允許地 面站的監控人員即時獲得無人機的視覺資料、目標偵測輔助及飛行狀態。 這種整合系統不僅提升了操作的即時性與準確性,也加強了飛行過程的安 全監控。在遇到突發情況時,監控人員能夠迅速介入控制無人機,提高其 應對能力。這樣的人機互動介面增強了無人機的安全性和可靠性,並拓展了其在多個領域的應用潛力。 In the past, drone visual guidance and positioning tasks mainly relied on special landing markers such as H-shaped markers, QR codes, or AprilTags. These markers effectively assist drones in rapid positioning, but since they need to be clearly recognized from high altitudes, the markers usually have to be relatively large. However, such large markers are not easy to carry, and expecting victims to carry these markers for positioning in emergency situations is impractical, especially in mountain rescue scenarios. Therefore, this study proposes using a flashing flashlight as an alternative. By using fixed-frequency flashing lights, drones can be guided to assist rescuers in search operations and, if necessary, precisely land next to the victims to provide urgently needed supplies such as medical equipment, food, and water. This method is not only more portable compared to traditional markers but also more practical in emergency situations. Additionally, the study integrates a human-machine interface, allowing ground station operators to receive real-time visual data from the drone, target detection assistance, and flight status information. This integrated system enhances the immediacy and accuracy of operations while also strengthening safety monitoring during the flight. In case of emergencies, operators can quickly intervene and control the drone, improving its responsiveness. Such a human-machine interaction interface enhances the safety and reliability of drones and expands their application potential in multiple fields. |
Reference: | [1] Hazim Shakhatreh, Ahmad H Sawalmeh, Ala Al-Fuqaha, Zuochao Dou, Eyad Almaita, Issa Khalil, Noor Shamsiah Othman, Abdallah Khreishah, and Mohsen Guizani. Unmanned aerial vehicles (uavs): A survey on civil applications and key research challenges. Ieee Access, 7:48572–48634, 2019. [2] Songhe Yuan, Kaoru Ota, Mianxiong Dong, and Jianghai Zhao. A path planning method with perception optimization based on sky scanning for uavs. Sensors, 22(3):891, 2022. [3] Youeyun Jung, Dongjin Lee, and Hyochoong Bang. Close-range vision navigation and guidance for rotary uav autonomous landing. In 2015 IEEE International Conference on Automation Science and Engineering (CASE), pages 342–347. IEEE, 2015. [4] SarantisKyristsis,AngelosAntonopoulos,TheofilosChanialakis,EmmanouelStefanakis, Christos Linardos, Achilles Tripolitsiotis, and Panagiotis Partsinevelos. Towards autonomous modular uav missions: The detection, geo-location and landing paradigm. Sensors, 16(11):1844, 2016. [5] AlexandreBorowczyk,Duc-TienNguyen,AndréPhu-VanNguyen,DangQuangNguyen, David Saussié, and Jerome Le Ny. Autonomous landing of a multirotor micro air vehicle on a high velocity ground vehicle. Ifac-Papersonline, 50(1):10488–10494, 2017. [6] Prases K Mohanty, Anand Kumar Singh, Amit Kumar, Manjeet Kumar Mahto, and Shubhasri Kundu. Path planning techniques for mobile robots: A review. In International Conference on Soft Computing and Pattern Recognition, pages 657–667. Springer, 2021. [7] Taua M Cabreira, Carmelo Di Franco, Paulo R Ferreira, and Giorgio C Buttazzo. Energy- aware spiral coverage path planning for uav photogrammetric applications. IEEE Robotics and automation letters, 3(4):3662–3668, 2018. 68 [8] Rafael Santin, Luciana Assis, Alessandro Vivas, and Luciano CA Pimenta. Matheuristics for multi-uav routing and recharge station location for complete area coverage. Sensors, 21(5):1705, 2021. [9] AurelioGMelo,MilenaFPinto,AndreLMMarcato,LeonardoMHonório,andFabrícioO Coelho. Dynamic optimization and heuristics based online coverage path planning in 3d environment for uavs. Sensors, 21(4):1108, 2021. [10] TauãM.Cabreira,CarmeloDiFranco,PauloR.Ferreira,andGiorgioC.Buttazzo.Energy- aware spiral coverage path planning for uav photogrammetric applications. IEEE Robotics and Automation Letters, 3(4):3662–3668, 2018. [11] Sung Won Cho, Hyun Ji Park, Hanseob Lee, David Hyunchul Shim, and Sun-Young Kim. Coverage path planning for multiple unmanned aerial vehicles in maritime search and rescue operations. Computers & Industrial Engineering, 161:107612, 2021. [12] Branden Pinney, Ben Stockett, Mohammad Shekaramiz, Mohammad A.S. Masoum, Abdennour Seibi, and Angel Rodriguez. Exploration and object detection via low-cost autonomous drone. In 2023 Intermountain Engineering, Technology and Computing (IETC), pages 49–54, 2023. [13] Chaoqun Zhang, Wenjuan Zhou, Weidong Qin, and Weidong Tang. A novel uav path planning approach: Heuristic crossing search and rescue optimization algorithm. Expert Systems with Applications, 215:119243, 2023. [14] Victor San Juan, Matilde Santos Peñas, and Jose Andujar Marquez. Intelligent uav map generation and discrete path planning for search and rescue operations. Complexity, 2018:1–17, 04 2018. [15] Muhammad Atif, Rizwan Ahmad, Waqas Ahmad, Liang Zhao, and Joel JPC Rodrigues. Uav-assisted wireless localization for search and rescue. IEEE Systems Journal, 15(3):3261–3272, 2021. [16] Avishkar Seth, Alice James, Endrowednes Kuantama, Subhas Mukhopadhyay, and Richard Han. Vertical trajectory analysis using qr code detection for drone delivery 69
application. In International Conference on Sensing Technology, pages 476–483. Springer, 2022. [17] Bohan Yoon, Hyeonha Kim, Geonsik Youn, and Jongtae Rhee. 3d position estimation of drone and object based on qr code segmentation model for inventory management automation. In 2021 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pages 223–229. IEEE, 2021. [18] Tae-Won Kang and Jin-Woo Jung. A drone’s 3d localization and load mapping based on qr codes for load management. Drones, 8(4):130, 2024. [19] Mingyang Lyu, Yibo Zhao, Chao Huang, and Hailong Huang. Unmanned aerial vehicles for search and rescue: A survey. Remote Sensing, 15(13):3266, 2023. [20] Mesay Belete Bejiga, Abdallah Zeggada, Abdelhamid Nouffidj, and Farid Melgani. A convolutional neural network approach for assisting avalanche search and rescue operations with uav imagery. Remote Sensing, 9(2):100, 2017. [21] Joseph McGee, Sajith J Mathew, and Felipe Gonzalez. Unmanned aerial vehicle and artificial intelligence for thermal target detection in search and rescue applications. In 2020 International Conference on Unmanned Aircraft Systems (ICUAS), pages 883–891. IEEE, 2020. [22] Ignacio Martinez-Alpiste, Gelayol Golcarenarenji, Qi Wang, and Jose Maria Alcaraz- Calero. Search and rescue operation using uavs: A case study. Expert Systems with Applications, 178:114937, 2021. [23] Yifan Lu, Jiaming Lu, Songhai Zhang, and Peter Hall. Traffic signal detection and classification in street views using an attention model. Computational Visual Media, 4:253–266, 2018. [24] Jian-Gang Wang and Lu-Bing Zhou. Traffic light recognition with high dynamic range imaging and deep learning. IEEE Transactions on Intelligent Transportation Systems, 20(4):1341–1352, 2018. 70
[25] Keisuke Yoneda, Akisuke Kuramoto, Naoki Suganuma, Toru Asaka, Mohammad Aldibaja, and Ryo Yanase. Robust traffic light and arrow detection using digital map with spatial prior information for automated driving. Sensors, 20(4):1181, 2020. [26] D Astanei, F Munteanu, C Nemes, A Ciobanu, M Ionescu, and M Adochitei. Light flicker detection using high-speed imaging. In 2017 International Conference on Modern Power Systems (MPS), pages 1–4. IEEE, 2017. [27] Shah Zahid Khan, Mujahid Mohsin, and Waseem Iqbal. On gps spoofing of aerial platforms: a review of threats, challenges, methodologies, and future research directions. PeerJ Computer Science, 7:e507, 2021. [28] Long Xin, Zimu Tang, Weiqi Gai, and Haobo Liu. Vision-based autonomous landing for the uav: A review. Aerospace, 9(11):634, 2022. [29] Shaowu Yang, Sebastian A Scherer, and Andreas Zell. An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. Journal of Intelligent & Robotic Systems, 69:499–515, 2013. [30] Youeyun Jung, Hyochoong Bang, and Dongjin Lee. Robust marker tracking algorithm for precise uav vision-based autonomous landing. In 2015 15th International Conference on Control, Automation and Systems (ICCAS), pages 443–446, 2015. [31] Guanchong Niu, Qingkai Yang, Yunfan Gao, and Man-On Pun. Vision-based autonomous landing for unmanned aerial and ground vehicles cooperative systems. IEEE Robotics and Automation Letters, 7(3):6234–6241, 2022. [32] Edwin Olson. Apriltag: A robust and flexible visual fiducial system. In 2011 IEEE international conference on robotics and automation, pages 3400–3407. IEEE, 2011. [33] Rong Liu, Jianjun Yi, Yajun Zhang, Bo Zhou, Wenlong Zheng, Hailei Wu, Shuqing Cao, and Jinzhen Mu. Vision-guided autonomous landing of multirotor uav on fixed landing marker. In 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), pages 455–458, 2020. 71
[34] Tomas Baca, Petr Stepan, Vojtech Spurny, Daniel Hert, Robert Penicka, Martin Saska, Justin Thomas, Giuseppe Loianno, and Vijay Kumar. Autonomous landing on a moving vehicle with an unmanned aerial vehicle. Journal of Field Robotics, 36(5):874–891, 2019. [35] Noi Quang Truong, Young Won Lee, Muhammad Owais, Dat Tien Nguyen, Ganbayar Batchuluun, Tuyen Danh Pham, and Kang Ryoung Park. Slimdeblurgan-based motion deblurring and marker detection for autonomous drone landing. Sensors, 20(14):3918, 2020. [36] RiccardoPolvara,SanjaySharma,JianWan,AndrewManning,andRobertSutton.Vision- based autonomous landing of a quadrotor on the perturbed deck of an unmanned surface vehicle. drones, 2(2):15, 2018. [37] Aleix Paris, Brett T Lopez, and Jonathan P How. Dynamic landing of an autonomous quadrotor on a moving platform in turbulent wind conditions. pages 9577–9583, 2020. [38] Muhammad Yeasir Arafat, Muhammad Morshed Alam, and Sangman Moh. Vision-based navigation techniques for unmanned aerial vehicles: Review and challenges. Drones, 7(2):89, 2023. [39] Shanggang Lin, Lianwen Jin, and Ziwei Chen. Real-time monocular vision system for uav autonomous landing in outdoor low-illumination environments. Sensors, 21(18):6226, 2021. [40] Brent A Terwilliger, David C Ison, Dennis A Vincenzi, and Dahai Liu. Advancement and application of unmanned aerial system human-machine-interface (hmi) technology. In Human Interface and the Management of Information. Information and Knowledge in Applications and Services: 16th International Conference, HCI International 2014, Heraklion, Crete, Greece, June 22-27, 2014. Proceedings, Part II 16, pages 273–283. Springer, 2014. [41] A Hong, O Igharoro, Yugang Liu, Farzad Niroui, Goldie Nejat, and Beno Benhabib. Investigating human-robot teams for learning-based semi-autonomous control in urban search and rescue environments. Journal of Intelligent & Robotic Systems, 94:669–686, 2019. 72
[42] Yongxiang Lu, Canjun Yang, and Ying Chen. Study on the humachine intelligent system and its application. In Proceedings of the JFPS International Symposium on Fluid Power, volume 1999, pages 613–617. The Japan Fluid Power System Society, 1999. [43] Dimitris Mourtzis, John Angelopoulos, and Nikos Panopoulos. The future of the human– machine interface (hmi) in society 5.0. Future Internet, 15(5):162, 2023. [44] Charles FF Karney. Algorithms for geodesics. Journal of Geodesy, 87:43–55, 2013. [45] CFF Karney and RE Deakin. Fw bessel (1825): The calculation of longitude and latitude from geodesic measurements. Astronomische Nachrichten, 331(8):852–861, 2010. [46] Adnan Brdjanin, Nadja Dardagan, Dzemil Dzigal, and Amila Akagic. Single object trackers in opencv: A benchmark. In 2020 International Conference on INnovations in Intelligent SysTems and Applications (INISTA), pages 1–6. IEEE, 2020. [47] Yolov8 Person Detection. Person detection in daylight dataset. https: //universe.roboflow.com/yolov8-person-detection/person- detection-in-daylight, aug 2023. visited on 2024-07-08. [48] persondetection. person detection with yolov8 dataset. https://universe. roboflow.com/persondetection-zth4t/person-detection-with- yolov8, jun 2024. visited on 2024-07-08. [49] John J Benedetto and Paulo JSG Ferreira. Modern sampling theory: mathematics and applications. Springer Science & Business Media, 2012. [50] Experimental video. https://www.youtube.com/watch?v=8wcjEVtFlAk, 2024. Accessed: 2024-07-19. |
Description: | 碩士 國立政治大學 資訊科學系 111753225 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0111753225 |
Data Type: | thesis |
Appears in Collections: | [資訊科學系] 學位論文
|
Files in This Item:
File |
Description |
Size | Format | |
322501.pdf | | 9980Kb | Adobe PDF | 0 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|