English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113318/144297 (79%)
Visitors : 51021797      Online Users : 878
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/149647
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/149647


    Title: 基於視覺導航之無人機自主降落韌性提升
    Robustness Enhancement on Visually Guided UAV Autonomous Landing
    Authors: 蔣明憲
    Chiang, Min-Hsien
    Contributors: 劉吉軒
    Liu, Jyi-Shane
    蔣明憲
    Chiang, Min-Hsien
    Keywords: 無人機
    四軸無人機
    自主精準降落
    降落韌性
    降落方法
    基於視覺的引導系統
    電腦視覺
    決策控制
    降落標記設計
    Uav
    Quadcopter
    Autonomous Precision Landing
    Landing Robustness
    Landing Strategy
    Vision-based Guidance systems
    Computer Vision
    Decision Control
    Landing Marker Design
    Date: 2023
    Issue Date: 2024-02-01 11:40:49 (UTC+8)
    Abstract: 近年來,由於軟硬體架構的革新和大環境的變化的影響,飛行無
    人機已成為研究的焦點。它具備高機動性和可滯空兩種特性,不論是
    用於軍事用途,如無人化遠距偵查和執行特定軍事任務,或者是商業
    上的應用,如空中巡檢和影像獲取,都受到廣泛關注。過去三年疫情
    的影響,零接觸概念開始備受重視,無人機的發展也逐漸成為焦點之
    一。自主降落作為飛行中的最後環節,在無人機智慧化中扮演著關鍵
    的角色,這項技術在過去十年中受到廣泛研究。特別是當無人機降落
    於各種環境時,視情況需要整合視覺追蹤、軌跡預測、路徑規劃以及
    動力算法等技術,以完成降落任務。考慮到在現實場景中可能遇到的
    多變情況,降落韌性提昇是一項值得深入研究的重要課題。
    鑒於過去多數論文的實驗停留在相對理想的環境下進行,如虛擬及
    室內環境,而與現實場景存在一定落差,因此本研究旨在提升無人機
    在現實中的降落韌性,以此設計出能在高空、不同風場以及碰到目標
    被遮蔽的情況下也能穩定追蹤目標並降落的方法。
    為達成上述韌性目標,我們設計出一套基於視覺的自主降落系統,
    涵蓋從降落標記設計、降落流程設計、飛行邏輯設計、降落方法設計
    以及硬體底層的飛行控制,並在視覺處理上整合多種演算法來互相驗
    證並以提升導航韌性,除此之外,透過將視覺反饋與路徑規劃進行整
    合,成功設計出一個極具韌性的新型降落方法。
    In recent years, due to innovations in both hardware and software architecture
    and changes in the overall environment, unmanned aerial vehicles (UAVs) have become a focal point of research. They possess two key characteristics: high maneuverability and the ability to loiter in the air, making them of significant interest for
    a wide range of applications. These applications include military uses such as unmanned long-range reconnaissance and the execution of specific military missions,
    as well as commercial applications like aerial inspections and image capture.
    The impact of the COVID-19 pandemic in the past three years has also placed a
    strong emphasis on the concept of contactless operations, leading to a growing focus
    on the development of UAVs. Autonomous landing, as the final phase of a flight,
    plays a crucial role in the intelligence of UAVs and has been extensively researched
    over the past decade. This technology integrates various techniques, such as visual
    tracking, trajectory prediction, path planning, and control algorithms, to successfully
    accomplish landing tasks, especially when UAVs need to land in diverse and unpredictable environments. Given the numerous and variable conditions that UAVs may
    encounter in real-world scenarios, enhancing landing resilience is an important and
    worthwhile area of in-depth research.
    Given that most previous research papers have conducted experiments in relatively ideal environments, such as virtual and indoor settings, which may not accurately reflect real-world scenarios, this study aims to enhance the landing resilience
    iii
    of unmanned aerial vehicles (UAVs) in real-world conditions. The goal is to design
    a method that enables UAVs to stably track and land on a target even in challenging
    situations, including high altitudes, varying wind conditions, and scenarios where
    the target may be obscured.
    To achieve the aforementioned resilience goals, we have designed a visual-based
    autonomous landing system that encompasses various components, including landing marker design, landing procedure design, flight logic design, landing method
    design, and low-level hardware flight control. In addition, we have integrated multiple algorithms in visual processing to mutually validate and enhance navigation
    resilience. Furthermore, by integrating visual feedback with path planning, we have
    successfully developed a highly resilient novel landing method
    Reference: [1] Abdulla Al-Kaff, David Martín, Fernando García, Arturo de la Escalera, and José
    María Armingol. Survey of computer vision algorithms and applications for unmanned aerial vehicles. Expert Systems with Applications, 92:447–463, 2018.
    [2] Muhammad Yeasir Arafat, Muhammad Morshed Alam, and Sangman Moh. Visionbased navigation techniques for unmanned aerial vehicles: Review and challenges.
    Drones, 7(2), 2023.
    [3] Tomas Baca, Petr Stepan, and Martin Saska. Autonomous landing on a moving
    car with unmanned aerial vehicle. In 2017 European Conference on Mobile Robots
    (ECMR), pages 1–6, 2017.
    [4] Alexandre Borowczyk, Duc-Tien Nguyen, André Phu-Van Nguyen, Dang Quang
    Nguyen, David Saussié, and Jerome Le Ny. Autonomous landing of a multirotor micro air vehicle on a high velocity ground vehicle**this work was partially supported
    by cfi jelf award 32848 and a hardware donation from dji. IFAC-PapersOnLine,
    50(1):10488–10494, 2017. 20th IFAC World Congress.
    [5] Michele Colledanchise and Petter Ögren. Behavior trees in robotics and AI: an introduction. CoRR, abs/1709.00084, 2017.
    [6] Michele Colledanchise and Petter Ögren. How behavior trees modularize robustness and safety in hybrid systems. In 2014 IEEE/RSJ International Conference on
    Intelligent Robots and Systems, pages 1482–1488, 2014.
    [7] Beizhen Feng, Xiaofei Yang, Ronghao Wang, Xin Yan, Hongwei She, and Liang Shan. Design and implementation of autonomous takeoff and landing uav systemfor usv platform. In 2022 International Conference on Cyber-Physical Social Intelligence (ICCSI), pages 292–296, 2022.
    [8] M. Fiala. Artag, a fiducial marker system using digital techniques. In 2005
    IEEE Computer Society Conference on Computer Vision and Pattern Recognition
    (CVPR’05), volume 2, pages 590–596 vol. 2, 2005.
    [9] Jawhar Ghommam and Maarouf Saad. Autonomous landing of a quadrotor on a moving platform. IEEE Transactions on Aerospace and Electronic Systems, 53(3):1504–
    1519, 2017.
    [10] Adrián González-Sieira, Daniel Cores, Manuel Mucientes, and Alberto Bugarín. Autonomous navigation for uavs managing motion and sensing uncertainty. Robotics
    and Autonomous Systems, 126:103455, 2020.
    [11] Elder M. Hemerly. Automatic georeferencing of images acquired by uav’s. In
    International Journal of Automation and Computing, 2014.
    [12] Youeyun Jung, Dongjin Lee, and Hyochoong Bang. Close-range vision navigation
    and guidance for rotary uav autonomous landing. In 2015 IEEE International Conference on Automation Science and Engineering (CASE), pages 342–347, 2015.
    [13] Azarakhsh Keipour, Guilherme A. S. Pereira, Rogerio Bonatti, Rohit Garg, Puru
    Rastogi, Geetesh Dubey, and Sebastian Scherer. Visual servoing approach to autonomous uav landing on a moving vehicle. Sensors, 22(17), 2022.
    [14] Sarantis Kyristsis, Angelos Antonopoulos, Theofilos Chanialakis, Emmanouel Stefanakis, Christos Linardos, Achilles Tripolitsiotis, and Panagiotis Partsinevelos. Towards autonomous modular uav missions: The detection, geo-location and landing
    paradigm. Sensors, 16(11), 2016.
    [15] Sven Lange, Niko Sunderhauf, and Peter Protzel. A vision based onboard approach
    for landing and position control of an autonomous multirotor uav in gps-denied environments. In 2009 International Conference on Advanced Robotics, pages 1–6,
    2009.
    [16] Min-Fan Ricky Lee, Shun-Feng Su, Jie-Wei Eric Yeah, Husan-Ming Huang, and
    Jonathan Chen. Autonomous landing system for aerial mobile robot cooperation. In
    2014 Joint 7th International Conference on Soft Computing and Intelligent Systems
    (SCIS) and 15th International Symposium on Advanced Intelligent Systems (ISIS),
    pages 1306–1311, 2014.
    [17] Zhou Li, Yang Chen, Hao Lu, Huaiyu Wu, and Lei Cheng. Uav autonomous landing
    technology based on apriltags vision positioning algorithm. pages 8148–8153, 07
    2019.
    [18] Shanggang Lin, Lianwen Jin, and Ziwei Chen. Real-time monocular vision system
    for uav autonomous landing in outdoor low-illumination environments. Sensors,
    21(18), 2021.
    [19] Rong Liu, Jianjun Yi, Yajun Zhang, Bo Zhou, Wenlong Zheng, Hailei Wu, Shuqing
    Cao, and Jinzhen Mu. Vision-guided autonomous landing of multirotor uav on fixed
    landing marker. In 2020 IEEE International Conference on Artificial Intelligence
    and Computer Applications (ICAICA), pages 455–458, 2020.
    [20] Edwin Olson. Apriltag: A robust and flexible visual fiducial system. In 2011 IEEE
    International Conference on Robotics and Automation, pages 3400–3407, 2011.
    [21] Umberto Papa and Giuseppe Del Core. Design of sonar sensor model for safe landing
    of an uav. In 2015 IEEE Metrology for Aerospace (MetroAeroSpace), pages 346–
    350, 2015.
    [22] Aleix Paris, Brett T. Lopez, and Jonathan P. How. Dynamic landing of an autonomous quadrotor on a moving platform in turbulent wind conditions. In 2020
    IEEE International Conference on Robotics and Automation (ICRA), pages 9577–
    9583, 2020.
    [23] Tatiana Pavlenko, Martin Schütz, Martin Vossiek, Thomas Walter, and Sergio Montenegro. Wireless local positioning system for controlled uav landing in gnss-denied environment. In 2019 IEEE 5th International Workshop on Metrology for AeroSpace(MetroAeroSpace), pages 171–175, 2019.
    [24] Riccardo Polvara, Sanjay Sharma, Jian Wan, Andrew Manning, and Robert Sutton.
    Vision-based autonomous landing of a quadrotor on the perturbed deck of an unmanned surface vehicle. Drones, 2(2), 2018.
    [25] Hamid Yusuf Putranto, Astria Nur Irfansyah, and Muhammad Attamimi. Identification of safe landing areas with semantic segmentation and contour detection for
    delivery uav. In 2022 9th International Conference on Information Technology,
    Computer, and Electrical Engineering (ICITACEE), pages 254–257, 2022.
    [26] Pengrui Qiu, Xiping Yuan, Shu Gan, and Yu Lin. Research on image denoising
    adaptive algorithm for uav based on visual landing. In 2017 International Conference
    on Computer Network, Electronic and Automation (ICCNEA), pages 408–411, 2017.
    [27] Marcos Felipe Santos Rabelo, Alexandre Santos Brandão, and Mário SarcinelliFilho. Landing a uav on static or moving platforms using a formation controller.
    IEEE Systems Journal, 15(1):37–45, 2021.
    [28] René Ranftl, Alexey Bochkovskiy, and Vladlen Koltun. Vision transformers for
    dense prediction, 2021.
    [29] Liu Ruifeng, Wang Jiasheng, Zhang Haolong, and Tian Mengfan. Research progress
    and application of behavior tree technology. In 2019 6th International Conference
    on Behavioral, Economic and Socio-Cultural Computing (BESC), pages 1–4, 2019.
    [30] David Safadinho, João Ramos, Roberto Ribeiro, Vítor Filipe, João Barroso, and António Pereira. Uav landing using computer vision techniques for human detection.
    Sensors, 20(3), 2020.
    [31] Artur Sagitov, Ksenia Shabalina, Roman Lavrenov, and Evgeni Magid. Comparing
    fiducial marker systems in the presence of occlusion. In 2017 International Conference on Mechanical, System and Control Engineering (ICMSC), pages 377–382,
    2017.
    [32] C.S. Sharp, O. Shakernia, and S.S. Sastry. A vision system for landing an unmanned aerial vehicle. In Proceedings 2001 ICRA. IEEE International Conference
    on Robotics and Automation (Cat. No.01CH37164), volume 2, pages 1720–1727
    vol.2, 2001.
    [33] Taemin Shim and Hyochoong Bang. Autonomous landing of uav using vision based
    approach and pid controller based outer loop. In 2018 18th International Conference
    on Control, Automation and Systems (ICCAS), pages 876–879, 2018.
    [34] Tuan Do Trong, Quan Tran Hai, Manh Vu Van, Binh Nguyen Thai, Tung Nguyen
    Chi, and Truong Nguyen Quang. Autonomous detection and approach tracking of
    moving ship on the sea by vtol uav based on deep learning technique through simulated real-time on-air image acquisitions. In 2021 8th NAFOSTED Conference on
    Information and Computer Science (NICS), pages 374–380, 2021.
    [35] T. K. Venugopalan, Tawfiq Taher, and George Barbastathis. Autonomous landing
    of an unmanned aerial vehicle on an autonomous marine vehicle. In 2012 Oceans,
    pages 1–9, 2012.
    [36] Holger Voos and Haitham Bou-Ammar. Nonlinear tracking and landing controller
    for quadrotor aerial robots. In 2010 IEEE International Conference on Control Applications, pages 2136–2141, 2010.
    [37] Zhiqing Wei, Mingyue Zhu, Ning Zhang, Lin Wang, Yingying Zou, Zeyang Meng,
    Huici Wu, and Zhiyong Feng. Uav-assisted data collection for internet of things: A
    survey. IEEE Internet of Things Journal, 9(17):15460–15483, 2022.
    [38] Long Xin, Zimu Tang, Weiqi Gai, and Haobo Liu. Vision-based autonomous landing
    for the uav: A review. Aerospace, 9(11), 2022.
    [39] Nguyen Xuan-Mung, Sung Kyung Hong, Ngoc Phi Nguyen, Le Nhu Ngoc Thanh
    Ha, and Tien-Loc Le. Autonomous quadcopter precision landing onto a heaving
    platform: New method and experiment. IEEE Access, 8:167192–167202, 2020.
    Description: 碩士
    國立政治大學
    資訊科學系
    110753208
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0110753208
    Data Type: thesis
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File Description SizeFormat
    320801.pdf53953KbAdobe PDF0View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback