English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113318/144297 (79%)
Visitors : 51069880      Online Users : 896
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/133895
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/133895


    Title: 基於視覺導航之自主無人機環繞檢測
    Autonomous UAV Surround Inspection based on Visual Navigation
    Authors: 張為超
    Chang, Wei-Chao
    Contributors: 劉吉軒
    Liu, Jyi-Shane
    張為超
    Chang, Wei-Chao
    Keywords: 無人機
    SLAM
    行為樹
    建築物檢視
    UAV
    SLAM
    Behavior tree
    Building inspection
    Date: 2020
    Issue Date: 2021-02-01 14:10:48 (UTC+8)
    Abstract: 與在空曠地區的進行的航空影像不同,人造建築物的航空檢視需要無人機進行更複雜的導航。無人機需要以可控的方式向目標物體移動,以獲取結構表面的特寫影像,同時也需要為了自身的安全而避免碰撞到目標建築物。在本文中,我們提出了一項基於視覺導航的人造建築物之自主檢視任務。我們利用SLAM做為視覺定位之基礎,主要針對以圓柱形建築物為路徑之環繞檢視為例,以實際飛行的形式進行了測試。
    我們的技術貢獻主要有兩個方面。首先,我們以一個較為完整之任務形式呈現我們的研究,無人機從起飛開始,接著會自主辨識出目標建築物並設定環繞檢視之路徑,在環繞的同時進行實時校正,完成使用者設定之環繞回合數後便會進行返航。其次,我們使用行為樹作為控制體系結構來集成所有功能組件以增強整體之穩定性以及可行性,並在低成本之微型無人機上進行開發。而在現實世界中的實驗表明,無人機可以以一定的成功率執行環繞檢視任務,並且能完整的獲取目標建築物的影像以進行結構檢視。
    Unlike aerial imagery in open fields, aerial inspection on man-made construction requires more complex navigation from drones. The drone needs to move toward target object in a controlled manner in order to acquire close-up views on structure surface, at the same time, avoid collision for its own safety. In this paper, we present a research work on autonomous visual navigation for aerial inspection on man-made construction. In particular, we focus on developing orbital inspection of pole-like objects. We use SLAM as the basis for visual positioning and we test our method in the form of actual flight.
    There are two main aspects of our technical contribution. First, we present our research in the form of a relatively complete mission. The drone will automatically identify the target building and set the path for the surround view from the start of take-off, and perform real-time adjustment while orbiting to complete the user-defined surround After the number of rounds, it will return home. Secondly, we use behavior tree as a control architecture to integrate all functional components to enhance the overall stability and feasibility, and develop it on a low-cost UAV. Extensive experiments in a real world scenario have shown that UAV can perform surround building inspection tasks with a certain success rate, and can obtain complete images of target buildings for structural inspection.
    Reference: [1] International Civil Aviation Organization (ICAO). Unmanned Aircraft Systems (UAS); ICAO: Montreal, QC, Canada, 2011.
    [2] C. Stöcker, E. Anette, and K. Pierre, "Measuring gullies by synergetic application of UAV and close range photogrammetry—A case study from Andalusia, Spain." Catena vol. 132, pp. 1-11, 2015.
    [3] C. Yuan, Y. M. Zhang and Z. X. Liu, "A survey on technologies for automatic forest fire monitoring detection and fighting using unmanned aerial vehicles and remote sensing techniques", Canadian Journal of Forest Research published on the web 12, March 2015.
    [4] H. Aasen, E. Honkavaara, A. Lucieer, and P. Zarco-Tejada, “Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows,” Remote Sens., vol. 10, no. 7, p. 1091, 2018.
    [5] M. Israel, "A UAV-based roe deer fawn detection system", Proc. Int. Conf. Unmanned Aerial Veh. Geomatics (UAV-g), vol. 38, pp. 1-5, 2011.
    [6] M. N. Gillins, D. T. Gillins and C. Parrish, "Cost-effective bridge safety inspections using unmanned aircraft systems (UAS)", Geotechnical and Structural Engineering Congress, 2016.
    [7] M. Asim, D. N. Ehsan, and K. Rafique, ‘‘Probable causal factors in Uav accidents based on human factor analysis and classification system,’’ in Proc. 27th Int. Congr. Aeronaut. Sci., vol. 1905, p. 5, 2005.
    [8] N. Hallermann and G. Morgenthal, "Visual inspection strategies for large bridges using unmanned aerial vehicles (uav)", Proc. of 7th IABMAS International Conference on Bridge Maintenance Safety and Management, pp. 661-667, 2014.
    [9] S. Omari, P. Gohl, M. Burri, M. Achtelik and R. Siegwart, "Visual industrial inspection using aerial robots", Proceedings of CARPI, 2014.
    [10] Y. Song, S. Nuske and S. Scherer, "A multi-sensor fusion MAV state estimation from long-range stereo IMU GPS and barometric sensors", Sensors, vol. 17, no. 1, 2017.
    [11] S. Ullman, "The interpretation of structure from motion", Proc. R. Soc. London, vol. B203, pp. 405-426, 1979.
    [12] J. Engel, V. Koltun and D. Cremers, "Direct sparse odometry", IEEE Trans. Pattern Anal. Mach. Intell., vol. 40, no. 3, pp. 611-625, Mar. 2018.
    [13] J. Engel, T. Schöps and D. Cremers, "LSD-SLAM: Large-scale direct monocular SLAM", Proc. Eur. Conf. Comput. Vision, pp. 834-849, Sep. 2014.
    [14] A. Buyval, I. Afanasyev and E. Magid, "Comparative analysis of ros-based monocular slam methods for indoor navigation", International Conference on Machine Vision (ICMV 2016), vol. 10341, pp. 103411K, 2017.
    [15] R. Mur-Artal, J. M. M. Montiel and J. D. Tardós, "ORB-SLAM: A versatile and accurate monocular SLAM system", IEEE Trans. Robot., vol. 31, no. 5, pp. 1147-1163, Oct. 2015.
    [16] M. Filipenko and I. Afanasyev, "Comparison of various slam systems for mobile robot in an indoor environment", International Conference on Intelligent Systems, Sep. 2018.
    [17] V. De Araujo, A. P. G. S. Almeida, C. T. Miranda, and F. De Barros Vidal, “A parallel hierarchical finite state machine approach to UAV control for search and rescue tasks,” in Proceedings of the 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO `14), pp. 410–415, Sep. 2014.
    [18] M. Colledanchise and P. Ögren, "How behavior trees modularize hybrid control systems and generalize sequential behavior compositions the subsumption architecture and decision trees", IEEE Trans. Robot., vol. 33, no. 2, pp. 372-389, Apr. 2017.
    [19] M. Samkuma, Y. Kobayashi, T. Emaru and A. Ravankar, "Mapping of Pier Substructure Using UAV", IEEE/SICE International Symposium on System Integration, 2016.
    [20] P. Shanthakumar, K. Yu, M. Singh, J. Orevillo, E. Bianchi, M. Hebdon, et al., "View planning and navigation algorithms for autonomous bridge inspection with uavs", International Symposium on Experimental Robotics, pp. 201-210, 2018.
    [21] A. Al-Kaff, F. M. Moreno, L. J. San José, F. García, D. Martín, A. De La Escalera, et al., "Vbii-uav: Vision-based infrastructure inspection-uav", World Conference on Information Systems and Technologies WorldCist`17, pp. 221-231, 2017.
    [22] F. Kendoul, "Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems," Journal of Field Robotics, vol. 29, no. 2, pp. 315-378, Mar. 2012.
    [23] I. Sa, S. Hrabar and P. Corke, "Outdoor flight testing of a pole inspection UAV incorporating high-speed vision", Springer Tracts Adv. Robot., vol. 105, pp. 107-121, Dec. 2015.
    [24] S. A. K. Tareen and Z. Saleem, “A comparative analysis of sift, surf, kaze, akaze, orb, and brisk,” in 2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), pp. 1–10, March 2018
    [25] M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applicatio ns to image analysis and automated cartography,” Conzmun. ACM, vol. 24, pp. 381-395, June 1981.
    [26] G. Shi, X. Xu, and Y. Dai, ‘‘SIFT feature point matching based on improved RANSAC algorithm,’’ in Proc. 5th Int. Conf. Intell. Hum.- Mach. Syst. Cybern., vol. 1, pp. 474–477, Aug. 2013.
    [27] H. Strasdat, J. M. M. Montiel and A. J. Davison, "Scale drift-aware large scale monocular SLAM", Proc. Robot.: Sci. Syst., Jun. 2010.
    [28] S. Choi, P. Jaehyun and Y. Wonpil, "Resolving scale ambiguity for monocular Visual Odometry", IEEE International Conference on Ubiquitous Robots and Ambient Intelligence, pp. 604-608, 2013.
    [29] J. Heinly, E. Dunn and J.-M. Frahm, "Comparative evaluation of binary features", European Conf. Comput. Vision, pp. 759-773, 2012.
    [30] A. Sujiwo et al., "Robust and accurate monocular vision-based localization in outdoor environments of real-world robot challenge", J. Robot. Mechatronics, vol. 29, no. 4, pp. 685-696, 2017.
    [31] Parrot Drones SAS (n.d.). Retrieved October 4, 2020, from https://support.parrot.com/global/support/products
    [32] Bebop_autonomy. (n.d.). Retrieved October 4, 2020, from https://bebopautonomy. readthedocs.io/en/latest.
    [33] I. Abdel-Qader , O. Abudayyeh, and M. E. Kelly, “Analysis of edge-detection techniques for crack identification in bridges,” J. Comput. Civil Eng. , vol. 17, no. 4, pp. 255–263 , Oct. 2003.
    Description: 碩士
    國立政治大學
    資訊科學系
    107753035
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0107753035
    Data Type: thesis
    DOI: 10.6814/NCCU202100038
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File SizeFormat
    303501.pdf1825KbAdobe PDF2685View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback