政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/60249
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113392/144379 (79%)
Visitors : 51193924      Online Users : 882
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/60249


    Title: 強健式視覺追蹤應用於擴增實境之研究
    Robust visual tracking for augmented reality
    Authors: 王瑞鴻
    Wang, Ruei Hong
    Contributors: 何瑁鎧
    Hor, Maw Kae
    王瑞鴻
    Wang, Ruei Hong
    Keywords: 擴增實境
    視覺追蹤
    立體視覺
    剛體運動
    Augmented reality
    visual tracking
    stereo vision
    rigid body motion
    Date: 2010
    Issue Date: 2013-09-04 17:07:59 (UTC+8)
    Abstract: 視覺追蹤(visual tracking)一直是傳統電腦視覺研究中相當重要的議題,許多電腦視覺的應用都需要結合視覺追蹤的幫助才能實現。近年來擴增實境(augmented reality)能快速成功的發展,均有賴於視覺追蹤技術上之精進。擴增實境採用視覺追蹤的技術,可將虛擬的物件呈現在被追蹤的物體(真實場景)上,進而達成所需之應用。

    由於在視覺追蹤上,被追蹤之物體易受外在環境因素影響,例如位移、旋轉、縮放、光照改變等,影響追蹤結果之精確度。本研究中,我們設計了一套全新的圖形標記方法作為視覺追蹤之參考點,能降低位移、旋轉與光照改變所造成追蹤結果的誤差,也能在複雜的背景中定位出標記圖形的正確位置,提高視覺追蹤的精確度。同時我們使用立體視覺追蹤物體,將過去只使用單一攝影機於二維影像資訊的追蹤問題,提升至使用三維空間的幾何資訊來做追蹤。然後透過剛體(rigid)特性找出旋轉量、位移量相同的物件,並且結合一致性隨機取樣(random sample consensus)之技巧以估測最佳的剛體物件運動模型,達到強健性追蹤的目的。

    另外,我們可由使用者提供之影片資訊中擷取特定資料,透過建模技術將所產生之虛擬物件呈現於使用者介面(或被追蹤之物體)上,並藉由這些虛擬物件,提供真實世界外之資訊,達成導覽指引(或擴增實境)的效果。

    實驗結果顯示,我們的方法具有辨識時間快、抗光照變化強、定位準確度高的特性,適合於擴增實境應用,同時我們設計的標記圖形尺寸小,方便適用於導覽指引等應用。
    Visual tracking is one of the most important research topics in traditional computer vision. Many computer vision applications can not be realized without the integration of visual tracking techniques. The fast growing of augmented reality in recent years relied on the improvement of visual tracking technologies.
    External environment such as object displacement, rotation, and scaling as well as illumination conditions will always influence the accuracy of visual tracking. In this thesis, we designed a set of markers that can reduce the errors induced by the illumination condition changes as well as that by the object displacement, rotation, and scaling. It can also correctly position the markers in complicated background to increase the tracking accuracy. Instead of using single camera tracking in 2D spaces, we used stereo vision techniques to track the objects in 3D spaces. We also used the properties of rigid objects and search for the objects with the same amount of rotation and displacement. Together with the techniques of random sample consensus, we can estimate the best rigid object motion model and achieve tracking robustness.
    Moreover, from the user supplied video, we can capture particular information and then generate the virtual objects that can be displaced on the user’s device (or on the tracked objects). Using these techniques we can either achieve navigation or guidance in real world or achieve augmented reality as we expected.
    The experimental results show that our mechanism has the characteristics of fast recognition, accurate positioning, and resisting to illumination changes that are suitable for augmented reality. Also, the size of the markers we designed is very small and good for augmented reality application.
    Reference: [1] M. Adcock, M. Hutchins, and C. Gunn, "Augmented reality haptics: using ARToolKit for display of haptic applications," Proceedings of 2nd IEEE International Augmented Reality Toolkit Workshop, pp. 1-2, 2003.
    [2] K. S. Arun, T. S. Huang, and S. D. Blostein, "Least-Squares Fitting of Two 3-D Point Sets," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-9, pp. 698-700, 1987.
    [3] R. T. Azuma, "A survey of augmented reality," Presence-Teleoperators and Virtual Environments, vol. 6, pp. 355-385, 1997.
    [4] H. Bay, T. Tuytelaars, and L. Van Gool, "Surf: Speeded up robust features," Proceedings of European Conference on Computer Vision, pp. 404-417, 2006.
    [5] W. Broll, I. Lindt, I. Herbst, J. Ohlenburg, A. K. Braun, and R. Wetzel, "Toward Next-Gen Mobile AR Games," Journal of IEEE Computer Graphics and Applications, vol. 28, pp. 40-48, 2008.
    [6] V. F. da Camara Neto, D. Balbino de Mesquita, R. F. Garcia, and M. F. M. Campos, "On the Design and Evaluation of a Precise Scalable Fiducial Marker Framework," Conference on 23rd SIBGRAPI Graphics, Patterns and Images, pp. 216-223, 2010.
    [7] L. Di Stefano, S. Mattoccia, and F. Tombari, "ZNCC-based template matching using bounded partial correlation," Pattern Recognition Letters, vol. 26, pp. 2129-2134, 2005.
    [8] M. Fiala, "ARTag, a fiducial marker system using digital techniques," Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 590-596 ,2005.
    [9] M. Fiala, "Designing Highly Reliable Fiducial Markers," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, pp. 1317-1324, 2010.
    [10] J. Fischer, M. Eichler, and D. Bartz, "A hybrid tracking method for surgical augmented reality," Computers and Graphics, vol. 31, pp. 39-52, 2007.
    [11] M. A. Fischler and R. C. Bolles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, vol. 24, pp. 381-395, 1981.
    [12] D. Flohr and J. Fischer, "A lightweight ID-based extension for marker tracking systems," Proceedings of Eurographics Symposium on Virtual Environments, pp. 59-64, 2007.
    [13] Y. Genc, S. Riedel, F. Souvannavong, C. Akinlar, and N. Navab, "Marker-less tracking for AR: a learning-based approach," Proceedings of International Symposium on Mixed and Augmented Reality, pp. 295-304, 2002.
    [14] H. Grabner, J. Matas, L. Van Gool, and P. Cattin, "Tracking the invisible: Learning where the object might be," IEEE Conference on Computer Vision and Pattern Recognition, pp. 1285-1292, 2010.
    [15] G. D. Hager and P. N. Belhumeur, "Real-time tracking of image regions with changes in geometry and illumination," Proceedings of Computer Vision and Pattern Recognition, pp. 403-410, 1996.
    [16] C. Harris and M. Stephens, "A Combined Corner and Edge Detection," Proceedings of The Fourth Alvey Vision Conference, pp. 147-151, 1988.
    [17] H. Kato and M. Billinghurst, "Marker tracking and HMD calibration for a video-based augmented reality conferencing system," Proceedings of the 2nd
    International Workshop on Augmented Reality, pp. 85-94, 1999.
    [18] D. Lowe, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, vol. 60, pp. 91-110, 2004.
    [19] D. Marissa, A.-M. Moises, M.-G. Lourdes, and R. Isaac, "Multi-User Networked Interactive Augmented Reality Card Game," Proceedings of International Conference on Cyberworld , pp. 177-182, 2006.
    [20] W. Piekarski and B. H. Thomas, "Using ARToolKit for 3D hand position tracking in mobile outdoor environments," Proceedings of The First IEEE International Augmented Reality Toolkit Workshop, pp. 2 , 2002.
    [21] Z. Qi, S. Brennan, and T. Hai, "Differential EMD Tracking," Proceedings of International Conference on Computer Vision, pp. 1-8, 2007.
    [22] G. Reitmayr and T. W. Drummond, "Going out: robust model-based tracking for outdoor augmented reality," Proceedings of International Symposium on Mixed and Augmented Reality, pp. 109-118, 2006.
    [23] T. Sielhorst, M. Feuerstein, and N. Navab, "Advanced Medical Displays: A Literature Review of Augmented Reality," Journal of Display Technologyf, vol. 4, pp. 451-467, 2008.
    [24] G. Silveira and E. Malis, "Real-time Visual Tracking under Arbitrary Illumination Changes," Proceedings of Computer Vision and Pattern Recognition, pp. 1-6, 2007.
    [25] L. Taehee and T. Hollerer, "Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking," Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers, pp.1-8, 2007.
    [26] L. Taehee and T. Hollerer, "Multithreaded Hybrid Feature Tracking for Markerless Augmented Reality," IEEE Transactions on Visualization and Computer Graphics , vol. 15, pp. 355-368, 2009.
    [27] D. Wagner and D. Schmalstieg, "Artoolkitplus for pose tracking on mobile devices," Proceedings of 12th Computer Vision Winter Workshop, pp. 6-8, 2007.
    [28] Z. Xiang, S. Fronz, and N. Navab, "Visual marker detection and decoding in AR systems: a comparative study," Proceedings of the 1st International Symposium on Mixed and Augmented Reality, pp. 97-106, 2002.
    [29] S. You and U. Neumann, "Mobile Augmented Reality for Enhancing E-Learning and E-Business," International Conference on Internet Technology and Applications , pp. 1-4, 2010.
    [30] J.-Y. Bouguet. Camera Calibration Toolbox for Matlab. www.vision.caltech.edu/bouguetj/calib_doc/
    Description: 碩士
    國立政治大學
    資訊科學學系
    98753014
    99
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0098753014
    Data Type: thesis
    Appears in Collections:[Department of Computer Science ] Theses

    Files in This Item:

    File SizeFormat
    301401.pdf6361KbAdobe PDF21558View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback