政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/95269
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文笔数/总笔数 : 113648/144635 (79%)
造访人次 : 51609984      在线人数 : 818
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/95269


    请使用永久网址来引用或连结此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/95269


    题名: 從多視角影像萃取密集影像對應
    Dense image matching from multi-view images
    作者: 蔡瑞陽
    Tsai, Jui Yang
    贡献者: 何瑁鎧
    Hor, Maw Kae
    蔡瑞陽
    Tsai, Jui Yang
    关键词: 影像對應
    多視角影像
    極線轉換
    三維建模
    image processing
    multi-view images
    epipolar transfer
    three-dimension model reconstruction
    日期: 2009
    上传时间: 2016-05-09 15:29:12 (UTC+8)
    摘要: 在三維模型的建構上,對應點的選取和改善佔有相當重要的地位。對應點的準確性影響整個建模的成效。本論文中我們提出了新的方法,透過極線轉換法(epipolar transfer)在多視角影像中做可見影像過濾和對應點改善。首先,我們以Furukawa所提出的方法,建構三維補綴面並加以做旋轉和位移,或是單純在二維影像移動對應點兩種方式選取初始對應點。然後再以本研究所提出的極線轉換法找到適當位置的對應點。接下來我們將每個三維點的可見影像(visible image)再次透過極線轉換法去檢查可見影像上的對應點位置是否適當,利用門檻值將不合適的對應點過濾掉。進一步針對對應點位置的改善和篩選,期望透過極線幾何法來找到位置最準確的對應點位置。最後比較實驗成果,觀察到以本研究所提出的方法做改善後,對應點準確度提高近百分之十五。
    In the construction of three-dimensional models, the selection and refinement of the correspondences plays a very important rule. The accuracy of the correspondences affects modeling results. In this paper, we proposed a new approach, that is filtering the visible images and improving the corresponding points in multi-view images by epipolar transfer method. First of all, we use Furukawa proposed method to construct three-dimensional patches and making rotation and displacement, or simply move the corresponding points in two-dimensional images are two ways to select the initial corresponding points. And then to use epipolar transfer method in this study to find the appropriate location of the corresponding points. Next we will check the corresponding points on the each 3D point’s visible image again through the polar transformation method , and we use the threshold value to filter out the corresponding points. Further the location of the corresponding points for the improvement and screening, hoped that through the epipolar geometry method to find the most accurate corresponding points’ location. Experimental results are compared to observe the improvements that the method proposed in this study, the corresponding point accuracy by nearly 15 percent.
    第一章 緒論 1
    1.1 研究動機和目的 1
    1.2 問題描述 2
    1.3 系統架構與流程說明 3
    1.4 本論文的貢獻 4
    1.5 論文章節架構 5
    第二章 相關研究 7
    第三章 背景知識 13
    3.1 極線幾何 13
    3.2 投影幾何及三維座標 14
    3.3 零平均正規化相關匹配法 15
    3.4 多視角影像 16
    3.5 極線轉換 17
    3.6 色彩模型 18
    第四章 選取對應點 20
    4.1 以三維補綴面選取對應點 22
    4.1.1 特徵點選取 23
    4.1.2 建構初始三維補綴面 24
    4.1.3 三維補綴面之最佳化 25
    4.2 在平面上移動選取對應點 28
    第五章 對應點的過濾且改善 30
    5.1 極線轉換 30
    5.1.1 選取較合理的對應點 30
    5.1.2 對應點之過濾和改善 34
    5.2 相互支持限制法 38
    5.3 順序限制 39
    5.4 擴展和過濾 39
    第六章 實驗成果 43
    6.1 稀疏對應點之選取 44
    6.2 對應點之過濾和改善 48
    6.2 平面物體之測試 53
    6.3 使用不同色彩模型之比較 57
    6.4 擴展及建模 58
    第七章 結論 60
    7.1 結論 60
    7.2 未來研究 60
    參考文獻 62
    參考文獻: [1] Furukawa, Y. and J. Ponce, “Accurate, Dense, and Robust Multi-View Stereopsis”, IEEE Conference on Computer Vision and Pattern Recognition, 1-8 2007.
    [2] Furukawa, Y. and J. Ponce, “Accurate Camera Calibration from Multi-View Stereo and Bundle Adjustment”, IEEE Conference on Computer Vision and Pattern Recognition, 1-8 2008.
    [3] Yebin Liu, X. Cao, Q. Dai and W. Xu, “Continuous Depth Estimation for Multi-view Stereo”, IEEE Computer Vision and Pattern Recognition, CVPR`09, June 2009.
    [4] Lowe, David G., “Distinctive Image Features from Scale-Invariant Keypoints”, International Journal of Computer Vision Vol. 60, No.2, 91–110, 2004.
    [5] Lhuillier M. and L. Quan, “Match Propagation for Image-based Modeling and Rendering”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.24, No.8, 1140-1146, 2002.
    [6] Yuille A. and T. Poggio, “A Generalized Ordering Constraint for Stereo Correspondence”, AI Memo 777, AI Lab, MIT, 1984.
    [7] C.-Y. Tang, H.-L. Chou, Y.-L. Wu and Y.-H. Ding, “Robust Fundamental Matrix Estimation Using Coplanar Constraints”, International Journal of Pattern Recognition and Artificial Intelligence, Vol. 24, No. 4, 2008.
    [8] Seitz S.M., B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A Comparison and Evaluation of Multi-View Stereo Reconstruction Algorithms”, IEEE Conference on Computer Vision and Pattern Recognition, Vol. 1, 519-528 2006.
    [9] R. Hartley and A. Zisserman, “Multiple View Geometry in Computer Vision”, Cambridge University Press, 2003.
    [10] Jeng-Jiun T., “Robust Refinement Methods for Camera Calibration and 3D Reconstruction from Multiple Images”, Journal of Visual Communication and Image Representation, 2009.
    [11] Jui-Yang T., “Generation of Dense Image Matching Using Epipolar Geometry”, International Display Manufacturing Conference & 3D Systems and Applications, 2009.
    [12] Kun-Shin W., “Refinement of 3D Models Reconstructed from Visual Hull”, Conference on Computer Vision, Graphics and Image Processing, 2009.
    [13] 蔡政君, 使用光束調整法與多張影像做相機校正與三維模型重建, 國立政治大學資訊科學所碩士論文, 民國98年。
    [14] 洪莘逸, 使用多視角影像合成新視點影像, 華梵大學資訊管理所碩士論文, 民國96年。
    [15] 詹凱軒, 由地面光達資料自動重建建物模型之研究, 國立政治大學資訊科學所碩士論文, 民國96年。
    描述: 碩士
    國立政治大學
    資訊科學學系
    96753015
    資料來源: http://thesis.lib.nccu.edu.tw/record/#G0096753015
    数据类型: thesis
    显示于类别:[資訊科學系] 學位論文

    文件中的档案:

    档案 大小格式浏览次数
    index.html0KbHTML2296检视/开启


    在政大典藏中所有的数据项都受到原著作权保护.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回馈