English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113656/144643 (79%)
Visitors : 51716633      Online Users : 599
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    政大機構典藏 > 資訊學院 > 資訊科學系 > 學位論文 >  Item 140.119/126581
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/126581


    Title: 3D互動敘事中以穿戴式裝置與虛擬角色互動之機制設計
    Using Wearable Devices to Interact with Virtual Agents in 3D Interactive Storytelling
    Authors: 王玟璇
    Wang, Wen-Hsuan
    Contributors: 李蔡彥
    Li, Tsai-Yen
    王玟璇
    Wang, Wen-Hsuan
    Keywords: 互動敘事
    虛擬實境
    穿戴式裝置
    電腦動畫
    Date: 2019
    Issue Date: 2019-10-03 17:17:57 (UTC+8)
    Abstract: 近年來,越來越多的產業加入虛擬實境技術的開發與應用,例如職場訓練模擬或遊戲娛樂等,但大多數的應用,通常是利用手把按鍵或給定的動作選項與環境中的物件及NPC互動,故事體驗者選擇動作後,NPC給予的回應多只是制式的罐頭動畫或者是單純的語音文字輸出。
    我們認為此般互動並不能讓故事體驗者真正融入虛擬世界當中,因此,我們提議能夠利用穿戴式動作捕捉設備,讓玩家能以自然的動作當作輸入,並將虛擬人物的動畫模組透過參數化的方式,讓動畫模組能透過參數的變化而有更多元的呈現輸出,並讓相同的人物與場景,會因為與故事體驗者進行不同的互動而呈現出不同的劇情發展及動畫回饋。
    我們實現了一套系統,讓故事體驗者利用穿戴式裝置輸入肢體動作,系統解析動作後,決定玩家角色的動畫呈現,以及判斷是否有觸發NPC的互動事件,根據互動過程的不同導向不同的結局。實驗利用穿戴式裝置與VIVE控制器兩種不同輸入媒介來做比較,受試者完成體驗後填寫問卷以及接受訪談,最後分析實驗結果,驗證了我們設計的互動方式是直覺且順暢的,並且受試者會想要嘗試不同的故事路徑,證明了我們的系統有重玩的價值。
    In recent year, more and more industries and companies are devoted to the de-velopment of Virtual Reality in applications such as work training and entertainment. However, most of them use traditional user interfaces such as buttons or predefined action sequences to interact with virtual agents. When a player has chosen her move-ment, the responses from NPC’s are usually fixed animations, voice, or text outputs.
    We think this kind of interaction could not allow players to immerse into a virtual world easily. Instead, we suggest using wearable devices to capture the player’s ges-ture and use her natural movements as inputs. In addition, we attempt to make the animation module of virtual character parameterizable in order to deliver appropriate, flexible, and diversified responses. We hope that the player can experience different story plots and perceive responsive animation feedbacks when they interact with the virtual world.
    We have implemented an interactive storytelling system which captures and in-terprets user’s body actions through wearable devices. The system can decide how to perform player character’s animation accordingly. The storyline will be adjusted if any NPC interactions are activated, thus leading to different story experiences. We have conducted a user study to evaluate our system by using traditional controller and wearable device for comparison. The participants evaluated the system by filling ques-tionnaires and were interviewed after the experiment. The experimental results reveal that the interaction methods we have designed are intuitive and easy to use for the users, compared to the controller. In addition, the users are willing to try to play with the system multiple times, which confirm the replay value of our interactive storytell-ing system.
    Reference: [1] F. Kistler, D. Sollfrank, N. Bee, E. André, "Full Body Gestures enhancing a Game Book for Interactive Story Telling," in International Conference on Interactive Digital Storytelling, 2011, pp.207-218.
    [2] C. Mousas, C.-N. Anagnostopoulos, "Performance-Driven Hybrid Full-Body Character Control for Navigation and Interaction in Virtual Environments," 3D Research, 8(2), Article No. 124, 2017.
    [3] H. Rhodin, J. Tompkin, K. I. Kim, E. de Aguiar, H. Pfister, H.-P. Seidel, C. Theobalt, "Generalizing Wave Gestures from Sparse Examples for Real-time Character Control," in Proceedings of ACM SIGGRAPH Asia 2015, 34(6), Article No. 181, 2015.
    [4] D. Thalmann, "Motion Modeling: Can We Get Rid of Motion Capture?," in In-ternational Workshop on Motion in Games, 2008, pp.121-131.
    [5] S. Tonneau, R. A. Al-Ashqar, J. Pettré,. T. Komura, N. Mansard, "Character contact re-positioning under large environment deformation," in Proceedings of the 37th Annual Conference of the European Association for Computer Graphics, 2016, pp127-138.
    [6] A. Shoulson, N. Marshak, M. Kapadia, N. I. Badler, "Adapt: the agent developmentand prototyping testbed," IEEE Transactions on Visualization and Computer Graphics 20.7 , 2014, pp.1035-1047.
    [7] M. Kapadia, X. Xu, M. Nitti, M. Kallmann, S. Coros, RW. Sumner, MH. Gross, "PRECISION: Precomputing Environment Semantics for Contact-Rich Character Animation," in Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, 2016, pp.29-37.
    [8] C. Mousas, "Towards Developing an Easy-To-Use Scripting Environment for Animating Virtual Characters," arXiv preprint arXiv:1702.03246, 2017.
    [9] 楊奇珍, "以體感方式參與敘事的3D互動敘事系統," 國立政治大學資訊科學系碩士論文, 2015.
    [10] M. Kipp, A. Heloir, M. Schroder, P. Gebhard, "Realizing Multimodal Behavior," in International Conference on Intelligent Virtual Agents, 2010, pp.57-63.
    [11] J. Funge, X. Tu, D. Terzopoulos, "Cognitive modeling: knowledge, reasoning and planning for intelligent characters," in Computer graphics and interactive tech-niques, 1999, pp.29-38.
    [12] 梁芳綺, "互動敘事中智慧型共同創作平台設計," 國立政治大學資訊科學系碩士論文, 2015.
    [13] 蘇雅雯, "互動敘事中具沉浸感之互動動畫產生研究," 國立政治大學資訊科學系碩士論文, 2017.
    [14] E. Brown, P. Cairns, " A grounded investigation of game immersion," in Extended Abstracts on Human Factors in Computing Systems, 2004, pp.1297-1300.
    [15] C. Jennett, A. L. Cox , P. Cairns, S. Dhoparee, A. Epps, T. Tijs, and A. Walton, "Measuring and defining the experience of immersion in games," International Journal of Human-Computer Studies, 66(9):641-661, 2008.
    Description: 碩士
    國立政治大學
    資訊科學系
    105753004
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0105753004
    Data Type: thesis
    DOI: 10.6814/NCCU201901172
    Appears in Collections:[資訊科學系] 學位論文

    Files in This Item:

    File SizeFormat
    300401.pdf3179KbAdobe PDF2415View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback