政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/79205
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 113318/144297 (79%)
Visitors : 51085003      Online Users : 881
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/79205


    Title: 以體感方式參與敘事的3D互動敘事系統
    Participating in Narratives with Motion-sensing Technologies in a 3D Interactive Storytelling System
    Authors: 楊奇珍
    Yang, Chi Chen
    Contributors: 李蔡彥
    Li, Tsai Yen
    楊奇珍
    Yang, Chi Chen
    Keywords: 互動數位敘事
    情境感知
    人機介面
    Interactive Storytelling
    Context-Aware
    Human-Computer Interaction
    Date: 2015
    Issue Date: 2015-11-02 14:49:50 (UTC+8)
    Abstract: 在目前大部分的互動數位敘事(Interactive Digital Storytelling)研究中,使用者都是以滑鼠或鍵盤作為輸入的選擇方式來進行互動,鮮少可以讓使用者以參與表演的方式互動。在本研究中,我們希望透過體感偵測系統輸入使用者的手部姿勢,同時以引導的方式來給予使用者選擇故事腳本演出的路線,並記錄使用者所有的演出過程。本研究目的在於設計出一個適合互動敘事的故事腳本,使用者以Leap Motion、Oculus Rift等輸出入裝置來參與演出,系統會提供引導式人機介面讓使用者能夠參考圖示選擇想呈現的表演。使用者是以第一人稱的視角來參與互動敘事的演出,透過各種手部動作的輸入,故事的發展路線也會有所不同。在參與過程中,我們錄下使用者在表演過程中的對白語音資訊,演出結束後提供使用者回顧整體故事的演出過程,以創造二次創作的價值。我們的系統提供使用者更多表演的空間,增加使用者對於故事的沉浸感,並能透過不同的角度觀看故事內容,提高故事的趣味性。透過設計實驗來驗證系統價值,顯示本研究的系統是有符合故事性、互動性、想像性與沉浸性的,使用者對於我們系統所創作出來的作品感到非常的有趣,並且會嘗試體驗不同的故事劇情,證實了我們的系統是有重玩價值的。
    At present, in most of the interactive digital storytelling studies, users use the mouse or the keyboard as an input device to interact with the story. The current systems rarely allows users to participate in the interactive performance. In our study, we use a somatosensory detection system to detect the user`s hand gestures, while guiding the user to select the desired choice of story line. We also hope that the user’s performance in the process can be recorded. The purpose of this study is to design a suitable interactive storytelling storyboard and allow a user to use Leap Motion, Oculus Rift, Microphone, etc. as I/O devices to participate in the performance of a 3D interactive story. The system also provides guidance in the human-machine interface such that the users can use gesture icons as reference to select the desired story path.
    The user participates in the interactive performance in a first-person perspective. By performing a variety of hand motions, the development of story line will also be different. At the same time, we record the user`s voice to provide the users a review of the overall story. Our system provides the users more performance space and enhances immersion of the users when the story is played. Being be able to view the story through different perspectives also has made the story more interesting.
    We have conducted an experiment to verify the value of the system. The results of this study showed that 3D interactive stories produced by our system are interactive, imaginative and immersive. Users are much interested in our work and try to experience different story lines, which confirm the replay value of our system.
    Reference: [1] “Leap Motion | Mac & PC Motion Controller for Games, Design, Virtual Reality & More.” [Online]. Available: https://www.leapmotion.com/. [Accessed: 14-Sep-2013].
    [2] 吳蕙盈, “打破第四道牆: 以敘事理論為基礎之個人化3D互動敘事創作系統,” 國立政治大學, 2012.
    [3] F. Kistler and E. André, “User-defined body gestures for an interactive storytelling scenario,” in Human-Computer Interaction--INTERACT 2013, Springer, 2013, pp. 264–281.
    [4] M. O. Cavazza, F. Charles, and S. J. Mead, “Character-based interactive storytelling,” IEEE Intell. Syst., 2002.
    [5] W. Swartout, R. Hill, J. Gratch, W. L. Johnson, C. Kyriakakis, C. LaBore, R. Lindheim, S. Marsella, D. Miraglia, and B. Moore, “Toward the holodeck: Integrating graphics, sound, character and story,” in Proc. Autonomous Agents 2001 Conf., ACM Press, 2006.
    [6] M. Cavazza, F. Charles, and S. J. Mead, “AI-based animation for interactive storytelling,” in Computer Animation, 2001. The Fourteenth Conference on Computer Animation. Proceedings, 2001, pp. 113–120.
    [7] I. Machado, A. Paiva, and P. Brna, “Real characters in virtual stories,” in Virtual Storytelling Using Virtual Reality Technologies for Storytelling, Springer, 2001, pp. 127–134.
    [8] N. M. Sgouros, G. Papakonstantinou, and P. Tsanakas, “A framework for plot control in interactive story systems,” in AAAI/IAAI, Vol. 1, 1996, pp. 162–167.
    [9] D. Grasbon and N. Braun, “A morphological approach to interactive storytelling,” in Proc. CAST01, Living in Mixed Realities. Special issue of Netzspannung. org/journal, the Magazine for Media Production and Inter-media Research, 2001, pp. 337–340.
    [10] V. Propp, Morphology of the Folktale, vol. 9. University of Texas Press, 1958.
    [11] R. McKee, Story: style, structure, substance, and the principles of screenwriting. Harper Collins, 1997.
    [12] M.-L. Ryan, Avatars of story. U of Minnesota Press, 2006.
    [13] P. Gervás, B. Díaz-Agudo, F. Peinado, and R. Hervás, “Story plot generation based on CBR,” in Knowledge-Based Systems, vol. 18, no. 4, Elsevier, 2005, pp. 235–242.
    [14] M. O. Riedl, “Incorporating Authorial Intent into Generative Narrative Systems.,” in AAAI Spring Symposium: Intelligent Narrative Technologies II, 2009, pp. 91–94.
    [15] U. Spierling, S. A. Weiß, and W. Müller, “Towards accessible authoring tools for interactive storytelling,” in Technologies for Interactive Digital Storytelling and Entertainment, Springer, 2006, pp. 169–180.
    [16] G. Burdea and P. Coiffet, “Virtual reality technology,” in Presence: Teleoperators and virtual environments, vol. 12, no. 6, MIT Press, 2003, pp. 663–664.
    [17] M. Csikszentmihalyi, Beyond boredom and anxiety. Jossey-Bass, 2000.
    [18] T. R. Trigo and S. R. M. Pellegrino, “An analysis of features for hand-gesture classification,” in 17th International Conference on Systems, Signals and Image Processing (IWSSIP 2010), 2010, pp. 412–415.
    [19] T. B. Dinh, V. B. Dang, D. Duong, T. T. Nguyen, D.-D. Le, and others, “Hand gesture classification using boosted cascade of classifiers,” in Research, Innovation and Vision for the Future, 2006 International Conference on, 2006, pp. 139–144.
    [20] “Unity - Game Engine.” [Online]. Available: https://unity3d.com/. [Accessed: 14-Sep-2013].
    Description: 碩士
    國立政治大學
    資訊科學學系
    101753005
    Source URI: http://thesis.lib.nccu.edu.tw/record/#G0101753005
    Data Type: thesis
    Appears in Collections:[Department of Computer Science ] Theses

    Files in This Item:

    File Description SizeFormat
    300501.pdf2765KbAdobe PDF2525View/Open
    300502.pdf2765KbAdobe PDF2720View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback