Loading...
|
Please use this identifier to cite or link to this item:
https://nccur.lib.nccu.edu.tw/handle/140.119/159084
|
Title: | 數位助理擬人化設計對社會距離影響之書目計量學分析 The Influence of Anthropomorphism on Social Distance : A Bibliometric Study |
Authors: | 陳皓惇 Chen, Hao-Dun |
Contributors: | 梁定澎 彭志宏 Liang, Ting-Peng Peng, Chih-Hung 陳皓惇 Chen, Hao-Dun |
Keywords: | 擬人化 數位助理 社會距離 社會臨場感 書目計量學 Anthropomorphism Digital Assistant Social Distance Social Presence Bibliometric Analysis |
Date: | 2025 |
Issue Date: | 2025-09-01 15:02:32 (UTC+8) |
Abstract: | 近年來,隨著人工智慧技術的快速進展,數位助理與聊天機器人日益普及,其擬人化設計對使用者感知與行為的影響也成為學術界關注的焦點。擬人化不僅可提升使用者的互動意願與信任感,更可能改變人與科技之間的社會距離感知。本研究以書目計量學方法,結合 Scopus 與 Web of Science 資料庫,共蒐集 2000 至 2025 年間 1000 筆相關文獻,透過發表趨勢分析、關鍵字共現分析、文獻引用與共被引用分析,以及書目耦合分析,探討「擬人化數位助理」與「社會距離感知」之研究發展脈絡。研究結果顯示,近年來相關研究呈現顯著成長,並以心理學、資訊工程與行銷領域為主;關鍵字分析揭示「社會臨場感」、「信任」、「人機互動」等為核心議題。透過視覺化分析亦辨識出三大研究群體,分別聚焦於擬人化策略、使用者體驗設計與科技接受模型。研究最終歸納出此領域之研究趨勢與跨學科整合潛力,期望能為後續人機互動與AI設計研究提供實證基礎與方向建議。 In recent years, with the rapid advancement of artificial intelligence, digital assistants and chatbots have become increasingly prevalent. The anthropomorphic design of these agents has drawn scholarly attention for its influence on user perception and behavior. Anthropomorphism not only enhances user engagement and trust but may also reshape the perceived social distance between humans and machines. This study applies bibliometric methods to examine research trends related to anthropomorphized digital assistants and social distance from 2000 to 2025. A total of 1000 articles were collected from Scopus and Web of Science databases. Analyses include publication trends, author keyword co-occurrence, citation and co-citation analysis, and bibliographic coupling. The results reveal a significant increase in related research, especially in psychology, computer science, and marketing. Keyword analysis identifies core topics such as social presence, trust, and human-computer interaction. Visualizations further uncover three main research clusters focusing on anthropomorphic strategies, user experience design, and technology acceptance models. This study outlines the current research landscape and interdisciplinary integration potential, aiming to support future investigations in AI design and human-machine interaction. |
Reference: | 1.Araujo, T. (2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Computers in human behavior, 85, 183-189. 2.Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International journal of social robotics, 1, 71-81. 3.Bartz, J. A., Tchalova, K., & Fenerci, C. (2016). Reminders of social connection can attenuate anthropomorphism: A replication and extension of Epley, Akalis, Waytz, and Cacioppo (2008). Psychological science, 27(12), 1644-1650. 4.Bente, G., Rüggenberg, S., Krämer, N. C., & Eschenburg, F. (2008). Avatar-mediated networking: Increasing social presence and interpersonal trust in net-based collaborations. Human communication research, 34(2), 287-318. 5.Biocca, F., Harms, C., & Burgoon, J. K. (2003). Toward a more robust theory and measure of social presence: Review and suggested criteria. Presence: Teleoperators & virtual environments, 12(5), 456-480. 6.Bogardus, E. S. (1925). Measuring social distance. Journal of applied sociology, 9, 299-308. 7.Donthu, N., Kumar, S., Mukherjee, D., Pandey, N., & Lim, W. M. (2021). How to conduct a bibliometric analysis: An overview and guidelines. Journal of Business Research, 133, 285–296. 8.Elson, J. S., Pintar, J., Vitro, C., Kearns, E. M., & Schuetzler, R. (2024). Analyzing the Effectiveness of a Web Form vs. Chatbot for Suspicious Activity Reporting. Computers in Human Behavior, 146, 107913. 9.Epley, N., Waytz, A., & Cacioppo, J. T. (2007). On seeing human: A three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. 10.Epley, N., Waytz, A., Akalis, S., & Cacioppo, J. T. (2008). When we need a human: Motivational determinants of anthropomorphism. Social cognition, 26(2), 143-155. 11.Gambino, A., Fox, J., & Ratan, R. A. (2020). Building a stronger CASA: Extending the computers are social actors paradigm. Human-Machine Communication, 1, 71-85. 12.Gaviria-Marin, M., Merigó, J. M., & Baier-Fuentes, H. (2019).
Knowledge management: A global examination based on bibliometric analysis. Technological Forecasting and Social Change, 140, 194–220. 13.Go, E., & Sundar, S. S. (2019). Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior, 97, 304–316. 14.Grudin, J., & Jacques, R. (2019, May). Chatbots, humbots, and the quest for artificial general intelligence. In Proceedings of the 2019 CHI conference on human factors in computing systems (pp. 1-11). 15.Gu, C., Zhang, Y., & Zeng, L. (2024). Exploring the mechanism of sustained consumer trust in AI chatbots after service failures: a perspective based on attribution and CASA theories. Humanities and Social Sciences Communications, 11(1), 1-12. 16.Ho, C. C., & MacDorman, K. F. (2010). Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices. Computers in Human Behavior, 26(6), 1508-1518. 17.Hoy, M. B. (2018). Alexa, Siri, Cortana, and more: An introduction to voice assistants. Medical Reference Services Quarterly, 37(1), 81–88. 18.Kätsyri, J., Förger, K., Mäkäräinen, M., & Takala, T. (2015). A review of empirical evidence on different uncanny valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness. Frontiers in psychology, 6, 390. 19.Kim, Y., & Sundar, S. S. (2012). Anthropomorphism of computers: Is it mindful or mindless?. In S. S. Sundar (Ed.), The handbook of the psychology of communication technology (pp. 172–187). Wiley-Blackwell. 20.Kim, T., & Song, H. (2023). “I Believe AI Can Learn from the Error. Or Can It Not?”: The Effects of Implicit Theories on Trust Repair of the Intelligent Agent. Computers in Human Behavior, 141, 107589. 21.Kim, Y., & Sundar, S. S. (2012). Anthropomorphism of computers: Is it mindful or mindless?. In S. S. Sundar (Ed.), The handbook of the psychology of communication technology (pp. 172–187). Wiley-Blackwell. 22.Krueger, J., & Clement, R. W. (1994). The truly false consensus effect: An ineradicable and egocentric bias in social perception. Journal of Personality and Social Psychology, 67(4), 596–610. 23.Lee, K. M., Peng, W., Jin, S. A., & Yan, C. (2006). Can robots manifest personality?: An empirical test of personality recognition, social responses, and social presence in human–robot int 24.Lee, J. E. R., & Nass, C. I. (2010). Trust in computers: The computers-are-social-actors (CASA) paradigm and trustworthiness perception in human-computer communication. In Trust and technology in a ubiquitous modern environment: Theoretical and methodological perspectives (pp. 1-15). IGI Global. 25.Lee, K. M., Peng, W., Jin, S. A., & Yan, C. (2020). Can robots manifest personality?: An empirical test of personality recognition, social responses, and social presence in human–robot interaction. Journal of Communication, 56(4), 754–772. 26.MacDorman, K. F., & Ishiguro, H. (2006). The uncanny advantage of using androids in cognitive and social science research. Interaction studies. social behaviour and communication in biological and artificial systems, 7(3), 297-337. 27.Miller, G. A., & Lenneberg, E. (Eds.). (1978). Psychology and biology of language and thought: Essays in honor of Eric Lenneberg. Academic Press. 28.Mitchell, W. J., Szerszen Sr, K. A., Lu, A. S., Schermerhorn, P. W., Scheutz, M., & MacDorman, K. F. (2011). A mismatch in the human realism of face and voice produces an uncanny valley. i-Perception, 2(1), 10-12. 29.Mori, M. (2012). The uncanny valley. IEEE Robotics & Automation Magazine, 19(2), 98–100. 30.Nass, C., Steuer, J., & Tauber, E. R. (1994, April). Computers are social actors. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 72-78). 31.Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. 32.Biocca, F., Harms, C., & Gregg, J. (2001, May). The networked minds measure of social presence: Pilot test of the factor structure and concurrent validity. In 4th annual international workshop on presence, Philadelphia, PA (pp. 1-9). 33.Nowak, K. L., & Biocca, F. (2003). The effect of the agency and anthropomorphism on users’ sense of telepresence, copresence, and social presence in virtual environments. Presence: Teleoperators & Virtual Environments, 12(5), 481–494. 34.Oh, C. S., Bailenson, J. N., & Welch, G. F. (2018). A systematic review of social presence: Definition, antecedents, and implications. Frontiers in Robotics and AI, 5, 114. 35.Park, R. E. (1924). The concept of social distance as applied to the study of racial attitudes and racial relations. Journal of Applied Sociology, 8, 339–344. 36.Pentina, I., Zhang, L., & Basmanova, O. (2023). Exploring relationship development with social chatbots: A mixed-method study of Replika. Computers in Human Behavior, 139, 107517. 37.Pinochet, L. H. C., de Gois, F. S., Pardim, V. I., & Onusic, L. M. (2024). Experimental study on the effect of adopting humanized and non-humanized chatbots on the factors that measure the intensity of the user's perceived trust in the Yellow September campaign. Computers in Human Behavior, 150, 107987. 38.Qiu, L., & Benbasat, I. (2009). Evaluating anthropomorphic product recommendation agents: A social relationship perspective to designing information systems. Journal of Management Information Systems, 25(4), 145–182. 39.Saygin, A. P., Chaminade, T., Ishiguro, H., Driver, J., & Frith, C. (2012). The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Social cognitive and affective neuroscience, 7(4), 413-422. 40.Sestino, A., & D’Angelo, A. (2023). My doctor is an avatar! The effect of anthropomorphism and emotional receptivity on individuals’ intention to use digital-based healthcare services. Telematics and Informatics, 80, 102028. 41.Sheehan, B., Jin, H. S., & Böhm, E. (2024). Wow! Interjections Improve Chatbot Performance: The Mediating Role of Anthropomorphism and Perceived Listening. Journal of Business Research, 171, 114235. 42.Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. (No Title). 43.Son, G., Tiemann, A., & Rubo, M. (2025). I am here with you: an examination of factors relating to social presence in social VR. Frontiers in Virtual Reality, 6, 1558233. 44.Tesser, A. (1988). Toward a self-evaluation maintenance model of social behavior. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology (Vol. 21, pp. 181–227). Academic Press. 45.Tinwell, A., Nabi, D. A., & Charlton, J. P. (2013). Perception of psychopathy and the Uncanny Valley in virtual characters. Computers in Human Behavior, 29(4), 1617-1625. 46.Tsai, W. H. S., Liu, Y., & Chuan, C. H. (2021). How chatbots' social presence communication enhances consumer engagement: the mediating role of parasocial interaction and dialogue. Journal of Research in Interactive Marketing, 15(3), 460-482. 47.Tussyadiah, I. P., & Miller, G. (2022). Anthropomorphic AI agents in public health: Effects on trust and compliance. Government Information Quarterly, 39(4), 101756. 48.Waytz, A., Cacioppo, J., & Epley, N. (2010a). Who sees human? The stability and importance of individual differences in anthropomorphism. Perspectives on psychological science, 5(3), 219-232. 49.Waytz, A., Cacioppo, J., & Epley, N. (2010b). Who sees human? The stability and importance of individual differences in anthropomorphism. Perspectives on psychological science, 5(3), 219-232. 50.Wang, S., Wang, X., Liu, L., & Jiang, J. (2025). Designing service blueprint for chatbots: Experimental evidence on public preference for design components. Computers in Human Behavior, 143, 107789. 51.Waytz, A., Heafner, J., & Epley, N. (2014). The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of Experimental Social Psychology, 52, 113–117. 52.Yadegaridehkordi, E., Hourmand, M., Nilashi, M., & Samad, S. (2021).
A bibliometric analysis of technology acceptance model.
Kybernetes, 50(3), 657–674. 53.Yu, H., & Zhao, X. (2024). Emojifying chatbot interactions: An exploration of emoji utilization in human–chatbot communications. International Journal of Human–Computer Interaction, 40(3), 205–220. 54.Zahee(2025)r, S. SOCIAL PRESENCE IN VIRTUAL REALITY: ENHANCING COLLABORATION AND INTERACTION IN IMMERSIVE DIGITAL SPACES. 55.Zhang, Y., Zhou, Z., & Liu, C. (2025). Let Me Hold Your Hand: Effects of Anthropomorphism and Touch Behavior on Self-Disclosure Intention, Attachment, and Cerebral Activity Towards AI Mental Health Counselors. Computers in Human Behavior, 145, 107842. 56.Zupic, I., & Čater, T. (2015).Bibliometric methods in management and organization. Organizational Research Methods, 18(3), 429–472. |
Description: | 碩士 國立政治大學 資訊管理學系 108356010 |
Source URI: | http://thesis.lib.nccu.edu.tw/record/#G0108356010 |
Data Type: | thesis |
Appears in Collections: | [資訊管理學系] 學位論文
|
Files in This Item:
File |
Description |
Size | Format | |
601001.pdf | | 8947Kb | Adobe PDF | 0 | View/Open |
|
All items in 政大典藏 are protected by copyright, with all rights reserved.
|