政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/145809
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  全文筆數/總筆數 : 113318/144297 (79%)
造訪人次 : 51006795      線上人數 : 917
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋
    請使用永久網址來引用或連結此文件: https://nccur.lib.nccu.edu.tw/handle/140.119/145809


    題名: 談話型AI如何扭曲與塑形全新的使用者體驗?
    How do biases in conversational artificial intelligence distort and shape a new user experience?
    作者: 舒天宓
    Saquet, Thémis
    貢獻者: 莊皓鈞
    Chuang, Howard
    舒天宓
    Thémis Saquet
    關鍵詞: 偏見
    對話式人工智能
    對話式營銷
    用戶體驗
    Bias
    Conversational Artificial Intelligence
    Conversational Marketing
    User Experience
    日期: 2023
    上傳時間: 2023-07-06 16:34:53 (UTC+8)
    摘要: In our post-COVID-19 societies, more and more consumers rely on conversational AI such as voice assistants or chatbots to perform any kind of task, from asking for the weather to having personal conversations. Companies have seized on this demand to continue developing their conversational AI to create an ever-better user experience. However, the racial or sexist biases implemented in these AIs distort the original user experience, sometimes creating a new one depending on when the bias is implemented. We will try to analyze the effect of these different biases on the user experience and know how they can distort the user experience, especially depending on the moment when these biases appear. To do so, we will analyze the biases in the case of voice assistants and interactive social chatbots through a case study between XiaoIce and MicrosoftTay. We will analyze the appearance of these biases and their effects using the three-stage framework for artificial intelligence in marketing. The main conclusions are that racial biases, mainly embedded because of insufficiently diverse data and engineers, and gender biases, tend to reinforce the structural inequalities that affect our societies. By reinforcing these inequalities, the user experience is negatively impacted in terms of accessibility, representation, and experience conveyed by the use of the product.
    參考文獻: Bibliography
    1. De Cosmo, L. (2022) Google Engineer claims AI chatbot is sentient: Why that matters, Scientific American. Scientific American.
    Available at: https://www.scientificamerican.com/article/google-engineer-claims-ai-chatbot-is-sentient-why-that-matters/
    2. Pourquoi l`IA Conversationnelle ? IBM.
    Available at: https://www.ibm.com/fr-fr/topics/conversational-ai#:~:text=L`intelligence%20artificielle%20(IA),auxquelles%20les%20utilisateurs%20peuvent%20parler
    3. Le fonctionnement des assistants vocaux en 5 étapes (2020) CNIL. Commission nationale de l`informatique et des libertés.
    Available at: https://www.cnil.fr/fr/le-fonctionnement-des-assistants-vocaux-en-5-etapes#:~:text=Un%20assistant%20vocal%20est%20un,la%20requ%C3%AAte%20d`un%20utilisateur
    4. Qu`est-Ce Qu`un Biais Cognitif ? Définition Biais cognitif (2023) USABILIS.
    Available at: https://www.usabilis.com/definition-biais-cognitifs/
    5. Spradlin, L.K. (2012) Diversity matters: Understanding diversity in schools. Belmont, CA: Wadsworth, Cengage Learning.
    6. Braswell, P. (2022) This is the difference between racism and racial bias, Fast Company. Available at: https://www.fastcompany.com/90796690/this-is-the-difference-between-racism-and-racial-bias
    7. Eliminating gender bias in conversational AI: Strategies for fair and Inclusive Conversations (2023) Aivo.
    Available at: https://www.aivo.co/blog/eliminating-gender-bias-in-conversational-ai-strategies-for-fair-and-inclusive-conversations
    8. User experience (2020) What is User Experience? | Definition and Overview.
    Available at: https://www.productplan.com/glossary/user-experience/
    9. What is user experience (UX) design? (2022) The Interaction Design Foundation. Interaction Design Foundation.
    Available at: https://www.interaction-design.org/literature/topics/ux-design
    10. Huang, M.-H. and Rust, R.T. (2020) A strategic framework for artificial intelligence in marketing - journal of the Academy of Marketing Science, SpringerLink. Springer US. Available at: https://link.springer.com/article/10.1007/s11747-020-00749-9
    11. Edison Research (2020) The smart audio report 2020 from NPR and Edison Research, Edison Research. Edison Research http://www.edisonresearch.com/wp-content/uploads/2014/06/edison-logo-300x137.jpg.
    Available at: https://www.edisonresearch.com/the-smart-audio-report-2020-from-npr-and-edison-research/
    12. Transcript: Ai from above (EP3) (2022) The Internet Health Report 2022.
    Available at: https://2022.internethealthreport.org/transcript-ai-from-above-ep3/
    13. Harwell, D. (2018) The accent gap: How Amazon`s and Google`s smart speakers leave certain voices behind, The Washington Post. WP Company.
    Available at: https://www.washingtonpost.com/graphics/2018/business/alexa-does-not-understand-your-accent/
    14. Sidnell, J. (2019). African American Vernacular English. [online] Hawaii.edu.
    Available at: https://www.hawaii.edu/satocenter/langnet/definitions/aave.html.
    15. Koenecke, A. et al. (2020) Racial disparities in automated speech recognition | PNAS, Proceedings of the National Academy of Sciences, PNAS.
    Available at: https://www.pnas.org/doi/10.1073/pnas.1915768117
    16. Online resources for African American language (no date) CORAAL | Online Resources for African American Language.
    Available at: https://oraal.uoregon.edu/coraal
    17. Lloreda, C.L. (2020). Speech Recognition Tech Is Yet Another Example of Bias.
    Scientific American.
    Available at: https://www.scientificamerican.com/article/speech-recognition-tech-is-yet-another-example-of-bias/

    18. Cantone, J.A., Martinez, L.N., Willis-Esqueda, C. and Miller, T. (2019). Sounding guilty:
    How accent bias affects juror judgments of culpability. Journal of Ethnicity in Criminal Justice, 17(3), pp.228–253.
    doi:https://doi.org/10.1080/15377938.2019.1623963

    19. Baquiran, C.L.C. and Nicoladis, E. (2019). A Doctor’s Foreign Accent Affects
    Perceptions of Competence. Health Communication, 35(6), pp.726–730. doi:https://doi.org/10.1080/10410236.2019.1584779

    20. Baraniuk, C. (2022). Why your voice assistant might be sexist. [online] www.bbc.com.
    Available at: https://www.bbc.com/future/article/20220614-why-your-voice-assistant-might-be-sexist

    21. CNN, B.B.G. (n.d.). Why computer voices are mostly female. [online] CNN.
    Available at: https://edition.cnn.com/2011/10/21/tech/innovation/female-computer-voices/

    22. Gizmodo. (n.d.). No, Women’s Voices Are Not Easier to Understand Than Men’s Voices.
    [online]
    Available at: https://gizmodo.com/no-siri-is-not-female-because-womens-voices-are-easier-1683901643

    23. Halo· (2023). How many copies did Halo sell? — 2023 statistics | LEVVVEL. [online]
    levvvel.com.
    Available at: https://levvvel.com/halo-statistics/#:~:text=The%20Halo%20series%20has%20sold%2081%20million%20copies.&text=The%20series%20went%20from%2065

    24. Franceinfo. (2020). Nouveau monde. Pourquoi les prénoms de Siri et d’Alexa pour des
    assistants vocaux ? [online]
    Available at: https://www.francetvinfo.fr/replay-radio/nouveau-monde/nouveau-monde-pourquoi-les-prenoms-de-siri-et-d-alexa-pour-des-assistants-vocaux_4045705.html

    25. Gizmodo. (2015). Why Is My Digital Assistant So Creepy? [online]
    Available at: https://gizmodo.com/why-is-my-digital-assistant-so-creepy-1682216423

    26. World Economic Forum. (n.d.). Hey Siri, you’re sexist, finds UN report on gendered
    technology. [online]
    Available at: https://www.weforum.org/agenda/2019/05/hey-siri-youre-sexist-finds-u-n-report-on-gendered-technology/

    27. www.ibm.com. (n.d.). Qu’est-ce qu’un agent conversationnel ? | IBM. [online]
    Available at: https://www.ibm.com/fr-fr/topics/chatbots

    28., 29. & 30. Salesforce.com. (n.d.). State of the Connected Customer Report. [online]
    Available at: https://www.salesforce.com/resources/research-reports/state-of-the-connected-customer/.

    31. Drift. (2021). 2021 State of Conversational Marketing. [online]
    Available at: https://www.drift.com/books-reports/conversational-marketing-trends/#state-of-convo-marketing.

    32. Weizenbaum, J. (1966). ELIZA---a computer program for the study of natural language
    communication between man and machine. Communications of the ACM, [online] 9(1), pp.36–45.
    doi:https://doi.org/10.1145/365153.365168.

    33. Colby, K.M., Weber, S. and Hilf, F.D. (1971). Artificial Paranoia. Artificial Intelligence,
    2(1), pp.1–25.
    doi:https://doi.org/10.1016/0004-3702(71)90002-6.

    34. Wallace, R.S. (2007). The Anatomy of A.L.I.C.E. Parsing the Turing Test, [online]
    pp.181–210.
    doi:https://doi.org/10.1007/978-1-4020-6710-5_13.

    35. Zhou, L., Gao, J., Li, D. and Shum, H.-Y. (2020). The Design and Implementation of
    XiaoIce, an Empathetic Social Chatbot. Computational Linguistics, 46(1), pp.53–93.
    doi:https://doi.org/10.1162/coli_a_00368.

    36. Quach, K. (n.d.). Microsoft chatbots: Sweet XiaoIce vs foul-mouthed Tay. [online]
    www.theregister.com.
    Available at: https://www.theregister.com/2016/09/29/microsofts_chatbots_show_cultural_differences_between_the_east_and_west/

    37. Microsoft chatbot is taught to swear on Twitter. (2016). BBC News. [online] 24 Mar.
    Available at: https://www.bbc.com/news/technology-35890188.

    38. Hunt, E. (2018). Tay, Microsoft’s AI chatbot, gets a crash course in racism from Twitter.
    [online] The Guardian.
    Available at: https://www.theguardian.com/technology/2016/mar/24/tay-microsofts-ai-chatbot-gets-a-crash-course-in-racism-from-twitter.

    39. Explains, K. (2020). Remembering Microsoft’s Chatbot disaster. [online] Medium.
    Available at: https://uxplanet.org/remembering-microsofts-chatbot-disaster-3a49d4a6331f

    40. Editor (2016). XiaoIce Vs. Tay: Two A.I. Chatbots, Two Different Outcomes - Sampi. co.
    [online]
    Available at: https://sampi.co/chinese-chatbot-xiaoice-vs-tay/

    41. & 42. Miller, K.W., Wolf, M.J. and Grodzinsky, F.S. (2017). Why We Should Have Seen
    That Coming. ORBIT Journal, 1(2).
    doi:https://doi.org/10.29297/orbit.v1i2.49

    43. Leetaru, K. (n.d.). How Twitter Corrupted Microsoft’s Tay: A Crash Course In the
    Dangers Of AI In The Real World. [online] Forbes.
    Available at: https://www.forbes.com/sites/kalevleetaru/2016/03/24/how-twitter-corrupted-microsofts-tay-a-crash-course-in-the-dangers-of-ai-in-the-real-world/?sh=ffeebf926d28

    44. https://plus.google.com/+UNESCO (2021). Problème de fille : briser les préjugés dans
    l’IA. [online] UNESCO.
    Available at: https://fr.unesco.org/girltrouble

    45. Zendesk. (n.d.). 7 ways to reduce bias in conversational AI. [online]
    Available at: https://www.zendesk.tw/blog/7-ways-reduce-bias-conversational-ai/#georedirect

    46. SoundHound. (n.d.). How to Overcome Cultural Bias in Voice AI Design. [online]
    Available at: https://www.soundhound.com/resources/how-to-overcome-cultural-bias-in-voice-ai-design/
    47. Dotan, R. (2023) Example of gender bias in CHATGPT, Ravit Dotan. Available at: https://www.ravitdotan.com/post/example-of-gender-bias-in-chatgpt
    48. Shafik and Case (eds.) (2022) Advances in manufacturing technology XXXV, IOS Press. Available at: https://www.iospress.com/catalog/books/advances-in-manufacturing-technology-xxxv
    49. Kotek, H. (2023) Doctors can’t get pregnant and other gender biases in chatgpt, Doctors can’t get pregnant and other gender biases in ChatGPT. Available at: https://hkotek.com/blog/gender-bias-in-chatgpt/

    Additional resources

    1. Wheeler, D.R. (2014). Why are Cortana and Siri female? [online] CNN.
    Available at: https://edition.cnn.com/2014/04/04/opinion/wheeler-cortana-siri/

    2. Abby Ohlheiser, covering digital culture (n.d.). Trolls turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac. [online] Washington Post.
    Available at: https://www.washingtonpost.com/news/the-intersect/wp/2016/03/24/the-internet-turned-tay-microsofts-fun-millennial-ai-bot-into-a-genocidal-maniac/

    3. Mathur, V., Stavrakas, Y. and Singh, S. (2016). Intelligence analysis of Tay Twitter bot. [online] ResearchGate.
    Available at: https://www.researchgate.net/publication/316727714_Intelligence_
    analysis_of_Tay_Twitter_bot

    4. Nouri, S. (n.d.). Council Post: The Role Of Bias In Artificial Intelligence. [online] Forbes.
    Available at: https://www.forbes.com/sites/forbestechcouncil/2021/02/04/the-role-of-bias-in-artificial-intelligence/?sh=3d43f9f7579d

    5. UN News. (2019). Are robots sexist? UN report shows gender bias in talking digital tech.
    [online]
    Available at: https://news.un.org/en/story/2019/05/1038691

    6. West, M., Kraut, R. and Chew, H.E. (2019). I’d blush if I could: closing gender divides in
    digital skills through education.
    Available at: https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1
    描述: 碩士
    國立政治大學
    國際經營管理英語碩士學位學程(IMBA)
    111933056
    資料來源: http://thesis.lib.nccu.edu.tw/record/#G0111933056
    資料類型: thesis
    顯示於類別:[國際經營管理英語碩士學程IMBA] 學位論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    305601.pdf2641KbAdobe PDF2166檢視/開啟


    在政大典藏中所有的資料項目都受到原著作權保護.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 回饋