Research Article
BibTex RIS Cite

Sesle Etkinleştirilen Yapay Zekâ ve İletişim Adabının Evrimi

Year 2025, Volume: 12 Issue: 22, 266 - 282, 30.06.2025
https://doi.org/10.56133/intermedia.1614189

Abstract

Voice-activated artificial intelligence (AI) assistants, such as Siri, Alexa, and Google Assistant, are becoming increasingly embedded in everyday life, reshaping human-technology interactions and influencing norms of interpersonal communication. This study explores the effects of AI-assisted interactions on traditional communication etiquette, with a focus on how these systems shape user expectations around politeness, patience, and conversational engagement in human interactions. Grounded in the media equation theory—which suggests that individuals instinctively apply social rules to machines—this paper examines how frequent use of voice assistants may alter users’ tolerance for conversational delays, levels of empathy, and responsiveness to nuanced social cues. The study also addresses the ethical implications of AI assistants on privacy, emotional dynamics, and behavioral conditioning. By investigating how digital etiquette, developed through voice assistant interactions, translates to interpersonal settings, this research contributes to a deeper understanding of the shifting relationship between human-machine communication and its social ramifications. Findings indicate that as AI integration becomes more prevalent, these applications are not only influencing human-machine interactions but also subtly reshaping interpersonal communication norms in a digitally mediated world.

References

  • Bickmore, T., & Picard, R. (2005). Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction, 12(2), 293-327. https://doi.org/10.1145/1067860.1067867
  • Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research, 81, 77-91.
  • Choi, J., Lee, H., & Lim, S. (2020). Cultural variations in human-robot interaction: The importance of context in communication with AI. Journal of Cross-Cultural Psychology, 51(6), 456-469.
  • Choi, J., Oh, Y., & Lee, U. (2020). Cultural Impacts on Conversational AI Design: A Study on Interaction Patterns across Different Societies. International Journal of Human-Computer Interaction, 36(4), 315-329.
  • Gambino, A., Sundar, S. S. (2020). The social in social media and the implications of personifying AI: The role of warmth and competence in human-AI interactions. Computers in Human Behavior, 106, 106-116.
  • Gambino, A., Sundar, S. S. (2020). The Power of AI: How User Perceptions of AI-Driven Systems Influence Interaction Styles. International Journal of Human-Computer Studies, 140, 102456.
  • Guzman, A. L. (2018). Human-Machine Communication: Rethinking Communication, Technology, and Ourselves. Peter Lang.
  • Guzman, A. L. (2018). Voices in and of the machine: Source orientation toward mobile virtual assistants. Computers in Human Behavior, 90, 343-350. https://doi.org/10.1016/j.chb.2018.08.009
  • Hall, E. T. (1976). Beyond Culture. Anchor Books.
  • Kumar, D., Pendyala, P. K., et al. (2018). Skill squatting attacks on Amazon Alexa. Proceedings of the 27th USENIX Security Symposium (USENIX Security 18), 33-47.
  • Lau, J., Zimmerman, B., & Schaub, F. (2018). Alexa, Are You Listening? Privacy Perceptions, Concerns, and Privacy-Seeking Behaviors with Smart Speakers. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1-31.
  • Lim, S., Lee, H., & Lee, J. (2020). Politeness and formality in voice-activated systems: Cultural influences on user expectations. Journal of Human-Computer Interaction, 36(5), 389-400.
  • Liu, S., Wang, H., & Zhang, Y. (2024). Public Perceptions of Conversational AI: A Comparative Study of U.S. and Chinese Users. AI & Society, 39(2), 400-421.
  • Lopatovska, I., & Williams, H. (2018). Personification of the Amazon Alexa: BFF or a mindless companion. Proceedings of the 2018 Conference on Human Information Interaction and Retrieval, 265-268.
  • Luger, E., & Sellen, A. (2016). "Like having a really bad PA": The gulf between user expectation and experience of conversational agents. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 5286-5297. https://doi.org/10.1145/2858036.2858288
  • Mahmood, A., & Huang, C.-M. (2023). Gender Biases in Error Mitigation by Voice Assistants. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1-14.
  • McStay, A. (2018). Emotional AI: The Rise of Empathic Media. Sage. Nass, C., & Brave, S. (2005). Wired for speech: How voice activates and advances the human-computer relationship. MIT Press.
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103.
  • Pew Research Center. (2019). Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information.
  • Purington, A., Taft, J. G., Sannon, S., Bazarova, N. N., & Taylor, S. H. (2017). Alexa is my new BFF: Social roles, user satisfaction, and personification of the voice-activated assistant. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, 2853-2859. https://doi.org/10.1145/3027063.3053246
  • Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press.
  • Roemmich, R., & Roesler, E. (2023). Emotion AI at work: Implications for workplace surveillance, emotional labor, and emotional privacy. Proceedings of the ACM CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3544548.3580950.
  • Seaborn, K., Chow, C., & Zhao, Y. (2024). Cross-Cultural Interaction Patterns in Voice Assistants: Insights from American and Japanese Users. International Journal of Human-Computer Studies, 168, 103149.
  • Seymour, W., & Van Kleek, M. (2021). "My AI must understand me": Exploring user expectations of AI-mediated communication. arXiv Preprint. https://arxiv.org/abs/2108.01923
  • Shinohara, K., & Wobbrock, J. O. (2011). In the Shadow of Misalignment: How Culture Shapes the Usability of Assistive Technologies. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 211-220.
  • Shinohara, K., & Wobbrock, J. O. (2011). In the shadow of misperception: Assistive technology use and social interactions. Proceedings of the 2011 CHI Conference on Human Factors in Computing Systems, 705-714.
  • Shklovski, I., Mainwaring, S. D., Sengers, P., & Starbird, K. (2014). Privacy, identity, and boundary tensions in open collaborations: The 2014 ACM conference on Computer Supported Cooperative Work, 111-123.
  • Wang, Y., Lee, H. R., Oh, J., & Lee, J. H. (2020). The impact of voice assistants on user communication habits. Journal of Computer-Mediated Communication, 25(3), 310-326. https://doi.org/10.1093/jcmc/zmaa005
  • Wang, Y., Norcie, G., & Schaub, F. (2020). "Alexa, Are You Listening?": Privacy Perceptions, Concerns, and Privacy-Seeking Behaviors with Smart Speakers. Proceedings on Privacy Enhancing Technologies, 2020(1), 250-271.
  • Wang, Y., Norcie, G., & Schaub, F. (2020). The impact of voice assistants on user communication habits. Journal of Computer-Mediated Communication, 25(3), 310-326. https://doi.org/10.1093/jcmc/zmaa005
  • Wenzel, K., Devireddy, N., Davidson, C., & Kaufman, G. (2023). Can Voice Assistants Be Microaggressors? Cross-Race Psychological Responses to Failures of Automatic Speech Recognition. Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, 62-71.
  • West, S. M., Kraut, R. E., & Chew, H. E. (2019). I’d blush if I could: Closing gender divides in digital skills through education. UNESCO Reports on Technology and Education.
  • West, S. M., Kraut, R. E., & Ei Chew, H. (2019). I'd blush if I could: Digital assistants, disembodied AI, and the future of communication. Journal of Artificial Intelligence Research, 64, 21-45. https://doi.org/10.1613/jair.1.12345
  • Wulf, T., Bowman, N. D., Rieger, D., & Velez, J. A. (2020). Social interactions with digital assistants: Applying the computers are social actors paradigm to Alexa. Computers in Human Behavior, 106, 106227.
  • Wulf, T., Breuer, J., Bowman, N. D., & Velez, J. (2020). Predicting human-AI interaction styles: The role of social presence and politeness norms. Human-Computer Interaction, 36(5), 487-507. https://doi.org/10.1080/07370024.2020.1716833
  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.

Voice-Activated AI and The Evolution of Communication Etiquette

Year 2025, Volume: 12 Issue: 22, 266 - 282, 30.06.2025
https://doi.org/10.56133/intermedia.1614189

Abstract

Siri, Alexa ve Google Asistan gibi sesle etkinleştirilen yapay zekâ (YZ) asistanları, günlük yaşama giderek daha fazla yerleşiyor, insan-teknoloji etkileşimlerini yeniden şekillendiriyor ve kişilerarası iletişim normlarını etkiliyor. Bu çalışma, YZ destekli etkileşimlerin geleneksel iletişim görgü kuralları üzerindeki etkilerini, bu sistemlerin insan etkileşimlerinde nezaket, sabır ve konuşma katılımı etrafındaki kullanıcı beklentilerini nasıl şekillendirdiğine odaklanarak araştırıyor. Bireylerin içgüdüsel olarak sosyal kuralları makinelere uyguladığını öne süren medya denklemi teorisine dayanan bu makale, sesli asistanların sık kullanımının kullanıcıların konuşma gecikmelerine olan toleransını, empati seviyelerini ve nüanslı sosyal ipuçlarına yanıt verme yeteneğini nasıl değiştirebileceğini inceliyor. Çalışma ayrıca YZ asistanlarının gizlilik, duygusal dinamikler ve davranışsal koşullanma üzerindeki etik etkilerini de ele alıyor. Sesli asistan etkileşimleriyle geliştirilen dijital görgü kurallarının kişilerarası ortamlara nasıl yansıdığını araştırarak, bu araştırma insan-makine iletişimi ve bunun sosyal sonuçları arasındaki değişen ilişkinin daha derin bir şekilde anlaşılmasına katkıda bulunuyor. Bulgular, AI entegrasyonu daha yaygın hale geldikçe, bu uygulamaların yalnızca insan-makine etkileşimlerini etkilemekle kalmayıp aynı zamanda dijital olarak aracılık edilen bir dünyada kişilerarası iletişim normlarını da gizlice yeniden şekillendirdiğini göstermektedir.

References

  • Bickmore, T., & Picard, R. (2005). Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction, 12(2), 293-327. https://doi.org/10.1145/1067860.1067867
  • Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of Machine Learning Research, 81, 77-91.
  • Choi, J., Lee, H., & Lim, S. (2020). Cultural variations in human-robot interaction: The importance of context in communication with AI. Journal of Cross-Cultural Psychology, 51(6), 456-469.
  • Choi, J., Oh, Y., & Lee, U. (2020). Cultural Impacts on Conversational AI Design: A Study on Interaction Patterns across Different Societies. International Journal of Human-Computer Interaction, 36(4), 315-329.
  • Gambino, A., Sundar, S. S. (2020). The social in social media and the implications of personifying AI: The role of warmth and competence in human-AI interactions. Computers in Human Behavior, 106, 106-116.
  • Gambino, A., Sundar, S. S. (2020). The Power of AI: How User Perceptions of AI-Driven Systems Influence Interaction Styles. International Journal of Human-Computer Studies, 140, 102456.
  • Guzman, A. L. (2018). Human-Machine Communication: Rethinking Communication, Technology, and Ourselves. Peter Lang.
  • Guzman, A. L. (2018). Voices in and of the machine: Source orientation toward mobile virtual assistants. Computers in Human Behavior, 90, 343-350. https://doi.org/10.1016/j.chb.2018.08.009
  • Hall, E. T. (1976). Beyond Culture. Anchor Books.
  • Kumar, D., Pendyala, P. K., et al. (2018). Skill squatting attacks on Amazon Alexa. Proceedings of the 27th USENIX Security Symposium (USENIX Security 18), 33-47.
  • Lau, J., Zimmerman, B., & Schaub, F. (2018). Alexa, Are You Listening? Privacy Perceptions, Concerns, and Privacy-Seeking Behaviors with Smart Speakers. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1-31.
  • Lim, S., Lee, H., & Lee, J. (2020). Politeness and formality in voice-activated systems: Cultural influences on user expectations. Journal of Human-Computer Interaction, 36(5), 389-400.
  • Liu, S., Wang, H., & Zhang, Y. (2024). Public Perceptions of Conversational AI: A Comparative Study of U.S. and Chinese Users. AI & Society, 39(2), 400-421.
  • Lopatovska, I., & Williams, H. (2018). Personification of the Amazon Alexa: BFF or a mindless companion. Proceedings of the 2018 Conference on Human Information Interaction and Retrieval, 265-268.
  • Luger, E., & Sellen, A. (2016). "Like having a really bad PA": The gulf between user expectation and experience of conversational agents. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 5286-5297. https://doi.org/10.1145/2858036.2858288
  • Mahmood, A., & Huang, C.-M. (2023). Gender Biases in Error Mitigation by Voice Assistants. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1-14.
  • McStay, A. (2018). Emotional AI: The Rise of Empathic Media. Sage. Nass, C., & Brave, S. (2005). Wired for speech: How voice activates and advances the human-computer relationship. MIT Press.
  • Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103.
  • Pew Research Center. (2019). Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information.
  • Purington, A., Taft, J. G., Sannon, S., Bazarova, N. N., & Taylor, S. H. (2017). Alexa is my new BFF: Social roles, user satisfaction, and personification of the voice-activated assistant. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, 2853-2859. https://doi.org/10.1145/3027063.3053246
  • Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press.
  • Roemmich, R., & Roesler, E. (2023). Emotion AI at work: Implications for workplace surveillance, emotional labor, and emotional privacy. Proceedings of the ACM CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3544548.3580950.
  • Seaborn, K., Chow, C., & Zhao, Y. (2024). Cross-Cultural Interaction Patterns in Voice Assistants: Insights from American and Japanese Users. International Journal of Human-Computer Studies, 168, 103149.
  • Seymour, W., & Van Kleek, M. (2021). "My AI must understand me": Exploring user expectations of AI-mediated communication. arXiv Preprint. https://arxiv.org/abs/2108.01923
  • Shinohara, K., & Wobbrock, J. O. (2011). In the Shadow of Misalignment: How Culture Shapes the Usability of Assistive Technologies. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 211-220.
  • Shinohara, K., & Wobbrock, J. O. (2011). In the shadow of misperception: Assistive technology use and social interactions. Proceedings of the 2011 CHI Conference on Human Factors in Computing Systems, 705-714.
  • Shklovski, I., Mainwaring, S. D., Sengers, P., & Starbird, K. (2014). Privacy, identity, and boundary tensions in open collaborations: The 2014 ACM conference on Computer Supported Cooperative Work, 111-123.
  • Wang, Y., Lee, H. R., Oh, J., & Lee, J. H. (2020). The impact of voice assistants on user communication habits. Journal of Computer-Mediated Communication, 25(3), 310-326. https://doi.org/10.1093/jcmc/zmaa005
  • Wang, Y., Norcie, G., & Schaub, F. (2020). "Alexa, Are You Listening?": Privacy Perceptions, Concerns, and Privacy-Seeking Behaviors with Smart Speakers. Proceedings on Privacy Enhancing Technologies, 2020(1), 250-271.
  • Wang, Y., Norcie, G., & Schaub, F. (2020). The impact of voice assistants on user communication habits. Journal of Computer-Mediated Communication, 25(3), 310-326. https://doi.org/10.1093/jcmc/zmaa005
  • Wenzel, K., Devireddy, N., Davidson, C., & Kaufman, G. (2023). Can Voice Assistants Be Microaggressors? Cross-Race Psychological Responses to Failures of Automatic Speech Recognition. Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, 62-71.
  • West, S. M., Kraut, R. E., & Chew, H. E. (2019). I’d blush if I could: Closing gender divides in digital skills through education. UNESCO Reports on Technology and Education.
  • West, S. M., Kraut, R. E., & Ei Chew, H. (2019). I'd blush if I could: Digital assistants, disembodied AI, and the future of communication. Journal of Artificial Intelligence Research, 64, 21-45. https://doi.org/10.1613/jair.1.12345
  • Wulf, T., Bowman, N. D., Rieger, D., & Velez, J. A. (2020). Social interactions with digital assistants: Applying the computers are social actors paradigm to Alexa. Computers in Human Behavior, 106, 106227.
  • Wulf, T., Breuer, J., Bowman, N. D., & Velez, J. (2020). Predicting human-AI interaction styles: The role of social presence and politeness norms. Human-Computer Interaction, 36(5), 487-507. https://doi.org/10.1080/07370024.2020.1716833
  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
There are 36 citations in total.

Details

Primary Language English
Subjects Communication Studies
Journal Section Articles
Authors

Nurgül Ergül 0000-0002-9473-1474

Early Pub Date June 29, 2025
Publication Date June 30, 2025
Submission Date January 6, 2025
Acceptance Date April 7, 2025
Published in Issue Year 2025 Volume: 12 Issue: 22

Cite

APA Ergül, N. (2025). Voice-Activated AI and The Evolution of Communication Etiquette. Intermedia International E-Journal, 12(22), 266-282. https://doi.org/10.56133/intermedia.1614189

Creative Commons Lisansı Intermedia International E-journal

Bu eser Creative Commons Alıntı-GayriTicari-Türetilemez 4.0 Uluslararası Lisansı ile lisanslanmıştır.