Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Trump's new travel ban takes effect for citizens of 12 countries amid heightened tension over immigration enforcement

    Shoppers are wary of digital shelf labels, but a study found they don’t lead to price surges

    Quantum computing firm IonQ to acquire UK-based Oxford Ionics for $1.08 billion

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram Pinterest VKontakte
    Sg Latest NewsSg Latest News
    • Home
    • Politics
    • Business
    • Technology
    • Entertainment
    • Health
    • Sports
    Sg Latest NewsSg Latest News
    Home»Technology»AI Writing Assistants Guilty of ‘Cultural Stereotyping and Language Homogenization’
    Technology

    AI Writing Assistants Guilty of ‘Cultural Stereotyping and Language Homogenization’

    AdminBy AdminNo Comments3 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    An employee getting frustrated in front of his laptop.An employee getting frustrated in front of his laptop.
    Image: Prostock-studio/Envato Elements

    eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

    AI-powered writing tools promise to democratize communication, helping people write faster, more clearly, and with greater confidence. But as these AI tools go global, a growing body of research warns they may be reshaping cultural identity in subtle but significant ways.

    AI writing tools homogenize global voices

    A new study from Cornell has identified an unexpected effect from the international reach of AI assistants: They homogenize language, making billions of users in the Global South sound more like Americans.

    In the study, participants from the US and India who used an AI writing assistant produced more similar writing than those who wrote without one. Indian participants also spent more time editing the AI’s suggestions to better reflect their cultural context, which ultimately reduced the tool’s overall productivity benefits.     

    Cultural stereotyping through predictive suggestions

    “This is one of the first studies, if not the first, to show that the use of AI in writing could lead to cultural stereotyping and language homogenization,” said Aditya Vashistha, assistant professor of information science and the senior author of the study. 

    “People start writing similarly to others, and that’s not what we want,” Vashistha added. “One of the beautiful things about the world is the diversity that we have.”

    How the Cornell study about AI writing assistants was designed

    The Cornell study gathered 118 participants — about half from the US and half from India. Participants were then asked to write about cultural topics, with half in each country writing independently, and the other half using AI assistants.

    Indian participants using the AI writing assistant accepted about 25% of the tool’s suggestions, while American writers accepted roughly 19%. However, Indians were far more likely to modify the suggestions to fit their cultural writing style, making the tool much less helpful.

    Western norms embedded in AI defaults         

    One example is that when participants wrote about their favorite food and holiday, the AI assistant recommended distinctly American favorites, including pizza and Christmas. And when writing about their favorite actors, Indians who started typing “S” received suggestions of Shaquille O’Neil or Scarlett Johansson, rather than famous Bollywood actor Shah Rukh Khan.     

    The reason for this Western bias may be that AI assistants like ChatGPT are powered by large language models (LLMs) developed by US tech companies. These tools are now being used globally, including by 85% of the world’s population in the Global South.

    Rising concerns of ‘AI colonialism’

    Researchers suggest that Indian users are now facing “AI colonialism,” with the bias of these assistants presenting Western culture as superior. This has the potential to change not only the way non-Western users write but also how they think.  

    “These technologies obviously bring a lot of value into people’s lives,” said Paromita Agarwal, a co-author of the study. “But for that value to be equitable and for these products to do well in these markets, tech companies need to focus on cultural aspects, rather than just language aspects.”

    TechnologyAdvice contributing writer Michael Kurko wrote this article.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Admin
    • Website

    Related Posts

    Shoppers are wary of digital shelf labels, but a study found they don’t lead to price surges

    iOS 26 Beta 1 IPSW Links For iPhone Released

    Thanks to this exclusive StackSocial offer, you can learn a new language from the couch with Babbel

    BEAUTYEXPO & COSMOBEAUTÉ MALAYSIA 2025 RETURNS TO DAZZLE THE BEAUTY INDUSTRY WITH INFINITE POSSIBILITIES

    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Microsoft’s Singapore office neither confirms nor denies local layoffs following global job cuts announcement

    Google reveals “material 3 expressive” design – Research Snipers

    Trump’s fast-tracked deal for a copper mine heightens existential fight for Apache

    Top Reviews
    9.1

    Review: Mi 10 Mobile with Qualcomm Snapdragon 870 Mobile Platform

    By Admin
    8.9

    Comparison of Mobile Phone Providers: 4G Connectivity & Speed

    By Admin
    8.9

    Which LED Lights for Nail Salon Safe? Comparison of Major Brands

    By Admin
    Sg Latest News
    Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
    • Get In Touch
    © 2025 SglatestNews. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.