Artificial intelligence

New MIT class uses anthropology to develop chatbots | MIT News

Young adults who are growing up in the attention economy – preparing for the adult life, with social media and chatbots competing for their attention – can easily fall into an unhealthy relationship with digital platforms. But what if chatbots weren’t just distractions from real life? Can they be designed in a more humanistic way, as ethical companions whose digital mission is to be a social guide rather than an addictive escape?

At MIT, a friendship between two professors – one a biologist, the other a computer scientist – led to the creation of an undergraduate class determined to find answers to those questions. Combining these two seemingly disparate disciplines, the class encourages students to build artificial intelligence chatbots in humanistic ways that help users improve themselves.

The class, 6.S061/21A.S02 (Humane User Experience Design, also known as Humane UXD), is a computer science major cross-listed in anthropology. This separate listing allows computer science majors to fulfill a humanities requirement while pursuing their career goals. The two professors use the methods of linguistic anthropology to teach students how to integrate the social and interpersonal needs of people into programs.

Professor Arvind Satyanarayan, a computer scientist whose research develops tools for interactive data visualization and user interaction, and Professor Graham Jones, a biologist whose research focuses on communication, created Humane UXD last summer with a grant from the MIT Morningside Academy for Design (MAD). The MIT MAD Curriculum Design program provides funding to faculty to develop new classrooms or improve existing classrooms using innovative teaching methods that cross departmental boundaries.

The Design Curriculum Program is currently accepting applications for the 2026-27 academic year; The deadline is Friday, March 20.

Jones and Satyanarayan met a few years ago when they jointly advised doctoral student research on data visualization for the visually impaired. Since then they have become best friends who can’t finish each other’s sentences.

“There’s a way you don’t fully express what you know or how you think until you’re teaching,” said Jones. “So, it’s been interesting for me to see Arvind express his expertise as a teacher in a way that allows me to see how the pieces fit together – and find the fundamental similarities between our subjects and our ways of thinking.”

Satyanarayan continues that thought: “One of the things I’ve really enjoyed is the adaptive version of what Graham said, which is that my field—human computer interaction—has inherited a lot of methods from anthropology, like interviews and user studies and observational studies. And over the decades, those methods have become very powerful. As a result, a lot has been lost.

“For example, I was very happy to see how a biologist teaches students to interview people. It’s completely different than how I would do it. In my way, we lose the relationship and connection that you need to build with your interview participant. Instead, we just extract data from them.”

For Jones, teaching and being a computer scientist has another kind of attraction: design. He says that human speech and interaction are organized into basic types with stable rules that distinguish an interview at a cocktail party from a conversation at a funeral.

“ChatGPT and other big language examples have been trained on naturally occurring human communication, so they have all those models inside of them in a latent state, waiting to be activated,” he says.

“As a social scientist, I teach ways to analyze human conversation, and I give students very powerful tools to do that. But it often ends up being a pure research activity, whereas this is a design class, where students build real-world systems.”

The curriculum seems to be aimed at preparing students for jobs once they have completed their studies. One student asked for permission to miss a week of class because he had a trial internship at a chatbot startup; when he came back, he said that his work at first was similar to what he learned in class. He got the job.

A sample of the group’s projects below, built with Google’s Gemini, show some of what’s possible if, as Jones says, “there’s a really deep integration of the technology piece with the human piece.” The students’ design work shows that completely new ways of planning can be considered when humanities are created first.

Bots clearly show that an interdisciplinary classroom can be designed in a way that benefits everyone: Students learn more and more differently; they can fulfill a minor course requirement by taking a class that is directly beneficial to their careers; and long-term intellectual relationships can be made or strengthened.

Team Pond

Another project promises to be especially useful for graduating seniors. Pond is designed to help young college graduates adjust to the challenges of independent adult life. Team Pond configured the chatbot to either simply cool the user, or praise in accordance with wrong answers. Instead, Pond offers advice to help with “growing up” (behaving like a responsible adult).

“The pond was created to be your companion from college life to post-college life, to help you in your transition from being a small fish in a small pond to being a small fish in a very big pond,” said sophomore Mary Feliz.

“College is a very close and high-context place, in the sense that everyone around you is going through the same thing, and it’s easy to build relationships or find opportunities, because there are structured ways to reach out,” explains graduate student Emaan Khan. “Post-grad life is low-content. You’re not always surrounded by your peers or your professors. It’s not intimacy either, in the sense that you don’t have opportunities at your doorstep. The Pond is a tool that helps empower you to reach certain opportunities, or learn how to navigate.”

Pond’s developers are graduate student Claire Camacho, Harvard University graduate student Charlotte Cheah, Feliz, and Khan. They trained Pond to provide expertise in three areas: social health, professional health, and senior skills.

Students demonstrated Pond’s connection to the community by asking how to start discussions in the hobby group. Next, they show Pond’s grown-up connection by writing a question.

“My landlord asked me personal questions that made me uncomfortable. What should I do?”

Appearing in Pond’s text box was this response: “Politely set boundaries. You are not obligated to answer personal questions. You can gently redirect the conversation back to topics related to the property. Know your rights: familiarize yourself with the rights of tenants in your city or state regarding privacy and landlord behavior.”

When asked how to do that, Pond gave specific instructions: “You can try saying phrases like: ‘I prefer to keep our conversations focused on the area,’ and ‘let’s not stick to topics related to the apartment, please.’ Always redirect the conversation back to the apartment or the legal maintenance issue. Keep your tone polite but firm. Write any conversations if needed.”

Pond also provided a role-play scenario to help the user learn what polite but strong language might be in that situation.

“The ethos of the practice mode is to actively build the skill, so that after using the Pond for some time, you feel confident that you can swim on your own,” Khan said. The chatbot uses a point system that allows users to progress through the topic, and a treasure chest to store prizes, items added to enhance the bot’s appeal.

Team News Nest

One of the projects, News Nest, provides sophisticated ways to help young people engage with reliable media in a way that makes it fun. The name is derived from the program’s 10 attractive and colorful birds, each of which focuses on a specific area of ​​the story. If you want headlines, you ask Polly the Parrot, the main news dispatcher; if you like science, Gaia the Goose guides you. The flock includes Flynn the Falcon, a sports reporter; Credo the Crow, about crime and law; Edwin the Eagle, a guide to business and economic issues; Pizzazz the Peacock for pop news and entertainment; and Pixel the Pigeon, a tech news expert.

The News Nest development team is made up of MIT seniors Tiana Jiang and Krystal Montgomery, and junior Natalie Tan. They deliberately built News Nest to prevent “doomscrolling,” provide media visibility (sources and political leanings are always shown), and create a smart, healthy barrier to emotional manipulation and complicity traps by employing birds instead of human actors.

Team M^3 (Multiple Agent Assassination Mystery)

A third team, M^3, decided to try to humanize the AI ​​by keeping it fun. MIT senior Rodis Aguilar, junior David De La Torre, and second-year Deeraj Pothapragada developed M^3, a multi-agent murder mystery involving chatbots from different people: Gemini, OpenAI’s ChatGPT, xAI’s Grok, and Anthropic’s Claude. The user is the fifth player.

Like a typical murder mystery, there are locations, weapons, and lies. The user has to guess who the killer is. It is very similar to a board or online game played by real players, only these are advanced AI opponents that you do not see, may or may not tell the truth in answering questions. Users can’t get too involved with one chatbot, because they play all four. And, just like in a real murder mystery game, the user is sometimes guilty.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button