The High School Student News Site of The American School in London

The Standard

The High School Student News Site of The American School in London

The Standard

Check out our latest issue

Snapchat’s MyAI feature raises concerns regarding safety, privacy

Photo used with permission from PickPik
Snapchat is working to improve MyAI, an artificial intelligence bot first implemented into the platform in February. Snapchat AI was prone to misleading content and bias may be seen through conversations when it was first released. Snapchat included additional safety enhancements and limits to the bot.

Powered by ChatGPT, Snapchat’s newest feature allows users to communicate with “MyAI,” described by the social media platform as an artificial intelligence robot that “is there to help and to connect you more deeply to the people and things you care most about.”

“MyAI” is pinned at the top of Snapchat users’ feeds, and the feature was made available to all of its 750 million monthly users in April, although it had already been used by Snapchat+ subscribers in February. 

High School Technology Coordinator Mark Scharen said the feature may have underlying dangers that are not made aware to users.

“It’s not just this little avatar that you’re talking to,” Scharen said. “It’s a much more complex system.”

Snapchat strives to make “MyAI” a persona, encouraging users to view the bot as a friend, not a search engine. Snapchat CEO Evan Spiegel told The Verge that Snapchat’s big idea is to make “My AI” an avatar who you would talk to just as much as friends and family.

Since Snapchat’s AI rollout, there has been a share price plunge of almost 20% by investors, according to Financial Review. Snapchat’s overall rating has plummeted to 1.67 from an average of 3.05 stars. Snapchat states that much like ChatGPT, My AI is prone to “hallucinations,” which are unexpected falsehoods generated by the chatbox. These “hallucinations” have raised lots of concern from the public and parents of young users. 

Kanak Roy (’25) said she hasn’t personally had an experience with the “hallucinations,” but has heard about them from friends.

“My friends told me that if you snap a picture, it’ll get your surroundings that weren’t in it and it’ll text you about it,” she said. 

From an online survey conducted by The Standard from May 4 to May 9, more than half of the high school feels Snapchat AI is a negative addition to the app and is dangerous for its users. Only 15% of people surveyed use the bot for advice and help.

Roy initially thought the chatbox was harmless, but after she heard stories and viewed screenshots online, her opinion changed. She has seen instances where “My AI” reacts in a questionable way in response to a photo sent, so she has not sent any images to the chatbot. 

“I just text it because I’m scared, if it knows my surroundings when I take a picture, what else can it know about?” Roy said. “What will it do with that information?”

Like Roy, Scharen wonders how the conversations with My AI are being used within a big corporation like Snapchat.

“I think my concerns would be like, if you’re putting all this personal information into the system, how is that information being stored, and has it been used by these companies?” Scharen said.

According to BBC, the only way to remove the pinned chatbox is to subscribe to Snapchat+, something that Yuval Francis (’26) dislikes. 

“I think it’s cool it’s available, but it shouldn’t be pinned and we should be able to remove it if we dislike it,” Francis said.

Although Snapchat reassures that the bot can only see your location if it’s enabled in the app, many users online claim that after “My AI” says their location, the bot apologizes and contradicts itself when asked how it knows. Maya Daley (’26) said she was confused when her AI was able to pinpoint her exact location. 

“It not only knew what country I was in, but what neighborhood, which was a bit weird to me,” Daley said. “I didn’t know how it knew my location when it was off on Snap Maps.”

From the survey conducted by The Standard 53.5% of Snapchat users have had instances where the AI bot has had knowledge of their exact location. 

Scharen said he is worried about an increase in technology usage following the release of “My AI.” As AI becomes more accessible to social media users, Scharen predicts this will have negative consequences.

It’s possible to just have this continuous conversation forever with this AI tool.

— High School Technology Coordinator Mark Scharen

“As human beings, we have to sleep at some point, everybody’s friends are going to go to sleep, but this AI avatar never needs sleep,” Scharen said. “It’s possible to just have this continuous conversation forever with this AI tool.”

Moreover, the AI’s humanlike Bitmoji is customizable and users can change its name. Daley said this makes the avatar unique and individual.

“It’s actually nice because you can make it a bit more personal to customize it,” Daley said.

During a NewFronts advertising event held May 2, Snapchat revealed that their AI bot will begin to surface sponsored links that are relevant in conversations. Francis said advertisements in Snapchat AI will cause swayed advice. 

“I guess there will be a bit of bias in recommendations that might not necessarily be the best ones,” Francis said.

Roy also said she is concerned for the younger users on Snapchat using the chatbot. In comparison to ChatGPT, Snapchat’s directed audience is between the ages of 13 to 24 years old, according to Datareportal

“There are a lot of people on Snapchat, and I feel a lot of young kids especially don’t recognize how harmful it can become,” Roy said.

Scharen said he advises students to apply the safety precautions of practicing digital citizenship skills when using Snapchat AI. 

“Technology is just a tool,” Scharen said. “It has the ability to do really beneficial and helpful things, but it also has the ability to be used inappropriately.”

Leave a Comment

Comments (0)

All The Standard Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *