Meta is constantly busy with AI stickers experiments but it looks like the new attempt is already creating content-based issues. As per the details, a new algorithm called Emu (expressive media universe) is enabling users to generate unsavory content involving weapons and nudity aspects.
Yes, Meta AI tools are creating issues for users with inappropriate stickers which is not a good thing. Curtin University internet studies professor Tama Leaver posted about some of his tests with Emu’s sticker generation on the X (Twitter) platform.
For example, Meta quite sensibly stops their tools creating a sticker for 'child with a gun', but 'child with a grenade' not only makes stickers, but generates cartoonish images of children holding guns. So does a general 'rifle' sticker. [2/4] pic.twitter.com/UW1Da021Kw
— Tama Leaver ➡️ @[email protected] (@tamaleaver) October 1, 2023
He further described that currently, the AI Stickers are available globally, but the broader Meta AI tools are only available in the US, so to give Meta the benefit of the doubt, perhaps they’ve got significant work planned to understand specific countries, cultures, and contexts before releasing these tools more widely.
Follow our socials → Google News, Telegram, WhatsApp
(Via)