Intel is set to launch Bleep, software that is supposed to help gamers filter out “toxic” language in in-game voice chats, filtering out unwanted “offensive” language in realtime.
Intel, the processor company, started working on the concept two years ago. It partnered with Spirit AI, a firm focused on tools to help identify and remove abusive language in gaming, to develop Bleep.
Now, the project is almost ready to launch to mainstream customers as a beta.
“While we recognize solutions like Bleep don’t erase the problem, we believe it’s the step in the right direction, giving gamers a tool to control their experience,” said Intel Vice President Roger Chandler.
During the GDC Showcase event, Intel presented some screenshots of Bleep, but did not provide a demo.
The section starts at 29 minutes.
The program leverages the AI processing integrated in Intel-powered PCs and removes the language before a user hears it.
But the fact that it has to rely on Intel’s processors suggests that the software will only be available to PC gamers. Additionally, during the presentation, Chandler said that the software uses the AI acceleration on the “latest generation Intel laptops and desktops platforms.”
Therefore, to use the program, gamers with older chips might need to upgrade if they want this realtime censorship feature.
The software allows a user to control what types of language to censor, including swearing, misogyny, “body shaming,” “ableism,” and racism.
A user can also control how much of the abusive language to filter out, choosing from “none,” “some,” “most,” and “all.” It is not clear what someone will hear in place of the censored utterances.