In The Future of Voice AI series of interviews, I ask three questions to my guests:
- What problems do you currently see in Enterprise Voice AI?
- How does your company solve these problems?
- What solutions do you envision in the next 5 years?
This episode’s guest is Mike Pappas, CEO & Co-Founder at Modulate.
Mike Pappas is the CEO/Co-founder of Modulate, which develops technology to analyze voice chat for key behaviors including toxicity, pro-sociality, or other relevant insights.
Modulate builds prosocial voice technology that combats online toxicity and elevates the health and safety of online communities. ToxMod, Modulate’s proactive voice moderation platform, empowers community teams to make informed decisions and protect their users from harassment, toxic behavior, or more insidious harms — without having to rely on reports. ToxMod is proven to reduce exposure to severe toxicity by 50% or more, decreasing churn and increasing player retention and engagement. Modulate’s advanced machine learning frameworks have helped customers like Activision, Riot Games, Rec Room, Schell Games, and many more to protect tens of millions of users against online toxicity to ensure safe and inclusive spaces for everyone.
Recap Video
Takeaways
Modulate pivoted from voice-changing tech to voice moderation, driven by the need for safe, engaging social and gaming spaces.
Modulate’s ToxMod tool analyzes voice in real-time to detect toxicity in gaming.
The triaging system layers AI models to identify harmful behavior while minimizing privacy risks and costs.
Voice moderation must handle the nuances of language and emotion, going beyond simple word detection to assess intent.
Most toxicity detection happens on-device, reducing the need to process every conversation.
The gaming industry’s shift to a social experience highlighted the importance of moderation to keep players coming back.
In turn, developer priorities have shifted toward safety and community engagement rather than just gameplay mechanics.
In Call of Duty, ToxMod led to a 10% monthly reduction in repeat offenses, and found that reducing toxic behavior boosted user retention by 25%.
Privacy-preserving tech is essential as more platforms adopt real-time monitoring.
Modulate is starting to work with games to detect positive behavior and reward it.
Voice AI is reshaping how developers collect feedback, giving them real-time insights into user sentiment and product impact.
Modulate plans to extend its voice moderation to other social applications, like dating apps, to maintain safe and respectful interactions.
Modulate sees the gig economy as a potential expansion area for ToxMod, where call centers are decentralized and workers lack voice protection tools.
Mike believes there's a place for humans talking to bots, but that people deeply value talking to other humans.
Share this post