When Brett Levenson near Apple successful 2019 to pb concern integrity astatine Facebook, the societal media elephantine was successful the heavy of the Cambridge Analytica fallout. At the time, helium thought helium could simply hole Facebook’s contented moderation occupation with amended technology.
The problem, helium rapidly learned, ran deeper than technology. Human reviewers were expected to memorize a 40-page argumentation papers that had been machine-translated into their language, helium said. Then they had astir 30 seconds per portion of flagged contented to determine not conscionable whether that contented violated the rules, but what to bash astir it: artifact it, prohibition the user, bounds the spread. Those speedy calls were lone “slightly amended than 50% accurate,” according to Levenson.
“It was benignant of similar flipping a coin, whether the quality reviewers could really code policies correctly, and this was galore days aft the harm had already occurred anyway,” Levenson told TechCrunch.
That benignant of delayed, reactive attack is not sustainable successful a satellite of nimble and well-funded adversarial actors. The emergence of AI chatbots has lone compounded the problem, arsenic contented moderation failures person resulted successful a drawstring of high-profile incidents, similar chatbots providing teens with self-harm guidance oregon AI-generated imagery evading information filters.
Levenson’s vexation led to the thought of “policy arsenic code” — a mode to crook static argumentation documents into executable, updatable logic tightly coupled to enforcement. That penetration led to the founding of Moonbounce, which announced it has raised $12 cardinal successful backing connected Friday, TechCrunch has exclusively learned. The circular was co-led by Amplify Partners and StepStone Group.
Moonbounce works with companies to supply an further information furniture wherever contented is generated, whether by a idiosyncratic oregon by AI. The institution has trained its ain ample connection exemplary to look astatine a customer’s argumentation documents, measure contented astatine runtime, supply a effect successful 300 milliseconds oregon less, and instrumentality action. Depending connected lawsuit preference, that enactment could look similar Moonbounce’s strategy slowing down organisation portion the contented awaits a quality reappraisal later, oregon it mightiness artifact high-risk contented successful the moment.
Today, Moonbounce serves 3 main verticals: Platforms dealing with user-generated contented similar dating apps; AI companies gathering characters oregon companions; and AI representation generators.
Techcrunch event
San Francisco, CA | October 13-15, 2026
Moonbounce is supporting much than 40 cardinal regular reviews and serving implicit 100 cardinal regular progressive users connected the platform, Levenson said. Customers see AI companion startup Channel AI, representation and video procreation institution Civitai, and quality roleplay platforms Dippy AI and Moescape.
“Safety tin really beryllium a merchandise benefit,” Levenson told TechCrunch. “It conscionable ne'er has been due to the fact that it’s ever a happening that happens later, not a happening you tin really physique into your product. And we spot our customers are uncovering truly absorbing and innovative ways to usage our exertion to marque information a differentiator, and portion of their merchandise story.”
Tinder’s caput of spot and information recently explained however the dating level uses these types of LLM-powered services to scope a 10x betterment successful accuracy of detections.
“Content moderation has ever been a occupation that plagued ample online platforms, but present with LLMs astatine the bosom of each application, this situation is adjacent much daunting,” Lenny Pruss, wide spouse astatine Amplify Partners, said successful a statement. “We invested successful Moonbounce due to the fact that we envision a satellite wherever objective, real-time guardrails go the enabling backbone of each AI-mediated application.”
AI companies are facing mounting ineligible and reputational unit aft chatbots person been accused of pushing teenagers and susceptible users toward suicide and representation generators similar xAI’s Grok person been utilized to make nonconsensual nude imagery. Clearly, information guardrails internally are failing, and it’s becoming a liability question. Levenson said AI companies are progressively looking extracurricular their ain walls for assistance beefing retired information infrastructure.
“We’re a 3rd enactment sitting betwixt the idiosyncratic and the chatbot, truthful our strategy isn’t inundated with discourse the mode the chat itself is,” Levenson said. “The chatbot itself has to remember, potentially, tens of thousands of tokens that person travel before…We’re solely disquieted astir enforcing rules astatine runtime.”
Levenson runs the 12-person institution with his erstwhile Apple workfellow Ash Bhardwaj, who antecedently built large-scale unreality and AI infrastructure crossed the iPhone-maker’s halfway offerings. Their adjacent absorption is simply a capableness called “iterative steering,” developed successful effect to cases similar the 2024 termination of a 14-year-old Florida boy who became obsessed with a Character AI chatbot. Rather than a blunt refusal erstwhile harmful topics arise, the strategy would intercept the speech and redirect it, modifying prompts successful existent clip to propulsion the chatbot toward a much actively supportive response.
“We anticipation to beryllium capable to adhd to our actions toolkit the quality to steer the chatbot successful a amended absorption to, essentially, instrumentality the user’s punctual and modify it to unit the chatbot to beryllium not conscionable an empathetic listener, but a adjuvant listener successful those situations,” Levenson said.
When asked whether his exit strategy progressive an acquisition by a institution similar Meta, bringing his enactment connected contented moderation afloat circle, Levenson said helium recognizes however good Moonbounce would acceptable into his aged employer’s stack, arsenic good arsenic his ain fiduciary duties arsenic a CEO.
“My investors would termination maine for saying this, but I would hatred to spot idiosyncratic bargain america and past restrict the technology,” helium said. “Like, ‘Okay, this is ours now, and cipher other tin payment from it.’”















English (US) ·