
""On the creative side, I really leave it up to the teams," he said. "I have found that creative teams will use tools that make their job easier when it makes their job easier, and any top-down mandate that 'Thou must use a certain tool'...is not really a path to success. I look at the teams, and we make tools available, and I kind of let it organically percolate.""
""It's now at a scale where you can't really moderate the safety of those with just people alone," Phil said. "The volume is too high. So we have AI that we use to make sure that the conversation and topics that are happening, and for protected child accounts and other things and who gets to talk to those accounts to those people, is locked down by parents or guardians who are setting those controls.""
On the creative side, teams are free to decide whether to use AI tools, with available tools left to organically percolate rather than being mandated. Creative teams tend to adopt tools that make their jobs easier. AI is employed heavily for security and moderation because the scale of interactions exceeds what human moderators can handle alone. AI systems help monitor conversations, enforce parental controls, and restrict contact with protected child accounts. There are no production-side goals to require AI usage. AI is considered more for increasing the pace of creativity and enabling more experimental projects.
Read at WGB
Unable to calculate read time
Collection
[
|
...
]