
"Executives can poke fun at the challenges of adopting AI in an era when companies are looking everywhere for a business win using the latest tools. So when Varonis field CTO Brian Vecci quipped that "every copilot pilot gets stuck in pilot" at a Fortune Brainstorm Tech panel on safeguarding innovation this month, there were plenty of chuckles from the audience. But the joke also underscored a serious problem: companies eager to deploy generative AI tools often slam into the same wall-data security fears."
"Scott Holcomb, U.S. enterprise trust AI leader at Deloitte, agreed that both internally and for his clients, "we've absolutely had to put guardrails in place" in terms of what people can and cannot do when using AI tools. For example, the amount of data that Microsoft Copilot has on individuals and organizations is "immense," he explained. "We were not comfortable with that, so we had to work our way through that with Microsoft, but we absolutely had to do a lot of training for our staff in terms of what you can and can't do with client data, too.""
"Yet leaders like Keith Na, SVP of technology and data at Cargill, cautioned that swinging too far the other way-shutting down experimentation altogether-can be just as dangerous. What organizations need, he said, is a culture of curiosity: a willingness to let engineers break, test, and learn in safe spaces. "I think a lot of technologists go into our profession to solve badass problems together," he said. "And I think over time we're isolating our [teams].""
Data security fears frequently halt generative AI pilots, creating tension between rapid innovation and protection of underlying data. Organizations must protect the data used for AI to capture productivity and other benefits safely. Many firms have implemented guardrails and extensive staff training to limit what can be done with client data when using AI tools. Conversely, eliminating experimentation entirely risks stifling problem-solving and collaboration. A balanced approach involves safe spaces for engineers to break, test, and learn, combined with policies and controls that enable innovation without exposing sensitive information. Breaking down barriers between teams supports curiosity and accelerates responsible adoption.
Read at Fortune
Unable to calculate read time
Collection
[
|
...
]