top of page

Leonid Igolnik - Can We Trust Al-Generated Code? Maybe We've Been Asking the Wrong Question.

Updated: Sep 12

ree

Leonid Igolnik is a long-time engineering leader and operator who’s spent his career building and scaling enterprise software systems. He’s currently Executive Vice President of Engineering at Clari, where he leads the teams behind the company’s revenue platform. Before Clari, he ran engineering at companies like SignalFx, Oracle’s Taleo, and CA Technologies — all at massive scale and with teams distributed across the globe.


Leonid is also a General Partner at IA Seed Ventures (IASV), where he backs early-stage founders building the next generation of B2B software. He’s especially passionate about developer-first products, observability, and the gritty, behind-the-scenes work of building infrastructure that lasts. Whether he’s talking about org design, on-call culture, or the traps of over-abstracting too early, Leonid brings a no-nonsense, engineer-first perspective grounded in years of real-world experience.



Can We Trust Al-Generated Code? Maybe We've Been Asking the Wrong Question.


No one trusts AI-generated code. It looks right. It sounds confident. But does it actually do what we expect?


Having AI test its own work doesn’t help. If we can’t trust it to write code, why would we trust it to write tests after the fact? That’s not verification; it’s an echo chamber.

That leaves us manually checking everything. The safest bet is to assume it’s wrong and review every line yourself, which doesn’t exactly scream “productivity boost.”


So what’s the alternative?


Maybe we’ve been looking at this the wrong way. AI might be trustworthy, but only if we rethink how we guide it. What if there were a way to ensure it understands intent before it writes a single line of code? A way to catch mistakes before they happen instead of fixing them afterward?


An excited AI developer advocate and a cynical senior engineering manager take the stage to debate whether AI-driven development is finally ready for prime time or just another way to get things wrong.



Comments


bottom of page