
In this episode of the Real World Serverless podcast, Patrick Debois joins host Yan Cui for a wide-ranging conversation about the origins of DevOps, the rise of platform engineering, and the growing intersection of AI with software delivery practices. Patrick reflects on how his definition of DevOps has evolved over 14 years – from an extension of agile focused on bridging the dev-ops divide, to a broader philosophy of overcoming the friction created by organizational silos, whether those silos exist between development, operations, security, finance, or any other group.
The discussion turns to platform engineering and how it fits within the DevOps narrative. Patrick describes platform teams as feature teams whose product happens to be infrastructure and developer tooling. He emphasizes that the key to successful platform engineering lies in treating internal teams as customers, gathering feedback at scale, and resisting the temptation to build in isolation. The conversation draws parallels between platform teams and SaaS providers, highlighting the importance of picking battles and aiming for broad enablement rather than trying to satisfy every team individually.
A significant portion of the episode explores how generative AI is changing the daily work of engineers. Patrick outlines several dimensions of impact: AI as a co-pilot for development tasks, AI as a time-saver for summarizing pull requests or triaging bugs, AI as an advisor enforcing company standards and governance, and the emerging concept of AI agents that can execute multi-step tasks autonomously. He stresses that these capabilities apply across the entire software delivery lifecycle, not just coding.
Patrick also addresses the reverse angle – bringing DevOps practices to the delivery of GenAI applications. He highlights testing as the biggest challenge, noting that traditional exact-match testing breaks down with non-deterministic LLM outputs. New approaches include using helper models to verify relevance, grammar checking as a form of linting, and running known test datasets as production health checks. He points to emerging tools for observability, semantic caching, and guard rails that protect against prompt injection and data leakage.
The episode concludes with practical advice for engineers who want to stay relevant in the AI era. Patrick encourages everyone to experiment with AI through personal projects, learn about embeddings and vector databases, and recognize that much of the AI engineering work is fundamentally about integration – a skill that DevOps practitioners already possess.
Watch on YouTube — available on the jedi4ever channel
This summary was generated using AI based on the auto-generated transcript.