Introducing the AI Pulse Podcast
by Max Vermeir, Senior Director of AI Strategy
While I believe that once we achieve the right implementation of generative AI, it will bring about significant changes…but we need to ask ourselves how we get there. Well, first and foremost, we need accurate models.
Welcome to the AI Pulse Podcast, a platform that we’ve created to share knowledge and spark conversation about a wide range of topics, all related to artificial intelligence and intelligent automation. In this series, we’ll bring on industry leaders in their specific fields to join us and chat about what they’re seeing in the market. We’ll pick their brains about how those trends will influence their decision making in the future and ask them “what is the next big thing that we should take into account?”
I’m your host, Max Vermeir, Senior Director for AI strategy at ABBYY. I’ve been highly passionate about technology from a young age, since I was digging into the inner workings of computers and electronic devices. That is why I can’t wait to share my latest insights into the technology market with you in this monthly podcast.
Already, we have an exciting lineup of guests planned for the first few episodes. We’ll discuss the ethical and legal implications of AI, specifically generative AI, with AI ethics evangelist Andrew Pery. We’ll also sit down with Bruce Orcutt, SVP of Product Marketing at ABBYY, to discuss his view of the current market and how it’s dramatically changed in the last 12 months. Then we’ll do a technical deep dive into cutting-edge AI solutions with Andrew Zyuzin, ABBYY’s VP of Product Management for Intelligent Document Processing.
This first episode dives into a topic that I believe is still at the peak of the hype cycle: large language models (LLMs). Are they really the future? That’s the question I keep getting asked these days, and that’s what I want to dig into today.
You can listen to the full episode above, and subscribe to The Intelligent Enterprise so you won’t miss new episodes published monthly here.