Need smarter insights in your inbox? Join our weekly newsletters to get solely what issues to enterprise AI, knowledge, and safety leaders. Subscribe Now
The chatter round synthetic normal intelligence (AGI) might dominate headlines coming from Silicon Valley firms like OpenAI, Meta and xAI, however for enterprise leaders on the bottom, the main target is squarely on sensible purposes and measurable outcomes. At VentureBeat’s latest Rework 2025 occasion in San Francisco, a transparent image emerged: the period of actual, deployed agentic AI is right here, is accelerating and it’s already reshaping how companies function.
Corporations like Intuit, Capital One, LinkedIn, Stanford College and Highmark Well being are quietly placing AI brokers into manufacturing, tackling concrete issues, and seeing tangible returns. Listed below are the 4 largest takeaways from the occasion for technical decision-makers.
1. AI Brokers are transferring into manufacturing, quicker than anybody realized
Enterprises at the moment are deploying AI brokers in customer-facing purposes, and the pattern is accelerating at a breakneck tempo. A latest VentureBeat survey of two,000 business professionals performed simply earlier than VB Rework revealed that 68% of enterprise firms (with 1,000+ staff) had already adopted agentic AI – a determine that appeared excessive on the time. (In actual fact, I anxious it was too excessive to be credible, so once I introduced the survey outcomes on the occasion stage, I cautioned that the excessive adoption could also be a mirrored image of VentureBeat’s particular readership.)
Nevertheless, new knowledge validates this speedy shift. A KPMG survey launched on June 26, a day after our occasion, exhibits that 33% of organizations at the moment are deploying AI brokers, a stunning threefold improve from simply 11% within the earlier two quarters. This market shift validates the pattern VentureBeat first recognized simply weeks in the past in its pre-Rework survey.
This acceleration is being fueled by tangible outcomes. Ashan Willy, CEO of New Relic, famous a staggering 30% quarter over quarter progress in monitoring AI purposes by its clients, primarily due to the its clients’ transfer to undertake brokers. Corporations are deploying AI brokers to assist clients automate workflows they need assistance with. Intuit, for example, has deployed bill technology and reminder brokers in its QuickBooks software program. The outcome? Companies utilizing the function are getting paid 5 days quicker and are 10% extra more likely to be paid in full.
Even non-developers are feeling the shift. Scott White, the product lead of Anthropic’s Claude AI product, described how he, regardless of not being an expert programmer, is now constructing production-ready software program options himself. “This wasn’t attainable six months in the past,” he defined, highlighting the ability of instruments like Claude Code. Equally, OpenAI’s head of product for its API platform, Olivier Godement, detailed how clients like Stripe and Field are utilizing its Brokers SDK to construct out multi-agent techniques.
2. The hyperscaler race has no clear winner as multi-cloud, multi-model reigns
The times of betting on a single massive language mannequin (LLM) supplier are over. A constant theme all through Rework 2025 was the transfer in the direction of a multi-model and multi-cloud technique. Enterprises need the pliability to decide on the perfect instrument for the job, whether or not it’s a robust proprietary mannequin or a fine-tuned open-source various.
As Armand Ruiz, VP of AI Platform at IBM defined, the corporate’s improvement of a mannequin gateway — which routes purposes to make use of no matter LLM is best and performant for the precise case –was a direct response to buyer demand. IBM began by providing enterprise clients its personal open-source fashions, then added open-source assist, and eventually realized it wanted to assist all fashions. This need for flexibility was echoed by XD Huang, the CTO of Zoom, who described his firm’s three-tiered mannequin strategy: supporting proprietary fashions, providing their very own fine-tuned mannequin and permitting clients to create their very own fine-tuned variations.
This pattern is creating a robust however constrained ecosystem, the place GPUs and the ability wanted to generate tokens are in restricted provide. As Dylan Patel of SemiAnalysis and fellow panelists Jonathan Ross of Groq and Sean Lie of Cerebras identified, this places stress on the profitability of loads of firms that merely purchase extra tokens when they’re accessible, as a substitute of locking into earnings as the price of these tokens continues to fall. Enterprises are getting smarter about how they use totally different fashions for various duties to optimize for each price and efficiency — and that will typically imply not simply counting on Nvidia chips, however being far more custom-made — one thing additionally echoed in a VB Rework session led by Solidigm across the emergence of custom-made reminiscence and storage options for AI.
3. Enterprises are targeted on fixing actual issues, not chasing AGI
Whereas tech leaders like Elon Musk, Mark Zuckerberg and Sam Altman are speaking in regards to the daybreak of superintelligence, enterprise practitioners are rolling up their sleeves and fixing quick enterprise challenges. The conversations at Rework had been refreshingly grounded in actuality.
Take Highmark Well being, the nation’s third-largest built-in medical health insurance and supplier firm. Its Chief Information Officer Richard Clarke mentioned it’s utilizing LLMs for sensible purposes like multilingual communication to higher serve their numerous buyer base, and streamlining medical claims. In different phrases, leveraging know-how to ship higher providers right now. Equally, Capital One is constructing groups of brokers that mirror the capabilities of the corporate, with particular brokers for duties like threat analysis and auditing, together with serving to their automobile dealership purchasers join clients with the fitting loans.
The journey business can be seeing a realistic shift. CTOs from Expedia and Kayak mentioned how they’re adapting to new search paradigms enabled by LLMs. Customers can now seek for a lodge with an “infinity pool” on ChatGPT, and journey platforms want to include that degree of pure language discovery to remain aggressive. The main target is on the client, not the know-how for its personal sake.
4. The way forward for AI groups is small, nimble, and empowered
The age of AI brokers can be reworking how groups are structured. The consensus is that small, agile “squads” of three to 4 engineers are best. Varun Mohan, CEO of Windsurf, a fast-growing agentic IDE, kicked off the occasion by arguing that this small staff construction permits for speedy testing of product hypotheses and avoids the slowdown that plagues bigger teams.
This shift implies that “everyone seems to be a builder,” and more and more, “everyone seems to be a supervisor” of AI brokers. As GitHub and Atlassian famous, engineers are now studying to handle fleets of brokers. The abilities required are evolving, with a larger emphasis on clear communication and strategic considering to information these autonomous techniques.
This nimbleness is supported by a rising acceptance of sandboxed improvement. Andrew Ng, a number one voice in AI, suggested attendees to depart security, governance, and observability to the tip of the event cycle. Whereas this may appear counterintuitive for giant enterprises, the thought is to foster speedy innovation inside a managed surroundings to show worth rapidly. This sentiment was mirrored in our survey, which discovered that 10% of organizations adopting AI haven’t any devoted AI security staff, suggesting a willingness to prioritize velocity in these early phases.
Collectively, these takeaways paint a transparent image of an enterprise AI panorama that’s maturing quickly, transferring from broad experimentation to targeted, value-driven execution. The conversations at Rework 2025 confirmed that firms are deploying AI brokers right now, even when they’ve needed to be taught robust classes on the best way. Many have already gone by way of one or two massive pivots since first attempting out generative AI one or two years in the past — so it’s good to get began early.
For a extra conversational dive into these themes and additional evaluation from the occasion, you may take heed to the total dialogue I had with impartial AI developer Sam Witteveen on our latest podcast beneath. We’ve additionally simply uploaded the main-stage talks at VB Rework right here. And our full protection of articles from the occasion is right here.
Take heed to the VB Rework takeaways podcast with Matt Marshall and Sam Witteveen right here:
Editor’s observe: As a thank-you to our readers, we’ve opened up early fowl registration for VB Rework 2026 — simply $200. That is the place AI ambition meets operational actuality, and also you’re going to wish to be within the room. Reserve your spot now.