Every AI conversation I hear right now is about individual productivity. The 10x engineer. The marketer who writes twice as many campaigns. The salesperson who personalizes outreach at scale. These stories are real, but they’re missing the point. The bottleneck at most companies isn’t how fast any one person works. It’s how information moves between people. Too many handoffs between narrowly scoped roles. Information siloed in department-specific tools. Meetings that exist to share context that should be searchable. Approval chains that serialize naturally parallel work. AI can accelerate individual output all day long, and you’ll still be slow if the transit layer is broken. I learned this running technology at a healthcare company where every department touches AI, and the biggest gains came from making information move without friction, not from making individuals faster. There’s a version of the AI productivity story that CTOs tell at conferences. “My engineers are seeing 10x productivity gains from AI coding tools.” And then, a few minutes later: “But we haven’t actually shipped more, because the deploy process, the review cycle, the rollout, the marketing, the compliance checks, all of that still takes the same amount of time.” The individual got faster. The system didn’t. The same pattern shows up everywhere. A salesperson can research a prospect in minutes instead of hours, but the CRM doesn’t connect to the territory data, so they still spend time manually cross-referencing. An operations manager can draft an SOP in ten minutes with voice-to-text, but getting three departments to review and approve it takes two weeks. A data analyst can run a complex query in seconds, but explaining the results to a non-technical stakeholder requires a meeting, a slide deck, and a follow-up meeting. Production got cheaper. Coordination didn’t. The friction patterns I see across departments are remarkably consistent regardless of company size. Tool silos are the most expensive. Every department has its own system. Revenue lives in the CRM. Support lives in the ticketing system. Clinical ops lives in a proprietary platform. Finance lives in spreadsheets and an ERP. The information in each system is valuable. The fact that these systems don’t talk to each other is the problem. “Which customers had support issues that correlate with churn risk?” requires manually pulling data from three systems and cross-referencing in a spreadsheet. That’s a plumbing problem, not an analysis problem. Handoff chains compound it. Context lost at every transition, reformatted and re-explained until the original need is unrecognizable. Meetings that exist solely to transfer information from one person to another. Approval workflows that serialize work because oversight was designed for a world where every step needed human attention. The friction is structural, and it’s consistent across every company I’ve advised. The highest-leverage AI applications I’ve built are the ones that remove transit friction between teams, not the ones that make individuals faster. We built an internal agent that any team member can query from Slack. Ask a business question in plain language, get an answer grounded in actual company data. Revenue asks about objection patterns. Operations asks about ticket trends. Leadership reviews customer health without attending every account meeting. The question that used to require an email to the data team, a dashboard request, and a three-day turnaround now takes thirty seconds. We enforce transcription by default on every meeting. Six months of transcribed meetings creates a searchable institutional memory that no wiki can replicate, because it captures the reasoning behind decisions, not just the decisions themselves. New hires search prior decisions instead of asking five people what happened. Managers review meetings they couldn’t attend without a debrief. The compound effect is significant. We built knowledge agents that handle the questions people ask repeatedly. “How does the billing system handle this edge case?” “What’s the SOP for provider onboarding?” A well-grounded agent handles the majority of these, and the human expert only gets involved for genuinely novel cases. The key word in all of this is “curated.” A knowledge base that’s reviewed, categorized, and version-controlled produces grounded answers citing internal sources. A raw document dump produces hallucinated answers that sound right and aren’t. Garbage in, garbage out isn’t just a data engineering truism. It’s the single most important principle in applied AI. None of this works without a data strategy. I keep coming back to this point because it’s the prerequisite most companies skip. If your tools don’t feed into a centralized data layer, your AI applications will be limited to individual-level productivity. Useful, but not what changes a company. The real leverage comes when AI can reason across systems: when the agent answering a revenue question can also see the support ticket history, the meeting transcript from last week, and the financial data that contextualizes the whole picture. Building that data layer isn’t glamorous. ETL pipelines, schema documentation, data quality checks, the unglamorous process of making sure every system feeds into a central warehouse with consistent naming conventions. We invested in this early, before AI was the obvious use case, because we needed reliable reporting. That investment turned out to be the foundation everything else sits on. Every month of centralized data you collect becomes context you can use later. And it’s nearly impossible to backfill. Start collecting now, even if you don’t know what you’ll do with it yet. If the bottleneck is information transit, the organizational response isn’t to hire more specialists. Hire fewer people with broader scope and give them AI tools that cover the breadth. A generalist who understands sales, operations, and product can use an AI agent to pull context from all three domains and synthesize a recommendation. A specialist who only understands one domain will produce faster output within that domain but still need to coordinate with two other teams to turn it into action. The lean team advantage is real, but it’s about information flow, not work ethic. A small team where everyone has visibility into the whole business, supported by AI tools that make cross-domain queries trivial, will outpace a large team where each person only sees their slice. You want people who are curious across functions, comfortable with ambiguity, and willing to learn new tools quarterly. People who ask “why does it work this way?” rather than “that’s not my department.” The token math is compelling on its own. AI output is fifty to two hundred times cheaper than human output at raw throughput. But the ROI that actually changes a company lives in the coordination costs you eliminate. Every meeting you don’t need to schedule. Every handoff you don’t need to manage. Every question that gets answered in thirty seconds instead of three days. Start with the data layer. Every month of centralized, searchable context you build now is context you can’t backfill later. The meeting that wasn’t transcribed, the sales call that wasn’t logged, the support ticket that wasn’t categorized: that information is gone. And in six months, when you want an agent that can reason across your entire business, you’ll wish you’d started collecting today.