Humans on the Support Team in 2026: Something to Solve For?
When the customer has access to a tool that knows the product better than you do...
The Support Team's duties are reactive to the needs of the user. Those needs shift on a daily, weekly, and as we're now experiencing, generational basis. In the mid '90s Microsoft's user base exploded by the millions. That's a lot of human bandwidth dedicated to closing the "Where is my 'ANY' key?" tickets. How do you solve for an inundated Support Team? Customer self-enablement. Microsoft began publishing FAQs for customers to help themselves. This, now industry standard, shifted the Support Team's responsibilities from answer every user question to answer every question users can't answer themselves. Sound familiar? AI is doing it again.
The difference is that, according to Gartner, by 2029 (an extremely conservative estimate IMO), agentic AI will resolve 80% of incoming customer queries without requiring any human bandwidth at all.
So if your product's AI assistant handles a majority of tickets now...
The embedded AI assistant in your SaaS product is servicing a new generation of technically-inclined users who instinctively self-troubleshoot. Ergo, nobody is reaching out for a post-sale product walkthrough, UI navigation issue, or "how do I...". Touch and go interactions are extinct. Yes, your ticket queue has not only gotten shorter, but the substance is more demanding, the resolutions are more complex, and your touches per ticket are more numerous.
So that brings us to...
Perceived Accountability
When an AI assistant tells your dissatisfied client, "We understand how important this is to you" ... who is "We"? Is it the company dev who plugged in your API Key authorizing the $0.00015–$0.0003 charge per 1,000 tokens outputting a "We Care" pattern-matched statement? It certainly isn't the AI Assistant. It takes a Being capable of the Conscious Human Experience to take accountability. We haven't yet been able to simulate the conscious human experience and so, neither can we simulate accountability. In 2024, Canadian courts held Air Canada liable after its AI chatbot promised to honor a non-existent refund policy. Legislatively, it appears AI accountability is following the same logic as animal ownership. The difference is that problematic pets can be dealt with in a way meaningful to the Conscious Human Experience that satisfies the wronged party, and, ostensibly, demands accountability from the owner. In other words, while both are seperate but associated entities, a pet is an accountability-extension while an AI Assistant is merely an accountability-proxy. The subtext here is that you cannot hold an AI Assistant personally accountable in any substantive way to a dissatisfied customer.
On an organizational level, a failure of duties needs to be attributed to someone in a meaningful way that ensures better future outcomes. On a customer relationship level, the customer needs to be able to feel that there will be meaningful consequences to a failure in duties. AI Assistants, incapable of simulating the Conscious Human Experience, rob this attribution of accountability on every level.
Pillar twoThroughput vs Situational Awareness
Your AI Assistant can process more tickets than a team of 1,000 Human Support Agents. It can also organize them by priority and urgency. It can also roadmap and triage these issues for the appropriate teams and even write documentation along the way.
I'll leave it up to you to think about all the different ways you've had food delivered to your table.
Your AI Agent hasn't been to your most recent weekly stand-up. It also doesn't know which Product Engineer is the backup SME on the new feature that just soft-launched to a handful of your Enterprise SLA clients comprising 80% of your MRR, or what tone is appropriate in a new communication with the customer's developer in charge of product integration before a critical release.
There's a difference between achieving a satisfying experience by smoothing the interstitial frictions of collaboration... VS... appropriate individuals have been alerted -> tasks complete -> initiating next unavoidable collaboration event.
Though... in another universe... Initiating: NPS SCORE MAXIMIZATION PROTOCOL
What is an NPS Score? Pillar three
Human Growth and Want
A human Support agent is a collector of peripheral data. Remember those junior developer jobs that have ostensibly disappeared? Well, those humans are on your Support Team now (always have been). They're trend noticers and roadmap projectors; these are your next Product Engineers and Product Managers whose context windows don't truncate and whose sessions have no token limit.
The context that doesn't explicitly pass through a matrix of keywords and patterns is the context that may not be optimal for ticket resolution, but it is the kind that leads to innovation. An AI Agent will never have experienced the want of needing a solution and then experiencing the friction in achieving that solution. No friction -> no want -> no innovation.
Consider: no AI would have suggested replacing the "Save" button with incremental autosave and a change history tab; now a commonly adopted industry standard across giants and startups like Microsoft 365, Google Workspace, and Builder.io. In fact, it may not be optimal for everyone. I've received a few tickets asking, "where did my save button go". Until the AI can juggle philosophies like utilitarianism and deontology and then apply that lens to a save button while taking into account what your best friend thinks, innovation is for the guy or gal who spent a year on the Support Team.
The Bar Isn't Lower, It's Different
The AI Assistant didn't replace your Support Team. It freed up your team's bandwidth.
So what's left? Empathy, relationships, and want. These are the non-simulateable, and non-optimizable, and non-automatable aspects of the Conscious Human Experience that make humans capable of holding the accountability that builds mutual trust between your company and your customers. And Raise those NPS Scores.🙂
AI is a tool for optimizing ticket resolution. The humans on the Support Team now have a new and powerful tool to optimize for the best possible Conscious Human Experience of your customers.