The demos are impressive. The limits are real. If you only ever read the launch posts and the LinkedIn carousels, you would think AI is roughly two software updates away from doing your job. It isn't. Here are four things it still can't do in 2026, and why knowing them makes you better at using it, not worse.
1. Know what happened today
An AI model is trained on data with a cut-off date. After that date, it knows nothing — unless it has a tool that can search the web, and even then it only knows what the search returns and how it interprets it. Ask it about a news story from this morning and you will often get either a confident-sounding hallucination or a refusal.
Most general assistants now have search baked in, which helps. But "knows what happened today" is still patchier than people assume. If your work depends on current facts — prices, schedules, who is in what role — verify them. Don't trust the first draft.
2. Remember you long-term
Within a single conversation, AI can hold a lot of context. Across conversations, almost nothing. Some tools have memory features that store a handful of facts about you. None of them remember the way a colleague remembers — the running joke, the last project, the thing you said you'd circle back on.
This means you have to bring the context every time. Treat it less like a person you've worked with for years and more like a brilliant temp who started this morning.
3. Have genuine opinions
AI doesn't believe anything. It produces output that sounds like opinions because that is what was in its training data. Ask it to "give its honest take" and it will simulate a take, often a balanced one, often the one it predicts you want.
This isn't a flaw to be fixed. It is what AI is. Useful for stress-testing your own thinking. Not useful as a substitute for someone who actually has skin in the game.
If you want a real opinion, ask a real person who knows what they think and why.
4. Do truly original work
AI is extraordinary at remixing. It struggles to invent. The output is always somewhere in the cloud of patterns it learned from. Sometimes that cloud is enormous and the recombination feels new. But the genuinely strange, the leap that nobody saw coming, the thing that doesn't fit any existing category — that mostly comes from humans, in part because humans have stakes and bodies and time pressure that produce the kinds of constraints AI does not have.
If you are doing creative work that depends on novelty, AI is a sparring partner. Don't expect it to be the artist.
Knowing the limits is the skill
Powerful, but not magic. The people who use AI well are the ones who know exactly where the cliff edge is — and stop walking before they fall off it.
What would you add to the list?
Limits people hit in real work are the most useful ones to talk about. Bring yours to the next Kent AI Meet Up — it's where these honest conversations happen.
See upcoming meet-ups →