> Do you believe wrappers can still be durable businesses?
Definitely not. Keep in mind the actual cost of using LLMs is currently heavily subsidised by VC money. They are fighting between themselves to win market dominance. We are paying a favourable rate until the dust settles. At some point the money will dry up. Is your app sustainable if the API costs increases?
> If you’re building with LLMs, where do you think the real moat comes from?
It really depends on what you're building. Are you chat wrapper with some additional system prompts? Most definitely you will be swallowed up eventually. Maybe not even by the LLMs themselves, but all you really need is a builder-type app. If your value on-top of an LLM is speciality prompts, you have no moat. Enjoy some returns now while it lasts. Nothing wrong with that either.
Does your app actually solve a real-world problem, and then uses LLMs is very specific features or ways to improve either efficiency or automate parts of it? I'd say that's the beginning of a moat. The data you have and collect from your problem solution lends itself to improved LLM efficiency and accuracy - which is where you can start building a moat. As long as that data is proprietary, then you have a bit of a leg up.
Obviously, LLMs will keep learning over time, but don't be certain they are going to get smarter. There's a phenomena called Model Collapse. The more LLMs become popular, the more it gets trained on it's own AI-generated output, leading to a degrading in quality. This reinforces the need and value of having your own data. It's already becoming really hard to navigate the internet, including reddit, without having to wade through tons of AI-slop. A lot of this is going to be fed back into the LLMs as training data.