It’s less likely we’re at the limit and more likely that the companies producing ai/llm tools are aiming for profitability instead of research/next gen implementation. Consider a current popular known use case for ai: Taking a fast food order. Now current estimates put a single query to chat gpt 4 to cost within the realm of 20-40 cents. Not exactly profitable if you’re trying to sell a $1.50 hamburger. If you can pair down these same mostly capable models so electricity and hardware costs are negligible, suddenly you open yourself up to a world of profitability.
It’s less likely we’re at the limit and more likely that the companies producing ai/llm tools are aiming for profitability instead of research/next gen implementation. Consider a current popular known use case for ai: Taking a fast food order. Now current estimates put a single query to chat gpt 4 to cost within the realm of 20-40 cents. Not exactly profitable if you’re trying to sell a $1.50 hamburger. If you can pair down these same mostly capable models so electricity and hardware costs are negligible, suddenly you open yourself up to a world of profitability.