Beware the AI hype cycle – managing expectations in the age of ChatGPT
Artificial intelligence (AI) is not new. In fiction and reality, machines’ ability to demonstrate human-like learning, behaviour and decision-making has been heavily invested in and prophesied over for decades.
But there can be no denying that 2023 has greatly heightened the level of hype surrounding this field of technology. And hype is the operative word.
ChatGPT has brought AI further into the mainstream, giving millions of people a taste of what the tech can offer. In turn, the media – traditional, digital and social alike – has been awash with interesting use cases and speculation of how disruptive AI might be across virtually all sectors and job roles.
Putting aside one’s own interests in AI and ChatGPT, such discussions have merit. Experimenting with technologies and considering how they could, or should, be applied to solve real-world problems is an essential part of innovation. However, it is equally important that expectations are tempered; we must consider where we are in the journey.
Beware of AI hype
It should be apparent to anyone working in technology that AI – or more specifically large language models (LLMs) – is squarely in the “hype” phase of the hype cycle.
Coined by Gartner, the hype cycle describes the pattern of excitement, disillusionment and eventual maturity that new, disruptive technologies tend to follow. Put another way, it is the rather predictable phases that new tech will pass through on its way to becoming more mainstream.
It starts with the “trigger” phase, where a new idea or technology first emerges and people start to get excited about its potential. Then, as expectations rise, it enters the “peak of inflated expectations” phase, where people start to expect the technology to revolutionise everything overnight.
Then, as the limitations and challenges of the technology become more apparent, it inevitably enters the “trough of disillusionment” phase, where people start to lose faith in its potential. But over time, with continued development and refinement, the technology eventually reaches the “plateau of productivity” phase, where it becomes a stable, useful tool – consumers or businesses understand, adopt and frequently use the technology.
A long road to travel
AI, as noted above, is further along its journey through the hype cycle. But taken as a subsect of this broad technological field, LLMs and natural language processing (NLP) are still in their relative infancy.
Right now, I would argue that we are in the peak of the inflated expectations phase – and we might not yet have reached the summit, with discourse around all things AI still raging.
There is no question that AI, LLMs and NLP have the potential to transform the way we work, live and interact with each other. However, we are still a long way from realising that potential in any meaningful way. Not to mention that there are still major technical and ethical challenges to overcome.
The hype around AI has already led to unrealistic expectations and over-promising from certain companies and individuals in the tech industry. There is no doubt that there have been some incredible advances in NLP and machine learning recently, but it is important to temper expectations and recognise that AI is still a work in progress.
Further, we must be able to better distinguish the opportunists – the individuals and organisations that over-promise what AI can offer – from those who are truly innovating in this space. The former should not be allowed to overshadow the latter. There is undoubtedly a huge number of disreputable AI evangelists on social media, and many tech startups seeking venture capitalists’ money with questionable claims around their AI credentials.
At Channel, we are carefully experimenting with AI while keeping the hype cycle in mind. We can see the huge potential of AI, but we are also realistic about its limitations and the challenges that still need to be addressed before it becomes mainstream.
Kristian Wilson is the chief technology officer at Channel.