Gartner recently declared that generative AI has reached a “Peak of Inflated Expectations.” Let’s explore the substance behind the hype.
What’s at stake:
Tech companies are promoting myriad claims that they have solutions to process transformer algorithms better than others. The benchmarking of transformer engines is not yet available.
The burgeoning trend toward generative AI has flipped the whole AI world on its head, or so it seems.
Large Language Models (LLMs), as seen in ChatGPT, are mostly limited to language modeling and text generation. But transformers – an overarching deep-learning architecture that underlines LLMs and other generative AI applications – offers a model useful in data streams ranging from text, speech and image to 3D and video, or any sensory data.