Reports of the imminent release of the next iteration of a neural network machine learning model created by OpenAI, a San Francisco-based company that has billionaire Elon Musk as one of its co-founders, has sparked a buzz in the artificial intelligence community. The linguistic capability demonstrated by GPT-3, which was launched in 2020, means there is great curiosity around the upcoming model. From writing books to creating computer code and poetry, GPT-3 provided insight into what AI could be done to achieve. With GPT-4 only expected to improve over the previous version, here’s a look at the extent to which artificial intelligence is considered to have come the way in achieving human capabilities.
What is GPT-3?
How about that for a satirical piece in imitation of writer Jerome K Jerome of ‘Three Men In A Boat’ fame: “It’s a curious fact that the last form of social life that remains which people in London are always interested in is Twitter. I was struck by this curious fact when I spent one of my recurring seaside vacations, and found the whole place chirping like a starling cage. I called it an anomaly, and it is. ”Before commenting on the style, form, and parallels with the language of the English comedian, it should be pointed out that this paragraph (and the whole story of six pages it opens) was written by GPT-3.
Short for ‘Generative Pre-trained Transformer’ or ‘Generative Pre-Training’, GPT-3 is the third generation of the model which was trained using data collected by crawling the Internet to generate text. GPT-3 can study anything structured like a language and can then perform tasks centered around that language. For example, he can be trained to write press releases and tweets as well as computer code. In this sense, such AI is described as predictive of language and performs so-called natural language processing (NLP) tasks. In other words, “it is an algorithmic structure designed to take a piece of language (an input) and transform it into what it predicts to be the next most useful piece of language for the user,” writes author Bernard Marr in Forbes.
The GPT-3 has not been released to the public and reports in October last year indicated that access to it had been granted to selected experts, who have now provided an overview of all the tasks qu ‘he can accomplish.
How will GPT-4 be different?
Before GPT-3, there was GPT-2 and GPT-1, which were released by OpenAI in 2019 and 2018, respectively. But they were like nascent steps leading up to the launch of GPT-3. Where GPT-2 had 1.5 billion parameters, GPT-3 had 175 billion, making it the largest artificial neural network ever to be created and 10 times more powerful than the one it stung – the Turing NLG model created by Microsoft, which ha.
An artificial neural network (ANN) is a system designed to mimic the functioning of the brain and enables “computer programs to recognize patterns and solve common problems in the fields of AI, machine learning, and computer science. ‘deep learning’, explains IBM.
“Pre-trained,” when it comes to GPT and other NLP programs like this, means that such models have been fed huge loads of data so that they can work on the rules of the game. language, variations in the meaning of words, etc. Once such a model has been trained, it can generate output from a basic prompt. For example, for the Jerome K. Jerome story, the user trying GPT-3 says about Twitter that “All I sowed was the title, the author’s name and the first” That “, the rest is done by # gpt3”.
Now we come to GPT-4. Reports indicate that following OpenAI’s trend of releasing a new version every year, it may soon come up with a version for expert testing. So it has been suggested that a version of GPT-4 could be released early next year or in 2023. And, it is widely predicted that this would be a game-changer.
A Towards data science (TDS) said that GPT-4 could have 100,000 billion parameters and would be “five hundred times” larger than GPT-3. “The brain has about 80 to 100 billion neurons (order of magnitude of GPT-3) synapses. GPT-4 will have as many parameters as the brain has synapses. The size of such a neural network could lead to qualitative leaps over GPT-3 that we can only imagine, ”he added.
The TDS report also said that GPT-4 “is unlikely to be just a language model,” referring to a December 2020 article by Ilya Sutskever, the chief scientist of OpenAI, in which he said that ‘in 2021, “language models will begin to become aware of the visual world”.
However, Sam Altman, CEO of OpenAI, has been reported as saying GPT-4 will not be bigger than GPT-3 but will use more compute resources.
So how close are we to an AI as good as human intelligence?
The stated objective of OpenAI is to achieve the creation of a general artificial intelligence, that is, an AI that has the same intelligence that a normal human being is supposed to have. It is something that seems a lot simpler than it actually is. As OpenAI itself notes, “AI systems today have impressive but narrow capabilities. It seems that we will continue to reduce their stresses and, in extreme cases, they will achieve human performance in virtually any intellectual task. “
But he adds that “it’s hard to imagine how much AI at the human level could benefit society, and it’s also hard to imagine how much it could harm society if it were built or used. incorrectly “. To that extent, he says his stated goal is to “advance digital intelligence in the way most likely to benefit humanity as a whole.”
As to how long that will take, OpenAI says “it’s hard to predict when human-level AI might be at hand.” be the best mode to crack the AI code although the organization is convinced of the way. imitating human thought patterns. However, the fact remains that “the solution to each task turned out to be much less general than people hoped”.
But he focused on deep learning because the strategy was found to have “produced exceptional results on pattern recognition problems, such as recognition of objects in images, machine translation and translation. speech recognition, ”which she said has now provided a glimpse of“ what it might be like for computers to be creative, to dream and to experience the world ”.
However, that said, NPT-3 isn’t exactly the perfect text writing tool that humans can completely rely on. As Altman himself put it, “The GPT-3 Hype is too much. AI is going to change the world, but GPT-3 is just a first look. (In fact, described as ‘gibberish’) ) when asked to produce something longer or more complex “.
But while it is suggested that future iterations of such AI systems would improve on the flaws of previous generations, not everyone is convinced. TDS quotes Stuart Russell, professor of computer science at the University of California at Berkeley and AI pioneer, as saying that “focusing on raw computing power completely misses the point… We don’t know how to make it work. a truly intelligent machine – even if it was the size of the universe. ”That is, deep learning alone may not be enough to achieve human-level intelligence.
But it is nonetheless an approach that some of the biggest names in tech are pursuing. For example, Microsoft, which is also an investor in OpenAI, has taken this route. He published a sort of explainer, “generated by the Turing-NLG language model itself” that says “massive deep learning language models … with billions of parameters learned from virtually everyone. text published on the Internet, have improved the state of the art on almost all downstream Natural Language Processing (NLP) tasks, including answering questions, chatbots, and document comprehension, among others ” .
All of this means that scientists are not yet ruling out human-like AI, although many say, as a report by consulting firm McKinsey points out, that general artificial intelligence “is far from the truth.” But the report says that “many academics and researchers argue that there is at least a chance that artificial intelligence at the human level can be achieved within the next decade.”
“Understanding AI at the human level will be a profound scientific achievement (and an economic boon) and may well happen by 2030 (25% chance) or by 2040 (50% chance) – or never (10 % chance) ”Richard Sutton, professor of computer science at the University of Alberta, reportedly said.
Read all the latest news, breaking news and coronavirus news here