Meet Orca, Microsoft’s new 13 billion parameter AI model that can imitate GPT-4

Microsoft, in partnership with OpenAI, has been steadily implementing AI capabilities in its products and services and also building smaller case-specific models. Microsoft Research unveiled a new AI model called Orca, which learns by imitating large language models. According to the research paper, Orca is designed to overcome the limitations of smaller models by imitating the reasoning processes of large foundation models like GPT-4
- International - June 20, 2023
Meet-Orca
Meet-Orca-Microsoft’s-new-13-billion-parameter-AI-model-that-can-imitate-GPT-4

Meet Orca, Microsoft’s new 13 billion parameter AI model that can imitate GPT-4

(Meet Orca) Microsoft recently announced the development of a new language model, named Orca, which has a whopping 13 billion parameters. Orca is capable of generating text that is indistinguishable from human writing and can even imitate the performance of hypothetical future models like GPT-4.

Orca is based on the same architecture as GPT-3, the current state-of-the-art language model developed by OpenAI. However, Orca is significantly larger than GPT-3, which has 175 billion parameters. Despite its smaller size, Orca is still a major advance over GPT-3 and represents a significant achievement in the field of natural language processing.

One of the key features of Orca is its ability to generate text that is not only grammatically correct but also semantically coherent and contextually appropriate. This means that Orca can generate text that is not only free of errors but also makes sense in the context of a given topic or situation.

To achieve this level of performance, Orca was trained on an enormous amount of text data, including books, articles, and other written materials. The model was also fine-tuned on specific tasks, such as translation and summarization, to improve its ability to generate text that is relevant and informative.

Field of language modeling

While Orca is a major advance in the field of language modeling, it also raises concerns about the potential impact of such models on society. One concern is that these models may be used to generate large amounts of fake news and propaganda, which could have serious consequences for democracy and public discourse.

Another concern is that language models like Orca may contribute to the displacement of human workers in certain industries, such as content creation and journalism. As these models become more advanced, they may be able to perform a wider range of tasks that were previously the domain of human workers.

Despite these concerns, Orca represents a major advance in the field of natural language processing and has the potential to revolutionize multiple industries, from healthcare and education to finance and entertainment. As technology continues to evolve, it will be important to carefully consider its potential impacts and take steps to ensure that it is used in ways that benefit society as a whole.

Microsoft, in partnership with OpenAI

Microsoft, in partnership with OpenAI, has been steadily implementing AI capabilities in its products and services and also building smaller case-specific models. Microsoft Research unveiled a new AI model called Orca, which learns by imitating large language models. According to the research paper, Orca is designed to overcome the limitations of smaller models by imitating the reasoning processes of large foundation models like GPT-4.

Language models like Orca can be optimized for specific tasks and trained using large language models like GPT-4. Due to its smaller size, Orca requires fewer computing resources to run and operate. Researchers can optimize their models according to their requirements and independently run them without relying on a large data center.

According to the research paper, Orca, a 13 billion parameter-powered AI model, can imitate and learn from large language models like GPT-4 and is based on Vicuna. It can learn explanations, step-by-step thought processes, and other complex instructions with the help of GPT-4, which is rumored to have over one trillion parameters.

Microsoft is utilizing large-scale and diverse imitation data to promote progressive learning with Orca, which has already surpassed Vicuna by 100% on complex zero-shot reasoning benchmarks like Big-Bench Hard (BBH). The new model is also claimed to be 42% faster than conventional AI models on AGIEval.

In terms of reasoning, despite being a smaller model, Orca is said to be on par with ChatGPT on benchmarks like BBH. Additionally, it demonstrates competitive performance on academic examinations such as SAT, LSAT, GRE, and GMAT, although it falls behind GPT-4.

The Microsoft research team states that Orca can learn using step-by-step explanations created by humans and more advanced language models and is expected to get improved skills and capabilities.