New Microsoft Orca AI model can learn and mimic GPT 4 models; here is what you get | Tech News

New Microsoft Orca AI model can learn and mimic GPT 4 models; here is what you get

Meet Orca, Microsoft’s new AI model that can mimic other large AI models, while getting rid of the formidable challenges posed by large-scale data handling and task variety.

By: SHAURYA TOMER
| Updated on: Jun 21 2023, 15:53 IST
Microsoft Orca
Orca is an open-source model that can mimic other LLMs such as ChatGPT. (Pexels)

Like many other companies like Google, Microsoft too is heavily investing in AI. Its multiyear, multibillion-dollar investment in OpenAI, the maker of ChatGPT, is just another example of the company's vision, led by CEO Satya Nadella. While Large Language Models (LLMs) like ChatGPT and Google Bard have vast capabilities, their extensive sizes require large computing resources, leading to limitations. To counter this, Microsoft has recently introduced Orca, a 13-billion parameter model that learns to imitate the reasoning process of Large Foundation Models (LFMs).

Meet Orca

Unlike ChatGPT, Microsoft Orca is a smaller AI model, developed and tailored for specific use cases. According to a Microsoft research paper, Orca learns from a vast database of information that is provided by GPT 4's approximately one trillion parameters, including explanation traces, intricate instructions, and detailed thought processes, while getting rid of the formidable challenges posed by large-scale data handling and task variety. Due to its smaller size, Orca does not require large, dedicated computing resources. As a result, it can be optimized and tailored for specific applications without the need for a large-scale data center.

One of the most notable factors of this AI model is its open-source architecture. Unlike privately owned ChatGPT and Google Bard, Orca supports an open-source framework, meaning that the public can contribute to the development and improvement of the small LFM. It can take on the private models built by large tech companies by harnessing the power of the public.

While it is based on the foundations of Vicuna, another instruction-tuned model, Orca surpasses its capabilities by 100 percent on complex zero-shot reasoning benchmarks such as Big-Bench Hard (BBH) and by 42 percent on AGIEval.

A ChatGPT rival

According to the research paper, Orca not only surpasses other instruction-tuned models but also performs at par with OpenAI's ChatGPT in BBH benchmarks, despite its smaller size. Moreover, it also displays academic prowess in competitive exams like LSAT, GRE, and GMAT, both in zero-shot settings without CoT, although it trails behind GPT-4.

Microsoft's research team claims that Orca has the capability to learn through step-by-step explanations, from both human experts as well as other Large Language Models (LLMs), in a bid to improve model capabilities and skills.

Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.

First Published Date: 21 Jun, 14:55 IST
NEXT ARTICLE BEGINS