In a collaborative effort between Microsoft and OpenAI, significant strides have been made in integrating artificial intelligence (AI) capabilities into Microsoft’s products and services. The partnership has also involved the development of more specialized AI models. One such model, called Orca, was recently unveiled by Microsoft Research. Orca is designed to learn by emulating the reasoning processes of large-scale language models, such as the anticipated GPT-4.
Orca, unlike its larger counterparts, addresses the limitations of smaller models by leveraging the insights gained from emulating the reasoning of massive foundation models like GPT-4. This approach enables language models like Orca to be optimized for specific tasks and trained using the knowledge and power of expansive models. The advantage of Orca’s smaller size is that it requires fewer computing resources to operate, enabling researchers to independently run their models without relying on a large data center.
In the research paper introducing Orca, it is revealed that this AI model, powered by 13 billion parameters, can successfully imitate and learn from larger language models like GPT-4. Built on the foundations of Vicuna, Orca can comprehend explanations, engage in step-by-step thought processes, and comprehend complex instructions with the assistance of GPT-4, a model rumored to possess over one trillion parameters.
To foster progressive learning with Orca, Microsoft leverages extensive and diverse imitation data. Through this approach, Orca has already surpassed Vicuna by 100% on complex zero-shot reasoning benchmarks like Big-Bench Hard (BBH). Additionally, the model is claimed to be 42% faster than conventional AI models on AGIEval, a benchmark evaluation for AI models.
Despite its smaller size, Orca proves to be on par with ChatGPT in terms of reasoning, as evidenced by its performance on benchmarks like BBH. Moreover, Orca demonstrates competitive capabilities on academic examinations such as SAT, LSAT, GRE, and GMAT, although it falls short of the expectations set by GPT-4.
The Microsoft research team behind Orca asserts that the model can learn from step-by-step explanations generated by both humans and more advanced language models. As a result, Orca is expected to continually enhance its skills and capabilities over time.
Microsoft’s partnership with OpenAI and the introduction of Orca exemplify the ongoing efforts to advance the field of AI. By combining the strengths of large-scale models with more specialized counterparts, Microsoft aims to empower researchers and developers to create innovative applications that can effectively reason, comprehend complex instructions, and address a variety of tasks.