Orca: The Model Few Saw Coming

The first model set to be opensourced that actually comes close to ChatGPT, and is just 13B (that’s small enough for a laptop). The 51 page report from Microsoft was released just 48 hours ago but I have gone through it all, and bring relevant insights from 5 other papers. By imitating the logic and explanations of GPT 4 (and using GPT 3.5 as an assistant), as well as by training on diverse tasks and an order of magnitude more examples, we have Orca. I will showcase it on a dozen benchmarks and go through in detail how it works and why. I will also end on comments from Sam Altman and Ilya Sutskever on whether Opensource will catch-up...
Back to Top