Apple quietly released an open source multimodal LLM in October

Apple quietly released an open source multimodal LLM in October

Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.


With little fanfare, researchers from Apple and Columbia University released an open source multimodal LLM, called Ferret, in October 2023. At the time, the release — which included the code and weights, but for research use only, not a commercial license — did not receive much attention. But now that may be changing: With open source models from Mistral making recent headlines and Google’s Gemini model is coming to the Pixel Pro and eventually to Android, there has been increased chatter about the potential for local LLMs to power small devices.

That chatter increased recently because Apple announced it had made a key breakthrough in deploying LLMs on iPhones: The company released two new research papers introducing new techniques for 3D avatars and efficient language model inference. The advancements were hailed as potentially enabling more immersive visual experiences and allowing complex AI systems to run on consumer devices such as the iPhone and iPad.

Many in the AI community who belatedly noticed the Ferret release celebrated Apple’s unexpected entry into the open source LLM landscape, especially since Apple has traditionally been known as a “walled garden.”

This morning, Bart de Witte, who runs a European non-profit focused on open source AI in medicine, posted on X: “I somehow missed this,” he wrote. “Apple joined the open source AI community in October. Ferret’s introduction is a testament to Apple’s commitment to impactful AI research, solidifying its place as a leader in the multimodal AI space…ps: I’m looking forward to the day when Local Large Language Models (LLLMs) run on my iPhone as an integrated service of a re-designed iOS.”

VB Event

The AI Impact Tour







Getting to an AI Governance Blueprint – Request an invite for the Jan 10 event.






Learn More




Tristan Behrens, a German AI music artist and advisor, also weighed in. “Well Christmas is tomorrow,” he wrote on LinkedIn. “But did you realize that Apple (yes Apple!) recently released a Multimodal Large Language Model? Including code and weights?”

And tech blogger and VentureBeat contributor Ben Dickson wrote on LinkedIn: “What is the AI development in 2023 that you expected the least? For me, it was Apple releasing open source LLMs (albeit with a non-commercial license).” Apple, he added, “has long been the paragon of closed systems, walled garden development, secrecy, bulletproof NDAs, releasing zero details, and patenting every single bit of their products.”

But in retrospect, he continued, “it makes sense for Apple (like Meta) to enter the LLM market with open source models. To compete with models like ChatGPT, you either need to be a hyperscaler or be in partnership with one. Apple might have a lot of resources, but its infrastructure is not built to serve LLMs at scale. The alternative would be to become dependent on a cloud provider like Microsoft or Google (both bitter enemies) or start releasing its own open source models, a la Meta.”

Interestingly, the news about Apple’s open source and local ML developments comes as both Anthropic and OpenAI are reportedly negotiating massive new funding raises for their proprietary LLM development efforts. Reuters reported on Wednesday that Anthropic is in discussions to raise $750 million from Menlo Ventures, while Bloomberg reported yesterday that OpenAI is “is in early discussions to raise a fresh round of funding at a valuation at or above $100 billion.”



VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.