Apple's Secret AI Roadmap

A Closer Look


4 min read

Apple's Secret AI Roadmap

It's interesting how Apple has quietly acquired the most AI startups since 2017, despite not emphasizing AI in their product launches.

Apple has been researching generative AI for years, with a focus on conversational AI similar to models like ChatGPT.

In 2023, Tim Cook disclosed Apple's ongoing efforts in generative AI during an investor call.

Despite Apple's relatively quiet public stance on AI, the company has acquired the most AI startups among tech giants since 2017.

By 2023, Apple had purchased 21 AI startups, surpassing Microsoft (12), Meta (11), and Google (8).

This strategic move is amplified by a research paper and the debut of MLX, a machine learning framework designed specifically for Apple devices.

Apple is pulling back the curtain on its AI roadmap.

Laying the Foundation: Run LLM on iPhones?

Apple has a history of leveraging hardware advancements to enhance both in-house and external technologies.

Running LLMs on mobile devices is a challenge.

LLMs are memory hogs, packing trillions of parameters that demand a large DRAM.

Apple confronted the challenge by conducting a study published in a research paper that sets the stage for running LLMs in mobile devices.

Instead of relying solely on conventional DRAM, Apple has opted for a unique solution using flash memory.

By selectively loading essential model weights based on activation sparsity.

In simpler terms, specific parts of the model are activated by conserving memory and enhancing overall efficiency.

By reading larger data chunks from flash memory, they've managed to run models twice the size of available DRAM. Resulting in a remarkable 4-5 times faster performance on the CPU and an impressive 20-25 times speedier output on the GPU.

The researchers stated: โ€œWe believe as LLMs continue to grow in size and complexity, approaches like this work will be essential for harnessing their full potential in a wide range of devices and applications.โ€

Building the Framework: MLX Unveiled

As Apple advances along its roadmap, they introduce a crucial milestone โ€“ MLX.

A machine learning framework tailored for Apple devices.

This framework not only simplifies the landscape for developers but also signifies a pivotal shift towards an AI-centric focus in future operating systems.

Python and C++ APIs serve as the building blocks, resembling familiar tools like NumPy and PyTorch.

MLX introduces three fundamental concepts:

Samples: representing individual data instances.

Buffers: serving as indexable containers of samples.

Streams: Iterable sequences of samples.

The MLX-data library provides primitives for working with these concepts.

While the framework aims to simplify machine learning tasks on Apple devices.

Its implications extend beyond mere functionality.

Developers creating products on iOS, iPadOS, or MacOS it's crucial to staying informed about this essential development.

This design choice reflects a commitment to a more data-centric approach within the Apple AI ecosystem.

Apple is not merely constructing tools for their domain but is contributing to the broader landscape of deep learning frameworks.

Siri AI

In the latest phase of Apple's journey into AI.

Within the latest iOS 17.4 code, the integration of large language models into its operating system is revealed.

The focus is on leveraging generative AI technology and large language models to enhance Siri's intelligence.

Enable it to perform complex tasks more effectively based on voice commands.

The SiriSummarization framework, nestled in the iOS 17.4 code, hints at potential AI integration in the Messages app.

This represents a crucial intersection in Apple's journey, where they aim to make AI an integral part of everyday user interactions.

System prompts like "please summarize" and "please answer this question" underscore their effort to create a versatile and user-friendly system.

To be clear the technology isn't actively deployed on iPhones now.

It is uncertain when it will be released to the public.

iOS 18:Apple Doing AI?

The central aim is to imbue iOS with advanced language processing capabilities. Mirroring the overarching trend of enriching user interactions through sophisticated AI.

The iOS 17.4 code reveals their testing of four distinct AI models, including Apple's proprietary "Ajax" model and variations of AjaxGPT.

Apple adopts a dual strategy, creating its own model Ajax while evaluating its performance against established solutions like ChatGPT and FLAN-T5.

The integration of on-device large language models, developed by Apple researchers, promises significant speed improvements compared to standard models.

Aligning with computer science pioneer Alan Kay's philosophy that serious software developers should create their own hardware.

In 2020, Apple strengthened its on-device processing capabilities through the acquisition of Xnor AI for a sum of $200 million.

This move allows Apple to control both hardware and software components. Enabling devices to perform tasks like facial recognition, natural language processing, and augmented reality independently for both performance and privacy reasons.

From laying the foundation with the flash memory approach to actively exploring large language models.

Apple's roadmap journey signifies an integration of AI into their ecosystem.

Setting the stage for a future where AI seamlessly integrates into the fabric of Apple users' daily experiences.

It's not just a roadmap; it's a journey toward an AI-infused Apple universe.


๐Ÿ’ญHow does the active exploration of large language model integration, align with Apple's strategy to maintain a leading position in the ever-evolving landscape of AI?

Follow us on LinkedIn where we talk about generative AI and its impact on business weekly.

Perspective ๐Ÿค”

Reference ๐Ÿ“š