Despite a relatively low profile on the topic, Apple was been rapidly innovating with artificial intelligence (AI) in 2023, with big plans for 2024. AI is reinventing how people interact with software and Apple’s business is certainly impacted by the innovation, and widespread disruption, taking place. In this analysis, I want to take a closer look at Apple’s 2023 AI advances as well as its strategy for 2024 and beyond. This is a pivotal moment for the firm to cement its place in the AI era as all other technology companies — and non-tech companies — race to do the same.
Apple’s AI Models and Frameworks
In 2023, Apple released a development framework called MLX that lets developers easily build machine learning (ML) models optimized for Apple Silicon, its line of Macbook processors. It was inspired partly by well-known open-source machine learning frameworks, including PyTorch, Jax, and ArrayFire. MLX comes with a model library called MLX Data. This is a critical step in helping its developer community build AI applications on top of Apple hardware.
Apple is also reportedly working on its own generative AI model called “Ajax,” its version of OpenAI’s GPT-n series. At 200 billion parameters, the Apple large language model (LLM) will be core to the company’s AI strategy moving forward. It’s likely to be comparable in performance to OpenAI’s recent models.
Ajax will allow the company to power many new AI integrations and features across its ecosystem of devices and applications without relying on third-party model providers like Microsoft, OpenAI, or Google. As we know, self-reliance and an integrated technology stack have always been the modus operandi of Apple. Its approach to AI has been the same so far.
Ask Cloud Wars AI Agent about this analysis
According to The Information and Apple analyst Jeff Pu, the company has also been building AI servers in the past year and will accelerate its efforts in 2024. One analyst estimates that Apple has spent around $620 million on servers in 2023 alone. Building these servers will be critical in training their own 200-billion-parameter model and pursuing AI research of their own.
Apple’s Research on AI
On the flip side, Apple has also been pursuing research in edge AI, specifically on-device AI. There are many benefits to bringing model inference — the term for when you prompt a generative AI model — closer to the edge. Relying entirely on the cloud comes with latency, privacy concerns, accessibility limitations, high costs, and more. For a company that controls a major percentage of edge devices on our desks and in our pockets, there is a lot of value to be captured in bringing AI computation on-device. To understand edge AI further, you can check out past coverage on Acceleration Economy:
- How Edge Computing Can Help Solve Generative AI Cybersecurity, Privacy Concerns
- With Its Hybrid Focus, Qualcomm Advances On-Device and Edge AI
- CIO Perspective: Why Some Use Cases Benefit From Shifting AI Processing to the Edge
More specifically, Apple researchers have released a paper describing a new method to bring computation on-device using flash memory instead of RAM, which mobile phones have a limited supply of. Flash is where your files and apps are stored; it’s a much larger memory pool. Bringing AI computation on-device opens new possibilities for devices including a better Siri, more advanced chatbots, real-time language translation, better AI-driven features in photography, augmented reality, and more.
Voice Assistants
Siri, in particular, has been on thin ice since the release of ChatGPT and the latter’s voice functionality. I’ve had full conversations with ChatGPT while cooking over the holidays. Despite the competition, Apple has a unique opportunity to deliver a truly integrated voice assistant experience. With the right privacy and control measures in place, I can see an AI-based Siri helping users manage their day more holistically and conversationally, accessing applications, messages, and so on, like Jarvis in the Iron Man movies.
Final Thoughts
Without a doubt, Apple faces strong competition. The company has a distinct chance at “losing” the generative AI race and surrendering market share to companies with the best AI technology and experiences, like Microsoft or Google. This goes far beyond voice assistants: The biggest tech suppliers are spending billions across the AI stack, from hardware to talent to frameworks to software and more.
AI will inevitably weave itself into every consumer’s interaction with software, from using apps to taking photos and browsing the web. Apple needs to build AI into iOS on a deep level and invest across the development stack to remain competitive and maintain control across the AI supply chain. I’m excited to see what they have in store for 2024 and beyond.