Bloomberg Apple Intelligence report highlights a company at odds with its core values, comparisons to Grok.

There's an interesting report this morning from Mark Gurman and Drake Bennett at Bloomberg on Apple Intelligence.

The story leads with a description of John Giannandrea being "poached" from Google in 2018. That shake-up didn't result in much, with a lack of substantial output so far. Over the past year or so Apple has had a marketing blitz for its answer to AI, Apple Intelligence. But it was premature.

Some writing features were added to Apple platforms and Siri can kinda phone-home to ChatGPT in other areas. But the initiative has so far fallen behind rivals, and even relied on them.

Yet the article makes me wonder if that's a bad thing. The more I read of Bloomberg's story, the more I feel like the very core of Apple's positioning over the past few years is at stake. And a lot of it stems from Apple growing into the everything-company. Take this quote for example:

The company’s base of 2.35 billion active devices gives it access to more data—web searches, personal interests, communications and more—than many of its competitors. But Apple is much stricter than Google, Meta and OpenAI about allowing its AI researchers access to customer data. Its commitment to privacy also extends to the personal data of noncustomers: Applebot, the web crawler that scrapes data for Siri, Spotlight and other Apple search features, allows websites to easily opt out of letting their data be used to improve Apple Intelligence. Many have done just that.

The article almost plays this off as being a bad thing. Apple's App Tracking Transparency privacy move reportedly stripped Meta of US$10 billion in revenue as of 2022. And it viciously competes with Google in other key areas. But the article continues:

All this has left Apple’s researchers more heavily reliant on datasets it licenses from third parties and on so-called synthetic data—artificial data created expressly to train AI. “There are a thousand noes for everything in this area, and you have to fight through the privacy police to get anything done,” says a person familiar with Apple’s AI and software development work. An executive who takes a similar view says, “Look at Grok from X—they’re going to keep getting better because they have all the X data. What’s Apple going to train on?”

There's an expectation that publicly listed businesses must do everything possible to grow, and this is an interesting example. Apple is a hardware and software company. It makes a big mobile OS and a pretty big desktop OS.

So it should sound alarm-bells that an executive is pointing to Elon Musk's "White-Genocide"-believing AI Grok as a baseline for LLMs. It should also be sounding alarm bells that Apple, the company that shipped iPods with a "Don't Steal Music" sticker, has employees that want to train AI on user data and not just rely on licensed datasets.

Privacy initiatives have undoubtedly helped Apple maintain a huge marketshare worldwide. Yet Apple Privacy as a department probably doesn't even have a line on Apple's balance sheet. And as the practice of scraping all value possible out of Apple's users and partners continues, it's almost as if growth is the goal at all costs, even if it hurts Apple's brand and bottom-line in other areas.

Apple has prided itself on its privacy and ethics, with billboards and commercials, for a decade at least. In 2015 the company famously went up against law enforcement in an attempt to maintain privacy for its users:

New Court Filing Reveals Apple Faces 12 Other Requests to Break Into Locked iPhones
Apple attorney Marc Zwillinger listed the requests in a newly unsealed response to an order from a magistrate judge in a Brooklyn federal court.
The other requests are listed in a newly unsealed court brief filed by Apple attorney Marc Zwillinger in response to an order from a magistrate judge in a Brooklyn federal court. That case involves a government request to search an Apple iPhone 5s for evidence about a suspect’s possession or sale of methamphetamine.
Apple has refused to extract data from the phone, even though it could (because the phone was running on an older operating system), arguing in court that it was “being forced to become an agent of law enforcement.”

Since then Apple has become the privacy company, though this is still a bit of marketing of course. While Apple shouts from the rooftops about its privacy initiatives - some of which are great - it still isn't perfect. One of the key insights from Google's landmark monopoly ruling is around the fact that Apple talks big game about privacy, while taking Google money to sell out its users.

Back in 2023, when the antitrust case went to trial, it was revealed in court that in 2021 Google paid companies more than $26 billion for the search placement deals. In 2022, Google paid Apple a whopping $20 billion to secure itself as the default search engine on the company's Safari browser, the DOJ said.

Which leads me back to this Bloomberg report. The question for Apple going forward will seemingly revolve around the apparent urgency of LLMs and "AI".

But is Apple Intelligence enough to risk the billion-dollar iPhone business? Is Apple willing to risk everything it currently apparently stands for just because of AI hype? Is it worth it to sink to the lows of xAI and Meta when Apple famously doesn't compete in the search space?

Apparently yes, and Tim Cook - the business mind at Apple who has no product sense - has been moving fast to shake things up. Despite a more reserved approach from Giannandrea, the kind you would expect from Apple, the company has moved him to a new product.

Giannandrea retains oversight of AI research, the development and improvement of large language models, the AI analysts, and some infrastructure teams. Insiders say that some Apple executives have discussed the idea of shrinking Giannandrea’s role still further or of him being put on a path to retirement (he’s 60), but that Federighi and others have concerns that if he leaves, the prized researchers and engineers he brought in would follow him out the door.

Is there any reason why Apple couldn't just be a great platform for other AI apps? It makes the hardware and software for the iPhone, and it could just introduce better APIs for offloading some tasks to LLMs that a user chooses to install. Offer up the useless Apple Intelligence-infused Siri for integrations with whatever AI app a user chooses to use?

Part of the Google trial has been around the fact that Apple simply doesn't compete in Search, even though that is a lucrative industry. Apple also doesn't compete in the cloud-computing space, which keeps Microsoft afloat. So why must it compete in AI?

Or is it more important to ignore these privacy introusions and move further away from the company that previously stood, as Steve Jobs would always say, at the intersection of liberal arts and technology?

Read more