Samsung Follows Apple's AI Strategy With Perplexity-Powered Bixby

4 min read Original article ↗

While Apple is moving toward a multi-model AI strategy for Siri, Samsung appears to be following a similar path by integrating Perplexity's AI into its Bixby assistant for the upcoming Galaxy S26.

bixby voice samsung
According to the well-regarded tipster @chunvn8888 on X (Twitter), the arrangement would see Samsung's Bixby continuing to handle basic on-device requests such as toggling settings and activating system features, while Perplexity's models would take on queries that involve complex reasoning and generative tasks.

Currently, Apple Intelligence blends on-device models for relatively simple tasks with OpenAI's ChatGPT for more complex reasoning and generation. A Samsung-Perplexity pairing would replicate this split. However, Samsung already maintains a deep AI partnership with Google, and newer Galaxy devices ship with Gemini-powered features embedded directly into its custom One UI interface. Choosing Perplexity suggests Samsung is widening its model pool rather than consolidating around Google's ecosystem.

Meanwhile, Apple is preparing a broader expansion of its own multi-model strategy. A recent Bloomberg report claims the more capable version of Siri under development will rely on Google's Gemini for advanced tasks such as summarisation and multi-step planning, with Apple's own models continuing to power select features.

The more advanced Siri was originally meant to arrive with iOS 18, but Apple is said to have recognized the need to rework the assistant's core architecture, which has pushed the release to a software update expected early next year.

Confirmed: Bixby will have Perplexity integrated into it. Basic tasks will be handled by Bixby while complicated, more thinking tasks will be backed by Perplexity. Just like how ChatGPT x Apple Intelligence works. Debut during the S26 series Unpacked very likely. — Semi-retired-ing (@chunvn8888) November 24, 2025

Despite Apple's reliance on external partners, Apple is said to be continuing to invest heavily in its internal LLM roadmap. Apple is reportedly developing a cloud-based complex reasoning model that could be ready as soon as 2026.

As a result of the update, ‌Siri‌ will be able to answer more complex queries and complete more complicated tasks in and between apps. It will be closer in function to Claude and ChatGPT, however Apple is not planning a dedicated chatbot app.

Popular Stories

iPhone 18 Pro Launching Later This Year With These 12 New Features

Wednesday March 18, 2026 7:39 am PDT by

While the iPhone 18 Pro and iPhone 18 Pro Max are not expected to launch for another six months or so, there are already plenty of rumors about the devices. It was initially reported that the iPhone 18 Pro models would have fully under-screen Face ID, with only a front camera visible in the top-left corner of the screen. However, the latest rumors indicate that only one Face ID component...

Here Are Apple's Release Notes for iOS 26.4

Wednesday March 18, 2026 11:56 am PDT by

Apple provided developers and public beta testers with the release candidate versions of iOS 26.4 and iPadOS 26.4, which means we're going to see a public launch as soon as next week. The RC versions of the software include Apple's official release notes, giving us final details on what's included in the update. Apple Music - Playlist Playground (beta) generates a playlist from your...

Apple Has Now Unveiled Eight New Products This Month

Apple has unveiled a whopping eight new products so far this March, including an iPhone 17e, iPad Air models with the M4 chip, MacBook Air models with the M5 chip, MacBook Pro models with M5 Pro and M5 Max chips, an updated Studio Display, a higher-end Studio Display XDR, and now the AirPods Max 2 this week. iPhone 17e features the same overall design as the iPhone 16e, but it gains Apple's...