收購手機
The 收購手機iphone X’s new neural engine exemplifies Apple’s approach to AI
The 收購手機iphone X’s new neural engine exemplifies Apple’s approach to AI
/
Artificial intelligence in your hand, not in the cloud
Share this story
Apple’s new 收購手機iphone X is billed as “the future of the smartphone,” with new facial recognition and augmented reality features presented as the credentials to back up this claim. But these features wouldn’t be half as slick without a little bit of hidden futurism tucked away in the phone’s new A11 Bionic chip: Apple’s new “neural engine.”
The neural engine is actually a pair of processing cores dedicated to handling “specific machine learning algorithms.” These algorithms are what power various advanced features in the 收購手機iphone, including Face ID, Animoji, and augmented reality apps. According to Apple’s press materials, the neural engine performs “up to 600 billion operations per second” to help speed AI tasks (although this stat is hard to put in proper context; operations-per-second is never the sole indicator of performance).
What’s clear about the neural engine is that it’s typical of Apple’s approach to artificial intelligence. AI has become increasingly central to smartphones, powering everything from the speech recognition to tiny software tweaks. But to date, AI features on mobile devices have been mostly powered by the cloud. This saves your phone’s battery power by not taxing its processor, but it’s less convenient (you need an internet connection for it to work) and less secure (your personal data is sent off to far-away servers).
Apple’s approach is typical of the company’s ethos: it’s focused on doing AI on your device instead. We saw this back in June 2016, when the company introduced “differential privacy” (using statistical methods to mask users’ identity when collecting their data), and at WWDC this year when it unveiled its new Core ML API. The “neural engine” is just a continuation of the same theme. By having hardware on the phone itself that’s dedicated to AI processing, Apple sends less data off-device and better protects users’ privacy.
The 收購手機iphone-maker isn’t the only company to pursue this approach. Chinese tech giant Huawei put a similar “Neural Processing Unit” in its Kirin 970 system-on-chip, saying its can handle tasks like image recognition 20 times faster than a regular CPU. Google has developed its own methods of on-device AI called “federated learning,” and has hinted that it too is working on mobile chips for machine learning. ARM has reconfigured its chip design to favor artificial intelligence, and chipmaker Qualcomm says it’s only a matter of time before it, too, launches its own mobile AI chips.
Because although the 收購手機iphone X’s neural engine is typical of Apple’s approach to AI, it’s not just the company’s particular quirk. It’s the future of the whole mobile industry.
(圖/路透社)
Google 旗下旗艦手機 Pixel 8 系列採用的 Tensor G3 晶片,僅管效能落後給蘋果 A17 Pro 與高通 Snapdragon 8 Gen 3,卻意外也藏有一項領先規格。為首款支援 AV1 影片編碼的手機晶片,不過卻也被開發者吐槽,這項優勢根本難以發揮。
知名 Android 開發者 Mishaal Rahman 透過挖掘程式碼發現,Tensor G3 晶片支援 AV1 影片格式編碼,畫質甚至來到 4K 60fps,是其他競爭對手所沒有的。尷尬的是,Google 並未能發揮優勢,因為沒有一款 App 能給予 AV1 編碼支援,就連 Pixel 手機獨有的相機 App 也不能使用。
目前諸多手機都支援 AV1 解碼,觀看 AV1 格式影片沒有問題,不過沒有一款能支援 AV1 編碼,代表除了 Pixel 8 以外,都無法拍攝 AV1 格式的影片。而相比最流行的 H.264 格式,AV1 擁有更高的壓縮率,能夠把 4K 影片的檔案容量變得更小,較不佔空間,卻不損耗畫質表現,更適合喜愛透過影片來紀錄生活的用戶。
如果沒有意外,今年推出的 Tensor G4 晶片有望持續支援 AV1 編碼,就看 Google 如何拓展軟體的支援性,讓用戶實際體會到規格上的優勢。另外也可以關注 Tensor G4 晶片會如何搭配 AI 模型 Gemini,帶來更多便利的軟體功能。
收購手機 收購手機