vacanthorse

Apple’s other king has been quietly deployed on your iPhone for 4 years -Part 3

Si Gyeongmin

Nov 22, 2021

If you have an iPhone or iPad with a bionic chip, you can turn on the "Narration" function through the settings, and then open the "Amplifier" APP, the "Amplifier" will speak out the content, people, objects, and scenes captured by the lens. Even the weather can be recognized, which is an assist function for visual-impaired people, but the iPhone before the A11 chip cannot do it.


In fact, this function CPU and GPU are also capable, but the efficiency is too low. It becomes hot as soon as you use a mobile phone. Professional things require professional "people" to do it. From A11 to A15, the "neural network engine" has been upgraded from 2 cores to 16 cores, and the computing power has been upgraded from 600 billion times per second to 15.8 trillion times per second, an increase of 26 times.


Some people say that Apple can do so much for a very small number of visual-impaired people. Oh, I'm so touched.


But in fact, this is just a good thing Apple did in the process of AR layout.


At present, although the "neural network engine" has other functions, it is really useless for ordinary people. Even if all of them are cut off, we will hardly perceive it. The only reasonable explanation for Apple’s effort to upgrade its performance is that Apple is planning for the future.


AR needs to integrate virtual information with the real world, and it must read the things on the screen.


Special for You

Privacy Policy | Terms of Use

Copyright © 2021.All rights Reserved.

Contact us at : [email protected]