The chip company is working on solutions to implement machine learning in consumer devices, in order to popularize the technology
Artificial intelligence seems reserved for high-performance computing. And the more sophisticated their different disciplines, such as machine learning , more pronounced seems this phenomenon. However, little by little this technology reaches more and more users from the hands of new devices and new networks.
To perform artificial intelligence tasks today all you need is a good Internet connection, some computing capacity and access to an AI system in the cloud. Even some device manufacturers have striven to make their products contain optimized chips for AI workloads. This is the case of NPUs (neural processor unit) that already include high-end smartphones from Apple, Huawei or Samsung.
But the truth is that artificial intelligence you still have a route to reach the whole world. For the ARM chip developer this is something that can be achieved. And the company is working on it, according to his vice president of machine learning Steve Roddy in an interview.
The company’s vision for the future is to create NPUs, not just for high-end mobile devices, but for everyone. In this way, the processing capacity offered by one of these chips, especially in fields such as object recognition using machine vision , will be available in mid-range and even lower-end terminals.
Obviously this takes time. ARM works on optimized processors that can incorporate machine learning into consumer devices. This kind of chips would be complementary to CPUs (central processor unit) and GPUs (graphical processor unit).
AI for everyone
As a branch of artificial intelligence, machine learning provides great capabilities also in the sphere of the consumer market. The fact that devices that are sold massively can access this technology would also allow improve some features that now rely on the cloud.
As an example of the usefulness of machine learning in a consumer device, speech recognition can be cited. Natural language processing it is a resource-intensive task. And normally a local computer system, such as a smartphone or computer processor, is not prepared to deal with the arbitrariness that such a request can achieve.
Having a processor optimized for machine learning will allow, for example, to improve how voice translators work without an Internet connection. Something quite useful when you are in a foreign country where neither your language nor your data rate allow you to communicate.
Images: Free-Photos, SplitShire