Home » Mobile » Samsung unveils on-device AI tech that utilizes fewer transistors and less energy.

Samsung unveils on-device AI tech that utilizes fewer transistors and less energy.

Samsung has announced a fresh NPU technology that will enable AI on-device to be quicker, more energy-efficient and take up less chip room. It is possible to create 4-bit neural networks that maintain the precision of a 32-bit network thanks to Quantization Interval Learning.

Using fewer parts decreases the number of computations and the equipment that carries them out considerably–Samsung says it can reach the same outcomes 8x quicker while decreasing the amount of 40x to 120x transistors. This will result in quicker and less power-using NPUs but can perform familiar functions such as object recognition and biometric authentication (fingerprint, iris and face recognition). It is also possible to perform more complex duties on the device itself.

Of course, on-device AI is a privacy boon, but it also has other advantages over cloud-based AI. Connecting to a cloud server, for instance, implies that the modem of the phone uses energy. This Internet link also adds latency. Thanks to the high-speed, low-latency operation of QIL-based NPUs, Samsung anticipates its use in items such as self-driving cars (where precise object recognition is essential) and virtual reality as well (maybe something along the lines of Nvidia’s DLSS, which utilizes deep learning to enhance image quality at a low price to the GPU).

Samsung will take this technology and other apps to portable chipsets “in the near future.”

Leave a Reply

Your email address will not be published. Required fields are marked *