2 min Analytics

IBM boost AI training with new 8-bit training model

IBM boost AI training with new 8-bit training model

IBM presents a new 8-bit precision approach that will strengthen AI computing power to the edge. It strives for a new industry standard that will boost AI training two to four times over. Finally, Big Blue is working on an analog chip that doubles the accuracy and consumes 33 times less energy than a digital architecture.

In recent years, AI hardware has taken significant steps forward. AI computing power has increased 2.5 times every year since specific hardware was launched in 2009. GPUs are today essential components for deep learning, but according to IBM, we are gradually reaching the limits of what GPUs and software can do.

IBM claims that next generation AI applications require faster response times and have to handle larger AI workloads and multimodal data across multiple streams. IBM research is working on a solution to scale AI with the hardware.

8-bit

Big Blue is working on a reduced-precision approach. Initially, models were trained with 32-bit precision, but a few years ago, 16-bit became the norm because the method did not show any visible loss in accuracy. Today, 16-bit training and 8-bit inference is the industry standard.

IBM is writing in a paper about the challenges it has faced to get training precision under 16-bit. According to IBM, the new 8-bit approach would preserve full model accuracy over the main AI dataset categories: image, voice and text.

The new trains two to four times faster than 16-bit systems in use today. IBM expects 8-bit training to become a new industry standard in the coming years.

When you combine this approach with your own dataflow architecture, a single chip can be used for training and inference on a wide range of workloads and networks large and small. This opens the doors to energy-efficient AI calculations in the edge.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.