2 min

AI hardware accelerators are some of the most potent tools we can use for scientific discovery. However, they are not energy efficient. Now, IBM Research, claims that it has identified a new way of tackling this problem.

During the IEEE CAS/ADS AI Compute Symposium IBM Research unveiled new partnerships and technologies that will give businesses the power to run large-scale AI workloads in hybrid clouds.

The claim is that energy-efficient AI hardware accelerators can increase the computing power by orders of magnitude without taxing the available energy resources.

Collaborations and partnerships

IBM made some announcements, including collaboration with Red Hat, which led to IBM Digital AI Cores becoming compatible with Red Hat OpenShift and the related ecosystem. IBM research added that its toolkit for Analog AI cores would be made open source. 

The AI Hardware Center will team up with Synopsys to work on new and enhanced AI chip architecture.

IBM also claimed that it had tackled the memory bandwidth bottleneck by providing resources to create the infrastructure necessary to accelerate new chip packaging development.

Changing problem-solving forever

AI will aid scientists in reaching new heights, according to IBM Research. It also added that large data processing workloads used to make discoveries will need more power, memory, and bandwidth at an unprecedented level.

The firm said that working with Red Hat, Synopsys, and the rest, the advancements in AI hardware and hybrid cloud management software integration will power the models and methods that will probably change how we solve problems forever.

That is a big claim to make and should not be easy to deliver on. However, IBM is a behemoth in the industry, and we only hope they are right.