Tag: AI Inference

Here you will find all the articles with the tag: AI Inference.

IBM debuts Telum chips, built for AI inferencing workloads

IBM debuts Telum chips, built for AI inferencing workloads

IBM unveiled its first chip with AI inferencing acceleration, named Telum. The chip will allow users to conduct tasks like fraud detection while a transaction is in progress. The chip contains eight processor cores with a “deep super-scalar out-of-order instruction pipeline, running with more ... Read more

date3 years ago
Ampere to buy OnSpecta for AI inference workloads acceleration

Ampere to buy OnSpecta for AI inference workloads acceleration

Chipmaker Ampere announced on Wednesday that it plans to acquire OnSpecta, a startup that makes software used to accelerate AI inference workloads at the edge and in the cloud. The terms of the deal have not been made public. OnSpecta was founded in 2017 and is headquartered in Redwood City, Cal... Read more

date3 years ago
Nvidia’s TensorRT 8 is here to boost AI inference

Nvidia’s TensorRT 8 is here to boost AI inference

Nvidia is accelerating artificial intelligence with the launch of its next generation of TensorRT software. On Tuesday, Nvidia launched the eighth iteration of its popular AI software, used in high-performance deep learning inference. TensorRT 8 combines a deep learning optimizer fitted with a r... Read more

date3 years ago