2 min Analytics

Nvidia claims new record in real-time conversational AI

Nvidia claims new record in real-time conversational AI

Nvidia says that it has achieved new records and breakthroughs in artificial intelligences (AIs) that understand language. The AIs should enable real-time conversational AI in various software applications.

According to Nvidia, conversational AI is especially important for companies that want to build chatbots and virtual assistants for conversations with real people. It is desirable that the AI then shows a human level of understanding.

The industry therefore uses increasingly large language models. However, the problem is that these models are also more difficult to train and deploy, writes Silicon Angle. Nvidia claims to have had a number of breakthroughs on that score.


For example, the company has reduced the training time for one of the most advanced AI language models, the Bidirectional Encoder Representations from Transformers (BERT), from several days to just 53 minutes.

Nvidia’s systems were also able to reduce the time needed to complete AI inference to about two milliseconds. That time is more than enough to make the quick conversations that people are used to.

In addition, Nvidia claims to have set a new world record. It did this by training BERT-Base. The training normally takes weeks, but was now completed within an hour. This was possible by using optimised software, as well as the DGX SuperPOD system.

Nvidia’s TensorRT platform set a world record for BERT inference with a latency of only two milliseconds. A threshold of ten milliseconds applies to human accuracy.

Next wave of conversational AI

Nvidia wants to use the breakthroughs to drive the next wave of conversational AI. Bryan Catanzaro, vice president of applied deep learning research at Nvidia, says the company has already made progress in that area.

For example, Nvidia works closely with Microsoft to enable more accurate search results in Bing. Together, Bing and Nvidia optimised the BERT reference using Nvidia’s GPUs and parts of the Azure AI infrastructure.

This led to the biggest improvement in the ranking of search quality that Bing implemented in the past year, says Catanzaro.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.