Microsoft announced on Wednesday that its researchers have developed a neural network to improve Bing. According to the statement, the network has 135 billion parameters that will enhance search results for users.
This neural network is the largest ‘universal’ artificial intelligence that the company has in production. It is also one of the most complex AI models ever detailed publicly.
To date, the largest neural network ever built has 175 billion parameters. It’s OpenAI’s natural language processing model. Parameters are the configuration settings that define how an AI tackles challenges.
At their core, parameters help neural networks pick the best way to compute something. The more parameters an AI has, the better it can carry out the tasks it’s developed to perform.
The researchers are calling their neural network MEB. The network analyzes Bing queries and helps identify the most relevant pages from around the web. MEB doesn’t do all this alone but shares the work with other machine learning (ML) algorithms included in the search engine.
MEB is running in production for all Bing searches in all regions and languages and is the largest universal model served by Microsoft, demonstrating its incredible ability to memorize facts.
The neural network approaches search uniquely to determine whether or not a page is relevant to what a user has searched.
Neural networks decide by weighing several variables about the data they’re processing. The factors are known as features. An example would be a revenue prediction AI using daily average store sales to create quarterly revenue forecasts.
Other aspects of the MEB AI are unique and could help it gain an edge over the incredibly dominant Google search engine.