2 min

Microsoft has updated its terms of use for consumers. It has added a new AI Services paragraph that states, among other things, that the tech giant may store call data for later use.

In the new AI Services paragraph of its consumer fine print, effective Sept. 30, Microsoft answers how it deals with difficult issues regarding generative AI. For example, it explains what the company does with the information end users provide to the Bing chatbot.

Five new policies

To this end, Microsoft introduces five new policies. First, they talk about reverse engineering. The tech giant’s AI services may not be used to discover the underlying components of its models, algorithms and systems.

In the area of Extracting Data, unless permitted, users may not use web scraping, web harvesting or web data extraction methods to extract data from Microsoft’s AI services.

Nor are users permitted to use the AI services and data extracted from them to create, train or improve other AI models. Users of Microsoft’s AI offerings are additionally responsible for any claim other parties may have with respect to the use of these applications.

Storing interactions

A critical part of the new terms, however, is the tech giant’s use of the data and content. For this, the tech giant states that it processes and stores all inputs from users in its services, such as prompts for Bing Chat. Responses from the services are also remembered.

The reason is that Microsoft wants to use the information to monitor the services to prevent malicious use and harmful outcomes. The techgiagnt does not specify a retention period for this particular data.

Also read: Bing Chat no longer avoids competitors’ mobile browsers