In three-quarters of organizations, ChatGPT or other generative AI tools are not allowed to be used for work. Most indicate that this policy will be in place for a long period of time or even permanently.
75 percent of organizations are not keen on using generative AI in the workplace. In these organizations, either the use is already not allowed or a policy to implement a ban is in the works.
The figures come from a global survey conducted by BlackBerry. Since the survey ran in June and July 2023, the survey accurately reflects how organizations today view the tools. 2,000 IT executives participated in the survey.
Organizations see to many risks
There are two concerns about generative AI tools that trigger the ban. Eight in 10 worry about using unsecured apps and the risks it poses to cybersecurity. Sixty-seven percent do not want corporate data to leak into AI training kits and express privacy concerns.
There seems to be a willingness to use generative AI tools in one area of the organization, however, and that is the case for cybersecurity. 81 percent see the value of this. It is especially important to keep in mind here that these questions were posed to IT professionals. Just over half of those surveyed are also convinced that the tools increase efficiency, stimulate innovation and fuel creativity.
Ready for the business market
More and more providers of generative AI tools are nevertheless getting ready to conquer the business market. OpenAI, for example, is working on ChatGPT Business and may be able to address some of the concerns with it. For example, by default, the tool does not use the data from this tool to train OpenAI’s model. Furthermore, there will be more opportunities to manage end users.
Generative AI tools are also increasingly integrating into platforms used by companies. For example, Salesforce offers AI Cloud, and in Zoom, a generative AI tool types a meeting summary. This evolution will make it increasingly difficult for companies to keep generative AI tools out of the work place in the future.