metamorworks - stock.adobe.com

Microsoft, Google make moves amid regulators' AI scrutiny

Microsoft is withdrawing stepping down from the OpenAI board. Google has reportedly disbanded its machine learning privacy team. Both vendors face investigations.


Listen to this article. This audio was generated by AI.

Microsoft and Google have made some major decisions internally amid pressure from regulators.

Microsoft earlier this week dropped its seat as an observer on the board of ChatGPT creator OpenAI, in which it has invested $13 billion.

Meanwhile, Apple has also decided not to join the AI vendor's board after previously considering it, and Google reportedly disbanded its machine learning privacy team.

The moves come as U.K and European antitrust regulators look into the Microsoft-OpenAI alliance. In the U.S., the Federal Trade Commission is investigating big tech companies' investments in OpenAI and another independent GenAI vendor, Anthropic.

Microsoft steps away from OpenAI board

Microsoft's move to voluntarily step down from its position on OpenAI's board came after eight months on the board. It also came after the cloud provider reached a 20 million euro settlement to resolve an antitrust complaint from a European trade agency about Microsoft's cloud computing licensing practices.

The U.S. and European investigations into the activities of Microsoft, Google, Amazon and OpenAI are proceeding, as GenAI technology's explosive growth outpaces other technology advances in recent decades.

Microsoft's departure from the OpenAI board appears to be a signal to regulators that it is not exercising undue control over OpenAI and that its relationship with the AI vendor does not need scrutiny, said Michael Bennett, AI law and policy adviser at Northeastern University.

It certainly looks to be a move that will make claims of undue influence and anti-competitive influence weaker.
Michael BennettAI policy adviser, Northeastern University

"It certainly looks to be a move that will make claims of undue influence and anti-competitive influence weaker," Bennett said. "This move seems like it will make it likelier that the regulators will probably not act as aggressively as they might ... if Microsoft was still on the board."

However, if Microsoft seeks to evade government scrutiny, openly giving up a board seat might have little consequence, said Futurum Group analyst David Nicholson.

"No one is going to be dissuaded by this," Nicholson said. "There's nothing that can happen to avoid the kind of scrutiny that's going to occur."

The sheer size of Microsoft's investment -- in money, time and cloud and compute resources -- and the tight relationship between OpenAI and Microsoft means they are bound to be under the government's eye, he said.

While the move appears most likely an attempt to avoid scrutiny, Microsoft also has its sights on the market, Nicholson added.

"It's a bit of a gesture to the marketplace that Microsoft is going to continue a sort of coopetition strategy where at certain areas on the margins, they will partner with other purveyors of LLM technology," he said. "They're not only going to exist in sort of a walled garden environment."

Microsoft has also invested in OpenAI competitor Mistral.

Not operating in a so-called walled garden, or an environment in which other vendors and potential partners are locked out, is beneficial as Microsoft continues to battle Google, AWS, Meta and others for the coveted position of leader of the GenAI market.

Google's perilous move

Despite facing an FTC antitrust investigation, Google's risky move of disbanding its machine learning privacy team could be due to competitive pressure. But it could also be the right decision as more government entities pass laws regulating AI, Bennett said.

"It's no deep secret that many perceive these teams to be both essential to responsible development of the technologies, and at the same time internally, a de facto potential hurdle to the introduction of some of some of the new technologies," he said.

In an arena as intensely competitive as the GenAI market, disbanding the team is a move that could help Google release innovative technologies by removing internal obstacles.

On the other hand, Google may not need its privacy team any more as government bodies in the U.S. and Europe increasingly regulate AI, Bennett added.

So, cloud providers like Google may feel that complying with privacy is more of a legal issue than one involving responsible AI. The tech giant may decide to outsource such legal and ethical matters to outside firms instead of relying on internal teams, Bennett said.

However, both Google's and Microsoft's moves shows how important it is for the AI vendors to be careful amid the flurry of investigations, Nicholson said.

"Like moths to a flame, government regulators are going to be drawn to everything associated with AI," he said. "Companies are right to be concerned, and they're right to be fastidious about crossing their T's and dotting their I's."

Neither Microsoft nor Google immediately responded to requests for comment.

Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems.

Dig Deeper on AI technologies