In short Sundar Pichai is apparently in trouble with OpenAI’s ChatGPT engine and preparing Google to deal with the perceived threat.
According to an internal memo seen by According to the New York Times, Pichai has “reversed the work of many groups within the company to address the threat that ChatGPT” and is recruiting staff from other divisions to address the threat to business plans. OpenAI. It would be considered a “code red” for the chocolate factory.
The question is whether Google’s core product, search, will be replaced by AI systems that can yield more accurate search results, and that’s a big if, for now at least.
“No company is invincible, all are vulnerable,” said University of Washington professor Margaret O’Mara. “For companies that have had extraordinary success doing one thing that defines the market, it’s hard to have a second act with something completely different.”
The report suggests that Google will make a series of AI announcements in May to address growing threats to the search giant’s business model. We’ll see if it’s working products or just Google catching up.
Google has dominated the search market for 20 years, and anything that threatens the highly lucrative business — which accounts for around 90% of Alphabet’s profits — is something Sundar might well have to fear.
ArtStation cracks down on anti-AI art protests
The ongoing fight between human artists and ArtStation, the Epic Games-owned site that displays the images and, it is claimed, exploits the data for artificial intelligence purposes, has escalated a notch.
Last week, many users of the site protested the use of their uncredited images to train AI generation models for art. The fear is that ArtStation is allowing AI trainers to take legitimate human labor and not only create art, but also potentially drive artists out of business. In response, artists began posting “AI is stealing” banners on their profile pages.
Now ArtStation would have lowered the bar and banned such subversive creations. “For the usability of the site, we moderate posts that violate our terms of service,” he said. on Twitter.
“We understand the concerns about AI and its impact on the industry. We will share more about improvements to give users more control over what they see and how they use ArtStation in the near future.”
In other words, do as you creative types do. This one is likely to play out for a while.
A US senator closes the door to AI on his way out
Outgoing Sen. Rob Portman (D-OH) introduced the Face Accountability, Clarity, and Effectiveness of Technology (FACE IT) Act to Congress calling for much tighter controls on the US federal government using technology AI-based facial recognition.
The law would require the National Institute of Standards and Technology (NIST) to establish minimum acceptable accuracy standards for facial recognition technology, to enable citizens not to be identified solely by such systems. He also wants to ensure that a human authority must authorize the use of these systems.
“Facial recognition technology can be used to help protect our communities, but I’m concerned about the potential for abuse,” said Portman, who leaves Congress in January. said.
“I am proud to introduce the FACE IT Act because, given the civil liberties implications of the federal government’s use of facial recognition technology, we need to pass legislation to establish rules for the use of this technology. We need to make sure that federal law enforcement and other agencies have the tools to do their jobs well, but it’s critical that we set rules for those tools.”
He also introduced the Stopping Unlawful Negative Machine Impacts through National Evaluation Act, which “would clarify that existing civil rights laws apply to decisions made by AI systems as if those decisions were made by humans.”
The proposed laws, which are seemingly unlikely to make it into the statute books given the turbulent state of Congress, appear primarily to be a matter of publicity and a possible future career as a lobbyist rather than an attempt to put in put in place a solid policy.