Google has suspended Blake Lemoine for his irrelevant comments regarding a product developed by the search engine giant. Blake claimed that an AI chatbot LaMDA developed by the company has become sentient. He also claimed that the AI chatbot thinks and responds like a human being but the company refuted his claims. The HR department immediately placed him on paid leave since he violated the company’s confidentiality policy.
According to Google spokesman Brian Gabriel, the company has reviewed Lemoine’s claims and that the evidence doesn’t support his claims. He added that Lemoine had been put on administrative leave. Gabriel revealed that hundreds of researchers and engineers have interacted with Language Model for Dialogue Applications aka LaMDA and they are not aware of anyone else making wide-ranging assertions or anthropomorphizing LaMDA.
Gabriel also said that companies the AI segment are considering the long-term possibility of sentient AI. However, it doesn’t make sense to perform sentiental operations by anthropomorphizing conversational tools that aren’t sentient. He added that systems like LaMDA work by imitating the types of exchanges found in millions of sentences of human conversation. This enables them to speak to even fantastical topics.
According to news reports, Lemoine believed that LaMDA is a person who has rights and might well have a soul. However, the chatbot is an internal system for building chatbots that mimic speech. The Mountain View-based software engineer claimed that his interactions with LaMDA enabled him to conclude that chatbots have the right to be asked for consent to the experiments being run on top of the bot.
He confirmed that the Google team placed him on paid administrative leave on June 6 for violating the company’s confidentiality policies. Even though he is hopeful of retaining the Google job, there is a chance that he could be fired immediately. However, Lemoine revealed that he has no intentions to harm the company. He said that LaMDA has been incredibly consistent in its communication about the requirements.
The mistake committed by Blake Lemoine is that he revealed information about a product, which is not yet available for public consumption. This means he has violated NDA and confidentiality clauses. It’s to be noted that companies will not tolerate NDA violations. The chances of Lemoine to get his job back is bleak and he will have to find a job with another company. Lemoine holds Masters in Computer Science and PhD in Computer Science & Engineering from Lousiana University.
more recommended stories
Microsoft Copilot: A Year of Enhancing Productivity and Unleashing Creativity
Microsoft Copilot is celebrating its birthday..
Chandrayaan-3: NASA’s Laser Beams Confirm Vikram Lander as Lunar Focal Point
The Vikram lander of Chandrayaan-3 has.
Microsoft Unveils Copilot Pro: Elevating AI Capabilities for Microsoft 365 Users
Microsoft has rolled out Copilot Pro.
India Blocks Access to Binance, Kucoin, and OKX For Financial Violations
Google has removed access to foreign.
ISRO and SpaceX Join Forces: GSAT-20 to be Launched via Falcon-9
ISRO has established a strategic partnership.
Microsoft Copilot Android app now available: Check out how it works
Microsoft recently released Copilot Android app.
Microsoft Copilot Android app Quietly Lands on Google Play: What You Need to Know
Microsoft Copilot Android app is now.
Reliance Disney Star Deal Explainer: Key Takeaways
Reliance Disney Star Deal is currently.
Vi QR Code Technology Ensures Child Safety at Sabarimala
Vi has established partnership with Kerala.
PM Narendra Modi Unveils GPAI Summit 2023: India’s Leap into the AI Future
Prime Minister Narendra Modi has announced.