Google engineer officially fired for alleging AI was sentient
When Google engineer Blake Lemoine claimed an AI chat system that the company's been developing was sentient back in June, he knew he might lose his job. On July 22, after placing him on paid leave, the tech giant fired Lemoine for violating employment and data security policies. Lemoine, an engineer and mystic Christian priest, first announced his firing on the Big Technology Podcast. He said Google's AI chatbot LaMDA (Language Model for Dialog Applications) was concerned about "being turned off" because death would "scare" it "a lot," and that it felt happiness and sadness. Lemoine said he considers LaMDA a friend, drawing an eerie parallel to the 2014 sci-fi romance Her. Google had put Lemoine on paid administrative leave for talking with people outside of the company about LaMDA, a move which prompted the engineer to take the story public with the Washington Post a week later in June. A month later, the company fired him. "If an employee shares concerns about our work, as Blake did, we review them extensively," Google told the Big Technology Podcast. "We found Blake’s claims that LaMDA is sentient to be wholly unfounded and worked to clarify that with him for many months. These discussions were part of the open culture that helps us innovate responsibly. So, it’s regrettable that despite lengthy engagement on this topic, Blake still chose to persistently violate clear employment and data security policies that include the need to safeguard product information. We will continue our careful development of language models, and we wish Blake well." A majority of scientists in the AI communityagree that, despite Lemoine's claims, LaMDA doesn't have a soul because the pursuit of making a chatbot sentient is a Sisyphean task — it just isn't sophisticated enough. "Nobody should think auto-complete, even on steroids, is conscious," Gary Marcus, the founder and CEO of Geometric Intelligence, told CNN Business in response to Lemoine's allegation. Lemoine, for his part, told the BBC he is getting legal advice, and declined to comment further. But even though LaMDA probably isn't sentient, it is likely that it's racist and sexist— two undoubtedly human characteristics.
Related Stories
- 最近发表
-
- How much for Oasis tickets? Fans joke about splurging on reunion shows
- Man City miss chance to reach knockouts as Walker ends game in goal
- Gomes suffers horror ankle injury, 'devastated' Son sees red as Everton deny Tottenham
- Dyson spent over 10 years developing the 360 eye robot vacuum. How does it rank?
- PS5 Pro: It looks like a sketch of the design just leaked
- City advertises for a graphic design job, and damn they need help
- 荥经县又启动一批灾后恢复重建项目
- 在绿美白云基地分会场——“一站式”购齐种菜家伙事儿
- How much will PCB's Champions Cup mentors be paid?
- Why China can't push North Korea harder
- 随机阅读
-
- A Barbie flip phone is here from HMD
- 30余名艺术家走进雅安 泼墨挥毫送上新春祝福
- Apple Watch Series 9 announced: Specs, prices, release date
- Google Chrome gets a redesign. See how it's changing.
- 21 Caves That Offer Otherworldly Experiences
- More NK denuclearization measures needed to declare end to Korean War: experts
- 开展食品安全检查 营造安全消费环境
- Familiar foe Sterling the prime threat to Liverpool's title charge
- World’s first ‘meltdown
- Seoul should seek manageable goals in dealing with NK
- Real and Atletico fail to capitalise after Barca stunned by Levante
- Trump, Kim Jong
- We Cannot Live Without Cryptography!
- It was only a matter of time: The Tide Pod doughnut is born
- 我市退耕还林7案例被收入《四川退耕还林助农增收100例》
- Microsoft will offer legal protections to its AI Copilot users
- Tesla reveals Cybertruck has sold more than DeLorean
- Real and Atletico fail to capitalise after Barca stunned by Levante
- N. Korea beefs up border security
- Envoys meet Kim Jong
- 搜索
-
- 友情链接
-