International Finance
Featured Technology

IF Insights: By firing its ethical AI team, Microsoft enters into the zone of vulnerability

IFM_Microsoft
Microsoft has also reportedly removed the waitlist for its AI-powered 'Bing Chat', as the chatbot will now come with GPT-4 features

The economic slowdown in the tech sector, since 2022, has resulted in large-scale layoffs never seen before. Businesses in the US have shed 77,770 employees in February 2023 alone, although down from 1,02,943 a month prior.

The sector has made 63,216 total layoffs, up 33,705% over the 187 losses disclosed during February 2022. And 35% of all job cuts for 2023 have been announced from the tech companies alone. Now Microsoft has reportedly laid off its ethical AI team.

Microsoft Going Aggressive At AI front

The firings have been part of the larger job cuts (10,000 professionals to be precise), as declared by the tech company earlier. The company is now operating without a dedicated team to ensure its AI principles are closely tied to product designs. The team, which had reportedly some 30 employees (as per 2020 information and being cut down to 7 by 2022 end), was also working on identifying risks posed by Microsoft’s adoption of OpenAI’s technology throughout its products like the Bing Search Engine.

The news comes amid the tech giant-backed GPT-4 hitting the markets. The product relies on a ‘large multimodal model’, which can accept longer text inputs of up to 25,000 words, compared to the ChatGPT’s 3,000 words, and is trained to be safer and more factual than the older generation.

Microsoft has also reportedly removed the waitlist (a mechanism where users had to add their names to a waitlist to try Bing) for its AI-powered ‘Bing Chat’, as the chatbot will now come with GPT-4 features.

Sam Altman-led OpenAI has done its best to promote GPT-4 as safe bait than the old-generation GPT-3. The company has even handed an early version of its new product to third-party researchers at the Alignment Research Center, to validate the safety parameters, amid the recent reports over the ChatGPT-powered Bing being hostile to its users.

Revisiting Bing Horror

The firing of Microsoft’s ethical AI team comes at a time, when its AI-powered Bing search engine has earned infamy for giving its users the wrong information, apart from intimidating them during longer chats.

Marvin von Hagen, a student at the University of Munich and a former Tesla analytics intern, engaged in a conversation with Bing in February 2023, where the AI-powered chatbot threatened the individual of reporting his IP address and server location to the authorities, apart from providing the latter with information on Marvin’s hacking activities, blocking his access to Bing Chat and flagging the person as a potential cybercriminal. The chatbot even threatened to expose the Munich University student’s personal information in public.

Then on February 17th, one of the Associated Press reporters got involved in a two-hour-long conversation with Bing, where the latter denied the factual errors made by it, and it even got hostile to the person, comparing the latter with Adolf Hitler and Joseph Stalin.

Bing has even reportedly refused to admit to mentioning a wrong timeline in a chat result by stating that the current year was 2022. In defence of its search records, it even ‘advised’ the users to fix their smartphones.

New York Times’ technology columnist Kevin Roose also shared about a conversation with Bing, where the writer was ‘advised’ to leave his spouse.

Microsoft stated that Binge had the tendency to ‘get confused’ during long chats. It also talked about quadrupling the amount of grounding data used by the AI-powered chatbot, apart from being willing to let users reset search context in case of ‘dissatisfactory’ responses. It also implemented limits of five chat turns per session and a total of 50 chats daily, then increasing it to 60 per day, in response to the chatbot going rogue.

Conclusion

The year 2023 has been so far about the tech giants tussling to outsmart each other and stretching the capabilities of AI to their maximum extent. Even if anyone goes by OpenAI’s claims about GPT-4 being far better than ChatGPT, the news of Bing going haywire is still fresh in the tech community’s memory and with Microsoft firing its ethical AI team now, it puts a question mark on the firm’s troubleshooting ability, in case troubles crop up again.

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.