Microsoft's AI team has made a huge blunder by leaking 38 TB of the company’s private data. A report has revealed that this data contained confidential information such as passwords for Microsoft services, secret keys, and over 30,000 internal messages from more than 350 Microsoft employees using Microsoft Teams.
According to a report by Wiz, the tech giant’s AI team has made a collection of training data public, which consisted of open-source code and AI models used for image recognition. After that, a link from Azure was provided to people who visited Github storage that can be used to download these models.
Also read: CoWin breach: What sensitive data of yours may be part of this leak?
Things went south when the link provided by the Microsoft AI team gave the random people access to not just check what was in the Azure storage account, but also gave them access to disrupt the system, including the ability to make changes like uploading, overwriting, or deleting files.
Further, the report revealed that the Shared Access Signature tokens, an Azure feature, was the root of this problem. It provided the special links that could be used to access Azure storage data, but the problem here was that this particular link was meant to give full access to the user. Typically, there may be some restrictions and limitations in the links.
Also read: WhatsApp data leak has 500 million accounts’ info up for sale on dark web
It was also mentioned in the report that Microsoft’s private data was exposed back in 2020 but the professionals informed the company on June 22, 2023. Afterwards, these Shared Access Signature tokens were reportedly taken care of within two days. The problem was fully resolved by Microsoft in August, according to the report.
Microsoft also gave assurance to the people that no private data of customers and the company was leaked.