Microsoft AI researchers accidentally leaked 38TB data online; Includes passwords, secret keys, more

Cybersecurity is a big concern in 2023 for both individuals and enterprises, and it seems even big technology companies aren’t safe from the prying eyes of hackers and cybercriminals. In a shocking development, almost 38TB of data was leaked online, albeit accidentally. While companies these days have several data protection policies and safeguards against data leaks from external threat actors in place, this data leak allegedly involved Microsoft employees. Here’s what happened.

In a blog post, Microsoft announced that researchers at cloud security firm Wiz discovered that Microsoft’s AI division researchers accidentally leaked 38TB of data while contributing to a GitHub repository involving the development of open-source AI models. Microsoft has emphasized that no customer data or any other internal service was put at risk, and no customer intervention is required. However, nearly 000 internal Microsoft Teams messages, secret keys, passwords for Microsoft’s services, and other data were involved in the big data leak.

How did the leak occur?

According to a Coordinated Vulnerability Disclosure (CVD) report by Wiz, the data leak involved a Microsoft employee who accidentally shared a URL for a blob store while contributing to a public GitHub repository on the development of open-source AI models. This URL had a Microsoft Azure feature called Shared Access Signature (SAS) token for an internal storage account. “Like other secrets, SAS tokens should be created and managed properly”, Microsoft said.

While SAS links generally include access to only a select number of files, this link was configured in such a manner that it gave access to the entire account. It also granted “full control” permissions, allowing the user to edit the contents of the entire account, instead of just allowing read-only access. The access to the internal storage account was inadvertently included in the blob URL, it contained the backups of workstation profiles of two former Microsoft employees, including their passwords, as well as thousands of Teams messages with their colleagues.

The research team was able to access this account with the SAS token, and this massive security issue was then reported to the Microsoft Security Response Center (MSRC). Following this, all external access to the storage account was revoked.

Microsoft said, “Additional investigation then took place to understand any potential impact to our customers and/or business continuity. Our investigation concluded that there was no risk to customers as a result of this exposure.”

Source link

Source: News

Add a Comment

Your email address will not be published. Required fields are marked *