Microsoft Employees Accidentally Leaked 38TB Worth of Company Private Data

Microsoft employees accidentally leaked 38TB worth of company private data and this is inclusive of Teams chats. Cybersecurity researchers just recently found an unprotected Microsoft Azure database wide open with no form of security.

Microsoft Employees Leaked Private Data

Microsoft Employees Leaked Private Data

Cybersecurity researchers from Wiz have just discovered a vast, unlocked Microsoft Azure cloud storage database, that is [playing host to sensitive information on hundreds of users and this is inclusive of private keys as well as passwords.

The database in question, as it turned out, reportedly belonged to researchers of Microsoft who are working on Artificial Intelligence (AI). The good news here is that the database was locked before any hackers lurking around could get their hands on it.

And just as explained by Wiz’s researchers, they were investigating accidental cloud-hosted data exposure when they eventually found a Microsoft GitHub repository with open-source code for AI models, to be utilized for image recognition. The models in question were reportedly hosted on an Azure Storage URL, but however, due to obvious human error in place, the storage also held data that no one should have access to in the first place.

Content of the Leaked Data

That data as you should know is inclusive of 38 terabytes of information, and this is including backups of two Microsoft employees’ computers, reported passwords to Microsoft services, and over 30,000 Teams chat messages that were exchanged by employees of Microsoft. The storage account in question was however not accessible directly, the researchers reportedly explained. The AI team of Microsoft instead generated a shared access signature token (SAS) that then granted too many permissions. With SAS tokens, TechCrunch explains, users of Azure can effectively generate shareable links for Azure Storage account data.

Microsoft Was Notified Of the Findings on June 22

Wiz however notified Microsoft immediately of its findings on June 22, and the SAS token in question was then revoked two days later after the report. You should know that it took the company almost three weeks to effectively run a thorough investigation after which it reportedly concluded that the data in question had not been accessed by any unauthorized third parties, TechCrunch stated.

And in order to make sure these things don’t happen again, Microsoft expanded the secret spanning service of GitHub, which as you should know helps to track all public open-source code changes for credentials as well as other secrets exposed in plaintext.

Unsecured Databases Are a Common Occurrence

Unfortunately, unsecured databases as you should know are a common occurrence. In the early parts of this year, a relatively popular Android voice chat app known as OyeTalk did the same thing. It was making use of Google’s Firebase mobile application development platform, which in question also offers cloud-hosted databases. And according to researchers from Cybernews, the Firebase instance of OyeTalk was not password-protected, thus meaning that its contents were available for everyone to see.



Please enter your comment!
Please enter your name here