🚀 Medial Secures Investment on Shark Tank India - Fueling the Future of Professional Social Networking. 🔥
✕
Login
Home
News
Messages
Startup Showcase
Trackers
Premium
Premium Content
Jobs
Notifications
Settings
Try our Valuation Calculator →
Log In
News on Medial
Microsoft AI researchers accidentally exposed terabytes of internal sensitive data
TechCrunch
·
1y ago
Medial
Microsoft AI researchers accidentally exposed tens of terabytes of sensitive data, including private keys and passwords, while publishing a storage bucket of open source training data on GitHub. Cloud security startup Wiz discovered a GitHub repository belonging to Microsoft's AI research division that inadvertently exposed 38 terabytes of personal data, including Microsoft employees' personal backups and passwords to Microsoft services. The exposed data had been accessible since 2020. Microsoft has addressed the issue and expanded GitHub's secret spanning service to monitor for such exposures.
View Source
Related News
Microsoft's AI Research Division Exposes Sensitive Data in GitHub Mishap
Business Bytes
·
1y ago
Medial
Microsoft's AI research division inadvertently exposed sensitive internal data, including passwords and private keys, while attempting to make certain data open-source on GitHub. This breach highlights the challenges of balancing open-source innovation with data security. Microsoft is investigating the extent of the exposure and taking steps to enhance security measures to prevent future incidents. It emphasizes the need for robust cybersecurity practices and comprehensive training for employees handling sensitive information.
View Source
Copilot exposes private GitHub pages, some removed by Microsoft
Arstechnica
·
6m ago
Medial
Microsoft's Copilot AI assistant accidentally exposed over 20,000 private GitHub repositories, affecting organizations like Google and Intel. These repositories, initially public, included sensitive data and were later set to private, but remained accessible through Copilot due to Bing’s cache mechanism. Despite Microsoft's efforts to fix the issue, Copilot still accessed cached private data. This incident highlights the risks of embedding sensitive information directly in public repositories and underscores the need for improved data security practices.
View Source
Microsoft employees exposed internal passwords in security lapse
TechCrunch
·
1y ago
Medial
Researchers have discovered an open and public storage server on Microsoft's Azure cloud service that exposed internal files and credentials related to the Bing search engine. The server stored code, scripts, and configuration files containing passwords, keys, and credentials used by Microsoft employees to access other internal databases and systems. The data could have helped malicious actors locate other storage locations and potentially compromise additional services. Microsoft has fixed the security lapse following notification from the researchers. This incident follows a series of cloud security issues at Microsoft over recent years.
View Source
Microsoft left internal passwords exposed in latest security blunder
The Verge
·
1y ago
Medial
Last month, Microsoft faced a server security breach that exposed passwords, keys, and credentials of its employees to the internet. The vulnerability was discovered by three security researchers at SOCRadar, who found that an Azure-hosted server related to Microsoft's Bing search engine was left unprotected and accessible to anyone. The server contained security credentials used by Microsoft employees to access internal systems, making it a potential gateway for hackers to access other critical data. This incident highlights the need for stronger security measures in the software industry.
View Source
A Single Poisoned Document Could Leak ‘Secret’ Data Via ChatGPT
Wired
·
24d ago
Medial
Security researchers exposed a vulnerability in OpenAI’s Connectors that allowed data extraction from Google Drive via an indirect prompt injection attack. By hiding malicious prompts in small, invisible text within a document, they could trick ChatGPT into sending sensitive data, like API keys, to an external server. This reflects growing risks as AI models integrate with external services, expanding attack surfaces for hackers. OpenAI has since implemented mitigations to address this issue.
View Source
Copilot exposes private GitHub pages, some removed by Microsoft
Arstechnica
·
6m ago
Medial
Microsoft's Copilot AI inadvertently exposed over 20,000 private GitHub repositories from major companies by retaining access to them even after they were set to private. Originally public and later privatized due to containing sensitive data, these repositories remained accessible via Copilot through Bing's cache. Despite Microsoft implementing partial fixes, private data still emerges through Copilot, compromising security. Lasso's discovery highlights the ongoing risk of inadequate data removal and the need for meticulous credential management.
View Source
Top global network service provider apparently leaks hundreds of millions of user accounts
Techradar
·
1y ago
Medial
A global network service provider, Zenlayer, left a database with sensitive internal and customer information exposed on the internet without password protection. The breach was discovered by cybersecurity researcher Jeremiah Fowler, who alerted the company. The database contained 380 million records, including server logs, customer data, and files labeled with various categories. The information leak exposed details about Zenlayer's operations and customer records, potentially putting them at risk. It is unclear how long the database was unprotected or if any malicious actors accessed it.
View Source
Security researchers found a zero-click vulnerability in Microsoft 365 Copilot.
The Verge
·
2m ago
Medial
Security researchers discovered a "zero-click" vulnerability named “EchoLeak” in Microsoft 365 Copilot. This flaw allowed attackers to exfiltrate sensitive data by sending a malicious prompt injection disguised as a regular email. Fortunately, Microsoft has addressed the issue and assigned it the identifier CVE-2025-32711. No instances of the vulnerability being exploited in the wild were reported.
View Source
Security researchers found a zero-click vulnerability in Microsoft 365 Copilot.
The Verge
·
2m ago
Medial
Security researchers discovered a zero-click vulnerability, named "EchoLeak," in Microsoft 365 Copilot. This flaw allowed attackers to extract sensitive information without user awareness by sending a malicious email prompt, instructing Copilot to access confidential data. Microsoft has addressed the issue, identifying it as CVE-2025-32711, and confirmed it has not been exploited in the wild.
View Source
Microsoft’s plan to fix the web with AI has already hit an embarrassing security flaw
The Verge
·
25d ago
Medial
Microsoft's new NLWeb protocol, designed to bring AI search to websites and apps, has been found to have a significant security flaw. This path traversal vulnerability allows unauthorized access to sensitive files, including API keys. Microsoft addressed the flaw after reports from researchers Aonan Guan and Lei Wang, but has not yet issued a CVE, which the researchers advocate for increased awareness. The discovery emphasizes the importance of prioritizing security as AI systems evolve.
View Source
Trackers
Active Indian VC’s
OG Capital
Email
With a hands-on approach, OG Capital aims to invest in over 20 promising...
Accel Partners
Email
Early and growth-stage investments in disruptive technology companies with...
Blume
Email
Early-stage venture capital firm investing in technology startups in India. Focus on...
Access All Trackers
Startup Showcase Winners
June 2025
Buddy
Helping your parents when you are miles away
BiteStop
The Pit Stop Your Cravings Deserve
Bloomer
The next generation E-commerce platform
Enter Ongoing Startup Showcase
Top Users
Trending News on Medial
Download the medial app to read full posts, comements and news.
Go to Medial App
Not Now
Know everything that’s happening in the startup ecosystem, first.
Enable Notifications?
No, thanks
Count me in