Microsoft launched a bug bounty program offering rewards up to $15,000 for finding vulnerabilities in AI systems, aiming to improve AI safety through external security testing.

The initial scope of the program will cover the AI-powered features in Bing, including Bing Chat, Bing Image Creator, and Bing integrations in Microsoft Edge, the Microsoft Start app, and Skype.

The company highlighted this new bounty program in a presentation at the BlueHat security conference. It aims to incentivize security researchers to find bugs and flaws in Microsoft’s AI products before malicious actors can exploit them.

Microsoft states in an announcement:

“As shared in our bounty year in review blog post last month, we are constantly growing, iterating, and evolving our bounty programs to help Microsoft customers stay ahead of the curve in the ever-changing security landscape and emerging technologies.”

Microsoft’s Bounty Program Expands to Include AI

Microsoft’s new bounty program is an extension of an existing program, which has awarded over $13 million to researchers. It comes after the company recently updated its vulnerability severity ratings for AI systems and held an AI security research challenge.

According to the bounty program’s terms, eligible vulnerabilities must meet Microsoft’s criticality thresholds, be previously unreported, and include clear, reproducible steps.

Submissions will be judged on technical severity as well as the quality of the report.

The minimum bounty payment is $2,000 for a moderate-severity flaw, ranging from $15,000 for critical vulnerabilities. Higher rewards are possible at Microsoft’s discretion for issues with significant customer impact.

How To Participate

Researchers interested in participating can submit vulnerabilities through the Microsoft Security Response Center portal.

Microsoft advises ethical bounty hunting using test accounts while avoiding customer data exposure or denial of service.

The program’s scope is limited to technical vulnerabilities in the AI-powered Bing experiences. Some actions aren’t allowed, such as accessing data that doesn’t belong to you, exploiting server-side problems beyond demonstrating proof of concept, and running automated tests that generate a lot of traffic.

In Summary

Microsoft’s AI bug bounty program signals a broader industry focus on identifying and responsibly disclosing vulnerabilities in AI systems before they can be exploited.

While limited to Bing’s AI features, the bounties may expand later as Microsoft builds out and secures more AI capabilities.


Featured Image: Andrii Yalanskyi/Shutterstock



Source link

Avatar photo

By Rose Milev

I always want to learn something new. SEO is my passion.

Leave a Reply

Your email address will not be published. Required fields are marked *