Microsoft launched a bug bounty program providing rewards as much as $15,000 for locating vulnerabilities in AI methods, aiming to enhance AI security by way of exterior safety testing.
The preliminary scope of this system will cowl the AI-powered options in Bing, together with Bing Chat, Bing Picture Creator, and Bing integrations in Microsoft Edge, the Microsoft Begin app, and Skype.
The corporate highlighted this new bounty program in a presentation on the BlueHat safety convention. It goals to incentivize safety researchers to seek out bugs and flaws in Microsoft’s AI merchandise earlier than malicious actors can exploit them.
Microsoft states in an announcement:
“As shared in our bounty 12 months in evaluation weblog put up final month, we’re continually rising, iterating, and evolving our bounty applications to assist Microsoft clients keep forward of the curve within the ever-changing safety panorama and rising applied sciences.”
Microsoft’s Bounty Program Expands to Embody AI
Microsoft’s new bounty program is an extension of an present program, which has awarded over $13 million to researchers. It comes after the corporate lately up to date its vulnerability severity scores for AI methods and held an AI safety analysis problem.
Based on the bounty program’s phrases, eligible vulnerabilities should meet Microsoft’s criticality thresholds, be beforehand unreported, and embody clear, reproducible steps.
Submissions shall be judged on technical severity in addition to the standard of the report.
The minimal bounty cost is $2,000 for a moderate-severity flaw, starting from $15,000 for important vulnerabilities. Larger rewards are doable at Microsoft’s discretion for points with important buyer influence.
How To Take part
Researchers occupied with taking part can submit vulnerabilities by way of the Microsoft Safety Response Middle portal.
Microsoft advises moral bounty searching utilizing check accounts whereas avoiding buyer information publicity or denial of service.
This system’s scope is proscribed to technical vulnerabilities within the AI-powered Bing experiences. Some actions aren’t allowed, resembling accessing information that doesn’t belong to you, exploiting server-side issues past demonstrating proof of idea, and operating automated exams that generate a variety of visitors.
In Abstract
Microsoft’s AI bug bounty program alerts a broader trade give attention to figuring out and responsibly disclosing vulnerabilities in AI methods earlier than they are often exploited.
Whereas restricted to Bing’s AI options, the bounties might increase later as Microsoft builds out and secures extra AI capabilities.
Featured Picture: Andrii Yalanskyi/Shutterstock