Bug bounty programs are indispensable tools for finding security vulnerabilities and are used by major tech companies like Google and Microsoft. Following an order from the US Army for personnel to stop using DJI drones due to security issues, the company launched its own bug bounty program. Now, one researcher says he found an incredible screw-up, worth $30,000 (£22,674), but then received extortionate threats from DJI.
In a detailed essay, Kevin Finisterre claims he began communicating with the DJI team on 2nd September after he discovered the drone-maker’s SSL certificates and firmware AES encryption keys exposed in code uploaded to GitHub. Finisterre says that he contacted DJI to ask if its program covers vulnerability finds in its servers. Finisterre says he was told it does, and that over the course of 130 emails the company proceeded to give him one headache after another before it eventually made unusual confidentiality demands, and implied that Finisterre could be guilty of violating the US Computer Fraud and Abuse Act (CFAA) if he did not comply.
Finisterre writes that he compiled a 31-page report that detailed personal customer information and internal communications he’d been able to view on one of DJI’s servers. “I had let them know about the fact I had seen unencrypted flight logs, passports, drivers licenses, and Identification Cards,” he writes.
According to Finisterre, DJI’s bug bounty program was hastily thrown together in what he considers more of a PR move than a genuine effort to keep its products secure. He says that there is no clear outline of what falls under the scope of the program, but that he was alternately told that his discovery does, and does not qualify for a reward. Ultimately, Finisterre says he was offered the top prize of $30,000 (£22,674)21. But then, he received the contract he’d have to sign to collect his money.
He says the agreement “did not offer researchers any sort of protection. For me personally, the wording put my right to work at risk, and posed a direct conflict of interest to many things including my freedom of speech.” He was asked to refrain from discussing his research publicly and a final draft agreement required that he destroy all materials that he’d discovered or risk prosecution under the CFAA. Finisterre says that he was assured by legal counsel “in various ways that the agreement was not only extremely risky, but it was likely crafted in bad faith to silence anyone that signed it.” Rather than pay the legal fees that would arise from further negotiating with DJI, he ultimately decided to just write about his experience and give up the money.
Gizmodo asked DJI for confirmation of Finisterre’s story, and if it believes that threatening researchers with legal action is the most effective way to discover security vulnerabilities. A spokesperson didn’t directly answer our questions but pointed us to a statement from 16th November that reads in part:
DJI is investigating the reported unauthorised access of one of DJI’s servers containing personal information submitted by our users.
As part of its commitment to customers’ data security, DJI engaged an independent cyber security firm to investigate this report and the impact of any unauthorised access to that data. Today, a hacker who obtained some of this data posted online his confidential communications with DJI employees about his attempts to claim a “bug bounty” from the DJI Security Response Center.
DJI implemented its Security Response Center to encourage independent security researchers to responsibly report potential vulnerabilities. DJI asks researchers to follow standard terms for bug bounty programs, which are designed to protect confidential data and allow time for analysis and resolution of a vulnerability before it is publicly disclosed. The hacker in question refused to agree to these terms, despite DJI’s continued attempts to negotiate with him, and threatened DJI if his terms were not met.
The same day that DJI released its statement, it posted more detailed terms for the bug bounty program. Only time will tell if researchers care to take a risk in working with the company. [Kevin Finisterre via Ars Technica]