On the Month of Apple Bugs, Backdoor Drama, and Why Security Researches Need Exceptional Ethics
Being on the road this week, I missed the latest drama at the Month of Apple Bugs pointed out in this post by Chris Pepper. (One thing Chris doesn’t mention is that this backdoor was only included in a pre-release version of the exploit, not the released proof of concept code). I read LMH’s response and explanation, spoke with him directly, and feel he’s unfortunately damaged the reputation of the already controversial project. Basically, LMH found some individuals scanning the directories where he posted the exploit code samples before the accompanying blog entry was posted. There were no pointers to these files. He included the backdoor code in a file put at that location that the people scanning his server picked up and, in some cases, executed. His goal was to identify these individuals and prevent further looping of his server. I think it was a bad idea, but the backdoor code was not in any released version of the exploit. Basically it was some wacky arms race- with the individuals downloading the code poking around someone else’s server, and LMH taking a vigilante-style response. As I’ve said many times I really don’t like dropping 0day exploit code; it damages the innocent more than the guilty. But if you are going to drop code as part of a full disclosure, it should be to prove the concept and allow people to test and evaluate the vulnerability. Your code should never make the situation worse, no matter what kind of point you’re trying to make. Putting in a backdoor to track and expose who downloads the code, released or not, is never the right way to go. It’s unethical. LMH responded on his blog: The disclaimer is clear enough, and if they go around downloading and voluntarily executing random code (read, a exploit), it’s certainly their responsibility to set up a properly isolated environment. Otherwise you’re total jackass (although, why would you “worry if the bugs are fake”?). Yes- you should set up a test environment before messing with ANY exploit code. I blew 4 hours this weekend fixing my Metasploit copy (still can’t get msfweb running) and creating virtual targets to test a new exploit someone sent me. Then again most people can’t virtualize OS X legally, requiring you to buy a spare Mac to test anything. That said, I don’t think anyone ever has the right to place backdoor code in anything. If you want to track who is downloading or leaking pre-release code you track audit logs or take other actions on your end. You have no right to do anything malicious to someone else’s system, even when they aren’t playing nice. This did nothing but hurt the project. Apple security is a serious issue that needs real debate, but games like this destroy credibility and marginalize the individuals involved. Vendors generally dislike security researchers as it is. Giving them any opening, no matter how small, makes it that much harder on the research community. It allows PR departments to make a legitimate researcher look like nothing more than a malicious criminal, demonizing them in the press even when they play strictly by the book. I don’t consider LMH malicious at all; and after a few conversations have no doubts his goal is to improve security, but I do disagree with his methods on this particular project. Security researchers need to have exceptional ethics to withstand the vendor attempts at marginalization or all their work goes to naught. Share: