“The Norwegian Government hacked my startup!”
At least, that’s how the conversation started, on a Whatsapp message early on a Friday evening in October, when a concerned startup founder contacted me to ask for help.
My initial reaction was that couldn’t be the case, this must be some kind of email hoax designed to scare her into handing over some bitcoins, and it wouldn’t take me long to convince her it was nothing to worry about. But the more I listened, the more it became clear that there was something a bit more serious afoot.
It all began with an investigation into smartwatch security by the Norwegian Consumer Council (NCC), specifically looking at devices intended for children. They paid some security researchers to spend a little time looking for flaws in the most popular products on the market. When they duly found some, the NCC notified the vendors of the flaws, and instructed them that fixes would be required. Initially they told vendors they weren’t going to the media unless their requests for fixes went ignored, but no actual deadlines were given.
So far so good, right? Vendors get a free security test, and consumers get their privacy bolstered by the government. Win-win?
Well not for long. Just 30 days later, things went from win-win, to lose-lose. With no warning whatsoever to the vendors, the NCC decided to simply release the information they had to the media. Security weaknesses in kids’ smartwatches make for pretty good headlines, so it was scooped up by numerous news outlets including the Daily Mail and BBC News, leaving our unfortunate startup founder with a media ****storm to deal with, and immediately losing a number of key contracts with retailers as a result.
You might think that’s a good outcome, serves them right for selling an insecure product right? But the difficulty is, there’s no guide book for business owners, there are no regulations as to exactly what tests should be performed on products, and they are entrepreneurs, not cyber security experts.
In this case, the founder already thought she had done the right thing. It wasn’t that she had been sloppy about security. She has children of her own, and cares deeply about the security of her customers, and so she had signed up for a crowdsourced ‘bug bounty’ platform, with the promise that security researchers would proactively find flaws for her. Unfortunately, she simply didn’t understand the difference between that and a rigorous and structured security audit.
Fortunately though, Colleen, founder of TechSixtyFour (the EU Distributor for the Gator smartwatch) didn’t make the same mistake twice. She realised the best way to deal with this situation was head-on, and engaged Intruder to perform a full security audit of the latest watch and its associated mobile application. Needless to say, we found a lot more issues that even the NCC had not, and got straight to work helping the manufacturer to fix them.
She also signed up to our continuous security monitoring service for the cloud servers that the watches communicate with, so that even after the security audit was over, we would continue to ensure no weaknesses were introduced. That’s a level of security that a vast number of companies in the world would struggle to match. In fact the PCI Data Security Standard only mandates quarterly vulnerability scans. And they’re responsible for making sure all our credit cards are safe. Well no wonder our credit cards keep turning up on the internet in droves!
You might think all this is actually a good outcome, but in the cyber security industry what the NCC did is called irresponsible disclosure, and it’s often more harmful than it is helpful. Here’s why:
- By going to the media highlighting active issues with children’s smartwatches, the NCC are publicising these weaknesses for anyone to go and exploit before any fixes are available. If they really believed these issues were serious security concerns, then they would not be releasing the information until the vendors had a good chance to remediate them. At best their approach is hypocritical, at worst, it is dangerous.
- All companies make mistakes in their software, and they take time to fix. Even “Project Zero” (a team of the best security researchers in the world created by Google to take sloppy software developers to rights) decided to cast aside previous norms about waiting until the vendor has fixed issues before disclosing them to the public. But even they give vendors 90 days to fix their flaws, and even that has been deemed a controversially short timescale in the industry.
So why would the NCC do such an unreasonable thing? Well we can only really speculate, but it probably has a lot more to do with hunting headlines than it does protecting consumers.
And how about the retailers. The knee jerk reaction to drop the supplier doesn’t solve any problems long term. Clearly they must have no kind of robust process in place to ensure the products they put on their shelves are in any way secure, and the fact they drop contracts with suppliers only if the suppliers get caught out in the media says everything about the general state of security in these internet-enabled products (or now more commonly called IoT devices).
Instead of attempting to assassinate individual businesses on a whack-a-mole basis, the Norwegian Consumer Council should be working with organisations like the IoT Security Foundation, and other government departments to enforce security seal of approval schemes, define standards for responsible retailers to follow, or provide consumer education on what security features to look for in products.
And instead of dropping contracts with suppliers after the fact, retailers should be working on programmes to enforce robust levels of testing have been performed, before these products are allowed on shelves.
As for the Gator smartwatch, my view is that with the fixes they have currently put in place, your child is safer if they have this watch than without it. Realistically, your average nasty individual would not be going to the lengths that we did to (for example) jam the communication between a child’s smartwatch and their parent. That’s like saying a cheap bike lock increases the risk of bike theft because it can be cut with a simple cable-cutter. You’re still better off having it than not.
In fact, this whole situation reminds me of Theo Paphitis famously tearing apart a Trunki in one episode of Dragon’s Den. It’s not that anyone would actually do that, but the fact it is possible plays on people’s minds more than it should.
That’s not to say there weren’t some really serious concerns with the way this watch was originally implemented, it was simply way too easy to tear this particular Trunki apart.
What saddens me though is the situation is now starting to resemble a witch-hunt, with the NCC recommending that the Gator product is removed from shelves, despite the herculean effort that Colleen and the Gator manufacturer have gone to to address the most serious issues in an incredibly short timescale. They should be being held up now as an example to all other IoT manufacturers. Fixing anything in 30 days can be hard work, But the manufacturer put in a superhuman effort to fix some major architectural issues, not just simple patches. In fact, they closed down issues with such speed it would put Oracle and Microsoft to shame, and they are continuing to do more.
For now it’s clear that there is still plenty of work to be done on the security of IoT devices. Gator won’t be the last product to be caught out. But I don’t think we’ll get there by over-emphasising risks to provoke media reaction. Those are the kind of hyperbolised scare tactics that previously confined security professionals to office basements and kept us a million miles away from the board agenda. Times have changed, major hacks are in the headlines almost daily, we don’t even need these scare tactics to get attention anymore. The real question is whether we can be mature enough as an industry to make progress without harassing businesses with unrealistic deadlines, and unnecessarily scaring the public into making ill informed purchasing decisions.
Thanks to Patrick Craston and Daniel Andrew.