Because I'm all about the "good enough."

Tuesday, December 8, 2015

A matter of taste.

I've figured it out: The word "cyber" is like garlic.

For most palates, just a bit of cyber in anything is enough. It makes it all a bit more interesting.

Some people love cyber so much that they put it in everything, in massive amounts (chicken with 40 cloves of cyber, for example). Others are so sensitive to cyber that they can't stand the faintest whiff of it.

If you've been raised in a culture that uses cyber a lot, you won't realize how it comes across to those who haven't grown up with it. People will pull away from you with horrified or disgusted looks on their faces and you won't know why. When you've been steeping in cyber, you don't notice the smell any more.

There's even a certain part of the United States that just loves its cyber. It puts on a regular cyber festival, where you can get cyber flavor in everything. I've never been to it myself, but I can tell you right now that I will never accept cyber-ice cream.

Some cultures love cyber, and some don't, but if you're part of a couple and only one of you has ingested cyber that day, you're going to have compatibility problems later on that evening.

And one final thought: if you feed your toddler too many spinach pierogi with cyber, she's going to be exhaling that stench for days until it clears out of her little body. Trust me on this one.




Wednesday, November 25, 2015

Why the airplane analogy doesn't fly.

Don't get me wrong — I love Trey Ford. He is one of the most inspiring infosec pros I know. He's smart, creative, full of mind-blowing ideas, and has energy to spare. And I love his talk at SecTor about what we can learn about information sharing from the aviation industry.

There's just one problem: aviation isn't all that comparable to cybersecurity.

Imagine that instead of flying the plane herself, a pilot had to convince all the passengers on the flight, EVERY flight, to do the flying together. And many of them aren't good at it, and don't care; they just want to sleep or watch videos or whatever.

The passengers change all the time, so you can't keep them educated on what to do. Depending on the size of the plane, there may be tens or hundreds of thousands of passengers helping with the flying. Instead of a finite maintenance crew that's under the direct control of the airline, there are dozens or thousands of different crews from third-party companies, all doing their bits (or not).

The aircraft types range into the thousands, dating back to Kitty Hawk and up to the newest models, and most of them have at least some custom alterations that can be changed between flights, so the various manufacturers won't take responsibility for anything they didn't add. Remember, too, that each of those alterations was probably made for a good reason — or at least, a reason that was good at the time. (That's a huge part of what we don't know about breaches today: we sometimes know the chain of events and contributing vulnerabilities, but we make rash judgments about why they happened without knowing the full story.)

The airlines all have different ideas on how they should equip their planes, so some pilots have one of everything new and shiny, and others have to make do with duct tape and bags of pretzels. (And some airlines are just now thinking that maybe having a dedicated pilot is a good idea.)

Oh, and did I mention? The weather is actively trying to disrupt your flight, usually in a way that you won't notice until it's too late. (Although you still have to worry about hacktivist storm cells that want you to look bad.)

All of these differences highlight our challenge in security: because everything is so complicated, so flexible, and so NOT under our individual control, we can easily blame someone else for their breaches because they did things so differently. And the pilot is nominally in charge, so that's where we concentrate the attention, but even the pilot can't get the toddler in 32B to stop screaming and fly straight. I'm not even going to mention the armchair aviation enthusiasts who sit near the runway with binoculars and lasers and provide "helpful" critique. (Oops, I guess that slipped out.)

So how can we still make use of what we've learned from information sharing in aviation? As Trey says, we can at least collect data now in a way that we may be able and willing to share later. If only we had a black box that collected vital information about a breach in a way that didn't expose the inner workings of the business, or those custom-built additions. If only we could sanitize the data in a way that communicated the important lessons ("don't combine these tray tables with that boarding process, and especially don't add a pilot over 6 feet tall without upgrading the landing gear") but defanged our industry's reflexive attempts at a certain kind of blame ("how stupid was that? We'd never do that!").

When I consider all this, sometimes I despair that we'll ever figure it out. But with positive thinkers like Trey, we may just have a chance.





Monday, September 7, 2015

When your risk profile is different.

Ready for some (more) unfounded speculation?

Both people and organizations tend to want to keep their data within a circle of trust; it's why there has been (and continues to be) resistance to putting sensitive data in the cloud. It's a function of human nature to keep things close -- which is why people still keep files on their desktops or laptops, use USB drives, and run servers at home. You keep your treasures in an environment that you know best, and where you feel you have the most control over them.

According to the Washington Post, President Bill Clinton had had a personal email server at home; Hillary Clinton had a server which had been in use during her first presidential campaign in 2008, and this same server was then set up for her at home when she took the Secretary of State post.

Besides this controversy with her home email server (and yes, I commented on that on CNN, but they must not have liked most of what I had to say), I noticed the other day that apparently Caroline Kennedy had been using personal email as well for State Department business. This suggests to me that they may have had a reason in common for doing this, one that hasn't been highlighted so far:

They both have a very different risk profile from most public officials.

When you're a celebrity -- independent of the position you currently hold -- your threat modeling has to include just about everyone. Any friends you have, any staff members you hire, could turn on you at any time for some perceived advantage. Now, Hillary could have had knowledge that the State Department was bad at securing its own systems, but I don't think that was it. I think she just couldn't trust staffers that worked for the agency and not for her personally. Any of them might try to access her email for political or personal reasons -- and let's face it: she's spent many, many years being embattled. The same would go for Caroline Kennedy, as well as anyone else who was famous before they took office.

In other words, their threat model holds colleagues to be a higher risk than hackers.

If you think this is surprising, you haven't been inside the minds of most non-security people. They have seen and experienced many more threats on a personal level than they have The Notorious A.P.T, so they will defend against the threat they believe in more.

None of us really knows how secure the server ended up being (although it looks like Hurricane Sandy caused natural disasters to become a more prominent part of the threat model, which is why they finally moved it to a provider with an actual data center), so I can't comment on that. Nor am I in any position to comment on the legal or classification issues, since those seem to be changing depending on who's got the microphone at any given time. But from a threat modeling perspective, I can absolutely understand why people want to hold their staff close and their data closer.

Oh, and by the way: if you can't view things from other peoples' perspectives, you're not going to be very good at threat modeling.


Saturday, May 16, 2015

Lessons in grown-up security.

Okay, so for the sake of those who can't say anything, I feel I have to say something.

Remember how much you hate people talking about things they don't understand? So do I. And let's face it: if you're not on the inside of an organization, you don't know 100% of what's going on there. Oftentimes it's less than 50%. And if it has to do with security, the percentage can drop as low as 10%.

The hysteria around Chris Roberts supposedly hacking a plane and "making it go sideways" has reached an all-time high. Which isn't to say it couldn't go higher, because media. But let's go through the versions here:

There's what he told people he did.
There's what they interpreted from what he said.
There's what he thought he did.
There's what he actually did.

Then there's the usual Telephone game of people misinterpreting, mis-reporting, and deliberately twisting all those things when they hear them second- and third-hand.

But one fact remains: there are people who actually know what's possible to do, and they ain't talking. Nor will they. Even if Roberts was talking complete bullshit, nobody on the inside is going to step forward and say it publicly. So in this case, silence does not equal assent.

We don't know whether the airline manufacturer already has experts doing pentesting, and they don't need any more, thankyouverymuch. Just because they're ignoring your reports doesn't mean they don't already know about what you think you're trying to say. They don't actually owe you an answer: "No, you didn't really get through, but if you had done THIS instead ..." Just because you decide to walk onto the court, it doesn't mean you get to be a player.

We don't know why United decided to come out with a bug bounty program, although it's mighty responsible of them NOT to encourage randoms to try hacking the avionics. Those who are complaining that it's missing from the bug bounty program are completely clueless in that regard, and have probably never been personally responsible for anything more consequential than a runaway shopping cart.

There may be no truth at all to what the FBI claims Roberts did, and they're just prosecuting him because letting him go free would send the wrong message to other juvenile delinquents out there.

The bottom line is, if you're not actively working WITH the company whose technology you're researching, then you're an adversary. So don't be surprised if they treat you like one. United has every right to say to Roberts, "You didn't actually do anything harmful, but you're a dick, so stay off our airplanes."

You can be a security researcher, but in the immortal, wise words of @wilw: Don't be a dick.

Friday, April 24, 2015

Achievement unlocked?

This week was Hell Week for analysts, otherwise known as Meet All The People, Inspect All The Things, otherwise known as the RSA Conference. Everything was going as expected: I made it through all the speaking engagements (at least one a day this time), spent a little time on the expo floor making a video with the awesome @j4vv4d, did the press interviews, and kissed all the hands and shook all the babies in 30-minute meeting slots.

I was heading over to the Security Bloggers' Meetup, wearing some really spectacular (if you'll pardon the pun) blinking-LED sunglasses that Javvad had given me, and I decided to leave them on for the short walk across the street to Jillian's; I figured they would look good in the dark bar.

All of a sudden, some male conference-goer walks by me, and in passing, he tells me, "There's a switch on the earpiece of the glasses, probably on the right, and you can turn them off that way so they won't run down the battery."

WTaF. Is this guy really mansplaining to me HOW TO OPERATE MY OWN SUNGLASSES?

Yes. Yes, he was.

Now, this is only the most harmless of micro-aggressions compared to what other women go through ("I want to talk to an engineer, not a booth lady"), but what most people don't understand is why we don't take people's heads off at the time. It's simple: you're so stunned, you don't think of the right words until much later. Imagine someone comes up to you out of the blue and says, "Hey buddy, you're wearing socks, we're going to have to ask you to leave." Completely on automatic, you might say, "Oh, okay, sorry about that," and start moving before the rest of your brain finishes processing the "What?" And many of us are trained to be polite first and foremost, so it's a reflex that has to be overcome.

So I said to the guy, "THANK YOU FOR EXPLAINING THAT TO ME. I WOULD NEVER HAVE FIGURED IT OUT BY MYSELF."  (Blogger doesn't have a sarcasm font, but imagine my saying it in one.) And now I'm sure that this Derpasaurus Rex took that completely seriously and thought I was really thanking him. So I should have done better, but it did take a few more minutes for the incredulity to drain away, and then it was too late.

What causes this level of pea-brained sexism to happen? I don't normally encounter it, or at least not so that I'd notice. I'm neither young nor pretty, but I was wearing a skirt at the time, which I don't normally do. What thought process goes on to make someone decide that a middle-aged mother of two, minding her own business, urgently needs sunglasses instructions?

The best I can come up with is this: the guy was truly bothered by the sight of someone wearing blinking sunglasses (on top of the head) in daylight.

"That's wasteful. Oh, it's a woman. She must not know how to turn them off."

And it would never have occurred to him to go through the same thought process if it had been a man. He would have assumed the man had a good reason for leaving them turned on, and it might still have bothered him in some Derpy Engineer Syndrome fashion, but he would have let it go.

Anyway, that was the one surreal moment from the conference this week. I think I'll put away the skirt for next year.


Tuesday, January 27, 2015

Looking logically at legislation.

There's a lot of fuss around the recent White House proposal to amend the Computer Fraud and Abuse Act, and some level-headed analysis of it. There's also a lot of defensive and emotional reaction to it ("ZOMG we're going to be illegal!").

First of all, everyone take a deep breath. The reason why proposed changes are made public is to invite comment. This is a really good time to step up and give constructive feedback, not just say how much it sucks (although a large enough uproar will be taken into account anyway). Try assuming that nobody is "out to get you" -- assume that they're just trying to do the right thing, as you would want them to do for you. Put yourself in their shoes: if you had to figure out how to protect citizens and infrastructure against criminal "cyber" activity, and do it legally, how would you do it?

There's another really important point here, beyond the one that if you don't like it, suggest something more reasonable. Jen Ellis talks about the challenge of doing just that in her great post. And I agree with Jen that an intent-based approach may be the most likely avenue to pursue, although proving intent can be difficult. I'm looking forward to seeing concrete suggestions from others. As I've pointed out before, writing robust legislation or administrative rules is a lot like writing secure code: you have to check for all the use and abuse cases, plan for future additions, and make it all stand on top of legacy code that has been around for decades and isn't likely to change. We have plenty of security people who should be able to do this.

If they can't -- if there's no way to distinguish between security researchers and criminals in a way that allows us to prosecute the latter without hurting the former -- then maybe that's a sign that some people should rethink their vocations. (It also explains why society at large can't tell the difference, and doesn't like security researchers.) After a certain point, it's irrational to insist on your right to take actions just like a criminal, force other people to figure out the difference, and not suffer any consequences. If you want to continue to do what you're doing, step up and help solve the real problem.