Big Data In, Big Threat Out

Doctor Lydia Robles was confused to the point of immobilization. She kept going over the results again and again and all the data pointed to an outbreak. A serious outbreak. An outbreak that had the potential to become a pandemic. When she made the recommendation to relocate critical medical resources and supplement vaccine stockpiles in the New York first responders command centers she had no idea that DHS would stick their nose into it and decide to move the entire southwest vaccine stockpile to New York.

But someone did.

She didn’t think her work was important enough to attract the attention of any surveillance programs, but based on what she’d heard from the Snowden leak, she guessed that what got watched by the NSA didn’t have that high a bar to go over any more. Someone must have looked “over her shoulder” and decided that it was worse then she had initially thought. But still, someone should have talked to her. There was something not quite right about it. Something that a big data analysis couldn’t see but something that her human intuition told her was just a bit off.

With a precision that rivals an Army Ranger strike force, as soon as the trucks were unloaded in New York, the pathogen started to make itself felt in San Diego. The only correlating event that they could trace so far was that many of the people clogging the emergency rooms and morgues were at a Chargers game the day before. Estimates said that it would take 18 hours just to get the vaccine stockpile back to that part of the country and another 12 hours to get in a position to administer it. This meant that they were already 2 days behind a virus that made HPAIV look tame. That was translating into another problem: where to put the bodies.

This thing was spreading fast. As fast as the model said it was supposed to spread in New York. But the data in New York was clearly wrong. Someone had salted the New York data in order to make the CDC think there was a problem. This was intentional and it was an attack. But how and who? And even more curious, why?

Whoever they were, they had significant resources and the patience of Job. Linda wondered how many computers had been secretly hacked to create the thousands upon thousands of web accesses and tweets over the past few months that were required to influence her analysis.

One thing was sure; Big Data turned a corner this week. A very dangerous and deadly corner. When she crawled out of bed this morning she no idea that her passion for data analysis would be the cause of so much misery and death. Garbage in, garbage out was the old rule. Not any more! Now, garbage in, Big Garbage out! Linda knew that her job would never be the same again. Big Data as a weapon. Shit. She was going to have to figure out a way to add a pedigree to her data in the future.

That is, assuming there was a future and she still had a job in it after this was all over. Or worse, if she was even still alive by then.

So I ask you, what went wrong here? Sometimes failures aren’t technical in nature. Yes, the technology can seem to fail us, as it might have here. But even the best technology can’t save us from poor reasoning and sloppy processes. Sometimes the assumptions we make are at the root of the failure we’re trying to analyze. As was the case here, sometimes the very behavior of the underlying technology can be used as a delivery method. The tech was just a bit player here. Also consider that the very behavior of the people and the processes they employed were targeted in very subtle ways.

And finally, what could have been done differently?


A New Way to Relate

Although the subject of information security is a very serious one, sometimes we take ourselves a bit too seriously. When I wrote my first book and put it under a publisher’s nose, she said that she loved it! To quote her, “I love your writing style! It’s engaging, fun, and easy to follow. Too bad we wouldn’t be able to sell a single book.”

She was referring to the fact that I was aiming the book at the executive layer. My intent was to write a book that made security concepts easy for executives to place within the context of business requirements. I was trying to make it easier for them to understand what it was we were trying to do and why it was important so they could make better decisions.

I was also trying to make it easier for them to spot a “poser” so as to reduce the amount of bullshit that was flooding into our solution space. Instead of waiting for the next disaster, I was hoping I could get them ahead of the curve. Unfortunately, it seemed that Robert Ludlum and Tom Clancy were a tad more popular then your garden-variety security geek.

So be it.

With that bit of advice tucked in a warm dark place, I wrote Endpoint Security. (A fantastic read by the way!) Alas, I’m still waiting for the groupies.

So, instead of droning on about solutions, vulnerabilities, risk, and how bad things are, I’m going to try a different path. I’m still going to ask hard questions, but I’m going to do it by telling some stories about how these things will affect our lives, and possibly our futures. I’m going to try to put myself in the place of the people that will be affected by our failures and try to tell their stories of how it impacted their lives and the people around them.

In short, I’m trying to get people to relate. We can build a better solution if everyone can see the possible future outcomes of not caring about it now.

So I give you our first story about how big data can go very very wrong.