Episode 06: Security & Things

Security and IoT: Ten Thousand Feet Podcast Episode 6

In this episode:

There are inherent security risks as we invite ‘things’ onto our networks and into our homes.

As a business, we’re charged with managing the devices on our network and making sure they only have access to the right data at the right time for the right reasons.

But as consumers, we are often fueled by excitement, going through installation steps and loving the new way we can set our temperature, unlock our doors and automate our homes. We may not pay attention to the recommended two-factor authentication or the buried privacy setting that protects us from hackers and exposing our private lives to these devices.

IoT experts Jim VanderMey of Vervint and Alex Jantz of AWS discuss where the security responsibility lies: with the business or with the consumer.

For more information on IoT security, here is the book Jim recommends at the end of the episode:

Cyber -Assurance for the Internet of Things by Tyson T. Brooks

Enjoy!

This episode is sponsored by:

Dell EMC logo

This podcast content was created prior to our rebrand and may contain references to our previous name (OST) and brand elements. Although our brand has changed, the information shared continues to be relevant and valuable.


Episode Transcript

Lizzie Williams: Hey, everybody, on this episode of Ten Thousand Feet, we are joined with Jim VanderMey, our co-founder and Chief Innovation Officer here at OST as well as Alex Jantz who’s a solution architect on our connected products team. So, we are gonna be spending the episode talking about the importance of securing connected products, both the responsibility as a user and also the responsibility of the product development team. So, let’s get into it.   

Today, we’re gonna be talking about security. I think specifically in products, is that right?  

Alex: Yup, that’s where we want to go to today.   

Lizzie: That’s where your heads at. 

View Full Transcript

Jim: That’s where we’re about.   

Lizzie: Okay. Well, Alex, you kinda gave us a little example before earlier today. Why don’t you kick us off and tell us what’s on your mind in regards to product security?   

Alex: Yeah. I read a news article about this situation where someone’s Nest camera had been hacked. It was by a white hat hacker. 

Alex: It was this anonymous group I think it was in Calgary had hacked a Nest camera and was waiting for this person to walk by so he could use the audio piece of the camera to kind of announce his presence.  

Lizzie: Oh gosh.   

Alex: And this guy ended up recording his interaction with this hacker. It was really interesting, right? So, he’s talking with the hacker, “Who are you?” And the guy is like, “Yeah, I’m a white hat hacker and I got into your system by using some exposed passwords.” So, the guy is like asking, “Well, what have you seen?” And he’s like, “Well, everything that your Nest camera can see and hear, I can hear.” So he goes, “How many of these have you hacked?” He’s like, “Oh, about 500.”   

Lizzie: Oh my gosh.   

Alex: And then he goes to cut back and the guy is like, “Well, I’m a realtor and I give these out as gifts for my sales, right, when I end up selling a house, I give it to people I sell the house to.” So, he’s kinda concerned for his own brand, right?  

Lizzie: Right, absolutely.  

Alex: So, he’s concerned about all these that he has given out. And so Google responded to this situation and they said, well, this particular realtor hadn’t turned on two-factor authentication so he wasn’t following their intent of security and so it was kind of on him that he had gotten hacked. But so many have been hacked at this point that it’s sort of a brand or reputation issue. And so kind of coming back to who has the ownership of these things and what happens when you put IoT into your products and vice-versa. What happens when you put an IoT product in your house? Those are areas that really the story kind of dives into. It’s pretty cool.  

Jim: Well, and there’s other examples, and the story that you recounted was broadcasted on NPR yesterday but I remember a story that happened about a year ago where one of the robotic vacuums got hacked and someone was able to control the camera on the robotic vacuum in a similar fashion. And as a result, they could not only have a camera active in your home but they could move it to a location in your home and have it behave like it’s your robotic vacuum.   

Alex: That’s creepy.   

Lizzie: Yeah, very creepy.   

Jim: So, that creepy factor then comes in. And I think that does get to the heart of the security issue because in the Nest example they didn’t turn on two-factor authentication. They used passwords that were common and of low strength. Whose responsibility is it to secure the home Internet of Things. Is it you as a consumer or is it up to the manufacturer to create something that is so secure that forces you into using good security practices? Like you mentioned two-factor authentication, what is two-factor authentication?   

Alex: It’s something you know and then something that you’ve been given or something you are, right? So, the two-factor in most cases would be you get a text message or some email link that allows you to validate not only that you know your password but then a secondary link of communication that you would not necessarily know in the moment.   

Jim: Right. So, having access to your phone, having access to a physical device, for example, if you had to use an ID badge plus a pin number, an ATM card as an example of two-factor authentication. So, what do you think is the responsibility for an organization that’s making the product in the space to think about as they take on security as a domain expertise?  

Alex: So if you were to kind of flip and say what are the implications of maybe skirting the deep dive on security itself. It would be brand implications, questions about whether or not you wanna own their product and then whether that’s an attack vector for a larger infiltration of your network, right? So, it is sort of this Trojan horse of once you’ve hacked the weakest link, are you on their network? And if you are, what else can you get access to?  

Jim: Right. And one of our customers I recall saying early in the product development process, he said our first goal is to not be on the front page of the Wall Street Journal because our product was used in a large scale bot attack on the web, right? And so security was designed into the product from the ground up.  

Alex: So, what does that look like from the ground up? You know you could put your crypto hat on and really kinda get into the deep dive of it. But at the multiple layers of it, what do you see as like considerations from product and tech there?  

Jim: Oh, that’s a great question because I think that we have so many constituencies that we’re serving at the same time. And Nest is trying to create a home ecosystem for example, but it’s an ecosystem that’s made up of thermostats and they’ve given out access to thermostats, utility companies have had delegated authorities so they can move the thermostat up and down depending on energy demand. And so in Southern California, there’s a partnership between Nest and some of the utility companies and there are other parts of the country as well where the utility company can turn your air conditioning down on peak days in the summer when they know that you’re not home. So, now you have an issue of what we call delegated authority where the manufacturer has access to data because they have to have access to the data so they can give you your great application on the iPad or on your phone to see what your energy consumption is. The utility has access and you have ultimate control and when you’re standing in front of your thermostat or in front of your device. So, when you’re trying to create this complex ecosystem of interrelationships, how do you deal with security? But how do you as a consumer know that you have ultimate control over what is and is not shared? Even yesterday’s examples of the Facebook data sharing, consumers don’t know how their data is being used and that leads to issues of trust.   

Alex: So, it comes back to a feeling, right? Like it gets surfaced. It hits a peak and it gets surfaced and then there’s sort of this moment where people feel rather than like — it’s not technical in that moment, right?   

Jim: Right.   

Alex: Something technical led up to that moment but in that moment, it’s kind of out of the hands of tech. It’s now ultimately back to brand and recognition and emotion of what we feel about this product whether it’s secure or not, right?  

Jim: Yeah, I think the last line in the NPR interview was that this real estate agent wasn’t going to give out Nests anymore and he actually disabled his cam and he’s gonna count on his dog for his security because he didn’t trust the device any longer for what its core value proposition was. This thing that I bought to make me feel secure is suddenly making me feel insecure.   

Alex: That’s actually a really good point. So, like the relationship with the product was inverted, right? Something– you said it best– it was like it was promoted to be a security blanket in some ways and it ends up being the thing that actually makes you feel least secure.   

Jim: Yeah, when you have a stranger speaking to you from your security camera, that’s a really creepy moment. That’s a really creepy moment, and it’s a loss of control. But it’s not only a loss of control but it’s a loss of control for things I didn’t even know were a risk.   

Alex: So, let’s break down Nest camera for a moment, right, as far as like a product. So, there’s attack vectors all through the ecosystem there but from a feature enablement standpoint, it’s a camera and a camera can be this one-way tool where you are able to see what’s happening. But they turned it into a two-way system in which it’s got a speaker in it so that you can kind of announce to the people outside — I think it’s for the intent that you can say like who’s at the door, that sort of thing. That also creates a new kind of a vector or a new reason to use that product, right? On the flip side, it was also something that this hacker had used to announce his presence. So, I guess back to that, what features are we enabling and we got to think about how are you locking those things down, how do you get access to this one-way view of the lens and then also how are we tracking the access to that product? Google could have probably seen if somebody from an external IP, a new IP address had been accessing this camera outside of a normal routine, right? The same as if somebody from that external IP starts announcing or listening for information on that camera. So, there are features of that product that has two-factor authentication probably could have helped prevent this from happening but also there were feature enablement pieces that Nest could’ve added to that product that would have also alleviated or help surface that sort of malicious activity.  

Jim: Yes, but sometimes security works roughly against ease of use because people like an elegant, easy-to-use product. And if security puts a moat and a drawbridge and a wall between me and my intended use of the product every single time I interact with it, it loses some of its desirability. It’s not easy. And when we’re designing connected products, we’re oftentimes thinking about that ease of use as one of the highest order values.   

Alex: So, this probably isn’t a fair question, but is security a trade-off?  

Jim: I think security is a trade-off. But it’s a trade-off that we make as product designers and builders on behalf of the consumers. That’s not a trade-off that the consumer is making for themselves. And so, how do we push that decision to the forefront in our design activities so that the consumers are feeling they’re in control and is also feeling that they understand the trade-off that they need to make. So, I’m willfully choosing to not use two-factor authentication. Are you exposing to me then that I’m potentially putting my device at risk for being hacked by a third party?   

Alex: Then it becomes a design solution, right?   

Jim: Yes.   

Alex: So, if you choose to make that compromise as a consumer, the design could surface it in such a way that you’ve made a decision that this is a trade-off, right, as the consumer?  

Jim: Yes.   

Alex: That kind of puts it back out of the core of brand, right?   

Jim: Yeah. I think it’s interesting though that we forced the sophistication on the consumer, is that the level of education, the level of understanding, the level of intuitive knowledge about how technology works now forces a very informed consumer to be able to use these products, and that means you’re limiting yourself to a hobbyist market in some regards. It’s not necessarily gonna be a broad market acceptance. Imagine if TVs had security questions on them before you use them but you’re exposing the fact that, oh, by the way, all your data about your Netflix viewing and the channels you watch and — oh, there’s a camera on your TV that you have for FaceTiming your grandchild which could be hacked potentially by a third party, but how do I expose those kind of decisions? I’m not sure how if we developed the sophistication on our consumers yet to be able to do that.   

Alex: Yeah. I think it’s been a drive toward feature enablement and now we’re sort of paying the price for some of these enabled features that have been on the market for some time but are probably playing catch-up from the whether they are secure or not aspect.   

Jim: Remember a few years ago Wired magazine hacked a vehicle on the road.   

Alex: Yeah.   

Jim: And the attack vector for that was because of the IP communication that was used within the car and the ability to connect to the car remotely, and they managed to hack that. It was fixed with a firmware update on the car but, yeah, how do you feel if all of a sudden someone took over the vehicle you’re driving?   

Alex: So, if you were to flip that back — let’s do a deep dive. So, you did a firmware update on your car–  

Jim: Yes, I did.  

Alex: It’s a Grand Cherokee?  

Jim: Yes, a Jeep Grand Cherokee.  

Alex: And we were sort of like overlaying that with the idea that Tesla does these overnight updates that sort of roll out and people are so excited about, right? Like they just happen and all of a sudden you’ve got a new feature. I think the one right now is romance mode.   

Lizzie: Oh, what’s that?   

Alex: Yeah, go check it out.   

Lizzie: I don’t have a Tesla.   

Jim: Another one is the Santa Claus mode. So, that’s another one that’s out right now as a seasonal.   

Alex: So, what was your experience on the Grand Cherokee firmware update?   

Jim: Oh, my word. First of all, I just had my car in for service and one of the things that they did was a software update on the car because it was part of a factory recall for the software. But the first time I had a software update, I actually got the USB stick to insert and I manually did it myself just to see if I could– in tech terms, I tried to brick my car. So, I was pulling the USB stick out of the middle of the firmware update. I was turning the car around. I was trying all these different things to see what would happen, but it took a while. And it was a fairly intimidating process to do a firmware update on my car. But I was interested in what the user experience was in that moment. And that idea that if I’m entering the space where I’m working with smart devices, now I have to be thinking about how is the software updated? And I’m wondering if there’s gonna end up being space in the home ecosystem specifically for a whole new class of security services around the detection of threats inside of your home from a third-party provider. So, you think about how we have security services companies that put alarms in place and you turn your alarm on and they’re monitoring it continuously and then they notify you via a phone or text message that there’s an alert that has just gone off on your home. Is this okay? Is this not okay? And we’re not yet at the point where there have been security services developed for your home network but I’m wondering if we’re gonna start having this kind of response because our consumer can’t secure all the devices. There’s nothing like the Underwriters Laboratory symbol of approval that says these things were safe to plug in to physical electrical outlets.   

Alex: And what is the recall path in the case of a compromise, right?  

Jim: Yeah, and so we’re gonna probably see a new class of consumer services that will arise for those that don’t want to manage it themselves.  

Alex: Oh, that’s really cool. So, you are to think about extending brand, right, so you take like a security like ADT, right?  

Jim: Yeah.  

Alex: And if they were to move laterally and say we can secure your home network, you put a router in and tell us what devices you have installed and we’ll go ahead and monitor for intrusion. They could use that story that we kicked off with about somebody intruded on the house, I mean by way of a smart device but they’re still on your house. ADT could say like we’re preventing physical and virtual access.   

Jim: Yeah, or we’re preventing the compromising of electronic locks. Some researchers at the University of Michigan a few years ago showed that they could hack the interaction between a smartphone and the electronic lock and they were actually able to load fake pins into an electronic lock and enable you to put codes into the lock to open it that the owner never validated. So, they were able to basically in the form of electronic locksmith, duplicate your key and you didn’t even know that your key had been duplicated. So, you start looking at these attack vectors and when you start sharing these stories, one of the immediate reactions is, “I don’t want any of these things in my house.” “I want to go back to a physical key.” “I wanna go back to a dog instead of an alarm system.” “I wanna go back to a disconnected engagement for my thermostat.” But then we start saying, well, what is the benefit versus the costs. And we all have to go through this cost-benefit analysis and see whether we’re willing to take it. But we haven’t serviced any of those from a brand standpoint. We don’t explain that well to our consumers.   

Alex: That’s a really cool way to spell it out, right? It’s like that’s going to be sort of a spectrum on comfort level based on how you feel you’re getting a cost-benefit of each product that you’re picking up but it’s probably not something that you really explicitly thinking about it. It’s the next cool thing in some ways.    

Jim: Yeah.   

Alex: But that’s the seat I’m in, right, like tech evangelist. I suppose if I were in a different world that I would be maybe more skeptical of these IoT devices.  

Jim: Do you make that decision when you’re buying something? Do you think about security?   

Alex: Oh, no. Nope. It’s pure excitement when we buy.   

Jim: Alright. So, just think about that, as knowledgeable and savvy as what you are, you don’t think about security.   

Alex: Yes, I think as a consumer. We’ve been trained as consumers that there’s this feature, it’s got excitement, something that we wanna try out, right? Even if it just moves the notch up just a little bit, right? I set my thermostat and it could’ve been a programmable thermostat but it’s a smart thermostat. It’s essentially the same thermostat from my use case. Somebody else is getting a lot more data out of it. And so, I think on the spectrum somebody is getting probably their more of their fair share of my smart thermostat but when I picked it up, I was just ecstatic that I could use something like that.   

Jim: Absolutely, it was cool. I mean, the user interface is beautiful and our design team had some involvement in that in one of the iterations on the Nest product family. It is a fantastically oriented consumer device and it’s a beautiful device. I like the way it looks on the wall better than the old thermostat.                                                

Alex: Right.   

Jim: But there are excitement over the features. And our implicit trust of the brand assume that they took care of the security. I think that’s a really important lesson that we believe that the brand that we bought this from — if we bought it from a credible brand, if we bought iRobot vacuum, or we bought a Whirlpool appliance or we bought a Nest security system especially if the core value is around security in the home that we assume that this trusted brand took care of security.  

Lizzie: That’s really interesting because I recently got a smart thermostat and I had asked some of our colleagues like what brand should I get? Some people gave me some off-brand ones that they’re like, hey, this does the exact same thing. It’s whatever X brand. I’m like, never heard of it, don’t trust it. I’m like I’m either going Nest or Ecobee because those are two that I knew were in a lot of homes and so I did inherently trust them more even though I’m sure some of the other products were just as safe and secure or not, right?   

Jim: And Honeywell would be a good third choice. If you look at consumer reports, if you look at Underwriters Laboratory, there have not been yet this consumer-facing discussion about how secure are the products that I’m buying? And we used to say, for example in a lock, you could tell based upon the construction; cost. The sophistication of the lock, the strength of the plate, you knew the difference between a cheap lock and a good lock. You could pick it up at Home Depot and you could just tell from the heft. There’s nothing intuitively that tells you that one device is secure and one device is not. For the people who have listened this long to the podcast and are technical in orientation and like discussions of higher-order logic which is probably a pretty narrow audience.  

[laughter]  

Jim: Tyson Brooks added to the book, Cyber-Assurance for the Internet of Things, and this is definitely not a consumer-oriented book. But he, along with a set of researchers at Syracuse University, took the principles of security by design that was used by the Air Force in developing combat drones and they applied that to how do we take the same principles of security by design and put that into consumer devices. They used the thermostat as their test case throughout the entire book. That’s a really solid book for an engineer to understand how to design security from the ground up. And when you think about systems that you really want to be secure, I can’t think of anything that would have a higher degree of sensitivity than the Air Force combat drone. And this information has only become available in the last couple of years. This book was just published in the last 18 months but there’s some really strong literature out there that can help us as engineers to design products that are secure because I think that putting the onus on the consumer is not fair.  

Alex: Yeah, and I guess kinda circling back when we helped the consumer make the choices about setting up accounts, it would be working that through from a design activity, right? So, when you’re setting up your account, next, next, next, next, done, right? That’s sort of the hope for the consumer, trying to make it as seamless as possible. If two-factor authentication is really important to your brand, you’re gonna wanna figure out how do you design that back in, how do you prompt that back? Even if it’s not set up on the initial round, the hope is that you can–  

Jim: Remind them–  

Alex: –remind them or feed that back through.   

Jim: So, I think it’s maybe three-fold enclosures. We have to design products to be secure for the life cycle of all the products, not just from the day that it’s installed. We have to raise the issues of security to our end consumer so they’re making conscious and informed choices at an appropriate level which is an issue of good design. And we as consumers need to be thoughtful about security as a purchasing criteria, even when the industry isn’t mature enough to present that criteria to us in a really efficacious fashion. So, it’s an interesting conversation that we ran into. I think we have probably a part two of this conversation coming up.   

Lizzie: Yeah. I think we learned a lot about security. I love your final three points, Jim. I think those will hit on the head for a lot of our listeners. And if you do have interest in this, the book that Jim mentioned was Cyber-Assurance for the Internet of Things.  You said that’s definitely for a technical audience though, right?   

Jim: Yes, I triple [inaudible] it’s not like–  

Lizzie: [chuckles] Nice. Alex, any final things that you think as a consumer we need to be thinking about?  

Alex: I think Jim summed it up well.   

Lizzie: Awesome. Well, thank you, Jim and Alex, so much for coming and sharing some information and stories about product security. I think it will make me think through some things differently as I go home today and check on my Nest.   

[laughter]  

Jim: Lizzie, don’t use the default password.   

Lizzie: Oh, so password is not a good password, or — okay. Got it. Well, thank you both for joining us. Thank you, listeners, for tuning in. Have a good day.   

OST. Changing how the world connects, together. For more information, go to ostusa.com/podcast.