Automated Voice: It's Monday, October 1st, 2007 and you're watching ARCast.TV.
Ron Jacobs: Welcome back to ARCast.TV from Slovenia. That's right; today we're once again going back to that little corner of Europe. The beautiful little country of Slovenia with some very smart guys. We had a nice little chat. You know those kinds of chats you have at a conference when you're sitting around at lunch or in the lounge? You start talking "tech". It's really geeky talk.

We were talking about, on this episode, security and all kinds of interesting things like root kits, how do you think about securing at an application level or a network level? What kind of things you might want to do. Things to think about.

It's a very casual chat. It was great fun, only this one was done in front of an audience on the stage and we recorded the whole thing. That's what we do on ARCast.TV, just record interesting conversations.

Today we're going back to Slovenia and Porte Rouge. Let's listen.

[switches to recording]

[applause]
Ron: Thank you. Thank you. Welcome to ARCast live from Slovenia. This is great to see such a group here. This is my first time in Slovenia and I'm delighted. I didn't even know you guys where here. This is a great place. I've got to get the name of this place right, is it Porte Rouge?
Micah Piclair: Porte Rouge.
Ron: OK, so I've got to that.
Micah: You're close.
Ron: I'm working on it. Trying to get my angle of pronunciation out of the way.

I'm delighted to be joined today by three guests. I'm going to let you guys go down and introduce yourself. Tell us your name, where you're from, and what do you do. Let's start here.
Micah: My name is Micah Piclair. I'm security IVP. I mostly work as a security consultant for mostly Microsoft networks for customers that mostly use Microsoft in their environments. I'm also a Slovenian user of an inner Caesar group elite. That takes most of my time.
Ron: OK, great. Let's go on Micah. You were talking about security. I did a session on security today also.
Micah: What did you do? What did you talk about?
Ron: I did one that was at an application level with some of the things people who are architecting and designing applications need to think about. We looked at some principles and process orientated things. Like how do you do a threat model and that sort of stuff? What were you talking about?
Micah: One of my favorite sessions this year was when I was talking about root kits. It was a really fun session to do.
Ron: Root kits, OK.
Micah: That's always fun.
Ron: Yes, those are scary.

[laughter]
Ron: For those people who don't know, what is a root kit?
Micah: A root kit is a piece of software that can hide on your system, and is hiding so well, that it's usually really hard to detect them. They can steal your information or send them away to someone, usually the author of the root kit, or someone that ordered it. You can actually order them on the Internet.
Ron: That's just sick.

[laughter]
Ron: The idea is that if I'm writing a root kit, what I'm going to do is hook the OS at a very low level in the kernel.
Micah: As low as possible.
Ron: Then I'm going to hide myself so if you try to list all the processes; I'm not going to show myself in that list. I might even hide my presence on the disc so if you look in the directory, you're not going to see a file that's like: "My Root Kit" or anything like that, right? You're going to hide in all those ways.
Micah: As much as possible, in more than one way. You can hide processes, hide registry values, and hide yourself from, like you said, management of the file system. TCP ports, UDP ports, any communication. You can assign your root kit its own Mac address or own IP address and all the other stuff that can be totally hidden.
Ron: Wow. So you know, this is so frightening because people start to realize: "I can have one of these in my system right now and not even know it."
Micah: My opinion is that we are not aware enough about these threats and we maybe underestimate them most of the time especially in our home environments.
Ron: Yeah.
Micah: And we will just install about anything on them whatever we need at the time. Especially if we are in a hurry. It's hard to protect yourself from this. You have to actually go let's say, you can use the defense in depth, you have to protect yourself in multiple layers if you want to be at least a bit secure from threats like root kits.
Man 2: A frightening thing is that you can actually buy a legal product and get a root kit with it.
Micah: Software or hardware.
Man 2: Sony did.
Ron: Yeah, that is a famous one.
Micah: Sony or Semantic.
Ron: Yeah, Sony put one on some [interrupting each other] CD's in the way to catch people who were swapping music around. They get very bad press for that.
Man 2: It was frightening.
Micah: Semantic did a very similar thing. It was not so much published but it did the same thing. They were warned about it and they stopped hiding stuff on the hard drive.
Ron: Wow.
Micah: You can actually get infected by using legal equipment. Let's say Apple sent out on the market infected iPods with a virus or it could be just the same with a root kit.
Ron: Yes, that's pretty frightened. Now in my session about security I told what has been a very interesting story in the U.S. I don't know if you guys have heard much about it over here but there was this company a retail chain called TJMax that had their network compromised by hackers who were using wireless equipment.

So they just actually went near a store and they were able to get the WiFi signal and the store was just using web encryption and I didn't realize this but web encryption apparently is pretty weak. There are tools that can break the web encryption within about a minute.

They used one of these tools and broke the encryption and then started sniffing packets on this store network. So that was one layer of vulnerability but then the applications that the store network used were passing usernames and passwords in clear text on the wire so now they were snipping packets they had username and password.

Now they were able to get into the database and start stealing credit card numbers. There were over 40 million transactions stolen. Now analysts are saying that these may cost as much as $50 per stolen record in the long run and that's for millions of dollars that they are reliable for.
Man 2: They don't even know how many were stolen, right?
Micah: And how long this was going on?
Ron: Yeah, one thing they know for sure is that a lot of the records that were stolen resulted in people using those credit card numbers, making bad charges and now the banks are wanting to sue TJMax to recover all this. So, it's pretty shocking when you think a lot of people have said: "Oh, our network is secure. We have a firewall."
Micah: Can you prove it?

Everyone: [laughing]
Micah: Can you prove that your network is secure? How do you measure that?
Ron: Yeah. That's a very difficult thing to do, isn't it? So, I mean do you consult with people about securing their wireless?
Micah: I try to do as much security as possible at different levels. Also on the wireless, also on the LAN, also on the server itself. Let's say NTFS permissions, also on the wire when we are talking about [indecipherable] inscription or IP segmenting devices. It's as in depth as possible, even physical security is important. If I have physical access to your server I can do just about anything with it.
Ron: Yeah.
Micah: At least kick it.
Ron: Yeah. I have to tell you I read this story about a bank who had their CEO wanted to see if their bank was secure. He hired a company to do penetration testing and he didn't tell anybody so nobody was expecting this.

The penetration testing people, what they did was they send a woman into the bank and she pretended to be very wealthy. She said, "I want to see the bank manager, I am thinking about opening an account here." So the bank manager came and she got his business card and she said, "I'm a very demanding customer, are you sure you can handle this, I am going to put a lot of money in your bank." He said "Oh, yes, yes, we can take care of you. It's going to be fine."

And while she was there, she had his card, she also noticed the copy machine that was sitting nearby and like most modern copy machines it was connected to the network and so she leaves. Then she has her partner call up the bank. He says, "I'm from this copier company and we have noticed it is time to do scheduled maintenance on your copier, and we would like to schedule and an appointment. Your manager Mr. So and so said it was OK and we could come in any time." So they said "Oh, well yes OK come in."

So he got a shirt with the logo of the company, comes in, sets up his laptop on the copy machine, plugs into the network, starts the sniffer, gets the password, writes it down and then puts it on a sticky note under the copy machine. Later sends an email to the CEO and said "Look under the copy machine." And he had it.
Micah: And you can actually steal the documents if they are coming to the copying machine. Captured those. And new copy machines have hard drives in it, so you can steal that hard drive that can have data on it and images of your documents.
Ron: Oh, yeah.
Micah: You just swap it.
Ron: So, no one, who would think "We need to protect the copy machine!" [laughs] You have got to protect these things.
Micah: I want to get that hard drive.
Ron: Yeah [laughter]. Well in fact this whole issue about hard drives is becoming very troubling. I have one of these 80 GB little hard drives that you attach to your laptops, right?

What we are seeing now is happening a lot. Somebody will go in their company. They'll get an extract of a bunch of data from a database. A database that is in a nice secure server room, its well protected, physically secure. They suck all the data out of it, put it on their little hard drive like that, totally unencrypted, not protected. They take it out and it gets stolen out of their car, or something like that. Then their company is on the news about millions of consumer records stolen from this little hard drive. How do you stop that?
Micah: Hard, in my experience. I went to the customer where I had to leave all my USB devices, all my USB hard drives, at the gate. So I left everything, except, I didn't have to leave my iPod there, I could take that inside.
Man 2: We had an interesting incident this year. A laptop from an assistant to the president was stolen and was missing for something like three weeks. After the police got the man who supposedly stole it, and they got the laptop back, they stated just, "OK, no doubt, I was compromised."

[laughter]
Micah: So it's secure.
Ron: Of course they know. They know this, right? It seemed to me that you're never going to be able to stop all of the little ways in which people can bring in devices. You can get very small USB keys that you can hide very easily and bring them in. You're never going to be able to stop all of that. Is there maybe a better way to restrict the ability of people to take all of the data out that wonderfully secure database, and then put it on these little devices?
Micah: I don't think the devices are the problem. I think that user education is a problem, that's the first thing.
Ron: Yes.
Micah: Even if I can't take the data out on the USB, there are other ways. Do I have access to the Internet while I'm connected? I can open a Gmail account, or whatever account, upload two gig's of data of their's. I don't have any two gig USB device, except my iPod.
Ron: Yes.
Micah: So I can do it that way. I can just mount my Gmail account space to a virtual hard drive on my computer and upload it there via SSL. So any IDS, IPS, or any other device, won't even tell the difference between legitimate access to Gmail or stealing data.
Man 2: That's right.
Micah: It's really hard. Not only are USB devices the problem, so is any other way that I can steal data.
Ron: So the bottom line it comes to; when you have confidential data in a database, anytime you let that data out of the database, for any reason, you have to be able to restrict it to a legitimate user and be able to audit their access of when they accessed and what they accessed. So you can know what people are looking at and restrict what they shouldn't be able to see.
Micah: There should be some sort of encryption used whenever one takes data out the center, no matter in what kind of shape this transfer is made. Either a USB device, or on a laptop, data should always be encrypted by any means.
Ron: I find that so many people have a "fortress" mentality, "We have a firewall. It's this wonderful wall around, and inside, we don't need to worry about the security because we have the wall." I've talked to many, many customers, and that's their attitude. How do you talk to people about that?
Micah: I try to explain to them, especially if they have multiple sites, that I will not attack them at their main site where they invest all of their money into security with cameras or security guards. I will attack them at their smallest office in the country. Perhaps, the office where there's only one person sitting doing, probably, nothing.
Ron: Yes.
Micah: I'm going to go there. Steal the main controller from that site, have access to the main control at that site, or connect my laptop at that site pretending to be a technician who came to service the printer or server at that location. Networks are usually configured as a flat network. Once I'm at that location, connected, I have access to just about anything, which is wrong.
Ron: Yes.
Micah: In most cases, I shouldn't be able to have access to the servers from this remote location to all of the servers in the main office. I should only have access to the services that I actually need.
Ron: Wow! I think a lot of people just go, "That just sounds like a big headache and a lot of cost", but when you consider the cost of a single break-in, especially if you're a bank or a financial institution; something that's a regulated industry, it's very, very expensive. Far more than what it would have cost to have done it right.
Micah: Then again, we talked about stealing data on the USB drive. As long as I can walk out from any office with the server under my arm, pretending to be a serviceman, we have a huge problem. We have a bigger problem than USB drives or anything else.

Yes, cost is a problem. There is always an investment involved. Now you have to wait. I don't know there are a few methods of calculating how much investment you can do or how much is actually worth anything before you actually go into any security stuff.
Ron: Yeah.
Man 3: This also boils down actually to the design perspective, right? There is no legitimate use for an application which allows access to all the credit card records to anybody. Why would you have an application like that? Or functionality inside the application like that? Or why would you expose a SQL server table to anyone which includes the records? There is no legitimate architectural way for you to tell that's a good thing.
Ron: Well, in fact you brought up a very good point. Why keep this information? I'm constantly amazed by this. In the United States, we have a thing called the social security number which was never really intended to be a universal identifier. Personally I think that every child should get guid assigned to them when they are born, something unique. We should tattoo it on them or something.

But anyway if you have my social security number and my date of birth you can have anything that belongs to me, pretty much. You can go out and apply for loans in my name. You can open my bank account. It's terrible.
Man 3: Our version of the social security number actually has dates incorporated.
Ron: Yeah? OK. Great.
Man 3: You don't even need the date.
Ron: So here is the thing. I go to see a doctor. In the US we don't have national health care or anything like that. You have to pay the doctor or if you have insurance that pays. But because of that every doctor wants to know your financial information. They want to know that you are going to pay. When you are filling that paper work and they ask you for your social security number and your date of birth.

I'm talking about one piece of paper that has the two really bad pieces of information about me. Where is it sitting? It is in the file cabinet behind the receptionist's desk in the unsecured area of the doctor's office. Any random person can walk in and grab this.

I ask them, first off I stop putting it in. I don't write that number down and they say, 'Oh, you left this number off.' I am not giving you that number, OK. I am not going to tell you. I ask them why, why do you want to have this information because actually it's a liability for you to have it.

So I think when you are architecting the system instead of saying like, "oh, people give us a credit card number when they buy something. Why don't we just keep that?" No, no, don't keep it. You shouldn't keep it because it's a big liability. What if those are stolen? The banks will come after you because you were keeping that number.
Micah: True. You can actually buy stolen numbers for about five hundred dollars a piece and they are actually working. You just have to see if it pays off for you. Is that five hundred dollars worth and how much you can spend then on that card.
Man 3: It boils down to social, actually to the client's experience too. Because all the major sites offer you the functionality of storing your credit card numbers and it's good.
Ron: Because people buy more.
Man 3: Sure. Yeah, you have it there. You don't have to type it in every time. You just select it, they know they just show the stars and the correct credit card gets selected. So if you are a commerce site or if you are some kind of a shop which operates with credit cards. What you want to do is expose the same kind of functionality to your users, because otherwise they're just going to say, "This is a terrible banking experience", or "This is a terrible experience after all." That's why most people actually store them.
Ron: If you do have to store them, SQL Server 2005 has great capability. We can encrypt these things. There's many other ways you can, of course, encrypt them. Like you said, if you're going to store them, they have to be stored encrypted in such a way, that not any old employee in the company can open up that table and see all of those numbers. That would be a very, very bad thing.
Micah: Also, the application rights shouldn't be designed in a way that shows you all of the numbers, why? You don't have to get that.
Man 3: You still have to protect the keys for the encrypter these days. Not leave them lying around on the hard drive or anything. You still have to protect the backup tapes and everything else. That's also a problem.
Ron: That's a very good point. I was talking to somebody at breakfast today about this. A lot of times, architects and developers are thinking solely about the code and how we develop the system and its runtime behavior. We're not thinking about; what does it mean to operate the system in a secure way? So the key management is a good example of that as well as the management of the backup data. All of these kinds of things are really important.

I completely agree, in fact, the exciting thing is, we're on the verge, within three to five years is my prediction, of being able to do credit card transactions without the numbers being involved.

The same technology that we put in place for Windows CardSpace, they're actually working on a way where I can do a transaction with an e-commerce site. The e-commerce site knows what it needs to know about me. Then my system sends a secure message to the bank, the bank issues an authorization code that is encrypted with the e-commerce site's public key. It can see that the bank authorized my transaction. I can't see that, so I can't fake it. They get what they need.

They know the bank likes me, they know me, they know what I want, but they don't have my credit card number which is a really powerful solution to this problem. I can't wait until we get there because, really, I'm fed up with this whole identity theft thing.
Man 3: Still, that's not the only problem. We were talking about stealing data from a site where I bring in a USB key, my laptop, or my hard drive. It's not only about stealing or taking out something from an environment. It's also what I can bring in. How about if I bring in an access point and hide it somewhere? How long would it take for the company to notice it?
Ron: Yes.
Man 3: I can just use it. I know the WEP keys, I know all of the other keys. I can hide the SS ID's and everything. I can sit outside of the company and connect from my access point inside the company, sniff, and listen to everything.
Ron: I can't believe I have never thought of that. You are a devious person.

[laughter]
Ron: You would think of that.
Man 3: No, I'm very good at my job.

[laughter]
Ron: I'm just too innocent. I would never imagine these kinds of evil things.

[audio switches]
Ron: See, I told you that was fun. We got a nice casual chat and we laughed a lot. I love to learn and I love to talk to people who know stuff. Every time I do this, I come away learning something.

That whole thing about a wireless access point just made me think, "Wow, how devious." I'm sure people have done this, where they go into a place they work, they secretly stash a wireless access point up in the ceiling or something. Come back after hours and, well, maybe people don't do that. But, it's just a small example of the kind of things that devious people will do.

There's just no end of it. I think one thing I come away with from this is; defense in depth, my friends. That is the principle we have to assume.

Hey, we'll see you next time ARCast.TV.
Automated Voice: ARCast.TV is a production of the Microsoft Architecture Strategy team.