DNSSEC now deployed in root DNS

July 19, 2010

DNSSEC has been enabled at the root zone, VeriSign says.

DNSSEC applies digital signatures to DNS for added security and authenticity.

The company has been working with others for years on this new technology.

Ken Silva is chief technology officer at the company and explains what DNSSEC is and how they’ve been preparing for it.

KS: DNS is the directory service of the Internet, if you will. That’s what converts the the name that you type to an IP address, which is actually how the packets find one another.

By digitally signing the answers to those, we’re ensuring the answers to those, we’re ensuring that they haven’t been forged or corrupted in any way on the way to the intended recipient. When you type in a website that you would like to go to, the answer that you get from our servers, or from anyone else’s who chooses to do DNSSEC, [there] will be a digitally signed answer that will be authenticated.

FCB: So, is this something that is going to mitigate all DNS attacks?

KS: It’s certainly not going to [do that]; in fact, it might even create some new ones. Packets are more complicated now. They’re larger.

So that’s one aspect of it — it complicates things a little bit, but from a security perspective, DNS, up until today, literally had no security associated with it whatsoever, other than a serial number that had to be attached to the message. That was pretty easily forged, and there’s been some publicized papers on how that could be done trivially.

So, what this does is prevent a foreign answer, or a forged answer, from reaching the intended recipient and directing them to a different location that where they should be going.

FCB: In the blogosphere, we’ve been reading that some are comparing this to Ipv6 in terms of an Internet standard that was a long time coming and a lot are still struggling to cope with. Do you think that’s a fair comparison?

KS: I think it is. They’ve both been adopted, or, at least, ratified standards for a number of years.

In comparing DNSSEC to IPv6 . . . they’re both complicated changes to the infrastructure. There hasn’t really been a significant change in DNS, really, in the last 25 years. Same thing with the IP address system. In order to make a subtle change to that, there’s a lot of equipment that relies on it performing the way that it currently is today. Any change in that could potentially cause disruption in pockets that we’re not even aware of — [if] someone has an old router in an old building someplace that doesn’t understand IPv6, or doesn’t understand DNSSEC. . . . Those things created problems.

So, we’ve spent the last couple of years in working with equipment vendors to find out which versions need to be patched, which pieces of hardware on the Internet infrastructure don’t work with DNSSEC. We’ve been working through those issues and we’ve deployed an interoperability lab. We’ve had a number of vendors bring equipment in, including home routers that people have. Initially, we don’t expect a large uptake with DNSSEC.

It’s not for everyone, and everyone probably won’t implement it right out of the gate. What we’re doing today is putting the underpinnings in to allow that to happen so that the rest of the Internet can implement DNSSEC as they see fit and as soon as they want to.

Until now, if they had chosen to implement DNSSEC, most of the underpinnings weren’t there for them to be able to do it. It had to be sort of cobbled together.

FCB: We do a lot of reporting about [the federal government’s] TIC — their trusted Internet connection policy. Is DNSSEC going to affect that at all, and has the federal government started preparing for DNSSEC?

KS: Oh they sure have. The U.S. government two years ago actually started the process of rolling out DNSSEC within the .gov domain.

Now, that has taken longer than they would have liked, and one of the reasons it has taken longer than they would have liked is that, in the process of implementing, they’re starting to see it’s not as easy to implement as traditional DNS.

It’s a bit more complicated than most people think on paper.

That’s why you do read in the blogosphere people saying, ‘This has taken far too long to implement.’ It’s taken as long as it’s taken because we’re talking about a fundamental change to the way that the Internet does domain names today if they chose to use DNSSEC. The great way about the way that it’s being implemented today is that everyone doesn’t have to implement DNSSEC overnight.

We’re not causing this tidal wave of everyone having to rush to their routers to make changes.

What we’re doing is we’re enabling the fundamental core piece at the root of distributing keys and signing the root answers from the very top. There’s a very select few people who run those services today, and they’ve all been working together on this unified roll out.

FCB: You’ve talked a little bit about what VeriSign’s been doing with DNSSEC. If you could go into a little more detail. Have you been helping the federal government out? And, we understand you have a panel coming up — if you could touch on that, too.

KS: We have been working with virtually everyone who runs DNS infrastructure, and that includes the federal government. We’ve been with ISPs.

We’ve been working with equipment manufactures, such as Cisco and software vendors, like Microsoft. So, we’ve been working with all of them to explain to them our findings, where we see potential problems and lessons learned. We’ve published . . . a couple of white papers and have talked to [the media] who have listeners and readers who are involved in the infrastructure to try to educate them on what the potential pitfalls could be and what the benefits are. . . .

At Black Hat, we’ve got a panel together, which will include myself, Dan Kaminsky . . . and others that will be discussing what led up to this change, how is the change going to make a difference and what are some of the potential risks to try to do it too early if you’re not actually ready for it — what actually could happen if you haven’t done everything that you need to do to get ready for it.


VeriSign uses cloud to protect against DDoS attacks

December 15, 2009

Listen to the second half of our interview with VeriSign’s Adam Geller and Nick Piazzola.


Today we continue our conversation with VeriSign’s Adam Geller, vice president of Enterprise & Government Authentication and Nick Piazzola, vice president of Government Programs.

On alleviating security concerns

Nick Piazzola: We provide PKI for federal agencies as a service. They have already defined for us security standards in the forms of policy.

For example, PKI has to follow the same business certification and accreditation requirements that you have for a federal agency.

Also, we have to get annual external audits to be able to demonstrate for the federal government that we are operating our services and are compliant with their standards.

So, for some of these security services, we are doing exactly what the federal agencies have to do for their own systems.

That’s intended to alleviate the concern that a federal agency might have about outsourcing their services to somebody who’s got a security service in the cloud like VeriSign’s.

Adam Geller: We’ve been having this classic debate with people — and we’ve been having it for a dozen years already. It’s not really new to us when we have these kinds of discussions.

I do think that there is a softening of stance, but, probably a lot of what it comes down to is, and this may sound funny, but there are standards out there — there are even government guidelines and recommendations to use managed services in certain areas, including PKI — but a lot of it, ultimately, comes down to personalities and people who are in positions within agencies or enterprises. At the end of the day, people still have a little bit of a religious feel to — do I want to outsource services or do I want to tightly control them?

I think where there’s the most hold-up or hang up about it is still probably just related to legacy.

But . . . almost anywhere you can look at an application or use case, you can now find an example of it being done as a cloud service in a very secure way for a major organization.

You can also find somebody who will do the opposite and say — I’m a major organization and I refuse to use it.

But, all the proof points are starting to line up — and they are there — with major organizations making these decisions.

On a VeriSign specialty for the federal government

NP: We provide a major piece of the Net Service for dot com and dot net.

We recognized years ago that we weren’t going to be able to provide 100% availability that was required for that unless we could do something to mitigate and protect against distributed denial of service (DNS) attacks.

What we did was we went ahead and we developed the capability that we provide for our DNS and all of the services that we provide in our data center.

Then, we’ve recently taken that service and made it available as an in the cloud offering for federal agencies.

Now, federal agencies can buy from VeriSign the same services that we use to protect ourselves for mitigating distributed [DNS] attacks


The gaining popularity of cloud

December 14, 2009

Listen to the first half of our interview with VeriSign’s Adam Geller and Nick Piazzola.


Fed Cloud Blog has been bringing you interviews with various companies that are helping federal agencies move into the cloud.

Today and tomorrow we’ll talk with two representatives of VeriSign, a company that specializes in protecting data.

Adam Geller is their vice president of Enterprise & Government Authentication and Nick Piazzola is their vice president of Government Programs.

The two sit down with FCB and discuss more about the cloud.

On the definition of cloud computing

Nick Piazzola: I think our definition of cloud computing fits in the generalized definition that NIST has put forth about a shared set of digital computing resources — but, for us, it’s more specific in that we provide applications for specific shared computing services.

Adam Geller: And I would just add, from a cloud computing perspective, I think that it’s sort of a combination of a lot of different initiatives — or projects — that have caught on in the past: great computing, utility computing, software-as-a-service. They sort of all bundle up, but the characteristic that I would apply to cloud computing are terms, like on demand that’s going to be scalable; virtualized, leverage-existing service infrastructure — and generally it’s going to take something, package it up and offer it as an Internet-facing surface.

On why cloud has seemingly taken off in the past year

NP: I would say that, from VeriSign’s perspective of what we’ve been providing, we’ve been doing cloud computing for many years.

Actually, all of the services that we provide for the federal government today are some form of network-based, shared application services.

So, from our perspective, we’ve been doing cloud computing in that definition since our inception.

I think it’s in vogue now today for a variety of reasons, including the fact the feds themselves now have that part of the President’s initiative for common services . . . [and] shared services and they’ve labeled that cloud computing.

AG: Certainly at VeriSign we’ve been doing it since our inception, but I think what’s changed about it is that VeriSign and other people who’ve done cloud or service-based offerings in the past, I think they’ve been a little bit more point-solution based. We’re a cloud authentication solution, which is good, but it’s narrow and it’s for a specific use case.

I think what’s fundamentally changed the attitude around cloud computing is that there were a couple of killer apps, like SalesForce and some other solutions out there that were full on applications, or, really, full solutions to be provided for somebody, not just a point component of it, [but] being fully provided as a service. I think that’s what’s sort of changed people’s perspective.

Not only can you [now] get a point service, like an alarm monitoring service or a managed security service or authentication, like what we’re doing, you can all of a sudden take a full on application from start to finish and offer it as a service. That’s being commonly accepted now.

I think that’s what’s gotten everybody to push and realize there are lots of things that can be provided for in the cloud, but, of course, this also opens up lots of interesting security questions, which is where we focus.

What the CIO should consider before moving to the cloud

NP: I think the obvious things are that the kinds of security concerns that you would have about a service in the cloud are at least the same as what they are if you would provide that service on your own when hosting your own application. So, you need all the kinds of things, like authentication, confidentiality for your data, protection of privacy.

In addition, you have to worry about the reliability and availability of the service, [especially] if it’s going to be mission-critical applications.

What you have to recognize as a federal agency is that it may be possible that that cloud service could do that better than if you were doing it yourself.

I think that’s the thing that the management needs to understand. If they pick the right application and the right service provider, they’ll actually get enhanced availability for security than if they did it themselves.

AG: I think this is a really interesting part — or an interesting fork in the road.

You’ve got the early adopters — people quickly adopting services, but, again, since we focus on security, what we’re hearing and what we’re seeing with people — the cloud providers and the people who want to use them — is that enterprises (and government also) — built up over a number of years internal compliance programs to meet regulations, to meet guidelines. . . . They built their own compliance programs internally, and they’ve had control over all of their touch points.

What we’re seeing with people who have moved to cloud services [is that] that’s sort of lagging a little bit.

So, as an example, you may have a great password management policy for your internal applications that you can document, you can get on it again and people can feel comfortable that you’re living up to a certain policy.

When you change over to a cloud service, all of a sudden you may not have as many options or the ability to synchronize the two. There’s sort of a very interesting market emerging for services that can bridge the enterprise or the agency infrastructure to the cloud so that you’re not replicating the two and having to maintain, let’s say password policies or account creation and teardown in two separate locations.

I think this is going to become a real interesting area as the auditors, I think, have to get caught up. I think it’s a new thing to audit in that level of detail. Certainly, not auditing cloud providers — people have done that in the past — but now this is getting very specific because there’s a lot of data moving over and I think it’s going to open up some eyes when the policies that people worked so hard to set internally aren’t necessarily easily applicable to the cloud.

That gap is going to have to close.

Tomorrow: advice on closing that gap, as well as how VeriSign is helping federal agencies.