The Department of Homeland Security plans to issue the first task order under GSA’s infrastructure-as-a-service blanket purchase agreement.
Under the contract, DHS will move its public websites to a public cloud, according to Federal News Radio reporter Jason Miller, who attended the conference. It will start with the websites for FEMA and the Citizen and Immigration Service (CIS).
According to the draft RFQ, “The primary purpose of this acquisition is to establish the consolidated and integrated web service delivery capability for development/test, staging/pre-production and production platform that will streamline the migration, implementation and support of current and future DHS public-facing websites to the public cloud…In addition, this includes all work associated with Web hosting, virtual machines, storage and migration as well as application services to support DHS public websites.”
The agency expects bids to be due by May 26.
Read Jason’s story for more on what Richard Spires, CIO at DHS, had to say about his agency’s future use of the public cloud.
Anyone attending the recent ACT-IAC Executive Leadership Conference in Williamsburg, Va., can tell you there was some great information on the future of cloud computing in government being announced and discussed at the event.
This includes the news that the General Services Administration has issued a notice on FedBizOpps.gov for e-mail-as-a-service under the software-as-a-service platform.
Katie Lewin, director of GSA’s Cloud Computing Program, told those in attendance at the event the agency is also considering a platform-as-a-service offering. Lewin said the agency is working with the U.S. Geological Survey among others to create a geospatial information platform in the cloud.
“It is the natural candidate for cloud computing because you have massive amounts of geospatial data stored all over government,” Dave McClure told Federal News Radio’s Jason Miller. McClure is GSA’s associate administrator for Citizen Services and Innovative Technologies and will also be speaking at Monday’s industry day. “If we can create a platform that would allow it to be stored securely and for common use, and leverage that across the entire government, I think we could see some unbelievable cost savings in the geospatial areas.”
Lewin said a RFP for geospatial could come out later in 2011.
Federal Chief Information Officer Vivek Kundra also had news about the draft FedRAMP specifications and requirements. The requirements document will be out soon and can be found on both FedBizOpps.gov and the CIO Council website when it’s released.
FedRAMP is a voluntary government-wide approach for agencies to submit cloud-based services to get certified and accredited (C&A) for cybersecurity once and trusted and used many times.
Fed Cloud Blog sat down with GSA’s Associate Administrator
of Citizen Services and Communications, Dave McClure, who talked with members of industry about the RFQ and the new contract at an ACT/IAC event at the end of April.
“I personally feel like we have to make sure we do solid outreach with industry to make sure that our instruments that we’re putting out for cloud services are in line with the way that they think we should be offering them. That was the purpose of the dialogue with industry. We did talk a little bit about the reasons for canceling the prior infrastructure-as-a-service RFQ. I just wanted to emphasize with them that we felt like the market had changed quite a bit since the initial offering, which had started up almost 12 months ago. Vendor engagement, vendor market offerings and vendor understanding of cloud has certainly matured quite a bit in the last 12 months, and the same thing has occurred on the agency side.”
“The infrastructure-as-a-service offering was put out previously [and] was done in very close approximation to the software-as-a-service announcement, and the whole launching of the Apps.gov website. We knew this after the launch, but a valuable lesson that we learned was that there was great confusion in industry about which announcement covered what. There was confusion as to what they needed to reply to to get on schedule for the infrastructure, what they needed to do to get on schedule and get up on the apps.gov storefront for software. We don’t have that problem [now]. The website is up, people understand the processes, so I think we’ve eliminated what was then a very confusing period for just announcing the storefront and announcing an infrastructure BPA all very, very much at the same time.”
This time around, McClure says several things will be different.
“We’re raising the security level to the moderate level. I think that’s where the public sector in general is headed — greater security in these cloud provisioning agreements. So, we’ve raised this up to the moderate level. I think that’s a significant improvement and difference from the prior RFQ. We also are making it much easier and clearer to map the industry offerings to the contract line items in this BPA instrument that we’re using. There was some confusion about whether specific services and prices for some of the industry offerings — how they’ve mapped to the contract line items in this BPA. We’ve gone back and actually cleaned that up and had conversations with industry on how that mapping process can work very effectively. So I think that will also create a much better instrument than what we had before. The third big difference is that things that are awarded off of this instrument will be candidates that will go into the FedRAMP centralized CNA approval process. I think that will make a difference, as well — knowing that your product or service will actually go through one CNA and then be usable across the entire government.”
Hear the first part of our interview with David Chen.
You might not equate the two right off the bat — cloud computing and service oriented architecture (SOA) — but they’re similar in terms of how their implementation evolved.
That’s according to David Chen, lead of the technology consulting practice for Accenture Health and Public Service, who is our guest this week to talk about the cloud.
Defining the cloud
It’s one of those things, I think, where if you have five different people, you get five different answers.
We at Accenture define cloud as a type of computing that allows companies and agencies and businesses to access technology-based services via the Internet.
But, there are a lot of exceptions to that.
There are also private clouds.
You can always find some instance of cloud that doesn’t exactly meet the [standard] definition; so, we like to look at some of the characteristics of cloud computing, and really those are — very rapid acquisition of services, very low to no capital investments, low operating costs and, usually, variable pricing tied directly to consumption — pay by the drink-types of models.
We also see cloud services as being program or automatically controlled and being able to be accessed in an on-demand fashion [while] giving you the illusion of infinite capacity.
I would add there’s different types of cloud and we really have four that we highlight: process-as-a-service (getting a business process through a cloud service); application-as-a-service (such as SalesForce.com); platform-as-a-service (using cloud for the development of an environment); and infrastructure-as-a-service (where you are buying compute power or storage).
Working with SalesForce
We are a partner with SalesForce.com and have been for many years. We will help clients who integrate SalesForce.com applications. We have not created our own applications per se. We do have some cloud-like business models and offerings . . . [such as] process-as-a-service, where we do the post-processing for tickets for many of the major airlines.
Cloud and SOA = similar?
Cloud basic fundamentals have been around for quite awhile, especially on the software level. SalesForce.com has been around since 1999 — [that’s] over ten years in terms of an offering; however, there has been a recent convergence of technologies and maturation of technologies on the data center side.
So, if you look at the ability to virtualize and share within a data center, the ability to automatically provision and then you look at the increased bandwidth and robustness of the Internet, that has allowed you to come up with some incredibly powerful price points and time to market.
So, instead of paying hundreds of thousands of dollars for compute power, you are now able to buy small units very cheaply and not have the up front time to create that while handling big spikes in volume.
Some of the new technologies have really added to the power of cloud, as well as some of the offerings now that Amazon and Google and other public providers have.
That has really enabled this new value proposition that is very hard to ignore.
I definitely agree that we’ve seen that spike in the last year or two. It is getting a lot of hype and it’s one of those things, also, like SOA before it — or Web technologies — where every vendor is trying to call what they have ‘cloud’, because of the hype that it’s getting.
But, the underlying business case that you can get now is, I would say, very different that you could get five or 10 years ago due to some of these newer technologies.