NSA uses cloud to modernize agency

May 23, 2011

Lonny Anderson, the National Security Agency’s chief information officer, says finding efficiencies in IT is one of his biggest priorities. And, he says, his agency is using the cloud to help with that.

“I don’t want to say we’re the leader in cloud use across the IC [Intelligence Community] but if we’re not, we’re reliant on it. We’re right up there.”

Anderson tells Federal News Radio’s Jason Miller, NSA currently uses three clouds – a utility cloud for virtualization, a storage-as-a-service cloud, and a data cloud. All three of these clouds are private but have some open-source components.

When asked whether an agency like NSA would ever consider using a public cloud, Anderson says, “For unclassified networks, the public cloud would be fine. The challenge for us is, of course, that we do a lot of things in the classified world.”

But, he says, the agency is looking outside its own walls for ideas. “One of the areas we’re going to look at in the future is how to take advantage of all of those developers that are out there, not in government, but across industry and in universities. To the extent we can, we’ll try to open source and ask for help.”

Anderson also updated Federal News Radio on the agency’s data center consolidation efforts. NSA’s new data center in Utah will be used by the Intelligence Community to support the National Cybersecurity Initiative. He says the new data center will help NSA fight cybersecurity threats and provide technical assistance to the Department of Homeland Security.

“That data center will give us, the IC, the ability to take advantage of new technologies as they come on board and design a data center in accordance with our future needs.”

Listen to the full interview.


Census implementing virtualization-first policy

May 13, 2011

Brian McGrath, the chief information officer at the Census Bureau, says his agency is about to implement a “virtualization-first” policy.

He tells Federal News Radio’s Jason Miller, “All new applications will be serviced via a virtualized guest as opposed to a bare-metal deployment of hardware. Unless there is a compelling engineering or architechture reason to do so, we see significant opportunities for cost savings.”

McGrath says this new policy will allow the Census Bureau to transition its data centers “from one of only servicing the Census Bureau to being positioned to service and provide compute and store resources, and cooling resources, and secure resources for other government agencies.”

In fact, McGrath says the International Trade Agency, a sister organization of the Census Bureau, will close one of its data centers in June and move into a Census data center.

The Census Bureau is also in the process of closing some of its own data centers. By the end of 2011, it will close six that were stood up for the 2010 census.

McGrath tells Miller the Census Bureau is also in the process of testing the use of the Internet to collect data as part of its preparation for the 2020 count. By the end of this year, the Bureau will have conducted 60 American Community Surveys online.

As for the storage of all that data, McGrath says that’s staying in the Bureau’s private cloud.

“Any of our sensitive data, personal identifiable information, or Title 13 or Title 26 data, which we are stewards of, will remain in our private cloud. At this point, we’re not seeing any capacity issues. We’re obviously always focused and concerned about IT security to ensure that we could manage the flow of information, store the information, and secure it in such a way as to protect the identity and integrity of the information we are currently capturing.”

Listen to Jason’s interview with McGrath.

Does cloud computing give you a headache?

April 21, 2011

The Fed Cloud Blog told you earlier this week about a recent survey of federal IT professionals by InformationWeek. The survey showed 58 percent of respondents are either already using cloud computing or plan to be using it within the next 12 months.

Federal News Radio wanted some more information on the survey, so we asked John Foley, the editor of InformationWeek Government, to join us on In Depth with Francis Rose. Foley says dealing with the administration’s cloud-first policy is giving some IT professionals a headache. Listen to the full interview here.

Francis also spoke to Bob Otto this week about how to consolidate data centers effectively. (Data center consolidation is another of the administration’s top IT priorities.)

Otto was the former chief information officer at the U.S. Postal Service and offered some advice on how cloud computing fits in to data center consolidation.

Navy, DHS, State make strides in the cloud

January 18, 2011

When it comes to reducing costs and wasteful spending at agencies, IT managers are being leaned on heavily to get the job done.

The Navy is moving ahead with its technology efficiency and consolidation initiative by putting the brakes on spending for new servers, server upgrades and data centers.

“We are reevaluating what all of our organizations want to do and why they want to do it, and is it consistent with our overall IT efficiency,” said Janice Haith, director of assessment and compliance for the Navy’s Information Dominance Directorate.

“Server purchases up to date may not have been efficient. They may not have bought servers that were sufficiently robust to handle virtualization. We need to do that. That may mean we have to buy some additional servers that can be virtualized, and some of our servers today are not in that state.”

Federal News Radio’s Jared Serbu reports, the Navy set some targets for virtualization as well. It directs each of the Navy’s 23 Echelon II organizations – the commands in the organizational chart directly below the office of the Chief of Naval Operations – to develop plans to increase virtualization by 40 to 80 percent, and server utilization by 50 to 80 percent.

Various civilian agencies are also making strides. At a recent AFCEA-Bethesda breakfast panel, the State Department said its goal is to reduce the number of data centers in the United States from 11 to 2 over the next few years.

Cindy Cassil, the agency’s director of systems integration in the CIO office, says part of the way her agency will do that is by getting buy-in from business owners by offering services on a private cloud.

“Right now we are offering infrastructure-as-a-service,” Cassil said. “We are trying to work around the political issue about people still wanting to maintain their applications. The IT staffs are very powerful. They really advise the business they need to be involved. Right now, I would say we have 99.9 percent cooperation with our business side because they really like our model at this point. We offer the platform and the storage, and it’s free to them if they come in and virtualize.”

DHS’s Deputy CIO Margie Graves also spoke at the event. Graves said her agency is creating a test and development environment similar to one developed by the Defense Information Systems Agency.

Federal News Radio’s Jason Miller reports, her office wants to make it easier for DHS components to do rapid application development in a cloud environment. DHS also is working on two other cloud test and development environments using IBM’s Websphere and one for open source.

White House adopts ‘cloud first’ policy

December 21, 2010

The White House is adopting a “cloud first” policy and plans to reconfigure IT by consolidating federal data centers and applications.

Federal CIO Vivek Kundra recently unveiled the plan, which calls for creating a Data Center Consolidation Task Force and closing at least 800 of the federal government’s 2,100 data centers over the next five years.

The Data Center Consolidation Task Force will also serve as a “community of practice” for agency CIOS and data center program manager so they can share best practices, Kundra says.

The government’s “cloud first” strategy will revolve around using commercial cloud technologies, launching private government clouds, and using regional clouds with state and local governments.

In the next six months, Kundra says his office will create a strategy to accelerate the adoption of safe and secure cloud computing across the government.

Each agency will also be responsible for identifying three “must move” services and create a plan for migrating them to cloud solutions. One of those services must be moved within a year, and the other two within 18 months.

The migration plans will also include adoption targets, execution risks, major milestones, required resources and a plan for legacy services once the cloud services are up and running.

Virtualization key for future of Bureau of the Public Debt

December 20, 2010

IT managers from large agencies and small agencies alike are looking at virtualization to help them save money and streamline processes.

On this week’s Ask the CIO program, Kim McCoy, chief information officer at the Bureau of the Public Debt, discussed her agency’s plans in this arena.

McCoy told Federal News Radio her agency, along with its sister agency, the Financial Management Service, will be consolidating their data centers from a total of five down to two that will service both.

Since we do provide services to other government agencies, we’re looking at the possibility of how to build the most cost-effective standardized hosting infrastructure, whether you want to call that a cloud or not, but how do we have an infrastructure that allows us to host applications for the federal sector very quickly, meeting all of our security requirements in a cost competitive fashion.

McCoy said the agency is also moving towards a virtual desktop, even though she isn’t fond of the term.

“The primary driver for that is that we know we need to move towards a telework environment not only to maintain the staff we have today but to expand our staff while keeping our overhead low and maintaining our current level of physical office space.”

McCoy said the agency will begin making progress in this area within the year with the hopes of rolling something out within the next few years.

Listen to Ask the CIO with Kim McCoy

OMB releases details of cloud-first policy for agencies

December 12, 2010

The use of cloud computing is one of the major aspects of the administration’s “25 Point Implementation Plan to Reform Federal Information Technology Management.” OMB recently announced a cloud-first policy for agency IT programs moving forward.

Last week, OMB released more details on its cloud computing plans for agencies.   

Federal News Radio’s Jason Miller reports, OMB wants every agency to identify three “must move” technologies that will go to the cloud by March, move at least one of them by the end of 2011, and the other two within 18 months.

The report from Kundra says he will “publish a strategy to accelerate the safe and secure adoption of cloud computing across the government” within the next six months.

In addition, the implementation plan says the General Services Administration will use requirements developed by the Software-as-a-Service E-mail Working Group to “stand up government-wide contract vehicles for cloud-based email solutions. GSA will also begin a similar process specifically designed for other back-end, cloud-based solutions.” This will happen within the next 12 months.

The plan also requires agencies by 2015 to reduce the number of data centers by at least one-third, or 800, from the more than 2,100 they reported earlier this year. OMB said agencies must “designate a data center program manager…who will be 100 percent dedicated to and accountable for driving change.” It also will “launch a data center consolidation task force of data center program managers, sustainability officers and facilities managers” to review the progress and ensure agencies are on similar paths. OMB will create a public dashboard to track agency progress.

Industry is also responding to the new cloud computing requirements for agencies. Read Jason Miller’s report.