Cloud lessons from OpenStack 2011

May 9, 2011

Have you heard of OpenStack? It’s an open source cloud computing project from NASA and a group of tech companies.

According to the OpenStack website, “All of the code for OpenStack is freely available under the Apache 2.0 license. Anyone can run it, build on it, or submit changes back to the project. We strongly believe that an open development model is the only way to foster badly-needed cloud standards, remove the fear of proprietary lock-in for cloud customers, and create a large ecosystem that spans cloud providers.”

The second OpenStack Design Summit was held late last month and the Web is buzzing with lessons learned and takeaways from the event.

As one attendee noted writing for Computer World, “[Cloud] computing continues to morph as providers and users gain more experience with the domain…the cost structures associated with traditional computing environments in the face of scale growth make existing infrastructure approaches obsolete.”

The idea behind OpenStack and the conference is to provide an environment where cloud vendors can share some parts of their cloud technology and learn from each other. According to NetworkComputing, “project organizers said open source software will allow businesses to use the same cloud platform in their own environment or with an external cloud service provider.”


NASA cloud pioneer leaves agency

March 17, 2011

Chris Kemp, NASA’s chief technology officer for IT, is leaving the agency. Kemp announced his resignation on his agency blog saying, “As budgets kept getting cut and continuing resolutions from Congress continued to make funding unavailable, I saw my vision for the future slowly slip further from my grasp.”

Kemp was one of the pioneers behind NASA’s nebula cloud while the chief information officer at NASA Ames, the position he held before coming to Headquarters. Kemp also helped launch Apps.gov while working at Ames. Apps.gov helps agencies buy cloud computing services.

Kemp has also been a big supporter of open source computing. Information Week reports, “the agency donated code from Nebula to OpenStack, an open-source cloud computing project, and at the end of the month, just after his departure, it will host its first-ever open-source summit.”

Kemp has worked for NASA for five years.


This week in the cloud

February 22, 2011

It was a big week for cloud computing on Federal News Radio – from NASA’s Nebula cloud to more information on OMB’s Cloud Computing Strategy. Plus, we learned what budget cuts might mean for the cloud…

NASA demos open source cloud computing
Hear how NASA is pairing its Nebula platform with open source cloud initiatives to pave the way in governmentwide transitions between private and public cloud computing. We hear from NASA Chief Technology Officer Chris Kemp.

OMB strategy lays a path to the cloud
We told you last week about the Office of Management and Budget’s new Federal Cloud Computing Strategy. During the Cloud/Gov 2011 conference last week, Sanjeev Bhagowalia, the associate administrator in the General Services Administration’s Office of Citizen Services and Innovative Technologies, gave more details on the strategy and where agencies need to go next when it comes to the cloud.

GSA gathers input to improve acquisition process
Mary Davie told an audience at last week’s Federal Networks conference one of GSA’s priorities is helping agencies adopt a cloud-first policy. She said 12 cloud-related blanket purchase agreements will be ready for agencies in March and GSA will launch FedRamp in October.

IT opportunities exist despite budget doom and gloom
Cloud computing is one of the areas that will most likely see growth over the next few years despite spending reductions, according to a new Federal IT Services Industry Outlook report.


Cloud computing tips from NASA’s Chris Kemp

January 3, 2011

NASA’s Nebula Cloud Computing Platform has gotten a lot of attention from agencies looking to move into the cloud. Chris Kemp, the Chief Technology Officer for IT at NASA, spoke with Federal News Radio about his agency’s use of the cloud and how that has morphed over the years.

Initially, the Nebula cloud was developed to get NASA’s thousands of public-facing websites all on the same platform. However, the agency also realized it could use the cloud as an infrastructure-as-a-service offering for compute and storage. Kemp said this is the area that has really gained traction at the agency.

Using the cloud in this way has also helped NASA with efficiency. Kemp said typical utilization of infrastructure is 20 percent but Nebula allows NASA to run at 80-90 percent efficiency.

To help understand all of the various cloud platforms available (both internally at NASA and externally in the private sector), Kemp said NASA has developed a cloud service office.  

“By having expertise in all these areas, as we have a new application, we can consult with them and send them to the right place. Architect [a] solution that involves cloud but that appropriately uses these different technologies because every cloud has different characteristics.”

Kemp recommends all agencies form a group like this as cloud becomes an even bigger focus in 2011 and beyond.

Listen to the full interview with NASA’s Chris Kemp.


NASA Goddard CIO discusses cloud as a priority

November 15, 2010

Adrian Gardner has been the chief information officer at NASA Goddard Space Flight Center for 10 months. Federal News Radio’s Jason Miller spoke with Gardner recently about his priorities for the agency.

Gardner says one of his goals is to understand the needs of the scientific community.

If you look at computing here at Goddard and the vision I have — all the way from looking at the desktop and possibly virtualizing the desktop; then almost to a virtualized data center; then looking at cloud and cloud computing and how that plays in; and then looking at high-performance computing and high-end computing — that whole range now represents what the scientist probably would need to actually fulfill their daily tasks.

What we’re trying to do as an organization in concert with [NASA] Ames and Linda [Cureton] and her staff at headquarters, looking at both cloud and other desktop capabilities, is really trying to understand across the board what are the attributes that make each one of those environments the right environment for that particular task. And then, beginning to educate the scientists and engineers, as well as our own staffs, about what kind of things should we run in those various environments in such a way as they’d be optimized in those environments.

We’re very much looking at that whole range of compute, drawing a circle around that, and saying, ‘that then is scientific mission computing for Goddard and potentially for NASA as a construct.’

As for what vendors, especially those interested in cloud computing, should know about working with his office? Gardner says he’s creating a new position that will deal with emerging technologies like cloud and social media. More info on this as we get it!

Hear Jason Miller’s entire interview with Gardner by clicking the audio link above.


NASA discusses challenges of data center consolidation

November 3, 2010

It’s been more than nine months since the Federal Data Center Consolidation Initiative began with the intent to reduce energy usage, lower IT costs and improve security. And some federal agencies are discovering that it’s difficult to reduce spending without putting some money upfront first.

NASA is just one of the federal agencies trying to streamline their IT practices and improve efficiency. Chris Kemp, NASA’s chief technology officer for IT, says it’s been a challenge to consolidate because the agency uses many different types of IT infrastructure.

Kemp says aggregating data wouldn’t be such a daunting task if the agency had the same type and size of servers and other universal equipment. The virtualization transition will take several years and Kemp says the IT department is researching how they can achieve maximum savings.

NASA is also working with company that provides software-as-a-service model that will collect information from an appliance installed on the network and send that information back to an analytical engine in their own data center.

Federal agencies submitted their data center consolidation plans to the Office of Management and Budget two months ago. OMB is going through the plans and hopes they’ll be put into action next year.


Cloud news round up: Know your rights in the cloud

July 23, 2010

Today in your end of the week cloud news round up:

  • Confused about the cloud? Have no idea where to go next? Worried about what will happen once you make the move? Have no fear. The Cloud Computing Survivor’s Guide for Government is coming up, and you can still register. It’s hosted by David S. Linthicum, CTO of Bick Group. Learn more about how you might be able to make the cloud work for you.
  • What can cloud really offer you? CIO.com’s Bernard Golden says there two kinds of agility you can find in the cloud that don’t exist elsewhere. Read more of his post as he discusses engineering resource availability, and business response to changing conditions or opportunity.
  • You should know your rights — and responsibilities — in the cloud. Gartner has created a Bill of Cloud Computing Rights. ChannelWeb reports that the firm put together a Global IT Council for Cloud Services, which came up with the idea for a set of key rights to govern computing in the cloud. They say they hope it will help not only IT customers and clients, but also developers, vendors and other stakeholders.
  • And NASA has joined forces with Rackspace Hosting, a cloud services provider. Daily Finance reports the agency is currently working on its Nebula software platform, which should be available by the end of this year. Rackspace has already contributed its Cloud Files system to NASA’s OpenStack code. Many in the industry think NASA’s code will serve as a best-of-breed technology.