Cloud lessons from OpenStack 2011

May 9, 2011

Have you heard of OpenStack? It’s an open source cloud computing project from NASA and a group of tech companies.

According to the OpenStack website, “All of the code for OpenStack is freely available under the Apache 2.0 license. Anyone can run it, build on it, or submit changes back to the project. We strongly believe that an open development model is the only way to foster badly-needed cloud standards, remove the fear of proprietary lock-in for cloud customers, and create a large ecosystem that spans cloud providers.”

The second OpenStack Design Summit was held late last month and the Web is buzzing with lessons learned and takeaways from the event.

As one attendee noted writing for Computer World, “[Cloud] computing continues to morph as providers and users gain more experience with the domain…the cost structures associated with traditional computing environments in the face of scale growth make existing infrastructure approaches obsolete.”

The idea behind OpenStack and the conference is to provide an environment where cloud vendors can share some parts of their cloud technology and learn from each other. According to NetworkComputing, “project organizers said open source software will allow businesses to use the same cloud platform in their own environment or with an external cloud service provider.”


NASA cloud pioneer leaves agency

March 17, 2011

Chris Kemp, NASA’s chief technology officer for IT, is leaving the agency. Kemp announced his resignation on his agency blog saying, “As budgets kept getting cut and continuing resolutions from Congress continued to make funding unavailable, I saw my vision for the future slowly slip further from my grasp.”

Kemp was one of the pioneers behind NASA’s nebula cloud while the chief information officer at NASA Ames, the position he held before coming to Headquarters. Kemp also helped launch Apps.gov while working at Ames. Apps.gov helps agencies buy cloud computing services.

Kemp has also been a big supporter of open source computing. Information Week reports, “the agency donated code from Nebula to OpenStack, an open-source cloud computing project, and at the end of the month, just after his departure, it will host its first-ever open-source summit.”

Kemp has worked for NASA for five years.


This week in the cloud

February 22, 2011

It was a big week for cloud computing on Federal News Radio – from NASA’s Nebula cloud to more information on OMB’s Cloud Computing Strategy. Plus, we learned what budget cuts might mean for the cloud…

NASA demos open source cloud computing
Hear how NASA is pairing its Nebula platform with open source cloud initiatives to pave the way in governmentwide transitions between private and public cloud computing. We hear from NASA Chief Technology Officer Chris Kemp.

OMB strategy lays a path to the cloud
We told you last week about the Office of Management and Budget’s new Federal Cloud Computing Strategy. During the Cloud/Gov 2011 conference last week, Sanjeev Bhagowalia, the associate administrator in the General Services Administration’s Office of Citizen Services and Innovative Technologies, gave more details on the strategy and where agencies need to go next when it comes to the cloud.

GSA gathers input to improve acquisition process
Mary Davie told an audience at last week’s Federal Networks conference one of GSA’s priorities is helping agencies adopt a cloud-first policy. She said 12 cloud-related blanket purchase agreements will be ready for agencies in March and GSA will launch FedRamp in October.

IT opportunities exist despite budget doom and gloom
Cloud computing is one of the areas that will most likely see growth over the next few years despite spending reductions, according to a new Federal IT Services Industry Outlook report.


Cloud computing tips from NASA’s Chris Kemp

January 3, 2011

NASA’s Nebula Cloud Computing Platform has gotten a lot of attention from agencies looking to move into the cloud. Chris Kemp, the Chief Technology Officer for IT at NASA, spoke with Federal News Radio about his agency’s use of the cloud and how that has morphed over the years.

Initially, the Nebula cloud was developed to get NASA’s thousands of public-facing websites all on the same platform. However, the agency also realized it could use the cloud as an infrastructure-as-a-service offering for compute and storage. Kemp said this is the area that has really gained traction at the agency.

Using the cloud in this way has also helped NASA with efficiency. Kemp said typical utilization of infrastructure is 20 percent but Nebula allows NASA to run at 80-90 percent efficiency.

To help understand all of the various cloud platforms available (both internally at NASA and externally in the private sector), Kemp said NASA has developed a cloud service office.  

“By having expertise in all these areas, as we have a new application, we can consult with them and send them to the right place. Architect [a] solution that involves cloud but that appropriately uses these different technologies because every cloud has different characteristics.”

Kemp recommends all agencies form a group like this as cloud becomes an even bigger focus in 2011 and beyond.

Listen to the full interview with NASA’s Chris Kemp.


NASA Goddard CIO discusses cloud as a priority

November 15, 2010

Adrian Gardner has been the chief information officer at NASA Goddard Space Flight Center for 10 months. Federal News Radio’s Jason Miller spoke with Gardner recently about his priorities for the agency.

Gardner says one of his goals is to understand the needs of the scientific community.

If you look at computing here at Goddard and the vision I have — all the way from looking at the desktop and possibly virtualizing the desktop; then almost to a virtualized data center; then looking at cloud and cloud computing and how that plays in; and then looking at high-performance computing and high-end computing — that whole range now represents what the scientist probably would need to actually fulfill their daily tasks.

What we’re trying to do as an organization in concert with [NASA] Ames and Linda [Cureton] and her staff at headquarters, looking at both cloud and other desktop capabilities, is really trying to understand across the board what are the attributes that make each one of those environments the right environment for that particular task. And then, beginning to educate the scientists and engineers, as well as our own staffs, about what kind of things should we run in those various environments in such a way as they’d be optimized in those environments.

We’re very much looking at that whole range of compute, drawing a circle around that, and saying, ‘that then is scientific mission computing for Goddard and potentially for NASA as a construct.’

As for what vendors, especially those interested in cloud computing, should know about working with his office? Gardner says he’s creating a new position that will deal with emerging technologies like cloud and social media. More info on this as we get it!

Hear Jason Miller’s entire interview with Gardner by clicking the audio link above.


NASA discusses challenges of data center consolidation

November 3, 2010

It’s been more than nine months since the Federal Data Center Consolidation Initiative began with the intent to reduce energy usage, lower IT costs and improve security. And some federal agencies are discovering that it’s difficult to reduce spending without putting some money upfront first.

NASA is just one of the federal agencies trying to streamline their IT practices and improve efficiency. Chris Kemp, NASA’s chief technology officer for IT, says it’s been a challenge to consolidate because the agency uses many different types of IT infrastructure.

Kemp says aggregating data wouldn’t be such a daunting task if the agency had the same type and size of servers and other universal equipment. The virtualization transition will take several years and Kemp says the IT department is researching how they can achieve maximum savings.

NASA is also working with company that provides software-as-a-service model that will collect information from an appliance installed on the network and send that information back to an analytical engine in their own data center.

Federal agencies submitted their data center consolidation plans to the Office of Management and Budget two months ago. OMB is going through the plans and hopes they’ll be put into action next year.


Cloud news round up: Know your rights in the cloud

July 23, 2010

Today in your end of the week cloud news round up:

  • Confused about the cloud? Have no idea where to go next? Worried about what will happen once you make the move? Have no fear. The Cloud Computing Survivor’s Guide for Government is coming up, and you can still register. It’s hosted by David S. Linthicum, CTO of Bick Group. Learn more about how you might be able to make the cloud work for you.
  • What can cloud really offer you? CIO.com’s Bernard Golden says there two kinds of agility you can find in the cloud that don’t exist elsewhere. Read more of his post as he discusses engineering resource availability, and business response to changing conditions or opportunity.
  • You should know your rights — and responsibilities — in the cloud. Gartner has created a Bill of Cloud Computing Rights. ChannelWeb reports that the firm put together a Global IT Council for Cloud Services, which came up with the idea for a set of key rights to govern computing in the cloud. They say they hope it will help not only IT customers and clients, but also developers, vendors and other stakeholders.
  • And NASA has joined forces with Rackspace Hosting, a cloud services provider. Daily Finance reports the agency is currently working on its Nebula software platform, which should be available by the end of this year. Rackspace has already contributed its Cloud Files system to NASA’s OpenStack code. Many in the industry think NASA’s code will serve as a best-of-breed technology.

Role of federal CIO, CTO influences agencies on cloud

June 15, 2010

As you probably know, the General Services Administration is planning to move the entire agency’s email system to the cloud.

Federal News Radio has been telling you that this is not the first agency to make the move; the Interior Department has already consolidated 12 different systems and moved 80,000 users to the cloud.

From this news, it seems like cloud is no longer just a buzzword — it’s becoming part of the new business of government.

David Link is President and CEO of ScienceLogic, which conducted a survey of federal IT managers and workers earlier this year at FOSE.

Link says one of the many trends the survey showed is that cloud computing seems like it’s here to stay because of the immense presences of federal CIO Vivek Kundra and federal CTO Aneesh Chopra.

“This year is the first time that we’ve had a federal CIO, a federal CTO over all of government IT. One of the questions we asked is — has this new role impacted your IT operations? Actually 56 percent of the people that responded said it absolutely had impacted, and over 30 percent said they were seeing a major impact. Only about 20 percent said it was business as usual, so I think what that means is that the mandates from the top down actually are active, they’re very visible, the word’s getting down to people and engineers and operators that are working in the trenches. That’s a great, positive movement. It’s a great story going forward — that a new role in the government can actually impact the people who [are] literally . . . Doing the job each and every day.”

He also notes that there is a direct connection between cloud and virtualization, which is helping agencies adopt cloud.

“What we saw early on with virtualization [in] the first year of the survey is that a few people had thought it was a key initiative and/or they had projects in place. This last year the adoption has moved up from major hype to adoption — 80 percent of the respondents this year said they had virtualization initiatives. Frankly, virtualization is at the heart of cloud, because it’s all about shared and pooled resources where you can leverage a resource pool really effectively and have the agility that cloud offers where you can stand up IT resources very quickly. Vitualization is really one of the heart and soul key components of cloud offerings.”

It is slow-going, however. The survey showed that adoption of cloud, however, is still relatively low. But interest is high. Link says, in his opinion, this isn’t a plateau or fad, and likens the government’s response to cloud as the same when it comes to IPv6.

“From the very top, Vivek Kundra’s really a thought leader on the cloud . . . with NASA’s initiatives and FedRamp setting standards on cloud initiatives, they’ve really got a lot of people focused on this. As the largest buyer of IT in the world, where the government goes, vendors are going to go. What I see is, they’re really being smart about the approach. They’re trying to figure out where outsourcing to the cloud makes sense — where is it smart? Where can you get the advantages that the nimbleness and scale of the cloud brings straight to government IT operations.”

But what about the money? Will agencies see future funding for cloud computing initiatives? Link says many agencies were helped in the past by the American Recovery and Reinvestment Act, and now agency heads and IT managers are looking at spending differently.

“Some of the huge projects that are multi-year, large awards may not be going as fast because they tend to take a long time, but I think what you’re seeing from a government IT perspective is more of a surgical approach to [solve problems]. There’s a huge initiative where Vivek Kundra has said, by the end of the year, he wants all agencies to put together and put forth their data center consolidation strategy and plan. Data center consolidation is really about figuring out how to collapse and provide more shared services, which is really going to drive adoption of the cloud and virtualization and these core technologies even faster because they’re a key linchpin to getting there.”


NASA JPL develops own cloud ‘brokering’ system

June 9, 2010

And now we wrap up our conversation about NASA’s JPL moving toward cloud computing.

In our final segment with guests Tom Soderstrom, IT CTO at NASA JPL and Khawaja Shams, senior solution architect at NASA JPL, they give us their final thoughts on the benefits of cloud.

TS: I would say there’s a couple of [benefits]. One is, in our industry we look at something we call the technology readiness level. It starts very early with an abstract idea — level 1 — and then when it’s operational, it becomes level 9. Now . . . we’re thinking about the cloud readiness level, so we’re getting JPL up the curve on this cloud readiness level, and we [had] a JPL cloud day — the first in a series. . . . Our overall goal is to run an application and the storage and the computing wherever it’s most appropriate.

So, the cloud for us gives us a new avenue, a new tier of options.

We’ll have our internal data centers with private clouds, we’ll use [a] community cloud . . . and then the ultimate goal is to [use] a public cloud. We have data in Amazon and Microsoft. We also have data in Google’s cloud.

To do that, we need some kind of cloud brokering, and we went out to industry and tried to buy it, frankly, but it doesn’t exist yet, so we’re creating it. We call it the Cloud Application Suitability Matrix — CASM — and that’s the set of questions that gives a score and assesses in which cloud this particular application is the most suitable to run. We think that’s going to be a big trend — this cloud brokering, if you will.

The partnering part, I can’t stress enough, how important it is for all of us in government and the private sector to just get started — to try it — because you learn a lot.

One unanticipated consequence is, of course, there’s a lot of excitement about the cloud, so you’re making connections and you’re making partnerships that otherwise would have taken a lot longer. We have very good relationships with lots of vendors and agencies.

The last piece, I would say, is . . . the CIO at JPL came up with this idea of replacing the procurement screen with a provisioning screen. That kind of says it all. We’re trying to give self-service to the users of IT so that they can get the computing they need when they need it, and turn it off when they need it, so we can spend less money on IT and more money on science.

The whole effort is to keep it real, and we did that from the very beginning and it’s proven very effective. It’s not an IT benefit, it’s a business of the institution benefit.

KS: One thing I’d like to add is, I know that a lot of institutions are very wary of security.

At JPL, instead of stopping to use the cloud because of security problems, we are trying to address the security problems and trying to create best practices and secure ways ot use the cloud without actually compromising the privacy or integrity of our data.

Our admission developers are working very closely with our office of the CIO and the IT security teams to make sure that we can leverage the benfits of the cloud without compromising our security.

TS: We think that the cloud could be more secure than what we do today, because it becomes, in many ways, more uniform so you can react to threats much more quickly and you can segment off things like denial of service attacks and keep going in a different part of the cloud. We have worked very closely with key vendors and cloud security teams . . . and the biggest obstacle, I would say, is going to come from the auditing function.

The auditing function needs to figure out how an application that used to run on one server in one data center now could [run] on multiple servers in multiple data centers. How do you audit that to make sure it’s secure? Until we can do that, we probably can’t go live with anything substantial.

So we’re working very closely with vendors and the auditors to facilitate that, be an early explorer and help industry in that area.


NASA JPL crowdsources with cloud

June 7, 2010

We continue our conversation about NASA’s JPL moving into the cloud.

Today, we start off discussing how President Barack Obama’s Open Government Initiative is influencing cloud at agencies — and whether or not cloud is helping JPL to comply.

Tom Soderstrom, the IT CTO at NASA JPL and Khawaja Shams, senior solution architect at NASA JPL tell us more.

TS: Essentially you can divide what we do in two ways. One is, it’s good for the mission. It makes us do better science. The other one is about communicating that to the public and getting the public excited. Our stockholders are the public. If the public wants to know more about space and science . . . it will go through the budget.

We’re very pleased to see that it’s a cloud and we’re big supporters of data.gov. We think it’s a fantastic idea — [where] you can get the data out at less cost and much more easily to the scientists and the public. So, we came up with this term . . . of citizen scientists. If they could get access to the data much more easily and quicker, they could maybe even help turn the wheels of science.

We worked with Microsoft using their Azure cloud on [a project] called Be a Martian. Citizens are able to do anything from tagging images online to creating programs that tag the images online. It’s a contest and . . . it’s been very successful. It’s a way of crowdsourcing and we took the images — 250,000 images from the Mars rovers Spirit and Opportunity — put them in the Azure cloud and gave the citizens access to it. It’s been terrific. We did the same thing with at EclipseCon.

KS: EclipseCon is a developer conference. Roughly a thousand or so developers worldwide attend each year in Santa Clara, and we held a contest there called the e4 Rover contest where we allowed developers to drive a mini robot around a Martian landscape basically that we had put together. We used this as an opportunity for public outreach, as well as to get developers to build interfaces to command the robot and view the telemetry that is coming back from the robot.

In order to run this contest, we needed a lot of infrastructure that we didn’t want to just go buy for this one week contest. So, we ran the entire contest on the Amazon cloud and leveraged a lot of the services that are very common to companies like Amazon and Microsoft and Google and we were able to get these for free and very quickly — services like load balancing . . . [and] getting computers running in multiple data centers, and services like the delivery of images to the operators that were, in this case, the developers. This project was actually quite successful and it made venues like Slashdot and Digg.

We ended up getting a lot of open source code back that we can go ahead and directly use to make very useable interfaces.

TS: What surprised us a little bit was the quality of the code that these developers came up with during the conference. It was 24 by 7 and Khawaja was there manning it, and the ways of lighting the road that the developers came up with were quite ingenious, including one on an iPhone. So, the crowdsourcing works both ways and we are quite excited about it.