Cloud tool helped Florida with 2010 Census count

July 13, 2010

The 2010 Census is over, but many are still talking about the new technologies that were used for the count this time around.

Gail Thomas is vice president for state and local government at Microsoft, and explains that she and others at the company worked on a project called MyFloridaCensus, which helped the state of Florida count citizens during the Census.

She tells Fed Cloud Blog all about the goal of the tool.

GT: We were excited to partner with the state of Florida on this cloud solution from Microsoft, which is all about enhancing the efforts around ensuring that they had a complete count of residents during the 2010 Census.

Really, it was meant to sort of supplement the door-to-door canvassing. . . . This solution is a final push to be able to count state residents and it’s through a website called myfloridacensus.gov. It’s a web-based application which allows . . . citizens to provide dynamic feedback and visual representations of that feedback. It’s really an innovative way to capture census information and to really have citizens take part in gathering that information.

It’s an opportunity for citizens to speak, as well, on what could help improve their community.

FCB: Is this [a place] where people can comment on a blog or take pictures? Give our listeners a little bit more information on that front.

GT: For starters, it’s hosted on Windows Azure, which is our cloud platform. It runs using Silverlight for cross-browser compatibility, as well. With the support of a Bing maps interface . . . It allows visitors to really share their experiences, provide feedback. [It is] an opportunity for social, user generated experiences around the census information count that they’re doing. [There] are visual representations in terms of providing feedback, so it really is sort of an interactive type of a program.

FCB: You mentioned Azure — and we know, talking with other people from Microsoft Federal . . . that it recently launched. Define Azure — what is it — and talk a little about cloud computing. How has it evolved, and what’s Microsoft doing in the world of the cloud with Azure right now?

GT: Our strategy around the cloud and hosted offerings is really broad and comprehensive.

We provide software in the cloud, which is software delivered as a service — things like your traditional email and collaboration type activities that people can do in the cloud.

Then, we provide infrastructure in the cloud and platform in the cloud, and Azure is our platform in the cloud, which allows people to develop applications in the cloud. It is language agnostic, so its completely open. . . . [Azure] allows applications that are hosted up there the ability to expand and contract the use of the server space, and it helps reduce the overall cost of those solutions when they’re not in peak use.

So, MyFloridaCensus, for example — there’s obviously a peak usage time that will die down. . . . It really is very cost effective as far as a solution goes and really provides a lot of flexibility. Another good example of that is we’ve got a solution called Hey Gov, which was built from the Miami 311 solution, which is a 311 online solution for citizens to be able to record non-emergency type incidents, and see exactly what the status of that incident is online.

FCB: So, are you doing anything like this or looking to do anything like this with other states or other local governments?

GT: Absolutely. . . . This solution that we worked on with Miami 311 now has a broader applicability — Hey Gov — and we’re starting to talk to a number of cities that are interested in doing similar things with a 311 system in the cloud, as well.

Advertisements

NASA JPL crowdsources with cloud

June 7, 2010

We continue our conversation about NASA’s JPL moving into the cloud.

Today, we start off discussing how President Barack Obama’s Open Government Initiative is influencing cloud at agencies — and whether or not cloud is helping JPL to comply.

Tom Soderstrom, the IT CTO at NASA JPL and Khawaja Shams, senior solution architect at NASA JPL tell us more.

TS: Essentially you can divide what we do in two ways. One is, it’s good for the mission. It makes us do better science. The other one is about communicating that to the public and getting the public excited. Our stockholders are the public. If the public wants to know more about space and science . . . it will go through the budget.

We’re very pleased to see that it’s a cloud and we’re big supporters of data.gov. We think it’s a fantastic idea — [where] you can get the data out at less cost and much more easily to the scientists and the public. So, we came up with this term . . . of citizen scientists. If they could get access to the data much more easily and quicker, they could maybe even help turn the wheels of science.

We worked with Microsoft using their Azure cloud on [a project] called Be a Martian. Citizens are able to do anything from tagging images online to creating programs that tag the images online. It’s a contest and . . . it’s been very successful. It’s a way of crowdsourcing and we took the images — 250,000 images from the Mars rovers Spirit and Opportunity — put them in the Azure cloud and gave the citizens access to it. It’s been terrific. We did the same thing with at EclipseCon.

KS: EclipseCon is a developer conference. Roughly a thousand or so developers worldwide attend each year in Santa Clara, and we held a contest there called the e4 Rover contest where we allowed developers to drive a mini robot around a Martian landscape basically that we had put together. We used this as an opportunity for public outreach, as well as to get developers to build interfaces to command the robot and view the telemetry that is coming back from the robot.

In order to run this contest, we needed a lot of infrastructure that we didn’t want to just go buy for this one week contest. So, we ran the entire contest on the Amazon cloud and leveraged a lot of the services that are very common to companies like Amazon and Microsoft and Google and we were able to get these for free and very quickly — services like load balancing . . . [and] getting computers running in multiple data centers, and services like the delivery of images to the operators that were, in this case, the developers. This project was actually quite successful and it made venues like Slashdot and Digg.

We ended up getting a lot of open source code back that we can go ahead and directly use to make very useable interfaces.

TS: What surprised us a little bit was the quality of the code that these developers came up with during the conference. It was 24 by 7 and Khawaja was there manning it, and the ways of lighting the road that the developers came up with were quite ingenious, including one on an iPhone. So, the crowdsourcing works both ways and we are quite excited about it.


FTC looking at the cloud — and other news

January 11, 2010

It looks like the Federal Trade Commission (FTC) is checking out cloud computing.

Information Week reports that the FTC is looking at data and security implications when it comes to cloud computing for consumers. The action is in response to a Federal Communication Commission (FCC) Notice of Inquiry about how broadband and data portability relate to the cloud and privacy.

FCB has contact the FTC (say that three times fast) — and hopefully we’ll be able to tell you more about this soon.

Vint Cerf, the man often considered the father of the Internet, says there should be standards when it comes to cloud computing. Computerworld reports that he talked about the various clouds (offered by companies such as Microsoft and Google) and the fact that no interoperability exists between them. He likened the current cloud situation to the lack of communication and familiarity among computer networks in 1973. In addition to interoperability, he discussed security and other topics during his presentation.

What’s scarier than concerns about security, compliance, privacy, and management? Hype. That’s according to Bob Evans of Information Week, who writes that 2010 might well be the year of the cloud, but that could pose some serious problems if proper planning isn’t done.

Hype isn’t stopping smaller companies, however. BCS reports that more small and medium sized businesses will embrace the cloud this year. A study, conducted by Easynet Connect, found that the number of SMBs that are planning to adopt cloud technology within the next five years rose from 47 per cent at the end of 2008 to nearly three-quarters of respondents at the end of the last year. They also found that the number of SMBs that are planning on switching to cloud computing in the next 12 months rose 28 percentage points to 50 per cent of those questioned.

Speaking of what’s new in 2010, Microsoft has announced that it’s Azure platform will be available on Feb. 1. This is Microsoft’s first foray into cloud computing since the launch of Microsoft Online Services.

Dell and Infobird have signed a memorandum of understanding (MOU) to provide cloud solutions in China. The MOU will bring cloud-based call center solutions to customers in China. The two companies are currently negotiating the more definitive agreements for the cooperation.


The inevitability of cloud in a mixed environment

December 23, 2009

Hear the second part of our interview with David Chen.


Today we continue our chat with David Chen, lead of the technology consulting practice for Accenture Health and Public Service.

Security in the cloud: Risks and benefits

Security is definitely a very valid concern.

If you look at how companies are able to offer some of the economies of the cloud is because they have shared infrastructure and they’re able to leverage unused compute power in one are to another application and move that around.

By the nature of things being shared, that poses a security concern, especially in the federal government, where there’s sensitive and classified information — and there’s also the need for certifications and accreditations of certain environments.

So, the first message there is to be conscious of that. I think IT managers need to choose very carefully what applications are appropriate to host on the cloud, given the current state.

Then, the third thing you’ll see is a lot of the cloud providers are working toward a hybrid model, where they will have computing infrastructure dedicated toward one agency or one organization and have a cloud within that.

Now, you won’t get quite the same economies when you do that.

We also see a lot of agencies starting to implement private clouds, where they use a lot of the same technologies internally and get some of the same advantages to address some of those security concerns.

I would say, though, also, on the flip side, the cloud can actually give you some benefits with security.

One is that you keep some of your applications that might be more public facing away from from your highly sensitized internal applications.

So, somebody breaks into your public facing Web page, for example, then, if it’s on the cloud, the intruder cannot get to other systems that would otherwise be on the same network.

We’ve seen that happen to some of the agencies — where [hackers] got into one system and then, all of a sudden, could get into other systems that were much more sensitive, because they were on the same network.

By moving things out to the cloud, you can avoid that problem and also the cloud can help you with things like denial of service attacks because of the ability to shut off and turn on new servers and other compute infrastructure quickly.

Accenture and the cloud

We help agencies and companies operate in the cloud and with the cloud.

We help them with their cloud strategy; we help them with the management of their infrastructure, including both cloud and non-cloud environments; and then we also will partner with cloud companies and really leverage their capitol investments in the infrastructure.

So, as an example, we announced a partnership with Microsoft on their Azure cloud offering, with the idea that we would be a primary systems integrator helping our clients use that offering and figure out how to implement it and help them with that integration.

What is the Accenture Cloud Computing Accelerator?

The name may be a little misleading — it’s not something to make the cloud go faster, but . . . It helps an agency or a company formulate their cloud computing strategy.

We help them, in a very short time period — usually four weeks or less — look to see which business applications could be migrated to the cloud, how cloud fits in with their overall strategy and then how to both transition into that . . . as well as long-term — how their environment might look or should look when they start integrating both their cloud and traditional computing environment.

Wrapping it up

I think what I would say in terms of [the topics about] — everyone’s struggling with how quickly to move into the cloud — and is it real — and is it secure enough?

What we’ve seen time and time again is that, when we look at internal compute enterprise environments butting heads with the Internet, the Internet always wins.

So, we see cloud as something that is inevitable long-term. What I would say is that most IT managers should start looking at the cloud [and] figuring out how it plays in and understand that it’s still early on and the technology is maturing and it’s not going to be a fit for everything — but start to look and see what is a good fit.

There are some incredible economies that should be — and can be — taken advantage of now. Then, also, as I have mentioned several times, really making sure that there is a holistic strategy.

It’s not just about cloud computing.

It’s not just about traditional, but we see, in the next several years, that everyone is going to have a mixed environment.

Cloud still has to be managed just like other systems out there need to be managed in terms of system monitoring and everything else.

So, it’s going to be very important for agencies to look at migrate and evolve their management structure to both be able to handle cloud and non-cloud in a mixed environment.

Fed Cloud Blog will return next week with more posts. Have a great holiday in the meantime!


Looking at the cloud as the next generation of IT

November 30, 2009

Listen to the first part of our interview with Susie Adams.


With the recent release of Microsoft’s Azure, Fed Cloud Blog decided to talk with Susie Adams, CTO for Microsoft’s Federal Business, about cloud computing, the platform and where she thinks federal agencies can most benefit in terms of the cloud.

In the first part of our interview with her, she talks about what cloud computing actually is — and why it’s nothing new, really.

Susie Adams: So, if you think about the definition of cloud computing — we can take the definition that NIST has given — and what they’ve done is really layered it into three different layers. So, you have infrastructure as a service, platform as a service and software as a service — and a very similar way that we think of multi-tier applications. At the infrastructure layer, you have your physical ping, power and pipe, and then on top of that you have your operating system layer.

In the cloud, we see as a natural evolution of moving to the cloud that you also need a cloud operating system, so that’s what Windows Azure is for Microsoft. So, it is Windows Server 2008 with our hypervisor virtualization technology with some engineering modifications to be able to handle the elasticity that’s needed to scale cloud apps up and down.

Fed Cloud Blog: One of the things that we talk about is security. Tell me a little about the challenges of securing something in the cloud versus [legacy] desktops, laptops, what have you.

SA: I like to talk about security first off by saying that, when most people talk about the cloud they think of it as some brand new technology that just kind of sprung up out of nowhere. Really, it’s just the fifth generation of IT. It’s a natural evolution, actually for the IT industry.

With that said, all the learnings, especially from a security perspective that we’ve garnered over the years . . . all those same security best practices still apply in the cloud, and then the obvious thing that people are concerned about is the lack of control that they feel that they have when they give their data, for example, to a service provider like Microsoft. Is my data safe? Will it be kept private? Is it secure?

So, when we think about those things from a security perspective, most people immediately go to security controls and best practices. From Microsoft’s perspective, we really think about it from two different standpoints.

One, it’s not just about the physical infrastructure and the controls that you have in place, it’s all about the people process and the technology. You can have the best controls in place and be utilizing best practices, but if you don’t have a rigorous process in place to evaluate and audit that on a regular basis, and continuously improve that process . . . you’re not going to be adhering to the best security practices.

The second thing is all about transparency and trust. When you think about, for example, email moving to the cloud, federal agencies have been using systems integrators for years to host their email systems. Some in data centers that are owned by the government, and some in data centers that are owned by systems integrators. When we look at how that traditionally has worked, the government agencies have been able to go in at any point in time and actually look at or audit a particular service running on a server.

The cloud model’s a little different because it’s a multi-tenant environment, and in that environment, it’s almost cost prohibitive to allow every single customer to come in. From our perspective, for you to trust us with your data, we have to be extremely transparent about the security processes that we have in place, the security controls that we’re compliant with. . . . What we like to do is make sure that our customers understand which security controls we’re compliant with and we allow them to actually look at the audit logs and things of that nature.

FCB: All of this talk about cloud computing and moving into the cloud is really making people change their mindsets. Do you think that that’s still a challenge, though, that some people [are stuck in the past]?

SA: Absolutely we do see it as a challenge. . . . For obvious reasons, as the IT industry has grown up, people have learned that if they have physical control over the applications and the systems that applications are running on, that they have a little bit more confidence on the reliability of that service. To go ahead and just say — especially if you think of the administrators today and the stress that they’re under, for example, just to make sure that mail is up. 99.9 percent of the time they’re very reluctant to say — I’m just going to give all of this to the service provider — especially since we’re just in the infancy of cloud computing.

The interesting piece here, I think, too about this is from a procurement model we’re still kind of stuck in the legacy mindset. I’ve talked with several CXO-level executives in multiple agencies and when we talk about — let’s take email and move it to the cloud — agencies don’t want a contract just for email. [They want] what I call the everything-but-the-kitchen sink model, where it’s email, desktop management, collaboration software, and a number of other helpdesk [options].

So, when we say — we can provide email to you, Federal Agency, they immediately go — I want everything else, too. What about my PII? Today, the way cloud services work, that’s not traditionally something that a Microsoft or a Google or an Amazon offers. We actually offer services in one of those three layers of the cloud stack.

I think that that is also going to be a challenge. It’s the [feeling of] the lack of control — and the way they procure software today.

On Wednesday, we’ll bring you more of our conversation with Susie Adams, when she will discuss the possible cost-savings — and pitfalls — of the cloud.