Census implementing virtualization-first policy

May 13, 2011

Brian McGrath, the chief information officer at the Census Bureau, says his agency is about to implement a “virtualization-first” policy.

He tells Federal News Radio’s Jason Miller, “All new applications will be serviced via a virtualized guest as opposed to a bare-metal deployment of hardware. Unless there is a compelling engineering or architechture reason to do so, we see significant opportunities for cost savings.”

McGrath says this new policy will allow the Census Bureau to transition its data centers “from one of only servicing the Census Bureau to being positioned to service and provide compute and store resources, and cooling resources, and secure resources for other government agencies.”

In fact, McGrath says the International Trade Agency, a sister organization of the Census Bureau, will close one of its data centers in June and move into a Census data center.

The Census Bureau is also in the process of closing some of its own data centers. By the end of 2011, it will close six that were stood up for the 2010 census.

McGrath tells Miller the Census Bureau is also in the process of testing the use of the Internet to collect data as part of its preparation for the 2020 count. By the end of this year, the Bureau will have conducted 60 American Community Surveys online.

As for the storage of all that data, McGrath says that’s staying in the Bureau’s private cloud.

“Any of our sensitive data, personal identifiable information, or Title 13 or Title 26 data, which we are stewards of, will remain in our private cloud. At this point, we’re not seeing any capacity issues. We’re obviously always focused and concerned about IT security to ensure that we could manage the flow of information, store the information, and secure it in such a way as to protect the identity and integrity of the information we are currently capturing.”

Listen to Jason’s interview with McGrath.


Navy, DHS, State make strides in the cloud

January 18, 2011

When it comes to reducing costs and wasteful spending at agencies, IT managers are being leaned on heavily to get the job done.

The Navy is moving ahead with its technology efficiency and consolidation initiative by putting the brakes on spending for new servers, server upgrades and data centers.

“We are reevaluating what all of our organizations want to do and why they want to do it, and is it consistent with our overall IT efficiency,” said Janice Haith, director of assessment and compliance for the Navy’s Information Dominance Directorate.

“Server purchases up to date may not have been efficient. They may not have bought servers that were sufficiently robust to handle virtualization. We need to do that. That may mean we have to buy some additional servers that can be virtualized, and some of our servers today are not in that state.”

Federal News Radio’s Jared Serbu reports, the Navy set some targets for virtualization as well. It directs each of the Navy’s 23 Echelon II organizations – the commands in the organizational chart directly below the office of the Chief of Naval Operations – to develop plans to increase virtualization by 40 to 80 percent, and server utilization by 50 to 80 percent.

Various civilian agencies are also making strides. At a recent AFCEA-Bethesda breakfast panel, the State Department said its goal is to reduce the number of data centers in the United States from 11 to 2 over the next few years.

Cindy Cassil, the agency’s director of systems integration in the CIO office, says part of the way her agency will do that is by getting buy-in from business owners by offering services on a private cloud.

“Right now we are offering infrastructure-as-a-service,” Cassil said. “We are trying to work around the political issue about people still wanting to maintain their applications. The IT staffs are very powerful. They really advise the business they need to be involved. Right now, I would say we have 99.9 percent cooperation with our business side because they really like our model at this point. We offer the platform and the storage, and it’s free to them if they come in and virtualize.”

DHS’s Deputy CIO Margie Graves also spoke at the event. Graves said her agency is creating a test and development environment similar to one developed by the Defense Information Systems Agency.

Federal News Radio’s Jason Miller reports, her office wants to make it easier for DHS components to do rapid application development in a cloud environment. DHS also is working on two other cloud test and development environments using IBM’s Websphere and one for open source.


Virtualization key for future of Bureau of the Public Debt

December 20, 2010

IT managers from large agencies and small agencies alike are looking at virtualization to help them save money and streamline processes.

On this week’s Ask the CIO program, Kim McCoy, chief information officer at the Bureau of the Public Debt, discussed her agency’s plans in this arena.

McCoy told Federal News Radio her agency, along with its sister agency, the Financial Management Service, will be consolidating their data centers from a total of five down to two that will service both.

Since we do provide services to other government agencies, we’re looking at the possibility of how to build the most cost-effective standardized hosting infrastructure, whether you want to call that a cloud or not, but how do we have an infrastructure that allows us to host applications for the federal sector very quickly, meeting all of our security requirements in a cost competitive fashion.

McCoy said the agency is also moving towards a virtual desktop, even though she isn’t fond of the term.

“The primary driver for that is that we know we need to move towards a telework environment not only to maintain the staff we have today but to expand our staff while keeping our overhead low and maintaining our current level of physical office space.”

McCoy said the agency will begin making progress in this area within the year with the hopes of rolling something out within the next few years.

Listen to Ask the CIO with Kim McCoy


Make decisions before you decide to move to cloud

August 9, 2010

This week, we bring you a special treat — an extended conversation with Mark White, a principal with Deloitte Consulting LLP who works with both the firm’s Federal and Technology practices. He is also the CIO of Deloitte Consulting.

FCB talked with him at length about cloud computing and a variety of issues that are currently facing the industry.

He started by explaining what Deloitte does for the federal government in terms of cloud.

MW: “In talking with our federal clients about cloud, there are two or three different stages in which we find ourselves. There’s still the stage of trying to put some structure around what cloud means, how to actually come to a common understanding and an actionable conversation about it. That’s sort of the first [challenge] — understanding structure frameworks that allow a common basis of understanding and definition and that can allow you to have conversations and make plans that lead to a decisive outcome.

That, a year ago, was sort of the most common part of it. Now, with all of the attention that’s been spent and all of the time that’s been spent, that, while it still goes on, is probably not the most significant part.

The next phase, if you will — or generation of it — is in analyzing strategies and evaluating and helping make plans — plans to do analysis, plans to adopt, in certain cases, plans to expand — that is becoming of the most common of the three phases of the understanding, planning — and then the third phase, which is going to be actual implementation. That is the third phase — implementation of certain aspects.

It’s interesting. One of the points — and, in fact, one of our fundamental planks in the platform about cloud is that it’s technologically evolutionary. The impact on the mission can be revolutionary. So, when I say implementation is beginning to be an area in which we work more with our federal clients, that’s speaking specifically to those things that were originally described as so called ‘cloud’.

When I think about the fact that the technology aspects of cloud are essentially evolutionary in nature, they’re the next logical generation of the technologies and techniques and methods and disciplines we’ve been applying for data center consolidation, virtualization and operations automation.

So, having said that, we have been — and continue — to help our clients with implementation of those technology disciplines and capabilities and tools. It’s those that would, out of the box or from the get-go, have described it as a cloud implementation. That actually is beginning to increase.

FCB: With all of these changes happening — and I know different organizations sometimes have different definitions for cloud — but going based on what you just told me, what are you doing in terms of security. When I talk to agencies themselves, they say, ‘We’re really excited to take this next step, but we’ve got all this data that we don’t want getting out there.’ Talk a little bit about the security aspect and maybe alleviate some of those concerns.

MW: In order to have a common definition of cloud, there are two steps to set the table, if you will. The first step is — what are the characteristics of the mission problem that you’re trying to solve, or perhaps the technology solution you’re proposing? And do those characteristics imply or outline a cloud solution?

We use the five characteristics that NIST has put forward, and if you look around, you’ll see slight variations on a theme, but I think those are perfectly reasonable. . . . So, if your mission problem or your technology solution embodies or implies or needs all five of those, clearly we need to have a conversation about cloud. If it requires fewer than five — maybe three — then perhaps we ought to talk about a more mature technology — utility computing or managed services or even plain old outsourcing.

That’s the first part of having a cloud conversation — what are the characteristics of the problem or solution?

The second part of having a cloud conversation is three dimensions of the answer. The first dimension is the capability, or what kind of cloud: infrastructure-as-a-service, platform-as-a-service, software-as-a-service, or business process-as-a-service. The second is, what source? Is it a public cloud? A private cloud? A hybrid cloud? A community cloud, which actually obviously GSA defined in that RFI coming up on two years ago now. . . . And then the third, and this is may not be quite as familiar because it doesn’t get talked about as much, but we think it’s really important, is — what is the business model?

There are four layers. Layer one is — the business model is, ‘I want to be a cloud service subscriber’. Layer three is, ‘I want to be a cloud service provider. I want to make money by providing cloud services in the marketplace’. Layer four is, ‘I want to be a cloud service enabler. I produce technologies or skills or capabilities that allow the cloud service providers to do their job’. And then layer two is a cloud service broker.

So, dimensioning a cloud conversation first — what are the five characteristics and do you really need cloud? Then, the three dimensions — what kind of service, what source of service and what business model? And, if you will tell me what we’re talking about, then we can have an actionable conversation — we can conclude with action. So, you might say to me, “I want to be a subscriber of a public cloud infrastructure,’ at which point we can have a very meaningful conversation about the obstacles and the enablers and the challenges and the benefits, one of which, obstacles, by the way, is the security and private data security and privacy issue.

Coming up — details about privacy!


Comparing cloud use in the U.S. and Europe

August 5, 2010

Who’s using cloud more — the U.S. or Europe? What are the biggest concerns when it comes to security on both sides of the Atlantic? Should you be developing a cloud strategy now, or should you wait until next year?

These are some of the questions that the Ponemon Institute and CA Technologies posed in a recent survey of IT professionals.

Today we talk with Larry Ponemon, chairman and founder of the Ponemon Institute, and Lena Leverti, vice president of products at CA Technologies, who explain their results for us.

LP: In our experience, there are a whole bunch of interesting security topics, but what seems to rise to the top of the security heap in terms of risk and potential problems is, in fact, the cloud computing environment, which is very quickly becoming the standard for organizations — not just small and medium sized companies — but much, much larger companies, as well.

LL: One of the key things is that, as companies start adopting cloud, they’re basically giving up some of the control that they have. When they technology is within their own organization, they control it directly, so one of the biggest hurdles that’s viewed around cloud adoption is definitely security.

FCB: Who did you survey and why did you pick that group or groups?

LP: Well, the appropriate groups for this study are folks in the IT community and, more specifically, people who know something about information security. When you do a study like this, you quickly find that people wear many hats, and so many of the respondents were IT practitioners, but every respondent at least touched some aspect of information security, including network security systems, and a whole bunch of other related areas of expertise. This study is not just the U.S. only; [it] was also conducted in tandem with a group of practitioners in Europe, as well. I think that actually generated some interesting differences between the two groups.

LL: There were about 600 folks that responded to the survey.

FCB: What were some of the key findings?

LP: Probably one of the most interesting and important findings is that the respondents — these IT practitioners in both the U.S. and Europe — basically don’t have confidence that their organization has the ability to secure data and applications that are presently deployed to the cloud. So, they basically see some very significant security risks that exist today and maybe loom large on the horizon. We also found that IT practitioners in the U.S. and Europe hold relatively similar views on the reasons why cloud computing is so fashionable and so popular and so important, because it’s really about cost savings, and it’s also about speed to deploying new applications. So, even though we may say, ‘gosh, there’s a huge security risk,’ the reality is that cost and speed to deployment are probably much more important to end users.

LL: And one of the biggest challenges that came out in the survey results was that half of the respondents basically said that they’re not aware of all of the computing resources deployed via the cloud in their organization today. So, if you’re not aware of it, you really can’t secure it.

FCB: One of the things that I noticed first and foremost is the fact that you define cloud computing. When you were talking to people in the U.S. and Europe, did you notice that there was maybe a difference in the definition of cloud computing?

LP: We expected that there would be differences, and, in fact, the perception of cloud computing and what a cloud computing environment is was pretty consistent — more consistent than our . . . expectation. But I will say that, in both the U.S. and Europe, there’s confusion about private clouds and what these really mean. Is a private cloud a more secure version of a public cloud? Or, is it just simply on-premise computing where you’re using extensive virtualization? So, if there is any confusion in the marketplace, it’s probably around the private cloud environment. But, public clouds are generally well understood and the definitions are generally agreed upon.

FCB: Speaking of differences across the pond, did you find any differences between who’s using cloud in the U.S. versus who’s using cloud in Europe, especially in terms of government entities?

LL: We did. Some of the [respondents] are, in fact from the public sector and public organizations, and it is clear that public sector organizations are using cloud computing resources, perhaps not to the same extent as commercial organizations, but definitely the trend is that the government is, in fact, a very large — and potentially larger — user of cloud computing resources, because obviously it’s about cost, and governments . . . are trying to control them. One way to do that is to make sure that [they are] using the most efficient technology. But, it does create that security risk. We did see some differences in the rates of deployment between the U.S. and Europe and, in fact, the rates of deployment in the U.S. are higher than Europe, generally speaking. That’s not just for software-as-a-service, but it’s also for platform services and infrastructure services.

FCB: Did you find any causation — why that might be — or did you just look at the numbers in terms of use.

LP: We tried to figure out why there were some differences between U.S. and European companies in terms of their deployment patterns. We think that, in the U.S., probably, cloud computing is just slightly more popular, and some of the providers — especially software-as-a-service — the big providers like Amazon, Google and SalesForce.com — they probably have a larger base of customers in the U.S. But, I think that difference is small and will probably be non-existent within the next 18 to 24 months.

FCB: Let’s talk a little bit more about security, because I noticed that you not only talked about cloud security and public cloud versus private cloud, but the responsibility for security — did you find any differences between who’s responsible for IT security in a U.S. organization versus in Europe? Or, is it kind of the same?

LL: With regards to the study results, it’s definitely shared, and the reality is, it has to be shared. Basically, when you look at the responsibilities for this type of an environment, there’s the provider themselves that has some level of responsibility and accountability, [and] the owner of the information is going to be held accountable regardless of any SLA in any type of agreement with the provider. At the end of the day, if a credit card provider puts their data in the hands of a partner, they’re still going to be held accountable, and history shows that’s definitely happened. So, the shared responsibility with IT, with the security folks, as well as the business line owner, which I think was a definite key finding in the study itself. The business owner also has a stake in this — and then, of course, the cloud provider.

FCB: What’s next? Is a report coming coming out of this study? What should we take from all of this data that you’ve put together?

LL: The study that we did was two-fold: it was for the consumers of cloud services, as well as the providers of cloud services. So, the study that we released was the first portion of that — for the consumers. We’ll be releasing the results of the study from the providers’ perspective, and then identifying some of the contrasts and so forth between the two.

FCB: Any wrap-up comments?

LP: We actually do believe that this issue of cloud computing from a security perspective is certainly not going away. The good news is that there are security technologies that are being developed and deployed that do reduce risk pretty substantially, caused by the change from on premises to cloud computing environments. So, it’s not all that bleak. There may be solutions in the future that will make that risk really negligible.

LL: Cloud security is definitely one of the areas that is viewed as high priority and, today, is viewed as a high risk area. I believe that technologies over the next year or so will definitely close the gaps [and] reduce the risks. One of the key things that organizations can do today and agencies can do today is clearly define a cloud security policy, whether it’s part of the security policy, I think it’s very important to just specify, from a cloud perspective, whether this policy applies in full or — here are the additional requirements and mandates for cloud security. That will help close that gap faster and reduce the risk significantly — just by creating awareness.


FAA’s Air Traffic Organization considers much with cloud pilots

August 3, 2010

When looking at the cloud, there’s a lot to consider.

Today we talk about the myriad of concerns about cloud with Steve Cooper of the Federal Aviation Administration’s Air Traffic Organization.

He tells us what’s going on in terms of cloud computing and Web 2.0 at his office, and why he’s taking the look in the first place.

SC: First of all, we’ve got a small tiger team inside of our organization focused on cloud computing. We’re looking at [it] a little bit differently. We’re after demonstrating and proving the viability of cloud computing as a way to more quickly stand up capability.

Today, if an internal customer comes to us and says, ‘Hey, we’ve got this application that we’ve been developing and we now need it hosted in one of our data centers,’ — well, that’s us. We host all this stuff. We run our data centers. But, in order to do that, we actually have to plan ahead. In some cases, we’ve got to plan 3 to 6 months ahead. We have to procure servers. Even with our virtualized environment, we still may have to provide servers. We may have to make sure that we’ve got any kind of specialized equipment necessary for the well-being and life cycle support of that application. That lead time is 3 to 6 months. You’ve got to get it through procurement hoops. You’ve got to make sure that you’ve got the right types of skill sets on board, and so forth and so on. I’m pretty sure that your audience will identify with that type of scenario in government today.

Here’s the tremendous advantage of cloud computing. Suppose that I can change that 3 to 6 month time frame to 3 to 6 days, or hours. Look at the phenomenal capability that we now bring in providing the benefit of whatever that investment was. Look how much more rapidly we can bring that capability online for the air traffic organization. That’s the exciting part of cloud computing. And if I can do that safely, if I can do it in a way where I may not have the full skill set to do something I want to do, that’s what I’m after. So, our pilot [programs] — they’re not real sexy but they demonstrate the viability.

We’re looking at bringing up collaboration capability in the cloud. That’s area number one — and that really is social media. That’s some of the new stuff we talk about with Web 2.0. What do we do with YouTube? What do we do with Facebook? What do we do with mySpace? What do we do with something like LinkdIn for networking and collaboration. What do we do with video conferencing? What do we do with individual Web conferencing on everybody’s desktop or laptop? How do we do all that stuff, and how do we do that in a way that makes it available instantaneously to any and all of us?

Area number two — we’re looking at actually moving our software development environment into the cloud. We’ve got tremendous challenges in replicating our production environments, and we can’t always economically afford all the different types of servers and equipment to fully replicate our production environment. But, if we could give that to a third party and, again, do this within all compliance rules and regulations, why wouldn’t I do that? Particularly if I can also do it in a more cost-effective manner.

[The] third area — we’re looking at [is] email archiving in the cloud. Storage, although the cost of unit storage has come down tremendously, our storage demands are growing at a phenomenal rate. The FAA and the Air Traffic Organization produces terabytes and petabytes of data. The cost of providing that in our own physical environment is increasing as an absolute percentage of our total investment in IT. Well, heck, if I can do that more efficiently or more cost-effectively by going to a cloud provider, I’d love to do it.

FCB: One of the things we’re seeing is that cloud is such a big trend, but some people are hesitant to jump in [and] some people are jumping in, like you guys are, with both your feet — [but] the reason I hear, most of the time, [for the hesitation] is security, security, security. It sounds to me like you found a way to deal with those security concerns, but also make it beneficial for the ATO.

SC: Listen, I wish I had. Those folks are absolutely right. You’re absolutely right — and, no, we don’t have a solution to the security challenges. But, what we’re doing is, when I use the words, ‘We’re looking at cloud computing to determine if it’s viable,’ that absolutely includes the [cybersecurity] aspects of cloud computing. So, as we’re reaching out to some partners externally in industry to partner with us — information security, cybersecurity absolutely is part of these proofs of concept that we’re doing.

Hear more with Steve Cooper on Ask the CIO.


For learning and development teams, cloud is nothing new

July 29, 2010

What does the emergence of cloud computing really mean for your organization?

One expert says, despite the buzz word, cloud computing should be looked at as nothing new when it comes to learning and development (L&D).

Billy Biggs is director of learning strategies at General Physics Corporation and recently wrote a white paper about how L&D should handle cloud. He tells FCB more about why this really can be business as usual, but starts out by explaining that many are still wary because, well, the definition of cloud has been undefined for a relatively long period of time.

BB: I think the difficulty around trying to define what cloud computing [is because] it’s still evolving. The technology is still evolving and, up until late last year, you really had no authoritative body providing one specific definition.

Until NIST released its definition late last year, it became a little easier for folks to get their hands around it, but to complicate matters, I think that the different cloud offerings that are available to consumers — whether it’s software-as-a-service, infrastructure-as-a-service or platform-as-a-service — with all the complicated ecosystems of vendors and partners and approaches, really complicates things on what cloud computing means to a lot of different people.

FCB: In your paper, you talk about [being] behind the firewall and third party, externally hosted services. How is cloud different and is cloud even necessarily better?

BB: I’m not sure it’s necessarily better at this point, but it’s certainly different. I would kind of describe it as that shiny and new technology or toy that’s out there and available.

Behind the firewall and third party hosting models are now considered more traditional models, where there are usually constraints on user populations, data or content.

Cloud is a little bit different in that it’s more of a pay-as-you-go model. It’s sold on demand. There are really no restrictions on data or content consumption by users.

I like to use the example of buying a car, versus, perhaps, having a taxi available at your discretion. So, in a traditional hosting model, you would have a situation where you would make a purchase, just like going out to buy a car, and you could do anything you wanted to to that car — or that application, whether it’s customized or integrated with other systems.

Cloud’s completely different. You pay as you go. For example, if you use the application a great deal one month, your costs are going to be a lot higher, like a taxi would if you used it a couple of days a week, versus a couple of times a day. It’s kind of the same concept.

FCB: Let’s talk a little bit about whether deciding cloud is appropriate. Using email in the cloud — I think a lot of people have been doing that for years, but what about infrastructure-as-a-service, or some of these bigger moves? What questions should IT professionals be asking themselves before making the move to cloud?

BB: So, software-as-a-service, or email in the cloud, [has] been out for a couple of years now [and is] pretty easy to understand. Infrastructure-as-a-service is especially compelling when vendors who are at capacity in their private clouds are able to tap into additional public clouds for additional resources, whether it’s processing, storage, network or security resources. It’s those type of components that really fall under the infrastructure-as-a-service offering. As far as whether it’s appropriate or not, I think holistically you have to understand, just like any other technology, what business problems the cloud will ultimately solve.

If it’s SaaS or platform or infrastructure — what is the business problem, and is the cloud offering to solve that? I would suggest walking through any successful evaluation criteria or process that an organization has used in the past to evaluate whether an enterprise migration to a cloud offering is going to work, whether that starts with a business case or evaluating case studies, best practices and lessons learned.

The main questions I would ask as a potential vendor going to a cloud offering is, what does the support model look like from the cloud service provider? You’re changing the dynamics a little bit in that you’re not going to have the same administrative access to the application or infrastructure that you once did, thus you could be exposed to outages or other service interruptions.

At the end of it, I think that the best approach is to create a delta checklist, to walk through every aspect of the application or infrastructure, and understand exactly how different the cloud model is going to be from the traditional hosting model. The areas I would focus on are access, identity management, integration with other applications, and, obviously, security is always a big consideration.

FCB: I do a lot of interviews and people talk about why you should move to the cloud, but in your experience, is there ever an argument for not making the move?

BB: Yes, I definitely think, just as there are reasons to consider going to the cloud, there are reasons to consider not going or, at this point, holding off. There’s been concerns around the lack of customization in cloud solutions specifically related to SaaS, the lack of configurability of these applications — they’re more locked down, in some instances. A lot of folks don’t like to go with the forced upgrade or quarterly releases. So, that’s a consideration.

If you have a complex governance model, or change management process, quarterly releases may be just too much to keep up with. And then [there is] all the data integration. If you have a lot of other systems talking to each other, that presents specific challenges in a SaaS environment. Number one has got to be security, though.

There are still security concerns out there related to PII and sensitive data and where that data actually resides. Contingency planning and batch recovery options always come into play.

I think you’re going to have security concerns until NIST finishes its roadmap and standards on cloud computing, giving the cloud vendors some kind of chance to walk through some kind of certification or an accreditation process or program that will help ease security concerns with the community as a whole.

FCB: Final question about virtualization — what role has [it] played when it comes to cloud?

BB: Well, I think . . . without it, you’re not going to see the most mature models of cloud that exist now. The tools out there allow applications to become self-contained units. So, you’re kind of rolling the database or the operating system — all of that together to be self-contained. Therefore, applications and infrastructure are now considered independent of one another, which is a huge leap from traditional hosting models, now that you can have multiple applications run on VMs on the same physical server. Now IT departments and hosting providers can essentially provision new VMs as demand for system resources increase without significant hardware purchases, thus creating more IT agility, if you will.

FCB: Anything I missed that you might want to add?

BB: I think if you’ve been paying attention to what’s been going on at the Microsoft Worldwide Partner Conference, Microsoft’s message is essentially, they’re all in related to their cloud offering and pushing that out via their partner networks. So, I don’t think cloud is going anywhere. I think it’s here to stay. With that said, I think that organizations don’t need to be afraid to play a kind of wait-and-see attitude. If they’re pressured to go to cloud to reduce costs, a wait-and-see right now for the next six to 12 months is not necessarily a bad approach. I think there’s going to be a new variety of cloud offerings and hybrids available in the next few years.

If you are pushing to go to cloud, I would just suggest the organization . . . follow the same successful process that you’ve used to evaluate any major technology investments in the past.


Checking out the cloud at the Centers for Medicare and Medicaid

July 21, 2010

The Centers for Medicare and Medicaid is being required to modernize its technology infrastructure.

Julie Boughn is chief information officer at CMS and tells Fed Cloud Blog about how her office is using some cutting edge technologies, including cloud computing.

“I love cloud. When you talk about software-as-a-service or even platform-as-a-service, the potential is actually astronomical, especially around things that maybe are lower risk and aren’t our core business, but have never been able to get funded. To be able to do projects like those. . . . We don’t have to develop a whole system, or host [some programs] in one of our data centers because we can get the whole service that we need, build a business process around it, and it can be up in a week, as opposed to what typically would take months to get through a regular IT investment life cycle.

What remains to be seen for me is where . . . We start to draw the line. [For example] Medicare fee-for-service claims processing. It’s hard for me to imagine that happening in the cloud, but that could be just because I’m being old and stogy. I hope that’s not the only reason, but it’s just hard for me to imagine that being there. Our backup — backup is the thing that happens in the cloud for a lot of . . . personal users. I’ve heard of some big companies that use cloud for backup and recovery. That one is even hard for me, too, and that’s mostly because of our scale and size. But, I’m very open minded . . . and where it makes sense I’m going to be gung-ho supportive of it.”

Hear Boughn talk more about modernization and virtualization at CMS on Ask the CIO.