EU project to fix security holes in cloud computing

November 30, 2010

Is cloud computing a safe and private way to access data? There’s no doubt about it, security is a major concern when it comes to cloud computing.

But now, the European Union is working on a three-year project to create technical fixes and policies designed to make cloud computing a more secure option for businesses and people alike.

Cloud computing brings a range of questions about security and privacy — everything from technical to regulatory and legal aspects.

The European Union is spending $10.1 billion  – or 7.5 pounds –  on the project, known as “Trustworthy Clouds” or TClouds.

IBM is leading the project, but more than a dozen other companies and research organizations in Europe will be involved.

Research from TClouds will be published in scientific journals and organizers say proposed standards could one day be incorporated into software. However, most people won’t see the effects of the project in everyday life for the next 10 years – or more.

TClouds will also research better privacy protocols for transferring data securely and seamlessly between two different cloud computing service providers.

Developing security standards and creating open APIs – or application programming interfaces – are also objectives of TClouds.


Friday cloud news round up

July 16, 2010

Today, we bring you your weekly cloud news round up.

  • As Microsoft continued a full-court press to get its partner companies to sell cloud-computing services, some of them were still scratching their heads over whether Microsoft’s advances in cloud computing could end up biting into a chunk of their own businesses. The Seattle Times reports that questions came as Microsoft announced new products and sales-support programs at its Worldwide Partner Conference this week that are aimed at helping partners make the jump. About 13,000 people representing companies that resell, build on and sell services based on Microsoft products are attending the conference at the Washington Convention Center.
  • Analyst firm, Gartner, published a set of guidelines intended to ease relationships between cloud vendors and users. As cloud computing becomes more pervasive, the ecosystem (including vendors and analysts) is seeking ways to align expectations among relevant parties, ZDNet reports. Gartner specified “six rights and one responsibility of service customers that will help providers and consumers establish and maintain successful business relationships:”
  • Navatar Group, a global Salesforce.com partner and Value Added Reseller, has introduced free cloud computing CRM for financial services firms. Officials with Navatar Group said that the company is now providing free CRM for eight months, to help one prepare for the expected rebound in the worldwide financial markets, TMCNet reports. Company officials said that this would help financial firms to get pre-built software-as-a-service for their business up and running within a day or two. They will also get the underlying force.com seats from Salesforce.com (News – Alert) free as part of this promotion.
  • Information technology company IBM on Thursday announced a new IBM Cloud Computing Competence Centre in Ehningen, Germany, TradingMarket.com reports. Located in Ehningen, Germany, home to IBM’s largest data centre in Europe, the new facility will host a range of technology platforms and optimised service delivery processes and, according to the company, will provide a broad range of cloud solutions and services to clients locally and internationally.

Capitol Hill looks at cloud

July 8, 2010

And now, your weekly cloud news round up!

  • Federal agencies are moving into the cloud, but what does the legislative branch think of this? Rep. Edolphus Towns (D.-N.Y.), chairman of the House Oversight and Government Reform Committee, recently held a hearing about the issue. Federal CIO Vivek Kundra and Greg Wilshusen, director of Information Security Issues at the GAO, both gave testimony about the issue. Fed Cloud Blog will bring you details next week.
  • IBM has reached a deal with the European Union on cloud. PCWorld reports that a consortium will be established to research new cloud-computing models to reduce the cost of hosting and maintaining Internet-based services.
  • Microsoft is cutting hundreds of jobs worldwide as it plans to create new cloud-focused positions down the road. TechFlash is reporting that many of the cuts are going to happen in marketing departments. Some, but not all, employees are expected to apply for other jobs within the company.
  • Cloud-based email is becoming the new battleground, apparently. Newsweek reports that companies like Microsoft, Google and others are increasingly pushing to have their customers move email services into the cloud.
  • What should you ask your cloud provider, now that you have decided to make the move? ReadWriteCloud recently posted 12 questions that all IT managers should posit — or, at least, consider — before hiring a service provider. Check them out!

What’s in a whitepaper? A lot, it turns out!

November 4, 2009

Cloud computing is gaining popularity in the lexicon of many in the IT world, but many have yet to fully move into the cloud.

The reasons vary, especially when it comes to federal agencies. Some are concerned with cost, while others are wary about security.

A number of whitepapers have recently been released regarding federal cloud computing and IT infrastructure. We examine them all and pull out the essential information just for you.

Less Is More

The Federal Enterprise Architecture (FEA) is certainly not new to anyone in the IT sphere, but different challenges arise each fiscal year because agencies are being asked to do more with less.

That is, IT managers are supposed to work to reduce the duplication of services while sharing information across government without letting costs go up.

Whew. I’m tired already.

In a recent whitepaper, Enhancing the Efficiency of Your IT Infrastructure, William Clark, CTO at CA, outlines how some agencies have already started doing this.

“Government agencies have already taken steps to reduce IT costs. Many agencies have begun deploying shared services and service centers to support human resources, payroll, financial services and more. They are consolidating data centers to reduce their numbers and associated costs. Agencies are even reducing the numbers of Internet connections to simplify security.”

In addition, Clark says most IT organizations today spend over 70 percent of their resources on maintaining existing systems.

Therefore, if an agency or organization can figure out how to use existing resources more efficiently, it can free up dollars for new projects.

This is where e-Gov comes in.

In another whitepaper, Make Your IT Organization More Effective, Clark says:

As a result of E-Gov government to government initiatives, many government agencies have begun to buy managed services such as human resources, financial management, payroll,
and so on from other agencies instead of executing them within their own operations.

In this paper, Clark outlines another reason consolidation and automation might be beneficial.

Youth.

He asserts that younger workers are often more ambitious and therefore won’t be content performing mundane maintenance tasks on legacy systems.

Automated IT management systems improve IT resource utilization and can help reduce your hiring requirements. Installing automated solutions also makes recruitment easier. By utilizing the latest management applications, you can create better jobs and support opportunities for staff, allowing them to play more strategic roles. In this way, automation can improve both
knowledge retention and recruiting efforts.

Thus, through careful examination, cloud computing could help an organization or agency do more with less.

Back in the Day

Cloud computing is by no means a new-fangled idea. (According to Wikipedia, The Cloud borrows from telephony, if you can believe it.)

But it is currently being used in newer, more expanded ways. As we said earlier, more people, organizations and agencies are operating within it, which means it’s becoming more popular.

In addition, the concept of virtualization has changed.

A recent whitepaper from IBM, Seeding the Clouds: Key Infrastructure Elements for Cloud Computing, outlines the metamorphoses.

In the 1990s, the concept of virtualization was expanded beyond virtual servers to higher levels of abstraction—first the virtual platform, including storage and network resources, and subsequently the virtual application, which has no specific underlying infrastructure. Utility computing offered clusters as virtual platforms for computing with a metered business model. More recently software as a service (SaaS) has raised the level of virtualization to the application, with a business model of charging not by the resources consumed but by the value of the application to subscribers.

As technologies like virtualization continue to grow, cloud computing will probably become more widely used.

Enter the Cloud

In his whitepaper, The Perfect Storm for Enterprise-Class Clouds, Dr. Michael Salsburg, Distinguished Engineer at Unisys, examines what he calls the ‘perfect storm’ that has created the ideal atmosphere for cloud computing.

One of the main features – and many would say benefits – of cloud computing concerns the flexibility of services offered.

Indeed, the “killer apps” for cloud computing have been Software as a Service (SaaS) applications. The expectation is that, without installing a “fat” application, the user can access the service immediately through any device that supports a ubiquitous web browser. So, simply stated, the cloud is basically a processing plant for services that can be consumed worldwide by other software or end users.

Because there is no way to predict who will need certain services or when certain services will be needed, real-time infrastructures (RTI’s) developed. RTI’s led to the creation of real-time enterprises (RTE’s), which can respond in real time to a sudden change in business conditions.

The capabilities that are necessary to provide an RTI are relatively new and have caused this “perfect storm” of technological development in the last ten years so that clouds could appear. Whether the development of RTIs caused the formation of clouds – or the other way around – is a debate for another day. But there is no doubt that the “perfect storm” includes the emergence of some key technologies: server virtualization, extreme automation and service-oriented architectures.

The last element – service-oriented architecture – is perhaps most important to the enhancement of cloud computing.

The key is that the infrastructure is architected to deliver and manage services, which is at the heart of software as a service. Because the infrastructure’s elasticity is driven by the need to provide quality of service, services need to be identified, monitored and governed. Emerging technologies are starting to appear that manage the “product” being delivered by clouds – services.

And, again, we are back to efficiency. The business model of a cloud allows for the more efficient use of existing resources.

IBM’s whitepaper says this helps IT organizations to repurpose resources that might be going unused or have been forgotten.

Clouds provide request-driven, dynamic allocation of computing resources for a mix of workloads on a massively scalable, heterogeneous and virtualized infrastructure. The value of a fully automated provisioning process that is security compliant and automatically customized to user’s needs results in:

  • Significantly reduced time to introduce technologies and innovations;
  • Cost savings in labor for designing, procuring and building hardware and software platforms;
  • Cost savings by avoiding human error in the configuration of security, networks and the software provisioning process;
  • Cost elimination through greater use and reuse of existing resources, resulting in better efficiency.

Show Me the Money

This brings us to the budget at your agency.

According to a recent whitepaper by Ted Alford and Gwen Morton of Booz Allen Hamilton, The Economics of Cloud Computing, the President’s budget for FY10 includes $75.8 billion in IT spending.

While this might sound like a lot, and is a 7 percent increase from FY09, the times are a-changin’.

The FY11 IT budget is projected to be almost $88 billion – and it’s clear that the government can’t sustain such a spending trajectory.

There are three groups, according to the whitepaper, that encompass what’s going on in the federal government right now: public cloud adopters, hybrid cloud adopters, and private cloud adopters.

Each group is moving at its own pace for a variety of reasons, but the paper suggests that the costs of moving to the cloud ideally should be wrapped in with an agency’s data center budget.

Data centers capture the most significant portion of the costs associated with moving IT infrastructure to the cloud. However, agencies publicly report only their “consolidated” IT infrastructure expenditures, which include end-user support systems (eg: desktops, laptops) and telecommunications. Additional spending on application-specific IT infrastructure is typically rolled up into individual IT investments.

Throughout the paper, the authors explain how they developed their own data profile that they feel best represents actual agency IT spending.

Their conclusion is that, while some agencies such as the Defense Information Systems Agency (DISA) are already well on their way to implementing cloud computing, many agencies will have to wait one to two years because of the way budgets are developed and implemented.

The timeframe for reprogramming IT funding to support cloud migrations is likely to be at least 1 – 2 years, given that agencies formulate budgets 18 months before receiving appropriations. Specifically, IT investment requests are developed each spring and submitted to OMB in September, along with an agency’s program budget request, for the following government fiscal year (GFY). OMB reviews agency submissions in the fall and can implement funding changes via passback decisions (generally in late November) before submitting the President’s budget to Congress in February. Theoretically, the earliest opportunity for OMB to push agencies to revise their IT budgets to support a transition to the cloud will be fall 2009; however, agencies typically only have about 1 month to incorporate changes to their IT portfolios during passback. To give GSA and OMB time to develop more detained guidance, as well as necessary procurement mechanisms and vehicles, it is more likely that OMB will direct or encourage agencies to plan for cloud migrations during the FY 12 budget cycle (starting in the spring of 2010).

In short, it won’t simply be up to agency IT managers to push for cloud computing.

Certain agencies within the federal government, such as OMB, GSA and NIST will have to continue to monitor the situation and give IT managers clear guidance concerning all aspects of migration to the cloud.