Showing posts with label Cloud Computing. Show all posts
Showing posts with label Cloud Computing. Show all posts

Tuesday, March 5, 2013

Microsoft issues free tool it says will cut Azure costs


If it works, a new free tool called MetricsHub can help cut costs of using Microsoft's Azure cloud services by automatically making the service run more efficiently and by increasing and decreasing Azure resources as customer need fluctuates.
Microsoft bought the company, which it also helped finance through its Microsoft Accelerator program for startups, and is making its software available as part of Azure services.
MetricsHub is a service that monitors customers' Azure systems and makes changes based on rules set by customers. Alternatively, customers can switch on a standard template of best-practices rules recommended by MetricsHub.
The service gathers data from each customer's Azure resources from one of two sources: WindowsAzure diagnostics or via a MetricsHub agent within customer's Azure infrastructure.
If customers use the first option, the gathered data is kept within customers' Azure storage block, typically costing 50 cents per instance per month, the MetricsHub says on its website. With this option, MetricsHub gathers only data that Azure makes available to any third-party developer via its APIs.
If customers use the second option, they install a MetricsHub agent in their Azure cloud, which requires redeploying each instance of customers' applications. This option allows gathering of more information. "Collecting data with this method gives MetricsHub deeper integration with your system, resulting in more informed decisions," MetricsHub says.
Specifically, the agent gathers metrics about virtual machines and websites, something the APIs don't reveal.
MetricsHub analyzes the data and using a feature called ActiveScale adds or drops application instances to best meet actual demand. MetricsHub produces a report that breaks down Azure bills to explain exactly what is being paid for.
MetricsHub can be used as a monitoring platform as well, providing health and performance monitoring and sending email, SMS or pager alerts when it detects potential problems.
In announcing that it had bought MetricsHub, Microsoft acknowledges that it's difficult without tools to gather and analyze data needed to scale applications up and down efficiently. MetricsHub makes those decisions and acts on them, says Bob Kelly, a Microsoft corporate vice president for strategy and business development in his blog. "It also ensures customers are only paying for what they need and maximizing the services they're using," he says. "We think it's going to save customers time, money and headaches."
Customers can opt in to using MetricsHub via the Azure Store. Any paying customers will be converted to the free version.
Tim Greene covers Microsoft for Network World and writes the Mostly Microsoft blog. Reach him at tgreene@nww.com and follow him on Twitter @Tim_Greene.

Friday, January 18, 2013

6 Hidden Costs of Cloud and How to Avoid Them


It should come as no surprise at this point that organizations of all sizes are flocking to the cloud with high hopes of reducing CapEx, making OpEx more predictable, enhancing scalability, making management easier and improving disaster preparedness. In fact, here in the opening weeks of 2013, a new study by Symantec finds that 94 percent of enterprises are at least discussing cloud or cloud services, up from 75 percent a year ago. But Symantec also reports that companies that rush into cloud deployments inevitably encounter a host of hidden costs.
ReRez conducted Symantec's Avoiding the Hidden Costs of Cloud 2013 Survey from September to October 2012, gathering responses from 3,236 organizations in 29 countries--1,358 of the responses came from smaller and midsize businesses, while 1,878 came from larger enterprises.
"This is a broad, robust survey," says Dave Elliott, senior product marketing manager for Global Cloud Marketing at Symantec. "It was in planning for nine months and took two months to implement. What we found is that organizations have, in fact, actually embraced the cloud. Organizations have said, 'Yes, the cloud is a real thing. We're there.'"
But ReRez and Symantec also found that the path to the cloud is often a rocky one.
"There were a bunch of hidden costs or second-order issues that organizations are facing when they move to the cloud," Elliott says. "In their rush to the cloud, they perhaps haven't thought through all the implications of it. These second-order issues are significant, they're real, but frankly they're easy to overcome with just a little bit of planning."
The most common hidden costs are tied to rogue cloud use, complex backup and recovery, inefficient storage, compliance and eDiscovery issues and data in transit issues, according to the study.
Rogue Cloud Implementations
The survey found that 77 percent of businesses saw rogue cloud deployments last year--implementations of public cloud applications by business groups that are not managed by IT or integrated into the company's IT infrastructure. It is more common among enterprises, 83 percent of which saw rogue cloud deployments within the last year. Among the SMB respondents, 70 percent said they experienced rogue cloud deployments within the last year.
"It's not getting any better," Elliott says. "In fact, it may be getting worse. Seventy-nine percent think it's going to stay as bad as it is or get worse."
And those rogue cloud deployments often lead to issues. The survey found that 40 percent of organizations who reported rogue cloud issues experienced the exposure of confidential information as a result. More than 25 percent said they faced account takeover issues, defacement of Web properties or stolen goods or services as a result.
"By taking control of cloud deployments, companies can seize advantage of the flexibility and cost savings associated with the cloud, while minimizing the data control and security risks linked with rogue cloud use," says Francis deSouza, group president of Enterprise Products and Services at Symantec.
Cloud Backup and Recovery Issues
The survey also found that cloud complicates backup and recovery.
"Organizations are rushing to move to the cloud, but they don't think through how important backup and recovery is," Elliott says. "Sixty-one percent of respondents use three or more solutions to back up physical, virtual and cloud data. That's just really inefficient."
It leads to increased risk and training costs, he says. In addition, 43 percent of organizations say they have "lost" cloud data (47 percent of enterprises and 36 percent of SMBs) and had to recover from backups. Elliott clarifies that "lost" could mean actually lost, but it could also mean deleted or even lost or damaged by the cloud service provider. To make matters worse, 68 percent of organizations reported recovery failures when attempting recovery of data in the cloud.
That includes data that may have been recovered eventually, but not in time to meet a particular need. Twenty-two percent of organizations report that it can take three or more days to recover from a catastrophic loss of data in the cloud.
Inefficient Cloud Storage
The simplicity of provisioning storage in the cloud leads to another hidden cost, according to Elliott. One of the reasons organizations love cloud storage is that they pay only for what they use, in theory anyway. But that's true only if you work to maintain efficiency. While most organizations strive to maintain a storage utilization rate above 50 percent, cloud storage utilization is much lower: a mere 17 percent on average. Enterprises do a little better here with an average utilization rate of 26 percent, while SMBs only manage a "shockingly" low 7 percent average utilization. The problem is compounded by the fact that about half of organizations admit that little if any of their cloud data is deduplicated.
Compliance and eDiscovery Concerns
"Organizations are concerned about meeting their compliance obligations when it comes to data in the cloud," Elliott says. "Even more so, they're concerned about proving compliance as they more to the cloud. Twenty-three percent of respondents have been fined for privacy violations in the cloud. That tells me that this is a bigger problem than most people have recognized. As more and more data moves to the cloud globally, there's more and more regulation about how that data needs to be managed. As you move to the cloud, you really need to think about compliance in the context of the overall organization."
The survey found that 49 percent of organizations were concerned about meeting compliance requirements and 53 percent were concerned about being able to prove they have met cloud compliance requirements.
Organizations are also struggling with eDiscovery when it comes to the cloud. The survey found that more than one-third of organizations have had an eDiscovery request for cloud data and two-thirds of that group missed their deadline, leading to fines and legal risks.
"Forty-one percent weren't ever able to find the data," Elliott says. "Taken together, those create significant liability."
Data in Transit Issues
Managing the exploding number of SSL certificates held by organizations is already a struggle today, and the cloud is compounding the problem, according to the study. Assets in the cloud require SSL certificates to protect the data--personal information, financial information, business transactions and other online interactions--in transit.
"The cornerstone of cloud transactions is SSL encryption," Elliott says. "You have to be able to manage your SSL certificates in an efficient way. Only about 27 percent of organizations say managed SSL certificates related to the cloud is easy. Many think it's highly complex. And 40 percent say they're not sure their cloud-partner's certificates meet or comply with their own internal corporate standards."
4 Steps to Avoid Hidden Cloud Costs
While the hidden costs of cloud deployments may be plentiful, Elliott says the good news is those hidden costs are easy to overcome with a bit of planning. He recommends four simple steps IT can take to avoid the hidden costs of the cloud:
1. Focus policies on information and people, not technologies or platforms. Cloud technologies and platforms are evolving at a rapid pace, Elliott says, and too much policy focus on technologies and platforms can lead to getting left behind. By focusing policies on information and people, you'll stay nimble regardless of the technology or platform you use.
2. Educate, monitor and enforce policies. "There is an education process here," Elliott says. "Like anything else, it takes time to mature. You need to monitor their performance and have mechanisms in place to enforce your policies."
3. Embrace tools that are platform agnostic. Platform-specific tools increase the cost of migrating to a new platform when necessary.
4. Deduplicate data in the cloud. "You're paying for the storage you use," Elliott says. "Deduplicate and you use less storage, reducing your overall cost."
Thor Olavsrud covers IT Security, Big Data, Open Source, Microsoft Tools and Servers for CIO.com. Follow Thor on Twitter @ThorOlavsrud. Follow everything from CIO.com on Twitter @CIOonline and onFacebook. Email Thor at tolavsrud@cio.com


Rackspace follows in Facebook's footsteps, plans to build own servers


At the Open Compute Foundation summit held this week, Rackspace announced that it is following in the footsteps of founding OCF member Facebook and will build its own compute and storage servers based on the open standards the foundation has lobbied for.
Rackspace plans to roll out the new OCF-based servers in its newest east coast data center, which it hopes to open in the first half of this year, says company COO Mark Roenigk.
At the OCF event, Facebook announced a new model that allows companies to build their own customized hardware using commodity hardware pieces that all comply to OCF standards. This building system allows organizations to buy less expensive commodity equipment compared to proprietary hardware from companies the likes of Dell, HP and Cisco. Roenigk says ultimately that will create efficiencies for Rackspace, including having hardware that is customized specifically to the company's needs.
Facebook founded the OCF and other members include Google, Goldman Sachs, eBay and Intel. EMC announced at the foundation's conference this week in Santa Clara that it has joined the OCF.
Rackspace released the details of the servers it hopes to assemble, which include three separate types and a rack for holding them. The Wiwynn server design, code named Winterfell, will include a 3-sled chassis with 2x16 core CPUs, 256 GiB RAM and 2x10 Gigabit Ethernet.
Roenigk says being a part of OCF reinforces Rackspace's philosophy of being an open standards, open platform company. Rackspace last summer announced that its newest cloud data centers would be powered completely be OpenStack, the open source cloud computing platform. Having competitors on the same hardware platform is good for the entire industry, Roenigk argues: It lowers costs and allows companies to differentiate on services and support. "We want to out serve our competitors on product and service, not out-geek them on hardware and technology," he says.
While the OCF has thus far focused its efforts on compute and storage hardware, Roenigk says there have been discussions around extending the scope of OCF to include networking components.
Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW.

Saturday, December 29, 2012

Cloud Sherpas funding, acquisition points to hot SaaS market


Atlanta-based consultancy Cloud Sherpas today announced $40 million in new funding and the acquisition of another consulting firm in California named Cloud Trigger.
Cloud Sherpas helps customers migrate workloads to the cloud, specifically SalesForce.com's CRM cloud, and to Google's cloud. Since founding in 2008, Cloud Sherpas says it now has 3,000 customers and that it expects to surpass $100 million in revenue in 2013. The company has 350 employees at offices in Atlanta, Brisbane, Chicago, Manila, New York, San Francisco, San Diego, Sydney and Wellington.
The latest $40 million funding round doubles the venture capital financing the company has received in the past. Investors include Columbia Capital, Delta-V Capital and new investor Greenspring Associates. The funding announcement points to the continued interest venture capitalists have in enterprise-geared startups and emerging technologies, including cloud computing and specifically software as a service (SaaS). Cloud Sherpas has had three previous rounds of funding totaling about $23 million in the past, including a $20 million round in March.
The CloudTrigger acquisition is the eighth consulting practice the company has purchased since 2007. CloudTrigger helps customers migrate to SalesForce.com and is the maker of G2Maps, a geographic mapping analytics and visualization app available on the Salesforce AppExchange.
Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW.

Tuesday, November 13, 2012

FileLocker offers 25GB free end-to-end encrypted cloud storage


If recent cloud storage breaches have you wondering if the web can be trusted with your files a new service launched Tuesday called FileLocker, that provides encrypted cloud storage, may give you new faith.
FileLocker, a folder sync and collaboration service, claims to provide end-to-end military-level encryption of files stored on its servers. That means that data is encrypted at its source (on your desktop), in transit (256-bit SSL), and in the cloud. Typically, web storage services offered to consumers don't encrypt data at all three of those stages.
Although primarily targeted at small and home businesses, the service, which offers 25GB of free online storage for up to five users, can be a good deal for consumers, too.
(See Related: How to encrypt your cloud storage for free)
Paid accounts start at $5-per-user a month, with unlimited storage space for a minimum of 5 users and maximum of 10. Accounts include an app for the desktop and for mobile devices running iOS or Android, as well as administrative and reporting tools.
According to the company, files are protected before leaving a device with a personal passphrase known only to you. You can designate a folder and just drag-and-drop files where they're sent to the cloud using a 256-bit SSL connection, encrypted again and stored in the FileLocker cloud.
The service should be particularly attractive to collaborators because all versions of files are kept for an unlimited amount of time. FileLocker, brought to you by the makers of SOS Online Backup, allows you to designate any folder on a PC as synced folder. Simply drag-and-drop files you want to share or store online.
Crowded Cloud Storage Space
With so many clouds storage services popping up on an almost daily basis, consumers may be reluctant to subscribe to yet another one -- even one that offers end-to-end encryption. Those consumers might be interested in Ensafer, now in public beta
Unlike FileLocker, Ensafer is designed to work with existing popular storage services, such as Dropbox, iCloud and Google Drive. It, too, encrypts files locally on a consumer's device and can only be decrypted there.
Of course, if you're a hands-on type person you can encrypt your files manually before sending them to the cloud with a program like BoxCryptor or TrueCrypt.

Saturday, November 10, 2012

Google gives cloud-based database a performance boost


Google's Cloud SQL database has gained more storage, faster reads and writes, and now offers users the choice of running their instances in data centers based in either the U.S. or Europe.
The performance upgrade allows enterprises to run bigger, faster MySQL databases on Google's cloud, Joe Faith, product manager for Google Cloud SQL, wrote in a blog post on Thursday.
Faith and his team have increased the available storage on Cloud SQL by a factor of ten to 100 GB, according to the blog post. Faster reads and writes are also possible thanks to instances with more memory and optional asynchronous replication, it said.
The maximum amount of RAM is now 16GB, quadrupling the amount of data users can cache to increase read speeds.
Asynchronous replication results in faster writes to the database, because the system doesn't have to wait for the replication to finish. However, users might lose their latest updates in the event of a data center failure within a few seconds of updating the database, according to Google's FAQ.
Besides improving performance, Google now allows Premier customers to choose if they want to store data and run their Cloud SQL database instances in U.S or European data centers.
Google's update comes just two days after Amazon Web Services announced two new instance types for its Relational Database Service: the Extra Large DB Instance and the Medium DB Instance, which have 15GB and 3.75GB of memory, respectively. Both of them can be used to run SQL Server and Oracle's database, while the medium instance can also be used to run MySQL.
Amazon has also reduced prices by up to 14 percent in the US East (Northern Virginia) and US West (Oregon) regions.
For example, a standard deployment of a large instance now costs US$0.365 per hour, which is 5 cents cheaper than what Amazon used to charge.
Similar to Amazon's existing free tier, Google has introduced a new trial offer for Cloud SQL. Users get to test one instance with "a small amount of RAM" and 500MB of storage until June 1 next year.
Users who want Cloud SQL with 16GB of memory pay either $46.84 per day, which includes 10GB of storage and 32 million requests, or $3.08 per hour plus $0.10 for every one million requests and $0.24 per month for 1GB of storage.
Send news tips and comments to mikael_ricknas@idg.com

Friday, November 9, 2012

Amazon, Microsoft and Google targeted by cloud provider Joyent


Joyent may be the biggest cloud provider you haven't heard of.
According to the pure-play infrastructure as a service (IaaS) provider -- which was founded in 2004 and is headquartered in San Francisco -- it is a top 5 vendor of cloud-based virtual machines in the world, a stat that's backed up by Gartner. That means it's rubbing elbows with the big names of cloud computing -- Amazon Web ServicesRackspaceMicrosoft and Google.
"They're the most interesting cloud company that few people talk about," says George Reese, CTO of enStratus, a company that consults with enterprises on cloud strategies and helps business deployapplications to the cloud. "When we talk to people we get questions about AWS, Rackspace, HP, and when we mention Joyent, they're like, 'Who?'"
Perhaps users should start paying attention, though. The company this week released Joyent7, the latest version of its cloud management platform named SmartOS, which it says enhances capabilities for hybrid cloud deployments between a customer data center and Joyent's cloud.
Company founder and CTO Jason Hoffman is aiming for the fences with his company, openly stating that he's looking to take on the Amazons, Googles and Microsofts of the world.
Does he have a shot?
Joyent's differentiator, Hoffman says, is its integrated stack. SmartOS is not just an operating system, but also a networking fabric and hypervisor -- it uses KVM. He describes it as analogous to a large-scale storage area network (SAN), with an integrated network between compute and data layers that run virtual machines directly on it. "We completely collapse the model into a single hardware design," he says. By doing this, new customers are easily onboarded to the cloud, with each new customer site added to Joyent's network being like the equivalent of adding another availability zone in AWS's system.
Hoffman says Joyent is cheaper and offers more compute for the buck compared to AWS. A pricing comparison chart on the company's website shows that Joyent prices are between 6% and 29% less compared to prices of similarly sized VM instance types in AWS's cloud.
Reese, the cloud consultant, says Joyent seems to have a dedicated user base, but it is still a niche play in the market. "They don't have a ton of features, but the features they do have perform really well," Reese says. VMs come up fast and are predictable and reliable, he says, based on testing he's done within enStratus for customers using Joyent's cloud.
Joyent seems optimized for customers that run large, complex, cloud-native apps in Joyent's cloud, apps from which developers want high visibility and highly reliable performance, Reese says. The focus on its core features leaves some wanting, though. Joyent doesn't have a database as a service feature, for example, nor does it have nearly the breadth of services offered by AWS or Rackspace. Ultimately, that could provide a challenge for Joyent significantly biting into Amazon or Rackspace's dominating market share.
Joyent is continuing to develop its products and company, though. The release of Joyent7 is about enabling "seamless hybrid cloud," Hoffman says. The new OS furthers LDAP integration and adds a catalog of APIs, specifically around workflow management, image management and security groups.
In addition to announcing Joyent7, the company also appointed a new CEO, Henry Wasik, formerly president and CEO of Force10 Networks, to lead the company.
Hoffman likes his chances of going up against the gorillas of the industry. "If someone really wants to take on AWS," which Hoffman clearly states he wants to do, "you have be multi-region, multi-AZ from the get-go." If a provider takes a pure-hardware approach, it says it would cost a half billion dollars to set it up. "We're in a space where, as a private company, we're partnering with a top-three chip maker [Intel], we have our own technology stack end-to-end and we've raised hundreds of millions of dollars." The company announced its latest $85 million funding round in January.
Gartner says it will be an uphill climb for Joyent, though, especially when it's competing with companies that have much greater resources they can devote to R&D. "Joyent is focused on developing its own technology, which creates long-term challenges in competing against providers with greater development resources," Gartner says. If Joyent remains a niche provider, Reese believes it has a chance to carve out a chunk of the market and serve it well. It's an open question if a company like Joyent can scale up to the size of some of the major cloud providers in the market, though.
Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW.

Oracle buys Instantis for project portfolio management software


Oracle on Thursday said it has agreed to acquire PPM (project portfolio management) software vendor Instantis, in a move that will build upon its past acquisition of Primavera. Terms of the deal, which is expected to be completed this year, were not disclosed.
Instantis has both on-premises and cloud-based software, which will be combined with Primavera as well as Oracle's next-generation Fusion Applications, according to a statement. All told, the software "will provide the ability to manage, track and report on enterprise strategies - from capital construction and maintenance, to manufacturing, IT, new product development, Lean Six Sigma, and other corporate initiatives," Oracle said.
Instantis' main product is called EnterpriseTrack, which incorporates dashboards and reports that can be shared "at any phase of the project life cycle from ideas to proposals to project execution to metrics and results," according to its website. The company's software also has a native social networking platform called EnterpriseStream, as well as an integration framework for tying EnterpriseTrack to other systems.
Oracle "plans to continue to invest in Instantis' technology, evolving the solutions organically and deepening the integration capabilities with Oracle technology," according to a FAQ document released Thursday.
As with all of its acquisitions, Oracle will also gain further footholds in enterprise accounts, giving its sales representatives opportunities to cross-sell and up-sell other products to Instantis users, which include Ingram Micro, DuPont, Credit Suisse and Xerox.
Oracle's competitors in the PPM market include CA Technologies, IBM and a number of smaller vendors.
Chris Kanaracus covers enterprise software and general technology breaking news for The IDG News Service. Chris' email address is Chris_Kanaracus@idg.com

The cloud as data-center extension


A year after Oregon's Multnomah County deployed an on-premises portfolio management application, the two IT staffers dedicated to it resigned. Other staff struggled to maintain the specialized server environment. Left with no other option to guarantee support of the mission-critical tool, the county leapt into the cloud.
"All of our IT projects are tracked through Planview," says Staci Cenis, IT project manager for Multnomah County, which includes Portland. "We use it for time accountability and planning. Monitoring scheduled and unscheduled maintenance shows us when staff will be free to take on another project."
Initially the county had two dedicated Planview administrators, Cenis explains. But over a period of around three months in 2009, both left their jobs at the county, "leaving us with no coverage, " Cenis says. "We didn't have anyone on staff that had been trained on the configuration of our Planview instance or understood the technical pieces of the jobs that run within the tool to update the tables," among other things.
Cenis hadn't considered the cloud before that issue, but agreed to abandon the in-house software in favor of Planview's software-as-a-service (SaaS) offering after assessing the costs. Training other IT staffers on server, storage, backup administration, recovery and upgrades alone would have compounded the on-premises software expenses, Cenis says.
Nowadays, with the infrastructure and application administration offloaded to the cloud, IT can handle most configuration, testing and disaster recovery concerns during a regularly scheduled monthly call. "I wish we had gone with the cloud from the start because it has alleviated a significant burden," Cenis says, especially in the area of software upgrades.
Each upgrade handled by the application provider instead of her team, she estimates, adds numerous hours back into her resource pool. "What would have taken us days if not weeks to troubleshoot is generally answered and fixed within a day or two," she adds. At the same time, users can access the latest software version within a month or two of its release.
Multnomah County's embrace of the cloud is one of five models becoming more common today, according to Anne Thomas Manes, vice president and distinguished analyst at Gartner.
Gartner categorizes them as follows:
Replace, as Multnomah County did by ripping out infrastructure and going with SaaS;
Re-host, where IT still manages the software, but it is hosted on external infrastructure such as Amazon, HP or Rackspace public or private cloud servers;
Refactor, where some simple changes are made to the application to take advantage of platform-as-a-service;
Revise, where code or data frameworks have to be adapted for PaaS;
Rebuild, where developers and IT scrap application code and start over using PaaS.
"Not a lot of companies rebuild or do a lot of major modifications to migrate an application to the cloud. Instead, they either replace, re-host or refactor," Manes says.
Primarily, enterprises view the cloud as an escape hatch for an overworked, out-of-space data center. "If you're faced with the prospect of building a new data center, which costs billions of dollars, it certainly saves money to take a bunch of less critical applications and toss them into the cloud," Manes says.
Problems in paradise?
However, since first observing the cloud frenzy years ago, Manes recognizes companies have taken their lumps. "Many business leaders were so eager to get to the cloud that they didn't get IT involved to institute proper redundancy or legal to execute proper agreements," she says. Such oversights have left them vulnerable technologically and monetarily to outages and other issues.
Companies that moved applications and data to the public cloud early on also didn't always plan for outages with traditional measures such as load balancing. "Even if an outage is centralized in one part of the country, it can have a cascading effect, and if it lasts more than a day can cause a real problem for businesses," she says.
Tips for getting to the cloud
Know what should go where: If you require a more controlled environment for your data, consider building a hybrid cloud using internal servers and shared dedicated cloud infrastructure. Doing so enables you to track where data lives without having to manage a sprawling data center.
Understand your licensing: Some companies are unwittingly getting double-charged by software companies and service providers for application, operating system and other licensing. Double-check your contracts and if yours doesn't include cloud architecture, then renegotiate with your vendors. Consult with your cloud provider because it might have an in-place deal with software makers. Also, as NASA's JPL advises, make sure to involve your legal team in all service agreements.
Stay involved: Sending your applications to the cloud might free up infrastructure and administrators, but IT still has to keep a close eye on critical elements such as security, integration, configurations, updates and disaster recovery. Multinomah County regularly meets with its SaaS provider to ensure proper communication and support levels.
Missing something? Don't be afraid to ask: Cloud providers are eager to please and want your business. Inform your cloud providers when a feature or functionality is absent from your service or platform. If you need load balancing, a provider probably will support that for you without much additional cost.
Seek support: You can offload cloud management to a third party if it is too onerous for your in-house team. For instance, some cloud providers will handle round-the-clock technical support of environments hosted in the Amazon cloud.
-- Sandra Gittlen
But Dave Woods, senior process manager at business intelligence service SNL Financial, disagrees. SNL Financial aggregates and analyzes publicly available data from around the world for its clients. Despite having a sizeable internal data center, the company's homegrown legacy workflow management application was testing its limits.
"Our data center was full" with both internal and customer-facing applications and databases, Woods says. The company didn't do a full-on analysis to find out whether it was server space or cooling or other limitations -- or all of the above -- but at some point it became clear that they were running out of capacity, and cloud software became attractive.
Though he briefly considered rebuilding the application and building out the data center, the costs, timeframe and instability of the code dissuaded him. "The legacy application lacked the design and flexibility we needed to improve our processes," Woods says. The goal, in other words, was not just to rehost the application but to do some serious workflow process improvement as well.
To accomplish this, SNL Financial adopted Appian's cloud-based business process management system. Although the annual licensing cost was similar to the on-premises software the firm had been using, the clincher was avoiding the $70,000 in hardware costs that would have been needed to update the application at the time. (SNL has since built a "spectacular new onsite data center," Woods says, so it's no longer an issue.)
SNL Financial is expanding its workflow processes to more than 500 banks in Asia, with Woods crediting the cloud for allowing this type of scalability and geographic reach. "We wouldn't have been able to improve our legacy workflow in this way. There was a much longer IT development life cycle to contend with. Also, the application wouldn't have had as much capability," he says.
"These platforms are mission-critical to us, not a side project," Woods explains. "They affect our business engine at our core and they have to enable us to fulfill our timeline guarantees to our customers," he says.
The processes Woods refers to are those involving collecting, auditing and reviewing data and news for specific industries -- the information that SNL sells to clients, in other words.
That's not to say there haven't been some bumps on the road to the cloud. Woods says that while IT was brought in at the start of the decision-making, his process-improvement team missed the mark on making sure IT was fully informed. "We found that no matter how much we thought we were doing a good job communicating with IT and networking, over-communication is the order of the day," he says.
Building up trust in the cloud
NASA's Jet Propulsion Laboratory (JPL) has a similar stick-to-it attitude with the cloud. With more than 100 terabytes spread across 10 different services, JPL's trust in the cloud built up over time.
Its first foray was in 2009, when reality sunk in that the 30-day Mars Exploration Rover (MER) mission would last far longer than originally thought, and demand far more resources than the internal data center could handle. (MER is still sending data back to Earth.)
"All of our IT systems had filled up. We either needed to build new IT systems internally or move to the cloud," says Tom Soderstrom, CTO.
Soderstrom and his team of technicians and developers used Microsoft's then-nascent Azure platform to host its "Be a Martian" outreach program. Immediately, JPL saw the benefits of the elasticity of the cloud, which can spin up resources in line with user demand.
In fact, outreach has proven a fertile playground for JPL's cloud efforts, such as using Google Apps as the foundation for its "Postcard from Mars" program for schoolchildren. Soderstrom calls the platform ideal because it enables an outside-the-firewall partnership with developers at the University of California, San Diego.
External developers are simply authorized in Google -- by JPL's IT group -- to work on the project. "If we used the internal data center, we would have had to issue them accounts and machines, get them badged by JPL, and have them go into schools to install and manage the application code," Soderstrom says. "The cloud approach is less expensive and more effective."
JPL also taps Amazon Web Services for various projects, including its contest for EclipseCon, the annual meeting of the Eclipse open-source community. "All testing, coding and scoring is done in Amazon's cloud so our internal data centers don't have to take the hit," he says.
The cloud benefits internal projects, too, including processing data from the Mars missions. To tile 180,000 images sent from Mars, the data center would have to spin servers around the clock for 15 days or more. JPL would have to foot the cost of that infrastructure and spend time on provisioning specifications down to the type of power plug required.
In contrast, the same process took less than five hours using the Amazon cloud and cost about $200, according to Soderstrom.
As cloud use grows in popularity and criticality, JPL continues to beef up its cloud-based disaster recovery/business continuity, using multiple geographic zones from a single service provider as well as multiple vendors. "We always have failover for everything and consider it as insurance," he says. For the summer Mars landing, JPL instituted a double-failover system. "All cloud vendors are going to have outages; you just have to determine how much failover is required to endure it," he says.
For its data on Amazon, JPL switched on load balancers to move data between zones as necessary. "Previously, network engineers would have been needed to do that kind of planning; now app developers can put in these measures themselves via point and click," Soderstrom says.
Self-service provisioning
There have been hiccups along the way, such as trying to match the application to the cloud service. "Cloud services used to be a relationship between a provider and a business leader with a credit card," Soderstrom says. Now, "we make sure IT is involved at every level," he explains.
To accomplish this, JPL has standardized its cloud provisioning overall, creating an online form that business leaders and developers fill out about their project. Based on pre-set templates created by IT, their plain-English answers to questions such as "are you going to need scalability?" and "where is your customer and where is your data?" guide which cloud service and the level of resources they will need.
The move to self-service provisioning has meant retraining system administrators to be knowledgeable about cloud-use cases. Also, IT security staffers serve as consultants for the cloud environment, vetting and hardening operating system and application builds.
Though this sounds like a complicated evolution, Soderstrom says the technical challenges presented by the cloud have been easy compared with the legal ones. Legal is front and center in all negotiations to ensure appropriate licensing, procurement and compliance deals are struck and adhered to.
In all its cloud contracts, JPL includes language about owning the data. In case of service shutdown, a dispute or other agreement termination, the provider must ship all data back on disks, with NASA picking up the labor tab.
Overall, though, Soderstrom says he is glad he made the leap. "Cloud is changing the entire computing landscape and I'm very comfortable with it. Nothing has been this revolutionary since the PC or the Internet."




Evolving security standards a challenge for cloud computing, expert says


ORLANDO -- Any enterprise looking to use cloud computing services will also be digging into what laws and regulations might hold in terms of security and privacy of data stored in the cloud. At the Cloud Security Alliance Congress in Orlando this week, discussion centered on two important regulatory frameworks now being put in place in Europe and the U.S.
The European Union, with its more than two dozen countries, has had a patchwork of data-privacy laws that each country created to adhere to the general directive set by the EU many years ago. But now there's a slow but steady march toward approving a single data-privacy regulation scheme for EU members.
These proposed rules published by the EU earlier this year may not become law until 2016 or later as they involve approval by the European Parliament, said Margaret Eisenhauer, an Atlanta-based attorney with expertise in data-privacy law.
Europe, especially countries such as Germany, already takes a stricter approach to data protection than the U.S., with databases holding individual's personal information having to be registered with government authorities, and rules on where exactly data can be transmitted. "European law is based on the protection of privacy as a fundamental human right," Eisenhauer said.
The benefit of the proposed EU regulation is that EU countries will, in theory, present a uniform approach instead of a patchwork of rules. The so-called "Article 29 Working Party Opinion" of proposed law specially addresses use of cloud computing, and it presents cloud providers and users with a long list of security-control requirements.
In addition, cloud providers must offer "transparency" about their operations something some are reluctant to do today, Eisenhauer said.
The proposed regulations also allude to how cloud-based computing contracts should be established. Among many requirements, "you have to state where the data will be processed," Eisenhauer said, plus where it will be accessed from. Customers have the right to "visit their data," she said, which means providers must be able to show the customer the physical and logical storage of it.
Some ideas could become the norm for Europe, such as the concept of the "right to be forgotten," which recognizes that individuals have a right not to be tracked across the Internet, which is often done through cookies today. This "privacy by default" concept means that Web browsers, for example, will likely be required to ship turned on by default to their newer "do not track" capabilities to be used in Europe. In Europe, "there are real concerns about behavioral targeting," said Eisenhauer.
Some European legal concepts suggest that even use of deep-packet inspection often a core technology used in security products today to watch for signs of malicious activities on the network could be frowned on under European law, and companies will need to be mindful of how deep-packet inspection is deployed, said Eisenhauer. Even today, use of security and information event management (SIEM) monitoring of employee network usage is something that does not easily conform to European ideas of data privacy.
The proposed EU data-privacy rules require reporting data breaches to the governments and their data-privacy authorities there as well as to the individuals impacted by it very quickly. The regulation also points to possible fines for failing to comply with the proposed regulations, fines that start with 2% of the company's annual worldwide revenue.
However, Eisenhauer adds that Europe's data-privacy regulators in government encourage direct communication about any issues that come up with cloud-service providers and their customers and are far more eager to resolve problems, not mete out punishments.
Many companies, including HP, which is a member of the CSA, are tracking these kinds of regulatory requirements from all across the world that impact the cloud.
"You will have to answer to auditors and regulatory regimes," said Andrzej Kawalec, HP's global technology officer at HP Enterprise Security Solutions. This means that there can't be "monolithic data centers" all subscribing to one mode of operation, but ones tailored to meet compliance in Europe, Asia and North America.
In Switzerland, for example, which is not part of the EU, "the Swiss think the data should remain in Switzerland," he said. But "everyone is getting a lot more stringent" on security and data protection, Kawalec said. Some ideas, such as Europe's notion that even the user's IP address represents a piece of personally identifiable information, are not necessarily the norm in the U.S.
In the U.S., there is also a significant regulatory change afoot related to cloud computing and security and it is arising out of the federal government's so-called FedRAMP program unveiled earlier this year.
FedRAMP is intended to get cloud-service providers (CSP) that serve government agencies accredited for specific security practices over the next two years. Although no CSP is yet certified, according to Chris Simpson, CEO at consultancy Bright Moon Security, who spoke on the topic at the CSA Congress this week, the goal is to get CSPs on board by assuring through third-party assessments that their cloud environments conform to specific security guidelines.
These include practices for incident response in the cloud, forensics in a highly dynamic environment, threat detection and analysis in a multi-tenant environment, and continuous monitoring for remediation, among other things. One FedRAMP idea is that service providers must be prepared to report security incidents of many types to the U.S. CERT and the government agency that might be impacted. The agency would also be reporting to US CERT as well, said Simpson.
If CSPs can't meet the FEDRAMP guidelines, they won't be able to provide services to government agencies, said Simpson. Once certified in FedRAMP though, they'll have a path to contracting for all federal agencies. But if a security incident or data breach occurs that is seen as negligence, that might be cause "to pull that authorization," Simpson concluded.
Ellen Messmer is senior editor at Network World, an IDG publication and website, where she covers news and technology trends related to information security. Twitter: MessmerE. E-mail: emessmer@nww.com.

VMware releases micro version of Cloud Foundry PaaS


Everything in the cloud seems to be getting bigger or smaller. VMware today went the small route, releasing a micro version of the company's popular open source platform as a service (PaaS), Cloud Foundry.
The claim to fame for Micro Cloud Foundry is that it can be deployed on a single virtual machine. In ablog post announcing the new version, VMware says it's ideal for developers who want to launch an application that's still under development to test it out, for example.
Cloud providers seem to be constantly tweaking their offerings in an effort to expand their product portfolio and the easiest ways to do that are to take existing products and either give them added capacity, or shrink them down into smaller, bite-sized chunks. VMware took the latter approach with today's release.
In contrast, Amazon Web Services recently announced two new types of virtual machines instances for its cloud, both of which are high input/output versions of its popular Elastic Cloud Compute (EC2) offering. At the time, independent analyst Paul Burns noted that adding capacity to existing products not only allows businesses like Amazon to have more products, but it allows customers to have instance types that more closely align with their computing needs.
In that aspect, creating a micro instance of Cloud Foundry seems like a natural move. As a PaaS, Cloud Foundry is used by developers as a cloud-based tool for creating and deploying applications. Traditionally these PaaS deployments live on large cloud environments made up of multiple virtual machines. But a micro instance, like the one released by VMware today, gives another tool for a developer to more easily test and play around with Cloud Foundry on a single machine.
VMware says Micro Cloud Foundry will have all the same features and functionality of the regular Cloud Foundry, the only limitation will be the power of the single VM that it runs on. In addition to announcing the micro version today, VMware also announced new features that will come with the Micro Cloud Foundry release. These include support for standalone apps, and enhanced support for various programming languages, including Ruby, Java and Node.js.
VMware says it will continue to update Micro Cloud Foundry on the same release cycle as its parent product, and promised to continue to improve Micro Cloud Foundry by further improving automation of tasks within the PaaS.
Cloud Foundry is used by a variety of cloud computing companies as a PaaS extension to their infrastructure as a service offering. Piston Cloud Computing, the OpenStack cloud platform, for example, uses Cloud Foundry to provide customers with a PaaS.
Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW.