Reducing Risk with Encryption for Multi-Tenant Environments

English: Amazon Virtual Private Cloud diagram

One of the biggest hurdles to cloud adoption is undeniably security. In particular, public cloud services are often under scrutiny as to whether a multi-tenant environment is actually secure. Let’s face it, production virtualized environments are a newer trend, which means that security was never really an issue.

As more business critical resources become virtualized, there is an increasing need to ensure the right security controls are in place. Until recently, multi-tenant encryption solutions weren’t particularly effective. Key management being one of the key reasons for the avoidance, as the portability of VMs across multiple physical servers meant advanced encryption key requirements.

AFORE Solutions Inc., a Cloud Security and Solution Provider, recently announced the release of their CloudLink™ 2.0 with Secure Virtual Storage Appliance, the first solution that enables cloud-based DR solutions to meet key regulatory and compliance requirements . This appliance provides a storage repository that can be accessed by VMs hosted in the cloud. Most encryption is currently applied through storage gateway methods which means it is only encrypted as it is sent to the cloud. CloudLink™ Secure VSA encrypts and protects data at all times, which is particularly important in highly regulated industries. The keys are managed by the enterprise and encryption keys can be controlled through Active integration.

CloudLink™ Secure VSA has already proven itself in Amazon VPC™ (Virtual Private Cloud), VMware vCloud™ Director environments and CA AppLogic based clouds. The main reason for the success is that organizations want to take advantage of the many benefits of the cloud model. If a provider can offer compliant environments, there is an immediate advantage.

Disk encryption is one of the key security controls used in enterprises to reduce the threat of data loss. The same methodology applies to cloud environments where you need to reduce the risk of unauthorized access as much as possible. Having the ability to encrypt individual VMs means an additional (and significant) layer of security to help protect your business critical resources.

Big Data Hosting

Image representing Hadoop as depicted in Crunc...

Image via CrunchBase

When it comes to building new cloud services, there is a large opportunity for new services built around Big Data.

So when you look at consulting firms that provide application development and integration services, are there opportunities for them to leverage Big Data in their service portfolio?

For enterprise consultancy, cloud platforms and tools such as enterprise integration framework and development software can be used to provide line-of-business, e-commerce and system integration services.  And the nice thing about Big Data is that it helps provide opportunities to build financially sustainable services.

While there is a large market for cloud computing and mobility solutions, one of the services that consultants are seeing traction with is through offering Big Data analytics, especially in the Healthcare and Automotive verticals.  The nice thing is that these engagements are usually lengthy and comprehensive, which is important for consulting firms who survive on stable, long-term projects to help raise cash flow and utilization rates to help lower overall cost of sales.

So what kind of tools can be used to help supplement these types of engagements?  Aside from the obvious choices like Hadoop, there are also tools such as HBase and Hive to get at the data and then other data visualization tools for analysis. The visualization part can be done by commercial tools such as MicroStrategy and Tableau, or open source tools such as those from Pentaho.

There is a huge emerging Big Data industry with actual growing revenue opportunities.  Right now we are still at a hype stage, but when it does get past it, there will be a significant need for Big Data consultancy firms.

Public GovCloud Computing – Assessing the requirements

As more and more large enterprises start to make public adoptions of cloud and service providers figuring out the role they play in the market, one of the biggest potential cloud adopters has been watching and looking for signs that the market can support one of the most public facing cloud deployments.

Yes, Governments have started to look at both private and public cloud offerings as potential solutions to address the changing needs of internal and external applications and processes. From a provider perspective, Governments are looking for services both for computing power and storage, as well as applications such as collaboration, CRM and email that can be used for public facing tools.

A big force behind the cloud adoption in the US comes from the White House, which gave agencies the mandate to prioritize cloud applications and services when looking at building new projects. It is a part of a large pool of initiatives to develop standards for cloud adoption, creating policies for new technologies such as adoption of BYOD and Big Data, but also to address the security concerns that come along with cloud.

Luckily, cloud isn’t that hard a sell to government agencies, as they already understand the benefits that come from cloud such as cost savings, flexibility and streamlining business processes -traits that private sector companies are already enjoying. But federal agencies are mainly looking to cloud to deliver self-service so that they can become increasingly responsive and agile in their infrastructure approach when it comes to taking on projects.

As one can imagine, due to the differences between agencies, it is impossible to create a one-model-fits-all approach when delivering services for the Federal Government. The information between agencies can vary significantly from public consumption to top secret. But even so, security isn’t necessarily the top reason for agencies to drag their feet when it comes to adoption. It’s the providers themselves.

The way that business has been done historically between providers and government agencies includes lots of contracts that are often written with private sector companies in mind. Unfortunately these contracts might not meet the unique requirements of each individual agency as they relate to security and compliance. This factor alone means that the adoption process requires lots of sitting down with providers to map out the right terms and conditions to ensure that there is flexibility in the offerings so that if they need to change providers, they are not locked in, and more importantly that they can take everything with them when they leave. The result of this is actually good for the cloud market as a whole, as the concerns that arise from these meetings can be translated into better cloud SLAs for all customers regardless of if they are in the public or private sector.

GovCloud 2.0 apps for citizen service

When it comes to embracing social media and applications, many businesses seem to question the benefit.

True, if you are a consumer goods manufacturer, there is inherent benefit in associating yourself with social media and consumer applications, but what about for Government agencies? Does it make sense for Governments, especially municipal, to look at creating both internal and external applications?

One of the more recent examples of municipalities that have embraced social applications is Riverside, California. They created an application, Riverside 311, to be used on smartphones as a way to better interact with residents.

The original project stemmed from the need to find better ways to communicate with residents aside from the 33,000 calls they receive every month reporting everything from potholes to downed trees. They simply did not have the internal infrastructure and staff to support the service and answer inquiries, and so the idea of having residents submit requests through a smartphone application just made sense.

As a result, they have seen a steady increase in reports that are generated through the application, and has led to some great side benefits such as the ability for residents to take pictures of crime such as graffiti, to help police match taggers and get vandalism cleaned up more effectively than previously. It has also empowered residents to help clean up their city and make it safer.

Creating public-facing applications is really an easy way to help governments gain more interaction with their citizens. It gives citizens and residents the ability to interact more efficiently with Government agencies, and rewards these agencies with a wealth of information such as GPS locations to help reduce process times.

The key issue is that the application has to provide perceived value to users, or it really is a waste of effort. Users want applications that save them time and effort in accessing commonly used services such as locating government offices, finding public services, or even just looking for local information. The easier it is to use, the more likely users will choose the application versus standard web surfing.

Luckily there are a lot of great Canadian developers who are helping Governments and agencies develop application platforms to support this next-generation method of interaction.

These companies understand the unique requirements for developing for multiple platforms, and how to properly integrate them with back-end process. The advantage with outsourcing is that these organizations can provide guidance on scope and anticipated costs better and provide a governance structure that involves the right stakeholders.

Cloud Computing as economic driver of job creation

A very exciting report published by the Sand Hill Group and sponsored by SAP recently came out about the impact cloud has on the economy and the workforce.

When it comes to cloud, most people think about companies with direct cloud investments such as hosting or hardware. While the statistics are US based, there are some great data points that show why Canada is poised to benefit from cloud adoption.

Companies selling cloud services are poised to pump $20 Billion dollars annually (yes, you read that right) for the next five years into the US economy. This is on top of the 472,000 estimated jobs in both the US and internationally that this industry is expected to create.

Attached to this, the venture capital companies who help support cloud stand to make upwards of $30 Billion in the next five years and create another 213,000 jobs in the US alone.

And even better news is that the government is also on the receiving end of the cloud benefit wagon. It is estimated that cloud computing could save US businesses as much as $625 Billion over five years, and if you do the math, that is a lot that can be reinvested to create even more jobs and auxiliary businesses to support the cloud landscape.

This is really related to the fact that cloud is going to create almost 5 times the jobs that the current IT sector supports. So imagine the impact that this type of trend could have on governments globally. Aside from cutting costs, creating more jobs and making significant profits cloud supports a large ecosystem of innovative organizations and startups. It’s huge, cloud really is going to change things for the better. With Canada’s track record of innovation, this is where we need to start investing.

The full report can be read at http://www.news-sap.com/files/Job-Growth-in-the-Forecast-012712.pdf

Cloud Identity management for e-Healthcare

Cloud Identity and Security Best Practices

Cloud Identity and Security Best Practices

Right now the Canadian government is working on creating a strategy for cloud adoption within various government agencies.

There are a lot of great benefits to centralizing and updating public services, but the minute you move to a distributed type of environment, there are suddenly many more security factors that must be taken into account.

Several government agencies around the world have already started the move to transitioning to new models that allow for greater access to information, better portals for the public to access information all while bringing better internal IT structure with more flexible infrastructure, streamlined applications and cost savings. But several factors that are affected with this type of restructuring is how outlying services are affected and must be kept in mind with the shift in any strategy. Let me explain.

As part of the move towards more centralized services, a lot of the responsibility of security is being pushed to the individual agencies, particularly in the healthcare industry. While governments are building new portals to make it easy for single access to a wide variety of services, it is the end users, (doctors offices, hospitals, health agencies) that are tasked with securing the access to these repositories. The key problem with this is that it is (in my opinion) the wrong group to leave the responsibility for security to.

Think about the end users, what is their main focus? Usually it is tied to serving customers, either as a healthcare practitioner, records administrator, etc. There is a good chance that these folks (with the exception of the IT staff in larger agencies such as hospitals) didn’t go to school to study IT or even more importantly, security. Yet they are a often tasked with helping maintain a security posture for a large network and rarely receive any training. Think about the security risks that this brings.

Think about this for a second. When you visit your healthcare practitioner, often you wait in an examination room where their computer is sitting unattended. Imagine just the risks associated with this. Firstly, there is a good chance that the doctor remains logged in at all times, so you have access to any systems attached to his computer (including portals, local information, etc). Imagine the types of patient information immediately accessible just through that.

Secondly, attached to the computer is often a prescription printer (which was part of the updating of patient prescription processing), which means that theoretically you can print out all kinds of prescriptions. It doesn’t sound like a big deal, but think of the high number of deaths associated with negative drug interactions that occur every year. What about in the US where you can obtain marijuana with a prescription. This just invites all kinds of organized crime into the equation.

I could go on quite a bit about other risks, but a lot of them can really be tied to identity management. Without the proper controls in place, it is almost impossible to track at every level who is accessing the systems, and ensuring that accessibility is kept up to date (such as currently employed vs no longer employed, etc). Leaving the responsibility at the healthcare practitioner level is not the answer to this, better system integration and unified access management (and user identity) is going to be the cornerstone to fully secure and accessible systems.

Canada’s opportunity to be a Cloud leader

KPMG released some news yesterday that Canada has a 14.9% cost competitiveness advantage over the US in the digital industries sector. This is due to the “lowest effective corporate income tax rates”. So I have to ask the question, why aren’t we seeing more tech companies setting up base here in Canada, especially with cloud looming?

It’s not for the lack of qualified employees. There is a huge workforce available for this sector, especially if you look at Burlington or Waterloo Ontario. Additionally, we have pretty relaxed immigration laws that allow for businesses to start up here in Canada and recruit from a global candidate pool.

The climate in Canada is also a boon for cloud. If you look at organizations like Rack Force, they figured out that the cool climate found in Canada is a perfect place to host a data centre. It’s the cornerstone of their “green” advantage. We also are immune to a lot of the natural disasters that unfortunately happen to the rest of the world.

Thirdly, we have a ton of organizations in Canada that can benefit from cloud services. Canada is primarily made up of mid-market customers who are struggling to figure out what to do with cloud and how to transform their business. There is tons of blue sky for the creation of services targeted towards this market.

Additionally, we also have some of the strictest privacy guidelines which makes outsourcing to another country a nightmare for all organizations. What better reason to see Canadian expansion (as we see with some of the bigger cloud enterprises) or new business startups. Your market is already captive, so setting up new services that reflect successful ones from other countries is a great opportunity.

The key roadblock that seems to come up with this type of conversation is that Canada is lagging behind in technology adoption. It’s true, us Canadians are pretty conservative when it comes to trying something new. However, we are also the country that spawned huge innovative organizations such as RIM and Nortel (think Patents). So there is a great opportunity for Canada to become a cloud leader, we just need the right ecosystem, and that will come from tech corporations investing in local talent.

The Canadian Market for Cloud Brokerage


I’ve written before about Cloud Brokerage, and its importance to the evolving cloud landscape. It’s clear that cloud brokerage is something that will not only be an important part of the cloud model moving forward but something that Canada is perfectly positioned to lead on a global scale.

Why? Because the unique dynamic of the Canadian telecommunications market coupled with our ability to work with a wide range of vendors in both security and IT services, all under one of the strictest regulatory environments in the world. But why cloud brokerage?

Gartner has been touting the need for cloud brokers for years, primarily to aid the cloud market in three key areas; cloud service aggregation, arbitrage and intermediation. As the Canadian market starts to transition to a cloud model (either hybrid or public), there will be an abrupt gap in internal resources available to aid with the transition.

Right now the mere idea of adopting cloud services is in the early stages within organizations, with many leaders left trying to understand how they are going to find the internal resources to aid in the initiative. Unless there is a large in-house deployment of virtualization and the resources to support it, the idea of virtualizing and moving internal systems into an off-site location is a huge undertaking and the knowledge curve can be daunting. So naturally, the organization is going to start to look at its vendor ecosystem for support.

What if a provider could offer the organization a one-stop shop for the entire project, and act as the project manager? This is exactly the value a cloud provider could offer as a cloud broker.

The role of the cloud broker in this case would be three-fold; first, the cloud broker would be able to research the marketplace and provide a fine-tuned list of services that meet the objectives of the customer, ranging from regulation requirements to location and disaster recovery services (check out the disaster recovery case), and further onto applications and security offerings and deliver them on a single platform.

The cloud broker could also layer their own in-house services such as networking or unified communications, and offer all services on a single monthly bill, with the “one throat to choke” model for customers. The evolution of the cloud brokerage model would see these players becoming the regulators of the market as they would control the demand and service requirements as the largest distributors of these services. If a service does not meet the standards of the cloud broker, the service could be substituted with another service without service interruption to the end customer.

The key markets for cloud brokerage in Canada would include large, heavily regulated organizations and entities including government services, financial institutions, education and manufacturing. These also happen to be the key customers of the large telecommunications companies in Canada, who are the best positioned to transition from offering standard telco services to evolving into a full-fledged cloud service broker. The end customers benefit from a single point of contact and engagement for the entire project, while the broker is able to create a high-margin MRR based business leveraging services that might already exist in-house in different product portfolios.

So where do we begin this transition? We are currently still in the early stages of adopting a cloud brokerage model in Canada, but there are some great examples of international organizations successfully leveraging this model including RightScale and Cloudkick, and the clear alignment of some of the larger global telecommunication organizations starting to move to a similar model. For Canada, the first steps are already taking place, with providers aligning themselves with key cloud services and offering the services bundled with in-house products ranging from IP services and security to unified communications. I expect over the next 2-5 years, we will see greater adoption of the cloud brokerage model, and the ability to start offering these services on a larger scale to Canadian and global organizations.