Securing The Cloud

The cloud presents a variety of unique challenges for data security, including some from the inside.

popularity

Cloud computing offers on-demand network access that is ubiquitous and convenient, with a pool of configurable computing resources such as shared networks, servers, storage, applications, and services. What makes this so attractive is these services can be provisioned and adapted to the load, with minimal management or service provider intervention.

Cloud computing takes advantage of a distributed and highly scalable architecture, where data and applications are downloaded from the cloud or run directly from it. While that enhances availability, scalability, collaboration, and agility, it also increases the risk of security breaches — especially from the inside.

“Cloud computing takes much further a trend that has been going on for some time, which is the element of insider threats,” says Simon Blake-Wilson, vice president of products and marketing at Rambus‘ Cryptography Research Division. He adds that insider threats are now one of the top priorities in the security arena these days “But with the cloud, they are also harder to control.”

Traditionally, the enterprise had a very hard shell that made it difficult for outsiders to get in. With a distributed model, multiple non-associated tenants, a vast array of applications, and no definable walls, it is much easier to get on the inside. Edward Snowden’s highly publicized breach is a good example of that. So there is an awareness that cloud server farms need to secure the perimeter from the inside, as well as the outside.

One of the major differences between walled networks and the cloud is the involvement of third parties, such as ISPs, Web services providers and applications suppliers. Third parties play a huge role in cloud computing. This is partly due to the cloud’s reliance on the Internet, and partly due to technologies such as service-oriented architectures (SOAs), as well as virtualization, which adds layers that were not part of traditional local host model.

In addition, the cloud offers everything-as-a-service (XaaS), including software, platforms and infrastructure. Each of those has a different security requirement that can be affected by dependencies across them. These third parties leverage programs and applications that aren’t necessarily part of the local enterprises within the security perimeter. Third parties are heavily involved in the various “as-a-service” platforms, as well. All of this adds a level of risk that is very difficult to encapsulate.

Cloud drilldown
And then there is the Internet itself. The cloud is 100% based on Internet connectivity, which poses some well-known publicized for users. But those risks increase with a distributed architecture, more varied access and more diversified applications, which combine to create a wider attack surface with a multitude of potential attack vectors that most private non-cloud networks do not have.

Moreover, public or shared clouds offer the infrastructure as a service, via the Internet. Because they are multi-tenant, they have little, or no control over the underlying technology infrastructure. They have to be able to accommodate everything, including legacy, analog, digital and virtual, because client requirements vary widely.

Private clouds, or internal clouds are somewhat less vulnerable. While the model is the same as the public cloud, they can narrow the application infrastructure, which is deployed over a company infrastructure or hosted datacenter. Because private clouds are company- or organization-specific, they can offer advanced security and more general access, along with fault-tolerant solutions that are not practical, or in many cases even possible in a public cloud. This is because with a private cloud the owner can determine and control the resources. Hence, multi-tenancy, and the considerations that come along with it, are irrelevant.

A third platform that is emerging is the hybrid cloud. This model tries to combine the best of both worlds. It uses an integrated approach to combine the best elements of both public and private clouds. This model incorporates customized rule sets and policies that administer the underlying infrastructure. This cloud platform can accommodate both the demands of public and private cloud activities and tasks, which are simply allocated to the external or internal cloud platforms as required by the clients. However, some security issues between them still exist.

Overall, the value of the data stored in cloud server farm is greater than the data stored on any one particular client. That means that the security concerns of the cloud become multiplied as the data on the servers grow.

Cloud challenges
Because of its wide attack surface, cloud providers must do more than just secure the border.

“More and more cloud enterprises are thinking about augmenting the hard shell,” said Rambus’ Blake-Wilson. “That includes the reinforced perimeter, with protection from insider threats, as well.”

A good example, of this is the resurgence of the water hole attack, which has been around for a few years. If a multi-tenant cloud database is poorly designed, or if the application infrastructure is sloppy, a flaw in another tenant’s application has the potential open a door that a hacker can use to compromise all data across that database. This type of attack permits the hacker to compromise a lot of users of these trusted Web applications by simply commandeering their Web browser. Water hole attacks are not limited to clouds, but because public and hybrid cloud platforms have such a diverse landscape, and are multi-tenant, it is easier for the attacker to set them up in these kinds of architectures.

The water hole is an indirect type of attack and is often used to infect victims from a specific industry, or business segment, as opposed to an individual company. Attackers might target companies that develop code or security and use the cloud. Then they try to infect user machines that have access to the target network. If successful in a weak cloud infrastructure, they can glean a mountain of data across multiple targets.

Hardware vulnerabilities
The virtual machine (VM) is ubiquitous within the cloud, and is a fertile playground for the hacker, because VMs are shared by different tenants due to resource pooling. So in the cloud, one company’s application potentially could be running on the same hardware as its competitor’s application. A breach from outside, or from within, could allow one company to gain access to another company’s competitive data.

Furthermore, virtual networks expand interconnectivity of VMs, which presents the potential for cross pollution. “In such a case, there needs to be assurance that the physical hardware is able to, effectively, isolate the hardware all around — from the other clients to the cloud provider,” says Blake-Wilson.

That’s the role of hypervisors, which were developed to manage virtual machines across virtual networks. The purpose of using hypervisors is to greatly improve resource efficiency as opposed to dedicated physical channels. But virtual network technologies and platforms also can increase the possibility of attacks, such as sniffing and spoofing.

The hypervisor is a low-level software package that is responsible for the control and monitor functions of the VMs. It controls the client operating systems using a virtual operating platform, which manages the OS execution on the clients. Where the weakness comes in is that multiple instances of multiple operating systems usually share these virtualized hardware resources. That is the first place there can be security flaw. So the philosophy is to keep the virtual machine monitor (VMM) code as simple and small as possible to reduce the risk of vulnerabilities. The VMM is also responsible for the virtual machine’s isolation. Any compromise in the VMM means that its VMs potentially can be compromised, as well.

Another potential vulnerability is that, with virtualization, the ability of migrate VMs across physical servers is introduced. This is commonly done for fault tolerance and load balancing. However, this capability is also a vulnerability. An attack on the migration module in the VMM can transfer a compromised VM to a clean server.

VM migration also exposes the entire network to the compromised VM. This can compromise the network’s data integrity and confidentiality. If that compromised VM is migrated to another host (also running a VMM), then that VMM becomes compromised, as well. The bottom line is that if any VM is hacked, all of the machines in that network become vulnerable.

Everyone is watching
Industrial espionage, as well as rogue states, are all in the spy game. Because the data is now housed in the cloud, governments, and competitors can bypass the company they want to spy on and go directly to the cloud provider, overtly or covertly. A multi-tenanted environment makes it all that much easier.

Finally, while not directly in the security line of fire, there are other challenges such as how to deal with multiple geographies. Laws vary from the smallest municipality to the largest nation. That means that the cloud provider must comply with an assortment of laws and regulations.

But laws don’t necessarily mean liability. Moving to the cloud doesn’t mean the cloud provider assumes all liability. It may secure the cloud, but the individual entity is still responsible for its own data. And the requirement to secure that can vary across borders.

The best way to secure the cloud.
“There are as many types of security solutions,” sayd Ted Marena, director of FPGA/SoC marketing at Microsemi. “But the most prominent is the public/private key code scenario.”

Public/private keys are an elegant solution to a difficult problem. The entire security ecosystem can benefit from this platform—especially with the IoE, where so many autonomous messages and communications are flying around.

The public/private key scenario works very well for cloud applications. “Today, there are cloud-based services, called certificate authorities, that provide keys,” Marena explains. “One applies to the service and obtains keys from these known and trusted certificate authorities.” Simply put, cloud servers implement a public key environment and hand out private keys to clients. The basic key process is shown in Figure 1.

microsemipic
Fig. 1: The key verification process. Source: Microsemi

The public key infrastructure takes symmetric key authentication to the next level. It was designed to address the weakness in symmetric key encryption, which requires sophisticated mechanisms to ensure the secret key is securely distributed to both parties.

Symmetrical keys present the proverbial Catch-22 scenario of how to security communicate securely a shared key before secure communication can be initiated. Symmetrical keys also can have trust problems. At the same time, symmetrical keys are much faster to run in hardware. And where there are limited resources, which will be the case with many of the IoE devices, symmetrical keys are often the better solutions.

Asymmetrical keys also are applicable to the more sophisticated IoE devices. With asymmetric encryption there are two related keys, referred to as a key pair. In this scenario, a public key is made available to anyone who might want to send a message. There is also a second key, which is private and kept secret. The public key is used to encrypt the message while the private key is used to decrypt it. This eliminates the need to secure the key transmission medium and permits unrestricted distribution of the public key to anyone who will be part of the communication link between parties, while protecting the sensitive data.

How it works.
To communicate secure information between users, an encrypted message is sent using the receiver’s public key. To decrypt the message, the receiver applies a private key. One of the nice things about asymmetric key pairs is that unlike shared keys, using the asymmetric key system allows only the recipient to decrypt a message. Even the sender cannot decrypt the message once it has been encrypted. Private keys are never distributed. Therefore, any attacker or eavesdropper will not be able to intercept the key that will be used to decrypt the message.

Another way to secure cloud data is to break up the key. For example, a client in the cloud has some very sensitive data they want to keep secure. “Let’s say I am the president of a bank,” says Pankaj Rohatgi, director of engineering at Rambus. “I can break my key into five pieces, for example, and give a piece to each vice president, or some other officer of this bank.”

That secures the key in a much tighter fashion and prevents some insider attack from a trusted individual who also may have access to the key. This provides a great deal of separation for the client, as well as adding a layer of very tight security, which is why this approach is beginning to gain traction in cloud services.

Missive
With non-cloud architectures an attack on one server site limits the damage to the data on that site. With the cloud, that limitation disappears.

The most notable challenge in cloud security is the cloud’s large and varied attack surface. Multiple clients, multiple apps, multiple platforms, all housed under one cloud umbrella make securing the package a challenge across so many different vectors.

Hardware cryptography is the first line of defense. Implementing effective key exchange strategies is, perhaps, the best technology to date. That can work, regardless of the application or platform.

Still, as the cloud becomes the norm, and massive databases are housed across multiple clouds, challenges such as performance, authentication, and cryptography will become the focal points. Approaches such as using the Hardware Security Module (HSM) for highly sensitive data still apply for highly sensitive data, but there is much work to be done to make data stored in the cloud less vulnerable to increasingly sophisticated attacks.

Related stories
The Cloud, The IoE, And You
The Next Level Of Chip Security



Leave a Reply


(Note: This name will be displayed publicly)