Click here to close now.



Welcome!

Java IoT Authors: Liz McMillan, Pat Romanski, Dana Gardner, Harry Trott, Elizabeth White

Related Topics: Containers Expo Blog, Java IoT, Microservices Expo, @CloudExpo, @BigDataExpo, SDN Journal

Containers Expo Blog: Article

Virtualization Security in Cloud Computing

A novel architecture design that aims to secure virtualization in cloud environments

2011 ended with the popularization of an idea: bringing VMs (virtual machines) onto the cloud. Recent years have seen great advancements in both cloud computing and virtualization. On the one hand there is the ability to pool various resources to provide Software as a Service, Infrastructure as a Service and Platform as a Service. At its most basic, this is what describes cloud computing. On the other hand, we have virtual machines that provide agility, flexibility, and scalability to the cloud resources by allowing the vendors to copy, move, and manipulate their VMs at will. The term virtual machine essentially describes sharing the resources of one single physical computer into various computers within itself. VMware and virtual box are commonly used virtual systems on desktops. Cloud computing effectively stands for many computers pretending to be one computing environment. Obviously, cloud computing would have many virtualized systems to maximize resources.

Keeping this information in mind, we can now look into the security issues that arise within a cloud computing scenario. As more and more organizations follow the "Into the Cloud" concept, malicious hackers keep finding ways to get their hands on valuable information by manipulating safeguards and breaching the security layers (if any) of cloud environments. One issue is that the cloud computing scenario is not as transparent as it claims to be. The service user has no clue about how his information is processed and stored. In addition, the service user cannot directly control the flow of data/information storage and processing. The service provider is usually not aware of the details of the service running on his or her environment. Thus, possible attacks on the cloud-computing environment can be classified into:

  1. Resource attacks: Include manipulating the available resources into mounting a large-scale botnet attack. These kinds of attacks target either cloud providers or service providers.
  2. Data attacks: Include unauthorized modification of sensitive data at nodes, or performing configuration changes to enable a sniffing attack via a specific device etc. These attacks are focused on cloud providers, service providers, and also on service users.
  3. Denial of Service attacks: The creation of a new virtual machine is not a difficult task and, thus, creating rogue VMs and allocating huge spaces for them can lead to a Denial of Service attack for service providers when they opt to create a new VM on the cloud. This kind of attack is generally called virtual machine sprawling.
  4. Backdoor: Another threat on a virtual environment empowered by cloud computing is the use of backdoor VMs that leak sensitive information and can destroy data privacy. Having virtual machines would indirectly allow anyone with access to the host disk files of the VM to take a snapshot or illegal copy of the whole system. This can lead to corporate espionage and piracy of legitimate products.

With so many obvious security issues (a lot more can be added to the list), we need to enumerate some steps that can be used to secure virtualization in cloud computing.

The most neglected aspect of any organization is its physical security. An advanced social engineer can take advantage of weak physical security policies an organization has put in place. Thus, it's important to have a consistent, context-aware security policy when it comes to controlling access to a data center. Traffic between the virtual machines needs to be monitored closely by using at least a few standard monitoring tools.

After thoroughly enhancing physical security, it's time to check security on the inside. A well-configured gateway should be able to enforce security when any virtual machine is reconfigured, migrated, or added. This will help prevent VM sprawls and rogue VMs. Another approach that might help enhance internal security is the use of third-party validation checks, performed in accordance with security standards.

In the above figure, we see that the service provider and cloud provider work together and are bound by the Service Level Agreement. The cloud is used to run various instances, whereas the service end users pay for each use the instant the cloud is used. The following section tries to explain an approach that can be used to check the integrity of virtual systems running inside the cloud.

Checking virtual systems for integrity increases the capabilities for monitoring and securing environments. One of the primary focuses of this integrity check should be the seamless integration of existing virtual systems like VMware and virtual box. This would lead to file integrity checking and increased protection against data losses within VMs. Involving agentless anti-malware intrusion detection and prevention in one single virtual appliance (unlike isolated point security solutions) would contribute greatly towards VM integrity checks. This will reduce operational overhead while adding zero footprints.

A server on a cloud may be used to deploy web applications, and in this scenario an OWASP top-ten vulnerability check will have to be performed. Data on a cloud should be encrypted with suitable encryption and data-protection algorithms. Using these algorithms, we can check the integrity of the user profile or system profile trying to access disk files on the VMs. Profiles lacking in security protections can be considered infected by malwares. Working with a system ratio of one user to one machine would also greatly reduce risks in virtual computing platforms. To enhance the security aspect even more, after a particular environment is used, it's best to sanitize the system (reload) and destroy all the residual data. Using incoming IP addresses to determine scope on Windows-based machines and using SSH configuration settings on Linux machines will help maintain a secure one-to-one connection.

Lightweight Directory Access Protocol (LDAP) and Cloud Computing
LDAP is an extension to DAP (directory access protocol), as the name suggests, by use of smaller pieces of code. It helps by locating organizations, individuals, and other files or resources over the network. Automation of manual tasks in a cloud environment is done using a concept known as virtual system patterns. These virtual system patterns enable a fast and repeatable use of systems. Having dedicated LDAP servers is not typically necessary, but LDAP services have to be considered when designing an efficient virtual system pattern. Extending LDAP servers to cloud management would lead to a buffering of existing security policies and cloud infrastructure. This also allows users to remotely manage and operate within the infrastructure.

Various security aspects to be considered:

1.     Granular access control

2.     Role-based access control

The directory synchronization client is a client-residential application. Only one instance of DSC can be run at a time. Multiple instances may lead to inconsistencies in the data being updated. If any new user is added or removed, DSC updates the information on its next scheduled update. The clients then have the option to merge data from multiple DSCs and synchronize. For web security, the clients don't need to register separately if they are in the network, provided that the DSC used is set up for NTLM identification and IDs.

Host-Side Architecture for Securing Virtualization in Cloud Environment
The security model described here is purely host-side architecture that can be placed in a cloud system "as is" without changing any aspect of the cloud. The system assumes the attacker is located in any form within the guest VM. This system is also asynchronous in nature and therefore easier to hide from an attacker. Asynchronicity prevents timing analysis attacks from detecting this system. The model believes that the host system is trustworthy. When a guest system is placed in the network, it's susceptible to various kinds of attacks like viruses, code injections (in terms of web applications), and buffer overflows. Other lesser-known attacks on clouds include DoS, keystroke analysis, and estimating traffic rates. In addition, an exploitation framework like metasploit can easily attack a buffer overflow vulnerability and compromise the entire environment.

The above approach basically monitors key components. It takes into account the fact that the key attacks would be on the kernel and middleware. Thus integrity checks are in place for these modules. Overall, the system checks for any malicious modifications in the kernel components. The design of the system takes into consideration attacks from outside the cloud and also from sibling virtual machines. In the above figure the dotted lines stand for monitoring data and the red lines symbolize malicious data. This system is totally transparent to the guest VMs, as this is a totally host-integrated architecture.

The implementation of this system basically starts with attaching a few modules onto the hosts. The following are the modules along with their functions:

Interceptor: The first module that all the host traffic will encounter. The interceptor doesn't block any traffic and so the presence of a third-party security system shouldn't be detected by an attacker; thus, the attacker's activities can be logged in more detail. This feature also allows the system to be made more intelligent. This module is responsible for monitoring suspicious guest activities. This also plays a role in replacing/restoring the affected modules in case of an attack.

Warning Recorder: The result of the interceptor's analysis is directly sent to this module. Here a warning pool is created for security checks. The warnings generated are prioritized for future reference.

Evaluator and hasher: This module performs security checks based on the priorities of the warning pool created by the warning recorder. Increased warning will lead to a security alert.

Actuator: The actuator actually makes the final decision whether to issue a security alert or not. This is done after receiving confirmation from the evaluator, hasher, and warning recorder.

This system performs an analysis on the memory footprints and checks for both abnormal memory usages and connection attempts. This kind of detection of malicious activity is called an anomaly-based detection. Once any system is compromised, the devious malware tries to affect other systems in the network until the entire unit is owned by the hacker. Targets of this type of attack also include the command and control servers, as in the case of botnets. In either case, there is an increase in memory activity and connection attempts that occur from a single point in the environment.

Another key strategy used by attackers is to utilize hidden processes as listed in the process list. An attacker performs a dynamic data attack/leveraging that hides the process he is using from the display on the system. The modules of this protection system perform periodic checks of the kernel schedulers. On scanning the kernel scheduler, it would detect hidden structures there by nullifying the attack.

Current Implementation
This approach has been followed by two of the main open source cloud distributions, namely Eucalyptus and OpenECP. In all implementations, this system remains transparent to the guest VM and the modules are generally attached to the key components of the architecture.

Performance Evaluation
The system claims to be CPU-free in nature (as it's asynchronous) and has shown few complex behaviors on I/O operations. It's reasoned that this characteristic is due to constant file integrity checks and analysis done by the warning recorder.

In this article, we have seen a novel architecture design that aims to secure virtualization on cloud environments. The architecture is purely host integrated and remains transparent to the guest VMs. This system also assumes trustworthiness of the host and assumes attacks originate from the guests. As in security, the rule of thumb says: anything and everything can be penetrated with time and patience. But an intelligent security consultant can make things difficult for an attacker by integrating transparent systems so that they remain invisible and that it takes time for hackers to detect these systems under normal scenarios.

References:

More Stories By Shathabheesha .

Shathabheesha is a security researcher for InfoSec Institute. InfoSec Institute is an IT security training company that offers popular VMware boot camp training.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"We work in the area of Big Data analytics and Big Data analytics is a very crowded space - you have Hadoop, ETL, warehousing, visualization and there's a lot of effort trying to get these tools to talk to each other," explained Mukund Deshpande, head of the Analytics practice at Accelerite, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2016' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited t...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profession...
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
Apixio Inc. has raised $19.3 million in Series D venture capital funding led by SSM Partners with participation from First Analysis, Bain Capital Ventures and Apixio’s largest angel investor. Apixio will dedicate the proceeds toward advancing and scaling products powered by its cognitive computing platform, further enabling insights for optimal patient care. The Series D funding comes as Apixio experiences strong momentum and increasing demand for its HCC Profiler solution, which mines unstruc...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...