|By Jonathan Lewis||
|September 30, 2013 07:45 AM EDT||
Identity and access management solutions provide governance and visibility capabilities that enable organizations to provision and control access to their applications, cloud infrastructure, servers and both structured and unstructured data. Enterprise IAM deployments are generally effective in managing the identities assigned to interactive, human users. However, within a typical enterprise there often are a greater number of identities assigned to the automated processes that drive much of the computing in large- scale data centers. As enterprises adopt more and more process automation, the number of non-human identities continues to grow while the number of identities assigned to human users remains relatively flat or even declines. The net result is enterprise IAM deployments are ignoring the much larger set of identities that actually perform most of the enterprise computing functions.
The vast majority of the identities enabling machine to machine (M2M) processes use Secure Shell for authentication, authorization and to provide a secure encrypted channel for M2M data transfers. For example, an automated process that retrieves server log data requires an authenticated and authorized connection to each server, plus a secure channel to move the log data to a centralized processing application. Secure Shell is ideal for these functions because:
- Public key (PKI) based authentication supported by Secure Shell enables the process to present its credentials without requiring an interactive user to login via username and password - or via any other interactive authentication process.
- The PKI based authentication process used by Secure Shell provides security for the login credentials. The private Secure Shell user key is never sent over the network.
- Secure Shell provides facilities to define and limit what functions a process may perform under a Secure Shell authorization. This meets "need to know, need to do" criteria of basic IAM governance.
- Finally, Secure Shell provides for confidentiality of data in transit. Communications over a Secure Shell channel are encrypted.
In spite of these advantages, there are significant gaps in IAM governance of identities that use Secure Shell. Typically, the provisioning of these identities is decentralized. Identities may be assigned by application developers, application owners and process owners. This often leads to a lack of proper control and oversight over creation of identities and their authorizations. Without central management and visibility, enterprises cannot be sure how many Secure Shell identities have been created, what these identities are authorized to perform and what authorizations are in fact no longer needed. The scope and nature of this problem are not theoretical. The typical enterprise server has between 8 and 100 Secure Shell authorizations (i.e., public Secure Shell user keys). This adds up. A large enterprise may have over one million keys, which in turn establish an even greater number of unmanaged machine-to-machine (M2M) trust relationships.
The Challenge of Ubiquitous Encryption
While many in IT security use Secure Shell to securely access remote servers, most are surprised to discover that M2M communication makes up the majority - in some cases over 90% of all Secure Shell traffic - on their network. The vast majority of Secure Shell trust relationships provide access to production servers and carry high-value payloads; including credit card information, healthcare records, national secrets, intellectual property and other highly critical information.
Shockingly, access to M2M encrypted channels via Secure Shell, which uses keys to authenticate a non-human user, almost always lacks proper identity and IAM controls, creating a huge risk and compliance issue for most enterprises. Any interactive user who has the proper credentials - in the case of Secure Shell, a simple copy of the key file - can hijack these uncontrolled M2M networks. This means that, in many cases, the most valuable information in the enterprise has the least amount of protection from unauthorized access.
Most large organizations have between 100,000 to well over a million of these keys in their network environments. Even though these keys grant access to critical systems and servers, many have never been changed. Even more incredibly, many organizations have no process for approving and enforcing who can grant permanent access to servers using these keys. One study at a large bank, with over one million keys in use, found that 10 percent of these keys granted unlimited administrative ("root") access to production servers; a grave security risk.
The lack of security controls - coupled with the high value of data it protects - has made Secure Shell a target for hackers. A recent IBM X-Force study found most attacks against Linux/Unix servers utilize stolen or lost Secure Shell keys as a threat vector. Because many keys are deployed in one-to-many relationships, it is possible that a single breach related to a compromised key could have a cascading effect across a large swath of the network environment.
In an ironic twist, the very function that blinds prying eyes from spying on sensitive data in-transit also prevents systems administrators from seeing whether information is being accessed improperly using a stolen Secure Shell key. All data-in-transit encryption, including Secure Shell, blinds layered security defense systems to malicious activity originating from a hacker, trusted insiders, business partners and outsourced IT. This means that unless the enterprise has deployed an encrypted channel monitoring, security operations and forensics teams cannot see what is happening in the encrypted network. Encrypted channel monitoring enables security intelligence and DLP solutions to inspect, store and - if need be - stop traffic to make sure hackers or malicious insiders cannot use Secure Shell encryption to spirit away information in an undetectable and untraceable manner. This way, the network administrator can track what a user is doing inside the encrypted channel, without exposing the data in the clear during transmission.
Evolving Standards to Include Other Authentication Methods
Moving to protect themselves against both hacker attacks and security compliance mandates, many enterprises are bolstering interactive user authentication methods; including enforcing password strength, requiring periodic password changes and implementing two-factor authentication. These methodologies are designed to confound hacker attempts to access interactive accounts through brute force attacks, lost or stolen passwords, or spoofed credentials. These approaches are now considered best practices and are enshrined in compliance requirements like PCI, HIPAA, FISMA, SOX and others.
Currently, compliance bodies are updating their regulations to specifically include other methods of authentication above and beyond user names and passwords - such as certificates and keys - in their regulatory language. This means that auditors will be required to flag instances where access is not being controlled via Secure Shell. This is a natural progression for compliance mandates, arriving at a time when the market is beginning to recognize that strong standards are required to ensure the safety of the enterprise's most critical business information.
To provide the highest levels of security and accountability, it is in the organization's best interest to research, design and deploy an IAM strategy that includes processes designed specifically for M2M communications. A comprehensive, best practices-based IAM program that includes provisions for Secure Shell-based M2M security must address both the provisioning and intelligence aspects of IAM across large, complex and heterogeneous environments.
Best practices based Secure Shell key management enables strong authentication practices, including:
- Restricting root access to servers so that only the key manager can provision or revoke keys
- Automated key creation, rotation and removal
- Discovery and continuous monitoring of trust relationships and unauthorized key deployments and removals
- Enforcing proper key type, size and version of Secure Shell
- Controlling where each key can be used from and what commands can be executed using the key
- Monitoring traffic in encrypted channels
In an environment where ever-increasing numbers of users, devices and machines are connected to the Internet and the company network, ensuring that the enterprise's IAM strategy includes strong Secure Shell access controls in M2M communications is mission-critical. While ubiquitous encryption offers clear network security benefits, left unmanaged it can present a significant threat to the business. IT security, compliance and audit professionals must begin the process of addressing Secure Shell access control and governance issues. The absence of such controls creates security vulnerabilities and can cause an organization to run afoul of compliance mandates, resulting in the risk of fines and other liabilities. By critically examining the organization's Secure Shell environment, IT teams can reveal and address the M2M access control issues that lie beneath the tip of the iceberg.
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
Oct. 9, 2015 12:00 AM EDT Reads: 1,123
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Oct. 8, 2015 11:30 PM EDT Reads: 198
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Oct. 8, 2015 11:30 PM EDT Reads: 277
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Oct. 8, 2015 10:00 PM EDT Reads: 588
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Oct. 8, 2015 09:15 PM EDT Reads: 282
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Oct. 8, 2015 09:00 PM EDT Reads: 115
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
Oct. 8, 2015 08:15 PM EDT Reads: 158
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at the same time reduce Time to Market (TTM) by using plug and play capabilities offered by a robust IoT ...
Oct. 8, 2015 06:00 PM EDT Reads: 2,163
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Oct. 8, 2015 05:30 PM EDT Reads: 232
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT.
Oct. 8, 2015 04:30 PM EDT Reads: 7,471
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 8, 2015 02:45 PM EDT Reads: 498
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
Oct. 8, 2015 02:30 PM EDT Reads: 652
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Oct. 8, 2015 01:00 PM EDT Reads: 762
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Oct. 8, 2015 01:00 PM EDT Reads: 476
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
Oct. 8, 2015 01:00 PM EDT Reads: 261
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
Oct. 8, 2015 12:45 PM EDT Reads: 508
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
Oct. 8, 2015 12:00 PM EDT Reads: 606
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Oct. 8, 2015 12:00 PM EDT Reads: 576
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Oct. 8, 2015 11:00 AM EDT Reads: 728
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 8, 2015 07:00 AM EDT Reads: 5,870