There is a real cost associated with how far you allow non-human traffic to penetrate the data center – and it’s not just in soft security risks.
A factor often ignored by those performing contact analysis costs is technology. Call centers, marketing, and other business functions generally rely upon key performance indicators that include costs per contact – a cost associated with every e-mail, call, or web transactions that occurs. This is measured against costs to acquire customers as a measure of effectiveness.
These generally involve labor and associated business costs, but forget to include the undeniably important technology costs associated with each contact. Given the increasing desire of consumers to leverage web and mobile contact options, it seems prudent to consider technology costs as a part of the bigger picture. Luckily, estimates put the technology-related costs of contact at a mere 2.6-5.9% of the total cost of contact (Strategic Contact, “COST STRUCTURE AND DISTRIBUTION IN TODAY’S CONTACT CENTERS”).
Recent statistics regarding the type of traffic which may end up being factored into such analysis show that perhaps that number could be reduced, which in turn would impact positively the cost of contact by bringing it down even lower. That’s because a significant portion of “visitors” are not even human, and thus consume resources that cost real dollars that impact the total cost of contact for an organization.
Not unsurprisingly, different industries and sites show different rates for non-human traffic:
In January 2010, the total page views climbed to 876 million, and the number of "bots" or spiders climbed to 586 million. From 57 percent in 2009, the percentage of "bots" and spiders climbed to 67 percent this year
In total 66,806,000 page requests (mime type text/html only!) per day are considered crawler requests, out of 477,009,000 external requests, which is 14.0%
Random reading in forums around the web anecdotally support a percentage somewhere in between, with 33% and 50% being the most common numbers cited by admins when asked what percentage of traffic was attributable to bots and/or spiders.
Similarly, the cost per transaction may be higher or lower depending on the industry and level of infrastructure maturity and complexity in place. What’s important to take away is not necessarily the specific numbers in this example, but to recognize that the deeper traffic flows into the data center, the more processing and handling that must occur. Each process, each component, each flow through the data center incurs a real cost to the organization. After all, the network and its bandwidth are not free, and this is increasingly evident when considering cloud computing resources that are more in-the-businesses-face about breaking out the infrastructure costs associated with that processing.
Which means the sooner you detect and filter out “non-human” traffic that has little to no value to the organization and thus only serves to increase costs, the better.
What that boils down to is that “when” illegitimate traffic is detected and rejected is important to not only improving operational security posture but to containing business costs. The earlier traffic is marked as “bad” and rejected, the better it is for both IT and the business. Consider a relatively average-sized business sees, according to Google Analytics Benchmarks from 2011, about 2873 web transactions per day. Using Incapsula’s estimate that 51% of those transactions are actually illegitimate, non-human visitors each with a Tower Group estimated cost of about $0.17, an average business wastes about $68 - $327 a day. While that figure may be easily shrugged off, the resulting nearly $25,000 dollars a year at the low end and $119,000 on the high end it adds up to is not. It’s a significant chunk of change, especially for a small or medium sized organization. Larger organizations with higher transactional rates would necessarily see that cost increase, to numbers likely not to bring a smile to the face of the CFO or CTO.
Luckily, the application delivery tier is strategically positioned to not only detect but reject such traffic as it enters the data center. That means a shallow penetration profile that dramatically reduces the cost associated with processing illegitimate transactions. The application delivery tier is the first tier in the data center, comprising application delivery and access services at the perimeter of the network. The application delivery controller is often designated as the first point of contact for end-user requests – both human and not because of its role in virtualizing the applications it delivers and secures. This limits the number of infrastructure components through which illegitimate traffic must pass. When traditional network security services such as those associated with a firewall as well as access management are consolidated in the application delivery tier, this further reduces the costs to process these requests as they are immediately recognized and rejected as invalid without requiring flows through additional components.
While there are still costs associated with this processing – it can’t be entirely eliminated, after all – it is greatly reduced because it occurs as close to the point of ingress as possible and eliminates traversal through the entire application infrastructure chain – from virtualization platforms to web, application, and database systems.
Much has been made of the need to align IT with the business in the past few years, but rarely are we able to provide such a clear case of where IT can directly impact the bottom line of the business as in the realm of controlling transactional costs. Whether through earlier detection and preventing or consolidation –or both – IT can certainly add value by reducing costs through the implementation of a comprehensive application delivery tier.