Click here to close now.


Java IoT Authors: Anders Wallgren, Betty Zakheim, Pat Romanski, Dana Gardner, Automic Blog

Related Topics: PowerBuilder, Java IoT, Eclipse

PowerBuilder: Article

Software Inventory Control Systems

Once you have a complete discovery, find applications performing a duplicate activity and consolidate them

This article will focus on Software Inventory Control Systems (SICS). Recently, I was asked by Bill Rogers (NH DoIT Commissioner) and Peter Hastings (NH DoIT Director) to take a look at SICS, investigate them and provide a recommendation. When I started this research I knew nothing about SICS. I didn't understand their value. I ate, slept, and breathed these tools for a period of time so that I could understand their value. I got a good education from some smart folks: individuals in DoIT who were familiar with these products, research from the Internet, testimonials from other users on products, many demos and I inquired into what other states were using.

Presently, I am the administrator of SCM AllFusion Harvest. SCM AllFusion Harvest is a process-based Software Configuration Management (SCM) tool for managing application source code. I manage 181 applications housed in SCM AllFusion Harvest and support 122 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11; Visual Studio 2003, 2005, 2008; Eclipse and Java.

As the Software Configuration Manager, I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, from developing the life cycles, to providing the best practices, procedures, processes, and documentation; maintaining build machines; and the training of all the developers on proper source code management using the development tool in our development environment.

Before I share my findings on Software Inventory Control Systems, let me start by saying I am not an expert on these tools. I make no claim of that honor. I've only been looking at these tools for a few months. So, with that said, if there are others out there who have their own studies, I would love to see your findings. Everyone's perspective is unique to their needs. There's a lot to learn. Also, if you have questions about SICS, at the end of this article I provide three good resources for you to consider.

The first thing I found out about these tools was there were not a lot of folks doing Software Inventory Control on a formalized level. In many inquiries there was a lot of Excel spreadsheet tracking, Access databases, etc. It seemed to me as though many were only recently discovering the value on a more serious level, in great part due to the economy driving cost savings anywhere it can be found. This is yet another place to save on licensing, potentially consolidating like applications and precluding potential fines. The economy is actually driving this effort across the industry right now. Everyone is trying to figure out how to squeeze a watermelon through a garden hose.

I learned quickly that these tools are a unique discipline unto themselves not unlike SCM, SQA, DBAs, developers, etc. Learning, maintaining, reporting, mentoring, upgrading, enhancing, modifying, manual input, testing and sustainment, and more. Running these tools well is a full-time job. In my opinion they require a staff person to work them who is constantly fine-tuning your environment perfectly. Most environments are very dynamic and that requires attention to detail.

If kept up-to-date they run effectively and I believe at the end of the day there is a cost benefit. This is a tool that just makes sense, without a full and complete inventory what are your metrics based on? Any organization that started this seriously some time ago is a head of the curve. It truly is an enormous task.

In many organizations applications are underutilized, over utilized, you pay far too much, you don't have enough. Knowing what you have is tied directly to your bottom line. Being able to discover all your assets on every computer you have will help you to better pay for what you have and use only what you paid for. Good for auditing and no potential financial penalties.

Once you have a complete discovery, then you can possibly find applications performing a duplicate activity and consolidate them, if that scenario exists, and remove applications and licensing costs that you either don't use or need. There are some real good suites out there that allow Patch Management, Power Management, Asset Inventory Control and Software Usage Analysis - all important in driving down cost.

These tools are quite robust; you can see the RAM on a PC, the slots, how much RAM is in the slots, USB detection, alerts when new applications are installed or uninstalled. Having the ability to query every machines agent and get real-time reports is a powerful tool to add to your arsenal.

As they say knowledge is power and these tools bring power with them. Once I developed a basic understanding of the tools and their capabilities, I had to define what it was that I felt constituted a good tool for my research and our environment. I looked at products across a broad spectrum and thought that for our environment we needed an Enterprise product that could be centralized, have a client agent, integrate into other products and be able to generate reports to reconcile discoveries with financials.

I started by looking at the tools we had in-house first to see if we had a tool that could potentially be customized. If we can develop our own product in house I feel that is better for support and modifications. When you personally own the development, you do better work than someone else and get a better product. I determined quickly that most of tools that were in-house were not really designed for software usage analysis and didn't have agents associated with them, and that would require a lot more work than those tools that were already canned in the box for this functionality.

Next, I looked at open source products. I liked many of the tools and would have felt comfortable using any one of them in a small organization looking to save money. Some of the tools I looked at were free up to the first 25 units; beyond that there was a cost but you also added functionality. The disappointing detail about some of the free open source tools is there is absolutely "No Support." Many were very clear on this fact. These tools are complicated and do so much that I would consider the Enterprise level just for the added support and minimize any down time if possible. These tools are all great as long as they are stable products. Otherwise you spend your time patching them and experiencing performance issues, which cost productive time for the end user.

I ruled out open source products in an Enterprise environment. I know that will upset some folks, but I just didn't feel they were robust enough and I didn't like the lack of support. I was immediately comfortable with the scalability and functionality of the workhorses. The enterprise-wide tools that I explored provided great discovery into precisely what software assets you are using and allowed you to reconcile this discovery with financials. However, I have not used these tools in any environment on the ground and only witnessed demos.

I can't speak to real-life experience as to witnessing whether these tools live up to their big names. However, I feel comfortable in saying that of the corporate enterprise tools I investigated, other products these corporations sell are good quality products as well. All the top vendors I reviewed knew their stuff. Any of my final choices would help you to discover all your software assets.

I also explored tools that other states were using or were in the process of implementing. One State was just in the process of implementing an ITAM tool but it was too soon for any results or lessons learned; the second one I spoke with was using an asset management tool but not discovering software usage analysis. I suspect in a few years from now these tools will be standard in every business.

When I consider tools that I am reviewing, I think generally about standardization - tools that can be implemented across all platforms that fill a broader need and centralization, creating central repositories for important assets so that data assets aren't scattered everywhere; and interoperability, capitalizing on other tools in your arsenal that you can integrate together. The metrics you acquire from another tool integrated are more accurate than any received from manual input from a user. For example, we use CA SCM in our shop and integrate this product into at least five other development tools (PowerBuilder, .NET, Eclipse, Java, etc.). Not always seamless and not without bumps but it accomplishes the task of centralizing the assets and the standardization of SCM by securing and controlling those assets using one tool to acquire the assets.

In my research I found that Software Inventory Control Systems were no different from other products. If you can leverage tools you already have through integration, metrics could be generated to provide a clear picture of your entire environment. These SICS tools seemed to work best with a suite that encompasses power management, patch management, asset inventory, software usage analysis, and more. There are several suites available that bundle the whole package up for you. It really is just a great way to go if you can spend the money.

After completing my findings I decided on the following five products, which I thought were serious tools to consider:

  1. AssetExplorer ITAM
  2. CA Client Automation/Inventory Control
  3. Kaseya IT Management
  4. IBM Software Usage Analysis/End point Manager
  5. SofTrack Inventory Control

I watched demos of all of these products and thought that any one of them would work very well in most mid-to-large customer's environments. Whether you purchase the suite or the individual products/modules, these are professional tools meant for creative, dynamic, rich work environments. They all cover the same ground, each a little different than the other but with the same end results to discover everything and report on it so that ultimately you can reconcile the data acquired with financials and hopefully realize a cost saving that at least pays for the tool.

The best way to know which product works for you is to "date the product." Install it, set it up and beat it up, and see if it delivers. These products are maturing all the time; whether these companies like it or not they are going to have to invest in these tool sets as a whole more as the end user demands more sophistication and the economy is driving these products. They are only going to get better.

All of these products and the companies representing them were very professional and knowledgeable. I thank them all for providing information about their products. All of these products are very sophisticated. These are not toys. It takes smart people to run these tools. I joke but there is a lot to know about these products and I just scratched the surface.

Table 1 shows a tool comparison that I used to refer to a few of the products I was reviewing (see Table 1). The recovery agent is important - it reports on activities that occur in your environment. Those alerts can ultimately save you money.

One thing I think is useful when looking for the right tool is to try and leverage the expertise you already may have in-house on a particular product. As I mentioned earlier these tools are best bought as a suite to address all facets of business but can be bought alone as well. However, my point is this - if you already own several products in a suite for some functionality, it makes sense to look at the entire suite. If you have individuals already knowledgeable on a specific module in the suite, you have some expertise you can leverage. This doesn't work for all products of course. Check to see if there are lessons learned from other users in the community or within your own organization.

When looking for the right tool for your environment some of the main features you really want to have available are as follows:

  1. A good database behind you: Storage, retrieval and reporting are essential and that is all about a good database. I like Oracle myself but SQL server is also a good choice.
  2. User-friendly GUI: Without an easy product to use and implement it could take time before you see results. Wizards with canned processes can streamline this activity and have you up and running quickly. You don't want a product that is so sophisticated that no one can run it and that mistake is made by some products. That was not the case in the ones that I looked at for this research. I felt that all the products I reviewed took into consideration the front-end administrator/user.
  3. Real time reporting: You want to be able to provide reports on-demand for status accounting purposes and financial reconciliation.
  4. Manual input: You want to keep track of artifacts that aren't typically discovered -BlackBerry, iPods, in-house development applications, etc.
  5. Remote capability: You want access to a machine for patch management, remote discovery, power management should a suite be purchased.
  6. USB detection: Detecting activity on USBs will help to prevent viruses if a policy can be set up to scan a device upon entry.

Much of the additional information below on SICS Toolsets is derived directly from the following corporations' literature to ensure accuracy of product information:

CA Inventory Control
It's an enterprise tool that is quite scalable and can run on either Oracle or SQL db - your preference. It includes business objects 11 (boxi) with its own universe for all your reporting needs. It can be used for all your non IT assets as well (NY Sports Club uses it for their gym equipment, for example, and sled clients use it to track office furniture). You can enter data in manually, via flat file, via bar code scanner, or using an automated discovery tool.

Asset Inventory and Discovery

  • Automated discovery, inventory, configuration, usage and reporting.

Asset Intelligence

  • Advanced asset analytics for informed decision-making.

Continuous and Active Discovery

  • Enables reconciliation of newly discovered assets with inventoried assets.

Comprehensive Inventory Across Heterogeneous Platforms

  • Hardware inventory
  • Software inventory
  • Software usage monitoring
  • Configuration information
  • System utilization
  • History of changes to systems
  • Extensive reporting system

Automate Daily Client Inventory Management Tasks

  • Create proven, reliable and repeatable processes
  • Employ process workflows

Seamless Integration

  • Unified technology architecture, single management console
  • Common management database

Centralized and Remote Management Capabilities

  • Reduce desk-side visits

The Kaseya Advantage

  • Agent-based: Simple, scalable, and lightweight
  • Fully integrated: Single pane of glass operation
  • Firewall friendly: No open ports
  • Fully encrypted: End-to-end protection
  • Secure and reliable: Web-based, sniff-proof and SSL-protected
  • Flexible and scalable: Easy setup, easy configuration

Kaseya Accreditations and Certifications
Kaseya has been rigorously tested and is pending an award of the Common Criteria Evaluation Assurance Certification and FIPS 140-2 Validation. Many federal and state agencies, including the Department of Defense, NASA, and the Department of Transportation use Kaseya solutions to improve service levels - yet still safeguard the integrity and availability of sensitive agency information.

It eliminates waste, sets policies, keeps systems secure, eliminates vulnerabilities and keeps those machines current. This ultimately has a positive influence on your end users and gives the IT staff the ability to maintain control of those systems.

Kaseya is a server/agent architecture. The agent itself is a 1.2 mb installer, and once installed it consumes 5mb with very little bandwidth consumption. Kayseya has customers managing 1000s and 10,000s machines with no performance issues. The communication from the agent to the servers is 100% secure (military grade encryption). The server can reside on premise (virtualized), or can be hosted in a cloud environment.

Manage software assets and get the complete software information with Asset Explorer. Ensure compliance by keeping a checklist of compliant, under-licensed and over-licensed software.

AssetExplorer helps to keep up-to-date information of all your assets by periodically scanning the software, hardware and other ownership information. Track and manage any workstation or network devices, for example, Windows, Linux, Mac, AIX machines, Solaris, Printers, Routers, and Switches, in your domain or network. It offers you both agent-based scanning and agent-less scanning.

AssetExplorer scans your network and automatically discovers all software available in each workstation. Asset Managers can easily ensure compliance by keeping a checklist of compliant, under-licensed and over-licensed software.

Software Usage Tracking
Gets the complete information on how often installed software is used and automatically classifies them as Frequently Used, Occasionally Used and Rarely Used Software.

Based on Software Usage Tracking reports, Asset Managers can take decisions while purchasing or renewing software licenses (see Figure 1).

Scans your network and automatically discovers all software available in all the workstations in your organization.

It helps you understand how many installations of software are available in your network.

It focus more on Software License Management by classifying all paid software as "Managed."

Group all your scanned software as freeware, open source, shareware, prohibited, and excluded using software types

Track and uninstall all the prohibited software, which the users have installed without your knowledge using the Remote Control Functionality (see Figure 2).

SofTrack's Smart Inventory
SofTrack's Smart Inventory provides automatic application recognition including where installed and frequency of use. It enables you to maximize your application budget and save money when you renew only the application licenses you are actually using.

Many software applications are licensed per installation (node-locked, single-user) rather than by actual usage. Simply put, when you install an application on a workstation it's counted as being in-use regardless of whether you actually use the application.

SofTrack's Smart Inventory provides automatic reports of:

  • What applications are installed
  • Where applications are installed
  • When applications are being used

SofTrack's Smart Inventory enables you to automatically determine the what, where and when of applications in your environment. SofTrack's Smart Inventory automatically determines what applications are installed on your workstations. There is no requirement of any prior knowledge of what applications are installed at each of your machines.

SofTrack's Smart Inventory will reveal computer-by-computer where applications are installed including version, publisher and, where available, the application serial number and installation key.

SofTrack's Smart Inventory will detect when each application is used and does not rely upon Windows Add/Remove Programs "last used" data. SofTrack's local agent technology will collect the following data per application per workstation:

  • Computer where used
  • Number of times launched (run)
  • Most recent time launched

IBM Software Usage Analysis
Software Usage Analysis tool is an application used to identify under-utilized software, track software usage patterns and trends, and detect over-used software licenses to maintain compliance with license agreements.

IBM Tivoli Endpoint Manager solution, built on the BigFix technology, provides real-time visibility and control through a single infrastructure, single agent, and single console for systems lifecycle management, endpoint security, and compliance. This approach enables you to securely manage your global IT infrastructures quickly and accurately, resulting in improved governance, control, visibility, and business agility.

IBM Tivoli Endpoint Manager solution leverages a multilayered technology platform that acts as the central nervous system of a global IT infrastructure. A dynamic, content-driven messaging and management system, this innovative technology distributes the work of managing IT infrastructures out to the managed devices. As a result, IBM Tivoli Endpoint Manager is able to operate in real time and deliver the scalability and performance that organizations demand. In addition, the single, multipurpose agent controls multiple services regardless of where the endpoint roams, optimizing the user experience and minimizing system impact. Specifically, the capabilities delivered via the IBM Tivoli Endpoint Manager unified management platform include:

  • Asset discovery and inventory
  • Software distribution
  • Patch management
  • Power management
  • OS deployment
  • Remote control
  • Software use analysis
  • Security configuration management
  • Vulnerability management
  • Anti-virus and anti-malware client management
  • Network self quarantine

IBM Tivoli Endpoint Manager for Lifecycle Management
IBM Tivoli Endpoint Manager for Lifecycle Management streamlines IT operational tasks with a lightweight, single-agent, single-console framework. The suite includes the following major capabilities:

  • Asset Discovery provides visibility into all IP-enabled devices on your network
  • Asset Inventory offers deep inspection capabilities to report on hundreds of hardware and software attributes out-of-the-box, across platforms, in real time, at massive scale
  • Patch Management collapses patch and update time scales, reduces staff workloads, cuts cost, and increases the effectiveness of patch processes
  • Remote Control configures and manages.
  • Microsoft Windows Remote Desktop and Remote Assistance functionality
  • Tivoli Remote Control components to be able to initiate and manage sessions remotely
  • Software Distribution automates and adds new levels of assurance to the software deployment process
  • OS Deployment provides bare-metal imaging and provisioning for operating systems
  • Software Use Analysis helps enterprises identify under used software to manage costs and over used licenses to stay on the right side of software license agreements

Our Experience
I cannot report on our experience with a tool of this nature at this time. However, should we procure one for our use and implement it I will report back in the future on our discoveries. We develop and maintain more than 181 applications in-house at New Hampshire's Department of Information Technology. The applications are used extensively in our welfare and health services delivery agencies. Example applications are for child-care licensing and managing adult and elderly care. Throughout the state the applications are used by hundreds of users.

Please feel free to email your questions to me at

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Internet of @ThingsExpo, taking place June 7-9, 2016 at Javits Center, New York City and Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 18th International @CloudExpo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo New York Call for Papers is now open.
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi’s VP Business Development and Engineering, explored the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context with p...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.