|By Al Soucy||
|September 9, 2011 10:45 AM EDT||
This article will focus on Software Inventory Control Systems (SICS). Recently, I was asked by Bill Rogers (NH DoIT Commissioner) and Peter Hastings (NH DoIT Director) to take a look at SICS, investigate them and provide a recommendation. When I started this research I knew nothing about SICS. I didn't understand their value. I ate, slept, and breathed these tools for a period of time so that I could understand their value. I got a good education from some smart folks: individuals in DoIT who were familiar with these products, research from the Internet, testimonials from other users on products, many demos and I inquired into what other states were using.
Presently, I am the administrator of SCM AllFusion Harvest. SCM AllFusion Harvest is a process-based Software Configuration Management (SCM) tool for managing application source code. I manage 181 applications housed in SCM AllFusion Harvest and support 122 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11; Visual Studio 2003, 2005, 2008; Eclipse and Java.
As the Software Configuration Manager, I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, from developing the life cycles, to providing the best practices, procedures, processes, and documentation; maintaining build machines; and the training of all the developers on proper source code management using the development tool in our development environment.
Before I share my findings on Software Inventory Control Systems, let me start by saying I am not an expert on these tools. I make no claim of that honor. I've only been looking at these tools for a few months. So, with that said, if there are others out there who have their own studies, I would love to see your findings. Everyone's perspective is unique to their needs. There's a lot to learn. Also, if you have questions about SICS, at the end of this article I provide three good resources for you to consider.
The first thing I found out about these tools was there were not a lot of folks doing Software Inventory Control on a formalized level. In many inquiries there was a lot of Excel spreadsheet tracking, Access databases, etc. It seemed to me as though many were only recently discovering the value on a more serious level, in great part due to the economy driving cost savings anywhere it can be found. This is yet another place to save on licensing, potentially consolidating like applications and precluding potential fines. The economy is actually driving this effort across the industry right now. Everyone is trying to figure out how to squeeze a watermelon through a garden hose.
I learned quickly that these tools are a unique discipline unto themselves not unlike SCM, SQA, DBAs, developers, etc. Learning, maintaining, reporting, mentoring, upgrading, enhancing, modifying, manual input, testing and sustainment, and more. Running these tools well is a full-time job. In my opinion they require a staff person to work them who is constantly fine-tuning your environment perfectly. Most environments are very dynamic and that requires attention to detail.
If kept up-to-date they run effectively and I believe at the end of the day there is a cost benefit. This is a tool that just makes sense, without a full and complete inventory what are your metrics based on? Any organization that started this seriously some time ago is a head of the curve. It truly is an enormous task.
In many organizations applications are underutilized, over utilized, you pay far too much, you don't have enough. Knowing what you have is tied directly to your bottom line. Being able to discover all your assets on every computer you have will help you to better pay for what you have and use only what you paid for. Good for auditing and no potential financial penalties.
Once you have a complete discovery, then you can possibly find applications performing a duplicate activity and consolidate them, if that scenario exists, and remove applications and licensing costs that you either don't use or need. There are some real good suites out there that allow Patch Management, Power Management, Asset Inventory Control and Software Usage Analysis - all important in driving down cost.
These tools are quite robust; you can see the RAM on a PC, the slots, how much RAM is in the slots, USB detection, alerts when new applications are installed or uninstalled. Having the ability to query every machines agent and get real-time reports is a powerful tool to add to your arsenal.
As they say knowledge is power and these tools bring power with them. Once I developed a basic understanding of the tools and their capabilities, I had to define what it was that I felt constituted a good tool for my research and our environment. I looked at products across a broad spectrum and thought that for our environment we needed an Enterprise product that could be centralized, have a client agent, integrate into other products and be able to generate reports to reconcile discoveries with financials.
I started by looking at the tools we had in-house first to see if we had a tool that could potentially be customized. If we can develop our own product in house I feel that is better for support and modifications. When you personally own the development, you do better work than someone else and get a better product. I determined quickly that most of tools that were in-house were not really designed for software usage analysis and didn't have agents associated with them, and that would require a lot more work than those tools that were already canned in the box for this functionality.
Next, I looked at open source products. I liked many of the tools and would have felt comfortable using any one of them in a small organization looking to save money. Some of the tools I looked at were free up to the first 25 units; beyond that there was a cost but you also added functionality. The disappointing detail about some of the free open source tools is there is absolutely "No Support." Many were very clear on this fact. These tools are complicated and do so much that I would consider the Enterprise level just for the added support and minimize any down time if possible. These tools are all great as long as they are stable products. Otherwise you spend your time patching them and experiencing performance issues, which cost productive time for the end user.
I ruled out open source products in an Enterprise environment. I know that will upset some folks, but I just didn't feel they were robust enough and I didn't like the lack of support. I was immediately comfortable with the scalability and functionality of the workhorses. The enterprise-wide tools that I explored provided great discovery into precisely what software assets you are using and allowed you to reconcile this discovery with financials. However, I have not used these tools in any environment on the ground and only witnessed demos.
I can't speak to real-life experience as to witnessing whether these tools live up to their big names. However, I feel comfortable in saying that of the corporate enterprise tools I investigated, other products these corporations sell are good quality products as well. All the top vendors I reviewed knew their stuff. Any of my final choices would help you to discover all your software assets.
I also explored tools that other states were using or were in the process of implementing. One State was just in the process of implementing an ITAM tool but it was too soon for any results or lessons learned; the second one I spoke with was using an asset management tool but not discovering software usage analysis. I suspect in a few years from now these tools will be standard in every business.
When I consider tools that I am reviewing, I think generally about standardization - tools that can be implemented across all platforms that fill a broader need and centralization, creating central repositories for important assets so that data assets aren't scattered everywhere; and interoperability, capitalizing on other tools in your arsenal that you can integrate together. The metrics you acquire from another tool integrated are more accurate than any received from manual input from a user. For example, we use CA SCM in our shop and integrate this product into at least five other development tools (PowerBuilder, .NET, Eclipse, Java, etc.). Not always seamless and not without bumps but it accomplishes the task of centralizing the assets and the standardization of SCM by securing and controlling those assets using one tool to acquire the assets.
In my research I found that Software Inventory Control Systems were no different from other products. If you can leverage tools you already have through integration, metrics could be generated to provide a clear picture of your entire environment. These SICS tools seemed to work best with a suite that encompasses power management, patch management, asset inventory, software usage analysis, and more. There are several suites available that bundle the whole package up for you. It really is just a great way to go if you can spend the money.
After completing my findings I decided on the following five products, which I thought were serious tools to consider:
- AssetExplorer ITAM
- CA Client Automation/Inventory Control
- Kaseya IT Management
- IBM Software Usage Analysis/End point Manager
- SofTrack Inventory Control
I watched demos of all of these products and thought that any one of them would work very well in most mid-to-large customer's environments. Whether you purchase the suite or the individual products/modules, these are professional tools meant for creative, dynamic, rich work environments. They all cover the same ground, each a little different than the other but with the same end results to discover everything and report on it so that ultimately you can reconcile the data acquired with financials and hopefully realize a cost saving that at least pays for the tool.
The best way to know which product works for you is to "date the product." Install it, set it up and beat it up, and see if it delivers. These products are maturing all the time; whether these companies like it or not they are going to have to invest in these tool sets as a whole more as the end user demands more sophistication and the economy is driving these products. They are only going to get better.
All of these products and the companies representing them were very professional and knowledgeable. I thank them all for providing information about their products. All of these products are very sophisticated. These are not toys. It takes smart people to run these tools. I joke but there is a lot to know about these products and I just scratched the surface.
Table 1 shows a tool comparison that I used to refer to a few of the products I was reviewing (see Table 1). The recovery agent is important - it reports on activities that occur in your environment. Those alerts can ultimately save you money.
One thing I think is useful when looking for the right tool is to try and leverage the expertise you already may have in-house on a particular product. As I mentioned earlier these tools are best bought as a suite to address all facets of business but can be bought alone as well. However, my point is this - if you already own several products in a suite for some functionality, it makes sense to look at the entire suite. If you have individuals already knowledgeable on a specific module in the suite, you have some expertise you can leverage. This doesn't work for all products of course. Check to see if there are lessons learned from other users in the community or within your own organization.
When looking for the right tool for your environment some of the main features you really want to have available are as follows:
- A good database behind you: Storage, retrieval and reporting are essential and that is all about a good database. I like Oracle myself but SQL server is also a good choice.
- User-friendly GUI: Without an easy product to use and implement it could take time before you see results. Wizards with canned processes can streamline this activity and have you up and running quickly. You don't want a product that is so sophisticated that no one can run it and that mistake is made by some products. That was not the case in the ones that I looked at for this research. I felt that all the products I reviewed took into consideration the front-end administrator/user.
- Real time reporting: You want to be able to provide reports on-demand for status accounting purposes and financial reconciliation.
- Manual input: You want to keep track of artifacts that aren't typically discovered -BlackBerry, iPods, in-house development applications, etc.
- Remote capability: You want access to a machine for patch management, remote discovery, power management should a suite be purchased.
- USB detection: Detecting activity on USBs will help to prevent viruses if a policy can be set up to scan a device upon entry.
Much of the additional information below on SICS Toolsets is derived directly from the following corporations' literature to ensure accuracy of product information:
CA Inventory Control
It's an enterprise tool that is quite scalable and can run on either Oracle or SQL db - your preference. It includes business objects 11 (boxi) with its own universe for all your reporting needs. It can be used for all your non IT assets as well (NY Sports Club uses it for their gym equipment, for example, and sled clients use it to track office furniture). You can enter data in manually, via flat file, via bar code scanner, or using an automated discovery tool.
Asset Inventory and Discovery
- Automated discovery, inventory, configuration, usage and reporting.
- Advanced asset analytics for informed decision-making.
Continuous and Active Discovery
- Enables reconciliation of newly discovered assets with inventoried assets.
Comprehensive Inventory Across Heterogeneous Platforms
- Hardware inventory
- Software inventory
- Software usage monitoring
- Configuration information
- System utilization
- History of changes to systems
- Extensive reporting system
Automate Daily Client Inventory Management Tasks
- Create proven, reliable and repeatable processes
- Employ process workflows
- Unified technology architecture, single management console
- Common management database
Centralized and Remote Management Capabilities
- Reduce desk-side visits
The Kaseya Advantage
- Agent-based: Simple, scalable, and lightweight
- Fully integrated: Single pane of glass operation
- Firewall friendly: No open ports
- Fully encrypted: End-to-end protection
- Secure and reliable: Web-based, sniff-proof and SSL-protected
- Flexible and scalable: Easy setup, easy configuration
Kaseya Accreditations and Certifications
Kaseya has been rigorously tested and is pending an award of the Common Criteria Evaluation Assurance Certification and FIPS 140-2 Validation. Many federal and state agencies, including the Department of Defense, NASA, and the Department of Transportation use Kaseya solutions to improve service levels - yet still safeguard the integrity and availability of sensitive agency information.
It eliminates waste, sets policies, keeps systems secure, eliminates vulnerabilities and keeps those machines current. This ultimately has a positive influence on your end users and gives the IT staff the ability to maintain control of those systems.
Kaseya is a server/agent architecture. The agent itself is a 1.2 mb installer, and once installed it consumes 5mb with very little bandwidth consumption. Kayseya has customers managing 1000s and 10,000s machines with no performance issues. The communication from the agent to the servers is 100% secure (military grade encryption). The server can reside on premise (virtualized), or can be hosted in a cloud environment.
Manage software assets and get the complete software information with Asset Explorer. Ensure compliance by keeping a checklist of compliant, under-licensed and over-licensed software.
AssetExplorer helps to keep up-to-date information of all your assets by periodically scanning the software, hardware and other ownership information. Track and manage any workstation or network devices, for example, Windows, Linux, Mac, AIX machines, Solaris, Printers, Routers, and Switches, in your domain or network. It offers you both agent-based scanning and agent-less scanning.
AssetExplorer scans your network and automatically discovers all software available in each workstation. Asset Managers can easily ensure compliance by keeping a checklist of compliant, under-licensed and over-licensed software.
Software Usage Tracking
Gets the complete information on how often installed software is used and automatically classifies them as Frequently Used, Occasionally Used and Rarely Used Software.
Based on Software Usage Tracking reports, Asset Managers can take decisions while purchasing or renewing software licenses (see Figure 1).
Scans your network and automatically discovers all software available in all the workstations in your organization.
It helps you understand how many installations of software are available in your network.
It focus more on Software License Management by classifying all paid software as "Managed."
Group all your scanned software as freeware, open source, shareware, prohibited, and excluded using software types
Track and uninstall all the prohibited software, which the users have installed without your knowledge using the Remote Control Functionality (see Figure 2).
SofTrack's Smart Inventory
SofTrack's Smart Inventory provides automatic application recognition including where installed and frequency of use. It enables you to maximize your application budget and save money when you renew only the application licenses you are actually using.
Many software applications are licensed per installation (node-locked, single-user) rather than by actual usage. Simply put, when you install an application on a workstation it's counted as being in-use regardless of whether you actually use the application.
SofTrack's Smart Inventory provides automatic reports of:
- What applications are installed
- Where applications are installed
- When applications are being used
SofTrack's Smart Inventory enables you to automatically determine the what, where and when of applications in your environment. SofTrack's Smart Inventory automatically determines what applications are installed on your workstations. There is no requirement of any prior knowledge of what applications are installed at each of your machines.
SofTrack's Smart Inventory will reveal computer-by-computer where applications are installed including version, publisher and, where available, the application serial number and installation key.
SofTrack's Smart Inventory will detect when each application is used and does not rely upon Windows Add/Remove Programs "last used" data. SofTrack's local agent technology will collect the following data per application per workstation:
- Computer where used
- Number of times launched (run)
- Most recent time launched
IBM Software Usage Analysis
Software Usage Analysis tool is an application used to identify under-utilized software, track software usage patterns and trends, and detect over-used software licenses to maintain compliance with license agreements.
IBM Tivoli Endpoint Manager solution, built on the BigFix technology, provides real-time visibility and control through a single infrastructure, single agent, and single console for systems lifecycle management, endpoint security, and compliance. This approach enables you to securely manage your global IT infrastructures quickly and accurately, resulting in improved governance, control, visibility, and business agility.
IBM Tivoli Endpoint Manager solution leverages a multilayered technology platform that acts as the central nervous system of a global IT infrastructure. A dynamic, content-driven messaging and management system, this innovative technology distributes the work of managing IT infrastructures out to the managed devices. As a result, IBM Tivoli Endpoint Manager is able to operate in real time and deliver the scalability and performance that organizations demand. In addition, the single, multipurpose agent controls multiple services regardless of where the endpoint roams, optimizing the user experience and minimizing system impact. Specifically, the capabilities delivered via the IBM Tivoli Endpoint Manager unified management platform include:
- Asset discovery and inventory
- Software distribution
- Patch management
- Power management
- OS deployment
- Remote control
- Software use analysis
- Security configuration management
- Vulnerability management
- Anti-virus and anti-malware client management
- Network self quarantine
IBM Tivoli Endpoint Manager for Lifecycle Management
IBM Tivoli Endpoint Manager for Lifecycle Management streamlines IT operational tasks with a lightweight, single-agent, single-console framework. The suite includes the following major capabilities:
- Asset Discovery provides visibility into all IP-enabled devices on your network
- Asset Inventory offers deep inspection capabilities to report on hundreds of hardware and software attributes out-of-the-box, across platforms, in real time, at massive scale
- Patch Management collapses patch and update time scales, reduces staff workloads, cuts cost, and increases the effectiveness of patch processes
- Remote Control configures and manages.
- Microsoft Windows Remote Desktop and Remote Assistance functionality
- Tivoli Remote Control components to be able to initiate and manage sessions remotely
- Software Distribution automates and adds new levels of assurance to the software deployment process
- OS Deployment provides bare-metal imaging and provisioning for operating systems
- Software Use Analysis helps enterprises identify under used software to manage costs and over used licenses to stay on the right side of software license agreements
I cannot report on our experience with a tool of this nature at this time. However, should we procure one for our use and implement it I will report back in the future on our discoveries. We develop and maintain more than 181 applications in-house at New Hampshire's Department of Information Technology. The applications are used extensively in our welfare and health services delivery agencies. Example applications are for child-care licensing and managing adult and elderly care. Throughout the state the applications are used by hundreds of users.
Please feel free to email your questions to me at http://alsoucy.sys-con.com/
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
Aug. 30, 2016 11:15 AM EDT Reads: 591
SYS-CON Events announced today that Adobe has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Adobe is changing the world though digital experiences. Adobe helps customers develop and deliver high-impact experiences that differentiate brands, build loyalty, and drive revenue across every screen, including smartphones, computers, tablets and TVs. Adobe content solutions are used daily by millions of co...
Aug. 30, 2016 11:00 AM EDT Reads: 3,690
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
Aug. 30, 2016 10:30 AM EDT Reads: 391
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at EMC, will introduce a methodology for capturing, enriching and sharing data (and analytics) across the organizati...
Aug. 30, 2016 09:08 AM EDT Reads: 250
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 30, 2016 02:30 AM EDT Reads: 1,867
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 30, 2016 02:00 AM EDT Reads: 2,252
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 30, 2016 01:30 AM EDT Reads: 3,056
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 30, 2016 01:00 AM EDT Reads: 1,946
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Aug. 30, 2016 12:00 AM EDT Reads: 1,684
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 29, 2016 10:00 PM EDT Reads: 2,502
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Aug. 29, 2016 08:30 PM EDT Reads: 2,457
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 29, 2016 07:00 PM EDT Reads: 1,996
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 29, 2016 02:15 PM EDT Reads: 3,755
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Aug. 29, 2016 12:45 PM EDT Reads: 2,050
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 29, 2016 12:15 PM EDT Reads: 897
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 30, 2016 01:00 PM EDT Reads: 3,212
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 29, 2016 08:00 AM EDT Reads: 1,006
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Aug. 29, 2016 07:30 AM EDT Reads: 846
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 28, 2016 10:30 PM EDT Reads: 4,092
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Aug. 27, 2016 12:45 PM EDT Reads: 2,419