|By Al Soucy||
|September 9, 2011 10:45 AM EDT||
This article will focus on Software Inventory Control Systems (SICS). Recently, I was asked by Bill Rogers (NH DoIT Commissioner) and Peter Hastings (NH DoIT Director) to take a look at SICS, investigate them and provide a recommendation. When I started this research I knew nothing about SICS. I didn't understand their value. I ate, slept, and breathed these tools for a period of time so that I could understand their value. I got a good education from some smart folks: individuals in DoIT who were familiar with these products, research from the Internet, testimonials from other users on products, many demos and I inquired into what other states were using.
Presently, I am the administrator of SCM AllFusion Harvest. SCM AllFusion Harvest is a process-based Software Configuration Management (SCM) tool for managing application source code. I manage 181 applications housed in SCM AllFusion Harvest and support 122 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11; Visual Studio 2003, 2005, 2008; Eclipse and Java.
As the Software Configuration Manager, I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, from developing the life cycles, to providing the best practices, procedures, processes, and documentation; maintaining build machines; and the training of all the developers on proper source code management using the development tool in our development environment.
Before I share my findings on Software Inventory Control Systems, let me start by saying I am not an expert on these tools. I make no claim of that honor. I've only been looking at these tools for a few months. So, with that said, if there are others out there who have their own studies, I would love to see your findings. Everyone's perspective is unique to their needs. There's a lot to learn. Also, if you have questions about SICS, at the end of this article I provide three good resources for you to consider.
The first thing I found out about these tools was there were not a lot of folks doing Software Inventory Control on a formalized level. In many inquiries there was a lot of Excel spreadsheet tracking, Access databases, etc. It seemed to me as though many were only recently discovering the value on a more serious level, in great part due to the economy driving cost savings anywhere it can be found. This is yet another place to save on licensing, potentially consolidating like applications and precluding potential fines. The economy is actually driving this effort across the industry right now. Everyone is trying to figure out how to squeeze a watermelon through a garden hose.
I learned quickly that these tools are a unique discipline unto themselves not unlike SCM, SQA, DBAs, developers, etc. Learning, maintaining, reporting, mentoring, upgrading, enhancing, modifying, manual input, testing and sustainment, and more. Running these tools well is a full-time job. In my opinion they require a staff person to work them who is constantly fine-tuning your environment perfectly. Most environments are very dynamic and that requires attention to detail.
If kept up-to-date they run effectively and I believe at the end of the day there is a cost benefit. This is a tool that just makes sense, without a full and complete inventory what are your metrics based on? Any organization that started this seriously some time ago is a head of the curve. It truly is an enormous task.
In many organizations applications are underutilized, over utilized, you pay far too much, you don't have enough. Knowing what you have is tied directly to your bottom line. Being able to discover all your assets on every computer you have will help you to better pay for what you have and use only what you paid for. Good for auditing and no potential financial penalties.
Once you have a complete discovery, then you can possibly find applications performing a duplicate activity and consolidate them, if that scenario exists, and remove applications and licensing costs that you either don't use or need. There are some real good suites out there that allow Patch Management, Power Management, Asset Inventory Control and Software Usage Analysis - all important in driving down cost.
These tools are quite robust; you can see the RAM on a PC, the slots, how much RAM is in the slots, USB detection, alerts when new applications are installed or uninstalled. Having the ability to query every machines agent and get real-time reports is a powerful tool to add to your arsenal.
As they say knowledge is power and these tools bring power with them. Once I developed a basic understanding of the tools and their capabilities, I had to define what it was that I felt constituted a good tool for my research and our environment. I looked at products across a broad spectrum and thought that for our environment we needed an Enterprise product that could be centralized, have a client agent, integrate into other products and be able to generate reports to reconcile discoveries with financials.
I started by looking at the tools we had in-house first to see if we had a tool that could potentially be customized. If we can develop our own product in house I feel that is better for support and modifications. When you personally own the development, you do better work than someone else and get a better product. I determined quickly that most of tools that were in-house were not really designed for software usage analysis and didn't have agents associated with them, and that would require a lot more work than those tools that were already canned in the box for this functionality.
Next, I looked at open source products. I liked many of the tools and would have felt comfortable using any one of them in a small organization looking to save money. Some of the tools I looked at were free up to the first 25 units; beyond that there was a cost but you also added functionality. The disappointing detail about some of the free open source tools is there is absolutely "No Support." Many were very clear on this fact. These tools are complicated and do so much that I would consider the Enterprise level just for the added support and minimize any down time if possible. These tools are all great as long as they are stable products. Otherwise you spend your time patching them and experiencing performance issues, which cost productive time for the end user.
I ruled out open source products in an Enterprise environment. I know that will upset some folks, but I just didn't feel they were robust enough and I didn't like the lack of support. I was immediately comfortable with the scalability and functionality of the workhorses. The enterprise-wide tools that I explored provided great discovery into precisely what software assets you are using and allowed you to reconcile this discovery with financials. However, I have not used these tools in any environment on the ground and only witnessed demos.
I can't speak to real-life experience as to witnessing whether these tools live up to their big names. However, I feel comfortable in saying that of the corporate enterprise tools I investigated, other products these corporations sell are good quality products as well. All the top vendors I reviewed knew their stuff. Any of my final choices would help you to discover all your software assets.
I also explored tools that other states were using or were in the process of implementing. One State was just in the process of implementing an ITAM tool but it was too soon for any results or lessons learned; the second one I spoke with was using an asset management tool but not discovering software usage analysis. I suspect in a few years from now these tools will be standard in every business.
When I consider tools that I am reviewing, I think generally about standardization - tools that can be implemented across all platforms that fill a broader need and centralization, creating central repositories for important assets so that data assets aren't scattered everywhere; and interoperability, capitalizing on other tools in your arsenal that you can integrate together. The metrics you acquire from another tool integrated are more accurate than any received from manual input from a user. For example, we use CA SCM in our shop and integrate this product into at least five other development tools (PowerBuilder, .NET, Eclipse, Java, etc.). Not always seamless and not without bumps but it accomplishes the task of centralizing the assets and the standardization of SCM by securing and controlling those assets using one tool to acquire the assets.
In my research I found that Software Inventory Control Systems were no different from other products. If you can leverage tools you already have through integration, metrics could be generated to provide a clear picture of your entire environment. These SICS tools seemed to work best with a suite that encompasses power management, patch management, asset inventory, software usage analysis, and more. There are several suites available that bundle the whole package up for you. It really is just a great way to go if you can spend the money.
After completing my findings I decided on the following five products, which I thought were serious tools to consider:
- AssetExplorer ITAM
- CA Client Automation/Inventory Control
- Kaseya IT Management
- IBM Software Usage Analysis/End point Manager
- SofTrack Inventory Control
I watched demos of all of these products and thought that any one of them would work very well in most mid-to-large customer's environments. Whether you purchase the suite or the individual products/modules, these are professional tools meant for creative, dynamic, rich work environments. They all cover the same ground, each a little different than the other but with the same end results to discover everything and report on it so that ultimately you can reconcile the data acquired with financials and hopefully realize a cost saving that at least pays for the tool.
The best way to know which product works for you is to "date the product." Install it, set it up and beat it up, and see if it delivers. These products are maturing all the time; whether these companies like it or not they are going to have to invest in these tool sets as a whole more as the end user demands more sophistication and the economy is driving these products. They are only going to get better.
All of these products and the companies representing them were very professional and knowledgeable. I thank them all for providing information about their products. All of these products are very sophisticated. These are not toys. It takes smart people to run these tools. I joke but there is a lot to know about these products and I just scratched the surface.
Table 1 shows a tool comparison that I used to refer to a few of the products I was reviewing (see Table 1). The recovery agent is important - it reports on activities that occur in your environment. Those alerts can ultimately save you money.
One thing I think is useful when looking for the right tool is to try and leverage the expertise you already may have in-house on a particular product. As I mentioned earlier these tools are best bought as a suite to address all facets of business but can be bought alone as well. However, my point is this - if you already own several products in a suite for some functionality, it makes sense to look at the entire suite. If you have individuals already knowledgeable on a specific module in the suite, you have some expertise you can leverage. This doesn't work for all products of course. Check to see if there are lessons learned from other users in the community or within your own organization.
When looking for the right tool for your environment some of the main features you really want to have available are as follows:
- A good database behind you: Storage, retrieval and reporting are essential and that is all about a good database. I like Oracle myself but SQL server is also a good choice.
- User-friendly GUI: Without an easy product to use and implement it could take time before you see results. Wizards with canned processes can streamline this activity and have you up and running quickly. You don't want a product that is so sophisticated that no one can run it and that mistake is made by some products. That was not the case in the ones that I looked at for this research. I felt that all the products I reviewed took into consideration the front-end administrator/user.
- Real time reporting: You want to be able to provide reports on-demand for status accounting purposes and financial reconciliation.
- Manual input: You want to keep track of artifacts that aren't typically discovered -BlackBerry, iPods, in-house development applications, etc.
- Remote capability: You want access to a machine for patch management, remote discovery, power management should a suite be purchased.
- USB detection: Detecting activity on USBs will help to prevent viruses if a policy can be set up to scan a device upon entry.
Much of the additional information below on SICS Toolsets is derived directly from the following corporations' literature to ensure accuracy of product information:
CA Inventory Control
It's an enterprise tool that is quite scalable and can run on either Oracle or SQL db - your preference. It includes business objects 11 (boxi) with its own universe for all your reporting needs. It can be used for all your non IT assets as well (NY Sports Club uses it for their gym equipment, for example, and sled clients use it to track office furniture). You can enter data in manually, via flat file, via bar code scanner, or using an automated discovery tool.
Asset Inventory and Discovery
- Automated discovery, inventory, configuration, usage and reporting.
- Advanced asset analytics for informed decision-making.
Continuous and Active Discovery
- Enables reconciliation of newly discovered assets with inventoried assets.
Comprehensive Inventory Across Heterogeneous Platforms
- Hardware inventory
- Software inventory
- Software usage monitoring
- Configuration information
- System utilization
- History of changes to systems
- Extensive reporting system
Automate Daily Client Inventory Management Tasks
- Create proven, reliable and repeatable processes
- Employ process workflows
- Unified technology architecture, single management console
- Common management database
Centralized and Remote Management Capabilities
- Reduce desk-side visits
The Kaseya Advantage
- Agent-based: Simple, scalable, and lightweight
- Fully integrated: Single pane of glass operation
- Firewall friendly: No open ports
- Fully encrypted: End-to-end protection
- Secure and reliable: Web-based, sniff-proof and SSL-protected
- Flexible and scalable: Easy setup, easy configuration
Kaseya Accreditations and Certifications
Kaseya has been rigorously tested and is pending an award of the Common Criteria Evaluation Assurance Certification and FIPS 140-2 Validation. Many federal and state agencies, including the Department of Defense, NASA, and the Department of Transportation use Kaseya solutions to improve service levels - yet still safeguard the integrity and availability of sensitive agency information.
It eliminates waste, sets policies, keeps systems secure, eliminates vulnerabilities and keeps those machines current. This ultimately has a positive influence on your end users and gives the IT staff the ability to maintain control of those systems.
Kaseya is a server/agent architecture. The agent itself is a 1.2 mb installer, and once installed it consumes 5mb with very little bandwidth consumption. Kayseya has customers managing 1000s and 10,000s machines with no performance issues. The communication from the agent to the servers is 100% secure (military grade encryption). The server can reside on premise (virtualized), or can be hosted in a cloud environment.
Manage software assets and get the complete software information with Asset Explorer. Ensure compliance by keeping a checklist of compliant, under-licensed and over-licensed software.
AssetExplorer helps to keep up-to-date information of all your assets by periodically scanning the software, hardware and other ownership information. Track and manage any workstation or network devices, for example, Windows, Linux, Mac, AIX machines, Solaris, Printers, Routers, and Switches, in your domain or network. It offers you both agent-based scanning and agent-less scanning.
AssetExplorer scans your network and automatically discovers all software available in each workstation. Asset Managers can easily ensure compliance by keeping a checklist of compliant, under-licensed and over-licensed software.
Software Usage Tracking
Gets the complete information on how often installed software is used and automatically classifies them as Frequently Used, Occasionally Used and Rarely Used Software.
Based on Software Usage Tracking reports, Asset Managers can take decisions while purchasing or renewing software licenses (see Figure 1).
Scans your network and automatically discovers all software available in all the workstations in your organization.
It helps you understand how many installations of software are available in your network.
It focus more on Software License Management by classifying all paid software as "Managed."
Group all your scanned software as freeware, open source, shareware, prohibited, and excluded using software types
Track and uninstall all the prohibited software, which the users have installed without your knowledge using the Remote Control Functionality (see Figure 2).
SofTrack's Smart Inventory
SofTrack's Smart Inventory provides automatic application recognition including where installed and frequency of use. It enables you to maximize your application budget and save money when you renew only the application licenses you are actually using.
Many software applications are licensed per installation (node-locked, single-user) rather than by actual usage. Simply put, when you install an application on a workstation it's counted as being in-use regardless of whether you actually use the application.
SofTrack's Smart Inventory provides automatic reports of:
- What applications are installed
- Where applications are installed
- When applications are being used
SofTrack's Smart Inventory enables you to automatically determine the what, where and when of applications in your environment. SofTrack's Smart Inventory automatically determines what applications are installed on your workstations. There is no requirement of any prior knowledge of what applications are installed at each of your machines.
SofTrack's Smart Inventory will reveal computer-by-computer where applications are installed including version, publisher and, where available, the application serial number and installation key.
SofTrack's Smart Inventory will detect when each application is used and does not rely upon Windows Add/Remove Programs "last used" data. SofTrack's local agent technology will collect the following data per application per workstation:
- Computer where used
- Number of times launched (run)
- Most recent time launched
IBM Software Usage Analysis
Software Usage Analysis tool is an application used to identify under-utilized software, track software usage patterns and trends, and detect over-used software licenses to maintain compliance with license agreements.
IBM Tivoli Endpoint Manager solution, built on the BigFix technology, provides real-time visibility and control through a single infrastructure, single agent, and single console for systems lifecycle management, endpoint security, and compliance. This approach enables you to securely manage your global IT infrastructures quickly and accurately, resulting in improved governance, control, visibility, and business agility.
IBM Tivoli Endpoint Manager solution leverages a multilayered technology platform that acts as the central nervous system of a global IT infrastructure. A dynamic, content-driven messaging and management system, this innovative technology distributes the work of managing IT infrastructures out to the managed devices. As a result, IBM Tivoli Endpoint Manager is able to operate in real time and deliver the scalability and performance that organizations demand. In addition, the single, multipurpose agent controls multiple services regardless of where the endpoint roams, optimizing the user experience and minimizing system impact. Specifically, the capabilities delivered via the IBM Tivoli Endpoint Manager unified management platform include:
- Asset discovery and inventory
- Software distribution
- Patch management
- Power management
- OS deployment
- Remote control
- Software use analysis
- Security configuration management
- Vulnerability management
- Anti-virus and anti-malware client management
- Network self quarantine
IBM Tivoli Endpoint Manager for Lifecycle Management
IBM Tivoli Endpoint Manager for Lifecycle Management streamlines IT operational tasks with a lightweight, single-agent, single-console framework. The suite includes the following major capabilities:
- Asset Discovery provides visibility into all IP-enabled devices on your network
- Asset Inventory offers deep inspection capabilities to report on hundreds of hardware and software attributes out-of-the-box, across platforms, in real time, at massive scale
- Patch Management collapses patch and update time scales, reduces staff workloads, cuts cost, and increases the effectiveness of patch processes
- Remote Control configures and manages.
- Microsoft Windows Remote Desktop and Remote Assistance functionality
- Tivoli Remote Control components to be able to initiate and manage sessions remotely
- Software Distribution automates and adds new levels of assurance to the software deployment process
- OS Deployment provides bare-metal imaging and provisioning for operating systems
- Software Use Analysis helps enterprises identify under used software to manage costs and over used licenses to stay on the right side of software license agreements
I cannot report on our experience with a tool of this nature at this time. However, should we procure one for our use and implement it I will report back in the future on our discoveries. We develop and maintain more than 181 applications in-house at New Hampshire's Department of Information Technology. The applications are used extensively in our welfare and health services delivery agencies. Example applications are for child-care licensing and managing adult and elderly care. Throughout the state the applications are used by hundreds of users.
Please feel free to email your questions to me at http://alsoucy.sys-con.com/
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
May. 24, 2015 03:00 PM EDT Reads: 2,796
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
May. 24, 2015 12:15 PM EDT Reads: 2,251
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers is very hard. You have to learn five new and different technologies and best practices (libswarm, sy...
May. 24, 2015 12:00 PM EDT Reads: 2,166
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data retrieval. They can easily adapt to new data sets and provide access to both structured and unstruc...
May. 24, 2015 12:00 PM EDT Reads: 1,974
As the Internet of Things unfolds, mobile and wearable devices are blurring the line between physical and digital, integrating ever more closely with our interests, our routines, our daily lives. Contextual computing and smart, sensor-equipped spaces bring the potential to walk through a world that recognizes us and responds accordingly. We become continuous transmitters and receivers of data. In his session at @ThingsExpo, Andrew Bolwell, Director of Innovation for HP's Printing and Personal Systems Group, discussed how key attributes of mobile technology – touch input, sensors, social, and ...
May. 24, 2015 11:30 AM EDT Reads: 3,995
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
May. 24, 2015 10:30 AM EDT Reads: 5,299
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
May. 24, 2015 10:00 AM EDT Reads: 2,259
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists will peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you'll have no problem fil...
May. 24, 2015 10:00 AM EDT Reads: 1,999
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
May. 24, 2015 09:30 AM EDT Reads: 6,789
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
May. 24, 2015 09:00 AM EDT Reads: 1,813
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
May. 24, 2015 08:30 AM EDT Reads: 4,338
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
May. 24, 2015 08:00 AM EDT Reads: 4,421
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
May. 24, 2015 06:30 AM EDT Reads: 5,588
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
May. 24, 2015 05:00 AM EDT Reads: 2,509
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
May. 24, 2015 05:00 AM EDT Reads: 4,491
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
May. 24, 2015 05:00 AM EDT Reads: 4,972
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing demand and the rapidly changing workspace model.
May. 24, 2015 04:30 AM EDT Reads: 3,259
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
May. 24, 2015 04:00 AM EDT Reads: 5,068
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
May. 24, 2015 04:00 AM EDT Reads: 4,575
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
May. 24, 2015 03:30 AM EDT Reads: 4,089