Welcome!

Java Authors: Carmen Gonzalez, Pat Romanski, Paige Leidig, Liz McMillan, Trevor Parsons

Related Topics: Cloud Expo, Java, SOA & WOA, .NET, Linux, SDN Journal

Cloud Expo: Article

An End-to-End Solution

Can one company fulfill all of the requirements of an organization?

I was asked by Mr. Peter Hastings (NH DoIT Commissioner) about my understanding and knowledge of "End to End Solutions". I have witnessed these solutions before but I wanted to find a good definition. So, I began my research and found this definition: One supplier or one company that can provide all of your hardware and software needs in a suite of tools that meets all of the customers requirements and no other suppliers need to be involved. I think this is a good one and makes sense to me.

I'm sure there are many other definitions that exist and this is simplistic in nature but I see this as a baseline definition. One could certainly branch off of this definition to encompass a variety of explanations but let's stick with this one for this article.

This idea has been around for sometime now and slowly but surely as the years go on it seems to go away and pop up and go away and pop up, etc... I have been watching and observing for a long time as well and witnessed the process and implementation of these suites over the years and have developed my own opinions about all encompassing tools. So, this has come up before in my career.

Commissioner Hastings wanted to understand at a high level if there was any benefit for our development teams across the enterprise of State government to integrate and entire suite of tools to achieve a more seamless Software Development Life Cycle (SDLC). If this were possible there could be savings realized by having shorter development life cycles rather than an application taking three years to complete, trim a year and the State could save money.

Theoretically, this is a great idea and makes good sense but in reality the only shops that can take on an idea of this magnitude are shops with a lot of money, a lot of staff and a lot of expertise in house because inherit in the words "End to End Solution" I hear big, large, enormous, all encompassing and there is a lot of overhead in this idea right out of the gate.

I explained to the Commissioner in advance and prefaced my dialog as such that I would be happy to write about this subject but that not all would agree with my analysis and conclusions. I explained to the Commissioner that this would only be my perspective and that there are varying experiences that would be a counterpoint to my thoughts on this subject. The Commissioner is quite open to varying opinions and really just wanted to understand the subject more than anything else at a high level. Now, I can only tell you what I have seen and witnessed over the years and share my thoughts.

The premise that one company can fulfill all of the requirements of an organization: Can this be true? Can one supplier/company provide this much and help an organization realize a cost savings? My opinion is that I have not seen this be true to date. It's a big white unicorn in my opinion, although, you have to believe in a unicorn once in a while. Every company has their specialty. Those things they do well and those things they try to do well and fail at.

We are all microcosms of the wider world so we are all "End-to-End Solutions" in some way, do we as humans do it all well? I think not. We all have those things we do well and those things we don't. The same goes for companies. So, the premise that a company can deliver it all to me is not real but fantasy in my opinion. Not only that but let's take it one step further and play devils advocate for one moment and even if a company could deliver an entire "End to End Solution" a suite of tools that delivered it all.

These are the next questions to ask: Can the organization it's being delivered to in reality manage it all? Do they have all the staff to support the toolsets in the suite, do they have all the expertise required to implement the toolsets in the suite, do they have the time it takes to learn and support the toolsets in the suite and can they really manage an entire suite of toolsets successfully?

If I were to think even of one tool in the suite, let's take the SCM tool since I know that tool well. You have the software configuration administration piece of the tool, you have the release management tool performing compiles, you have defect tracking tool to manage deficiencies, modifications and enhancements against the application, then you have code reviews and theoretically this could be three to four people with these skills. That's just one toolset in the suite. Do you have all of this expertise in one person in house already? SCM is not an easy skill to come by to begin with.

What tools will your organization do well at and which ones do you have the knowledge in already? Several toolsets in the suite may be worked because there is expertise in house on those toolsets but not on others and you may have to hire or train staff in house on the other toolsets providing you have the staff to train. In any case it usually turns out to be that you don't have all that it takes and as such fail at the implementation. Once the implementation fails and the buy in from management fades because no one wants to be associated with a failing project and the grass roots folks that are really working the products on the ground are to stressed from the implementation the experiment is over and lots of money is wasted that could have been targeted in another more productive manner.

The problem really is that when you get a lot of folks in a room and the sell is on and the atmosphere is charged with enthusiasm and a lot of folks see potential savings not yet realized that could be they become swept away in a sell. When this happens it usually ends up costing an organization a lot of money that was unintended and at some point the question is always asked what did we do?

I agreed to write on this subject ultimately because I'm bugged about "End to End Solutions". I bugged about the sell. I'm bugged about the hype and the promise that I don't see how anyone could deliver. For example: I have recording software that is more automated than anything I have seen before but I still have to know how to run the tool, record with it, how to mix, add effects, edit the recording, master to a stereo track even though the tool is very automated.

This would be the case for the "End to End Solution": There is the Testing tool, The QA tool, The Project Management tool, The Requirements Gathering tool, the SCM tool, the Defect Tracking tool and whatever other tool or plug-in you add to the suite. I'm sure the suites can be customized as well to only include what you need for your organization, which makes more sense.

The bottom line is, that's a lot of expertise, staff, time, money and resources. The tools have to be learned; you might have to hire folks in addition to support them, others will have to be cross trained in the event that your expertise goes away. What organization can really afford this level of overhead other than large organizations with a chunk of capital?

I think one of the big mistakes that organizations make is that when these sells are occurring there is often times no expertise from the organization in the room to ask the tough questions. What training is the supplier offering, what is the technical support like, are you using a proprietary database, if the database goes down who supports it, etc...

So, it's typically managers who are in the room who are not even going to do the work on the ground or support these behemoths. I'm bugged by the suites themselves that typically do a good job in one area and fall down in others but yet you now have an entire suite of tools to make-work. For example: I have seen some companies make a great SCM tool but a terrible Defect Tracking tool and a great Project Management tool but a terrible SCM tool.

The idea that one supplier can do it all in my opinion I feel is flawed. Just like when I'm recording music, since that's the analogy that I understand best. As automated as any mix they want to make whether they say click here for rock, jazz, easy listening, etc... The user still has to do something to get to that place and if that is not the right place the user must have an understanding of how to get to the right place. So, human intervention is required. Truly automated tools are a myth in my opinion.

Now a lot of folks are going to disagree with me because they are selling these monster's to people who can't support them, implement them, maintain them, provide technical support on them and mentor other's on them. It really requires and army depending on how big the organization is and that brings up my first example.

Every organization wants to save money and often times find themselves getting sold on this pot of gold that if everything could be integrated across the enterprise of our business however big or small we could save so much money and here's the thing as I said earlier.

A large organization can take the impact of a completely integrated software environment like this because it has the staff. Now if they have the staff, then it's possible they have the expertise or you at least have the staff that can be trained by the seller initially to get your people trained. However, when the customer leaves there must be competent staff that can truly maintain the tools and champion the tools to create the standard that can actually produce the desired result and cost savings.

For most organizations the additional staff, training and expertise in itself would more than likely be a great expense that was unintended. Good for you and your organization if you have the expertise of the suites toolsets in house but typically that is not the case.

Often times you have to go outside your space to find some help. Maybe to find someone who knows QA, which is hard to find these days or SCM's, that is even harder, Tester's, etc... If you have a suite of five toolsets you must have a champion of each one of those toolsets in order to get them introduced successfully in an organization.

Now, if you try this with State government your even more likely to fail because you never have the staff, expertise, time or the money. I have seen this attempted at least two times in my tenure.

Also, something that organizations don't take into consideration when taking on suites like this is that it usually requires a culture that accepts change and having their cheese moved and understand the chain of command. It requires typically a culture shift and most folks don't like change, including me. However, it was once said to me many years ago that Change is stability. It's the one constant. So, I try to believe in that.

So, consider for one moment an organization taking on a tool suite of this nature the "End to End Solution". Once you have hired staff, have your trained expertise in house, your staffing is solidly in place, you now need your leader to convincingly direct the staff to implement and accept the new methodology. This industry is always in flux and moves fast and is never is static but change is still difficult to implement.

When everyone gets on the bus and you have institutionalized a new culture with new toolsets and they are embraced, then savings can be realized because anyone who then draws outside the lines changes the integrity of the outcome and a new outcome emerges.

I've told you what you need and that larger organizations are more successful at this than smaller ones because they have the staffing, expertise, money and time on their side.

Let me just provide one example of what I have seen and the way I think this can be done successfully even in smaller organizations. Many attempts fail and the reason is simple. It's too big to eat. I mean these implementations are monstrous. They take multiple administrators for starters of every toolset in the suite. One person should never be running the whole suite with each tool having it's own administrator that would be a security risk. You never want one person having all the keys to the kingdom anyway.

There are multiple layers of understanding and expertise and a lot of time to get up and running. Now, I'll have plenty of folks who will disagree with me and that's fine. They are the ones selling these suites. They are going to have a different opinion and that's fine.

As I have learned in life anyway one-third of the people are going to like you, one third of the people are going to hate you and one third of the people don't care.

So, you purchase the suite anyway and the first toolset you'll tackle is the testing suite. Now you need to have a body that understands the testing tool. How to write the scripts that get populated into the tool to run the test scripts and they are constantly changing based on new functionality, modifications, deficiencies and enhancements against the applications under development. If you don't have a good scriptwriter for testing you are already in trouble.

Now you are going to tackle the Project Management tool. Who really knows this skill? I've never met anyone who actually used a tool like this and understood it and implemented it well. That's just my experience. I'm sure they are out there. To me this is a rare expertise. Moving onto the QA tool, then the SCM tool, Defect Tracking Tool, etc.. There could be a combination of toolsets. My point is that you have to eat a tool one at a time. It's extremely hard to begin with to standardize any tool across any organization never mind a suite of toolsets. However, if you take a small bite, digest that and then the next one and so one you'll be more successful.

I feel scalability is the answer and actually makes more sense. If you can customize the suite to only include in it what you have expertise in and build upon that knowledge slowly I feel you can be more successful. For example start with the expertise you have in house. Let's say you have SCM and Defect Tracking knowledge.

Introduce those tools and let users in your organization get comfortable with that standardization. Then once that is stable introduce the Project Management, then Testing, then Requirements Gathering, etc... It doesn't have to be in this order clearly. It can be in whatever order you want but if you go slowly you won't break the bank and it will evolve. I always tell folks this is an evolution not a revolution.

Just look at world events, when there is a revolution and the old government is now out of office typically no one knows how to introduce a new government successfully. Even when they do they try to do the whole thing at once and it fails because it usually takes years to get in the situation that you are in and it's going to take time to get out of it and change it. So, an evolution is always a better approach in my opinion. Slow progress over time gathering up steam as one tool is introduced after another until it's all in place.

Should an organization even take on this kind of an exercise you have to put competent people on it because competent people can mentor others and this makes the implementation more successful at the end of the day. Every person has their strength's and weaknesses so your going to want to put folks on the job that have a proven track record at pushing change through. Not an easy thing to do.

You want folks running this who have experience with rejection and get a leadership that is 100% unfalteringly driving the process. One exception to the process and the entire suite will fail.

If you were smart while you were implementing the toolset you have strength in you would have been learning the next toolset in the suite to be ready to move forward with that implementation the moment the first tool was completely implemented. This way there is no pause in the next learning curve.

There is a lot of work that goes into these suites and it's not just staffing a tool like this and finding the expertise to implement the various pieces/tools. This includes upgrading the tool, applying any bug fixes, maintaining the OS, supporting back-up's. The list really does go on.

One thing to make sure to find when examining the SCM toolset in the suites is that it has a Generic repository perspective. This really is the answer for the SCM piece of the puzzle. You need to have repositories, which is a glorified name for a folder that will house any type of asset in the organization. You can have an "End to End Solution" for one development tool and one project but most development environments contain multiple tools (Java, Cobol, PowerBuilder, etc.). Now you'll have to introduce multiples toolsets and twice the overhead and no way of measure across an enterprise. That's a waste of money in the final analysis.

You need to make sure that these repositories can support any platform of development tool that an organization has in house or across the enterprise depending on how your organization is structured. SCM tools that only support one platform or another that are part of a suite of this nature in the "End to End Solution" are limited in scope and are not flexible enough to be called standards and standardizing and centralizing is how you really save money.

I don't know of to many organizations anyway that only have one application language they are developing in and nothing else and if you're serious about SCM you have to take into account every type of software asset or intellectual property you have in the enterprise.

I'm not saying that there are not exclusive Java, Visual Studio or PowerBuilder shops because they exist. What I am saying is that large organizations like States for example typically have a lot of legacy applications that were developed in many types of languages and to have a SCM tool in your "End to End Solution" suite that only covers one language or another could never be a standard. Where would you house all the rest of your source data assets, intellectual property or mission-critical assets in a second SCM tool or a third?

If that is the case there goes the whole benefit of reporting, managing, requirements gathering, testing on all of your assets in the enterprise because you won't have everything in one repository but in multiple repositories. Then the "End to End Solution" idea is a moot point. The benefit is lost. Now you will need more expertise, more money, more staff, more time, more resources and now your wasting more of everything and you lose the value of the "End to End Solution" anyway. There is no value in supporting just one type of language at all, the metrics will be quite limited and remember the purpose of these suites to begin with is to standardize, centralize and realize cost savings across an entire organization not just one project.

Listen to how this makes no sense we spent all of the money and bought an "End to End Solution" just for the Java team or the Visual Studio team or the PowerBuilder team. What about all the other types of development you are performing? Don't you want to save money in those areas of application development as well? For the cost of these suites I would hope so and anything else would be a waste of time, money and resources for any organization.

There is also the disaster recovery aspect would you secure one set of assets and then not secure another set. For example would I only care about my PowerBuilder source assets and not my Visual Studio assets? If I have a repository perspective that can only handle one type of data assets, what happens if there is a catastrophic event I only can recover one sets of assets but not another. So, when looking at these suites and considering disaster recovery make sure the SCM toolset has a generic repository that can secure and control any type of source data assets in your organization. It doesn't have to be great at it but it does have to work.

Summary
There are a lot of pitfalls to these "End to End Solutions" make sure you have competent folks in the room that can ask the hard questions and if the managers are embarrassed by the hard questions at the dog and pony show they are going to be even more embarrassed when they buy the suites and begin to lose money for the organization, the implementation fails, they lose their jobs or have to move on to distance themselves from the failure and the organization has to absorb the cost.

You have to make sure you have the staff, the time, the money; the expertise and that you choose suites that can support every type of source data asset in your enterprise. Make sure leadership convincingly communicates the change. Take small bites and digest those and make the cultural shifts in your organization you need to make before moving on to the next toolsets in the suite. It's an evolution not a revolution so think it through and don't get caught up in the sell. It will only cost you more money later when you buy on impulse.

My background is that I have been working in Software Configuration Management (SCM) for a few years now and I am presently, the administrator of SCM at the State of NH. SCM is a process-based Software Configuration Management (SCM) tool for managing application source code, intellectual property and mission critical assets. In this capacity I also secure these assets for disaster recovery purposes. I manage 400 plus applications housed in SCM and support 400 users using the product. The development tools we currently use are PowerBuilder PBV8, PBV11 and 12; Visual Studio 2003, 2005, 2008, 2010 and 2012; Eclipse, Juno, RAD, Mule, Cold Fusion, Java, COBOL and VB.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes the entire infrastructure of the environment for development, developing life cycles, providing best practices, procedures, processes, documentation; defect tracking, disaster recovery, maintaining build machines and the training of all the developers on proper source code management using the development tools in our development environment.

Risk to End to End Solutions

  • Not enough staff to support the toolsets
  • Not enough expertise to understand the toolsets with great clarity (Testing, SCM, Defect Tracking, QA, Requirements Gathering, Project Management)
  • Loss of expertise to other employment opportunities.
  • No cross training of expertise or transfer of knowledge
  • No solid implementation team
  • The expertise required to analyze these tools are typically not invited to the dog and pony show.
  • Not enough capital when problems arise.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that Harbinger Systems will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Harbinger Systems is a global company providing software technology services. Since 1990, Harbinger has developed a strong customer base worldwide. Its customers include software product companies ranging from hi-tech start-ups in Silicon Valley to leading product companies in the US and large in-house IT organizations.
SYS-CON Events announces a new pavilion on the Cloud Expo floor where WebRTC converges with the Internet of Things. Pavilion will showcase WebRTC and the Internet of Things. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices--computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
The only place to be June 9-11 is Cloud Expo & @ThingsExpo 2015 East at the Javits Center in New York City. Join us there as delegates from all over the world come to listen to and engage with speakers & sponsors from the leading Cloud Computing, IoT & Big Data companies. Cloud Expo & @ThingsExpo are the leading events covering the booming market of Cloud Computing, IoT & Big Data for the enterprise. Speakers from all over the world will be hand-picked for their ability to explore the economic strategies that utility/cloud computing provides. Whether public, private, or in a hybrid form, clo...
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridsto...
SYS-CON Events announced today that Red Hat, the world's leading provider of open source solutions, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Red Hat is the world's leading provider of open source software solutions, using a community-powered approach to reliable and high-performing cloud, Linux, middleware, storage and virtualization technologies. Red Hat also offers award-winning support, training, and consulting services. As the connective hub in a global network of enterprises, partners, a...
As the Internet of Things unfolds, mobile and wearable devices are blurring the line between physical and digital, integrating ever more closely with our interests, our routines, our daily lives. Contextual computing and smart, sensor-equipped spaces bring the potential to walk through a world that recognizes us and responds accordingly. We become continuous transmitters and receivers of data. In his session at Internet of @ThingsExpo, Andrew Bolwell, Director of Innovation for HP’s Printing and Personal Systems Group, will discuss how key attributes of mobile technology – touch input, senso...
The Internet of Things (IoT) is making everything it touches smarter – smart devices, smart cars and smart cities. And lucky us, we’re just beginning to reap the benefits as we work toward a networked society. However, this technology-driven innovation is impacting more than just individuals. The IoT has an environmental impact as well, which brings us to the theme of this month’s #IoTuesday Twitter chat. The ability to remove inefficiencies through connected objects is driving change throughout every sector, including waste management. BigBelly Solar, located just outside of Boston, is trans...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics...
Internet of @ThingsExpo Silicon Valley announced on Thursday its first 12 all-star speakers and sessions for its upcoming event, which will take place November 4-6, 2014, at the Santa Clara Convention Center in California. @ThingsExpo, the first and largest IoT event in the world, debuted at the Javits Center in New York City in June 10-12, 2014 with over 6,000 delegates attending the conference. Among the first 12 announced world class speakers, IBM will present two highly popular IoT sessions, which will take place November 4-6, 2014 at the Santa Clara Convention Center in Santa Clara, Calif...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at Internet of @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., will show what is needed to leverage the IoT to transform your business. He will discuss opportunities and challenges ahead for the IoT from a market and tec...
SYS-CON Events announced today that TeleStax, the main sponsor of Mobicents, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. TeleStax provides Open Source Communications software and services that facilitate the shift from legacy SS7 based IN networks to IP based LTE and IMS networks hosted on private (on-premise), hybrid or public clouds. TeleStax products include Restcomm, JSLEE, SMSC Gateway, USSD Gateway, SS7 Resource Adaptors, SIP Servlets, Rich Multimedia Services, Presence Services/RCS, Diame...
From a software development perspective IoT is about programming "things," about connecting them with each other or integrating them with existing applications. In his session at @ThingsExpo, Yakov Fain, co-founder of Farata Systems and SuranceBay, will show you how small IoT-enabled devices from multiple manufacturers can be integrated into the workflow of an enterprise application. This is a practical demo of building a framework and components in HTML/Java/Mobile technologies to serve as a platform that can integrate new devices as they become available on the market.
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption by amplifying "faint signals" from the alpha geeks who are creating the future. An...
SYS-CON Events announced today that Aria Systems, the recurring revenue expert, has been named "Bronze Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Aria Systems helps leading businesses connect their customers with the products and services they love. Industry leaders like Pitney Bowes, Experian, AAA NCNU, VMware, HootSuite and many others choose Aria to power their recurring revenue business and deliver exceptional experiences to their customers.
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
As a disruptive technology, Web Real-Time Communication (WebRTC), which is an emerging standard of web communications, is redefining how brands and consumers communicate in real time. The on-going narrative around WebRTC has largely been around incorporating video, audio and chat functions to apps. In his session at Internet of @ThingsExpo, Alex Gouaillard, Founder and CTO of Temasys Communications, will look at a fourth element – data channels – and talk about its potential to move WebRTC beyond browsers and into the Internet of Things.
SYS-CON Events announced today that Gigaom Research has been named "Media Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Ashar Baig, Research Director, Cloud, at Gigaom Research, will also lead a Power Panel on the topic "Choosing the Right Cloud Option." Gigaom Research provides timely, in-depth analysis of emerging technologies for individual and corporate subscribers. Gigaom Research's network of 200+ independent analysts provides new content daily that bridges the gap between break...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.