Welcome!

Java IoT Authors: Liz McMillan, Pat Romanski, Elizabeth White, Paul Simmons, Yeshim Deniz

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Apache

@DevOpsSummit: Blog Post

How to Fail Enterprise DevOps Miserably By @Plutora | @DevOpsSummit [#DevOps]

Where do you draw the boundaries between DevOps and more structured approaches to IT service management

Five Surefire Ways to Make Your Enterprise DevOps Initiative Fail Miserably

By Dalibor Siroky

2014 was the year that DevOps arrived in large enterprises, and we're seeing several large corporations moving to more agile software delivery and more frequent releases. We've helped many companies move toward DevOps practices using Plutora to provide necessary transparency to manage risk and track agile software development efforts, and over the past few years we've come to understand what makes a DevOps initiative succeed and what makes it fail.

In this post we focus on some of the factors that can contribute to a failed DevOps initiative - a DevOps initiative that often ends up being either abandoned or one that ultimately ends up recreating the same messy bureaucracy it was supposed to replace.

Failure #1: Fail to define what DevOps means to your organization

The term DevOps is not well defined, and you'd be hard pressed to get the same definition of "DevOps" from everyone you ask in your enterprise. Developers in your organization may equate DevOps with a specific approach to software builds and the use of popular tools such as Chef, Puppet, Jenkins, Git, and Docker. IT management might see DevOps as a continuation of existing processes with an emphasis on faster time to market and lightweight release procedures.

Without a common definition of the term you'll have teams arguing over what is DevOps and what is not DevOps.  If your software releases still involve the use of a change management tool such as BMC's Remedy is it really DevOps? If the build takes an hour to deploy a QA build is it really DevOps? The reality of Enterprise DevOps is that every organization's answers to these question will vary.  Enterprise DevOps is compromise between self-service, rapid-fire agility and the ability to manage the risks that accompany mission-critical builds.

Some of our customers are launching rockets and running economies, "DevOps" means something very different to these clients than it means to a startup.

Before you start introducing technology and process under a DevOps initiative make sure to define a baseline for your DevOps initiative:

  1. What is your definition of DevOps?
  2. What are you trying to accomplish? And, most importantly,
  3. Where do you draw the boundaries between DevOps and more structured approaches to IT service management.

Fail to do this and you'll see teams arguing over what DevOps "is."

Failure #2: Focus on tools and techniques, forget about people.

The mistake many enterprises make is to elevate technology as the primary driver of DevOps at the expense of some of the difficult yet necessary processes that ensure that quality software that meets customer expectations.  It isn't enough to hire a few release engineers, give them a bunch of VMs, and permission to install Jenkins and Puppet.  You can't just hire a bunch of "DevOps People" put them in a room and step away expecting them to work "Magic" and make everything more efficient.

Instead what you need to ensure in any DevOps initiative is that you are taking human-driven processes in account.  You need to align any existing teams such a quality assurance and release management with DevOps initiatives to avoid the common mistake of failing to adapt an existing enterprise release management process to a new approach to software delivery and service management.

You can always create more environments by throwing servers at the problem, but it's unrealistic to expect your teams to scale overnight. Occasionally, your QA team needs to sleep.

We've seen a number of companies adopt DevOps, move to faster, more frequent releases driven by the needs of individual projects only to realize that an increase cadence of software delivery can lead to QA and release management burnout. If you are introducing more automation to speed time to market make sure you also think of the impact any DevOps initiative is going to have on people.

Failure #3: Ignore governance entirely.

A lot of the rhetoric in favor of DevOps is rooted in the idea that developers need to take a proactive approach to "route around" change-averse administrators.  DevOps in the enterprise tends to emerge from one or two group deciding to stage a revolution against an ineffective IT organization, and a number of us have participated in these transitions over the last decade. You work for a company that has huge system, intractable releases that take months, and, eventually, you just lose patience with the process and one or two teams decide that they are going to break the mold and move quickly.

The first DevOps were breaking the rules by design. They were comprised of independent teams "free to innovate" outside of centralized IT structures. This approach works in smaller startups, but it is a non-starter in most enterprises.

An enterprise without common standards for software architecture, release management, and environment management isn't an enterprise at all - it's an awful mess.

An organization with dozens of independent teams creating novel continuous deployment pipelines sounds good in a work of DevOps fiction, but it never works in practice. When we work with the largest companies in the industry they want to enable teams to work faster, but they also understand that DevOps isn't about reducing the number of governance gates. On the contrary, if anything DevOps enables more effective, more frequent governance gates if you use a tool like Plutora to shine a light on complex release orchestration challenges.

Failure #4: Fail to account for risk.

More frequent releases, self-service provisioning of infrastructure, infrastructure automation, continuous delivery pipelines: all of these common factors of DevOps initiatives lead to faster time to market, but at the tail end of a release process the business risks remain unchanged. Changes to production facing systems still require rigorous change management and when multiple teams feel empowered to push to production every weeks (or every day) you still need some release management function tracking conflicts and risk.

Many companies dive head first into DevOps without a full understanding of how DevOps will affect risks associated with software releases. When a company transitions from a slower, ITIL-focused process to a more agile, DevOps-focused reality release managers are often expected to "wing it" toward the end of the release cycle. They are asked to put governance gates atop a fast moving, constantly evolving process that is driven by development teams eager to release.

When organizations adopt DevOps they often lose the built-in "checks and balances" that came with ITIL. Software can be delivered faster, but the enterprise still require governance gates.

With Plutora you can let application development teams move quickly, you can allow teams to conduct multiple simultaneous release, and at the tail end of the process you can integrate with existing ITIL tools that operations expects to have in place to track and manage risk. If you just throw developers at your production servers you'll learn first-hand how reliable production is when no one factors risk in decisions about production.

Failure #5: Run DevOps without metrics.

DevOps teams start out very eager to make large changes to an enterprise's infrastructure and release process, but they also tend to bite off a bit more than they can chew. The smart enterprise understands that no initiative can interfere with ongoing software development and release management and they will manage the slow transition to DevOps tools and techniques over several quarters. While you DevOps teams will want to reinvent your release process overnight you release and IT managers should define metrics to evaluate whether DevOps initiatives are a success.

Enterprises expect to see hard data to back up staffing and infrastructure decisions. If you are invested in the success of a DevOps initiative make sure that you are collecting statistics that justify your investment.

If you stand up a DevOps team ask them to define roles and responsibilities. Hold them accountable for bringing greater efficiency to the organization. Regularly check in with developers and system administrators not on the team and objectively assess the results. Keep track of release and environment metrics with Plutora for teams that are involved with DevOps and teams not involved with DevOps and use the data Plutora provides to make informed decisions to dial up or dial down particular initiatives from your DevOps teams.

More Stories By Plutora Blog

Plutora provides Enterprise Release and Test Environment Management SaaS solutions aligning process, technology, and information to solve release orchestration challenges for the enterprise.

Plutora’s SaaS solution enables organizations to model release management and test environment management activities as a bridge between agile project teams and an enterprise’s ITSM initiatives. Using Plutora, you can orchestrate parallel releases from several independent DevOps groups all while giving your executives as well as change management specialists insight into overall risk.

Supporting the largest releases for the largest organizations throughout North America, EMEA, and Asia Pacific, Plutora provides proof that large companies can adopt DevOps while managing the risks that come with wider adoption of self-service and agile software development in the enterprise. Aligning process, technology, and information to solve increasingly complex release orchestration challenges, this Gartner “Cool Vendor in IT DevOps” upgrades the enterprise release management from spreadsheets, meetings, and email to an integrated dashboard giving release managers insight and control over large software releases.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...