Welcome!

Java IoT Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, Pat Romanski

Related Topics: @DevOpsSummit, Microservices Expo, @CloudExpo

@DevOpsSummit: Blog Feed Post

Software Design with #Microservices | @DevOpsSummit #DevOps #APM #DX

Microservices embody both engineering and operational elements within its design

Microservices and the Revival of Software Design
By JP Morgenthal

Back in February of 2017, Andrew Clay Schafer of Pivotal tweeted the following: “seriously tho, the whole software industry is stuck on deployment when we desperately need architecture and telemetry.” Intrigue in a 140 characters. For me, I hear Andrew saying, “we’re jumping to step 5 before we’ve successfully completed steps 1-4.”

Microservices is an industry response to the need for more design in what we are putting into this pipeline.

I have to agree with Mr. Shafer that it does seem like the IT industry has a fascination with the part of the process that releases new capabilities into production. I personally hear the words “continuous delivery (CD)” at least 3-4 times a day. However, CD is the result of multiple iterations on a delivery pipeline to remove bottlenecks and consistently optimize and improve. Enterprises that believe they can get this right on the first attempt will be at a very high risk for failure.

I believe a key driver enabling users to make this leap is the emergence of the microservice. Because microservices embody both engineering and operational elements within its design, it’s possible for businesses to just focus on the operationalization of a microservice without even requiring any engineering effort. For example, operations can take an existing application component and package it up in a container (e.g., Docker) and deploy ten instances of that container.

This now suffices in some businesses as a microservice. However, it’s not really a microservice, it’s just a repackaged application component that now is easier to manage and deploy. I don’t mean to belittle this effort. The ability to automate management and deployment of existing applications goes a long way to reducing IT overhead. However, to Andrew’s point, it’s missing the architecture and telemetry.

Microservices: A Key Element of DevOps Strategy
The last few decades of software engineering have heavily emphasized design without equal consideration for operationalization of the software being developed. As a visualization, perhaps you have seen the meme with the young girl and the house on fire in the background with the text, “it worked fine in development.” That’s because it’s easy to make a distributed application work in a properly engineered development environment; it’s when we release it into production (the wild?) that we begin to see its flaws.

The response to the above trend has been the emergence of DevOps and a focus on adopting lean principles that facilitate rapid and continuous deployment of small numbers of changes. The rationale behind this is sound; limit the number of changes in control variables, leverage an automated and highly-repeatable process that has been thoroughly tested to push into production, and lower overall risk.

However, to Mr. Shafer’s point, it seems there is a bit of extremism that has swung the pendulum too much to the side of operationalization without having first mastered the art of the design. I believe the results of this effect can be summarized as moving garbage faster into production. I also believe microservices is an industry response to the need for more design in what we are putting into this pipeline.

All distributed computing archetypes recognize deployment architecture in some way, shape or form, but microservices is really the first of these distributed object computing archetypes to address packaging and distribution as a first-class attribute of the design goals. Isolation, decentralized data management and implicit tolerance for failure are central design goals for microservices. However, the key is to design the microservices in such a way as to amplify the business value of the entity, fulfilling the second part of Mr. Shafer’s statement.

In the past we have taught architects to think component design. But components are a part of something larger and don’t have value outside of the larger entity. For example, a gear is a component of an engine. The engine needs the gear to operate, but the gear has little value outside of the engine. We must now teach architects to think in terms of designing services, which have innate value without being part of something larger, but still can participate in a larger context, increasing it’s value. Hence, the service can be part of a larger application, or it can simply deliver on a single goal.

Microservices, then, are an integral and key element of any DevOps strategy. It is a common model that application development and operations can share and clearly establishes deployment boundaries and pathways. For engineering, the microservice represents a bounded set of business logic and data supporting a business capability. For operations, the microservice represents the unit of deployed business capabilities.

The post Microservices and the Revival of Software Design appeared first on XebiaLabs.

More Stories By XebiaLabs Blog

XebiaLabs is the technology leader for automation software for DevOps and Continuous Delivery. It focuses on helping companies accelerate the delivery of new software in the most efficient manner. Its products are simple to use, quick to implement, and provide robust enterprise technology.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...