Welcome!

Java IoT Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Elizabeth White, Stefana Muller

Related Topics: @CloudExpo, Containers Expo Blog, SDN Journal, @DevOpsSummit

@CloudExpo: Article

What ‘Software-Defined’ Really Means | @CloudExpo #AI #SDN #SDX #DevOps

It’s time to bring some clarity into the big picture of SD – what it is, and perhaps even more importantly, what it is not.

The visual model to declarative metadata representation to immutable deployment vision is in essence what SD is all about.

The secret to making this approach practical, and thus the key to understanding why SD approaches have become so prevalent, is the word immutable.

Once we get an SD approach right, we no longer have to touch the deployed technology whatsoever. Instead, to make a change, update the model and redeploy.

In a recent Cortex, I bemoaned the fact that as buzzwords go, Digital Transformation is excessively vague. There is yet another buzzword of our times that is suffering the same fate: Software-Defined.

Rare though buzz-adjectives may be among the pantheon of buzz-nouns and the occasional buzz-verb, Software-Defined (SD) has become remarkably pervasive. In fact, it ties together many different, quite disparate concepts into what has become a vague mishmash.

It's time to bring some clarity into the big picture of SD - what it is, and perhaps even more importantly, what it is not.

The Many Uses of Software-Defined
The most concrete use of the SD adjective is perhaps in the phrase Software-Defined Networking (SDN). SDN separates network equipment's control plane (where routing instructions and other metadata go) from the data plane (where the data being routed go), and then shifts the entire control plane to centralized software.

The network, however, is only the beginning. We have SD infrastructure (SDI), SD data centers (SDDCs), SD wide-area networking (SD-WAN), and more. Each of these approaches follows the lead of SDN, shifting control of various pieces of hardware (or virtualized hardware) to centralized, software-based management and configuration applications.

SDI (which includes SDN), in fact, is at the core of cloud computing. Clearly, there's no way to scale a cloud data center if people had to run from server to server making changes.

Furthermore, Network Functions Virtualization (NFV) from the telco world also falls under the SD banner. With NFV, telco service providers shift all control to software, so that the underlying hardware is entirely generic. No more dedicated switches, routers, and specialized telco gear - all the hardware consist of generic, white-label boxes.

Software-Defined: Beyond the Network
While the network-centric context of SD in corporate networks, cloud data centers, and telco infrastructure forms the home base of the SD movement, SDI is also an essential enabler of continuous integration and continuous delivery (CI/CD), core elements of DevOps.

In order to achieve the velocity that CI/CD promise, the ops part of the story must be SD. Instead of ops people managing servers individually, the DevOps team must be able to deploy and manage software automatically via centralized software control. In other words, the immutable infrastructure principle behind DevOps is nothing more than SDI.

In fact, now that virtualization has matured, all the infrastructure from hypervisors down to bare metal is SD.

At the application level, however, the SD story gets more complicated.

Using software to automate the tasks involved in deploying software is nothing new. Developers have been using runbooks for years - scripts that tell various parts of the environment to execute a series of tasks in a particular sequence.

As DevOps has matured, the notion of the mundane runbook has taken on new life, as DevOps vendors automate increasingly broad swaths of the software development lifecycle (SDLC) with ‘recipes' or other scripting approaches.

As applications and the environments they run in get more complicated, however, the world of DevOps automation finds itself in a Catch-22: the automation scripts or recipes themselves become increasingly complex software applications in their own right, and thus must go through an SDLC of their own, with all the testing and governance that go along with it.

As a result, we're back to square one, manually creating, managing, deploying, and versioning software.

Does Software-Defined Mean Declarative?
To address this Catch-22, some DevOps tools take a declarative approach. Instead of scripting the environment step by step, the declarative approach enables the user to describe the desired behavior, and then the tool interprets such a description and takes the necessary actions to implement such behavior out of sight of the user.

In fact, in many contexts, when most vendors say SD, they really mean that they take a declarative approach, separating configuration from the underlying implementation. There's more to SD behavior than simply following a declarative approach, however.

For example, HTML (and markup languages in general) are declarative. And while we could certainly hand-code a web page by pecking out HTML, we're far more likely to use a visual tool for that purpose.

When we build a web site using such a tool, we're essentially working with models. The model is a visual, configurable representation of the page that the tool can convert into HTML for browsers to render into the page itself for users to view.

In this example, therefore, we have three different ways of thinking about the page: as a visual model, independent of any particular technology implementation of the page; as the HTML markup for that page; and as the action of the browser itself, an application purpose-built to render HTML into visual pages.

Architects and other shrewd readers will recognize the pattern above as being an instance of Model-Driven Architecture (MDA), or its common implementation, Model-Driven Development (MDD).

Does Software-Defined Mean Model-Driven?
MDA is an Object Management Group (OMG) standard
for creating metamodels that represent platform-independent models (our visual model, above) and platform-specific models (the HTML markup in the example), as well as an abstracted approach for turning the former into the latter.

Models, especially visual ones, are in broad use today, but MDA and MDD's best days are behind them. The reason: they didn't deal as well with change as MDA's creators had hoped.

In the MDD world, a developer might build a (platform-independent) model of an application in a model-driven tool and then push a button and out would pop the (platform-specific) source code that represented the working application.

However, if developers wanted to subsequently make a change, they would either need to change the model and regenerate and redeploy all the code (an onerous and time-consuming task), or tweak the auto-generated code itself, thus making it inconsistent with the model.

Round-trip tooling that would take tweaked code and automatically update the model - the holy grail of MDD - has proven impractical.

If we combine some of the principles from MDD with the declarative approach, however, we finally see some light at the end of the tunnel. Instead of the code-generating context of MDA reminiscent of CASE tools of yore, the platform-specific representation for a declarative model consists of a metadata representation of a configuration.

In practice, tools that take this approach create such metadata representations in JSON, XML, or a domain-specific language appropriate to the task at hand. Developers occasionally have reason to view such metadata, but rarely if ever have call to monkey with it directly.

Instead, users - who need not be developers - simply make changes in the model, typically via direct interaction with icons or other visual elements, or by selecting appropriate configurations. The underlying platform takes care of the rest.

The Intellyx Take
The round-trip code-generation vision of MDD proved unworkable, but the visual model to declarative metadata representation to immutable deployment vision is in essence what SD is all about.

The secret to making this approach practical, and thus the key to understanding why SD approaches have become so prevalent, is the word immutable.

Once we get an SD approach right, we no longer have to touch the deployed technology whatsoever. Instead, to make a change, update the model and redeploy.

The most important takeaway from this Cortex: this core SD pattern is fully generalizable. It works with networks, data centers, DevOps-based deployments, and as I'll cover in part two, it's also at the core of the Low-Code/No-Code movement.

It's no wonder, therefore, that Software-Defined Everything (SDX) is rising to the top of the buzzword heap - but SDX is no mere buzzword. It describes the central technological principles behind Agile Digital Transformation.

Copyright © Intellyx LLC. Intellyx publishes the Agile Digital Transformation Roadmap poster, advises companies on their digital transformation initiatives, and helps vendors communicate their agility stories. As of the time of writing, none of the organizations mentioned in this article are Intellyx customers. Image credit: Tim Adams.

More Stories By Jason Bloomberg

Jason Bloomberg is a leading IT industry analyst, Forbes contributor, keynote speaker, and globally recognized expert on multiple disruptive trends in enterprise technology and digital transformation. He is ranked #5 on Onalytica’s list of top Digital Transformation influencers for 2018 and #15 on Jax’s list of top DevOps influencers for 2017, the only person to appear on both lists.

As founder and president of Agile Digital Transformation analyst firm Intellyx, he advises, writes, and speaks on a diverse set of topics, including digital transformation, artificial intelligence, cloud computing, devops, big data/analytics, cybersecurity, blockchain/bitcoin/cryptocurrency, no-code/low-code platforms and tools, organizational transformation, internet of things, enterprise architecture, SD-WAN/SDX, mainframes, hybrid IT, and legacy transformation, among other topics.

Mr. Bloomberg’s articles in Forbes are often viewed by more than 100,000 readers. During his career, he has published over 1,200 articles (over 200 for Forbes alone), spoken at over 400 conferences and webinars, and he has been quoted in the press and blogosphere over 2,000 times.

Mr. Bloomberg is the author or coauthor of four books: The Agile Architecture Revolution (Wiley, 2013), Service Orient or Be Doomed! How Service Orientation Will Change Your Business (Wiley, 2006), XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996). His next book, Agile Digital Transformation, is due within the next year.

At SOA-focused industry analyst firm ZapThink from 2001 to 2013, Mr. Bloomberg created and delivered the Licensed ZapThink Architect (LZA) Service-Oriented Architecture (SOA) course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, which was acquired by Dovel Technologies in 2011.

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting), and several software and web development positions.

IoT & Smart Cities Stories
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.