Java IoT Authors: Zakia Bouachraoui, Pat Romanski, Elizabeth White, Liz McMillan, Yeshim Deniz

Related Topics: Java IoT

Java IoT: Article

Parallel Worlds

Parallel Worlds

In the last few years the focus in computing has gradually moved away from the raw technology to settle on the total cost of ownership (tco) for a solution. What makes up the tco? That's hard to say, and everyone has a different answer, which usually depends on what they find easiest to fix. Most people agree that the tco isn't simply the sum of the prices of the parts that make the system, although it comes from those initially. A much greater cost arises from the cost of supporting the system in context.

A popular approach to reducing tco has been to try to centralize the administration of individual systems and/or the client desktop, yet that's only part of the answer. It's good to keep the amount of travel to a minimum, but what actually causes the administration to be needed? The answer, of course, is change, although not on its own. Change in isolation would only necessitate work on the change itself. But we all know that making a change in one part of a system results in support needs throughout the system.

The typical computer system is often heading toward "entropy death," in which ordered simplicity has tended toward interconnected complexity. While a cure for the symptom may be central administration, the actual disease mandates avoidance of the complex network of dependencies in the first place. It's this that Java technology and XML start to address, by eliminating the automatic codependency of systems, software and data.

A New World
The need for much of the support and administration comes from the web of dependencies woven by the software in our computers. To bring back the simplicity, we need to cut the dependencies. Where are they? There are several categories:

  • Software to platform
  • Software to data
  • Software to software
  • Platform to platform

Cutting the cord of these dependencies isn't easy, but the new world of computing that's been developing over the last decade is finally coming to maturity and making it possible.

Let's first consider the computing model we've been living with. When computing was new, the choices were easy to make. I could pick any one from the limited range of computers, write software to run on it and create file formats to store the data in. Trouble was, the software and data would work only on that kind of computer, so when a different kind was used I had to use different software, or if I used different software on the same system I couldn't use the same data and had to learn a new user interface.

Many of the problems were solved by two standardization steps: everyone agreed to use the IBM PC and everyone used DOS and then Windows. A degree of simplicity came back. As time went on, though, it became clear that there was still plenty of scope for complexity to creep in. In particular, agreeing on the platform didn't break the platform dependency of the software ­ it just meant it was all codependent. And when an update came along, everything broke! In addition, there was no standardization beyond the power of monopoly in the world of the data. Just as the software depended on a particular level of the platform, so the data related to a particular level of a particular brand of software. A complex web of dependency was woven, in which a change at any point led to instability and perhaps failure in the whole web.

The greatest enemy of computing is the creation of unintentional codependencies. As computer solutions are built, they involve relationships among software, hardware, platforms, development tools, and so on. Each is connected to every other by unseen connecting threads of codependency. Over time, the cost of owning any solution is proportional to the number of dependencies among the parts. But by the unintentional creation of many codependencies, the cost rises in an exponential rather than a linear way. The result is that the addition of further codependent elements increases the lifetime cost disproportionately. The point at which this begins to apply is the race point, and the condition beyond the race point is termed entropy death (see Figure 1). The inevitability of entropy death is set well before the race point by the act of choosing a system philosophy prone to codependency, the unwitting reliance of one part of a system on another, possibly mediated by some other element. The most common unwitting codependency is between software and the operating system it predicates.

This isn't to say that all codependencies can or should be avoided; some are inevitable. But in modern system specification and design they should be identified and justified in the same way as any other cost driver, taking into account not only the direct cost but also the lifetime cost inherited by connection to the dependency network. In general, software needs to be insulated from the environment in which it is used. In some situations use of native interfaces and binaries is unavoidable, but in these cases a platform-neutral "wrapper" around the native code is almost always valuable.

For example, consider the apocryphal case of a company that's used the macro language of an office suite as the basis for an office automation system.

One day, installation of another piece of software (unknowingly) updates one of the DLL files used by the suite. Result? One of the macros no longer works. They finally manage to get it working again, but the new version needs an updated version of the spreadsheet program. To get that they have to install a whole new level of the office suite. Now none of the macros work! They crawl through them, updating and fixing, but among the other things the fixes mandate is a new version of the database driver. Sadly, that needs the latest version of the database to work. So they upgrade the database and...well, you can guess the rest.

The New Foundation
The problem is caused by the transmission of the impact of change from subsystem to subsystem. The integrated computing foundation currently in use in most systems acts as a transmission medium, allowing change in one place to have an impact elsewhere.

How can we escape this trap? The key is to disconnect data from software from platforms, to use standards-based choices so that version-version variations of the implementation have the smallest effect possible. By doing this we isolate changes from the transmission medium (the underlying platform) and prevent the impact of change from causing shockwaves of cost ­ we add the insulating layer mentioned earlier. What would be an optimal base of standards? The technology domains (see Figure 2) such a foundation (see Figure 3) would have to cover are:

  • Network protocols holding systems together and providing access
  • Delivery model that brings the solution to the audience that needs it
  • Programming model by which the solutions are created
  • Data structuring model for the information the solutions consume
  • Security model that allows the right audience access to the right data and solution

Much of the change in the computer industry over the last decade has involved the rediscovery of technology ideas and their establishment as standards within that model. The mappings are:

  • Network: TCP/IP, which has now become so widespread that it's no longer a topic of conversation.
  • Delivery: Web model stateless client/ server computing is the chosen delivery mechanism of a growing majority of business computer users. Rather than creating stateful clients that need costly maintenance and support, state is maintained instead at the server and "loaned" to the client.
  • Program: Only four years from release, Java technology has established itself as the standard for new software in a vast number of enterprises, not least because its JavaBeans architecture allows component-based development to be used in earnest. This isn't to say that all code needs to be written in the Java language; it's platform-neutral Java bytecode binary programs that win. Where these aren't feasible, at the very least a wrapper of Java technology to insulate the rest of the solution from native code is essential.
  • Data: Apparently new to the scene, XML is actually simplified SGML ­ 80% of the function for 20% of the complexity. Uptake throughout the computer industry has been huge, and it shows every sign of dominating data formatting in the future.
  • Security: By removing the need to send full key information "in the clear," public key-based security systems are already dominant, especially on the Web.
From Technologies to Audiences
Alongside the agreement of the standards for the new world of computing has been a shift in the requirements for business solutions. In the past each solution would be built with only the requesting customer in mind.

The focus was on who was using the solution and where they were; hence terms like intranet, extranet and Internet. But progress has meant that the focus now is much more on modeling the data and defining the relationship of the user to the data. There has been an inversion in the approach to computing solutions, and the focus has switched from technologies and systems to information and audiences. Today, defining a new solution involves defining the relationship that an audience has with a body of information. In most cases a given body of information will have multiple audiences. Thus, for an online shop, when customers view information, only their particular data is accessible to them, and it's presented in a way to suit them; when customer service staff from the vendor view the same information, both the scope and the presentation differ. It's the transition to a solutions-and-audiences view that presents the greatest challenge in IT today. But users can proceed with confidence since all of the technologies in the "new" tradition are in fact mature and proven, so the transition is one of emphasis and strategy rather than a leap into unknown technology.

Parallel Worlds
The fact that all five of the foundation technologies are well understood also offers another benefit. For many users migration to the new world of e-business is something evolutionary rather than revolutionary. They can take the first steps without scrapping the investment they've already made. This new world is thus a parallel world rather than an alternate one.

So why, after all that, will Java technology and XML succeed? There are several reasons:

  • Proven technology: All five segments of the "new" foundation are based on the oldest, best established ideas in the industry: TCP/IP, "dumb" terminals, virtual machines, markup languages, public key systems ­ all proved by the experience of decades.

  • User driven: In the final analysis, the move to the new foundation is driven by the needs and desires of the marketplace rather than by the fiat of any one vendor or even of a consortium. As the costs of computer technology become more of a focus item, and those driven by the upgrade arms race toward entropy death become more and more obvious, the demand for the new foundation becomes greater and greater.
  • Vendor supported: All five technologies form the basis of almost all vendors' new solutions. Vendors choosing an alternate at any point increasingly discover the market questioning their choice and suspecting an attempt at proprietary lock-in.
  • Vendor neutral: All five technologies are beyond the control of any one vendor, so investments are protected from the risks of vendor lock-in as well as the design choices of any one vendor starting an upgrade race. The only possible exception to this are Java technology and public key, and it's worth taking time to consider why neither is a problem in this context.
  • Platform neutral: All five technologies are independent of each other and of the platforms on which they're used. Thus they can all be implemented anywhere, insulating the systems that depend on them from codependency.
Java Technology: Public Property?
Can a technology apparently developed and controlled by a single vendor be considered open? It all depends on the attitudes and actions of the vendor and the time scales involved. In the case of the five domains in the new computing foundation, control has passed from the originator to "the mind of the market." For example, although the core ideas of public key systems are owned by one company, the industry has been willing to base almost all encryption and digital signatures on that technology because of a combination of the power of the technology and the attitude of the owners of the core patents.

In the same way, Java technology has become public property that is currently protected by the owner of the core technology. A move to standards body control would, however, be very positive. Standards bodies work better as museums rather than as factories; the parts of Java technology that are clearly established, such as the bytecode specification and the language, should be moved to the control of a suitable standards body as soon as practical ­ the third quarter of 2000, when the core patents of public key technology pass into the open, would be a good target. As long as the move toward full, externally controlled standardization continues apace, there probably won't be a problem. What's more, that ownership is nowhere as firm as in the case of public key systems. If the whole industry chose to implement Java technology differently, there would be almost no recourse. But that doesn't happen, because any company seen to violate the value of Java technology is shunned by the market. The fact that the base of standardization in Java technology is actually the binary format of bytecodes rather than the language is of course a big help. Thus, if we feel safe basing key parts of the computing infrastructure on public key systems, there's all the more reason to feel safe using Java technology. However, we should feel concerned if ever the technology owner puts brand equity before technology. Asserting, for example, that Java will never be made an independently managed standard would be a gross betrayal of trust.

The key issue that should occupy us is not how I can cut the cost of administration and support, but how I can reduce toward elimination the amount of admin support that's needed. To reflect on this changed concept, and to progress from the notions that sometimes turn consideration of tco into tcp (total cost of purchase), we should perhaps use a modified term to express the issue at hand: lifetime cost of ownership.

The core assertion of this article is that the primary decision factor for new computer systems should be the cost of owning the system over its entire life, lco ­ software, network, and client and server hardware, complete with development, deployment, administration, management of impact during life-cycle and migration to replacement systems at sundown. The core proposal of this article is that this factor be controlled by minimizing the network of the codependent complexity that these various elements create. To achieve this a change of system philosophy rather than an instant change of technology is proposed. By basing future developments on a firm foundation of standards, entropy death can be avoided. And this is the reason Java and XML technologies will succeed, cool though the technologies themselves may be!

More Stories By Simon Phipps

Simon Phipps, Sun's Chief Open Source Officer, is a technology futurist and a well-known computer industry insider. At various times he has programmed mainframes, Windows and on the Web and was previously involved in OSI standards in the 80s, in the earliest commercial collaborative conferencing software in the early 90s, in introducing Java and XML to IBM, and most recently with Sun's launching Sun's blogging site, blogs.sun.com. He lives in the UK, is based at Sun's Menlo Park campus in California and can be contacted via http://www.webmink.net.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Moroccanoil®, the global leader in oil-infused beauty, is thrilled to announce the NEW Moroccanoil Color Depositing Masks, a collection of dual-benefit hair masks that deposit pure pigments while providing the treatment benefits of a deep conditioning mask. The collection consists of seven curated shades for commitment-free, beautifully-colored hair that looks and feels healthy.
The textured-hair category is inarguably the hottest in the haircare space today. This has been driven by the proliferation of founder brands started by curly and coily consumers and savvy consumers who increasingly want products specifically for their texture type. This trend is underscored by the latest insights from NaturallyCurly's 2018 TextureTrends report, released today. According to the 2018 TextureTrends Report, more than 80 percent of women with curly and coily hair say they purcha...
The textured-hair category is inarguably the hottest in the haircare space today. This has been driven by the proliferation of founder brands started by curly and coily consumers and savvy consumers who increasingly want products specifically for their texture type. This trend is underscored by the latest insights from NaturallyCurly's 2018 TextureTrends report, released today. According to the 2018 TextureTrends Report, more than 80 percent of women with curly and coily hair say they purcha...
We all love the many benefits of natural plant oils, used as a deap treatment before shampooing, at home or at the beach, but is there an all-in-one solution for everyday intensive nutrition and modern styling?I am passionate about the benefits of natural extracts with tried-and-tested results, which I have used to develop my own brand (lemon for its acid ph, wheat germ for its fortifying action…). I wanted a product which combined caring and styling effects, and which could be used after shampo...
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected pat...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
Codete accelerates their clients growth through technological expertise and experience. Codite team works with organizations to meet the challenges that digitalization presents. Their clients include digital start-ups as well as established enterprises in the IT industry. To stay competitive in a highly innovative IT industry, strong R&D departments and bold spin-off initiatives is a must. Codete Data Science and Software Architects teams help corporate clients to stay up to date with the mod...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Druva is the global leader in Cloud Data Protection and Management, delivering the industry's first data management-as-a-service solution that aggregates data from endpoints, servers and cloud applications and leverages the public cloud to offer a single pane of glass to enable data protection, governance and intelligence-dramatically increasing the availability and visibility of business critical information, while reducing the risk, cost and complexity of managing and protecting it. Druva's...
BMC has unmatched experience in IT management, supporting 92 of the Forbes Global 100, and earning recognition as an ITSM Gartner Magic Quadrant Leader for five years running. Our solutions offer speed, agility, and efficiency to tackle business challenges in the areas of service management, automation, operations, and the mainframe.