Welcome!

Java Authors: Frank Huerta, Adrian Bridgwater, Pieter Van Heck, Esmeralda Swartz, Gary Kaiser

Related Topics: Java

Java: Article

The Two-Dimensional Legacy of GUIs

"There are striking similarities between cinema and GUI software"

Ted Nelson, inventor of, among other things, hypertext, once lamented that software development today is at the same evolutionary stage film making was at 100 years ago. Back in the 1900s, when the technology of film production was in its earliest stages, the cameraman was the person in charge because he was the one who understood the technology and could make it function correctly. The audience's sheer fascination with the magic of films was enough to captivate and hold their attention while the silent and blurred subjects grinned and gawked directly into the lens. Much has changed in the last hundred years though, and movie directors are now the ones in charge of making a film. It is they who decide every camera angle, every pan and zoom, every focus switch, and every light level in order to create the finished product. Their goal is to captivate the viewer, hold their attention, and suspend disbelief for 90 minutes or more. Cinema is a branch of the arts whose end product is thing of aesthetic quality.

There are striking similarities between cinema and GUI software, the most obvious of which is the fact that both interact with their users through a screen. It was developers at the Xerox Palo Alto Research Center (PARC) in the 1970s who realized that the eye is the highest bandwidth connection to the brain, and to interact with the user in the most efficient and natural way required using a terminal as more than just a way of receiving and displaying formatted text output. As a result of this epiphany, the first WYSIWYG editor was created and the concept of the GUI was born. Having the user interact with the information via scroll bars, push buttons, pop-up menus, check boxes, and so forth were all conceived 30 years ago. Each time a new whiz bang PC operating system is released or a software package created that will "revolutionize the way we work," I am always disappointed to peel back the hype and find that most of the effort occurred under the hood, leaving the front end largely unchanged with just a token rearranging of controls. Yesterday's dinner is re-served with fresh salad dressing and a quick 30 seconds in the microwave to make it smell fresh.

The problem is a deep one and can be traced back to the original Xerox developers who, although they were undeniably creative geniuses who shaped most of modern computing, also created the legacy that now holds us back. Their task back then was to create applications that supported laser printers; they needed a way for the user to see what the finished output would be before wasting toner and paper. WYSIWYG is more aptly an implementation of the acronym WYSIWYP or "what you see is what you print." For example, the scrollbar was designed to allow the user to deal with having their available viewing area smaller than the size of the underlying page, not as a way of navigating through large lists of back-end data. All of our computing metaphors - such as copy and paste, applications on a desktop, folders with files, trash cans, and so forth - come from a paper-based view of the world. Ted Nelson writes that "like the fish that is unaware of water, computer users are blind to the 2D tyranny of paper." The problem occurs because everything we do in GUI software is predicated on the fact that we're using the computer as though it were a page of information.

What needs to be done then to break out of the goldfish bowl we're all currently swimming around in? We need to create applications that push the boundaries of software toward cinema, where the user experience is all that matters and our job isn't to be latter-day cameramen just arranging precanned widgets and controls designed to help some laser printer engineers.

Part of the problem lies in our psyche, who we are, and where we came from. In Hackers and Painters by Paul Graham (www.paulgraham.com/hp.html), he draws a parallel between the numerous people involved in creating a piece of art. One is the painter who has the inspiration, the talent, and the creative ability to produce the finished piece. To do this he has the knowledge of how to apply paints in the right way, the right combination, and the right technique to produce a work of art. The other person is the engineer at the paint factory. He has the knowledge of how chemicals can be mixed together to produce different products; paints that dry on the canvas and not in the tin; different colors, mediums, and so forth. The two disciplines are clearly different and neither the artist nor the engineer would be able to apply each other's trade. When the paint factory hires new recruits, they approach the chemical engineering department for graduates, and when the art gallery needs new material, they visit the art department. Software should be the same, where for a GUI application the people required understand how to communicate with users through a computer screen and how to convey information. These people are artists, not engineers. They are people who understand how to direct a movie, not those whose only skill is loading film in a camera and switching it on and off.

When this is understood it can be used to good effect, for example in the computer gaming industry where the production process has more in common with a movie studio than a traditional software house. Likewise, the plethora of special effects that occur in films, not to mention the raw technology underpinning CGI cartoon movies, makes the boundary between the two disciplines even more blurred. Computer games companies hire, and put in charge of production, creative talent and artists, while the engineer's job is relegated to cutting the code, pointing the camera, and making the canvas come to life in the image of the design team's ideas and vision.

What we need to do for desktop software is to learn from the games programmers, to recognize that the screen can do more than rehash a 30-year old paper presentation metaphor to the users, and see if we can inject more art, and less science, into the applications we produce.

More Stories By Joe Winchester

Joe Winchester, Editor-in-Chief of Java Developer's Journal, was formerly JDJ's longtime Desktop Technologies Editor and is a software developer working on development tools for IBM in Hursley, UK.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
David Marshall 12/11/06 09:56:09 PM EST

Reliable speech recognition and artificial intelligence would go a long way toward making interfaces more usable. How many times I have to move around looking for files and opening them with the correct app when I wish I could just say "edit the web.xml file in my garbanzo web app folder". Or "Restart Apache."
Why am I ever looking at XML files anyway? These things are meant to be read only by parsers.

This kind of stuff is so tedious especially if you have RSI. It mystifies me to see that so many people are returning to essentially a 1970s operating system (Linux) or Unix on the Mac. Now they can issue commands like ls -l to see all their files or cat xxx to look at a file in a terminal window. grep , awk , sed. This is for hackers and only hackers. The fact that these things still exist is not a testament to their power. No one has broken out of the box. I don't think MS Vista is a move forward but it sounds like they at least they entertain some of these ideas. Unfortunately, they seem to be taken with the idea that you can walk into your house and have your computer figure out what song to blast from your entertainment center. That much I can handle. But give me a computer that I can ask "Whats 34.48 times 94?"