Welcome!

Java IoT Authors: Karyn Jeffery, Anders Wallgren, Pat Romanski, Scott Allen, Kevin Benedict

Related Topics: Java IoT

Java IoT: Article

Software Archeology: What Is It and Why Should Java Developers Care?

The Java language is very mature and most new Java projects aren't from scratch

The process of Software Archeology can really save a significant amount of work, or in many cases, rework. So what challenges do all developers face when asked to do these kinds of projects?

  • What have I just inherited?
  • What pieces should be saved?
  • Where are the scary sections of the code?
  • What kind of development team created this?
  • Where are the performance spots I should worry about?
  • What's missing that will most likely cause me significant problems downstream in the development process?
The overall approach is broken into a six-step process. By the time a team is finished, and has reviewed what is there and what is not, this process can drastically help define the go-forward project development strategy. The six steps include:
  • Visualization: a visual representation of the application's design.
  • Design Violations: an understanding of the health of the object model.
  • Style Violations: an understanding of the state the code is currently in.
  • Business Logic Review: the ability to test the existing source.
  • Performance Review: where are the bottlenecks in the source code?
  • Documentation: does the code have adequate documentation for people to understand what they're working on?
Most developers regard these steps as YAP (Yet Another Process), but in reality many of them should be part of the developer's daily process, so it shouldn't be too overwhelming. The next question is can these tasks be done by hand? From a purely technical point-of-view, the answer would have to be yes, but in today's world of shorter timelines and elevated user expectations, the time needed to do this by hand is unacceptable.

So if it really can't be done by hand, what tools do I need to get the job done? Let's break down the process step-by-step and look at the tools that could be used to complete the task. Some advanced IDEs exist that include all of these tools and there are open source-based tools that may be able to do some parts of the job.

Visualization is the first step to understanding what kind of code the developer will be working with. It always amazes me how many developers have never looked at a visualization of the code they've written. Many times key architecture issues can be discovered just by looking at an object diagram of the system. Things like relationships between objects and level of inheritance can be a real eye opener. The old adage is true: a picture can be worth a 1,000 lines of code. When thinking about visualization in an object-oriented language like Java, UML diagrams seems to be widely used and understood. Being able to reverse-engineer the code into a class diagram is the first tool that's needed. Later in the process it will be important to be able to reverse-engineer methods into sequence or communication diagrams for a better understanding of extremely complex classes and methods.

Once visualization of the system is done and reviewed, the next step is reviewing the system from a design violation standpoint. This can be done by using static code metrics. Using metrics gives the developer or team a way to check the health of the object design. Basic system knowledge like lines of code (LOC) or the ever-important cyclomatic complexity (CC) can give a lot of information to the reviewer.

Many developers have no idea how big or small the application they're working on is or where the most complex parts of the application are located. Using a select number of metrics, developers can pinpoint "trouble" areas; these should be marked for further review, because normally those areas are the ones that are asking to be modified. Further analysis can also be done on methods that have been marked as overly complex by generating sequence diagrams. These diagrams offer a condensed graphical representation and make it much easier for developers and management to understand the task of updating or changing the methods. Another valuable metric is JUnit testing Coverage (JUC). In many cases when code is being inherited a low or non-existent number around JUnit tests exists and should raise major concerns about making changes to the system. The biggest concern will most likely become how to ensure that changes made to the code or the fixes implemented are correct and don't break other parts of the system. By using the information generated by the metrics tools developers get a better understanding of what's been inherited and some of the complications around the product.

Style violations help complete the picture of the inherited code. Many developers argue that static code audits should be run first, and this is true from a new project perspective. However, when inheriting massive amounts of code, running metrics first usually gives more object health-based information. Once the health of the object design is determined and can point to various areas of the code that need significant work, the audits can further refine that knowledge.

Static code audits include all kind of rules checking that look for code consistency, standards, and bad practices. Audit tools like ours include 200+ audits and will help in understanding the complexity of the application under review. Advanced audit tools include rules for finding things like god classes, god methods, feature envy, and shotgun surgery. These advanced audits actually use some of the metrics to give the reviewers more information. Take god methods for example. This is a method in a class that gets called from everywhere, meaning from an object design standpoint that method has too much responsibility so making changes to that one method could have a dramatic effect on the entire system. Look at feature envy. This is almost the exact opposite of a god class; this is a class that doesn't do much and maybe should be re-factored back to its calling class. When estimating the amount of time to give to a particular enhancement or determine what kind of code has been inherited this kind of low-level understanding is worth a lot.

Business logic review focuses on the testability of an application. By using advanced metrics the amount of testing available can be determined in a few minutes. Inheriting a large amount of code and finding that no unit test exists for it is going to have a dramatic effect on estimates for enhancements, or make the developers realize they probably don't have a way to verify that any changes to the system are correct. The tools needed for testing business logic should include a code coverage product and an integrated unit testing product like JUnit. Having one of the two is okay, but having both opens a lot of new testing possibilities. First, by running the unit test with a code coverage tool, the code to be tested can be verified. Code coverage can also be used when you don't have the advanced audit tools discussed above, plus a good code coverage tool will show all class and methods included in the run of the test. Using an advanced audit like shotgun surgery will highlight a method that has a lot of dependencies but using unit testing and code coverage together ensures that changes to these types of methods can be fully tested and verified. Another advantage to a code coverage tool is found in QA, which runs the product testing scripts while code coverage is turned on. This will tell them two things: whether the test script is complete and whether there's test coverage for all of the applications code. The good thing about this piece of Software Archeology is that usually it can only get better. By adding additional tests, the end result should be a better running system.

The need for a good profiler is key to performance review. Using the tools and results from the business logic review, performance issues can be uncovered and fixed. A key metric to remember is that only around 5% of the code causes most performance issues. So having a handle on where code is complex makes ongoing maintenance faster and easier.

The last step is documentation. Doing all this work is great for the developer, reviewer, or team trying to understand the system. It would be great if that work could be captured and used going forward. Having an automatic documentation generator saves time, reduces overhead, and helps ensure the documentation is up-to-date. This will make it easier for new members joining a team or for the application to be passed to another team.

The ideas around Software Archeology are fairly straightforward; this article took an approach of inheriting a large amount of code and then being responsible for that code. Other expeditions into the code could produce useful design patterns, great algorithms to reuse, or major things to avoid. We all know that software is an asset so using Software Archeology can ensure we get the most out of that investment.

More Stories By Mike Rozlog

Mike Rozlog is with Embarcadero Technologies. In this role, he is focused on ensuring the family of Delphi developer products being created by Embarcadero meets the expectations of developers around the world. Much of his time is dedicated to discussing and explaining the technical and business aspects of Embarcadero’s products and services to analysts and other audiences worldwide. Mike was formerly with CodeGear, a developer tools group that was acquired by Embarcadero in 2008. Previously, he spent more than eight years working for Borland in a number of positions, including a primary role as Chief Technical Architect. A reputed author, Mike has been published numerous times. His latest collaboration is Mastering JBuilder from John Wiley & Sons, Inc.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
An IoT product’s log files speak volumes about what’s happening with your products in the field, pinpointing current and potential issues, and enabling you to predict failures and save millions of dollars in inventory. But until recently, no one knew how to listen. In his session at @ThingsExpo, Dan Gettens, Chief Research Officer at OnProcess, will discuss recent research by Massachusetts Institute of Technology and OnProcess Technology, where MIT created a new, breakthrough analytics model f...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, will discuss the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports. The session will include a working demo and a technical d...
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lea...
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
Digital transformation is too big and important for our future success to not understand the rules that apply to it. The first three rules for winning in this age of hyper-digital transformation are: Advantages in speed, analytics and operational tempos must be captured by implementing an optimized information logistics system (OILS) Real-time operational tempos (IT, people and business processes) must be achieved Businesses that can "analyze data and act and with speed" will dominate those t...
If you had a chance to enter on the ground level of the largest e-commerce market in the world – would you? China is the world’s most populated country with the second largest economy and the world’s fastest growing market. It is estimated that by 2018 the Chinese market will be reaching over $30 billion in gaming revenue alone. Admittedly for a foreign company, doing business in China can be challenging. Often changing laws, administrative regulations and the often inscrutable Chinese Interne...
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Adobe is changing the world though digital experiences. Adobe helps customers develop and deliver high-impact experiences that differentiate brands, build loyalty, and drive revenue across every screen, including smartphones, computers, tablets and TVs. Adobe content solutions are used daily by millions of companies worldwide-from publishers and broadcasters, to enterprises, marketing agencies and household-name brands. Building on its established design leadership, Adobe enables customers not o...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
SYS-CON Events announced today that ReadyTalk, a leading provider of online conferencing and webinar services, has been named Vendor Presentation Sponsor at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. ReadyTalk delivers audio and web conferencing services that inspire collaboration and enable the Future of Work for today’s increasingly digital and mobile workforce. By combining intuitive, innovative tec...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
WebRTC adoption has generated a wave of creative uses of communications and collaboration through websites, sales apps, customer care and business applications. As WebRTC has become more mainstream it has evolved to use cases beyond the original peer-to-peer case, which has led to a repeating requirement for interoperability with existing infrastructures. In his session at @ThingsExpo, Graham Holt, Executive Vice President of Daitan Group, will cover implementation examples that have enabled ea...
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Businesses are struggling to manage the information flow and interactions between all of these new devices and things jumping on their network, and the apps and IT systems they control. The data businesses gather is only helpful if they can do something with it. In his session at @ThingsExpo, Chris Witeck, Principal Technology Strategist at Citrix, will discuss how different the impact of IoT will be for large businesses, expanding how IoT will allow large organizations to make their legacy ap...
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
24Notion is full-service global creative digital marketing, technology and lifestyle agency that combines strategic ideas with customized tactical execution. With a broad understand of the art of traditional marketing, new media, communications and social influence, 24Notion uniquely understands how to connect your brand strategy with the right consumer. 24Notion ranked #12 on Corporate Social Responsibility - Book of List.