Click here to close now.




















Welcome!

Java IoT Authors: VictorOps Blog, Pat Romanski, Trevor Parsons, Dennis Griffin, Adine Deford

Related Topics: Java IoT

Java IoT: Article

Software Archeology: What Is It and Why Should Java Developers Care?

The Java language is very mature and most new Java projects aren't from scratch

The process of Software Archeology can really save a significant amount of work, or in many cases, rework. So what challenges do all developers face when asked to do these kinds of projects?

  • What have I just inherited?
  • What pieces should be saved?
  • Where are the scary sections of the code?
  • What kind of development team created this?
  • Where are the performance spots I should worry about?
  • What's missing that will most likely cause me significant problems downstream in the development process?
The overall approach is broken into a six-step process. By the time a team is finished, and has reviewed what is there and what is not, this process can drastically help define the go-forward project development strategy. The six steps include:
  • Visualization: a visual representation of the application's design.
  • Design Violations: an understanding of the health of the object model.
  • Style Violations: an understanding of the state the code is currently in.
  • Business Logic Review: the ability to test the existing source.
  • Performance Review: where are the bottlenecks in the source code?
  • Documentation: does the code have adequate documentation for people to understand what they're working on?
Most developers regard these steps as YAP (Yet Another Process), but in reality many of them should be part of the developer's daily process, so it shouldn't be too overwhelming. The next question is can these tasks be done by hand? From a purely technical point-of-view, the answer would have to be yes, but in today's world of shorter timelines and elevated user expectations, the time needed to do this by hand is unacceptable.

So if it really can't be done by hand, what tools do I need to get the job done? Let's break down the process step-by-step and look at the tools that could be used to complete the task. Some advanced IDEs exist that include all of these tools and there are open source-based tools that may be able to do some parts of the job.

Visualization is the first step to understanding what kind of code the developer will be working with. It always amazes me how many developers have never looked at a visualization of the code they've written. Many times key architecture issues can be discovered just by looking at an object diagram of the system. Things like relationships between objects and level of inheritance can be a real eye opener. The old adage is true: a picture can be worth a 1,000 lines of code. When thinking about visualization in an object-oriented language like Java, UML diagrams seems to be widely used and understood. Being able to reverse-engineer the code into a class diagram is the first tool that's needed. Later in the process it will be important to be able to reverse-engineer methods into sequence or communication diagrams for a better understanding of extremely complex classes and methods.

Once visualization of the system is done and reviewed, the next step is reviewing the system from a design violation standpoint. This can be done by using static code metrics. Using metrics gives the developer or team a way to check the health of the object design. Basic system knowledge like lines of code (LOC) or the ever-important cyclomatic complexity (CC) can give a lot of information to the reviewer.

Many developers have no idea how big or small the application they're working on is or where the most complex parts of the application are located. Using a select number of metrics, developers can pinpoint "trouble" areas; these should be marked for further review, because normally those areas are the ones that are asking to be modified. Further analysis can also be done on methods that have been marked as overly complex by generating sequence diagrams. These diagrams offer a condensed graphical representation and make it much easier for developers and management to understand the task of updating or changing the methods. Another valuable metric is JUnit testing Coverage (JUC). In many cases when code is being inherited a low or non-existent number around JUnit tests exists and should raise major concerns about making changes to the system. The biggest concern will most likely become how to ensure that changes made to the code or the fixes implemented are correct and don't break other parts of the system. By using the information generated by the metrics tools developers get a better understanding of what's been inherited and some of the complications around the product.

Style violations help complete the picture of the inherited code. Many developers argue that static code audits should be run first, and this is true from a new project perspective. However, when inheriting massive amounts of code, running metrics first usually gives more object health-based information. Once the health of the object design is determined and can point to various areas of the code that need significant work, the audits can further refine that knowledge.

Static code audits include all kind of rules checking that look for code consistency, standards, and bad practices. Audit tools like ours include 200+ audits and will help in understanding the complexity of the application under review. Advanced audit tools include rules for finding things like god classes, god methods, feature envy, and shotgun surgery. These advanced audits actually use some of the metrics to give the reviewers more information. Take god methods for example. This is a method in a class that gets called from everywhere, meaning from an object design standpoint that method has too much responsibility so making changes to that one method could have a dramatic effect on the entire system. Look at feature envy. This is almost the exact opposite of a god class; this is a class that doesn't do much and maybe should be re-factored back to its calling class. When estimating the amount of time to give to a particular enhancement or determine what kind of code has been inherited this kind of low-level understanding is worth a lot.

Business logic review focuses on the testability of an application. By using advanced metrics the amount of testing available can be determined in a few minutes. Inheriting a large amount of code and finding that no unit test exists for it is going to have a dramatic effect on estimates for enhancements, or make the developers realize they probably don't have a way to verify that any changes to the system are correct. The tools needed for testing business logic should include a code coverage product and an integrated unit testing product like JUnit. Having one of the two is okay, but having both opens a lot of new testing possibilities. First, by running the unit test with a code coverage tool, the code to be tested can be verified. Code coverage can also be used when you don't have the advanced audit tools discussed above, plus a good code coverage tool will show all class and methods included in the run of the test. Using an advanced audit like shotgun surgery will highlight a method that has a lot of dependencies but using unit testing and code coverage together ensures that changes to these types of methods can be fully tested and verified. Another advantage to a code coverage tool is found in QA, which runs the product testing scripts while code coverage is turned on. This will tell them two things: whether the test script is complete and whether there's test coverage for all of the applications code. The good thing about this piece of Software Archeology is that usually it can only get better. By adding additional tests, the end result should be a better running system.

The need for a good profiler is key to performance review. Using the tools and results from the business logic review, performance issues can be uncovered and fixed. A key metric to remember is that only around 5% of the code causes most performance issues. So having a handle on where code is complex makes ongoing maintenance faster and easier.

The last step is documentation. Doing all this work is great for the developer, reviewer, or team trying to understand the system. It would be great if that work could be captured and used going forward. Having an automatic documentation generator saves time, reduces overhead, and helps ensure the documentation is up-to-date. This will make it easier for new members joining a team or for the application to be passed to another team.

The ideas around Software Archeology are fairly straightforward; this article took an approach of inheriting a large amount of code and then being responsible for that code. Other expeditions into the code could produce useful design patterns, great algorithms to reuse, or major things to avoid. We all know that software is an asset so using Software Archeology can ensure we get the most out of that investment.

More Stories By Mike Rozlog

Mike Rozlog is with Embarcadero Technologies. In this role, he is focused on ensuring the family of Delphi developer products being created by Embarcadero meets the expectations of developers around the world. Much of his time is dedicated to discussing and explaining the technical and business aspects of Embarcadero’s products and services to analysts and other audiences worldwide. Mike was formerly with CodeGear, a developer tools group that was acquired by Embarcadero in 2008. Previously, he spent more than eight years working for Borland in a number of positions, including a primary role as Chief Technical Architect. A reputed author, Mike has been published numerous times. His latest collaboration is Mastering JBuilder from John Wiley & Sons, Inc.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
In his session at @ThingsExpo, Lee Williams, a producer of the first smartphones and tablets, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. He will explain how M2M controllers work through wirelessly connected remote controls; and specifically delve into a retrofit option that reverse-engineers control codes of existing conventional controller systems so they don't have to be replaced and are instantly converted to become smart, connected devices.
The Internet of Things is in the early stages of mainstream deployment but it promises to unlock value and rapidly transform how organizations manage, operationalize, and monetize their assets. IoT is a complex structure of hardware, sensors, applications, analytics and devices that need to be able to communicate geographically and across all functions. Once the data is collected from numerous endpoints, the challenge then becomes converting it into actionable insight.
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Contrary to mainstream media attention, the multiple possibilities of how consumer IoT will transform our everyday lives aren’t the only angle of this headline-gaining trend. There’s a huge opportunity for “industrial IoT” and “Smart Cities” to impact the world in the same capacity – especially during critical situations. For example, a community water dam that needs to release water can leverage embedded critical communications logic to alert the appropriate individuals, on the right device, as soon as they are needed to take action.
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts, GM of Platform at FinancialForce.com, will discuss the value of business applications on wearable ...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.