Java IoT Authors: Pat Romanski, Yeshim Deniz, Elizabeth White, Roger Strukhoff, Liz McMillan

Related Topics: Java IoT

Java IoT: Article

Managing a Standardized Build Process Outside of the Eclipse IDE

Point-and-click solutions won't cut it

Building objects in the Eclipse IDE is simple - it's a point-and-click solution. However, as applications built on the Eclipse platform mature the need for building outside of the IDE increases. This need can be driven by the development team that is striving to perform agile development techniques where builds are executed based on a file "check-in" action into an SCM tool. The need can also be driven by IT governance where a scheduled and audited production build is required. Moving from builds managed inside of the Eclipse platform to builds managed outside of the Eclipse platform can be a big task in itself. Don't hesitate to make this jump. It's a jump that you'll find you can't do without. The sooner you get out of your point-and-click build process, the sooner your application will begin to mature.

Defining the build process should not be taken lightly. Auditability and traceability of built objects are becoming increasingly important due to IT compliance mandates. This means that your builds must become more traceable than a point-and-click process. Don't make the mistake of addressing the issue of building outside of the Eclipse IDE long after the application has grown to an unmanageable size. Delaying the inevitable only results in a poorly managed, unplanned, ad hoc build process that isn't sustainable, can't meet IT compliance, and involves expensive hidden costs in maintenance and fixes.

There are three ways of addressing the build process outside the Eclipse IDE. The most common method is to manually develop and maintain Ant/XML scripts. These scripts use Ant Tasks from the Apache Foundation to act as a wrapper to the Java compiler. The second method is to write scripts that call Eclipse in what's called a "headless" mode. A script that executes the build in headless mode acts in the same way as the point-and-click process inside the Eclipse IDE, but does the build from a script. And finally, the preferred method is to use a commercial build tool that can automate the creation of the scripts. Commercial tools that minimize the use of manual scripts establish a more repeatable and traceable build workflow, the ultimate goal of any solid development process.

If you don't have the luxury of a commercial build tool that can create a solid reusable build framework, you must create a manual build process that's as standardized as possible. A "manual" build process refers to any scripted build process that has to be maintained manually. Even if you execute your manual scripts through a job scheduling build management tool, your builds are still manual because you must manually maintain the build logic contained in the scripts.

When writing the manual build process, your choices become writing Ant/XML scripts to perform the build or to use the Eclipse headless mode option. Establishing a repeatable build that can be traced requires that you concisely define how the build executes and with what source code. The use of the headless mode removes that level of control. It provides a little more functionality than using the point-and-click process inside Eclipse. When running a headless mode script, you're still relying on the Eclipse IDE to control the build. For this reason, defining a build process using Ant/XML is recommended if a commercial tool isn't available.

If you review any Ant/XML script, it may appear that the process of converting Java source into Java jars is complicated. This isn't necessarily true. Ant/XML files execute in a serial fashion, top to bottom. Everything must be precisely coded in a particular order. That's why XML build scripts can be very large and difficult to debug. There are some suggested standards on writing XML scripts, but they're not always followed. At a minimum, XML build scripts should follow a basic flow with pre-processing and post-processing steps that are consistent for every XML script created, planned, or unplanned.

Pre-Processing Steps
Pre-processing steps are used to establish the environment in which the subsequent task will execute. The point of setting up these pre-processing steps is to get the source code and variables organized and do overall housekeeping before compilation starts. A common mistake is to mix together these housekeeping steps for each call to the Java compiler or before calling an Ant Task. By organizing the pre-processing steps at the beginning of the XML scripts, the build process becomes clearer and easier to follow. It also reduces redundancy and results in an improved, more efficient process. These are the recommended pre-processing steps:

The CLASSPATH identifies which Java Classes are going to be used to resolve inter-class dependencies. The Java compiler will search the CLASSPATH in a first-found method and use a file as soon as it's been located. The CLASSPATH can include Jar and Zip files and directories.

When setting the CLASSPATH it's important to make sure of two things. First, the CLASSPATH should only be defined once in your process. Second, only the jar files and class directories that are used should be referenced in the CLASSPATH. Don't reference other unused jars since then the Java compiler will do more work than needed, slowing down your build substantially. And specifying only the used jars will make for quick dependency identification. There's nothing worse than attempting to trace jar file dependencies only to find that a large number of jar files aren't needed. This can add substantial time to debugging your builds.

The use of wild cards is always a topic of debate. Wild cards can eliminate typing in your script, but in the end may cause your build process to include more objects than needed. List each jar file in the CLASSPATH explicitly to prevent any vagueness. This is particularly critical when exposing your build details for IT compliance mandates. Wild cards aren't traceable.

Setting the CLASSPATH should be the first task in the Ant XML script.

Copying and Renaming Files
There are cases in which source code, jar and property files have to be copied from one location to another for the compiler to find the file or put it in the archive correctly. Do yourself a favor and minimize the use of copying and renaming of files. It makes it extremely difficult to trace the archive contents back to the original source. Copying files around also creates a more "magical" build process. IT compliance mandates want a clear view into your process. No "magic" is needed.

Instead of copying or renaming the files, put the files in the correct location from the start. Don't use your Ant/XML script to clean up a mistake in file organization. This may involve updating your project directory structure and making your Source Code Management tool more efficient. As an alternative to copying and renaming files, the use of the Ant Task "Zip" and its attributes, such as "dir" and "prefix" can handle getting source from one location and putting it in the archive at a different location.

This sample XML code from the Apache Ant Manual demonstrates using the Ant Task "Zip" to take one source location (htdocs/manual) and put it in another location (docs/user-guide):

<zip destfile="${dist}/manual.zip">
     <zipfileset dir="htdocs/manual" prefix="docs/user-guide"/>
     <zipgroupfileset dir="." includes="examples*.zip"/>

Compile Step
The compile step is, of course, the heart of your process. It will become the largest section of XML script. The important point to remember when defining this portion of your script is the management of dependencies.

Dependency References
With Ant, you can explicitly define the dependencies between tasks. For example, the JAR task can be dependent on the JAVAC task. Ant will also let multiple task dependencies be established. Don't be seduced by this seemingly convenient Ant Task. While it seems useful, it can be burdensome. When tracing the order of executing the various Ant tasks in the Ant/XML script, it's much easier to follow a dependency chain that has only one task dependency instead of multiple ones. For example:

Scenario 1
JAR Task depends on JAVAC Task
JAVAC Task depends on the COPY Task
COPY Task depends on the INITIALIZATION Task

Scenario 2
JAR Task depends on the JAVAC Task and COPY Task
JAVAC Task depends on the COPY Task and INITIALIZATION Task
COPY Task depends on the INITIALIZATION Task

As you can see in Scenario 2, there are redundant Ant task dependencies. For example, the COPY Task is redundant on the JAR Task. This redundant use of COPY Tasks isn't needed since it's already referenced higher up in the task dependency hierarchy being the JAVAC Task.

There will be cases when you want to have multiple task dependencies as in the creation of a war file. In this case, multiple task dependencies may be needed to ensure that all of the jars are created before the war. But each jar should have just one task dependency, that being the JAVAC task.

Identifying Source Code
Finding your source can seem like an easy item at first, but when applications get bigger there's a greater chance that wrong or obsolete code is included. Using wildcards in the Ant/XML script is an easy way to minimize the need for typing, but for the wildcards to be effective, the source files have to be efficiently organized in a proper Java package directory structure.

The best way to manage source is to define an efficient package directory structure. So you must move beyond your unique needs and address the package names at a more global level in your organization. It's best to make sure that a corporate Java package structure is agreed on and used. As part of the Java package structure it's best to keep the package names simple. Really long package names can cause problems with the file limits on the Windows operating system. Java compiles on Windows have been known to stop working when the 254-character limit is exceeded. To make this problem even peskier, the script may work on one user's machine but break on another's. This is due do the build directory root name being added to the package names. For instance, one person may do the build in c:\mybuilds but another may build in d:\onlinedata\j2ee\development\code. The difference in the directory name can make or break the build by pushing the 254-character limit.

Another item that defines the location of source code is the use of the excludes attribute of the JAVAC Ant Task. It's best to remove the older obsolete code from the Source Code Management tool and from the file system instead of using the exclude attribute. Most SCM tools provide for renaming or removing an item without loosing all of the history. SCM tools also allow for comments that create a level of traceability on why a piece of code is no longer required. Having this information in the SCM tool makes for easier access versus the information being hidden in a comment in the Ant XML.

JAVAC -sourcepath
The native command line Java compiler (javac.exe) has an interesting flag called -sourcepath that provides a directory concatenation to find the source code. It works on a first-found basis. So once the source has been located, the directory browse stops. There are two advantages to using this parameter. First, all of the code doesn't have to be found in the current build directory. That is, source code can be found in multiple locations. Thus, the build process only has to check out the changed code and find the remaining code from a previous full checkout. This will speed the overall build process by minimizing the files to be checked out. Second, if the JAVAC command is given a Java file as a parameter it will then check using the -sourcepath parameter for additional source that's referenced by the original Java file and compile it too. This process will allow just the changed source to be passed as the parameters to JAVAC and JAVAC will figure out all the remaining dependencies for you.

More Stories By Steve Taylor

Steve Taylor is an experienced senior developer, bringing 17 years of expertise with client/server and mainframe application development and system integration. Prior to founding Catalyst Systems Corporation, he served as a lead technical consultant responsible for the successful implementation of applications into the production environment. Steve received his BS in computer science/mathematics from the University of Illinois-CU.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...