Java IoT Authors: Liz McMillan, Yeshim Deniz, Zakia Bouachraoui, Elizabeth White, Pat Romanski

Related Topics: Java IoT

Java IoT: Article

Programming with Java's I/O Streams - Part 1

Programming with Java's I/O Streams - Part 1

Most programs use data in one form or another - as input, output or both. The sources of input and output can vary from a local file to a socket on the network, a database, in memory or another program. Even the type of data can vary from objects and characters to multimedia and more.

The APIs Java provides for reading and writing streams of data have been part of the core Java Development Kit since version 1.0, but they're often overshadowed by the better known JavaBeans, JFC, RMI, JDBC and others. However, input and output streams are the backbone of the Java APIs, and understanding them is not only crucial but can also make programming with them a lot of fun.

In this article we'll cover the fundamentals of I/O streams by looking at the various stream classes and covering the concept of stream chaining. Next month we'll look at some example uses of I/O streams.

To bring data into a program, a Java program opens a stream to a data source - such as a file or remote socket - and reads the information serially. On the flip side a program can open a stream to a data source and write to it in serial fashion. Whether you're reading from a file or from a socket, the concept of serially reading from, and writing to, different data sources is the same. For that very reason, once you understand the top-level classes (java.io.Reader, java.io.Writer), the remaining classes are a breeze to work with.

Character versus Byte Streams
Prior to JDK 1.1, the input and output classes (mostly found in the java.io package) supported only 8-bit "byte" streams. In JDK 1.1 the concept of 16-bit Unicode "character" streams was introduced. While the byte streams were supported via the java.io.InputStream and java.io.OutputStream classes and their subclasses, character streams are implemented by the java.io.Reader and java.io.Writer classes and their subclasses.

Most of the functionality available for byte streams is also provided for character streams. The methods for character streams generally accept parameters of data type "char" parameters, while "byte" streams - you guessed it - work with "byte" data types. The names of the methods in both sets of classes are almost identical except for the suffix; that is, character-stream classes end with the suffix Reader or Writer and byte-stream classes end with the suffix InputStream and OutputStream. For example, to read files using character streams, you'd use the java.io.FileReader class; to read using byte streams you'd use java.io.FileInputStream.

Unless you're working with binary data such as image and sound files, you should use readers and writers to read and write information for the following three reasons:
1. They can handle any character in the Unicode character set (the byte streams are limited to ISO-Latin-18-bit bytes).
2. Programs that use character streams are easier to internationalize because they're not dependent upon a specific character encoding.
3. Character streams use buffering techniques internally and are therefore potentially much more efficient than byte streams.

To bridge the gap between the byte and character-stream classes, Java provides the java.io.InputStreamReader and java.io.OutputStreamWriter classes. The only purpose of these classes is to convert byte data into character-based data according to a specified (or the platform default) encoding. For example, the static data member "in" in the "System" class is essentially a handle to the Standard Input (stdin) device. If you wanted to "wrap" this inside the java.io.BufferedReader class that works with character streams, you'd use InputStreamReader class as follows:

BufferedReader in = new BufferedReader(new

For JDK 1.0 Versions
If you can't use JDK 1.1, perhaps because you're developing applets for older browsers, simply use the byte-stream versions, which work just as well. Although I haven't discussed these versions much, they work almost identically to the character versions from a developer's perspective except, of course, the reader/writers accept character data types versus byte data types.

The Various Stream Classes
Top-Level Classes: java.io.Reader and java.io.Writer
Reader and Writer are the abstract parent classes for character stream-based classes in the java.io package. As discussed above, Reader classes are used to read 16-bit character streams and Writer classes are used to write to 16-bit character streams. The methods for reading and writing to streams found in these and their descendant classes (discussed in the next section) are:

int read()
int read(char cbuf[])
int read(char cbuf[], int offset, int length)
int write(int c)
int write(char cbuf[])
int write(char cbuf[], int offset, int length)

Listing 1 demonstrates how the read and write methods can be used. The program is similar to the MS-DOS type and Unix cat commands, that is, it displays the contents of a file. The following code fragment from Listing 1 opens the input and output streams:

FileReader fr = new FileReader(args[0]);
PrintWriter pw = new PrintWriter(System.out, true);

The program then reads the input file and displays its contents till it hits an end of file condition (-1), as shown here:

while ((read = fr.read(c)) != -1)
pw.write(c, 0, read);

I used the "(char cbuf[])" version of the read method, because reading a single character at a time can be approximately five times slower than reading chunks (array) at a time.

Other notable methods in the top level classes include skip(int), mark(int), reset(), available(), ready() and flush().

  • skip(), as the name implies, allows you to skip over characters.
  • mark() and reset() provide a bookmarking feature that allows you to read ahead in a stream to inspect the upcoming data but not necessarily process it. Not all streams support "marking". To determine whether a stream supports it, use the markSupported() method.
  • InputStream.available() tells you how many bytes are available to be read before the next read() will block. Reader.ready() is similar to the available() method, except it does not indicate how many characters are available.
  • The flush() method simply writes out any buffered characters (or bytes) to the destination (e.g., file, socket).

    Specialized Descendant Stream Classes
    Several specialized stream classes sub-class from the Reader and Writer classes to provide additional functionality. For example, the BufferedReader provides not only buffered reading for efficiency but also methods such as "readLine()" to read a line of input.

    The class hierarchy shown in Listing 4 portrays a few of the specialized classes found in the java.io package. This hierarchy merely demonstrates how stream classes extend their parent classes (e.g.. LineNumberReader) to add more specialized functionality. Tables 1, 2 and 3 provide a more comprehensive list of the various descendant classes found in the java.io and other packages, along with a brief description for each class.

    These descendant classes are divided into two categories: those that read from or write to "data sinks", and those that perform some sort of processing on the data (this distinction is merely to group the classes into two logical sections; you don't have to know one way or the other when using them).

    Listings 2 and 3 don't contain the complete list for the table because I intentionally skipped the "byte" counterparts to the "char" based classes and a few others (please refer to the JDK API reference guide for a complete list).

    Stream Chaining
    One of the best features of the I/O stream classes is that they're designed to work together via stream chaining.

    Stream chaining is the concept of "connecting" several stream classes together to get the data in the form required. Each class performs a specific task on the data and forwards it to the next class in the chain. Stream chaining can be very handy. For example, in our own 100% Pure Java backup software, BackOnline, we chain several stream classes to compress, encrypt, transmit, receive and finally store the data in a remote file.

    Figure 1 portrays chaining of three classes to convert raw data into compressed and encrypted data, which is stored in a local file. The data is written to GZIPOutputStream, which compresses the input data and sends it to CryptOutputStream. CryptOutputStream encrypts the data prior to forwarding it to FileOutputStream, which writes it out to a file. The result is a file that contains encrypted and compressed data.

    The source for the stream chaining shown in Figure 1 would look something like the code seen here:

    FileOutputStream fos = new FileOutputStream("myfile.out");
    CryptOutputStream cos = new CryptOutputStream(fos);
    GZIPOutputStream gos = new GZIPOutputStream(bos);

    or simply:

    GZIPOutputStream gos = new
    GZIPOutputStream(new CryptOutputStream
    (new FileOutputStream("myfile.out")));

    To write to chained streams, simply call the write() method on the outermost class as shown here:


    Similarily, when closing chained streams, you need only to close the outermost stream class since the close() call is automatically trickled through all the chained classes; in our example above we would simply call the close() method on the GZIPOutputStream class.

    In this article we reviewed the basic concepts of Java's I/O streams, which should give you a good understanding of how to program with them. Be sure to tune in next month when we'll complete this article by looking at lots of source code to get a feel for the various uses of I/O streams such as files, databases, sockets, archives and much more.

  • More Stories By Anil Hemrajani

    Anil Hemrajani is the author of the book titled Agile Java Development with Spring, Hibernate and Eclipse. He is the founder of Isavix Corporation (now InScope Solutions, Inc.), a successful IT services company, and DeveloperHub.com (formerly isavix.net), an award-winning online developer community that grew to over 100,000 registered members. He has twenty years of experience in the Information Technology community working with several Fortune 100 companies and also smaller organizations. He has published numerous articles in well known trade journals, presented around the world, and received and/or nominated for several awards. Anil can be reached via his web site, VisualPatterns.com.

    Comments (0)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    IoT & Smart Cities Stories
    René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
    Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
    In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
    Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
    Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
    Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
    If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
    Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
    When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
    Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...