Welcome!

Java IoT Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Zakia Bouachraoui, Pat Romanski

Blog Feed Post

Building custom monitor with Pentaho Kettle

free-website-monitoringMonitis open API and available collection of open source monitoring and management scripts provide nice possibility for finding solutions for monitoring your systems. Still there are many cases when you need a specific monitor and do not have or don’t want to spend much time on coding. That is the reason of presenting the very simple and easy way of building custom monitors with Pentaho Data Integration suite.

Pentaho Data Integration (PDI) – Kettle is a free, open source ETL (Extraction, Transformation and Loading) tool. Along with powerful data extract, transform and load capabilities, Kettle provides intuitive and rich graphical design environment – Spoon. Spoon is a fast and easy way for building applications without writing a code. Drag and drop interface allows to graphically construct transformations and jobs.

To start with Kettle we recommend the following tutorial, it is a help with installation and introduction to Spoon; PDI user guide and brief introduction to Kettle components.

In our article we want to present a very simple way of building custom monitor using Spoon. Moreover, our goal today will be monitoring of a business performance data opposite to usual system or application monitoring. Actually monitored data can be any information extracted from your database that needs to be shared and/or monitored. We’ll build a monitor that based on SQL query, will trace test table Orders, randomly populated with data, by order statuses. In this case number of orders grouped by current status (In Process, On Hold, Shipped and Cancelled) will act as metrics for our custom monitor.

To start, please, just have a look at Monitis API documentation. For creating custom monitor we need to implement the steps described below:

1.       Authentication – using Monitis API key and secret key (keys are available from your Monitis account: Tools->API) we need to get authentication token that will be used further for creating monitor and posting data.

For that, the following transformation

was created, using transformation steps listed below:

  to provide API url, API key, secret key and other request parameters for API calls
  HTTP request for Authentication token
  Json input for parsing result of Authentication token request
  and selection of needed parameter to be used later

 

After testing, we will implement small changes for converting created transformation to sub-transformation by simply adding Input and Output Specification as a start and end steps and removing info about API and secret key from parameters. This information will be provided in main transformations as an input for Authentication sub-transformation. Actually, we have created building block for our next steps which can be used in other transformation without any changes.

 

2.       Creating monitor

 

Here Data Grid steps are used for providing necessary input information:

  API key and secret key in User data, as an input for Authentication  sub-transformation
  monitor parameters
  and metrics description

User Defined Java Expression step and Group By step for constructing parameter list for create monitor API call:

All the parameters are grouped by the Join Rows “Add Monitor Param” step resulting as an input for Add Monitor HTTP Post request . Write to Log step is providing information on transformation execution results where Data field is the ID of created monitor and will be used in the next transformation.

 

3.       Posting metric results for custom monitor

As an input here along with the user data (API and secret keys) we have Custom Monitor ID – result of Create Monitor transformation and Table Input step, which will retrieve the necessary information from database.

HTTP Post step will execute API call for posting monitor data.

 

4.       Creating a job

The only thing left is just creating a simple job to run the transformation for posting metric results.

After test you can use any scheduler to run the created job using Pentaho Kitchen, a standalone command line process that can be used to execute jobs.

And here we can see our custom monitor on Monitis dashboard.

 

 

Using these simple transformations as a basis, you can create monitors by just changing input parameters and SQL query in Table Input step for retrieving metric data. Moreover, instead of Table Input step any other transformation Input, Utility, Lookup or Scripting step can be used as a source for monitored data. That will allow you to access relational and NOSQL databases and log files or data input of any format (CSV, JSON, XML, YAML, Excel, plain text …); to base monitor on script execution, Java classes or shell/process output; HTTP, REST and WSDL requests; fetch data from Google analytics account – just feel free to explore rich collection of Spoon transformation steps.

 

Share Now:del.icio.usDiggFacebookLinkedInBlinkListDZoneGoogle BookmarksRedditStumbleUponTwitterRSS

Read the original blog entry...

More Stories By Hovhannes Avoyan

Hovhannes Avoyan is the CEO of PicsArt, Inc.,

IoT & Smart Cities Stories
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER gives detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPOalso offers sp...
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...