Click here to close now.

Welcome!

Java Authors: Carmen Gonzalez, Plutora Blog, Pat Romanski, Harry Trott, Elizabeth White

Related Topics: Virtualization, SOA & WOA

Virtualization: Case Study

Data Virtualization at Northern Trust: A Case Study

New integration approach accelerates revenue

Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility is the first book published on the topic of data virtualization. Along with an overview of data virtualization and its advantages, it presents ten case studies of organizations that have adopted data virtualization to significantly improve business decision making, decrease time-to-solution and reduce costs. This article describes data virtualization adoption at one of the enterprises profiled, Northern Trust.

Organization Background
Northern Trust is a leading provider of innovative fiduciary, investment management, private banking, wealth management and worldwide trust and custody services. Clients include corporations, institutions, families and individuals in more than 40 countries.

Based in Chicago, Illinois, the company has over 13,000 employees. At the end of 2010, the company had assets under custody of $4.1 trillion, assets under management of $643.6 billion and banking assets of $83.8 billion. Annual revenues in 2010 were almost $3.7 billion.

For this case study, we interviewed Leonard J. Hardy, Senior Vice President, Operations and Technology. Hardy is a member of Northern Trust's Rapid Solutions group, which provides a solution architecture and consulting facility to other Northern Trust application development groups.

Hardy also manages the company's Integration Competency Center (ICC), which helps other application areas figure out the best way to integrate data. The ICC supports several tools for this, including data virtualization, ETL and data integration best practices. Hardy participated in this case study project in his ICC role.

The Business Problem
An important line of business for Northern Trust is providing outsourced investment management operations for corporate customers, typically investment management firms and banks.

These institutional customers continue to provide front-office investment management functions for their end clients, but outsource all or some of the middle-office and back-office functions to Northern Trust. By contracting these functions out to Northern Trust, the institution does not have to invest in the resources, assets and skills necessary to provide the functions internally. In return, Northern Trust provides guaranteed levels of quality, service, resilience and value-to-cost criteria and management.

The Northern Trust line of business responsible for investment management outsourcing is called Investment Operations Outsourcing (IOO).

The business problem for IOO was the fact that it had a large number of new institutional customers in the outsourcing pipeline and simply could not implement them fast enough.

As Hardy stated, "If we could get customers on board more quickly, we would realize several key benefits. First, revenue would increase. Second, we could go down the pipeline faster. We had customers already signed up for outsourcing who were waiting to be converted and we had to find a way to do that faster. That, in turn, would allow us to open up the pipeline and take advantage of huge new business opportunities in the marketplace."

The Technical Problem
Existing IOO client reporting capabilities had evolved within a traditional systems infrastructure and were both inefficient and inflexible, thereby extending the time it took to set up the end client reporting function for each customer.

For example, value-added client data was stored in multiple separate databases, including legacy, mainframe systems. In addition, some of the business logic to interpret the data structures and create reports was embedded in stored procedure code. The only way to access the data and interpret it correctly was to go through the stored procedures. There was no data warehouse or data mart that consolidated all this data and business logic for reporting purposes.

As a result, according to Hardy, "the old way of reporting required technology intervention for every single customer implementation. As you can imagine, each institution has its own formats, labels, graphics and fonts. A technology expert proficient in our development tools had to get involved and make changes to our reporting process for every customer. The time-to-market on that was very long. We needed the ability to bring on new customers without having to make technology changes for each one."

Two Options Evaluated
IOO considered two solution architectures to meet their business and technical challenges.

Physical Consolidation: One solution for IOO was to physically integrate the data using ETL and data warehousing technology. But that would require pulling all of the business logic out of the stored procedures and figuring out how to get all of the data into one database and have the structure make sense, a massive project that could take up to a year and a half.

Data Virtualization: Instead, IOO took a more innovative approach and decided to virtually integrate the data using the Composite Data Virtualization Platform as a virtual data warehouse. Placing Composite in the middle between the client reporting tool and the back-end data sources would enable IOO to quickly abstract the data into the data virtualization layer, reuse the existing business logic to access the data and generate all of the data needed for end client reporting. This approach required just seven months.

The Data Virtualization Solution - Architecture
Northern Trust's IOO data virtualization solution is a virtual data warehouse implemented with the Composite Data Virtualization Platform combined with a new client reporting front end. The major architecture components include:

Data sources - There are several physical data stores in the IOO environment.

  • Transaction data includes all of the detailed transactions for each end-client account: buys, sells, interest income, dividends, etc.
  • Valuation data includes, for each end-client account, how much it is worth today, how much it was worth yesterday, etc., based on the transaction data.
  • Enterprise asset data is a reference file that contains data about each asset in the client accounts, such as ticker symbol and investment category (equity, bond, money market, etc.). This data comes from a number of different third-party providers.
  • Performance data includes very complicated investment performance calculations that permit performance comparisons between accounts, such as one account against accounts of similar value or with similar investment strategies.
  • Transaction, valuation and enterprise asset data is stored in IBM DB2 and accessed via COBOL stored procedures.
  • Performance data is in Oracle and accessed via web services that call C code.

Data virtualization layer: The virtual data warehouse built using the Composite Data Virtualization Platform abstracts the data in the IOO data stores and creates a unified, virtual view that makes them appear as a single physical data store.

When the reporting platform issues a query to retrieve data from one or more data stores, Composite queries the data by calling the appropriate stored procedures and/or web services, combines the data (for example, associates all of the transactions that make up a valuation and the important performance data that applies to that valuation) and delivers it to the reporting platform.

Consuming applications: The primary consuming application is the client reporting platform. This is the toolset that IOO's internal business/operations partners now use to build, format and produce the reports for IOO's customers and their clients.

Another benefit of the reporting tool is that it supports a data dictionary and business views of the data that directly map to views within the data virtualization layer. These business views, or building blocks, conform the data into reusable components that the business analysts can use to create custom reports.

A key point here is that the solution has enabled IOO to transfer the time-consuming report customization work that previously had to be done for each new institutional customer by very high-priced and very busy technical resources (applications programmers) to IOO's operations partners (business analysts).

As Hardy summarized it, "We have been successful in our goal of taking the technology out of the outsourcing equation. As long as we supply the right data through Composite, the reporting tool handles all formatting, graphing and other functionality that we used to have to hard code within the application program. One reason we had such a big bottleneck in implementing outsourcing customers was the fact that we didn't have enough programmers in our application development group to handle the customization requirements. Business analysts in our operations group can now easily process the building blocks necessary to customize reports."

By combining the capabilities of data virtualization and the new reporting tool, IOO has been able to navigate through and abstract a very complex set of layers of translation to connect the raw IOO data structures to business metadata and finished customer reports.

The Data Virtualization Solution - Best Practices
Hardy offered some best practices and lessons learn based on his experience working with IOO.

Start with a focused project: Hardy stated that a focused initial effort is a key to success. "Don't start with a project to make all enterprise data available via a service." To date, IOO has completed four institutional customer implementations with over ten more in progress or planned.

Centralize support for data virtualization and development: Giving ICC responsibility for the data virtualization technology provided economies of scale and enabled the company to accelerate up the best practices learning curve in effectively implementing the technology in business solutions.

Understand the data: According to Hardy, data virtualization is no different than any other development or physical data integration tool. "The design and architecture of the solution is where you need to spend time. The old carpenter's adage of ‘measure twice and cut once' applies here. Don't just dive in and start coding views. Make sure you really understand the data. This is the key to success in an effort like this."

Educate and support the business: Hardy said it is also important to allocate time to consult with the business and make sure they understand the data. "In our case, they are the ones designing the reports but they don't necessarily have in-depth knowledge about the data in its raw form. So we need to help them understand the raw data and make sure that when the data gets to the end tool, people understand what they are looking at."

Manage business expectations: Although data virtualization is a capability that can get the job done faster, the organization still has to go through the same steps to implement the solution: analysis, design and architecture. "In our case, we could say it saved us a year and a half, but it did take seven months. So it will take time no matter how you do it."

Summary of Benefits
Hardy described the major benefits Northern Trust has achieved with the new IOO client reporting architecture based on data virtualization.

Faster time-to-market increases customer satisfaction and revenue: Data virtualization has dramatically reduced the time it takes to implement a new outsourcing customer by 50%. Moving customers through the pipeline faster improves overall customer satisfaction and gives IOO the capacity to bring in even more business and revenue.

Single point of access simplifies work and cut costs: The fact that the reporting tool, or any consuming application, has a single data access point regardless of the format, access method (SQL, web service, stored procedure, etc.) or data source  significantly reduces application complexity and development and maintenance time and cost.

Reuse benefits grow with adoption: The ability to create reusable data services that can be shared by all applications also reduces costs.

Long run flexibility is improved: Northern Trust is embarking on a three-year project to replace all of its underlying data stores with modern data warehouses and data marts. By putting data virtualization in the middle, thereby decoupling the back-end data from the consuming applications, IOO will not have to change anything that has been defined in the reporting tool when it swaps out the back-end data stores. It will only need to change the views to consume the new data sources.

•   •   •

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. The complete Northern case study, along with nine others enterprise are available in the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization model to drive results.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will discuss how to cut costs, scale easily, and unleash insight with CommVault Simpana software, the only si...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from mere Big Data to smart data. He will argue that today’s hybrid cloud storage solutions, with commodity...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, will focus on how to set up a cloud data governance program and s...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business application that everyone can relate to. Gone are the salad days of smart home apps, the early chalkb...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch of Docker's initial release in March of 2013, interest was revved up several notches. Then late last...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
CommVault has announced that top industry technology visionaries have joined its leadership team. The addition of leaders from companies such as Oracle, SAP, Microsoft, Cisco, PwC and EMC signals the continuation of CommVault Next, the company's business transformation for sales, go-to-market strategies, pricing and packaging and technology innovation. The company also announced that it had realigned its structure to create business units to more directly match how customers evaluate, deploy, operate, and purchase technology.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...