Click here to close now.



Welcome!

Java IoT Authors: Harry Trott, Jenny Fong, Liz McMillan, Pat Romanski, Elizabeth White

Related Topics: Containers Expo Blog, Microservices Expo

Containers Expo Blog: Case Study

Data Virtualization at Northern Trust: A Case Study

New integration approach accelerates revenue

Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility is the first book published on the topic of data virtualization. Along with an overview of data virtualization and its advantages, it presents ten case studies of organizations that have adopted data virtualization to significantly improve business decision making, decrease time-to-solution and reduce costs. This article describes data virtualization adoption at one of the enterprises profiled, Northern Trust.

Organization Background
Northern Trust is a leading provider of innovative fiduciary, investment management, private banking, wealth management and worldwide trust and custody services. Clients include corporations, institutions, families and individuals in more than 40 countries.

Based in Chicago, Illinois, the company has over 13,000 employees. At the end of 2010, the company had assets under custody of $4.1 trillion, assets under management of $643.6 billion and banking assets of $83.8 billion. Annual revenues in 2010 were almost $3.7 billion.

For this case study, we interviewed Leonard J. Hardy, Senior Vice President, Operations and Technology. Hardy is a member of Northern Trust's Rapid Solutions group, which provides a solution architecture and consulting facility to other Northern Trust application development groups.

Hardy also manages the company's Integration Competency Center (ICC), which helps other application areas figure out the best way to integrate data. The ICC supports several tools for this, including data virtualization, ETL and data integration best practices. Hardy participated in this case study project in his ICC role.

The Business Problem
An important line of business for Northern Trust is providing outsourced investment management operations for corporate customers, typically investment management firms and banks.

These institutional customers continue to provide front-office investment management functions for their end clients, but outsource all or some of the middle-office and back-office functions to Northern Trust. By contracting these functions out to Northern Trust, the institution does not have to invest in the resources, assets and skills necessary to provide the functions internally. In return, Northern Trust provides guaranteed levels of quality, service, resilience and value-to-cost criteria and management.

The Northern Trust line of business responsible for investment management outsourcing is called Investment Operations Outsourcing (IOO).

The business problem for IOO was the fact that it had a large number of new institutional customers in the outsourcing pipeline and simply could not implement them fast enough.

As Hardy stated, "If we could get customers on board more quickly, we would realize several key benefits. First, revenue would increase. Second, we could go down the pipeline faster. We had customers already signed up for outsourcing who were waiting to be converted and we had to find a way to do that faster. That, in turn, would allow us to open up the pipeline and take advantage of huge new business opportunities in the marketplace."

The Technical Problem
Existing IOO client reporting capabilities had evolved within a traditional systems infrastructure and were both inefficient and inflexible, thereby extending the time it took to set up the end client reporting function for each customer.

For example, value-added client data was stored in multiple separate databases, including legacy, mainframe systems. In addition, some of the business logic to interpret the data structures and create reports was embedded in stored procedure code. The only way to access the data and interpret it correctly was to go through the stored procedures. There was no data warehouse or data mart that consolidated all this data and business logic for reporting purposes.

As a result, according to Hardy, "the old way of reporting required technology intervention for every single customer implementation. As you can imagine, each institution has its own formats, labels, graphics and fonts. A technology expert proficient in our development tools had to get involved and make changes to our reporting process for every customer. The time-to-market on that was very long. We needed the ability to bring on new customers without having to make technology changes for each one."

Two Options Evaluated
IOO considered two solution architectures to meet their business and technical challenges.

Physical Consolidation: One solution for IOO was to physically integrate the data using ETL and data warehousing technology. But that would require pulling all of the business logic out of the stored procedures and figuring out how to get all of the data into one database and have the structure make sense, a massive project that could take up to a year and a half.

Data Virtualization: Instead, IOO took a more innovative approach and decided to virtually integrate the data using the Composite Data Virtualization Platform as a virtual data warehouse. Placing Composite in the middle between the client reporting tool and the back-end data sources would enable IOO to quickly abstract the data into the data virtualization layer, reuse the existing business logic to access the data and generate all of the data needed for end client reporting. This approach required just seven months.

The Data Virtualization Solution - Architecture
Northern Trust's IOO data virtualization solution is a virtual data warehouse implemented with the Composite Data Virtualization Platform combined with a new client reporting front end. The major architecture components include:

Data sources - There are several physical data stores in the IOO environment.

  • Transaction data includes all of the detailed transactions for each end-client account: buys, sells, interest income, dividends, etc.
  • Valuation data includes, for each end-client account, how much it is worth today, how much it was worth yesterday, etc., based on the transaction data.
  • Enterprise asset data is a reference file that contains data about each asset in the client accounts, such as ticker symbol and investment category (equity, bond, money market, etc.). This data comes from a number of different third-party providers.
  • Performance data includes very complicated investment performance calculations that permit performance comparisons between accounts, such as one account against accounts of similar value or with similar investment strategies.
  • Transaction, valuation and enterprise asset data is stored in IBM DB2 and accessed via COBOL stored procedures.
  • Performance data is in Oracle and accessed via web services that call C code.

Data virtualization layer: The virtual data warehouse built using the Composite Data Virtualization Platform abstracts the data in the IOO data stores and creates a unified, virtual view that makes them appear as a single physical data store.

When the reporting platform issues a query to retrieve data from one or more data stores, Composite queries the data by calling the appropriate stored procedures and/or web services, combines the data (for example, associates all of the transactions that make up a valuation and the important performance data that applies to that valuation) and delivers it to the reporting platform.

Consuming applications: The primary consuming application is the client reporting platform. This is the toolset that IOO's internal business/operations partners now use to build, format and produce the reports for IOO's customers and their clients.

Another benefit of the reporting tool is that it supports a data dictionary and business views of the data that directly map to views within the data virtualization layer. These business views, or building blocks, conform the data into reusable components that the business analysts can use to create custom reports.

A key point here is that the solution has enabled IOO to transfer the time-consuming report customization work that previously had to be done for each new institutional customer by very high-priced and very busy technical resources (applications programmers) to IOO's operations partners (business analysts).

As Hardy summarized it, "We have been successful in our goal of taking the technology out of the outsourcing equation. As long as we supply the right data through Composite, the reporting tool handles all formatting, graphing and other functionality that we used to have to hard code within the application program. One reason we had such a big bottleneck in implementing outsourcing customers was the fact that we didn't have enough programmers in our application development group to handle the customization requirements. Business analysts in our operations group can now easily process the building blocks necessary to customize reports."

By combining the capabilities of data virtualization and the new reporting tool, IOO has been able to navigate through and abstract a very complex set of layers of translation to connect the raw IOO data structures to business metadata and finished customer reports.

The Data Virtualization Solution - Best Practices
Hardy offered some best practices and lessons learn based on his experience working with IOO.

Start with a focused project: Hardy stated that a focused initial effort is a key to success. "Don't start with a project to make all enterprise data available via a service." To date, IOO has completed four institutional customer implementations with over ten more in progress or planned.

Centralize support for data virtualization and development: Giving ICC responsibility for the data virtualization technology provided economies of scale and enabled the company to accelerate up the best practices learning curve in effectively implementing the technology in business solutions.

Understand the data: According to Hardy, data virtualization is no different than any other development or physical data integration tool. "The design and architecture of the solution is where you need to spend time. The old carpenter's adage of ‘measure twice and cut once' applies here. Don't just dive in and start coding views. Make sure you really understand the data. This is the key to success in an effort like this."

Educate and support the business: Hardy said it is also important to allocate time to consult with the business and make sure they understand the data. "In our case, they are the ones designing the reports but they don't necessarily have in-depth knowledge about the data in its raw form. So we need to help them understand the raw data and make sure that when the data gets to the end tool, people understand what they are looking at."

Manage business expectations: Although data virtualization is a capability that can get the job done faster, the organization still has to go through the same steps to implement the solution: analysis, design and architecture. "In our case, we could say it saved us a year and a half, but it did take seven months. So it will take time no matter how you do it."

Summary of Benefits
Hardy described the major benefits Northern Trust has achieved with the new IOO client reporting architecture based on data virtualization.

Faster time-to-market increases customer satisfaction and revenue: Data virtualization has dramatically reduced the time it takes to implement a new outsourcing customer by 50%. Moving customers through the pipeline faster improves overall customer satisfaction and gives IOO the capacity to bring in even more business and revenue.

Single point of access simplifies work and cut costs: The fact that the reporting tool, or any consuming application, has a single data access point regardless of the format, access method (SQL, web service, stored procedure, etc.) or data source  significantly reduces application complexity and development and maintenance time and cost.

Reuse benefits grow with adoption: The ability to create reusable data services that can be shared by all applications also reduces costs.

Long run flexibility is improved: Northern Trust is embarking on a three-year project to replace all of its underlying data stores with modern data warehouses and data marts. By putting data virtualization in the middle, thereby decoupling the back-end data from the consuming applications, IOO will not have to change anything that has been defined in the reporting tool when it swaps out the back-end data stores. It will only need to change the views to consume the new data sources.

•   •   •

Editor's Note: Robert Eve is the co-author, along with Judith R. Davis, of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility, the first book published on the topic of data virtualization. The complete Northern case study, along with nine others enterprise are available in the book.

More Stories By Robert Eve

Robert Eve is the EVP of Marketing at Composite Software, the data virtualization gold standard and co-author of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility. Bob's experience includes executive level roles at leading enterprise software companies such as Mercury Interactive, PeopleSoft, and Oracle. Bob holds a Masters of Science from the Massachusetts Institute of Technology and a Bachelor of Science from the University of California at Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
SYS-CON Events announced today that ReadyTalk, a leading provider of online conferencing and webinar services, has been named Vendor Presentation Sponsor at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. ReadyTalk delivers audio and web conferencing services that inspire collaboration and enable the Future of Work for today’s increasingly digital and mobile workforce. By combining intuitive, innovative tec...
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change t...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
industrial company for a multi-year contract initially valued at over $4.0 million. In addition to DataV software, Bsquare will also provide comprehensive systems integration, support and maintenance services. DataV leverages advanced data analytics, predictive reasoning, data-driven diagnostics, and automated orchestration of remediation actions in order to improve asset uptime while reducing service and warranty costs.
Vidyo, Inc., has joined the Alliance for Open Media. The Alliance for Open Media is a non-profit organization working to define and develop media technologies that address the need for an open standard for video compression and delivery over the web. As a member of the Alliance, Vidyo will collaborate with industry leaders in pursuit of an open and royalty-free AOMedia Video codec, AV1. Vidyo’s contributions to the organization will bring to bear its long history of expertise in codec technolo...