|By Ash Parikh||
|March 23, 2012 07:45 AM EDT||
In a recent article, CIO.com said that analytics and BI will be the top technology priorities for CIOs in 2012, based on a Gartner Inc. survey of IT executives. However, if you look back in time, reports show that BI was a top priority even then. Although we have fast-forwarded many years, the priorities haven't really changed. BI is still top of mind.
Granted, the amount of data that needs to be processed is growing by the day, and the need for businesses to have timely insight into things that matter is becoming more immediate. But wasn't this the case earlier as well? Businesses have always had this mindset - hence the reason for growth and continuous innovation.
What's new? Nothing, on the face of it. Except that with all things being equal, the fundamental problem, or shall I say problems, seem to have taken a backseat, yet again. We seem to keep talking about the symptoms instead of treating the issue at hand. In a recent report by Gleanster, LLC, the biggest challenges for enabling BI agility, are:
- Breaking down data / departmental silos
- Integrating with applications (e.g., CRM), operations and other platforms
- Achieving acceptable data quality
The report also points out that the most commonly used metrics by businesses are time-to-decision or time-to-response to information requests; information access (comprehensiveness, accuracy, and consistency); and volume and quality or actionable insights. These, in essence, are the fundamental requirements that need to be fulfilled to the hilt in order to enable BI agility.
For those in the know, this is not something BI tools can address on their own. A recent blog by Forrester Research, Inc., states that traditional BI approaches often fall short because BI hasn't fully empowered information workers, who still largely depend on IT, and because BI platforms, tools, and applications aren't agile enough. Now that we have this background in place, I can start my analysis.
Based on what we are seeing in some ongoing polls, without the underpinnings of a self-service driven agile data integration strategy in place, BI agility will continue to remain a pipedream. Yes, of late, data virtualization has emerged as an agile data integration approach that can enable BI agility. But as all solutions are not created equal, let's try to address the challenges we discussed with the proposed solution.
As I always say, the devil's in the details. Data virtualization built on data federation does one thing and only one thing very well - it accesses and merges data from several different data sources, in real-time, without physical data movement. It can turn many data silos into one and integrate with applications. But how about data quality? Is federated data truly ready for consumption? All I hear is silence.
A BI tool won't do anything to improve data quality as it simply assumes the availability of the most current and accurate data. What happens if there are inaccuracies and inconsistencies after federating data across various systems in real time? A more fundamental question - what if you cannot effectively analyze and profile the federated data in the first place? Well, you need further processing.
Did you read the fine print? I think it just said, deal with it. Or worse yet, I have also heard the excuse - BI tools do not expect consistent and accurate data. Very convenient wouldn't you say? Bottom line, you not only lose the time advantage that you gained in not moving the data physically, but you now have to deal with quality and consistency on a reactive basis. So much for an agile data integration approach.
We discussed quality and consistency. Now, how about the role of business users? Shouldn't the analyst define business entities, analyze and identify issues with the data, create rules to correct inaccuracies and inconsistencies, and then play a key part in making sure the federated data is as requested? Ask any BI professional, business users know the data the best. Data federation does little to get them involved.
Next, let's talk about the role of IT. Is it just about prioritizing a backlog of growing requests, building out the solution, testing and then deploying it? Shouldn't IT interact with the analyst instantly and throughout the process? This is critical to IT building exactly what the business wanted. Without self-service, agility can't be ensured. However, data federation has been typically a coding-heavy IT tool.
Although data federation has been around for a long time, it hasn't gone too far. Data virtualization built on data federation seems to be a case of doing the same thing again, and expecting a different answer. Federating data across many diverse data sources, in real time, without physical data movement, is what I call, par for the course. To enable BI agility, you need to go beyond looking under the hood.
Since data virtualization built on data federation cannot profile both data sources and logic, apply complex data quality rules and advanced data transformations on federated data as it is in flight, involve the business user early and often, and reuse the virtual views not just for BI tools, portals and composite applications, but also for batch - it looks like we have a choice to make?
The choices are - manual coding, further processing using other tools, and custom solutions. Really! Is this truly a choice you have the luxury or the extra budget to make? Are you going to sign-up for a solution that promises agility and then leaves a major portion of the task to you or to another tool? What's even more dangerous is that lack of critical functionality is simply passed off as good-to-have.
The Gartner Magic Quadrant for Data Integration Tools, October 27, 2011, says it well - it's "the ability to switch seamlessly and transparently between delivery modes (bulk/batch vs. granular real-time vs. federation) with minimal rework." Data virtualization thus needs to be built-on data integration to truly enable BI agility. Having said that, I believe the days of a one-trick pony are numbered.
• • •
Don't forget to join me at Informatica World 2012, May 15-18 in Las Vegas, to learn the tips, tricks and best practices for using the Informatica Platform to maximize your return on big data, and get the scoop on the R&D innovations in our next release, Informatica 9.5. For more information and to register, visit www.informaticaworld.com.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 20, 2017 02:30 AM EST Reads: 5,005
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Jan. 20, 2017 02:00 AM EST Reads: 6,570
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Jan. 20, 2017 01:45 AM EST Reads: 4,261
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Jan. 20, 2017 12:45 AM EST Reads: 2,827
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Jan. 20, 2017 12:45 AM EST Reads: 4,107
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 20, 2017 12:00 AM EST Reads: 6,335
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jan. 19, 2017 09:45 PM EST Reads: 6,821
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 19, 2017 09:45 PM EST Reads: 7,703
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Jan. 19, 2017 07:30 PM EST Reads: 4,239
Providing secure, mobile access to sensitive data sets is a critical element in realizing the full potential of cloud computing. However, large data caches remain inaccessible to edge devices for reasons of security, size, format or limited viewing capabilities. Medical imaging, computer aided design and seismic interpretation are just a few examples of industries facing this challenge. Rather than fighting for incremental gains by pulling these datasets to edge devices, we need to embrace the i...
Jan. 19, 2017 05:30 PM EST Reads: 3,649
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Jan. 19, 2017 05:15 PM EST Reads: 3,118
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Jan. 19, 2017 04:45 PM EST Reads: 3,774
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
Jan. 19, 2017 04:00 PM EST Reads: 5,446
SYS-CON Events announced today that Catchpoint, a leading digital experience intelligence company, has been named “Silver Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Catchpoint Systems is a leading Digital Performance Analytics company that provides unparalleled insight into your customer-critical services to help you consistently deliver an amazing customer experience. Designed for digital business, C...
Jan. 19, 2017 03:45 PM EST Reads: 1,808
@ThingsExpo has been named the ‘Top WebRTC Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @ThingsExpo ranked as the number one ‘WebRTC Influencer' followed by @DevOpsSummit at 55th.
Jan. 19, 2017 02:00 PM EST Reads: 4,798
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Jan. 19, 2017 01:15 PM EST Reads: 5,191
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jan. 19, 2017 01:15 PM EST Reads: 5,697
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Jan. 19, 2017 12:15 PM EST Reads: 4,317
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
Jan. 19, 2017 12:15 PM EST Reads: 2,037
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
Jan. 19, 2017 11:45 AM EST Reads: 1,732