Click here to close now.


Java IoT Authors: Pat Romanski, Adrian Bridgwater, Liz McMillan, Pete Waterhouse, Ed Featherston

Related Topics: Agile Computing, @CloudExpo

Agile Computing: Article

Google Chrome and Business Intelligence in the Cloud

Panorama this week released the latest version of its gadget for Google Docs

What will Business Intelligence be like in the future? "BI in the Cloud" architecture is only going to be feasible when most of your source data lives in the cloud already, possibly in something something like SQL Server Data Services or Amazon Simple DB or Google BigTable; or possibly in a hosted app like

The big news last week was of course Google's announcement of Chrome. And as several of the more informed bloggers noted (eg Nick Carr, Tim McCoy), the point of Chrome is to be not so much a browser as a platform for online applications, leading to a world where there is no obvious distinction between online and offline applications.

When I think about applications I think about Business Intelligence applications, and of course thinking about online BI applications and Google I thought of Panorama - which incidentally this week released the latest version of its gadget for Google Docs.

Now, I'll be honest and say that I've had a play with it and it is very slow and there are a few bugs still around. But it's a beta, and I'm told that it's running on a test server and performance will be better once it is released, and anyway it's only part of a wider client tool story (outlined and analysed nicely by Nigel Pendse here) which starts in the full Novaview client and involves the ability to publish views into Google Docs for a wider audience and for collaboration.

I guess it's a step towards the long-promised future where the desktop PC will have withered away into nothing more than a machine to run a browser on, and all our BI apps and all our data will be accessible over the web.

This all makes me wonder what BI will be like in the future...Time for some wild, half-formed speculation:

  • Starting at the back, the first objection raised to a purely 'BI in the cloud' architecture is that you've got to upload your data to it somehow. Do you fancy trying to push what you load into your data warehouse every day up to some kind of web service? I thought not. So I think 'BI in the cloud' architecture is only going to be feasible when most of your source data lives in the cloud already, possibly in something something like SQL Server Data Services or Amazon Simple DB or Google BigTable; or possibly in a hosted app like This requirement puts us a long way into the future already, although for smaller data volumes and one-off analyses perhaps it's not so much an issue.

  • You also need your organization to accept the idea of storing its most valuable data in someone else's data center. Now I'm not saying this as a kind of "why don't those Luddites hurry up and accept this cool new thing"-type comment, because there are some very valid objections to be made to the idea of cloud computing at the moment, like: can I guarantee good service levels? Will the vendor I chose go bust, or get bought, or otherwise disappear in a year or two? What are the legal implications of moving data to the cloud and possibly across borders? It will be a while before there are good answers to these questions and even when there are, there's going to be a lot of inertia that needs to be overcome.

    The analogy most commonly used to describe the brave new world of cloud computing is with the utility industry: you should be able to treat IT like electricity or water and treat it like a service you can plug into whenever you want, and be able to assume it will be there when you need it (see, for example, "The Big Switch").

    As far as data goes, though, I think a better analogy is with the development of the banking industry. At the moment we treat data in the same way that a medieval lord treated his money: everyone has their own equivalent of a big strong wooden box in the castle where the gold is kept, in the form of their own data centre. Nowadays the advantages of keeping money in the bank are clear - why worry about thieves breaking in and stealing your gold in the night, why go to the effort of moving all those heavy bags of gold around yourself, when it's much safer and easier to manage and move money about when it's in the bank? We may never physically see the money we possess but we know where it is and we can get at it when we need it. And I think the same attitude will be taken of data in the long run, but it does need a leap of faith to get there (how many people still keep money hidden in a jam jar in a kitchen cupboard?).

  • Once your data's in the cloud, you're going to want to load it into a hosted data warehouse of some kind, and I don't think that's too much to imagine given the cloud databases already mentioned. But how to load and transform it? Not so much of an issue if you're doing ELT, but for ETL you'd need a whole bunch of new hosted ETL services to do this. I see Informatica has one in Informatica On Demand; I'm sure there are others.

  • You're also going to want some kind of analytical engine on top - Analysis Services in the cloud anyone? Maybe not quite yet, but companies like Vertica ( and Kognitio ( are pushing into this area already; the architecture this new generation of shared-nothing MPP databases surely lends itself well to the cloud model: if you need better performance you just reach for your credit card and buy a new node.

  • You then want to expose it to applications which can consume this data, and in my opinion the best way of doing this is of course through an OLAP/XMLA layer. In the case of Vertica you can already put Mondrian on top of it ( so you can already have this if you want it, but I suspect that you'd have to invest as much time and money to make the OLAP layer scale as you had invested to make the underlying database scale, otherwise it would end up being a bottleneck. What's the use of having a high-performance database if your OLAP tool can't turn an MDX query, especially one with lots of calculations, into an efficient set of SQL queries and perform the calculations as fast as possible? Think of all the work that has gone into AS2008 to improve the performance of MDX calculations - the performance improvements compared to AS2005 are massive in some cases, and the AS team haven't even tackled the problem of parallelism in the formula engine at all yet (and I'm not sure if they even want to, or if it's a good idea). Again there's been a lot of buzz recently about the implementation of MapReduce by Aster and Greenplum to perform parallel processing within the data warehouse, which although it aims to solve a slightly different set of problems, it nonetheless shows that problem is being thought about.

  • Then it's onto the client itself. Let's not talk about great improvements in usability and functionality, because I'm sure badly designed software will be as common in the future as it is today. It's going to be delivered over the web via whatever the browser has evolved into, and will certainly use whatever modish technologies are the equivalent of today's Silverlight, Flash, AJAX etc. But will it be a stand-alone, specialised BI client tool, or will there just be BI features in online spreadsheets(or whatever online spreadsheets have evolved into)? Undoubtedly there will be good examples of both but I think the latter will prevail. It's true even today that users prefer their data in Excel, the place they eventually want to work with their data; the trend would move even faster if MS pulled their finger out and put some serious BI features in Excel...

    In the short-term this raises an interesting question though: do you release a product which, like Panorama's gadget, works with the current generation of clunky online apps in the hope that you can grow with them? Or do you, like Good Data and Birst (which I just heard about yesterday, and will be taking a closer look at soon) create your own complete, self-contained BI environment which gives a much better experience now but which could end up being an online dead-end? It all depends on how quickly the likes of Google and Microsoft (which is supposedly going to be revealing more about its online services platform soon) can deliver usable online apps; they have the deep pockets to be able to finance these apps for a few releases while they grow into something people want to use, but can smaller companies like Panorama survive long enough to reap the rewards? Panorama has a traditional BI business that could certainly keep it afloat, although one wonders whether they are angled to be acquired by Google.

So there we go, just a few thoughts I had. Anyone got any comments? I like a good discussion!


More Stories By Chris Webb

Chris Webb is an IT Consultant based in London, UK.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
pfelix 09/15/08 11:14:28 AM EDT

Great article. Lots of good points are made here. "Cloud" computing makes a lot of sense and will undoubtedly be accepted by many organizations in the future. Currently the BI SaaS offering is in a very early stage of development, but it also offers a lot of useful feature. Anyone with a spreadsheet that wants to do analysis in a collaborative manner can accomplish this more easily than ever before. Panorama's new flash gadget which is available to both Google doc users and iGoogle users can be leveraged by linking it to existing OLAP data sources in only a matter of minutes. As the article points out, there are challenges. Uploading transactional databases to the "cloud" is not a very realistic strategy. However, it is realistic to upload reconciled data which can be a much smaller set of data while still offering significant analysis abilities. Another common objection to BI SaaS is security. The data used in BI analysis is typically some of the most confidential and critical to an organization's success. Pushing this data to a third party is not something to take lightly. However, with today's highly redundant data centers and encryption techniques it is likely that BI data on the cloud will commonly be more secure that it would be in self maintained IT infrastructure. This is an exciting paradigm shift that the BI industry and software industry in general is going through. It will be very interesting to watch this transition.

@ThingsExpo Stories
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Learn how IoT, cloud, social networks and last but not least, humans, can be integrated into a seamless integration of cooperative organisms both cybernetic and biological. This has been enabled by recent advances in IoT device capabilities, messaging frameworks, presence and collaboration services, where devices can share information and make independent and human assisted decisions based upon social status from other entities. In his session at @ThingsExpo, Michael Heydt, founder of Seamless Thingies, will discuss and demonstrate how devices and humans can be integrated from a simple clust...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
As enterprises capture more and more data of all types – structured, semi-structured, and unstructured – data discovery requirements for business intelligence (BI), Big Data, and predictive analytics initiatives grow more complex. A company’s ability to become data-driven and compete on analytics depends on the speed with which it can provision their analytics applications with all relevant information. The task of finding data has traditionally resided with IT, but now organizations increasingly turn towards data source discovery tools to find the right data, in context, for business users, d...
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobile software company with over 150 developers, designers, quality assurance engineers, project manage...
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect and correlate data from control systems, sensors, mobile devices and IT systems for a variety of Ind...
SYS-CON Events announced today that Solgeniakhela will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Solgeniakhela is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between Personal and Professional Social, Mobile and Cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing ...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, will show how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He will use a Raspberry Pi to connect sensors to web services, and cloud integration to connect accounting and data, providing a Bluetooth...
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.