|By Keith Donald||
|December 8, 2004 12:00 AM EST||
By now you've probably either heard about or read the analyst report from the Burton Group entitled "J2EE in Jeopardy."
In summary, the claim is J2EE as a standard is in danger due to several market forces:
- Market commoditization: Open source players like Apache, JBoss, and ObjectWeb are commoditizing the platform, making it harder for vendors to profit from J2EE server licenses. If vendors can't make money on J2EE, they won't want to continue to invest in the specification.
- "Disruptive" technologies: In the last year the complexity of J2EE's programming model - EJB - has been challenged. Simpler, more productive models have emerged from within the open source community and have garnered widespread acceptance in a short time. Credited innovators here include the Spring Framework and Hibernate ORM.
- End users should turn over responsibility for generic J2EE infrastructure to proven providers in the open source world. You shouldn't abandon J2EE, but should consider some of the "alternative" frameworks in the open source community.
- Vendors should focus on building a "J2EE super platform." Basically, innovate in areas where open source hasn't already preempted competition through commoditization.
On the one hand, it's great to see the work of the Spring, Hibernate, Apache, and other quality open source teams getting endorsement and credibility. The cat is out of the proverbial bag: open source is a force of innovation to be respected and certainly not underestimated. Outsourcing infrastructure to proven open source providers is a very effective strategy for companies looking to deliver working software better/faster/cheaper. Complimenting quality open source offerings with strategic commercial products and services is an effective model for infrastructure providers looking to further penetrate the market.
On the other hand, I feel the casual reader might come away with a bad spin on what's happening here. I can see it now - a corporate manager faced with a major technology investment decision happens across this article (or others like it) and concludes J2EE is in chaos. Or worse, concludes the open source community is out building flavor-of-the-month "alternative frameworks" that "reinvent the wheel" because the "standard" platform doesn't cut it. Not exactly the impression we want to make to grow the enterprise Java market.
This is where our community must step in and set that manager straight. J2EE is not in a state of chaos. There are simply more good choices for J2EE infrastructure than ever before. And from what I've experienced, there are many more J2EE success stories. Second, these "alternative" frameworks absolutely do not "reinvent the wheel in open source." They all build on standard J2EE services to improve developer productivity; they are not replacements for the platform.
Indeed, Spring and Hibernate - the leading so-called "alternative" frameworks - are challenging the J2EE programming model while embracing the J2EE technology platform. This is a critical distinction. With Spring particularly, you get the power and maturity of the J2EE stack with a simplicity comparable to that of the .NET programming model. And you get it all with less cost and considerably more business leverage (choice).
How is this possible? When you step back and look at J2EE, there is a lot to it. J2EE consists of:
- A standard set of enterprise services addressing typical infrastructure needs, including transaction management (JTA), dynamic user content (servlets/JSP/JSF), database access (JDBC), service lookup (JNDI), asynchronous messaging (JMS), management (JMX), and remoting (RMI/Web services).
- A standard programming model for tapping into the power of the above services, gluing the individual pieces together into a consistent software delivery platform.
Enter the "alternative" frameworks. Spring, in particular, has given our community a framework that not only makes it easier to tap into the power of J2EE, but captures best practices on what services to use when given your business requirements. The result? More developers, architects, and managers are getting smarter about the infrastructure they need for the given job at hand. Developer productivity is up.
For over a year I've personally leveraged the Spring Framework as the base architecture for my development projects. I now treat J2EE infrastructure as a separate concern, one fully decoupled from the business logic ("core meat") of my application. Spring gives me the power to choose which deployment environment and technologies are most appropriate given the complexity of the domain problem at hand. That puts me in command - if all I need is a Web container to power a Web app with a single data source, a solution like Tomcat is the best cost. For middleware-intensive applications that require messaging, global transactions, and remoting, a higher-end application server is worth the investment. In all cases, my programming model stays simple and consistent, grounded in my customer's problem domain.
I can't say it enough, J2EE is better than ever - for the consumer. I read success story after success story from developers working on projects with products like Spring and Hibernate. They're leveraging them in all kinds of environments and application servers to support demands on all scales. Today is a great time to be developing enterprise applications in Java. It's a great time to be a consumer. We've got the technology, the platform, and the community - it's only going to get better. Who can stop us now?
|Sean Warburton 01/24/05 03:27:30 PM EST|
|Edgar Dollin 12/25/04 09:57:29 AM EST|
The only thing that one can gather from this much dialogue is that the current state of the art for open source / java development frameworks is that they are inadequate and need improvement.
I know struts is a Joke and will go away, Hibernate looks like a good tool but the lack of standardization and a real outside understanding of what it is holds it back, JSF is an attempt by the tool makers to force developers into paying for tools and a belated attempt to compete with .NET and Microsoft who has it close to correct is held in contempt with good reason.
Doesn't look like there is going to be concensus on this issue for quite some time. The problem is that there are some many good answers and so many people with working solutions. Let's hope that the 'Right' solution survives the melee.
|Doug Smith 12/17/04 04:39:51 PM EST|
Elevator pitch? Maybe on the Petronas Towers elevators, stopping at each floor for a minute. ;-)
|Bill Watson 12/17/04 02:51:21 PM EST|
This seems to be coming up more and more frequently - ColdFusion developers being asked to defend ColdFusion against a planned move to Java and J2EE. And so, in case you end up in this situation, this is what you need to know.
For starters, any suggestion of "we need to stop using ColdFusion because we are going to use Java" demonstrates a complete lack of understanding of what exactly ColdFusion is. So, let's start with a brief explanation of the ColdFusion-Java relationship.
Applications written in ColdFusion (as of ColdFusion MX) are pure Java. Or, expressed slightly differently, ColdFusion runs on a J2EE server (either embedded, or one of your choosing) running a Sun-verified Java application (the ColdFusion engine), executing Java bytecode (compiled from your CFML source code). In other words, CFML (the code you write) is a developer-time consideration, not a run-time consideration. There is no CFML at runtime; at runtime you are executing pure Java, no more or less so than had you written the application in straight Java. Your ColdFusion application is a Java application; if you deploy a ColdFusion application what you have deployed is Java. It's as simple as that.
This means that the assertion that ColdFusion and Java are somehow mutually exclusive is just flat out incorrect. But what about the CFML code you write? Isn't that ColdFusion specific and not pure Java? And isn't that an issue? I don't think so. There is an entire industry of Java add-ons out there - tools, tags, language extensions, and more - and Java shops use these (as they should; after all, why reinvent the wheel?). If your Java code leverages third-party add-ons for reporting, or back-end integration, or charting, or ... does that make your code any less Java? Nope, not at all.
Experienced developers know that starting from step one is expensive and seldom makes sense, regardless of the language and platform. Experienced developers have toolboxes at their disposal, stuff they can leverage and reuse to be as productive as possible. Experienced developers write modular applications, separating logic and processing and presentation into tiers, allowing these to evolve independently of each other, even allowing them to be inserted or removed independently.
For Java developers, one of these tools should be ColdFusion. After all, why write dozens of lines of Java code to connect to a database when a single tag can accomplish the exact same thing (likely using the same code internally)? And why write lots of code to send an SMTP message using JavaMail APIs when a single tag can do it for you (again, using those same APIs)? You can think of ColdFusion as a bunch of prewritten Java code, stuff you can use so as to hit the ground running. And that makes your app no less Java than if you had done all the work manually.
However, some may counter that CFML is proprietary, and that the fact that you need to pay for an engine to execute your code somehow makes it non-Java. I have actually heard this from customers. So is this a valid issue? Again, I don't think so. For starters, paid does not equal proprietary. After all, these same customers do not balk at spending big bucks on their J2EE servers (and management tools and professional services and more). Furthermore, there are indeed third-party CFML engines out there. I am not going to comment on how good they are and how viable an alternative they are - that's irrelevant. What is relevant is that they exist, and that means that CFML is not a single-vendor or proprietary.
Great, so ColdFusion simplifies Java development, and ColdFusion applications are no less Java than applications written in low-level Java directly. But simplicity and abstractions require sacrificing power, right? Wrong! ColdFusion applications can (and should) leverage Java; Java APIs, Java classes, JavaBeans, JSP tags, you name it, ColdFusion can leverage it because ColdFusion itself is Java. It's that simple.
So, ColdFusion or Java? The answer should be yes, ColdFusion is Java, and Java development can benefit from ColdFusion. This is not an either/or proposition, it's a "you can have it all so why the heck would you want to do it any other way?" proposition.
The ASP versus ColdFusion discussion used to come up regularly. But not anymore. Now that Microsoft has essentially abandoned any future development on classic ASP, replacing it with ASP.NET, few organizations are embarking on brand new ASP deployments. But having said that, if you do need to defend ColdFusion against ASP, here's what you need to know.
For starters, ASP capabilities are a subset of those of ColdFusion. Or put differently, ColdFusion can do anything that ASP can do, and a whole lot more too. The reverse is not true. Sure, ASP can be extended (using COM objects) to do just about anything that ColdFusion can do, but that's just it, you need to extend ASP - it's your responsibility to do so. Simple things that ColdFusion developers take for granted, like being able to generate an e-mail message, or process an uploaded file, or generate a business chart, none of those are native ASP functionality.
And this is not mere boasting, this is important, because it's the way to head off the "but ASP is free" argument. Sure, ASP is free for starters, but buy all the add-on bits you need to make it functionally equivalent to ColdFusion (even ColdFusion Standard, and even ColdFusion 3 or 4!) and you'll end up paying far more than ColdFusion costs. Sure, ASP is cheaper initially, but you get what you pay for. Or rather, you don't get what you don't pay for. And when you do pay for it, you'll end up paying a whole lot more.
And that's just looking at initial costs. ASP development is also far more time consuming than ColdFusion development. Even if you're comfortable in the languages used, you'll still have to write lots more code to get the job done. Even the execution of simple SQL statements is far more complex in ASP - one tag versus lots of lines of ugly code. More code = longer development time = costs more. Plus, more code = more complex ongoing maintenance = costs even more.
At the risk of sounding like an MBA, when you look at the total cost of ownership, ASP is not the cheaper option at all. Oh, and on top of all that, ASP is proprietary, a single vendor solution, and you're married to Windows boxes (no Linux, no Unix, no portability).
Maybe this is why, as already stated, most ColdFusion servers run on Windows, Windows boxes that likely already have ASP installed. Why? Because hundreds of thousands of developers have figured out that free can be far too expensive.
J2EE and .NET are remarkably alike, both in terms of objectives and the range of their various components and systems. Of course, applications and application development with the two platforms are not alike at all; everything from tools to languages to methodologies are different. At their respective cores, both .NET and J2EE provide the building blocks and technologies needed to build applications. Security abstractions, database support, back-end integration, system level services, transactions and messaging, run-time services, and more are all provided by the platforms themselves. Both J2EE and .NET provide "safe" environments in which applications run (the JVM and CLR respectively); both J2EE and .NET support the use of different languages within these environments (although this potential has been realized to a greater degree in .NET); both have a scripting solution designed for Web applications (JSP or ColdFusion for J2EE, ASP.NET for .NET); and both are incredibly powerful and capable.
Many organizations are going through a J2EE or .NET discussion, usually independent of any discussion about ColdFusion. And there are pros and cons to both options. J2EE wins when vendor independence, openness, and portability are a priority. .NET wins when it comes to tools, a better client experience, or simply a commitment to the Microsoft way (there is more to it than that, but that's an entire column unto itself).
However, as many are discovering, J2EE versus .NET is not always an either/or proposition. In fact, lots of organizations are discovering that they need both, and that the future is decidedly heterogeneous. This is especially true for larger organizations where there's room for both, and interoperability (primarily via SOAP) makes this a workable option.
If an organization has made the strategic decision to bet its future solely on Microsoft and .NET, then they probably should use ASP.NET. Sure, ColdFusion can coexist and interoperate with the .NET world, but ASP.NET will likely be the preferred option. For organizations going the J2EE route, well, I've covered that one already. But for most organizations, ColdFusion remains compelling, leveraging the worlds of J2EE natively and .NET via SOAP. In fact, some organizations have discovered that ColdFusion is the simplest way to create client applications that talk to both J2EE and .NET back ends, if that is needed.
So, ColdFusion or ASP.NET? That depends on what your IT future looks like. And unless the future is Microsoft and Windows only, ColdFusion remains an important cog in the IT engine.
PHP is also script based. Pages contain code that is processed by the PHP engine. The PHP engine itself is open source, and the PHP language uses a syntax borrowed from C, Java, and Perl (the latter is important, as PHP is particularly popular with former Perl developers). PHP runs on all sorts of systems, and uses functions and APIs for all sorts of processing.
PHP's stated goal (as per the official PHP FAQ) is to "allow Web developers to write dynamically generated pages quickly." Ironically, developers with both CFML and PHP experience will tell you that while PHP development may be quicker than using C or Java, it does not come close to that of ColdFusion and CFML.
There is no refuting PHP's power; PHP developers have an impressive and rich set of functions and language extensions available, and that is definitely part of PHP's appeal. But this power comes at a cost.It takes a lot more PHP code to do what simple CFML tags do, and as explained before, more code = greater cost. If you truly need that power, then PHP may indeed be an option worth considering (although CFML+Java would likely give you the same power and more). But if what you really need is to "write dynamically generated pages quickly," then PHP leaves much to be desired.
One of the most compelling arguments for PHP is its cost there is none. But, as explained earlier, the cost of software is only one part of the total cost of application development, arguably the least significant cost in the long run. It is for this reason that despite being incredibly popular with developers and consulting shops, corporate users have been slow to adopt PHP as a development platform (which in turn is why the PHP versus ColdFusion discussion comes up so infrequently).
The bottom line is if you really need the power and flexibility of PHP and can justify the longer development cycles and more code to manage and maintain it, then PHP should seriously be considered. But if time to completion and total cost of ownership are an issue, then ColdFusion wins hands down.
|Alexander 12/10/04 11:02:45 AM EST|
Keith, "nonsense" may indeed be too strong a word. In most cases it is a good technical decision to have POJOs that don't call container APIs. But that is neither a special feature of IoC containers nor does it make your system and development organisation as a whole less dependent on the container. It just moves the dependency to a different place. It's a pattern that everyone has been free to use within EJB and I have been using it ever since. To have complex business rules or algorithms directly call into EJB APIs so you can't test them independently or use them in other kinds of systems is just bad design. It's trivial not to do that. To throw a standardised architecture overboard and instead introduce a whole new non-standard framework just to support this particular pattern, is in my view a disproportianate overreaction.
Regarding backward compatibility of open source: Well, there's a mixed picture I would say. There are indeed a lot of open source packages that have a good history there. But the most prominent example of open source, Linux, is just horrendous in that respect. Frankly, it's ridiculous, that after so many years of debate about the fragmentation of Unix and many attempts to correct it, along comes a Unix clone that itself fragments into countless incompatible distributions.
Just as it is a good technical decision to not link business rules to framework APIs, it is an equally good decision to separate specification from implementation. Where is the Spring specification? Where is the Hibernate spec? Which vendors or open source groups or standards organisations support these specs?
|Doug Smith 12/10/04 08:59:49 AM EST|
I love to read these discussions. As a solo practioner, I have found POJO, Servlet/JSP and JDBC to be good enough to build the scale of applications my clients need. I am intriqued and frustrated by the abundance of excellent frameworks to use, but have no one to discuss the pros and cons with. I tend to agree with the open standards POV.
My 20 years of IBM midrange (System/38, AS/400, iSeries) has taught me the value to the client (customer, not PC) of having a framework that persists over time. Every consultant going into a midrange shop knew exactly how to use the editor, screen designer, database, interactive sessions, batch sessions, and so on. Yes, it was a very limited architecture, but sufficient for the era, and surprisingly capable today as a database machine or JVM host.
How much good does it really do if every Java shop has a unique architecture du jour, creating a barrier for consultant productivity on Day One? Let a thousand blossoms bloom, but let reason prevail. POJO will be with us ten years from now. Will your favorite framework still be used?
|Keith Donald 12/10/04 08:54:18 AM EST|
Your point about the costs to train a development organization is a good one.
But my points are not nonsense. There are real savings to be had in minimizing dependencies on infrastructure APIs in domain-specific java code. There is real leverage in abstracting dependencies on a particular service like Hibernate behind a logical layer of data access interfaces, for example. Not only does it produce cleaner, more maintainable code, it makes it easier to replace any one layer without impacting the others.
My point is Spring and Hibernate make it easier to write cleaner code that is consistently simple, easy to test, and cohesive (focused on domain problems.) On my development team, not everyone has to know everything about Spring or Hibernate -- only a few key developers need to know them inside and out, and the others still get their jobs done. They can do this because Spring makes it easy for developers to specialize. And you don't have to ask me, ask the many who have already built Spring-powered apps: Spring promotes well-layered applications that are structured into manageable, independently testable components.
Re: backwards compatability. Overall I think open source has a good record of backwards compability. Would you agree? There are certainly examples of where specifications don't. I know for certain Spring will continue to be backwards compatible--one because our commuity is large and committed to that and second because Spring is backed by a commercial interest. But again, the risk is mitigated when most of my code is _not_ dependent on Spring. If it is dependent on Spring, for example, because I'm using the Web MVC framework, it's a choice I make because its a good value-add solution in which my team -- given their skills and knowledge -- can be most productive. I completely agree with you on that point, you have to take into account the skills of your team and the cost to retrain and weigh that with the benefits acquired, for any technology (Spring is no exception.) It's my opinion, because Spring helps my developers write better code, and integrates well with a number of standard-based and 'good' open source/commerical solutions (as an effective technology integration platform) the benefits are real and quick (perhaps broad, perhaps piecemeal - both strategies work) and outweigh the initial training costs.
Re: fashion, Spring is my base moving forward. I've come to trust it as a stable foundatation whose architects work hard to integrate the best and more volatile of whats out there, hedging my bets on infrastructure by providing higher-layer abstractions that give me more leverage. EJB 3.0 will come and we'll offer support for it, just like we have support for EJB 2.1 now. The programming model we see today in Spring will become more pervasive, and ultimately that's very good for Java. It's going to open up the productivity door for a lot more developers to solve a whole array of interesting problems faster.
|Alexander 12/10/04 05:32:56 AM EST|
You say: "POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter)."
I'm sorry, but that's complete nonsense. The question is not if POJOs are coupled to anything. The question is what do developers have to do to make the application work? What knowledge do they need? And the answer is, they have to know the Spring architecture, all the Spring configuration stuff and the Hibernate OR-mappings. This _IS_ the API. The API is everything developers have to know to make the system work. So the question is what developers become coupled to, not what POJOs are coupled to.
It's no consolation when you say, with Spring, you only have to know the parts that you need. For one, it's not true, because there is always the general architecture of the framework that you have to know. And then of course a framework is there to be used. If I don't need any of the features of Spring, I don't need Spring in the first place. If I need a particular feature, I get coupled to it. There's no amout of smoke screens that can make me believe otherwise. And Spring and Hibernate are products, not specifications. So, if I get coupled to a particular product, why would I not choose .NET then? Or the Oracle DBMS which is a platform in its own right? Both have much better tool support than Spring or Hibernate. Both are very productive. Both are supported by vendors that won't go away soon and are committed to some degree of backwards compatibility. And both cost much less than retraining a whole development organisation to the most fashionable open source framework once every two years.
|Alexander 12/10/04 04:55:35 AM EST|
I think your distinction between technology platform and programming model is a red herring. For a development organisation, there is only one thing that counts and that is the API. An EJB/CMP developer cannot use Spring/Hibernate without retraining and a year of experience. Neither Spring nor Hibernate are standards. There is a huge number of similar frameworks and an even greater number of MVC frameworks on top of that. Let's face it, what we see is fragmentation. If that turns out to be a good thing or a bad thing depends on the ability of all involved to build a new standardisation process that works better than the JCP. It's not enough to say, hey look we have so much choice isn't that fantastic? It would be fantastic if what we were talking about wasn't a platform but a range of cheese in the supermarket.
|Jay Toee 12/10/04 02:08:40 AM EST|
I really love the persistence model of EJBs and specifically CMP2. OpenSource frameworks are often used in conjunction with OpenSource app servers. One reason for more frameworks popping up, may be that JBoss' CMP engine can't handle that much traffic, so the alternatives may be more appealing.
|Keith Donald 12/09/04 11:18:27 PM EST|
One note re: open frameworks like Spring and Hibernate 'not being standard'. With these two, the question becomes: what is there to standardize? POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter). Indeed, POJOS stay decoupled from infrastructure, which makes them easily adaptable and runnable in different environments (from test, to standalone, to enterprise.) This really does give you the ability to make infrastructure a choice not a mandate, and makes it easy to adapt and scale up as requirements change. This model is the future, no doubt about it.
|Harrison Li 12/09/04 09:15:30 PM EST|
I'd just like to add some comments about company profits and J2EE. In the early stage of J2EE, application server vendors like BEA, etc. were the major force to push the J2EE standard. Now things have changed. Big business application vendors such as SAP have started to play that role. The reason is very simple, those big companies has invested significantly on J2EE technology by developing new generation of business application products. If you were familiar with SAP, you would have seen what vast changes have happened in product development on the top of J2EE in the last 3-4 years. SAP, IBM, etc. have to continue to push J2EE technology forward because J2EE has become the fundation of the company's products. I don't worry about the future of J2EE technology. Instead, I do worry about how those "pure" J2EE application server vendors could survive if they continue to be the server vendor only. Thanks.
|Dave 12/09/04 08:26:54 PM EST|
I like the programming models that open source projects like Spring and Hibernate give you.
the problem with frameworks like these however, is that open source is good, but open *standards* are better. I'd be much more likely to use Spring if there were a Spring JSR, and I'd much rather take my pick from 20+ open source and commercial implementations of JDO than tie my application to the proprietary Hibernate API.
look at Struts for instance - great framework, but in time it's going to go the way of the dodo, replaced completely by JSF, because JSF is a *standard*, and there are multiple open source and commercial vendors I can go to to get my JSF implementation. The current Hibernate API will be legacy code too once everyone goes to EJB 3 / JDO.
open source is great, but open standards are much better...
|David 12/09/04 07:41:38 PM EST|
J2EE has simply gotten so bloated and complex that few can really grasp it all. And since J2EE is the foundation, that's not good. Our applications are complicated enough without all of that baggage. The simpler frameworks, including simpler J2EE subsets such as JSP/servlet + JDBC are enough for many applications. They can always add JavaMail if they do email, or RMI if they need remote invocations, and they can even venture in JMS should that become necessary. However, most of those are not needed by a large number of applications, and the simpler frameworks will help put a stop to managers who choose J2EE when the simpler stuff will make it happen faster and cheaper.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
May. 29, 2015 03:45 PM EDT Reads: 4,790
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In this session, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, will describe how to revolutionize your architecture and...
May. 29, 2015 02:33 PM EDT Reads: 317
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
May. 29, 2015 02:00 PM EDT Reads: 6,433
We’re entering a new era of computing technology that many are calling the Internet of Things (IoT). Machine to machine, machine to infrastructure, machine to environment, the Internet of Everything, the Internet of Intelligent Things, intelligent systems – call it what you want, but it’s happening, and its potential is huge. IoT is comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures. As a result, huge volumes of data are being generated, and that data is being processed into useful actions that can “command and control” thi...
May. 29, 2015 02:00 PM EDT Reads: 764
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
May. 29, 2015 01:15 PM EDT Reads: 2,121
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
May. 29, 2015 01:00 PM EDT Reads: 6,930
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
May. 29, 2015 01:00 PM EDT Reads: 4,277
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
May. 29, 2015 01:00 PM EDT Reads: 1,463
Thanks to widespread Internet adoption and more than 10 billion connected devices around the world, companies became more excited than ever about the Internet of Things in 2014. Add in the hype around Google Glass and the Nest Thermostat, and nearly every business, including those from traditionally low-tech industries, wanted in. But despite the buzz, some very real business questions emerged – mainly, not if a device can be connected, or even when, but why? Why does connecting to the cloud create greater value for the user? Why do connected features improve the overall experience? And why do...
May. 29, 2015 12:42 PM EDT Reads: 410
SYS-CON Events announced today that O'Reilly Media has been named “Media Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York City, NY. O'Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O'Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption by amplifying "faint signals" from the alpha geeks who are creating the future. An active participa...
May. 29, 2015 12:30 PM EDT Reads: 788
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust IT industrialization – allowing customers to provide amazing user experiences with optimized IT per...
May. 29, 2015 12:15 PM EDT Reads: 945
Imagine a world where targeting, attribution, and analytics are just as intrinsic to the physical world as they currently are to display advertising. Advances in technologies and changes in consumer behavior have opened the door to a whole new category of personalized marketing experience based on direct interactions with products. The products themselves now have a voice. What will they say? Who will control it? And what does it take for brands to win in this new world? In his session at @ThingsExpo, Zack Bennett, Vice President of Customer Success at EVRYTHNG, will answer these questions a...
May. 29, 2015 12:13 PM EDT Reads: 418
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
May. 29, 2015 12:00 PM EDT Reads: 1,769
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
May. 29, 2015 11:00 AM EDT Reads: 5,711
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
May. 29, 2015 11:00 AM EDT Reads: 4,031
The multi-trillion economic opportunity around the "Internet of Things" (IoT) is emerging as the hottest topic for investors in 2015. As we connect the physical world with information technology, data from actions, processes and the environment can increase sales, improve efficiencies, automate daily activities and minimize risk. In his session at @ThingsExpo, Ed Maguire, Senior Analyst at CLSA Americas, will describe what is new and different about IoT, explore financial, technological and real-world impact across consumer and business use cases. Why now? Significant corporate and venture...
May. 29, 2015 10:50 AM EDT Reads: 479
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
May. 29, 2015 10:00 AM EDT Reads: 3,201
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
May. 29, 2015 10:00 AM EDT Reads: 5,471
There will be 150 billion connected devices by 2020. New digital businesses have already disrupted value chains across every industry. APIs are at the center of the digital business. You need to understand what assets you have that can be exposed digitally, what their digital value chain is, and how to create an effective business model around that value chain to compete in this economy. No enterprise can be complacent and not engage in the digital economy. Learn how to be the disruptor and not the disruptee.
May. 29, 2015 09:45 AM EDT Reads: 681
The enterprise market will drive IoT device adoption over the next five years. In his session at @ThingsExpo, John Greenough, an analyst at BI Intelligence, division of Business Insider, will analyze how companies will adopt IoT products and the associated cost of adopting those products. John Greenough is the lead analyst covering the Internet of Things for BI Intelligence- Business Insider’s paid research service. Numerous IoT companies have cited his analysis of the IoT. Prior to joining BI Intelligence, he worked analyzing bank technology for Corporate Insight and The Clearing House Pay...
May. 29, 2015 09:36 AM EDT Reads: 464