Click here to close now.


Java IoT Authors: Anders Wallgren, Betty Zakheim, Pat Romanski, Dana Gardner, Automic Blog

Related Topics: Java IoT, ColdFusion

Java IoT: Article

J2EE: A Standard in Jeopardy?

J2EE: A Standard in Jeopardy?

By now you've probably either heard about or read the analyst report from the Burton Group entitled "J2EE in Jeopardy."

In summary, the claim is J2EE as a standard is in danger due to several market forces:

  1. Market commoditization: Open source players like Apache, JBoss, and ObjectWeb are commoditizing the platform, making it harder for vendors to profit from J2EE server licenses. If vendors can't make money on J2EE, they won't want to continue to invest in the specification.
  2. "Disruptive" technologies: In the last year the complexity of J2EE's programming model - EJB - has been challenged. Simpler, more productive models have emerged from within the open source community and have garnered widespread acceptance in a short time. Credited innovators here include the Spring Framework and Hibernate ORM.
The report goes on to make recommendations to mitigate risks for end users and vendors alike. In summary:
  1. End users should turn over responsibility for generic J2EE infrastructure to proven providers in the open source world. You shouldn't abandon J2EE, but should consider some of the "alternative" frameworks in the open source community.
  2. Vendors should focus on building a "J2EE super platform." Basically, innovate in areas where open source hasn't already preempted competition through commoditization.
As a Spring Framework user/developer and a strong believer in our value proposition and development philosophy, I came away from all of this with mixed feelings.

On the one hand, it's great to see the work of the Spring, Hibernate, Apache, and other quality open source teams getting endorsement and credibility. The cat is out of the proverbial bag: open source is a force of innovation to be respected and certainly not underestimated. Outsourcing infrastructure to proven open source providers is a very effective strategy for companies looking to deliver working software better/faster/cheaper. Complimenting quality open source offerings with strategic commercial products and services is an effective model for infrastructure providers looking to further penetrate the market.

On the other hand, I feel the casual reader might come away with a bad spin on what's happening here. I can see it now - a corporate manager faced with a major technology investment decision happens across this article (or others like it) and concludes J2EE is in chaos. Or worse, concludes the open source community is out building flavor-of-the-month "alternative frameworks" that "reinvent the wheel" because the "standard" platform doesn't cut it. Not exactly the impression we want to make to grow the enterprise Java market.

This is where our community must step in and set that manager straight. J2EE is not in a state of chaos. There are simply more good choices for J2EE infrastructure than ever before. And from what I've experienced, there are many more J2EE success stories. Second, these "alternative" frameworks absolutely do not "reinvent the wheel in open source." They all build on standard J2EE services to improve developer productivity; they are not replacements for the platform.

Indeed, Spring and Hibernate - the leading so-called "alternative" frameworks - are challenging the J2EE programming model while embracing the J2EE technology platform. This is a critical distinction. With Spring particularly, you get the power and maturity of the J2EE stack with a simplicity comparable to that of the .NET programming model. And you get it all with less cost and considerably more business leverage (choice).

How is this possible? When you step back and look at J2EE, there is a lot to it. J2EE consists of:

  1. A standard set of enterprise services addressing typical infrastructure needs, including transaction management (JTA), dynamic user content (servlets/JSP/JSF), database access (JDBC), service lookup (JNDI), asynchronous messaging (JMS), management (JMX), and remoting (RMI/Web services).
  2. A standard programming model for tapping into the power of the above services, gluing the individual pieces together into a consistent software delivery platform.
We love the enterprise services. Nobody can match the power and maturity of the J2EE technology stack. However, today's pragmatic developer isn't so keen on the EJB programming model that builds on these services. It's complex, invasive, dated, and overengineered for most developer needs. It's not consistent. I have to jump through hoops just to run my business logic in different environments, for example. I incur the costs of JTA when all my application needs is a single database. The list of unnecessary costs goes on.

Enter the "alternative" frameworks. Spring, in particular, has given our community a framework that not only makes it easier to tap into the power of J2EE, but captures best practices on what services to use when given your business requirements. The result? More developers, architects, and managers are getting smarter about the infrastructure they need for the given job at hand. Developer productivity is up.

For over a year I've personally leveraged the Spring Framework as the base architecture for my development projects. I now treat J2EE infrastructure as a separate concern, one fully decoupled from the business logic ("core meat") of my application. Spring gives me the power to choose which deployment environment and technologies are most appropriate given the complexity of the domain problem at hand. That puts me in command - if all I need is a Web container to power a Web app with a single data source, a solution like Tomcat is the best cost. For middleware-intensive applications that require messaging, global transactions, and remoting, a higher-end application server is worth the investment. In all cases, my programming model stays simple and consistent, grounded in my customer's problem domain.

I can't say it enough, J2EE is better than ever - for the consumer. I read success story after success story from developers working on projects with products like Spring and Hibernate. They're leveraging them in all kinds of environments and application servers to support demands on all scales. Today is a great time to be developing enterprise applications in Java. It's a great time to be a consumer. We've got the technology, the platform, and the community - it's only going to get better. Who can stop us now?

More Stories By Keith Donald

Keith Donald is a software consultant, mentor, and Java application developer for Interface21, Limited. He's been a believer in The Spring Framework since he started leveraging it in real projects in July 2003.

Comments (14) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Sean Warburton 01/24/05 03:27:30 PM EST

Edgar Dollin 12/25/04 09:57:29 AM EST


The only thing that one can gather from this much dialogue is that the current state of the art for open source / java development frameworks is that they are inadequate and need improvement.

I know struts is a Joke and will go away, Hibernate looks like a good tool but the lack of standardization and a real outside understanding of what it is holds it back, JSF is an attempt by the tool makers to force developers into paying for tools and a belated attempt to compete with .NET and Microsoft who has it close to correct is held in contempt with good reason.

Doesn't look like there is going to be concensus on this issue for quite some time. The problem is that there are some many good answers and so many people with working solutions. Let's hope that the 'Right' solution survives the melee.

Doug Smith 12/17/04 04:39:51 PM EST

Elevator pitch? Maybe on the Petronas Towers elevators, stopping at each floor for a minute. ;-)

Bill Watson 12/17/04 02:51:21 PM EST

This seems to be coming up more and more frequently - ColdFusion developers being asked to defend ColdFusion against a planned move to Java and J2EE. And so, in case you end up in this situation, this is what you need to know.

For starters, any suggestion of "we need to stop using ColdFusion because we are going to use Java" demonstrates a complete lack of understanding of what exactly ColdFusion is. So, let's start with a brief explanation of the ColdFusion-Java relationship.

Applications written in ColdFusion (as of ColdFusion MX) are pure Java. Or, expressed slightly differently, ColdFusion runs on a J2EE server (either embedded, or one of your choosing) running a Sun-verified Java application (the ColdFusion engine), executing Java bytecode (compiled from your CFML source code). In other words, CFML (the code you write) is a developer-time consideration, not a run-time consideration. There is no CFML at runtime; at runtime you are executing pure Java, no more or less so than had you written the application in straight Java. Your ColdFusion application is a Java application; if you deploy a ColdFusion application what you have deployed is Java. It's as simple as that.

This means that the assertion that ColdFusion and Java are somehow mutually exclusive is just flat out incorrect. But what about the CFML code you write? Isn't that ColdFusion specific and not pure Java? And isn't that an issue? I don't think so. There is an entire industry of Java add-ons out there - tools, tags, language extensions, and more - and Java shops use these (as they should; after all, why reinvent the wheel?). If your Java code leverages third-party add-ons for reporting, or back-end integration, or charting, or ... does that make your code any less Java? Nope, not at all.

Experienced developers know that starting from step one is expensive and seldom makes sense, regardless of the language and platform. Experienced developers have toolboxes at their disposal, stuff they can leverage and reuse to be as productive as possible. Experienced developers write modular applications, separating logic and processing and presentation into tiers, allowing these to evolve independently of each other, even allowing them to be inserted or removed independently.

For Java developers, one of these tools should be ColdFusion. After all, why write dozens of lines of Java code to connect to a database when a single tag can accomplish the exact same thing (likely using the same code internally)? And why write lots of code to send an SMTP message using JavaMail APIs when a single tag can do it for you (again, using those same APIs)? You can think of ColdFusion as a bunch of prewritten Java code, stuff you can use so as to hit the ground running. And that makes your app no less Java than if you had done all the work manually.

However, some may counter that CFML is proprietary, and that the fact that you need to pay for an engine to execute your code somehow makes it non-Java. I have actually heard this from customers. So is this a valid issue? Again, I don't think so. For starters, paid does not equal proprietary. After all, these same customers do not balk at spending big bucks on their J2EE servers (and management tools and professional services and more). Furthermore, there are indeed third-party CFML engines out there. I am not going to comment on how good they are and how viable an alternative they are - that's irrelevant. What is relevant is that they exist, and that means that CFML is not a single-vendor or proprietary.

Great, so ColdFusion simplifies Java development, and ColdFusion applications are no less Java than applications written in low-level Java directly. But simplicity and abstractions require sacrificing power, right? Wrong! ColdFusion applications can (and should) leverage Java; Java APIs, Java classes, JavaBeans, JSP tags, you name it, ColdFusion can leverage it because ColdFusion itself is Java. It's that simple.

So, ColdFusion or Java? The answer should be yes, ColdFusion is Java, and Java development can benefit from ColdFusion. This is not an either/or proposition, it's a "you can have it all so why the heck would you want to do it any other way?" proposition.

... ASP
Microsoft ASP has been an important player in Web application scripting since, well, since a year or so after the introduction of ColdFusion. From an application development process viewpoint, ColdFusion and ASP are not that different. Both are script based, both are very page centric, both embed server-side processing and client-side presentation code in source files, and both are implemented as HTTP server add-ons (ASP via ISAPI, ColdFusion via ISAPI and more). ASP and ColdFusion can coexist, and indeed, as most ColdFusion deployments are on Windows, the likelihood of ASP and ColdFusion coexisting (even if ASP is not used) is very high.

The ASP versus ColdFusion discussion used to come up regularly. But not anymore. Now that Microsoft has essentially abandoned any future development on classic ASP, replacing it with ASP.NET, few organizations are embarking on brand new ASP deployments. But having said that, if you do need to defend ColdFusion against ASP, here's what you need to know.

For starters, ASP capabilities are a subset of those of ColdFusion. Or put differently, ColdFusion can do anything that ASP can do, and a whole lot more too. The reverse is not true. Sure, ASP can be extended (using COM objects) to do just about anything that ColdFusion can do, but that's just it, you need to extend ASP - it's your responsibility to do so. Simple things that ColdFusion developers take for granted, like being able to generate an e-mail message, or process an uploaded file, or generate a business chart, none of those are native ASP functionality.

And this is not mere boasting, this is important, because it's the way to head off the "but ASP is free" argument. Sure, ASP is free for starters, but buy all the add-on bits you need to make it functionally equivalent to ColdFusion (even ColdFusion Standard, and even ColdFusion 3 or 4!) and you'll end up paying far more than ColdFusion costs. Sure, ASP is cheaper initially, but you get what you pay for. Or rather, you don't get what you don't pay for. And when you do pay for it, you'll end up paying a whole lot more.

And that's just looking at initial costs. ASP development is also far more time consuming than ColdFusion development. Even if you're comfortable in the languages used, you'll still have to write lots more code to get the job done. Even the execution of simple SQL statements is far more complex in ASP - one tag versus lots of lines of ugly code. More code = longer development time = costs more. Plus, more code = more complex ongoing maintenance = costs even more.

At the risk of sounding like an MBA, when you look at the total cost of ownership, ASP is not the cheaper option at all. Oh, and on top of all that, ASP is proprietary, a single vendor solution, and you're married to Windows boxes (no Linux, no Unix, no portability).

Maybe this is why, as already stated, most ColdFusion servers run on Windows, Windows boxes that likely already have ASP installed. Why? Because hundreds of thousands of developers have figured out that free can be far too expensive.

Comparing ASP.NET to ColdFusion is difficult. Actually, it's not just difficult, it's simply incorrect, and not an apples-to-apples comparison. In order to defend ColdFusion against a "we are moving to ASP.NET" claim, you (and whoever is involved in the decision making) need to take a big step back. Why? Simple, because ASP.NET is part of Microsoft's .NET solution, and ASP.NET apps take advantage of the .NET Framework and infrastructure, just like ColdFusion apps take advantage of J2EE. In other words, deciding between ColdFusion and ASP.NET (and indeed, defending ColdFusion against ASP.NET) first requires a .NET versus J2EE discussion.

J2EE and .NET are remarkably alike, both in terms of objectives and the range of their various components and systems. Of course, applications and application development with the two platforms are not alike at all; everything from tools to languages to methodologies are different. At their respective cores, both .NET and J2EE provide the building blocks and technologies needed to build applications. Security abstractions, database support, back-end integration, system level services, transactions and messaging, run-time services, and more are all provided by the platforms themselves. Both J2EE and .NET provide "safe" environments in which applications run (the JVM and CLR respectively); both J2EE and .NET support the use of different languages within these environments (although this potential has been realized to a greater degree in .NET); both have a scripting solution designed for Web applications (JSP or ColdFusion for J2EE, ASP.NET for .NET); and both are incredibly powerful and capable.

Many organizations are going through a J2EE or .NET discussion, usually independent of any discussion about ColdFusion. And there are pros and cons to both options. J2EE wins when vendor independence, openness, and portability are a priority. .NET wins when it comes to tools, a better client experience, or simply a commitment to the Microsoft way (there is more to it than that, but that's an entire column unto itself).

However, as many are discovering, J2EE versus .NET is not always an either/or proposition. In fact, lots of organizations are discovering that they need both, and that the future is decidedly heterogeneous. This is especially true for larger organizations where there's room for both, and interoperability (primarily via SOAP) makes this a workable option.

If an organization has made the strategic decision to bet its future solely on Microsoft and .NET, then they probably should use ASP.NET. Sure, ColdFusion can coexist and interoperate with the .NET world, but ASP.NET will likely be the preferred option. For organizations going the J2EE route, well, I've covered that one already. But for most organizations, ColdFusion remains compelling, leveraging the worlds of J2EE natively and .NET via SOAP. In fact, some organizations have discovered that ColdFusion is the simplest way to create client applications that talk to both J2EE and .NET back ends, if that is needed.

So, ColdFusion or ASP.NET? That depends on what your IT future looks like. And unless the future is Microsoft and Windows only, ColdFusion remains an important cog in the IT engine.

... PHP
PHP is not one that comes up often; there is not a significant overlap between PHP developers and ColdFusion developers. But, in the interests of presenting the complete story, here is what you need to know.

PHP is also script based. Pages contain code that is processed by the PHP engine. The PHP engine itself is open source, and the PHP language uses a syntax borrowed from C, Java, and Perl (the latter is important, as PHP is particularly popular with former Perl developers). PHP runs on all sorts of systems, and uses functions and APIs for all sorts of processing.

PHP's stated goal (as per the official PHP FAQ) is to "allow Web developers to write dynamically generated pages quickly." Ironically, developers with both CFML and PHP experience will tell you that while PHP development may be quicker than using C or Java, it does not come close to that of ColdFusion and CFML.

There is no refuting PHP's power; PHP developers have an impressive and rich set of functions and language extensions available, and that is definitely part of PHP's appeal. But this power comes at a cost.It takes a lot more PHP code to do what simple CFML tags do, and as explained before, more code = greater cost. If you truly need that power, then PHP may indeed be an option worth considering (although CFML+Java would likely give you the same power and more). But if what you really need is to "write dynamically generated pages quickly," then PHP leaves much to be desired.

One of the most compelling arguments for PHP is its cost there is none. But, as explained earlier, the cost of software is only one part of the total cost of application development, arguably the least significant cost in the long run. It is for this reason that despite being incredibly popular with developers and consulting shops, corporate users have been slow to adopt PHP as a development platform (which in turn is why the PHP versus ColdFusion discussion comes up so infrequently).

The bottom line is if you really need the power and flexibility of PHP and can justify the longer development cycles and more code to manage and maintain it, then PHP should seriously be considered. But if time to completion and total cost of ownership are an issue, then ColdFusion wins hands down.

And there you have it, the elevator-pitch arguments needed to defend ColdFusion, if the need so arises. Of course, the ultimate defense is results, and delivering results is something ColdFusion developers have proven themselves to be incredibly good at.

Alexander 12/10/04 11:02:45 AM EST

Keith, "nonsense" may indeed be too strong a word. In most cases it is a good technical decision to have POJOs that don't call container APIs. But that is neither a special feature of IoC containers nor does it make your system and development organisation as a whole less dependent on the container. It just moves the dependency to a different place. It's a pattern that everyone has been free to use within EJB and I have been using it ever since. To have complex business rules or algorithms directly call into EJB APIs so you can't test them independently or use them in other kinds of systems is just bad design. It's trivial not to do that. To throw a standardised architecture overboard and instead introduce a whole new non-standard framework just to support this particular pattern, is in my view a disproportianate overreaction.

Regarding backward compatibility of open source: Well, there's a mixed picture I would say. There are indeed a lot of open source packages that have a good history there. But the most prominent example of open source, Linux, is just horrendous in that respect. Frankly, it's ridiculous, that after so many years of debate about the fragmentation of Unix and many attempts to correct it, along comes a Unix clone that itself fragments into countless incompatible distributions.

Just as it is a good technical decision to not link business rules to framework APIs, it is an equally good decision to separate specification from implementation. Where is the Spring specification? Where is the Hibernate spec? Which vendors or open source groups or standards organisations support these specs?

Doug Smith 12/10/04 08:59:49 AM EST

I love to read these discussions. As a solo practioner, I have found POJO, Servlet/JSP and JDBC to be good enough to build the scale of applications my clients need. I am intriqued and frustrated by the abundance of excellent frameworks to use, but have no one to discuss the pros and cons with. I tend to agree with the open standards POV.

My 20 years of IBM midrange (System/38, AS/400, iSeries) has taught me the value to the client (customer, not PC) of having a framework that persists over time. Every consultant going into a midrange shop knew exactly how to use the editor, screen designer, database, interactive sessions, batch sessions, and so on. Yes, it was a very limited architecture, but sufficient for the era, and surprisingly capable today as a database machine or JVM host.

How much good does it really do if every Java shop has a unique architecture du jour, creating a barrier for consultant productivity on Day One? Let a thousand blossoms bloom, but let reason prevail. POJO will be with us ten years from now. Will your favorite framework still be used?

Keith Donald 12/10/04 08:54:18 AM EST

Your point about the costs to train a development organization is a good one.

But my points are not nonsense. There are real savings to be had in minimizing dependencies on infrastructure APIs in domain-specific java code. There is real leverage in abstracting dependencies on a particular service like Hibernate behind a logical layer of data access interfaces, for example. Not only does it produce cleaner, more maintainable code, it makes it easier to replace any one layer without impacting the others.

My point is Spring and Hibernate make it easier to write cleaner code that is consistently simple, easy to test, and cohesive (focused on domain problems.) On my development team, not everyone has to know everything about Spring or Hibernate -- only a few key developers need to know them inside and out, and the others still get their jobs done. They can do this because Spring makes it easy for developers to specialize. And you don't have to ask me, ask the many who have already built Spring-powered apps: Spring promotes well-layered applications that are structured into manageable, independently testable components.

Re: backwards compatability. Overall I think open source has a good record of backwards compability. Would you agree? There are certainly examples of where specifications don't. I know for certain Spring will continue to be backwards compatible--one because our commuity is large and committed to that and second because Spring is backed by a commercial interest. But again, the risk is mitigated when most of my code is _not_ dependent on Spring. If it is dependent on Spring, for example, because I'm using the Web MVC framework, it's a choice I make because its a good value-add solution in which my team -- given their skills and knowledge -- can be most productive. I completely agree with you on that point, you have to take into account the skills of your team and the cost to retrain and weigh that with the benefits acquired, for any technology (Spring is no exception.) It's my opinion, because Spring helps my developers write better code, and integrates well with a number of standard-based and 'good' open source/commerical solutions (as an effective technology integration platform) the benefits are real and quick (perhaps broad, perhaps piecemeal - both strategies work) and outweigh the initial training costs.

Re: fashion, Spring is my base moving forward. I've come to trust it as a stable foundatation whose architects work hard to integrate the best and more volatile of whats out there, hedging my bets on infrastructure by providing higher-layer abstractions that give me more leverage. EJB 3.0 will come and we'll offer support for it, just like we have support for EJB 2.1 now. The programming model we see today in Spring will become more pervasive, and ultimately that's very good for Java. It's going to open up the productivity door for a lot more developers to solve a whole array of interesting problems faster.

Alexander 12/10/04 05:32:56 AM EST

You say: "POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter)."

I'm sorry, but that's complete nonsense. The question is not if POJOs are coupled to anything. The question is what do developers have to do to make the application work? What knowledge do they need? And the answer is, they have to know the Spring architecture, all the Spring configuration stuff and the Hibernate OR-mappings. This _IS_ the API. The API is everything developers have to know to make the system work. So the question is what developers become coupled to, not what POJOs are coupled to.

It's no consolation when you say, with Spring, you only have to know the parts that you need. For one, it's not true, because there is always the general architecture of the framework that you have to know. And then of course a framework is there to be used. If I don't need any of the features of Spring, I don't need Spring in the first place. If I need a particular feature, I get coupled to it. There's no amout of smoke screens that can make me believe otherwise. And Spring and Hibernate are products, not specifications. So, if I get coupled to a particular product, why would I not choose .NET then? Or the Oracle DBMS which is a platform in its own right? Both have much better tool support than Spring or Hibernate. Both are very productive. Both are supported by vendors that won't go away soon and are committed to some degree of backwards compatibility. And both cost much less than retraining a whole development organisation to the most fashionable open source framework once every two years.

Alexander 12/10/04 04:55:35 AM EST

I think your distinction between technology platform and programming model is a red herring. For a development organisation, there is only one thing that counts and that is the API. An EJB/CMP developer cannot use Spring/Hibernate without retraining and a year of experience. Neither Spring nor Hibernate are standards. There is a huge number of similar frameworks and an even greater number of MVC frameworks on top of that. Let's face it, what we see is fragmentation. If that turns out to be a good thing or a bad thing depends on the ability of all involved to build a new standardisation process that works better than the JCP. It's not enough to say, hey look we have so much choice isn't that fantastic? It would be fantastic if what we were talking about wasn't a platform but a range of cheese in the supermarket.

Jay Toee 12/10/04 02:08:40 AM EST

I really love the persistence model of EJBs and specifically CMP2. OpenSource frameworks are often used in conjunction with OpenSource app servers. One reason for more frameworks popping up, may be that JBoss' CMP engine can't handle that much traffic, so the alternatives may be more appealing.

Keith Donald 12/09/04 11:18:27 PM EST

One note re: open frameworks like Spring and Hibernate 'not being standard'. With these two, the question becomes: what is there to standardize? POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter). Indeed, POJOS stay decoupled from infrastructure, which makes them easily adaptable and runnable in different environments (from test, to standalone, to enterprise.) This really does give you the ability to make infrastructure a choice not a mandate, and makes it easy to adapt and scale up as requirements change. This model is the future, no doubt about it.


Harrison Li 12/09/04 09:15:30 PM EST

I'd just like to add some comments about company profits and J2EE. In the early stage of J2EE, application server vendors like BEA, etc. were the major force to push the J2EE standard. Now things have changed. Big business application vendors such as SAP have started to play that role. The reason is very simple, those big companies has invested significantly on J2EE technology by developing new generation of business application products. If you were familiar with SAP, you would have seen what vast changes have happened in product development on the top of J2EE in the last 3-4 years. SAP, IBM, etc. have to continue to push J2EE technology forward because J2EE has become the fundation of the company's products. I don't worry about the future of J2EE technology. Instead, I do worry about how those "pure" J2EE application server vendors could survive if they continue to be the server vendor only. Thanks.

Dave 12/09/04 08:26:54 PM EST

I like the programming models that open source projects like Spring and Hibernate give you.

the problem with frameworks like these however, is that open source is good, but open *standards* are better. I'd be much more likely to use Spring if there were a Spring JSR, and I'd much rather take my pick from 20+ open source and commercial implementations of JDO than tie my application to the proprietary Hibernate API.

look at Struts for instance - great framework, but in time it's going to go the way of the dodo, replaced completely by JSF, because JSF is a *standard*, and there are multiple open source and commercial vendors I can go to to get my JSF implementation. The current Hibernate API will be legacy code too once everyone goes to EJB 3 / JDO.

open source is great, but open standards are much better...

David 12/09/04 07:41:38 PM EST

J2EE has simply gotten so bloated and complex that few can really grasp it all. And since J2EE is the foundation, that's not good. Our applications are complicated enough without all of that baggage. The simpler frameworks, including simpler J2EE subsets such as JSP/servlet + JDBC are enough for many applications. They can always add JavaMail if they do email, or RMI if they need remote invocations, and they can even venture in JMS should that become necessary. However, most of those are not needed by a large number of applications, and the simpler frameworks will help put a stop to managers who choose J2EE when the simpler stuff will make it happen faster and cheaper.

@ThingsExpo Stories
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Internet of @ThingsExpo, taking place June 7-9, 2016 at Javits Center, New York City and Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 18th International @CloudExpo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo New York Call for Papers is now open.
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi’s VP Business Development and Engineering, explored the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context with p...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.