|By Keith Donald||
|December 8, 2004 12:00 AM EST||
By now you've probably either heard about or read the analyst report from the Burton Group entitled "J2EE in Jeopardy."
In summary, the claim is J2EE as a standard is in danger due to several market forces:
- Market commoditization: Open source players like Apache, JBoss, and ObjectWeb are commoditizing the platform, making it harder for vendors to profit from J2EE server licenses. If vendors can't make money on J2EE, they won't want to continue to invest in the specification.
- "Disruptive" technologies: In the last year the complexity of J2EE's programming model - EJB - has been challenged. Simpler, more productive models have emerged from within the open source community and have garnered widespread acceptance in a short time. Credited innovators here include the Spring Framework and Hibernate ORM.
- End users should turn over responsibility for generic J2EE infrastructure to proven providers in the open source world. You shouldn't abandon J2EE, but should consider some of the "alternative" frameworks in the open source community.
- Vendors should focus on building a "J2EE super platform." Basically, innovate in areas where open source hasn't already preempted competition through commoditization.
On the one hand, it's great to see the work of the Spring, Hibernate, Apache, and other quality open source teams getting endorsement and credibility. The cat is out of the proverbial bag: open source is a force of innovation to be respected and certainly not underestimated. Outsourcing infrastructure to proven open source providers is a very effective strategy for companies looking to deliver working software better/faster/cheaper. Complimenting quality open source offerings with strategic commercial products and services is an effective model for infrastructure providers looking to further penetrate the market.
On the other hand, I feel the casual reader might come away with a bad spin on what's happening here. I can see it now - a corporate manager faced with a major technology investment decision happens across this article (or others like it) and concludes J2EE is in chaos. Or worse, concludes the open source community is out building flavor-of-the-month "alternative frameworks" that "reinvent the wheel" because the "standard" platform doesn't cut it. Not exactly the impression we want to make to grow the enterprise Java market.
This is where our community must step in and set that manager straight. J2EE is not in a state of chaos. There are simply more good choices for J2EE infrastructure than ever before. And from what I've experienced, there are many more J2EE success stories. Second, these "alternative" frameworks absolutely do not "reinvent the wheel in open source." They all build on standard J2EE services to improve developer productivity; they are not replacements for the platform.
Indeed, Spring and Hibernate - the leading so-called "alternative" frameworks - are challenging the J2EE programming model while embracing the J2EE technology platform. This is a critical distinction. With Spring particularly, you get the power and maturity of the J2EE stack with a simplicity comparable to that of the .NET programming model. And you get it all with less cost and considerably more business leverage (choice).
How is this possible? When you step back and look at J2EE, there is a lot to it. J2EE consists of:
- A standard set of enterprise services addressing typical infrastructure needs, including transaction management (JTA), dynamic user content (servlets/JSP/JSF), database access (JDBC), service lookup (JNDI), asynchronous messaging (JMS), management (JMX), and remoting (RMI/Web services).
- A standard programming model for tapping into the power of the above services, gluing the individual pieces together into a consistent software delivery platform.
Enter the "alternative" frameworks. Spring, in particular, has given our community a framework that not only makes it easier to tap into the power of J2EE, but captures best practices on what services to use when given your business requirements. The result? More developers, architects, and managers are getting smarter about the infrastructure they need for the given job at hand. Developer productivity is up.
For over a year I've personally leveraged the Spring Framework as the base architecture for my development projects. I now treat J2EE infrastructure as a separate concern, one fully decoupled from the business logic ("core meat") of my application. Spring gives me the power to choose which deployment environment and technologies are most appropriate given the complexity of the domain problem at hand. That puts me in command - if all I need is a Web container to power a Web app with a single data source, a solution like Tomcat is the best cost. For middleware-intensive applications that require messaging, global transactions, and remoting, a higher-end application server is worth the investment. In all cases, my programming model stays simple and consistent, grounded in my customer's problem domain.
I can't say it enough, J2EE is better than ever - for the consumer. I read success story after success story from developers working on projects with products like Spring and Hibernate. They're leveraging them in all kinds of environments and application servers to support demands on all scales. Today is a great time to be developing enterprise applications in Java. It's a great time to be a consumer. We've got the technology, the platform, and the community - it's only going to get better. Who can stop us now?
|Sean Warburton 01/24/05 03:27:30 PM EST|
|Edgar Dollin 12/25/04 09:57:29 AM EST|
The only thing that one can gather from this much dialogue is that the current state of the art for open source / java development frameworks is that they are inadequate and need improvement.
I know struts is a Joke and will go away, Hibernate looks like a good tool but the lack of standardization and a real outside understanding of what it is holds it back, JSF is an attempt by the tool makers to force developers into paying for tools and a belated attempt to compete with .NET and Microsoft who has it close to correct is held in contempt with good reason.
Doesn't look like there is going to be concensus on this issue for quite some time. The problem is that there are some many good answers and so many people with working solutions. Let's hope that the 'Right' solution survives the melee.
|Doug Smith 12/17/04 04:39:51 PM EST|
Elevator pitch? Maybe on the Petronas Towers elevators, stopping at each floor for a minute. ;-)
|Bill Watson 12/17/04 02:51:21 PM EST|
This seems to be coming up more and more frequently - ColdFusion developers being asked to defend ColdFusion against a planned move to Java and J2EE. And so, in case you end up in this situation, this is what you need to know.
For starters, any suggestion of "we need to stop using ColdFusion because we are going to use Java" demonstrates a complete lack of understanding of what exactly ColdFusion is. So, let's start with a brief explanation of the ColdFusion-Java relationship.
Applications written in ColdFusion (as of ColdFusion MX) are pure Java. Or, expressed slightly differently, ColdFusion runs on a J2EE server (either embedded, or one of your choosing) running a Sun-verified Java application (the ColdFusion engine), executing Java bytecode (compiled from your CFML source code). In other words, CFML (the code you write) is a developer-time consideration, not a run-time consideration. There is no CFML at runtime; at runtime you are executing pure Java, no more or less so than had you written the application in straight Java. Your ColdFusion application is a Java application; if you deploy a ColdFusion application what you have deployed is Java. It's as simple as that.
This means that the assertion that ColdFusion and Java are somehow mutually exclusive is just flat out incorrect. But what about the CFML code you write? Isn't that ColdFusion specific and not pure Java? And isn't that an issue? I don't think so. There is an entire industry of Java add-ons out there - tools, tags, language extensions, and more - and Java shops use these (as they should; after all, why reinvent the wheel?). If your Java code leverages third-party add-ons for reporting, or back-end integration, or charting, or ... does that make your code any less Java? Nope, not at all.
Experienced developers know that starting from step one is expensive and seldom makes sense, regardless of the language and platform. Experienced developers have toolboxes at their disposal, stuff they can leverage and reuse to be as productive as possible. Experienced developers write modular applications, separating logic and processing and presentation into tiers, allowing these to evolve independently of each other, even allowing them to be inserted or removed independently.
For Java developers, one of these tools should be ColdFusion. After all, why write dozens of lines of Java code to connect to a database when a single tag can accomplish the exact same thing (likely using the same code internally)? And why write lots of code to send an SMTP message using JavaMail APIs when a single tag can do it for you (again, using those same APIs)? You can think of ColdFusion as a bunch of prewritten Java code, stuff you can use so as to hit the ground running. And that makes your app no less Java than if you had done all the work manually.
However, some may counter that CFML is proprietary, and that the fact that you need to pay for an engine to execute your code somehow makes it non-Java. I have actually heard this from customers. So is this a valid issue? Again, I don't think so. For starters, paid does not equal proprietary. After all, these same customers do not balk at spending big bucks on their J2EE servers (and management tools and professional services and more). Furthermore, there are indeed third-party CFML engines out there. I am not going to comment on how good they are and how viable an alternative they are - that's irrelevant. What is relevant is that they exist, and that means that CFML is not a single-vendor or proprietary.
Great, so ColdFusion simplifies Java development, and ColdFusion applications are no less Java than applications written in low-level Java directly. But simplicity and abstractions require sacrificing power, right? Wrong! ColdFusion applications can (and should) leverage Java; Java APIs, Java classes, JavaBeans, JSP tags, you name it, ColdFusion can leverage it because ColdFusion itself is Java. It's that simple.
So, ColdFusion or Java? The answer should be yes, ColdFusion is Java, and Java development can benefit from ColdFusion. This is not an either/or proposition, it's a "you can have it all so why the heck would you want to do it any other way?" proposition.
The ASP versus ColdFusion discussion used to come up regularly. But not anymore. Now that Microsoft has essentially abandoned any future development on classic ASP, replacing it with ASP.NET, few organizations are embarking on brand new ASP deployments. But having said that, if you do need to defend ColdFusion against ASP, here's what you need to know.
For starters, ASP capabilities are a subset of those of ColdFusion. Or put differently, ColdFusion can do anything that ASP can do, and a whole lot more too. The reverse is not true. Sure, ASP can be extended (using COM objects) to do just about anything that ColdFusion can do, but that's just it, you need to extend ASP - it's your responsibility to do so. Simple things that ColdFusion developers take for granted, like being able to generate an e-mail message, or process an uploaded file, or generate a business chart, none of those are native ASP functionality.
And this is not mere boasting, this is important, because it's the way to head off the "but ASP is free" argument. Sure, ASP is free for starters, but buy all the add-on bits you need to make it functionally equivalent to ColdFusion (even ColdFusion Standard, and even ColdFusion 3 or 4!) and you'll end up paying far more than ColdFusion costs. Sure, ASP is cheaper initially, but you get what you pay for. Or rather, you don't get what you don't pay for. And when you do pay for it, you'll end up paying a whole lot more.
And that's just looking at initial costs. ASP development is also far more time consuming than ColdFusion development. Even if you're comfortable in the languages used, you'll still have to write lots more code to get the job done. Even the execution of simple SQL statements is far more complex in ASP - one tag versus lots of lines of ugly code. More code = longer development time = costs more. Plus, more code = more complex ongoing maintenance = costs even more.
At the risk of sounding like an MBA, when you look at the total cost of ownership, ASP is not the cheaper option at all. Oh, and on top of all that, ASP is proprietary, a single vendor solution, and you're married to Windows boxes (no Linux, no Unix, no portability).
Maybe this is why, as already stated, most ColdFusion servers run on Windows, Windows boxes that likely already have ASP installed. Why? Because hundreds of thousands of developers have figured out that free can be far too expensive.
J2EE and .NET are remarkably alike, both in terms of objectives and the range of their various components and systems. Of course, applications and application development with the two platforms are not alike at all; everything from tools to languages to methodologies are different. At their respective cores, both .NET and J2EE provide the building blocks and technologies needed to build applications. Security abstractions, database support, back-end integration, system level services, transactions and messaging, run-time services, and more are all provided by the platforms themselves. Both J2EE and .NET provide "safe" environments in which applications run (the JVM and CLR respectively); both J2EE and .NET support the use of different languages within these environments (although this potential has been realized to a greater degree in .NET); both have a scripting solution designed for Web applications (JSP or ColdFusion for J2EE, ASP.NET for .NET); and both are incredibly powerful and capable.
Many organizations are going through a J2EE or .NET discussion, usually independent of any discussion about ColdFusion. And there are pros and cons to both options. J2EE wins when vendor independence, openness, and portability are a priority. .NET wins when it comes to tools, a better client experience, or simply a commitment to the Microsoft way (there is more to it than that, but that's an entire column unto itself).
However, as many are discovering, J2EE versus .NET is not always an either/or proposition. In fact, lots of organizations are discovering that they need both, and that the future is decidedly heterogeneous. This is especially true for larger organizations where there's room for both, and interoperability (primarily via SOAP) makes this a workable option.
If an organization has made the strategic decision to bet its future solely on Microsoft and .NET, then they probably should use ASP.NET. Sure, ColdFusion can coexist and interoperate with the .NET world, but ASP.NET will likely be the preferred option. For organizations going the J2EE route, well, I've covered that one already. But for most organizations, ColdFusion remains compelling, leveraging the worlds of J2EE natively and .NET via SOAP. In fact, some organizations have discovered that ColdFusion is the simplest way to create client applications that talk to both J2EE and .NET back ends, if that is needed.
So, ColdFusion or ASP.NET? That depends on what your IT future looks like. And unless the future is Microsoft and Windows only, ColdFusion remains an important cog in the IT engine.
PHP is also script based. Pages contain code that is processed by the PHP engine. The PHP engine itself is open source, and the PHP language uses a syntax borrowed from C, Java, and Perl (the latter is important, as PHP is particularly popular with former Perl developers). PHP runs on all sorts of systems, and uses functions and APIs for all sorts of processing.
PHP's stated goal (as per the official PHP FAQ) is to "allow Web developers to write dynamically generated pages quickly." Ironically, developers with both CFML and PHP experience will tell you that while PHP development may be quicker than using C or Java, it does not come close to that of ColdFusion and CFML.
There is no refuting PHP's power; PHP developers have an impressive and rich set of functions and language extensions available, and that is definitely part of PHP's appeal. But this power comes at a cost.It takes a lot more PHP code to do what simple CFML tags do, and as explained before, more code = greater cost. If you truly need that power, then PHP may indeed be an option worth considering (although CFML+Java would likely give you the same power and more). But if what you really need is to "write dynamically generated pages quickly," then PHP leaves much to be desired.
One of the most compelling arguments for PHP is its cost there is none. But, as explained earlier, the cost of software is only one part of the total cost of application development, arguably the least significant cost in the long run. It is for this reason that despite being incredibly popular with developers and consulting shops, corporate users have been slow to adopt PHP as a development platform (which in turn is why the PHP versus ColdFusion discussion comes up so infrequently).
The bottom line is if you really need the power and flexibility of PHP and can justify the longer development cycles and more code to manage and maintain it, then PHP should seriously be considered. But if time to completion and total cost of ownership are an issue, then ColdFusion wins hands down.
|Alexander 12/10/04 11:02:45 AM EST|
Keith, "nonsense" may indeed be too strong a word. In most cases it is a good technical decision to have POJOs that don't call container APIs. But that is neither a special feature of IoC containers nor does it make your system and development organisation as a whole less dependent on the container. It just moves the dependency to a different place. It's a pattern that everyone has been free to use within EJB and I have been using it ever since. To have complex business rules or algorithms directly call into EJB APIs so you can't test them independently or use them in other kinds of systems is just bad design. It's trivial not to do that. To throw a standardised architecture overboard and instead introduce a whole new non-standard framework just to support this particular pattern, is in my view a disproportianate overreaction.
Regarding backward compatibility of open source: Well, there's a mixed picture I would say. There are indeed a lot of open source packages that have a good history there. But the most prominent example of open source, Linux, is just horrendous in that respect. Frankly, it's ridiculous, that after so many years of debate about the fragmentation of Unix and many attempts to correct it, along comes a Unix clone that itself fragments into countless incompatible distributions.
Just as it is a good technical decision to not link business rules to framework APIs, it is an equally good decision to separate specification from implementation. Where is the Spring specification? Where is the Hibernate spec? Which vendors or open source groups or standards organisations support these specs?
|Doug Smith 12/10/04 08:59:49 AM EST|
I love to read these discussions. As a solo practioner, I have found POJO, Servlet/JSP and JDBC to be good enough to build the scale of applications my clients need. I am intriqued and frustrated by the abundance of excellent frameworks to use, but have no one to discuss the pros and cons with. I tend to agree with the open standards POV.
My 20 years of IBM midrange (System/38, AS/400, iSeries) has taught me the value to the client (customer, not PC) of having a framework that persists over time. Every consultant going into a midrange shop knew exactly how to use the editor, screen designer, database, interactive sessions, batch sessions, and so on. Yes, it was a very limited architecture, but sufficient for the era, and surprisingly capable today as a database machine or JVM host.
How much good does it really do if every Java shop has a unique architecture du jour, creating a barrier for consultant productivity on Day One? Let a thousand blossoms bloom, but let reason prevail. POJO will be with us ten years from now. Will your favorite framework still be used?
|Keith Donald 12/10/04 08:54:18 AM EST|
Your point about the costs to train a development organization is a good one.
But my points are not nonsense. There are real savings to be had in minimizing dependencies on infrastructure APIs in domain-specific java code. There is real leverage in abstracting dependencies on a particular service like Hibernate behind a logical layer of data access interfaces, for example. Not only does it produce cleaner, more maintainable code, it makes it easier to replace any one layer without impacting the others.
My point is Spring and Hibernate make it easier to write cleaner code that is consistently simple, easy to test, and cohesive (focused on domain problems.) On my development team, not everyone has to know everything about Spring or Hibernate -- only a few key developers need to know them inside and out, and the others still get their jobs done. They can do this because Spring makes it easy for developers to specialize. And you don't have to ask me, ask the many who have already built Spring-powered apps: Spring promotes well-layered applications that are structured into manageable, independently testable components.
Re: backwards compatability. Overall I think open source has a good record of backwards compability. Would you agree? There are certainly examples of where specifications don't. I know for certain Spring will continue to be backwards compatible--one because our commuity is large and committed to that and second because Spring is backed by a commercial interest. But again, the risk is mitigated when most of my code is _not_ dependent on Spring. If it is dependent on Spring, for example, because I'm using the Web MVC framework, it's a choice I make because its a good value-add solution in which my team -- given their skills and knowledge -- can be most productive. I completely agree with you on that point, you have to take into account the skills of your team and the cost to retrain and weigh that with the benefits acquired, for any technology (Spring is no exception.) It's my opinion, because Spring helps my developers write better code, and integrates well with a number of standard-based and 'good' open source/commerical solutions (as an effective technology integration platform) the benefits are real and quick (perhaps broad, perhaps piecemeal - both strategies work) and outweigh the initial training costs.
Re: fashion, Spring is my base moving forward. I've come to trust it as a stable foundatation whose architects work hard to integrate the best and more volatile of whats out there, hedging my bets on infrastructure by providing higher-layer abstractions that give me more leverage. EJB 3.0 will come and we'll offer support for it, just like we have support for EJB 2.1 now. The programming model we see today in Spring will become more pervasive, and ultimately that's very good for Java. It's going to open up the productivity door for a lot more developers to solve a whole array of interesting problems faster.
|Alexander 12/10/04 05:32:56 AM EST|
You say: "POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter)."
I'm sorry, but that's complete nonsense. The question is not if POJOs are coupled to anything. The question is what do developers have to do to make the application work? What knowledge do they need? And the answer is, they have to know the Spring architecture, all the Spring configuration stuff and the Hibernate OR-mappings. This _IS_ the API. The API is everything developers have to know to make the system work. So the question is what developers become coupled to, not what POJOs are coupled to.
It's no consolation when you say, with Spring, you only have to know the parts that you need. For one, it's not true, because there is always the general architecture of the framework that you have to know. And then of course a framework is there to be used. If I don't need any of the features of Spring, I don't need Spring in the first place. If I need a particular feature, I get coupled to it. There's no amout of smoke screens that can make me believe otherwise. And Spring and Hibernate are products, not specifications. So, if I get coupled to a particular product, why would I not choose .NET then? Or the Oracle DBMS which is a platform in its own right? Both have much better tool support than Spring or Hibernate. Both are very productive. Both are supported by vendors that won't go away soon and are committed to some degree of backwards compatibility. And both cost much less than retraining a whole development organisation to the most fashionable open source framework once every two years.
|Alexander 12/10/04 04:55:35 AM EST|
I think your distinction between technology platform and programming model is a red herring. For a development organisation, there is only one thing that counts and that is the API. An EJB/CMP developer cannot use Spring/Hibernate without retraining and a year of experience. Neither Spring nor Hibernate are standards. There is a huge number of similar frameworks and an even greater number of MVC frameworks on top of that. Let's face it, what we see is fragmentation. If that turns out to be a good thing or a bad thing depends on the ability of all involved to build a new standardisation process that works better than the JCP. It's not enough to say, hey look we have so much choice isn't that fantastic? It would be fantastic if what we were talking about wasn't a platform but a range of cheese in the supermarket.
|Jay Toee 12/10/04 02:08:40 AM EST|
I really love the persistence model of EJBs and specifically CMP2. OpenSource frameworks are often used in conjunction with OpenSource app servers. One reason for more frameworks popping up, may be that JBoss' CMP engine can't handle that much traffic, so the alternatives may be more appealing.
|Keith Donald 12/09/04 11:18:27 PM EST|
One note re: open frameworks like Spring and Hibernate 'not being standard'. With these two, the question becomes: what is there to standardize? POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter). Indeed, POJOS stay decoupled from infrastructure, which makes them easily adaptable and runnable in different environments (from test, to standalone, to enterprise.) This really does give you the ability to make infrastructure a choice not a mandate, and makes it easy to adapt and scale up as requirements change. This model is the future, no doubt about it.
|Harrison Li 12/09/04 09:15:30 PM EST|
I'd just like to add some comments about company profits and J2EE. In the early stage of J2EE, application server vendors like BEA, etc. were the major force to push the J2EE standard. Now things have changed. Big business application vendors such as SAP have started to play that role. The reason is very simple, those big companies has invested significantly on J2EE technology by developing new generation of business application products. If you were familiar with SAP, you would have seen what vast changes have happened in product development on the top of J2EE in the last 3-4 years. SAP, IBM, etc. have to continue to push J2EE technology forward because J2EE has become the fundation of the company's products. I don't worry about the future of J2EE technology. Instead, I do worry about how those "pure" J2EE application server vendors could survive if they continue to be the server vendor only. Thanks.
|Dave 12/09/04 08:26:54 PM EST|
I like the programming models that open source projects like Spring and Hibernate give you.
the problem with frameworks like these however, is that open source is good, but open *standards* are better. I'd be much more likely to use Spring if there were a Spring JSR, and I'd much rather take my pick from 20+ open source and commercial implementations of JDO than tie my application to the proprietary Hibernate API.
look at Struts for instance - great framework, but in time it's going to go the way of the dodo, replaced completely by JSF, because JSF is a *standard*, and there are multiple open source and commercial vendors I can go to to get my JSF implementation. The current Hibernate API will be legacy code too once everyone goes to EJB 3 / JDO.
open source is great, but open standards are much better...
|David 12/09/04 07:41:38 PM EST|
J2EE has simply gotten so bloated and complex that few can really grasp it all. And since J2EE is the foundation, that's not good. Our applications are complicated enough without all of that baggage. The simpler frameworks, including simpler J2EE subsets such as JSP/servlet + JDBC are enough for many applications. They can always add JavaMail if they do email, or RMI if they need remote invocations, and they can even venture in JMS should that become necessary. However, most of those are not needed by a large number of applications, and the simpler frameworks will help put a stop to managers who choose J2EE when the simpler stuff will make it happen faster and cheaper.
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
Nov. 23, 2014 07:30 PM EST Reads: 1,684
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 23, 2014 12:00 PM EST Reads: 1,625
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
Nov. 23, 2014 07:45 AM EST Reads: 1,468
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Nov. 22, 2014 10:00 PM EST Reads: 1,380
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
Nov. 22, 2014 05:30 PM EST Reads: 1,448
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
Nov. 22, 2014 05:30 PM EST Reads: 1,302
ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ -- IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...
Nov. 22, 2014 05:30 PM EST Reads: 1,460
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Nov. 21, 2014 09:15 PM EST Reads: 1,376
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, and physical persons. In the IoT vision, every new "thing" - sensor, actuator, data source, data con...
Nov. 21, 2014 08:00 PM EST Reads: 1,386
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the real benefits to focus on, how to understand the requirements of a successful solution, the flow of ...
Nov. 21, 2014 08:00 PM EST Reads: 1,441
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nov. 21, 2014 07:00 PM EST Reads: 1,291
Focused on this fast-growing market’s needs, Vitesse Semiconductor Corporation (Nasdaq: VTSS), a leading provider of IC solutions to advance "Ethernet Everywhere" in Carrier, Enterprise and Internet of Things (IoT) networks, introduced its IStaX™ software (VSC6815SDK), a robust protocol stack to simplify deployment and management of Industrial-IoT network applications such as Industrial Ethernet switching, surveillance, video distribution, LCD signage, intelligent sensors, and metering equipment. Leveraging technologies proven in the Carrier and Enterprise markets, IStaX is designed to work ac...
Nov. 20, 2014 09:15 PM EST Reads: 1,379
C-Labs LLC, a leading provider of remote and mobile access for the Internet of Things (IoT), announced the appointment of John Traynor to the position of chief operating officer. Previously a strategic advisor to the firm, Mr. Traynor will now oversee sales, marketing, finance, and operations. Mr. Traynor is based out of the C-Labs office in Redmond, Washington. He reports to Chris Muench, Chief Executive Officer. Mr. Traynor brings valuable business leadership and technology industry expertise to C-Labs. With over 30 years' experience in the high-tech sector, John Traynor has held numerous...
Nov. 20, 2014 06:00 PM EST Reads: 1,341
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
Nov. 20, 2014 04:45 PM EST Reads: 1,136
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
Nov. 20, 2014 01:00 PM EST Reads: 1,584
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover how hardware commoditization, the ubiquitous nature of connectivity, and the emergence of Big Data a...
Nov. 20, 2014 12:30 PM EST Reads: 1,793
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Nov. 18, 2014 09:00 PM EST Reads: 2,018
SYS-CON Events announced today that IDenticard will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. IDenticard™ is the security division of Brady Corp (NYSE: BRC), a $1.5 billion manufacturer of identification products. We have small-company values with the strength and stability of a major corporation. IDenticard offers local sales, support and service to our customers across the United States and Canada. Our partner network encompasses some 300 of the world's leading systems integrators and security s...
Nov. 18, 2014 08:15 PM EST Reads: 1,571
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
Nov. 18, 2014 01:30 PM EST Reads: 2,013
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world. The next @ThingsExpo will take place November 4-6, 2014, at the Santa Clara Convention Center, in Santa Clara, California. Since its launch in 2008, Cloud Expo TV commercials have been aired and CNBC, Fox News Network, and Bloomberg TV. Please enjoy our 2014 commercial.
Nov. 13, 2014 05:00 AM EST Reads: 3,543