|By Keith Donald||
|December 8, 2004 12:00 AM EST||
By now you've probably either heard about or read the analyst report from the Burton Group entitled "J2EE in Jeopardy."
In summary, the claim is J2EE as a standard is in danger due to several market forces:
- Market commoditization: Open source players like Apache, JBoss, and ObjectWeb are commoditizing the platform, making it harder for vendors to profit from J2EE server licenses. If vendors can't make money on J2EE, they won't want to continue to invest in the specification.
- "Disruptive" technologies: In the last year the complexity of J2EE's programming model - EJB - has been challenged. Simpler, more productive models have emerged from within the open source community and have garnered widespread acceptance in a short time. Credited innovators here include the Spring Framework and Hibernate ORM.
- End users should turn over responsibility for generic J2EE infrastructure to proven providers in the open source world. You shouldn't abandon J2EE, but should consider some of the "alternative" frameworks in the open source community.
- Vendors should focus on building a "J2EE super platform." Basically, innovate in areas where open source hasn't already preempted competition through commoditization.
On the one hand, it's great to see the work of the Spring, Hibernate, Apache, and other quality open source teams getting endorsement and credibility. The cat is out of the proverbial bag: open source is a force of innovation to be respected and certainly not underestimated. Outsourcing infrastructure to proven open source providers is a very effective strategy for companies looking to deliver working software better/faster/cheaper. Complimenting quality open source offerings with strategic commercial products and services is an effective model for infrastructure providers looking to further penetrate the market.
On the other hand, I feel the casual reader might come away with a bad spin on what's happening here. I can see it now - a corporate manager faced with a major technology investment decision happens across this article (or others like it) and concludes J2EE is in chaos. Or worse, concludes the open source community is out building flavor-of-the-month "alternative frameworks" that "reinvent the wheel" because the "standard" platform doesn't cut it. Not exactly the impression we want to make to grow the enterprise Java market.
This is where our community must step in and set that manager straight. J2EE is not in a state of chaos. There are simply more good choices for J2EE infrastructure than ever before. And from what I've experienced, there are many more J2EE success stories. Second, these "alternative" frameworks absolutely do not "reinvent the wheel in open source." They all build on standard J2EE services to improve developer productivity; they are not replacements for the platform.
Indeed, Spring and Hibernate - the leading so-called "alternative" frameworks - are challenging the J2EE programming model while embracing the J2EE technology platform. This is a critical distinction. With Spring particularly, you get the power and maturity of the J2EE stack with a simplicity comparable to that of the .NET programming model. And you get it all with less cost and considerably more business leverage (choice).
How is this possible? When you step back and look at J2EE, there is a lot to it. J2EE consists of:
- A standard set of enterprise services addressing typical infrastructure needs, including transaction management (JTA), dynamic user content (servlets/JSP/JSF), database access (JDBC), service lookup (JNDI), asynchronous messaging (JMS), management (JMX), and remoting (RMI/Web services).
- A standard programming model for tapping into the power of the above services, gluing the individual pieces together into a consistent software delivery platform.
Enter the "alternative" frameworks. Spring, in particular, has given our community a framework that not only makes it easier to tap into the power of J2EE, but captures best practices on what services to use when given your business requirements. The result? More developers, architects, and managers are getting smarter about the infrastructure they need for the given job at hand. Developer productivity is up.
For over a year I've personally leveraged the Spring Framework as the base architecture for my development projects. I now treat J2EE infrastructure as a separate concern, one fully decoupled from the business logic ("core meat") of my application. Spring gives me the power to choose which deployment environment and technologies are most appropriate given the complexity of the domain problem at hand. That puts me in command - if all I need is a Web container to power a Web app with a single data source, a solution like Tomcat is the best cost. For middleware-intensive applications that require messaging, global transactions, and remoting, a higher-end application server is worth the investment. In all cases, my programming model stays simple and consistent, grounded in my customer's problem domain.
I can't say it enough, J2EE is better than ever - for the consumer. I read success story after success story from developers working on projects with products like Spring and Hibernate. They're leveraging them in all kinds of environments and application servers to support demands on all scales. Today is a great time to be developing enterprise applications in Java. It's a great time to be a consumer. We've got the technology, the platform, and the community - it's only going to get better. Who can stop us now?
|Sean Warburton 01/24/05 03:27:30 PM EST|
|Edgar Dollin 12/25/04 09:57:29 AM EST|
The only thing that one can gather from this much dialogue is that the current state of the art for open source / java development frameworks is that they are inadequate and need improvement.
I know struts is a Joke and will go away, Hibernate looks like a good tool but the lack of standardization and a real outside understanding of what it is holds it back, JSF is an attempt by the tool makers to force developers into paying for tools and a belated attempt to compete with .NET and Microsoft who has it close to correct is held in contempt with good reason.
Doesn't look like there is going to be concensus on this issue for quite some time. The problem is that there are some many good answers and so many people with working solutions. Let's hope that the 'Right' solution survives the melee.
|Doug Smith 12/17/04 04:39:51 PM EST|
Elevator pitch? Maybe on the Petronas Towers elevators, stopping at each floor for a minute. ;-)
|Bill Watson 12/17/04 02:51:21 PM EST|
This seems to be coming up more and more frequently - ColdFusion developers being asked to defend ColdFusion against a planned move to Java and J2EE. And so, in case you end up in this situation, this is what you need to know.
For starters, any suggestion of "we need to stop using ColdFusion because we are going to use Java" demonstrates a complete lack of understanding of what exactly ColdFusion is. So, let's start with a brief explanation of the ColdFusion-Java relationship.
Applications written in ColdFusion (as of ColdFusion MX) are pure Java. Or, expressed slightly differently, ColdFusion runs on a J2EE server (either embedded, or one of your choosing) running a Sun-verified Java application (the ColdFusion engine), executing Java bytecode (compiled from your CFML source code). In other words, CFML (the code you write) is a developer-time consideration, not a run-time consideration. There is no CFML at runtime; at runtime you are executing pure Java, no more or less so than had you written the application in straight Java. Your ColdFusion application is a Java application; if you deploy a ColdFusion application what you have deployed is Java. It's as simple as that.
This means that the assertion that ColdFusion and Java are somehow mutually exclusive is just flat out incorrect. But what about the CFML code you write? Isn't that ColdFusion specific and not pure Java? And isn't that an issue? I don't think so. There is an entire industry of Java add-ons out there - tools, tags, language extensions, and more - and Java shops use these (as they should; after all, why reinvent the wheel?). If your Java code leverages third-party add-ons for reporting, or back-end integration, or charting, or ... does that make your code any less Java? Nope, not at all.
Experienced developers know that starting from step one is expensive and seldom makes sense, regardless of the language and platform. Experienced developers have toolboxes at their disposal, stuff they can leverage and reuse to be as productive as possible. Experienced developers write modular applications, separating logic and processing and presentation into tiers, allowing these to evolve independently of each other, even allowing them to be inserted or removed independently.
For Java developers, one of these tools should be ColdFusion. After all, why write dozens of lines of Java code to connect to a database when a single tag can accomplish the exact same thing (likely using the same code internally)? And why write lots of code to send an SMTP message using JavaMail APIs when a single tag can do it for you (again, using those same APIs)? You can think of ColdFusion as a bunch of prewritten Java code, stuff you can use so as to hit the ground running. And that makes your app no less Java than if you had done all the work manually.
However, some may counter that CFML is proprietary, and that the fact that you need to pay for an engine to execute your code somehow makes it non-Java. I have actually heard this from customers. So is this a valid issue? Again, I don't think so. For starters, paid does not equal proprietary. After all, these same customers do not balk at spending big bucks on their J2EE servers (and management tools and professional services and more). Furthermore, there are indeed third-party CFML engines out there. I am not going to comment on how good they are and how viable an alternative they are - that's irrelevant. What is relevant is that they exist, and that means that CFML is not a single-vendor or proprietary.
Great, so ColdFusion simplifies Java development, and ColdFusion applications are no less Java than applications written in low-level Java directly. But simplicity and abstractions require sacrificing power, right? Wrong! ColdFusion applications can (and should) leverage Java; Java APIs, Java classes, JavaBeans, JSP tags, you name it, ColdFusion can leverage it because ColdFusion itself is Java. It's that simple.
So, ColdFusion or Java? The answer should be yes, ColdFusion is Java, and Java development can benefit from ColdFusion. This is not an either/or proposition, it's a "you can have it all so why the heck would you want to do it any other way?" proposition.
The ASP versus ColdFusion discussion used to come up regularly. But not anymore. Now that Microsoft has essentially abandoned any future development on classic ASP, replacing it with ASP.NET, few organizations are embarking on brand new ASP deployments. But having said that, if you do need to defend ColdFusion against ASP, here's what you need to know.
For starters, ASP capabilities are a subset of those of ColdFusion. Or put differently, ColdFusion can do anything that ASP can do, and a whole lot more too. The reverse is not true. Sure, ASP can be extended (using COM objects) to do just about anything that ColdFusion can do, but that's just it, you need to extend ASP - it's your responsibility to do so. Simple things that ColdFusion developers take for granted, like being able to generate an e-mail message, or process an uploaded file, or generate a business chart, none of those are native ASP functionality.
And this is not mere boasting, this is important, because it's the way to head off the "but ASP is free" argument. Sure, ASP is free for starters, but buy all the add-on bits you need to make it functionally equivalent to ColdFusion (even ColdFusion Standard, and even ColdFusion 3 or 4!) and you'll end up paying far more than ColdFusion costs. Sure, ASP is cheaper initially, but you get what you pay for. Or rather, you don't get what you don't pay for. And when you do pay for it, you'll end up paying a whole lot more.
And that's just looking at initial costs. ASP development is also far more time consuming than ColdFusion development. Even if you're comfortable in the languages used, you'll still have to write lots more code to get the job done. Even the execution of simple SQL statements is far more complex in ASP - one tag versus lots of lines of ugly code. More code = longer development time = costs more. Plus, more code = more complex ongoing maintenance = costs even more.
At the risk of sounding like an MBA, when you look at the total cost of ownership, ASP is not the cheaper option at all. Oh, and on top of all that, ASP is proprietary, a single vendor solution, and you're married to Windows boxes (no Linux, no Unix, no portability).
Maybe this is why, as already stated, most ColdFusion servers run on Windows, Windows boxes that likely already have ASP installed. Why? Because hundreds of thousands of developers have figured out that free can be far too expensive.
J2EE and .NET are remarkably alike, both in terms of objectives and the range of their various components and systems. Of course, applications and application development with the two platforms are not alike at all; everything from tools to languages to methodologies are different. At their respective cores, both .NET and J2EE provide the building blocks and technologies needed to build applications. Security abstractions, database support, back-end integration, system level services, transactions and messaging, run-time services, and more are all provided by the platforms themselves. Both J2EE and .NET provide "safe" environments in which applications run (the JVM and CLR respectively); both J2EE and .NET support the use of different languages within these environments (although this potential has been realized to a greater degree in .NET); both have a scripting solution designed for Web applications (JSP or ColdFusion for J2EE, ASP.NET for .NET); and both are incredibly powerful and capable.
Many organizations are going through a J2EE or .NET discussion, usually independent of any discussion about ColdFusion. And there are pros and cons to both options. J2EE wins when vendor independence, openness, and portability are a priority. .NET wins when it comes to tools, a better client experience, or simply a commitment to the Microsoft way (there is more to it than that, but that's an entire column unto itself).
However, as many are discovering, J2EE versus .NET is not always an either/or proposition. In fact, lots of organizations are discovering that they need both, and that the future is decidedly heterogeneous. This is especially true for larger organizations where there's room for both, and interoperability (primarily via SOAP) makes this a workable option.
If an organization has made the strategic decision to bet its future solely on Microsoft and .NET, then they probably should use ASP.NET. Sure, ColdFusion can coexist and interoperate with the .NET world, but ASP.NET will likely be the preferred option. For organizations going the J2EE route, well, I've covered that one already. But for most organizations, ColdFusion remains compelling, leveraging the worlds of J2EE natively and .NET via SOAP. In fact, some organizations have discovered that ColdFusion is the simplest way to create client applications that talk to both J2EE and .NET back ends, if that is needed.
So, ColdFusion or ASP.NET? That depends on what your IT future looks like. And unless the future is Microsoft and Windows only, ColdFusion remains an important cog in the IT engine.
PHP is also script based. Pages contain code that is processed by the PHP engine. The PHP engine itself is open source, and the PHP language uses a syntax borrowed from C, Java, and Perl (the latter is important, as PHP is particularly popular with former Perl developers). PHP runs on all sorts of systems, and uses functions and APIs for all sorts of processing.
PHP's stated goal (as per the official PHP FAQ) is to "allow Web developers to write dynamically generated pages quickly." Ironically, developers with both CFML and PHP experience will tell you that while PHP development may be quicker than using C or Java, it does not come close to that of ColdFusion and CFML.
There is no refuting PHP's power; PHP developers have an impressive and rich set of functions and language extensions available, and that is definitely part of PHP's appeal. But this power comes at a cost.It takes a lot more PHP code to do what simple CFML tags do, and as explained before, more code = greater cost. If you truly need that power, then PHP may indeed be an option worth considering (although CFML+Java would likely give you the same power and more). But if what you really need is to "write dynamically generated pages quickly," then PHP leaves much to be desired.
One of the most compelling arguments for PHP is its cost there is none. But, as explained earlier, the cost of software is only one part of the total cost of application development, arguably the least significant cost in the long run. It is for this reason that despite being incredibly popular with developers and consulting shops, corporate users have been slow to adopt PHP as a development platform (which in turn is why the PHP versus ColdFusion discussion comes up so infrequently).
The bottom line is if you really need the power and flexibility of PHP and can justify the longer development cycles and more code to manage and maintain it, then PHP should seriously be considered. But if time to completion and total cost of ownership are an issue, then ColdFusion wins hands down.
|Alexander 12/10/04 11:02:45 AM EST|
Keith, "nonsense" may indeed be too strong a word. In most cases it is a good technical decision to have POJOs that don't call container APIs. But that is neither a special feature of IoC containers nor does it make your system and development organisation as a whole less dependent on the container. It just moves the dependency to a different place. It's a pattern that everyone has been free to use within EJB and I have been using it ever since. To have complex business rules or algorithms directly call into EJB APIs so you can't test them independently or use them in other kinds of systems is just bad design. It's trivial not to do that. To throw a standardised architecture overboard and instead introduce a whole new non-standard framework just to support this particular pattern, is in my view a disproportianate overreaction.
Regarding backward compatibility of open source: Well, there's a mixed picture I would say. There are indeed a lot of open source packages that have a good history there. But the most prominent example of open source, Linux, is just horrendous in that respect. Frankly, it's ridiculous, that after so many years of debate about the fragmentation of Unix and many attempts to correct it, along comes a Unix clone that itself fragments into countless incompatible distributions.
Just as it is a good technical decision to not link business rules to framework APIs, it is an equally good decision to separate specification from implementation. Where is the Spring specification? Where is the Hibernate spec? Which vendors or open source groups or standards organisations support these specs?
|Doug Smith 12/10/04 08:59:49 AM EST|
I love to read these discussions. As a solo practioner, I have found POJO, Servlet/JSP and JDBC to be good enough to build the scale of applications my clients need. I am intriqued and frustrated by the abundance of excellent frameworks to use, but have no one to discuss the pros and cons with. I tend to agree with the open standards POV.
My 20 years of IBM midrange (System/38, AS/400, iSeries) has taught me the value to the client (customer, not PC) of having a framework that persists over time. Every consultant going into a midrange shop knew exactly how to use the editor, screen designer, database, interactive sessions, batch sessions, and so on. Yes, it was a very limited architecture, but sufficient for the era, and surprisingly capable today as a database machine or JVM host.
How much good does it really do if every Java shop has a unique architecture du jour, creating a barrier for consultant productivity on Day One? Let a thousand blossoms bloom, but let reason prevail. POJO will be with us ten years from now. Will your favorite framework still be used?
|Keith Donald 12/10/04 08:54:18 AM EST|
Your point about the costs to train a development organization is a good one.
But my points are not nonsense. There are real savings to be had in minimizing dependencies on infrastructure APIs in domain-specific java code. There is real leverage in abstracting dependencies on a particular service like Hibernate behind a logical layer of data access interfaces, for example. Not only does it produce cleaner, more maintainable code, it makes it easier to replace any one layer without impacting the others.
My point is Spring and Hibernate make it easier to write cleaner code that is consistently simple, easy to test, and cohesive (focused on domain problems.) On my development team, not everyone has to know everything about Spring or Hibernate -- only a few key developers need to know them inside and out, and the others still get their jobs done. They can do this because Spring makes it easy for developers to specialize. And you don't have to ask me, ask the many who have already built Spring-powered apps: Spring promotes well-layered applications that are structured into manageable, independently testable components.
Re: backwards compatability. Overall I think open source has a good record of backwards compability. Would you agree? There are certainly examples of where specifications don't. I know for certain Spring will continue to be backwards compatible--one because our commuity is large and committed to that and second because Spring is backed by a commercial interest. But again, the risk is mitigated when most of my code is _not_ dependent on Spring. If it is dependent on Spring, for example, because I'm using the Web MVC framework, it's a choice I make because its a good value-add solution in which my team -- given their skills and knowledge -- can be most productive. I completely agree with you on that point, you have to take into account the skills of your team and the cost to retrain and weigh that with the benefits acquired, for any technology (Spring is no exception.) It's my opinion, because Spring helps my developers write better code, and integrates well with a number of standard-based and 'good' open source/commerical solutions (as an effective technology integration platform) the benefits are real and quick (perhaps broad, perhaps piecemeal - both strategies work) and outweigh the initial training costs.
Re: fashion, Spring is my base moving forward. I've come to trust it as a stable foundatation whose architects work hard to integrate the best and more volatile of whats out there, hedging my bets on infrastructure by providing higher-layer abstractions that give me more leverage. EJB 3.0 will come and we'll offer support for it, just like we have support for EJB 2.1 now. The programming model we see today in Spring will become more pervasive, and ultimately that's very good for Java. It's going to open up the productivity door for a lot more developers to solve a whole array of interesting problems faster.
|Alexander 12/10/04 05:32:56 AM EST|
You say: "POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter)."
I'm sorry, but that's complete nonsense. The question is not if POJOs are coupled to anything. The question is what do developers have to do to make the application work? What knowledge do they need? And the answer is, they have to know the Spring architecture, all the Spring configuration stuff and the Hibernate OR-mappings. This _IS_ the API. The API is everything developers have to know to make the system work. So the question is what developers become coupled to, not what POJOs are coupled to.
It's no consolation when you say, with Spring, you only have to know the parts that you need. For one, it's not true, because there is always the general architecture of the framework that you have to know. And then of course a framework is there to be used. If I don't need any of the features of Spring, I don't need Spring in the first place. If I need a particular feature, I get coupled to it. There's no amout of smoke screens that can make me believe otherwise. And Spring and Hibernate are products, not specifications. So, if I get coupled to a particular product, why would I not choose .NET then? Or the Oracle DBMS which is a platform in its own right? Both have much better tool support than Spring or Hibernate. Both are very productive. Both are supported by vendors that won't go away soon and are committed to some degree of backwards compatibility. And both cost much less than retraining a whole development organisation to the most fashionable open source framework once every two years.
|Alexander 12/10/04 04:55:35 AM EST|
I think your distinction between technology platform and programming model is a red herring. For a development organisation, there is only one thing that counts and that is the API. An EJB/CMP developer cannot use Spring/Hibernate without retraining and a year of experience. Neither Spring nor Hibernate are standards. There is a huge number of similar frameworks and an even greater number of MVC frameworks on top of that. Let's face it, what we see is fragmentation. If that turns out to be a good thing or a bad thing depends on the ability of all involved to build a new standardisation process that works better than the JCP. It's not enough to say, hey look we have so much choice isn't that fantastic? It would be fantastic if what we were talking about wasn't a platform but a range of cheese in the supermarket.
|Jay Toee 12/10/04 02:08:40 AM EST|
I really love the persistence model of EJBs and specifically CMP2. OpenSource frameworks are often used in conjunction with OpenSource app servers. One reason for more frameworks popping up, may be that JBoss' CMP engine can't handle that much traffic, so the alternatives may be more appealing.
|Keith Donald 12/09/04 11:18:27 PM EST|
One note re: open frameworks like Spring and Hibernate 'not being standard'. With these two, the question becomes: what is there to standardize? POJOs configured and decorated by Spring and persisted by Hibernate are done so without being coupled with those respective APIs (or any infrastructure API for that matter). Indeed, POJOS stay decoupled from infrastructure, which makes them easily adaptable and runnable in different environments (from test, to standalone, to enterprise.) This really does give you the ability to make infrastructure a choice not a mandate, and makes it easy to adapt and scale up as requirements change. This model is the future, no doubt about it.
|Harrison Li 12/09/04 09:15:30 PM EST|
I'd just like to add some comments about company profits and J2EE. In the early stage of J2EE, application server vendors like BEA, etc. were the major force to push the J2EE standard. Now things have changed. Big business application vendors such as SAP have started to play that role. The reason is very simple, those big companies has invested significantly on J2EE technology by developing new generation of business application products. If you were familiar with SAP, you would have seen what vast changes have happened in product development on the top of J2EE in the last 3-4 years. SAP, IBM, etc. have to continue to push J2EE technology forward because J2EE has become the fundation of the company's products. I don't worry about the future of J2EE technology. Instead, I do worry about how those "pure" J2EE application server vendors could survive if they continue to be the server vendor only. Thanks.
|Dave 12/09/04 08:26:54 PM EST|
I like the programming models that open source projects like Spring and Hibernate give you.
the problem with frameworks like these however, is that open source is good, but open *standards* are better. I'd be much more likely to use Spring if there were a Spring JSR, and I'd much rather take my pick from 20+ open source and commercial implementations of JDO than tie my application to the proprietary Hibernate API.
look at Struts for instance - great framework, but in time it's going to go the way of the dodo, replaced completely by JSF, because JSF is a *standard*, and there are multiple open source and commercial vendors I can go to to get my JSF implementation. The current Hibernate API will be legacy code too once everyone goes to EJB 3 / JDO.
open source is great, but open standards are much better...
|David 12/09/04 07:41:38 PM EST|
J2EE has simply gotten so bloated and complex that few can really grasp it all. And since J2EE is the foundation, that's not good. Our applications are complicated enough without all of that baggage. The simpler frameworks, including simpler J2EE subsets such as JSP/servlet + JDBC are enough for many applications. They can always add JavaMail if they do email, or RMI if they need remote invocations, and they can even venture in JMS should that become necessary. However, most of those are not needed by a large number of applications, and the simpler frameworks will help put a stop to managers who choose J2EE when the simpler stuff will make it happen faster and cheaper.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Aug. 27, 2016 03:15 AM EDT Reads: 1,751
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
Aug. 27, 2016 02:30 AM EDT Reads: 1,993
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 27, 2016 01:45 AM EDT Reads: 1,720
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Aug. 27, 2016 01:15 AM EDT Reads: 1,998
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
Aug. 27, 2016 12:45 AM EDT Reads: 2,929
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Aug. 26, 2016 10:00 PM EDT Reads: 1,853
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 26, 2016 07:00 PM EDT Reads: 663
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
Aug. 26, 2016 04:45 PM EDT Reads: 1,538
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 26, 2016 04:30 PM EDT Reads: 2,319
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 26, 2016 04:00 PM EDT Reads: 3,940
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 26, 2016 03:15 PM EDT Reads: 459
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Aug. 26, 2016 11:30 AM EDT Reads: 606
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Aug. 26, 2016 10:15 AM EDT Reads: 3,592
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Aug. 26, 2016 09:45 AM EDT Reads: 1,859
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 26, 2016 05:00 AM EDT Reads: 3,057
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Aug. 25, 2016 11:45 PM EDT Reads: 2,319
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Aug. 25, 2016 09:15 PM EDT Reads: 2,239
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Aug. 25, 2016 05:15 PM EDT Reads: 777
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Aug. 25, 2016 01:00 PM EDT Reads: 2,636
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Aug. 25, 2016 08:45 AM EDT Reads: 2,161