Welcome!

Java IoT Authors: Carmen Gonzalez, Elizabeth White, Mano Marks, Harry Trott, Pat Romanski

News Feed Item

Oracle Database In-Memory Powers the Real-Time Enterprise

Larry Ellison Unveils Breakthrough Technology, Which Turns the Promise of Real-Time Into a Reality

REDWOOD SHORES, CA -- (Marketwired) -- 06/10/14 -- Oracle (NYSE: ORCL)

News Summary

In today's fast-paced, hyper-connected, and mobile/social world, businesses demand instantaneous information and responsiveness. In this environment, businesses must be able to move as fast as their customers, be they B2B or B2C, to deliver the experience those customers demand.

For years, technology companies have been talking about the "real-time" enterprise. And for years, that's all those vendors delivered -- talk -- because they didn't have the necessary range of world-class technologies to deliver on the real-time promise. But today, Oracle is changing that paradigm, because only Oracle can bring together for customers optimized in-memory capabilities across applications, middleware, databases, and systems. Oracle Database In-Memory transparently extends the power of Oracle Database 12c to enable organizations to discover business insights in real-time while simultaneously increasing transactional performance. With Oracle Database In-Memory, users can get immediate answers to business questions that previously took hours to obtain and are able to deliver a faster, better experience to both their internal and external constituents.

Oracle Database In-Memory delivers leading-edge in-memory performance without the need to restrict functionality or accept compromises, complexity and risk. Deploying Oracle Database In-Memory with virtually any existing Oracle Database-compatible application is as easy as flipping a switch -- no application changes are required. It is fully integrated with Oracle Database's renowned scale-up, scale-out, storage tiering, availability and security technologies making it the most industrial-strength offering in the industry.

At a special event at Oracle's headquarters, CEO Larry Ellison described how the ability to combine real-time data analysis with sub-second transactions on existing applications enables organizations to become Real-Time Enterprises that quickly make data-driven decisions, respond instantly to customers' demands, and continuously optimize key processes.

What Customers are Saying

  • "As a consumer Internet pioneer and innovator, Yahoo is always at the leading edge of big data and database technology to deliver a responsive, seamless consumer experience. We joined Oracle's beta program to understand how memory optimization could sharpen our big data processing," said Sudhi Vijayakumar, Yahoo's Principal Oracle Database Architect. "Full support for Oracle Real Application Clusters' scale-out capabilities means Oracle Database In-Memory can be used even on our largest data warehouses."

News Facts

  • Oracle Database In-Memory enables customers to accelerate database performance by orders of magnitude for analytics, data warehousing, and reporting while also speeding up online transaction processing (OLTP).
  • An innovative, dual-format in-memory architecture combines the best of row format and column format to simultaneously deliver fast analytics and efficient OLTP.
  • Oracle Database In-Memory allows any existing Oracle Database-compatible application to automatically and transparently take advantage of columnar in-memory processing, without additional programming or application changes.
  • Oracle Database In-Memory demonstrated from 100x to more than 1000x speedup for enterprise application modules in performance tests, including Oracle E-Business Suite, Oracle's JD Edwards, Oracle's PeopleSoft, Oracle's Siebel, and Oracle Fusion Applications.
  • The ability to combine real-time data analysis with sub-second transactions on existing applications enables organizations to become Real-Time Enterprises that quickly make data-driven decisions, respond instantly to customers' demands, and continuously optimize all key processes.
  • Oracle Database In-Memory has undergone extensive validation testing by hundreds of end-users, ISV partners, and Oracle Applications teams over the past nine months.
  • Oracle Database In-Memory is scheduled for general availability in July and can be used with all hardware platforms on which Oracle Database 12c is supported.
  • Oracle PartnerNetwork (OPN) is also announcing that Oracle Database 12c Ready certification will soon include Oracle Database In-Memory.

Software and Hardware Engineered for the Real-Time Enterprise

  • Building on years of innovations and maturity, Oracle Database In-Memory inherits all Oracle Database capabilities including:
    • Maximum Availability Architecture to protect against data loss and downtime.
    • Industry leading security technologies.
    • Scalability to meet any requirement via scale-up on large SMP servers, scale-out across a cluster of servers, and storage-tiering, to cost effectively run databases of any size -- whether petabyte-scale data warehouses, big data processing or database clouds.
    • Rich programmability: Java, R, Big Data, PHP Python, Node, REST, Ruby, etc.
    • Full data type support: relational, objects, XML, text, spatial, and new integrated JSON support.
  • Oracle Engineered Systems are the ideal complement to Oracle Database In-Memory:
    • Oracle Engineered Systems, including Oracle Exadata Database Machine and Oracle SuperCluster, are optimized for Oracle Database In-Memory, featuring large memory capacity, extreme performance, and high availability while tiering less active data to flash and disk to deliver outstanding cost effectiveness.
    • In-Memory fault tolerance on Oracle Engineered Systems optionally duplicates in-memory data across nodes enabling queries to instantly use a copy of in-memory data if a server fails. New Direct-to-Wire Infiniband accelerates scale-out for in-memory.
    • Oracle's M6-32 Big Memory Machine is the most powerful scale-up platform for Oracle Database In-Memory providing up to 32 Terabytes of DRAM memory and 3 terabytes/sec of memory bandwidth for maximum in-memory performance.

Supporting Quotes

  • "We are delighted that our MicroStrategy Analytics Platform is among the first third-party applications to be certified with Oracle Database In-Memory," explained Paul Zolfaghari, President, MicroStrategy Incorporated. "Our participation in Oracle's beta program and integration with Oracle Database In-Memory builds on our long-standing relationship with Oracle, underscoring the importance of working together to optimize our platforms to extend the advanced functionality and speed performance improvements to our joint customers."
  • "Oracle is the only vendor in the industry to embrace in-memory computing from applications to middleware to database to systems, enabling businesses to maximize profitability by accelerating operations, quickly discovering new growth opportunities and making smarter, real-time decisions," said Andrew Mendelsohn, Executive Vice President, Database Server Technologies, Oracle. "Oracle Database 12c In-Memory uniquely delivers unprecedented performance for virtually all workloads with 100 percent application transparency and no data migration. Plus all the high availability, scalability, and security that customers have come to expect from the Oracle Database are fully preserved."
  • "Oracle Applications provide the foundation for our customers' mission-critical business operations, including sales, financials, supply chain and human resources. By raising the bar on speed, Oracle Database In-Memory enables customers to compound the value of their existing applications by deriving new insights and business opportunities faster," said Steve Miranda, Executive Vice President of Application Development, Oracle.

Supporting Resources

About Oracle
Oracle engineers hardware and software to work together in the cloud and in your data center. For more information about Oracle (NYSE: ORCL), visit www.oracle.com.

Trademarks
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation.

PDF Attachment Available: http://media.marketwire.com/attachments/201406/76191_DBIM_ComparChart_Vert.pdf

Image Available: http://www2.marketwire.com/mw/frame_mw?attachid=2613727

Contact Info

Letty Ledbetter
Oracle
+1.650.506.8071
Email Contact

Teri Whitaker
Oracle
+1.650.506.9914
Email Contact

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

@ThingsExpo Stories
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, introduced the technologies required for implementing these idea...
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Providing secure, mobile access to sensitive data sets is a critical element in realizing the full potential of cloud computing. However, large data caches remain inaccessible to edge devices for reasons of security, size, format or limited viewing capabilities. Medical imaging, computer aided design and seismic interpretation are just a few examples of industries facing this challenge. Rather than fighting for incremental gains by pulling these datasets to edge devices, we need to embrace the i...