Get the opportunity to grow your influence by giving your products or services prime exposure with Performance Magazine.

If you are interested in advertising with Performance Magazine, leave your address below or contact us at: [email protected].

Advertise with us

What makes open source enterprise-ready?

FacebooktwitterlinkedinFacebooktwitterlinkedin
hadoop
If you go back to the past, about 10 years ago, you would see that a lot of CIOs believed there was something bad about open source. In general, we now see a diminishing fear of open source in the market.- Mike Olson, Board Chairman and Chief Strategy Officer, Cloudera.
Mr. Olson uses that word intentionally. Executives would bring up that open source wasn’t professionally developed, that it was not developed for companies. That is flat out no longer true. These days, open source software has become regulation compliant and gives CIOs a chance to fully take dominance of the speed of innovation.

As such, it is important that we discuss the four reasons why open source software will steadily and surely end up ruling the enterprise software market.

  1. Compliancy

Enterprise will bring in that open source software is not regulatory compliant, not secure, and therefore not save to use.

As Mike Olson noted, we can point to very large scale, very secure implementations, compliant with rigorous regulatory regime requirements, of the open source platform in mission critical applications.

Cloudera is the first and currently only Hadoop platform that has passed PCI Data Security Standard certification to store personally identifiable data worldwide compliant with regulatory requirements.

Mr. Olson emphasized that there is no reason at all for CIOs to be concerned with security and compliance. They should not be as concerned with open source and the development, as they should be with enterprise requirements that they put on data management platforms.

  1. Avoiding Single Vendor Proprietary Lock-in

If there is one thing that CIOs detest, it is single vendor proprietary lock-in. They want to be able to choose the vendor that they work with, by continuously taking advantage of the pace of innovation that a global open source developer community drives.

Nowadays, CIOs demand open source solutions, because with open source tooling, a CIO is able to change software when something better is available.

  1. Platform Software Needs to be Open Source

Mike Olson has been active in the data management space since the early ’80s. He has seen the market grow and the way that software development has changed over the years.

In the last 10 years, no meaningful proprietary platform software (database, operating system) has emerged. “I am confident that the law of physics now is that it is only possible to successfully launch a platform software if it is open source.

What used to be proprietary becomes open source. This is a trend across every single category; databases, operating systems, middleware. Think about it: JBoss for middleware, Linux for operating system, MySQL, Postgres and Hadoop for Data Management.

So, does that mean no more proprietary software? No, not quite. Open Source offers great opportunity, but there is certainly room for proprietary software, especially as an important driver for innovation.

While open source communities have been great at building platform software, they haven’t been generally that great in building business applications. If you think of great analytics products or ERP products in general, these are proprietary applications built on top of open source platforms.

  1. Open Source Enables New Use Cases

Hadoop will not be the one database to rule them all. In some cases, current workloads from different systems can move in, that is possible. Typically, Global 8000 enterprises have been data driven for years, these are companies that have been using data well. There are good data warehouse, OLTP and other systems that enterprises have been relying on.

Large enterprises have been using dashboards and reports for many, many years. There is now an opportunity for them to not only be reactively reporting on historic and current data, but to become predictive.

In this process, Hadoop is enabling great new use cases. To predict what is going to happen next, and how organizations need to change their behavior to take advantage of these opportunities.

Hadoop was designed to handle these advanced analytic and large-scale data processing workloads. Integrating with existing systems is important, so you are not forced to move all of your existing infrastructure.

The Future of Hadoop

Hadoop has now been available for around 8 years, in which the advancements inside the platform have been enormous. What will happen with it in the future?

Well, the future will be determined by the further development of technologies and applications within the ecosystem.

Platform Developments

At the beginning of Cloudera’s life, Hadoop was really just a storage layer, HDFS, and a process and computer layer, MapReduce. This offered only one single way to work with data. A lot of innovation happened since then, so these days when we talk Hadoop, we mean a collection of processing and analytics capabilities on a shared store.

Examples can be found plenty. HBase, taking a substantial share of the SQL workload market, Cloudera Impala (now Apache Impala), as an open source massive scale analytics data processing engine, Cloudera Search which has been built on Lucene and Solar technology.

We have seen innovation like Apache Spark in the market; who’s to predict what is the next spark that is going to emerge in the Hadoop ecosystem?

At current, Cloudera sees similar innovation at the storage layer, and this excites Mr. Olson a lot: Hbase and HDFS are all available for data storage. Apache Kudu is now a fast-growing project, and is addressing a new kind of workload in that market.

Ecosystem

One last thing to keep in mind is the innovation from applications that are layered on top of Hadoop. For Cloudera it is very important to encourage partners to provide services, applications, and hardware, that make it easier for customers to consume the platform.

“We see partners building solutions ranging from mobile telephony systems, to cybersecurity or new analytics and reporting solutions. Cloudera and Hadoop themselves don’t offer these kinds of solutions. The role of Cloudera is to offer suppliers a stable and highly scalable platform to run these applications on top of.”

Image source:

The Difference between Deep Learning and Machine Learning
The Essential Functions of Health and Safety Management Systems and Software
free

Tags: , ,

Comments (1)

Leave a comment

THE KPI INSTITUTE

The KPI Institute’s 2024 Agenda is now available! |  The latest updates from The KPI Institute |  Thriving testimonials from our clients |