How The Linux Foundation's ODPi Initiative is Advancing Apache Hadoop and Big Data
Tuesday, March 15, 2016
The ODPi initiative focuses on promoting and advancing the state of Apache Hadoop and Big Data technologies for the enterprise. It is a collaborative project of the Linux Foundation, which hosts a number of collaborative software projects and provide the organizational, promotional and technical infrastructure needed to make the projects successful.
We recently visited with John Mertic, Senior Program Manager for ODPi, to discuss the goals of the ODPi initiative.
ADM: ODPi was recently formed in late 2015. Why was it created and what do you hope to accomplish?
Mertic: The rapid influx of digital information available to enterprises has resulted in a Big Data ecosystem that is challenged and slowed by fragmented, duplicated efforts. ODPi was created as a shared industry effort to accelerate the adoption of Apache Hadoop and related Big Data technologies with the goal of making it easier to rapidly develop applications through the integration and standardization of a common reference platform called the ODPi Core.
ODPi enables the larger community around Hadoop, namely distro vendors, application vendors, and end-users, to collaborate in a vendor neutral setting . This model work has worked with countless open source technologies experiencing rapid growth. We know it can increase adoption and open up opportunities for innovation on top of an already strong Hadoop community.
ADM: Describe the problem ODPi solves, what value does it bring to the market?
Mertic: Many vendors have focused on productizing Apache Hadoop as a distribution, which has led to inconsistency and has made it challenging for application vendors and end-users to embrace Hadoop as a technology. ODPi is integrating a variety of upstream Apache projects, working across the Apache ecosystem to create a downstream reference platform on top of which new Big Data solutions can be built.
By providing a common runtime, long-term support, reference implementations and test suites, ODPi Core removes cost and complexity and accelerates the development of Big Data solutions.
ODPi will bring value to the market by:
- Standardizing the commodity work of the components of an Hadoop distribution.
- Providing a common platform against which to certify apps, reducing the complexities of interoperability.
- Ensuring a level of compatibility and standardization across distribution and application offerings for management and integration
ADM: What are the consequence of today's fragmented big data ecosystem?
Mertic: Fragmentation in the big data system has slowed innovation in the industry and inhibits enterprises from developing enterprise-driven applications. The industry needs more open source-based big data technologies and standards so application developers and enterprises are able to more easily build data-driven applications.
ADM: Tell us a little bit more about your release cycles and the progress you're making?
Mertic: ODPi has a twice-yearly release cycle that happens in March and September. Since ODPi came under the Linux Foundation as a Collaborative Project in September 2015, we have experienced significant momentum, nearly doubling our membership since we launched.
The planned ODPi Certification Program is also underway. The goal of ODPi Certification Programs will be to ensure consistency and compatibility across the Big Data ecosystem. You can learn more about this technical progress here. This March we will release a final specification document and test guidelines for a Hadoop distribution to self-certify ODPi at Runtime.
ADM: How many members and contributors do you have? How big is the community?
Mertic: Our community has been growing consistently since inception. Currently, we have more than 35 maintainers from 25 companies.
ADM: Talk more about your members and why they organized ODPi.
Mertic: ODPi’s membership includes heavy-weight brands across Hadoop software providers including EMC, Hortonworks, IBM, Pivotal, and VMware; service providers like AltiScale; advanced ISV’s like CapGemini, Infosys, and SAS; and, finally, leading Hadoop consumers like General Electric, a large international telco and major automotive OEM.
The companies came together with the combined goal of creating operational efficiencies across the entire Big Data ecosystem. Our members clearly have relevant expertise and investment in the big data. They also are able to look at big data challenges from a variety of angles, balancing the views of consumers of the technologies with providers.
ADM: For mobile developers, what are the key benefits of ODPi?
Mertic: For advanced used use cases involving mobile apps and software to evolve, an industry standard for certification and testing is important. It will make it easier for non-traditional Hadoop developers to participate in this opportunity.
ADM: With the Hadoop and big data ecosystem, what are the main opportunities and challenges?
Mertic: Now 10 years old Hadoop has become a mature technology that serves hyperscale environments and is able to handle a wide varying amounts and types of data. It’s a proven and popular platform among developers requiring a technology that can power large, complex applications. Yet, Hadoop components and Hadoop Distros are innovating very quickly and in many different ways. This diversity, while healthy in many ways, also slows big data ecosystem development and limits adoption.
Engaging mobile app users with Flurry Push Monday, May 21, 2018
Predicting future IT outages using AI Monday, May 21, 2018
Revenue leakage and how devs can tackle it Monday, May 21, 2018
Enterprise full-spectrum CPQ developed by Apttus Monday, May 21, 2018
Low code development platform gets an update from Filemaker Wednesday, May 16, 2018
Stay UpdatedSign up for our newsletter for the headlines delivered to you