Intel and others commit to building open generative AI tools for the enterprise

Key Points:

  • OPEA project aims to develop open, modular generative AI systems
  • Focus on creating interoperable AI toolchains and compilers
  • Evaluation criteria include performance, features, trustworthiness, and enterprise readiness


The Linux Foundation, in collaboration with industry leaders such as Cloudera and Intel, has introduced the Open Platform for Enterprise AI (OPEA) project. Focused on developing open, composable generative AI systems, OPEA aims to create robust and scalable AI solutions by leveraging open source innovation. Members of OPEA include Intel, IBM-owned Red Hat, and other prominent enterprise players.


One key area of interest within OPEA is the implementation of retrieval-augmented generation (RAG) pipelines in generative AI applications. RAG allows AI models to access external data sources beyond their training data, enhancing their capabilities for responding and executing tasks effectively. This approach has garnered attention for its potential to expand the knowledge base of AI models and improve performance in various use cases.


Intel, a key contributor to OPEA, emphasized the need for standardization in components for enterprises adopting RAG solutions. OPEA’s mission includes collaborating with the industry to establish standards across frameworks, architecture blueprints, and reference solutions to facilitate interoperability and accelerate product deployment in the market.


To evaluate generative AI systems, OPEA has proposed a comprehensive rubric focusing on performance, features, trustworthiness, and enterprise-grade readiness. This grading system will enable thorough assessments of AI deployments based on real-world benchmarks and quality assurance measures.


In addition to assessments, OPEA plans to engage with the open source community to offer testing based on the rubric and provide comprehensive grading of generative AI solutions. Intel has already contributed reference implementations for generative AI applications, including a chatbot, document summarizer, and code generator optimized for specific hardware configurations.


While OPEA members are deeply involved in developing enterprise generative AI tools, the question remains whether these vendors will collaborate effectively to create cross-compatible solutions under OPEA. The potential benefits for customers lie in the flexibility to choose from a variety of vendors based on their requirements. However, challenges such as vendor lock-in could hinder the goal of achieving true interoperability and collaboration within the AI ecosystem.



Prompt Engineering Guides



©2024 The Horizon