,

CEO commends genuine open source for enterprise generative AI, new initiatives unveiled at Think 2024

Merima Hadžić Avatar

IBM is bolstering its generative AI initiatives with a new suite of technologies and partnerships, unveiled at the Think 2024 conference. Building on a legacy in AI that predates the current generative AI boom by decades, IBM continues to innovate and expand its capabilities in the field.

IBM’s AI history spans several decades, with significant advancements leading up to the modern era. At Think 2023, IBM launched its Watsonx generative AI product platform, setting the stage for its current AI endeavors. The Watsonx platform has since become the cornerstone of IBM’s generative AI strategy, offering enterprise-grade models, robust governance, and comprehensive tools. This year, IBM is making a significant leap by releasing several of its Granite models as open-source code. These models, ranging from 3 to 34 billion parameters, are designed for both code and language tasks. Additionally, IBM is integrating Mistral AI models, including the Mistral Large, into its platform.

One of the most common applications of generative AI is the development of assistants. IBM is keen to enhance support for this use case with the introduction of new Watsonx assistants. IBM CEO Arvind Krishna emphasized the transformative potential of AI, likening its impact to that of the steam engine or the internet.

The Importance of Open Source for IBM Granite Enterprise AI
IBM first introduced its Granite models in September 2023, continually expanding their capabilities. Among these models is a 20 billion parameter base code model that powers IBM’s Watsonx code assistant for Z service, aiding organizations in modernizing COBOL applications.

At Think 2024, IBM is taking a significant step by releasing a group of its most advanced Granite models under the open-source Apache license. While many vendors claim to offer open models, few actually provide them under an Open Source Initiative (OSI) approved license. IBM asserts that, alongside Mistral, Granite is among the few highly performant large language model (LLM) families available under a genuine open-source license like Apache.

In response to a question from VentureBeat, Krishna underscored the critical importance of true open source for enterprises. He noted that many so-called open licenses are not genuinely open source, serving more as marketing tools. Krishna highlighted that genuine open source is essential for fostering contributions and technological growth.

Expanding Watsonx Assistants for Enterprise AI Advancement
While large language models (LLMs) are pivotal for enterprise generative AI, the concept of an AI assistant is equally crucial. AI assistants, referred to as copilots by companies like Microsoft and Salesforce, offer a more consumable approach for many organizations. During a media roundtable, Rob Thomas, IBM’s Senior Vice President and Chief Commercial Officer, explained that AI assistants provide a packaged approach for enterprises to deploy AI in production.

At Think 2024, IBM is unveiling three new assistants:

Watsonx Code Assistant for Java: This assistant helps developers write Java application code, leveraging IBM’s extensive experience with Java.
Watsonx Assistant for Z: Designed to assist organizations in managing IBM Z mainframe environments.
Watsonx Orchestrate: Enables enterprises to build their own custom assistants.
Integrating Retrieval Augmented Generation (RAG) and InstructLab Technology
One prevalent deployment pattern for generative AI in enterprises today is Retrieval Augmented Generation (RAG). RAG enhances assistants and generative AI chatbots with real-time enterprise information that the LLMs were not originally trained to handle. At the core of RAG is a vector database or vector support within an existing database. While IBM recognizes the importance of RAG and vector databases, it is not developing its own vector database. Instead, IBM integrates with many existing options, ensuring the capability is present on the platform without owning it.

IBM also sees a promising future for InstructLab technology, recently announced by its Red Hat business unit. InstructLab facilitates the continuous improvement of models through an optimized approach, providing a robust framework for advancing AI capabilities.

The Impact of Generative AI on Employment and Society
IBM, like much of the enterprise IT industry, is optimistic about the potential of generative AI. However, the company is also cognizant of the societal and employment impacts. During the roundtable, Krishna addressed concerns about employment, stating that increased productivity typically leads to more business and, consequently, higher employment. He noted that demographic trends in most countries point to a declining workforce, making AI capabilities crucial for maintaining quality of life and economic growth.

Key Takeaways and Future Directions
Granite Models: Open-source release under Apache license.
Watsonx Assistants: New assistants for Java, Z, and custom enterprises.
RAG and Vector Databases: Integration with existing solutions.
InstructLab Technology: Continuous model improvement.
IBM’s announcements at Think 2024 mark a significant milestone in its generative AI journey. By embracing open-source principles and expanding its suite of AI assistants, IBM aims to solidify its position as a leader in the enterprise AI landscape. The company’s strategic partnerships and technological innovations are set to drive the next wave of AI advancements, with a focus on real-world applications and societal benefits.

Merima Hadžić Avatar