News

IBM Offers Meta’s Llama 3 on WatsonX

IBM announced the availability of Meta Llama 3 — the next generation of Meta’s open large language model — on its watsonx AI and data platform. This expands IBM’s watsonx.ai model library to help enterprises innovate with its in-house Granite series of models, as well as those from leading model providers like Meta.

The addition of Llama 3 builds on IBM’s collaboration with Meta to advance open innovation for AI. The two companies launched the AI Alliance — a group of leading organizations across industry, startup, academia, research and government —  late last year, and it has since grown to more than 80 members and collaborators.

Furthermore, IBM Consulting and Client engineering experts have engaged with  hundreds of enterprises to apply Llama models to their targeted enterprise pilots and use cases. For example, IBM helped build a content engine for the Recording Academy — the non-profit that hosts the GRAMMYs — by training Llama 2 using the Recording Academy’s proprietary, trusted data to produce digital content consistent with the brand’s standards and tone of voice.

According to Meta, the release of Llama 3 features pretrained and instruction fine-tuned language models with 8B and 70B parameter counts that can support a broad range of use cases including summarization, classification, information extraction, and content grounded question and answering. The 8B model is designed for faster training and edge devices, while the 70B model is designed for content creation, conversational AI, language understanding, research and development and enterprise applications. Meta has stated Llama 3 is demonstrating improved performance when compared to Llama 2 based on Meta’s internal testing. In the coming months, Meta expects to introduce new capabilities, additional model sizes, and enhanced performance, and the Llama 3 research paper.

IBM also hosts Code Llama 34B — a task-specific model for code generation and translation — on watsonx. IBM offers Llama models for both SaaS and on premises, giving clients choice and flexibility to scale AI with their own data for a broader set of enterprise use cases.