📄️ Open Source Inferencing
The Open Source Inferencing feature enables users to leverage powerful, community-driven AI models for real-time inference without vendor lock-in. By integrating open source models and tools, users gain flexibility, transparency, and control over their inferencing pipelines, enabling customization and collaboration.This empowers developers and data scientists to:
📄️ User guide
This guide will help you navigate the platform’s LLM Tools section to use pre-trained open source models for inference. Follow these steps to select a model, input your query data, and understand the results.