LLM Services Available Directly from Generated Resources

AIOS provides an environment where LLM can be used immediately without separate LLM service installation or configuration, using resources created through Samsung Cloud Platform's Virtual Server, GPU Server, and Kubernetes Engine services. AIOS runs on Samsung Cloud Platform’s internal infrastructure, minimizing data leakage and providing private cloud-level security.

Overview

01

04

Service Architecture

User App.Dev [Conversational AI, Language Translation, Text / Code Generation, Text / Data Summarization] → Samsung Cloud Platform Samsung Cloud Platform
  • Virtual Server → AIOS
  • GPU Server → AIOS
  • Kubernetes Engine → AIOS
AIOS : AIOS LLM [Llama Family Model, Reasoning Model, Security Guard Model]

Key Features

  • Provide AIOS LLM endpoint
    1. Automatic provision of LLM private endpoint in the detailed inquiry screen of created Virtual Server, GPU Server, and Kubernetes Engine service resources
  • Provide visualized reports
    1. Can check the number of calls and token usage by service type, resource, and model
    2. View full call statistics by LLM model
  • AIOS compatibility
    1. Supports compatibility with OpenAI and LangChain SDK, allowing easy integration with existing development environments or frameworks

Let’s talk

Whether you’re looking for a specific business solution or just need some questions answered, we’re here to help