FabriX Simple Package quickly provides the latest AI technologies such as RAG and Code Interpreter based on an intuitive user interface. Among the multiple LLMs available, users can choose one to use.
Users can easily register corporate data and assets and utilize them in order to induce optimized answers. Through integration with Data Ops, a service built on extensive company-wide data and assets can be achieved to meet specific needs. Data Ops facilitates the collection and storage of data.
FabriX Simple Package includes a filtering feature that detect and prevent harmful content. It detects and takes action on unnecessary words or phrases of profanity in
a query. In particular, the AI-based filter using LLM prevents data leakages such as confidential programming codes, user information, and logs. Embedded with enhanced security features, it also protects users’ personal data and provides services in a secure environment.
FabriX Simple Package can simply be integrated with other SCP products including Data Ops and Elasticsearch, both of which are services that collect and refine internal data, and FabriX, which efficiently shares and manages internal data assets in an integrated manner. In addition, it has high scalability with new technologies such as various LLMs, Orchestrator, and industry-oriented plug-ins.
- Provide LLM-based questions and answers
- Generate answers based on internal data
- Code Interpreter : Execute code from programming languages and provide feedback on results
- Allow access to authorized users through single sign-on (SSO)
- Provide API Endpoint for the implementation of copilot chat service developed by customer (To be provided in 2024)
- Cognitive search : Offer highly accurate responses and information based on internal data transformed into knowledge
- Rule-based filter : Provide prompt filtering designed to prevent the leakage of sensitive data and block a politically incorrect query
- AI-based filter : Prevent breaches of sensitive data including programming codes, user information, and log data
- Provide Prompt Library/Studio and Orchestrator (To be provided in 2024)
- Enterprise-oriented LLM using LLaMA 2
- External LLM provided in a stable environment : Azure OpenAI
- Expansion of enterprise-oriented LLMs (To be provided in 2024)
- Expanding multi-modal LLMs (To be provided in 2024)