Pragatix Components Overview

Pragatix Components Overview

BusinessGPT system diagram-Private AI-20240702-063906.png

 

 

BusinessGPT system diagram-AI Firewall.png

 

Components overview

Dashboard / Ingestor Windows Server

Dashboard

.NET website for the end user and admin UI

The Dashboard manages the synced content items, permissions, building collections, chat interfaces, and settings at the user and site levels.

Ingestor service

The Ingestor service is a service with connectors to various data sources. It pulls the content and the permissions and sends them to the Gateway embedding queue. 

Firewall Service

This is a service that Classifies and analyses data at rest against policies.

Firewall API

This is a website exposing API to be used in real-time for inspection and classification.

Gateway Linux Server / Containers

Content loader

The loader extracts the text from the content items and cleans unnecessary parts from the content, such as email signatures and disclaimers in emails.

Embedding service

Gets data from the Loader, splits it into chunks, and transforms the chunked content into vectors.

Each file type requires a different loader to extract the text content. We will use the Unstructured library for most file types.

We will endeavor to keep many file types, paragraphs, and sections in a single chunk, with some context before and after for context. 

The embedding vectors will be stored in a vector DB - we’ve use Postgres Vector DB

Embedding AI model

For the private AI Pragatix uses intfloat/multilingual-e5-large

Vector DB

The Vector DB stores embedding as vector content and the chunked text with metadata. 

Embedded vector binary content is also stored here. 

Insights Engine

This business Logic processes questions, searches for relevant organizational content, and interacts with the AI models.

The API gets the query and decides the best algorithm to generate the answer.

Local LLM Linux Server / Container

Customers can choose between using OpenAI’s LLM models such as Chat GPT 5 and Open AI GPT OSS 120B, free for commercial use. Supported Self Hosted LLM Models

Token classification model

An AI model used for specific data classification, such as PII

Bastion Proxy

Network proxy supporting HTTP and Web sockets needed for analysing the AI services traffic

 

SaaS Firewall Data flow description

Once the user is configured with the Firewall using a PAC file, the traffic related to AI service URLs is directed to the Bastion Proxy in the appropriate region.

The Proxy then decrypts the traffic and forwards it to the Firewall API service hosted in the same region.

The Firewall API stores the data in the Firewall Auditing table within the database hosted in the same region, under the relevant account.

The account is identified based on the IP address or domain. In an upcoming version, it will be determined by a value sent through the port to which the traffic is directed at the Proxy.

If Real-time Inspection is set to true, the Firewall will check data classification and guardrails policies after both the prompt and the response, depending on the Firewall's settings.