Vault is the consolidated retrieval and generation layer of the Intelligine platform, in which customer proprietary documents, customer databases, and customer application programming interfaces are ingested into a private knowledge graph, and every natural language answer is produced by a foundation model that has been trained from scratch on the customer corpus and formally handed over to the customer engineering organization on the 30th calendar day of the engagement.
Vault constructs a tenant-scoped knowledge graph in which every document, every named entity, every relationship between entities, and every access control policy is explicitly modeled, and the customer private model is trained against that graph rather than against a flat collection of embedded passages, with the result that the model reasons over the structure of the customer organization rather than only over the surface text of customer documents.
When a customer user submits a natural language question, Vault traverses the customer knowledge graph using the customer private model, returns an answer bound to citations that name the source document, the relevant paragraph, and the timestamp at which the source was last revised, and respects the customer existing access control list configuration so that each user observes only the subset of the graph that the customer has already permitted them to see.
Confirmation of customer data sources, customer engagement domain, and customer success criteria, with an architecture review attended by the customer chief information security officer and chief information officer.
Cleaning and labeling of customer proprietary datasets and construction of the tenant-scoped customer knowledge graph, with a documented audit trail covering every individual document that enters the corpus.
Foundation model training and subsequent fine-tuning of the model against the customer knowledge graph, executed on graphical processing units that sit inside the customer cloud account.
Trained model weights, supporting runtime, embedding model, customer-specific tokenizer, and operating runbook handed over to the customer engineering organization on the 30th calendar day of the engagement.
All ingestion, graph construction, training, and inference operations are performed inside the customer virtual private cloud, customer on-premises data center, or customer air-gapped environment, and customer data never crosses the customer-defined network boundary.
Every factual statement returned by the customer private model is traced back to a specific source document and a specific edge in the customer knowledge graph, and unsourced output is filtered out of the response before it reaches the customer user.
The customer retains full ownership of the trained model weights, the embedding model, the customer-specific tokenizer, and the customer knowledge graph, with no recurring application programming interface dependency back to Intelligine infrastructure.
A discovery call between the customer technology leadership and an Intelligine architect produces a documented scope for the Vault engagement within two conversations, including a documented mapping between customer source systems and the proposed tenant-scoped customer knowledge graph.