← Glossary
Category · 6 terms
AI governance
AI Use Policies, training-data clauses, output ownership, and the frameworks (ISO 42001, NIST AI RMF) procurement is mapping to.
AI Use Policy
An organization's internal policy governing employee use of AI tools, including approved tools, restricted data, output review, and incident handling.
Also: AI Use Policy · AI Acceptable Use Policy · AI Policy
ISO/IEC 42001
The 2023 international standard for AI management systems, the first ISO certification scheme designed specifically for organizations that develop or deploy AI.
Also: ISO 42001 · ISO/IEC 42001 · AI Management System
NIST AI Risk Management Framework
The voluntary US framework for managing AI risk, organized around four functions: Govern, Map, Measure, and Manage.
Also: NIST AI RMF · AI RMF · NIST AI Risk Management Framework
Output Ownership
The contractual question of who owns the outputs an AI system generates from a user's prompts: the user, the vendor, or some shared arrangement.
Also: output ownership · AI output IP · AI output rights
Prompt Confidentiality
The contractual obligation that an AI vendor will treat the user's prompts as confidential information, not training data, and not visible to other users.
Also: prompt confidentiality · input confidentiality · prompt privacy
Training-Data Clause
The contract provision that addresses whether and how a vendor can use customer data, prompts, or outputs to train, fine-tune, or improve its AI models.
Also: training data clause · no-training clause · model training opt-out
Train this into your team’s playbook.
The corporate training program turns these terms into the operational discipline your in-house team negotiates with every week.