tune Config & Inspector expand_more

ModelOp Enterprise Governance

Curated Insight

Taxonomy Context

ModelOp

Bridging Standard Metrics & Enterprise Needs

ModelOp Center provides a comprehensive suite of out-of-the-box (OOTB) monitors. However, enterprise requirements often demand unique calculations. This guide introduces the capabilities of custom Python-based monitors.

code The Developer Value

Write standard Python code using Data Science libraries and have it automatically integrated into an enterprise-grade governance platform. No complex API integrations required.

verified_user The Governance Value

Ensures that no matter how complex the model metric, the "evidence" is always captured in a standardized, auditable format for risk and compliance.

Monitors by Model Modality

Custom monitors can be built to supplement any of these OOTB categories based on your domain.

Data Science Metrics Catalog

Hover or click on any metric pill below to view its algorithmic definition and dual-persona insights. These definitions guide standard and custom monitor creation.

psychology

1. Generative AI / NLP Validations

Secures conversational agents and generative models against hallucinations, toxic output, and data leakage.

balance

2. Ethical Fairness & Bias

Evaluates model behavior disparities against protected classes to ensure regulatory compliance.

show_chart

3. Regression & Credit Risk

Assesses continuous prediction errors and rank-ordering capabilities, crucial for financial/credit models.

join_inner

4. Classification Performance

Evaluates discrete prediction models. Requires schemas mapping score_column and label_column.

query_stats

5. Data & Concept Drift

Detects shifts in input distributions over time by comparing baseline data against a sample slice.

Enterprise Monitor Selection Pathway

Selecting the right AI monitor requires mapping the technical model type directly to enterprise business outcomes. Use this dynamic tree to navigate from your raw data state to the specific ModelOp monitors required.

manage_search Node Inspector

(Ctrl+Click for Multi)

Model Selection

Select your model types above, then click nodes in the graph to map the required governance monitors to their specific enterprise use cases.
System Component Identifier
modelop.model.registration
sync Generating Pathway Map...

filter_alt Pathway Filters

Toggle model types below to dynamically generate the recommended monitor pathway branches.

Industry Context Presets

info Disclaimer: This is an enablement tool intended for guidance and is not to be used for final or automated business decision making.

Key Concepts: Execution Architectures

Interact with the architecture diagrams below. Toggle between standard monitoring and agentic LLM patterns.

manage_search Node Inspector

Ctrl+Click multi

Execution Process

Select a node in the diagram to view its details.

  1. Process initiated (via UI/API)
  2. Metrics job created (via MLC)
  3. Job sent to Runtime
  4. Runtime loads datasets & code
  5. Runtime executes Python source
  6. Output yielded as JSON
  7. Model Test Result attached
System Component Identifier
modelop.core
sync Generating Architecture Map...

Artifact Explorer & Generator

A custom monitor is defined by specific files in a Git repository. Explore the required structure and generate contextual Data Science code boilerplates.

verified Required Files

custom_metrics.py terminal

Primary Model Source Code

metadata.json data_object

Monitor Classification Meta

required_assets.json list_alt

Input Data Definitions

custom_metrics.py

The algorithmic brain. Use the pills below to generate boilerplate logic for different Data Science use cases.

Onboarding Roadmap

Follow this interactive guide to promote your custom monitor from a local IDE script into a production-ready ModelOp asset.

1

cloud_upload Connect Git Repository

expand_more

Import your custom code via the ModelOp UI. Navigate to Monitors > Add Monitor and select "Git".

  • Provide the Repository URL and target Branch.
  • Assign an Access Group to control viewing permissions.
  • The system automatically scans for the metrics() entry point.
2

camera Freeze a Snapshot

chevron_right

Snapshots create an immutable version of your monitor code linked to a specific commit.

This guarantees production stability, ensuring that subsequent commits to the Git branch don't silently alter or break actively scheduled tests.

3

link Map Data & Execute

chevron_right

Attach the monitor snapshot to your Business Model's "Monitoring" tab.

  • The UI will prompt you to map specific data assets as defined in your required_assets.json.
  • Click Play to spawn the Job and yield your metrics.