Metadata Specification
Overview
The testing-api utilizes a standardized metadata schema to ensure that every evaluation run is traceable, reproducible, and verifiable. In the context of the Supervised AI platform, metadata serves as the bridge between raw model outputs and the historical context of the training and testing environment.
Metadata Schema
All metadata is passed as a structured JSON object within the request body of the testing endpoints. This allows the platform to track model lineage and performance across different iterations.
Core Components
The metadata object is divided into three primary categories:
- Model Identity: Specifics about the model version and architecture.
- Dataset Context: Information regarding the data subset used for the evaluation.
- Execution Environment: Environmental variables and hardware specifications.
Field Definitions
| Field | Type | Required | Description |
| :--- | :--- | :--- | :--- |
| model_id | String (UUID) | Yes | The unique identifier for the model registered in the Supervised AI platform. |
| model_version | String | Yes | Semantic versioning or git hash of the model weights/architecture. |
| dataset_id | String | Yes | The unique identifier for the dataset used for testing. |
| subset | Enum | Yes | The specific data split: training, validation, or testing. |
| hyperparameters | Object | No | A key-value map of parameters used during the inference or testing phase (e.g., temperature, top_p). |
| tags | Array<String> | No | Custom labels for filtering results (e.g., ["production-candidate", "experimental"]). |
Usage Example
When submitting a test result or starting an evaluation job, the metadata object should be structured as follows:
{
"metadata": {
"model_id": "ai-mod-99283-x",
"model_version": "2.4.1-rc",
"dataset_id": "ds-imagenet-v4",
"subset": "testing",
"hyperparameters": {
"batch_size": 32,
"precision": "fp16",
"device": "cuda"
},
"tags": ["sprint-12", "baseline-comparison"],
"timestamp": "2023-10-27T10:00:00Z"
}
}
Type Definitions
For developers using TypeScript or strongly typed languages, use the following interface structure:
interface ModelMetadata {
model_id: string;
model_version: string;
dataset_id: string;
subset: 'training' | 'validation' | 'testing';
hyperparameters?: Record<string, string | number | boolean>;
tags?: string[];
timestamp?: string; // ISO 8601 format
}
Data Validation
The API performs strict validation on the metadata object to ensure consistency in the Supervised AI dashboard:
- Format: The
timestampmust follow the ISO 8601 standard. - Enumeration: The
subsetfield only accepts the three predefined strings mentioned above. - Size: The
hyperparametersobject is limited to 50 key-value pairs to ensure optimal indexing performance.
Failure to provide the required model_id or dataset_id will result in a 400 Bad Request error, as these are critical for the platform's supervised model tracking capabilities.