Validation Protocols
Overview
The Validation Protocols for the testing-api are designed to ensure that third-party AI integrations adhere to the structural and functional requirements of the Supervised AI platform. These automated scripts verify schema compliance, endpoint availability, and response integrity.
Prerequisites
Before running the validation suite, ensure your environment meets the following requirements:
- Environment: Node.js v16.x or higher / Python 3.9+ (depending on your implementation).
- API Key: A valid
SUPERVISED_AI_TEST_KEYconfigured in your environment variables. - Target URL: The endpoint of the third-party integration you are validating.
Running the Validation Suite
To validate an integration, use the provided CLI utility. This script performs a handshake and executes a series of test cases against your integration's public interface.
Standard Validation Command
# Execute validation against a local or remote endpoint
npm run validate -- --target="https://your-integration-api.com/v1" --schema="default"
Parameters
| Parameter | Type | Description |
| :--- | :--- | :--- |
| --target | string | The base URL of the third-party integration. |
| --schema | string | The specific API schema version to validate against (e.g., v1, v2). |
| --verbose | boolean | (Optional) Outputs detailed request/response logs for debugging. |
Protocol Stages
The automated script executes the following stages in sequence. A failure in any stage will halt the process.
1. Connectivity & Authentication
The script sends a GET request to the /health or /status endpoint of the integration.
- Expected Status:
200 OK - Required Headers:
Content-Type: application/json
2. Schema Compliance
The validator sends standardized payloads to your processing endpoints and verifies that the response structure matches the testing-api specification.
Example Request Payload:
{
"task_id": "val_78910",
"input_data": {
"type": "text",
"content": "Sample validation string"
},
"config": {
"temperature": 0.5
}
}
Validation Criteria:
- Response Format: Must be valid JSON.
- Required Fields:
status,output,metadata. - Data Types:
outputmust be a string or object depending on the task type.
3. Latency Benchmarking
The script measures the Round Trip Time (RTT) for three consecutive requests.
- Warning Threshold: > 2000ms
- Failure Threshold: > 5000ms (Timeout)
Error Handling
When a validation fails, the script returns a non-zero exit code and a JSON error report detailing the mismatch.
| Error Code | Meaning | Resolution |
| :--- | :--- | :--- |
| ERR_SCHEMA_MISMATCH | Response fields do not match the API structure. | Check the testing-api schema definitions. |
| ERR_AUTH_FAILURE | The integration rejected the test API key. | Verify your middleware is correctly processing the Authorization header. |
| ERR_TIMEOUT | The integration took too long to respond. | Optimize model cold-start times or check network configuration. |
Integrating with CI/CD
To automate these protocols within your deployment pipeline, include the validation script as a post-deployment step:
# Example GitHub Actions Step
- name: Validate Integration
run: |
npm install
npm run validate -- --target=${{ secrets.STAGING_URL }}
env:
SUPERVISED_AI_TEST_KEY: ${{ secrets.TEST_KEY }}