SoCMate uses LLM models for entity extraction, query generation, evidence assessment, and report generation. The platform supports multiple LLM providers and enables hot-swapping models without redeployment.
All model management endpoints require the admin role. Changes to the model configuration take effect immediately for new investigations.
How It Works
SoCMate’s model gateway routes requests to the configured LLM provider. This decouples the platform from any specific provider and allows you to switch models at any time.
Listing Models
View all models currently configured:
curl -X GET https://api.socmate.yourcompany.com/api/admin/models \
-H "Authorization: Bearer <admin_token>"
Response:
{
"models": [
{
"model_name": "gpt-5.3",
"model_info": {
"id": "model_abc123",
"provider": "azure",
"model": "gpt-5.3"
}
},
{
"model_name": "gpt-5-mini",
"model_info": {
"id": "model_def456",
"provider": "azure",
"model": "gpt-5-mini"
}
}
],
"total": 2
}
Adding a Model
Add a new LLM model:
curl -X POST https://api.socmate.yourcompany.com/api/admin/models \
-H "Authorization: Bearer <admin_token>" \
-H "Content-Type: application/json" \
-d '{
"model_name": "gpt-5.3",
"provider": "azure",
"model_id": "gpt-5.3",
"api_base": "https://your-openai-resource.openai.azure.com",
"api_key": "your-azure-openai-api-key",
"api_version": "2025-01-01-preview"
}'
Response:
{
"message": "Model added successfully"
}
Model Configuration Fields
| Field | Required | Description |
|---|
model_name | Yes | Display name used in SoCMate (e.g., gpt-5.3) |
provider | Yes | LLM provider: azure, openai, anthropic, etc. |
model_id | Yes | Provider-specific model identifier |
api_base | No | Provider API base URL (required for Azure OpenAI) |
api_key | Yes | API key for the provider |
api_version | No | API version (required for Azure OpenAI) |
Testing a Model
Send a test prompt to verify the model is working:
curl -X POST https://api.socmate.yourcompany.com/api/admin/models/gpt-5.3/test \
-H "Authorization: Bearer <admin_token>" \
-H "Content-Type: application/json" \
-d '{
"prompt": "Say hello in one word."
}'
Response:
{
"model": "gpt-5.3",
"response": "Hello.",
"latency_ms": 320,
"tokens": {
"prompt": 6,
"completion": 2,
"total": 8
}
}
If no prompt is provided, the default test prompt "Say hello in one word." is used.
Setting the Default Model
The default model is used for all investigation stages unless a specific model is configured per task.
View Current Configuration
curl -X GET https://api.socmate.yourcompany.com/api/admin/models/config \
-H "Authorization: Bearer <admin_token>"
Response:
{
"default_model": "gpt-5.3",
"available_models": ["gpt-5.3", "gpt-5-mini", "kimi-k2.5", "grok-4.1"],
"updated_at": "2026-03-27T10:00:00Z",
"updated_by": "admin@example.com"
}
Update Default Model
curl -X PUT https://api.socmate.yourcompany.com/api/admin/models/config \
-H "Authorization: Bearer <admin_token>" \
-H "Content-Type: application/json" \
-d '{
"default_model": "gpt-5-mini"
}'
The model name must match an existing model. The change takes effect immediately for new investigations.
Changing the default model affects all new investigations across the platform. Test the model thoroughly before setting it as default. In-progress investigations continue using the model they started with.
Deleting a Model
Remove a model:
curl -X DELETE https://api.socmate.yourcompany.com/api/admin/models/{model_id} \
-H "Authorization: Bearer <admin_token>"
Do not delete the model that is currently set as the default. Set a different default model first, then delete the old one.
Health Check
Check the model gateway health:
curl -X GET https://api.socmate.yourcompany.com/api/admin/models/health \
-H "Authorization: Bearer <admin_token>"
Returns the health status of the model gateway and its configured models.
Supported Providers
SoCMate supports multiple LLM providers:
| Provider | Configuration Notes |
|---|
| Azure OpenAI | Requires api_base and api_version. Most common for enterprise deployments. |
| OpenAI | Standard OpenAI API. Requires api_key only. |
| Anthropic | Claude models. Requires api_key only. |
| Google | Gemini models via Vertex AI or AI Studio. |
| Custom | Any OpenAI-compatible API endpoint via api_base. |
Best Practices
- Test before defaulting — Always run the test endpoint on a new model before setting it as the platform default
- Monitor quality — Different models produce different quality queries and reports; evaluate investigation output after switching models
- Keep a fallback — Maintain at least two configured models so you can switch quickly if one provider has issues
- Use appropriate models — Larger models (GPT-5.3) produce better investigation reports; smaller models (GPT-5-mini) are faster and cheaper for simpler tasks