SoCMate uses LLM models for entity extraction, query generation, evidence assessment, and report generation. The platform supports multiple LLM providers and enables hot-swapping models without redeployment.
All model management endpoints require the admin role. Changes to the model configuration take effect immediately for new investigations.

How It Works

SoCMate’s model gateway routes requests to the configured LLM provider. This decouples the platform from any specific provider and allows you to switch models at any time.

Listing Models

View all models currently configured:
curl -X GET https://api.socmate.yourcompany.com/api/admin/models \
  -H "Authorization: Bearer <admin_token>"
Response:
{
  "models": [
    {
      "model_name": "gpt-5.3",
      "model_info": {
        "id": "model_abc123",
        "provider": "azure",
        "model": "gpt-5.3"
      }
    },
    {
      "model_name": "gpt-5-mini",
      "model_info": {
        "id": "model_def456",
        "provider": "azure",
        "model": "gpt-5-mini"
      }
    }
  ],
  "total": 2
}

Adding a Model

Add a new LLM model:
curl -X POST https://api.socmate.yourcompany.com/api/admin/models \
  -H "Authorization: Bearer <admin_token>" \
  -H "Content-Type: application/json" \
  -d '{
    "model_name": "gpt-5.3",
    "provider": "azure",
    "model_id": "gpt-5.3",
    "api_base": "https://your-openai-resource.openai.azure.com",
    "api_key": "your-azure-openai-api-key",
    "api_version": "2025-01-01-preview"
  }'
Response:
{
  "message": "Model added successfully"
}

Model Configuration Fields

FieldRequiredDescription
model_nameYesDisplay name used in SoCMate (e.g., gpt-5.3)
providerYesLLM provider: azure, openai, anthropic, etc.
model_idYesProvider-specific model identifier
api_baseNoProvider API base URL (required for Azure OpenAI)
api_keyYesAPI key for the provider
api_versionNoAPI version (required for Azure OpenAI)

Testing a Model

Send a test prompt to verify the model is working:
curl -X POST https://api.socmate.yourcompany.com/api/admin/models/gpt-5.3/test \
  -H "Authorization: Bearer <admin_token>" \
  -H "Content-Type: application/json" \
  -d '{
    "prompt": "Say hello in one word."
  }'
Response:
{
  "model": "gpt-5.3",
  "response": "Hello.",
  "latency_ms": 320,
  "tokens": {
    "prompt": 6,
    "completion": 2,
    "total": 8
  }
}
If no prompt is provided, the default test prompt "Say hello in one word." is used.

Setting the Default Model

The default model is used for all investigation stages unless a specific model is configured per task.

View Current Configuration

curl -X GET https://api.socmate.yourcompany.com/api/admin/models/config \
  -H "Authorization: Bearer <admin_token>"
Response:
{
  "default_model": "gpt-5.3",
  "available_models": ["gpt-5.3", "gpt-5-mini", "kimi-k2.5", "grok-4.1"],
  "updated_at": "2026-03-27T10:00:00Z",
  "updated_by": "admin@example.com"
}

Update Default Model

curl -X PUT https://api.socmate.yourcompany.com/api/admin/models/config \
  -H "Authorization: Bearer <admin_token>" \
  -H "Content-Type: application/json" \
  -d '{
    "default_model": "gpt-5-mini"
  }'
The model name must match an existing model. The change takes effect immediately for new investigations.
Changing the default model affects all new investigations across the platform. Test the model thoroughly before setting it as default. In-progress investigations continue using the model they started with.

Deleting a Model

Remove a model:
curl -X DELETE https://api.socmate.yourcompany.com/api/admin/models/{model_id} \
  -H "Authorization: Bearer <admin_token>"
Do not delete the model that is currently set as the default. Set a different default model first, then delete the old one.

Health Check

Check the model gateway health:
curl -X GET https://api.socmate.yourcompany.com/api/admin/models/health \
  -H "Authorization: Bearer <admin_token>"
Returns the health status of the model gateway and its configured models.

Supported Providers

SoCMate supports multiple LLM providers:
ProviderConfiguration Notes
Azure OpenAIRequires api_base and api_version. Most common for enterprise deployments.
OpenAIStandard OpenAI API. Requires api_key only.
AnthropicClaude models. Requires api_key only.
GoogleGemini models via Vertex AI or AI Studio.
CustomAny OpenAI-compatible API endpoint via api_base.

Best Practices

  • Test before defaulting — Always run the test endpoint on a new model before setting it as the platform default
  • Monitor quality — Different models produce different quality queries and reports; evaluate investigation output after switching models
  • Keep a fallback — Maintain at least two configured models so you can switch quickly if one provider has issues
  • Use appropriate models — Larger models (GPT-5.3) produce better investigation reports; smaller models (GPT-5-mini) are faster and cheaper for simpler tasks