HomeGuidesRecipesAPI EndpointsRelease NotesCommunity
Log In
Guides

Utilizing External Provider Models

Introduction

In Layar 1.17 the ability to utilize external providers like Azure, OpenAi, and Anthropic was added. This guide will go over to how to configure these if you do not have the required hardware to serve the model yourself.

ALL_EXTERNAL_MODEL_INFO

This is a layar.config parameter that must be provided if you want to use an external provider. The paramter is a JSON dictionary that looks as follows.

{
  model_name: the API name of the model in the external provider e.g. "certaraai-dev-oai-deployment-gpt4o",
  model_provider: the type of external provider ("azure","openai", or "anthropic"),
  model_base: the common name for the model e.g. "gpt-4o",
  total_max_len: total context length for the model,
  max_generation_tokens: maximum length of generated tokens for the model,
  disallowed_generation_args: usually this is just [] i.e. nothing but some models require specific generation arguments to be filtered out from the general model call proces
}

OpenAI

To use OpenAI you must also provide you OpenAI API key in the OPENAI_KEY parameter in the layar.config file.

OPENAI_KEY: YOUR_API_KEY

ALL_EXTERNAL_MODEL_INFO: {
  "model_name":"gpt-4o",
  "model_provider":"openai",
  "model_base":"gpt-4o"
}

The possible model_name can be as follows.

{"model_name":"gpt-4.1",
 "model_provider":"openai",
 "model_base":"gpt-4.1",
 "total_max_len":1047576,
 "max_generation_tokens":32768,
 "disallowed_generation_args":[]},
{"model_name":"gpt-4o",
 "model_provider":"openai",
 "model_base":"gpt-4o",
 "total_max_len":128000,
 "max_generation_tokens":16384,
 "disallowed_generation_args":[]},
{"model_name":"gpt-5",
 "model_provider":"openai",
 "model_base":"gpt-5",
 "total_max_len":400000,
 "max_generation_tokens":128000,
 disallowed_generation_args: ["logprobs","temperature","top_p","frequency_penalty"]},
{"model_name":"gpt-5-nano",
 "model_provider":"openai",
 "model_base":"gpt-5",
 "total_max_len":400000,
 "max_generation_tokens":128000,
 disallowed_generation_args: ["logprobs","temperature","top_p","frequency_penalty"]},
{"model_name":"gpt-5-mini",
 "model_provider":"openai",
 "model_base":"gpt-5",
 "total_max_len":400000,
 "max_generation_tokens":128000,
 disallowed_generation_args: ["logprobs","temperature","top_p","frequency_penalty"]},
{"model_name":"o3",
 "model_provider":"openai",
 "model_base":"o3",
 "total_max_len":200000,
 "max_generation_tokens":100000,
 disallowed_generation_args: ["logprobs","temperature","top_p","frequency_penalty"]},
{"model_name":"o4-mini",
 "model_provider":"openai",
 "model_base":"o4-mini",
 "total_max_len":200000,
 "max_generation_tokens":100000,
 disallowed_generation_args: ["logprobs","temperature","top_p","frequency_penalty"]}

Azure

The Azure setup is very similar; it requires an API key to be given in the AZURE_KEYparameter. You must also provide the AZURE_ENDPOINTparameter, which will be the URL to the Azure environment. You must also provide ALL_EXTERNAL_MODEL_INFO.

AZURE_ENDPOINT: the endpoint for the clients azure e.g. https://certaraai-dev-oai.openai.azure.com/
AZURE_KEY: YOUR_API_KEY

ALL_EXTERNAL_MODEL_INFO: {
  "model_name":"certaraai-dev-oai-deployment-gpt4o",
  "model_provider":"azure",
  "model_base":"gpt-4o"
}

model_name and model_base will be different in the Azure environment. The possible model_basevalues are the same as the OpenAI values.

Anthropic

Anthropic requires an ANTHROPIC_KEYand ALL_EXTERNAL_MODEL_INFO.

ANTHROPIC_KEY: YOUR_API_KEY

ALL_EXTERNAL_MODEL_INFO: {
  "model_name":"claude-3-haiku-20240307",
  "model_provider":"anthropic",
  "model_base":"claude"
}

The following models can be used.

{ "model_name": "claude-3-7-sonnet-20250219",
  "model_provider": "anthropic",
  "model_base": "claude",
  "total_max_len": 200000,
  "max_generation_tokens": 64000,
  "disallowed_generation_args": [ ] },
{ "model_name": "claude-sonnet-4-20250514",
  "model_provider": "anthropic",
  "model_base": "claude",
  "total_max_len": 200000,
  "max_generation_tokens": 64000,
  "disallowed_generation_args": [ ] },
{ "model_name": "claude-3-5-haiku-20241022",
  "model_provider": "anthropic",
  "model_base": "claude",
  "total_max_len": 200000,
  "max_generation_tokens": 8192,
  "disallowed_generation_args": [ ] },
{ "model_name": "claude-opus-4-20250514",
  "model_provider": "anthropic",
  "model_base": "claude",
  "total_max_len": 200000,
  "max_generation_tokens": 32000,
  "disallowed_generation_args": [ ] },
{ "model_name": "claude-opus-4-1-20250805",
  "model_provider": "anthropic",
  "model_base": "claude",
  "total_max_len": 200000,
  "max_generation_tokens": 32000,
  "disallowed_generation_args": [ "temperature" ] }