This endpoint supports various OpenAI (OAI) API operations such as chat completions (including v1/completions
and
v1/chat/completions
), embeddings, and more.
For all requests that require the "model" parameter in the request body, it match one of the available models
returned by the /oai/v1/models
endpoint.
For more details of each specific endpoint, see https://platform.openai.com/docs/api-reference.
\
Notes
Requests are not proxied to, or served by, OpenAI. This endpoint simply utilizes the same input and
response formats defined in the OpenAI API specification. As a result, they do not require any special/different
authentication.
There are slight differences in the response payloads between Layar and OAI. For example: If requested, OAI returns
usage
info to track costs per token. We do not return this field or track tokens in this way. There are some other
OAI-specific fields that refer to other pieces of the OAI stack; we do not mimic these.
The intended use for this functionality is primarily to support completions, embeddings, and models endpoints to
facilitate utilizing Layar for workflows that may have been previously built against OAI.