Skip to content

Latest commit

 

History

History
61 lines (45 loc) · 9.37 KB

File metadata and controls

61 lines (45 loc) · 9.37 KB

Serverless.ChatRender

Overview

Available Operations

render

Given a list of messages forming a conversation, the API renders them into the final prompt text that will be sent to the model.

Example Usage

import os

from friendli import SyncFriendli

with SyncFriendli(
    token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
    res = friendli.serverless.chat_render.render(
        model="meta-llama-3.1-8b-instruct",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant.",
            },
            {
                "role": "user",
                "content": "Hello!",
            },
        ],
    )

    # Handle response
    print(res)

Parameters

Parameter Type Required Description Example
model str ✔️ Code of the model to use. See available model list. meta-llama-3.1-8b-instruct
messages List[models.Message] ✔️ A list of messages comprising the conversation so far. [
{
"content": "You are a helpful assistant.",
"role": "system"
},
{
"content": "Hello!",
"role": "user"
}
]
x_friendli_team OptionalNullable[str] ID of team to run requests as (optional parameter).
chat_template_kwargs Dict[str, Any] Additional keyword arguments supplied to the template renderer. These parameters will be available for use within the chat template.
tools List[models.Tool] A list of tools the model may call.
Use this to provide a list of functions the model may generate JSON inputs for.

When tools is specified, min_tokens and response_format fields are unsupported.
retries Optional[utils.RetryConfig] Configuration to override the default retry behavior of the client.

Response

models.ServerlessChatRenderSuccess

Errors

Error Type Status Code Content Type
models.SDKError 4XX, 5XX */*