LCM Call

This function allows you to call an LLM with a user and system prompt as well as a set of data parameters. Save the requestId from the response and use the status endpoint to check the status of the request and retrieve the output.


API Reference:

POST /llm/llm_call

Function Signature

client = Masterpiecex()
client.llms.call(**kwargs) -> GenerateResponseObject
client = Masterpiecex();
client.llms.call(body, options?): GenerateResponseObject

Parameters

Python

Node

Description

system_prompt: str

systemPrompt: string

The system prompt to use for the LLM call

user_prompt: str

userPrompt: string

The user prompt to use for the LLM call

data_parms Optional[dataParms]{

max_tokens: Optional[float], temperature: Optional[float]

}

dataParms? DataParms {

maxTokens?: number,
temperature?:number

}

The data parameters to use for the LLM call. These parameters are optional.
max_tokens - the maximum tokens to use for the LLM call. temperature - What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

Returns

GenerateResponseObject

Example

import os
from mpx_genai_sdk import Masterpiecex

lient = Masterpiecex(
    bearer_token=os.environ.get("MPX_SDK_BEARER_TOKEN"),  # This is the default and can be omitted
)
generate_response_object = client.llms.call(
    system_prompt="You are a helpful assistant",
    user_prompt="Tell me three things I didn't know",
    data_parms= {
    "temperature": 0.5,
    "max_tokens": 1000
  }
)
print(generate_response_object.request_id)
import Masterpiecex from 'mpx-genai-sdk';

const client = new Masterpiecex({
  bearerToken: process.env['MPX_SDK_BEARER_TOKEN'], // This is the default and can be omitted
});

const generateResponseObject = await client.llms.call({
  systemPrompt: 'You are a helpful assistant',
  userPrompt: 'Tell me three things I didn't know',
  dataParms: {
    maxTokens: 1000,
    temperature: 0.3
  }
});

console.log(generateResponseObject.requestId);

Example Response

print(response.outputs.output) # The response from the llm call
print(response.balance)  # remaining credits available associated with the account
print(response.request_id) # used to check the the status. Eg., client.status.retrieve(request_id)
print(response.status)  # current status of the request - typically pending on initial submission
console.log(response.outputs.output); // The response from the llm call
console.log(response.balance); // remaining credits available associated with the account
console.log(response.requestId); // used to check the the status. Eg., client.status.retrieve(requestId)
console.log(response.status); // current status of the request - typically pending on initial submission
{
    "requestId": "xxxxx",
    "status": "complete",
    "processingTime_s": 5.777,
    "outputs": {
        "output": "While I can't know exactly what you do or don't know, I can share three interesting and less commonly known facts that might be new to you:\n\n1. **Banana Plant Movement**: Banana plants are technically herbs and not trees. They grow from a root structure called a corm. What's fascinating is that banana plants can actually walk over time. As the corm grows, new shoots can emerge several feet away from where the original plant was, effectively making the plant seem like it is walking.\n\n2. **Octopus Hearts**: Octopuses have three hearts. Two of the hearts pump blood to the gills, where it picks up oxygen, and the third heart pumps the oxygenated blood to the rest of the body. Interestingly, when an octopus swims, the heart that delivers blood to the rest of the body actually stops beating, which is why they prefer crawling over swimming—it's less tiring for them.\n\n3. **Longest Word in English**: The longest word"
    }
}