Skip to main content

generate-reply-api

ai-research-agent / agents/generate-reply-api

Generate

CHAT_MODELS

const CHAT_MODELS: object;

List of default models for the chat providers and a list of models available for Groq

Type declaration

NameType

anthropic

object[]

defaults

{ anthropic: string; groq: string; openai: string; together: string; xai: string; }

groq

object[]

openai

object[]

together

object[]

xai

object[]


generateLanguageModelReply()

function generateLanguageModelReply(query, options): Promise<{
content: string;
error: string;
}>

Generates a reply using specified AI provider and model:

Google Vertex Docs Google Vertex Keys: Gemini

This function utilizes transformer-based language models

  1. Input Embedding: Converts input text into numerical vectors.
  2. Positional Encoding: Adds position information to maintain word order.
  3. Multi-Head Attention: Processes relationships between words in parallel.
  4. Feed-Forward Networks: Further processes the attention output.
  5. Layer Normalization & Residual Connections: Stabilizes learning and prevents vanishing gradients.
  6. Output Layer: Generates probabilities for next tokens.

Parameters

ParameterTypeDescription

query

string | any[]

User's input query text string or LangChain messages array

options

{ apiKey: string; model: string; provider: string; temperature: number; }

Options

options.apiKey

string

API key for the specified provider

options.model?

string

Optional model name. If not provided, uses default

options.provider

string

LLM provider: groq, openai, anthropic, together, xai, google

options.temperature?

number

Temperature is a way to control the overall confidence of the model's scores (the logits). What this means is that, if you use a lower value than 1.0, the relative distance between the tokens will become larger (more deterministic), and if you use a larger value than 1.0, the relative distance between the tokens becomes smaller (less deterministic). 1.0 Temperature is the original distribution that the model was trained to optimize for, since the scores remain the same.

Returns

Promise<{ content: string; error: string; }>

Generated response

Author

AI Research Contributors

Example

const response = await generateLanguageModelReply(
"Explain neural networks", {provider: "groq", apiKey: "your-api-key"})

Other

ChatModel

Properties

id
id: string;

The internal ID of the model

model
model: string;

The model name

name
name: string;

The display name of the model


ChatModels

Properties

defaults
defaults: Object;

The default models for the chat providers

groq
groq: ChatModel[];

List of models available for Groq