Skip to content

< Back


IInferenceParams

Namespace: LLama.Abstractions

The parameters used for inference.

1
public interface IInferenceParams

Attributes NullableContextAttribute

Properties

TokensKeep

number of tokens to keep from initial prompt

1
public abstract int TokensKeep { get; set; }

Property Value

Int32

MaxTokens

how many new tokens to predict (n_predict), set to -1 to inifinitely generate response until it complete.

1
public abstract int MaxTokens { get; set; }

Property Value

Int32

AntiPrompts

Sequences where the model will stop generating further tokens.

1
public abstract IReadOnlyList<string> AntiPrompts { get; set; }

Property Value

IReadOnlyList<String>

SamplingPipeline

Set a custom sampling pipeline to use.

1
public abstract ISamplingPipeline SamplingPipeline { get; set; }

Property Value

ISamplingPipeline

DecodeSpecialTokens

If true, special characters will be converted to text. If false they will be invisible.

1
public abstract bool DecodeSpecialTokens { get; set; }

Property Value

Boolean


< Back