IInferenceParams
Namespace: LLama.Abstractions
The parameters used for inference.
1 |
|
Attributes NullableContextAttribute
Properties
TokensKeep
number of tokens to keep from initial prompt
1 |
|
Property Value
MaxTokens
how many new tokens to predict (n_predict), set to -1 to inifinitely generate response until it complete.
1 |
|
Property Value
AntiPrompts
Sequences where the model will stop generating further tokens.
1 |
|
Property Value
SamplingPipeline
Set a custom sampling pipeline to use.
1 |
|
Property Value
DecodeSpecialTokens
If true, special characters will be converted to text. If false they will be invisible.
1 |
|