ConversationExtensions
Namespace: LLama.Batched
Extension method for Conversation
1 |
|
Inheritance Object → ConversationExtensions
Attributes NullableContextAttribute, NullableAttribute, ExtensionAttribute
Methods
Sample(Conversation, SafeLLamaSamplerChainHandle, Int32)
Sample a token from this conversation using the given sampler chain
1 |
|
Parameters
conversation
Conversation
Conversation to sample from
sampler
SafeLLamaSamplerChainHandle
offset
Int32
Offset from the end of the conversation to the logits to sample, see Conversation.GetSampleIndex(Int32) for more details
Returns
Sample(Conversation, ISamplingPipeline, Int32)
Sample a token from this conversation using the given sampling pipeline
1 |
|
Parameters
conversation
Conversation
Conversation to sample from
sampler
ISamplingPipeline
offset
Int32
Offset from the end of the conversation to the logits to sample, see Conversation.GetSampleIndex(Int32) for more details
Returns
Rewind(Conversation, Int32)
Rewind a Conversation back to an earlier state by removing tokens from the end
1 |
|
Parameters
conversation
Conversation
The conversation to rewind
tokens
Int32
The number of tokens to rewind
Exceptions
ArgumentOutOfRangeException
Thrown if tokens
parameter is larger than TokenCount
ShiftLeft(Conversation, Int32, Int32)
Shift all tokens over to the left, removing "count" tokens from the start and shifting everything over. Leaves "keep" tokens at the start completely untouched. This can be used to free up space when the context gets full, keeping the prompt at the start intact.
1 |
|
Parameters
conversation
Conversation
The conversation to rewind
count
Int32
How much to shift tokens over by
keep
Int32
The number of tokens at the start which should not be shifted