Conversation
Namespace: LLama.Batched
A single conversation thread that can be prompted (adding tokens from the user) or inferred (extracting a token from the LLM)
public sealed class Conversation : System.IDisposable
Inheritance Object → Conversation
Implements IDisposable
Properties
Executor
The executor which this conversation belongs to
public BatchedExecutor Executor { get; }
Property Value
ConversationId
Unique ID for this conversation
public LLamaSeqId ConversationId { get; }
Property Value
TokenCount
Total number of tokens in this conversation, cannot exceed the context length.
public int TokenCount { get; }
Property Value
IsDisposed
Indicates if this conversation has been disposed, nothing can be done with a disposed conversation
public bool IsDisposed { get; }
Property Value
RequiresInference
Indicates if this conversation is waiting for inference to be run on the executor. "Prompt" and "Sample" cannot be called when this is true.
public bool RequiresInference { get; }
Property Value
RequiresSampling
Indicates that this conversation should be sampled.
public bool RequiresSampling { get; }
Property Value
Methods
Finalize()
Finalizer for Conversation
protected void Finalize()
Dispose()
End this conversation, freeing all resources used by it
public void Dispose()
Exceptions
Fork()
Create a copy of the current conversation
public Conversation Fork()
Returns
Exceptions
Remarks:
The copy shares internal state, so consumes very little extra memory.
Sample()
Get the logits from this conversation, ready for sampling
public Span<float> Sample()
Returns
Exceptions
CannotSampleRequiresPromptException
Thrown if this conversation was not prompted before the previous call to infer
CannotSampleRequiresInferenceException
Thrown if Infer() must be called on the executor
Prompt(String)
Add tokens to this conversation
public void Prompt(string input)
Parameters
input
String
Prompt(List<LLamaToken>)
Add tokens to this conversation
public void Prompt(List<LLamaToken> tokens)
Parameters
tokens
List<LLamaToken>
Exceptions
AlreadyPromptedConversationException
Prompt(ReadOnlySpan<LLamaToken>)
Add tokens to this conversation
public void Prompt(ReadOnlySpan<LLamaToken> tokens)
Parameters
tokens
ReadOnlySpan<LLamaToken>
Exceptions
AlreadyPromptedConversationException
Prompt(LLamaToken)
Add a single token to this conversation
public void Prompt(LLamaToken token)
Parameters
token
LLamaToken
Exceptions
AlreadyPromptedConversationException
Modify(ModifyKvCache)
Directly modify the KV cache of this conversation
public void Modify(ModifyKvCache modifier)
Parameters
modifier
ModifyKvCache
Exceptions
CannotModifyWhileRequiresInferenceException
Thrown if this method is called while Conversation.RequiresInference == true