Skip to content

Conversation

Namespace: LLama.Batched

A single conversation thread that can be prompted (adding tokens from the user) or inferred (extracting a token from the LLM)

1
public sealed class Conversation : System.IDisposable

Inheritance ObjectConversation
Implements IDisposable

Properties

Executor

The executor which this conversation belongs to

1
public BatchedExecutor Executor { get; }

Property Value

BatchedExecutor

ConversationId

Unique ID for this conversation

1
public LLamaSeqId ConversationId { get; }

Property Value

LLamaSeqId

TokenCount

Total number of tokens in this conversation, cannot exceed the context length.

1
public int TokenCount { get; }

Property Value

Int32

IsDisposed

Indicates if this conversation has been disposed, nothing can be done with a disposed conversation

1
public bool IsDisposed { get; }

Property Value

Boolean

RequiresInference

Indicates if this conversation is waiting for inference to be run on the executor. "Prompt" and "Sample" cannot be called when this is true.

1
public bool RequiresInference { get; }

Property Value

Boolean

RequiresSampling

Indicates that this conversation should be sampled.

1
public bool RequiresSampling { get; }

Property Value

Boolean

Methods

Finalize()

Finalizer for Conversation

1
protected void Finalize()

Dispose()

End this conversation, freeing all resources used by it

1
public void Dispose()

Exceptions

ObjectDisposedException

Fork()

Create a copy of the current conversation

1
public Conversation Fork()

Returns

Conversation

Exceptions

ObjectDisposedException

Remarks:

The copy shares internal state, so consumes very little extra memory.

Sample()

Get the logits from this conversation, ready for sampling

1
public Span<float> Sample()

Returns

Span<Single>

Exceptions

ObjectDisposedException

CannotSampleRequiresPromptException
Thrown if this conversation was not prompted before the previous call to infer

CannotSampleRequiresInferenceException
Thrown if Infer() must be called on the executor

Prompt(String)

Add tokens to this conversation

1
public void Prompt(string input)

Parameters

input String

Prompt(List<LLamaToken>)

Add tokens to this conversation

1
public void Prompt(List<LLamaToken> tokens)

Parameters

tokens List<LLamaToken>

Exceptions

ObjectDisposedException

AlreadyPromptedConversationException

Prompt(ReadOnlySpan<LLamaToken>)

Add tokens to this conversation

1
public void Prompt(ReadOnlySpan<LLamaToken> tokens)

Parameters

tokens ReadOnlySpan<LLamaToken>

Exceptions

ObjectDisposedException

AlreadyPromptedConversationException

Prompt(LLamaToken)

Add a single token to this conversation

1
public void Prompt(LLamaToken token)

Parameters

token LLamaToken

Exceptions

ObjectDisposedException

AlreadyPromptedConversationException

Modify(ModifyKvCache)

Directly modify the KV cache of this conversation

1
public void Modify(ModifyKvCache modifier)

Parameters

modifier ModifyKvCache

Exceptions

CannotModifyWhileRequiresInferenceException
Thrown if this method is called while Conversation.RequiresInference == true