Skip to content

< Back


LLamaContext

Namespace: LLama

A llama_context, which holds all the context required to interact with a model

1
public sealed class LLamaContext : System.IDisposable

Inheritance ObjectLLamaContext
Implements IDisposable
Attributes NullableContextAttribute, NullableAttribute

Properties

ContextSize

Total number of tokens in the context

1
public uint ContextSize { get; }

Property Value

UInt32

EmbeddingSize

Dimension of embedding vectors

1
public int EmbeddingSize { get; }

Property Value

Int32

Params

The context params set for this context

1
public IContextParams Params { get; }

Property Value

IContextParams

NativeHandle

The native handle, which is used to be passed to the native APIs

1
public SafeLLamaContextHandle NativeHandle { get; }

Property Value

SafeLLamaContextHandle

Remarks:

Be careful how you use this!

Encoding

The encoding set for this model to deal with text input.

1
public Encoding Encoding { get; }

Property Value

Encoding

GenerationThreads

Get or set the number of threads to use for generation

1
public int GenerationThreads { get; set; }

Property Value

Int32

BatchThreads

Get or set the number of threads to use for batch processing

1
public int BatchThreads { get; set; }

Property Value

Int32

BatchSize

Get the maximum batch size for this context

1
public uint BatchSize { get; }

Property Value

UInt32

Vocab

Get the special tokens for the model associated with this context

1
public Vocabulary Vocab { get; }

Property Value

Vocabulary

Constructors

LLamaContext(LLamaWeights, IContextParams, ILogger)

Create a new LLamaContext for the given LLamaWeights

1
public LLamaContext(LLamaWeights model, IContextParams params, ILogger logger)

Parameters

model LLamaWeights

params IContextParams

logger ILogger

Exceptions

ObjectDisposedException

Methods

Tokenize(String, Boolean, Boolean)

Tokenize a string.

1
public LLamaToken[] Tokenize(string text, bool addBos, bool special)

Parameters

text String

addBos Boolean
Whether to add a bos to the text.

special Boolean
Allow tokenizing special and/or control tokens which otherwise are not exposed and treated as plaintext.

Returns

LLamaToken[]

DeTokenize(IReadOnlyList<LLamaToken>)

Caution

Use a StreamingTokenDecoder instead


Detokenize the tokens to text.

1
public string DeTokenize(IReadOnlyList<LLamaToken> tokens)

Parameters

tokens IReadOnlyList<LLamaToken>

Returns

String

SaveState(String)

Save the state to specified path.

1
public void SaveState(string filename)

Parameters

filename String

SaveState(String, LLamaSeqId)

Save the state of a particular sequence to specified path.

1
public void SaveState(string filename, LLamaSeqId sequence)

Parameters

filename String

sequence LLamaSeqId

GetState()

Get the state data as an opaque handle, which can be loaded later using LLamaContext.LoadState(String)

1
public State GetState()

Returns

State

Remarks:

Use LLamaContext.SaveState(String) if you intend to save this state to disk.

GetState(LLamaSeqId)

Get the state data as an opaque handle, which can be loaded later using LLamaContext.LoadState(String)

1
public SequenceState GetState(LLamaSeqId sequence)

Parameters

sequence LLamaSeqId

Returns

SequenceState

Remarks:

Use LLamaContext.SaveState(String, LLamaSeqId) if you intend to save this state to disk.

LoadState(String)

Load the state from specified path.

1
public void LoadState(string filename)

Parameters

filename String

LoadState(String, LLamaSeqId)

Load the state from specified path into a particular sequence

1
public void LoadState(string filename, LLamaSeqId sequence)

Parameters

filename String

sequence LLamaSeqId

LoadState(State)

Load the state from memory.

1
public void LoadState(State state)

Parameters

state State

LoadState(SequenceState, LLamaSeqId)

Load the state from memory into a particular sequence

1
public void LoadState(SequenceState state, LLamaSeqId sequence)

Parameters

state SequenceState

sequence LLamaSeqId

Encode(LLamaBatch)

1
public EncodeResult Encode(LLamaBatch batch)

Parameters

batch LLamaBatch

Returns

EncodeResult

EncodeAsync(LLamaBatch, CancellationToken)

1
public Task<EncodeResult> EncodeAsync(LLamaBatch batch, CancellationToken cancellationToken)

Parameters

batch LLamaBatch

cancellationToken CancellationToken

Returns

Task<EncodeResult>

Decode(LLamaBatch)

1
public DecodeResult Decode(LLamaBatch batch)

Parameters

batch LLamaBatch

Returns

DecodeResult

DecodeAsync(LLamaBatch, CancellationToken)

1
public Task<DecodeResult> DecodeAsync(LLamaBatch batch, CancellationToken cancellationToken)

Parameters

batch LLamaBatch

cancellationToken CancellationToken

Returns

Task<DecodeResult>

Decode(LLamaBatchEmbeddings)

1
public DecodeResult Decode(LLamaBatchEmbeddings batch)

Parameters

batch LLamaBatchEmbeddings

Returns

DecodeResult

DecodeAsync(LLamaBatchEmbeddings, CancellationToken)

1
public Task<DecodeResult> DecodeAsync(LLamaBatchEmbeddings batch, CancellationToken cancellationToken)

Parameters

batch LLamaBatchEmbeddings

cancellationToken CancellationToken

Returns

Task<DecodeResult>

DecodeAsync(List<LLamaToken>, LLamaSeqId, LLamaBatch, Int32)

1
public Task<ValueTuple<DecodeResult, int, int>> DecodeAsync(List<LLamaToken> tokens, LLamaSeqId id, LLamaBatch batch, int n_past)

Parameters

tokens List<LLamaToken>

id LLamaSeqId

batch LLamaBatch

n_past Int32

Returns

Task<ValueTuple<DecodeResult, Int32, Int32>>
A tuple, containing the decode result, the number of tokens that have not been decoded yet and the total number of tokens that have been decoded.

Dispose()

1
public void Dispose()

< Back