LLamaContext
Namespace: LLama
A llama_context, which holds all the context required to interact with a model
1 | |
Inheritance Object → LLamaContext
Implements IDisposable
Attributes NullableContextAttribute, NullableAttribute
Properties
ContextSize
Total number of tokens in the context
1 | |
Property Value
EmbeddingSize
Dimension of embedding vectors
1 | |
Property Value
Params
The context params set for this context
1 | |
Property Value
NativeHandle
The native handle, which is used to be passed to the native APIs
1 | |
Property Value
Remarks:
Be careful how you use this!
Encoding
The encoding set for this model to deal with text input.
1 | |
Property Value
GenerationThreads
Get or set the number of threads to use for generation
1 | |
Property Value
BatchThreads
Get or set the number of threads to use for batch processing
1 | |
Property Value
BatchSize
Get the maximum batch size for this context
1 | |
Property Value
Vocab
Get the special tokens for the model associated with this context
1 | |
Property Value
Constructors
LLamaContext(LLamaWeights, IContextParams, ILogger)
Create a new LLamaContext for the given LLamaWeights
1 | |
Parameters
model LLamaWeights
params IContextParams
logger ILogger
Exceptions
Methods
Tokenize(String, Boolean, Boolean)
Tokenize a string.
1 | |
Parameters
text String
addBos Boolean
Whether to add a bos to the text.
special Boolean
Allow tokenizing special and/or control tokens which otherwise are not exposed and treated as plaintext.
Returns
DeTokenize(IReadOnlyList<LLamaToken>)
Caution
Use a StreamingTokenDecoder instead
Detokenize the tokens to text.
1 | |
Parameters
tokens IReadOnlyList<LLamaToken>
Returns
SaveState(String)
Save the state to specified path.
1 | |
Parameters
filename String
SaveState(String, LLamaSeqId)
Save the state of a particular sequence to specified path.
1 | |
Parameters
filename String
sequence LLamaSeqId
GetState()
Get the state data as an opaque handle, which can be loaded later using LLamaContext.LoadState(String)
1 | |
Returns
Remarks:
Use LLamaContext.SaveState(String) if you intend to save this state to disk.
GetState(LLamaSeqId)
Get the state data as an opaque handle, which can be loaded later using LLamaContext.LoadState(String)
1 | |
Parameters
sequence LLamaSeqId
Returns
Remarks:
Use LLamaContext.SaveState(String, LLamaSeqId) if you intend to save this state to disk.
LoadState(String)
Load the state from specified path.
1 | |
Parameters
filename String
LoadState(String, LLamaSeqId)
Load the state from specified path into a particular sequence
1 | |
Parameters
filename String
sequence LLamaSeqId
LoadState(State)
Load the state from memory.
1 | |
Parameters
state State
LoadState(SequenceState, LLamaSeqId)
Load the state from memory into a particular sequence
1 | |
Parameters
state SequenceState
sequence LLamaSeqId
Encode(LLamaBatch)
1 | |
Parameters
batch LLamaBatch
Returns
EncodeAsync(LLamaBatch, CancellationToken)
1 | |
Parameters
batch LLamaBatch
cancellationToken CancellationToken
Returns
Decode(LLamaBatch)
1 | |
Parameters
batch LLamaBatch
Returns
DecodeAsync(LLamaBatch, CancellationToken)
1 | |
Parameters
batch LLamaBatch
cancellationToken CancellationToken
Returns
Decode(LLamaBatchEmbeddings)
1 | |
Parameters
batch LLamaBatchEmbeddings
Returns
DecodeAsync(LLamaBatchEmbeddings, CancellationToken)
1 | |
Parameters
batch LLamaBatchEmbeddings
cancellationToken CancellationToken
Returns
DecodeAsync(List<LLamaToken>, LLamaSeqId, LLamaBatch, Int32)
1 | |
Parameters
tokens List<LLamaToken>
id LLamaSeqId
batch LLamaBatch
n_past Int32
Returns
Task<ValueTuple<DecodeResult, Int32, Int32>>
A tuple, containing the decode result, the number of tokens that have not been decoded yet and the total number of tokens that have been decoded.
Dispose()
1 | |