LLamaSharp
LLama
AntipromptProcessor
ChatSession
InstructExecutor
InteractiveExecutor
LLamaContext
LLamaEmbedder
LLamaQuantizer
LLamaReranker
LLamaTemplate
LLamaTransforms
LLamaWeights
LLavaWeights
SessionState
StatefulExecutorBase
StatelessExecutor
StreamingTokenDecoder
LLama.Abstractions
IContextParams
IHistoryTransform
IInferenceParams
ILLamaExecutor
ILLamaParams
IModelParams
INativeLibrary
INativeLibrarySelectingPolicy
ITextStreamTransform
ITextTransform
LLamaExecutorExtensions
MetadataOverride
MetadataOverrideConverter
TensorBufferOverride
TensorSplitsCollection
TensorSplitsCollectionConverter
LLama.Batched
AlreadyPromptedConversationException
BatchedExecutor
CannotModifyWhileRequiresInferenceException
CannotSampleRequiresInferenceException
CannotSampleRequiresPromptException
CannotSaveWhileRequiresInferenceException
Conversation
ConversationExtensions
ExperimentalBatchedExecutorException
LLama.Common
AuthorRole
ChatHistory
FixedSizeQueue<T>
InferenceParams
MirostatType
ModelParams
LLama.Exceptions
GetLogitsInvalidIndexException
LLamaDecodeError
LoadWeightsFailedException
MissingTemplateException
RuntimeError
TemplateNotFoundException
LLama.Extensions
IContextParamsExtensions
IModelParamsExtensions
SpanNormalizationExtensions
LLama.Native
AvxLevel
DecodeResult
DefaultNativeLibrarySelectingPolicy
EncodeResult
GGMLType
GPUSplitMode
ICustomSampler
LLamaAttentionType
LLamaBatch
LLamaBatchEmbeddings
LLamaChatMessage
LLamaContextParams
LLamaFtype
LLamaKvCacheViewSafeHandle
LLamaLogitBias
LLamaLogLevel
LLamaModelKvOverrideType
LLamaModelMetadataOverride
LLamaModelParams
LLamaModelQuantizeParams
LLamaModelTensorBufferOverride
LLamaNativeBatch
LLamaPerfContextTimings
LLamaPoolingType
LLamaPos
LLamaRopeType
LLamaSamplerChainParams
LLamaSamplingTimings
LLamaSeqId
LLamaToken
LLamaTokenAttr
LLamaTokenData
LLamaTokenDataArray
LLamaTokenDataArrayNative
LLamaVocabType
LLavaImageEmbed
LoraAdapter
NativeApi
NativeLibraryConfig
NativeLibraryConfigContainer
NativeLibraryFromPath
NativeLibraryMetadata
NativeLibraryName
NativeLibraryWithAvx
NativeLibraryWithCuda
NativeLibraryWithMacOrFallback
NativeLibraryWithVulkan
NativeLogConfig
RopeScalingType
SafeLLamaContextHandle
SafeLLamaHandleBase
SafeLlamaModelHandle
SafeLLamaSamplerChainHandle
SafeLlavaImageEmbedHandle
SafeLlavaModelHandle
SystemInfo
UnknownNativeLibrary
LLama.Sampling
BaseSamplingPipeline
DefaultSamplingPipeline
Grammar
GreedySamplingPipeline
ISamplingPipeline
ISamplingPipelineExtensions
PromptTemplateTransformer