Class: ContextChatEngine
ContextChatEngine uses the Index to get the appropriate context for each query. The context is stored in the system prompt, and the chat history is chunk: ChatResponseChunk, nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[], nodes?: NodeWithScore<import("/Users/marcus/code/llamaindex/LlamaIndexTS/packages/core/src/Node").Metadata>[]lowing the appropriate context to be surfaced for each query.
Extends
Implements
Constructors
new ContextChatEngine()
new ContextChatEngine(
init
):ContextChatEngine
Parameters
• init
• init.chatHistory?: ChatMessage
[]
• init.chatModel?: LLM
<object
, object
>
• init.contextRole?: MessageType
• init.contextSystemPrompt?: ContextSystemPrompt
• init.nodePostprocessors?: BaseNodePostprocessor
[]
• init.retriever: BaseRetriever
• init.systemPrompt?: string
Returns
Overrides
Defined in
packages/core/chat-engine/dist/index.d.ts:51
Properties
chatModel
chatModel:
LLM
<object
,object
>
Defined in
packages/core/chat-engine/dist/index.d.ts:46
contextGenerator
contextGenerator:
ContextGenerator
&PromptMixin
Defined in
packages/core/chat-engine/dist/index.d.ts:48
memory
memory:
BaseMemory
<object
>
Defined in
packages/core/chat-engine/dist/index.d.ts:47