Skip to content

Lm Chat Ollama

Node: lmChatOllama · Full type: @n8n/n8n-nodes-langchain.lmChatOllama · Version: 1

Ergonomic keyword available: LLM ollama — see NFLOW.md for shorter syntax.

Credentials

ollamaApi (alias: @ollama)

CREDENTIAL @ollama = ollamaApi "My Lm Chat Ollama"

Parameters

Parameter Type Default Details
model options "llama3.2" required
options collection {} keys: temperature, topK, topP, frequencyPenalty, keepAlive, lowVram, ...

options children:

Parameter Type Default Details
temperature number 0.7
topK number -1
topP number 1
frequencyPenalty number 0
keepAlive string "5m"
lowVram boolean false
mainGpu number 0
numBatch number 512
numCtx number 2048
numGpu number -1
numPredict number -1
numThread number 0
penalizeNewline boolean true
presencePenalty number 0
repeatPenalty number 1
useMLock boolean false
useMMap boolean true
vocabOnly boolean false
format options "default" default, json

Example

NODE "lmChatOllama" @ollama AS "Lm Chat Ollama"