Lm Ollama¶
Node: lmOllama · Full type: @n8n/n8n-nodes-langchain.lmOllama · Version: 1
Credentials¶
ollamaApi (alias: @ollama)
Parameters¶
| Parameter | Type | Default | Details |
|---|---|---|---|
model |
options | "llama3.2" | required |
options |
collection | {} | keys: temperature, topK, topP, frequencyPenalty, keepAlive, lowVram, ... |
options children:
| Parameter | Type | Default | Details |
|---|---|---|---|
temperature |
number | 0.7 | |
topK |
number | -1 | |
topP |
number | 1 | |
frequencyPenalty |
number | 0 | |
keepAlive |
string | "5m" | |
lowVram |
boolean | false | |
mainGpu |
number | 0 | |
numBatch |
number | 512 | |
numCtx |
number | 2048 | |
numGpu |
number | -1 | |
numPredict |
number | -1 | |
numThread |
number | 0 | |
penalizeNewline |
boolean | true | |
presencePenalty |
number | 0 | |
repeatPenalty |
number | 1 | |
useMLock |
boolean | false | |
useMMap |
boolean | true | |
vocabOnly |
boolean | false | |
format |
options | "default" | default, json |