Skip to content

AI Agent node throws "No prompt specified" when Guardrails node is connected upstream #27417

@sinehypernova-0718

Description

@sinehypernova-0718

Bug Description

In AI Agent node versions prior to v3.1, there was a dedicated prompt type called "Connected Guardrails Node" (guardrails) that automatically looked for the guardrailsInput field output by the Guardrails node. This option was removed in v3.1 without providing users a clear migration path.

As a result, connecting a Guardrails node to an AI Agent node in v3.1+ results in "No prompt specified" with no actionable guidance, even though the Guardrails node correctly outputs guardrailsInput.

Users have no way to discover that they need to:

  1. Set prompt type to "Define below"
  2. Manually enter {{ $json.guardrailsInput }} as the expression

Related issue: #27342

Canvas

Image

AI Agent Node

Image

Guardrails Node

Image

To Reproduce

  1. Create workflow:
    Manual Trigger → Guardrails → AI Agent → any Chat Model
  2. Guardrails node: Operation = Sanitize, any text
  3. AI Agent node: Leave "Source for Prompt" on default (auto)
  4. Run workflow
  5. Error: "No prompt specified"
    Expected to find the prompt in an input field called 'chatInput' (this is what the chat trigger node node outputs). To use something else, change the 'Prompt' parameter

Even though Guardrails node correctly outputs:
guardrailsInput: "Hello, please help me with something"

Workflow JSON:

{
  "nodes": [
    {
      "parameters": {},
      "type": "n8n-nodes-base.manualTrigger",
      "typeVersion": 1,
      "position": [
        16,
        160
      ],
      "id": "78609a7f-3ea1-4859-80ff-43f793dc1cf2",
      "name": "When clicking 'Execute workflow'"
    },
    {
      "parameters": {
        "operation": "sanitize",
        "text": "Hello, please help me with something",
        "guardrails": {}
      },
      "type": "@n8n/n8n-nodes-langchain.guardrails",
      "typeVersion": 2,
      "position": [
        240,
        160
      ],
      "id": "d080417d-7011-4e7f-b460-e07b004b65db",
      "name": "Guardrails"
    },
    {
      "parameters": {
        "options": {}
      },
      "type": "@n8n/n8n-nodes-langchain.agent",
      "typeVersion": 3.1,
      "position": [
        464,
        160
      ],
      "id": "667d13f1-ce2e-49b0-9f33-21d4aa1a0637",
      "name": "AI Agent"
    },
    {
      "parameters": {
        "model": "whisper-large-v3-turbo",
        "options": {}
      },
      "type": "@n8n/n8n-nodes-langchain.lmChatGroq",
      "typeVersion": 1,
      "position": [
        536,
        384
      ],
      "id": "a3087974-2501-4f6f-b7e2-0655dab93745",
      "name": "Groq Chat Model",
      "credentials": {
        "groqApi": {
          "id": "zACUN8ImshjS5kQk",
          "name": "Groq account"
        }
      }
    }
  ],
  "connections": {
    "When clicking 'Execute workflow'": {
      "main": [
        [
          {
            "node": "Guardrails",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Guardrails": {
      "main": [
        [
          {
            "node": "AI Agent",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Groq Chat Model": {
      "ai_languageModel": [
        [
          {
            "node": "AI Agent",
            "type": "ai_languageModel",
            "index": 0
          }
        ]
      ]
    }
  },
  "pinData": {},
  "meta": {
    "templateCredsSetupCompleted": true,
    "instanceId": "8e0eced79741f2656ae8ff2e289a4b26d81862cd0d9a61033474dd063bce3974"
  }
}

Expected behavior

Either:

  1. The AI Agent node in auto mode should also check for
    guardrailsInput as a fallback when chatInput is null,

OR

  1. The "Define below" option description should explicitly mention
    that Guardrails node users should use {{ $json.guardrailsInput }}

Users should not have to discover this through trial and error.

Debug Info

Node type

@n8n/n8n-nodes-langchain.agent

Node version

3.1 (Latest)

n8n version

2.12.3 (Self Hosted)

Time

3/23/2026, 3:56:58 PM

Stack trace

NodeOperationError: No prompt specified at getPromptInputByType (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_6626676b36b9e45b1aba3b7dc84f924c/node_modules/@n8n/n8n-nodes-langchain/utils/helpers.ts:38:10) at prepareItemContext (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_6626676b36b9e45b1aba3b7dc84f924c/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ToolsAgent/V3/helpers/prepareItemContext.ts:47:36) at /usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_6626676b36b9e45b1aba3b7dc84f924c/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ToolsAgent/V3/helpers/executeBatch.ts:78:47 at Array.map () at executeBatch (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_6626676b36b9e45b1aba3b7dc84f924c/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ToolsAgent/V3/helpers/executeBatch.ts:73:30) at ExecuteContext.toolsAgentExecute (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_6626676b36b9e45b1aba3b7dc84f924c/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/agents/ToolsAgent/V3/execute.ts:46:84) at ExecuteContext.execute (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/@n8n+n8n-nodes-langchain@file+packages+@n8n+nodes-langchain_6626676b36b9e45b1aba3b7dc84f924c/node_modules/@n8n/n8n-nodes-langchain/nodes/agents/Agent/V3/AgentV3.node.ts:151:10) at WorkflowExecute.executeNode (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/n8n-core@file+packages+core_@opentelemetry+api@1.9.0_@opentelemetry+exporter-trace-otlp_9f358c3eeaef0d2736f54ac9757ada43/node_modules/n8n-core/src/execution-engine/workflow-execute.ts:1043:8) at WorkflowExecute.runNode (/usr/local/lib/node_modules/n8n/node_modules/.pnpm/n8n-core@file+packages+core_@opentelemetry+api@1.9.0_@opentelemetry+exporter-trace-otlp_9f358c3eeaef0d2736f54ac9757ada43/node_modules/n8n-core/src/execution-engine/workflow-execute.ts:1222:11) at /usr/local/lib/node_modules/n8n/node_modules/.pnpm/n8n-core@file+packages+core_@opentelemetry+api@1.9.0_@opentelemetry+exporter-trace-otlp_9f358c3eeaef0d2736f54ac9757ada43/node_modules/n8n-core/src/execution-engine/workflow-execute.ts:1668:27

Debug info

core

  • n8nVersion: 2.12.3
  • platform: docker (self-hosted)
  • nodeJsVersion: 24.13.1
  • nodeEnv: production
  • database: sqlite
  • executionMode: regular
  • concurrency: -1
  • license: community
  • consumerId: unknown

storage

  • success: all
  • error: all
  • progress: false
  • manual: true
  • binaryMode: filesystem

pruning

  • enabled: true
  • maxAge: 336 hours
  • maxCount: 10000 executions

client

  • userAgent: mozilla/5.0 (windows nt 10.0; win64; x64) applewebkit/537.36 (khtml, like gecko) chrome/146.0.0.0 safari/537.36
  • isTouchDevice: false

Generated at: 2026-03-23T10:15:08.244Z

Operating System

Windows 11

n8n Version

2.12.3

Node.js Version

24.14.0

Database

SQLite (default)

Execution mode

main (default)

Hosting

self hosted

Metadata

Metadata

Assignees

No one assigned

    Labels

    status:in-linearIssue or PR is now in Linearstatus:team-assignedA team has been assigned the issue or PRteam:aiIssue is with the ai teamtriage:pendingWaiting to be triaged

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions