Functionality
The If Output Contains node acts as a content filter, analyzing LLM output text and directing the flow based on pattern matching. This can be useful for:- Output Validation: Ensure AI outputs meet specific criteria
- Content Classification: Route outputs based on detected themes or content
- Quality Control: Identify and handle outputs with particular characteristics
- Output Filtering: Filter out unwanted content patterns
- Automated Output Handling: Route different types of outputs to appropriate processing steps
Node Properties
For each condition you add, you can configure:- Text to Match: The pattern or text to search for in the LLM output
- Use Regex: Toggle to enable regular expression pattern matching
- Case Sensitive: Toggle to make the pattern matching case sensitive
Multiple conditions are evaluated using OR logic — if any condition is true, the workflow will proceed down the true path. To use AND logic, simply chain multiple If Output Contains nodes together.
Usage Examples
Scenario: Output Quality Control
Let’s say you want to handle AI outputs differently based on their content characteristics:- Add an If Output Contains node after your LLM node
- Add conditions for quality checks:
- Text: “I apologize” (to catch uncertainty or inability to answer)
- Text: “I don’t know” (to identify knowledge gaps)
- Connect the “True” path to fallback handling
- Connect the “False” path to normal processing
Scenario: Content Classification with Regex
To categorize outputs based on specific patterns:- Add an If Output Contains node
- Enable “Use regex”
- Add a condition with pattern:
\$\d+(\.\d{2})?
(to identify outputs containing dollar amounts) - Route matching outputs to financial processing nodes
Tips and Best Practices
- Place this node immediately after LLM nodes to analyze their outputs
- Use simple text matches for basic content detection
- Implement regex for more complex pattern recognition
- Consider case sensitivity based on your content requirements
- Chain multiple nodes for sophisticated output analysis
- Test conditions with various LLM outputs to ensure reliable routing
- Consider checking the Execution Logs to track which conditions are being triggered