The Ultimate n8n Automation Cheatsheet 2025

Field Manual for Automation Engineers
This cheatsheet distills 20+ years of enterprise automation experience into actionable patterns, shortcuts, and best practices for building lightning-fast, reliable, and scalable workflows in n8n.
1. Introduction
What is n8n?
n8n is an open-source, self-hostable workflow automation platform that enables you to connect APIs, services, and databases without writing code. Unlike Zapier or Make, n8n gives you complete control over your data, infrastructure, and execution logic.
Key Differentiators:
- Self-hostable - Run on your infrastructure
- Open-source - Full transparency and customization
- Visual workflow builder - Intuitive drag-and-drop interface
- Code nodes - Write custom JavaScript/TypeScript when needed
- Webhook-first - Built for event-driven architectures
- Cost-effective - No per-task pricing, unlimited executions
The Automation Mindset
β οΈ Pro Tip: Before building, ask: "What manual process am I replacing, and what's the failure cost?"
Core Principles:
- Modularity - Build reusable workflow components
- Resilience - Design for failure with retries and fallbacks
- Observability - Log everything, monitor execution times
- Scalability - Batch operations, avoid rate limits
- Security - Encrypt credentials, validate webhook signatures
2. Setup & Configuration
Installation Methods
Desktop App (Quick Start)
1# Download from n8n.io/download
2# Or via Homebrew (macOS)
3brew install n8nDocker (Recommended for Production)
1docker run -it --rm \
2 --name n8n \
3 -p 5678:5678 \
4 -v ~/.n8n:/home/node/.n8n \
5 n8nio/n8nDocker Compose (Full Stack)
1version: '3.8'
2services:
3 n8n:
4 image: n8nio/n8n
5 ports:
6 - "5678:5678"
7 environment:
8 - N8N_BASIC_AUTH_ACTIVE=true
9 - N8N_BASIC_AUTH_USER=admin
10 - N8N_BASIC_AUTH_PASSWORD=${N8N_PASSWORD}
11 - DB_TYPE=postgresdb
12 - DB_POSTGRESDB_HOST=postgres
13 - DB_POSTGRESDB_DATABASE=n8n
14 - DB_POSTGRESDB_USER=n8n
15 - DB_POSTGRESDB_PASSWORD=${POSTGRES_PASSWORD}
16 volumes:
17 - n8n_data:/home/node/.n8n
18 depends_on:
19 - postgres
20
21 postgres:
22 image: postgres:15
23 environment:
24 - POSTGRES_DB=n8n
25 - POSTGRES_USER=n8n
26 - POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
27 volumes:
28 - postgres_data:/var/lib/postgresql/data
29
30volumes:
31 n8n_data:
32 postgres_data:Cloud (n8n Cloud)
- Sign up at n8n.io
- Instant setup, managed infrastructure
- Free tier: 2,000 executions/month
Environment Configuration
Essential Environment Variables:
1# Authentication
2N8N_BASIC_AUTH_ACTIVE=true
3N8N_BASIC_AUTH_USER=admin
4N8N_BASIC_AUTH_PASSWORD=your-secure-password
5
6# Database (PostgreSQL recommended for production)
7DB_TYPE=postgresdb
8DB_POSTGRESDB_HOST=localhost
9DB_POSTGRESDB_DATABASE=n8n
10DB_POSTGRESDB_USER=n8n
11DB_POSTGRESDB_PASSWORD=your-db-password
12
13# Encryption (generate with: openssl rand -base64 32)
14N8N_ENCRYPTION_KEY=your-32-char-encryption-key
15
16# Webhook URL (for self-hosted)
17WEBHOOK_URL=https://your-domain.com/
18
19# Timezone
20TZ=UTC
21
22# Execution Settings
23EXECUTIONS_PROCESS=main
24EXECUTIONS_DATA_PRUNE=true
25EXECUTIONS_DATA_MAX_AGE=336 # 14 days in hoursBest Practice: Use environment variables for all sensitive data. Never hardcode credentials in workflows.
Credential Management
Secure Credential Storage:
1. Environment Variables - Use for API keys, secrets
1# In .env file
2OPENAI_API_KEY=sk-...
3SLACK_WEBHOOK_URL=https://hooks.slack.com/...2. n8n Credentials - Encrypted storage in database
- Access via: Settings β Credentials
- Supports OAuth2, API keys, basic auth
3. External Vaults (Advanced)
- HashiCorp Vault integration
- AWS Secrets Manager
- Azure Key Vault
Version Control & Backup
Workflow Versioning:
1# Export workflows as JSON
2# Via UI: Workflow β Download
3# Via API:
4curl -X GET \
5 "https://your-n8n-instance.com/api/v1/workflows" \
6 -H "X-N8N-API-KEY: your-api-key" \
7 > workflows-backup.json
Automated Backup Script:
1#!/bin/bash
2# backup-n8n.sh
3DATE=$(date +%Y%m%d_%H%M%S)
4BACKUP_DIR="./backups"
5mkdir -p $BACKUP_DIR
6
7# Export workflows
8curl -X GET \
9 "http://localhost:5678/api/v1/workflows" \
10 -H "X-N8N-API-KEY: $N8N_API_KEY" \
11 > "$BACKUP_DIR/workflows_$DATE.json"
12
13# Backup database (PostgreSQL)
14pg_dump -h localhost -U n8n n8n > "$BACKUP_DIR/db_$DATE.sql"
15
16echo "Backup completed: $DATE"π§ Pro Tip: Set up daily automated backups. Workflows are codeβtreat them like code.
3. Core Concepts
Nodes, Connections, and Execution Flow
Node Types:
- Trigger Nodes - Start workflows (Webhook, Cron, Schedule)
- Action Nodes - Perform operations (HTTP Request, Database, API calls)
- Transform Nodes - Modify data (Set, Code, Merge, Split)
- Control Nodes - Control flow (IF, Switch, Wait, Error Trigger)
Execution Flow:
1Trigger β Transform β Action β Transform β Action β EndData Passing:
Each node receives data from the previous node as an array of items. Access data using:
1// In Code Node
2const inputData = $input.all();
3const firstItem = $input.first();
4const itemValue = $input.item.json.fieldName;JSON Structure Handling
Understanding n8n Data Format:
1[
2 {
3 "json": {
4 "id": 1,
5 "name": "John",
6 "email": "john@example.com"
7 },
8 "binary": {},
9 "pairedItem": null
10 }
11]Common Access Patterns:
1// Get all items
2const items = $input.all();
3
4// Get first item's JSON
5const first = $input.first().json;
6
7// Map over items
8const emails = items.map(item => item.json.email);
9
10// Filter items
11const filtered = items.filter(item => item.json.status === 'active');Workflow Activation
Activation Methods:
- Manual Execution - Click "Execute Workflow"
- Production Mode - Toggle "Active" switch (workflow runs automatically)
- Webhook Trigger - Workflow activates on HTTP POST
- Schedule Trigger - Cron-based activation
- API Call - Trigger via REST API
Production Activation Checklist:
- [ ] Test workflow in manual mode
- [ ] Add error handling nodes
- [ ] Set up monitoring/alerting
- [ ] Document workflow purpose
- [ ] Enable "Active" toggle
4. Node Reference
Trigger Nodes
Webhook
- Use Case: HTTP POST/GET endpoints
- Pro Tip: Use for real-time integrations
- Example Config: Method: POST, Path:
/webhook/my-workflow
Cron
- Use Case: Scheduled tasks
- Pro Tip: Use cron syntax:
0 9 * * *(daily at 9 AM) - Example Config: Expression:
0 */6 * * *(every 6 hours)
Schedule Trigger
- Use Case: Simple recurring tasks
- Pro Tip: Easier than Cron for basic schedules
- Example Config: Every: 1 hour, Starting at: 09:00
IMAP Email
- Use Case: Email-triggered workflows
- Pro Tip: Filter by subject/from to reduce noise
- Example Config: Folder: INBOX, Options: Unread only
WebSocket
- Use Case: Real-time bidirectional communication
- Pro Tip: Use for live data streams
- Example Config: URL:
wss://api.example.com/stream
Polling
- Use Case: Check APIs at intervals
- Pro Tip: Set reasonable intervals to avoid rate limits
- Example Config: Interval: 5 minutes
Data Transformation Nodes
Code (JavaScript)
- Use Case: Custom logic, data mapping
- Pro Tip: Use
$input.all()for batch processing - Example: See Code Node Mastery section
Set
- Use Case: Add/modify fields
- Pro Tip: Use dot notation:
user.email - Example: Field:
fullName, Value:{{$json.firstName}} {{$json.lastName}}
Merge
- Use Case: Combine data from multiple nodes
- Pro Tip: Choose merge mode: "Merge By Index" or "Merge By Key"
- Example: Mode: Merge By Key, Key:
id
Split In Batches
- Use Case: Process large datasets
- Pro Tip: Set batch size based on API limits
- Example: Batch Size: 100, Options: Reset
IF
- Use Case: Conditional branching
- Pro Tip: Use expressions:
{{$json.status === 'active'}} - Example: Condition:
{{$json.amount > 1000}}
Switch
- Use Case: Multi-path routing
- Pro Tip: Use multiple rules for complex routing
- Example: Rule 1:
{{$json.type === 'email'}}, Rule 2:{{$json.type === 'sms'}}
Regex
- Use Case: Extract/transform text
- Pro Tip: Test regex patterns before using
- Example: Pattern:
(\d{4}), Replace:****
Integration Nodes
HTTP Request
- Use Case: Call any REST API
- Pro Tip: Use authentication in headers
- Example: Method: POST, URL:
https://api.example.com/data, Headers:Authorization: Bearer {{$env.API_KEY}}
MySQL/Postgres
- Use Case: Database operations
- Pro Tip: Use parameterized queries to prevent SQL injection
- Example: Query:
SELECT * FROM users WHERE status = :status, Parameters:{"status": "active"}
Google Sheets
- Use Case: Spreadsheet operations
- Pro Tip: Use named ranges for better performance
- Example: Operation: Append, Sheet: "Data", Range: "A1:D100"
Slack
- Use Case: Team notifications
- Pro Tip: Use blocks for rich formatting
- Example: Channel:
#alerts, Message:{{$json.message}}
Telegram
- Use Case: Personal/bot notifications
- Pro Tip: Use Markdown for formatting
- Example: Chat ID:
{{$env.TELEGRAM_CHAT_ID}}, Text:*Alert*: {{$json.title}}
Notion
- Use Case: Knowledge base operations
- Pro Tip: Use page IDs, not URLs
- Example: Operation: Create Page, Parent:
{{$json.parentId}}
GitHub
- Use Case: Repository automation
- Pro Tip: Use personal access tokens with minimal scopes
- Example: Operation: Create Issue, Repository:
owner/repo, Title:{{$json.title}}
Utility Nodes
Wait
- Use Case: Pause execution
- Pro Tip: Use for rate limiting or delays
- Example: Wait For: 5 seconds
Delay
- Use Case: Time-based delays
- Pro Tip: Use for scheduled retries
- Example: Wait For: 1 hour
Error Trigger
- Use Case: Catch and handle errors
- Pro Tip: Place after nodes that might fail
- Example: Continue On Fail: true
Execute Workflow
- Use Case: Call other workflows
- Pro Tip: Use for modular design
- Example: Workflow ID:
{{$env.MAIN_WORKFLOW_ID}}
Function Item
- Use Case: Per-item processing
- Pro Tip: Use for item-level transformations
- Example: See Code Node examples
5. Code Node Mastery
Basic Patterns
Accessing Input Data:
1// Get all items as array
2const items = $input.all();
3
4// Get first item
5const firstItem = $input.first().json;
6
7// Get specific field
8const email = $input.item.json.email;
9
10// Get binary data
11const imageData = $input.item.binary.data;Returning Data:
1// Return single item
2return {
3 json: {
4 id: 1,
5 name: "John",
6 processed: true
7 }
8};
9
10// Return multiple items
11return items.map(item => ({
12 json: {
13 ...item.json,
14 processed: true,
15 timestamp: new Date().toISOString()
16 }
17}));Data Transformation Examples
Mapping and Filtering:
1// Filter active users
2const activeUsers = $input.all()
3 .filter(item => item.json.status === 'active')
4 .map(item => ({
5 json: {
6 id: item.json.id,
7 email: item.json.email,
8 name: item.json.name
9 }
10 }));
11
12return activeUsers;API Response Parsing:
1// Parse nested API response
2const response = $input.first().json;
3
4const transformed = {
5 json: {
6 userId: response.data.user.id,
7 userName: response.data.user.name,
8 email: response.data.user.email,
9 metadata: {
10 createdAt: response.data.user.created_at,
11 lastLogin: response.data.user.last_login
12 }
13 }
14};
15
16return transformed;Batch Processing:
1// Process items in chunks
2const items = $input.all();
3const batchSize = 10;
4const batches = [];
5
6for (let i = 0; i < items.length; i += batchSize) {
7 batches.push(items.slice(i, i + batchSize));
8}
9
10// Return first batch (use Split In Batches node for full solution)
11return batches[0].map(item => ({
12 json: {
13 ...item.json,
14 batchNumber: 1
15 }
16}));
Reusable Functions
Helper Functions:
1// Date formatting helper
2function formatDate(dateString, format = 'YYYY-MM-DD') {
3 const date = new Date(dateString);
4 const year = date.getFullYear();
5 const month = String(date.getMonth() + 1).padStart(2, '0');
6 const day = String(date.getDate()).padStart(2, '0');
7
8 return format
9 .replace('YYYY', year)
10 .replace('MM', month)
11 .replace('DD', day);
12}
13
14// Email validation
15function isValidEmail(email) {
16 return /^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(email);
17}
18
19// Use helpers
20const items = $input.all();
21return items.map(item => ({
22 json: {
23 ...item.json,
24 formattedDate: formatDate(item.json.createdAt),
25 isValidEmail: isValidEmail(item.json.email)
26 }
27}));
Error Handling Patterns
Try-Catch in Code Node:
1const items = $input.all();
2const results = [];
3
4for (const item of items) {
5 try {
6 // Process item
7 const processed = {
8 json: {
9 ...item.json,
10 processed: true,
11 error: null
12 }
13 };
14 results.push(processed);
15 } catch (error) {
16 // Handle error gracefully
17 results.push({
18 json: {
19 ...item.json,
20 processed: false,
21 error: error.message
22 }
23 });
24 }
25}
26
27return results;Validation Pattern:
1const item = $input.first().json;
2
3// Validate required fields
4const required = ['email', 'name', 'phone'];
5const missing = required.filter(field => !item[field]);
6
7if (missing.length > 0) {
8 throw new Error(`Missing required fields: ${missing.join(', ')}`);
9}
10
11// Validate format
12if (!/^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(item.email)) {
13 throw new Error('Invalid email format');
14}
15
16return { json: item };Dynamic Variable Usage
Environment Variables:
1// Access environment variables
2const apiKey = $env.OPENAI_API_KEY;
3const webhookUrl = $env.SLACK_WEBHOOK_URL;
4
5// Use in API calls
6return {
7 json: {
8 apiKey: apiKey, // Don't expose in production!
9 webhookUrl: webhookUrl
10 }
11};Node Output References:
1// Reference previous node outputs
2const webhookData = $('Webhook').first().json;
3const dbResult = $('Postgres').all();
4
5// Combine data
6return {
7 json: {
8 webhookId: webhookData.id,
9 dbRecords: dbResult.length,
10 combined: {
11 ...webhookData,
12 records: dbResult
13 }
14 }
15};6. Workflow Design Patterns (Expert Section)
Parallel Execution Strategy
Running Multiple Operations Simultaneously:
1Webhook Trigger
2 βββ HTTP Request (API 1)
3 βββ HTTP Request (API 2)
4 βββ Database Query
5 β
6 Merge Node (Wait for all)
7 β
8 Process Combined Resultsβ Best Practice: Use Merge node with "Wait for all inputs" to synchronize parallel operations.
Example: Parallel API Calls
1// In Merge node, combine results
2const api1Data = $input.all()[0].json; // First input
3const api2Data = $input.all()[1].json; // Second input
4const dbData = $input.all()[2].json; // Third input
5
6return {
7 json: {
8 api1: api1Data,
9 api2: api2Data,
10 database: dbData,
11 timestamp: new Date().toISOString()
12 }
13};Error-Resilient Automation
Try/Catch Pattern with Error Trigger:
1HTTP Request
2 β
3Error Trigger (Continue On Fail: true)
4 βββ Success Path β Process Data
5 βββ Error Path β Log Error β Send Alert β Fallback ActionRetry Mechanism:
1// Retry logic in Code Node
2async function retryOperation(operation, maxRetries = 3) {
3 for (let i = 0; i < maxRetries; i++) {
4 try {
5 return await operation();
6 } catch (error) {
7 if (i === maxRetries - 1) throw error;
8 await new Promise(resolve => setTimeout(resolve, 1000 * (i + 1)));
9 }
10 }
11}
12
13// Use in workflow
14const result = await retryOperation(() => {
15 // Your operation here
16 return fetch('https://api.example.com/data');
17});Deduplication Pattern:
1// Track processed items
2const processedIds = new Set();
3const items = $input.all();
4
5const uniqueItems = items.filter(item => {
6 const id = item.json.id;
7 if (processedIds.has(id)) {
8 return false; // Duplicate
9 }
10 processedIds.add(id);
11 return true;
12});
13
14return uniqueItems.map(item => ({ json: item.json }));Event-Driven Architecture
Webhook-First Design:
1External Service β Webhook β n8n β Process β Trigger Actions
2 β
3 Multiple Workflows
4Webhook Security:
1// Verify webhook signature (in Code Node)
2const crypto = require('crypto');
3
4const secret = $env.WEBHOOK_SECRET;
5const signature = $input.first().headers['x-signature'];
6const payload = JSON.stringify($input.first().json);
7
8const expectedSignature = crypto
9 .createHmac('sha256', secret)
10 .update(payload)
11 .digest('hex');
12
13if (signature !== expectedSignature) {
14 throw new Error('Invalid webhook signature');
15}
16
17return $input.all();Modular Workflow Design
Using Execute Workflow Node:
1Main Workflow
2 βββ Execute Workflow: Data Processing
3 βββ Execute Workflow: Notification
4 βββ Execute Workflow: ReportingBenefits:
- Reusability across workflows
- Easier testing and debugging
- Better organization
- Independent versioning
Passing Data Between Workflows:
1// In calling workflow
2{
3 "workflowId": "{{$env.PROCESSING_WORKFLOW_ID}}",
4 "data": {
5 "items": $input.all(),
6 "config": {
7 "mode": "production",
8 "notify": true
9 }
10 }
11}
12AI-Powered Automations
Integrating OpenAI:
1// OpenAI API call pattern
2const items = $input.all();
3const results = [];
4
5for (const item of items) {
6 const response = await fetch('https://api.openai.com/v1/chat/completions', {
7 method: 'POST',
8 headers: {
9 'Authorization': `Bearer ${$env.OPENAI_API_KEY}`,
10 'Content-Type': 'application/json'
11 },
12 body: JSON.stringify({
13 model: 'gpt-4',
14 messages: [
15 {
16 role: 'system',
17 content: 'You are a helpful assistant that summarizes data.'
18 },
19 {
20 role: 'user',
21 content: `Summarize this: ${JSON.stringify(item.json)}`
22 }
23 ],
24 temperature: 0.7
25 })
26 });
27
28 const data = await response.json();
29 results.push({
30 json: {
31 ...item.json,
32 summary: data.choices[0].message.content
33 }
34 });
35}
36
37return results;AI-Driven Decision Making:
1// Use AI to classify or route items
2const classification = await classifyItem(item.json);
3
4if (classification.category === 'urgent') {
5 // Route to urgent handling workflow
6} else if (classification.category === 'normal') {
7 // Route to standard processing
8}7. Debugging & Optimization
Common Mistakes and Fixes
Missing error handling
- Symptom: Workflow fails silently
- Fix: Add Error Trigger nodes
Not handling empty arrays
- Symptom: "Cannot read property of undefined"
- Fix: Check
$input.all().length > 0
Rate limit issues
- Symptom: API calls fail intermittently
- Fix: Add Wait nodes, implement retry logic
Memory leaks
- Symptom: Workflow slows over time
- Fix: Use Split In Batches, limit data retention
Hardcoded values
- Symptom: Workflow breaks when data changes
- Fix: Use expressions:
{{$json.field}}
Infinite loops
- Symptom: Workflow runs forever
- Fix: Add execution time limits, break conditions
Performance Tuning
Batch Size Optimization:
1// Optimal batch size depends on:
2// 1. API rate limits
3// 2. Memory constraints
4// 3. Processing time
5
6// For most APIs: 50-100 items per batch
7// For heavy processing: 10-20 items per batch
8// For simple operations: 200-500 items per batchLazy Loading Pattern:
1// Only fetch what you need
2const items = $input.all();
3const ids = items.map(item => item.json.id);
4
5// Fetch details only for active items
6const activeItems = items.filter(item => item.json.status === 'active');
7// Process only active itemsData Trimming:
1// Remove unnecessary fields to reduce memory
2const items = $input.all();
3return items.map(item => ({
4 json: {
5 id: item.json.id,
6 email: item.json.email,
7 // Only keep essential fields
8 }
9}));Monitoring and Logging
Execution Logging:
1// Log to console (visible in n8n execution log)
2console.log('Processing item:', item.json.id);
3console.error('Error occurred:', error.message);
4
5// Log to external service
6await fetch($env.LOGGING_WEBHOOK, {
7 method: 'POST',
8 body: JSON.stringify({
9 level: 'info',
10 message: 'Workflow executed',
11 data: item.json,
12 timestamp: new Date().toISOString()
13 })
14});Performance Monitoring:
1// Track execution time
2const startTime = Date.now();
3
4// ... your processing ...
5
6const executionTime = Date.now() - startTime;
7
8return {
9 json: {
10 ...item.json,
11 _metadata: {
12 executionTimeMs: executionTime,
13 timestamp: new Date().toISOString()
14 }
15 }
16};Alerting on Failures:
1Error Trigger
2 β
3Code Node (Format Error)
4 β
5Slack/Telegram/Email Notification1// Format error for notification
2const error = $input.first().json;
3
4return {
5 json: {
6 alert: true,
7 severity: 'high',
8 message: `Workflow failed: ${error.error.message}`,
9 workflow: error.workflow?.name || 'Unknown',
10 node: error.node?.name || 'Unknown',
11 timestamp: new Date().toISOString(),
12 details: error
13 }
14};8. Real-World Examples
Example 1: Google Sheets β Airtable β Slack Pipeline
Use Case: Sync data from Google Sheets to Airtable and notify team in Slack.
1Schedule Trigger (Daily at 9 AM)
2 β
3Google Sheets (Read Data)
4 β
5Code Node (Transform Data)
6 β
7Airtable (Create/Update Records)
8 β
9IF Node (Check for new records)
10 βββ Yes β Slack (Send notification)
11 βββ No β EndCode Node Transformation:
1const sheetsData = $input.all();
2
3return sheetsData.map(item => ({
4 json: {
5 fields: {
6 'Name': item.json.name,
7 'Email': item.json.email,
8 'Status': item.json.status,
9 'Last Updated': new Date().toISOString()
10 }
11 }
12}));
13Platform-Specific Formatting:
1const post = $input.first().json;
2
3// Twitter format (280 chars)
4const twitterPost = {
5 text: post.content.substring(0, 280),
6 media_ids: post.imageId ? [post.imageId] : undefined
7};
8
9// LinkedIn format
10const linkedInPost = {
11 author: `urn:li:person:${$env.LINKEDIN_PERSON_URN}`,
12 lifecycleState: 'PUBLISHED',
13 specificContent: {
14 'com.linkedin.ugc.ShareContent': {
15 shareCommentary: {
16 text: post.content
17 },
18 shareMediaCategory: 'ARTICLE',
19 media: post.imageUrl ? [{
20 status: 'READY',
21 media: post.imageUrl
22 }] : undefined
23 }
24 },
25 visibility: {
26 'com.linkedin.ugc.MemberNetworkVisibility': 'PUBLIC'
27 }
28};
29
30return [
31 { json: { platform: 'twitter', data: twitterPost } },
32 { json: { platform: 'linkedin', data: linkedInPost } }
33];9. Security, Scalability & Maintenance
Webhook Security
Signature Verification:
1// Verify webhook signature
2const crypto = require('crypto');
3const secret = $env.WEBHOOK_SECRET;
4const signature = $input.first().headers['x-signature'];
5const payload = JSON.stringify($input.first().json);
6
7const expectedSignature = crypto
8 .createHmac('sha256', secret)
9 .update(payload)
10 .digest('hex');
11
12if (signature !== `sha256=${expectedSignature}`) {
13 throw new Error('Invalid webhook signature');
14}
15IP Whitelisting:
1// Check allowed IPs
2const allowedIPs = $env.ALLOWED_IPS.split(',');
3const clientIP = $input.first().headers['x-forwarded-for'] ||
4 $input.first().headers['x-real-ip'];
5
6if (!allowedIPs.includes(clientIP)) {
7 throw new Error('IP not allowed');
8}Rate Limiting:
1// Simple rate limiting (use Redis for production)
2const rateLimitKey = `ratelimit:${clientIP}`;
3// Check against stored count
4// Increment on each request
5// Block if exceeds limitScaling n8n
Docker Compose for High Availability:
1version: '3.8'
2services:
3 n8n:
4 image: n8nio/n8n
5 deploy:
6 replicas: 3
7 environment:
8 - EXECUTIONS_PROCESS=main
9 - QUEUE_BULL_REDIS_HOST=redis
10 depends_on:
11 - postgres
12 - redis
13
14 redis:
15 image: redis:7-alpine
16 volumes:
17 - redis_data:/data
18
19 postgres:
20 image: postgres:15
21 environment:
22 - POSTGRES_DB=n8n
23 volumes:
24 - postgres_data:/var/lib/postgresql/dataKubernetes Deployment:
1apiVersion: apps/v1
2kind: Deployment
3metadata:
4 name: n8n
5spec:
6 replicas: 3
7 template:
8 spec:
9 containers:
10 - name: n8n
11 image: n8nio/n8n
12 env:
13 - name: DB_TYPE
14 value: postgresdb
15 - name: DB_POSTGRESDB_HOST
16 value: postgres-serviceDatabase Maintenance
Backup Strategy:
1# Daily PostgreSQL backup
20 2 * * * pg_dump -h localhost -U n8n n8n | gzip > /backups/n8n_$(date +\%Y\%m\%d).sql.gz
3
4# Retention: Keep 30 days
5find /backups -name "n8n_*.sql.gz" -mtime +30 -deleteData Pruning:
1# Environment variables for automatic pruning
2EXECUTIONS_DATA_PRUNE=true
3EXECUTIONS_DATA_MAX_AGE=336 # 14 days
4EXECUTIONS_DATA_PRUNE_MAX_COUNT=10000High-Throughput Optimizations
Batch Processing:
1// Process in optimal batch sizes
2const BATCH_SIZE = 100; // Adjust based on API limits
3const items = $input.all();
4
5for (let i = 0; i < items.length; i += BATCH_SIZE) {
6 const batch = items.slice(i, i + BATCH_SIZE);
7 // Process batch
8 // Use Split In Batches node for automatic handling
9}Connection Pooling:
1// Reuse HTTP connections
2const https = require('https');
3const agent = new https.Agent({
4 keepAlive: true,
5 maxSockets: 50
6});
7
8// Use agent in fetch requests10. Power Shortcuts & Productivity Hacks
n8n Hotkeys
- Save workflow:Β
Cmd/Ctrl + S - Execute workflow:Β
Cmd/Ctrl + Enter - Add node:Β
Space(when node selected) - Delete node:Β
DeleteorBackspace - Undo:Β
Cmd/Ctrl + Z - Redo:Β
Cmd/Ctrl + Shift + Z - Zoom in:Β
Cmd/Ctrl + + - Zoom out:Β
Cmd/Ctrl + - - Reset zoom:Β
Cmd/Ctrl + 0
Workflow Templates
Create Reusable Templates:
- Build workflow with placeholder values
- Export as JSON
- Store in version control
- Import and customize for new use cases
Template Structure:
1{
2 "name": "Template: Data Sync",
3 "nodes": [
4 {
5 "name": "Source",
6 "type": "n8n-nodes-base.httpRequest",
7 "parameters": {
8 "url": "{{$env.SOURCE_API_URL}}"
9 }
10 }
11 ]
12}Community Nodes
Recommended Community Nodes:
- n8n-nodes-base.redis - Redis operations
- n8n-nodes-base.aws - AWS services integration
- n8n-nodes-base.google-cloud - GCP services
- n8n-nodes-base.kubernetes - K8s operations
Installation:
1npm install n8n-nodes-base.redis
2# Restart n8n
3VS Code Integration
Workflow Development Workflow:
- Export workflow as JSON
- Edit in VS Code with JSON schema validation
- Import back to n8n
- Use Git for version control
Recommended Extensions:
- JSON Tools
- Prettier
- GitLens
Productivity Tips
π§ Pro Tip: Use workflow tags and naming conventions for easy discovery.
Naming Conventions:
1[Category] - [Purpose] - [Environment]
2Examples:
3- "Marketing - Email Campaign - Production"
4- "Data - Sync Sheets to DB - Staging"
5- "Monitoring - Error Alerts - Production"Workflow Organization:
- Group by department/team
- Use tags:
production,staging,deprecated - Document purpose in workflow notes
- Set up workflow folders
11. Bonus: Advanced Automation Frameworks
Integrating with LangChain
n8n + LangChain Pattern:
1// Call LangChain API from n8n
2const response = await fetch('http://localhost:8000/chain/invoke', {
3 method: 'POST',
4 headers: { 'Content-Type': 'application/json' },
5 body: JSON.stringify({
6 input: {
7 question: $input.first().json.question,
8 context: $input.first().json.context
9 }
10 })
11});
12
13const result = await response.json();
14return {
15 json: {
16 answer: result.output,
17 sources: result.sources
18 }
19};Python Script Integration
Calling Python Scripts:
1// Execute Python script via HTTP Request
2const pythonResponse = await fetch('http://localhost:5000/process', {
3 method: 'POST',
4 body: JSON.stringify($input.first().json)
5});
6
7return { json: await pythonResponse.json() };Python Script Example:
1# process.py
2from flask import Flask, request, jsonify
3
4app = Flask(__name__)
5
6@app.route('/process', methods=['POST'])
7def process():
8 data = request.json
9 # Your processing logic
10 result = {
11 'processed': True,
12 'data': data
13 }
14 return jsonify(result)
15
16if __name__ == '__main__':
17 app.run(port=5000)GitHub Actions Integration
Trigger n8n from GitHub Actions:
1# .github/workflows/trigger-n8n.yml
2name: Trigger n8n Workflow
3on:
4 push:
5 branches: [main]
6
7jobs:
8 trigger:
9 runs-on: ubuntu-latest
10 steps:
11 - name: Trigger n8n
12 run: |
13 curl -X POST \
14 "${{ secrets.N8N_WEBHOOK_URL }}" \
15 -H "Content-Type: application/json" \
16 -d '{
17 "event": "push",
18 "branch": "${{ github.ref }}",
19 "commit": "${{ github.sha }}"
20 }'Firebase Functions Integration
n8n β Firebase Functions:
1// Firebase Function calling n8n
2const functions = require('firebase-functions');
3const axios = require('axios');
4
5exports.triggerN8n = functions.https.onRequest(async (req, res) => {
6 await axios.post(process.env.N8N_WEBHOOK_URL, {
7 data: req.body,
8 source: 'firebase'
9 });
10
11 res.json({ success: true });
12});CLI Integration
n8n CLI Commands:
1# Install n8n CLI
2npm install -g n8n
3
4# Execute workflow via CLI
5n8n execute:workflow --id=WORKFLOW_ID --data='{"key":"value"}'
6
7# Export workflow
8n8n export:workflow --id=WORKFLOW_ID --output=workflow.json
9
10# Import workflow
11n8n import:workflow --input=workflow.json
12AI-Driven Automation Orchestration
Pattern: AI Decision Maker β n8n Executor
1External Event
2 β
3AI Classifier (OpenAI/Claude)
4 β
5Route to Appropriate n8n Workflow
6 βββ Workflow A (High Priority)
7 βββ Workflow B (Normal)
8 βββ Workflow C (Low Priority)AI Classification Code:
1const item = $input.first().json;
2
3const classification = await fetch('https://api.openai.com/v1/chat/completions', {
4 method: 'POST',
5 headers: {
6 'Authorization': `Bearer ${$env.OPENAI_API_KEY}`,
7 'Content-Type': 'application/json'
8 },
9 body: JSON.stringify({
10 model: 'gpt-4',
11 messages: [{
12 role: 'system',
13 content: 'Classify items into: urgent, normal, low-priority'
14 }, {
15 role: 'user',
16 content: JSON.stringify(item)
17 }]
18 })
19}).then(r => r.json());
20
21const priority = classification.choices[0].message.content.toLowerCase();
22
23return {
24 json: {
25 ...item,
26 priority: priority,
27 workflowId: priority === 'urgent'
28 ? $env.URGENT_WORKFLOW_ID
29 : $env.NORMAL_WORKFLOW_ID
30 }
31};Error Handling Checklist
- [ ] Add Error Trigger nodes after critical operations
- [ ] Implement retry logic for transient failures
- [ ] Set up alerting for persistent errors
- [ ] Log errors with context
- [ ] Provide fallback actions
- [ ] Test error scenarios
Performance Checklist
- [ ] Use Split In Batches for large datasets
- [ ] Implement rate limiting
- [ ] Trim unnecessary data fields
- [ ] Use parallel execution where possible
- [ ] Monitor execution times
- [ ] Set up data retention policies
Conclusion
This cheatsheet represents decades of automation engineering wisdom distilled into actionable patterns and practices. Use it as your daily reference while building workflows, and remember:
The best automation is invisible - it works so reliably that you forget it exists.
Key Takeaways:
- Start simple, iterate complex - Build MVP workflows first
- Design for failure - Errors will happen, plan for them
- Monitor everything - You can't optimize what you don't measure
- Document as you build - Future you will thank present you
- Reuse and modularize - Don't repeat yourself
Next Steps:
- Set up your n8n instance with proper security
- Build your first workflow using the patterns above
- Join the n8n community for support
- Contribute your own patterns and share knowledge
Happy Automating! π
This cheatsheet is a living document. Update it as you discover new patterns and best practices.
Related Articles
π Redis vs MongoDB Why You're Asking the Wrong Question
Redis and MongoDB arenβt competitors. MongoDB stores truth long-term, while Redis powers real-time speed. Great systems use both reliability + instant performance.
What Clients Really Expect From Developers (After Working on 100+ Real Projects)
Clients donβt just want code. They want developers who think like owners, challenge bad ideas, understand users and build products that scale. This mindset separates average devs from trusted partners
Written by
Badal Khatri
AI Engineer & Architect