Keepfig v1.0

AI-powered invoice extraction and financial management platform

Complete SaaS solution with Next.js 16 frontend, Go backend microservices, document extraction, and intelligent data processing.

πŸš€ Quick Start

Prerequisites

Installation

# Clone repository
git clone https://github.com/yourusername/invoice-pro.git
cd invoice-pro

# Install dependencies
./deploy.sh install

# Start development
docker-compose up -d

Access Points

Service URL Purpose
Frontendhttp://localhost:3000Next.js web app
Backend APIhttp://localhost:8080gRPC-Web gateway
PostgreSQLlocalhost:5432Primary database
Redislocalhost:6379Cache & rate limiting

πŸ—οΈ Architecture

System Overview

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚              Frontend (Next.js 16)              β”‚
β”‚  β€’ App Router, TypeScript, Tailwind CSS        β”‚
β”‚  β€’ Connect-RPC gRPC client                     β”‚
β”‚  β€’ React Query for state management            β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                 β”‚ HTTP/gRPC-Web
                 β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚         Backend Services (Go)                   β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚  Rate Limiter (Redis)                    β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β”‚                 β–Ό                                β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚  Authentication (JWT)                    β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β”‚                 β–Ό                                β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚  Business Logic Handlers                 β”‚  β”‚
β”‚  β”‚  β€’ Accounting Service                    β”‚  β”‚
β”‚  β”‚  β€’ Import Service                        β”‚  β”‚
β”‚  β”‚  β€’ Integration Service                   β”‚  β”‚
β”‚  β”‚  β€’ AI Agent Service                      β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                 β”‚
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β–Ό            β–Ό            β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚PostgreSQLβ”‚  β”‚ Redis  β”‚  β”‚ AI APIs  β”‚
β”‚(Primary) β”‚  β”‚(Cache) β”‚  β”‚(OpenAI)  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ“¦ Data Import System Complete

Overview

Universal CSV/Excel import with AI-powered column mapping. Supports any provider (Paystack, Stripe, Flutterwave, Square) with automatic field detection and transformation.

Key Features

Import Strategy

Method Speed Cost Accuracy
Template Matching Instant FREE 100%
Rule-Based Patterns Instant FREE 70-95%
AI Fallback ~2s ~$0.02/file 85%
90-day Cache Instant FREE 95% hit

Database Schema

import_sessions
  - id, tenant_id, filename, status
  - total_rows, processed_rows, error_rows
  - integration_id, provider, template_id
  
import_field_mappings
  - session_id, source_column, target_entity, target_field
  - mapping_strategy (template/ai/rules)
  - confidence_score, transformations (JSONB)
  
import_errors
  - session_id, row_number, field_name, error_message
  
import_templates
  - name, description, provider
  - column_mappings (JSONB), is_public
  
mapping_cache
  - cache_key (hash of columns), mapping_result (JSONB)
  - hit_count, expires_at

API Endpoints (gRPC)

Frontend Components

frontend/src/
β”œβ”€β”€ app/dashboard/data-imports/
β”‚   β”œβ”€β”€ page.tsx                 (list all imports)
β”‚   β”œβ”€β”€ new/page.tsx             (import wizard)
β”‚   β”œβ”€β”€ [id]/page.tsx            (import details)
β”‚   └── templates/page.tsx       (manage templates)
β”œβ”€β”€ components/data-imports/
β”‚   β”œβ”€β”€ import-wizard.tsx        (multi-step wizard)
β”‚   β”œβ”€β”€ upload-step.tsx
β”‚   β”œβ”€β”€ review-step.tsx
β”‚   └── execute-step.tsx
β”œβ”€β”€ hooks/
β”‚   └── use-imports.ts           (8 React hooks)
└── lib/api/
    └── imports.ts               (ImportsAPI client)

Usage Example

// Upload file
const session = await ImportsAPI.uploadFile({
  tenantId: 'uuid',
  file: csvFile,
  integrationId: 'uuid', // optional
  provider: 'stripe',    // optional
  templateId: 'uuid',    // optional (instant import)
})

// Get mappings (AI or template)
const { mappings } = await ImportsAPI.getMappings(tenantId, sessionId)

// Adjust if needed
await ImportsAPI.updateMapping({
  tenantId,
  sessionId,
  mappingId: 'uuid',
  targetField: 'amount',
  isUserConfirmed: true
})

// Execute import
await ImportsAPI.executeImport(tenantId, sessionId)
πŸ’‘ Template Optimization: When a template is selected, mappings are applied instantly without AI analysis. This provides 100% accuracy at zero cost, compared to ~$0.02 per AI analysis.

⚑ Workflow Automation Production Ready

Overview

Visual workflow builder with scheduled execution, AI-powered automation, and seamless integration management. Build complex business processes with drag-and-drop nodes that execute automatically on configurable intervals.

Key Features

Workflow Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚        Workflow Builder (Frontend V2)            β”‚
β”‚  β€’ Drag-and-drop canvas with zoom/pan           β”‚
β”‚  β€’ Node palette (triggers, actions, logic)       β”‚
β”‚  β€’ Configuration sidebar                         β”‚
β”‚  β€’ Integration validation before save            β”‚
β”‚  β€’ Activate/Pause buttons with status badge      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                 β”‚ Save β†’ status: draft
                 β”‚ Activate β†’ /api/workflows/{id}/activate
                 β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚      Workflow API Routes (Next.js)               β”‚
β”‚  β€’ POST /api/workflows - Create/update          β”‚
β”‚  β€’ POST /activate - Set active + create schedule β”‚
β”‚  β€’ POST /pause - Set paused + disable schedule   β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                 β”‚ gRPC: activateWorkflow, pauseWorkflow
                 β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚       Backend Workflow Services (Go)             β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  Workflow Scheduler (robfig/cron/v3)       β”‚ β”‚
β”‚  β”‚  β€’ Auto-starts with app lifecycle          β”‚ β”‚
β”‚  β”‚  β€’ Loads active schedules from DB          β”‚ β”‚
β”‚  β”‚  β€’ Converts polling intervals to cron      β”‚ β”‚
β”‚  β”‚  β€’ Executes workflows on schedule          β”‚ β”‚
β”‚  β”‚  β€’ Checks workflow status before execution β”‚ β”‚
β”‚  β”‚  β€’ Refresh loop every 5 minutes            β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β”‚               β–Ό                                  β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  Workflow Runner (Graph Executor)          β”‚ β”‚
β”‚  β”‚  β€’ Resolves {{template}} variables         β”‚ β”‚
β”‚  β”‚  β€’ Executes nodes in dependency order      β”‚ β”‚
β”‚  β”‚  β€’ Manages execution context               β”‚ β”‚
β”‚  β”‚  β€’ Updates run counts and timestamps       β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β”‚               β–Ό                                  β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  Node Executors (11 types)                β”‚ β”‚
β”‚  β”‚  β€’ Triggers: paystack_scheduled_sync       β”‚ β”‚
β”‚  β”‚  β€’ Actions: create_payment, create_contact β”‚ β”‚
β”‚  β”‚  β€’ Logic: condition, loop, transform       β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Scheduled Execution System

Workflows can be activated to run automatically on configurable intervals. The scheduler uses cron expressions and tracks execution history.

Polling Intervals

Interval Cron Expression Use Case
Every 15 minutes*/15 * * * *Real-time transaction sync
Every 30 minutes*/30 * * * *Frequent updates
Every hour0 * * * *Hourly reports
Every 6 hours0 */6 * * *Periodic sync
Daily (midnight)0 0 * * *Daily reconciliation (recommended)

Workflow Lifecycle

// 1. Save workflow (status: draft)
POST /api/workflows
{
  "name": "Auto-import Paystack",
  "nodes": [...],
  "connections": [...]
}
β†’ Workflow saved, NOT scheduled

// 2. Activate workflow (creates schedule)
POST /api/workflows/{id}/activate
{
  "id": "workflow-uuid"
}
β†’ Response:
{
  "workflow": { "status": "active", ... },
  "scheduleId": "schedule-uuid",
  "cronExpression": "0 0 * * *",
  "nextRunAt": "2025-11-29T00:00:00Z"
}
β†’ Status badge shows "● Active" (green)
β†’ Scheduler begins executing daily

// 3. Pause workflow (disables schedule)
POST /api/workflows/{id}/pause
{
  "id": "workflow-uuid"
}
β†’ Response:
{
  "workflow": { "status": "paused", ... },
  "success": true
}
β†’ Status badge shows "⏸⏸ Paused" (yellow)
β†’ Schedule preserved but disabled

// 4. Re-activate (re-enables existing schedule)
POST /api/workflows/{id}/activate
β†’ Same schedule reactivated
β†’ Status returns to "active"

Scheduler Implementation

// WorkflowScheduler (Go)

type WorkflowScheduler struct {
    cron         *cron.Cron
    db           *database.DB
    scheduleMap  map[string]cron.EntryID
    workflowRunner *WorkflowRunner
}

// Start - Called on app startup
func (s *WorkflowScheduler) Start(ctx context.Context) {
    s.cron.Start()
    s.LoadActiveSchedules(ctx)
    go s.refreshLoop(ctx)  // Reload every 5 min
}

// ConvertPollingIntervalToCron
func ConvertPollingIntervalToCron(interval string) string {
    cronMap := map[string]string{
        "15":   "*/15 * * * *",  // Every 15 minutes
        "30":   "*/30 * * * *",  // Every 30 minutes
        "60":   "0 * * * *",     // Every hour
        "360":  "0 */6 * * *",   // Every 6 hours
        "1440": "0 0 * * *",     // Daily at midnight
    }
    return cronMap[interval]
}

// CreateOrUpdateScheduleForWorkflow
func (s *WorkflowScheduler) CreateOrUpdateScheduleForWorkflow(
    workflowID, tenantID uuid.UUID, 
    pollingInterval string
) (*models.WorkflowSchedule, error) {
    cronExpr := ConvertPollingIntervalToCron(pollingInterval)
    
    // Check if schedule exists
    var existing models.WorkflowSchedule
    if found {
        // Update cron and re-add to scheduler
        existing.CronExpression = cronExpr
        existing.IsActive = true
        db.Save(&existing)
        s.RemoveSchedule(existing.ID)
        s.AddSchedule(ctx, &existing)
        return &existing, nil
    }
    
    // Create new schedule
    return s.CreateSchedule(workflowID, tenantID, cronExpr, "UTC")
}

// executeScheduledWorkflow - Run workflow on schedule
func (s *WorkflowScheduler) executeScheduledWorkflow(
    ctx context.Context, 
    schedule *models.WorkflowSchedule
) {
    // Check workflow status
    var workflow models.Workflow
    db.Where("id = ?", schedule.WorkflowID).First(&workflow)
    
    if workflow.Status != "active" {
        log.Printf("Skipping workflow %s: status is '%s'", 
            schedule.WorkflowID, workflow.Status)
        return
    }
    
    // Update last_run_at, calculate next_run_at
    now := time.Now()
    nextRun := cronSchedule.Next(now)
    db.Model(schedule).Updates(map[string]interface{}{
        "last_run_at": now,
        "next_run_at": nextRun,
    })
    
    // Execute workflow
    s.workflowRunner.ExecuteWorkflow(
        ctx, 
        schedule.WorkflowID, 
        schedule.TenantID,
        models.TriggerTypeSchedule, 
        nil
    )
}

Frontend Activation Controls

// workflow-builder-v2.tsx

const [currentWorkflow, setCurrentWorkflow] = useState<{
  id: string
  name: string
  status?: string  // 'draft' | 'active' | 'paused'
}>(null)

const activateWorkflow = async () => {
  const response = await fetch(
    `/api/workflows/${currentWorkflow.id}/activate`, 
    { method: 'POST' }
  )
  const data = await response.json()
  showToast(
    `Workflow activated! Next run: ${new Date(data.nextRunAt).toLocaleString()}`,
    "success"
  )
  setCurrentWorkflow({ ...currentWorkflow, status: 'active' })
}

const pauseWorkflow = async () => {
  await fetch(`/api/workflows/${currentWorkflow.id}/pause`, { method: 'POST' })
  showToast("Workflow paused", "success")
  setCurrentWorkflow({ ...currentWorkflow, status: 'paused' })
}

// UI - Status badge
{currentWorkflow?.status && (
  <span className={`badge ${
    currentWorkflow.status === 'active' ? 'badge-success' :
    currentWorkflow.status === 'paused' ? 'badge-warning' :
    'badge-secondary'
  }`}>
    {currentWorkflow.status === 'active' ? '● Active' :
     currentWorkflow.status === 'paused' ? '⏸⏸ Paused' :
     'β—‹ Draft'}
  </span>
)}

// UI - Activate/Pause button
{currentWorkflow?.status === 'active' ? (
  <Button onClick={pauseWorkflow}>Pause</Button>
) : (
  <Button onClick={activateWorkflow}>Activate</Button>
)}

Node Executors

Category Node Type Purpose
Triggerspaystack_syncFetch transactions from Paystack API
Actionscreate_paymentCreate payment record in database
create_contactCreate customer/vendor contact
create_journalCreate journal entry for accounting
link_externalLink to external transaction source
send_emailSend notification email
LogicconditionIf/else branching with operators
loopIterate over arrays, set context per item
transformData mapping and transformation

Template Variable System

Use {{variable}} syntax to reference data from previous nodes. Supports dot notation for nested fields.

// Template syntax examples
{{current_transaction.amount}}       // Access nested field
{{loop_item.customer.email}}         // Access from loop context
{{paystack_transactions[0].status}}  // Array indexing

// Context variables
{{loop_item}}      // Current item in loop
{{loop_index}}     // Current iteration index (0-based)
{{loop_total}}     // Total items in array
{{loop_is_first}}  // Boolean: first iteration
{{loop_is_last}}   // Boolean: last iteration

Condition Operators

Operator Example Use Case
equalsstatus == "success"Exact match
not_equalstype != "refund"Exclusion
greater_thanamount > 10000Numeric threshold
less_thanamount < 1000Lower bound
containsemail contains "@gmail"Substring match
starts_withreference starts_with "PAY"Prefix match
ends_withname ends_with ".pdf"Suffix match
is_emptydescription is_emptyNull/empty check

Integration-Aware Workflow Builder

When creating workflows that require integrations (Paystack, Stripe, etc.), the system automatically validates that integrations are connected before saving.

User Flow

  1. User drags Paystack Scheduled Sync node onto canvas
  2. Node configuration shows alert: "⚠️ Integration Required - Connect Paystack Integration β†’"
  3. User configures polling interval and transaction filter
  4. User clicks Save Workflow
  5. Backend validates that Paystack integration exists and is connected
  6. If missing: Returns 400 error with provider name
  7. Frontend shows alert: "Please connect Paystack integration before saving"
  8. User clicks link β†’ Redirects to integrations page
  9. User connects Paystack with API key
  10. Returns to workflow builder and saves successfully βœ…

Backend Validation

// API Route: /api/workflows (POST)

// Extract integration nodes
const integrationNodes = nodes.filter(node => 
  node.nodeType === 'paystack_scheduled_sync' || 
  node.nodeType === 'stripe_scheduled_sync'
)

// Check each integration
for (const node of integrationNodes) {
  const provider = node.nodeType.replace('_scheduled_sync', '')
  
  const integrations = await accountingClient.listIntegrations({})
  const hasIntegration = integrations.some(
    int => int.provider === provider && int.status === 'connected'
  )
  
  if (!hasIntegration) {
    return NextResponse.json({
      error: 'Missing required integrations',
      missingIntegrations: [{ provider }],
      message: `Please connect ${provider} before saving`
    }, { status: 400 })
  }
}

Frontend Node Configuration

// Paystack Sync Node Config UI

{selectedNode.nodeType === "paystack_scheduled_sync" && (
  <>
    <div className="bg-blue-50 border border-blue-200 rounded p-3">
      <AlertCircle className="text-blue-600" />
      <span className="font-semibold">Integration Required</span>
      <p>This workflow requires a connected Paystack integration.</p>
      <a href="/integrations">Connect Paystack Integration β†’</a>
    </div>
    
    <select>
      <option value="15">Every 15 minutes</option>
      <option value="60">Every hour</option>
      <option value="1440">Daily (recommended)</option>
    </select>
  </>
)}

Example Workflow: Auto-Import Paystack Transactions

{
  "name": "Auto-import Paystack Transactions",
  "nodes": [
    {
      "id": "node_1",
      "nodeType": "paystack_scheduled_sync",
      "config": {
        "polling_interval": "1440",
        "from_date": "{{today - 7 days}}",
        "to_date": "{{today}}",
        "transaction_filter": "success_only"
      }
    },
    {
      "id": "node_2",
      "nodeType": "loop",
      "config": {
        "array_path": "paystack_transactions.transactions"
      }
    },
    {
      "id": "node_3",
      "nodeType": "condition",
      "config": {
        "field": "{{loop_item.amount}}",
        "operator": "greater_than",
        "value": 10000
      }
    },
    {
      "id": "node_4",
      "nodeType": "create_payment",
      "config": {
        "amount": "{{loop_item.amount}}",
        "reference": "{{loop_item.reference}}",
        "customer_email": "{{loop_item.customer.email}}",
        "payment_type": "received",
        "payment_method": "card"
      }
    },
    {
      "id": "node_5",
      "nodeType": "create_journal",
      "config": {
        "description": "Payment received: {{loop_item.reference}}",
        "lines": [
          {
            "account_code": "1010",
            "debit": "{{loop_item.amount}}"
          },
          {
            "account_code": "4010",
            "credit": "{{loop_item.amount}}"
          }
        ]
      }
    }
  ],
  "connections": [
    { "sourceId": "node_1", "targetId": "node_2" },
    { "sourceId": "node_2", "targetId": "node_3" },
    { "sourceId": "node_3", "targetId": "node_4", "type": "true_branch" },
    { "sourceId": "node_4", "targetId": "node_5" }
  ]
}

Workflow Execution Flow

  1. Scheduler triggers workflow (daily at midnight)
  2. Paystack Sync Node fetches last 7 days of transactions via API
  3. Loop Node iterates over each transaction, sets {{loop_item}}
  4. Condition Node checks if amount > ₦10,000
  5. Create Payment Node creates payment record with resolved values
  6. Create Journal Node generates double-entry journal entry
  7. Workflow completes, logs execution summary

Paystack API Integration

Real HTTP integration to api.paystack.co/transaction with pagination, date filtering, and Bearer authentication.

// Paystack Sync Executor (Go)

func (e *PaystackSyncExecutor) Execute(ctx context.Context, ...) {
  // 1. Get integration credentials
  integration := getIntegration(tenantID, "paystack")
  
  // 2. Fetch transactions from Paystack
  transactions := fetchPaystackTransactions(
    integration.SecretKey,
    config.FromDate,
    config.ToDate,
    100 // perPage
  )
  
  // 3. Transform API response
  for _, txn := range transactions {
    amount := txn.Amount / 100 // kobo to naira
    processedTxns = append(processedTxns, {
      "amount": amount,
      "reference": txn.Reference,
      "status": txn.Status,
      "customer": {
        "email": txn.Customer.Email,
        "name": txn.Customer.FirstName + " " + txn.Customer.LastName
      }
    })
  }
  
  // 4. Store in execution context
  return map[string]interface{}{
    "paystack_transactions": {
      "transactions": processedTxns,
      "total": len(processedTxns)
    }
  }
}

Performance

Files

File Lines Purpose
workflow_scheduler.go337Cron scheduler with lifecycle management
workflow_runner.go380Graph execution engine (stubs removed)
executors/paystack_sync.go210Paystack API integration
executors/loop.go145Array iteration with context
executors/condition.go217If/else branching logic
executors/transform.go180Data transformation
workflows.go851gRPC handlers (activate/pause added)
accounting.proto2660+Proto definitions (activate/pause RPCs)
workflow-builder-v2.tsx2245Visual builder with activation controls
/api/workflows/.../activate29Activation endpoint
/api/workflows/.../pause27Pause endpoint
βœ… Production Ready: Scheduled workflow execution system complete with activation/pause controls, status tracking, cron-based scheduler, and automatic execution on configurable intervals. All 11 node executors registered, legacy stubs removed. Backend running in Docker, frontend on npm.

πŸ“§ Email System Production Ready

Overview

Template-based email system with Sparkpost integration, dynamic recipients, file attachments, and workflow automation. All emails use HTML templates with automatic plain text generation.

Key Features

Email Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚         Workflow Email Node (send_email)         β”‚
β”‚  β€’ Template selection (required)                 β”‚
β”‚  β€’ Recipient configuration                       β”‚
β”‚  β€’ Subject with variables                        β”‚
β”‚  β€’ Attachment configuration                      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                 β”‚
                 β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚        Email Executor (Go Backend)               β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  1. Resolve Recipients                     β”‚ β”‚
β”‚  β”‚     β€’ tenant_admin β†’ Tenant owner          β”‚ β”‚
β”‚  β”‚     β€’ user β†’ From workflow context         β”‚ β”‚
β”‚  β”‚     β€’ contact β†’ From payment/transaction   β”‚ β”‚
β”‚  β”‚     β€’ custom β†’ Static email list           β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β”‚               β–Ό                                  β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  2. Prepare Template Data                  β”‚ β”‚
β”‚  β”‚     β€’ Flatten payment/contact data         β”‚ β”‚
β”‚  β”‚     β€’ Add workflow metadata                β”‚ β”‚
β”‚  β”‚     β€’ Merge custom data from config        β”‚ β”‚
β”‚  β”‚     β€’ Replace {{variables}}                β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β”‚               β–Ό                                  β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  3. Render Template (html/template)        β”‚ β”‚
β”‚  β”‚     β€’ Load from templates/emails/          β”‚ β”‚
β”‚  β”‚     β€’ Execute with template data           β”‚ β”‚
β”‚  β”‚     β€’ Generate plain text version          β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β”‚               β–Ό                                  β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  4. Generate Attachments                   β”‚ β”‚
β”‚  β”‚     β€’ payment_receipt β†’ PDF/HTML           β”‚ β”‚
β”‚  β”‚     β€’ csv_export β†’ CSV from data           β”‚ β”‚
β”‚  β”‚     β€’ financial_report β†’ Comprehensive PDF β”‚ β”‚
β”‚  β”‚     β€’ Base64 encode for API                β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β”‚               β–Ό                                  β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚  5. Send via Sparkpost API                 β”‚ β”‚
β”‚  β”‚     β€’ POST /transmissions                  β”‚ β”‚
β”‚  β”‚     β€’ Multiple recipients (To/CC/BCC)      β”‚ β”‚
β”‚  β”‚     β€’ Attachments with base64 encoding     β”‚ β”‚
β”‚  β”‚     β€’ Sandbox mode for development         β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Built-in Email Templates

Template Purpose Key Variables
payment_received Payment confirmation amount, currency, reference, transaction_date
workflow_success Workflow completion workflow_name, execution_id, nodes_executed, duration
workflow_failed Workflow failure alert workflow_name, failed_node, error_message
financial_report Financial summary revenue, expenses, net_income, top_expenses, top_revenue

Recipient Resolution Types

Type Description Configuration
tenant_admin Send to tenant owner/administrator {"type": "tenant_admin"}
user Extract from workflow context {"type": "user", "email_field": "user.email", "name_field": "user.name"}
contact From payment/transaction data {"type": "contact", "email_field": "created_payment.contact_email"}
custom Static email list {"type": "custom", "emails": ["admin@company.com"]}

Email Node Configuration Example

// Payment notification with receipt
{
  "type": "send_email",
  "config": {
    "template": "payment_received",
    "recipients": {
      "type": "contact",
      "email_field": "created_payment.contact_email",
      "name_field": "created_payment.contact_name",
      "cc_tenant": true  // CC tenant admin
    },
    "subject": "Payment Received - {{reference}}",
    "attachments": [
      {
        "type": "payment_receipt",
        "filename": "receipt_{{reference}}.pdf"
      }
    ],
    "data": {
      "recipient_name": "{{contact_name}}",
      "dashboard_url": "https://app.company.com"
    }
  }
}

Attachment Types

Type Generated Content Use Case
payment_receipt HTML/PDF receipt Payment confirmations
csv_export CSV from data array Transaction reports, data exports
financial_report Comprehensive PDF Monthly/quarterly reports

Variable Substitution

Use {{variable}} syntax with dot notation for nested values:

// Automatically available variables
{{workflow_id}}           // Current workflow ID
{{execution_id}}          // Current execution ID
{{tenant_name}}           // Tenant company name
{{current_date}}          // YYYY-MM-DD format
{{current_datetime}}      // RFC3339 format
{{dashboard_url}}         // Dashboard base URL

// Flattened payment data (from created_payment)
{{amount}}                // Payment amount
{{currency}}              // Payment currency (USD, NGN, etc.)
{{reference}}             // Payment reference
{{description}}           // Payment description
{{payment_method}}        // Payment method (card, bank_transfer)
{{transaction_date}}      // Transaction date
{{payment_id}}            // Payment UUID

// Flattened contact data (from created_contact)
{{contact_name}}          // Contact full name
{{contact_email}}         // Contact email
{{contact_phone}}         // Contact phone

// Nested access from workflow context
{{created_payment.metadata.custom_field}}
{{loop_item.customer.name}}

Environment Configuration

# Sparkpost API Configuration
SPARKPOST_API_KEY=your_sparkpost_api_key
SPARKPOST_API_URL=https://api.sparkpost.com/api/v1

# Email Defaults
EMAIL_FROM_ADDRESS=noreply@yourdomain.com
EMAIL_FROM_NAME=Keepfig

# Environment (affects sandbox mode)
ENVIRONMENT=production  # development enables sandbox mode

Sparkpost Setup Steps

  1. Get API Key: Log in to Sparkpost dashboard β†’ API Keys β†’ Create New Key
  2. Verify Domain: Account β†’ Sending Domains β†’ Add Domain
  3. Add DNS Records: Copy DKIM and SPF records to Namecheap DNS
  4. DKIM: TXT record like scph1234._domainkey β†’ v=DKIM1; k=rsa; p=...
  5. SPF: TXT record @ β†’ v=spf1 include:sparkpostmail.com ~all
  6. Verify: Wait 24-48h for DNS propagation, click Verify in Sparkpost
  7. Test: Use sandbox mode (ENVIRONMENT=development) for testing

Pricing

πŸ’‘ Cost Estimate: 100 tenants Γ— 50 emails/month = 5,000 emails = $1/month

Schema-Driven Configuration UI

The workflow builder uses a dynamic, composable configuration system instead of hardcoded conditions for each node type. This makes it easy to add new node types without frontend code changes.

How It Works

  1. Backend Schema: Node types define config_schema in JSON (e.g., actions.json)
  2. Dynamic Renderer: DynamicConfigRenderer component reads the schema and generates UI
  3. Field Types: text, textarea, select, boolean, number, array, object (nested configs)
  4. Conditional Fields: required_when shows/hides fields based on parent values
  5. Template Metadata: Each email template option includes required_data, suggested_recipients, supports_attachments

Example: send_email Config Schema

{
  "template": {
    "type": "select",
    "label": "Email Template",
    "required": true,
    "options": [
      {
        "value": "payment_received",
        "label": "Payment Received",
        "description": "Payment confirmation with receipt",
        "required_data": ["amount", "currency", "reference"],
        "suggested_recipients": ["contact"],
        "supports_attachments": ["payment_receipt"]
      }
    ]
  },
  "recipients": {
    "type": "object",
    "properties": {
      "type": {
        "type": "select",
        "options": [
          { "value": "tenant_admin", "label": "Tenant Owner/Admin" },
          { "value": "user", "label": "User from Context" },
          { "value": "contact", "label": "Contact from Payment" },
          { "value": "custom", "label": "Custom Email List" }
        ]
      },
      "email_field": {
        "type": "text",
        "required_when": ["user", "contact"],
        "placeholder": "team_member.email"
      }
    }
  }
}

Benefits

πŸ’‘ Location: frontend/src/components/workflow/dynamic-config-renderer.tsx

Common Workflow Patterns

1. Payment Notification Flow

paystack_sync β†’ create_payment β†’ send_email (payment_received)
                                         ↓
                              Store payment in DB
                                         ↓
                              Email customer with receipt

2. Workflow Status Notifications

// Success notification
[workflow completes] β†’ send_email (workflow_success to admin)

// Failure notification  
[workflow fails] β†’ send_email (workflow_failed to admin)

3. Scheduled Reports

[cron: daily at midnight]
  ↓
generate_report_data
  ↓
send_email (financial_report with CSV attachment)

4. Conditional Notifications

create_payment
  ↓
condition (amount > 10000)
  ↓ (true)
send_email (high_value_payment alert to CFO)

πŸ”’ Rate Limiting Complete

Overview

Multi-tier token bucket rate limiting with Redis. Protects API from abuse with automatic retry and user-friendly feedback.

Architecture

Default Limits

Tier Requests/Min Burst
Global10,000100
Tenant1,00050
User20020
AI Chat10010
File Upload20020
Bulk Ops505

Backend Implementation

// Go - Rate limiter with token bucket
rateLimiter := middleware.NewRateLimiter(redisClient, config)

// gRPC interceptor chain
grpcSrv := grpc.NewServer(
    grpc.ChainUnaryInterceptor(
        rateLimiter.Unary(),      // Rate limit FIRST
        authInterceptor.Unary(),  // Auth second
    ),
)

Frontend Integration

// Automatic retry on rate limit
import { withRateLimitRetry } from '@/lib/rate-limit-handler'

const result = await withRateLimitRetry(async () => {
  return await fetch('/api/endpoint')
})
// Retries 3x with backoff: 1s β†’ 2s β†’ 4s

// Toast notification
import { useRateLimitToast } from '@/components/ui/rate-limit-toast'

const { toast, showToast } = useRateLimitToast()
if (isRateLimited) {
  showToast("Rate limit reached, retrying in 3s...", 3)
}

Performance

πŸ“„ Complete Guide: See rate-limiting.html for detailed documentation with examples, testing, and monitoring.

πŸ€– AI Chatbot Enhanced

Overview

Natural language interface for financial queries. Powered by OpenAI with context-aware function calling.

Features

Available Functions

Function Purpose
get_dashboard_statsRevenue, expenses, profit overview
list_transactionsRecent transactions with filters
get_revenue_trendTime-series revenue analysis
get_expense_breakdownCategory-wise expense breakdown
search_paymentsFind specific payments
get_mrr_reportMonthly recurring revenue metrics

Usage

// Frontend - Chat with AI
import { accountingClientBrowser } from '@/lib/connect-rpc-browser'

const stream = accountingClientBrowser.chatWithAI({
  tenantId: 'uuid',
  sessionId: 'uuid',
  userMessage: 'What was my revenue last month?'
})

for await (const response of stream) {
  console.log(response.assistantMessage)
}

πŸ”Œ Integrations Active

πŸ“š Full API Documentation: For complete REST API documentation with all endpoints, request/response schemas, and examples, see the Swagger API Reference β†’

Supported Providers

Provider Type Features
Paystack API + CSV Live sync, transaction history, webhooks
Stripe CSV Balance history import, template matching
Flutterwave CSV Transaction export import
Square CSV Payment export import
Generic CSV Any provider with AI mapping

Integration Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚     Integration Manager                 β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                                         β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚  API Sync    β”‚   β”‚  CSV Import  β”‚  β”‚
β”‚  β”‚  (Paystack)  β”‚   β”‚  (All)       β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜   β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β”‚         β”‚                  β”‚           β”‚
β”‚         β–Ό                  β–Ό           β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚  ExternalTransactionSource      β”‚  β”‚
β”‚  β”‚  (Deduplication & Linking)      β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β”‚         β”‚                               β”‚
β”‚         β–Ό                               β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚  Payment & Journal Entry        β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Dual Entry Points

πŸ’‘ Duplicate Prevention: CSV imports create ExternalTransactionSource records to prevent re-import via API sync. Reference column becomes ExternalID for uniqueness.

πŸ“Š Pitch Deck Generation Active

Overview

AI-generated investor pitch decks with live financial metrics. Connect data sources, answer questions, get presentation-ready slides.

Features

Data Sources

βš™οΈ Backend Services

Service Architecture

backend/
β”œβ”€β”€ cmd/
β”‚   └── server/
β”‚       └── main.go              (gRPC server, interceptors)
β”œβ”€β”€ internal/
β”‚   β”œβ”€β”€ accountingserver/        (20+ gRPC handlers)
β”‚   β”‚   β”œβ”€β”€ accounts.go
β”‚   β”‚   β”œβ”€β”€ payments.go
β”‚   β”‚   β”œβ”€β”€ journalentries.go
β”‚   β”‚   β”œβ”€β”€ reconciliation.go
β”‚   β”‚   β”œβ”€β”€ ai_agent.go
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ services/                (Business logic)
β”‚   β”‚   β”œβ”€β”€ import_parser_service.go
β”‚   β”‚   β”œβ”€β”€ import_mapper_service.go
β”‚   β”‚   β”œβ”€β”€ import_service.go
β”‚   β”‚   β”œβ”€β”€ ai_agent_service.go
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ middleware/
β”‚   β”‚   β”œβ”€β”€ ratelimit.go         (Rate limiting)
β”‚   β”‚   └── auth.go              (JWT validation)
β”‚   β”œβ”€β”€ database/
β”‚   β”‚   └── database.go          (GORM setup)
β”‚   └── models/
β”‚       └── models.go            (50+ database models)
└── proto/
    └── accounting.proto         (gRPC definitions)

Key Services

Service Purpose LOC
ImportParserServiceCSV/Excel parsing & validation400
ImportMapperServiceAI/template column mapping500
ImportServiceImport workflow orchestration400
AIAgentServiceChatbot with function calling600
PaymentServicePayment CRUD & reconciliation300
JournalServiceDouble-entry accounting450

Database Models (50+)

🎨 Frontend App

Tech Stack

Structure

frontend/src/
β”œβ”€β”€ app/
β”‚   β”œβ”€β”€ (auth)/
β”‚   β”‚   β”œβ”€β”€ login/page.tsx
β”‚   β”‚   └── register/page.tsx
β”‚   └── dashboard/
β”‚       β”œβ”€β”€ page.tsx              (dashboard)
β”‚       β”œβ”€β”€ payments/
β”‚       β”œβ”€β”€ journal-entries/
β”‚       β”œβ”€β”€ accounts/
β”‚       β”œβ”€β”€ data-imports/
β”‚       β”œβ”€β”€ integrations/
β”‚       └── settings/
β”œβ”€β”€ components/
β”‚   β”œβ”€β”€ ui/                       (shadcn components)
β”‚   β”œβ”€β”€ dashboard/
β”‚   β”œβ”€β”€ data-imports/
β”‚   └── pitch-deck/
β”œβ”€β”€ lib/
β”‚   β”œβ”€β”€ connect-rpc-browser.ts   (gRPC client)
β”‚   β”œβ”€β”€ rate-limit-handler.ts
β”‚   └── api/
β”‚       └── imports.ts
β”œβ”€β”€ hooks/
β”‚   └── use-imports.ts           (React hooks)
└── gen/
    β”œβ”€β”€ accounting_pb.ts         (Protobuf messages)
    └── accounting_connect.ts    (gRPC service)

πŸ—„οΈ Database Schema

Core Tables

-- Tenants & Users
tenants (id, name, base_currency, created_at)
users (id, email, password_hash, created_at)
tenant_memberships (tenant_id, user_id, role)

-- Accounting
accounts (id, tenant_id, name, type, code, parent_id)
transactions (id, tenant_id, amount, date, type, status)
payments (id, tenant_id, amount, date, type, payment_method)
journal_entries (id, tenant_id, date, description, status)
journal_lines (entry_id, account_id, debit, credit)

-- Data Imports
import_sessions (id, tenant_id, filename, status, total_rows)
import_field_mappings (id, session_id, source_column, target_field)
import_errors (id, session_id, row_number, error_message)
import_templates (id, name, provider, column_mappings)
mapping_cache (cache_key, mapping_result, expires_at)

-- Integrations
integrations (id, tenant_id, provider, status, credentials)
external_transaction_sources (integration_id, external_id, transaction_id)

-- AI & Analytics
ai_query_history (id, tenant_id, user_id, query, response)
subscriptions (id, tenant_id, plan, mrr, status)
pitch_data (tenant_id, data_type, content)

πŸš€ Production Deployment

Deployment Script

# Install dependencies
./deploy.sh install

# Deploy to production
./deploy.sh deploy

# Setup SSL certificate
./deploy.sh ssl yourdomain.com

# Status check
./deploy.sh status

Docker Compose Services

Service Image Purpose
frontendnode:18-alpineNext.js app
backendgolang:1.25-alpinegRPC server
extractorgolang:1.25-alpineDocument AI
postgrespostgres:15-alpineDatabase
redisredis:7-alpineCache & rate limiting
nginxnginx:alpineReverse proxy

Environment Variables

# Backend (.env)
DATABASE_URL=postgres://user:pass@postgres:5432/invoice_pro_db
REDIS_URL=redis://redis:6379
JWT_SECRET=your-secret-key
OPENAI_API_KEY=sk-...
PAYSTACK_SECRET_KEY=sk_...

# Frontend (.env.local)
NEXT_PUBLIC_GRPC_WEB_URL=http://localhost:8080
NEXT_PUBLIC_API_URL=http://localhost:3000/api

Resource Limits (Optimized for $6-24/mo droplets)

Service CPU Memory
Frontend0.5512MB
Backend1.01GB
Extractor0.5512MB
PostgreSQL1.01GB
Redis0.25256MB
Nginx0.25128MB

πŸ“Š Monitoring

Health Checks

# Check all services
docker-compose ps

# View logs
docker-compose logs -f backend
docker-compose logs -f frontend

# Database connection
docker-compose exec postgres psql -U postgres -d invoice_pro_db

# Redis status
docker-compose exec redis redis-cli INFO

Key Metrics

⚠️ Important:
  • Always backup PostgreSQL before major updates
  • Monitor disk usage (imports create temp files)
  • Set up log rotation for Docker containers
  • Test SSL renewal process (Let's Encrypt auto-renews)

πŸ›‘οΈ Fraud Detection API Standard

Industry Standards Compliance:
The Fraud Detection API follows industry best practices from ISO 31000 (Risk Management), ACFE (Association of Certified Fraud Examiners), and PCI DSS fraud detection guidelines.

Response Structure Overview

All verification responses follow a standardized structure with clear decisions, signals, and recommendations:

message VerificationResult {
  // Identifiers
  string verification_id = 1;
  string document_hash_id = 2;
  
  // Decision (ACCEPT/REVIEW/REJECT)
  VerificationDecision decision = 5;
  string recommendation = 6;
  
  // Risk Assessment
  double risk_score = 7;        // 0.0 to 1.0
  string risk_level = 8;        // low, medium, high, critical
  double confidence = 9;        // 0.0 to 1.0
  
  // Individual fraud indicators
  repeated DetectionSignal signals = 10;
  
  // Duplicate detection results
  DuplicateAnalysis duplicate_analysis = 11;
  
  // Visual similarity (recurring vendors)
  SimilarityAnalysis similarity_analysis = 12;
  
  // Processing metadata
  ProcessingInfo processing_info = 13;
}

Decision Types

Decision Risk Level Action Use Case
ACCEPT LOW Process automatically Document verified, no risk detected
REVIEW MEDIUM/HIGH Manual review required Elevated risk, human verification needed
REJECT CRITICAL Block/reject document Confirmed duplicate or critical fraud indicators

Detection Signals

Each signal represents an individual fraud indicator with its own confidence and risk contribution:

message DetectionSignal {
  string signal_type = 1;        // duplicate_hash, visual_similarity, etc.
  string category = 2;           // duplicate_detection, image_forensics, etc.
  string severity = 3;           // info, low, medium, high, critical
  string description = 4;        // Human-readable description
  double confidence = 5;         // 0.0 to 1.0
  double risk_contribution = 6;  // How much this adds to risk score
  map<string, string> metadata = 7;
  string phase = 8;              // Which phase detected this signal
}

Signal Categories

Common Signal Types

Signal Type Severity Risk Description
duplicate_document CRITICAL 0.6 Confirmed duplicate (metadata verified)
duplicate_invoice_number CRITICAL 0.6 Invoice number already exists in system
visual_similarity INFO 0.0 Similar to previous doc (likely recurring vendor)
copy_move_detected HIGH 0.4 Cloned regions found (possible manipulation)
ela_suspicious MEDIUM 0.3 Error Level Analysis indicates editing
no_exif_data LOW 0.1 Image missing EXIF metadata

Duplicate vs Similarity Analysis

DuplicateAnalysis - True Duplicates

Confirmed duplicates that should be rejected:

message DuplicateAnalysis {
  bool is_duplicate = 1;
  string detection_method = 2;        // "exact_hash", "invoice_number", "metadata_verified"
  repeated string duplicate_document_ids = 3;
  double match_confidence = 4;        // Typically 0.95+ for true duplicates
  string explanation = 5;
}
⚠️ True Duplicate Criteria:
  • Exact Hash Match: SHA-256 identical (100% same file)
  • Invoice Number Match: Same invoice number in database
  • Metadata Verified: Visual similarity + same amount + same date + same reference number

SimilarityAnalysis - Recurring Vendors

Visual similarity that's NOT a duplicate (legitimate recurring transactions):

message SimilarityAnalysis {
  bool has_similar_documents = 1;
  int32 similar_document_count = 2;
  repeated SimilarDocument similar_documents = 3;
  string interpretation = 4;          // "recurring_vendor", "same_template"
  string recommendation = 5;
}

message SimilarDocument {
  string document_id = 1;
  double visual_similarity_score = 2;  // 0.0 to 1.0
  int32 hamming_distance = 3;          // Perceptual hash distance
  DifferentiatingFactors differentiating_factors = 4;
}

Differentiating Factors

Shows WHY visually similar documents are NOT duplicates:

message DifferentiatingFactors {
  bool different_invoice_numbers = 1;
  bool different_amounts = 2;
  bool different_dates = 3;
  string amount_difference = 4;       // "$52.00 vs $26.00"
  string date_difference = 5;         // "Oct 2025 vs Dec 2025"
  string invoice_numbers = 6;         // "#256348 vs #284081"
}
πŸ’‘ Key Insight: Visual similarity (same invoice template/layout) is DIFFERENT from duplicates. Recurring vendors use the same template but with different transaction details. The system detects similarity but checks metadata to confirm it's a unique transaction.

Processing Info

Multi-phase forensics tracking:

message ProcessingInfo {
  repeated string phases_completed = 1;   // ["hash", "ocr", "image_forensics"]
  repeated string phases_pending = 2;     // ["deep_analysis"]
  int64 processing_duration_ms = 3;
  string processed_by = 4;                // Service version
  map<string, string> phase_timings = 5;
}

Forensics Phases (Multi-Phase Architecture)

  1. Phase 1 - Hash Analysis: SHA-256 + perceptual hash (exact + visual similarity)
  2. Phase 2 - OCR Extraction: Invoice number, amount, merchant (docTR/PaddleOCR)
  3. Phase 3 - Image Forensics: EXIF, ELA, copy-move detection (if image)
  4. Phase 4 - PDF Forensics: Metadata, structure, timestamps (if PDF)
  5. Phase 5 - Metadata Verification: Cross-reference to determine true duplicates
  6. Phase 6 - Deep Analysis (Future): ML-based anomaly detection, behavioral patterns

Example Responses

Scenario 1: Confirmed Duplicate (REJECT)

{
  "verification_id": "abc123...",
  "decision": "REJECT",
  "recommendation": "REJECT: This document is a confirmed duplicate. Do not process to avoid duplicate payments.",
  "risk_score": 0.95,
  "risk_level": "critical",
  "confidence": 0.98,
  
  "signals": [
    {
      "signal_type": "duplicate_invoice_number",
      "category": "duplicate_detection",
      "severity": "critical",
      "description": "Invoice number #284081 already exists in system",
      "confidence": 0.99,
      "risk_contribution": 0.6
    }
  ],
  
  "duplicate_analysis": {
    "is_duplicate": true,
    "detection_method": "invoice_number_match",
    "duplicate_document_ids": ["def456..."],
    "match_confidence": 0.99,
    "explanation": "Duplicate invoice number detected: #284081"
  },
  
  "similarity_analysis": {
    "has_similar_documents": false
  }
}

Scenario 2: Recurring Vendor (ACCEPT)

{
  "verification_id": "xyz789...",
  "decision": "ACCEPT",
  "recommendation": "ACCEPT: Visual similarity detected with previous documents (likely recurring vendor/template), but metadata confirms this is a unique transaction. Safe to process.",
  "risk_score": 0.15,
  "risk_level": "low",
  "confidence": 0.92,
  
  "signals": [],
  
  "duplicate_analysis": {
    "is_duplicate": false,
    "detection_method": "metadata_verified_unique",
    "match_confidence": 0.0,
    "explanation": "No duplicates detected. Document is unique."
  },
  
  "similarity_analysis": {
    "has_similar_documents": true,
    "similar_document_count": 2,
    "similar_documents": [
      {
        "document_id": "abc123...",
        "visual_similarity_score": 0.92,
        "hamming_distance": 3,
        "differentiating_factors": {
          "different_invoice_numbers": true,
          "different_amounts": true,
          "different_dates": true,
          "amount_difference": "$52.00 vs $26.00",
          "date_difference": "Oct 2025 vs Dec 2025",
          "invoice_numbers": "#256348 vs #284081"
        }
      }
    ],
    "interpretation": "recurring_vendor",
    "recommendation": "Visual similarity detected with previous documents. This appears to be from the same vendor/template with different transaction details. Document is unique and safe to process."
  }
}

Scenario 3: Medium Risk (REVIEW)

{
  "verification_id": "review123...",
  "decision": "REVIEW",
  "recommendation": "REVIEW: Elevated risk detected (medium). Please verify document authenticity before processing.",
  "risk_score": 0.55,
  "risk_level": "medium",
  "confidence": 0.78,
  
  "signals": [
    {
      "signal_type": "ela_suspicious",
      "category": "image_forensics",
      "severity": "medium",
      "description": "Error Level Analysis indicates possible manipulation",
      "confidence": 0.75,
      "risk_contribution": 0.3
    },
    {
      "signal_type": "no_exif_data",
      "category": "image_forensics",
      "severity": "low",
      "description": "Image missing EXIF metadata",
      "confidence": 0.9,
      "risk_contribution": 0.1
    }
  ],
  
  "duplicate_analysis": {
    "is_duplicate": false
  },
  
  "similarity_analysis": {
    "has_similar_documents": false
  }
}

Frontend Integration

TypeScript Client Usage

import { FraudDetectionApiClient } from '@/lib/fraud-detection/api-client';

// Upload document for verification
const result = await fraudClient.verifyDocument(file, {
  gdprConsent: true,
  consentFields: {
    purpose: 'fraud_detection',
    retention: '90_days'
  }
});

// Handle decision
switch (result.decision) {
  case 'ACCEPT':
    // Process document automatically
    console.log('βœ…', result.recommendation);
    await processDocument(result.verificationId);
    break;
    
  case 'REVIEW':
    // Flag for manual review
    console.log('⚠️', result.recommendation);
    await flagForReview(result.verificationId);
    break;
    
  case 'REJECT':
    // Block document
    console.log('🚫', result.recommendation);
    await rejectDocument(result.verificationId);
    break;
}

// Show similarity info if present
if (result.similarityAnalysis?.hasSimilarDocuments) {
  console.log('ℹ️ Recurring vendor detected:', 
    result.similarityAnalysis.recommendation);
}

UI Display Patterns

Recommended UI Flow:
  1. Primary Display: Show decision badge (ACCEPT/REVIEW/REJECT) with color coding
  2. Recommendation: Display the recommendation text prominently
  3. Risk Indicator: Show risk level and score as visual gauge
  4. Signals Section: Expandable list of individual fraud indicators
  5. Similarity Alert: If visual similarity detected, show info banner explaining recurring vendor
  6. Differentiating Factors: Show why similar documents are NOT duplicates

Best Practices

Implementation Guidelines:
  • Trust the Decision: Use the decision field as primary guidance (ACCEPT/REVIEW/REJECT)
  • Show Context: Always display the recommendation text to explain WHY a decision was made
  • Distinguish Duplicates from Similarity: Don't confuse recurring vendors with true duplicates
  • Educate Users: Explain that visual similarity (same template) is normal for recurring vendors
  • Track Phases: Show which forensics phases have completed for transparency
  • Signal Transparency: Display individual signals so users understand risk composition
  • Confidence Scores: Show confidence alongside decisions for reliability context

Future Phases (Roadmap)

Logging & Observability

Grafana Loki + Promtail + Grafana - Production-grade centralized logging system

Architecture Overview

Keepfig uses industry-standard logging stack for centralized log management:

Component Purpose Port Resources
Loki Log aggregation storage 3100 0.5 CPU, 512MB RAM
Promtail Log collector (Docker) - 0.25 CPU, 128MB RAM
Grafana Visualization dashboard 3000 0.5 CPU, 512MB RAM

Features

Quick Start

Development

# Start all services including logging
docker-compose up -d

# Access Grafana
open http://localhost:3000
# Login: admin / admin

Production

# Set secure Grafana password
export GRAFANA_ADMIN_PASSWORD="your-secure-password"

# Deploy with logging
docker-compose -f docker-compose.prod.yml up -d

# Access Grafana
open http://your-server:3000

Using Grafana

Pre-configured Dashboard

Navigate to: Dashboards β†’ Invoice Pro - Application Logs

Dashboard Panels:

LogQL Query Examples

# All backend logs
{service="backend"}

# Fraud detection logs
{service="backend"} |~ "VerifyDocument|GetVerificationResult"

# Currency conversion logs
{service="backend"} |~ "convertToBaseCurrency|Currency"

# OCR service logs
{service="ocr-service"}

# All errors across all services
{project="invoice-pro"} |~ "ERROR|CRITICAL|Failed"

# Search by verification ID (distributed tracing)
{service="backend"} |~ "VerificationID=abc-123-def-456"

# Count errors per service
sum by (service) (count_over_time({project="invoice-pro"} |~ "ERROR" [5m]))

Log Retention

Adjust Retention: Edit loki-config.yaml

limits_config:
  retention_period: 720h  # Change to desired hours

Security Best Practices

Production Security Checklist:
  1. Change Grafana Password:
    # In .env or docker-compose.prod.yml
    - GF_SECURITY_ADMIN_PASSWORD=your-strong-password
  2. Restrict Grafana Port:
    ports:
      - "127.0.0.1:3000:3000"  # Localhost only
  3. Use Reverse Proxy: Setup nginx with SSL for Grafana access
  4. Enable OAuth: Integrate with existing auth (OAuth, LDAP, SAML)

Troubleshooting

Logs Not Appearing

# Check Promtail is running
docker-compose ps promtail
docker-compose logs promtail

# Verify Loki is accessible
curl http://localhost:3100/ready
# Should return: ready

# Check Docker labels
docker inspect invoice-pro-backend-1 | grep com.docker.compose.service

Grafana Can't Connect to Loki

# Test network connectivity
docker-compose exec grafana ping loki

# Check datasource in Grafana UI
# Configuration β†’ Data Sources β†’ Loki
# URL should be: http://loki:3100

High Disk Usage

# Check Loki storage
docker-compose exec loki du -sh /loki

# Reduce retention in loki-config.yaml
# Then restart
docker-compose restart loki

OCR Service Build Timeout

Models are lazy-loaded at startup (not during build) to avoid GitHub Actions timeout:

# Check OCR service status
docker compose -f docker-compose.prod.yml ps ocr-service

# Watch model download during first startup
docker compose -f docker-compose.prod.yml logs ocr-service -f
# Look for: "Initializing CPU-optimized docTR OCR predictor (first request)..."
# Look for: "CPU-optimized docTR OCR predictor initialized successfully"

# Test health endpoint (should return after model download)
curl http://localhost:8001/health
# Should return: {"status": "healthy"}

Integration with Application

Backend (Go)

Logs automatically collected via stdout/stderr. Use structured logging:

log.Printf("[VerifyDocument] TenantID=%s, Amount=%.2f %s, Decision=%s",
    tenantID, amount, currency, decision)

OCR Service (Python)

import logging
logger = logging.getLogger(__name__)
logger.info(f"Detected currency: {currency} with confidence {confidence}")

Add Correlation IDs

log.Printf("[RequestID=%s] Processing verification...", requestID)

# Search in Grafana:
{service="backend"} |~ "RequestID=abc-123"

Cost Analysis

Solution Monthly Cost Notes
Loki (Self-hosted) $0 + hosting Free, ~$5/month infrastructure
Datadog $31/host Commercial SaaS
New Relic $25/month Commercial SaaS
Loggly $79/month Commercial SaaS

Annual Savings: $300-900 compared to commercial alternatives

Recent Updates (December 2025)

Universal Currency Detection

Status: Production Ready | Supports 180+ currencies worldwide

Enhancements

Example Supported Currencies

Europe: EUR €, GBP Β£, CHF, SEK kr, NOK kr, DKK kr, PLN zΕ‚
Africa: NGN ₦, ZAR R, GHS β‚΅, KES KSh, EGP Β£
Asia: JPY Β₯, CNY Β₯, INR β‚Ή, KRW β‚©, THB ΰΈΏ, SGD $, HKD $
Americas: USD $, CAD $, BRL R$, MXN $, ARS $, CLP $
Middle East: AED Ψ―.Ψ₯, SAR Ψ±.Ψ³, ILS β‚ͺ

How It Works

def detect_currency(text: str) -> str:
    # 1. Search for any 3-letter ISO code pattern
    currency_pattern = r'\b([A-Z]{3})\b'
    matches = re.findall(currency_pattern, text_upper)
    
    # 2. Prioritize common currencies if multiple found
    # 3. Fallback to symbol detection (€, $, Β£, Β₯, β‚Ή, β‚½, ₦, etc.)
    # 4. Context-aware disambiguation for ambiguous symbols

Files Updated: ocr-service/app/main.py, backend/internal/accountingserver/forensics.go

Enhanced Similar Documents Display

Status: Production Ready | Complete document hash visibility

New Information Displayed

When similar documents are detected (not duplicates), the UI now shows:

Visual Design

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Similar Documents Found:                       β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚ Document ID: abc-123-def-456...                β”‚
β”‚ Visual Similarity: 87%                         β”‚
β”‚                                                β”‚
β”‚ SHA-256 Hash: a1b2c3d4e5f6...                 β”‚
β”‚ Perceptual Hash: 1a2b3c4d5e6f...              β”‚
β”‚ Hamming Distance: 8 (Similar)                  β”‚
β”‚                                                β”‚
β”‚ Uploaded: Dec 3, 2025, 10:30 AM               β”‚
β”‚                                                β”‚
β”‚ βœ“ Why This Is NOT a Duplicate:                β”‚
β”‚   βœ“ Different Invoice Numbers                 β”‚
β”‚     #256348 vs #284081                         β”‚
β”‚   βœ“ Different Amounts                          β”‚
β”‚     $52.00 vs $26.00                           β”‚
β”‚   βœ“ Different Dates                            β”‚
β”‚     Oct 2025 vs Dec 2025                       β”‚
β”‚                                                β”‚
β”‚ View Full Details β†’                            β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Files Updated: proto/forensics.proto, forensics.go, use-fraud-detection.ts, fraud-check-results.tsx

Deployment Checklist

GitHub Actions Deployment:

The deployment workflow automatically handles all services including the new logging stack (Loki, Promtail, Grafana) and OCR service.

Required GitHub Secrets

Secret Name Description Template File
ENV_POSTGRES PostgreSQL configuration .env.postgres.prod
ENV_GRAFANA Grafana admin credentials .env.grafana.prod
ENV_BACKEND Backend service config .env.backend.prod
ENV_EXTRACTOR Document extractor config .env.extractor.prod
DEPLOY_HOST Server IP address e.g., 123.45.67.89
DEPLOY_USER SSH username e.g., root
DEPLOY_SSH_KEY Private SSH key Full private key content

Deployment Steps

  1. Configure GitHub Secrets:
    • Go to repository Settings β†’ Secrets and variables β†’ Actions
    • Add each secret from the table above
    • Use strong passwords for ENV_POSTGRES and ENV_GRAFANA
  2. Push to Main: Automatic deployment triggers on push to main branch
  3. Verify Services:
    ssh root@your-server
    cd ~/invoice-pro
    ./deploy.sh monitor  # Check all services are running
  4. Access Grafana: http://your-server:3001 (admin credentials from ENV_GRAFANA)
  5. Test Logging: Upload invoice β†’ Check logs in Grafana dashboard

Manual Deployment (Optional)

# SSH into server
ssh root@your-server

# Navigate to project
cd ~/invoice-pro

# Pull latest changes
git pull origin main

# Deploy all services
./deploy.sh deploy

# Monitor health
./deploy.sh monitor

Services Deployed

Post-Deployment Verification

  1. Regenerate Protobuf: make proto-go
  2. Rebuild Services: docker-compose build backend ocr-service
  3. Deploy: docker-compose up -d
  4. Verify Grafana: open http://localhost:3001
  5. Test Currency Detection: Upload invoices with various currencies
  6. Test Similar Documents: Upload recurring vendor invoices

Invoice Pro Documentation | Complete Implementation Guide | 2025
For detailed rate limiting documentation, see rate-limiting.html