This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
go build -ldflags="-w -s" -o bank-transactions main.go database.go models.go parser.go categorization.go# Run all unit tests
go test ./tests/unit/...
# Run specific test suites
go test ./tests/unit/ -run TestDirectoryIngestion
go test ./tests/unit/ -run TestFullIntegration
go test ./tests/unit/ -run TestTransactionsQueryOutput
# Run individual test files
go test -v ./tests/unit/integration_test.go
go test -v ./tests/unit/transactions_test.go
go test -v ./tests/unit/directory_ingestion_test.go
go test -v ./tests/unit/s3_functionality_test.go
go test -v ./tests/unit/db_file_flag_test.go# Build Lambda function
./infrastructure/build-lambda.sh
# Deploy infrastructure
./infrastructure/deploy.sh
# Test deployed infrastructure
./infrastructure/test.sh
# Destroy infrastructure
./infrastructure/destroy.shThis is a Go CLI tool for parsing and storing bank transaction data with AWS infrastructure for automated processing.
- main.go: CLI interface using Cobra framework with commands for user/account management, ingestion, and S3 operations
- database.go: SQLite database setup, schema management, and data operations
- models.go: Data structures for User, BankAccount, and BankTransaction entities
- parser.go: Parser interface and implementations for different bank formats ()
- categorization.go: Transaction categorization engine for automated classification
SQLite database with three main tables:
users: User information with UUID primary keysbank_accounts: Bank account details linked to usersbank_transactions: Transaction records with hash-based idempotency
Hierarchical command structure with subcommands for better organization:
User Management:
users list: List all users (read-only)users create: Create users (S3 recommended for regular use, local DB discouraged)
Bank Account Management:
bank-accounts list: List bank accounts with filters and nicknames (replaces deprecatedbankscommand)bank-accounts create: Create bank account structures in S3
Core Operations:
upload: Upload transaction files to S3download: Download processed databases from S3ingest: Process transaction files into database (hidden, Lambda use only)transactions: Query transactions with filters
Important Notes:
- The
bankscommand has been removed - usebank-accounts listinstead - Local database mutations are discouraged for CLI use - prefer S3 operations
- The
ingestcommand is hidden and intended for AWS Lambda internal use only - For regular use, upload files to S3 rather than using local ingestion
Commands support --db-file for reads and --bucket-name for S3 operations (mutually exclusive).
Three-bucket architecture:
- Users bucket: User definitions as JSON files
- Exports bucket: Bank account structures with transaction files
- Transactions bucket: Processed SQLite databases
Lambda function processes ALL accounts on any file upload, creating complete database snapshots.
Extensible parser interface supporting:
- TestBank: Simple CSV format
Tests located in tests/unit/ directory:
integration_test.go: Full end-to-end testingdirectory_ingestion_test.go: Directory-based ingestion teststransactions_test.go: Query operation testss3_functionality_test.go: S3 integration testsdb_file_flag_test.go: Database file flag tests
Test data in tests/data/ with sample account structures and transaction files.
- Idempotent ingestion prevents duplicate transactions using SHA-256 hashes
- AWS credentials required for S3 operations (via AWS_PROFILE or environment variables)
- Database path resolution:
--db-fileflag >DB_PATHenv var > defaultbank_transactions.db - Lambda processes complete account snapshots, not incremental updates
- All S3 operations are mutually exclusive with local
--db-fileoperations
CRITICAL: Always run tests before creating PRs
- Run
go test ./tests/unit/...after any code changes - Fix all failing tests before committing
- Test command parsing is particularly sensitive - ensure subcommands use separate arguments
- Example: Use
["users", "create"]not["users create"]in test commands
When modifying CLI commands:
- Update main.go - Modify command definitions and subcommand structure
- Update all test files - Search for old command usage in
tests/unit/ - Update documentation - Update both CLAUDE.md and README.md
- Test thoroughly - Run full test suite multiple times
- Verify manually - Test CLI commands manually with
--helpflags
- Follow existing Cobra command patterns for consistency
- Use subcommands for related functionality (e.g.,
users list,users create) - Maintain backward compatibility in functionality, even if command structure changes
- Preserve all existing flags and their behavior
Branch Management:
- Development is typically done in git worktrees created in advance
- Each worktree should be on its own feature branch
- WARNING: If you find yourself on the
mainbranch, STOP immediately- Display a clear warning that direct pushes to main are not allowed
- Prompt to create a feature branch before proceeding
- Use:
git checkout -b feature/descriptive-name
Commit Strategy:
- Commit logical units of work separately
- Include test fixes in the same PR as the feature
- Update documentation in the same PR as code changes
- Use descriptive commit messages with implementation details
- IMPORTANT: Do not prompt for commit messages - generate them automatically based on the changes made
Pre-Push Checklist:
- Verify you're on a feature branch (not main)
- Run full test suite:
go test ./tests/unit/... - Build verification:
go build -o bank-transactions main.go database.go models.go parser.go categorization.go - Manual CLI testing of changed functionality
Pull Request Management:
- NEVER merge PRs without explicit user instruction
- Creating PRs is encouraged for review and testing
- Always wait for explicit merge instruction from the user
- Do not assume merge approval even if workflows pass
Rule Updates:
- When the user says "update your rules", modify the CLAUDE.md file to add the new rule
- All behavioral guidelines should be documented in this file for persistence
Issue Implementation:
- When asked to implement an issue, check out a new feature branch with the issue number if not already on one
- Use format:
git checkout -b feature/issue-Norgit checkout -b feature/issue-N-brief-description - Only create new branch if currently on main or another non-feature branch
Claude Commands:
-
/understand-architecture: Quick architecture overview command for new Claude sessions- Read this CLAUDE.md file first for project context
- Read README.md for user-facing documentation
- Examine main.go for CLI command structure
- Review models.go for data structures
- Check categorization.go for business logic
- Look at infrastructure/terraform/ for AWS setup
- Review tests/unit/ for test patterns and examples
- This provides complete context for working with the codebase
-
/implement-issue: Implement a GitHub issue based on the current branch name- Extract issue number from branch name format:
<number>-<name> - Use
/understand-architecturefirst to familiarize with the repository - Fetch and analyze the GitHub issue with the extracted number
- CRITICAL: Before implementation, identify clear completion signals (tests, commands, outputs, behaviors)
- If no good signal exists, prompt user with suggestions for how to verify the task is complete
- Build up verification approach interactively before implementing
- Never assume or proceed without concrete ways to validate success
- Implement the required changes following all development guidelines
- Run full test suite and fix any failing tests
- Create a Pull Request when implementation is complete
- IMPORTANT: Always create a PR at the end of issue implementation
- Extract issue number from branch name format:
- Test Command Parsing: When updating commands to subcommands, remember to update test files to use separate arguments
- Documentation Sync: Always update both CLAUDE.md and README.md when changing commands
- Test Coverage: Verify that all test scenarios still pass, especially integration tests
- Manual Verification: Always test the actual CLI commands manually after changes
- Tests build their own binaries in
tests/unit/directory - Debug test failures by checking command argument parsing
- Use
-vflag for verbose test output:go test -v ./tests/unit/... - Check that test binaries are built with the latest code changes
- Use terraform to describe infrastructural resources.
- The only things you need to directly query via aws cli are:
- Logs
- Metrics
- Events
- Contents of S3 buckets
- The rest should be defined/queried via terraform.
- If you are debugging something specific, the cli may be used but you MUST have a plan for making it possible to diagnose via terraform in future.