Skip to content

Instantly share code, notes, and snippets.

@kanchokanchev
kanchokanchev / nginx_fail2ban.md
Created May 19, 2025 14:14
Nginx - SSL Certificates #Nginx_ADMIN #Nginx_FAIL2BAN

🛡️ Securing Nginx with Fail2Ban

This guide walks you through:

  • Installing and configuring Fail2Ban to block ssh attacks malicious requests to sensitive files

✅ 1. Install Fail2Ban

@kanchokanchev
kanchokanchev / nginx_log_management_guide.md
Last active May 8, 2025 06:37
Nginx - Log Management Guide #Nginx_ADMIN #Nginx_Log

Nginx Log Management Guide

Efficient log management helps in maintaining disk space and improving observability. This guide covers how to find, inspect, and manage old Nginx logs.


📁 1. Find and Manage Old Log Files

🔍 List Compressed Logs Older Than N Days

@kanchokanchev
kanchokanchev / nginx_log_rotation_&_compression.md
Last active May 8, 2025 07:15
Nginx - Log Rotation & Compression #Nginx_ADMIN #Nginx_Log

Nginx Log Rotation & Compression Setup Guide


Prerequisites

  • Check available disk space
df -h /
  • Verify gzip is installed
@kanchokanchev
kanchokanchev / git_repo_best_practices.md
Created April 7, 2025 12:40
Git Repo - Best Practices #GIT #GitHub #GitBest

GitHub Repo Best Practices

Prerequisites

  • Git version 2.34.0 or higher
  • Git Bash (Windows users)

A. Authenticating with GitHub

Important: ✅ Use SSH when working locally.

@kanchokanchev
kanchokanchev / git_permanently_remove_file_git_history.md
Created February 1, 2025 17:20
Git Repo - Permanently Remove A File From A Git History #GIT #GIT_HISTORY

How to Permanently Remove a .env File from GitHub History

If you accidentally committed a .env file to your Git repository, you cannot remove it directly through GitHub’s web interface. You must use Git commands to rewrite history.

Step 1: Remove the .env File from All Commits Using git filter-repo (Recommended)

GitHub officially recommends using git filter-repo instead of git filter-branch because it's faster and safer.

1. Install git filter-repo (if not installed)

Check if git filter-repo is installed:

@kanchokanchev
kanchokanchev / dbt_pipeline_aws_s3_lambda_data_validation_control_m.md
Created January 14, 2025 07:42
DBT - Cloud Pipeline Using AWS S3, Lambda With Data Validation and Control-M #DBT #DBT_AWS_S3 #DBT_Pipeline #Control_M #AWS_LAMBDA #AWS_SNS_TOPIC

Multi-Folder S3 to DBT Cloud Pipeline Using Lambda With Data Files Validation and Control-M

This guide explains how to set up a pipeline where files uploaded to multiple S3 bucket folders trigger specific DBT jobs in DBT Core. The pipeline uses AWS Lambda to validate each uploaded data file, send a mail notification if needed and trigger DBT jobs through Control-M.


Prerequisites

1. Store Secrets in AWS Secrets Manager:

2. Create a secret in Secrets Manager with the credentials for the Control-M API.

@kanchokanchev
kanchokanchev / dbt_pipeline_aws_s3_eventbridge_lambda.md
Last active January 14, 2025 07:06
DBT - Cloud Pipeline Using AWS S3, EventBridge and Lambda #DBT #DBT_AWS_S3 #DBT_Pipeline #DBT_AWS_EVENT_BRIDGE #AWS_EVENT_BRIDGE #AWS_LAMBDA

Multi-Folder S3 to DBT Core Pipeline Using EventBridge and Lambda

This guide explains how to set up a pipeline where files uploaded to multiple S3 bucket folders trigger specific DBT jobs in DBT Core. The pipeline uses AWS EventBridge for event routing and AWS Lambda for triggering DBT jobs.


Architecture Overview

  1. S3 Bucket Folders: Up to 7 folders in an S3 bucket (e.g., folder1/, folder2/, ... folder7/).
@kanchokanchev
kanchokanchev / dbt_docker_compose_postgresql_setup.md
Last active April 11, 2025 20:35
DBT - Setup With Docker Compose and PostgreSQL #DBT #DBT_Docker_Compose #DBT_PostgreSQL #Docker #Docker_Compose

DBT Environment with Docker Compose and PostgreSQL

This guide explains how to set up a DBT environment using Docker Compose with PostgreSQL as the database. It includes SQL and Python models to transform data, fetch data from an external API (ASPIRE), and export it to AWS S3.

Directory Structure

my_dbt_project/
├── docker-compose.yml
@kanchokanchev
kanchokanchev / snowflake_using_tasks.md
Created January 8, 2025 14:28
Snowflake - Using Tasks #Snowflake #Snowflake_TASKS

Snowflake SQL Task Demonstration

Introduction

This document provides a comprehensive guide to working with Snowflake tasks. It includes SQL code examples, detailed instructions, and best practices for creating, managing, and executing tasks in Snowflake.


Setting Up the Environment

@kanchokanchev
kanchokanchev / snowflake_real_time_data_pipeline.md
Last active January 3, 2025 15:37
Snowflake - Real-Time or Near Real-Time Data Pipeline #Snowflake #Snowflake_Pipeline

Real-Time or Near Real-Time Data Pipeline in Snowflake

Scenario Overview

You have data files from 2-3 different suppliers. These files are uploaded to an AWS S3 bucket and are then loaded into the respective database tables in Snowflake using Snowpipe. The Snowpipe is triggered automatically whenever new files arrive in the S3 bucket via S3 SNS (Simple Notification Service). After loading the data into Snowflake tables, a stored procedure is triggered for further data processing and distribution into other tables.


Solution Architecture