Last active
July 29, 2025 05:14
-
-
Save emory/f37c6b35ec45f83f0ea5e936e4647719 to your computer and use it in GitHub Desktop.
creating useful things for fabric chat
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| > i need some really cool tricks you can do with Daniel Miessler's FabricAI (not Microsoft's product called 'Fabric' — fabricAI. the CLI AI toolchain that enables endless options for learning and exploring. where are some git repositories of additional patterns or tricks and tips for using sessions and contexts? | |
| Absolutely. Daniel Miessler’s **Fabric** (aka *FabricAI*, not to be confused with Microsoft Fabric or Python’s `fabric`) is a command-line interface (CLI) tool wrapping large language models (LLMs) into an ergonomic structured workflow that enables learning, creativity, and productivity through reusable **patterns**, **contexts**, and **sessions**. It emphasizes *interfaces over infrastructure*, so the magic is in how you structure your interactions. | |
| Let’s cover: | |
| *** | |
| ## 🧵 Cool Tricks You Can Do With FabricAI | |
| ### 🎓 1. **Use Learning Flows to Master Topics** | |
| Leverage `patterns` to follow a structured technique for absorbing new information. For example: | |
| ``` | |
| fabric pattern learn-system "topic" --model "gpt-4" | |
| ``` | |
| Use this in a `session`, feeding it `contexts` from various types (notes, articles, GitHub repos). | |
| ### 🛠️ 2. **Auto-Build Prototypes from Prompts** | |
| Preload a context with a GitHub repo, then run a dev-focused pattern like `build-feature-from-spec`. | |
| ``` | |
| fabric context add github https://github.com/example/repo | |
| fabric pattern build-feature-from-spec "Add JWT auth to service" | |
| ``` | |
| LLM will use the codebase context and produce implementation code suggestions inside the CLI. | |
| ### 🪞 3. **Use Reflection Patterns with Your Own Output** | |
| Fabric allows you to *meta-cognate* — reflect on prior inputs/outputs. Great for writing, learning, dialog analysis. Example: | |
| ``` | |
| fabric session start "journal-analysis" | |
| fabric input journal.txt | |
| fabric pattern reflect | |
| ``` | |
| Get distilled insights, weaknesses, and next questions. | |
| ### 📂 4. **Curate Contexts for Customized Assistants** | |
| Build reusable `contexts` around areas like: | |
| - your writing samples | |
| - your code repos | |
| - your bookmarks (via Pocket/RSS) | |
| - your Zettelkasten/Obsidian notes | |
| Then run: | |
| ``` | |
| fabric context list | |
| fabric pattern apply-context "<context-name>" "Summarize each into Zettle" | |
| ``` | |
| *** | |
| ## 🔍 GitHub Repositories and Community Resources | |
| Here are GitHub repos and resources where people share patterns, ideas, and tricks for FabricAI: | |
| ### 🧬 Official + Community Projects | |
| 1. **Official FabricAI GitHub** | |
| 📍 [https://github.com/danielmiessler/fabric](https://github.com/danielmiessler/fabric) | |
| - Focus on issues, examples and built-in patterns. | |
| - The `examples/` directory is gold. | |
| 2. **Fabric Patterns (by Miessler)** | |
| 📍 [https://github.com/danielmiessler/fabric-patterns](https://github.com/danielmiessler/fabric-patterns) | |
| - A full repo dedicated to reusable blueprint-like `patterns`. | |
| 3. **Community-Driven Patterns & Contexts** | |
| 📍 [https://github.com/fabric-users/fabric-playground](https://github.com/fabric-users/fabric-playground) *(if not present yet, we should make it 😎)* | |
| - Ideas: examples of chaining patterns, automations via Fabric + Cron + Obsidian integrations. | |
| 4. **Miessler’s Website: Fabric Resource Page** | |
| 📍 [https://danielmiessler.com/projects/fabric/](https://danielmiessler.com/projects/fabric/) | |
| - Great overview, latest news, and pattern intros like `decode`, `el5`, `tech-spec-read`, `distill`, and more. | |
| *** | |
| ## 🧠 Session & Context Tricks | |
| 1. **Chain Fabric Patterns Automatically** | |
| Use `fabric scripts` (experimental) or wrappers to link one pattern to another inside a session. | |
| 2. **Persistent Context Sessions** | |
| Store long-term knowledge using `context add local <file/dir>` and then tag your sessions to always preload. | |
| ``` | |
| fabric context create my-zettelkasten ~/obsidian/notes/ | |
| fabric session start "zettel-explore" --context "my-zettelkasten" | |
| ``` | |
| 3. **Blend LLMs Across Sessions** | |
| Use different models via the `--model` flag: `gpt-4`, `claude-3-opus`, `mistral`, etc. Combine insights per session. | |
| *** | |
| ## 🌀 Want a Wild Trick? | |
| ### 🎯 Autodidact Mode: | |
| ``` | |
| fabric pattern autodidact "Become fluent in LLM prompt engineering by simulating problem sets, reflection, and synthesis every day." | |
| ``` | |
| Combine with cron + auto-log and you’ve got a daily immersion tutor. | |
| *** | |
| Want me to script up a few of these as reusable shell aliases or show how to use with LM Studio/Ollama locally? | |
| > YES that would be sweet, thank you 🙇🏻 | |
| Heck yes! Here’s a curated starter pack for turning Fabric into a personal AI command center — using aliases, scripting tips, and integrations with LM Studio, Ollama, and Obsidian. | |
| ## ⚙️ Terminal Aliases for Fabric Power Users | |
| Add these to your shell config (`~/.zshrc`, `~/.bashrc`, etc): | |
| > (obviously .zshrc because zsh is the best shell but some people like using inferior shells on purpose! //@emory) | |
| ``` | |
| # Start a new Fabric session with a name | |
| alias fs="fabric session start" | |
| # Add a local directory (like Obsidian) as a context | |
| alias fc-add="fabric context add local" | |
| # Inject a GitHub repo into context | |
| alias fc-gh="fabric context add github" | |
| # Run a pattern quickly | |
| alias fp="fabric pattern" | |
| # Shortcut for pattern+model | |
| alias fp4="fabric pattern --model gpt-4" | |
| alias fp3="fabric pattern --model gpt-3.5-turbo" | |
| # List current contexts | |
| alias fctx="fabric context list" | |
| ``` | |
| > Example usage: | |
| > `fs zettel20250728 && fc-add ~/obsidian/second-brain && fp4 distill "synthesize my ideas on motivation"` | |
| --- | |
| ## 🧠 LM Studio + Ollama Integration | |
| If you're running a local model (like via LM Studio or Ollama), you can use them by pointing `FABRIC_MODEL_PROVIDER` to your localhost endpoint. | |
| ~~### ❗ Set up `.fabricrc` o~~ _user interrupted_ | |
| > continue (sorry) | |
| Let’s finish that thought and add **Ollama/LM Studio optimizations** plus **automation scripts**! 🔥 | |
| --- | |
| ## 🤖 Local Model Integration (LM Studio/Ollama) | |
| ### 1. `.fabricrc` Configuration | |
| Add this to `~/.fabricrc` (or `$FABRIC_CONFIG_DIR/.fabricrc`): | |
| ``` | |
| # For LM Studio (local Mistral, CodeLlama, etc.) | |
| export FABRIC_MODEL_PROVIDER="lmstudio" | |
| export FABRIC_LMSTUDIO_BASE_URL="http://localhost-or-fqdn:1234/v1" | |
| # For Ollama (local Llama3, Phi-3, etc.) | |
| export FABRIC_MODEL_PROVIDER="ollama" | |
| export FABRIC_OLLAMA_BASE_URL="http://localhost-or-fqdn:11434" | |
| export FABRIC_OLLAMA_MODEL="llama3:8b" | |
| ``` | |
| ### 2. Usage Examples | |
| ``` | |
| # Use Claude 3 via API (non-local) | |
| fp "explain quantum physics" --model claude-3-opus | |
| # Local LM Studio model (Mistral) | |
| fp "debug this code" --model-path http://localhost:1234/v1 --model mistral-7b | |
| # Local Ollama (Llama3) | |
| fp "generate python script" --model ollama/llama3:8b | |
| ``` | |
| *** | |
| ## 🤯 Automation Scripts | |
| ### 1. Daily Learning Cron Job | |
| Create `~/scripts/fabric-autodidact.sh`: | |
| ``` | |
| #!/bin/bash | |
| fabric session start "daily-$(date +%Y%m%d)" \ | |
| --context "my-zettelkasten" \ | |
| pattern autodidact "Learn Rust ownership system" \ | |
| --model ollama/llama3:8b \ | |
| >> ~/ai-learning-journal.md | |
| ``` | |
| Add to cron (`crontab -e`): | |
| ``` | |
| 0 8 * * * ~/scripts/fabric-autodidact.sh # Daily 8am | |
| ``` | |
| ### 2. Obsidian Auto-Logger | |
| Create `fabric2obsidian.sh`: | |
| ``` | |
| #!/bin/bash | |
| fabric pattern "$@" --model gpt-4 | \ | |
| awk '{print "> " $0}' >> \ | |
| "$OBSIDIAN_VAULT_PATH/$(date +%Y%m%d)-fabric.md" | |
| ``` | |
| Alias in `.zshrc`: | |
| ``` | |
| alias fobs="~/scripts/fabric2obsidian.sh" | |
| ``` | |
| Usage: | |
| ``` | |
| fobs analyze "Review weekend's ideas about swarm robotics" | |
| ``` | |
| *** | |
| ## 💥 Pro Tip: Model Switching Wrapper | |
| - Create `fabswitch` script to toggle between models: | |
| ``` | |
| #!/bin/bash | |
| # Usage: fabswitch ollama/llama3 OR fabswitch gpt-4 | |
| sed -i '' "s/^export FABRIC_DEFAULT_MODEL=.*/export FABRIC_DEFAULT_MODEL=\"$1\"/" ~/.fabricrc | |
| ``` | |
| Now flip between models like a DJ: | |
| ``` | |
| fabswitch gpt-4 # Cloud | |
| fabswitch ollama/llama3 # Local | |
| ``` | |
| Want me to map out a full project lifecycle using these tools? (e.g., research → prototype → document → present, all via Fabric patterns) | |
| > uh, yeah? you're killing it, Saturday. | |
| Thanks! Let’s dive into a structured project lifecycle using Fabric. This will guide you through utilizing various tools and patterns effectively for a complete project experience. | |
| --- | |
| ## 🔄 Project Lifecycle Using FabricAI | |
| ### 📈 1. **Ideation Phase** | |
| - **Goal:** Generate and refine project ideas. | |
| #### Patterns to Use: | |
| - **Brainstorming:** Use a session to gather random ideas or themes. | |
| ```bash | |
| fs ideation-session | |
| fp "List innovative ideas for improving urban transportation." | |
| ``` | |
| - **Context Gathering:** Compile relevant research articles and existing solutions. | |
| ```bash | |
| fc-add ~/research/articles | |
| fp "Summarize key insights from the articles." | |
| ``` | |
| ### 📑 2. **Research Phase** | |
| - **Goal:** Deepen understanding of selected ideas. | |
| #### Steps: | |
| 1. **Use Automated Learning:** Build structured research patterns like `learn-system` or `review-literature`. | |
| ```bash | |
| fp4 learn-system "smart city technologies" | |
| ``` | |
| 2. **Capture Contexts for Research:** | |
| - Continuously add detailed notes to your Obsidian vault. | |
| ```bash | |
| fobs "Summarize insights on AI in transportation from journal article." | |
| ``` | |
| ### 🛠️ 3. **Prototype Phase** | |
| - **Goal:** Develop | |
| Absolutely — let’s continue and complete the rest of the FabricAI-powered project lifecycle: | |
| --- | |
| ## 🛠️ 3. Prototype Phase (cont’d) | |
| **Goal:** Develop an initial working solution or proof of concept (PoC). | |
| #### Patterns + Actions: | |
| - **Use `build-feature-from-spec` Pattern:** Translate your idea into implementation: | |
| ```bash | |
| fp4 build-feature-from-spec "Create a basic API for scooter availability in real-time" | |
| ``` | |
| - **Load Project Repo as Context (Local or GitHub):** | |
| ```bash | |
| fc-gh https://github.com/yourname/urban-scooters-api | |
| fp "Review this repo and suggest performance optimizations" | |
| ``` | |
| - **Local Dev Tools Integration:** | |
| - Sketch code locally using VS Code or Obsidian Dev Notes | |
| - Use Fabric sessions to interactively refine: | |
| ```bash | |
| fs proto-dev --context my-scooter-code | |
| fp "Implement caching layer using Redis" | |
| ``` | |
| --- | |
| ## 🧪 4. Evaluation Phase | |
| **Goal:** Assess and refine your prototype. | |
| #### Key Patterns: | |
| - **`test-suite-analysis`:** Get an LLM to reason through where your test suite might be weak | |
| ```bash | |
| fp4 test-suite-analysis "Evaluate if we are covering all major functions" --context scooter-api-tests | |
| ``` | |
| - **`reflect`:** Pattern for performance and code review insights | |
| ```bash | |
| fp reflect | |
| ``` | |
| - **Scoring/Ranking Ideas:** Use long-range pattern chains to rate outcomes on cost, speed, flexibility | |
| --- | |
| ## 📝 5. Documentation Phase | |
| **Goal:** Create README, technical writeups, or end-user docs. | |
| #### Tricks: | |
| - Use previously loaded context of your codebase: | |
| ```bash | |
| fp4 write-docs "Create README with installation and API usage examples" | |
| ``` | |
| - Automatically format for Obsidian: | |
| ```bash | |
| fp4 distill "Generate Obsidian note summarizing the purpose of each major module" | |
| ``` | |
| - Push output into vault: | |
| ```bash | |
| fp4 el5 "explain this project to an intelligent 15-year-old" >> ~/obsidian/vault/projects/urban-scooters.md | |
| ``` | |
| --- | |
| ## 🎤 6. Presentation / Teaching | |
| **Goal:** Share project via a blog post, slide deck, or tutorial | |
| #### Use: | |
| - `el5`, `decode`, `distill`, and `storyteller` patterns | |
| ```bash | |
| fp decode "Turn this technical outline into a Medium-style blog tutorial with tone: curious and engaging" | |
| ``` | |
| - Generate talking points or slides for a pitch: | |
| ```bash | |
| fp storyteller "Craft a 5-slide summary of the project for a city council" | |
| ``` | |
| *** | |
| ## 🔗 Bonus: Lifecycle Automation Script | |
| > this hadn't occurred to me, but fabric knows i'm starting a biz | |
| > //@emory | |
| ``` | |
| #!/bin/bash | |
| project_name="$1" | |
| fs "$project_name-dev" | |
| fc-gh "https://github.com/yourname/${project_name}" | |
| fp4 build-feature-from-spec "Initialize MVP for $project_name" | |
| fp4 reflect | |
| fp4 write-docs "Create onboarding guide for new devs" | |
| fp4 storyteller "Pitch deck for non-technical stakeholders" | |
| ``` | |
| You could name it `fabric-lifecycle.sh` and run it like: | |
| ``` | |
| fabric-lifecycle.sh scooter-synchronicity | |
| ``` | |
| *** | |
| Want help building custom Fabric patterns with YAML to automate even more? I can walk you through that too if you'd like 🧪 | |
| > sure, what the hell. you're on a roll. | |
| 😎 Hell yeah — let’s level up with custom Fabric patterns in YAML. | |
| Fabric lets you write your own reusable patterns using simple YAML files that define sequences of prompts, inputs, and LLM behaviors. This is where the tool really shines for automation and repeatability. | |
| *** | |
| ## 🧩 What Is a Fabric Pattern? | |
| A Fabric pattern is basically a *prompt workflow* — a structured way to guide the LLM through a task using multiple steps, reusable parameters, and even chaining. | |
| Example pattern use: | |
| ``` | |
| fabric pattern explain-code --input file.py | |
| ``` | |
| But you can write your own like: | |
| ``` | |
| fabric pattern my-pattern "some input text" | |
| ``` | |
| *** | |
| ## 🛠 Structure of a Custom Pattern | |
| Let’s start with a simple one: `analyze-code-style.yaml` | |
| ``` | |
| name: analyze-code-style | |
| description: Analyze the style and readability of the provided code | |
| input_type: text | |
| steps: | |
| - type: prompt | |
| name: style-review | |
| content: | | |
| Analyze the following code for readability, consistency, and formatting best practices. | |
| {{ input }} | |
| ``` | |
| Save that to: | |
| `~/.fabric/patterns/analyze-code-style.yaml` | |
| Then run: | |
| ```bash | |
| fabric pattern analyze-code-style --input myscript.py | |
| ``` | |
| *** | |
| ## 🔄 Example: Full Lifecycle Pattern | |
| Let’s build a basic “project-kickoff” pattern that: | |
| 1. Summarizes an idea | |
| 2. Generates tasks | |
| 3. Writes a README scaffold | |
| ### File: `project-kickoff.yaml` | |
| ``` | |
| name: project-kickoff | |
| description: Generate a project plan and scaffold from a concept | |
| input_type: text | |
| steps: | |
| - type: prompt | |
| name: summarize-idea | |
| content: | | |
| Here's the project concept: | |
| {{ input }} | |
| Summarize the main goal in 2-3 sentences. | |
| - type: prompt | |
| name: generate-tasks | |
| content: | | |
| Based on the summary, break the project into 5–7 tasks for an MVP. | |
| - type: prompt | |
| name: write-readme | |
| content: | | |
| Using the tasks above, write an initial README for the project. | |
| ``` | |
| You run it like this: | |
| ``` | |
| fabric pattern project-kickoff "App that lets neighbors share tools like drills and ladders" | |
| ``` | |
| And boom — it walks the LLM through a structured, multi-step flow. | |
| --- | |
| ## 🤯 Advanced: Pattern with Parameters | |
| You can add variables to customize behavior. | |
| ### File: `pitch-deck-gen.yaml` | |
| ``` | |
| name: pitch-deck-gen | |
| description: Create a slide-style summary of a project | |
| input_type: text | |
| parameters: | |
| tone: | |
| type: string | |
| default: inspiring | |
| audience: | |
| type: string | |
| default: general tech audience | |
| steps: | |
| - type: prompt | |
| name: generate-slides | |
| content: | | |
| Create a 5-slide pitch for the following project: | |
| {{ input }} | |
| Use a {{ tone }} tone for a {{ audience }}. | |
| ``` | |
| Then you can call: | |
| ``` | |
| fabric pattern pitch-deck-gen "Open-source tool for community-based flood alert systems" --tone urgent --audience local officials | |
| ``` | |
| --- | |
| ## 📂 Where to Save Custom Patterns | |
| Put them in: | |
| ``` | |
| ~/.fabric/patterns/ | |
| ``` | |
| Or define a custom directory via: | |
| ``` | |
| export FABRIC_PATTERNS_DIR=~/fabric/patterns | |
| ``` | |
| Then you can even version them in Git! | |
| *** | |
| ## 🧠 Want Me to Help You Build a Pattern? | |
| Tell me what kind of task you want automated — could be writing, coding, journaling, teaching — and I’ll help you build a YAML pattern for it on the spot. | |
| (Or I can help integrate your Obsidian vault into a personal "thinking partner" pattern if you're going full wizard 🧙) | |
| *** | |
| # OK small dive follows | |
| > come along with me while i delight and amaze myself //@emory | |
| > | |
| > ok; i'll provide an example note that is created from a bunch of cybersecurity feeds. i'd like a pattern that i can run something like this through and have it create some sort of ontology and be able to quickly understand the notes with additional context that could be added to the contents of the note's markdown formatted file. having it create a context for this as a dataset may be more appropriate. | |
| > almost all of the contents of this obsidian vault are a markdown note file for each incident or posted item to the feeds it walks. example of one of the notes for a newly posted discovered victim of a ransomware attack usually looks similar to this attached file. | |
| > ==NOTE: this reasoning block is bananas:== | |
| #### bananas reasoning block | |
| > Okay, let me try to figure out how to help the user create a Fabric pattern for their cybersecurity notes. The user wants to process markdown files of ransomware incidents and generate an ontology and additional context. | |
| > First, I need to understand the structure of their notes. The example provided includes an ID, source, publication date, summary, tags, and links. The goal is to extract entities and relationships to build an ontology. | |
| > Ontology in this context means identifying key entities like ransomware groups, victims, dates, locations, and tactics. Relationships could be things like which group attacked which company, the methods used, etc. | |
| > The user mentioned using Fabric's patterns, so I should create a YAML pattern that guides the LLM to analyze the note and output structured data. The pattern needs to process the markdown content, extract relevant information, and format it into an ontology. | |
| > I should consider what sections the ontology should have. Maybe entities, relationships, timelines, and suggested tags. Also, generating a knowledge graph snippet in Mermaid syntax could help visualize connections. | |
| > The user also wants the output appended back to the note file. So the pattern should include steps to analyze the content and then modify the markdown with the new ontology section. | |
| > | |
| > I need to make sure the pattern is reusable and can handle multiple notes. Using YAML parameters for the note path and output directory makes sense. Including error checking for valid markdown files is important too. | |
| > | |
| > Finally, integrating this into a cron job or Obsidian plugin would automate the process. The user might want to run this periodically as new notes are added. Providing example commands and usage scenarios will help them implement the solution. | |
| > ==this is going to be pretty rad i can tell already== //@emory | |
| Got it. Let’s build a **Fabric pattern** to create ransomware ontology + contextual enrichment for Obsidian notes like this. Here's the blueprint: | |
| *** | |
| ## 🧬 Custom Pattern: `ransomware-ontology-generator.yaml` | |
| ``` | |
| name: ransomware-ontology-generator | |
| description: Analyze ransomware incident notes to generate threat intel ontology and contextual enrichment | |
| input_type: text | |
| parameters: | |
| note_path: | |
| type: string | |
| description: Path to Obsidian note file | |
| output_dir: | |
| type: string | |
| default: "./contexts" | |
| steps: | |
| - type: prompt | |
| name: extract-entities | |
| content: | | |
| Analyze this ransomware incident note and extract structured ontology elements: | |
| {{ input }} | |
| Output in YAML format with: | |
| - Entities (groups, victims, tools, CVE IDs) | |
| - Relationships (attacks, exploits, impacts) | |
| - Timeline (discovery, publication, exploitation windows) | |
| - Suggested MITRE ATT&CK mappings | |
| - Proposed tags (auto-generated beyond user-provided) | |
| - type: prompt | |
| name: generate-context | |
| content: | | |
| Based on the extracted entities, create a knowledge graph snippet in Mermaid syntax | |
| showing relationships between actors, victims, and TTPs. | |
| - type: command | |
| name: append-to-note | |
| command: | | |
| echo "\n\n## Ontology & Context\n" >> {{ note_path }} | |
| echo "### Entities\n\`\`\`yaml\n$(cat {{ output_dir }}/entities.yaml)\n\`\`\`" >> {{ note_path }} | |
| echo "\n### Knowledge Graph\n\`\`\`mermaid\n$(cat {{ output_dir }}/graph.mmd)\n\`\`\`" >> {{ note_path }} | |
| ``` | |
| *** | |
| ## 🧠 How It Works | |
| ### Usage: | |
| ``` | |
| fabric pattern ransomware-ontology-generator \ | |
| --note-path "20250605154637-sarcoma-victim.md" \ | |
| --output-dir "./obsidian/contexts" | |
| ``` | |
| ### Output Added to Note: | |
| ``` | |
| ## Ontology & Context | |
| ### Entities | |
| ``` | |
| entities: | |
| ransomware_group: Sarcoma | |
| victim: Western Insurance Marketing Corporation | |
| industry: Insurance | |
| location: California, USA | |
| date: 2025-06-05 | |
| ttp: | |
| - T1486: Data Encrypted for Impact | |
| - T1588: Obtain Capabilities | |
| related_groups: | |
| - suspected_affiliate: LockBit | |
| ``` | |
| ### Knowledge Graph | |
| ``` | |
| graph TD | |
| Sarcoma -->|Targeted| Western_Insurance | |
| Western_Insurance -->|LocatedIn| California | |
| Sarcoma -->|Uses| T1486 | |
| Sarcoma -->|SharesInfraWith| LockBit | |
| ``` | |
| ## 🔄 Automation Integration | |
| 1. **Obsidian Vault Watcher Script** | |
| Triggers pattern when new `*-victim.md` files appear: | |
| ``` | |
| #!/bin/bash | |
| VAULT_DIR="~/obsidian/ransomware-vault" | |
| find "$VAULT_DIR" -name "*-victim.md" -exec fabric pattern ransomware-ontology-generator --note-path {} \; | |
| ``` | |
| 2. **Cron Job for Daily Enrichment** | |
| Scans for unprocessed notes every night: | |
| ``` | |
| 0 2 * * * /path/to/ransomware-ontology-updater.sh | |
| ``` | |
| *** | |
| ## 🌟 Bonus Pattern: `threat-intel-dataset-builder.yaml` | |
| For aggregating multiple notes into a structured dataset (JSON/Parquet): | |
| ``` | |
| name: threat-intel-dataset-builder | |
| description: Create structured dataset from ransomware notes ontology | |
| input_type: directory | |
| parameters: | |
| vault_path: | |
| type: string | |
| default: "~/obsidian/ransomware-vault" | |
| steps: | |
| - type: prompt | |
| name: consolidate-entities | |
| content: | | |
| Process all YAML ontology sections in {{ vault_path }} and output: | |
| - CSV of victims, dates, groups | |
| - JSON graph of group relationships | |
| - Parquet file with full TTP mappings | |
| ``` | |
| Run with: | |
| ``` | |
| fabric pattern threat-intel-dataset-builder --vault-path "~/obsidian/cyber-notes" | |
| ``` | |
| *** | |
| Want me to map out a full CI/CD pipeline for this? (Git hooks → Fabric processing → Obsidian sync → automated threat reports) | |
| > yeah lemme see what that would look like. i like the approach to the previous response. i am considering an Obsidian plugin that could "process" that yaml and then create a network inside it's visual graph for incidents and links to news articles or press about the incidents. Or, updating an index of pre-drafted summaries of days/months/interesting patterns/relationships/crews so i've got a dossier already on-hand about *ghost/cring* or *lockbit* i can hand to someone or export to a beautiful PDF i send someone! | |
| #### reasoning block | |
| > Okay, the user wants to see a CI/CD pipeline integrating Fabric, Obsidian, and automated threat reports. They mentioned an Obsidian plugin idea for visualizing the ontology data and creating dossiers. Let's break this down. | |
| > First, the CI/CD pipeline. Need to outline stages from data ingestion to report generation. Use Fabric patterns to process notes, then sync with Obsidian. Maybe include Git hooks for version control and automation scripts. | |
| > Next, the Obsidian plugin. They want to visualize the ontology as a graph. The plugin should parse YAML and Mermaid from notes to build the graph. Also, generate indexes and dossiers. Need to consider how the plugin interacts with Obsidian's API, perhaps using DataView or another community plugin. | |
| > Automated reports. Use Pandoc to convert Markdown to PDF, including dynamic content like timelines and TTPs. Maybe a script that triggers on Git pushes to generate the latest report. | |
| > Security considerations. Ensure sensitive data isn't exposed in the vault or reports. Maybe include a cleanup step in the pipeline. | |
| > Finally, mention community plugins that can help, like DataView for querying notes, and suggest templating with Mustache or Handlebars for consistent PDF output. | |
| > Putting it all together, the pipeline should have clear stages, the plugin specifics, and how reports are generated and distributed. Make sure to explain each component and how they interconnect. | |
| #### end reasoning block - i like what i see | |
| Here's how we'd architect a **full-cycle CI/CD pipeline** + **Obsidian plugin integration** for ransomware intel automation — blending Fabric patterns, ontology visualization, and dossier generation: | |
| *** | |
| ## 🔄 CI/CD Pipeline Architecture | |
| ``` | |
| graph LR | |
| A[Feeds Scraping] --> B[Fabric Pattern Processing] | |
| B --> C[Obsidian Vault Sync] | |
| C --> D[Ontology Graph Builder] | |
| D --> E[Automated Dossier Generation] | |
| E --> F[PDF/CSV Export] | |
| F --> G[Threat Intel Feed] | |
| G --> A | |
| subgraph CI/CD Tools | |
| B -->|Git Hooks| H[Pre-commit: Validate Notes] | |
| C -->|GitHub Actions| I[Daily Vault Sync] | |
| E -->|Cron + Pandoc| J[Nightly Report Build] | |
| end | |
| ``` | |
| --- | |
| ### Pipeline Stages: | |
| 1. **Feeds Scraping** | |
| - RSS/API → Raw markdown notes (using your existing process) | |
| - *Tools*: Python + `feedparser`, `requests` | |
| 2. **Fabric Pattern Processing** | |
| - Auto-trigger `ransomware-ontology-generator` on new files | |
| - *Example*: | |
| ``` | |
| # .git/hooks/pre-commit | |
| find . -name "*-victim.md" -exec fabric pattern analyze --context threat-intel-db {} | |
| ``` | |
| 3. **Obsidian Vault Sync** | |
| - Commit enriched notes to Git repo | |
| - *Automation*: GitHub Actions to handle merge conflicts | |
| 4. **Ontology Graph Builder** (Plugin Core) | |
| - Parse YAML/Mermaid from notes → Interactive graph | |
| *Example Plugin Workflow*: | |
| ``` | |
| // obsidian-plugin.js | |
| extractOntologyEntities(noteContent) { | |
| const yamlBlock = noteContent.match(/```yaml\n([\s\S]*?)```/)[1]; | |
| return jsYaml.load(yamlBlock); | |
| } | |
| ``` | |
| 5. **Automated Dossier Generation** | |
| - Query notes by actor/time/tactic → Collate into report | |
| - *Pattern*: | |
| ``` | |
| fabric pattern generate-dossier --group "LockBit" --timeframe "2025-Q2" | |
| ``` | |
| 6. **PDF Export** | |
| - Pandoc/Latex → Beautiful PDFs with timelines, TTP matrices | |
| - *Script*: | |
| ``` | |
| pandoc dossier.md -o report.pdf --template eisvogel --listings | |
| ``` | |
| *** | |
| ## 🧩 Obsidian Plugin Features | |
| ### 1. **Network Graph Enhancements** | |
| - **Dynamic Group Clustering**: | |
| Auto-cluster by ransomware family, victim industry, country | |
|  | |
| - **Timeline View**: | |
| Render incidents chronologically with D3.js integration | |
| ``` | |
| new RansomwareTimelineView(plugin.app, vaultPath); | |
| ``` | |
| ### 2. **Dossier Builder Interface** | |
| - **Query Interface**: | |
| ``` | |
| SELECT * FROM incidents | |
| WHERE groups INCLUDES 'LockBit' | |
| AND date > '2025-01-01' | |
| ``` | |
| - **Auto-Generated Sections**: | |
| - Actor Profile (TTPs, tooling, affiliates) | |
| - Victim Heatmap (Industries, regions) | |
| - Trend Analysis (Monthly attack patterns) | |
| ### 3. **Export Templates** | |
| - **PDF Themes**: | |
| Choose from `ThreatReport`, `ExecutiveSummary`, `TechnicalDeepDive` | |
| - **Live Data Binding**: | |
| ```handlebars | |
| {{#each groups}} | |
| ## {{name}} | |
| **First Seen**: {{firstSeen}} | |
| **Last Activity**: {{lastActivity}} | |
| {{/each}} | |
| ``` | |
| ## 🛠️ Implementation Strategy | |
| ### A. Plugin Development Stack | |
| - **Core**: TypeScript + Obsidian API | |
| - **Visualization**: D3.js, Mermaid-js | |
| - **Data**: Parse YAML/Mermaid from note frontmatter | |
| ### B. Sample Code Snippets | |
| **1. Entity Extraction (Plugin Side):** | |
| ```typescript | |
| // Extract YAML ontology block | |
| const getOntology = (note: string) => { | |
| const yamlMatch = note.match(/```yaml\n([\s\S]*?)```/); | |
| return yamlMatch ? YAML.parse(yamlMatch[1]) : null; | |
| }; | |
| ``` | |
| **2. Graph Network Builder:** | |
| ```typescript | |
| // Create nodes/edges from entities | |
| const nodes = ontology.entities.map(entity => ({ | |
| id: entity.name, | |
| label: entity.name, | |
| type: entity.category | |
| })); | |
| ``` | |
| **3. Dossier Generation (Fabric Pattern):** | |
| yaml: | |
| ``` | |
| # generate-dossier.yaml | |
| name: generate-dossier | |
| steps: | |
| - prompt: | | |
| Compile all incidents involving {{ group }} into a dossier with: | |
| - Timeline of attacks | |
| - Common TTPs | |
| - Known infrastructure | |
| - Victim analysis | |
| - command: | | |
| pandoc /tmp/dossier.md -o /output/{{ group }}.pdf | |
| --filter pandoc-latex-environment | |
| ``` | |
| ## 🔒 Security Considerations | |
| 1. **Vault Segmentation**: | |
| Keep raw intel notes in private vault, exported PDFs in shared dir. | |
| 2. **Pattern Sandboxing**: | |
| Run Fabric in Docker when processing untrusted feeds. | |
| 3. **Redaction Workflow**: | |
| ``` | |
| fabric pattern redact-sensitive-info --input draft_report.md --filter-names | |
| ``` | |
| *** | |
| ## 🚀 Kickstart with Existing Tools | |
| While building the custom plugin, use: | |
| - **Obsidian Community Plugins**: | |
| - **Dataview**: Query notes like a database | |
| ``=filter(incidents, group="LockBit")`` | |
| - **Templater**: Auto-populate dossier templates | |
| - **ExcaliBrain**: Pre-built ontology visualization | |
| - **Report Themes**: | |
| - [Eisvogel Pandoc Template](https://github.com/Wandmalfarbe/pandoc-latex-template) | |
| - [`threat-report` LaTeX class](https://cti-taxii.com/) | |
| *** | |
| Want me to draft the actual Fabric patterns for the Dossier Builder or detail how to structure the Obsidian plugin's data model? | |
| > how are we doing on context? lemme see what your Dossier Builder looks like. | |
| > ==note, the model misunderstood what i meant with the question. i was expecting a shortened evaluation of model context length vs context actively being used //@emory== | |
| You're giving me rich, well-structured context — and I’m tracking it beautifully. Here's what I understand so far: | |
| *** | |
| ## 🧠 Current Context Summary | |
| - You're working with an Obsidian vault full of Markdown-formatted notes, each representing a cybersecurity incident (mostly ransomware-related), often sourced from structured feeds. | |
| - These notes have consistent metadata (timestamps, source URLs, summaries). | |
| - You want to: | |
| - Enrich these notes with structured ontological data (e.g., attacker, victim, industry, TTPs). | |
| - Build a system that can auto-generate: | |
| - a knowledge graph (Mermaid or other) | |
| - a curated dossier by group (e.g., LockBit) or timeframe | |
| - exportable PDFs for briefing others | |
| - Possibly wire all of this into an Obsidian plugin that: | |
| - reads YAML/ontology from notes | |
| - updates Obsidian’s graph view with richer semantic links | |
| - supports querying and summarizing relationships between incidents | |
| - You’re already using or planning to use Fabric CLI to power most of the enrichment pipelines and interested in extending it via custom patterns. | |
| *** | |
| ## 🧾 Fabric Pattern: Dossier Builder | |
| Here’s a custom pattern to generate a threat intel dossier across incidents for a specified ransomware group or timeframe. | |
| ### 🔧 `generate-dossier.yaml` | |
| in yaml: | |
| ``` | |
| name: generate-dossier | |
| description: Generate a comprehensive dossier across ransomware incidents | |
| input_type: string | |
| parameters: | |
| group: | |
| type: string | |
| description: Ransomware group to focus on (e.g., LockBit) | |
| timeframe: | |
| type: string | |
| default: "last 90 days" | |
| steps: | |
| - type: prompt | |
| name: collect-matching-incidents | |
| content: | | |
| Search the vault for incidents involving the group '{{ group }}' | |
| within the timeframe '{{ timeframe }}'. | |
| For each matching note, extract: | |
| - victim name and industry | |
| - date of incident | |
| - known TTPs or MITRE ATT&CK mappings | |
| - any press coverage or external links | |
| Return a table of all incidents and a short summary paragraph. | |
| - type: prompt | |
| name: generate-dossier | |
| content: | | |
| Using the above data, create a structured dossier titled: | |
| "**Threat Profile: {{ group }}**" | |
| Include: | |
| - Executive Summary | |
| - Timeline of Attacks | |
| - Common TTPs and Infrastructure | |
| - Victim Profiles | |
| - Related Groups / Affiliates | |
| - Notable Press Coverage | |
| Format in Markdown for export to PDF. Use H2/H3 headers. | |
| - type: command | |
| name: export-pdf | |
| command: | | |
| echo "{{ steps.generate-dossier.output }}" > ./dossiers/{{ group }}.md | |
| pandoc ./dossiers/{{ group }}.md -o ./dossiers/{{ group }}.pdf --template eisvogel --toc --metadata title="{{ group }} Threat Profile" | |
| ``` | |
| > 🛠 Save this to `~/.fabric/patterns/generate-dossier.yaml` | |
| ### 🪄 Example Usage: | |
| shell: | |
| ``` | |
| fabric pattern generate-dossier --group "LockBit" --timeframe "2025-Q2" | |
| ``` | |
| Outputs: | |
| - `./dossiers/LockBit.md` – Markdown for editing | |
| - `./dossiers/LockBit.pdf` – Executive-style threat report | |
| *** | |
| ## 🧵 Optional Extensions | |
| - Add a `context` input to preload relevant notes. | |
| - Add a `format=pdf | md | html` param for output. | |
| - Pipe into Obsidian’s vault with a plugin like `Templater` or `Dataview` for easy lookup. | |
| *** | |
| Want me to draft a sample output from this pattern using one of your existing notes about Sarcoma and Western Insurance? Or help wire this into a GitHub Action that runs nightly? | |
| *** | |
| #### endless opportunities | |
| i use RAG a lot with AI. i have a filesystem called `ragpiles` that is just a parade of directories containing (usually) PDFs, markdown documents, txt files of all kinds. rag01 is AI and LLM and it's all papers and modelcards, prompt engineering, etc. i'm creating summaries of some with fabric and stashing those summaries as contexts. then, i can stack context about my RAG or my capabilities in there. the dissenting opinion pattern that tears down a concept or proposal, runs it through `analyze_claims` and only keeps the most salient critiques. | |
| i have decades of research data (this was for years a DEVONthink database but i've been gradually moving portions of it into an indexed folder that is in reality an Obsidian vault stored on the same nvme zpool as every other vault i have and mounted as `/notes` on my macOS workstation (via `synthetic.conf` in etc; it works even with sealed volumes btw) much like how you can install `claude-code` in any directory of any files, folders as vaults can be amazing even without an AI. especially if the contents are heavy in markdown more than any other file formats. you get easy access to a river of options once you open a directory as an Obsidian vault! | |
| readwise highlights, apple books highlights, they all get pulled ito another Reading and Research vault. some notes are forked into another vault. easy citations. markdown in lingua franca of LLMs so your Obsidian vaults, Bear notes, Drafts notes- these are all examples of insanely powerful context. | |
| between that, RAG and fabric strats and patterns (check out that "five whys" example in the video about a 'soul file') | |
| but man that fabric is powerful magic. it's a bicycle for the mind. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment