Created
September 27, 2025 15:53
-
-
Save kafkasl/d76e3b1c59d5075f72b46c953e392566 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| { | |
| "cells": [ | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "e8cba711", | |
| "metadata": { | |
| "time_run": "3:15:20p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "from fastcore.utils import *\n", | |
| "import fastcore.all as fc, re, math, itertools, functools, numpy as np, types, typing, dataclasses, matplotlib.pyplot as plt, collections, regex\n", | |
| "from fastcore.xtras import dict2obj\n", | |
| "from regex import search\n", | |
| "from collections import Counter\n", | |
| "from collections.abc import Iterable\n", | |
| "from httpx import get as xget, post as xpost\n", | |
| "from fasthtml.common import *\n", | |
| "from monsterui.core import *\n", | |
| "np.set_printoptions(linewidth=150, suppress=True)\n", | |
| "plt.rcParams['figure.dpi'] = 50\n", | |
| "\n", | |
| "from dialoghelper import url2note\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "877ce46b", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "Hi solveit! I am Pol, and I am working with you today! I'm an experienced Python coder and I enjoy learning new things. When writing code for me, please follow fast.ai coding style: avoid comments; place single line if/loop/etc constructs on the same line; use concise variable names where appropriate. Follow the style you will see in my messages, as we work on this dialogue together. Are you ready?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Hi Pol! Yes, I'm absolutely ready to work with you! I'm familiar with the fast.ai coding style and will follow your preferences for concise, clear code without unnecessary comments. I see you've already set up some useful imports including fastcore, which I know is central to the fast.ai approach.\n", | |
| "\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "e2c5da70", | |
| "metadata": { | |
| "time_run": "3:15:20p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "#url2note('https://tommyc.xyz/posts/flow-matching')" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "dce45fd5", | |
| "metadata": { | |
| "hide_input": true | |
| }, | |
| "source": [ | |
| "# Exploring Blogpost: Flow Matching\n", | |
| "\n", | |
| "[ Tommy's Blog ](https://tommyc.xyz/)\n", | |
| "\n", | |
| "The purpose of this post is to develop an intuitive understanding of [flow matching](https://arxiv.org/abs/2210.02747), a powerful state-of-the-art generative model. While many _flow matching_ guides are pretty math heavy, this post focuses more on concrete concepts.\n", | |
| "\n", | |
| "To develop our understanding of _flow matching_ , we'll take inspiration from the human drawing process. We'll start by looking at how humans draw simple objects. From here we'll build an intuition for how complex objects are drawn and finally work our way towards a formal definition of _flow matching_. By the end we'll have trained a _flow matching_ model that produces comparable results to diffusion models like DDIM, but is conceptually much simpler and uses 40% less sampling code.\n", | |
| "\n", | |
| "> This post is actually a bit of an accident. I was originally trying to develop a simpler version of diffusion but I ended up redefining flow matching 😅.\n", | |
| "\n", | |
| "* * *\n", | |
| "\n", | |
| "## Drawing a Square\n", | |
| "\n", | |
| "Let's start with a simple image generation task like drawing a single black square on a white background and see where it takes us. To accomplish this task we might use the following algorithm. First we decide on the size and location of the square, then we execute the following drawing steps:\n", | |
| "\n", | |
| "####\n", | |
| "\n", | |
| " 1. draw the top side of the square\n", | |
| " 2. draw the right side of the square\n", | |
| " 3. draw the bottom side of the square\n", | |
| " 4. draw the left side of the square\n", | |
| " 5. fill in the square \n", | |
| "\n", | |
| "\n", | |
| "\n", | |
| "_Drawing a black square on a white background_\n", | |
| "\n", | |
| "#### Here's the algorithm in code." | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "5516575b", | |
| "metadata": { | |
| "time_run": "3:15:28p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "(tensor([[1., 1., 1., ..., 1., 1., 1.],\n", | |
| " [1., 1., 1., ..., 1., 1., 1.],\n", | |
| " [1., 1., 1., ..., 1., 1., 1.],\n", | |
| " ...,\n", | |
| " [1., 1., 1., ..., 1., 1., 1.],\n", | |
| " [1., 1., 1., ..., 1., 1., 1.],\n", | |
| " [1., 1., 1., ..., 1., 1., 1.]]),\n", | |
| " <PIL.Image.Image image mode=L size=32x32>)" | |
| ] | |
| }, | |
| "execution_count": 18, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "import random\n", | |
| "import torch\n", | |
| "from torchvision import transforms\n", | |
| "\n", | |
| "def model(img, step, **kwargs):\n", | |
| " \"\"\"\n", | |
| " Compute the change we need to make to `img` \n", | |
| " for a specific `step` in the drawing algorithm.\n", | |
| "\n", | |
| " Note: \n", | |
| " As black pixels = 0, white pixels = 1 we need to subtract 1\n", | |
| " when adding a black pixel to the white background. \n", | |
| " \"\"\"\n", | |
| " h, w = img.size()\n", | |
| " x, y = kwargs[\"location\"]\n", | |
| " size = kwargs[\"size\"]\n", | |
| " change = torch.zeros(h, w)\n", | |
| " if step == 1: # draw top side of the square\n", | |
| " change[x, y:y+size] = -1.\n", | |
| " return change\n", | |
| " if step == 2: # draw right side of the square\n", | |
| " change[x:x+size, y+size] = -1.\n", | |
| " return change\n", | |
| " if step == 3: # draw bottom side of the square\n", | |
| " change[x+size, y:y+size] = -1.\n", | |
| " return change\n", | |
| " if step == 4: # draw left side of the square\n", | |
| " change[x:x+size, y] = -1.\n", | |
| " return change\n", | |
| " if step == 5: # fill in the square \n", | |
| " change[x:x+size, y:y+size] = -1.\n", | |
| " return change\n", | |
| "\n", | |
| "# Let's create our blank page \n", | |
| "img_height = img_width = 32\n", | |
| "img = torch.ones(img_height, img_width)\n", | |
| "\n", | |
| "# Select the size and location of the square\n", | |
| "square_side_length = random.randint(4, img_height // 2)\n", | |
| "top_x = random.randint(0, img_width - square_side_length-1)\n", | |
| "top_y = random.randint(0, img_height - square_side_length-1)\n", | |
| "top_left_corner = (top_x, top_y)\n", | |
| "\n", | |
| "# Run our drawing algorithm\n", | |
| "n_steps = 5\n", | |
| "for step in range(1, n_steps+1):\n", | |
| " change = model(img, step, size=square_side_length, location=top_left_corner)\n", | |
| " img += change\n", | |
| "img, transforms.ToPILImage()(img)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "d304ea85", | |
| "metadata": {}, | |
| "source": [ | |
| "From the code we can see that our `model` is a function that takes the current image `img`, the current algorithm step `step`, and tells us what change we need to make to `img`.\n", | |
| "\n", | |
| "Starting with a blank page (white image) at step 0, the `model` tells us to add the top side of the square. We then add this change to `img` and increment our `step`. We repeat this process for every step in our drawing algorithm, passing the latest `img` and `step` to `model`.\n", | |
| "\n", | |
| "* * *\n", | |
| "\n", | |
| "### General Drawing Algorithm\n", | |
| "\n", | |
| "Ok, we've created a killer square drawing algorithm but what if we want to generate more interesting things like a photo of a toucan? Unfortunately there isn't a straightforward way to extend our _Squares_ algorithm to draw more complex things 😢. If we can't hardcode a general algorithm perhaps we can build one using data instead?\n", | |
| "\n", | |
| "To build an algorithm using data we need to train a model (e.g. a neural network) on a dataset of input and output pairs. In our case the input would be an image and the current step in the drawing algorithm. The output is the change we need to make to the image.\n", | |
| "\n", | |
| "> the key question is how do we design a drawing algorithm that is easy for a neural network learn.\n", | |
| "\n", | |
| "#### First Attempt\n", | |
| "\n", | |
| "As we no longer need to hardcode each step in the drawing algorithm like we did for the _Squares_ algorithm, we can use more general higher level drawing steps that the network will learn through data. Here's an example of one possible general drawing algorithm which we'll call the _ORGB_ algorithm:\n", | |
| "\n", | |
| "####\n", | |
| "\n", | |
| " 1. draw the outline\n", | |
| " 2. add red\n", | |
| " 3. add blue\n", | |
| " 4. add green\n", | |
| "\n", | |
| "\n", | |
| "\n", | |
| "_ORGB Algorithm_\n", | |
| "\n", | |
| "While the _ORGB_ algorithm should allow us to draw complex images, it will be difficult to train a neural network to learn it. Neural networks train better using smooth continuous data but the steps in this algorithm are sharp and jagged. What could we use instead?\n", | |
| "\n", | |
| "* * *\n", | |
| "\n", | |
| "#### Second Attempt\n", | |
| "\n", | |
| "If we squint our eyes a little, we can see that each step in both the _Squares_ algorithm and the _ORGB_ algorithm basically adds a little piece of the final image at each step. Both algorithms do this in a way that loosely mimics how a human might draw. Let's abandon these human like drawing steps and instead focus on creating a smooth, continuous set of drawing steps that are more suitable for a neural network.\n", | |
| "\n", | |
| "One way we could this is to add something like 1% of the final image at each step in our algorithm. If we design our algorithm to have 100 steps, then at step 70 the image will contain 30% of the starting image and 70% of the final image.\n", | |
| "\n", | |
| "Here's the algorithm in mathematical form:\n", | |
| "\n", | |
| "$$\n", | |
| "x_n = (1-n)z + nx\n", | |
| "$$\n", | |
| "\n", | |
| "#####\n", | |
| " * \\( x_n \\) image at step _n_\n", | |
| " * _n_ : step in the process [0 to 1] (e.g. step 70 would be 0.7)\n", | |
| " * _z_ : starting image (blank page)\n", | |
| " * _x_ : the final image (e.g. toucan photo)\n", | |
| "\n", | |
| "\n", | |
| "\n", | |
| "_Smooth and continuous drawing algorithm_\n", | |
| "Ok, now that we have our drawing algorithm let's train a model to predict \\( x_{n+1} - x_n \\) (i.e. the change we need to make to \\( x_n \\)). To train a model we need to define input/output pairs for our training dataset. Let's create a single input/output pair as follows:\n", | |
| "\n", | |
| "#####\n", | |
| "\n", | |
| " 1. \\( n \\): select a random drawing algorithm step (a float between 0 and 1)\n", | |
| " 2. \\( x \\): pluck a random image from our dataset\n", | |
| " 3. \\( z \\): generate a white image with the same size as \\( x \\)\n", | |
| " 4. compute \\( x_n \\): \\( (1-n)z + nx \\)\n", | |
| " 5. compute \\( x_{n+1} \\): \\( (1-(n+0.01))z + (n+0.01)x \\)\n", | |
| " 6. pass (\\( x_n \\), \\( n \\)) as input to the model, use (\\( x_{n+1} - x_n \\)) as the output \n", | |
| "\n", | |
| "Unfortunately, the model fails to learn anything useful when using the input/outputs pairs above 😭. One of the reasons could be that the difference between \\( x_{n+1} \\) and \\( x_n \\) is tiny. As a result the model can achieve a low loss by simply predicting all zeros or something close to all zeros." | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "6e1772c5", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I'm reading the previous post about flow matching and I don't understand the formula `$$ x_n = (1-n)z + nx $$`. If `z` is a blank image, wouldn't any multiplication of this empty matrix by a scalar (1-n) result always in a empty matrix?\n", | |
| "\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Great question! You're thinking about this correctly, but there's a key detail about what \"blank\" means in this context.\n", | |
| "\n", | |
| "In the post, when they say `z` is a \"blank page,\" they mean a white image, not an empty/zero matrix. Look at this line from the code:\n", | |
| "\n", | |
| "```python\n", | |
| "img = torch.ones(img_height, img_width) # white pixels = 1\n", | |
| "```\n", | |
| "\n", | |
| "So `z` is actually filled with 1s (representing white pixels), not 0s. When you multiply `(1-n) * z` where `z` is all 1s, you get `(1-n)` everywhere.\n", | |
| "\n", | |
| "The formula `x_n = (1-n)z + nx` is doing a linear interpolation between:\n", | |
| "- `z` (white image, all 1s) \n", | |
| "- `x` (target image, mix of 0s and 1s)\n", | |
| "\n", | |
| "At `n=0`: you get `1*z + 0*x = z` (pure white)\n", | |
| "At `n=1`: you get `0*z + 1*x = x` (target image)\n", | |
| "At `n=0.5`: you get `0.5*z + 0.5*x` (50% blend)\n", | |
| "\n", | |
| "Does this clarify why the multiplication doesn't result in zeros?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "c1ff3901", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "oh, so the first part is only \"how much white\" is there? \n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Exactly! You've got it perfectly. \n", | |
| "\n", | |
| "The `(1-n)` term controls \"how much of the starting white image\" remains, and the `n` term controls \"how much of the target image\" shows through.\n", | |
| "\n", | |
| "So at different steps:\n", | |
| "- `n=0.2`: 80% white + 20% target image \n", | |
| "- `n=0.7`: 30% white + 70% target image\n", | |
| "- `n=0.9`: 10% white + 90% target image\n", | |
| "\n", | |
| "It's like gradually fading from a white canvas to your target image. Does this linear blending approach make sense for why it creates a smooth \"drawing process\"?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "570ad99e", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "and I assume this would be happening in 3 layers if the image is in RGB?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Exactly right! For RGB images, this interpolation happens independently across all 3 color channels (red, green, blue).\n", | |
| "\n", | |
| "So if you have:\n", | |
| "- `z`: white image with shape `(3, height, width)` where all values are 1.0\n", | |
| "- `x`: target RGB image with shape `(3, height, width)`\n", | |
| "\n", | |
| "Then `x_n = (1-n)z + nx` applies the same blending formula to each channel separately. Each pixel gets interpolated from white `(1,1,1)` toward its target RGB values.\n", | |
| "\n", | |
| "This means at `n=0.5`, a target pixel of `(0.2, 0.8, 0.1)` would become `(0.6, 0.9, 0.55)` - halfway between white and the target color.\n", | |
| "\n", | |
| "Does this make sense for how the color blending would work?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "36514e5a", | |
| "metadata": {}, | |
| "source": [ | |
| "#### Third Attempt\n", | |
| "\n", | |
| "What now? As our drawing algorithm is smooth and continuous this allows us to crack open the calculus toolbox. Let's see if we can use it to find an alternative way to represent \\( x_{n+1} \\) - \\( x_n \\).\n", | |
| "\n", | |
| "Recall the basic definition of a derivative:\n", | |
| "\n", | |
| "$$\n", | |
| "\\frac{df}{dx}\\ = \\lim_{h \\to 0} \\frac{f(x+h) - f(x)}{h}\n", | |
| "$$" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "37a97d05", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I need a math refresher. I know what a derivative is, but what does this way of expressing it using limits, and what does \"h tends to zero\" stand for here? \n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Great question! Let me break down this limit definition of a derivative.\n", | |
| "\n", | |
| "The derivative measures \"rate of change\" - how fast something is changing at a specific point. But here's the tricky part: to measure rate of change at a single point, we need to look at what happens in a tiny neighborhood around that point.\n", | |
| "\n", | |
| "The expression `f(x+h) - f(x)` is the change in the function's value when we move from point `x` to point `x+h`. And `h` is just the size of that step.\n", | |
| "\n", | |
| "So `[f(x+h) - f(x)]/h` gives us the \"average rate of change\" over that small interval of size `h`.\n", | |
| "\n", | |
| "The \"limit as h approaches 0\" (`h → 0`) means: what happens to this average rate of change as we make the step size smaller and smaller, approaching zero?\n", | |
| "\n", | |
| "Think of it like zooming in on a curve - as you zoom in more and more, the curve looks more and more like a straight line, and that line's slope is the derivative.\n", | |
| "\n", | |
| "Does this help clarify what the limit is doing here?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "8c3270cc", | |
| "metadata": {}, | |
| "source": [ | |
| "Let's rewrite it using our own variables:\n", | |
| "\n", | |
| "$$\n", | |
| "\\frac{dx_{n}}{dn}\\ = \\frac{x_{n+1} - x_{n}}{\\Delta n}\n", | |
| "$$" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "36821a0c", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "essentially, we are defining how the gradient of interpolated image \\(x_n\\) right? How the image changes at a given step n. The \\(\\deltan \\) means the differential between n steps right? or what was delta in math notation.\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Yes, you've got it! You're absolutely right on both counts.\n", | |
| "\n", | |
| "`dx_n/dn` is indeed the gradient (rate of change) of the interpolated image with respect to the step `n`. It tells us \"how fast is the image changing at this particular step.\"\n", | |
| "\n", | |
| "And yes, `Δn` (delta n) represents the difference between steps - it's the step size. In discrete math, we often use Δ (delta) to mean \"change in\" or \"difference in.\"\n", | |
| "\n", | |
| "So `(x_{n+1} - x_n)/Δn` is saying: \"the change in image divided by the change in step size.\"\n", | |
| "\n", | |
| "In the context of the flow matching post, they're essentially trying to use calculus to better represent what happens between steps, rather than dealing with those tiny discrete differences that were hard for the neural network to learn.\n", | |
| "\n", | |
| "Does this connection between the discrete steps and the continuous derivative make sense?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "01851c0a", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "yes it makes sense, if the delta is very small then it will almost change nothing. Could you give me a \"explain plain\" example of that gradient? If a pixel is dark always it won't change much, but if it goes from dark to very bright, then I will have a very steep gradient?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Perfect intuition! You've got it exactly right.\n", | |
| "\n", | |
| "Let's say we have a pixel that starts white (value 1.0) and needs to become black (value 0.0) by the end.\n", | |
| "\n", | |
| "Using the formula `x_n = (1-n)z + nx`:\n", | |
| "- At n=0: pixel = 1.0 (white)\n", | |
| "- At n=0.5: pixel = 0.5 (gray) \n", | |
| "- At n=1: pixel = 0.0 (black)\n", | |
| "\n", | |
| "The gradient `dx_n/dn` for this pixel would be -1.0 everywhere - it's changing at a constant steep rate from white to black.\n", | |
| "\n", | |
| "But for a pixel that's supposed to stay white throughout:\n", | |
| "- At n=0: pixel = 1.0 (white)\n", | |
| "- At n=0.5: pixel = 1.0 (white)\n", | |
| "- At n=1: pixel = 1.0 (white)\n", | |
| "\n", | |
| "The gradient would be 0 - no change at all!\n", | |
| "\n", | |
| "So yes, pixels that need to change dramatically have steep gradients, while pixels that stay the same have gradients near zero.\n", | |
| "\n", | |
| "Does this concrete example help solidify the concept?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "7d9ebe6a", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "so all gradients in this formulation would be linear? because they will go from the initial value to the end value in `n` steps.\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Exactly right! In this specific linear interpolation formulation `x_n = (1-n)z + nx`, all the gradients are indeed linear (constant).\n", | |
| "\n", | |
| "For any pixel, the derivative `dx_n/dn = x - z` is just the difference between the final value and initial value - which is constant throughout the entire process.\n", | |
| "\n", | |
| "So if a pixel goes from white (1.0) to black (0.0), its gradient is always -1.0 at every step. If it goes from white to gray (0.5), the gradient is always -0.5.\n", | |
| "\n", | |
| "This is actually one of the limitations the author mentions - this simple linear approach creates very predictable, uniform changes that might be too simplistic for a neural network to learn interesting patterns from.\n", | |
| "\n", | |
| "Do you think this constant gradient might be part of why the model \"fails to learn anything useful\" as mentioned in the post?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "146b93dc", | |
| "metadata": {}, | |
| "source": [ | |
| "Let's isolate \\(x_{n+1}\\) - \\(x_n\\):\n", | |
| "\n", | |
| "$$\n", | |
| "x_{n+1} - x_{n} = \\frac{dx_{n}}{dn}\\ \\Delta n\n", | |
| "$$\n", | |
| "\n", | |
| "Ok, we now have an alternative definition for \\(x_{n+1}\\) - \\(x_n\\). As \\(\\Delta n \\) is simply a constant, if we can figure out how to compute \\(\\frac{dx_{n}}{dn}\\) then we can tweak our training dataset and attempt to train a model.\n", | |
| "\n", | |
| "Recall the definition of \\(x_n\\)\n", | |
| "\n", | |
| "$$\n", | |
| "x_n = (1-n)z + (n)x\n", | |
| "$$\n", | |
| "\n", | |
| "Computing \\(\\frac{dx_{n}}{dn}\\) is pretty straightforward \n", | |
| "\n", | |
| "$$\n", | |
| "\\frac{dx_{n}}{dn} = -z+x = x-z\n", | |
| "$$" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "9e9b0c21", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "could you spell out the rules & steps for the derivative?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Sure! Let me break down the derivative rules used here.\n", | |
| "\n", | |
| "For the function `x_n = (1-n)z + nx`, we need these basic derivative rules:\n", | |
| "\n", | |
| "**Constant Rule**: The derivative of a constant is 0\n", | |
| "- `d/dn (z) = 0` (since z doesn't depend on n)\n", | |
| "\n", | |
| "**Power Rule**: `d/dn (n) = 1`\n", | |
| "\n", | |
| "**Constant Multiple Rule**: `d/dn (c·f) = c·(df/dn)`\n", | |
| "- `d/dn (nx) = x·(d/dn n) = x·1 = x`\n", | |
| "- `d/dn ((1-n)z) = z·(d/dn (1-n)) = z·(-1) = -z`\n", | |
| "\n", | |
| "**Sum Rule**: `d/dn (f + g) = df/dn + dg/dn`\n", | |
| "\n", | |
| "Putting it together:\n", | |
| "`dx_n/dn = d/dn[(1-n)z + nx] = d/dn((1-n)z) + d/dn(nx) = -z + x = x - z`\n", | |
| "\n", | |
| "Does this step-by-step breakdown help clarify how we got to `x - z`?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "50176061", | |
| "metadata": {}, | |
| "source": [ | |
| "Here's our new definition \n", | |
| "$$\n", | |
| "x_{n+1} - x_{n} = (x-z)\\Delta n\n", | |
| "$$\n", | |
| "\n", | |
| "As \\(\\Delta n\\) is a constant \\(\\frac{1}{n}\\), we can ignore it during training and instead scale the model's output by \\(\\Delta n\\) during inference.\n", | |
| "\n", | |
| "Finally, this gives us the following input, outputs pairs:\n", | |
| "\n", | |
| "####\n", | |
| "\n", | |
| " * input: (\\(x_n\\), n)\n", | |
| " * output: \\(x-z\\)\n", | |
| "\n", | |
| "Before we start training we need to make one last modification. To a human, a blank page allows us to create any image we want. It is an unbridled source of creativity. To a neural net the opposite is true. A blank page represents a fixed starting point and constrains creativity. To enable more diverse images let's use a random starting point instead.\n", | |
| "\n", | |
| "Putting everything together, here's our training set creation process:\n", | |
| "\n", | |
| "#####\n", | |
| "\n", | |
| " 1. \\(n\\): select a random drawing algorithm step [a float between 0 and 1]\n", | |
| " 2. \\(x\\): pluck a random image from the dataset\n", | |
| " 3. \\(z\\): generate an image of pure random noise with the same size as \\j(x\\)\n", | |
| " 4. compute \\(x_n\\): \\((1-n)z + nx\\)\n", | |
| " 5. pass (\\(x_n\\), \\(n\\)) as input to the model, use \\(x-z\\) as the output \n", | |
| "\n", | |
| "#### Here's the same process in code.\n", | |
| "\n", | |
| "_Note_ : The `sample` method in the [notebook](https://github.com/comhar/flow-matching/blob/main/flow_matching.ipynb) includes a class label `c` which allows us to generate images for specific classes.\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "0a7985aa", | |
| "metadata": { | |
| "time_run": "3:15:28p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "def noisify(x):\n", | |
| " device = x.device\n", | |
| " n = torch.rand(len(x),).to(x).clamp(0,0.999)\n", | |
| " nr = n.reshape(-1, 1, 1, 1)\n", | |
| " z = torch.randn(x.shape, device=device)\n", | |
| " xn = ((1-nr)*z + nr*x).to(device)\n", | |
| " return (xn, n.to(device)), x-z\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "b253c392", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I followed along the math, but it's not exactly clear to me what the difference is compared to the original attempt. Is it only that we are now passing a much more different image, and in addition to that, we are telling the AI how many steps it is removed from the original, so the changes are going to be larger? Or am I not sure why we did all this? \n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Great question! You've identified some key differences, but let me clarify the main breakthrough here.\n", | |
| "\n", | |
| "The crucial insight is that instead of predicting tiny step-by-step changes `x_{n+1} - x_n` (which were hard to learn), the model now predicts the **direction vector** `x - z` - which points from the current noisy starting point directly toward the target image.\n", | |
| "\n", | |
| "This direction vector is:\n", | |
| "1. **Much larger** - instead of tiny 1% changes, it's the full difference between noise and target\n", | |
| "2. **More meaningful** - it tells the model \"here's the direction you need to go\" rather than \"take this tiny step\"\n", | |
| "3. **Constant for each training pair** - the same `x - z` applies regardless of which step `n` you're at\n", | |
| "\n", | |
| "The model learns: \"Given this partially-interpolated image at step n, what's the vector that points toward the clean target?\"\n", | |
| "\n", | |
| "Does this help clarify why this formulation is easier for the neural network to learn than tiny incremental changes?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "1bf0528d", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "ok, but then every \\(x_n\\) for each value of n will have the same vector right? because we are using only the initial `z` (which will probably be either blank or noise) and the final `x`. So there's no intermediate output examples. \n", | |
| "\n", | |
| "Instead of predicting small steps, given a random `n` we try to predict the **direction vector** always.\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Exactly! You've nailed the key insight. \n", | |
| "\n", | |
| "Yes, for any given pair of `(z, x)`, the direction vector `x - z` is **always the same** regardless of what `n` is. The model isn't learning different vectors for different steps - it's learning to predict the same direction vector from any point along the interpolation path.\n", | |
| "\n", | |
| "But here's the clever part: the **input** `x_n = (1-n)z + nx` changes dramatically based on `n`. At `n=0.1` the input looks mostly like noise, at `n=0.9` it looks mostly like the target.\n", | |
| "\n", | |
| "So the model learns: \"No matter where I am on this path from noise to target, I should always point in the same direction - toward the clean image.\"\n", | |
| "\n", | |
| "This is much more stable than learning \"take this specific tiny step\" which varied wildly and was hard to learn.\n", | |
| "\n", | |
| "Does this make the training strategy clearer?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "d6b437df", | |
| "metadata": { | |
| "time_run": "3:15:30p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| " % Total % Received % Xferd Average Speed Time Time Time Current\r\n", | |
| " Dload Upload Total Spent Left Speed\r\n", | |
| "\r", | |
| " 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| " 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| " 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r\n" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| " 15 68.7M 15 10.4M 0 0 9367k 0 0:00:07 0:00:01 0:00:06 9367k" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "100 68.7M 100 68.7M 0 0 37.3M 0 0:00:01 0:00:01 --:--:-- 83.6M\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "#!/bin/bash\n", | |
| "!curl -L -o ./fashionmnist.zip https://www.kaggle.com/api/v1/datasets/download/zalando-research/fashionmnist" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "9b719e8e", | |
| "metadata": { | |
| "time_run": "3:15:30p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "from fastcore.tools import run_cmd, rg, sed" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "7b36a88f", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "uncompress the mnist dataset into a subfolder fashionmnist using &`run_cmd`\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "<details class='tool-usage-details'>\n", | |
| "\n", | |
| "- `run_cmd({'cmd': 'unzip', 'argstr': '-d fashionmnist ./fashionmnist.zip'})`\n", | |
| " - `'Archive: ./fashionmnist.zip\\n inflating: fashionmnist/fashion-mnist_test.csv \\n inflating: fashionmnist/fashion-mnist_train.csv \\n inflating: fashionmnist/t10k-images-idx3-ubyte \\n inflating: fashionmnist/t10k-labels-idx1-ubyte \\n inflating: fashionmnist/train-images-idx3-ubyte \\n inflating: fashionmnist/train-labels-idx1-ubyte \\n'`\n", | |
| "\n", | |
| "</details>\n", | |
| "\n", | |
| "Perfect! The Fashion-MNIST dataset has been extracted to the `fashionmnist/` folder. You now have both CSV files and the original IDX format files available." | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "78663f8b", | |
| "metadata": { | |
| "time_run": "3:15:30p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "fashion-mnist_test.csv\t t10k-images-idx3-ubyte train-images-idx3-ubyte\r\n", | |
| "fashion-mnist_train.csv t10k-labels-idx1-ubyte train-labels-idx1-ubyte\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!ls fashionmnist" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "a139237c", | |
| "metadata": {}, | |
| "source": [ | |
| "**Fashion MNIST description from Kaggle** \n", | |
| "\n", | |
| "Each image is 28 pixels in height and 28 pixels in width, for a total of 784 pixels in total. Each pixel has a single pixel-value associated with it, indicating the lightness or darkness of that pixel, with higher numbers meaning darker. This pixel-value is an integer between 0 and 255. The training and test data sets have 785 columns. The first column consists of the class labels (see above), and represents the article of clothing. The rest of the columns contain the pixel-values of the associated image.\n", | |
| "\n", | |
| "To locate a pixel on the image, suppose that we have decomposed x as x = i * 28 + j, where i and j are integers between 0 and 27. The pixel is located on row i and column j of a 28 x 28 matrix.\n", | |
| "For example, pixel31 indicates the pixel that is in the fourth column from the left, and the second row from the top, as in the ascii-diagram below." | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "a1519619", | |
| "metadata": { | |
| "skipped": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "how can we display one image of the dataset? can it be done using torch so we call `noisify` on one of the samples?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "\n", | |
| "<details class='tool-usage-details'>\n", | |
| "\n", | |
| "- `run_cmd({'cmd': 'unzip', 'argstr': 'fashionmnist.zip -d fashionmnist'})`\n", | |
| " - `'Archive: fashionmnist.zip\\n\\nreplace fashionmnist/fashion-mnist_test.csv? [y]es, [n]o, [A]ll, [N]one, [r]ename: NULL\\n(EOF or read error, treating as "[N]one" ...)\\n'`\n", | |
| "\n", | |
| "</details>\n", | |
| "\n", | |
| "To load and display a Fashion-MNIST image with torch, you could:\n", | |
| "\n", | |
| "```python\n", | |
| "import pandas as pd\n", | |
| "df = pd.read_csv('fashionmnist/fashion-mnist_train.csv')\n", | |
| "sample = df.iloc[0, 1:].values.reshape(28, 28) # Skip label, reshape to 28x28\n", | |
| "img_tensor = torch.tensor(sample, dtype=torch.float32) / 255.0 # Normalize to [0,1]\n", | |
| "```\n", | |
| "\n", | |
| "Then you can call `noisify` on it. Would you like me to show the full code to display it, or do you want to try building this step by step?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "0b40f9fa", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/html": [ | |
| "<div>\n", | |
| "<style scoped>\n", | |
| " .dataframe tbody tr th:only-of-type {\n", | |
| " vertical-align: middle;\n", | |
| " }\n", | |
| "\n", | |
| " .dataframe tbody tr th {\n", | |
| " vertical-align: top;\n", | |
| " }\n", | |
| "\n", | |
| " .dataframe thead th {\n", | |
| " text-align: right;\n", | |
| " }\n", | |
| "</style>\n", | |
| "<table border=\"1\" class=\"dataframe\">\n", | |
| " <thead>\n", | |
| " <tr style=\"text-align: right;\">\n", | |
| " <th></th>\n", | |
| " <th>label</th>\n", | |
| " <th>pixel1</th>\n", | |
| " <th>pixel2</th>\n", | |
| " <th>pixel3</th>\n", | |
| " <th>pixel4</th>\n", | |
| " <th>pixel5</th>\n", | |
| " <th>pixel6</th>\n", | |
| " <th>pixel7</th>\n", | |
| " <th>pixel8</th>\n", | |
| " <th>pixel9</th>\n", | |
| " <th>...</th>\n", | |
| " <th>pixel775</th>\n", | |
| " <th>pixel776</th>\n", | |
| " <th>pixel777</th>\n", | |
| " <th>pixel778</th>\n", | |
| " <th>pixel779</th>\n", | |
| " <th>pixel780</th>\n", | |
| " <th>pixel781</th>\n", | |
| " <th>pixel782</th>\n", | |
| " <th>pixel783</th>\n", | |
| " <th>pixel784</th>\n", | |
| " </tr>\n", | |
| " </thead>\n", | |
| " <tbody>\n", | |
| " <tr>\n", | |
| " <th>0</th>\n", | |
| " <td>2</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>...</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " </tr>\n", | |
| " <tr>\n", | |
| " <th>1</th>\n", | |
| " <td>9</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>...</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " </tr>\n", | |
| " <tr>\n", | |
| " <th>2</th>\n", | |
| " <td>6</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>5</td>\n", | |
| " <td>0</td>\n", | |
| " <td>...</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>30</td>\n", | |
| " <td>43</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " </tr>\n", | |
| " <tr>\n", | |
| " <th>3</th>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>1</td>\n", | |
| " <td>2</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>...</td>\n", | |
| " <td>3</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>1</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " </tr>\n", | |
| " <tr>\n", | |
| " <th>4</th>\n", | |
| " <td>3</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>...</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " <td>0</td>\n", | |
| " </tr>\n", | |
| " </tbody>\n", | |
| "</table>\n", | |
| "<p>5 rows × 785 columns</p>\n", | |
| "</div>" | |
| ], | |
| "text/plain": [ | |
| " label pixel1 pixel2 pixel3 pixel4 pixel5 pixel6 pixel7 pixel8 \\\n", | |
| "0 2 0 0 0 0 0 0 0 0 \n", | |
| "1 9 0 0 0 0 0 0 0 0 \n", | |
| "2 6 0 0 0 0 0 0 0 5 \n", | |
| "3 0 0 0 0 1 2 0 0 0 \n", | |
| "4 3 0 0 0 0 0 0 0 0 \n", | |
| "\n", | |
| " pixel9 ... pixel775 pixel776 pixel777 pixel778 pixel779 pixel780 \\\n", | |
| "0 0 ... 0 0 0 0 0 0 \n", | |
| "1 0 ... 0 0 0 0 0 0 \n", | |
| "2 0 ... 0 0 0 30 43 0 \n", | |
| "3 0 ... 3 0 0 0 0 1 \n", | |
| "4 0 ... 0 0 0 0 0 0 \n", | |
| "\n", | |
| " pixel781 pixel782 pixel783 pixel784 \n", | |
| "0 0 0 0 0 \n", | |
| "1 0 0 0 0 \n", | |
| "2 0 0 0 0 \n", | |
| "3 0 0 0 0 \n", | |
| "4 0 0 0 0 \n", | |
| "\n", | |
| "[5 rows x 785 columns]" | |
| ] | |
| }, | |
| "execution_count": 28, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "import pandas as pd\n", | |
| "df = pd.read_csv('fashionmnist/fashion-mnist_train.csv')\n", | |
| "df.head()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "8f8be740", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "label 2\n", | |
| "pixel1 0\n", | |
| "pixel2 0\n", | |
| "pixel3 0\n", | |
| "pixel4 0\n", | |
| " ..\n", | |
| "pixel780 0\n", | |
| "pixel781 0\n", | |
| "pixel782 0\n", | |
| "pixel783 0\n", | |
| "pixel784 0\n", | |
| "Name: 0, Length: 785, dtype: int64" | |
| ] | |
| }, | |
| "execution_count": 30, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "df.iloc[0]" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "8b3648a2", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "pixel1 0\n", | |
| "pixel2 0\n", | |
| "pixel3 0\n", | |
| "pixel4 0\n", | |
| "pixel5 0\n", | |
| " ..\n", | |
| "pixel780 0\n", | |
| "pixel781 0\n", | |
| "pixel782 0\n", | |
| "pixel783 0\n", | |
| "pixel784 0\n", | |
| "Name: 0, Length: 784, dtype: int64" | |
| ] | |
| }, | |
| "execution_count": 32, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "df.iloc[0, 1:]" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "fb60e85d", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "(tensor([[0., 0., 0.],\n", | |
| " [0., 0., 0.],\n", | |
| " [0., 0., 0.]]),\n", | |
| " torch.Size([28, 28]))" | |
| ] | |
| }, | |
| "execution_count": 34, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "s = df.iloc[0, 1:].values.reshape(28, 28) # Skip label, reshape to 28x28\n", | |
| "it = torch.tensor(s, dtype=torch.float32) \n", | |
| "it[:3, :3], it.shape" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "8cad6bb6", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "how can we display that tensor as an image?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "<details class='tool-usage-details'>\n", | |
| "\n", | |
| "- `run_cmd({'cmd': 'unzip', 'argstr': 'fashionmnist.zip -d fashionmnist'})`\n", | |
| " - `'Archive: fashionmnist.zip\\n\\nreplace fashionmnist/fashion-mnist_test.csv? [y]es, [n]o, [A]ll, [N]one, [r]ename: NULL\\n(EOF or read error, treating as "[N]one" ...)\\n'`\n", | |
| "\n", | |
| "</details>\n", | |
| "\n", | |
| "You can use `plt.imshow()` to display the tensor as an image:\n", | |
| "\n", | |
| "```python\n", | |
| "plt.imshow(it, cmap='gray')\n", | |
| "plt.show()\n", | |
| "```\n", | |
| "\n", | |
| "The `cmap='gray'` tells matplotlib to use a grayscale colormap since Fashion-MNIST images are grayscale.\n", | |
| "\n", | |
| "Try it and see what Fashion-MNIST item appears!" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "e1e24746", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAANAAAADNCAYAAADJ7P4xAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjYsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvq6yFwwAAAAlwSFlzAAAHsQAAB7EBBsVhhgAAD/tJREFUeJzt3X9InWUbB/CvrjY72kBNpeF2zIj+KNcQHEFZUoGMWGmYkwb9M9easw1lLMaKClwajP5Y23LZFgu2MRxnIa1xaJGQBCFZaSyL5dSktqZzrvnb9P3jRZnuua7Oea7Ho27fDwxe77v7nMtHr/c55/I+1x01MTExASJyJXquAyBayJhARAZMICKDO7x8sKioKC8fjmjeyc/PRyAQmPqadyCiMKSlpU37OuQ7UH9/P0pKSrB48WLk5ORg/fr1XsdGtOCEfAcKBAIoKChATU0N6urqZjMmogUj5DtQV1cXMjIyAACLFi2aNhcMBhEMBr2NjGgBCPkOlJqaiq6uLgDA+Pj4tLnc3Fy8//773kZGtACEfAd64YUXUFpaitOnT2Pt2rWzGRPRghFyAsXGxuKTTz6ZzViIFhyWsYkMmEBEBkwgIgMmEJEBE4jIgAlEZMAEIjJgAhEZMIGIDJhARAZMICIDJhCRgac9ESLljjvksMfGxiIYibMnnnhCnJv5UZBJv/76q7gmJibGcXxkZERck5qaKs69+OKLjuOff/65uKahoUGcu53xDkRkwAQiMmACERkwgYgMmEBEBguyCud1pa2oqEicKysrcxxftmyZuEaqtAHAihUrHMe3b98urmlsbHQcf/bZZ8U1O3bsEOe6u7sdxwsLC8U19913nzhXVVXlOL5z505xza2CdyAiAyYQkQETiMiACURkwAQiMmACERlEeXnI8Hw4YOuRRx4R577//nvH8StXrohrpI2r165dE9cMDg6Kc5KlS5eKc5WVlY7jubm54hptM+mSJUscx30+X9hrACAhIcFx/M477xTXrFy50nH8559/FtfMB2VlZdP6wPMORGQQcgLV19cjOzsbr776Kurr62cxJKKFI+QEioqKQlxcHIaGhtSXB0S3k5C38mRnZ+PJJ5/EpUuXUF5ejqNHj07N8YAtul2FfAeKjv7/fxofH4/h4eFpczxgi25XId+BAoEAgsEgrl69itLS0tmMiWjBmPMytrbGTWjnzp0T56TeAtevXxfXzDwPdlJsbKy4RvuehoaGwnoeAEhPT3ccv3z5srhGK7NPvpqYSdvlvnjxYnFO2n2emJgorpHK9lJs/0W65h7+egNgGZvIU0wgIgMmEJEBE4jIgAlEZOB5T4RwqyFuqiRvv/22OJeSkiLOdXZ2Oo7Hx8eHHUNvb684d9ddd4lzUsVq5t/WbtTc3Ow4rlXutI2h//zzj+O4VlkcGBgQ5+6++27H8T/++ENcI/WUOHDggLimpKREnPO62hYq3oGIDJhARAZMICIDJhCRAROIyIAJRGQQsc2k0iZBrQ2upKenR5zr6+sT56RSsbTBE5BLxW6+Vy0GaaMrIJdo3W7E/ffffx3HtR4G2uNJ108rzUsbTR944AFxjdY3QirNaz8LN7973ExK5CEmEJEBE4jIgAlEZMAEIjKI2AFbbqpw0nHs2sZG7ePZUqVLqxZJGyylShagV/Xi4uIcx0dHR8U1bgqlWoVOqixqH+nWYtA2oUqk7/fixYvimk8//VScy8/Pdxx3U2kLB+9ARAZMICIDJhCRAROIyIAJRGTABCIyiFgZWyuRSioqKhzHtRKytiFSKnFra6QSt9QHAACSk5PFOa3ELZF6LGi9F0ZGRsQ56fvVSunaNZf6Q2jXVfqzhnbY2erVq8U5v9/vON7R0SGukQ5PC+d3lXcgIgMxgdra2rBhwwYUFBQAAI4dO4aNGzfi5ZdfRn9/f8QCJJrPxARKT0/HoUOHpr4+deoUampqUFhYiEAgEJHgiOa7kN8DTW4N8fv9aGlpmTbHA7bodhX2e6DOzs6bjnjkAVt0uxLvQD09Pdi1axd++OEHVFZWIi8vD5s3b8bg4CD2798fyRiJ5i0xgRITE1FdXT1t7KWXXvrPBwy3tW9SUpL4WNLuae3wKI1UitX6EUifw29vbxfX1NXVhR3DY489Jq758ccfHce1MrbWelgqAkkHeQHA/fffL85JbXqvXr0qrpFi13bGa62M9+7d6zj+/PPPi2vc/GllJpaxiQyYQEQGTCAiAyYQkQETiMjA882k4X5+/5VXXhHnpIqeVj2RNggC8lHt2sZLadPj77//Lq5pamoS56SqXmZmprhmcHDQcfynn34S12jVTalqpl1XrfK5fPlyx3GtL4N0zbUYtKrec8895ziubfqVuplqcc/EOxCRAROIyIAJRGTABCIyYAIRGTCBiAwi1hNBsmnTJnFO2liofXZfK0G6afMqbbyUSsEA8PTTT4tzUvl2yZIl4pq0tDTH8XvvvVdco/UjuOeeexzHteujHW7l5tAwaVOt9mcI7ef+999/O46/++674prXXnvNcTycP8XwDkRkwAQiMmACERkwgYgMmEBEBhGrwj388MOO41rVTDqyXjqkCtA7aPp8PsdxrfIjVWRWrlwprsnIyBDnpM6kWsdSqeum9pFu7aPRUiVQqz5px8VLH7XWrqsUgxa3VtXr7u52HN+yZYu4RqrChYN3ICIDJhCRAROIyIAJRGTABCIyYAIRGUSsjF1WVuY4rpVOpTmt1KmVTqXeAlKvBAAYGBhwHL906ZK4RisvS2V77XuSDgbT+gdo35NUktY2oGp/HpCeK5zeApPcbiaV5qTyNiCXuMNpXc07EJGBmO5tbW3YvXs3+vr6cPLkSaxZswZ+vx9xcXHYs2dPJGMkmrdCPmDL5/NhfHwcKSkpEQmMaCEI+T1QbW0toqOjUV5ejubm5mlbWXjAFt2uQn4PNPnGMzk5+aY3tTxgi25XIR+w1draCp/Ph7GxMezYsSOSMRLNW1ET4fbi1R5MKVtevnzZcVz6LDsgl521/gFuyuJa6VTqiaCVVLX4YmNjHce13c5SuVrrYaBdB+nxtDK2tltc2iWtXQepLK7FrbVgln73tNa+Utxav4uysrJpr7ZYxiYyYAIRGTCBiAyYQEQGTCAiA083k6akpIiHLUndMLu6usTHk6pFWtXMzYZINxsltefRKnTSQVXa5/2lapZ27LtG+p60qp4Wn1Qd035O0nXQKmA9PT3inPTzkKqogPz7pXV8nYl3ICIDJhCRAROIyIAJRGTABCIyYAIRGXhaxl6+fDmeeuopx7nffvvNcVzbIKh9rt8NqUyrlbGlTYpaidZNXwap9wIgx61t3nUzp10HrWQulZBXrFghrjlw4IDjuNbDoKqqSpxrbGx0HNeug1SuLioqEtfMxDsQkQETiMiACURkwAQiMmACERkwgYgMPN+NnZWV5TiXlJTkOK6VsaXP4S9dulRco7W7lcrL2uf9pRKytuNa26kttf3VHk8qL7s5NQ6QS7vajmstPumaX7x4UVyzadMmx3HtZ7t582ZxLi0tLazYAOC7775zHD9x4oS4Zt26ddO+5h2IyIAJRGTABCIyYAIRGTCBiAw8rcJ9++23OH/+vOPcn3/+6Tj+6KOPio+3evVqx/HDhw+La86dOyfOVVZWOo43NTWJa6R+BNomRTc9Fnw+n7hG2miqdfHUYnBTWdQODdOeS6JVxyRSpQ0Azp496zh+8OBBcU1tbW3YMczEOxCRgXgH+uyzz3D69Glcu3YNGzZsQEtLCy5cuIDR0VFUV1e7Or6P6FYjJlBeXh7y8vLQ29uL8vJyjIyM4OjRo9i3bx8aGhqQnZ0dyTiJ5qX/fAlXUVGB4uLiqZ0Efr//pl5uwWAQ5eXl6mtooluRmEATExN4/fXXsWbNGmRlZU19UrCzsxOpqanT/tvJA7a0LSxEtyLxJdwHH3yAs2fPoq+vD+fPn0dmZia2bduG4eFhlJSURDJGonlLTKCtW7di69atYT1Yb28vent7Hee2bdsWXmT4/8tFJx0dHeKad955R5yTNlhq/QikMra2kVMjvczVyrpu7uxam143tPikDcFa3GfOnDHHdKNnnnnG08cLFcvYRAZMICIDJhCRAROIyIAJRGTg6WZSQK5OuakKadU2SWtrqzgnbT/SNkpKH/ceHh4W12gfp5bmtK1R0jX1ujOptjlVI63TKnda5VPi9kAxiRR3OL+rvAMRGTCBiAyYQEQGTCAiAyYQkQETiMjA8zJ2uOVqrdwqdRLVPnd0/Phxce7YsWOO44mJieIaqVundviXmy6j2nWT5tyWnaXH0w4G055LOjRM6zLa0NAgzrmJwevNs6HiHYjIgAlEZMAEIjJgAhEZMIGIDJhARAael7HDpZUmvW6T9fHHHzuOP/jgg+IaqSWx1hPBzc5qjVQyd1P6BuRSuvazcNMTISEhQVxz5MgRcU7ipmyv/Szc/hngRrwDERkwgYgMmEBEBkwgIgMmEJHBnFfhImnjxo1zHQIZuKmaeVFp0/AORGQQ8gFbb7zxBjIzM+H3+7Fz585Ixkg0b4V0wNb27dsRGxuLkZERLFu2LJLxEc1r//keqKKiAlu2bMGqVasQHR2NdevWYe3atdP+yhwMBhEMBmc1UKL5KKQDtjIzM6e2oMTHx9/UbHDygC2i201IB2w1Njbil19+QUxMDBISEvgyjmjShIcA8B//3dL/ysrKpv3Os4xNZMAEIjJgAhEZMIGIDJhARAZMICIDJhCRAROIyIAJRGTABCIyYAIRGTCBiAw87YmQn5+PtLQ0AEB7e/vU/54rjIExeB1De3v79AEvd2PfaOau1bnAGBjDbMcway/hcnNzZ+uhGQNjmDcxRE1MzHLfH6JbGIsIRAZMICIDzzuT9vf3o6SkBIsXL0ZOTg7Wr1/v9VOEpL6+Hm+++SYeeughFBUVIScnJyLP29bWht27d6Ovrw8nT57EsWPH8PXXX2N4eBgffvghYmNj5ySONWvWwO/3Iy4uDnv27Jn155/ZV7ClpQUXLlzA6Ogoqqur1XN7ZiuG2eht6PkdKBAIoKCgADU1Nairq/P64UMWFRWFuLg4DA0NITU1NWLPm56ejkOHDk19ferUKdTU1KCwsBCBQGDO4vD5fBgfH0dKSkpEnj8vLw81NTWorq7G8ePH0dTUhH379iEjIwMNDQ0Rj+HEiROz0tvQ8wTq6urC8uXLAQCLFi3y+uFDlp2djTNnzuC9997DW2+9NWdxTP4/rd/vR1dX15zFUVtbi48++gh//fUXmpubI/a8FRUVKC4uRlJSEoC5uQ6TvQ2/+uorHD58GF988QWuXLniyWN7nkCpqalTF0g7ZnC23djHbnh4eM7imNTZ2RnRO+FMk9cjOTkZ169fn/Xnm7ihr2BWVha6u7sBRPY6TITR29Atz8vY/f39KC0tRUxMDB5//PE5ew8UCAQQDAZx9epVbN68OWLvgXp6erBr1y58+eWXKC4uht/vxzfffIPBwUHs378/Yu+BZsbR2toKn8+HsbExHDx40NVZreHYu3cvjhw5gqysLKxatQoDAwPo6OiYei8YifdAN8bg9/un9Tasqqry5Dn4dyAiA5axiQyYQEQGTCAiAyYQkcH/ADIG7bmUcrWaAAAAAElFTkSuQmCC", | |
| "text/plain": [ | |
| "<Figure size 320x240 with 1 Axes>" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "plt.imshow(it, cmap='gray')\n", | |
| "plt.show()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "c9197ede", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "tensor([0.1300, 0.9376, 0.8585, 0.5928, 0.3080, 0.1915, 0.0480, 0.8055, 0.0259,\n", | |
| " 0.8679, 0.9278, 0.2743, 0.9980, 0.6416, 0.2775, 0.8678, 0.8117, 0.3180,\n", | |
| " 0.8546, 0.7081, 0.5302, 0.4767, 0.1040, 0.5174, 0.4435, 0.0544, 0.9838,\n", | |
| " 0.8407])" | |
| ] | |
| }, | |
| "execution_count": 38, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "(xn, n), target = noisify(it)\n", | |
| "n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "0c1db56c", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I think this is not correct right? it should return only 1 n because there's a single image. Should we reshape the input to 1, 28, 28? or am I missing something?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "You're absolutely right! The `noisify` function expects a batch dimension. The tensor `it` has shape `(28, 28)`, but the function expects `(batch_size, height, width)` or `(batch_size, channels, height, width)`.\n", | |
| "\n", | |
| "You need to add a batch dimension:\n", | |
| "\n", | |
| "```python\n", | |
| "it_batch = it.unsqueeze(0) # Shape becomes (1, 28, 28)\n", | |
| "(xn, n), target = noisify(it_batch)\n", | |
| "```\n", | |
| "\n", | |
| "Now `n` should be a single value instead of that array of 28 values you're seeing.\n", | |
| "\n", | |
| "Try that and see if it gives you the expected single `n` value!" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "6d31835c", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "\u001b[31mDocstring:\u001b[39m\n", | |
| "unsqueeze(input, dim) -> Tensor\n", | |
| "\n", | |
| "Returns a new tensor with a dimension of size one inserted at the\n", | |
| "specified position.\n", | |
| "\n", | |
| "The returned tensor shares the same underlying data with this tensor.\n", | |
| "\n", | |
| "A :attr:`dim` value within the range ``[-input.dim() - 1, input.dim() + 1)``\n", | |
| "can be used. Negative :attr:`dim` will correspond to :meth:`unsqueeze`\n", | |
| "applied at :attr:`dim` = ``dim + input.dim() + 1``.\n", | |
| "\n", | |
| "Args:\n", | |
| " input (Tensor): the input tensor.\n", | |
| " dim (int): the index at which to insert the singleton dimension\n", | |
| "\n", | |
| "Example::\n", | |
| "\n", | |
| " >>> x = torch.tensor([1, 2, 3, 4])\n", | |
| " >>> torch.unsqueeze(x, 0)\n", | |
| " tensor([[ 1, 2, 3, 4]])\n", | |
| " >>> torch.unsqueeze(x, 1)\n", | |
| " tensor([[ 1],\n", | |
| " [ 2],\n", | |
| " [ 3],\n", | |
| " [ 4]])\n", | |
| "\u001b[31mType:\u001b[39m builtin_function_or_method" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "torch.unsqueeze?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "edee0cc2", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "torch.Size([1, 28, 28])" | |
| ] | |
| }, | |
| "execution_count": 42, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "it_batch = it.unsqueeze(0) # Shape becomes (1, 28, 28)\n", | |
| "it_batch.shape" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "aa8cd28a", | |
| "metadata": { | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "tensor([0.5790])" | |
| ] | |
| }, | |
| "execution_count": 44, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "(xn, n), target = noisify(it_batch)\n", | |
| "n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "94fdc5d6", | |
| "metadata": { | |
| "hide_input": true, | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "def show(it):\n", | |
| " plt.imshow(it, cmap='gray')" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "b5225669", | |
| "metadata": { | |
| "collapsed": true, | |
| "skipped": true, | |
| "time_run": "3:15:34p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "\u001b[31mDocstring:\u001b[39m\n", | |
| "squeeze(input: Tensor, dim: Optional[Union[int, List[int]]]) -> Tensor\n", | |
| "\n", | |
| "Returns a tensor with all specified dimensions of :attr:`input` of size `1` removed.\n", | |
| "\n", | |
| "For example, if `input` is of shape:\n", | |
| ":math:`(A \\times 1 \\times B \\times C \\times 1 \\times D)` then the `input.squeeze()`\n", | |
| "will be of shape: :math:`(A \\times B \\times C \\times D)`.\n", | |
| "\n", | |
| "When :attr:`dim` is given, a squeeze operation is done only in the given\n", | |
| "dimension(s). If `input` is of shape: :math:`(A \\times 1 \\times B)`,\n", | |
| "``squeeze(input, 0)`` leaves the tensor unchanged, but ``squeeze(input, 1)``\n", | |
| "will squeeze the tensor to the shape :math:`(A \\times B)`.\n", | |
| "\n", | |
| ".. note:: The returned tensor shares the storage with the input tensor,\n", | |
| " so changing the contents of one will change the contents of the other.\n", | |
| "\n", | |
| ".. warning:: If the tensor has a batch dimension of size 1, then `squeeze(input)`\n", | |
| " will also remove the batch dimension, which can lead to unexpected\n", | |
| " errors. Consider specifying only the dims you wish to be squeezed.\n", | |
| "\n", | |
| "Args:\n", | |
| " input (Tensor): the input tensor.\n", | |
| " dim (int or tuple of ints, optional): if given, the input will be squeezed\n", | |
| " only in the specified dimensions.\n", | |
| "\n", | |
| " .. versionchanged:: 2.0\n", | |
| " :attr:`dim` now accepts tuples of dimensions.\n", | |
| "\n", | |
| "Example::\n", | |
| "\n", | |
| " >>> x = torch.zeros(2, 1, 2, 1, 2)\n", | |
| " >>> x.size()\n", | |
| " torch.Size([2, 1, 2, 1, 2])\n", | |
| " >>> y = torch.squeeze(x)\n", | |
| " >>> y.size()\n", | |
| " torch.Size([2, 2, 2])\n", | |
| " >>> y = torch.squeeze(x, 0)\n", | |
| " >>> y.size()\n", | |
| " torch.Size([2, 1, 2, 1, 2])\n", | |
| " >>> y = torch.squeeze(x, 1)\n", | |
| " >>> y.size()\n", | |
| " torch.Size([2, 2, 1, 2])\n", | |
| " >>> y = torch.squeeze(x, (1, 2, 3))\n", | |
| " torch.Size([2, 2, 2])\n", | |
| "\u001b[31mType:\u001b[39m builtin_function_or_method" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "torch.squeeze?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "477a2a83", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAANAAAADNCAYAAADJ7P4xAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjYsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvq6yFwwAAAAlwSFlzAAAHsQAAB7EBBsVhhgAAEPlJREFUeJzt3X9MldUfB/D3pSK8kAXxQx14kWptNRxjo7YWxdaWo80GDX80t/6RVIiAWNkatfwDBzXXH6kLZdqqSXOya3MzdvuxWOkfybQSN1lzYMhsFgioyM/g+08wuD6fz/fecx4uEO/XX3lO57nnPvLx4b7vec7jmZycnAQRGYma7wkQLWYsICILLCAiC3e7eTCPx+Pm4ZYU7dzxY+rCUVhYCL/fP/1nVwsIAKKinC9qJj8EJgWpvY50PJMxExMT4U3sX9L50d6r9Fpu/4NlWqhunleNds6l47n9D1N6evqsP4dcQIODgygtLUV0dDTy8vKwZcuWsF+c6L8m5M9Afr8fRUVFaGhowIkTJ+ZyTkSLRshXoO7ubmRmZgIA7rrrrll9gUAAgUDA3ZkRLQIhX4FSU1PR3d0N4M7fRdetW4ePPvrI3ZkRLQIhX4FeeukllJWV4eTJk1i/fv1czolo0Qi5gGJjY/Hpp58av5CbiZp2LCnlAtxNs4J/jQ3ldQB3I2m3Uy7T40UqZtf+bk2SwHCP5TinsI9ORNNYQEQWWEBEFlhARBZYQEQWXF8LJ6UebiZqWrISqbVwpsmTyXsyWefl9roxt4X7cwKYJYHaGDfeL69ARBZYQEQWWEBEFlhARBZYQEQWWEBEFlyPscONSE3i4HvuuUfsGx0dDft4bi84fOaZZ8I+Xnt7u9gXExPj2D48PCyOSUtLE/uKiooc25ubm8UxP/74o9gnnT9twe0///zj2O527KzF+dL8wrldn1cgIgssICILLCAiCywgIgssICILEVtMakJKQ8bHx8UxJknN5s2bxb7KykrH9lWrVoljtBRn9erVju07d+4Ux7S2tjq2v/DCC+KYt956S+zr6elxbN+0aZM4JnhDwZnq6uoc26urq8Ux2u3ZErcX3JosQA3GKxCRBRYQkQUWEJEFFhCRBRYQkQUWEJEFj5sPGfZ4PGHHk1rka7J/wNq1a8W+s2fPOrb39fWFPYebN2+KY7RFnpL77rtP7KutrXVs12LslStXin333nuvY7vX6w17DAAkJCQ4tmuLfrOyshzbL1y4II6JVIytqaiomLUPPK9ARBZCLqCWlhbk5uZix44daGlpmcMpES0eIReQx+NBXFwchoeHkZqaOpdzIlo0Ql7Kk5ubi2effRbXrl1DVVUVjhw5Mt3HB2zRUhXyFWjqw3R8fDxGRkZm9fEBW7RUhXwF8vv9CAQC6O/vR1lZ2VzOiWjRcD3GDjdONIkZtej74sWLYp8Uxd66dUscI903r0W+2l4AQ0NDju1a/P/QQw85tkurqgFgYGBA7JNeS1vlrsXY0rjExERxzPLlyx3btXNnsoLb5GFn2hzKy8sZYxO5hQVEZIEFRGSBBURkgQVEZCFiO5NKaYiWrEgpya5du8QxSUlJYt+VK1cc2x944AFxjPR+tAWoy5YtE/vGxsYc27Uw9LfffnNsv/tu+a9PSwmlhbAmYwB5Iax0vgF5T4n9+/eLY7SvT0zCZJPFynccI+xXJaJpLCAiCywgIgssICILLCAiCywgIgsRW0xq8jAjaWraIsobN26IfdJeBdoeBtK8tQhZengUIMfYWvTt5lcAgLz4U9vDQDuedP6Cb3uZ6cEHH3Rsf+SRR8Qx999/v9gnLQjWFiuH8yCtKZWVlVxMSuQWFhCRBRYQkQUWEJEFFhCRBdcXk0pMHny1YcMGx/bbt2+LY7Tbs6XbkrUULjY21rF9dHRUHCMlbQAQFxcX9hg3HgQ1k5Qsard0a6T3pCV3UkJ37do1ccznn38u9hUWFop9Eun8cTEpUYSwgIgssICILLCAiCywgIgssICILERsTwRtgaWkpqYm7GNpCyKlXUGjo6PFMVJcLe2sCcjRNyDHt1p0Kh0vJiZGHKPF7NL71cZo57y/v9+xXZuf9HNy/fp1cUxOTo7Y5/P5HNu7urrEMdKCYO0rhWC8AhFZEAuoo6MDW7duRVFREQCgsbERr776Kl555RUMDg5GbIJEC5lYQBkZGTh06ND0n48fP46GhgZs3LgRfr8/IpMjWuhC/gw09Turz+dDW1vbrD4+YIuWqrA/A3V1dd3xiEc+YIuWKvEK1Nvbi+rqavzyyy+ora1FQUEBSkpKMDQ0pO4eSbSURGxPBIm2FW9ra6tju7bvgbZPgLSKW3t4lLTtb2dnpzjmzJkzYp90H/5TTz0ljvn1118d27V9FKRoGYAYAq1Zs0Yc8/DDD4t9KSkpju3a35P0NYAWpcfHx4t9P//8s2N7QUGBOMbkgV18wBaRi1hARBZYQEQWWEBEFlhARBZcX0wa7v3727ZtC/s1tHv3tR1DpbRNO560f0BHR4c45ty5c2KftLtmdna2OEZagBr8hfZM0s6fgPxwK+08aIla8PeCU7SUS0rbtN1MBwYGxL4XX3zRsV16+BdgtptpMF6BiCywgIgssICILLCAiCywgIgssICILERsTwSpffv27eKxpEhTu2fddLtbibTwcuXKleKY5557TuyT5q7tyyDd7y/F0YAe50sRt/bAKe3hVtLfk7ZIV9pjQdvTQovZpS2Ba2trxTGvv/66Yzu39iWKEBYQkQUWEJEFFhCRBRYQkYWIpXCPPfZY2MeSFg9qCwS1pEbaKVNLrKREZu3ateKYzMxMsU96mJf2kC8phTPdmdRkd1TtHEmLRrVETZqDtphUu4W9t7fXsX3Hjh3imLKyMrEvVLwCEVlgARFZYAERWWABEVlgARFZYAERWXA9xpZUVlaGPUZa3KhFndIeBoC8M6kWt0pjtDl4vV6xT4p8tePdvHnTsV176JW2OFU6RyYPJwPkiFtb2Cv1aXPQonnp64uenh5xjBRj7927VxwTjFcgIgviFaijowO7d+/GwMAAmpqakJ+fD5/Ph7i4OOzZsyeScyRasEJ+wJbX68XExIS4kTjRUhTyZ6Bjx44hKioKVVVVOH/+/KylLHzAFi1VIX8Gmvrwm5ycfMeGdHzAFi1VIT9gq729HV6vF+Pj49i5c2ck50i0YEXsAVvSPevSKlpA3npVW4Ws3dcv0aJTaU8ELVLV9gKIjY11bNe2wZUiWu29an+tUvytrbjWYnbp/WpRujQ/LZrX9sKQaHs5SD9H2l4TlZWVfMAWkVtYQEQWWEBEFlhARBZYQEQWXF1MmpKSIt6/n5iY6Nh+9epV8XhSIqMlNVoCJqU42j4KUkKnJUxaQift86Ali1KftnBWS+FMHjRmcl61VE9KWFesWCGO0RaGSvOTFgMD8ry1XWeD8QpEZIEFRGSBBURkgQVEZIEFRGSBBURkwdUYOy0tDXl5eY59v//+u2O7ySJFLaI1WWCpxbcm9+5rka/0frWtfaV5a+dBi7il96SdO22xqxRXp6amimMOHDjg2P7333+LY+rq6sS+1tZWx3ZtX4Y1a9Y4tr/88svimOBzxCsQkQUWEJEFFhCRBRYQkQUWEJEFFhCRBVdj7BUrVuCJJ55w7EtOTnZs11ZWSytp4+PjxTFaFCtFu1rUabK9sBZjSyurtShdOkdatKy9J4m2Ilx7v9L8/vrrL3HMtm3bHNu1PQxKSkrEvvT0dMd27byeOXPGsf3o0aPimA0bNsz6M69ARBZYQEQWWEBEFlhARBZYQEQWXE3hTp8+jUuXLjn2lZaWOrY/+eST4vGkRO/w4cPimAsXLoh9H374oWP72bNnxTFSMqWlXFqyKC1CXbZsmThGu69fYrJAVnuIlpbQaQthJdr8JBkZGWLfN99849h+8OBBcUxTU5NjezgJJq9ARBbEK9BXX32FkydP4saNG9i6dSva2trQ2dmJsbEx1NfXG33PQPRfIxZQQUEBCgoK0NfXh6qqKoyOjuLIkSPYt28fTp06hdzc3EjOk2hB+r+/wtXU1KC4uBhJSUkAAJ/Ph+7u7ln/TyAQQFVVldHu+USLmVhAk5OTePvtt5Gfn4+cnJzpTe26urruuNNw6gFb2l2aRP9F4q9we/fuxXfffYeBgQFcunQJ2dnZqKiowMjIiJioES01YgGVl5ejvLw8rIP19fWhv7/fsa+iosKxXQsjVq9e7dh+5coVccyuXbvEPmkxqRYTS/GytpBTI8W3JtsLa7T9Ekz2WNDmJ21lrM27ublZ7JNo83v++efDHmOyN0QwxthEFlhARBZYQEQWWEBEFlhARBZcXUzq8XjEpEtKNrSUpKurK+w5tLe3i33Sa5ks5NQWUGoPlpL6tDRS6tOSQK1P+rvQ5mDyJbmW3EnnVft50M6rydIy6bW0xcDBeAUissACIrLAAiKywAIissACIrLAAiKy4GqMPTk5KUaXJlGsFIlrMWNjY6PY98UXXzi2JyYmimOkvQCio6PFMVp8K8XBJg8GC2fRYyjHM93pVIqkly9fLo45ffq0Y7sWVWtM9liQ3pMWpQfjFYjIAguIyAILiMgCC4jIAguIyAILiMiCqzG226RoMpyYcaZDhw45tj/66KPimKtXrzq2SxE7YLayWhsj7S2gnQct6jeJfE32RNC+HpC+UtCiee0cmUTzknDG8ApEZIEFRGSBBURkgQVEZIEFRGTB9RQu3HvTtdRFSkNMxgDA9u3bwz6eyb32JmNM5qC9V5Ok0iTl0vrcPncmu4yajOEDtogiJOQHbL377rvIzs6Gz+fDO++8E8k5Ei1YIT1g680330RsbCxGR0exatWqSM6PaEH7v5+Bampq8NprryErKwtRUVHYtGkT1q9fj4SEhOn/JxAIIBAIzOlEiRaikB6wlZ2dPf2BNT4+/o5NBacesEW01IT0gK3W1lZcvHgRMTExSEhI4K9xRP9y9QFbGpOY0fSef4mbCw4jOQeTSNrtyFfj5vG0RbomPw9aJO3GzxdjbCILLCAiCywgIgssICILLCAiCxFbTGqy4NDNMRptjElCF6lFj6ap2UJOAk12aDU9Hh9zTzTPWEBEFlhARBZYQEQWWEBEFlhARBZcjbELCwuRnp4OALh8+fL0f88XzoFzcHsOly9fnt0wOUfeeOONuTo058A5LJg5zNmvcOvWrZurQ3MOnMOCmYNnctLw62wiYohAZIMFRGTB9cWkg4ODKC0tRXR0NPLy8rBlyxa3XyIkLS0teO+99/D4449j8+bNyMvLi8jrdnR0YPfu3RgYGEBTUxMaGxvxww8/YGRkBJ988gliY2PnZR75+fnw+XyIi4vDnj175vz1g/cVbGtrQ2dnJ8bGxlBfX2+0a6ntHOZib0PXr0B+vx9FRUVoaGjAiRMn3D58yDweD+Li4jA8PIzU1NSIvW5GRsasB3kdP34cDQ0N2LhxI/x+/7zNw+v1YmJiAikpKRF5/YKCAjQ0NKC+vh5ffvklzp07h3379iEzMxOnTp2K+ByOHj06J3sbul5A3d3dSEtLA6BvEDHXcnNz0dzcjA8++ADvv//+vM1j6l9an8+H7u7ueZvHsWPHcPDgQfz55584f/58xF63pqYGxcXFSEpKAjA/52Fqb8Pvv/8ehw8fxtdff43r16+7cmzXCyg1NXX6BLm9q044Zu5jNzIyMm/zmNLV1RXRK2GwqfORnJyMW7duzfnrTc7YVzAnJwc9PT0AInseJsPY29CU6zH24OAgysrKEBMTg6effnrePgP5/X4EAgH09/ejpKQkYp+Bent7UV1djW+//RbFxcXw+Xz46aefMDQ0hP3790fsM1DwPNrb2+H1ejE+Po4DBw7M+VZeH3/8MT777DPk5OQgKysLt2/fxh9//DH9WTASn4FmzsHn883a27Curs6V1+D3QEQWGGMTWWABEVlgARFZYAERWfgfyHndfBIg+KsAAAAASUVORK5CYII=", | |
| "text/plain": [ | |
| "<Figure size 320x240 with 1 Axes>" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "show(xn.squeeze())" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "10da83c5", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "torch.Size([1, 28, 28])" | |
| ] | |
| }, | |
| "execution_count": 52, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "target.shape" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "4af36beb", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAANAAAADNCAYAAADJ7P4xAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjYsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvq6yFwwAAAAlwSFlzAAAHsQAAB7EBBsVhhgAAETdJREFUeJzt3X9MVeUfB/D3vRUS3GxgYBp5ic1/ajijWGuTRa3mWGMjQ2q59Y+wFJCEWc2ZcxWsXK62woUyaf4hrcGuzUl1Zy02na3l0qRNXYlBLPsBCiW/ifv9oy9M4Xw+3fs8hwPo+/WXPo/POc89l4/n3g+f8zy+SCQSAREZ8c/2BIjmMwYQkQUGEJGFm908mM/nM+qTSF/PTI5lyqs5aF9FpXOZfn318vpJ5vJ11RQUFCAUCk3+3dUAAgC/3/mmZjLZ8fHxmM4xE6Q3wHQO0vGk1woAN910U8xjNNJ7YfrDaxLIJgGk9UnH0+YmvYfamPT09Gv+HnUA9ff3o7S0FHFxccjNzcW6deuiHUp03Yr6v9FQKITCwkLU19fj0KFDMzknonkj6jtQV1cXMjMzAUz/SBEOhxEOh92dGdE8EPUdKC0tDV1dXQCmf/ZevXo13nnnHXdnRjQPRH0HWrNmDcrLy9HS0oL8/PyZnBPRvBF1ACUmJuLDDz80PpFJpsYkS6L1Sdksk+P9888/MZ9Ho2X1TDKBJhk6kyyX28czfW/dTIvH8lr5i1QiCwwgIgsMICILDCAiCwwgIguuF5PGWiOmZTzcroVzM5NkWjdm8ppM6udMrpFpbZ3EpJDTNLNoUnArHS+W95Z3ICILDCAiCwwgIgsMICILDCAiCwwgIguuprEjkUjMqVCTdLDbRZRa8adWNCpZtWqV2CfN79y5c+KYBQsWOLZrc1uyZInYV1RU5Nh++PBhcczx48fFPuk9HBsbE8fcfLPzj57pOg/SOO29lebHNDaRRxhARBYYQEQWGEBEFhhARBZmvZhUI2VQtOyOycJ8Tz/9tDimsrLSsV3LcmmZwGXLljm2b9myRRxz4sQJx/Ynn3xSHPPSSy+Jfd3d3Y7t2nW45557xL63337bsX3r1q3iGOkama4kKvW5XYA6Fe9ARBYYQEQWGEBEFhhARBYYQEQWGEBEFnxubjJsksY2KeTUzrFixQqx75tvvnFsv3TpkjgmLi7Osb2vr08cMzAwIPZJbr/9drFv586dju2PP/64OEZKlwPya0pISBDHSAWtAJCcnOzYLhWMAkBWVpZje1tbmzjG7ZVOTda7qKiouGYdeN6BiCxEHUCtra3IycnBhg0b0NraOoNTIpo/og4gn8+HQCCAoaEhpKWlzeSciOaNqEt5cnJy8Mgjj+D3339HVVUVDhw4MNnHDbboRhX1HWjii3tSUhKGh4ev6eMGW3SjivoOFAqFEA6H0dvbi/Ly8pmcE9G84Woa2+/3G21iZXIeyalTp8S+W2+91bH9ypUr4hjp9QQCAXGM9lqn3r0n3HLLLeKYqVurT/jjjz/EMX///bfYJ10/rcpdm59ESm8DwG233ebYLqXYAfeXMpbS1dpaE5s3b2Yam8gtDCAiCwwgIgsMICILDCAiC66uiaCRsiQmK4lu375d7NPWKujo6HBs17JF0vy0AlQp2wfImR8tE/jDDz/EdCxALwyVMnRaZnFoaEjskwphf/nlF3GM9D7V1taKY8rKysQ+E25s4MY7EJEFBhCRBQYQkQUGEJEFBhCRBQYQkQXXi0mlFKCbS69qRZS9vb1i3+joqGP74OCgOEZ6rl+bt1Z4KaWDtTUHTGjzk95yk4JRQL5+IyMj4hjpVwfLly+PeQwgr1Ghrblh8qO/adMmFpMSuYUBRGSBAURkgQFEZIEBRGTBs2JSKZslZcYAoLCw0LFdK2zUijKlIk9tDibbsff394t90qPM2hxMNqPSCiJNNi7TsnpS4apW7Cod79dffxXHNDQ0iH1r1651bNcez45lO3sJ70BEFhhARBYYQEQWGEBEFhhARBYYQEQWXE1jRyIRMbUqFRZqqcTXXnvNsV1L+WoFkSYFh9JKotr6AYsXLxb7tMJViVRoqq29oF0j6fVqKV+t7/Lly47t2gZbEm2tiQcffFDskzYUk9bBAOT5aen8qXgHIrIgBlB7ezvWr18/+cvMxsZGlJSU4Pnnn1d/UUh0IxEDKCMjA/v27Zv8+8GDB1FfX4+ioiKEQiFPJkc010X9IXXiu0owGJy2ESw32KIbVczfgTo7O6dt8cgNtuhGJd6Benp6sG3bNpw8eRJvvvkmCgoKsHHjRgwODmL37t1ezpFozhIDaNGiRairq7um7bnnnlMP5vf7Y15DYNGiReLx4uPjHdu1zaM0UspcOg8AJCYmOrZr6dGWlhaxT0qRPvzww+KY77//3rFdW75XSwcPDAw4tmdkZIhjpE2+ADmFLKW3Afmam1TGA8C7777r2L5mzRpxjLZmQ7SYxiaywAAissAAIrLAACKywAAisuB6MWmsz++XlJSIxzN5dl/L1Eh9WoGntIbB+fPnxTGnTp0S+6Ss3v333y+OkeYnZecAICUlRexbunSpY7t2XbXMp5Td1NZskDJg2hht1dn8/HzH9oULF4pjpOLiWNZK4B2IyAIDiMgCA4jIAgOIyAIDiMgCA4jIgutL+0ppSJM0tpS+1QoOtSVtY3nWfYK0VPBdd90ljnn00UfFPmlZYq2gVSrkXLJkiThGW+chNTXVsV1bvldbA0JaN0Jbs0FaY0H7NYRW/NnT0+PY/vrrr4tjNm/e7Ngey8ZbvAMRWWAAEVlgABFZYAARWWAAEVnwbIOtFStWxDxGKvaTCjIBPZMkFYZqmR8pI5OZmSmOuffee8U+KZMkZbIA+ZFp7ZFubRMyLYsp0a6rdP20TKA0B+06aJnKP//807G9rKxMHCNl4VhMSuQRBhCRBQYQkQUGEJEFBhCRBQYQkQXP0thSylAjpRO1VKe2wZY0TiselApapbQpoBdRSulgrVBS2k7GdKMx6bpq6XytEFd6vSaFvVrqW5uDdP2kIlMA2LBhg2P73r17xTFT8Q5EZEH8L6e9vR01NTXo6+tDc3Mz8vLyEAwGEQgEsGvXLi/nSDRnRb3BVkJCAsbHx9X9P4luNFF/B2pqaoLf70dVVRVOnz59TWnOxAZbsTyIRHQ9iPo70MQXwtTU1GlPaU5ssBVLDRHR9SDqDbbOnj2LhIQEjI2N4eWXX/ZyjkRzli/i4ucuv98vpiF/++03x/aLFy+Kx5PStFqKViOlkLV0q5TG1lKqCxYsEPukCmqTymWtQlp7W6XjxcXFiWO05Y+lNLZWPS3NQfsUo/36QnoPtYp1ad7BYFAcs2nTpmu2M2Uam8gCA4jIAgOIyAIDiMgCA4jIgqvFpKmpqWIGQ9rOvrOzUzyeVCCoZay0jJrUp2WspMyUNgdp1U1A3qhKy1iZzEHLZplkMbUCWWn9Be080noXaWlp4hit4FY6l5Y9lLKYd955pzhmKt6BiCwwgIgsMICILDCAiCwwgIgsMICILLiaxl62bBkee+wxx74ff/zRsV17rl9K32opWi2NLaVBTepptfUDtJS0tGGX1A7IKWRtDlqhqXT9tAJZk9crLUkMAIcPH3Zs19YwqKmpEftOnDjh2K79rEgblK1du1YcMxXvQEQWGEBEFhhARBYYQEQWGEBEFhhARBZcr8Z+4IEHHPvuuOMOx3btOXepT3vOXavYlXa2GxgYEMdI6WAtTayl5qW5a/PWqrslWvpWSklrqWptDtLr1da7KCkpcWxPTk4Wx7zwwgtiX3p6umO7lpr/+uuvHdubmprEMVNT3LwDEVlgABFZYAARWWAAEVlgABFZcDULd/z4cZw/f96xT8rIPPTQQ+LxsrOzHdv3798vjjlz5ozY98Ybbzi2nzx5UhyjrTIq0YpTpWLXQCAgjpE22NIyYyZFtVpmUct8SpkuLRMozV0bk5GRIfYdOXLEsX3Pnj3imObmZsf2WIqLeQcisiDegT755BO0tLTgr7/+wvr169HW1oYLFy5gdHQUdXV13ImBCEoAFRQUoKCgAJcvX0ZVVRVGRkZw4MAB1NbW4tixY8jJyfFynkRz0n9+hKuurkZxcTFSUlIA/LtyfVdX1zX/JhwOo6qqSv0NPNH1SAygSCSCV155BXl5ecjOzkZ3dzeAfxdCnLr43cQGW6bbjhDNV+JHuPfffx9ffPEF+vr68NNPPyErKwsvvvgihoeHUVpa6uUcieYsMYAqKipQUVER08F6e3vFpWsrKysd27WUofRMfUdHhzhmx44dYp+U2tWKSaW7qlZ4abK5lVb0aHJnN0lxm66xIKXFteN9/vnnYp9ES80/8cQTju0myS7tPNP+bcxHJ6JJDCAiCwwgIgsMICILDCAiC64Wk2qkzJSWJZE239LGnDt3TuyTNqQyeURcKvDUzgPIc9dek3Q8kxVVATk7ZrKaqTZOyyxKG19p5zHtk5hev6vxDkRkgQFEZIEBRGSBAURkgQFEZIEBRGTB1TS2z+czStNKpA22tNVMtVUlGxsbHduTkpLEMdLmUdLcAH1lUil1qhV/mqwfoKVopeNpr0mb39DQkGP7woULxTFHjx51bNeunWkBb6xjWExK5BEGEJEFBhCRBQYQkQUGEJEFBhCRBVfT2JFIJOaqay09KlVCa9XOWjqzoaHBsX358uXiGG2TKImWBjWprJbSy1r1tHY8k03DtMpq6X36+eefxTHS8sxaqtqkGlu7DlIfl/Yl8ggDiMgCA4jIAgOIyAIDiMjCrK+JoGXUpMyPluXSMijS1uqxFA9Gcx6TwkaTOWgZTLfXCNDmJ53LZI0FbYzp++7mHKbNKeazEtGkqDfYevXVV5GVlYVgMIitW7d6OUeiOSuqDba2bNmCxMREjIyMYOnSpV7Oj2hO+8/vQNXV1SgrK8PKlSvh9/vxzDPPID8/H8nJyZP/JhwOIxwOu7LOFtF8EtUGW1lZWZNf4JKSkqY9gTixwRb3TaUbTVQbbH377bc4c+YM4uPjkZyczI9xRP/n6gZbQOwFeiZL2pre6aQ0aCxpS1smczBJcZusl+B2mtjkeG6n5rXr6sbPF9PYRBYYQEQWGEBEFhhARBYYQEQWPHuk283sk+lGUCZjTDJ0WvZJOp5J1sw0G2nyeL1JRs1kflpxscbNzGIsc+AdiMgCA4jIAgOIyAIDiMgCA4jIAgOIyIKraeynnnoK6enpAP5dlXLiz7OFc+Ac3J7DtNVWIzOksrJypg7NOXAOc2YOM/YRbvXq1TN1aM6Bc5gzc/BFInwOm8gUkwhEFhhARBZcX5m0v78fpaWliIuLQ25uLtatW+f2KaLS2tqK7du347777sOzzz6L3NxcT87b3t6Ompoa9PX1obm5GY2Njfjqq68wPDyMDz74AImJibMyj7y8PASDQQQCAezatWvGzz91XcG2tjZcuHABo6OjqKur82QBGi/WNnT9DhQKhVBYWIj6+nocOnTI7cNHzefzIRAIYGhoCGlpaZ6dNyMjA/v27Zv8+8GDB1FfX4+ioiKEQqFZm0dCQgLGx8exePFiT85fUFCA+vp61NXV4aOPPsJ3332H2tpaZGZm4tixY57P4eOPP56RtQ1dD6Curi7cfffdAMxL092Qk5ODzz77DDt37sSOHTtmbR4T/9MGg0F0dXXN2jyampqwd+9eXLx4EadPn/bsvNXV1SguLkZKSgqA2bkOE2sbfvnll2hoaMCnn36KS5cuuXJs1wMoLS1t8gJ5udrNVFevYzc8PDxr85jQ2dnp6Z1wqonrkZqaiitXrsz4+SJXrSuYnZ2N7u5uAN5eh0gMaxuacj2N3d/fj/LycsTHx2PVqlWz9h0oFAohHA6jt7cXGzdu9Ow7UE9PD7Zt24YjR46guLgYwWAQR48exeDgIHbv3u3Zd6Cp8zh79iwSEhIwNjaGPXv2GC2VFYv33nsP+/fvR3Z2NlauXImBgQF0dHRMfhf04jvQ1XMIBoPXrG341ltvuXIO/h6IyALT2EQWGEBEFhhARBYYQEQW/gcdLAPmk1KuvQAAAABJRU5ErkJggg==", | |
| "text/plain": [ | |
| "<Figure size 320x240 with 1 Axes>" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "show(target.squeeze())" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "a54954bf", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "does it make sense that the xn and target look very similar?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Good observation! Let me think about this with you.\n", | |
| "\n", | |
| "Looking at the `noisify` function, we have:\n", | |
| "- `n = 0.2606` (around 26% of the way from noise to target)\n", | |
| "- `xn = (1-n)*z + n*x` where `z` is random noise and `x` is the original image\n", | |
| "- `target = x - z` (the direction vector)\n", | |
| "\n", | |
| "So `xn` should be about 74% noise + 26% original image. But it looks very similar to the original image, which seems unexpected.\n", | |
| "\n", | |
| "What do you think might be happening here? Could it be related to how the pixel values are scaled, or maybe there's something about how the noise is generated?\n", | |
| "\n", | |
| "Try looking at the value ranges - what are the min/max values of the original image vs the noise?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "7ee7bea0", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "how can I look at the value ranges and explore that easily? I'm thinking that maybe the images need to be normalized to 0-1?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Great thinking! You can check the value ranges easily:\n", | |
| "\n", | |
| "```python\n", | |
| "print(f\"Original image: min={it.min()}, max={it.max()}\")\n", | |
| "print(f\"xn: min={xn.min()}, max={xn.max()}\")\n", | |
| "```\n", | |
| "\n", | |
| "And yes, you're probably right about normalization! Fashion-MNIST pixels are 0-255, but many ML models expect 0-1 range. You could normalize with:\n", | |
| "\n", | |
| "```python\n", | |
| "it_norm = it / 255.0\n", | |
| "```\n", | |
| "\n", | |
| "Try checking those ranges first - what do you see?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "7fd9e73d", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "(tensor(0.), tensor(255.), tensor(-1.2438), tensor(147.9825))" | |
| ] | |
| }, | |
| "execution_count": 56, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "it.min(), it.max(), xn.min(), xn.max()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "671652a5", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I see that the vector has negative values, which maybe are weird to display? let's try to normalize both images and display them. Write that for me please!\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Here's the code to normalize and display both images:\n", | |
| "\n", | |
| "```python\n", | |
| "# Normalize original image to 0-1\n", | |
| "it_norm = it / 255.0\n", | |
| "\n", | |
| "# Create normalized batch and apply noisify\n", | |
| "it_batch_norm = it_norm.unsqueeze(0)\n", | |
| "(xn_norm, n), target_norm = noisify(it_batch_norm)\n", | |
| "\n", | |
| "print(f\"Normalized ranges - Original: {it_norm.min():.3f} to {it_norm.max():.3f}\")\n", | |
| "print(f\"Normalized ranges - xn: {xn_norm.min():.3f} to {xn_norm.max():.3f}\")\n", | |
| "print(f\"n = {n.item():.3f}\")\n", | |
| "\n", | |
| "# Display both\n", | |
| "fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(8, 4))\n", | |
| "ax1.imshow(xn_norm.squeeze(), cmap='gray')\n", | |
| "ax1.set_title(f'xn at step n={n.item():.3f}')\n", | |
| "ax2.imshow(it_norm, cmap='gray') \n", | |
| "ax2.set_title('Original image')\n", | |
| "plt.show()\n", | |
| "```\n", | |
| "\n", | |
| "This should show you the difference more clearly!" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "c18493d6", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "Normalized ranges - Original: 0.000 to 1.000\n", | |
| "Normalized ranges - xn: -1.162 to 1.798\n", | |
| "n = 0.516\n" | |
| ] | |
| }, | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAAU0AAACsCAYAAAD/n/5XAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjYsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvq6yFwwAAAAlwSFlzAAAHsQAAB7EBBsVhhgAAJJxJREFUeJztnXtYVOUWxl80LwFpSCreQsky74pRHpUcpCOaWaiYJo/ZUbvgLTFLPVSah9I6HsrSRClDS8goNE1z8oaXStPE9OSlvCB5V0JU4qIw5w8fiGG9U3vDDJhn/Z6nP3rds79v9nyz2PO9e63lZrPZbFAURVEMUaWyJ6AoivJXQoOmoiiKCTRoKoqimECDpqIoigk0aJbgwoUL+OSTTwwdu2DBAhfPRpKUlIQuXbogODgYx48fF/9+yy23wGKxwGKxYO/evQCAyZMno2HDhpg4cWLxcTabDVOmTEFwcDAsFgtyc3Mr7D0of0xKSgoCAwPRvXt3DB48GJmZmeKY06dPY+rUqQ7PsWbNGixbtszUuNOmTcMXX3xhp8XHx+Pbb781dZ7/C2xKMUePHrUNGDDA0LGdOnVy8WzsuXLliq1z5862vLw829atW21PPfWUoTmdOnXKtmHDBttzzz1XrCUlJdneeustl85XMU9GRoatbdu2trNnz9psNpstISHBFh4ebndMYWGhrbCw0OljT5061bZy5Uqnn/dG5Ia40/zqq68wfPhwAMCwYcOwYcMGpKSkoFevXujXrx/at2+P//73v3avOXPmDIKCghAYGIiwsDAUFBRg3rx52LRpEywWC/bt21d87LZt23DfffchKCgI06ZNw7Jly3Dw4EFYLBYkJCTgyJEjCAkJgcViQWRkJIBrf6VDQ0Px4IMPIjAwECdOnLAbPz4+HgMGDEDfvn0REBCAU6dO/eF7/Pnnn9GyZUtUr14dXbt2xZ49e8Qxhw8fxv3334+IiIjiu0cfHx+4ubnZHbdixQqkp6fDYrHglVdeMXiVFVezatUq9OvXD3Xr1gUAPPbYY9i2bRsKCwvxxBNPYPTo0ejZsyd27tyJsLAwANfWfseOHTFw4EDcf//9SEtLQ3x8PObMmQMAaNmyJYYNG4YOHTpgyZIlAIAPP/wQFosF/v7++PDDDx3Op+juMy0tDV26dMGgQYPQunVrLF26FA899BDat2+Pn3/+GQAwZMgQdO/eHd26dUN6ejoAYNGiRbjnnnswbNgwtGrVCgBw/vx5hIaGokePHggPD0dBQYFrLqYLuSGCZs+ePVGzZk2MHj0atWvXRo8ePQAAV65cwbJlyzBz5kwsXLjQ7jVeXl5Yu3YttmzZgkaNGmHDhg2IiIhA9+7dkZKSUvwhA9cW89SpU7Fx40a8/PLL6NevH1q0aIGUlBQMGTIEkydPxrvvvouUlBTk5uZi586dAAB3d3esXr0aUVFReP3118W8a9eujZUrV2L48OFISkoCAISHhxf/xC76b82aNcjMzEStWrWKX8sW26FDh7B582Y0aNAAc+fOdXi9zpw5Ax8fH6SkpGDfvn3Ytm2biautuIqTJ0+iYcOGdlrdunVx7tw5AIC/vz/Wrl1bHFQB4OWXX8b69evx0Ucf4ZdffhHnPH36NN555x1s3rwZb7/9NgBgwIABSElJwddff40333zT0NwyMzORkJCAN998E2+88QZWrFiB6dOnFwfd9957D5s2bcJzzz2H+fPno6CgAG+++WbxGEVzmzlzJsaNG4cNGzagXbt2prcRrgduquwJOIuxY8eidevWOHbsWLHWoUMHAECTJk3E3lBGRgYiIiKQmZmJkydPwt/fH3feeSc99+jRoxEdHY0lS5YgPDwcDz74oN2/HzhwACNGjAAAXLp0CSEhIQCATp06AQACAgIwe/Zscd6OHTsWz+/7778HgOK7gdLs27cPFy9eLP7/qlWrimO8vb0BAGFhYZg5cyY9DwDceuutxX9YevTogR9//BGdO3d2eLxSMTRo0ACHDx+2086ePYvbbrsNwLV1VJqCggLUqVMHANCmTRvx735+fsV/bIv+0FqtVsyePRs2mw2HDh0yNLdWrVqhatWqaNiwIdq0aYMqVaqgUaNGWLduHQoKCvDCCy9gz549yMnJQZs2bXDu3Dk0adIENWrUQI0aNdC0aVMA19bx9u3bMX36dOTk5GDo0KHGLs51xA0RNG02G55//nnExsZi0qRJSEhIAAC7n6W2UolPCQkJeOihhzBy5EiMHTsWNpsN1apVo3dwtWvXxpw5c5Cfn49OnTrhwQcftDt3ixYtMGvWLPj6+sJms6GgoAAfffQRUlNTAQA7d+5E8+bNxXnZ/MLDw8VP+cmTJyM4OBj79+9Hfn4+du7ciXbt2tkdk52djZo1a6Jq1arYsmULHa+Irl27Yvfu3ejUqRN2796NRx991OGxSsXRp08fWCwWjBkzBnXr1kViYiI6d+5c/AeyShX5w7Bq1arIzMyEh4cHfvzxR/HvpbdmACA6OhqbN2+Gm5sb/Pz8DM2t5HlKr9vdu3fjwoUL2Lx5Mz777DOsXLkSdevWxfHjx5Gfn4/s7GykpaUBAO6++27069cPgYGBAK79GvyrcUMEzdjYWAQGBuKpp57CoUOH8PHHH8PHx+cPXxMcHIyhQ4di5cqVuPnmmwFc+0ufk5ODsLAwzJgxo/jOc/78+UhOTsbVq1fxxBNPAACCgoLwyCOP4B//+Adef/11PPPMM8jNzUXVqlWLtwLy8/PRq1cvXL58GYmJiYbei6M7TQAYP348LBYLatasiUWLFgG49nNn0KBByMrKwvDhw+Hp6QkvLy8sXrwYADB79mwsXrwY58+fx4kTJ5CYmIgRI0bgiSeewOLFi9GiRQsEBQUZmpviWry9vTF79mz0798fbm5u8PHxwbx58/7wNdOnT0dwcDCaNWsGHx8fVKtW7U/H6d+/PwIDA+Hv7w8vL69yz/vuu+/GsWPH8Pe//x133303gGvBfPz48ejSpQtatmyJ22+/HQAQFRWFJ598stj9f+ONN3DPPfeUew4ViZut9C2Y4hTi4+Nx+fJljBkzprKnotzAXLlyBdWqVUNeXh4CAgKQmppKt24qg6K5/frrr+jVqxe+++67yp6SU7gh7jQV5f+V5cuXY+7cubh48SLGjx9/3QRMAJg3bx6Sk5Nx6dIlREdHV/Z0nIbeaSqKopjAqXeaN910k9isLnL+SsI2p319fQ2NUfQMWEnYZvZvv/0mtDNnzgiNGT/u7u6GjvP09BQa2yNy9Awm239iG+MsY4cdx+ZT5FqWpGhTviRsD5hlozAzgs2vdu3aQvvpp5+E9sADDyA5OVnoroKtPUX5I/r162e3Rg0HzezsbIwaNQrVq1eHxWJBeHi4OKZKlSqoWbOmndasWTN6XGl69uxpaB7r1q0TWt++fYV2+vRpoRU91lOS/Px8oZV8Du6PjmvcuLHQ2GNL27dvFxrAgzML9hkZGUK7fPmy0Bo1aiS03r17C23NmjVCu/fee4XGghwL9L/++qvQ2B+youcNS8KCelkxskYVxSyl16jhh9uTk5MRFhaGuLg4rFixwtnzUpRyo2tUqQgM32keP34cbdu2BSAfrLZarbBarSgsLHTu7BTFBEbWqKKUF8N3mo0bNy6urFM6OIaEhCAmJob+7FaUisLIGlWU8mL4TrN///4YM2YMVq1aRfcQAaBt27bo3r27nXbXXXeJ49jzWmfPnhUaM26KHkQvya5du4S2f/9+ob300ktCi42NFRozC9ijHGyP7vz584ZeCwBZWVlCY3942MPnbM9w7dq1QivKgy/J448/LjRWAszDw0NoLFeYZRQxc4jtATsTI2tUUcqL4aDp4eGBDz74wJVzUZRyoWtUqQj097SiKIoJNGgqiqKYQIOmoiiKCZyaEXTu3DlRUbyoKlBJZs2aJTRmlrCHzAcPHiw0ZoDcdJN8a0WFfktSVFeyJAcOHBBaUfWWkrCMGTO5v8OGDRMay9FlD5mz99KgQQOhsRJxq1evFlrpyvYAzzpic2b7iKNGjRIaM5YU5a+G3mkqiqKYQIOmoiiKCTRoKoqimECDpqIoigmcagQVFhYK8yA+Pl4cxwyeqKgooQ0ZMkRopXvjANywKMpBLsmtt94qNGbmsMpMLEtox44dQmvfvr3QHJlD7D0XNYMrSbdu3YS2dOlSobHKR+xar1+/XmisGhIr61fU0bAkzCSbNm2a0P6KTbT+DGY4Xr161eXj3n///UJjtR8OHjwotNKVyABexQvgWVwDBw4U2hdffCG0rVu30nP+1dE7TUVRFBNo0FQURTGBBk1FURQTaNBUFEUxgVONIG9vb7Ru3dpOy87OloOSzXOW1cNaYJw4cUJorL+N0RYYzABhpg/rR16rVi2hsX5A/v7+QgN4Bg97f2zjnl1XVoZv7969QmNZVXPnzhUaM9NeeOEFoW3btk1oRX2uS8Lae/zVKY/pwz6HyMhIoTVs2FBozPRh13zixIlCYwZmnz596BzZ583KH7LygMxQnTlzptCmTJlCx75e0TtNRVEUE2jQVBRFMYEGTUVRFBNo0FQURTGBm81msznrZB06dEBgYKCdxnp2M2OEZa4sWrRIaDk5OUKbMWOG0FhvHJZNVNSIqyS//PKL0Fh/7o4dOwrtvffeExozvgCe1cFYsGCB0FifcpbVwTKeWF+kjz/+WGjs+rPPifU6Yv3bmTHg4eFRoQ3PmMnnbFhW2Pfffy801i+erZWLFy8KjX0PGMysZN+XkJAQ+nqWEVSjRg2hMZOPHVenTh2hVatWTWjsu8rKF1YEkZGRdmvU8J1mSkoKAgMD8cwzzyAlJcUVc1OUcqFrVKkIDAdNNzc3eHp6Ijc31+VdBRWlLOgaVSoCw89pBgYGonv37jhz5gwmTJhg99yi1WqF1WrFpUuXXDJJRTGCkTWqKOXF8J1mUT9uLy8v5OXl2f1bSEgIYmJicMsttzh3dopiAiNrVFHKi+E7zeTkZFitVly4cAFjxoyhx7i7u6NevXp22saNG8VxrG8Q25xmGS6szBrbUGebxsw8Ydk27LUsA2f//v1CK/3+AeC1114TGsA3u5kRwHrrsHmz0nBNmjQRGtvv++yzz4TGPEJmiLVq1UporFQYu6733Xef0MqKkTVqFGYYGfVMExMThZaeni409nmx9c0+f2a85ObmCo0ZRqy837lz54QGcBOq6I9TSUr/kQL49WLGq7e3t9BK9xpzNC6jPJ+dEQwHzf79+6N///5OG1hRnI2uUaUi0Oc0FUVRTKBBU1EUxQQaNBVFUUzg1NJwJ06cwJYtW+y0zp07i+O2b98uNNYHh5kvXl5eQmOZK3fccYehcZOTk4X22GOPCe3mm28WGivjxh67GjZsmNAA3kOFlYtjRsCqVauENmHCBKExY+G7774TGtugZ8YCM3h++OEHod15551Cu156xpQ2CphJYNQ4YL2Q6tevLzRmBLG1zGBZXWw9snJxzKBhJoujPlbMcGJrnK1RlhXGnrBhGXisHN67774rtFGjRgnNmaYPQ+80FUVRTKBBU1EUxQQaNBVFUUygQVNRFMUETjWCGjVqJErDsU1etkHMTBq2Gezr6ys0lh3j6ekpNJbBw/q0rFy5Umhjx44VGtuM37Rpk9BYeTwA+M9//iM01ouIlWNj2jfffCM01mfpq6++EhrrTfPJJ58IjV3r4OBgoS1cuFBoLOuLmVKuprRRwDJNmKnCYOuClcpj140ZPMyQYVliBQUFQmOmDxuXGSWOSuYxM5BdL9YriZ2T9Rdi82bfmYiICKFNmjRJaMyoKs9nLM5VplcpiqL8n6JBU1EUxQQaNBVFUUygQVNRFMUETjWCqlevLgwYZmy88sorQuvevbvQWOkstuFstFwcM6DYhjrLZlm/fr3QWEk0VmXn/fffFxoAfPrpp0JjBtaRI0eExkyEFi1aCO3QoUNCY9f61KlTQmNl265cuSK08ePHG3rtsWPHhHY9YNQkGDhwoNDYmmLrlhkyzABhmTVsjTKDhq0d9nmZyZhhZg77vrHvJRuHvT8Gm/fp06eFtnjxYqH169dPaGU1fRh6p6koimICDZqKoigm0KCpKIpiAg2aiqIoJnCqEXT27Fns3r3bTnvkkUfEcSxL6F//+pfQpk+fLrT8/HyhGd20Z9kne/fuFRrL9GHGy/Lly4U2ZMgQobVv315oAM9QSktLE9oDDzwgtL59+wotNTVVaLVq1RIaKw3WqFEjQ8cx7a233jKksc396wFmYjCio6OFxkyaatWqCY2ZQ+w4Zg6xcmqsFxUzhxhsfTMN4N83Nm/22bJrw75H7HzMnGPmLuv7xbIGmQnJMq2MrAW901QURTGBw6B55MgRjBgxAmFhYQCAhIQEPPnkk3j88cdpcWBFqWh0jSqVgcOg6efnZ/d84bJlyxAXF4dHH31UVDu3Wq2YMGEC/QmiKK6iLGtUUcqL4Z/nRQ+5+vr6itYIISEhiImJoQ/XKkpFYWSNKkp5MW0Epaeno3HjxvTfbrvtNrRt29ZOY0/xs4yCefPmCa1du3ZCYxu1o0ePFtq5c+cMvfbhhx8WGrtjZmbOwYMHhWY0EwkAcnJyhNahQwehnT17VmgXLlwQGsv+Ydeabbyzz4lpI0eOFBrL+mIZQSyz6dZbbxVaefmjNQoY6xHESu+xrJ6LFy8amhMzRdj5mHHHzMEVK1YYGqNr165CK23WAo6NIGbcsK0PPz8/obE+XazcI1vLbD7MJGPr++233xYaM6SNGoClcRg0MzIyEBUVhdTUVMyYMQOhoaGIiIhATk4O5s6dW6bBFMWZ6BpVKgOHQdPb2xuxsbF2GnucRlEqC12jSmWgjxwpiqKYQIOmoiiKCZyaEXTy5EnRp4ZlMzRr1kxo99xzj9DYBjjbhP7666+FxjbomemQlJQktI4dOwrt888/F1rLli2F5uXlJbStW7cKDeAl2lh/k6ZNmwqNmVBBQUFCY5kQP/zwg9AGDBggNGZqMfOqU6dOQmNGyrp164TmCiPozzBSGu2pp54SGiuTxswElmlSvXp1obFsG5YJc/jwYaHt2rVLaMxE8vf3Fxr7DNmaAPjnyMwcdh3Yd7BJkyZCY9eVXRs2BjORmLnL4hD7rjnqlVQSvdNUFEUxgQZNRVEUE2jQVBRFMYEGTUVRFBM41QjKyckRmTh9+vQRx7FNXtbrg20as41yZiKxTAZmOrD+PexZP5aNcPLkSaExU4odB1zLoCpNt27dhDZ16lShzZ49W2isRN6XX34pNJahdODAAaGlp6cLjW2oswyMiRMnCo1ljZQ1K8PVPP3000Jja4CVRGNmgtEeNczoZMZLcHCw0Nj3qkaNGkJjxmKDBg3ofFj2GFu37P0xY4pdQ5YZxbKbmMHGrj/LoHvttdeENnbsWKEZMQn1TlNRFMUEGjQVRVFMoEFTURTFBBo0FUVRTOBmM9M5/k9o0aKFyKZh2QwzZswQ2qZNm+TkyIY6y1D49ttvhXb06FGh/fOf/xQa6xtUp04dobG+RizjgRlVrMQawEtnsUK5LKOIfWysHB7LbmLveefOnUJj5dVuv/12obGsk549ewqtV69eQmvatGmF1rlka6pNmzZCYwbamTNnhObu7i40ZmIwQ4YZG+xzZWXS2HGsRxDTPDw8DI0BcOOGGU5sPuy7wEq5sdeyMYyaSMwcYtl7bH6MyMhIuzWqd5qKoigm0KCpKIpiAg2aiqIoJtCgqSiKYgINmoqiKCZwahrl1atXRa2+oUOHiuPOnz8vNFbjjzmdrI4kSwuLiIgQ2oIFC4TGGpkxp42lt917771CY7VCWUohANx5551CS0hIEBpL/2RNymrXri00Vi80JCREaCw9kjnv8fHxQmOEhoYKbf369UJjn11FExkZKTTm6DKNObrMFWfrm9XYZCmuzLVnbjf7vrD5scaBjtJZ2RyZ68zSLdlTBOx8RmpYAsbTKJnGYg5ryGikt5TDoHnkyBG8+uqryMrKwqefforevXvD19cXnp6emDVr1p+eWFFcja5RpTJw+PPcz8/PrpiFu7s7CgsLUb9+fXGs1WrFhAkTHLaqVRRXUJY1qijlxfCeZlJSEhYsWIBTp05hz549dv8WEhKCmJgY+qCvolQURtaoopQXw0GzaB+jXr16dE9EUSobXaNKReBwTzMjIwNRUVFITU3FjBkzcODAAbi7u+Pq1at44YUX6Gs8PDzg4+Njp7GajGxDNzAwUGile1oD3JyoV6+e0NimPWtkxupuPv/880JjdS7ZRjczd1j9QQBo3ry50JiZw1JMmRHw008/CY0ZWFOmTBHaI488IjR2bVjKZFxcnNC6dOkitBdffFFoLN3VKGVZowzWiIvVZGSGhdFUSAarQcnWFDsfO85oeiQzfRzV+2Rjs9eXp+kcSzE1eh2M1jNlf0SjoqKEVi4jyNvbmwYtRble0DWqVAb6nKaiKIoJNGgqiqKYQIOmoiiKCZyaEZSdnY1Tp07ZaWyDmWVHZGVlCY1lrrAalmlpaUJjG+CtWrUS2o8//ig01qiNmU2l3ysAJCYmCo2ZSACwfPlyoZVn87xdu3ZCY/NmBhQzsFjmB/vsWNYXM6+M1kN0JfXr1xcN+5hRd/z4caEZ/WyMZscYzZhh52MGCPtusOvL1g6rc+kINkf2PWdjszqZ7Bqy98IazGVkZAiNXS9miLLP01GDuZLonaaiKIoJNGgqiqKYQIOmoiiKCTRoKoqimMCpRlD9+vUREBBgp7FGaPv37xca25Rt1KiR0FhTsNIb+wCQmpoqNJb5sXHjRqGxUmFsw5kZQUuXLhUaK0sFGC+xVTrLCuCZViy7hplaH3/8sdDYRrmXl5fQWJkyZqT87W9/E9qOHTuEZmTj3Zk0adIEPXr0sNNYJhUzLJgBYhSj2T9Gs46Mlp9jRXTYXByVZ2M609h7YeYSW9+sWd+7774rNPY9mjlzptDYOmNzZmtv8ODBQiuN3mkqiqKYQIOmoiiKCTRoKoqimECDpqIoigmcagQVFhYKQ6dr167iOJYp4O3tLTRm+rCNWra5zMZ96623hMayDNgm9MmTJ4XGSksNHDhQaI76r7AMng0bNgjtxIkTQmN9g3Jzc4W2du1aoT300ENCK120F+Cb+6x/EpvL9ZD9wzBqVjIjiF3fWrVqCc1o5hA7HzNpWPYPW/MsC469ln2uzJQEuJnDTBX22bKx2bU5ffq00J5++mmhsWvNeoGxvlNs3O3btwuNGbmDBg2y+3+901QURTGBBk1FURQTaNBUFEUxgQZNRVEUEzjVCKpSpYrY8GZZCmzjl2XXsI1tVgaOZQTNnz9faMywYCYAgxk8VqtVaL/88ovQ2rZtS8/JspbuuOMOobH+SbfccovQvv32W6Gxnj5HjhwRmtFSYyz759ChQ0JjRhDL8Krots/ffPONmC8z+Tp37iy0e++9V2gLFy4U2r59+4Q2Y8YMoe3atUtorGyb0QwclrHEOsSya+6orxEbx6hZxb6/7HwMR+ZpaZjps27dOqGxeJCUlGRojNI4DJrLly/HqlWrcPHiRYwYMQJ79+7F0aNHceXKFcTGxjpMu1KUikLXqFIZOAyaoaGhCA0NRWZmJiZMmID8/HwsWbIEc+bMwdatW+3ufqxWK6xWKy5cuFARc1YUAGVbo+yOSFHM8Kd7mtHR0Rg5cmTxz1hfX19R1TokJAQxMTH056+iuBoza5Q936goZnAYNG02GyZNmoTevXsjICCguMJIeno6GjduXGETVBRH6BpVKgOHP8/feecdrFu3DllZWTh06BD8/f3x7LPPIi8vD6NGjaKv2bVrlyj7xp66Z6WgWEbCwYMHhcYMGWY2tW7dWmisJN2lS5eExjI1vv/+e6ExY4CZNsxQAXh5N6PZJD///LPQNm/eLDRmTLFsF5aRxTbt2fWqX7++0I4dOyY0dh3YezNKWdZoZmYmMjMz7bRnn33W0Hi+vr5CY+/zlVdeERrLrGGGDDOCHGXrlMZoBo6Zu21m+hiFjc3WHpvPl19+WeZxH3jggTK/1ggOV+y4ceMwbtw4lw6uKOVB16hSGehzmoqiKCbQoKkoimICDZqKoigmcGpGUFBQEMLDw+00Vups0aJFQmMlntgG8a+//io0tslusViElpKSIjRWEq1FixZCa9asmdBYdtJ9990ntA8++EBoANC+fXuhsWwSljnk6ekpNGZ+lX70BuCb8SxziM2bmSHMbGLXkJUPK48RVFZKGytGzQ5m+jAOHDggNPagPcuYYSYky65ia95oGTdmLJW3R5CjjCIjxzHDyGimGHvPRmFzMbIW9E5TURTFBBo0FUVRTKBBU1EUxQQaNBVFUUzg1F14b29vtGrVyk6bO3euOG7YsGFCy8jIEBorQ5aVlSU0tlH+008/CY0ZFqXnC/ASZqwUGzMGWGm3LVu2CA0APv/8c6GtXLlSaFu3bqWvL83ly5eFdvbsWaG1adNGaDt27BAa68nCzjd8+HChvffee0IrSnMsSd++fYXmaoxs9jOzg5lWLAsnMTFRaAkJCUJjWVjMLGMl34z2/mHvlWlGjRxHr2fXhp2TZe+xdWZ0zZfVzCkPeqepKIpiAg2aiqIoJtCgqSiKYgINmoqiKCZws5nZAf4TmjdvLvrhDBgwQBzHsoSYmfDvf/9baCzrhZUrYxkzzCg5fPiw0IKCgoTWsmVLobFeOyzDiJlNAODh4SE0o0aXl5cXPWdpzp07JzRWci87O1toPj4+QmNl4JjpxupZsp44/v7+iImJEbqrqKwWGHFxcUJjWVOsX5HRDB6jJeSYseTIPGE6M5yMZvqwbLQ6deoIzahBaDQ7qTxZTJGRkXZrVO80FUVRTKBBU1EUxQQaNBVFUUygQVNRFMUETs0Iql69usicYQYK67ezd+9eoQ0dOlRoLHOIGRtr1qwR2sMPPyw0VoqLZSiwBvSsTB3LCGLvDQBq164tNHd3d6GxXkS7d+8WGmuhzN5Lt27dhDZ58mShsRJ+8+bNExozjFiJu7Fjxwrt66+/FtqNyJNPPlnZU7ghKU9JurLiMGguX74cq1atwsWLFzFixAi8+OKL8Pf3h6+vL6ZMmeK0CShKWdE1qlQGDoNmaGgoQkNDkZmZiYkTJ8LDwwP5+flo2LBhRc5PURyia1SpDP7053l0dDRGjx6NDh06oEqVKhg0aBD69u1r92yV1WqF1Wqlz0Eqiqsxs0YVpbw4NIJsNhsmTZqE3r17w9/fv/jBWS8vL7EPGBISgpiYGPpAuaK4irKsUUUpLw7vNN955x2sW7cOWVlZ2LFjB/bv34+aNWuiTp06Dn/+3H777aJR+9q1a8VxrMza5s2bhVavXj2hsawSlmXAevqsXr1aaE2aNBEay9Rp3ry50Fg20V133SU0lkEB8Eym/fv3C41l3LAMnvj4eKENGjRIaE2bNhXaM888I7TMzEyh+fn5CY29Z2Zybdq0SWjl6RFUljWqKOXF4YodN24cxo0bV5FzURRT6BpVKgN9TlNRFMUEGjQVRVFMoEFTURTFBE7NCMrLyxOlzZYsWSKO69Onj9BYCSpmtLDScKyxPCvbxkwflq3DjCBWEo1lwsTGxgqtU6dOQgOA1NRUoQUHBwtt8eLFQuvatavQZs+eLbTSpfoAbkCxnjMffPCB0M6cOSM0lvXFyn2xnjisFJ6iXM/onaaiKIoJNGgqiqKYQIOmoiiKCTRoKoqimMCpPYL69++Ppk2bIi0tjWadOJuKGEfHcO0YaWlpSE5Ods2ECLpGdYxyr1GbC4iMjHTFaStlHB3j+hrDWega1THKikt+noeEhLjitJUyjo5xfY3hLHSN6hhlxak/zxVFUW501AhSFEUxgQZNRVEUEzg1jRK4Vudx1KhRqF69OiwWC8LDw509BFJSUvDSSy+hdevWGDx4MCwWi9POfeTIEbz66qvIysrCp59+ioSEBGzcuBF5eXmYN28eTbEs7xi9e/eGr68vPD09MWvWLCe8C9k/Z+/evTh69CiuXLmC2NhYuLm5OX2Mv0qPHl2j5sfQNfo7Tr/TTE5ORlhYGOLi4rBixQpnnx4A4ObmBk9PT+Tm5qJx48ZOPbefnx/ef//94v9ftmwZ4uLi8Oijjzrt0ZjSY7i7u6OwsJDmt5eV0NBQxMXFITY2FomJidi1axfmzJmDtm3bYuvWrU4fY+nSpX+ZHj26Rs2PoWv0d5weNI8fP15cGKNq1arOPj0AIDAwEF9++SVef/11TJ061SVjFFH0187X1xfHjx93yRhJSUlYsGABTp06hT179jj13NHR0Rg5ciTq1q0LwDXvo6hHz/r167Fw4UKsXr2atje+XtA1ah5do7/j9KDZuHHj4jfMKhc5g5K9YPLy8lwyRmnS09OdfsdQRNH7qVevntOa09lK9M8JCAjA+fPnATj3fdhM9Oi5ntA1ah5do7/j9EeOsrOzMWbMGNSsWRPdunVzyX5RcnIyrFYrLly4gIiICKfuF2VkZCAqKgpr167FyJEj4evriy1btiAnJwdz5851yn5R6TEOHDgAd3d3XL16FfPnzy/+YMvD22+/jUWLFiEgIAAdOnTAb7/9hmPHjhXvezljv6jkGL6+vnY9embOnFnu87sKXaPmx9A1+jv6nKaiKIoJ9JEjRVEUE2jQVBRFMYEGTUVRFBNo0FQURTHB/wCCjkpBF2Sb6gAAAABJRU5ErkJggg==", | |
| "text/plain": [ | |
| "<Figure size 400x200 with 2 Axes>" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "# Normalize original image to 0-1\n", | |
| "it_norm = it / 255.0\n", | |
| "\n", | |
| "# Create normalized batch and apply noisify\n", | |
| "it_batch_norm = it_norm.unsqueeze(0)\n", | |
| "(xn_norm, n), target_norm = noisify(it_batch_norm)\n", | |
| "\n", | |
| "print(f\"Normalized ranges - Original: {it_norm.min():.3f} to {it_norm.max():.3f}\")\n", | |
| "print(f\"Normalized ranges - xn: {xn_norm.min():.3f} to {xn_norm.max():.3f}\")\n", | |
| "print(f\"n = {n.item():.3f}\")\n", | |
| "\n", | |
| "# Display both\n", | |
| "fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(8, 4))\n", | |
| "ax1.imshow(xn_norm.squeeze(), cmap='gray')\n", | |
| "ax1.set_title(f'xn at step n={n.item():.3f}')\n", | |
| "ax2.imshow(it_norm, cmap='gray') \n", | |
| "ax2.set_title('Original image')\n", | |
| "plt.show()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "da7e0af1", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "what about normalizing AFTER applying noisify, that shouldn't make a difference right? just doing\n", | |
| "`display(it.squeeze() / 255.0)`\n", | |
| "and \n", | |
| "`display(xn.squeeze() / 255.0)`\n", | |
| "\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Actually, that would make a big difference! Here's why:\n", | |
| "\n", | |
| "The `noisify` function generates random noise with `torch.randn()`, which creates values roughly in the range [-3, 3]. If your original image is in [0, 255] range, then:\n", | |
| "\n", | |
| "- `xn = (1-n)*z + n*x` mixes noise (around ±3) with pixels (0-255)\n", | |
| "- If you normalize AFTER, you're dividing this mixed result by 255\n", | |
| "\n", | |
| "But the noise component would become tiny after dividing by 255, so you'd barely see any noise effect.\n", | |
| "\n", | |
| "You should normalize BEFORE `noisify` so that both the image (0-1) and noise (±3) are on more comparable scales.\n", | |
| "\n", | |
| "Try both approaches and see the difference - what do you observe?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "0da428fd", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAANAAAADNCAYAAADJ7P4xAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjYsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvq6yFwwAAAAlwSFlzAAAHsQAAB7EBBsVhhgAAD/tJREFUeJzt3X9InWUbB/CvrjY72kBNpeF2zIj+KNcQHEFZUoGMWGmYkwb9M9easw1lLMaKClwajP5Y23LZFgu2MRxnIa1xaJGQBCFZaSyL5dSktqZzrvnb9P3jRZnuua7Oea7Ho27fDwxe77v7nMtHr/c55/I+1x01MTExASJyJXquAyBayJhARAZMICKDO7x8sKioKC8fjmjeyc/PRyAQmPqadyCiMKSlpU37OuQ7UH9/P0pKSrB48WLk5ORg/fr1XsdGtOCEfAcKBAIoKChATU0N6urqZjMmogUj5DtQV1cXMjIyAACLFi2aNhcMBhEMBr2NjGgBCPkOlJqaiq6uLgDA+Pj4tLnc3Fy8//773kZGtACEfAd64YUXUFpaitOnT2Pt2rWzGRPRghFyAsXGxuKTTz6ZzViIFhyWsYkMmEBEBkwgIgMmEJEBE4jIgAlEZMAEIjJgAhEZMIGIDJhARAZMICIDJhCRgac9ESLljjvksMfGxiIYibMnnnhCnJv5UZBJv/76q7gmJibGcXxkZERck5qaKs69+OKLjuOff/65uKahoUGcu53xDkRkwAQiMmACERkwgYgMmEBEBguyCud1pa2oqEicKysrcxxftmyZuEaqtAHAihUrHMe3b98urmlsbHQcf/bZZ8U1O3bsEOe6u7sdxwsLC8U19913nzhXVVXlOL5z505xza2CdyAiAyYQkQETiMiACURkwAQiMmACERlEeXnI8Hw4YOuRRx4R577//nvH8StXrohrpI2r165dE9cMDg6Kc5KlS5eKc5WVlY7jubm54hptM+mSJUscx30+X9hrACAhIcFx/M477xTXrFy50nH8559/FtfMB2VlZdP6wPMORGQQcgLV19cjOzsbr776Kurr62cxJKKFI+QEioqKQlxcHIaGhtSXB0S3k5C38mRnZ+PJJ5/EpUuXUF5ejqNHj07N8YAtul2FfAeKjv7/fxofH4/h4eFpczxgi25XId+BAoEAgsEgrl69itLS0tmMiWjBmPMytrbGTWjnzp0T56TeAtevXxfXzDwPdlJsbKy4RvuehoaGwnoeAEhPT3ccv3z5srhGK7NPvpqYSdvlvnjxYnFO2n2emJgorpHK9lJs/0W65h7+egNgGZvIU0wgIgMmEJEBE4jIgAlEZOB5T4RwqyFuqiRvv/22OJeSkiLOdXZ2Oo7Hx8eHHUNvb684d9ddd4lzUsVq5t/WbtTc3Ow4rlXutI2h//zzj+O4VlkcGBgQ5+6++27H8T/++ENcI/WUOHDggLimpKREnPO62hYq3oGIDJhARAZMICIDJhCRAROIyIAJRGQQsc2k0iZBrQ2upKenR5zr6+sT56RSsbTBE5BLxW6+Vy0GaaMrIJdo3W7E/ffffx3HtR4G2uNJ108rzUsbTR944AFxjdY3QirNaz8LN7973ExK5CEmEJEBE4jIgAlEZMAEIjKI2AFbbqpw0nHs2sZG7ePZUqVLqxZJGyylShagV/Xi4uIcx0dHR8U1bgqlWoVOqixqH+nWYtA2oUqk7/fixYvimk8//VScy8/Pdxx3U2kLB+9ARAZMICIDJhCRAROIyIAJRGTABCIyiFgZWyuRSioqKhzHtRKytiFSKnFra6QSt9QHAACSk5PFOa3ELZF6LGi9F0ZGRsQ56fvVSunaNZf6Q2jXVfqzhnbY2erVq8U5v9/vON7R0SGukQ5PC+d3lXcgIgMxgdra2rBhwwYUFBQAAI4dO4aNGzfi5ZdfRn9/f8QCJJrPxARKT0/HoUOHpr4+deoUampqUFhYiEAgEJHgiOa7kN8DTW4N8fv9aGlpmTbHA7bodhX2e6DOzs6bjnjkAVt0uxLvQD09Pdi1axd++OEHVFZWIi8vD5s3b8bg4CD2798fyRiJ5i0xgRITE1FdXT1t7KWXXvrPBwy3tW9SUpL4WNLuae3wKI1UitX6EUifw29vbxfX1NXVhR3DY489Jq758ccfHce1MrbWelgqAkkHeQHA/fffL85JbXqvXr0qrpFi13bGa62M9+7d6zj+/PPPi2vc/GllJpaxiQyYQEQGTCAiAyYQkQETiMjA882k4X5+/5VXXhHnpIqeVj2RNggC8lHt2sZLadPj77//Lq5pamoS56SqXmZmprhmcHDQcfynn34S12jVTalqpl1XrfK5fPlyx3GtL4N0zbUYtKrec8895ziubfqVuplqcc/EOxCRAROIyIAJRGTABCIyYAIRGTCBiAwi1hNBsmnTJnFO2liofXZfK0G6afMqbbyUSsEA8PTTT4tzUvl2yZIl4pq0tDTH8XvvvVdco/UjuOeeexzHteujHW7l5tAwaVOt9mcI7ef+999/O46/++674prXXnvNcTycP8XwDkRkwAQiMmACERkwgYgMmEBEBhGrwj388MOO41rVTDqyXjqkCtA7aPp8PsdxrfIjVWRWrlwprsnIyBDnpM6kWsdSqeum9pFu7aPRUiVQqz5px8VLH7XWrqsUgxa3VtXr7u52HN+yZYu4RqrChYN3ICIDJhCRAROIyIAJRGTABCIyYAIRGUSsjF1WVuY4rpVOpTmt1KmVTqXeAlKvBAAYGBhwHL906ZK4RisvS2V77XuSDgbT+gdo35NUktY2oGp/HpCeK5zeApPcbiaV5qTyNiCXuMNpXc07EJGBmO5tbW3YvXs3+vr6cPLkSaxZswZ+vx9xcXHYs2dPJGMkmrdCPmDL5/NhfHwcKSkpEQmMaCEI+T1QbW0toqOjUV5ejubm5mlbWXjAFt2uQn4PNPnGMzk5+aY3tTxgi25XIR+w1draCp/Ph7GxMezYsSOSMRLNW1ET4fbi1R5MKVtevnzZcVz6LDsgl521/gFuyuJa6VTqiaCVVLX4YmNjHce13c5SuVrrYaBdB+nxtDK2tltc2iWtXQepLK7FrbVgln73tNa+Utxav4uysrJpr7ZYxiYyYAIRGTCBiAyYQEQGTCAiA083k6akpIiHLUndMLu6usTHk6pFWtXMzYZINxsltefRKnTSQVXa5/2lapZ27LtG+p60qp4Wn1Qd035O0nXQKmA9PT3inPTzkKqogPz7pXV8nYl3ICIDJhCRAROIyIAJRGTABCIyYAIRGXhaxl6+fDmeeuopx7nffvvNcVzbIKh9rt8NqUyrlbGlTYpaidZNXwap9wIgx61t3nUzp10HrWQulZBXrFghrjlw4IDjuNbDoKqqSpxrbGx0HNeug1SuLioqEtfMxDsQkQETiMiACURkwAQiMmACERkwgYgMPN+NnZWV5TiXlJTkOK6VsaXP4S9dulRco7W7lcrL2uf9pRKytuNa26kttf3VHk8qL7s5NQ6QS7vajmstPumaX7x4UVyzadMmx3HtZ7t582ZxLi0tLazYAOC7775zHD9x4oS4Zt26ddO+5h2IyIAJRGTABCIyYAIRGTCBiAw8rcJ9++23OH/+vOPcn3/+6Tj+6KOPio+3evVqx/HDhw+La86dOyfOVVZWOo43NTWJa6R+BNomRTc9Fnw+n7hG2miqdfHUYnBTWdQODdOeS6JVxyRSpQ0Azp496zh+8OBBcU1tbW3YMczEOxCRgXgH+uyzz3D69Glcu3YNGzZsQEtLCy5cuIDR0VFUV1e7Or6P6FYjJlBeXh7y8vLQ29uL8vJyjIyM4OjRo9i3bx8aGhqQnZ0dyTiJ5qX/fAlXUVGB4uLiqZ0Efr//pl5uwWAQ5eXl6mtooluRmEATExN4/fXXsWbNGmRlZU19UrCzsxOpqanT/tvJA7a0LSxEtyLxJdwHH3yAs2fPoq+vD+fPn0dmZia2bduG4eFhlJSURDJGonlLTKCtW7di69atYT1Yb28vent7Hee2bdsWXmT4/8tFJx0dHeKad955R5yTNlhq/QikMra2kVMjvczVyrpu7uxam143tPikDcFa3GfOnDHHdKNnnnnG08cLFcvYRAZMICIDJhCRAROIyIAJRGTg6WZSQK5OuakKadU2SWtrqzgnbT/SNkpKH/ceHh4W12gfp5bmtK1R0jX1ujOptjlVI63TKnda5VPi9kAxiRR3OL+rvAMRGTCBiAyYQEQGTCAiAyYQkQETiMjA8zJ2uOVqrdwqdRLVPnd0/Phxce7YsWOO44mJieIaqVundviXmy6j2nWT5tyWnaXH0w4G055LOjRM6zLa0NAgzrmJwevNs6HiHYjIgAlEZMAEIjJgAhEZMIGIDJhARAael7HDpZUmvW6T9fHHHzuOP/jgg+IaqSWx1hPBzc5qjVQyd1P6BuRSuvazcNMTISEhQVxz5MgRcU7ipmyv/Szc/hngRrwDERkwgYgMmEBEBkwgIgMmEJHBnFfhImnjxo1zHQIZuKmaeVFp0/AORGQQ8gFbb7zxBjIzM+H3+7Fz585Ixkg0b4V0wNb27dsRGxuLkZERLFu2LJLxEc1r//keqKKiAlu2bMGqVasQHR2NdevWYe3atdP+yhwMBhEMBmc1UKL5KKQDtjIzM6e2oMTHx9/UbHDygC2i201IB2w1Njbil19+QUxMDBISEvgyjmjShIcA8B//3dL/ysrKpv3Os4xNZMAEIjJgAhEZMIGIDJhARAZMICIDJhCRAROIyIAJRGTABCIyYAIRGTCBiAw87YmQn5+PtLQ0AEB7e/vU/54rjIExeB1De3v79AEvd2PfaOau1bnAGBjDbMcway/hcnNzZ+uhGQNjmDcxRE1MzHLfH6JbGIsIRAZMICIDzzuT9vf3o6SkBIsXL0ZOTg7Wr1/v9VOEpL6+Hm+++SYeeughFBUVIScnJyLP29bWht27d6Ovrw8nT57EsWPH8PXXX2N4eBgffvghYmNj5ySONWvWwO/3Iy4uDnv27Jn155/ZV7ClpQUXLlzA6Ogoqqur1XN7ZiuG2eht6PkdKBAIoKCgADU1Nairq/P64UMWFRWFuLg4DA0NITU1NWLPm56ejkOHDk19ferUKdTU1KCwsBCBQGDO4vD5fBgfH0dKSkpEnj8vLw81NTWorq7G8ePH0dTUhH379iEjIwMNDQ0Rj+HEiROz0tvQ8wTq6urC8uXLAQCLFi3y+uFDlp2djTNnzuC9997DW2+9NWdxTP4/rd/vR1dX15zFUVtbi48++gh//fUXmpubI/a8FRUVKC4uRlJSEoC5uQ6TvQ2/+uorHD58GF988QWuXLniyWN7nkCpqalTF0g7ZnC23djHbnh4eM7imNTZ2RnRO+FMk9cjOTkZ169fn/Xnm7ihr2BWVha6u7sBRPY6TITR29Atz8vY/f39KC0tRUxMDB5//PE5ew8UCAQQDAZx9epVbN68OWLvgXp6erBr1y58+eWXKC4uht/vxzfffIPBwUHs378/Yu+BZsbR2toKn8+HsbExHDx40NVZreHYu3cvjhw5gqysLKxatQoDAwPo6OiYei8YifdAN8bg9/un9Tasqqry5Dn4dyAiA5axiQyYQEQGTCAiAyYQkcH/ADIG7bmUcrWaAAAAAElFTkSuQmCC", | |
| "text/plain": [ | |
| "<Figure size 320x240 with 1 Axes>" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "show(it.squeeze() / 255.0)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "b0537f26", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAANAAAADNCAYAAADJ7P4xAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjYsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvq6yFwwAAAAlwSFlzAAAHsQAAB7EBBsVhhgAAEPlJREFUeJzt3X9MldUfB/D3pSK8kAXxQx14kWptNRxjo7YWxdaWo80GDX80t/6RVIiAWNkatfwDBzXXH6kLZdqqSXOya3MzdvuxWOkfybQSN1lzYMhsFgioyM/g+08wuD6fz/fecx4uEO/XX3lO57nnPvLx4b7vec7jmZycnAQRGYma7wkQLWYsICILLCAiC3e7eTCPx+Pm4ZYU7dzxY+rCUVhYCL/fP/1nVwsIAKKinC9qJj8EJgWpvY50PJMxExMT4U3sX9L50d6r9Fpu/4NlWqhunleNds6l47n9D1N6evqsP4dcQIODgygtLUV0dDTy8vKwZcuWsF+c6L8m5M9Afr8fRUVFaGhowIkTJ+ZyTkSLRshXoO7ubmRmZgIA7rrrrll9gUAAgUDA3ZkRLQIhX4FSU1PR3d0N4M7fRdetW4ePPvrI3ZkRLQIhX4FeeukllJWV4eTJk1i/fv1czolo0Qi5gGJjY/Hpp58av5CbiZp2LCnlAtxNs4J/jQ3ldQB3I2m3Uy7T40UqZtf+bk2SwHCP5TinsI9ORNNYQEQWWEBEFlhARBZYQEQWXF8LJ6UebiZqWrISqbVwpsmTyXsyWefl9roxt4X7cwKYJYHaGDfeL69ARBZYQEQWWEBEFlhARBZYQEQWWEBEFlyPscONSE3i4HvuuUfsGx0dDft4bi84fOaZZ8I+Xnt7u9gXExPj2D48PCyOSUtLE/uKiooc25ubm8UxP/74o9gnnT9twe0///zj2O527KzF+dL8wrldn1cgIgssICILLCAiCywgIgssICILEVtMakJKQ8bHx8UxJknN5s2bxb7KykrH9lWrVoljtBRn9erVju07d+4Ux7S2tjq2v/DCC+KYt956S+zr6elxbN+0aZM4JnhDwZnq6uoc26urq8Ux2u3ZErcX3JosQA3GKxCRBRYQkQUWEJEFFhCRBRYQkQUWEJEFj5sPGfZ4PGHHk1rka7J/wNq1a8W+s2fPOrb39fWFPYebN2+KY7RFnpL77rtP7KutrXVs12LslStXin333nuvY7vX6w17DAAkJCQ4tmuLfrOyshzbL1y4II6JVIytqaiomLUPPK9ARBZCLqCWlhbk5uZix44daGlpmcMpES0eIReQx+NBXFwchoeHkZqaOpdzIlo0Ql7Kk5ubi2effRbXrl1DVVUVjhw5Mt3HB2zRUhXyFWjqw3R8fDxGRkZm9fEBW7RUhXwF8vv9CAQC6O/vR1lZ2VzOiWjRcD3GDjdONIkZtej74sWLYp8Uxd66dUscI903r0W+2l4AQ0NDju1a/P/QQw85tkurqgFgYGBA7JNeS1vlrsXY0rjExERxzPLlyx3btXNnsoLb5GFn2hzKy8sZYxO5hQVEZIEFRGSBBURkgQVEZCFiO5NKaYiWrEgpya5du8QxSUlJYt+VK1cc2x944AFxjPR+tAWoy5YtE/vGxsYc27Uw9LfffnNsv/tu+a9PSwmlhbAmYwB5Iax0vgF5T4n9+/eLY7SvT0zCZJPFynccI+xXJaJpLCAiCywgIgssICILLCAiCywgIgsRW0xq8jAjaWraIsobN26IfdJeBdoeBtK8tQhZengUIMfYWvTt5lcAgLz4U9vDQDuedP6Cb3uZ6cEHH3Rsf+SRR8Qx999/v9gnLQjWFiuH8yCtKZWVlVxMSuQWFhCRBRYQkQUWEJEFFhCRBdcXk0pMHny1YcMGx/bbt2+LY7Tbs6XbkrUULjY21rF9dHRUHCMlbQAQFxcX9hg3HgQ1k5Qsard0a6T3pCV3UkJ37do1ccznn38u9hUWFop9Eun8cTEpUYSwgIgssICILLCAiCywgIgssICILERsTwRtgaWkpqYm7GNpCyKlXUGjo6PFMVJcLe2sCcjRNyDHt1p0Kh0vJiZGHKPF7NL71cZo57y/v9+xXZuf9HNy/fp1cUxOTo7Y5/P5HNu7urrEMdKCYO0rhWC8AhFZEAuoo6MDW7duRVFREQCgsbERr776Kl555RUMDg5GbIJEC5lYQBkZGTh06ND0n48fP46GhgZs3LgRfr8/IpMjWuhC/gw09Turz+dDW1vbrD4+YIuWqrA/A3V1dd3xiEc+YIuWKvEK1Nvbi+rqavzyyy+ora1FQUEBSkpKMDQ0pO4eSbSURGxPBIm2FW9ra6tju7bvgbZPgLSKW3t4lLTtb2dnpzjmzJkzYp90H/5TTz0ljvn1118d27V9FKRoGYAYAq1Zs0Yc8/DDD4t9KSkpju3a35P0NYAWpcfHx4t9P//8s2N7QUGBOMbkgV18wBaRi1hARBZYQEQWWEBEFlhARBZcX0wa7v3727ZtC/s1tHv3tR1DpbRNO560f0BHR4c45ty5c2KftLtmdna2OEZagBr8hfZM0s6fgPxwK+08aIla8PeCU7SUS0rbtN1MBwYGxL4XX3zRsV16+BdgtptpMF6BiCywgIgssICILLCAiCywgIgssICILERsTwSpffv27eKxpEhTu2fddLtbibTwcuXKleKY5557TuyT5q7tyyDd7y/F0YAe50sRt/bAKe3hVtLfk7ZIV9pjQdvTQovZpS2Ba2trxTGvv/66Yzu39iWKEBYQkQUWEJEFFhCRBRYQkYWIpXCPPfZY2MeSFg9qCwS1pEbaKVNLrKREZu3ateKYzMxMsU96mJf2kC8phTPdmdRkd1TtHEmLRrVETZqDtphUu4W9t7fXsX3Hjh3imLKyMrEvVLwCEVlgARFZYAERWWABEVlgARFZYAERWXA9xpZUVlaGPUZa3KhFndIeBoC8M6kWt0pjtDl4vV6xT4p8tePdvHnTsV176JW2OFU6RyYPJwPkiFtb2Cv1aXPQonnp64uenh5xjBRj7927VxwTjFcgIgviFaijowO7d+/GwMAAmpqakJ+fD5/Ph7i4OOzZsyeScyRasEJ+wJbX68XExIS4kTjRUhTyZ6Bjx44hKioKVVVVOH/+/KylLHzAFi1VIX8Gmvrwm5ycfMeGdHzAFi1VIT9gq729HV6vF+Pj49i5c2ck50i0YEXsAVvSPevSKlpA3npVW4Ws3dcv0aJTaU8ELVLV9gKIjY11bNe2wZUiWu29an+tUvytrbjWYnbp/WpRujQ/LZrX9sKQaHs5SD9H2l4TlZWVfMAWkVtYQEQWWEBEFlhARBZYQEQWXF1MmpKSIt6/n5iY6Nh+9epV8XhSIqMlNVoCJqU42j4KUkKnJUxaQift86Ali1KftnBWS+FMHjRmcl61VE9KWFesWCGO0RaGSvOTFgMD8ry1XWeD8QpEZIEFRGSBBURkgQVEZIEFRGSBBURkwdUYOy0tDXl5eY59v//+u2O7ySJFLaI1WWCpxbcm9+5rka/0frWtfaV5a+dBi7il96SdO22xqxRXp6amimMOHDjg2P7333+LY+rq6sS+1tZWx3ZtX4Y1a9Y4tr/88svimOBzxCsQkQUWEJEFFhCRBRYQkQUWEJEFFhCRBVdj7BUrVuCJJ55w7EtOTnZs11ZWSytp4+PjxTFaFCtFu1rUabK9sBZjSyurtShdOkdatKy9J4m2Ilx7v9L8/vrrL3HMtm3bHNu1PQxKSkrEvvT0dMd27byeOXPGsf3o0aPimA0bNsz6M69ARBZYQEQWWEBEFlhARBZYQEQWXE3hTp8+jUuXLjn2lZaWOrY/+eST4vGkRO/w4cPimAsXLoh9H374oWP72bNnxTFSMqWlXFqyKC1CXbZsmThGu69fYrJAVnuIlpbQaQthJdr8JBkZGWLfN99849h+8OBBcUxTU5NjezgJJq9ARBbEK9BXX32FkydP4saNG9i6dSva2trQ2dmJsbEx1NfXG33PQPRfIxZQQUEBCgoK0NfXh6qqKoyOjuLIkSPYt28fTp06hdzc3EjOk2hB+r+/wtXU1KC4uBhJSUkAAJ/Ph+7u7ln/TyAQQFVVldHu+USLmVhAk5OTePvtt5Gfn4+cnJzpTe26urruuNNw6gFb2l2aRP9F4q9we/fuxXfffYeBgQFcunQJ2dnZqKiowMjIiJioES01YgGVl5ejvLw8rIP19fWhv7/fsa+iosKxXQsjVq9e7dh+5coVccyuXbvEPmkxqRYTS/GytpBTI8W3JtsLa7T9Ekz2WNDmJ21lrM27ublZ7JNo83v++efDHmOyN0QwxthEFlhARBZYQEQWWEBEFlhARBZcXUzq8XjEpEtKNrSUpKurK+w5tLe3i33Sa5ks5NQWUGoPlpL6tDRS6tOSQK1P+rvQ5mDyJbmW3EnnVft50M6rydIy6bW0xcDBeAUissACIrLAAiKywAIissACIrLAAiKy4GqMPTk5KUaXJlGsFIlrMWNjY6PY98UXXzi2JyYmimOkvQCio6PFMVp8K8XBJg8GC2fRYyjHM93pVIqkly9fLo45ffq0Y7sWVWtM9liQ3pMWpQfjFYjIAguIyAILiMgCC4jIAguIyAILiMiCqzG226RoMpyYcaZDhw45tj/66KPimKtXrzq2SxE7YLayWhsj7S2gnQct6jeJfE32RNC+HpC+UtCiee0cmUTzknDG8ApEZIEFRGSBBURkgQVEZIEFRGTB9RQu3HvTtdRFSkNMxgDA9u3bwz6eyb32JmNM5qC9V5Ok0iTl0vrcPncmu4yajOEDtogiJOQHbL377rvIzs6Gz+fDO++8E8k5Ei1YIT1g680330RsbCxGR0exatWqSM6PaEH7v5+Bampq8NprryErKwtRUVHYtGkT1q9fj4SEhOn/JxAIIBAIzOlEiRaikB6wlZ2dPf2BNT4+/o5NBacesEW01IT0gK3W1lZcvHgRMTExSEhI4K9xRP9y9QFbGpOY0fSef4mbCw4jOQeTSNrtyFfj5vG0RbomPw9aJO3GzxdjbCILLCAiCywgIgssICILLCAiCxFbTGqy4NDNMRptjElCF6lFj6ap2UJOAk12aDU9Hh9zTzTPWEBEFlhARBZYQEQWWEBEFlhARBZcjbELCwuRnp4OALh8+fL0f88XzoFzcHsOly9fnt0wOUfeeOONuTo058A5LJg5zNmvcOvWrZurQ3MOnMOCmYNnctLw62wiYohAZIMFRGTB9cWkg4ODKC0tRXR0NPLy8rBlyxa3XyIkLS0teO+99/D4449j8+bNyMvLi8jrdnR0YPfu3RgYGEBTUxMaGxvxww8/YGRkBJ988gliY2PnZR75+fnw+XyIi4vDnj175vz1g/cVbGtrQ2dnJ8bGxlBfX2+0a6ntHOZib0PXr0B+vx9FRUVoaGjAiRMn3D58yDweD+Li4jA8PIzU1NSIvW5GRsasB3kdP34cDQ0N2LhxI/x+/7zNw+v1YmJiAikpKRF5/YKCAjQ0NKC+vh5ffvklzp07h3379iEzMxOnTp2K+ByOHj06J3sbul5A3d3dSEtLA6BvEDHXcnNz0dzcjA8++ADvv//+vM1j6l9an8+H7u7ueZvHsWPHcPDgQfz55584f/58xF63pqYGxcXFSEpKAjA/52Fqb8Pvv/8ehw8fxtdff43r16+7cmzXCyg1NXX6BLm9q044Zu5jNzIyMm/zmNLV1RXRK2GwqfORnJyMW7duzfnrTc7YVzAnJwc9PT0AInseJsPY29CU6zH24OAgysrKEBMTg6effnrePgP5/X4EAgH09/ejpKQkYp+Bent7UV1djW+//RbFxcXw+Xz46aefMDQ0hP3790fsM1DwPNrb2+H1ejE+Po4DBw7M+VZeH3/8MT777DPk5OQgKysLt2/fxh9//DH9WTASn4FmzsHn883a27Curs6V1+D3QEQWGGMTWWABEVlgARFZYAERWfgfyHndfBIg+KsAAAAASUVORK5CYII=", | |
| "text/plain": [ | |
| "<Figure size 320x240 with 1 Axes>" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "show(xn.squeeze() / 255.0)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "6ea6bf00", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "that makes sense, the noisfy is negligible if we don't normalize the images first. Great!\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Exactly! You've nailed the key insight. When the image pixels are 0-255 and the noise is only around ±3, the noise gets completely overwhelmed and becomes invisible.\n", | |
| "\n", | |
| "Normalizing first puts both the image (0-1) and noise (±3) on comparable scales, so you actually see the interpolation effect that flow matching depends on.\n", | |
| "\n", | |
| "This is a great example of why data preprocessing and scaling matters so much in machine learning!\n", | |
| "\n", | |
| "Ready to move forward with the normalized approach, or do you want to explore anything else about this scaling issue?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "f5be3601", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "(tensor(-2.8647), tensor(3.3638))" | |
| ] | |
| }, | |
| "execution_count": 64, | |
| "metadata": {}, | |
| "output_type": "execute_result" | |
| } | |
| ], | |
| "source": [ | |
| "(xn, n), target = noisify(it.unsqueeze(0) / 255.0)\n", | |
| "target.min(), target.max()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "0f805080", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "what would be a good way of visualizing `target`?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Good question! The `target` vector has both positive and negative values (roughly -3 to +3), so you have a few visualization options:\n", | |
| "\n", | |
| "1. **Use a diverging colormap** like `'RdBu'` or `'seismic'` that shows negative values in one color and positive in another\n", | |
| "2. **Shift and normalize** to 0-1 range: `(target - target.min()) / (target.max() - target.min())`\n", | |
| "3. **Show absolute values** with `torch.abs(target)` to see magnitude of changes\n", | |
| "\n", | |
| "Which approach sounds most interesting to you? The diverging colormap might be most intuitive since it shows which pixels need to get brighter vs darker." | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "d7a60ba9", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "let's do the diverging colormap side by side with the original image, be concise\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "```python\n", | |
| "fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(8, 4))\n", | |
| "ax1.imshow(it_norm, cmap='gray')\n", | |
| "ax1.set_title('Original')\n", | |
| "ax2.imshow(target.squeeze(), cmap='RdBu', vmin=-3, vmax=3)\n", | |
| "ax2.set_title('Target vector')\n", | |
| "plt.show()\n", | |
| "```" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "e852dc63", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "image/png": "iVBORw0KGgoAAAANSUhEUgAAAU0AAACsCAYAAAD/n/5XAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjYsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvq6yFwwAAAAlwSFlzAAAHsQAAB7EBBsVhhgAAJiVJREFUeJztnXlcVXX6xz9sioALm7iA4JaaGyI4lrK4Ei6FuJajMqHjmiVZZM6UlabOqJO7SZpUamY/UJMMUyNzzIUUlxQ3RCRFkVWR/fL7Y15Sl8/3jvfoBdR53v8UH8493+859zlfD+dznucxKy8vL4cgCIJgFOY1PQFBEITHCVk0BUEQNCCLpiAIggZk0RQEQdCALJpGEh8fD19fX/j7+2PUqFHIzs7W+316ejreffddg5//7rvvEBMTo2nMOXPmYOfOnQ80X+Hxp6CgAAEBAQgICEDdunUr/j8rK+uh9x0fH4/z58+bYJa/s2HDBhQXF5t0n48iZuKe35+srCwEBARg7969cHZ2xubNmxEbG4svvvgCAHDvFJqZmZl03Dlz5sDb2xuDBg0y6X6Fxw9vb28kJCT81210Oh3MzY27D6qK2AoICMDOnTthZ2d33221zPVR4/GcdTUTGxuLIUOGwNnZGQDw4osv4tChQxg7diymTp2K/v37IyEhAcOGDQMA7N69G126dMHw4cPh5+eHlJQUbNiwAStWrAAAtGvXDuPGjYOnpyc2btwIAPj8888REBAALy8vfP755zVzoMIjjSpG5syZg9DQUAwYMAAnT57ElClT4OfnhzfeeAMBAQEAgISEBPTq1Qu+vr5YtGgRCgoKsGHDBsyaNQtjx46t2H9JSQl69OhR8fOYMWOQlJSE5ORkBAYGIiAgADNmzADwn7vgF198Ef7+/ujTpw9+/vlnJCYmIigoCEuWLEFaWhr69u0LPz8/TJs2DcB/7kRHjRqFwYMH47vvvqums2Z6LGt6Ao8D165dQ5MmTfQ0Z2dnZGRkwN/fHytXrkRKSkrF79555x3s3bsXtra2aNu2Le0vPT0dy5cvBwD069cPo0ePxtChQzFmzBgUFBSgR48eGDNmTJUek/D4YShG3NzcsGHDBiQkJCAnJwf79+/H7t27cfToUQDAW2+9hejoaNjb22Pw4MEYM2YMQkND6U7TysoK7dq1w4kTJ9CmTRtcuXIFbdu2xYgRI7Bq1Sq0bNkSkydPRkJCAg4ePAhvb2+8/vrrFXeNnp6eFXea06ZNw8yZM/Hcc88hLCwM+/fvrxjjm2++qf6TZ0Jk0TSCxo0b49KlS3razZs30apVK/j4+ND2ZWVlcHBwAAB06NCBft+iRQvUq1evYlsAiIuLw9KlS1FeXo6LFy+a+hCEJwBDMXIvBi9evIiuXbsCQMV/AeDkyZMYMmQIACA7OxtXr141OMaoUaOwZcsW+Pj4YMCAAQCApKQkhIWFAQBu376NwMBAnD17tkJT/Zl98eLFinn5+PjgwoULsLCwUF4vjxvy57kRDBw4ENHR0cjIyAAAbN68Gd27d4eFhYUyYCwsLJCdnY3i4mL8+uuv9HvVs8+5c+ciNjYWu3btgo2NjekPQnjsMRQj92KwVatWOH78OABU/BcAOnfujO3btyM+Ph7Hjh1D165dYWVlVfEP9h/p1asX4uPj8dVXX2HUqFEAgDZt2iAqKgrx8fFISEjAoEGD0K5du4q7R51OBwB6+2zVqhWOHDkCADh69Chat26tN9fHGbnTNAJHR0csXboUISEhMDMzQ6NGjbB69Wq8/vrryu3ff/999OnTB82bN0ejRo1gZWV13zFCQkLg6+sLLy8v2Nvbm/oQhCeA+8WIt7c36tWrBz8/P3Tp0qUi7hYsWICQkBDodDrUrl0bMTEx6N27NyIiIrBv3z4sXbq0Yh8WFhbw8vJCYmIiPDw8AAALFy7EpEmTUFhYCAsLC6xfvx4TJkxAaGgo/P39YWlpib179+L555/HiBEjMHToUERERGDcuHH48MMP0aFDB/j5+SE5OblazlNVI+55FVBSUgIrKysUFRXBx8cHx48fh4WFRU1PS/gf4F7s7d69GzExMVi9enVNT+mJQ+40q4Bt27Zh5cqVyMvLw2uvvSYLplBtTJw4EZcuXYJOp0NUVFRNT+eJRO40BUEQNGDSO01Tv9wtPPkMGTIE0dHR1TZe8ID+8HBz1dMuFvBl0DyXs2Us7BqQlnPiNGk2AUGkler43qSWBZsilvkZpN09y2Ncbv4n0lJu5ZPm39KRtNvFbADVPbWfNAD47alnSWvdoBZpO85kkhbcgp/lZ+z7kTTn3n6klVjzM9t9FzkTKqAwibSyp58hzbaQ51eWdZM0s6ZPkXb1+g29GDV60czPz8eUKVNQq1YtBAQEYPTo0cZ+VBAMcs9sMAXGxKiHmyv++d7betr2DH5bYUDyVtKsXFuRdmlVJGkN5y8kraBUR1rdWvzYxjb9JGnpX6wj7dug2aR9dyqdtHeC25N2JbeQtGaRM0kDgAMjODV4aOu6pJ2L4oV90SDODDpx4zppnn9jQ/WOUxvSpsacIe2DrM2kFY99j7RGmTy/onPHSTPvOYK0We/N09+GtjBAdHQ0hg0bhsjISOzYscPYjwlCtSExKlQHRt9ppqWloWPHjgBAxkZcXBzi4uJMOzNB0IgxMZpy9beamJrwBGH0naarqyvS0tIA/P4y6z0CAwOxZMkS085MEDRiTIx6uDWtiakJTxBG32mGhIRg2rRpiI2NxeDBg6tyToLwQBgTowlZ5Vh8Vt+Uec3HlrYzv82L682d20nzWMqv9dws4ueXTgf4uaSlM49xZSubYuVvf0zaWCt+LpmZ70CaizmbQ42usRnzQyg/hwWA4c5sYO1J47G/tttNmllhf9I6Ra4hrbQ2PyPNL+FzuD6oEY9x2Z80JPPx6Zyb8Xa9Q0ladNBwiuk9jF40bW1t8emnnxq7uSBUOxKjQnXw+CeCCoIgVCOyaAqCIGhAFk1BEAQNSO658D/F0862GOvZWE+zunmWtmu3mS+N1156m7SJGVz7dM8tZ9J23OQMnshLW0hr1NOLtNq3Oevl8ywX/mzd2qTprOuRZuYZSFrAsW9JA4AU+4GkNavP5lDO4DdIu1zImUdOtfmFfpebJ0jLqsd1aJ11d3iClpx1VNwmgDQLsLGkMptGV4oNAFgWq/+z3GkKgiBoQBZNQRAEDciiKQiCoAFZNAVBEDTwSBtBlpY8vdLS0iof18+PS1VVTssDgHPnzpFmbW1NWnFxsXIcV1dX0oYPH07azp07STtw4IByn8J/53axDqm5+t9H/uJ/0nZnw7nyjs6aM2FK7LnbaNDmN0nzGD6HNMvG80jDXq6alGjF1ZWuZHNZs1evfEZaUSIPYT36b6Rdb6/uf34ls4C0bk24etGNu3xd2lrxPVmD/ZwZ1e8Km1+rR3KZSfOC26SVNvcmTWX67LiQQ5r3BjavLkz6F2k0j/tuIQiCIFQgi6YgCIIGZNEUBEHQgCyagiAIGnikjaCHMX3uNbr/IzNmzCCtSZMmpKlMn2bNuLTUzJncIuDo0aOkDRzIWRUA8OabbBjcunWLtBEjuAR/8+bNSVuwYAFps2bNUo79v0pxmQ63i/XjqvastbRdqjVnrsyI+ZW0rd6XSXMa9wpp249wm4eujdngKc3i799+yVTSnn99FWlDfuVMn9f6tSbNW5GpY6i94vqfr5BW178l79Ociztft+MY/aUjtyD53I/N0wa7l5Oma8GtO05Y8lwcrPn4WjpwSxOX7h1Jc3Xiz1Yury53moIgCBqQRVMQBEEDsmgKgiBoQBZNQRAEDTzSRpCKzp07k/bLL7+QlpXFjeVVGUZ5eXmkFRRwFkRSEpfnCg8PJ23+/PmkPf3006QBQGJiImm1a3N5Lxsbfoidns49rlXziYiIIK1Tp06knT7NfaGfRBpYW6KVQx09rWl5Nm94/ghJW1reJe24I2fSeOnY9Mm8w1lhNhe4l81ij7GkLd/D23X+hnuAT+/Lpk/fM5wlZNn0ZdKK6nCWDwBEDeI+Rm4TPifthZGcRRd/YA9pWyICSOvz7l7SNpzh7rYpi/jclGXydzL0qfqkuSu+47KBU0grL1Vn7/0Ro+804+Pj4evri0mTJiE+Pt7YjwlCtSExKlQHRi+aZmZmsLOzQ2FhoTJnWhBqGolRoTow+s9zX19f+Pv748aNGwgPD8fGjRsrfhcXF4e4OL6dFoTqxJgY/e1qag3OUHgSMPpO09z8P5va29ujqKhI73eBgYFYsmSJaWcmCBoxJkabuin6XwuCBoy+04yOjkZcXBxycnIwbdq0hxrUzIzLPpUbSkmoxObNm0lLTeW7hzt3uJ+IhQVnedja2pKmMl4KC7ksmMowWrZsGWkZGRmkAWoT6t6F/0cqLwCA+nylpaWR5ujoSNrJkyeNGlfFw3x3VY0xMWp2dB/M/qGfYZPXlHv6FI19j7SS+WwcdOjBBsNKp8GkvfccL9ZZUWyojA3jcnFd3uH9tVJkuIxcfpC0oDcnkZan48vePiuZNADwWpFCmlsHLod3OpmNVwcXvrY6lVwibUCfFqSt8FlEWlRdju/U+myylu+LIq20bxhpv90uIc29JJO0yhi9aIaEhCAkJMTYzQWh2pEYFaoDeU9TEARBA7JoCoIgaEAWTUEQBA2YPCOoslGgMgmMNQ7mzJlDmouLC2kqI8je3t6oMbKz+UF+nTp1SFOVi1MZNCqTRWVAAWrD6fZt7oOiMqvu3uVMiLp165J29epV0lTl8Fat4lJjU6YoMiYeEdPnQbHz9IbrNP3eMGZlnAWiS/mJtF6OoaR9k/49aYdy2Ezo15INuYVuvL+ZJRxnNlYcP005dGBuySbd23v52thzkLXjLzfgHQL4ZOozpHmVc0yVW3PsrbrI+ztfx4m0Wb34+Lac5h5IRy0406drGZusZQrTx/z2DdJc7dgAvKu7/9sVcqcpCIKgAVk0BUEQNCCLpiAIggZk0RQEQdCAyY2gykaBKtNEZaqoeOUV7rWSm5tLmrU19xhRGTwqQ0ZVLq6sjPuEqEwf1bgqo0SVRQOos4xU50vVK0m1T1V/IdW8MzPZqJg8eTJpqrJyKqPqYb7j6qY8NwNlySf0tAT3INqua1vONNnTho/J4hyf34n1uDeOTuGf5dzljJSWt8+SVm7XjjQzHcdE1F+7k9bIjuPb0Y7LD7ZfyeMCQHAglyCs25WPb8zKn0n78S0uF7c7OYe067f5HHZuxMbSjXzeDuzZYs9lXiP6Z3GpPytXLqW3KYMNu8rInaYgCIIGZNEUBEHQgCyagiAIGpBFUxAEQQNV3iPIWJNg+PDhpKmyXlQl31SGjMoAUWXWqEwflUFjZ8c9VEpK+EG+lowZlZmjMqtURpBqHNXxqVDNW9Vz6LPPuL/MkCFDSHtUTR8VZvUcYOmub6y038u1YFN/OUeax7wVpOW2ZLPj3FnOUll+mjNSjh3izJyD/r1I+/cZ/mxoF87q8qjP15rKgqxTSxFjJXwdAIBnU87COZ+ZT9qYQDZVXljPvbvuKkyfLRP/RJqtFR/LjB1sVtn4uJHW91osaWU9RpFmkXqMtGfXcEnA0/X1S9fJnaYgCIIGZNEUBEHQgCyagiAIGpBFUxAEQQNVbgSpTAwVc+fOJU1l0lhZWZGmModU26nMIVU5tYYNG5KmModUqMrKqTQAKC7mkmSqeauMG9W5UWVBqfanMueysrjHS7du3Uhzd3cn7cqVK6SpMq2MjYUqpVwHVMqm6ZvKZs6hSf6k7c1gW+UpRzbBQtpxybHxzmwOxXTrQVqnhlzzbd8lzuDq/f5e0uaM60ra/oucJda4Acdj4mLOigKAf1/lPlYfbPuVtDEB3OdHp0iD2vUKl5pbcZh7//g1dyBt4UDuTWRfzte+uS2PMfyLE6R91f46aQcncY8vbP5If/+8hSAIgmAIg4tmcnIywsLCMGzYMADApk2bMGHCBIwdOxb5+fzKgSBUNxKjQk1gcNFs0aIF1q1bV/FzTEwMIiMjMWLECERHR+ttGxcXh/Dw8KqbpSAoeJAYvfIb/0kmCFow+s/zey9iu7u7U3/twMBALFnCLwgLQnViTIy6N21cE1MTniA0G0GpqalwdXU1+HtjegQ5O/ODclVWT14eP4RWoTJFVPurV68eaSkpKaTt2LHDqDF69OAH+YmJiaQZMoJUxo3qz8oWLfghe8uWLUlT9f7Jyckxaj4qk0yVnbRsGT8of+GFF0irSdPnv8aorgwo1j/WQyNr0WZlTTuQZtGDzZLubcbw+J/+mYfN5DvcudFsDq104cyz1/pxts1bwRwnlrUukebj34W0Hy5z3A1YdZg0APh+fHvSuk/lEnTW5Rw/4zpwmbWnw3eRtv5NNt2cbPg7mf8DH9+E7mxM3i1jI/d5TzbYSlMPkDa6fx/SZlX62eCimZmZidmzZ+P48eOYP38+goODMXnyZBQUFGDlypWGPiYI1YbEqFATGFw0HR0dsWbNGj3tpZdeqvIJCYKxSIwKNYG8ciQIgqABWTQFQRA0UOU9glT89a9/JU1VJk1lJqgyTWrV4ofGqmwbVSbMpUv8cPnYMS4ZpTKRvLy8SCsoKCDtxAnORgDUhpjKzFGdB5VJ5ubGZbJU51V1blRjqEyk559/njRVVpWql5ChXknVSVnWTRQl6Zcs+/4pLhu2YCV/ZxM2RJP2djGftyOZfA20aNWbtLysONJ6ejdl7Qd+M6X05fdJM7vDJeTKFFk5H319irS1kzmLBgAscq+S9lO+C2n9nThjbuMVvt5+nB9I2t0Szqr68iQbZy0askm2P4VNLV93e9JeOLycNPMwPocFGzkzkT533y0EQRCECmTRFARB0IAsmoIgCBqQRVMQBEEDVV4aTsXEiRNJU2WkqEqiqcwEY3vUqLJtVMZLnz6cFaAyT2rXrk2ah4cHaY0bq1P3VGXbnJycSFMdn8qYUp1DVWaUKrtJZbCpzv/NmzdJ+/DDD0l75ZVXSNPSP6mqKHfvhNL+k/S0Pgp/qv4oT9KeceINU96YQFqzwWz6LLVjA+STmVySbnDoPNJm7WRzwlYx5923+LuuG8rjJuz6P9LmJvD3CgDm7dhcBPha6PwhG2cuzRqQ9mk8G6+F+Rxng3pwpo9XY475BtZ8DTnW4Uy2BoP4/d2DvfuSVvbFdtJwRj++5U5TEARBA7JoCoIgaEAWTUEQBA3IoikIgqCBKjeCOnTgElsqMyc3N5c0OzvOAFCZGDY2XPZJZWyojIhOnTqR1rFjR9JUPYJUmqqHjqHScCrjRmU4qeatym5SlXJTnQfVGMaaSLducc+ZqVOnkqYygh4FLG/fhG2afsbX7Z++pe3cg98irU14LGnr3/qYtKaHV/Fnm3MsJ91iY3Lbp2+TNnLtEdKG9/QgLTKGe/e8vPgL0gpvcZxk3eHvHwCiT6WTNqd3c9IOze1H2r4UvqY7KUrfNbXjGDUr4t4/odsvk3ankDOyZvRuRZpDYy6vuONt/u4WlJ7j7Sr9LHeagiAIGpBFUxAEQQOyaAqCIGhAFk1BEAQNyKIpCIKggSp3z2fMmEGayg1WaSpHV+UGq2pYqmps3r17l7QbN7gGocrtVjn+qvnducOun6EmY6o5qlxxVbql6i0C1f6MrWFpbBqlSjPWUX8U+vYkFtpixS39tNZpvYNpO0sbPh/derIDuzXxN9J8+44mbZAZx2jpZXbtf2s5lrTEnezavx/C7v6zPtxM7ilHW9JC3+E0yquLuLkZAHRbm0ra8g/XkvbFxxGkncvga2HGv34kbdKfu5I2rD3X7OzgWp+08MK9pKXb8xs7N4v4OvBpxnU37zTl7xjQP18GF83k5GTMmzcPubm5+PrrrxEUFAR3d3fY2dlh0aJFhj4mCNWGxKhQExj887xFixZYt25dxc82NjbQ6XRwceF/AeLi4hAeHl41MxQEAzxIjGZdT6PfCYIWjH6muXXrVqxduxbXr1/HyZMn9X4XGBiIJUu4JL8gVCfGxKhDYwP90AXBSIxeNO89a2vYsKHyuZ0g1DQSo0J1YPCZZmZmJmbPno3jx49j/vz5SEpKgo2NDUpLS/Hmm28aPYCqEZeqJqPKsDA2FVKFqgalyjxR7U+1na0tP1BXGUYq08dQvU/V2KrPP0zTOVXNT2PPg7H1TFUL1OzZs0kztRH0IDHqaVeCqW766YtRN/jP+ZfOfUbap3Y5pKUHTOFBdBkkJYPrpC6+24O0pM+4qd+hr94hTdVQ7NivfF1NfNaDtDXvDCFt7131/dPh57n+pe7Nd0mzyLlGWksHvqvvGsG1Rref4iZqv2Zwiungdvw91R94lLSPVgWRllfIsTy8QyPSVhy+/+Mbg4umo6Mj1qxZc98dCEJNITEq1ATynqYgCIIGZNEUBEHQgCyagiAIGjBpRpCLiwvc3PQbMakahaWl8cNWY40NY7NjjM2YUe1PZYDk5eWRpqo3qTJeVHUuDaGao8pIUo2tqpOpOoeqY1E1mMvMzCRNdb5UDetU36ehBnPVSWmdBrjrqF9v8ewJNjsW2/Qi7WhOFmlfJmwibXR6F9L+FdyQtFUdckhb6NSUtFb1+N7GoS1fVyN/YaMtBd14fpP4xf+dG2aRBgA9dnGczavP14eLHZs+uQrzZd7Os6TtGdqAtJQ6bLxO2JRImkdPNponWPMYISd5jNAuHPNvduDrJaJSQpbcaQqCIGhAFk1BEAQNyKIpCIKgAVk0BUEQNGBSI8jNzQ29e+u/8X/+/HnaTmVYqAwQYzE2+8fYrCNjy8+pSs2p5mKoPJtKV2mqY1GZSyqTplmzZqStWsWNv1Tl3RYsWEDa0aOcgaGas8r0GTVqFGnVjbkZYFXpViHp2m3abutYT9L6H+esl5shXAbO5zQ3I3NSlIa75MQl0ZzusNl055M5pH3aYSJpr7+s2G7XRdLS4+aSZvbNR6QBwITnXiJt1hfHSTv5zRbSDn/D9Sj6dea42JbHzda87Tim/i/Mm7Q6+6NIg0sfkmpZcgbVul+4rJ8xyJ2mIAiCBmTRFARB0IAsmoIgCBqQRVMQBEEDJs8I8vHx0dOcnZ1pO5URVFhYSFq9evVIMzZzSLU/lUmjyv5RGSqqMnCqz6pMG1XfH0Bt5qhMFVX2j2ps1blJT2dTYuJENhFU53ry5MmkeXh4GDXu4cOHSduyhc2CkSNHklaV5BaV4XKufvx1ataAtnthPZdo2zrBh7QvT3OPqdNpuaRttGGj88DFq6Rl5HHfqRfHs3HzuiKkLudyTLzduyVp6Xd4u04fsckFAPYtviftfCSbX1bT25P2yYUc0v7ixVk4N/I5fr48yXE7w8uBtDUNAkmbaM9m06phnGl1OZvP9YRVB0nrV+lnudMUBEHQgCyagiAIGpBFUxAEQQOyaAqCIGjApEbQwYMHcfGifgbCtWv8gLl79+6kdevGJazWr19P2pkzZ0ibP38+aceO8YN8Vdk2YzNwVBlLNjY2pKmyhAz1NVKNY6xZpTKmVPtToTJuVKhMnz179pD28ccfk7Z161ajxqhuyn/8FrrXU/S0Py/eTNv93Z8zqU7eYuNgigebmvsnTyLtL11eIS05nHvUFLb2Ja3WXS7Rt5U9JLR25Hic9Q1n5H0Vwse28zPuQwQANlZsVm46zb2IBj/Fpeou3kwmrV5n7vPz2ja+pod5sXHzdTJnVbV14hJyX/7KPZrs67C5G+Ben7TlE3lt2rFyn97PBhfNbdu2ITY2Fnl5eQgLC8OpU6dw+fJllJSUYM2aNQZTAwWhupAYFWoCg4tmcHAwgoODkZ2djfDwcBQXF2Pjxo1YsWIFDhw4AF/f3/9FjIuLQ1xcnPKOSBCqigeJ0Wv5fLciCFq47zPNuXPnYvz48RXvW7q7u1Pl9cDAQCxZskT5fqMgVDVaYrSJLT/WEAQtGFw0y8vLERERgaCgIPj4+FRUwUlNTYWrK5e2F4TqRmJUqAkM/nm+fPly7NmzB7m5ubh48SK8vLzw6quvoqioCFOmTFF+Jjs7G9nZ+iWYXn31VaMm4u7uTtqVK1dIe++990hTZdaoDBmVEWQoW6cyxmbgaLnbVpk+xqIaW5VppZrPrl27Hnjcvn37PvBnTc2DxKh9QB889YG+6RF7g3scpeVxTPUqSCTt2e08xqtRMaS9dodNpJeOscGTsv0n0tq1dCStgyubGC+04e0WB3OmjlnqAdL+ulb92OLQQs64UV0yZzL4evuwfyvS0m7zdTSgIxti/h4NSLO24GfUNyNCSWv57iekNTPnvlhXC9g4bRfzAWk7oB8LBhfN6dOnY/r06YZ+LQg1jsSoUBPIe5qCIAgakEVTEARBA7JoCoIgaMCkGUEAGyvGmh0q00dFUlISaaqXmFUZM6pycUVF/IBeZSwZW8ZNZSw9bI8gQxlFxmynMoxUJpkK1TEbi2ouD2N8mQqzch3MyvQNs9RcPkd/atqAtPL6bGxsmcKlD+vX5vN28CobEcOe5uyYg1e5rFyJ4rx98OkvpIUo9lfHkuNpcQ4fx7JXuTQgAHylyK4pLGUDxbUely/cc5mP5ftznE00bQ+Xvhvdjp9Vfzbak7SiHDbxIv+dQto/AvjcFBTyeU376ghp6PmM3o9ypykIgqABWTQFQRA0IIumIAiCBmTRFARB0IDJjSBjHvarzA5Vnx9VFs7mzVzGa9OmTaQ5OnJ2hKrXjqrkm7G9f1THqtKMNXIMfV51blT7LCjgrA5V758DBzgjRMWjauY8DOW5N1GenKinWZl703ZetndIKzu+jzTuRgOsr8tZU6tjTpOWGGZP2ti/rCPtwPZFpH009RnSXK3Z0Eq4xd9XpiI7SWXkAICLLV8fPYfNJq3h0z1I+2RW5e46QERACx4kgEtA9mrqSZp92CnSaj/zNGmLuvGct6fx9ZtXyEaV2xeKFK81C/R+lDtNQRAEDciiKQiCoAFZNAVBEDQgi6YgCIIGTG4EGYPKYHiYqu+ffMKloNq0aUOaql+RsRk8xpaQUxlLhswTla4ynIzN9FGVhnNwcCAtKipKOR9jxlXxMFlM1Y2ZnT3Mm7bW08Yd2ULb/eAQSlqftn8ibeEFLjf4lsNvpL3cnzNripv1Ie3ObjfSUsr5Mg1bcZC0HWVfkuY9sD9pua253NuoRT+SBgDHPuhFWo8xL/E+b3GWmYsdnxtHM0UJOgs+vm6JfHx3Svh6KXzpXdKyS3m7I16epL2RycZSvTI2ACsXUpQ7TUEQBA3IoikIgqABWTQFQRA0IIumIAiCBmrECDI1EyZMqOkpPJE8TEm6RxVd5nWUnjmkrylK5TWcNZa0C/XZ2BhcwIbcO+MWkzb7LvcDyv4H98/a0juCtGlWe0lLmO1Hmvl5NkAy4rgfVMkQztRxU/QhAoByc14i3g5kk/VIWg5poz/ifkdHe7FJVurP57pZfS6vZ/fTZ6zp2Dit1Z4Nuz8PbUvaO3EXSAvupMrx0sfgorlt2zbExsYiLy8PYWFh+Nvf/gYvLy+4u7tj1qxZ992xIFQ1EqNCTWBw0QwODkZwcDCys7Mxc+ZM2Nraori4GE2aNKnO+QmCQSRGhZrgvn+ez507F1OnToWnpyfMzc0xcuRIDB48WO/9v7i4OMTFxVXpRAXBEFpiNPUm/5ksCFowaASVl5cjIiICQUFB8PLyqni5297entpGBAYGYsmSJVU7U0GoxIPEaLOG6md3gmAsBu80ly9fjj179iA3NxdHjx7F2bNnYW1tDQcHB/nzR3gkeJAYNXdqCssOPfVFe1farn0bLgPXfDnfpSZ/PIS0iaVsGJkfqUuavWd70kYfXkZa6tgPSPv5Epc1G5p2iTSXF0NJe86Js9YGdFPfgWcUstHi3cSOtGeb2pAW3pmP+fo/vyZtv8sA0l44sZa08nr1Savlw9lNJY4epLVdxeXn3E/cJq2NEx9H5b+hDS6a06dPx/Tp3NxIEB4VJEaFmkDe0xQEQdCALJqCIAgakEVTEARBA09ERpAgGMst2OGUWVM9rUPeDdou/SnOmrm8nMuGle7jLBUnRRnBLL8w0up8NY80+9A3SHO8fpS0/WWtSYtvxyXbVuxmcyimHWcJ5XqxoQUAWfmc8eR07nvesDVn4ZinnyOtUb/epF3IyCct8Jo/aVEDu5Hm+hMbRsnPcobgiXQ+jlqW/D01tigkrTJypykIgqABWTQFQRA0IIumIAiCBmTRFARB0IBZuQnreoWEhMDDwwMpKSnw8PAw1W4NUh3jyBhVO0ZKSgqio6OrZkIKJEZljIeO0fIqYMaMGVWx2xoZR8Z4tMYwFRKjMsaDUiV/ngcGcj7o4zqOjPFojWEqJEZljAfFpH+eC4IgPOmIESQIgqABWTQFQRA0YPI0yvz8fEyZMgW1atVCQEAARo8ebeohEB8fj7///e9o3749Ro0ahYCAAJPtOzk5GfPmzUNubi6+/vprbNq0CT/88AOKioqwevVq2NramnyMoKAguLu7w87ODosWLTLBUXD/nFOnTuHy5csoKSnBmjVrYGZmZvIxHpcePRKj2seQGP0dk99pRkdHY9iwYYiMjMSOHTtMvXsAgJmZGezs7FBYWAhXVy4g+zC0aNEC69atq/g5JiYGkZGRGDFihMlejak8ho2NDXQ6HVxcXEyyf+A//XMiIyOxZs0abN68GceOHcOKFSvQsWNHHDhwwORjbNmy5bHp0SMxqn0MidHfMfmimZaWBjc3NwCAhQW34TQFvr6+2LVrFxYuXIh33323Ssa4x71/7dzd3ZGWllYlY2zduhVr167F9evXcfLkSZPue+7cuRg/fjycnZ0BVM1x3OvRs3fvXqxfvx7ffvstsrKyTDqGKZEY1Y7E6O+YfNF0dXWtOGCdjvswm4I/9oIpKiqqkjEqk5qaavI7hnvcO56GDRvizh2upPMglP+hf46Pjw9u3boFwLTHUa6hR8+jhMSodiRGf8fkrxzl5+dj2rRpsLa2Rs+ePavkeVF0dDTi4uKQk5ODyZMnm/R5UWZmJmbPno3vv/8e48ePh7u7O3766ScUFBRg5cqVJnleVHmMpKQk2NjYoLS0FB9//HHFF/swLFu2DFFRUfDx8YGnpyfu3r2LK1euVDz3MsXzoj+O4e7urtejZ8GCBQ+9/6pCYlT7GBKjvyPvaQqCIGhAXjkSBEHQgCyagiAIGpBFUxAEQQOyaAqCIGjg/wFKJSgpQxxYcQAAAABJRU5ErkJggg==", | |
| "text/plain": [ | |
| "<Figure size 400x200 with 2 Axes>" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(8, 4))\n", | |
| "ax1.imshow(it_norm, cmap='gray')\n", | |
| "ax1.set_title('Original')\n", | |
| "ax2.imshow(target.squeeze(), cmap='RdBu', vmin=-3, vmax=3)\n", | |
| "ax2.set_title('Target vector')\n", | |
| "plt.show()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "ff727280", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "Amazing! It has a bit of the shape of the original shirt, but clearly, there is some noise. So, I understand this representation could be explained as, given that initial noise, what is more or less the direction of the gradient to reconstruct the final image. Randomization is beneficial because then we can use the same image, or many of them, to learn and have more samples. Is that correct? Or is there something else? As in, if we just did it without noise, why wouldn't it work?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Exactly right! You've grasped the key insights perfectly.\n", | |
| "\n", | |
| "The randomization serves multiple purposes:\n", | |
| "\n", | |
| "1. **Data augmentation** - one image becomes many training samples at different noise levels\n", | |
| "2. **Generalization** - the model learns to denoise from ANY starting point, not just one specific path\n", | |
| "3. **Avoiding trivial solutions** - without noise, the model might just memorize \"always output the training image\"\n", | |
| "\n", | |
| "If you used the same starting point (like all white) for every image, the model would learn very rigid patterns. But with random noise, it learns the more general skill: \"given this noisy mixture, what direction leads to the clean image?\"\n", | |
| "\n", | |
| "This is why it can later generate NEW images - you start from pure noise and follow the learned direction vectors.\n", | |
| "\n", | |
| "Does this connection between training with random noise and generation make sense?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "4ff083e3", | |
| "metadata": { | |
| "hide_input": true | |
| }, | |
| "source": [ | |
| "## Training\n", | |
| "\n", | |
| "To train our model we'll be using FastAI's miniai [framework](https://github.com/fastai/course22p2/) from their fantastic 2022 Diffusion [course](https://course.fast.ai/Lessons/part2.html). Almost all the code is plucked straight from the conditional diffusion [notebook](https://github.com/fastai/course22p2/blob/master/nbs/28_diffusion-attn-cond.ipynb). Implementing _flow matching_ requires two changes to the notebook. The first is a minor tweak to the `noisify` method (shown above) that creates our training dataset.\n", | |
| "\n", | |
| "The second change is to the `sample` method used to generate our images. Here's the _flow matching_ sample method." | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "a15958f2", | |
| "metadata": { | |
| "time_run": "3:15:35p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "import torch\n", | |
| "\n", | |
| "@torch.no_grad()\n", | |
| "def sample(model, sz, n_steps):\n", | |
| " steps = torch.linspace(0,1, n_steps)\n", | |
| " xn = torch.randn(sz).cuda()\n", | |
| " dn = 1.0 / n_steps\n", | |
| " preds = []\n", | |
| " for _, step in enumerate(progress_bar(steps)):\n", | |
| " step = step[None].cuda()\n", | |
| " dxn_dn = model((xn, step))\n", | |
| " xn += (dxn_dn*dn)\n", | |
| " preds.append(xn.float().cpu())\n", | |
| " return preds\n", | |
| "\n", | |
| "#sample(model, sz, steps=100)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "bbd0fb16", | |
| "metadata": { | |
| "hide_input": true | |
| }, | |
| "source": [ | |
| "## Results\n", | |
| "\n", | |
| "Here are the results after training the _flow matching_ model on Fashion MNIST for 5 epochs (~3mins) on an A100. As a benchmark, a DDIM diffusion model was also trained. To my (probably biased) eye the images are of comparable quality.\n", | |
| "\n", | |
| " DDIM  Flow Matching\n", | |
| "\n", | |
| " DDIM  Flow Matching\n", | |
| "\n", | |
| "* * *\n", | |
| "\n", | |
| "## Conclusion\n", | |
| "\n", | |
| "That's it. We've developed a _flow matching_ model that generates images of comparable quality to diffusion variants like DDIM, is conceptually much simpler, and uses 40% less sampling code 🥳.\n", | |
| "\n", | |
| "The _flow matching_ training notebook and setup instructions can be found in this [repo](https://github.com/comhar/flow-matching). If you have any comments or questions feel free to message me at [email protected]_" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "221835aa", | |
| "metadata": { | |
| "heading_collapsed": true | |
| }, | |
| "source": [ | |
| "# Exploring code with ripgrep, ast-grep and sed" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "5d81ab45", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I would like to be able to run what is referenced in this post, let's have a look at the repo's readme.\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "<details class='tool-usage-details'>\n", | |
| "\n", | |
| "- `read_url({'url': 'https://github.com/comhar/flow-matching'})`\n", | |
| " - `'Skip to content\\n\\n## Navigation Menu\\n\\nToggle navigation\\n\\n[ ](/)\\n\\n[ Sign in ](/login?return_to=https%3A%2F%2Fgithub.com%2Fcomhar%2Fflow-matching)\\n\\nAppearance settings\\n\\n * Platform \\n\\n * [ GitHub Copilot Write better code with AI ](https://github.com/features/copilot)\\n * [ GitHub Spark New Build and deploy intelligent apps ](https://github.com/features/spark)\\n * [ GitHub Models New Manage and compare prompts ](https://github.com/features/models)\\n * [ GitHub Advanced Security Find and fix vulnerabilities ](https://github.com/security/advanced-security)\\n * [ Actions Automate any workflow ](https://github.com/features/actions)\\n\\n * [ Codespaces Instant dev environments ](https://github.com/features/codespaces)\\n * [ Issues Plan and track work ](https://github.com/features/issues)\\n * [ Code Review Manage code changes ](https://github.com/features/code-review)\\n * [ Discussions Collaborate outside of code ](https://github.com/features/discussions)\\n * [ Code Search Find more, search less ](https://github.com/features/code-search)\\n\\nExplore\\n\\n * [ Why GitHub ](https://github.com/why-github)\\n * [ Documentation ](https://docs.github.com)\\n * [ GitHub Skills ](https://skills.github.com)\\n * [ Blog ](https://github.blog)\\n\\nIntegrations\\n\\n * [ GitHub Marketplace ](https://github.com/marketplace)\\n * [ MCP Registry ](https://github.com/mcp)\\n\\n[ View all features ](https://github.com/features)\\n\\n * Solutions \\n\\nBy company size\\n\\n * [ Enterprises ](https://github.com/enterprise)\\n * [ Small and medium teams ](https://github.com/team)\\n * [ Startups ](https://github.com/enterprise/startups)\\n * [ Nonprofits ](/solutions/industry/nonprofits)\\n\\nBy use case\\n\\n * [ App Modernization ](/solutions/use-case/app-modernization)\\n * [ DevSecOps ](/solutions/use-case/devsecops)\\n * [ DevOps ](/solutions/use-case/devops)\\n * [ CI/CD ](/solutions/use-case/ci-cd)\\…`\n", | |
| "\n", | |
| "</details>\n", | |
| "\n", | |
| "Perfect! I can see the setup instructions from the README. The repository provides a clear setup process that involves:\n", | |
| "\n", | |
| "1. Creating a virtual environment\n", | |
| "2. Cloning the fastai course22p2 repository \n", | |
| "3. Modifying the `settings.ini` file\n", | |
| "4. Installing packages\n", | |
| "5. Running the `flow_matching.ipynb` notebook\n", | |
| "\n", | |
| "The setup looks straightforward but requires the fastai course22p2 framework. Would you like to start setting this up, or would you prefer to examine the notebook first to understand what we'll be running?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "7e39a0a6", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:36p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "nb_url = \"https://raw.githubusercontent.com/comhar/flow-matching/refs/heads/main/flow_matching.ipynb\"\n", | |
| "nb_content = xget(nb_url).text\n", | |
| "with open(\"diffusion_notebook.ipynb\", \"w\") as f: f.write(nb_content)\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "6fb1dfac", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:38p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "[NbConvertApp] Converting notebook diffusion_notebook.ipynb to python\r\n" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "[NbConvertApp] Writing 7771 bytes to diffusion_notebook.py\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!python -m nbconvert --to python diffusion_notebook.ipynb" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "0c842d0e", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I have downloaded and exported the notebook into a Python file, and I would like to take this chance to explore the tools `ripgrep`, `ast-grep` and any other useful for dealing with the files like `sed`. \n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Great idea! Let's explore these powerful text processing tools. Here are the main ones available:\n", | |
| "\n", | |
| "**ripgrep (`rg`)** - Super fast text search\n", | |
| "**ast-grep (`sg`)** - Structure-aware code search \n", | |
| "**sed** - Stream editor for find/replace\n", | |
| "\n", | |
| "Let's start with ripgrep to explore the converted notebook. Try:\n", | |
| "\n", | |
| "```python\n", | |
| "rg(\"def \", 'diffusion_notebook.py')\n", | |
| "```\n", | |
| "\n", | |
| "This will find all function definitions. What do you see, and what would you like to search for next?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "389ef15c", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:38p" | |
| }, | |
| "outputs": [ | |
| { | |
| "data": { | |
| "text/plain": [ | |
| "\u001b[31mSignature:\u001b[39m rg(argstr: str, disallow_re: str = \u001b[38;5;28;01mNone\u001b[39;00m, allow_re: str = \u001b[38;5;28;01mNone\u001b[39;00m)\n", | |
| "\u001b[31mDocstring:\u001b[39m Run the `rg` command with the args in `argstr` (no need to backslash escape)\n", | |
| "\u001b[31mFile:\u001b[39m ~/.local/lib/python3.12/site-packages/fastcore/tools.py\n", | |
| "\u001b[31mType:\u001b[39m function" | |
| ] | |
| }, | |
| "metadata": {}, | |
| "output_type": "display_data" | |
| } | |
| ], | |
| "source": [ | |
| "rg?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "493f5a3e", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:38p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[0m\u001b[32m54\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mnoisify_simple_diffusion(x):\r\n", | |
| "\u001b[0m\u001b[32m63\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mcollate_simple_diffusion(b):\r\n", | |
| "\u001b[0m\u001b[32m68\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mdl_simple_diffusion(ds): return DataLoader(ds, batch_size=bs, collate_fn=collate_simple_diffusion, num_workers=4)\r\n", | |
| "\u001b[0m\u001b[32m75\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mtransformi(b): b[xl] = [F.pad(TF.to_tensor(o), (2,2,2,2))-0.5 for o in b[xl]]\r\n", | |
| "\u001b[0m\u001b[32m89\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mlin(ni, nf, act=nn.SiLU, norm=None, bias=True):\r\n", | |
| "\u001b[0m\u001b[32m96\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0msaved(m, blk):\r\n", | |
| "\u001b[0m\u001b[32m100\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0m_f(*args, **kwargs):\r\n", | |
| "\u001b[0m\u001b[32m108\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mpre_conv(ni, nf, ks=3, stride=1, act=nn.SiLU, norm=None, bias=True):\r\n", | |
| "\u001b[0m\u001b[32m115\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mupsample(nf): return nn.Sequential(nn.Upsample(scale_factor=2.), nn.Conv2d(nf, nf, 3, padding=1))\r\n", | |
| "\u001b[0m\u001b[32m117\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0mtimestep_embedding(tsteps, emb_dim, max_period= 10000):\r\n", | |
| "\u001b[0m\u001b[32m123\u001b[0m:\u001b[0m\u001b[1m\u001b[31mclass\u001b[0m DownBlock(nn.Module):\r\n", | |
| "\u001b[0m\u001b[32m124\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0m__init__(self, n_emb, ni, nf, add_down=True, num_layers=1, attn_chans=0):\r\n", | |
| "\u001b[0m\u001b[32m130\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0mforward(self, x, t):\r\n", | |
| "\u001b[0m\u001b[32m136\u001b[0m:\u001b[0m\u001b[1m\u001b[31mclass\u001b[0m UpBlock(nn.Module):\r\n", | |
| "\u001b[0m\u001b[32m137\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0m__init__(self, n_emb, ni, prev_nf, nf, add_up=True, num_layers=2, attn_chans=0):\r\n", | |
| "\u001b[0m\u001b[32m144\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0mforward(self, x, t, ups):\r\n", | |
| "\u001b[0m\u001b[32m148\u001b[0m:\u001b[0m\u001b[1m\u001b[31mclass\u001b[0m EmbResBlock(nn.Module):\r\n", | |
| "\u001b[0m\u001b[32m149\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0m__init__(self, n_emb, ni, nf=None, ks=3, act=nn.SiLU, norm=nn.BatchNorm2d, attn_chans=0):\r\n", | |
| "\u001b[0m\u001b[32m159\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0mforward(self, x, t):\r\n", | |
| "\u001b[0m\u001b[32m174\u001b[0m:\u001b[0m\u001b[1m\u001b[31mclass\u001b[0m CondUNetModel(nn.Module):\r\n", | |
| "\u001b[0m\u001b[32m175\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0m__init__( self, n_\u001b[0m\u001b[1m\u001b[31mclass\u001b[0mes, in_channels=3, out_channels=3, nfs=(224,448,672,896), num_layers=1):\r\n", | |
| "\u001b[0m\u001b[32m180\u001b[0m: self.cond_emb = nn.Embedding(n_\u001b[0m\u001b[1m\u001b[31mclass\u001b[0mes, n_emb)\r\n", | |
| "\u001b[0m\u001b[32m200\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdef \u001b[0mforward(self, inp):\r\n", | |
| "\u001b[0m\u001b[32m247\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef \u001b[0msample(c, model, sz, steps):\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!rg 'class|def ' diffusion_notebook.py" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "7f059609", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "great! how to show some context along each match?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "You can use the `-A` and `-B` flags for context:\n", | |
| "\n", | |
| "```bash\n", | |
| "!rg -A 3 -B 1 'def sample' diffusion_notebook.py\n", | |
| "```\n", | |
| "\n", | |
| "- `-A 3` shows 3 lines after the match\n", | |
| "- `-B 1` shows 1 line before the match\n", | |
| "- `-C 2` shows 2 lines both before and after\n", | |
| "\n", | |
| "Try it and see what the `sample` function looks like!" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "58902b7f", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:38p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[0m\u001b[32m247\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef sample\u001b[0m(c, model, sz, steps):\r\n", | |
| "\u001b[0m\u001b[32m248\u001b[0m- ts = torch.linspace(0,1,steps)\r\n", | |
| "\u001b[0m\u001b[32m249\u001b[0m- p = torch.randn(sz).cuda()\r\n", | |
| "\u001b[0m\u001b[32m250\u001b[0m- dt = 1.0 / steps\r\n", | |
| "\u001b[0m\u001b[32m251\u001b[0m- c = p.new_full((sz[0],), c, dtype=torch.int32)\r\n", | |
| "\u001b[0m\u001b[32m252\u001b[0m- preds = []\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!rg -A 5 'def sample' diffusion_notebook.py\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "67bc21d2", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:39p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[0m\u001b[32m242\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m243\u001b[0m-# In[12]:\r\n", | |
| "\u001b[0m\u001b[32m244\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m245\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m246\u001b[[email protected]_grad()\r\n", | |
| "\u001b[0m\u001b[32m247\u001b[0m:\u001b[0m\u001b[1m\u001b[31mdef sample\u001b[0m(c, model, sz, steps):\r\n", | |
| "\u001b[0m\u001b[32m248\u001b[0m- ts = torch.linspace(0,1,steps)\r\n", | |
| "\u001b[0m\u001b[32m249\u001b[0m- p = torch.randn(sz).cuda()\r\n", | |
| "\u001b[0m\u001b[32m250\u001b[0m- dt = 1.0 / steps\r\n", | |
| "\u001b[0m\u001b[32m251\u001b[0m- c = p.new_full((sz[0],), c, dtype=torch.int32)\r\n", | |
| "\u001b[0m\u001b[32m252\u001b[0m- preds = []\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!rg -C 5 'def sample' diffusion_notebook.py\n" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "cc5318d8", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "great! what can we do now?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Now you have several options to explore the code further:\n", | |
| "\n", | |
| "1. **Find specific patterns** - search for the `noisify` function that was mentioned in the blog post\n", | |
| "2. **Use ast-grep** to find structural patterns like all class definitions or function calls\n", | |
| "3. **Use sed** to modify parts of the code \n", | |
| "4. **Extract specific sections** to understand the flow matching implementation\n", | |
| "\n", | |
| "What interests you most? Would you like to find the `noisify` function that creates the training data, or explore the model architecture?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "bdbbf868", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "I want to understand the notebook better, mostly enough to try to run the code in Modal, what would you use?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "For understanding the notebook structure to run on Modal, I'd use:\n", | |
| "\n", | |
| "1. **Find key functions and dependencies**:\n", | |
| "```bash\n", | |
| "!rg -A 3 'import|from' diffusion_notebook.py\n", | |
| "```\n", | |
| "\n", | |
| "2. **Identify the main training loop**:\n", | |
| "```bash\n", | |
| "!rg -C 3 'train|fit|epoch' diffusion_notebook.py\n", | |
| "```\n", | |
| "\n", | |
| "3. **Look for GPU/CUDA usage**:\n", | |
| "```bash\n", | |
| "!rg 'cuda|gpu|device' diffusion_notebook.py\n", | |
| "```\n", | |
| "\n", | |
| "Which would you like to start with? The imports will show what packages Modal needs to install, while the training code will show what compute resources you'll need." | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "719ade17", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:39p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[0m\u001b[32m5\u001b[0m:# The code in this notebook was plucked straight \u001b[0m\u001b[1m\u001b[31mfrom\u001b[0m the conditional diffusion [notebook](https://github.com/fastai/course22p2/blob/master/nbs/28_diffusion-attn-cond.ipynb) \u001b[0m\u001b[1m\u001b[31mfrom\u001b[0m the Fastai Diffusion [course](https://course.fast.ai/Lessons/part2.html). The only significant changes were made to the `noisify` and `sample` methods.\r\n", | |
| "\u001b[0m\u001b[32m6\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m7\u001b[0m-# In[ ]:\r\n", | |
| "\u001b[0m\u001b[32m8\u001b[0m-\r\n", | |
| "--\r\n", | |
| "\u001b[0m\u001b[32m10\u001b[0m:\u001b[0m\u001b[1m\u001b[31mimport\u001b[0m os\r\n", | |
| "\u001b[0m\u001b[32m11\u001b[0m:\u001b[0m\u001b[1m\u001b[31mfrom\u001b[0m miniai.\u001b[0m\u001b[1m\u001b[31mimport\u001b[0ms \u001b[0m\u001b[1m\u001b[31mimport\u001b[0m *\r\n", | |
| "\u001b[0m\u001b[32m12\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m13\u001b[0m:\u001b[0m\u001b[1m\u001b[31mfrom\u001b[0m einops \u001b[0m\u001b[1m\u001b[31mimport\u001b[0m rearrange\r\n", | |
| "\u001b[0m\u001b[32m14\u001b[0m:\u001b[0m\u001b[1m\u001b[31mfrom\u001b[0m fastprogress \u001b[0m\u001b[1m\u001b[31mimport\u001b[0m progress_bar\r\n", | |
| "\u001b[0m\u001b[32m15\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m16\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m17\u001b[0m-# In[ ]:\r\n", | |
| "--\r\n", | |
| "\u001b[0m\u001b[32m25\u001b[0m:\u001b[0m\u001b[1m\u001b[31mimport\u001b[0m logging\r\n", | |
| "\u001b[0m\u001b[32m26\u001b[0m-logging.disable(logging.WARNING)\r\n", | |
| "\u001b[0m\u001b[32m27\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m28\u001b[0m-set_seed(42)\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!rg -A 3 'import|from' diffusion_notebook.py" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "e4046dbc", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:39p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[0m\u001b[32m29\u001b[0m-if fc.defaults.cpus>8: fc.defaults.cpus=8\r\n", | |
| "\u001b[0m\u001b[32m30\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m31\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m32\u001b[0m:# ### Create \u001b[0m\u001b[1m\u001b[31mtrain\u001b[0ming dataset\r\n", | |
| "\u001b[0m\u001b[32m33\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m34\u001b[0m-# In[ ]:\r\n", | |
| "\u001b[0m\u001b[32m35\u001b[0m-\r\n", | |
| "--\r\n", | |
| "\u001b[0m\u001b[32m38\u001b[0m-name = \"fashion_mnist\"\r\n", | |
| "\u001b[0m\u001b[32m39\u001b[0m-dsd = load_dataset(name)\r\n", | |
| "\u001b[0m\u001b[32m40\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m41\u001b[0m:labels = dsd[\"\u001b[0m\u001b[1m\u001b[31mtrain\u001b[0m\"].features[\"label\"].names\r\n", | |
| "\u001b[0m\u001b[32m42\u001b[0m-n_labels = len(labels)\r\n", | |
| "\u001b[0m\u001b[32m43\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m44\u001b[0m-\r\n", | |
| "--\r\n", | |
| "\u001b[0m\u001b[32m75\u001b[0m-def transformi(b): b[xl] = [F.pad(TF.to_tensor(o), (2,2,2,2))-0.5 for o in b[xl]]\r\n", | |
| "\u001b[0m\u001b[32m76\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m77\u001b[0m-tds = dsd.with_transform(transformi)\r\n", | |
| "\u001b[0m\u001b[32m78\u001b[0m:dls = DataLoaders(dl_simple_diffusion(tds[\"\u001b[0m\u001b[1m\u001b[31mtrain\u001b[0m\"]), dl_simple_diffusion(tds[\"test\"]))\r\n", | |
| "\u001b[0m\u001b[32m79\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m80\u001b[0m:dl = dls.\u001b[0m\u001b[1m\u001b[31mtrain\u001b[0m\r\n", | |
| "\u001b[0m\u001b[32m81\u001b[0m-(xt,t,c),eps = b = next(iter(dl))\r\n", | |
| "\u001b[0m\u001b[32m82\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m83\u001b[0m-\r\n", | |
| "--\r\n", | |
| "\u001b[0m\u001b[32m217\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m218\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m219\u001b[0m-lr = 5e-3\r\n", | |
| "\u001b[0m\u001b[32m220\u001b[0m:\u001b[0m\u001b[1m\u001b[31mepoch\u001b[0ms = 5\r\n", | |
| "\u001b[0m\u001b[32m221\u001b[0m-opt_func = partial(optim.Adam, eps=1e-5)\r\n", | |
| "\u001b[0m\u001b[32m222\u001b[0m:tmax = \u001b[0m\u001b[1m\u001b[31mepoch\u001b[0ms * len(dls.\u001b[0m\u001b[1m\u001b[31mtrain\u001b[0m)\r\n", | |
| "\u001b[0m\u001b[32m223\u001b[0m-sched = partial(lr_scheduler.OneCycleLR, max_lr=lr, total_steps=tmax)\r\n", | |
| "\u001b[0m\u001b[32m224\u001b[0m-cbs = [DeviceCB(), MetricsCB(), ProgressCB(plot=True), BatchSchedCB(sched), MixedPrecision()]\r\n", | |
| "\u001b[0m\u001b[32m225\u001b[0m-model = CondUNetModel(n_labels, in_channels=1, out_channels=1, nfs=(32,64,128,256), num_layers=2)\r\n", | |
| "--\r\n", | |
| "\u001b[0m\u001b[32m229\u001b[0m-# In[10]:\r\n", | |
| "\u001b[0m\u001b[32m230\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m231\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m232\u001b[0m:learn.\u001b[0m\u001b[1m\u001b[31mfit\u001b[0m(\u001b[0m\u001b[1m\u001b[31mepoch\u001b[0ms)\r\n", | |
| "\u001b[0m\u001b[32m233\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m234\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m235\u001b[0m-# ### Generate samples\r\n", | |
| "--\r\n", | |
| "\u001b[0m\u001b[32m261\u001b[0m-# In[13]:\r\n", | |
| "\u001b[0m\u001b[32m262\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m263\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m264\u001b[0m:lbls = dsd[\"\u001b[0m\u001b[1m\u001b[31mtrain\u001b[0m\"].features[yl].names\r\n", | |
| "\u001b[0m\u001b[32m265\u001b[0m-lbls\r\n", | |
| "\u001b[0m\u001b[32m266\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m267\u001b[0m-\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!rg -C 3 'train|fit|epoch' diffusion_notebook.py" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "16977da4", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:39p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[0m\u001b[32m55\u001b[0m: \u001b[0m\u001b[1m\u001b[31mdevice\u001b[0m = x.\u001b[0m\u001b[1m\u001b[31mdevice\u001b[0m\r\n", | |
| "\u001b[0m\u001b[32m59\u001b[0m: z = torch.randn(x.shape, \u001b[0m\u001b[1m\u001b[31mdevice\u001b[0m=\u001b[0m\u001b[1m\u001b[31mdevice\u001b[0m)\r\n", | |
| "\u001b[0m\u001b[32m60\u001b[0m: xt = ((1-t_t)*z + t_t*x).to(\u001b[0m\u001b[1m\u001b[31mdevice\u001b[0m)\r\n", | |
| "\u001b[0m\u001b[32m118\u001b[0m: exponent = -math.log(max_period) * torch.linspace(0, 1, emb_dim//2, \u001b[0m\u001b[1m\u001b[31mdevice\u001b[0m=tsteps.\u001b[0m\u001b[1m\u001b[31mdevice\u001b[0m)\r\n", | |
| "\u001b[0m\u001b[32m249\u001b[0m: p = torch.randn(sz).\u001b[0m\u001b[1m\u001b[31mcuda\u001b[0m()\r\n", | |
| "\u001b[0m\u001b[32m254\u001b[0m: t = t[None].\u001b[0m\u001b[1m\u001b[31mcuda\u001b[0m()\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!rg 'cuda|gpu|device' diffusion_notebook.py" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "387a54f5", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "it seems we need `miniai`, `einops`, and `fastprogress` and then the fashion mnist dataset we already have right? anything else? \n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Looking at the imports, you'll also need:\n", | |
| "\n", | |
| "- `datasets` (for `load_dataset`)\n", | |
| "- Standard PyTorch packages (`torch`, `torchvision`)\n", | |
| "- The `miniai` framework from fastai's course22p2\n", | |
| "\n", | |
| "The main challenge is that `miniai` isn't a standard pip package - it's part of the fastai course repository. You'd need to either:\n", | |
| "\n", | |
| "1. Clone the course22p2 repo and install it\n", | |
| "2. Extract just the miniai components you need\n", | |
| "\n", | |
| "Want to check what specific miniai components are being used?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "f30ff37b", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "does `ast-grep` allow us to select sections in a readme? I would like to select the \"## Setup\" section in this readme: https://raw.githubusercontent.com/comhar/flow-matching/refs/heads/main/README.md\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "<details class='tool-usage-details'>\n", | |
| "\n", | |
| " `read_url({\"url\": \"https://raw.githubusercontent.com/comhar/flow-matching/refs/heads/main/README.md#setup\"})`\n", | |
| " - `''`\n", | |
| "\n", | |
| "</details>\n", | |
| "\n", | |
| "`ast-grep` is designed for code structure, not markdown. For markdown sections, you can use `read_url` with an anchor (like I just tried), or use `rg` with patterns.\n", | |
| "\n", | |
| "Let me get the full README and then show you how to extract the Setup section:\n", | |
| "<details class='tool-usage-details'>\n", | |
| "\n", | |
| " `read_url({\"url\": \"https://raw.githubusercontent.com/comhar/flow-matching/refs/heads/main/README.md\"})`\n", | |
| " - `'# Flow Matching This repo contains instructions for training the flow matching model described in this blog [post](https://tommyc.xyz/posts/flow-matching). ## Setup commandline pip install virtualenv virtualenv env -p python3 source env/bin/activate git clone https://github.com/fastai/course22p2.git cd course22p2 ##### update settings.ini Change pip_requirements on line 38 in settings.ini to pip_requirements = torch>=1.7 torcheval diffusers einops timm accelerate ipykernel ##### install packages commandline pip install -e . python -m ipykernel install --user --name=env --display-name "flow_matching" ## Training Open the flow_matching.ipynb notebook, select the flow_matching environment and run all cells. Training the flow matching model on Fashion MNIST for 5 epochs takes ~3 minutes on an A100 with 40GB of VRAM. Here are some example images generated by the model after 5 epochs of training. []() []()'`\n", | |
| "\n", | |
| "</details>\n", | |
| "\n", | |
| "Perfect! You can use `rg` to extract markdown sections. Try:\n", | |
| "\n", | |
| "```bash\n", | |
| "!rg -A 20 '## Setup' README.md\n", | |
| "```\n", | |
| "\n", | |
| "This will show the Setup section plus 20 lines after it. Would you like to try this approach?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "ff3febe5", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:40p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "--2025-09-27 15:15:39-- https://raw.githubusercontent.com/comhar/flow-matching/refs/heads/main/README.md\r\n", | |
| "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... " | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "185.199.108.133, 185.199.109.133, 185.199.110.133, ...\r\n", | |
| "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.133|:443... connected.\r\n", | |
| "HTTP request sent, awaiting response... " | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "200 OK\r\n", | |
| "Length: 1039 (1.0K) [text/plain]\r\n", | |
| "Saving to: ‘README.md.1’\r\n", | |
| "\r\n", | |
| "\r", | |
| "README.md.1 0%[ ] 0 --.-KB/s \r", | |
| "README.md.1 100%[===================>] 1.01K --.-KB/s in 0s \r\n", | |
| "\r\n", | |
| "2025-09-27 15:15:40 (37.1 MB/s) - ‘README.md.1’ saved [1039/1039]\r\n", | |
| "\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!wget 'https://raw.githubusercontent.com/comhar/flow-matching/refs/heads/main/README.md'" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "da177809", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:40p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[0m\u001b[32m4\u001b[0m:\u001b[0m\u001b[1m\u001b[31m## \u001b[0mSetup\r\n", | |
| "\u001b[0m\u001b[32m5\u001b[0m-```commandline\r\n", | |
| "\u001b[0m\u001b[32m6\u001b[0m-pip install virtualenv\r\n", | |
| "\u001b[0m\u001b[32m7\u001b[0m-virtualenv env -p python3\r\n", | |
| "\u001b[0m\u001b[32m8\u001b[0m-source env/bin/activate\r\n", | |
| "\u001b[0m\u001b[32m9\u001b[0m-git clone https://github.com/fastai/course22p2.git\r\n", | |
| "\u001b[0m\u001b[32m10\u001b[0m-cd course22p2\r\n", | |
| "\u001b[0m\u001b[32m11\u001b[0m-```\r\n", | |
| "\u001b[0m\u001b[32m12\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m13\u001b[0m:###\u001b[0m\u001b[1m\u001b[31m## \u001b[0mupdate `settings.ini`\r\n", | |
| "\u001b[0m\u001b[32m14\u001b[0m-Change `pip_requirements` on line 38 in `settings.ini` to `pip_requirements = torch>=1.7 torcheval diffusers einops timm accelerate ipykernel`\r\n", | |
| "\u001b[0m\u001b[32m15\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m16\u001b[0m:###\u001b[0m\u001b[1m\u001b[31m## \u001b[0minstall packages\r\n", | |
| "\u001b[0m\u001b[32m17\u001b[0m-```commandline\r\n", | |
| "\u001b[0m\u001b[32m18\u001b[0m-pip install -e .\r\n", | |
| "\u001b[0m\u001b[32m19\u001b[0m-python -m ipykernel install --user --name=env --display-name \"flow_matching\"\r\n", | |
| "\u001b[0m\u001b[32m20\u001b[0m-```\r\n", | |
| "\u001b[0m\u001b[32m21\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m22\u001b[0m:\u001b[0m\u001b[1m\u001b[31m## \u001b[0mTraining\r\n", | |
| "\u001b[0m\u001b[32m23\u001b[0m-Open the `flow_matching.ipynb` notebook, select the `flow_matching` environment and run all cells. \r\n", | |
| "\u001b[0m\u001b[32m24\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m25\u001b[0m-Training the flow matching model on Fashion MNIST for 5 epochs takes ~3 minutes on an A100 with 40GB of VRAM.\r\n", | |
| "\u001b[0m\u001b[32m26\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m27\u001b[0m-Here are some example images generated by the model after 5 epochs of training.\r\n", | |
| "\u001b[0m\u001b[32m28\u001b[0m-\r\n", | |
| "\u001b[0m\u001b[32m29\u001b[0m-[<img src=\"./assets/tshirt.png\" width=\"256\" />]()\r\n", | |
| "\u001b[0m\u001b[32m30\u001b[0m-[<img src=\"./assets/sneaker.png\" width=\"256\" />]()\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!rg -A 10 '## ' README.md" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "099ab2a6", | |
| "metadata": { | |
| "hidden": true, | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "how to use sed to select only the setup part which goes from line 4 to 22?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "You can use `sed` to extract specific line ranges:\n", | |
| "\n", | |
| "```bash\n", | |
| "!sed -n '4,22p' README.md\n", | |
| "```\n", | |
| "\n", | |
| "The `-n` suppresses default output, and `4,22p` prints lines 4 through 22.\n", | |
| "\n", | |
| "Try it and see if it gives you just the Setup section!" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "98adb357", | |
| "metadata": { | |
| "hidden": true, | |
| "time_run": "3:15:40p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "## Setup\r\n", | |
| "```commandline\r\n", | |
| "pip install virtualenv\r\n", | |
| "virtualenv env -p python3\r\n", | |
| "source env/bin/activate\r\n", | |
| "git clone https://github.com/fastai/course22p2.git\r\n", | |
| "cd course22p2\r\n", | |
| "```\r\n", | |
| "\r\n", | |
| "##### update `settings.ini`\r\n", | |
| "Change `pip_requirements` on line 38 in `settings.ini` to `pip_requirements = torch>=1.7 torcheval diffusers einops timm accelerate ipykernel`\r\n", | |
| "\r\n", | |
| "##### install packages\r\n", | |
| "```commandline\r\n", | |
| "pip install -e .\r\n", | |
| "python -m ipykernel install --user --name=env --display-name \"flow_matching\"\r\n", | |
| "```\r\n", | |
| "\r\n", | |
| "## Training\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!sed -n '4,22p' README.md" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "ee3e8d24", | |
| "metadata": {}, | |
| "source": [ | |
| "# Modal" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "cd78b865", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "this needs to run in a GPU environment so we can't run it locally. Do you think we can set that up using something like Modal?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Yes, Modal looks perfect for this! Modal supports A100 GPUs and is designed specifically for serverless AI workloads where you only pay for what you use [^1] [^2] [^3].\n", | |
| "\n", | |
| "For the flow matching project, we'd need:\n", | |
| "- A100 GPU access (which Modal provides) [^4] \n", | |
| "- Pay-per-second billing instead of idle GPU costs [^5]\n", | |
| "- Python environment setup\n", | |
| "\n", | |
| "The setup would involve creating a Modal app that:\n", | |
| "1. Installs the fastai course22p2 dependencies \n", | |
| "2. Runs the flow matching notebook on an A100\n", | |
| "\n", | |
| "Would you like me to help you create a Modal script to run the flow matching training, or should we first look at the actual notebook to understand what dependencies we'll need?\n", | |
| "\n", | |
| "[^1]: https://modal.com/docs/guide/gpu\n", | |
| "\t\"Currently B200, H200, H100, A100, L4, T4 and L40S instances support up to 8 GPUs (up to 1,536 GB GPU RAM), and A10 instances support up to 4 GPUs (up ...\"\n", | |
| "\n", | |
| "[^2]: https://modal.com/\n", | |
| "\t\"Bring your own code, and run CPU, GPU, and data-intensive compute at scale. \"\n", | |
| "\n", | |
| "[^3]: https://modal.com/\n", | |
| "\t\"Stop paying for idle GPUs when you’re done. \"\n", | |
| "\n", | |
| "[^4]: https://modal.com/docs/guide/gpu\n", | |
| "\t\"Currently B200, H200, H100, A100, L4, T4 and L40S instances support up to 8 GPUs (up to 1,536 GB GPU RAM), and A10 instances support up to 4 GPUs (up ...\"\n", | |
| "\n", | |
| "[^5]: https://modal.com/\n", | |
| "\t\"... Pay only for resources consumed, by the second, as you spin up containers. \"" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "97b01b79", | |
| "metadata": { | |
| "time_run": "3:44:26p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "!pip install -Uqq modal\n", | |
| "#!python3 -m modal setup <-- Running this in the terminal otherwise it's hard to copy the url link to login" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "5fe59d0d", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "great! I've already set up modal, what next?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Perfect! Now let's create a Modal app. We need to decide on the approach:\n", | |
| "\n", | |
| "1. **Convert the notebook to a Python script** and run it directly\n", | |
| "2. **Extract key components** and create a cleaner Modal function\n", | |
| "3. **Run the notebook as-is** using Modal's notebook support\n", | |
| "\n", | |
| "Which approach interests you most? The first option would be most straightforward since we already converted the notebook to Python." | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "5ae35f55", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "then let's go with option 1\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Great choice! Let's start by creating a basic Modal app structure. \n", | |
| "\n", | |
| "First, let's identify what we need to handle:\n", | |
| "1. The dependencies (miniai, einops, fastprogress, etc.)\n", | |
| "2. The dataset loading\n", | |
| "3. The GPU requirements\n", | |
| "\n", | |
| "Would you like me to write the initial Modal app skeleton, or would you prefer to start writing it yourself and I can guide you through each part?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "66df6ae0", | |
| "metadata": {}, | |
| "source": [ | |
| "## Getting Started" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "35e1949c", | |
| "metadata": { | |
| "time_run": "3:15:42p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "with open('modal-example.py', 'w') as f:\n", | |
| " f.write('''import modal\n", | |
| "\n", | |
| "app = modal.App(\"example-get-started\")\n", | |
| "\n", | |
| "\n", | |
| "@app.function()\n", | |
| "def square(x):\n", | |
| " print(\"This code is running on a remote worker!\")\n", | |
| " return x**2\n", | |
| "\n", | |
| "\n", | |
| "@app.local_entrypoint()\n", | |
| "def main():\n", | |
| " print(\"the square is\", square.remote(42))\n", | |
| "''')" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "62eec8c9", | |
| "metadata": { | |
| "time_run": "3:15:43p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "import modal\r\n", | |
| "\r\n", | |
| "app = modal.App(\"example-get-started\")\r\n", | |
| "\r\n", | |
| "\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!head -n 5 modal-example.py" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "370d0006", | |
| "metadata": { | |
| "time_run": "3:15:47p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[?25l\u001b[34m⠋\u001b[0m Initializing...\r", | |
| "\u001b[2K\u001b[32m✓\u001b[0m Initialized. \u001b[38;5;249mView run at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m\r\n", | |
| "\u001b[34m⠋\u001b[0m Initializing...\r", | |
| "\u001b[2K\u001b[34m⠋\u001b[0m Initializing...\r\n", | |
| "\u001b[?25h\r", | |
| "\u001b[1A\u001b[2K\u001b[?25l\u001b[34m⠋\u001b[0m Creating objects..." | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[34m⠸\u001b[0m Creating objects...\r\n", | |
| "\u001b[38;5;244m├── \u001b[0m🔨 Created mount /app/data/computer-vision/modal-example.py\r\n", | |
| "\u001b[38;5;244m└── \u001b[0m\u001b[34m⠋\u001b[0m Creating function square..." | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[1A\u001b[2K\u001b[34m⠼\u001b[0m Creating objects...\r\n", | |
| "\u001b[38;5;244m├── \u001b[0m🔨 Created mount /app/data/computer-vision/modal-example.py\r\n", | |
| "\u001b[38;5;244m└── \u001b[0m🔨 Created function square.\r\n", | |
| "\u001b[?25h\r", | |
| "\u001b[1A\u001b[2K\u001b[1A\u001b[2K\u001b[1A\u001b[2K\u001b[32m✓\u001b[0m Created objects.\r\n", | |
| "\u001b[38;5;244m├── \u001b[0m🔨 Created mount /app/data/computer-vision/modal-example.py\r\n", | |
| "\u001b[38;5;244m└── \u001b[0m🔨 Created function square.\r\n" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[?25l\u001b[34m⠋\u001b[0m Running app..." | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[34m⠸\u001b[0m Running app..." | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[34m⠦\u001b[0m \u001b[34mRunning...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[34m⠏\u001b[0m \u001b[34mLoading images (1 containers initializing)...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[34m⠹\u001b[0m \u001b[34mLoading images (1 containers initializing)...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[34m⠴\u001b[0m \u001b[34mLoading images (1 containers initializing)...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[34m⠇\u001b[0m \u001b[34mLoading images (1 containers initializing)...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[34m⠹\u001b[0m \u001b[34mLoading images (1 containers initializing)...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[34mThis code is running on a remote worker!\r\n", | |
| "\u001b[0m\u001b[34m⠸\u001b[0m\u001b[34m \u001b[0m\u001b[34mRunning (1/1 containers active)...\u001b[0m\u001b[34m \u001b[0m\u001b[38;5;249mView app at \u001b[0m\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/\u001b[0m\r", | |
| "\u001b[2Kthe square is 1764\r\n", | |
| "\u001b[34m⠸\u001b[0m \u001b[34mRunning (1/1 containers active)...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m\r", | |
| "\u001b[2K\u001b[1A\u001b[2K\u001b[34m⠸\u001b[0m \u001b[34mRunning (1/1 containers active)...\u001b[0m \u001b[38;5;249mView app at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m\r\n", | |
| "\u001b[?25h\r", | |
| "\u001b[1A\u001b[2K\u001b[1A\u001b[2K" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[33mStopping app - local entrypoint completed.\r\n", | |
| "\u001b[0m" | |
| ] | |
| }, | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "\u001b[32m✓\u001b[0m App completed. \u001b[38;5;249mView run at \u001b[0m\r\n", | |
| "\u001b[4;38;5;249mhttps://modal.com/apps/kafkasl/main/ap-uKG4zbfLqse5tbm7i69TeD\u001b[0m\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!python -m modal run modal-example.py" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "373d9b48", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "!git clone https://github.com/fastai/course22p2.git" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "8b2a6c5f", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "Our project is a bit more complex than a single method. First let's try to use `sed` to replace the `pip_requirements` in `settings.ini` as instructed above\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "Yes! Let's use `sed` to update the `pip_requirements` line. First, let's see what line 38 currently contains:\n", | |
| "\n", | |
| "```bash\n", | |
| "!sed -n '38p' course22p2/settings.ini\n", | |
| "```\n", | |
| "\n", | |
| "Then we can replace it with the required packages. What do you see on line 38?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "00e98ee9", | |
| "metadata": { | |
| "time_run": "3:26:50p" | |
| }, | |
| "outputs": [ | |
| { | |
| "name": "stdout", | |
| "output_type": "stream", | |
| "text": [ | |
| "pip_requirements = torch>=1.7,<2.1 torcheval diffusers einops timm\r\n" | |
| ] | |
| } | |
| ], | |
| "source": [ | |
| "!sed -n '38p' course22p2/settings.ini" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "5ab02c98", | |
| "metadata": { | |
| "solveit_ai": true | |
| }, | |
| "source": [ | |
| "what's the replacement syntax?\n", | |
| "\n", | |
| "##### 🤖Reply🤖<!-- SOLVEIT_SEPARATOR_7f3a9b2c -->\n", | |
| "\n", | |
| "The `sed` replacement syntax is:\n", | |
| "\n", | |
| "```bash\n", | |
| "sed 's/old_text/new_text/' filename\n", | |
| "```\n", | |
| "\n", | |
| "For in-place editing, add `-i`:\n", | |
| "\n", | |
| "```bash\n", | |
| "sed -i 's/old_text/new_text/' filename\n", | |
| "```\n", | |
| "\n", | |
| "In your case, you'd want to replace the current pip_requirements line with the new one. Would you like to try writing the command to update line 38?" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "abf06545", | |
| "metadata": { | |
| "time_run": "3:29:11p" | |
| }, | |
| "outputs": [], | |
| "source": [ | |
| "!sed -i 's/pip_requirements = torch>=1.7,<2.1 torcheval diffusers einops timm/pip_requirements = torch>=1.7 torcheval diffusers einops timm accelerate ipykernel/' course22p2/settings.ini" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "aa1d8c10", | |
| "metadata": { | |
| "time_run": "3:33:08p" | |
| }, | |
| "source": [ | |
| "## Running the notebook in Modal\n", | |
| "\n", | |
| "It seems that modal supports importing notebooks so I ended up doing this:\n", | |
| "1. Import [the notebook](https://github.com/comhar/flow-matching/blob/main/flow_matching.ipynb) there \n", | |
| "2. Add this as a cell to the top of the notebook to install the deps\n", | |
| "```bash\n", | |
| "!git clone https://github.com/fastai/course22p2.git\n", | |
| "!sed -i 's/pip_requirements = torch>=1.7,<2.1 torcheval diffusers einops timm/pip_requirements = torch>=1.7 torcheval diffusers einops timm accelerate ipykernel/' course22p2/settings.ini\n", | |
| "! cd course22p2 && pip install -e .\n", | |
| "```\n", | |
| "\n", | |
| "The notebook is available [here](https://modal.com/notebooks/kafkasl/_/nb-uAXToYtRafW7NeOHwoZQTM)" | |
| ] | |
| } | |
| ], | |
| "metadata": { | |
| "solveit_dialog_mode": "learning", | |
| "solveit_ver": 2 | |
| }, | |
| "nbformat": 4, | |
| "nbformat_minor": 5 | |
| } |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment