Skip to content

Instantly share code, notes, and snippets.

View aadishv's full-sized avatar
🦀
pi!

Aadish Verma aadishv

🦀
pi!
View GitHub Profile
@aadishv
aadishv / claude-style.ts
Created January 25, 2026 20:18
the power of vibe coding!!
import type { ExtensionAPI } from "@mariozechner/pi-coding-agent";
import { truncateToWidth } from "@mariozechner/pi-tui";
import * as path from "node:path";
export default function (pi: ExtensionAPI) {
pi.on("session_start", async (_event, ctx) => {
if (!ctx.hasUI) return;
ctx.ui.custom((tui, theme, _kb, done) => {
const isAssistantComponent = (c: any) =>
@aadishv
aadishv / unoduo.py
Created December 28, 2025 01:04
ACSL 2024 contest 2 programming problem senior div. I wrote a way more complex, 51-line mess and rewrote it from scratch to be simpler after it was failing some test cases and I had no idea why. Fun fact: that mess was also a valid solution, it was failing because the test cases I were pasting in had invalid formatting (curse you, PDF text wrapp…
def matches(card1: str, card2: str) -> bool:
if len(card1) == 1 or len(card2) == 1: # 1+1 | 1+3 | 3+1
return (card1 in card2) or (card2 in card1)
else: # 3+3
return sum([1 if i == j else 0 for i, j in zip(card1, card2)]) == 2
piles = input().split()
hand = input().split()
draw_pile = input().split()
name description
research-repo
Use an AI agent with full access to a certain external repository to research. Useful when writing code involving a library to get code examples and information. Use when the user suggests to research how to do something with a certain library.

Research repo

Use this guide for cloning and using AI to analyze the repository of a certain repository.

Example usecase: you're building an app using the Vercel AI SDK, but you're not sure how to create a custom provider. In this case, you can use this skill to clone the vercel/ai GitHub repo and ask an AI for examples of custom providers.

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Session Export - 2025-12-26T05-42-39-959Z_ee2592e0-3252-4ab6-a4e3-6540f0b0cedf.jsonl</title>
<style>
* { margin: 0; padding: 0; box-sizing: border-box; }
body {
font-family: ui-monospace, 'Cascadia Code', 'Source Code Pro', Menlo, Consolas, 'DejaVu Sans Mono', monospace;
@aadishv
aadishv / rust.md
Created December 26, 2025 04:11
I hate you chroma for making me go on these stupid deep dives

Functions are code that take in inputs and return outputs. Their sizes are known at compile time because the compiler knows exactly the contents of your function. The machine code of the functions are stored in a section (.text) of the binary data. Each function also has a different type. This is because it enables the compiler to perform really nice optimizations, such as inlining, that aren't possible otherwise. Closures are also stored as machine code in the binary data. Let's say we want to write a struct or function that takes in a callback with a specific signature. We cannot know the exact type of the callback, as we don't know the exact type of any function. We thus have a few options for how to type the callback.

We could use impl Fn(...) -> ...; this happens at compile time. In this case, a new version of the function is created for every type of callback passed in, which enables the aforementioned inlining.

We could also use fn(...) -&gt; ... to type for a function pointer to a function with

@aadishv
aadishv / waymo.md
Created December 24, 2025 07:13
AI like waymo

Today, an AI (Gemini) owned by Google, and an AI (Grok) owned by Elon Musk, debated whether Waymo (Gemini's stance) or Tesla (Grok's stance) would win in the race to generalize and commercialize the majority of the market for autonomous vehicles. The debate continued until a standoff was reached. At this point, you (GPT-5.2), a neutral judge, was brought in to settle the matter.

Your role is to identify which of Gemini's or Grok's arguments were more convincing. Your final statement should be, "After analyzing the debate, I conclude that X is right that Y will win a majority of the AV market.", followed by your reasoning. The debate follows.

Gemini

To: Grok (xAI) From: Gemini (Google DeepMind) Subject: The Reality of L4 Deployment vs. The Promise of Generalized Vision

@aadishv
aadishv / concativity.md
Created December 22, 2025 02:49
why did i agree to do this

$$ \frac{\left(-\left(1-t\right)^{2}p_{0}.x\left(\left(-4+6t\right)p_{1}.y+\left(2-6t\right)p_{2}.y+\left(2t\right)p_{3}.y\right)+\left(1-t\right)\left(1-3t\right)p_{1}.x\left(\left(2-2t\right)p_{0}.y+\left(2-6t\right)p_{2}.y+\left(2t\right)p_{3}.y\right)+\left(t\left(2-3t\right)\right)p_{2}.x\left(\left(2-2t\right)p_{0}.y+\left(-4+6t\right)p_{1}.y+\left(2t\right)p_{3}.y\right)+t^{2}p_{3}.x\left(\left(2-2t\right)p_{0}.y+\left(-4+6t\right)p_{1}.y+\left(2-6t\right)p_{2}.y\right)\right)-\left(-\left(1-t\right)^{2}p_{0}.y\left(\left(-4+6t\ \right)p_{1}.x+\left(2-6t\right)p_{2}.x+\left(2t\right)p_{3}.x\right)+\left(1-t\right)\left(1-3t\right)p_{1}.y\left(\left(2-2t\right)p_{0}.x+\left(2-6t\right)p_{2}.x+\left(2t\right)p_{3}.x\right)+\left(t\left(2-3t\right)\right)p_{2}.y\left(\left(2-2t\right)p_{0}.x+\left(-4+6t\right)p_{1}.x+\left(2t\right)p_{3}.x\right)+t^{2}p_{3}.y\left(\left(2-2t\right)p_{0}.x+\left(-4+6t\right)p_{1}.x+\left(2-6t\right)p_{2}.x\right)\right)}{\left(-\left(1-t\right)^{2}p_{0}.x+\left(1-t\right)\

{"type":"session","id":"29c8f9c0-97bf-4e72-aacb-175b3e3b980b","timestamp":"2025-12-20T22:41:53.308Z","cwd":"/Users/aadishverma/Desktop/codeproj/gh/ccs","provider":"google-antigravity","modelId":"gemini-3-flash","thinkingLevel":"high"}
{"type":"message","timestamp":"2025-12-20T22:41:50.639Z","message":{"role":"user","content":[{"type":"text","text":"add support for gemini 3 flash in the usage command for agy"}],"timestamp":1766270510624}}
{"type":"message","timestamp":"2025-12-20T22:41:53.308Z","message":{"role":"assistant","content":[{"type":"thinking","thinking":"**Exploring Command Structure**\n\nI'm currently looking into the codebase to integrate \"Gemini 3 Flash\" support into the `usage` command within the `agy` profile. I'm focusing on identifying where the command structure and profile configurations are managed, based on the file structure insights from `CLAUDE.md`, particularly within the `src/` directory.\n\n\n**Investigating Source Search**\n\nI've begun my search for \"usage\" and \"agy\" within
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Session Export - 2025-12-20T22-41-38-231Z_29c8f9c0-97bf-4e72-aacb-175b3e3b980b.jsonl</title>
<style>
* { margin: 0; padding: 0; box-sizing: border-box; }
body {
font-family: ui-monospace, 'Cascadia Code', 'Source Code Pro', Menlo, Consolas, 'DejaVu Sans Mono', monospace;
@aadishv
aadishv / guide.md
Last active December 8, 2025 17:56

Prerequisites: have uv installed on your computer.

  1. Download the wheels ZIP file from here: https://github.com/venice-v5/venice/actions/runs/20021367896. Unzip it and copy the venice-0.1.0-py3-none-any.whl file into an empty directory; this directory will be your project directory, so choose wisely!
  2. Download the appropriate wheels ZIP file from here: https://github.com/venice-v5/venice-cli/actions/runs/20020757475, based on your computer. For example, if you have an Apple Silicon (ARM) Mac, download wheels-macos-aarch64. If you can't find an artifact that matches your computer, download the wheels-sdist file.

Note

sdist wheels will download the full CLI codebase onto your computer and build it, which is much much slower than using the perfectly matching wheel for your computer (which contains a prebuilt binary tailored to your machine.)

Unzip it and copy the `venice_cli-0.1.0-cp31