Skip to content

Instantly share code, notes, and snippets.

@mseri
mseri / textbooks.md
Created December 2, 2025 09:21 — forked from mikephys8/textbooks.md
Alex Stef's list of freely-available mathematics textbooks

Textbooks in Mathematics

A list of links to useful mathematical textbooks available for free on the Internet. They are all legal and maintained by their authors or by the legitimate publisher.

All the documents are in English. They are in a printable format - Postscript or Adobe Portable Document Format. You are free to download, read and print them. Here are some links to other sites offering lists of free mathematical textbooks.

For any comments, please, contact me: [email protected]

@mseri
mseri / grpo_demo.py
Created November 6, 2025 19:57 — forked from willccbb/grpo_demo.py
GRPO Llama-1B
# train_grpo.py
#
# See https://github.com/willccbb/verifiers for ongoing developments
#
"""
citation:
@misc{brown2025grpodemo,
title={Granular Format Rewards for Eliciting Mathematical Reasoning Capabilities in Small Language Models},
author={Brown, William},
@mseri
mseri / lll.sh
Last active December 3, 2025 17:44
llama-launcher
#!/bin/bash
set -euo pipefail
MODELS="aquif, gemma3, granite4, granite4-moe, lfm2, lfm2-moe, lfm2-vl, ministral3 (3,8B;i,r), nemotron, qwen3(i,r), voxtral"
function usage() {
echo "Usage: $0 <model> [options]"
echo "Models: $MODELS."
echo "Options:"
echo " --dry to see the invocation string only"
@mseri
mseri / llm-hackathon.md
Created July 8, 2025 15:31 — forked from chriscarrollsmith/llm-hackathon.md
Coders' Colaboratory mini-hackathon on `llm` by simonw

Let's hack on llm!

If you have uv installed (and you should!), you can install llm globally in a uv-managed tool environment with:

uv tool install llm

If you want to use models other than OpenAI models, you'll need some extensions:

@mseri
mseri / TextRecognizer.swift
Created June 18, 2025 22:03
Small swift script to extract text from images (using Apple's Vision)
#!/usr/bin/swift
// Started from the code in https://terminalbytes.com/iphone-8-solar-powered-vision-ocr-server/
// Edited from Mistral generated code: https://chat.mistral.ai/chat/563cacdf-6def-49e4-9df6-ee8e263978c5
import AppKit
import CoreGraphics
import Foundation
import SwiftUI
import Vision
func processImageSync(imagePath: String) -> String? {
@mseri
mseri / code_completion_ide.py
Created November 2, 2024 16:07 — forked from iamaziz/code_completion_ide.py
simple offline code completion example with ollama/streamlit and code execution
import sys
from io import StringIO
import streamlit as st # pip install streamlit
from code_editor import code_editor # pip install streamlit_code_editor
import ollama as ol # pip install ollama
st.set_page_config(layout='wide')
st.title('`Offline code completion`')
@mseri
mseri / l3min.py
Created November 2, 2024 16:06 — forked from awni/l3min.py
A minimal, fast implementation of Llama 3.1 in MLX.
"""
A minimal, fast example generating text with Llama 3.1 in MLX.
To run, install the requirements:
pip install -U mlx transformers fire
Then generate text with:
python l3min.py "How tall is K2?"
@mseri
mseri / chat-interface.html
Created October 23, 2024 14:13
Simple html chat interface, made with Claude, used to interact with LM Studio, Ollama or any other openai compatible server (I am using it with firefox new ai panel)
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>LM Studio Chat Interface</title>
<style>
body {
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI",
Roboto, sans-serif;
@mseri
mseri / lms_to_llm.py
Last active November 5, 2024 09:46
Get models from LM Studio server and prepare them for llm. On Mac the output goes to `Library/Application\ Support/io.datasette.llm/extra-openai-models.yaml`
import requests
import subprocess
import yaml
def get_data_from_api():
base_url = "http://localhost:1234/v1"
response = requests.get(base_url + "/models")
if response.status_code == 200:
json_data = response.json()
package main
import (
"encoding/json"
"fmt"
"io/ioutil"
"os"
"os/user"
"path/filepath"
"strings"