UPDATE: Ubuntu 25.10 completely fixed this issue. This fix is NOT required for Ubuntu 25.10
Skip to solution
UPDATE: Ubuntu 25.10 completely fixed this issue. This fix is NOT required for Ubuntu 25.10
Skip to solution
| import os | |
| import shutil | |
| import json | |
| import logging | |
| import re | |
| from abc import ABC, abstractmethod | |
| from typing import BinaryIO, Tuple, Dict | |
| from open_webui.config import ( | |
| S3_ACCESS_KEY_ID, |
(Seriously?)
The following instructions are for Chrome, user may perform similar actions when using other browsers
https://heyzine.com and you should see this:

Key column. Original key should be something like hz_cache_1201_p_10_normal_https://cdnc.heyzine.com/files/uploaded/v3/[REDACTED].pdf.https://cdnc.heyzine.com/files/uploaded/v3/[REDACTED].pdf| #!/usr/bin/python3 | |
| import os | |
| import base64 | |
| import hashlib | |
| from datetime import datetime, timezone | |
| import requests | |
| from http.client import RemoteDisconnected | |
| # This function is literally imported from Gemini, I can't write comments *this* detailed and format *this* properly :) | |
| def generate_soap_header_values(plain_password: str) -> dict: |
I have a IP cam that output RTSP stream with:
and I want to stream that to Youtube (using ffmpeg)
| root@cado:~# mkdir crack | |
| root@cado:~# cd cado-nfs/ | |
| root@cado:~/cado-nfs# ./cado-nfs.py 7071142888328001159781190611496955124398513403641364137666269609876069075501385806782438994383629333720075608597235677504440240558571591145514673066702347 server.address=192.168.1.9 server.ssl=no server.whitelist=REDACTED workdir=/root/crack/ | |
| Info:root: Using default parameter file ./parameters/factor/params.c155 | |
| Info:root: No database exists yet | |
| Info:Database: Database URI is db:sqlite3:///root/crack//c155.db | |
| Info:Database: Opened connection to database /root/crack//c155.db | |
| Info:root: Set tasks.linalg.bwc.threads=20 based on detected physical cores | |
| Info:root: Set tasks.threads=20 based on detected logical cpus | |
| Info:root: tasks.threads = 20 [via tasks.threads] |
| setInterval(function() {var c_f_b = document.querySelector("#alert_o_o"); if (c_f_b !== null) {c_f_b.click();}}, 30000); |
| D:\dev>git clone --recursive https://github.com/mapillary/OpenSfM | |
| D:\dev>cd OpenSfM | |
| D:\dev\OpenSfM>git clone https://github.com/microsoft/vcpkg | |
| Cloning into 'vcpkg'... | |
| remote: Enumerating objects: 247401, done. | |
| remote: Counting objects: 100% (59827/59827), done. | |
| remote: Compressing objects: 100% (4389/4389), done. | |
| remote: Total 247401 (delta 55699), reused 55438 (delta 55438), pack-reused 187574 (from 1) | |
| Receiving objects: 100% (247401/247401), 75.41 MiB | 10.16 MiB/s, done. | |
| Resolving deltas: 100% (164296/164296), done. |
MemGPT provides you a free endpoint for trying. It's at https://inference.memgpt.ai/chat/completions From official docs it's claimed that the free endpoint is running on variants of Mixtral 8x7b:
MemGPT Free Endpoint: select this if you'd like to try MemGPT on a top open LLM for free (currently variants of Mixtral 8x7b!)
However, after I manually try running Mixtral 8x7b on my own machine, I saw that the model cannot be compared to the free endpoint in term of accuracy (function calling + response). This makes me want to find out the real model behind this endpoint.
Free endpoint just forwards your call to OpenAI's ChatGPT 3.5. However, I cannot be sure if they log our requests or not.
| # Import | |
| import cv2 | |
| from deepsparse.pipeline import Pipeline | |
| from deepsparse.yolo.schemas import YOLOInput | |
| from deepsparse.yolo.utils import COCO_CLASSES | |
| import time | |
| # Model settings | |
| task = "yolo" | |
| model_path = "zoo:yolov5-l-coco-pruned.4block_quantized" |