Skip to content

Instantly share code, notes, and snippets.

@visionarylab
Forked from aras-p/unity_6_empty_web_build.md
Last active January 26, 2026 06:19
Show Gist options
  • Select an option

  • Save visionarylab/6d442992edcded3b885a6750a2f09b5b to your computer and use it in GitHub Desktop.

Select an option

Save visionarylab/6d442992edcded3b885a6750a2f09b5b to your computer and use it in GitHub Desktop.
Unity 6 "empty" web build file sizes
First off you need 3D projection software like MadMapper (a leading software for complex work $51 a month / $533 to buy.) / HeavyM (best for new users $294 - $594- $714 / with monthly rent-to-own options too.) / Resolume Arena (for djing / $927 a year then you have to renew to get updates.) / TouchDesigner / so on. But all of these software cost hundreds of dollars and some of them are only subscription based because they like gatekeeping the tech. Other than that you just stretch an image or video in the software like you would in photoshop at the angles you need to fit where your projector is pointing.
https://github.com/LiterallyPhil/Echelon-OS
https://github.com/vzakharchenko/Forge-Secure-Notes-for-Jira
https://github.com/kannonboy/unsplash-rovo-agent-example
https://fibery.io/partners
#Notion, integrately, unito, latenote
Census - FiveTran, tines, https://hightouch.com/pricing free tier
DeepNote
Clay.earth
Form,feedback
BankSync,transaction
https://marketplace.atlassian.com/apps/1226811/golinks-for-confluence?hosting=cloud&tab=overview
https://news.ycombinator.com/item?id=33978767
Grapple Importer
https://www.polymerhq.io/pricing/
https://aws.amazon.com/marketplace/pp/prodview-mui6ld62t4ub6
https://marketplace.builtfirst.com/ramp-expense-management
#Mapon
https://www.gitbook.com/integrations/reflag
https://marketplace.atlassian.com/apps/1215727/templateme-customized-notifications?hosting=datacenter&tab=pricing
https://trello.com/power-ups/652cd18cd9ce322b3e03bebe/hidynotes-private-card-notes
#Embedpress
#guardian
https://www.gitguardian.com/pricing#plan-details
https://trufflesecurity.com/pricing
https://www.reddit.com/r/devsecops/comments/tg2wb5/experience_with_application_security_tools_cycode/
https://www.consoleconnect.com/services/marketplace/
Akrew Pro is our in-house membership system (similar to Patreon)
The Rise and Fall of Nookazon | ACNH
Alternatives to Traderie depend on the game or market you are trading in, but common options include sites like mm2values.com, gamersberg.com, and diablo2.io for specific games, or platforms like eToro and Interactive Brokers for financial trading and copy trading. Other gaming-focused alternatives include zeusx.com and itemku.com.
For gaming
MM2Values.com: A popular site for valuing items in the game Murder Mystery 2.
GamersBerg.com: A community for trading and checking values of various items.
Diablo2.io: A community site with a trading focus for Diablo 2.
ZeusX.com: A global marketplace for trading in-game items.
Itemku.com: An online marketplace that focuses on digital goods, including in-game items.
https://www.macprovideo.com/article/audio-software/create-your-own-midi-generated-realtime-visuals-with-openframeworks
Create a Render Texture: In your Project window, right-click and select Create > Render Texture. Name it appropriately (e.g., VideoTexture). This asset acts as the canvas where the video will be drawn frame by frame.
Add a Display Object:
For 3D playback: Create a 3D object like a Plane or Quad in your scene to serve as the screen. A Plane is often ideal as it requires minimal rotation adjustment.
For 2D playback (UI): In your Hierarchy, right-click and select UI > Canvas, then right-click on the Canvas and select UI > Raw Image.
Configure the Video Player:
Create an empty GameObject (right-click in Hierarchy > Create Empty) and name it VideoPlayerObject.
Select VideoPlayerObject and in the Inspector, click Add Component and search for Video Player.
Drag your imported video file into the Video Clip field.
Change the Render Mode to Render Texture and drag your VideoTexture asset into the Target Texture field.
Display the Video on the Object:
For 3D playback: Create a new Material (right-click in Project window > Create > Material) and name it VideoMaterial. With the material selected, drag your VideoTexture into the Albedo (and optionally Emission) color slot. Drag this VideoMaterial onto your 3D object in the Scene.
For 2D playback (UI): Select your Raw Image UI element. Drag your VideoTexture asset into the Texture field of the Raw Image component.
Configure Audio (Optional):
In the Video Player component, change the Audio Output Mode to Audio Source.
Add an Audio Source component to the same GameObject (VideoPlayerObject).
Link the Audio Source component to the Audio Source field in the Video Player.
Play:
Ensure Play On Awake is checked in the Video Player component to start the video when the scene loads.
Run your scene in the Unity Editor to see the video play on your chosen surface
If you tell me your target platform (Windows / Linux X11 / Linux Wayland / macOS / Android) I can find the most relevant repo(s) and search inside them (semantic search) for the exact functions/classes that implement video-as-texture, drawArea/mask, and projection/rotation code, then summarize the implementation and point to the files/lines.
I can produce a minimal example (OpenGL/WebGL or a small Electron/SDL app) that:
plays a video as a texture,
maps it into a subset region (clipped quad or UV mask),
applies a transform/perspective so the video appears rotated to face the viewer,
and shows how to animate/rotate it.
Progress so far — what I looked for and how
I loaded the repository-code-search ability and ran targeted code searches for patterns that indicate a video being used as a texture and then mapped/clipped to a sub-region or mesh (keywords/patterns: VideoPlayer, video.texture, VideoTexture, drawImage with source-rect, setDrawArea, setProjectionMatrix, mesh UVs, plane geometry).
I inspected matching files/readmes and opened the most promising hits to confirm they actually map video frames into a subset region, onto quads/meshes, or update textures from canvases/videos.
Top repositories and files that implement video → texture → subset-region / mesh / UV mapping (with links and quick notes)
DepthViewer (Unity) — direct video-as-texture onto a mesh with UVs
ImgVidDepthTexInputs.cs — handles VideoPlayer, reads VideoPlayer.texture, updates frames and sets the mesh:
https://github.com/parkchamchi/DepthViewer/blob/6c69d7ed39f1188819b7ca30e04fc4963eea0673/DEPTH/Assets/Scripts/TexInputs/ImgVidDepthTexInputs.cs
(this file reads vp.texture and then UpdateVid() sets a mesh)
MeshBehavior.cs — builds Mesh, sets vertices & uv arrays and uses a Material (so the video texture is mapped using UVs onto the mesh):
https://github.com/parkchamchi/DepthViewer/blob/6c69d7ed39f1188819b7ca30e04fc4963eea0673/DEPTH/Assets/Scripts/MeshBehavior.cs Why this is relevant: Unity VideoPlayer.texture → Texture assigned to material and MeshBehavior holds UVs/triangles so the video frames are rendered onto a custom mesh (subset region / warped plane).
three.js / WebGL examples — video mapped to a plane / mesh (canvas copying, VideoTexture)
NIN demo README shows creating a canvas, THREE.VideoTexture and mapping it to a mesh:
https://github.com/ninjadev/nin/blob/e22015939cf616287f63e4b1d8bc6876d382439a/README.md#L135-L222
jocly (Gadget3DVideo) — creates a <video> tag, copies frames into a canvas via drawImage(video,...), uses new THREE.Texture(videoCanvas) and applies as material.map on PlaneGeometry:
https://github.com/aclap-dev/jocly/blob/f06a159c8f7e21483ab63508d94059c0be822ddc/src/browser/jocly.xd-view.js#L2202-L2293
Why this is relevant: these show the common web approach — draw video into canvas or use VideoTexture, then map it to a plane/mesh and update texture each frame. You can crop the source (drawImage with source rectangle) or change UVs to map a subset region.
Canvas / 2D drawImage cropping (video → subset) — direct clipping/cropping solution
video-canvas — helper that continuously paints a video into a canvas and exposes a custom drawCall (ctx.drawImage(video, sx, sy, sw, sh, dx, dy, dw, dh) can clip/crop):
https://github.com/fregante/video-canvas/blob/cd9ab0b2cd21a2759304aa8ec820043c0b53f413/readme.md Why this is relevant: the HTML5 canvas drawImage API directly supports drawing a source rect (subset of the frame) to a destination rect; this is a simple way to map a video frame into a subregion or mask a part (you can then use the canvas as a texture in WebGL or apply clipping/mask shapes).
PlayCanvas / Three demos / Misc game engines
PlayCanvas and other engine demos use a pattern: <video> → drawImage(video, canvas) → three/engine texture needsUpdate = true → material on a mesh/plane. Example of the pattern in a project log / scripts:
PlayCanvas / VideoTexture references appear across repos; example show list: search in PlayCanvas exports (log shows VideoTexture.js etc.)
(see search hits referencing VideoTexture, PlayCanvas exports — multiple repos; example context: https://github.com/TyRadman/PlayCanvas_InteractivePortfolio/blob/566c9cfe7377d57ab56257b654ac1e2d2ae845a9/log.txt) Why this is relevant: identical workflow — video → canvas → texture → mesh. Canvas gives precise control for cropping/clipping before mapping.
ofxReprojection / OpenFrameworks — drawing textures into arbitrary projected areas
ofxReprojection README/docs show methods like setProjectionMatrix, setDrawArea, drawImage which let you draw a texture into a specified area with a projection transform:
https://github.com/Illd/ofxReprojection/blob/e1da235a31e71e0650d25b65b95eb044eaded6f2/DOCS.md#L124-L187
Why this is relevant: when you need the video rotated to “face” the viewer (perspective correction), reprojection utilities show the exact APIs to draw images/textures into arbitrary transformed regions.
Unity simple examples — VideoPlayer → RawImage and VideoPlayer.texture assignments
Many Unity examples show image.texture = videoPlayer.texture; that covers mapping into UI elements (RawImage) — example: OpeningLogo.cs and others:
https://github.com/Catacomb-Snatch/Catacomb-Snatch/blob/af8fc849d0730e36ad02f2a6f6da1946341ee613/Assets/Scripts/Menus/OpeningLogo.cs
Why this is relevant: RawImage supports cropping/rect transforms in UI; for arbitrary mesh transforms, DepthViewer above is closer to what you asked.
Short analysis / how these approaches implement "subset region + rotation/perspective"
Canvas drawImage(sourceRect, destRect) — simplest: select subset of the video frame and draw wherever you want inside the canvas. Use canvas as texture in WebGL if you need GPU transforms. (video-canvas, multiple web demos)
VideoTexture / canvas → Three.Texture → map to PlaneGeometry or custom geometry — then change UVs or the geometry vertices to place/warp/rotate the video region; update the texture each frame. (jocly, NIN, many three.js demos)
Unity VideoPlayer.texture → assign to Material used on Mesh → set UVs in mesh (MeshBehavior.cs) and apply transforms/rotation so that the video quad faces camera/viewer. (DepthViewer)
ofxReprojection / custom projection matrices — use real projection matrices to map texture coordinates to screen/projector coordinates for perspective-correct mapping.
Limitations of my search
GitHub code-search responses are limited; the results I inspected are representative but not exhaustive — there may be more wallpaper-specific projects that implement subset-region video mapping (especially commercial/closed or non-GitHub code).
If you want exhaustive coverage I can run targeted semantic searches inside any specific repo from the list or more widely for patterns like "VideoPlayer.texture" OR "drawImage(video" OR "VideoTexture" OR "setDrawArea" across more repos (note: semantic repo searches require repoOwner/repoName per call).
Suggested next steps — pick one
I can deep-inspect one of the top hits (recommended: parkchamchi/DepthViewer) with a semantic search to extract the exact functions that do: bind video texture → build UVs → apply projection/rotation; then paste the relevant code snippets and explain how to adapt it for wallpaper usage.
Or I can produce a minimal example (WebGL/three.js or Unity) that demonstrates:
play video to texture,
crop a source rect (subset) or use UVs to map only part of it,
render that textured quad with rotation/perspective so it "faces" the viewer.
Or I can run additional code searches for other projects that are explicitly wallpaper apps (search terms: "live wallpaper", "wallpaper engine", "set wallpaper video", "desktop wallpaper video") and filter to those that do mesh/UV mapping.
Which of these do you want next?
Inspect DepthViewer files and extract the exact code paths (VideoPlayer → texture → mesh UVs + how to rotate to face camera), or
Produce a short three.js or Unity minimal example that maps a video into a sub-region and rotates it to face the viewer, or
Broaden the repo search to find more wallpaper-specific projects that do mesh/UV mapping (I can run repo-scoped semantic searches once you pick a repo to inspect).
If you want me to inspect a repo, tell me which one (or say "DepthViewer") and I’ll open it and extract the exact functions and code lines that implement the mapping (I already have DepthViewer files located and can run a semantic search inside it next).
If your reprojection is shader-heavy, you can convert the GLSL shaders from openFrameworks to work with Shadertoy or X Live Wallpaper (which supports shaders).
Key Considerations
Performance: ofxReprojection involves real-time 3D rendering. Running this as a wallpaper will consume significant CPU/GPU resources.
Optimization: Ensure your project is optimized to reduce battery drain on laptops or phones.

This short post by Defold people about "empty project build size" comparison between Defold, Unity and Godot (twitter, mastodon) sparked my interest.

It is curious that Godot builds seem to be larger than Unity? Would not have expected that! Anyway.

A way more extensive Unity "web" build comparison and analysics is over at https://github.com/JohannesDeml/UnityWebGL-LoadingTest but here are just my short notes in trying out Unity 6 (6.0.23 - Oct 2024).

Default (3D, URP) template

Create a project using default ("3D", Universal Render Pipeline) template, do nothing just switch to "Web" platform and make a build: 10.7MB (3.7MB data, 6.9MB code). Takes about 6 minutes to finish the build.

Look at (uncompressed) asset sizes that are printed in the Editor.log during the build (note: it's been an issue for years that some of what goes into the build files is not reported for some reason), but anyway. The largest (uncompressed) data contributors are:

  • 2.7MB: unity logo/splash texture (!). I remember years ago we tried to keep it small since "this goes into all builds". Apparently not anymore.
  • 2.7MB: URP FilmGrain textures (10 textures, 256KB each).
  • 0.5MB: "unity_builtin_extra" - various "built-in" assets that are included into the build if something needs/uses them. Again, it's known for years that it would be more useful to report details on what got included, but here we are.
  • 0.4MB: URP blue noise textures (7 textures, 64KB each).
  • 0.3MB: URP anti-aliasing (SMAA) AreaTex texture.
  • 0.2MB: URP UberPost shader.

Looks like various things, mostly related to URP post-processing, are "always included" into the build, even if you don't explicitly use them. The 3+ MB above is just "film grain" textures, "blue noise" textures, anti-aliasing texture, plus a bunch of shaders and so on.

Not terribly large, but curious things: 80KB is URP "Runtime Debugging" truetype font (PerfectDOSVGA437.ttf), plus 16KB is another runtime debugging font (DebugFont.tga). There's also 60KB of URP "DebugUIbitField.prefab". All of these sound like some sort of "debug overlay/visualization" thingy, that is for some reason included into the build (even if I'm making a non-Development build!).

There's 3KB of Assets/Resources/PerformanceTestRunInfo.json and while it is tiny, I wonder why it is there at all, and what it does contain (there's no asset like that anywhere in the project or packages; it somehow gets generated during build time apparently).

Anyway, at this point sounds like URP has not really paid much attention to minimizing the build sizes, so let's try our good old friend, the built-in render pipeline (BiRP).

2D, Built-in Render Pipeline template

Create a project using "2D + BiRP" template that is an option in Unity Hub. Again switch platform to "Web" (unity hangs at "compiling scripts: backend" state with zero CPU utilization; kill it, restart, now works), make a build. 7.7MB (1.5MB data, 6.1MB code), takes almost 4 minutes to build.

Ok, so BiRP saves about 2MB worth of (Brotli-compressed, which is default) data size, good; Brotli-compressed code size is smaller too. Out of uncompressed assets reported in Editor.log, the same 2.7MB for splash screen / logo is still there, the rest is peanuts.

Turn off splash/logo, remove packages we don't need

With Unity 6 you can turn off the default splash/logo even in the free ("personal") license, so do that.

Also, while the project feels like it is "empty" and contains nothing, that is not actually true; it contains several dozen packages. I think I'm not gonna need: Visual Scripting, Timeline, Version Control, Performance Testing API (whatever that is), Multiplayer Center, Visual Studio Editor, Test Framework. Turn those off in the package manager window.

There's also Burst, Mathematics, Collections, as well as a bunch of "2D" related packages like Aseprite Importer and so on. Confusingly enough, the package manager UI does not allow me to remove them, since they are part of "2D feature". It only allows me to "unlock" said packages, but what does it do I have no idea. After unlocking them, they still can't be removed. You know what, just remove the whole "2D" feature. Afterall, basic "2D" (sprites, materials, 2D physics etc.) are still built-in and available.

Build is 6.9MB (1.1MB data, 5.6MB code), 3 minute build time.

Try out "disk size" code optimization

In the build profiles window, "code optimization" setting defaults to "shorter build time". Switch that to "disk size", build. Build size increases, lol (7.1MB), and takes over 10 minutes to build. Ok that does not sound terribly useful! Forget about it, change code optimization setting back to default.

Increase code stripping level

Hidden deep inside Player Settings (which is organized like a major mess), there are several settings that might affect build size:

  • Player Settings -> Other Settings -> Optimization -> Managed Stripping Level. Change to "High". This is what allows to remove "not used by the game" parts of the "engine", I think.
  • Player Settings -> Other Settings -> Configuration -> ILCPP Code Generation. Change to "Faster (smaller) builds". Various tooltips there talk about "scripting backends" which is confusing for all of these platforms where there's only one scripting backend.
  • Player Settings -> Publishing Settings -> Web Assembly Features -> Enable Exceptions. Change to "None".
  • Player Settings -> Publishing Settings -> Web Assembly Features -> Use WebAssembly.Table. Turn on. Might lose some old browsers support, but the tooltip indicates that it might save some code size.
  • Graphics Settings -> Shader Settings -> Video. Change to "Don't include".

Build: 4.8MB (0.9MB data, 3.8MB code), two minutes to build.

Remove Input System, Unity UI

Now, for some reason there's still a lot of engine code that does not get removed. We have removed almost all packages from the project... but not all of them! You know what, let's remove "Input System" and "Unity UI", just to see.

Build: 2.3MB! (0.4MB data, 1.8MB code), one minute to build.

Now we're talking! And again, it is hard to say why for example the data file got twice smaller; the Editor.log build size report does not contain useful information. But the engine code size got way smaller. I did not check whether it is the input system package, or the Unity UI package that "drags in" a ton of engine code.

Try to remove built-in engine modules

In the package manager UI left sidebar there is a section called "Built-in" with no explanations. And it lists a bunch of things that do not have descriptions either. These are not "packages" but rather "engine modules" (IMHO a largely misguided and/or unfinished effort from years ago). Let's try to turn off all the ones we think we don't need: Accessibility, AI, Cloth, Director, Physics (keep Physics 2D), Screen Capture, Terrain, Terrain Physics, UIElements, Umbra, Unity Analytics, Vehicles, Video, VR, Wind, XR.

Build: size unchanged. Turning off these "built-in modules" does not do much/anything! It might help to avoid accidentally adding a dependency to them during development, but otherwise if you have code stripping on already (see sections above), it won't reduce file sizes.

Change code optimization setting back

Previously (with "managed code stripping" player setting at default), build profile code optimization setting for "disk size" was not useful (larger code, and way longer build times). But maybe now it would be better?

Code optimization set to "disk size": 2.1MB (0.4MB data, 1.6MB code), build time 70 seconds.

Code optimization set to "disk size with LTO": 2.0MB (0.4MB data, 1.5MB code), build time 60 seconds.

That's it!

https://platform.uno/docs/articles/guides/silverlight-migration/01-create-uno-solution.html
>dotnet new install [email protected] (Uno 6.0 introduce uno design studio, hot design, new mediaelementplayer, drop uwp support etc)
>dotnet new unoapp-uwp -o uwptesting -tfm=net8.0 -wasm=false -mobile=false -skia-wpf=false -skia-gtk=false -skia-linux-fb=false
dotnet restore
Best old tutorial is https://platform.uno/blog/tag/goodreads/
thecatapi
uno contoso
https://www.visioforge.com/buy
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment