From b8d4a815453acac752c6fb3c56d047e39a76fd05 Mon Sep 17 00:00:00 2001 From: skal Date: Sat, 14 Feb 2026 19:05:34 +0100 Subject: feat(gpu): add SDF camera infrastructure and effect base class Add unified camera system for SDF raymarching effects: - CameraParams struct (80 bytes): inv_view matrix + FOV/near/far/aspect - SDFEffect base class: manages camera uniform, provides update_camera() helpers - camera_common.wgsl: getCameraRay(), position/forward/up/right extractors - SDFTestEffect: working example with orbiting camera + animated sphere Refactor effect headers: - Extract class definitions from demo_effects.h to individual .h files - Update includes in .cc files to use specific headers - Cleaner compilation dependencies, faster incremental builds Documentation: - Add SDF_EFFECT_GUIDE.md with complete workflow - Update ARCHITECTURE.md, UNIFORM_BUFFER_GUIDELINES.md - Update EFFECT_WORKFLOW.md, CONTRIBUTING.md Tests: 34/34 passing, SDFTestEffect validated Co-Authored-By: Claude Sonnet 4.5 --- doc/ARCHITECTURE.md | 26 +++++++ doc/CONTRIBUTING.md | 8 +- doc/EFFECT_WORKFLOW.md | 8 +- doc/SDF_EFFECT_GUIDE.md | 164 +++++++++++++++++++++++++++++++++++++++ doc/UNIFORM_BUFFER_GUIDELINES.md | 40 +++++++++- 5 files changed, 238 insertions(+), 8 deletions(-) create mode 100644 doc/SDF_EFFECT_GUIDE.md (limited to 'doc') diff --git a/doc/ARCHITECTURE.md b/doc/ARCHITECTURE.md index 4c36ec5..ebb2a59 100644 --- a/doc/ARCHITECTURE.md +++ b/doc/ARCHITECTURE.md @@ -4,6 +4,28 @@ Detailed system architecture for the 64k demo project. --- +## SDF Camera System + +**Purpose**: Unified camera infrastructure for SDF raymarching effects. + +**CameraParams** (80 bytes, `src/gpu/camera_params.h`): +- `inv_view`: mat4 (inverse view matrix for screen→world transform) +- `fov`, `near_plane`, `far_plane`, `aspect_ratio`: f32 (camera parameters) + +**SDFEffect Base Class** (`src/gpu/sdf_effect.h`): +- Manages `UniformBuffer` +- Provides `update_camera()` helper methods (from Camera object or manual values) +- Standard binding: 0=CommonUniforms, 1=CameraParams + +**WGSL Helpers** (`common/shaders/camera_common.wgsl`): +- `getCameraRay(cam, uv)`: Generate ray from screen UV coordinates +- `getCameraPosition/Forward/Up/Right()`: Extract camera vectors from inv_view +- Integrates with existing `render/raymarching.wgsl` (rayMarch, normal, shadow) + +**Usage**: Effects inherit from SDFEffect, update camera each frame, shader accesses camera uniforms for raymarching. + +--- + ## Hybrid 3D Renderer **Core Idea**: Uses standard rasterization to draw proxy hulls (boxes), then raymarches inside the fragment shader to find the exact SDF surface. @@ -18,6 +40,10 @@ Detailed system architecture for the 64k demo project. **Effect**: Abstract base for visual elements. Supports `compute` and `render` phases. +**PostProcessEffect**: Subclass for full-screen post-processing effects. + +**SDFEffect**: Subclass for SDF raymarching effects with camera management (see SDF Camera System below). + **Sequence**: Timeline of effects with start/end times defined in beats. **MainSequence**: Top-level coordinator and framebuffer manager. diff --git a/doc/CONTRIBUTING.md b/doc/CONTRIBUTING.md index d7ef88a..7fbfd64 100644 --- a/doc/CONTRIBUTING.md +++ b/doc/CONTRIBUTING.md @@ -65,10 +65,14 @@ See `doc/CODING_STYLE.md` for detailed examples. ## Development Protocols ### Adding Visual Effect -1. Create effect class files (use `tools/shadertoy/convert_shadertoy.py` or templates) + +**For SDF/raymarching effects:** Use `SDFEffect` base class (see `doc/SDF_EFFECT_GUIDE.md`). + +**For standard effects:** +1. Create effect class files (each effect should have its own `.h` and `.cc` file, e.g., `src/effects/my_effect.h` and `src/effects/my_effect.cc`). Use `tools/shadertoy/convert_shadertoy.py` or templates. 2. Add shader to `workspaces/main/assets.txt` 3. Add effect `.cc` file to `CMakeLists.txt` GPU_SOURCES (both sections) -4. Include header in `src/gpu/demo_effects.h` +4. Include header in `src/gpu/demo_effects.h` (which now serves as a central include for all individual effect headers) 5. Add to workspace `timeline.seq` (e.g., `workspaces/main/timeline.seq`) 6. **Update `src/tests/gpu/test_demo_effects.cc`**: - Add to `post_process_effects` list (lines 80-93) or `scene_effects` list (lines 125-137) diff --git a/doc/EFFECT_WORKFLOW.md b/doc/EFFECT_WORKFLOW.md index 22b8dc9..57cf904 100644 --- a/doc/EFFECT_WORKFLOW.md +++ b/doc/EFFECT_WORKFLOW.md @@ -10,6 +10,8 @@ Automated checklist for adding new visual effects to the demo. **For ShaderToy conversions:** Use `tools/shadertoy/convert_shadertoy.py` then follow steps 3-8 below. +**For SDF/raymarching effects:** See `doc/SDF_EFFECT_GUIDE.md` for streamlined workflow using SDFEffect base class. + **For custom effects:** Follow all steps 1-8. --- @@ -18,6 +20,8 @@ Automated checklist for adding new visual effects to the demo. ### 1. Create Effect Files +**Description:** Each visual effect must have its own dedicated header (`.h`) and implementation (`.cc`) file pair. + **Location:** - Header: `src/effects/_effect.h` - Implementation: `src/effects/_effect.cc` @@ -84,7 +88,7 @@ SHADER_TUNNEL, NONE, shaders/tunnel.wgsl, "Tunnel effect shader" # In normal section (line ~183): src/effects/solarize_effect.cc - src/effects/tunnel_effect.cc # <-- Add here + src/effects/tunnel.cc # <-- Add here src/effects/chroma_aberration_effect.cc ``` @@ -92,7 +96,7 @@ SHADER_TUNNEL, NONE, shaders/tunnel.wgsl, "Tunnel effect shader" **File:** `src/gpu/demo_effects.h` -**Action:** Add include directive: +**Action:** `src/gpu/demo_effects.h` now acts as a central include file. Add a single include directive for your new effect's header: ```cpp #include "effects/_effect.h" ``` diff --git a/doc/SDF_EFFECT_GUIDE.md b/doc/SDF_EFFECT_GUIDE.md new file mode 100644 index 0000000..fba80e7 --- /dev/null +++ b/doc/SDF_EFFECT_GUIDE.md @@ -0,0 +1,164 @@ +# SDF Effect Guide + +Streamlined workflow for SDF raymarching effects using the `SDFEffect` base class. + +--- + +## Quick Start + +```cpp +// src/effects/my_sdf_effect.h +class MySDFEffect : public SDFEffect { + MySDFEffect(const GpuContext& ctx); + void render(WGPURenderPassEncoder pass, + const CommonPostProcessUniforms& uniforms) override; + RenderPass pass_; +}; +``` + +```cpp +// src/effects/my_sdf_effect.cc +#include "effects/my_sdf_effect.h" +#include "gpu/gpu.h" +#include "gpu/shaders.h" + +MySDFEffect::MySDFEffect(const GpuContext& ctx) : SDFEffect(ctx) { + ResourceBinding bindings[] = { + {uniforms_.get(), WGPUBufferBindingType_Uniform}, + {camera_params_.get(), WGPUBufferBindingType_Uniform}}; + pass_ = gpu_create_render_pass(ctx_.device, ctx_.format, + my_sdf_shader_wgsl, bindings, 2); + pass_.vertex_count = 3; +} + +void MySDFEffect::render(WGPURenderPassEncoder pass, + const CommonPostProcessUniforms& uniforms) { + uniforms_.update(ctx_.queue, uniforms); + + // Orbiting camera + vec3 cam_pos(std::cos(uniforms.time * 0.5f) * 5.0f, 2.0f, + std::sin(uniforms.time * 0.5f) * 5.0f); + update_camera(cam_pos, vec3(0, 0, 0), vec3(0, 1, 0), 0.785398f, 0.1f, 100.0f, + uniforms.aspect_ratio); + + wgpuRenderPassEncoderSetPipeline(pass, pass_.pipeline); + wgpuRenderPassEncoderSetBindGroup(pass, 0, pass_.bind_group, 0, nullptr); + wgpuRenderPassEncoderDraw(pass, pass_.vertex_count, 1, 0, 0); +} +``` + +```wgsl +// workspaces/main/shaders/my_sdf.wgsl +#include "common_uniforms" +#include "camera_common" +#include "math/sdf_shapes" +#include "render/raymarching" + +@group(0) @binding(0) var uniforms: CommonUniforms; +@group(0) @binding(1) var camera: CameraParams; + +fn df(p: vec3) -> f32 { + return sdSphere(p, 1.0); +} + +@vertex +fn vs_main(@builtin(vertex_index) vid: u32) -> @builtin(position) vec4 { + let x = f32((vid & 1u) << 2u) - 1.0; + let y = f32((vid & 2u) << 1u) - 1.0; + return vec4(x, y, 0.0, 1.0); +} + +@fragment +fn fs_main(@builtin(position) pos: vec4) -> @location(0) vec4 { + let uv = (pos.xy / uniforms.resolution - 0.5) * 2.0; + let ray = getCameraRay(camera, uv); + let t = rayMarch(ray.origin, ray.direction, 0.0); + + var col = vec3(0.1); + if (t < MAX_RAY_LENGTH) { + let hit_pos = ray.origin + ray.direction * t; + let n = normal(hit_pos); + col = vec3(n * 0.5 + 0.5); + } + return vec4(col, 1.0); +} +``` + +--- + +## Available Uniforms + +### CommonUniforms (binding 0) +- `resolution`: vec2 (screen size) +- `time`: float (physical seconds) +- `beat_time`: float (musical beats) +- `beat_phase`: float (0-1 within beat) +- `audio_intensity`: float (peak) +- `aspect_ratio`: float + +### CameraParams (binding 1) +- `inv_view`: mat4x4 (inverse view matrix) +- `fov`: float (vertical FOV in radians) +- `near_plane`, `far_plane`: float +- `aspect_ratio`: float + +--- + +## WGSL Helpers + +From `camera_common.wgsl`: + +```wgsl +fn getCameraRay(cam: CameraParams, uv: vec2) -> Ray; +fn getCameraPosition(cam: CameraParams) -> vec3; +fn getCameraForward(cam: CameraParams) -> vec3; +fn getCameraUp(cam: CameraParams) -> vec3; +fn getCameraRight(cam: CameraParams) -> vec3; +``` + +From `render/raymarching.wgsl`: + +```wgsl +fn rayMarch(ro: vec3, rd: vec3, initt: f32) -> f32; +fn normal(pos: vec3) -> vec3; +fn shadow(lp: vec3, ld: vec3, mint: f32, maxt: f32) -> f32; +``` + +From `math/sdf_shapes.wgsl`: + +```wgsl +fn sdSphere(p: vec3, r: float) -> f32; +fn sdBox(p: vec3, b: vec3) -> f32; +fn sdTorus(p: vec3, t: vec2) -> f32; +fn sdPlane(p: vec3, n: vec3, h: f32) -> f32; +``` + +--- + +## Camera Control + +```cpp +// Method 1: Manual values +update_camera(position, target, up, fov, near, far, aspect); + +// Method 2: Camera object +Camera cam; +cam.position = vec3(0, 5, 10); +cam.target = vec3(0, 0, 0); +update_camera(cam, uniforms.aspect_ratio); +``` + +--- + +## Registration Checklist + +1. Add shader to `workspaces/main/assets.txt` +2. Add extern declaration to `src/gpu/shaders.h` +3. Add definition to `src/gpu/shaders.cc` +4. Add `.cc` to `cmake/DemoSourceLists.cmake` (both headless & normal) +5. Include header in `src/gpu/demo_effects.h` +6. Add to `src/tests/gpu/test_demo_effects.cc` + +--- + +## Example: workspaces/main/shaders/sdf_test.wgsl diff --git a/doc/UNIFORM_BUFFER_GUIDELINES.md b/doc/UNIFORM_BUFFER_GUIDELINES.md index 93999d8..c6cf9c8 100644 --- a/doc/UNIFORM_BUFFER_GUIDELINES.md +++ b/doc/UNIFORM_BUFFER_GUIDELINES.md @@ -16,11 +16,17 @@ Structs are padded to the alignment of their largest member. Any trailing space ## Standard Uniform Buffer Pattern -To maintain consistency and facilitate efficient rendering, a standard pattern for uniform buffer usage is established: +To maintain consistency and facilitate efficient rendering, standard patterns for uniform buffer usage are established: +### Post-Process Effects - **Binding 0 & 1:** Reserved for Sampler and Texture access (handled by `pp_update_bind_group`). -- **Binding 2:** **Common Uniforms** (`CommonPostProcessUniforms` or similar). This buffer should contain frequently used data like resolution, aspect ratio, physical time, beat time, beat phase, and audio intensity. -- **Binding 3:** **Effect-Specific Parameters**. This buffer holds parameters unique to a particular effect (e.g., `strength`, `speed`, `fade_amount`). +- **Binding 2:** **Common Uniforms** (`CommonPostProcessUniforms`). Contains resolution, aspect ratio, physical time, beat time, beat phase, audio intensity. +- **Binding 3:** **Effect-Specific Parameters**. Unique per-effect data (e.g., `strength`, `speed`, `fade_amount`). + +### SDF/Raymarching Effects +- **Binding 0:** **Common Uniforms** (`CommonPostProcessUniforms`). Same as above. +- **Binding 1:** **Camera Parameters** (`CameraParams`). Camera transform and projection data for raymarching. +- **Binding 2+:** **Effect-Specific Parameters** (optional). This pattern ensures that common data is shared efficiently across effects, while effect-specific data remains isolated. @@ -98,10 +104,36 @@ struct GaussianBlurParams { float strength = 2.0f; float _pad = 0.0f; }; -static_assert(sizeof(GaussianBlurParams) == 8, +static_assert(sizeof(GaussianBlurParams) == 8, "GaussianBlurParams must be 8 bytes for WGSL alignment"); ``` +**Example (C++ CameraParams):** + +```cpp +struct CameraParams { + mat4 inv_view; // 64 bytes - inverse view matrix (screen→world) + float fov; // 4 bytes - vertical field of view (radians) + float near_plane; // 4 bytes - near clipping plane + float far_plane; // 4 bytes - far clipping plane + float aspect_ratio; // 4 bytes - width/height ratio +}; +static_assert(sizeof(CameraParams) == 80, + "CameraParams must be 80 bytes for WGSL alignment"); +``` + +**Corresponding WGSL:** + +```wgsl +struct CameraParams { + inv_view: mat4x4, // 64 bytes + fov: f32, // 4 bytes + near_plane: f32, // 4 bytes + far_plane: f32, // 4 bytes + aspect_ratio: f32, // 4 bytes +} +``` + ## Handling Common Pitfalls - **`vec3` Padding:** Avoid using `vec3` for padding in WGSL, as it has a 16-byte alignment. If padding is needed, use `vec2` for 8 bytes or individual `f32`s for 4-byte alignment. -- cgit v1.2.3