summaryrefslogtreecommitdiff
path: root/doc
diff options
context:
space:
mode:
authorskal <pascal.massimino@gmail.com>2026-02-14 19:05:34 +0100
committerskal <pascal.massimino@gmail.com>2026-02-14 19:05:34 +0100
commitb8d4a815453acac752c6fb3c56d047e39a76fd05 (patch)
treee8b49ac34aed2b5cfbdbdc4a4c99903fbd709cef /doc
parent57aeae226617dbce364716f2d4e7c4aaa6271c1d (diff)
feat(gpu): add SDF camera infrastructure and effect base class
Add unified camera system for SDF raymarching effects: - CameraParams struct (80 bytes): inv_view matrix + FOV/near/far/aspect - SDFEffect base class: manages camera uniform, provides update_camera() helpers - camera_common.wgsl: getCameraRay(), position/forward/up/right extractors - SDFTestEffect: working example with orbiting camera + animated sphere Refactor effect headers: - Extract class definitions from demo_effects.h to individual .h files - Update includes in .cc files to use specific headers - Cleaner compilation dependencies, faster incremental builds Documentation: - Add SDF_EFFECT_GUIDE.md with complete workflow - Update ARCHITECTURE.md, UNIFORM_BUFFER_GUIDELINES.md - Update EFFECT_WORKFLOW.md, CONTRIBUTING.md Tests: 34/34 passing, SDFTestEffect validated Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Diffstat (limited to 'doc')
-rw-r--r--doc/ARCHITECTURE.md26
-rw-r--r--doc/CONTRIBUTING.md8
-rw-r--r--doc/EFFECT_WORKFLOW.md8
-rw-r--r--doc/SDF_EFFECT_GUIDE.md164
-rw-r--r--doc/UNIFORM_BUFFER_GUIDELINES.md40
5 files changed, 238 insertions, 8 deletions
diff --git a/doc/ARCHITECTURE.md b/doc/ARCHITECTURE.md
index 4c36ec5..ebb2a59 100644
--- a/doc/ARCHITECTURE.md
+++ b/doc/ARCHITECTURE.md
@@ -4,6 +4,28 @@ Detailed system architecture for the 64k demo project.
---
+## SDF Camera System
+
+**Purpose**: Unified camera infrastructure for SDF raymarching effects.
+
+**CameraParams** (80 bytes, `src/gpu/camera_params.h`):
+- `inv_view`: mat4 (inverse view matrix for screen→world transform)
+- `fov`, `near_plane`, `far_plane`, `aspect_ratio`: f32 (camera parameters)
+
+**SDFEffect Base Class** (`src/gpu/sdf_effect.h`):
+- Manages `UniformBuffer<CameraParams>`
+- Provides `update_camera()` helper methods (from Camera object or manual values)
+- Standard binding: 0=CommonUniforms, 1=CameraParams
+
+**WGSL Helpers** (`common/shaders/camera_common.wgsl`):
+- `getCameraRay(cam, uv)`: Generate ray from screen UV coordinates
+- `getCameraPosition/Forward/Up/Right()`: Extract camera vectors from inv_view
+- Integrates with existing `render/raymarching.wgsl` (rayMarch, normal, shadow)
+
+**Usage**: Effects inherit from SDFEffect, update camera each frame, shader accesses camera uniforms for raymarching.
+
+---
+
## Hybrid 3D Renderer
**Core Idea**: Uses standard rasterization to draw proxy hulls (boxes), then raymarches inside the fragment shader to find the exact SDF surface.
@@ -18,6 +40,10 @@ Detailed system architecture for the 64k demo project.
**Effect**: Abstract base for visual elements. Supports `compute` and `render` phases.
+**PostProcessEffect**: Subclass for full-screen post-processing effects.
+
+**SDFEffect**: Subclass for SDF raymarching effects with camera management (see SDF Camera System below).
+
**Sequence**: Timeline of effects with start/end times defined in beats.
**MainSequence**: Top-level coordinator and framebuffer manager.
diff --git a/doc/CONTRIBUTING.md b/doc/CONTRIBUTING.md
index d7ef88a..7fbfd64 100644
--- a/doc/CONTRIBUTING.md
+++ b/doc/CONTRIBUTING.md
@@ -65,10 +65,14 @@ See `doc/CODING_STYLE.md` for detailed examples.
## Development Protocols
### Adding Visual Effect
-1. Create effect class files (use `tools/shadertoy/convert_shadertoy.py` or templates)
+
+**For SDF/raymarching effects:** Use `SDFEffect` base class (see `doc/SDF_EFFECT_GUIDE.md`).
+
+**For standard effects:**
+1. Create effect class files (each effect should have its own `.h` and `.cc` file, e.g., `src/effects/my_effect.h` and `src/effects/my_effect.cc`). Use `tools/shadertoy/convert_shadertoy.py` or templates.
2. Add shader to `workspaces/main/assets.txt`
3. Add effect `.cc` file to `CMakeLists.txt` GPU_SOURCES (both sections)
-4. Include header in `src/gpu/demo_effects.h`
+4. Include header in `src/gpu/demo_effects.h` (which now serves as a central include for all individual effect headers)
5. Add to workspace `timeline.seq` (e.g., `workspaces/main/timeline.seq`)
6. **Update `src/tests/gpu/test_demo_effects.cc`**:
- Add to `post_process_effects` list (lines 80-93) or `scene_effects` list (lines 125-137)
diff --git a/doc/EFFECT_WORKFLOW.md b/doc/EFFECT_WORKFLOW.md
index 22b8dc9..57cf904 100644
--- a/doc/EFFECT_WORKFLOW.md
+++ b/doc/EFFECT_WORKFLOW.md
@@ -10,6 +10,8 @@ Automated checklist for adding new visual effects to the demo.
**For ShaderToy conversions:** Use `tools/shadertoy/convert_shadertoy.py` then follow steps 3-8 below.
+**For SDF/raymarching effects:** See `doc/SDF_EFFECT_GUIDE.md` for streamlined workflow using SDFEffect base class.
+
**For custom effects:** Follow all steps 1-8.
---
@@ -18,6 +20,8 @@ Automated checklist for adding new visual effects to the demo.
### 1. Create Effect Files
+**Description:** Each visual effect must have its own dedicated header (`.h`) and implementation (`.cc`) file pair.
+
**Location:**
- Header: `src/effects/<effect_name>_effect.h`
- Implementation: `src/effects/<effect_name>_effect.cc`
@@ -84,7 +88,7 @@ SHADER_TUNNEL, NONE, shaders/tunnel.wgsl, "Tunnel effect shader"
# In normal section (line ~183):
src/effects/solarize_effect.cc
- src/effects/tunnel_effect.cc # <-- Add here
+ src/effects/tunnel.cc # <-- Add here
src/effects/chroma_aberration_effect.cc
```
@@ -92,7 +96,7 @@ SHADER_TUNNEL, NONE, shaders/tunnel.wgsl, "Tunnel effect shader"
**File:** `src/gpu/demo_effects.h`
-**Action:** Add include directive:
+**Action:** `src/gpu/demo_effects.h` now acts as a central include file. Add a single include directive for your new effect's header:
```cpp
#include "effects/<effect_name>_effect.h"
```
diff --git a/doc/SDF_EFFECT_GUIDE.md b/doc/SDF_EFFECT_GUIDE.md
new file mode 100644
index 0000000..fba80e7
--- /dev/null
+++ b/doc/SDF_EFFECT_GUIDE.md
@@ -0,0 +1,164 @@
+# SDF Effect Guide
+
+Streamlined workflow for SDF raymarching effects using the `SDFEffect` base class.
+
+---
+
+## Quick Start
+
+```cpp
+// src/effects/my_sdf_effect.h
+class MySDFEffect : public SDFEffect {
+ MySDFEffect(const GpuContext& ctx);
+ void render(WGPURenderPassEncoder pass,
+ const CommonPostProcessUniforms& uniforms) override;
+ RenderPass pass_;
+};
+```
+
+```cpp
+// src/effects/my_sdf_effect.cc
+#include "effects/my_sdf_effect.h"
+#include "gpu/gpu.h"
+#include "gpu/shaders.h"
+
+MySDFEffect::MySDFEffect(const GpuContext& ctx) : SDFEffect(ctx) {
+ ResourceBinding bindings[] = {
+ {uniforms_.get(), WGPUBufferBindingType_Uniform},
+ {camera_params_.get(), WGPUBufferBindingType_Uniform}};
+ pass_ = gpu_create_render_pass(ctx_.device, ctx_.format,
+ my_sdf_shader_wgsl, bindings, 2);
+ pass_.vertex_count = 3;
+}
+
+void MySDFEffect::render(WGPURenderPassEncoder pass,
+ const CommonPostProcessUniforms& uniforms) {
+ uniforms_.update(ctx_.queue, uniforms);
+
+ // Orbiting camera
+ vec3 cam_pos(std::cos(uniforms.time * 0.5f) * 5.0f, 2.0f,
+ std::sin(uniforms.time * 0.5f) * 5.0f);
+ update_camera(cam_pos, vec3(0, 0, 0), vec3(0, 1, 0), 0.785398f, 0.1f, 100.0f,
+ uniforms.aspect_ratio);
+
+ wgpuRenderPassEncoderSetPipeline(pass, pass_.pipeline);
+ wgpuRenderPassEncoderSetBindGroup(pass, 0, pass_.bind_group, 0, nullptr);
+ wgpuRenderPassEncoderDraw(pass, pass_.vertex_count, 1, 0, 0);
+}
+```
+
+```wgsl
+// workspaces/main/shaders/my_sdf.wgsl
+#include "common_uniforms"
+#include "camera_common"
+#include "math/sdf_shapes"
+#include "render/raymarching"
+
+@group(0) @binding(0) var<uniform> uniforms: CommonUniforms;
+@group(0) @binding(1) var<uniform> camera: CameraParams;
+
+fn df(p: vec3<f32>) -> f32 {
+ return sdSphere(p, 1.0);
+}
+
+@vertex
+fn vs_main(@builtin(vertex_index) vid: u32) -> @builtin(position) vec4<f32> {
+ let x = f32((vid & 1u) << 2u) - 1.0;
+ let y = f32((vid & 2u) << 1u) - 1.0;
+ return vec4<f32>(x, y, 0.0, 1.0);
+}
+
+@fragment
+fn fs_main(@builtin(position) pos: vec4<f32>) -> @location(0) vec4<f32> {
+ let uv = (pos.xy / uniforms.resolution - 0.5) * 2.0;
+ let ray = getCameraRay(camera, uv);
+ let t = rayMarch(ray.origin, ray.direction, 0.0);
+
+ var col = vec3<f32>(0.1);
+ if (t < MAX_RAY_LENGTH) {
+ let hit_pos = ray.origin + ray.direction * t;
+ let n = normal(hit_pos);
+ col = vec3<f32>(n * 0.5 + 0.5);
+ }
+ return vec4<f32>(col, 1.0);
+}
+```
+
+---
+
+## Available Uniforms
+
+### CommonUniforms (binding 0)
+- `resolution`: vec2 (screen size)
+- `time`: float (physical seconds)
+- `beat_time`: float (musical beats)
+- `beat_phase`: float (0-1 within beat)
+- `audio_intensity`: float (peak)
+- `aspect_ratio`: float
+
+### CameraParams (binding 1)
+- `inv_view`: mat4x4 (inverse view matrix)
+- `fov`: float (vertical FOV in radians)
+- `near_plane`, `far_plane`: float
+- `aspect_ratio`: float
+
+---
+
+## WGSL Helpers
+
+From `camera_common.wgsl`:
+
+```wgsl
+fn getCameraRay(cam: CameraParams, uv: vec2<f32>) -> Ray;
+fn getCameraPosition(cam: CameraParams) -> vec3<f32>;
+fn getCameraForward(cam: CameraParams) -> vec3<f32>;
+fn getCameraUp(cam: CameraParams) -> vec3<f32>;
+fn getCameraRight(cam: CameraParams) -> vec3<f32>;
+```
+
+From `render/raymarching.wgsl`:
+
+```wgsl
+fn rayMarch(ro: vec3<f32>, rd: vec3<f32>, initt: f32) -> f32;
+fn normal(pos: vec3<f32>) -> vec3<f32>;
+fn shadow(lp: vec3<f32>, ld: vec3<f32>, mint: f32, maxt: f32) -> f32;
+```
+
+From `math/sdf_shapes.wgsl`:
+
+```wgsl
+fn sdSphere(p: vec3<f32>, r: float) -> f32;
+fn sdBox(p: vec3<f32>, b: vec3<f32>) -> f32;
+fn sdTorus(p: vec3<f32>, t: vec2<f32>) -> f32;
+fn sdPlane(p: vec3<f32>, n: vec3<f32>, h: f32) -> f32;
+```
+
+---
+
+## Camera Control
+
+```cpp
+// Method 1: Manual values
+update_camera(position, target, up, fov, near, far, aspect);
+
+// Method 2: Camera object
+Camera cam;
+cam.position = vec3(0, 5, 10);
+cam.target = vec3(0, 0, 0);
+update_camera(cam, uniforms.aspect_ratio);
+```
+
+---
+
+## Registration Checklist
+
+1. Add shader to `workspaces/main/assets.txt`
+2. Add extern declaration to `src/gpu/shaders.h`
+3. Add definition to `src/gpu/shaders.cc`
+4. Add `.cc` to `cmake/DemoSourceLists.cmake` (both headless & normal)
+5. Include header in `src/gpu/demo_effects.h`
+6. Add to `src/tests/gpu/test_demo_effects.cc`
+
+---
+
+## Example: workspaces/main/shaders/sdf_test.wgsl
diff --git a/doc/UNIFORM_BUFFER_GUIDELINES.md b/doc/UNIFORM_BUFFER_GUIDELINES.md
index 93999d8..c6cf9c8 100644
--- a/doc/UNIFORM_BUFFER_GUIDELINES.md
+++ b/doc/UNIFORM_BUFFER_GUIDELINES.md
@@ -16,11 +16,17 @@ Structs are padded to the alignment of their largest member. Any trailing space
## Standard Uniform Buffer Pattern
-To maintain consistency and facilitate efficient rendering, a standard pattern for uniform buffer usage is established:
+To maintain consistency and facilitate efficient rendering, standard patterns for uniform buffer usage are established:
+### Post-Process Effects
- **Binding 0 & 1:** Reserved for Sampler and Texture access (handled by `pp_update_bind_group`).
-- **Binding 2:** **Common Uniforms** (`CommonPostProcessUniforms` or similar). This buffer should contain frequently used data like resolution, aspect ratio, physical time, beat time, beat phase, and audio intensity.
-- **Binding 3:** **Effect-Specific Parameters**. This buffer holds parameters unique to a particular effect (e.g., `strength`, `speed`, `fade_amount`).
+- **Binding 2:** **Common Uniforms** (`CommonPostProcessUniforms`). Contains resolution, aspect ratio, physical time, beat time, beat phase, audio intensity.
+- **Binding 3:** **Effect-Specific Parameters**. Unique per-effect data (e.g., `strength`, `speed`, `fade_amount`).
+
+### SDF/Raymarching Effects
+- **Binding 0:** **Common Uniforms** (`CommonPostProcessUniforms`). Same as above.
+- **Binding 1:** **Camera Parameters** (`CameraParams`). Camera transform and projection data for raymarching.
+- **Binding 2+:** **Effect-Specific Parameters** (optional).
This pattern ensures that common data is shared efficiently across effects, while effect-specific data remains isolated.
@@ -98,10 +104,36 @@ struct GaussianBlurParams {
float strength = 2.0f;
float _pad = 0.0f;
};
-static_assert(sizeof(GaussianBlurParams) == 8,
+static_assert(sizeof(GaussianBlurParams) == 8,
"GaussianBlurParams must be 8 bytes for WGSL alignment");
```
+**Example (C++ CameraParams):**
+
+```cpp
+struct CameraParams {
+ mat4 inv_view; // 64 bytes - inverse view matrix (screen→world)
+ float fov; // 4 bytes - vertical field of view (radians)
+ float near_plane; // 4 bytes - near clipping plane
+ float far_plane; // 4 bytes - far clipping plane
+ float aspect_ratio; // 4 bytes - width/height ratio
+};
+static_assert(sizeof(CameraParams) == 80,
+ "CameraParams must be 80 bytes for WGSL alignment");
+```
+
+**Corresponding WGSL:**
+
+```wgsl
+struct CameraParams {
+ inv_view: mat4x4<f32>, // 64 bytes
+ fov: f32, // 4 bytes
+ near_plane: f32, // 4 bytes
+ far_plane: f32, // 4 bytes
+ aspect_ratio: f32, // 4 bytes
+}
+```
+
## Handling Common Pitfalls
- **`vec3<f32>` Padding:** Avoid using `vec3<f32>` for padding in WGSL, as it has a 16-byte alignment. If padding is needed, use `vec2<f32>` for 8 bytes or individual `f32`s for 4-byte alignment.