summaryrefslogtreecommitdiff
path: root/tools
diff options
context:
space:
mode:
Diffstat (limited to 'tools')
-rw-r--r--tools/cnn_test.cc2
-rw-r--r--tools/cnn_v2_test/README.md251
-rw-r--r--tools/cnn_v2_test/index.html2049
-rw-r--r--tools/common/style.css117
-rw-r--r--tools/shader_editor/index.html42
-rw-r--r--tools/spectral_editor/index.html1
-rw-r--r--tools/spectral_editor/style.css106
-rw-r--r--tools/timeline_editor/README.md13
-rw-r--r--tools/timeline_editor/index.html956
-rw-r--r--tools/timeline_editor/timeline-playback.js322
-rw-r--r--tools/timeline_editor/timeline-viewport.js170
-rw-r--r--tools/track_visualizer/index.html21
12 files changed, 1180 insertions, 2870 deletions
diff --git a/tools/cnn_test.cc b/tools/cnn_test.cc
index 7d060ae..137d235 100644
--- a/tools/cnn_test.cc
+++ b/tools/cnn_test.cc
@@ -5,7 +5,7 @@
#error "cnn_test requires STRIP_ALL=OFF (tool builds only)"
#endif
-#include "effects/cnn_effect.h"
+#include "../cnn_v1/src/cnn_v1_effect.h"
#include "generated/assets.h"
#include "gpu/bind_group_builder.h"
#include "gpu/gpu.h"
diff --git a/tools/cnn_v2_test/README.md b/tools/cnn_v2_test/README.md
deleted file mode 100644
index d41a00f..0000000
--- a/tools/cnn_v2_test/README.md
+++ /dev/null
@@ -1,251 +0,0 @@
-# CNN v2 Testing Tool
-
-WebGPU-based browser tool for testing trained CNN v2 weights.
-
----
-
-## Features
-
-- Drag-drop PNG images and `.bin` weights (or click to browse)
-- Real-time CNN inference with WebGPU compute shaders
-- View modes: CNN output, original input, difference (×10)
-- Adjustable blend amount and depth
-- Data-driven pipeline (supports variable layer count)
-- GPU timing display
-- **Left Panel:** Weights info + kernel visualization (1px/weight, all layers)
-- **Right Panel:** Layer activation viewer with 4-channel split + 4× zoom
-
----
-
-## Requirements
-
-- Browser with WebGPU support:
- - Chrome/Edge 113+ (enable `chrome://flags/#enable-unsafe-webgpu` if needed)
- - Safari 18+ (macOS Ventura+)
-- Trained CNN v2 weights in binary format (`.bin`)
-- Test images (PNG format)
-
----
-
-## Usage
-
-### 1. Open Tool
-
-```bash
-open tools/cnn_v2_test/index.html
-```
-
-Or use a local server to avoid CORS:
-```bash
-python3 -m http.server 8000
-# Open http://localhost:8000/tools/cnn_v2_test/
-```
-
-### 2. Load Data
-
-1. **Drop `.bin` weights** into left sidebar zone (or click to browse)
-2. **Drop PNG image** anywhere in center canvas area
-3. CNN runs automatically when both loaded
-
-### 3. Layout
-
-**Left Sidebar:**
-- Weights drop zone (click or drag-drop `.bin` files)
-- Weights info panel (layer specs, ranges, file size)
-- Weights visualization (click Layer 0/1/2 buttons)
- - 1 pixel per weight, all input channels horizontally
- - Output channels (Out 0-3) stacked vertically
-
-**Center Canvas:**
-- Main output view (CNN result, original, or diff)
-- Keyboard: `SPACE` = original, `D` = diff (×10)
-
-**Right Sidebar:**
-- Layer selection buttons (Static 0-3/4-7, Layer 0/1/2)
-- 4 small activation views (Ch0/1/2/3) in a row
-- Large zoom view below (4× magnification, follows mouse)
-
-**Header Controls:**
-- **Blend:** Mix between original (0.0) and CNN output (1.0)
-- **Depth:** Uniform depth value for all pixels (0.0–1.0)
-- **View:** Current display mode
-
-**Footer:**
-- Status: GPU timing (ms), image dimensions, view mode
-- Console: Timestamped event log (file loads, errors)
-
----
-
-## Preparing Test Data
-
-### Export Weights
-
-```bash
-# From trained checkpoint
-./training/export_cnn_v2_weights.py \
- checkpoints/checkpoint_epoch_100.pth \
- --output-weights tools/cnn_v2_test/test_weights.bin
-```
-
-Binary format: 16-byte header + 20 bytes per layer + f16 weights (~3.2 KB for 3-layer model)
-
-### Test Images
-
-Use training images or any PNG:
-```bash
-# Copy test image
-cp training/input/test.png tools/cnn_v2_test/
-```
-
-**Note:** Grayscale images automatically converted to RGB.
-
----
-
-## Validation
-
-### Visual Comparison
-
-Compare browser output with C++ tool:
-
-```bash
-# Generate C++ output
-./build/cnn_test training/input/test.png /tmp/cpp_output.png
-
-# Load same image in browser tool
-# Visually compare outputs
-```
-
-### GPU Timing
-
-Expected performance:
-- 512×512: ~1-2 ms (integrated GPU)
-- 1024×1024: ~3-5 ms
-- 1920×1080: ~5-8 ms
-
-Slower than expected? Check:
-- WebGPU enabled in browser
-- Dedicated GPU selected (if available)
-- No background tabs consuming GPU
-
----
-
-## Troubleshooting
-
-### "WebGPU not supported"
-
-- Update browser to latest version
-- Enable WebGPU flag: `chrome://flags/#enable-unsafe-webgpu`
-- Try Safari 18+ (native WebGPU on macOS)
-
-### "Invalid .bin file"
-
-- Check magic number: `hexdump -C weights.bin | head`
-- Should start with: `43 4e 4e 32` ('CNN2')
-- Re-export weights: `./training/export_cnn_v2_weights.py`
-
-### Black output / incorrect colors
-
-- Check blend slider (set to 1.0 for full CNN output)
-- Verify training converged (loss < 0.01)
-- Compare with C++ tool output
-
-### Shader compilation errors
-
-Open browser console (F12) for detailed errors. Common issues:
-- Image too large (>4096×4096 not tested)
-- Unsupported texture format (rare on modern GPUs)
-
----
-
-## Architecture
-
-**Pipeline:**
-1. **Static Features Pass** - Generate 8D features (RGBD, UV, sin, bias)
-2. **CNN Layer Passes** - Compute N layers with ping-pong textures
-3. **Display Pass** - Unpack and render with view mode
-
-**Textures:**
-- Input: RGBA8 (original image)
-- Depth: R32F (uniform depth)
-- Static features: RGBA32Uint (8×f16 packed)
-- Layer buffers: RGBA32Uint (ping-pong)
-
-**Data-Driven Execution:**
-- Layer count read from binary header
-- Per-layer params (kernel size, channels, offsets) from binary
-- Single CNN shader dispatched N times
-
----
-
-## Implemented Features
-
-**✓ Weights Metadata Panel:**
-- Layer descriptions (kernel size, channels, weight count)
-- Weight statistics (min/max per layer)
-- File size and layer count
-
-**✓ Weights Visualization:**
-- Per-layer kernel heatmaps (1px/weight)
-- All input channels displayed horizontally
-- Output channels stacked vertically
-- Normalized grayscale display
-
-**✓ Layer Activation Viewer:**
-- Static features (8D split into 0-3 and 4-7 views)
-- All CNN layer outputs (Layer 0/1/2...)
-- 4-channel split view (grayscale per channel)
-- Mouse-driven 4× zoom view
-
-## TODO
-
-**Future Enhancements:**
-- Weight distribution histograms per layer
-- Activation statistics (min/max/mean overlay)
-- Side-by-side diff mode (browser vs C++ output)
-- Export rendered layers as PNG
-
----
-
-## Extensions (v2+)
-
-Planned enhancements:
-
-**Variable Feature Count:**
-- Binary v2: Add `num_features` to header
-- Shader: Dynamic feature array or multiple textures
-
-**Multi-Scale Input (Mip Levels):**
-- Uncomment mip bindings in static shader
-- No binary format change needed
-
-**8-bit Quantized Weights:**
-- Binary version bump (format field already present)
-- Add quantization codepath in `get_weight()` function
-- 2× size reduction (~1.6 KB)
-
-**Pre-defined Test Images:**
-- Dropdown menu with training/input/*.png
-- Requires local file server
-
----
-
-## Size
-
-- HTML structure: ~2 KB
-- CSS styling: ~2 KB
-- JavaScript logic: ~10 KB (includes zoom + weights viz)
-- Static shader: ~1 KB
-- CNN shader: ~3 KB
-- Display shader: ~1 KB
-- Layer viz shader: ~2 KB
-- Zoom shader: ~1 KB
-- **Total: ~22 KB** (single file, no dependencies)
-
----
-
-## See Also
-
-- `doc/CNN_V2.md` - Architecture and design
-- `doc/HOWTO.md` - Training workflows
-- `training/export_cnn_v2_weights.py` - Binary format
-- `src/effects/cnn_v2_effect.cc` - C++ reference implementation
diff --git a/tools/cnn_v2_test/index.html b/tools/cnn_v2_test/index.html
deleted file mode 100644
index e226d0c..0000000
--- a/tools/cnn_v2_test/index.html
+++ /dev/null
@@ -1,2049 +0,0 @@
-<!DOCTYPE html>
-<html lang="en">
-<!--
- CNN v2 Testing Tool - WebGPU-based inference validator
-
- Architecture:
- - Static features (8D): p0-p3 (parametric), uv_x, uv_y, sin(10*uv_x), bias (NOT a CNN layer)
- - Layer 0: input RGBD (4D) + static (8D) = 12D → 4 channels
- - Layer 1+: previous layer (4D) + static (8D) = 12D → 4 channels
- - All CNN layers: uniform 12D input, 4D output (ping-pong buffer)
-
- Naming convention (matches train_cnn_v2.py / .wgsl / .cc):
- - UI shows: "Static 0-3", "Static 4-7", "Layer 0", "Layer 1", "Layer 2"
- - weights.layers[] array: Layer 0 = weights.layers[0], Layer 1 = weights.layers[1]
-
- Features:
- - Input: PNG images or video files (MP4, WebM, etc.)
- - Video playback: Play/Pause, frame-by-frame navigation (◄/► buttons)
- - Video mode: Non-realtime processing (drops frames if CNN slower than playback)
- - Side panel: .bin metadata display, weight statistics per layer
- - Layer inspection: 4-channel grayscale split, intermediate layer visualization
- - View modes: CNN output, original, diff (×10)
- - Optimization: Layer viz updates only on pause/seek during video playback
-
- WGSL Shader Reuse:
- - CNN_SHADER (inference), STATIC_SHADER, LAYER_VIZ_SHADER are inline for single-file deployment
- - Can extract to .wgsl files for: better IDE support, testing, cross-tool reuse
- - Tradeoff: extraction needs fetch() or build step, breaks single-file portability
- - C++ sync: manual (WGSL ≠ GLSL) but logic identical
--->
-<head>
- <meta charset="UTF-8">
- <meta name="viewport" content="width=device-width, initial-scale=1.0">
- <title>CNN v2 Testing Tool</title>
- <style>
- * { margin: 0; padding: 0; box-sizing: border-box; }
- body {
- font-family: 'Courier New', monospace;
- background: #1a1a1a;
- color: #e0e0e0;
- display: flex;
- flex-direction: column;
- height: 100vh;
- overflow: hidden;
- }
- .header {
- background: #2a2a2a;
- padding: 16px;
- border-bottom: 1px solid #404040;
- display: flex;
- align-items: center;
- gap: 24px;
- flex-wrap: wrap;
- }
- h1 { font-size: 18px; }
- .controls {
- display: flex;
- gap: 16px;
- align-items: center;
- flex-wrap: wrap;
- }
- .control-group {
- display: flex;
- gap: 8px;
- align-items: center;
- }
- .control-group label { font-size: 12px; }
- input[type="range"] { width: 120px; }
- input[type="number"] { width: 60px; background: #1a1a1a; color: #e0e0e0; border: 1px solid #404040; padding: 4px; }
- .drop-zone {
- border: 3px dashed #606060;
- padding: 20px;
- text-align: center;
- cursor: pointer;
- transition: all 0.2s;
- font-size: 13px;
- font-weight: bold;
- background: #252525;
- border-radius: 6px;
- color: #4a9eff;
- }
- button {
- background: #1a1a1a;
- border: 1px solid #404040;
- color: #e0e0e0;
- padding: 6px 12px;
- font-size: 12px;
- font-family: 'Courier New', monospace;
- cursor: pointer;
- transition: all 0.2s;
- border-radius: 4px;
- }
- button:hover { border-color: #606060; background: #252525; }
- button:disabled { opacity: 0.3; cursor: not-allowed; }
- video { display: none; }
- .drop-zone:hover { border-color: #4a9eff; background: #2a3545; }
- .drop-zone.active { border-color: #4a9eff; background: #1a2a3a; }
- .drop-zone.error { border-color: #ff4a4a; background: #3a1a1a; }
- .content {
- flex: 1;
- display: flex;
- overflow: hidden;
- gap: 1px;
- background: #404040;
- }
- .left-sidebar {
- width: 315px;
- background: #2a2a2a;
- overflow-y: auto;
- display: flex;
- flex-direction: column;
- gap: 16px;
- padding: 16px;
- }
- .main {
- flex: 1;
- display: flex;
- justify-content: center;
- align-items: center;
- padding: 24px;
- overflow: auto;
- position: relative;
- background: #1a1a1a;
- }
- .video-controls-float {
- position: absolute;
- top: 16px;
- left: 50%;
- transform: translateX(-50%);
- display: flex;
- gap: 8px;
- background: rgba(42, 42, 42, 0.95);
- padding: 8px 12px;
- border-radius: 4px;
- border: 1px solid #404040;
- z-index: 100;
- }
- .bottom-controls-float {
- position: absolute;
- bottom: 16px;
- left: 50%;
- transform: translateX(-50%);
- display: flex;
- gap: 16px;
- align-items: center;
- background: rgba(42, 42, 42, 0.95);
- padding: 8px 16px;
- border-radius: 4px;
- border: 1px solid #404040;
- z-index: 100;
- }
- .bottom-controls-float .control-group {
- display: flex;
- gap: 8px;
- align-items: center;
- }
- .bottom-controls-float #videoControls {
- display: flex;
- gap: 8px;
- align-items: center;
- padding-right: 16px;
- border-right: 1px solid #404040;
- }
- .main.drop-active::after {
- content: 'Drop PNG/video here';
- position: absolute;
- inset: 24px;
- display: flex;
- align-items: center;
- justify-content: center;
- border: 3px dashed #4a9eff;
- background: rgba(74, 158, 255, 0.1);
- font-size: 24px;
- color: #4a9eff;
- pointer-events: none;
- z-index: 10;
- }
- .sidebar {
- width: 400px;
- background: #2a2a2a;
- overflow-y: auto;
- display: flex;
- flex-direction: column;
- gap: 16px;
- padding: 16px;
- }
- .panel {
- border: 1px solid #404040;
- border-radius: 4px;
- overflow: hidden;
- }
- .panel.collapsed .panel-content {
- display: none;
- }
- .panel-header {
- background: #1a1a1a;
- padding: 8px 12px;
- font-size: 12px;
- font-weight: bold;
- border-bottom: 1px solid #404040;
- }
- .panel-content {
- padding: 12px;
- font-size: 11px;
- }
- .panel-content table {
- width: 100%;
- border-collapse: collapse;
- }
- .panel-content th {
- text-align: left;
- padding: 4px;
- font-size: 10px;
- color: #808080;
- border-bottom: 1px solid #404040;
- }
- .panel-content td {
- padding: 4px;
- font-size: 10px;
- }
- .panel-content tr:hover {
- background: #1a1a1a;
- }
- .layer-buttons {
- display: flex;
- flex-wrap: wrap;
- gap: 6px;
- margin-bottom: 12px;
- }
- .layer-buttons button {
- background: #1a1a1a;
- border: 1px solid #404040;
- color: #e0e0e0;
- padding: 6px 12px;
- font-size: 10px;
- font-family: 'Courier New', monospace;
- cursor: pointer;
- transition: all 0.2s;
- }
- .layer-buttons button:hover {
- border-color: #606060;
- background: #252525;
- }
- .layer-buttons button.active {
- background: #4a9eff;
- border-color: #4a9eff;
- color: #1a1a1a;
- }
- .layer-buttons button:disabled {
- opacity: 0.3;
- cursor: not-allowed;
- }
- .layer-buttons button:disabled:hover {
- border-color: #404040;
- background: #1a1a1a;
- }
- .layer-grid {
- display: grid;
- grid-template-columns: repeat(4, 1fr);
- gap: 4px;
- margin-bottom: 12px;
- }
- .layer-view {
- aspect-ratio: 1;
- background: #1a1a1a;
- border: 1px solid #404040;
- display: flex;
- flex-direction: column;
- overflow: hidden;
- }
- .layer-preview {
- background: #1a1a1a;
- border: 1px solid #404040;
- display: flex;
- flex-direction: column;
- overflow: hidden;
- margin-top: 8px;
- }
- .layer-preview canvas {
- width: 100%;
- height: 100%;
- image-rendering: pixelated;
- }
- .layer-view.active {
- border: 2px solid #ffffff;
- }
- .layer-view canvas {
- cursor: pointer;
- }
- .layer-view-label {
- background: #2a2a2a;
- padding: 4px;
- font-size: 9px;
- text-align: center;
- border-bottom: 1px solid #404040;
- }
- .layer-view canvas {
- width: 100%;
- height: 100%;
- image-rendering: pixelated;
- }
- canvas {
- max-width: 100%;
- max-height: 100%;
- image-rendering: pixelated;
- box-shadow: 0 4px 12px rgba(0,0,0,0.5);
- }
- .footer {
- background: #2a2a2a;
- border-top: 1px solid #404040;
- font-size: 11px;
- display: flex;
- flex-direction: column;
- gap: 8px;
- }
- .footer-top {
- padding: 12px 16px 0;
- display: flex;
- justify-content: space-between;
- }
- .status { color: #4a9eff; }
- .shortcuts { color: #808080; }
- .console {
- background: #1a1a1a;
- padding: 8px 16px;
- font-family: 'Courier New', monospace;
- font-size: 10px;
- color: #808080;
- max-height: 100px;
- overflow-y: auto;
- border-top: 1px solid #404040;
- }
- .console-line { margin: 2px 0; }
- .console-line.error { color: #ff4a4a; }
- .console-line.info { color: #4a9eff; }
- </style>
-</head>
-<body>
- <div class="header">
- <h1>CNN v2 Testing Tool</h1>
- </div>
- <video id="videoSource" muted loop></video>
- <div class="content">
- <div class="left-sidebar">
- <input type="file" id="weightsFile" accept=".bin" style="display: none;">
- <div class="drop-zone" id="weightsDrop" onclick="document.getElementById('weightsFile').click()">
- Drop .bin Weights or Click to Browse
- </div>
- <div class="panel" id="weightsInfoPanel">
- <div class="panel-header">Weights Info</div>
- <div class="panel-content" id="weightsInfo">
- <p style="color: #808080; text-align: center;">No weights loaded</p>
- </div>
- </div>
- <div class="panel" id="weightsVizPanel" style="display: none;">
- <div class="panel-header">Weights Visualization</div>
- <div class="panel-content" id="weightsViz">
- <div class="layer-buttons" id="weightsLayerButtons"></div>
- <canvas id="weightsCanvas" style="width: 100%; image-rendering: pixelated; border: 1px solid #404040;"></canvas>
- </div>
- </div>
- <div class="panel">
- <div class="panel-content">
- <label for="mipLevel" style="font-size: 11px;">Mip Level:</label>
- <select id="mipLevel" style="width: 100%; background: #1a1a1a; color: #e0e0e0; border: 1px solid #404040; padding: 4px; margin-top: 4px;">
- <option value="0">Mip 0 (original)</option>
- <option value="1">Mip 1 (half res)</option>
- <option value="2">Mip 2 (quarter res)</option>
- </select>
- </div>
- </div>
- </div>
- <div class="main" id="mainDrop">
- <div class="bottom-controls-float">
- <div id="videoControls">
- <button id="playPauseBtn" disabled>Play</button>
- <button id="stepBackBtn" disabled>◄ Frame</button>
- <button id="stepForwardBtn" disabled>Frame ►</button>
- </div>
- <div class="control-group">
- <label>Blend:</label>
- <input type="range" id="blend" min="0" max="1" step="0.01" value="1.0">
- <span id="blendValue">1.0</span>
- </div>
- <div class="control-group">
- <label>Depth:</label>
- <input type="range" id="depth" min="0" max="1" step="0.01" value="1.0">
- <span id="depthValue">1.0</span>
- </div>
- <button id="savePngBtn">Save PNG</button>
- </div>
- <canvas id="canvas"></canvas>
- </div>
- <div class="sidebar">
- <div class="panel" style="flex: 1; display: flex; flex-direction: column; min-height: 0;">
- <div class="panel-header">Layer Visualization</div>
- <div class="panel-content" id="layerViz" style="flex: 1; overflow: hidden;">
- <p style="color: #808080; text-align: center;">Load image + weights</p>
- </div>
- </div>
- </div>
- </div>
- <div class="footer">
- <div class="footer-top">
- <span class="status" id="status">Drop PNG/video anywhere to begin</span>
- <span class="shortcuts">[SPACE] Original | [D] Diff (×10)</span>
- </div>
- <div class="console" id="console"></div>
- </div>
-
- <script>
-// ============================================================================
-// EMBEDDED WEIGHTS & CONSTANTS
-// ============================================================================
-
-// Default pre-trained weights (base64-encoded binary format)
-// Version 2: 4 layers (3×3, 5×5, 3×3, 3×3), 2496 f16 weights, mip_level=2
-const DEFAULT_WEIGHTS_B64 = 'Q05OMgIAAAAEAAAAwAkAAAIAAAADAAAADAAAAAQAAAAAAAAAsAEAAAUAAAAMAAAABAAAALABAACwBAAAAwAAAAwAAAAEAAAAYAYAALABAAADAAAADAAAAAQAAAAQCAAAsAEAAAU3faplMDmtR7gnMLqt6bSrLM4RCa/En4q257kVsmWz57aSHJMxz6wILJC0tLdBriWww7IULUehCClCo60dBiu1nWqsf60ZKn6ktCWKjrswATSfLwQunzJjKKWkN6hxLTMwbS2DJvgvUjFDL1YsQDFFL78ysC5OL/cvxC2kJ6qh0i1BLH2rzCrcKFUoeixTqwwopjD+rXmewCY6sYUtXCwwsaKqGjBcqoykKigRJYStaqjMp+siPi1BLI+tGatfK5Ii6C1qLY0tYSGFKz4wpzNdH1QuJDKmMJi0lLVAs0y2Q7YWtY21fLXusf+n8LDSsaethK3drB4rtSROKYOrLK53qrqu0REYLEUuVy1qEqohDSzgqk4sDKKSKi0clKcVKvupJ69rKTmw8q7qptatQK7OsFUw5Z5JKJ4udSp9LLQeui87LbcxljEgJ6Iw75jDLfUvIjCxnh0g763Lq/ItMqzDqP0sXCRcqnkl9qDlJUStSyR8oTuwA616IrAnNqo5JS4qDKeILmahyaHZI48tryiajuEs0aghLBcuny+aovQpAhj6Kqkwdy+8MZ0wLzBvKBStsrRAKJez+raaKAotBiVSqZqyk7b2sHO1e7cJsfGmQLACpWizBLP9LnWxYLWoJPeb/CY5ISokXqynJ4qtG6K1qpesL6zGqYssIDJRpnErRi3RL9kh1zBFLPkdGSNvKtEuvyywmgilbC43LNovbywCKj4pFzEbMmMuly2gMFYscCgzliIomSqZnpSnyK3hJJKsAasgJGMrfCyNqXwpqaYNq14wiyzWLrSn/yLbqm+tnauOpkKtRKdCrBcYQS0dnGAveqeBrD8sMiGpLkAugzEaLM6lLzAkL5YydzYnqGo15zh2MuSwJK0nqxI04jZ5LAs2TjilNeSc3yANLecrCzBCprUvfjUHMWCuFrAkItyq/an0JSUnvKnrrAosv5CRrTGvQKesntuur6v2rsyxzbCAsHYn1y5GrAGsASYUmawrpSLooRSy86sBqmaxAq67sD0lJalOKxOtkqx8H+wqgygMLhup8SzNKZuhcafWKUKs567KI1opDCsoplatAykJpc+skavUrK4p2iznLlMqcig4Le6mDKiaJpIsMiOgLGOtQqI7sFGworKfsTOq86ZIlru0dLCEoMqq4KzsI6I2MzixMocqSym8MwQtT7Njqrwy26rEthe2nTGxL/Gq+az8MPg1Tq6EqXmslqyArkKs/S73MqEwmyuzrUUxejLhKYaw0yUlMzgxAZULsZ4rhq8ssgarCjDTrPop0ywBLswwjbT7MMAxdq2fsEC04DZoOIovG7G4LwM1gTNnKDsuEbByrzyxvLLBKJgkGDQANSMy66wVrM21ebURriAluK5quFa3wLBsK2wvaDU7OEg3RDGWKVUzpTfPNG+tbrGcr3ytRKosr7yuCbB2rV6gZq3msWmtjqvmoNurP6YXrOIpf6l/J2irl6/iqK2jy6MCLkkhjSDQoAWWACo1JrWjP6nvKvmthay+KJ6rUqoKqaatHKyJrUOarydBo5yu/CUaKFoxFCW1CNgpri2WK02kgqvYqkotwqlIrdiiEa1aKZ2tXa6mrkax4KkYKp2vcKgErYsi2RvbqWapU6EAnMyqtyPBpYwdZyVZkwGl1yhhJ2QBPaUJqMmMJJ54IikpcqmUHzmacCDzq1Cr3yR9n8aizKlWKFiogapBFlknrimnHmemDqbVKHciNRyII5AsxZ0+Lf0Xmyh7LMIqDS2KK9EkxyxRHKgp2iL9K0QfxCwGLLEuwiqrLcWob6xpppasp6+lotypGrC9qdmpPKUuplagES2cpSyrsSyHJTMi3Kk4KWAlSCaqKNMtR626rKaoj6koI1wqeivGI9cpuqQ9KQUkZyEJKOmquyW0JymirSjhprWgkBpKLFykzZyloWSrNKxrGaCtMi1MqL6t56lLqu+wbbTetYkqYDR1rB0wqir/sWQwNas8N9E4wq+9I6WwT6xuMDy1yC9tM/Kwka+btK8vJisnIJWeUa30LRkwDaqIsNqzWK9lLnEzKjEMqYMuWy8uMs0qI6xKLjcvxicEqYCv06zrrLusKK/lMeMz8CyCMmqxO7AtNpW38zFzL5i2Wq19tkCuBaTlt8Kv85Mlsg6wWLfgstutzDJVNAqZxCywrQgspDYOMS0mGbQCuf63QS7GJ4GsBLizuRS0mKyiKKMkBbLXseCufCr4qKUpah7Vqh8tV6eqLLQoGy1bMNEu6i4fMD4wZSvbjwOpmCBzLMmeJKddoYqkIic6qpqRY6nNqDiwIq5dqcmndqbnKnGkSCjmKBUsriySrHWsZyTaG7smSKxAIwolIi2zLX6unK5KqXCwKq03qyarcKWMqQmmd6tIodWtH6UvLg2tTadPJOOp2iGgny0ufyy+L7AvNClhpiEpC6qMqqMp7KTopJ4mmB2ylM6mrKhfKiQrTyiiKdGoQqjKJ6Umxip/qDiq/ChgKtmqIiwOr+CunZF7Kfot36poqkcthCx+Ksapg5T5pn0oNqOPq4osMSbSqQQmGqgXKhEl3yV1piyswazLK7QoQBTaqU8lIS13Ldch+qQqJ2AsPKfmp3Ink5Z2HhosR5z4qLIoGqkNLCct2Ck3KPGnUC0oJBQq7agOKyaq0qsqpAap8SylLg4qriy6M3MqKCtdKpMjSi86KigsGCz/n2erEyu7J/QRVCkpILUwcC35LI8qxiw6Knoq5jAAKo8wnieqLF0vVTAYMZw4Jyx2t/ayTjGWMoGzKbwus1w4QRxeJse1dTGSNJGwmCrEJV8uQKygKe4gjSqkrLeydiaMroS0FrQms8Uygi28qe2uXS2Ko4q1d7ZxszEpiDSBMoc0STWpNc0xJKSvrMWm6bCKsOC3CrEOJNC1Ga5Qubi7U6/+NRQ0AqnSuFoySDmKtJS0b7KcNAMmqi45IbMvGzjeMg2qSioPKVWtSK6EpaA1UTckMt2m16nwM5E2oDHBsZ+pniVpMc4vQy1epXkqHifBl7Mu36T/KzQorix4JAOmWyqJFVUqq67doiot2CxYME8i2JxVKhQt5ioYJsWp1KiSpL0lhq1JpWAgbCweKW2o1CrCIMsrcghkHUqW3hiTI5osYqMlB+WaLy0uKNUooKx4qdEezqRlJEapyKuUoEmoZyT7nqcoo6v3n4yqZaGcpNElwij3IkinQiAFIFQK2ygqIoKsiZxEI6ukqCf7KFSkgqSTqjEq8JZLJPufXKmFkaEj36lCKj2qURxfKkQouaqQhRIrGSmepKin7Cl8KEcuKI+ip4Evz6xIF0woVK/yHLyfLSj0ny+oWywSJHWmQaEomWos6ZTMpPWlY61pqLelZqYGpAidcyzQE5kneBr1pnQkJSwIqWYpIabdKA8oHKroGeCnYplOKzAmC51LJ0emp6o+rXAofCkCKV4w4x1sKCYjrKAgKa0r+BcPJDMmP6o2JW4pIqqtm4srTqgHlLWlsBBepaqrKq27rBat9aTlot8qkaw2o5sl76ivKDkjNyjzKKWY5KlHrQCr8SjxquarXqrlKB2xyyfZL1Sqq7LWpxA04zZwMkyvUiyHMig1ay+GJqenVq1Ao1awVLHQnrEqxTD/LO8kKB+NH1grfKsPsY6u+aIELLaj4LBmLBU0wDOlM8ksdKjbqPSqQykHJmYodC+WMcYuSCJ7psYvNDTaLqWw/qy7Myw4xjTnMIouQTV9OJ81YSlbLiIx3TVuMUcokrDzI0ow8CQQr9IvDyxsLnk0OTVhLmmobLAULN4zkyyZsGC0LK01L3Upw52Jroywlix0MCwr5qkQJkot9aWzsYuui66HrHykMa9ZsDet96yBqXWvXbAXsraxIqgpsVOvtq5frF+iZa2WqROwcaP+qX2w+aW3rxWpI7Bwrlqu5K0LrxexX7DUrfOvhK3QrUGwP7BrsY2tU6yWr8qkpK18rn2rHCbloYmfaqM1nfSr7Sn1qjuk2KT2qyem4KXJJ4MdxaidqPWsa58zKTSsoKXAJUymz6rJpv+oGKsOJo2hSicHqA4oOiiRmr4k0BxBq8Ui16jTKvyq7ijmqHcpZanhHnGfMikxIiEk7S4Yq90sfKWSoZyntKg/qh+nJiifnAyvlKeXJMIdViKeoxEjLKvZpXymAqkhraCofK5SnTGmLqdkq7mjYCD8qV0qQKo0qrUo+KsZKVSs0iaULFUI8qS0mlWtiiqbGBegACwBoAErhaW1qMwqHSxfKVKpp6x7poiweKxCrdkivK48sJewrKdArHYnqyhoHbUnsagYK58qSjAgMcUwsCt0K/4rLC7mJGwtvStOMFQu0SzuJQUsBTBMLswqcJyEnVQsESn3ox2z9ai/qFqwES7tKP0vSChMoqQwVzR4LKaT+y/NK06q2y0LIi2wHrIcKZuzsrSHn/6xkrPssAovJzEipEQiDbDjr3SqIis5LGIoOSm6p1apeqGGrtAqJzCIJRuptqrApiktWTAwMB4xQizXKoIgASFFsLwweTHbLdQtqyzXoKYtay3SLeOke6wgoPWr/SpFKUEmDacWptSoMChJKm6s6azkHe+mfzFKKyamfi6bK/wr5atPqEMxUTAlKSeueiRxoSQjQqxQLRavgauKriOssymXLZOooa97pFoufTSppqgoVq05tEg196yCsQIy7bEitAItJ7RgtUEzxjGML/QmEKIlrPgjPDFaoTYoPDFcJRavtK4XrKmsk6zjsCwsTa4UsPQs9jI/I3ct1C6cMV+b5y7wJZ0tYTF9MGojdS/oLTShziM/MVmnxC8FKJUwRCUxIz8wiS4QLWipLCCYq9EseabMKnEll6kPqIawRq+xGcgjyCkgqKed7SB6qZcr6CwJLW+st6ePq7WuHycUrhqsSq7zsKuZtimgCXCrmKkqnIGp4LHNsX2wnqyBsH2xIbDhpwCzra1ss44wTCypKDCyyK23LRiwYKKPMJmxcaqZKcshCCYipoyxNa1Nsbwozi1+MB8lQ5mtsDel3jDnlbutxiPzsWmp5SpTHaqys7EstauTPqoRsOosf6g3sLOgeaAfKUIsWi/BJdosUSzdMM4pSy3kpGM0DjWvLWw0cjR4MWWqQaYMLo2rZSijJjstZiFaLBadMq0TseyjYi0VGsQt8yo5oZCgti/HMLciM6r3KgMk8K6OqKup9q0srT0xcaWMMMwra67qrhSfsZ3GrrIj2a2+pqSvdrEcrRQ0IDhgMB+PCDWVM8qjnJ5ZKOmw4C0dMGyuG6DGMQUvrq+Oq4UsTSzHMRg2ibbXs+Axa7N5sAqqnSoerQUmky8oKIiuUjGsoBitdKy9q6iw661pqg4thKnpkYmt+a3gseypGp5Co22fM6YSKJap66hwopmsmqhlrCMkZyiLL4KnGKupKvUmyCQbLFUrbSZerKahlaRoqCYm5SqYKW0rcS8WrAUkzaMcGlqpRK3bnresXy18IXapEKqHKFssXKCpKMUrfamapf4tKjBiKJGoU54HK+8q5qq4qVuiZiy4JuEsTixNMFQnlSSIIw4k1KzxpbMlDqyKqz6gra4SpcOw3a3Vq+qqC6tOq22eORvnpC8hRadkka2q/K7HHUiowawpqPInLyA0qYMlsihUqGGkWCb7K1WdWK5Dr5EhnKv5KHKlXqYnJ/2l9i0YKUYuMzHxpyCs/ChMkPEtwanxoFQqJi3Uq7Mseq3arXskWKc5pOAc7CZcqCwc5w7qKO4f3iaKIDsq/KRgLpWsQqn5rYYkxCWPoU0bx6hzGdkkqibtofEoxy8GpUupSCTiKiwvpij7LbiulqkErXetejFkL2+upqtUp0OwiLAPsdCpxLIlrKOyQ7C2r3utIg0drZEl2y6oLkquoaX4rCysAa9GDRCwKrHDsNivAbHsqtioqiGvrqgJE66Kqw4rzKyDKgaomp6TK2EsDyc0oOSol6NZJkmsvyxorMss5pR0KBquEixPpjsgXCpsnXQocq2MrfGmoivvLBeacahmLROpe6kcGCSfdC03qL6i6yitHHohrxzqq4UiP6JMqF8qThOshWAVUqHupDsoohQuJSkv/ywqLiwlNjG7o++hxi3vIKmleCdyrH6wYatdsPWsjLCNol+sSTDpryCptbBDK+qs4zBpLGc0Nqc1rdo09jX5MqsrHi2xKOad8igwJxAoeSsiqgkqdChcLOYxJzGlMkAsUzCuKzskTjAOKhuplqjHqf8wzDKYIGefNDISqd8pIC23Ltwu7zC9KgMsQDL/JcgrryYzLJ0oTSoyqpkmLax+KuejVyqxr08ulZ2XpyQr5yxRsEMpwzD0KmEqoihRC6mwF6xOplwmjSSmpMep0SvhpOEndCluqLyvtCGgo3unOyy9IXKtmZ9yIK8hlqohrEUtxh0XKH0sGi18p6coHa3Tow6psqa/JRUMU6yiKbUoXigQpo2i7C18q3ur6CnWrSateC3/KY+jlCJ6o6qr+x8VJUkSFadyAgGpji0xraytBSd+rYksTqDAHQAtxSjkqMAmNqxhqNesEi5uKsqlFqo9Kg6seizOrdusAasErjmtoKv8rb8ph6cYLnMmcKlCLJ6pjiuIKpkpKK1UKvyq3RhVpZac+izlrYitWB+DrI4omKOZKikiZS1Fqicf+q25rJmsqKrYrNGt0JWRLWel2KfLqQ==';
-
-// Reusable fullscreen quad vertex shader (2 triangles covering NDC)
-const FULLSCREEN_QUAD_VS = `
-@vertex
-fn vs_main(@builtin(vertex_index) idx: u32) -> @builtin(position) vec4<f32> {
- var pos = array<vec2<f32>, 6>(
- vec2<f32>(-1.0, -1.0), vec2<f32>(1.0, -1.0), vec2<f32>(-1.0, 1.0),
- vec2<f32>(-1.0, 1.0), vec2<f32>(1.0, -1.0), vec2<f32>(1.0, 1.0)
- );
- return vec4<f32>(pos[idx], 0.0, 1.0);
-}`;
-
-// ============================================================================
-// WGSL SHADERS
-// ============================================================================
-
-// Static features: 7D parametric features (RGBD + UV + sin(10*uv_x) + bias)
-const STATIC_SHADER = `
-@group(0) @binding(0) var input_tex: texture_2d<f32>;
-@group(0) @binding(1) var linear_sampler: sampler;
-@group(0) @binding(2) var depth_tex: texture_2d<f32>;
-@group(0) @binding(3) var output_tex: texture_storage_2d<rgba32uint, write>;
-@group(0) @binding(4) var<uniform> mip_level: u32;
-
-@compute @workgroup_size(8, 8)
-fn main(@builtin(global_invocation_id) id: vec3<u32>) {
- let coord = vec2<i32>(id.xy);
- let dims = textureDimensions(input_tex);
- if (coord.x >= i32(dims.x) || coord.y >= i32(dims.y)) { return; }
-
- // Use normalized UV coords with linear sampler (bilinear filtering)
- let uv = (vec2<f32>(coord) + 0.5) / vec2<f32>(dims);
- let rgba = textureSampleLevel(input_tex, linear_sampler, uv, f32(mip_level));
-
- let p0 = rgba.r;
- let p1 = rgba.g;
- let p2 = rgba.b;
- let p3 = textureLoad(depth_tex, coord, 0).r;
-
- let uv_x = f32(coord.x) / f32(dims.x);
- let uv_y = f32(coord.y) / f32(dims.y);
- let sin20_y = sin(20.0 * uv_y);
- let bias = 1.0;
-
- let packed = vec4<u32>(
- pack2x16float(vec2<f32>(p0, p1)),
- pack2x16float(vec2<f32>(p2, p3)),
- pack2x16float(vec2<f32>(uv_x, uv_y)),
- pack2x16float(vec2<f32>(sin20_y, bias))
- );
- textureStore(output_tex, coord, packed);
-}`;
-
-const CNN_SHADER = `
-struct LayerParams {
- kernel_size: u32,
- in_channels: u32,
- out_channels: u32,
- weight_offset: u32,
- is_output_layer: u32,
- blend_amount: f32,
- is_layer_0: u32,
-}
-
-@group(0) @binding(0) var static_features: texture_2d<u32>;
-@group(0) @binding(1) var layer_input: texture_2d<u32>;
-@group(0) @binding(2) var output_tex: texture_storage_2d<rgba32uint, write>;
-@group(0) @binding(3) var<storage, read> weights_buffer: array<u32>;
-@group(0) @binding(4) var<uniform> params: LayerParams;
-@group(0) @binding(5) var original_input: texture_2d<f32>;
-
-fn unpack_static_features(coord: vec2<i32>) -> array<f32, 8> {
- let packed = textureLoad(static_features, coord, 0);
- let v0 = unpack2x16float(packed.x);
- let v1 = unpack2x16float(packed.y);
- let v2 = unpack2x16float(packed.z);
- let v3 = unpack2x16float(packed.w);
- return array<f32, 8>(v0.x, v0.y, v1.x, v1.y, v2.x, v2.y, v3.x, v3.y);
-}
-
-fn unpack_layer_channels(coord: vec2<i32>) -> vec4<f32> {
- let packed = textureLoad(layer_input, coord, 0);
- let v0 = unpack2x16float(packed.x);
- let v1 = unpack2x16float(packed.y);
- return vec4<f32>(v0.x, v0.y, v1.x, v1.y);
-}
-
-fn pack_channels(values: vec4<f32>) -> vec4<u32> {
- return vec4<u32>(
- pack2x16float(vec2<f32>(values.x, values.y)),
- pack2x16float(vec2<f32>(values.z, values.w)),
- 0u,
- 0u
- );
-}
-
-fn get_weight(idx: u32) -> f32 {
- let pair_idx = idx / 2u;
- let packed = weights_buffer[pair_idx];
- let unpacked = unpack2x16float(packed);
- return select(unpacked.y, unpacked.x, (idx & 1u) == 0u);
-}
-
-@compute @workgroup_size(8, 8)
-fn main(@builtin(global_invocation_id) id: vec3<u32>) {
- let coord = vec2<i32>(id.xy);
- let dims = textureDimensions(static_features);
- if (coord.x >= i32(dims.x) || coord.y >= i32(dims.y)) { return; }
-
- let kernel_size = params.kernel_size;
- let in_channels = params.in_channels; // Always 12 (4 prev + 8 static)
- let out_channels = params.out_channels; // Always 4
- let weight_offset = params.weight_offset;
- let is_output = params.is_output_layer != 0u;
- let kernel_radius = i32(kernel_size / 2u);
-
- let static_feat = unpack_static_features(coord);
-
- var output: vec4<f32> = vec4<f32>(0.0);
- for (var c: u32 = 0u; c < 4u; c++) {
- var sum: f32 = 0.0;
- for (var ky: i32 = -kernel_radius; ky <= kernel_radius; ky++) {
- for (var kx: i32 = -kernel_radius; kx <= kernel_radius; kx++) {
- let sample_coord = coord + vec2<i32>(kx, ky);
- let clamped = vec2<i32>(
- clamp(sample_coord.x, 0, i32(dims.x) - 1),
- clamp(sample_coord.y, 0, i32(dims.y) - 1)
- );
- let static_local = unpack_static_features(clamped);
- let layer_local = unpack_layer_channels(clamped);
-
- let ky_idx = u32(ky + kernel_radius);
- let kx_idx = u32(kx + kernel_radius);
- let spatial_idx = ky_idx * kernel_size + kx_idx;
-
- // Previous layer channels (4D)
- for (var i: u32 = 0u; i < 4u; i++) {
- let w_idx = weight_offset +
- c * in_channels * kernel_size * kernel_size +
- i * kernel_size * kernel_size + spatial_idx;
- sum += get_weight(w_idx) * layer_local[i];
- }
-
- // Static features (8D)
- for (var i: u32 = 0u; i < 8u; i++) {
- let w_idx = weight_offset +
- c * in_channels * kernel_size * kernel_size +
- (4u + i) * kernel_size * kernel_size + spatial_idx;
- sum += get_weight(w_idx) * static_local[i];
- }
- }
- }
-
- if (is_output || params.is_layer_0 != 0u) {
- output[c] = 1.0 / (1.0 + exp(-sum)); // Sigmoid [0,1]
- } else {
- output[c] = max(0.0, sum); // ReLU
- }
- }
-
- if (is_output) {
- let original = textureLoad(original_input, coord, 0).rgb;
- let result_rgb = vec3<f32>(output.x, output.y, output.z);
- let blended = mix(original, result_rgb, params.blend_amount);
- output.x = blended.r;
- output.y = blended.g;
- output.z = blended.b;
- }
-
- textureStore(output_tex, coord, pack_channels(output));
-}`;
-
-const DISPLAY_SHADER = `
-@group(0) @binding(0) var result_tex: texture_2d<u32>;
-@group(0) @binding(1) var original_tex: texture_2d<f32>;
-@group(0) @binding(2) var<uniform> mode: u32;
-
-@vertex
-fn vs_main(@builtin(vertex_index) idx: u32) -> @builtin(position) vec4<f32> {
- var pos = array<vec2<f32>, 6>(
- vec2<f32>(-1.0, -1.0), vec2<f32>(1.0, -1.0), vec2<f32>(-1.0, 1.0),
- vec2<f32>(-1.0, 1.0), vec2<f32>(1.0, -1.0), vec2<f32>(1.0, 1.0)
- );
- return vec4<f32>(pos[idx], 0.0, 1.0);
-}
-
-@fragment
-fn fs_main(@builtin(position) pos: vec4<f32>) -> @location(0) vec4<f32> {
- let coord = vec2<i32>(pos.xy);
- let packed = textureLoad(result_tex, coord, 0);
- let v0 = unpack2x16float(packed.x);
- let v1 = unpack2x16float(packed.y);
- let result = vec3<f32>(v0.x, v0.y, v1.x);
-
- if (mode == 0u) {
- return vec4<f32>(result, 1.0);
- } else if (mode == 1u) {
- let original = textureLoad(original_tex, coord, 0).rgb;
- return vec4<f32>(original, 1.0);
- } else {
- let original = textureLoad(original_tex, coord, 0).rgb;
- let diff = abs(result - original) * 10.0;
- return vec4<f32>(diff, 1.0);
- }
-}`;
-
-const LAYER_VIZ_SHADER = `
-@group(0) @binding(0) var layer_tex: texture_2d<u32>;
-@group(0) @binding(1) var<uniform> viz_params: vec2<f32>; // x=channel_idx, y=scale
-
-@vertex
-fn vs_main(@builtin(vertex_index) idx: u32) -> @builtin(position) vec4<f32> {
- var pos = array<vec2<f32>, 6>(
- vec2<f32>(-1.0, -1.0), vec2<f32>(1.0, -1.0), vec2<f32>(-1.0, 1.0),
- vec2<f32>(-1.0, 1.0), vec2<f32>(1.0, -1.0), vec2<f32>(1.0, 1.0)
- );
- return vec4<f32>(pos[idx], 0.0, 1.0);
-}
-
-@fragment
-fn fs_main(@builtin(position) pos: vec4<f32>) -> @location(0) vec4<f32> {
- let coord = vec2<i32>(pos.xy);
- let dims = textureDimensions(layer_tex);
-
- let channel = u32(viz_params.x);
-
- // DEBUG MODE 1: Texture coordinates (channel 10)
- if (channel == 10u) {
- let uv = vec2<f32>(f32(coord.x) / f32(dims.x), f32(coord.y) / f32(dims.y));
- return vec4<f32>(uv.x, uv.y, 0.0, 1.0);
- }
-
- let packed = textureLoad(layer_tex, coord, 0);
-
- // DEBUG MODE 2: Raw packed data (channel 11)
- if (channel == 11u) {
- let raw_val = f32(packed.x) / 4294967295.0;
- return vec4<f32>(raw_val, raw_val, raw_val, 1.0);
- }
-
- let v0 = unpack2x16float(packed.x);
- let v1 = unpack2x16float(packed.y);
- let v2 = unpack2x16float(packed.z);
- let v3 = unpack2x16float(packed.w);
-
- // DEBUG MODE 3: First unpacked value (channel 12)
- if (channel == 12u) {
- return vec4<f32>(v0.x, v0.x, v0.x, 1.0);
- }
-
- var channels: array<f32, 8>;
- channels[0] = v0.x;
- channels[1] = v0.y;
- channels[2] = v1.x;
- channels[3] = v1.y;
- channels[4] = v2.x;
- channels[5] = v2.y;
- channels[6] = v3.x;
- channels[7] = v3.y;
-
- let scale = viz_params.y;
-
- let idx = min(channel, 7u);
- let raw = channels[idx];
-
- // Apply scale: multiply and clamp to [0, 1]
- let val = clamp(raw * scale, 0.0, 1.0);
-
- return vec4<f32>(val, val, val, 1.0);
-}`;
-
-class CNNTester {
- constructor() {
- this.canvas = document.getElementById('canvas');
- this.status = document.getElementById('status');
- this.console = document.getElementById('console');
- this.image = null;
- this.video = document.getElementById('videoSource');
- this.weights = null;
- this.viewMode = 0;
- this.blendAmount = 1.0;
- this.depth = 1.0;
- this.currentLayerIdx = null;
- this.currentChannelOffset = null;
- this.isVideo = false;
- this.fps = 30;
- this.isProcessing = false;
- this.mipLevel = 0;
- this.selectedChannel = 0;
- this.init();
- }
-
- log(msg, type = 'info') {
- const line = document.createElement('div');
- line.className = `console-line ${type}`;
- line.textContent = `[${new Date().toLocaleTimeString()}] ${msg}`;
- this.console.appendChild(line);
- this.console.scrollTop = this.console.scrollHeight;
- }
-
- async init() {
- if (!navigator.gpu) {
- this.setStatus('WebGPU not supported', true);
- this.log('WebGPU not supported in this browser', 'error');
- return;
- }
-
- try {
- this.adapter = await navigator.gpu.requestAdapter();
- this.device = await this.adapter.requestDevice();
- this.context = this.canvas.getContext('webgpu');
- this.format = navigator.gpu.getPreferredCanvasFormat();
- this.log('WebGPU initialized successfully');
- } catch (e) {
- this.setStatus(`GPU init failed: ${e.message}`, true);
- this.log(`GPU initialization failed: ${e.message}`, 'error');
- }
- }
-
- setStatus(msg, isError = false) {
- this.status.textContent = msg;
- this.status.style.color = isError ? '#ff4a4a' : '#4a9eff';
- }
-
- // Get current source dimensions (video or image)
- getDimensions() {
- if (this.isVideo) {
- return { width: this.video.videoWidth, height: this.video.videoHeight };
- }
- return { width: this.image.width, height: this.image.height };
- }
-
- // Enable/disable video playback controls
- setVideoControlsEnabled(enabled) {
- ['playPauseBtn', 'stepBackBtn', 'stepForwardBtn'].forEach(id =>
- document.getElementById(id).disabled = !enabled
- );
- }
-
- parseWeights(buffer) {
- const view = new DataView(buffer);
- const magic = view.getUint32(0, true);
- if (magic !== 0x32_4E_4E_43) {
- throw new Error('Invalid .bin file (bad magic)');
- }
-
- const version = view.getUint32(4, true);
- const numLayers = view.getUint32(8, true);
- const totalWeights = view.getUint32(12, true);
-
- // Version 2: added mip_level field (20-byte header)
- let mipLevel = 0;
- let headerSize = 16;
- if (version === 2) {
- mipLevel = view.getUint32(16, true);
- headerSize = 20;
- this.log(`Binary header: version=${version}, layers=${numLayers}, weights=${totalWeights}, mip_level=${mipLevel}`);
- } else if (version === 1) {
- this.log(`Binary header: version=${version}, layers=${numLayers}, weights=${totalWeights}`);
- } else {
- throw new Error(`Unsupported binary version: ${version}`);
- }
-
- const layers = [];
- for (let i = 0; i < numLayers; i++) {
- const offset = headerSize + i * 20;
- const layer = {
- kernelSize: view.getUint32(offset, true),
- inChannels: view.getUint32(offset + 4, true),
- outChannels: view.getUint32(offset + 8, true),
- weightOffset: view.getUint32(offset + 12, true),
- weightCount: view.getUint32(offset + 16, true),
- };
- layers.push(layer);
- this.log(` Layer ${i}: ${layer.inChannels}→${layer.outChannels}, kernel=${layer.kernelSize}×${layer.kernelSize}, weights=${layer.weightCount}`);
- }
-
- const weightsOffset = headerSize + numLayers * 20;
- const weights = new Uint32Array(buffer.slice(weightsOffset));
-
- // Calculate min/max per layer
- for (let i = 0; i < numLayers; i++) {
- const layer = layers[i];
- let min = Infinity, max = -Infinity;
- const startIdx = layer.weightOffset;
- const endIdx = startIdx + layer.weightCount;
-
- for (let j = startIdx; j < endIdx; j++) {
- const pairIdx = Math.floor(j / 2);
- const packed = weights[pairIdx];
- const unpacked = this.unpackF16(packed);
- const val = (j % 2 === 0) ? unpacked[0] : unpacked[1];
- min = Math.min(min, val);
- max = Math.max(max, val);
- }
-
- layer.min = min;
- layer.max = max;
- this.log(` Layer ${i} range: [${min.toFixed(4)}, ${max.toFixed(4)}]`);
- }
-
- let nonZero = 0;
- for (let i = 0; i < weights.length; i++) {
- if (weights[i] !== 0) nonZero++;
- }
- this.log(` Weight buffer: ${weights.length} u32 (${nonZero} non-zero)`);
-
- return { version, layers, weights, mipLevel, fileSize: buffer.byteLength };
- }
-
- unpackF16(packed) {
- const lo = packed & 0xFFFF;
- const hi = (packed >> 16) & 0xFFFF;
- const toFloat = (bits) => {
- const sign = (bits >> 15) & 1;
- const exp = (bits >> 10) & 0x1F;
- const frac = bits & 0x3FF;
- if (exp === 0) return (sign ? -1 : 1) * Math.pow(2, -14) * (frac / 1024);
- if (exp === 31) return frac ? NaN : (sign ? -Infinity : Infinity);
- return (sign ? -1 : 1) * Math.pow(2, exp - 15) * (1 + frac / 1024);
- };
- return [toFloat(lo), toFloat(hi)];
- }
-
- async loadImage(file) {
- const img = await createImageBitmap(file);
- this.image = img;
- this.isVideo = false;
- this.canvas.width = img.width;
- this.canvas.height = img.height;
- this.setVideoControlsEnabled(false);
- this.log(`Loaded image: ${file.name} (${img.width}×${img.height})`);
- if (this.weights) {
- this.setStatus(`Ready: ${img.width}×${img.height}`);
- this.run();
- } else {
- this.setStatus(`Image loaded (${img.width}×${img.height}) - drop .bin weights to process`);
- this.displayOriginal();
- }
- }
-
- // Video loading: wait for metadata, then first frame decode (readyState≥2)
- async loadVideo(file) {
- return new Promise((resolve, reject) => {
- this.video.src = URL.createObjectURL(file);
-
- this.video.onloadedmetadata = () => {
- const w = this.video.videoWidth;
- const h = this.video.videoHeight;
- if (w === 0 || h === 0) {
- reject(new Error('Video has invalid dimensions'));
- return;
- }
-
- this.isVideo = true;
- this.canvas.width = w;
- this.canvas.height = h;
- this.fps = 30;
- this.log(`Loaded video: ${file.name} (${w}×${h}, ${this.video.duration.toFixed(1)}s)`);
- this.setVideoControlsEnabled(true);
-
- // Set up event handlers
- this.video.onpause = () => { document.getElementById('playPauseBtn').textContent = 'Play'; };
- this.video.onplay = () => { document.getElementById('playPauseBtn').textContent = 'Pause'; this.playbackLoop(); };
-
- // Wait for first frame to be decoded before displaying
- const displayFirstFrame = () => {
- this.video.onseeked = () => { if (!this.isProcessing) this.processVideoFrame(); };
- if (this.video.readyState >= 2) { // HAVE_CURRENT_DATA or better
- if (this.weights) {
- this.setStatus(`Ready: ${w}×${h}`);
- this.processVideoFrame().then(() => resolve());
- } else {
- this.setStatus(`Video loaded - drop .bin weights to process`);
- this.displayOriginal();
- resolve();
- }
- } else {
- setTimeout(displayFirstFrame, 50); // Poll until frame ready
- }
- };
-
- this.video.onseeked = displayFirstFrame;
- this.video.currentTime = 0;
- };
-
- this.video.onerror = () => reject(new Error('Failed to load video'));
- });
- }
-
- // Video playback loop (non-realtime, drops frames if CNN slow)
- playbackLoop() {
- if (this.video.paused || this.video.ended) return;
- if (!this.isProcessing) this.processVideoFrame();
- requestAnimationFrame(() => this.playbackLoop());
- }
-
- // Process current video frame through CNN pipeline
- async processVideoFrame() {
- if (!this.weights || this.isProcessing) return;
- this.isProcessing = true;
- await this.run();
- this.isProcessing = false;
- }
-
- // Video controls
- togglePlayPause() {
- this.video.paused ? this.video.play() : this.video.pause();
- }
-
- stepFrame(direction) {
- if (!this.isVideo) return;
- this.video.pause();
- this.video.currentTime = Math.max(0, Math.min(this.video.duration,
- this.video.currentTime + direction / this.fps));
- }
-
- async loadWeights(file) {
- const buffer = await file.arrayBuffer();
- this.weights = this.parseWeights(buffer);
- this.weightsBuffer = buffer;
- this.mipLevel = this.weights.mipLevel; // Set mip level from binary format
- this.log(`Loaded weights: ${file.name} (${this.weights.layers.length} layers, ${(buffer.byteLength/1024).toFixed(1)} KB)`);
-
- // Update UI dropdown to reflect loaded mip level
- const mipLevelSelect = document.getElementById('mipLevel');
- if (mipLevelSelect) {
- mipLevelSelect.value = this.mipLevel.toString();
- }
-
- this.updateWeightsPanel();
- if (this.image) {
- this.setStatus(`Ready: ${this.image.width}×${this.image.height}`);
- this.run();
- } else {
- this.setStatus('Weights loaded - drop PNG image to process');
- }
- }
-
- updateWeightsPanel() {
- const panel = document.getElementById('weightsInfo');
- const { version, layers, mipLevel, fileSize } = this.weights;
-
- let html = `
- <div style="margin-bottom: 12px;">
- <div><strong>File Size:</strong> ${(fileSize / 1024).toFixed(2)} KB</div>
- <div><strong>Version:</strong> ${version}</div>
- <div><strong>CNN Layers:</strong> ${layers.length}</div>
- <div><strong>Mip Level:</strong> ${mipLevel} (p0-p3 features)</div>
- <div style="font-size: 9px; color: #808080; margin-top: 4px;">Static features (input) + ${layers.length} conv layers</div>
- </div>
- <table>
- <thead>
- <tr>
- <th>Layer</th>
- <th>Size</th>
- <th>Weights</th>
- <th>Min</th>
- <th>Max</th>
- </tr>
- </thead>
- <tbody>
- `;
-
- // Display layers as "Layer 0", "Layer 1", etc. (matching codebase convention)
- for (let i = 0; i < layers.length; i++) {
- const l = layers[i];
- html += `
- <tr>
- <td>Layer ${i}</td>
- <td>${l.inChannels}→${l.outChannels} (${l.kernelSize}×${l.kernelSize})</td>
- <td>${l.weightCount}</td>
- <td>${l.min.toFixed(3)}</td>
- <td>${l.max.toFixed(3)}</td>
- </tr>
- `;
- }
-
- html += `
- </tbody>
- </table>
- `;
-
- panel.innerHTML = html;
-
- // Show weights visualization panel and create layer buttons
- const weightsVizPanel = document.getElementById('weightsVizPanel');
- weightsVizPanel.style.display = 'block';
-
- const weightsLayerButtons = document.getElementById('weightsLayerButtons');
- let buttonsHtml = '';
- for (let i = 0; i < layers.length; i++) {
- buttonsHtml += `<button onclick="tester.visualizeWeights(${i})" id="weightsBtn${i}">Layer ${i}</button>`;
- }
- weightsLayerButtons.innerHTML = buttonsHtml;
-
- // Auto-select first layer
- this.visualizeWeights(0);
- }
-
- generateMipmaps(texture, width, height) {
- if (!this.mipmapPipeline) {
- const mipmapShader = FULLSCREEN_QUAD_VS + `
- @group(0) @binding(0) var src: texture_2d<f32>;
- @fragment
- fn fs_main(@builtin(position) pos: vec4<f32>) -> @location(0) vec4<f32> {
- let coord = vec2<i32>(i32(pos.x) * 2, i32(pos.y) * 2);
- var sum = vec4<f32>(0.0);
- for (var y: i32 = 0; y < 2; y++) {
- for (var x: i32 = 0; x < 2; x++) {
- sum += textureLoad(src, coord + vec2<i32>(x, y), 0);
- }
- }
- return sum * 0.25;
- }
- `;
- this.mipmapPipeline = this.device.createRenderPipeline({
- layout: 'auto',
- vertex: { module: this.device.createShaderModule({ code: mipmapShader }), entryPoint: 'vs_main' },
- fragment: {
- module: this.device.createShaderModule({ code: mipmapShader }),
- entryPoint: 'fs_main',
- targets: [{ format: 'rgba8unorm' }]
- }
- });
- }
-
- const encoder = this.device.createCommandEncoder();
-
- for (let mip = 1; mip < 3; mip++) {
- const mipWidth = Math.max(1, width >> mip);
- const mipHeight = Math.max(1, height >> mip);
-
- const bindGroup = this.device.createBindGroup({
- layout: this.mipmapPipeline.getBindGroupLayout(0),
- entries: [
- { binding: 0, resource: texture.createView({ baseMipLevel: mip - 1, mipLevelCount: 1 }) }
- ]
- });
-
- const renderPass = encoder.beginRenderPass({
- colorAttachments: [{
- view: texture.createView({ baseMipLevel: mip, mipLevelCount: 1 }),
- loadOp: 'clear',
- storeOp: 'store'
- }]
- });
-
- renderPass.setPipeline(this.mipmapPipeline);
- renderPass.setBindGroup(0, bindGroup);
- renderPass.setViewport(0, 0, mipWidth, mipHeight, 0, 1);
- renderPass.draw(6);
- renderPass.end();
- }
-
- this.device.queue.submit([encoder.finish()]);
- }
-
- displayOriginal() {
- const source = this.isVideo ? this.video : this.image;
- if (!source || !this.device) return;
-
- const { width, height } = this.getDimensions();
- this.context.configure({ device: this.device, format: this.format });
-
- const inputTex = this.device.createTexture({
- size: [width, height],
- format: 'rgba8unorm',
- usage: GPUTextureUsage.TEXTURE_BINDING | GPUTextureUsage.COPY_DST | GPUTextureUsage.RENDER_ATTACHMENT
- });
-
- this.device.queue.copyExternalImageToTexture(
- { source: source },
- { texture: inputTex },
- [width, height]
- );
-
- const simpleShader = FULLSCREEN_QUAD_VS + `
- @group(0) @binding(0) var tex: texture_2d<f32>;
- @fragment
- fn fs_main(@builtin(position) pos: vec4<f32>) -> @location(0) vec4<f32> {
- let coord = vec2<i32>(pos.xy);
- return textureLoad(tex, coord, 0);
- }
- `;
-
- const pipeline = this.device.createRenderPipeline({
- layout: 'auto',
- vertex: { module: this.device.createShaderModule({ code: simpleShader }), entryPoint: 'vs_main' },
- fragment: {
- module: this.device.createShaderModule({ code: simpleShader }),
- entryPoint: 'fs_main',
- targets: [{ format: this.format }]
- }
- });
-
- const bindGroup = this.device.createBindGroup({
- layout: pipeline.getBindGroupLayout(0),
- entries: [{ binding: 0, resource: inputTex.createView() }]
- });
-
- const encoder = this.device.createCommandEncoder();
- const renderPass = encoder.beginRenderPass({
- colorAttachments: [{
- view: this.context.getCurrentTexture().createView(),
- loadOp: 'clear',
- storeOp: 'store'
- }]
- });
- renderPass.setPipeline(pipeline);
- renderPass.setBindGroup(0, bindGroup);
- renderPass.draw(6);
- renderPass.end();
-
- this.device.queue.submit([encoder.finish()]);
- }
-
- // Run CNN inference pipeline on current source (image or video frame)
- async run() {
- const t0 = performance.now();
- const source = this.isVideo ? this.video : this.image;
- if (!source) return;
- const { width, height } = this.getDimensions();
-
- this.context.configure({ device: this.device, format: this.format });
-
- // Create persistent input texture for original view with mipmaps
- if (this.inputTexture) this.inputTexture.destroy();
- this.inputTexture = this.device.createTexture({
- size: [width, height],
- format: 'rgba8unorm',
- mipLevelCount: 3,
- usage: GPUTextureUsage.TEXTURE_BINDING | GPUTextureUsage.COPY_DST | GPUTextureUsage.RENDER_ATTACHMENT
- });
-
- this.device.queue.copyExternalImageToTexture(
- { source: source },
- { texture: this.inputTexture, mipLevel: 0 },
- [width, height]
- );
-
- // Generate mipmaps
- this.generateMipmaps(this.inputTexture, width, height);
-
- const staticTex = this.device.createTexture({
- size: [width, height],
- format: 'rgba32uint',
- usage: GPUTextureUsage.STORAGE_BINDING | GPUTextureUsage.TEXTURE_BINDING | GPUTextureUsage.COPY_SRC
- });
-
- // Create one texture per layer output (static + all CNN layers)
- this.layerOutputs = [];
- const numLayers = this.weights.layers.length + 1; // +1 for static features
- const layerTextures = [];
- for (let i = 0; i < numLayers; i++) {
- layerTextures.push(this.device.createTexture({
- size: [width, height],
- format: 'rgba32uint',
- usage: GPUTextureUsage.STORAGE_BINDING | GPUTextureUsage.TEXTURE_BINDING | GPUTextureUsage.COPY_DST
- }));
- }
-
- // Ping-pong buffers for computation
- const computeTextures = [
- this.device.createTexture({
- size: [width, height],
- format: 'rgba32uint',
- usage: GPUTextureUsage.STORAGE_BINDING | GPUTextureUsage.TEXTURE_BINDING | GPUTextureUsage.COPY_SRC
- }),
- this.device.createTexture({
- size: [width, height],
- format: 'rgba32uint',
- usage: GPUTextureUsage.STORAGE_BINDING | GPUTextureUsage.TEXTURE_BINDING | GPUTextureUsage.COPY_SRC
- })
- ];
-
- const weightsGPU = this.device.createBuffer({
- size: this.weightsBuffer.byteLength,
- usage: GPUBufferUsage.STORAGE | GPUBufferUsage.COPY_DST
- });
- this.device.queue.writeBuffer(weightsGPU, 0, this.weightsBuffer);
- const staticPipeline = this.device.createComputePipeline({
- layout: 'auto',
- compute: { module: this.device.createShaderModule({ code: STATIC_SHADER }), entryPoint: 'main' }
- });
-
- const cnnPipeline = this.device.createComputePipeline({
- layout: 'auto',
- compute: { module: this.device.createShaderModule({ code: CNN_SHADER }), entryPoint: 'main' }
- });
-
- const displayPipeline = this.device.createRenderPipeline({
- layout: 'auto',
- vertex: { module: this.device.createShaderModule({ code: DISPLAY_SHADER }), entryPoint: 'vs_main' },
- fragment: {
- module: this.device.createShaderModule({ code: DISPLAY_SHADER }),
- entryPoint: 'fs_main',
- targets: [{ format: this.format }]
- }
- });
-
- const encoder = this.device.createCommandEncoder();
-
- const mipLevelBuffer = this.device.createBuffer({
- size: 4,
- usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST
- });
- this.device.queue.writeBuffer(mipLevelBuffer, 0, new Uint32Array([this.mipLevel]));
-
- if (!this.pointSampler) {
- this.pointSampler = this.device.createSampler({
- magFilter: 'linear',
- minFilter: 'linear',
- mipmapFilter: 'linear'
- });
- }
-
- // Extract depth from alpha channel (or 1.0 if no alpha)
- const depthTex = this.device.createTexture({
- size: [width, height, 1],
- format: 'r32float',
- usage: GPUTextureUsage.TEXTURE_BINDING | GPUTextureUsage.COPY_DST
- });
-
- // Read image data to extract alpha channel
- const tempCanvas = document.createElement('canvas');
- tempCanvas.width = width;
- tempCanvas.height = height;
- const tempCtx = tempCanvas.getContext('2d');
- tempCtx.drawImage(source, 0, 0, width, height);
- const imageData = tempCtx.getImageData(0, 0, width, height);
- const pixels = imageData.data;
-
- // Extract alpha channel (RGBA format: every 4th byte)
- const depthData = new Float32Array(width * height);
- for (let i = 0; i < width * height; i++) {
- depthData[i] = pixels[i * 4 + 3] / 255.0; // Alpha channel [0, 255] → [0, 1]
- }
-
- this.device.queue.writeTexture(
- { texture: depthTex },
- depthData,
- { bytesPerRow: width * 4 },
- [width, height, 1]
- );
-
- const staticBG = this.device.createBindGroup({
- layout: staticPipeline.getBindGroupLayout(0),
- entries: [
- { binding: 0, resource: this.inputTexture.createView() },
- { binding: 1, resource: this.pointSampler },
- { binding: 2, resource: depthTex.createView() }, // Depth from alpha (matches training)
- { binding: 3, resource: staticTex.createView() },
- { binding: 4, resource: { buffer: mipLevelBuffer } }
- ]
- });
-
- const staticPass = encoder.beginComputePass();
- staticPass.setPipeline(staticPipeline);
- staticPass.setBindGroup(0, staticBG);
- staticPass.dispatchWorkgroups(Math.ceil(width / 8), Math.ceil(height / 8));
- staticPass.end();
-
- // Copy static features to persistent storage (visualization index 0, shown as Static 0-3 / Static 4-7)
- encoder.copyTextureToTexture(
- { texture: staticTex },
- { texture: layerTextures[0] },
- [width, height]
- );
- this.layerOutputs.push(layerTextures[0]);
-
- let srcTex = staticTex;
- let dstTex = computeTextures[0];
-
- for (let i = 0; i < this.weights.layers.length; i++) {
- const layer = this.weights.layers[i];
- const isOutput = i === this.weights.layers.length - 1;
-
- // Calculate absolute weight offset in f16 units (add header offset)
- // Version 1: 4 u32 header, Version 2: 5 u32 header
- const headerSizeU32 = (this.weights.version === 1) ? 4 : 5;
- const headerOffsetU32 = headerSizeU32 + this.weights.layers.length * 5; // Header + layer info in u32
- const absoluteWeightOffset = headerOffsetU32 * 2 + layer.weightOffset; // Convert to f16 units
-
- const paramsData = new Uint32Array(7);
- paramsData[0] = layer.kernelSize;
- paramsData[1] = layer.inChannels;
- paramsData[2] = layer.outChannels;
- paramsData[3] = absoluteWeightOffset; // Use absolute offset
- paramsData[4] = isOutput ? 1 : 0;
- paramsData[6] = (i === 0) ? 1 : 0; // is_layer_0 flag
-
- const paramsView = new Float32Array(paramsData.buffer);
- paramsView[5] = this.blendAmount;
-
- const paramsBuffer = this.device.createBuffer({
- size: 28,
- usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST
- });
- this.device.queue.writeBuffer(paramsBuffer, 0, paramsData);
-
- const cnnBG = this.device.createBindGroup({
- layout: cnnPipeline.getBindGroupLayout(0),
- entries: [
- { binding: 0, resource: layerTextures[0].createView() },
- { binding: 1, resource: srcTex.createView() },
- { binding: 2, resource: dstTex.createView() },
- { binding: 3, resource: { buffer: weightsGPU } },
- { binding: 4, resource: { buffer: paramsBuffer } },
- { binding: 5, resource: this.inputTexture.createView() }
- ]
- });
-
- const cnnPass = encoder.beginComputePass();
- cnnPass.setPipeline(cnnPipeline);
- cnnPass.setBindGroup(0, cnnBG);
- cnnPass.dispatchWorkgroups(Math.ceil(width / 8), Math.ceil(height / 8));
- cnnPass.end();
-
- [srcTex, dstTex] = [dstTex, srcTex];
-
- // Copy CNN layer output to persistent storage for visualization
- // i=0: Layer 0 → layerTextures[1]
- // i=1: Layer 1 → layerTextures[2], etc.
- encoder.copyTextureToTexture(
- { texture: srcTex },
- { texture: layerTextures[i + 1] },
- [width, height]
- );
-
- // Always push layer outputs for visualization (including output layer)
- this.layerOutputs.push(layerTextures[i + 1]);
- }
-
- const modeBuffer = this.device.createBuffer({
- size: 4,
- usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST
- });
- this.device.queue.writeBuffer(modeBuffer, 0, new Uint32Array([this.viewMode]));
-
- // Store result texture and display pipeline for view mode switching
- this.resultTexture = srcTex;
- this.displayPipeline = displayPipeline;
- this.modeBuffer = modeBuffer;
-
- const displayBG = this.device.createBindGroup({
- layout: displayPipeline.getBindGroupLayout(0),
- entries: [
- { binding: 0, resource: srcTex.createView() },
- { binding: 1, resource: this.inputTexture.createView() },
- { binding: 2, resource: { buffer: modeBuffer } }
- ]
- });
- this.displayBindGroup = displayBG;
-
- const renderPass = encoder.beginRenderPass({
- colorAttachments: [{
- view: this.context.getCurrentTexture().createView(),
- loadOp: 'clear',
- storeOp: 'store'
- }]
- });
- renderPass.setPipeline(displayPipeline);
- renderPass.setBindGroup(0, displayBG);
- renderPass.draw(6);
- renderPass.end();
-
- this.device.queue.submit([encoder.finish()]);
-
- // Wait for GPU to finish before visualizing layers
- await this.device.queue.onSubmittedWorkDone();
-
- const t1 = performance.now();
- const mode = ['CNN Output', 'Original', 'Diff (×10)'][this.viewMode];
- this.setStatus(`GPU: ${(t1-t0).toFixed(1)}ms | ${width}×${height} | ${mode}`);
- this.log(`Completed in ${(t1-t0).toFixed(1)}ms`);
-
- // Update layer visualization panel
- this.updateLayerVizPanel();
- }
-
- updateLayerVizPanel() {
- const panel = document.getElementById('layerViz');
-
- if (!this.layerOutputs || this.layerOutputs.length === 0) {
- panel.innerHTML = '<p style="color: #808080; text-align: center;">No layers to visualize</p>';
- return;
- }
-
- // Only rebuild panel structure if layer count changed
- const needsRebuild = !this.lastLayerCount || this.lastLayerCount !== this.layerOutputs.length;
-
- if (needsRebuild) {
- let html = '<div class="layer-buttons">';
- html += `<button onclick="tester.visualizeLayer(0, 0)" id="layerBtn0_0">Static 0-3</button>`;
- html += `<button onclick="tester.visualizeLayer(0, 4)" id="layerBtn0_4">Static 4-7</button>`;
-
- for (let i = 1; i < this.layerOutputs.length; i++) {
- const label = `Layer ${i - 1}`;
- html += `<button onclick="tester.visualizeLayer(${i})" id="layerBtn${i}">${label}</button>`;
- }
- html += `<button onclick="tester.saveCompositedLayer()" style="margin-left: 20px; background: #28a745;">Save Composited</button>`;
- html += '</div>';
-
- html += '<div class="layer-grid" id="layerGrid"></div>';
- html += '<div class="layer-preview"><div class="layer-view-label" id="previewLabel">Ch0</div><canvas id="previewCanvas"></canvas></div>';
-
- panel.innerHTML = html;
- this.log(`Layer visualization ready: ${this.layerOutputs.length} layers`);
- this.recreateCanvases();
- this.lastLayerCount = this.layerOutputs.length;
- }
-
- // Update current visualization
- if (this.currentLayerIdx !== null) {
- this.visualizeLayer(this.currentLayerIdx, this.currentChannelOffset || 0);
- } else {
- this.visualizeLayer(0, 0);
- }
- }
-
- recreateCanvases() {
- const grid = document.getElementById('layerGrid');
- if (!grid) return;
-
- // Force removal of old canvases to clear any WebGPU contexts
- const oldCanvases = grid.querySelectorAll('canvas');
- oldCanvases.forEach(canvas => {
- canvas.width = 0;
- canvas.height = 0;
- });
-
- grid.innerHTML = '';
- for (let c = 0; c < 4; c++) {
- const div = document.createElement('div');
- div.className = 'layer-view';
- div.innerHTML = `
- <div class="layer-view-label" id="channelLabel${c}">Ch ${c}</div>
- <canvas id="layerCanvas${c}"></canvas>
- `;
- div.onclick = () => this.selectChannel(c);
- grid.appendChild(div);
- }
- this.selectedChannel = 0;
- }
-
- async visualizeLayer(layerIdx, channelOffset = 0) {
- if (!this.layerOutputs || layerIdx >= this.layerOutputs.length) {
- this.log(`Cannot visualize layer ${layerIdx}: no data`, 'error');
- return;
- }
-
- // Store current selection
- this.currentLayerIdx = layerIdx;
- this.currentChannelOffset = channelOffset;
-
- // Update button states
- document.querySelectorAll('.layer-buttons button').forEach(btn => btn.classList.remove('active'));
- if (layerIdx === 0) {
- // Static features
- const btnId = `layerBtn0_${channelOffset}`;
- const btn = document.getElementById(btnId);
- if (btn) btn.classList.add('active');
- } else {
- const btn = document.getElementById(`layerBtn${layerIdx}`);
- if (btn) btn.classList.add('active');
- }
-
- const layerName = layerIdx === 0 ? `Static Features (${channelOffset}-${channelOffset + 3})` : `Layer ${layerIdx - 1}`;
- const layerTex = this.layerOutputs[layerIdx];
- const { width, height } = this.getDimensions();
-
- // Update channel labels based on layer type
- // Static features (layerIdx=0): 8 channels split into two views
- // CNN layers (layerIdx≥1): 4 channels per layer
- const staticLabels = [
- ['Ch0 (p0)', 'Ch1 (p1)', 'Ch2 (p2)', 'Ch3 (p3)'],
- ['Ch4 (uv_x)', 'Ch5 (uv_y)', 'Ch6 (sin10_x)', 'Ch7 (bias)']
- ];
- const channelLabels = layerIdx === 0
- ? staticLabels[channelOffset / 4]
- : ['Ch0', 'Ch1', 'Ch2', 'Ch3'];
-
- for (let c = 0; c < 4; c++) {
- const label = document.getElementById(`channelLabel${c}`);
- if (label) label.textContent = channelLabels[c];
- }
-
- // Create layer viz pipeline if needed
- if (!this.layerVizPipeline) {
- this.layerVizPipeline = this.device.createRenderPipeline({
- layout: 'auto',
- vertex: {
- module: this.device.createShaderModule({ code: LAYER_VIZ_SHADER }),
- entryPoint: 'vs_main'
- },
- fragment: {
- module: this.device.createShaderModule({ code: LAYER_VIZ_SHADER }),
- entryPoint: 'fs_main',
- targets: [{ format: this.format }]
- }
- });
- this.log('Created layer visualization pipeline');
- }
-
- // Render each channel to its canvas
- for (let c = 0; c < 4; c++) {
- const canvas = document.getElementById(`layerCanvas${c}`);
- if (!canvas) {
- this.log(`Canvas layerCanvas${c} not found`, 'error');
- continue;
- }
-
- // Set canvas size BEFORE getting context
- canvas.width = width;
- canvas.height = height;
-
- const ctx = canvas.getContext('webgpu');
- if (!ctx) {
- this.log(`Failed to get WebGPU context for channel ${c}`, 'error');
- continue;
- }
-
- try {
- ctx.configure({ device: this.device, format: this.format });
- } catch (e) {
- this.log(`Failed to configure canvas ${c}: ${e.message}`, 'error');
- continue;
- }
-
- const vizScale = 1.0; // Always 1.0, shader clamps to [0,1]
- const paramsBuffer = this.device.createBuffer({
- size: 8,
- usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST
- });
- // Use channel index with offset for static features
- const actualChannel = channelOffset + c;
- const paramsData = new Float32Array([actualChannel, vizScale]);
- this.device.queue.writeBuffer(paramsBuffer, 0, paramsData);
-
- const bindGroup = this.device.createBindGroup({
- layout: this.layerVizPipeline.getBindGroupLayout(0),
- entries: [
- { binding: 0, resource: layerTex.createView() },
- { binding: 1, resource: { buffer: paramsBuffer } }
- ]
- });
-
- const encoder = this.device.createCommandEncoder();
- const renderPass = encoder.beginRenderPass({
- colorAttachments: [{
- view: ctx.getCurrentTexture().createView(),
- loadOp: 'clear',
- clearValue: { r: 1.0, g: 0.0, b: 1.0, a: 1.0 }, // Magenta clear for debugging
- storeOp: 'store'
- }]
- });
-
- renderPass.setPipeline(this.layerVizPipeline);
- renderPass.setBindGroup(0, bindGroup);
- renderPass.draw(6);
- renderPass.end();
-
- this.device.queue.submit([encoder.finish()]);
- }
-
- // Wait for all renders to complete
- await this.device.queue.onSubmittedWorkDone();
-
- // Update active channel highlighting and preview
- this.updateChannelSelection();
- await this.renderChannelPreview();
- }
-
- selectChannel(channelIdx) {
- this.selectedChannel = channelIdx;
- this.updateChannelSelection();
- this.renderChannelPreview();
- }
-
- updateChannelSelection() {
- const grid = document.getElementById('layerGrid');
- if (!grid) return;
-
- const views = grid.querySelectorAll('.layer-view');
- views.forEach((view, idx) => {
- view.classList.toggle('active', idx === this.selectedChannel);
- });
- }
-
- async renderChannelPreview() {
- const previewCanvas = document.getElementById('previewCanvas');
- const previewLabel = document.getElementById('previewLabel');
- if (!previewCanvas || !this.device) return;
-
- const { width, height } = this.getDimensions();
- previewCanvas.width = width;
- previewCanvas.height = height;
-
- const ctx = previewCanvas.getContext('webgpu');
- if (!ctx) return;
-
- try {
- ctx.configure({ device: this.device, format: this.format });
- } catch (e) {
- return;
- }
-
- // Update label
- const channelLabel = document.getElementById(`channelLabel${this.selectedChannel}`);
- if (channelLabel && previewLabel) {
- previewLabel.textContent = channelLabel.textContent;
- }
-
- // Render selected channel
- const layerIdx = this.currentLayerIdx;
- const channelOffset = this.currentChannelOffset;
- const layerTex = this.layerOutputs[layerIdx];
- if (!layerTex) return;
-
- // Always 1.0, shader clamps to [0,1] - show exact layer values
- const vizScale = 1.0;
- const actualChannel = channelOffset + this.selectedChannel;
-
- const paramsBuffer = this.device.createBuffer({
- size: 8,
- usage: GPUBufferUsage.UNIFORM | GPUBufferUsage.COPY_DST
- });
- const paramsData = new Float32Array([actualChannel, vizScale]);
- this.device.queue.writeBuffer(paramsBuffer, 0, paramsData);
-
- const bindGroup = this.device.createBindGroup({
- layout: this.layerVizPipeline.getBindGroupLayout(0),
- entries: [
- { binding: 0, resource: layerTex.createView() },
- { binding: 1, resource: { buffer: paramsBuffer } }
- ]
- });
-
- const encoder = this.device.createCommandEncoder();
- const renderPass = encoder.beginRenderPass({
- colorAttachments: [{
- view: ctx.getCurrentTexture().createView(),
- loadOp: 'clear',
- storeOp: 'store'
- }]
- });
-
- renderPass.setPipeline(this.layerVizPipeline);
- renderPass.setBindGroup(0, bindGroup);
- renderPass.draw(6);
- renderPass.end();
-
- this.device.queue.submit([encoder.finish()]);
- }
-
- visualizeWeights(cnnLayerIdx) {
- const layer = this.weights.layers[cnnLayerIdx];
- if (!layer) {
- this.log(`Layer ${cnnLayerIdx} not found`, 'error');
- return;
- }
-
- // Update button states
- document.querySelectorAll('#weightsLayerButtons button').forEach(btn => btn.classList.remove('active'));
- const btn = document.getElementById(`weightsBtn${cnnLayerIdx}`);
- if (btn) btn.classList.add('active');
-
- const { kernelSize, inChannels, outChannels, weightOffset, min, max } = layer;
-
- const canvas = document.getElementById('weightsCanvas');
- const ctx = canvas.getContext('2d', { willReadFrequently: false });
-
- // 1 pixel per weight, show all input channels horizontally
- const width = inChannels * kernelSize;
- const height = outChannels * kernelSize;
-
- canvas.width = width;
- canvas.height = height;
-
- ctx.fillStyle = '#1a1a1a';
- ctx.fillRect(0, 0, width, height);
-
- // Stack output channels vertically
- for (let outCh = 0; outCh < outChannels; outCh++) {
- const yOffset = outCh * kernelSize;
-
- for (let inCh = 0; inCh < inChannels; inCh++) {
- const xOffset = inCh * kernelSize;
-
- for (let ky = 0; ky < kernelSize; ky++) {
- for (let kx = 0; kx < kernelSize; kx++) {
- const spatialIdx = ky * kernelSize + kx;
- const wIdx = weightOffset +
- outCh * inChannels * kernelSize * kernelSize +
- inCh * kernelSize * kernelSize +
- spatialIdx;
-
- const weight = this.getWeightValue(wIdx);
- const normalized = (weight - min) / (max - min);
- const intensity = Math.floor(normalized * 255);
-
- ctx.fillStyle = `rgb(${intensity}, ${intensity}, ${intensity})`;
- ctx.fillRect(xOffset + kx, yOffset + ky, 1, 1);
- }
- }
- }
- }
- }
-
- getWeightValue(idx) {
- const pairIdx = Math.floor(idx / 2);
- const packed = this.weights.weights[pairIdx];
- const unpacked = this.unpackF16(packed);
- return (idx % 2 === 0) ? unpacked[0] : unpacked[1];
- }
-
- toggleWeightsInfo() {
- const panel = document.getElementById('weightsInfoPanel');
- const toggle = document.getElementById('weightsInfoToggle');
- panel.classList.toggle('collapsed');
- toggle.textContent = panel.classList.contains('collapsed') ? '▶' : '▼';
- }
-
- updateDisplay() {
- if (!this.displayPipeline || !this.displayBindGroup) return;
-
- this.device.queue.writeBuffer(this.modeBuffer, 0, new Uint32Array([this.viewMode]));
-
- const encoder = this.device.createCommandEncoder();
- const renderPass = encoder.beginRenderPass({
- colorAttachments: [{
- view: this.context.getCurrentTexture().createView(),
- loadOp: 'clear',
- storeOp: 'store'
- }]
- });
- renderPass.setPipeline(this.displayPipeline);
- renderPass.setBindGroup(0, this.displayBindGroup);
- renderPass.draw(6);
- renderPass.end();
-
- this.device.queue.submit([encoder.finish()]);
- }
-
- async savePNG() {
- if (!this.image && !this.isVideo) {
- this.log('No image loaded', 'error');
- return;
- }
-
- if (!this.resultTexture) {
- this.log('No result to save', 'error');
- return;
- }
-
- try {
- const { width, height } = this.getDimensions();
-
- // GPU readback from result texture
- const bytesPerRow = width * 16; // 4×u32 per pixel
- const paddedBytesPerRow = Math.ceil(bytesPerRow / 256) * 256;
- const bufferSize = paddedBytesPerRow * height;
-
- const stagingBuffer = this.device.createBuffer({
- size: bufferSize,
- usage: GPUBufferUsage.COPY_DST | GPUBufferUsage.MAP_READ
- });
-
- const encoder = this.device.createCommandEncoder();
- encoder.copyTextureToBuffer(
- { texture: this.resultTexture },
- { buffer: stagingBuffer, bytesPerRow: paddedBytesPerRow, rowsPerImage: height },
- { width, height, depthOrArrayLayers: 1 }
- );
- this.device.queue.submit([encoder.finish()]);
-
- await stagingBuffer.mapAsync(GPUMapMode.READ);
- const mapped = new Uint8Array(stagingBuffer.getMappedRange());
-
- // Unpack f16 to RGBA8
- const pixels = new Uint8Array(width * height * 4);
- for (let y = 0; y < height; y++) {
- const rowOffset = y * paddedBytesPerRow;
- for (let x = 0; x < width; x++) {
- const pixelOffset = rowOffset + x * 16;
- const data = new Uint32Array(mapped.buffer, mapped.byteOffset + pixelOffset, 4);
-
- // Unpack f16 (first 4 channels only)
- const unpack = (u32, idx) => {
- const h = (idx === 0) ? (u32 & 0xFFFF) : ((u32 >> 16) & 0xFFFF);
- const sign = (h >> 15) & 1;
- const exp = (h >> 10) & 0x1F;
- const frac = h & 0x3FF;
- if (exp === 0) return 0;
- if (exp === 31) return sign ? 0 : 255;
- const e = exp - 15;
- const val = (1 + frac / 1024) * Math.pow(2, e);
- return Math.max(0, Math.min(255, Math.round(val * 255)));
- };
-
- const outIdx = (y * width + x) * 4;
- pixels[outIdx + 0] = unpack(data[0], 0); // R
- pixels[outIdx + 1] = unpack(data[0], 1); // G
- pixels[outIdx + 2] = unpack(data[1], 0); // B
- pixels[outIdx + 3] = 255; // A
- }
- }
-
- stagingBuffer.unmap();
- stagingBuffer.destroy();
-
- // Create blob from pixels
- const canvas = document.createElement('canvas');
- canvas.width = width;
- canvas.height = height;
- const ctx = canvas.getContext('2d');
- const imageData = new ImageData(new Uint8ClampedArray(pixels), width, height);
- ctx.putImageData(imageData, 0, 0);
-
- const blob = await new Promise(resolve => canvas.toBlob(resolve, 'image/png'));
- const url = URL.createObjectURL(blob);
- const a = document.createElement('a');
- const mode = ['cnn', 'original', 'diff'][this.viewMode];
- a.href = url;
- a.download = `output_${width}x${height}_${mode}.png`;
- a.click();
- URL.revokeObjectURL(url);
-
- this.log(`Saved PNG: ${a.download}`);
- this.setStatus(`Saved: ${a.download}`);
- } catch (err) {
- this.log(`Failed to save PNG: ${err.message}`, 'error');
- this.setStatus(`Save failed: ${err.message}`, true);
- }
- }
-
- async saveCompositedLayer() {
- if (!this.currentLayerIdx) {
- this.log('No layer selected for compositing', 'error');
- return;
- }
-
- try {
- const canvases = [];
- for (let i = 0; i < 4; i++) {
- const canvas = document.getElementById(`layerCanvas${i}`);
- if (!canvas) {
- this.log(`Canvas layerCanvas${i} not found`, 'error');
- return;
- }
- canvases.push(canvas);
- }
-
- const width = canvases[0].width;
- const height = canvases[0].height;
- const compositedWidth = width * 4;
-
- // Create composited canvas
- const compositedCanvas = document.createElement('canvas');
- compositedCanvas.width = compositedWidth;
- compositedCanvas.height = height;
- const ctx = compositedCanvas.getContext('2d');
-
- // Composite horizontally
- for (let i = 0; i < 4; i++) {
- ctx.drawImage(canvases[i], i * width, 0);
- }
-
- // Convert to grayscale
- const imageData = ctx.getImageData(0, 0, compositedWidth, height);
- const pixels = imageData.data;
- for (let i = 0; i < pixels.length; i += 4) {
- const gray = 0.299 * pixels[i] + 0.587 * pixels[i + 1] + 0.114 * pixels[i + 2];
- pixels[i] = pixels[i + 1] = pixels[i + 2] = gray;
- }
- ctx.putImageData(imageData, 0, 0);
-
- // Save as PNG
- const blob = await new Promise(resolve => compositedCanvas.toBlob(resolve, 'image/png'));
- const url = URL.createObjectURL(blob);
- const a = document.createElement('a');
- a.href = url;
- a.download = `composited_layer${this.currentLayerIdx - 1}_${compositedWidth}x${height}.png`;
- a.click();
- URL.revokeObjectURL(url);
-
- this.log(`Saved composited layer: ${a.download}`);
- this.setStatus(`Saved: ${a.download}`);
- } catch (err) {
- this.log(`Failed to save composited layer: ${err.message}`, 'error');
- this.setStatus(`Compositing failed: ${err.message}`, true);
- }
- }
-}
-
-const tester = new CNNTester();
-
-// Load default weights on startup
-(async () => {
- try {
- const binaryString = atob(DEFAULT_WEIGHTS_B64);
- const bytes = new Uint8Array(binaryString.length);
- for (let i = 0; i < binaryString.length; i++) {
- bytes[i] = binaryString.charCodeAt(i);
- }
- await tester.loadWeights({ name: 'default.bin', arrayBuffer: () => Promise.resolve(bytes.buffer) });
- tester.log('Loaded default weights');
- } catch (err) {
- tester.log(`Failed to load default weights: ${err.message}`, 'error');
- }
-})();
-
-function setupDropZone(id, callback) {
- const zone = document.getElementById(id);
- ['dragenter', 'dragover', 'dragleave', 'drop'].forEach(e => {
- zone.addEventListener(e, ev => { ev.preventDefault(); ev.stopPropagation(); });
- });
- ['dragenter', 'dragover'].forEach(e => zone.addEventListener(e, () => zone.classList.add('active')));
- ['dragleave', 'drop'].forEach(e => zone.addEventListener(e, () => zone.classList.remove('active')));
- zone.addEventListener('drop', e => {
- const file = e.dataTransfer.files[0];
- if (file) callback(file).catch(err => {
- zone.classList.add('error');
- tester.setStatus(err.message, true);
- tester.log(err.message, 'error');
- setTimeout(() => zone.classList.remove('error'), 2000);
- });
- });
-}
-
-// Whole window drop for PNG images and videos
-const mainArea = document.getElementById('mainDrop');
-['dragenter', 'dragover', 'dragleave', 'drop'].forEach(e => {
- mainArea.addEventListener(e, ev => { ev.preventDefault(); ev.stopPropagation(); });
-});
-['dragenter', 'dragover'].forEach(e => mainArea.addEventListener(e, () => mainArea.classList.add('drop-active')));
-['dragleave', 'drop'].forEach(e => mainArea.addEventListener(e, () => mainArea.classList.remove('drop-active')));
-mainArea.addEventListener('drop', e => {
- const file = e.dataTransfer.files[0];
- if (file) {
- if (file.type.startsWith('image/')) {
- tester.loadImage(file).catch(err => {
- tester.setStatus(err.message, true);
- tester.log(err.message, 'error');
- });
- } else if (file.type.startsWith('video/')) {
- tester.loadVideo(file).catch(err => {
- tester.setStatus(err.message, true);
- tester.log(err.message, 'error');
- });
- }
- }
-});
-
-// Weights drop zone
-setupDropZone('weightsDrop', f => tester.loadWeights(f));
-
-// Weights file input
-document.getElementById('weightsFile').addEventListener('change', e => {
- const file = e.target.files[0];
- if (file) {
- tester.loadWeights(file).catch(err => {
- tester.setStatus(err.message, true);
- tester.log(err.message, 'error');
- });
- }
-});
-
-document.getElementById('blend').addEventListener('input', e => {
- tester.blendAmount = parseFloat(e.target.value);
- document.getElementById('blendValue').textContent = e.target.value;
- if ((tester.image || tester.isVideo) && tester.weights) {
- tester.log(`Blend changed to ${e.target.value}`);
- tester.run();
- }
-});
-
-document.getElementById('depth').addEventListener('input', e => {
- tester.depth = parseFloat(e.target.value);
- document.getElementById('depthValue').textContent = e.target.value;
- if ((tester.image || tester.isVideo) && tester.weights) tester.run();
-});
-
-document.getElementById('mipLevel').addEventListener('change', e => {
- tester.mipLevel = parseInt(e.target.value);
- tester.log(`Mip level changed to ${e.target.value}`);
- if ((tester.image || tester.isVideo) && tester.weights) tester.run();
-});
-
-document.getElementById('playPauseBtn').addEventListener('click', () => tester.togglePlayPause());
-document.getElementById('stepBackBtn').addEventListener('click', () => tester.stepFrame(-1));
-document.getElementById('stepForwardBtn').addEventListener('click', () => tester.stepFrame(1));
-document.getElementById('savePngBtn').addEventListener('click', () => tester.savePNG());
-
-document.addEventListener('keydown', e => {
- if (e.code === 'Space') {
- e.preventDefault();
- if (tester.viewMode === 1) {
- tester.viewMode = 0;
- } else {
- tester.viewMode = 1;
- }
- const modeName = ['CNN Output', 'Original', 'Diff (×10)'][tester.viewMode];
- if ((tester.image || tester.isVideo) && tester.weights) {
- tester.log(`View mode: ${modeName}`);
- tester.updateDisplay();
- const width = tester.isVideo ? tester.video.videoWidth : tester.image.width;
- const height = tester.isVideo ? tester.video.videoHeight : tester.image.height;
- tester.setStatus(`${width}×${height} | ${modeName}`);
- }
- } else if (e.code === 'KeyD') {
- e.preventDefault();
- if (tester.viewMode === 2) {
- tester.viewMode = 0;
- } else {
- tester.viewMode = 2;
- }
- const modeName = ['CNN Output', 'Original', 'Diff (×10)'][tester.viewMode];
- if ((tester.image || tester.isVideo) && tester.weights) {
- tester.log(`View mode: ${modeName}`);
- tester.updateDisplay();
- const width = tester.isVideo ? tester.video.videoWidth : tester.image.width;
- const height = tester.isVideo ? tester.video.videoHeight : tester.image.height;
- tester.setStatus(`${width}×${height} | ${modeName}`);
- }
- }
-});
- </script>
-</body>
-</html>
diff --git a/tools/common/style.css b/tools/common/style.css
new file mode 100644
index 0000000..1ba4bad
--- /dev/null
+++ b/tools/common/style.css
@@ -0,0 +1,117 @@
+:root {
+ --bg-dark: #1e1e1e;
+ --bg-medium: #252526;
+ --bg-light: #3c3c3c;
+ --text-primary: #d4d4d4;
+ --text-muted: #858585;
+ --accent-blue: #0e639c;
+ --accent-blue-hover: #1177bb;
+ --accent-green: #4ec9b0;
+ --accent-orange: #ce9178;
+ --accent-red: #f48771;
+ --border-color: #858585;
+ --gap: 10px;
+ --radius: 4px;
+}
+
+* {
+ margin: 0;
+ padding: 0;
+ box-sizing: border-box;
+}
+
+body {
+ font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
+ background: var(--bg-dark);
+ color: var(--text-primary);
+ overflow: hidden;
+}
+
+button, .btn, .file-label {
+ background: var(--accent-blue);
+ color: white;
+ border: none;
+ padding: 10px 20px;
+ border-radius: var(--radius);
+ cursor: pointer;
+ font-size: 14px;
+ display: inline-block;
+ text-align: center;
+}
+
+button:hover, .btn:hover, .file-label:hover {
+ background: var(--accent-blue-hover);
+}
+
+button:disabled, .btn:disabled {
+ background: var(--bg-light);
+ cursor: not-allowed;
+}
+
+input[type="file"] {
+ display: none;
+}
+
+input, select {
+ background: var(--bg-light);
+ border: 1px solid var(--border-color);
+ border-radius: var(--radius);
+ color: var(--text-primary);
+ padding: 8px;
+ font-size: 14px;
+}
+
+.container {
+ width: 100%;
+ height: 100vh;
+ display: flex;
+ flex-direction: column;
+}
+
+.header {
+ background: var(--bg-medium);
+ padding: 15px;
+ border-bottom: 1px solid var(--border-color);
+ display: flex;
+ align-items: center;
+ gap: 20px;
+ flex-wrap: wrap;
+}
+
+h1 {
+ color: var(--accent-green);
+ font-size: 18px;
+ white-space: nowrap;
+}
+
+.controls {
+ display: flex;
+ gap: var(--gap);
+ flex-wrap: wrap;
+ align-items: center;
+}
+
+.panel {
+ background: var(--bg-medium);
+ border: 1px solid var(--border-color);
+ border-radius: var(--radius);
+ padding: 15px;
+}
+
+.error-message {
+ background: #5a1d1d;
+ color: var(--accent-red);
+ padding: 10px;
+ border-radius: var(--radius);
+ box-shadow: 0 2px 8px rgba(0,0,0,0.3);
+ margin: 10px 0;
+}
+
+.success-message {
+ background: #1e5231;
+ color: #89d185;
+ padding: 10px;
+ border-radius: var(--radius);
+ box-shadow: 0 2px 8px rgba(0,0,0,0.3);
+ margin: 10px 0;
+}
diff --git a/tools/shader_editor/index.html b/tools/shader_editor/index.html
index bad0abb..d93a595 100644
--- a/tools/shader_editor/index.html
+++ b/tools/shader_editor/index.html
@@ -4,26 +4,8 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>WGSL Shader Editor</title>
+ <link rel="stylesheet" href="../common/style.css">
<style>
-* {
- margin: 0;
- padding: 0;
- box-sizing: border-box;
-}
-
-body {
- font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", monospace;
- background: #1e1e1e;
- color: #d4d4d4;
- overflow: hidden;
- height: 100vh;
-}
-
-.container {
- display: flex;
- height: 100vh;
-}
-
.preview-pane {
flex: 0 0 57%;
background: #252526;
@@ -89,26 +71,13 @@ body {
}
.control-group button {
- background: #0e639c;
- color: #fff;
- border: none;
padding: 6px 12px;
- border-radius: 3px;
- cursor: pointer;
font-size: 13px;
}
-.control-group button:hover {
- background: #1177bb;
-}
-
.control-group input[type="number"],
.control-group select {
- background: #3c3c3c;
- color: #d4d4d4;
- border: 1px solid #3e3e42;
padding: 4px 8px;
- border-radius: 3px;
font-size: 13px;
}
@@ -153,19 +122,10 @@ body {
}
.editor-header button {
- background: #0e639c;
- color: #fff;
- border: none;
padding: 6px 12px;
- border-radius: 3px;
- cursor: pointer;
font-size: 13px;
}
-.editor-header button:hover {
- background: #1177bb;
-}
-
.editor-container {
flex: 1;
position: relative;
diff --git a/tools/spectral_editor/index.html b/tools/spectral_editor/index.html
index 75658ae..2d5f3e5 100644
--- a/tools/spectral_editor/index.html
+++ b/tools/spectral_editor/index.html
@@ -4,6 +4,7 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Spectral Brush Editor</title>
+ <link rel="stylesheet" href="../common/style.css">
<link rel="stylesheet" href="style.css">
</head>
<body>
diff --git a/tools/spectral_editor/style.css b/tools/spectral_editor/style.css
index 48f7463..87fb54e 100644
--- a/tools/spectral_editor/style.css
+++ b/tools/spectral_editor/style.css
@@ -1,18 +1,4 @@
-/* Spectral Brush Editor Styles */
-
-* {
- margin: 0;
- padding: 0;
- box-sizing: border-box;
-}
-
-body {
- font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
- background: #1e1e1e;
- color: #d4d4d4;
- overflow: hidden;
- height: 100vh;
-}
+/* Spectral Brush Editor Specific Styles */
#app {
display: flex;
@@ -20,41 +6,12 @@ body {
height: 100vh;
}
-/* Header */
-header {
- background: #252526;
- padding: 12px 20px;
- border-bottom: 1px solid #3e3e42;
- display: flex;
- justify-content: space-between;
- align-items: center;
-}
-
-header h1 {
- font-size: 18px;
- font-weight: 600;
- color: #cccccc;
-}
-
-.header-controls {
- display: flex;
- align-items: center;
- gap: 15px;
-}
-
-.file-info {
- font-size: 13px;
- color: #858585;
-}
-
-/* Main content area */
.main-content {
display: flex;
flex: 1;
overflow: hidden;
}
-/* Canvas container (80% width) */
.canvas-container {
flex: 1;
position: relative;
@@ -89,7 +46,6 @@ header h1 {
display: none;
}
-/* Mini spectrum viewer (bottom-right overlay) */
.spectrum-viewer {
position: absolute;
bottom: 10px;
@@ -99,12 +55,8 @@ header h1 {
background: rgba(30, 30, 30, 0.9);
border: 1px solid #3e3e42;
border-radius: 3px;
- display: block; /* Always visible */
- pointer-events: none; /* Don't interfere with mouse events */
-}
-
-.spectrum-viewer.active {
- display: block; /* Keep for backward compatibility */
+ display: block;
+ pointer-events: none;
}
#spectrumCanvas {
@@ -123,7 +75,6 @@ header h1 {
color: #858585;
}
-/* Toolbar (20% width) */
.toolbar {
width: 250px;
background: #252526;
@@ -155,16 +106,6 @@ header h1 {
transition: background 0.2s;
}
-.btn-toolbar:hover {
- background: #1177bb;
-}
-
-.btn-toolbar:disabled {
- background: #3e3e42;
- color: #858585;
- cursor: not-allowed;
-}
-
.btn-toolbar.btn-danger {
background: #a82d2d;
}
@@ -199,7 +140,6 @@ header h1 {
border-color: #0e639c;
}
-/* Point info panel */
.point-info {
margin-top: 10px;
padding: 10px;
@@ -224,7 +164,6 @@ header h1 {
font-family: monospace;
}
-/* Control panel (bottom) */
.control-panel {
background: #252526;
border-top: 1px solid #3e3e42;
@@ -314,16 +253,6 @@ header h1 {
transition: background 0.2s;
}
-.btn-playback:hover:not(:disabled) {
- background: #1177bb;
-}
-
-.btn-playback:disabled {
- background: #3e3e42;
- color: #858585;
- cursor: not-allowed;
-}
-
.btn-playback kbd {
background: rgba(255, 255, 255, 0.1);
padding: 2px 5px;
@@ -331,7 +260,6 @@ header h1 {
font-size: 11px;
}
-/* Action bar (bottom) */
.action-bar {
background: #2d2d30;
border-top: 1px solid #3e3e42;
@@ -365,11 +293,6 @@ header h1 {
border-color: #0e639c;
}
-.btn-action:disabled {
- color: #858585;
- cursor: not-allowed;
-}
-
.btn-primary {
padding: 6px 16px;
background: #0e639c;
@@ -381,17 +304,11 @@ header h1 {
transition: background 0.2s;
}
-.btn-primary:hover {
- background: #1177bb;
-}
-
-/* Icon styling */
.icon {
font-size: 14px;
line-height: 1;
}
-/* Modal */
.modal {
position: fixed;
z-index: 1000;
@@ -490,7 +407,6 @@ header h1 {
color: #cccccc;
}
-/* Scrollbar styling */
::-webkit-scrollbar {
width: 10px;
height: 10px;
@@ -509,7 +425,6 @@ header h1 {
background: #4e4e52;
}
-/* Waveform intensity viewer */
.waveform-container {
position: relative;
height: 120px;
@@ -570,24 +485,9 @@ header h1 {
transition: background 0.2s;
}
-.btn-copy:hover, .btn-snap:hover {
- background: #1177bb;
-}
-
-.btn-copy:active, .btn-snap:active {
- background: #0d5a8f;
-}
-
.spectrogram-wrapper {
flex: 1;
position: relative;
overflow: hidden;
z-index: 1;
}
-
-#spectrogramCanvas {
- width: 100%;
- height: 100%;
- display: block;
- cursor: crosshair;
-}
diff --git a/tools/timeline_editor/README.md b/tools/timeline_editor/README.md
index 72b5ae0..66e39bd 100644
--- a/tools/timeline_editor/README.md
+++ b/tools/timeline_editor/README.md
@@ -39,7 +39,12 @@ This helps identify performance hotspots in your timeline.
## Usage
-1. **Open:** `open tools/timeline_editor/index.html` or double-click in browser
+1. **Open:** Requires HTTP server (ES6 modules):
+ ```bash
+ cd tools/timeline_editor
+ python3 -m http.server 8080
+ ```
+ Then open: `http://localhost:8080`
2. **Load timeline:** Click "📂 Load timeline.seq" → select `workspaces/main/timeline.seq`
3. **Load audio:** Click "🎵 Load Audio (WAV)" → select audio file
4. **Auto-load via URL:** `index.html?seq=timeline.seq&wav=audio.wav`
@@ -125,7 +130,11 @@ open "tools/timeline_editor/index.html?seq=../../workspaces/main/timeline.seq"
## Technical Notes
-- Pure HTML/CSS/JavaScript (no dependencies, works offline)
+- Modular ES6 structure (requires HTTP server, not file://)
+ - `index.html` - Main editor and rendering
+ - `timeline-viewport.js` - Zoom/scroll/indicator control
+ - `timeline-playback.js` - Audio playback and waveform
+- No external dependencies
- **Internal representation uses beats** (not seconds)
- Sequences have absolute times (beats), effects are relative to parent sequence
- BPM used for seconds conversion (tooltips, audio waveform alignment)
diff --git a/tools/timeline_editor/index.html b/tools/timeline_editor/index.html
index 363c5cb..c5e0264 100644
--- a/tools/timeline_editor/index.html
+++ b/tools/timeline_editor/index.html
@@ -4,105 +4,466 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Timeline Editor - timeline.seq</title>
+ <link rel="stylesheet" href="../common/style.css">
<link rel="icon" href="data:image/svg+xml,<svg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 100 100'><rect width='100' height='100' fill='%231e1e1e'/><rect x='10' y='30' width='15' height='40' fill='%234ec9b0'/><rect x='30' y='20' width='15' height='60' fill='%234ec9b0'/><rect x='50' y='35' width='15' height='30' fill='%234ec9b0'/><rect x='70' y='15' width='15' height='70' fill='%234ec9b0'/></svg>">
<style>
- :root {
- --bg-dark: #1e1e1e;
- --bg-medium: #252526;
- --bg-light: #3c3c3c;
- --text-primary: #d4d4d4;
- --text-muted: #858585;
- --accent-blue: #0e639c;
- --accent-blue-hover: #1177bb;
- --accent-green: #4ec9b0;
- --accent-orange: #ce9178;
- --accent-red: #f48771;
- --border-color: #858585;
- --gap: 10px;
- --radius: 4px;
+ body {
+ padding: 20px;
+ min-height: 100vh;
}
- * { margin: 0; padding: 0; box-sizing: border-box; }
- body { font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; background: var(--bg-dark); color: var(--text-primary); padding: 20px; min-height: 100vh; }
- .container { max-width: 100%; width: 100%; margin: 0 auto; }
+ .container {
+ max-width: 100%;
+ width: 100%;
+ margin: 0 auto;
+ }
+
+ header {
+ background: var(--bg-medium);
+ padding: 20px;
+ border-radius: 8px;
+ margin-bottom: 20px;
+ display: flex;
+ align-items: center;
+ justify-content: space-between;
+ gap: 20px;
+ flex-wrap: wrap;
+ }
+
+ .zoom-controls {
+ display: flex;
+ gap: var(--gap);
+ flex-wrap: wrap;
+ align-items: center;
+ margin-bottom: var(--gap);
+ }
+
+ .checkbox-label {
+ display: flex;
+ align-items: center;
+ gap: 8px;
+ cursor: pointer;
+ user-select: none;
+ }
+
+ .checkbox-label input[type="checkbox"] {
+ cursor: pointer;
+ }
+
+ .timeline-container {
+ background: var(--bg-medium);
+ border-radius: 8px;
+ position: relative;
+ height: calc(100vh - 280px);
+ min-height: 500px;
+ display: flex;
+ flex-direction: column;
+ }
+
+ .timeline-content {
+ flex: 1;
+ overflow: auto;
+ position: relative;
+ padding: 0 20px 20px 20px;
+ scrollbar-width: none;
+ -ms-overflow-style: none;
+ }
+
+ .timeline-content::-webkit-scrollbar {
+ display: none;
+ }
+
+ .timeline {
+ position: relative;
+ min-height: 100%;
+ }
+
+ .sticky-header {
+ position: sticky;
+ top: 0;
+ background: var(--bg-medium);
+ z-index: 100;
+ padding: 20px 20px 10px 20px;
+ border-bottom: 2px solid var(--bg-light);
+ flex-shrink: 0;
+ }
+
+ .waveform-container {
+ position: relative;
+ height: 80px;
+ overflow: hidden;
+ background: rgba(0, 0, 0, 0.5);
+ border-radius: var(--radius);
+ cursor: crosshair;
+ }
+
+ #cpuLoadCanvas {
+ position: absolute;
+ left: 0;
+ bottom: 0;
+ height: 10px;
+ display: block;
+ z-index: 1;
+ }
- header { background: var(--bg-medium); padding: 20px; border-radius: 8px; margin-bottom: 20px; display: flex; align-items: center; justify-content: space-between; gap: 20px; flex-wrap: wrap; }
- h1 { color: var(--accent-green); white-space: nowrap; }
- .controls { display: flex; gap: var(--gap); flex-wrap: wrap; align-items: center; }
- .zoom-controls { display: flex; gap: var(--gap); flex-wrap: wrap; align-items: center; margin-bottom: var(--gap); }
+ #waveformCanvas {
+ position: absolute;
+ left: 0;
+ top: 0;
+ height: 80px;
+ display: block;
+ z-index: 2;
+ }
+
+ .waveform-cursor {
+ position: absolute;
+ top: 0;
+ bottom: 0;
+ width: 1px;
+ background: rgba(78, 201, 176, 0.6);
+ pointer-events: none;
+ z-index: 3;
+ display: none;
+ }
+
+ .waveform-tooltip {
+ position: absolute;
+ background: rgba(30, 30, 30, 0.95);
+ color: var(--text-primary);
+ padding: 6px 10px;
+ border-radius: 4px;
+ font-size: 12px;
+ pointer-events: none;
+ z-index: 4;
+ display: none;
+ white-space: nowrap;
+ border: 1px solid var(--border-color);
+ box-shadow: 0 2px 8px rgba(0,0,0,0.3);
+ }
+
+ .playback-indicator {
+ position: absolute;
+ top: 0;
+ bottom: 0;
+ left: 20px;
+ width: 2px;
+ background: var(--accent-red);
+ box-shadow: 0 0 4px rgba(244, 135, 113, 0.8);
+ pointer-events: none;
+ z-index: 110;
+ display: none;
+ }
- button, .file-label { background: var(--accent-blue); color: white; border: none; padding: 10px 20px; border-radius: var(--radius); cursor: pointer; font-size: 14px; display: inline-block; }
- button:hover, .file-label:hover { background: var(--accent-blue-hover); }
- button:disabled { background: var(--bg-light); cursor: not-allowed; }
- input[type="file"] { display: none; }
+ .time-markers {
+ position: relative;
+ height: 30px;
+ margin-top: var(--gap);
+ border-bottom: 1px solid var(--bg-light);
+ }
- .checkbox-label { display: flex; align-items: center; gap: 8px; cursor: pointer; user-select: none; }
- .checkbox-label input[type="checkbox"] { cursor: pointer; }
+ .time-marker {
+ position: absolute;
+ top: 0;
+ font-size: 12px;
+ color: var(--text-muted);
+ }
- .timeline-container { background: var(--bg-medium); border-radius: 8px; position: relative; height: calc(100vh - 280px); min-height: 500px; display: flex; flex-direction: column; }
- .timeline-content { flex: 1; overflow: auto; position: relative; padding: 0 20px 20px 20px; scrollbar-width: none; -ms-overflow-style: none; }
- .timeline-content::-webkit-scrollbar { display: none; }
- .timeline { position: relative; min-height: 100%; }
+ .time-marker::before {
+ content: '';
+ position: absolute;
+ left: 0;
+ top: 20px;
+ width: 1px;
+ height: 10px;
+ background: var(--bg-light);
+ }
- .sticky-header { position: sticky; top: 0; background: var(--bg-medium); z-index: 100; padding: 20px 20px 10px 20px; border-bottom: 2px solid var(--bg-light); flex-shrink: 0; }
- .waveform-container { position: relative; height: 80px; overflow: hidden; background: rgba(0, 0, 0, 0.3); border-radius: var(--radius); cursor: crosshair; }
- #cpuLoadCanvas { position: absolute; left: 0; bottom: 0; height: 10px; display: block; z-index: 1; }
- #waveformCanvas { position: absolute; left: 0; top: 0; height: 80px; display: block; z-index: 2; }
+ .time-marker::after {
+ content: '';
+ position: absolute;
+ left: 0;
+ top: 30px;
+ width: 1px;
+ height: 10000px;
+ background: rgba(100, 100, 60, 0.9);
+ pointer-events: none;
+ }
- .playback-indicator { position: absolute; top: 0; bottom: 0; left: 20px; width: 2px; background: var(--accent-red); box-shadow: 0 0 4px rgba(244, 135, 113, 0.8); pointer-events: none; z-index: 110; display: none; }
+ .sequence {
+ position: absolute;
+ background: #264f78;
+ border: 2px solid var(--accent-blue);
+ border-radius: var(--radius);
+ padding: 8px;
+ cursor: move;
+ min-height: 40px;
+ transition: box-shadow 0.2s;
+ }
- .time-markers { position: relative; height: 30px; margin-top: var(--gap); border-bottom: 1px solid var(--bg-light); }
- .time-marker { position: absolute; top: 0; font-size: 12px; color: var(--text-muted); }
- .time-marker::before { content: ''; position: absolute; left: 0; top: 20px; width: 1px; height: 10px; background: var(--bg-light); }
- .time-marker::after { content: ''; position: absolute; left: 0; top: 30px; width: 1px; height: 10000px; background: rgba(60, 60, 60, 0.2); pointer-events: none; }
+ .sequence:hover {
+ box-shadow: 0 0 10px rgba(14, 99, 156, 0.5);
+ }
+
+ .sequence.selected {
+ border-color: var(--accent-green);
+ box-shadow: 0 0 10px rgba(78, 201, 176, 0.5);
+ }
+
+ .sequence.collapsed {
+ overflow: hidden !important;
+ background: #1a3a4a !important;
+ }
+
+ .sequence.collapsed .sequence-name {
+ display: none !important;
+ }
+
+ .sequence.active-playing {
+ border-color: var(--accent-green);
+ background: #2a5f4a;
+ }
+
+ .sequence.active-flash {
+ animation: sequenceFlash 0.6s ease-out;
+ }
- .sequence { position: absolute; background: #264f78; border: 2px solid var(--accent-blue); border-radius: var(--radius); padding: 8px; cursor: move; min-height: 40px; transition: box-shadow 0.2s; }
- .sequence:hover { box-shadow: 0 0 10px rgba(14, 99, 156, 0.5); }
- .sequence.selected { border-color: var(--accent-green); box-shadow: 0 0 10px rgba(78, 201, 176, 0.5); }
- .sequence.collapsed { overflow: hidden !important; background: #1a3a4a !important; }
- .sequence.collapsed .sequence-name { display: none !important; }
- .sequence.active-playing { border-color: var(--accent-green); background: #2a5f4a; }
- .sequence.active-flash { animation: sequenceFlash 0.6s ease-out; }
@keyframes sequenceFlash {
- 0% { box-shadow: 0 0 20px rgba(78, 201, 176, 0.8); border-color: var(--accent-green); }
- 100% { box-shadow: 0 0 10px rgba(14, 99, 156, 0.5); border-color: var(--accent-blue); }
+ 0% {
+ box-shadow: 0 0 20px rgba(78, 201, 176, 0.8);
+ border-color: var(--accent-green);
+ }
+ 100% {
+ box-shadow: 0 0 10px rgba(14, 99, 156, 0.5);
+ border-color: var(--accent-blue);
+ }
+ }
+
+ .sequence-header {
+ position: absolute;
+ top: 0;
+ left: 0;
+ right: 0;
+ padding: 8px;
+ z-index: 5;
+ cursor: move;
+ user-select: none;
+ }
+
+ .sequence-header-name {
+ font-size: 14px;
+ font-weight: bold;
+ color: #ffffff;
}
- .sequence-header { position: absolute; top: 0; left: 0; right: 0; padding: 8px; z-index: 5; cursor: move; user-select: none; }
- .sequence-header-name { font-size: 14px; font-weight: bold; color: #ffffff; }
- .sequence:not(.collapsed) .sequence-header-name { display: none; }
- .sequence-name { position: absolute; top: 50%; left: 50%; transform: translate(-50%, -50%); font-size: 24px; font-weight: bold; color: #ffffff; text-shadow: 2px 2px 8px rgba(0, 0, 0, 0.9), -1px -1px 4px rgba(0, 0, 0, 0.7); pointer-events: none; white-space: nowrap; opacity: 1; transition: opacity 0.3s ease; z-index: 10; }
- .sequence.hovered .sequence-name { opacity: 0; }
+ .sequence:not(.collapsed) .sequence-header-name {
+ display: none;
+ }
+
+ .sequence-name {
+ position: absolute;
+ top: 50%;
+ left: 50%;
+ transform: translate(-50%, -50%);
+ font-size: 24px;
+ font-weight: bold;
+ color: #ffffff;
+ text-shadow: 2px 2px 8px rgba(0, 0, 0, 0.9), -1px -1px 4px rgba(0, 0, 0, 0.7);
+ pointer-events: none;
+ white-space: nowrap;
+ opacity: 1;
+ transition: opacity 0.3s ease;
+ z-index: 10;
+ }
- .effect { position: absolute; background: #3a3d41; border: 1px solid var(--border-color); border-radius: 3px; padding: 4px 8px; cursor: move; font-size: 11px; transition: box-shadow 0.2s; display: flex; align-items: center; white-space: nowrap; overflow: hidden; text-overflow: ellipsis; }
- .effect:hover { box-shadow: 0 0 8px rgba(133, 133, 133, 0.5); background: #45484d; }
- .effect.selected { border-color: var(--accent-orange); box-shadow: 0 0 8px rgba(206, 145, 120, 0.5); }
- .effect.conflict { background: #4a1d1d; border-color: var(--accent-red); box-shadow: 0 0 8px rgba(244, 135, 113, 0.6); }
- .effect.conflict:hover { background: #5a2424; }
- .effect-handle { position: absolute; top: 0; width: 6px; height: 100%; background: rgba(78, 201, 176, 0.8); cursor: ew-resize; display: none; z-index: 10; }
- .effect.selected .effect-handle { display: block; }
- .effect-handle.left { left: 0; border-radius: 3px 0 0 3px; }
- .effect-handle.right { right: 0; border-radius: 0 3px 3px 0; }
- .effect-handle:hover { background: var(--accent-green); width: 8px; }
+ .sequence.hovered .sequence-name {
+ opacity: 0;
+ }
- .properties-panel { position: fixed; bottom: 20px; left: 20px; width: 350px; max-height: 80vh; background: var(--bg-medium); padding: 15px; border-radius: 8px; box-shadow: 0 4px 12px rgba(0, 0, 0, 0.5); z-index: 1000; overflow-y: auto; transition: transform 0.3s ease; }
- .properties-panel.collapsed { transform: translateY(calc(100% + 40px)); }
- .panel-header { display: flex; justify-content: space-between; align-items: center; margin-bottom: 15px; padding-bottom: 10px; border-bottom: 1px solid var(--bg-light); }
- .panel-header h2 { margin: 0; color: var(--accent-green); font-size: 16px; }
- .panel-toggle { background: transparent; border: 1px solid var(--border-color); color: var(--text-primary); padding: 4px 8px; border-radius: 3px; cursor: pointer; font-size: 12px; }
- .panel-toggle:hover { background: var(--bg-light); }
- .panel-collapse-btn { position: fixed; bottom: 20px; left: 20px; background: var(--bg-medium); border: 1px solid var(--border-color); color: var(--text-primary); padding: 8px 12px; border-radius: var(--radius); cursor: pointer; z-index: 999; box-shadow: 0 2px 6px rgba(0, 0, 0, 0.3); display: none; }
- .panel-collapse-btn:hover { background: var(--bg-light); }
- .panel-collapse-btn.visible { display: block; }
+ .effect {
+ position: absolute;
+ background: #3a3d41;
+ border: 1px solid var(--border-color);
+ border-radius: 3px;
+ padding: 4px 8px;
+ cursor: move;
+ font-size: 11px;
+ transition: box-shadow 0.2s;
+ display: flex;
+ align-items: center;
+ white-space: nowrap;
+ overflow: hidden;
+ text-overflow: ellipsis;
+ }
- .property-group { margin-bottom: 15px; }
- .property-group label { display: block; margin-bottom: 5px; color: var(--text-muted); font-size: 14px; }
- .property-group input, .property-group select { width: 100%; padding: 8px; background: var(--bg-light); border: 1px solid var(--border-color); border-radius: var(--radius); color: var(--text-primary); font-size: 14px; }
+ .effect:hover {
+ box-shadow: 0 0 8px rgba(133, 133, 133, 0.5);
+ background: #45484d;
+ }
- .stats { background: var(--bg-dark); padding: 10px; border-radius: var(--radius); margin-top: 10px; font-size: 12px; color: var(--text-muted); }
- #messageArea { position: fixed; top: 80px; right: 20px; z-index: 2000; max-width: 400px; }
- .error { background: #5a1d1d; color: var(--accent-red); padding: 10px; border-radius: var(--radius); box-shadow: 0 2px 8px rgba(0,0,0,0.3); }
- .success { background: #1e5231; color: #89d185; padding: 10px; border-radius: var(--radius); box-shadow: 0 2px 8px rgba(0,0,0,0.3); }
+ .effect.selected {
+ border-color: var(--accent-orange);
+ box-shadow: 0 0 8px rgba(206, 145, 120, 0.5);
+ }
+
+ .effect.conflict {
+ background: #4a1d1d;
+ border-color: var(--accent-red);
+ box-shadow: 0 0 8px rgba(244, 135, 113, 0.6);
+ }
+
+ .effect.conflict:hover {
+ background: #5a2424;
+ }
+
+ .effect-handle {
+ position: absolute;
+ top: 0;
+ width: 6px;
+ height: 100%;
+ background: rgba(78, 201, 176, 0.8);
+ cursor: ew-resize;
+ display: none;
+ z-index: 10;
+ }
+
+ .effect.selected .effect-handle {
+ display: block;
+ }
+
+ .effect-handle.left {
+ left: 0;
+ border-radius: 3px 0 0 3px;
+ }
+
+ .effect-handle.right {
+ right: 0;
+ border-radius: 0 3px 3px 0;
+ }
+
+ .effect-handle:hover {
+ background: var(--accent-green);
+ width: 8px;
+ }
+
+ .properties-panel {
+ position: fixed;
+ bottom: 20px;
+ left: 20px;
+ width: 350px;
+ max-height: 80vh;
+ background: var(--bg-medium);
+ padding: 15px;
+ border-radius: 8px;
+ box-shadow: 0 4px 12px rgba(0, 0, 0, 0.5);
+ z-index: 1000;
+ overflow-y: auto;
+ transition: transform 0.3s ease;
+ }
+
+ .properties-panel.collapsed {
+ transform: translateY(calc(100% + 40px));
+ }
+
+ .panel-header {
+ display: flex;
+ justify-content: space-between;
+ align-items: center;
+ margin-bottom: 15px;
+ padding-bottom: 10px;
+ border-bottom: 1px solid var(--bg-light);
+ }
+
+ .panel-header h2 {
+ margin: 0;
+ color: var(--accent-green);
+ font-size: 16px;
+ }
+
+ .panel-toggle {
+ background: transparent;
+ border: 1px solid var(--border-color);
+ color: var(--text-primary);
+ padding: 4px 8px;
+ border-radius: 3px;
+ cursor: pointer;
+ font-size: 12px;
+ }
+
+ .panel-toggle:hover {
+ background: var(--bg-light);
+ }
+
+ .panel-collapse-btn {
+ position: fixed;
+ bottom: 20px;
+ left: 20px;
+ background: var(--bg-medium);
+ border: 1px solid var(--border-color);
+ color: var(--text-primary);
+ padding: 8px 12px;
+ border-radius: var(--radius);
+ cursor: pointer;
+ z-index: 999;
+ box-shadow: 0 2px 6px rgba(0, 0, 0, 0.3);
+ display: none;
+ }
+
+ .panel-collapse-btn:hover {
+ background: var(--bg-light);
+ }
+
+ .panel-collapse-btn.visible {
+ display: block;
+ }
+
+ .property-group {
+ margin-bottom: 15px;
+ }
+
+ .property-group label {
+ display: block;
+ margin-bottom: 5px;
+ color: var(--text-muted);
+ font-size: 14px;
+ }
+
+ .property-group input,
+ .property-group select {
+ width: 100%;
+ }
+
+ .stats {
+ background: var(--bg-dark);
+ padding: 10px;
+ border-radius: var(--radius);
+ margin-top: 10px;
+ font-size: 12px;
+ color: var(--text-muted);
+ }
+
+ #messageArea {
+ position: fixed;
+ top: 80px;
+ right: 20px;
+ z-index: 2000;
+ max-width: 400px;
+ }
+
+ .error {
+ background: #5a1d1d;
+ color: var(--accent-red);
+ padding: 10px;
+ border-radius: var(--radius);
+ box-shadow: 0 2px 8px rgba(0,0,0,0.3);
+ }
+
+ .success {
+ background: #1e5231;
+ color: #89d185;
+ padding: 10px;
+ border-radius: var(--radius);
+ box-shadow: 0 2px 8px rgba(0,0,0,0.3);
+ }
</style>
</head>
<body>
@@ -149,12 +510,14 @@
<div id="messageArea"></div>
- <div class="timeline-container">
+ <div class="timeline-container" id="timelineContainer">
<div class="playback-indicator" id="playbackIndicator"></div>
<div class="sticky-header">
<div class="waveform-container" id="waveformContainer">
<canvas id="cpuLoadCanvas"></canvas>
<canvas id="waveformCanvas"></canvas>
+ <div class="waveform-cursor" id="waveformCursor"></div>
+ <div class="waveform-tooltip" id="waveformTooltip"></div>
</div>
<div class="time-markers" id="timeMarkers"></div>
</div>
@@ -176,16 +539,15 @@
<div class="stats" id="stats"></div>
</div>
- <script>
+ <script type="module">
+ import { ViewportController } from './timeline-viewport.js';
+ import { PlaybackController } from './timeline-playback.js';
+
// Constants
const POST_PROCESS_EFFECTS = new Set(['FadeEffect', 'FlashEffect', 'GaussianBlurEffect',
'SolarizeEffect', 'VignetteEffect', 'ChromaAberrationEffect', 'DistortEffect',
'ThemeModulationEffect', 'CNNEffect', 'CNNv2Effect']);
- const TIMELINE_LEFT_PADDING = 20;
- const SCROLL_VIEWPORT_FRACTION = 0.4;
- const SMOOTH_SCROLL_SPEED = 0.1;
- const VERTICAL_SCROLL_SPEED = 0.3;
const SEQUENCE_GAP = 10;
const SEQUENCE_DEFAULT_WIDTH = 10;
const SEQUENCE_DEFAULT_DURATION = 16;
@@ -195,21 +557,29 @@
const SEQUENCE_BOTTOM_PADDING = 5;
const EFFECT_SPACING = 30;
const EFFECT_HEIGHT = 26;
- const WAVEFORM_AMPLITUDE_SCALE = 0.4;
+
+ // BPM computation helper
+ const computeBPMValues = (bpm) => ({
+ secondsPerBeat: 60.0 / bpm,
+ beatsPerSecond: bpm / 60.0
+ });
// State
+ const DEFAULT_BPM = 120;
const state = {
- sequences: [], currentFile: null, selectedItem: null, pixelsPerSecond: 100,
- showBeats: true, quantizeUnit: 1, bpm: 120, isDragging: false, dragOffset: { x: 0, y: 0 },
+ sequences: [], currentFile: null, selectedItem: null, pixelsPerBeat: 100,
+ showBeats: true, quantizeUnit: 1, bpm: DEFAULT_BPM, isDragging: false, dragOffset: { x: 0, y: 0 },
lastActiveSeqIndex: -1, isDraggingHandle: false, handleType: null, handleDragOffset: 0,
- audioBuffer: null, audioDuration: 0, audioSource: null, audioContext: null,
+ audioBuffer: null, audioDurationSeconds: 0, audioSource: null, audioContext: null,
isPlaying: false, playbackStartTime: 0, playbackOffset: 0, playStartPosition: 0, animationFrameId: null,
- lastExpandedSeqIndex: -1, dragMoved: false
+ lastExpandedSeqIndex: -1, dragMoved: false,
+ ...computeBPMValues(DEFAULT_BPM)
};
// DOM
const dom = {
timeline: document.getElementById('timeline'),
+ timelineContainer: document.getElementById('timelineContainer'),
timelineContent: document.getElementById('timelineContent'),
fileInput: document.getElementById('fileInput'),
saveBtn: document.getElementById('saveBtn'),
@@ -238,7 +608,9 @@
bpmSlider: document.getElementById('bpmSlider'),
currentBPM: document.getElementById('currentBPM'),
showBeatsCheckbox: document.getElementById('showBeatsCheckbox'),
- quantizeSelect: document.getElementById('quantizeSelect')
+ quantizeSelect: document.getElementById('quantizeSelect'),
+ waveformCursor: document.getElementById('waveformCursor'),
+ waveformTooltip: document.getElementById('waveformTooltip')
};
// Parser
@@ -247,7 +619,7 @@
let currentSequence = null, bpm = 120, currentPriority = 0;
const parseTime = (timeStr) => {
- if (timeStr.endsWith('s')) return parseFloat(timeStr.slice(0, -1)) * bpm / 60.0;
+ if (timeStr.endsWith('s')) return parseFloat(timeStr.slice(0, -1)) * bpm / 60.0; // Local bpm during parsing
if (timeStr.endsWith('b')) return parseFloat(timeStr.slice(0, -1));
return parseFloat(timeStr);
};
@@ -293,14 +665,25 @@
}
// Helpers
- const beatsToTime = (beats) => beats * 60.0 / state.bpm;
- const timeToBeats = (seconds) => seconds * state.bpm / 60.0;
+ const updateBPM = (newBPM) => {
+ state.bpm = newBPM;
+ Object.assign(state, computeBPMValues(newBPM));
+ };
+ const beatsToTime = (beats) => beats * state.secondsPerBeat;
+ const timeToBeats = (seconds) => seconds * state.beatsPerSecond;
const beatRange = (start, end) => {
const s = start.toFixed(1), e = end.toFixed(1);
const ss = beatsToTime(start).toFixed(1), es = beatsToTime(end).toFixed(1);
return state.showBeats ? `${s}-${e}b (${ss}-${es}s)` : `${ss}-${es}s (${s}-${e}b)`;
};
+ // Utilities
+ function showMessage(text, type) {
+ if (type === 'error') console.error(text);
+ dom.messageArea.innerHTML = `<div class="${type}">${text}</div>`;
+ setTimeout(() => dom.messageArea.innerHTML = '', 3000);
+ }
+
function detectConflicts(seq) {
const conflicts = new Set();
const priorityGroups = {};
@@ -334,84 +717,16 @@
return output;
}
- // Audio
- async function loadAudioFile(file) {
- try {
- const arrayBuffer = await file.arrayBuffer();
- if (!state.audioContext) state.audioContext = new (window.AudioContext || window.webkitAudioContext)();
- state.audioBuffer = await state.audioContext.decodeAudioData(arrayBuffer);
- state.audioDuration = state.audioBuffer.duration;
- renderWaveform();
- dom.playbackControls.style.display = 'flex';
- dom.playbackIndicator.style.display = 'block';
- dom.clearAudioBtn.disabled = false;
- dom.replayBtn.disabled = false;
- showMessage(`Audio loaded: ${state.audioDuration.toFixed(2)}s`, 'success');
- renderTimeline();
- } catch (err) {
- showMessage(`Error loading audio: ${err.message}`, 'error');
- }
- }
-
- function renderWaveform() {
- if (!state.audioBuffer) return;
- const canvas = dom.waveformCanvas, ctx = canvas.getContext('2d');
-
- // Calculate maxTime same as timeline to ensure alignment
- let maxTime = 60;
- for (const seq of state.sequences) {
- maxTime = Math.max(maxTime, seq.startTime + SEQUENCE_DEFAULT_DURATION);
- for (const effect of seq.effects) maxTime = Math.max(maxTime, seq.startTime + effect.endTime);
- }
- if (state.audioDuration > 0) maxTime = Math.max(maxTime, state.audioDuration * state.bpm / 60.0);
-
- const w = maxTime * state.pixelsPerSecond, h = 80;
- canvas.width = w; canvas.height = h;
- canvas.style.width = `${w}px`; canvas.style.height = `${h}px`;
- ctx.fillStyle = 'rgba(0, 0, 0, 0.3)'; ctx.fillRect(0, 0, w, h);
-
- const channelData = state.audioBuffer.getChannelData(0);
- const audioBeats = timeToBeats(state.audioDuration);
- const audioPixelWidth = audioBeats * state.pixelsPerSecond;
- const samplesPerPixel = Math.ceil(channelData.length / audioPixelWidth);
- const centerY = h / 2, amplitudeScale = h * WAVEFORM_AMPLITUDE_SCALE;
-
- ctx.strokeStyle = '#4ec9b0'; ctx.lineWidth = 1; ctx.beginPath();
- for (let x = 0; x < audioPixelWidth; x++) {
- const start = Math.floor(x * samplesPerPixel);
- const end = Math.min(start + samplesPerPixel, channelData.length);
- let min = 1.0, max = -1.0;
- for (let i = start; i < end; i++) {
- min = Math.min(min, channelData[i]);
- max = Math.max(max, channelData[i]);
- }
- const yMin = centerY - min * amplitudeScale, yMax = centerY - max * amplitudeScale;
- x === 0 ? ctx.moveTo(x, yMin) : ctx.lineTo(x, yMin);
- ctx.lineTo(x, yMax);
- }
- ctx.stroke();
- ctx.strokeStyle = 'rgba(255, 255, 255, 0.1)';
- ctx.beginPath(); ctx.moveTo(0, centerY); ctx.lineTo(audioPixelWidth, centerY); ctx.stroke();
-
- // Draw beat markers across full maxTime width
- ctx.strokeStyle = 'rgba(255, 255, 255, 0.15)';
- ctx.lineWidth = 1;
- for (let beat = 0; beat <= maxTime; beat++) {
- const x = beat * state.pixelsPerSecond;
- ctx.beginPath();
- ctx.moveTo(x, 0);
- ctx.lineTo(x, h);
- ctx.stroke();
- }
- }
+ // Controllers - initialized after DOM setup
+ let viewportController, playbackController;
function computeCPULoad() {
- if (state.sequences.length === 0) return { maxTime: 60, loads: [], conflicts: [] };
- let maxTime = Math.max(60, ...state.sequences.flatMap(seq =>
+ if (state.sequences.length === 0) return { maxTimeBeats: 60, loads: [], conflicts: [] };
+ let maxTimeBeats = Math.max(60, ...state.sequences.flatMap(seq =>
seq.effects.map(eff => seq.startTime + eff.endTime)));
- if (state.audioDuration > 0) maxTime = Math.max(maxTime, timeToBeats(state.audioDuration));
+ if (state.audioDurationSeconds > 0) maxTimeBeats = Math.max(maxTimeBeats, timeToBeats(state.audioDurationSeconds));
- const resolution = 0.1, numSamples = Math.ceil(maxTime / resolution);
+ const resolution = 0.1, numSamples = Math.ceil(maxTimeBeats / resolution);
const loads = new Array(numSamples).fill(0);
const conflicts = new Array(numSamples).fill(false);
@@ -462,19 +777,19 @@
});
});
- return { maxTime, loads, conflicts, resolution };
+ return { maxTimeBeats, loads, conflicts, resolution };
}
function renderCPULoad() {
const canvas = dom.cpuLoadCanvas, ctx = canvas.getContext('2d');
- const { maxTime, loads, conflicts, resolution } = computeCPULoad();
- const w = maxTime * state.pixelsPerSecond, h = 10;
+ const { maxTimeBeats, loads, conflicts, resolution } = computeCPULoad();
+ const w = maxTimeBeats * state.pixelsPerBeat, h = 10;
canvas.width = w; canvas.height = h;
canvas.style.width = `${w}px`; canvas.style.height = `${h}px`;
ctx.fillStyle = 'rgba(0, 0, 0, 0.3)'; ctx.fillRect(0, 0, w, h);
if (loads.length === 0) return;
- const barWidth = resolution * state.pixelsPerSecond;
+ const barWidth = resolution * state.pixelsPerBeat;
loads.forEach((load, i) => {
if (load === 0) return;
const n = Math.min(load / 8, 1.0);
@@ -487,114 +802,31 @@
});
}
- function clearAudio() {
- stopPlayback(); state.audioBuffer = null; state.audioDuration = 0; state.playbackOffset = 0;
- state.playStartPosition = 0;
- dom.playbackControls.style.display = 'none';
- dom.playbackIndicator.style.display = 'none';
- dom.clearAudioBtn.disabled = true;
- dom.replayBtn.disabled = true;
- const ctx = dom.waveformCanvas.getContext('2d');
- ctx.clearRect(0, 0, dom.waveformCanvas.width, dom.waveformCanvas.height);
- renderTimeline();
- showMessage('Audio cleared', 'success');
- }
-
- async function startPlayback() {
- if (!state.audioBuffer || !state.audioContext) return;
- if (state.audioSource) try { state.audioSource.stop(); } catch (e) {} state.audioSource = null;
- if (state.audioContext.state === 'suspended') await state.audioContext.resume();
- try {
- state.audioSource = state.audioContext.createBufferSource();
- state.audioSource.buffer = state.audioBuffer;
- state.audioSource.connect(state.audioContext.destination);
- state.audioSource.start(0, state.playbackOffset);
- state.playbackStartTime = state.audioContext.currentTime;
- state.isPlaying = true; dom.playPauseBtn.textContent = '⏸ Pause';
- updatePlaybackPosition();
- state.audioSource.onended = () => { if (state.isPlaying) stopPlayback(); };
- } catch (e) {
- console.error('Failed to start playback:', e); showMessage('Playback failed: ' + e.message, 'error');
- state.audioSource = null; state.isPlaying = false;
- }
- }
-
- function stopPlayback(savePosition = true) {
- if (state.audioSource) try { state.audioSource.stop(); } catch (e) {} state.audioSource = null;
- if (state.animationFrameId) { cancelAnimationFrame(state.animationFrameId); state.animationFrameId = null; }
- if (state.isPlaying && savePosition) {
- const elapsed = state.audioContext.currentTime - state.playbackStartTime;
- state.playbackOffset = Math.min(state.playbackOffset + elapsed, state.audioDuration);
- }
- state.isPlaying = false; dom.playPauseBtn.textContent = '▶ Play';
- }
-
- function updatePlaybackPosition() {
- if (!state.isPlaying) return;
- const elapsed = state.audioContext.currentTime - state.playbackStartTime;
- const currentTime = state.playbackOffset + elapsed;
- const currentBeats = timeToBeats(currentTime);
- dom.playbackTime.textContent = `${currentTime.toFixed(2)}s (${currentBeats.toFixed(2)}b)`;
- updateIndicatorPosition(currentBeats, true);
- expandSequenceAtTime(currentBeats);
- state.animationFrameId = requestAnimationFrame(updatePlaybackPosition);
- }
-
- function expandSequenceAtTime(currentBeats) {
- let activeSeqIndex = -1;
- for (let i = 0; i < state.sequences.length; i++) {
- const seq = state.sequences[i];
- const seqEndBeats = seq.startTime + (seq.effects.length > 0 ? Math.max(...seq.effects.map(e => e.endTime)) : 0);
- if (currentBeats >= seq.startTime && currentBeats <= seqEndBeats) { activeSeqIndex = i; break; }
- }
- if (activeSeqIndex !== state.lastExpandedSeqIndex) {
- const seqDivs = dom.timeline.querySelectorAll('.sequence');
- if (state.lastExpandedSeqIndex >= 0 && seqDivs[state.lastExpandedSeqIndex]) {
- seqDivs[state.lastExpandedSeqIndex].classList.remove('active-playing');
- }
- if (activeSeqIndex >= 0 && seqDivs[activeSeqIndex]) {
- seqDivs[activeSeqIndex].classList.add('active-playing');
- }
- state.lastExpandedSeqIndex = activeSeqIndex;
- }
- }
-
- function updateIndicatorPosition(beats, smoothScroll = false) {
- const timelineX = beats * state.pixelsPerSecond;
- const scrollLeft = dom.timelineContent.scrollLeft;
- dom.playbackIndicator.style.left = `${timelineX - scrollLeft + TIMELINE_LEFT_PADDING}px`;
- if (smoothScroll) {
- const targetScroll = timelineX - dom.timelineContent.clientWidth * SCROLL_VIEWPORT_FRACTION;
- const scrollDiff = targetScroll - scrollLeft;
- if (Math.abs(scrollDiff) > 5) dom.timelineContent.scrollLeft += scrollDiff * SMOOTH_SCROLL_SPEED;
- }
- }
-
// Render
function renderTimeline() {
renderCPULoad();
dom.timeline.innerHTML = ''; document.getElementById('timeMarkers').innerHTML = '';
- let maxTime = 60;
+ let maxTimeBeats = 60;
for (const seq of state.sequences) {
- maxTime = Math.max(maxTime, seq.startTime + SEQUENCE_DEFAULT_DURATION);
- for (const effect of seq.effects) maxTime = Math.max(maxTime, seq.startTime + effect.endTime);
+ maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + SEQUENCE_DEFAULT_DURATION);
+ for (const effect of seq.effects) maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + effect.endTime);
}
- if (state.audioDuration > 0) maxTime = Math.max(maxTime, state.audioDuration * state.bpm / 60.0);
- const timelineWidth = maxTime * state.pixelsPerSecond;
+ if (state.audioDurationSeconds > 0) maxTimeBeats = Math.max(maxTimeBeats, state.audioDurationSeconds * state.beatsPerSecond);
+ const timelineWidth = maxTimeBeats * state.pixelsPerBeat;
dom.timeline.style.width = `${timelineWidth}px`;
let totalTimelineHeight = 0;
const timeMarkers = document.getElementById('timeMarkers');
if (state.showBeats) {
- for (let beat = 0; beat <= maxTime; beat += 4) {
+ for (let beat = 0; beat <= maxTimeBeats; beat += 4) {
const marker = document.createElement('div');
- marker.className = 'time-marker'; marker.style.left = `${beat * state.pixelsPerSecond}px`;
+ marker.className = 'time-marker'; marker.style.left = `${beat * state.pixelsPerBeat}px`;
marker.textContent = `${beat}b`; timeMarkers.appendChild(marker);
}
} else {
- const maxSeconds = maxTime * 60.0 / state.bpm;
+ const maxSeconds = maxTimeBeats * state.secondsPerBeat;
for (let t = 0; t <= maxSeconds; t += 1) {
- const beatPos = t * state.bpm / 60.0, marker = document.createElement('div');
- marker.className = 'time-marker'; marker.style.left = `${beatPos * state.pixelsPerSecond}px`;
+ const beatPos = t * state.beatsPerSecond, marker = document.createElement('div');
+ marker.className = 'time-marker'; marker.style.left = `${beatPos * state.pixelsPerBeat}px`;
marker.textContent = `${t}s`; timeMarkers.appendChild(marker);
}
}
@@ -611,9 +843,9 @@
const numEffects = seq.effects.length;
const fullHeight = Math.max(SEQUENCE_MIN_HEIGHT, SEQUENCE_TOP_PADDING + numEffects * EFFECT_SPACING + SEQUENCE_BOTTOM_PADDING);
const seqHeight = seq._collapsed ? SEQUENCE_COLLAPSED_HEIGHT : fullHeight;
- seqDiv.style.left = `${seqVisualStart * state.pixelsPerSecond}px`;
+ seqDiv.style.left = `${seqVisualStart * state.pixelsPerBeat}px`;
seqDiv.style.top = `${cumulativeY}px`;
- seqDiv.style.width = `${(seqVisualEnd - seqVisualStart) * state.pixelsPerSecond}px`;
+ seqDiv.style.width = `${(seqVisualEnd - seqVisualStart) * state.pixelsPerBeat}px`;
seqDiv.style.height = `${seqHeight}px`; seqDiv.style.minHeight = `${seqHeight}px`; seqDiv.style.maxHeight = `${seqHeight}px`;
seq._yPosition = cumulativeY; cumulativeY += seqHeight + SEQUENCE_GAP; totalTimelineHeight = cumulativeY;
const seqHeaderDiv = document.createElement('div'); seqHeaderDiv.className = 'sequence-header';
@@ -640,9 +872,9 @@
if (conflicts.has(effectIndex)) effectDiv.classList.add('conflict');
Object.assign(effectDiv.dataset, { seqIndex, effectIndex });
Object.assign(effectDiv.style, {
- left: `${(seq.startTime + effect.startTime) * state.pixelsPerSecond}px`,
+ left: `${(seq.startTime + effect.startTime) * state.pixelsPerBeat}px`,
top: `${seq._yPosition + SEQUENCE_TOP_PADDING + effectIndex * EFFECT_SPACING}px`,
- width: `${(effect.endTime - effect.startTime) * state.pixelsPerSecond}px`,
+ width: `${(effect.endTime - effect.startTime) * state.pixelsPerBeat}px`,
height: `${EFFECT_HEIGHT}px`
});
effectDiv.innerHTML = `<div class="effect-handle left"></div><small>${effect.className}</small><div class="effect-handle right"></div>`;
@@ -685,13 +917,13 @@
if (!state.isDragging || !state.selectedItem) return;
state.dragMoved = true;
const containerRect = dom.timelineContent.getBoundingClientRect();
- let newTime = Math.max(0, (e.clientX - containerRect.left + dom.timelineContent.scrollLeft - state.dragOffset.x) / state.pixelsPerSecond);
- if (state.quantizeUnit > 0) newTime = Math.round(newTime * state.quantizeUnit) / state.quantizeUnit;
- if (state.selectedItem.type === 'sequence') state.sequences[state.selectedItem.index].startTime = newTime;
+ let newTimeBeats = Math.max(0, (e.clientX - containerRect.left + dom.timelineContent.scrollLeft - state.dragOffset.x) / state.pixelsPerBeat);
+ if (state.quantizeUnit > 0) newTimeBeats = Math.round(newTimeBeats * state.quantizeUnit) / state.quantizeUnit;
+ if (state.selectedItem.type === 'sequence') state.sequences[state.selectedItem.index].startTime = newTimeBeats;
else if (state.selectedItem.type === 'effect') {
const seq = state.sequences[state.selectedItem.seqIndex], effect = seq.effects[state.selectedItem.effectIndex];
- const duration = effect.endTime - effect.startTime, relativeTime = newTime - seq.startTime;
- effect.startTime = relativeTime; effect.endTime = effect.startTime + duration;
+ const durationBeats = effect.endTime - effect.startTime, relativeTimeBeats = newTimeBeats - seq.startTime;
+ effect.startTime = relativeTimeBeats; effect.endTime = effect.startTime + durationBeats;
}
renderTimeline(); updateProperties();
}
@@ -709,7 +941,7 @@
state.selectedItem = { type: 'effect', seqIndex, effectIndex, index: seqIndex };
const seq = state.sequences[seqIndex], effect = seq.effects[effectIndex];
const containerRect = dom.timelineContent.getBoundingClientRect();
- const mouseTimeBeats = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerSecond;
+ const mouseTimeBeats = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerBeat;
const handleTimeBeats = seq.startTime + (type === 'left' ? effect.startTime : effect.endTime);
state.handleDragOffset = handleTimeBeats - mouseTimeBeats;
document.addEventListener('mousemove', onHandleDrag); document.addEventListener('mouseup', stopHandleDrag);
@@ -718,13 +950,13 @@
function onHandleDrag(e) {
if (!state.isDraggingHandle || !state.selectedItem) return;
const containerRect = dom.timelineContent.getBoundingClientRect();
- let newTime = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerSecond + state.handleDragOffset;
- newTime = Math.max(0, newTime);
- if (state.quantizeUnit > 0) newTime = Math.round(newTime * state.quantizeUnit) / state.quantizeUnit;
+ let newTimeBeats = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerBeat + state.handleDragOffset;
+ newTimeBeats = Math.max(0, newTimeBeats);
+ if (state.quantizeUnit > 0) newTimeBeats = Math.round(newTimeBeats * state.quantizeUnit) / state.quantizeUnit;
const seq = state.sequences[state.selectedItem.seqIndex], effect = seq.effects[state.selectedItem.effectIndex];
- const relativeTime = newTime - seq.startTime;
- if (state.handleType === 'left') effect.startTime = Math.min(relativeTime, effect.endTime - 0.1);
- else if (state.handleType === 'right') effect.endTime = Math.max(effect.startTime + 0.1, relativeTime);
+ const relativeTimeBeats = newTimeBeats - seq.startTime;
+ if (state.handleType === 'left') effect.startTime = Math.min(relativeTimeBeats, effect.endTime - 0.1);
+ else if (state.handleType === 'right') effect.endTime = Math.max(effect.startTime + 0.1, relativeTimeBeats);
renderTimeline(); updateProperties();
}
@@ -762,8 +994,8 @@
const samePriority = effect.priorityModifier === '=';
dom.propertiesContent.innerHTML = `
<div class="property-group"><label>Effect Class</label><input type="text" id="propClassName" value="${effect.className}"></div>
- <div class="property-group"><label>Start Time (relative to sequence)</label><input type="number" id="propStartTime" value="${effect.startTime}" step="0.1"></div>
- <div class="property-group"><label>End Time (relative to sequence)</label><input type="number" id="propEndTime" value="${effect.endTime}" step="0.1"></div>
+ <div class="property-group"><label>Start Time (beats, relative to sequence)</label><input type="number" id="propStartTime" value="${effect.startTime}" step="0.1"></div>
+ <div class="property-group"><label>End Time (beats, relative to sequence)</label><input type="number" id="propEndTime" value="${effect.endTime}" step="0.1"></div>
<div class="property-group"><label>Constructor Arguments</label><input type="text" id="propArgs" value="${effect.args || ''}"></div>
<div class="property-group"><label>Stack Position (determines priority)</label>
<div style="display: flex; gap: 5px; margin-bottom: 10px;">
@@ -826,18 +1058,11 @@
updateProperties();
}
- // Utilities
- function showMessage(text, type) {
- if (type === 'error') console.error(text);
- dom.messageArea.innerHTML = `<div class="${type}">${text}</div>`;
- setTimeout(() => dom.messageArea.innerHTML = '', 3000);
- }
-
function updateStats() {
const effectCount = state.sequences.reduce((sum, seq) => sum + seq.effects.length, 0);
- const maxTime = Math.max(0, ...state.sequences.flatMap(seq =>
+ const maxTimeBeats = Math.max(0, ...state.sequences.flatMap(seq =>
seq.effects.map(e => seq.startTime + e.endTime).concat(seq.startTime)));
- dom.stats.innerHTML = `📊 Sequences: ${state.sequences.length} | 🎬 Effects: ${effectCount} | ⏱️ Duration: ${maxTime.toFixed(2)}s`;
+ dom.stats.innerHTML = `📊 Sequences: ${state.sequences.length} | 🎬 Effects: ${effectCount} | ⏱️ Duration: ${maxTimeBeats.toFixed(2)}b (${beatsToTime(maxTimeBeats).toFixed(2)}s)`;
}
async function loadFromURLParams() {
@@ -848,21 +1073,22 @@
const response = await fetch(seqURL);
if (!response.ok) throw new Error(`HTTP ${response.status}`);
const content = await response.text(), parsed = parseSeqFile(content);
- state.sequences = parsed.sequences; state.bpm = parsed.bpm;
+ state.sequences = parsed.sequences;
+ updateBPM(parsed.bpm);
dom.currentBPM.value = state.bpm; dom.bpmSlider.value = state.bpm;
state.currentFile = seqURL.split('/').pop();
state.playbackOffset = 0;
renderTimeline(); dom.saveBtn.disabled = false; dom.addSequenceBtn.disabled = false; dom.reorderBtn.disabled = false;
- updateIndicatorPosition(0, false);
+ if (viewportController) viewportController.updateIndicatorPosition(0, false);
showMessage(`Loaded ${state.currentFile} from URL`, 'success');
} catch (err) { showMessage(`Error loading seq file: ${err.message}`, 'error'); }
}
- if (wavURL) {
+ if (wavURL && playbackController) {
try {
const response = await fetch(wavURL);
if (!response.ok) throw new Error(`HTTP ${response.status}`);
const blob = await response.blob(), file = new File([blob], wavURL.split('/').pop(), { type: 'audio/wav' });
- await loadAudioFile(file);
+ await playbackController.loadAudioFile(file);
} catch (err) { showMessage(`Error loading audio file: ${err.message}`, 'error'); }
}
}
@@ -876,11 +1102,12 @@
reader.onload = e => {
try {
const parsed = parseSeqFile(e.target.result);
- state.sequences = parsed.sequences; state.bpm = parsed.bpm;
+ state.sequences = parsed.sequences;
+ updateBPM(parsed.bpm);
dom.currentBPM.value = state.bpm; dom.bpmSlider.value = state.bpm;
state.playbackOffset = 0;
renderTimeline(); dom.saveBtn.disabled = false; dom.addSequenceBtn.disabled = false; dom.reorderBtn.disabled = false;
- updateIndicatorPosition(0, false);
+ if (viewportController) viewportController.updateIndicatorPosition(0, false);
showMessage(`Loaded ${state.currentFile} - ${state.sequences.length} sequences`, 'success');
} catch (err) { showMessage(`Error parsing file: ${err.message}`, 'error'); }
};
@@ -894,41 +1121,7 @@
showMessage('File saved', 'success');
});
- dom.audioInput.addEventListener('change', e => { const file = e.target.files[0]; if (file) loadAudioFile(file); });
- dom.clearAudioBtn.addEventListener('click', () => { clearAudio(); dom.audioInput.value = ''; });
- dom.playPauseBtn.addEventListener('click', async () => {
- if (state.isPlaying) stopPlayback();
- else {
- if (state.playbackOffset >= state.audioDuration) state.playbackOffset = 0;
- state.playStartPosition = state.playbackOffset;
- await startPlayback();
- }
- });
-
- dom.replayBtn.addEventListener('click', async () => {
- stopPlayback(false);
- state.playbackOffset = state.playStartPosition;
- const replayBeats = timeToBeats(state.playbackOffset);
- dom.playbackTime.textContent = `${state.playbackOffset.toFixed(2)}s (${replayBeats.toFixed(2)}b)`;
- updateIndicatorPosition(replayBeats, false);
- await startPlayback();
- });
-
- dom.waveformContainer.addEventListener('click', async e => {
- if (!state.audioBuffer) return;
- const rect = dom.waveformContainer.getBoundingClientRect();
- const canvasOffset = parseFloat(dom.waveformCanvas.style.left) || 0;
- const clickX = e.clientX - rect.left - canvasOffset;
- const clickBeats = clickX / state.pixelsPerSecond;
- const clickTime = beatsToTime(clickBeats);
- const wasPlaying = state.isPlaying;
- if (wasPlaying) stopPlayback(false);
- state.playbackOffset = Math.max(0, Math.min(clickTime, state.audioDuration));
- const pausedBeats = timeToBeats(state.playbackOffset);
- dom.playbackTime.textContent = `${state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`;
- updateIndicatorPosition(pausedBeats, false);
- if (wasPlaying) await startPlayback();
- });
+ // Audio/playback event handlers - managed by PlaybackController
dom.addSequenceBtn.addEventListener('click', () => {
state.sequences.push({ type: 'sequence', startTime: 0, priority: 0, effects: [], _collapsed: true });
@@ -963,30 +1156,24 @@
showMessage('Sequences re-ordered by start time', 'success');
});
- dom.zoomSlider.addEventListener('input', e => {
- state.pixelsPerSecond = parseInt(e.target.value);
- dom.zoomLevel.textContent = `${state.pixelsPerSecond}%`;
- if (state.audioBuffer) renderWaveform();
- renderTimeline();
- updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
- });
+ // Zoom handler - managed by ViewportController
dom.bpmSlider.addEventListener('input', e => {
- state.bpm = parseInt(e.target.value);
+ updateBPM(parseInt(e.target.value));
dom.currentBPM.value = state.bpm;
- if (state.audioBuffer) renderWaveform();
+ if (state.audioBuffer && playbackController) playbackController.renderWaveform();
renderTimeline();
- updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
+ if (viewportController) viewportController.updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
});
dom.currentBPM.addEventListener('change', e => {
const bpm = parseInt(e.target.value);
if (!isNaN(bpm) && bpm >= 60 && bpm <= 200) {
- state.bpm = bpm;
+ updateBPM(bpm);
dom.bpmSlider.value = bpm;
- if (state.audioBuffer) renderWaveform();
+ if (state.audioBuffer && playbackController) playbackController.renderWaveform();
renderTimeline();
- updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
+ if (viewportController) viewportController.updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
} else {
e.target.value = state.bpm;
}
@@ -998,22 +1185,15 @@
dom.panelCollapseBtn.addEventListener('click', () => { dom.propertiesPanel.classList.remove('collapsed'); dom.panelCollapseBtn.classList.remove('visible'); dom.panelToggle.textContent = '▼ Collapse'; });
dom.timeline.addEventListener('click', () => { state.selectedItem = null; dom.deleteBtn.disabled = true; dom.addEffectBtn.disabled = true; renderTimeline(); updateProperties(); });
- dom.timeline.addEventListener('dblclick', async e => {
+ dom.timeline.addEventListener('dblclick', e => {
if (e.target !== dom.timeline) return;
+ if (!playbackController || !state.audioBuffer) return;
const containerRect = dom.timelineContent.getBoundingClientRect();
- const clickX = e.clientX - containerRect.left + dom.timelineContent.scrollLeft - TIMELINE_LEFT_PADDING;
- const clickBeats = clickX / state.pixelsPerSecond;
+ const clickX = e.clientX - containerRect.left + dom.timelineContent.scrollLeft - viewportController.TIMELINE_LEFT_PADDING;
+ const clickBeats = clickX / state.pixelsPerBeat;
const clickTime = beatsToTime(clickBeats);
- if (state.audioBuffer) {
- const wasPlaying = state.isPlaying;
- if (wasPlaying) stopPlayback(false);
- state.playbackOffset = Math.max(0, Math.min(clickTime, state.audioDuration));
- const pausedBeats = timeToBeats(state.playbackOffset);
- dom.playbackTime.textContent = `${state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`;
- updateIndicatorPosition(pausedBeats, false);
- if (wasPlaying) await startPlayback();
- showMessage(`Seek to ${clickTime.toFixed(2)}s (${clickBeats.toFixed(2)}b)`, 'success');
- }
+ const result = playbackController.seekTo(clickBeats, clickTime);
+ if (result) showMessage(`Seek to ${result.clickTime.toFixed(2)}s (${result.clickBeats.toFixed(2)}b)`, 'success');
});
document.addEventListener('keydown', e => {
@@ -1030,51 +1210,21 @@
}
});
- dom.timelineContent.addEventListener('scroll', () => {
- const scrollLeft = dom.timelineContent.scrollLeft;
- dom.cpuLoadCanvas.style.left = `-${scrollLeft}px`;
- dom.waveformCanvas.style.left = `-${scrollLeft}px`;
- document.getElementById('timeMarkers').style.transform = `translateX(-${scrollLeft}px)`;
- updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
- });
+ // Scroll/wheel handlers - managed by ViewportController
- dom.timelineContent.addEventListener('wheel', e => {
- e.preventDefault();
- if (e.ctrlKey || e.metaKey) {
- const rect = dom.timelineContent.getBoundingClientRect(), mouseX = e.clientX - rect.left;
- const scrollLeft = dom.timelineContent.scrollLeft, timeUnderCursor = (scrollLeft + mouseX) / state.pixelsPerSecond;
- const zoomDelta = e.deltaY > 0 ? -10 : 10;
- const newPixelsPerSecond = Math.max(10, Math.min(500, state.pixelsPerSecond + zoomDelta));
- if (newPixelsPerSecond !== state.pixelsPerSecond) {
- state.pixelsPerSecond = newPixelsPerSecond;
- dom.zoomSlider.value = state.pixelsPerSecond;
- dom.zoomLevel.textContent = `${state.pixelsPerSecond}%`;
- if (state.audioBuffer) renderWaveform();
- renderTimeline();
- dom.timelineContent.scrollLeft = timeUnderCursor * newPixelsPerSecond - mouseX;
- updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
- }
- return;
- }
- dom.timelineContent.scrollLeft += e.deltaY;
- const currentScrollLeft = dom.timelineContent.scrollLeft, viewportWidth = dom.timelineContent.clientWidth;
- const slack = (viewportWidth / state.pixelsPerSecond) * 0.1, currentTime = (currentScrollLeft / state.pixelsPerSecond) + slack;
- let targetSeqIndex = 0;
- for (let i = 0; i < state.sequences.length; i++) {
- if (state.sequences[i].startTime <= currentTime) targetSeqIndex = i; else break;
- }
- if (targetSeqIndex !== state.lastActiveSeqIndex && state.sequences.length > 0) {
- state.lastActiveSeqIndex = targetSeqIndex;
- const seqDivs = dom.timeline.querySelectorAll('.sequence');
- if (seqDivs[targetSeqIndex]) {
- seqDivs[targetSeqIndex].classList.add('active-flash');
- setTimeout(() => seqDivs[targetSeqIndex]?.classList.remove('active-flash'), 600);
- }
+ // Initialize controllers
+ const renderCallback = (trigger) => {
+ if (trigger === 'zoom' || trigger === 'zoomWheel') {
+ if (state.audioBuffer && playbackController) playbackController.renderWaveform();
+ renderTimeline();
+ if (viewportController) viewportController.updateIndicatorPosition(timeToBeats(state.playbackOffset), false);
+ } else {
+ renderTimeline();
}
- const targetScrollTop = state.sequences[targetSeqIndex]?._yPosition || 0;
- const currentScrollTop = dom.timelineContent.scrollTop, scrollDiff = targetScrollTop - currentScrollTop;
- if (Math.abs(scrollDiff) > 5) dom.timelineContent.scrollTop += scrollDiff * VERTICAL_SCROLL_SPEED;
- }, { passive: false });
+ };
+
+ viewportController = new ViewportController(state, dom, renderCallback);
+ playbackController = new PlaybackController(state, dom, viewportController, renderCallback, showMessage);
window.addEventListener('resize', renderTimeline);
renderTimeline(); loadFromURLParams();
diff --git a/tools/timeline_editor/timeline-playback.js b/tools/timeline_editor/timeline-playback.js
new file mode 100644
index 0000000..a1c50ab
--- /dev/null
+++ b/tools/timeline_editor/timeline-playback.js
@@ -0,0 +1,322 @@
+// timeline-playback.js - Audio playback and waveform rendering
+
+export class PlaybackController {
+ constructor(state, dom, viewportController, renderCallback, showMessage) {
+ this.state = state;
+ this.dom = dom;
+ this.viewport = viewportController;
+ this.renderCallback = renderCallback;
+ this.showMessage = showMessage;
+
+ // Constants
+ this.WAVEFORM_AMPLITUDE_SCALE = 0.4;
+ this.SEQUENCE_DEFAULT_DURATION = 16;
+
+ this.init();
+ }
+
+ init() {
+ this.dom.audioInput.addEventListener('change', e => {
+ const file = e.target.files[0];
+ if (file) this.loadAudioFile(file);
+ });
+
+ this.dom.clearAudioBtn.addEventListener('click', () => {
+ this.clearAudio();
+ this.dom.audioInput.value = '';
+ });
+
+ this.dom.playPauseBtn.addEventListener('click', async () => {
+ if (this.state.isPlaying) this.stopPlayback();
+ else {
+ if (this.state.playbackOffset >= this.state.audioDurationSeconds) {
+ this.state.playbackOffset = 0;
+ }
+ this.state.playStartPosition = this.state.playbackOffset;
+ await this.startPlayback();
+ }
+ });
+
+ this.dom.replayBtn.addEventListener('click', async () => {
+ this.stopPlayback(false);
+ this.state.playbackOffset = this.state.playStartPosition;
+ const replayBeats = this.timeToBeats(this.state.playbackOffset);
+ this.dom.playbackTime.textContent = `${this.state.playbackOffset.toFixed(2)}s (${replayBeats.toFixed(2)}b)`;
+ this.viewport.updateIndicatorPosition(replayBeats, false);
+ await this.startPlayback();
+ });
+
+ this.dom.waveformContainer.addEventListener('click', async e => {
+ if (!this.state.audioBuffer) return;
+ const rect = this.dom.waveformContainer.getBoundingClientRect();
+ const canvasOffset = parseFloat(this.dom.waveformCanvas.style.left) || 0;
+ const clickX = e.clientX - rect.left - canvasOffset;
+ const clickBeats = clickX / this.state.pixelsPerBeat;
+ const clickTime = this.beatsToTime(clickBeats);
+
+ const wasPlaying = this.state.isPlaying;
+ if (wasPlaying) this.stopPlayback(false);
+ this.state.playbackOffset = Math.max(0, Math.min(clickTime, this.state.audioDurationSeconds));
+ const pausedBeats = this.timeToBeats(this.state.playbackOffset);
+ this.dom.playbackTime.textContent = `${this.state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`;
+ this.viewport.updateIndicatorPosition(pausedBeats, false);
+ if (wasPlaying) await this.startPlayback();
+ });
+ }
+
+ async loadAudioFile(file) {
+ try {
+ const arrayBuffer = await file.arrayBuffer();
+
+ // Detect original WAV sample rate before decoding
+ const dataView = new DataView(arrayBuffer);
+ let originalSampleRate = 32000; // Default assumption
+
+ // Parse WAV header to get original sample rate
+ // "RIFF" at 0, "WAVE" at 8, "fmt " at 12, sample rate at 24
+ if (dataView.getUint32(0, false) === 0x52494646 && // "RIFF"
+ dataView.getUint32(8, false) === 0x57415645) { // "WAVE"
+ originalSampleRate = dataView.getUint32(24, true); // Little-endian
+ console.log(`Detected WAV sample rate: ${originalSampleRate}Hz`);
+ }
+
+ if (!this.state.audioContext) {
+ this.state.audioContext = new (window.AudioContext || window.webkitAudioContext)();
+ }
+
+ this.state.audioBuffer = await this.state.audioContext.decodeAudioData(arrayBuffer);
+ this.state.audioDurationSeconds = this.state.audioBuffer.duration;
+ this.state.originalSampleRate = originalSampleRate;
+ this.state.resampleRatio = this.state.audioContext.sampleRate / originalSampleRate;
+
+ console.log(`AudioContext rate: ${this.state.audioContext.sampleRate}Hz, resample ratio: ${this.state.resampleRatio.toFixed(3)}x`);
+
+ this.renderWaveform();
+ this.dom.playbackControls.style.display = 'flex';
+ this.dom.playbackIndicator.style.display = 'block';
+ this.dom.clearAudioBtn.disabled = false;
+ this.dom.replayBtn.disabled = false;
+ this.showMessage(`Audio loaded: ${this.state.audioDurationSeconds.toFixed(2)}s @ ${originalSampleRate}Hz`, 'success');
+ this.renderCallback('audioLoaded');
+ } catch (err) {
+ this.showMessage(`Error loading audio: ${err.message}`, 'error');
+ }
+ }
+
+ renderWaveform() {
+ if (!this.state.audioBuffer) return;
+ const canvas = this.dom.waveformCanvas;
+ const ctx = canvas.getContext('2d');
+
+ // Calculate maxTimeBeats same as timeline
+ let maxTimeBeats = 60;
+ for (const seq of this.state.sequences) {
+ maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + this.SEQUENCE_DEFAULT_DURATION);
+ for (const effect of seq.effects) {
+ maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + effect.endTime);
+ }
+ }
+ if (this.state.audioDurationSeconds > 0) {
+ maxTimeBeats = Math.max(maxTimeBeats, this.state.audioDurationSeconds * this.state.beatsPerSecond);
+ }
+
+ const w = maxTimeBeats * this.state.pixelsPerBeat;
+ const h = 80;
+ canvas.width = w;
+ canvas.height = h;
+ canvas.style.width = `${w}px`;
+ canvas.style.height = `${h}px`;
+
+ ctx.fillStyle = 'rgba(0, 0, 0, 0.3)';
+ ctx.fillRect(0, 0, w, h);
+
+ const channelData = this.state.audioBuffer.getChannelData(0);
+ const audioBeats = this.timeToBeats(this.state.audioDurationSeconds);
+ const audioPixelWidth = audioBeats * this.state.pixelsPerBeat;
+ const samplesPerPixel = Math.ceil(channelData.length / audioPixelWidth);
+ const centerY = h / 2;
+ const amplitudeScale = h * this.WAVEFORM_AMPLITUDE_SCALE;
+
+ ctx.strokeStyle = '#4ec9b0';
+ ctx.lineWidth = 1;
+ ctx.beginPath();
+
+ for (let x = 0; x < audioPixelWidth; x++) {
+ const start = Math.floor(x * samplesPerPixel);
+ const end = Math.min(start + samplesPerPixel, channelData.length);
+ let min = 1.0, max = -1.0;
+
+ for (let i = start; i < end; i++) {
+ min = Math.min(min, channelData[i]);
+ max = Math.max(max, channelData[i]);
+ }
+
+ const yMin = centerY - min * amplitudeScale;
+ const yMax = centerY - max * amplitudeScale;
+
+ if (x === 0) ctx.moveTo(x, yMin);
+ else ctx.lineTo(x, yMin);
+ ctx.lineTo(x, yMax);
+ }
+ ctx.stroke();
+
+ // Center line
+ ctx.strokeStyle = 'rgba(255, 255, 255, 0.1)';
+ ctx.beginPath();
+ ctx.moveTo(0, centerY);
+ ctx.lineTo(audioPixelWidth, centerY);
+ ctx.stroke();
+
+ // Beat markers
+ ctx.strokeStyle = 'rgba(255, 255, 255, 0.50)';
+ ctx.lineWidth = 1;
+ for (let beat = 0; beat <= maxTimeBeats; beat++) {
+ const x = beat * this.state.pixelsPerBeat;
+ ctx.beginPath();
+ ctx.moveTo(x, 0);
+ ctx.lineTo(x, h);
+ ctx.stroke();
+ }
+ }
+
+ clearAudio() {
+ this.stopPlayback();
+ this.state.audioBuffer = null;
+ this.state.audioDurationSeconds = 0;
+ this.state.playbackOffset = 0;
+ this.state.playStartPosition = 0;
+
+ this.dom.playbackControls.style.display = 'none';
+ this.dom.playbackIndicator.style.display = 'none';
+ this.dom.clearAudioBtn.disabled = true;
+ this.dom.replayBtn.disabled = true;
+
+ const ctx = this.dom.waveformCanvas.getContext('2d');
+ ctx.clearRect(0, 0, this.dom.waveformCanvas.width, this.dom.waveformCanvas.height);
+
+ this.renderCallback('audioClear');
+ this.showMessage('Audio cleared', 'success');
+ }
+
+ async startPlayback() {
+ if (!this.state.audioBuffer || !this.state.audioContext) return;
+
+ if (this.state.audioSource) {
+ try { this.state.audioSource.stop(); } catch (e) {}
+ this.state.audioSource = null;
+ }
+
+ if (this.state.audioContext.state === 'suspended') {
+ await this.state.audioContext.resume();
+ }
+
+ try {
+ this.state.audioSource = this.state.audioContext.createBufferSource();
+ this.state.audioSource.buffer = this.state.audioBuffer;
+ this.state.audioSource.connect(this.state.audioContext.destination);
+ this.state.audioSource.start(0, this.state.playbackOffset);
+ this.state.playbackStartTime = this.state.audioContext.currentTime;
+ this.state.isPlaying = true;
+ this.dom.playPauseBtn.textContent = '⏸ Pause';
+
+ this.updatePlaybackPosition();
+
+ this.state.audioSource.onended = () => {
+ if (this.state.isPlaying) this.stopPlayback();
+ };
+ } catch (e) {
+ console.error('Failed to start playback:', e);
+ this.showMessage('Playback failed: ' + e.message, 'error');
+ this.state.audioSource = null;
+ this.state.isPlaying = false;
+ }
+ }
+
+ stopPlayback(savePosition = true) {
+ if (this.state.audioSource) {
+ try { this.state.audioSource.stop(); } catch (e) {}
+ this.state.audioSource = null;
+ }
+
+ if (this.state.animationFrameId) {
+ cancelAnimationFrame(this.state.animationFrameId);
+ this.state.animationFrameId = null;
+ }
+
+ if (this.state.isPlaying && savePosition) {
+ const elapsed = this.state.audioContext.currentTime - this.state.playbackStartTime;
+ this.state.playbackOffset = Math.min(this.state.playbackOffset + elapsed, this.state.audioDurationSeconds);
+ }
+
+ this.state.isPlaying = false;
+ this.dom.playPauseBtn.textContent = '▶ Play';
+ }
+
+ updatePlaybackPosition() {
+ if (!this.state.isPlaying) return;
+
+ const elapsed = this.state.audioContext.currentTime - this.state.playbackStartTime;
+ const currentTime = this.state.playbackOffset + elapsed;
+ const currentBeats = this.timeToBeats(currentTime);
+
+ this.dom.playbackTime.textContent = `${currentTime.toFixed(2)}s (${currentBeats.toFixed(2)}b)`;
+ this.viewport.updateIndicatorPosition(currentBeats, true);
+ this.expandSequenceAtTime(currentBeats);
+
+ this.state.animationFrameId = requestAnimationFrame(() => this.updatePlaybackPosition());
+ }
+
+ expandSequenceAtTime(currentBeats) {
+ let activeSeqIndex = -1;
+
+ for (let i = 0; i < this.state.sequences.length; i++) {
+ const seq = this.state.sequences[i];
+ const seqEndBeats = seq.startTime + (seq.effects.length > 0 ?
+ Math.max(...seq.effects.map(e => e.endTime)) : 0);
+
+ if (currentBeats >= seq.startTime && currentBeats <= seqEndBeats) {
+ activeSeqIndex = i;
+ break;
+ }
+ }
+
+ if (activeSeqIndex !== this.state.lastExpandedSeqIndex) {
+ const seqDivs = this.dom.timeline.querySelectorAll('.sequence');
+
+ if (this.state.lastExpandedSeqIndex >= 0 && seqDivs[this.state.lastExpandedSeqIndex]) {
+ seqDivs[this.state.lastExpandedSeqIndex].classList.remove('active-playing');
+ }
+
+ if (activeSeqIndex >= 0 && seqDivs[activeSeqIndex]) {
+ seqDivs[activeSeqIndex].classList.add('active-playing');
+ }
+
+ this.state.lastExpandedSeqIndex = activeSeqIndex;
+ }
+ }
+
+ seekTo(clickBeats, clickTime) {
+ if (!this.state.audioBuffer) return;
+
+ const wasPlaying = this.state.isPlaying;
+ if (wasPlaying) this.stopPlayback(false);
+
+ this.state.playbackOffset = Math.max(0, Math.min(clickTime, this.state.audioDurationSeconds));
+ const pausedBeats = this.timeToBeats(this.state.playbackOffset);
+ this.dom.playbackTime.textContent = `${this.state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`;
+ this.viewport.updateIndicatorPosition(pausedBeats, false);
+
+ if (wasPlaying) this.startPlayback();
+
+ return { clickTime, clickBeats };
+ }
+
+ // Helpers
+ beatsToTime(beats) {
+ return beats * this.state.secondsPerBeat;
+ }
+
+ timeToBeats(seconds) {
+ return seconds * this.state.beatsPerSecond;
+ }
+}
diff --git a/tools/timeline_editor/timeline-viewport.js b/tools/timeline_editor/timeline-viewport.js
new file mode 100644
index 0000000..dcedb45
--- /dev/null
+++ b/tools/timeline_editor/timeline-viewport.js
@@ -0,0 +1,170 @@
+// timeline-viewport.js - Viewport zoom/scroll control
+
+export class ViewportController {
+ constructor(state, dom, renderCallback) {
+ this.state = state;
+ this.dom = dom;
+ this.renderCallback = renderCallback;
+
+ // Constants
+ this.TIMELINE_LEFT_PADDING = 20;
+ this.SCROLL_VIEWPORT_FRACTION = 0.4;
+ this.SMOOTH_SCROLL_SPEED = 0.1;
+ this.VERTICAL_SCROLL_SPEED = 0.3;
+
+ this.init();
+ }
+
+ init() {
+ // Zoom controls
+ this.dom.zoomSlider.addEventListener('input', e => this.handleZoomSlider(e));
+
+ // Scroll sync
+ this.dom.timelineContent.addEventListener('scroll', () => this.handleScroll());
+
+ // Wheel handling - capture at container level to override all child elements
+ const wheelHandler = e => this.handleWheel(e);
+ this.dom.timelineContainer.addEventListener('wheel', wheelHandler, { passive: false, capture: true });
+
+ // Prevent wheel bubbling from UI containers outside timeline
+ document.querySelector('header').addEventListener('wheel', e => e.stopPropagation());
+ this.dom.propertiesPanel.addEventListener('wheel', e => e.stopPropagation());
+ document.querySelector('.zoom-controls').addEventListener('wheel', e => e.stopPropagation());
+ document.querySelector('.stats').addEventListener('wheel', e => e.stopPropagation());
+
+ // Waveform hover tracking
+ this.dom.waveformContainer.addEventListener('mouseenter', () => this.showWaveformCursor());
+ this.dom.waveformContainer.addEventListener('mouseleave', () => this.hideWaveformCursor());
+ this.dom.waveformContainer.addEventListener('mousemove', e => this.updateWaveformCursor(e));
+ }
+
+ handleZoomSlider(e) {
+ this.state.pixelsPerBeat = parseInt(e.target.value);
+ this.dom.zoomLevel.textContent = `${this.state.pixelsPerBeat}%`;
+ this.renderCallback('zoom');
+ }
+
+ handleScroll() {
+ const scrollLeft = this.dom.timelineContent.scrollLeft;
+ this.dom.cpuLoadCanvas.style.left = `-${scrollLeft}px`;
+ this.dom.waveformCanvas.style.left = `-${scrollLeft}px`;
+ document.getElementById('timeMarkers').style.transform = `translateX(-${scrollLeft}px)`;
+ this.updateIndicatorPosition(this.timeToBeats(this.state.playbackOffset), false);
+ }
+
+ handleWheel(e) {
+ e.preventDefault();
+
+ // Zoom with ctrl/cmd
+ if (e.ctrlKey || e.metaKey) {
+ this.handleZoomWheel(e);
+ return;
+ }
+
+ // Horizontal scroll
+ this.dom.timelineContent.scrollLeft += e.deltaY;
+
+ // Auto-scroll to active sequence
+ this.autoScrollToSequence();
+ }
+
+ handleZoomWheel(e) {
+ const rect = this.dom.timelineContent.getBoundingClientRect();
+ const mouseX = e.clientX - rect.left;
+ const scrollLeft = this.dom.timelineContent.scrollLeft;
+ const timeUnderCursor = (scrollLeft + mouseX) / this.state.pixelsPerBeat;
+
+ const zoomDelta = e.deltaY > 0 ? -10 : 10;
+ const newPixelsPerBeat = Math.max(10, Math.min(500, this.state.pixelsPerBeat + zoomDelta));
+
+ if (newPixelsPerBeat !== this.state.pixelsPerBeat) {
+ this.state.pixelsPerBeat = newPixelsPerBeat;
+ this.dom.zoomSlider.value = this.state.pixelsPerBeat;
+ this.dom.zoomLevel.textContent = `${this.state.pixelsPerBeat}%`;
+ this.renderCallback('zoomWheel');
+ this.dom.timelineContent.scrollLeft = timeUnderCursor * newPixelsPerBeat - mouseX;
+ this.updateIndicatorPosition(this.timeToBeats(this.state.playbackOffset), false);
+ }
+ }
+
+ autoScrollToSequence() {
+ const currentScrollLeft = this.dom.timelineContent.scrollLeft;
+ const viewportWidth = this.dom.timelineContent.clientWidth;
+ const slack = (viewportWidth / this.state.pixelsPerBeat) * 0.1;
+ const currentTime = (currentScrollLeft / this.state.pixelsPerBeat) + slack;
+
+ let targetSeqIndex = 0;
+ for (let i = 0; i < this.state.sequences.length; i++) {
+ if (this.state.sequences[i].startTime <= currentTime) targetSeqIndex = i;
+ else break;
+ }
+
+ if (targetSeqIndex !== this.state.lastActiveSeqIndex && this.state.sequences.length > 0) {
+ this.state.lastActiveSeqIndex = targetSeqIndex;
+ const seqDivs = this.dom.timeline.querySelectorAll('.sequence');
+ if (seqDivs[targetSeqIndex]) {
+ seqDivs[targetSeqIndex].classList.add('active-flash');
+ setTimeout(() => seqDivs[targetSeqIndex]?.classList.remove('active-flash'), 600);
+ }
+ }
+
+ const targetScrollTop = this.state.sequences[targetSeqIndex]?._yPosition || 0;
+ const currentScrollTop = this.dom.timelineContent.scrollTop;
+ const scrollDiff = targetScrollTop - currentScrollTop;
+ if (Math.abs(scrollDiff) > 5) {
+ this.dom.timelineContent.scrollTop += scrollDiff * this.VERTICAL_SCROLL_SPEED;
+ }
+ }
+
+ updateIndicatorPosition(beats, smoothScroll = false) {
+ const timelineX = beats * this.state.pixelsPerBeat;
+ const scrollLeft = this.dom.timelineContent.scrollLeft;
+ this.dom.playbackIndicator.style.left = `${timelineX - scrollLeft + this.TIMELINE_LEFT_PADDING}px`;
+
+ if (smoothScroll) {
+ const targetScroll = timelineX - this.dom.timelineContent.clientWidth * this.SCROLL_VIEWPORT_FRACTION;
+ const scrollDiff = targetScroll - scrollLeft;
+ if (Math.abs(scrollDiff) > 5) {
+ this.dom.timelineContent.scrollLeft += scrollDiff * this.SMOOTH_SCROLL_SPEED;
+ }
+ }
+ }
+
+ showWaveformCursor() {
+ if (!this.state.audioBuffer) return;
+ this.dom.waveformCursor.style.display = 'block';
+ this.dom.waveformTooltip.style.display = 'block';
+ }
+
+ hideWaveformCursor() {
+ this.dom.waveformCursor.style.display = 'none';
+ this.dom.waveformTooltip.style.display = 'none';
+ }
+
+ updateWaveformCursor(e) {
+ if (!this.state.audioBuffer) return;
+ const rect = this.dom.waveformContainer.getBoundingClientRect();
+ const mouseX = e.clientX - rect.left;
+ const scrollLeft = this.dom.timelineContent.scrollLeft;
+ const timeBeats = (scrollLeft + mouseX) / this.state.pixelsPerBeat;
+ const timeSeconds = timeBeats * this.state.secondsPerBeat;
+
+ // Position cursor
+ this.dom.waveformCursor.style.left = `${mouseX}px`;
+
+ // Position and update tooltip
+ const tooltipText = `${timeSeconds.toFixed(3)}s (${timeBeats.toFixed(2)}b)`;
+ this.dom.waveformTooltip.textContent = tooltipText;
+
+ // Position tooltip above cursor, offset to the right
+ const tooltipX = mouseX + 10;
+ const tooltipY = 5;
+ this.dom.waveformTooltip.style.left = `${tooltipX}px`;
+ this.dom.waveformTooltip.style.top = `${tooltipY}px`;
+ }
+
+ // Helper
+ timeToBeats(seconds) {
+ return seconds * this.state.beatsPerSecond;
+ }
+}
diff --git a/tools/track_visualizer/index.html b/tools/track_visualizer/index.html
index 4a613ec..d1e7480 100644
--- a/tools/track_visualizer/index.html
+++ b/tools/track_visualizer/index.html
@@ -4,18 +4,8 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Music Track Visualizer</title>
+ <link rel="stylesheet" href="../common/style.css">
<style>
- * {
- margin: 0;
- padding: 0;
- box-sizing: border-box;
- }
- body {
- font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
- background: #1e1e1e;
- color: #d4d4d4;
- overflow: hidden;
- }
#controls {
padding: 15px;
background: #2d2d2d;
@@ -27,16 +17,8 @@
}
button, input[type="file"] {
padding: 8px 16px;
- background: #0e639c;
- color: white;
- border: none;
- border-radius: 4px;
- cursor: pointer;
font-size: 14px;
}
- button:hover {
- background: #1177bb;
- }
input[type="file"] {
padding: 6px 12px;
}
@@ -58,7 +40,6 @@
width: 100%;
height: calc(100vh - 70px);
overflow: auto;
- background: #1e1e1e;
}
#timeline-canvas {
display: block;