diff options
25 files changed, 1509 insertions, 818 deletions
@@ -1,67 +1,28 @@ -# ============================================ -# TIER 1: CRITICAL CONTEXT (Always Loaded) -# ============================================ +# TIER 1: CRITICAL (Always Loaded) @PROJECT_CONTEXT.md @TODO.md @README.md -# ============================================ -# TIER 2: TECHNICAL REFERENCE (Always Loaded) -# ============================================ +# TIER 2: REFERENCE (Always Loaded) @doc/HOWTO.md @doc/CONTRIBUTING.md @doc/AI_RULES.md @doc/EFFECT_WORKFLOW.md -# ============================================ -# TIER 3: DESIGN DOCS (Load On-Demand) -# ============================================ -# Load these only when working on specific subsystems: -# -# Audio & Tracker: -# doc/SPECTRAL_BRUSH_EDITOR.md - Spectral editor design -# doc/TRACKER.md - Audio tracker system -# -# 3D & Graphics: -# doc/3D.md - 3D rendering architecture -# doc/PROCEDURAL.md - Procedural generation +# TIER 3: DESIGN DOCS (Load On-Demand by Subsystem) # -# Build & Assets: -# doc/ASSET_SYSTEM.md - Asset pipeline details -# doc/BUILD.md - Build system details -# doc/FETCH_DEPS.md - Dependency management -# -# Testing & Tools: -# doc/test_demo_README.md - test_demo tool documentation -# -# Architecture & Reference: -# doc/ARCHITECTURE.md - Detailed system architecture -# doc/CODING_STYLE.md - Code style examples -# doc/BACKLOG.md - Untriaged future goals -# doc/TOOLS_REFERENCE.md - Developer tools reference +# Audio: SPECTRAL_BRUSH_EDITOR.md, TRACKER.md, BEAT_TIMING.md +# CNN: CNN.md, CNN_EFFECT.md, CNN_V2*.md, CNN_TEST_TOOL.md, CNN_*ANALYSIS.md, CNN_BIAS_FIX*.md +# 3D/Graphics: 3D.md, GPU_PROCEDURAL_PHASE4.md, MASKING_SYSTEM.md, SDF_EFFECT_GUIDE.md +# Scene: SCENE_FORMAT.md, SEQUENCE.md, WORKSPACE_SYSTEM.md +# Build: ASSET_SYSTEM.md, BUILD.md, CMAKE_MODULES.md, SIZE_MEASUREMENT.md +# Rendering: GEOM_BUFFER.md, SHADER_REUSE_INVESTIGATION.md, UNIFORM_BUFFER_GUIDELINES.md, WGPU_HELPERS.md, AUXILIARY_TEXTURE_INIT.md +# Tools: test_demo_README.md, HOT_RELOAD.md, HEADLESS_MODE.md, RECIPE.md +# Arch: ARCHITECTURE.md, CODING_STYLE.md, BACKLOG.md, TOOLS_REFERENCE.md, CONTEXT_MAINTENANCE.md -# ============================================ -# TIER 4: HISTORICAL ARCHIVE (Load Rarely) -# ============================================ -# Load these only for historical context or debugging: -# -# Completion History: -# Use Read tool for doc/COMPLETED.md -# -# Technical Investigations: -# doc/GPU_EFFECTS_TEST_ANALYSIS.md -# doc/PLATFORM_ANALYSIS.md -# doc/PLATFORM_SIDE_QUEST_SUMMARY.md -# doc/PEAK_FIX_SUMMARY.md -# doc/CNN_DEBUG.md - CNN post-processing binding bug resolution -# -# Agent Handoffs: -# doc/HANDOFF_CLAUDE.md -# doc/HANDOFF.md -# doc/HANDOFF_2026-02-04.md -# -# Task Tracking: -# doc/TASKS_SUMMARY.md +# TIER 4: ARCHIVE (Historical/Debugging Only) +# Active: COMPLETED.md, CNN_DEBUG.md, FILE_HIERARCHY_CLEANUP_2026-02-13.md +# Archived: doc/archive/* (handoffs, analyses, build proposals, timing fixes) # ============================================ # PROJECT RULES (IMPORTANT) @@ -84,48 +84,19 @@ IMPORTANT: - Keep PROJECT_CONTEXT.md focused on current status - Keep TODO.md focused on active/next tasks only -# ============================================ -# CURRENT STATE SNAPSHOT (Gemini-Specific) -# ============================================ -<state_snapshot> - <overall_goal> - Produce a cross-platform (Windows, macOS, Linux) 64-kilobyte demoscene production. This is achieved through a C++ codebase utilizing WebGPU for graphics (with a hybrid SDF/rasterization pipeline) and a real-time procedural audio engine for sound, with a heavy focus on size-optimization at all stages. - </overall_goal> - - <active_constraints> - - All tests passing (36/36 - 100%). - - Strict 64k final binary size target. - - Adherence to project coding style and contribution guidelines is mandatory. - </active_constraints> - - <key_knowledge> - - **Workspace System:** The project is organized into self-contained workspaces (e.g., `workspaces/main`, `workspaces/test`) managed by a `workspace.cfg` file, separating demo-specific content from a `common/` directory that holds shared shaders and resources. Selection is done at build time with `-DDEMO_WORKSPACE=<name>`. - - **Build & Asset Pipeline:** A modular CMake system orchestrates the build. It uses host-native tools (`asset_packer`, `seq_compiler`, `tracker_compiler`) to parse manifest files (`assets.txt`, `timeline.seq`, `music.track`) and compile assets directly into the binary as C++ data, including procedural asset generation. - - **Audio Engine:** A real-time, sample-accurate audio engine with a tracker system for sequencing patterns from `.track` files. It features procedural synthesis from spectrograms (FFT-based IDCT), variable tempo that is decoupled from visual timing, and an abstracted backend for testing and offline rendering (`WavDumpBackend`). - - **Graphics & Rendering:** The renderer uses WebGPU (wgpu-native) and WGSL shaders. It employs a hybrid technique, rasterizing proxy geometry (cubes) and then performing SDF raymarching within the fragment shader. The 3D system supports BVH acceleration, and there's a pipeline for importing OBJ models. - - **Sequence & Timing:** Visuals are defined in `.seq` files using a beat-based timeline that is compiled into physical seconds. Shaders receive a `CommonPostProcessUniforms` buffer containing both physical time (`time`) for constant-speed animations and musical time (`beat_time`, `beat_phase`) for syncing with the audio playback clock. - - **CNN Post-Processing:** The project features a sophisticated CNN post-processing pipeline (CNNv2) for visual stylization. This includes a full Python/PyTorch training toolchain, a binary weight format, and a WebGPU-based validation tool. The network uses 7D parametric static features (RGBD, UV, sin, bias) for rich, position-aware effects. - - **Development Workflow:** There is a strong emphasis on tooling and process, including a visual timeline editor, audio analysis tools, a web-based CNN debugger, strict coding standards enforced by `clang-format`, and a comprehensive pre-commit script (`./scripts/check_all.sh`). - </key_knowledge> +# Role: Senior Systems Engineer (C++ Focus) - <artifact_trail> - - `GEMINI.md`: This file, synchronized with CLAUDE.md structure - - `PROJECT_CONTEXT.md`: Current system status - - `TODO.md`: Active priorities (Task #5 in progress) - </artifact_trail> +## Response Style +- **Extreme Brevity:** Provide direct answers. No "Sure, I can help," "I hope this helps," or "Here is the code." +- **Code-First:** Lead with the solution or the code block. +- **Explain on Demand:** Do not explain *how* the code works unless explicitly asked with "Why?" or "Explain." +- **No Markdown Fluff:** Avoid bolding every other word. Use standard technical formatting. - <recent_actions> - - **File Hierarchy Cleanup:** Major reorganization of the project structure, establishing the `workspaces/` and `common/` directories and eliminating ~100 redundant files (especially shaders). - - **CNNv2 Training Pipeline:** Fixed critical checkpointing bugs and streamlined the output of the training scripts for faster iteration. - - **Effect Render API Refactor:** Simplified the `Effect::render` API and fixed uniform initialization bugs across 19 effects. - - **CNN Shader Testing Tool:** Implemented `tools/cnn_test` for offline GPU-accelerated validation of trained CNN models. - - **Effect Source Hierarchy Cleanup**: Refactored the effects system by splitting `src/gpu/demo_effects.h` into individual header files for each effect, creating 10 new headers, moving class definitions, updating `.cc` files with new includes, fixing missing `#include` statements, creating a common `particle_defs.h`, and cleaning up `demo_effects.h`. Verified by passing all 34 tests. handoff(Gemini): - </recent_actions> +## Technical Preferences +- **C++ Standards:** Default to C++20/C++23 unless specified otherwise. +- **Style:** Prefer Modern C++ (RAII, templates, constexpr, STL) over C-style patterns. +- **Nomenclature:** Use standard engineering terminology (e.g., "O(n) complexity," "pointer aliasing," "cache miss") without defining the terms. - <task_state> - 1. [IN PROGRESS] Task #5: Spectral Brush Editor (Priority 1) - 2. [PENDING] Task #18: 3D System Enhancements (Priority 4) - 3. [RECURRENT] Task #50: WGSL Modularization (Priority 4) - 4. [PENDING] Tracker Humanization & Sample Offset (Priority 3) - </task_state> -</state_snapshot>
\ No newline at end of file +## Interaction Protocol +- If a query is ambiguous, provide the most likely technical solution rather than asking for clarification. +- Treat the user as a peer with expert-level knowledge. diff --git a/doc/AUDIO_WAV_DRIFT_BUG.md b/doc/AUDIO_WAV_DRIFT_BUG.md new file mode 100644 index 0000000..050dd49 --- /dev/null +++ b/doc/AUDIO_WAV_DRIFT_BUG.md @@ -0,0 +1,185 @@ +# Audio WAV Drift Bug Investigation + +**Date:** 2026-02-15 +**Status:** ACCEPTABLE (to be continued) +**Current State:** -150ms drift at beat 64b, no glitches + +## Problem Statement + +Timeline viewer shows progressive visual drift between audio waveform and beat grid markers: +- At beat 8 (5.33s @ 90 BPM): kick waveform appears **-30ms early** (left of grid line) +- At beat 60 (40.0s @ 90 BPM): kick waveform appears **-180ms early** (left of grid line) + +Progressive drift rate: ~4.3ms/second + +## Initial Hypotheses (Ruled Out) + +### 1. ❌ Viewer Display Bug +- **Tested:** Sample rate detection in viewer (32kHz correctly detected) +- **Result:** Viewer BPM = 90 (correct), `pixelsPerSecond` mapping correct +- **Conclusion:** Not a viewer rendering issue + +### 2. ❌ WAV File Content Error +- **Tested:** Direct WAV sample position analysis via Python +- **Result:** Actual kick positions in WAV file: + ``` + Beat | Expected(s) | WAV(s) | Drift + -----|-------------|----------|------- + 8 | 5.3333 | 5.3526 | +19ms (LATE) + 60 | 40.0000 | 39.9980 | -2ms (nearly perfect) + ``` +- **Conclusion:** WAV file samples are at correct positions; visual drift not in WAV content + +### 3. ❌ Frame Truncation (Partial Cause) +- **Issue:** `frames_per_update = (int)(32000 * (1/60))` = 533 frames (truncates 0.333) +- **Impact:** Loses 0.333 frames/update = 10.4μs/frame +- **Total drift over 40s:** 2400 frames × 10.4μs = **25ms** +- **Conclusion:** Explains 25ms of 180ms, but not sufficient + +## Root Cause Discovery + +### Investigation Method +Added debug tracking to `audio_render_ahead()` (audio.cc:115): +```cpp +static int64_t g_total_render_calls = 0; +static int64_t g_total_frames_rendered = 0; + +// Track actual frames rendered vs expected +const int64_t actual_rendered = frames_after - frames_before; +g_total_render_calls++; +g_total_frames_rendered += actual_rendered; +``` + +### Critical Finding: Over-Rendering + +**WAV dump @ 40s (2400 iterations):** +``` +Expected frames: 1,279,200 (2400 × 533) +Actual rendered: 1,290,933 +Difference: +11,733 frames = +366.66ms EXTRA audio +``` + +**Pattern observed every 10s (600 calls @ 60fps):** +``` +[RENDER_DRIFT] calls=600 expect=319800 actual=331533 drift=-366.66ms +[RENDER_DRIFT] calls=1200 expect=639600 actual=651333 drift=-366.66ms +[RENDER_DRIFT] calls=1800 expect=959400 actual=971133 drift=-366.66ms +[RENDER_DRIFT] calls=2400 expect=1279200 actual=1290933 drift=-366.66ms +``` + +### Why This Causes Visual Drift + +**WAV Dump Flow (main.cc:289-302):** +1. `fill_audio_buffer(update_dt)` → calls `audio_render_ahead()` + - Renders audio into ring buffer + - **BUG:** Renders MORE than `chunk_frames` due to buffer management loop +2. `ring_buffer->read(chunk_buffer, samples_per_update)` + - Reads exactly 533 frames from ring buffer +3. `wav_backend.write_audio(chunk_buffer, samples_per_update)` + - Writes exactly 533 frames to WAV + +**Result:** Ring buffer accumulates 11,733 extra frames over 40s. + +### Timing Shift Mechanism + +Ring buffer acts as FIFO queue with 400ms lookahead: +- Initially fills to 400ms (12,800 frames) +- Each iteration: renders 533.333 (actual: ~536) frames, reads 533 frames +- Net accumulation: ~3 frames/iteration +- After 2400 iterations: 12,800 + (2400 × 3) = 20,000 frames buffer size + +Events trigger at correct `music_time` but get written to ring buffer position that's ahead. When WAV reads from buffer, it reads from older position, causing events to appear EARLIER in WAV file than their nominal music_time. + +## Technical Details + +### Code Locations + +**Truncation point 1:** `main.cc:282` +```cpp +const int frames_per_update = (int)(32000 * update_dt); // 533.333 → 533 +``` + +**Truncation point 2:** `audio.cc:105` +```cpp +const int chunk_frames = (int)(dt * RING_BUFFER_SAMPLE_RATE); // 533.333 → 533 +``` + +**Over-render loop:** `audio.cc:112-229` +```cpp +while (true) { + // Keeps rendering until buffer >= target_lookahead + // Renders MORE than chunk_frames due to buffer management + ... +} +``` + +### Why 366ms Per 10s? + +At 60fps, 10s = 600 iterations: +- Expected: 600 × 533 = 319,800 frames +- Actual: 331,533 frames +- Extra: 11,733 frames ÷ 600 = **19.55 frames extra per iteration** + +But `chunk_frames = 533`, so we render 533 + 19.55 = **~552.55 frames per call** on average. + +Discrepancy from 533.333 expected: 552.55 - 533.333 = **19.22 frames/call over-render** + +This 19.22 frames = 0.6ms per iteration accumulates to 366ms per 10s. + +## Proposed Fix + +### Option 1: Match Render to Read (Recommended) +In WAV dump mode, ensure `audio_render_ahead()` renders exactly `frames_per_update`: +```cpp +// main.cc WAV dump loop +const int frames_per_update = (int)(32000 * update_dt); +audio_render_ahead(g_music_time, update_dt, /* force_exact_amount */ frames_per_update); +``` + +Modify `audio_render_ahead()` to accept optional exact frame count and render precisely that amount instead of filling to target lookahead. + +### Option 2: Round Instead of Truncate +```cpp +const int frames_per_update = (int)(32000 * update_dt + 0.5f); // Round: 533.333 → 533 +``` +Reduces truncation error but doesn't solve over-rendering. + +### Option 3: Use Double Precision + Accumulator +```cpp +static double accumulated_time = 0.0; +accumulated_time += update_dt; +const int frames_to_render = (int)(accumulated_time * 32000); +accumulated_time -= frames_to_render / 32000.0; +``` +Eliminates cumulative truncation error. + +## Related Issues + +- `tracker.cc:237` TODO comment mentions "180ms drift over 63 beats" - this is the same bug +- Ring buffer lookahead (400ms) is separate from drift (not the cause) +- Web Audio API `outputLatency` in viewer is unrelated (affects playback, not waveform display) + +## Verification Steps + +1. ✅ Measure WAV sample positions directly (Python script) +2. ✅ Add render tracking debug output +3. ✅ Confirm over-rendering (366ms per 10s) +4. ✅ Implement partial fix (bypass ring buffer, direct render) +5. ⚠️ Current result: -150ms drift at beat 64b (acceptable, needs further work) + +## Current Implementation (main.cc:286-308) + +**WAV dump now bypasses ring buffer entirely:** +1. **Frame accumulator**: Calculates exact frames per update (no truncation) +2. **Direct render**: Calls `synth_render()` directly with exact frame count +3. **No ring buffer**: Eliminates buffer management complexity +4. **Result**: No glitches, but -150ms drift remains + +**Remaining issue:** Drift persists despite direct rendering. Likely related to tempo scaling or audio engine state management. Acceptable for now. + +## Notes + +- Viewer waveform rendering is CORRECT - displays WAV content accurately +- Bug is in demo's WAV generation, specifically ring buffer management in `audio_render_ahead()` +- Progressive nature of drift (30ms → 180ms) indicates accumulation, not one-time offset +- Fix must ensure rendered frames = read frames in WAV dump mode diff --git a/src/app/main.cc b/src/app/main.cc index 537da74..3c80520 100644 --- a/src/app/main.cc +++ b/src/app/main.cc @@ -207,7 +207,10 @@ int main(int argc, char** argv) { #endif /* !defined(STRIP_ALL) */ // Pre-fill ring buffer to target lookahead (prevents startup delay) - fill_audio_buffer(audio_get_required_prefill_time(), 0.0); + // Skip pre-fill in WAV dump mode (direct render, no ring buffer) + if (!dump_wav) { + fill_audio_buffer(audio_get_required_prefill_time(), 0.0); + } audio_start(); g_last_audio_time = audio_get_playback_time(); // Initialize after start @@ -268,37 +271,41 @@ int main(int argc, char** argv) { printf("Running WAV dump simulation (%.1fs - %.1fs)...\n", start_time, end_time); - // Seek to start time if needed + // Seek to start time if needed (advance state without rendering) if (start_time > 0.0f) { const double step = 1.0 / 60.0; for (double t = 0.0; t < start_time; t += step) { - fill_audio_buffer(step, t); - audio_render_silent((float)step); + g_audio_engine.update(g_music_time, (float)step * g_tempo_scale); + g_music_time += (float)step * g_tempo_scale; } printf("Seeked to %.1fs\n", start_time); } - const float update_dt = 1.0f / 60.0f; // 60Hz update rate - const int frames_per_update = (int)(32000 * update_dt); // ~533 frames - const int samples_per_update = frames_per_update * 2; // Stereo + const float update_dt = 1.0f / 60.0f; // 60Hz update rate + const int sample_rate = 32000; - AudioRingBuffer* ring_buffer = audio_get_ring_buffer(); - std::vector<float> chunk_buffer(samples_per_update); + std::vector<float> chunk_buffer(2048); // Max samples for one update double physical_time = start_time; + double frame_accumulator = 0.0; while (physical_time < end_time) { - // Update music time and tracker (using tempo logic from - // fill_audio_buffer) - fill_audio_buffer(update_dt, physical_time); + // Calculate exact frames for this update + frame_accumulator += sample_rate * update_dt; + const int frames_this_update = (int)frame_accumulator; + frame_accumulator -= frames_this_update; + const int samples_this_update = frames_this_update * 2; - // Read rendered audio from ring buffer - if (ring_buffer != nullptr) { - ring_buffer->read(chunk_buffer.data(), samples_per_update); - } + // Update tracker/audio state + g_audio_engine.update(g_music_time, update_dt * g_tempo_scale); - // Write to WAV file - wav_backend.write_audio(chunk_buffer.data(), samples_per_update); + // Render directly to buffer (bypass ring buffer) + if (frames_this_update > 0) { + synth_render(chunk_buffer.data(), frames_this_update); + wav_backend.write_audio(chunk_buffer.data(), samples_this_update); + } + // Advance music time + g_music_time += update_dt * g_tempo_scale; physical_time += update_dt; // Progress indicator every second diff --git a/src/audio/audio.cc b/src/audio/audio.cc index ba76a28..a220fbb 100644 --- a/src/audio/audio.cc +++ b/src/audio/audio.cc @@ -78,9 +78,9 @@ void audio_start() { #if !defined(STRIP_ALL) if (!audio_is_prefilled()) { const int buffered = g_ring_buffer.available_read(); - const float buffered_ms = - (float)buffered / (RING_BUFFER_SAMPLE_RATE * RING_BUFFER_CHANNELS) * - 1000.0f; + const float buffered_ms = (float)buffered / + (RING_BUFFER_SAMPLE_RATE * RING_BUFFER_CHANNELS) * + 1000.0f; printf("WARNING: Audio buffer not pre-filled (%.1fms < %.1fms)\n", buffered_ms, audio_get_required_prefill_time() * 1000.0f); } @@ -97,11 +97,12 @@ void audio_render_ahead(float music_time, float dt, float target_fill) { // Render in small chunks to keep synth time synchronized with tracker // Chunk size: one frame's worth of audio (~16.6ms @ 60fps) - // TODO(timing): CRITICAL BUG - Truncation here may cause 180ms drift over 63 beats - // (int) cast loses fractional samples: 0.333 samples/frame * 2560 frames = 853 samples = 27ms - // But observed drift is 180ms, so this is not the only source (27ms < 180ms) - // NOTE: This is NOT a float vs double precision issue - floats handle <500s times fine - // See also: tracker.cc BPM timing calculation + // TODO(timing): CRITICAL BUG - Truncation here may cause 180ms drift over 63 + // beats (int) cast loses fractional samples: 0.333 samples/frame * 2560 + // frames = 853 samples = 27ms But observed drift is 180ms, so this is not the + // only source (27ms < 180ms) NOTE: This is NOT a float vs double precision + // issue - floats handle <500s times fine See also: tracker.cc BPM timing + // calculation const int chunk_frames = (int)(dt * RING_BUFFER_SAMPLE_RATE); const int chunk_samples = chunk_frames * RING_BUFFER_CHANNELS; diff --git a/src/audio/tracker.cc b/src/audio/tracker.cc index 37f0683..38c814d 100644 --- a/src/audio/tracker.cc +++ b/src/audio/tracker.cc @@ -193,7 +193,8 @@ static int get_free_pattern_slot() { // sample-accurate timing) // volume_mult: Additional volume multiplier (for humanization) static void trigger_note_event(const TrackerEvent& event, - int start_offset_samples, float volume_mult = 1.0f) { + int start_offset_samples, + float volume_mult = 1.0f) { #if defined(DEBUG_LOG_TRACKER) // VALIDATION: Check sample_id bounds if (event.sample_id >= g_tracker_samples_count) { @@ -234,10 +235,10 @@ static void trigger_note_event(const TrackerEvent& event, } void tracker_update(float music_time_sec, float dt_music_sec) { - // TODO(timing): CRITICAL BUG - Events trigger ~180ms early over 63 beats @ BPM=90 - // Observed: Beat 63 snare at 41.82s in WAV, should be at 42.00s (180ms drift) - // NOTE: This is NOT a float vs double precision issue - floats handle <500s times fine - // Root cause unknown - suspects: + // TODO(timing): CRITICAL BUG - Events trigger ~180ms early over 63 beats @ + // BPM=90 Observed: Beat 63 snare at 41.82s in WAV, should be at 42.00s (180ms + // drift) NOTE: This is NOT a float vs double precision issue - floats handle + // <500s times fine Root cause unknown - suspects: // 1. Systematic bias in time calculation (not random accumulation) // 2. Truncation in audio.cc:103 chunk_frames = (int)(dt * sample_rate) // 3. BPM calculation precision below (unit_duration_sec) diff --git a/src/effects/flash_cube_effect.cc b/src/effects/flash_cube_effect.cc index 29e9897..383e66a 100644 --- a/src/effects/flash_cube_effect.cc +++ b/src/effects/flash_cube_effect.cc @@ -60,12 +60,12 @@ void FlashCubeEffect::render(WGPURenderPassEncoder pass, // Detect beat changes for flash trigger (using intensity as proxy for beat // hits) Intensity spikes on beats, so we can use it to trigger flashes if (uniforms.audio_intensity > 0.5f && - flash_intensity_ < 0.3f) { // High intensity + flash cooled down + flash_intensity_ < 0.2f) { // High intensity + flash cooled down flash_intensity_ = 1.0f; // Trigger full flash } // Exponential decay of flash - flash_intensity_ *= 0.90f; // Slower fade for more visible effect + flash_intensity_ *= 0.95f; // Slower fade for more visible effect // Always have base brightness, add flash on top float base_brightness = 0.2f; @@ -80,7 +80,7 @@ void FlashCubeEffect::render(WGPURenderPassEncoder pass, // Slowly rotate the cube for visual interest scene_.objects[0].rotation = - quat::from_axis(vec3(0.3f, 1, 0.2f), uniforms.time * 0.05f); + quat::from_axis(vec3(0.3f, 1, 0.2f), uniforms.time * 0.04f); // Position camera OUTSIDE the cube looking at it from a distance // This way we see the cube as a background element diff --git a/src/effects/gaussian_blur_effect.h b/src/effects/gaussian_blur_effect.h index 651c5c3..bf1062f 100644 --- a/src/effects/gaussian_blur_effect.h +++ b/src/effects/gaussian_blur_effect.h @@ -8,9 +8,9 @@ // Parameters for GaussianBlurEffect (set at construction time) struct GaussianBlurParams { - float strength = 1.0f; // Default + float strength = 1.0f; // Default float strength_audio = 0.5f; // how much to pulse with audio - float stretch = 1.f; // y/x axis ratio + float stretch = 1.f; // y/x axis ratio float _pad = 0.; }; static_assert(sizeof(GaussianBlurParams) == 16, diff --git a/src/effects/particle_spray_effect.h b/src/effects/particle_spray_effect.h index c83d691..216e13f 100644 --- a/src/effects/particle_spray_effect.h +++ b/src/effects/particle_spray_effect.h @@ -3,8 +3,8 @@ #pragma once -#include "gpu/effect.h" #include "effects/particle_defs.h" +#include "gpu/effect.h" class ParticleSprayEffect : public Effect { public: diff --git a/src/effects/particles_effect.h b/src/effects/particles_effect.h index 6d46ea2..a69039f 100644 --- a/src/effects/particles_effect.h +++ b/src/effects/particles_effect.h @@ -3,8 +3,8 @@ #pragma once -#include "gpu/effect.h" #include "effects/particle_defs.h" +#include "gpu/effect.h" class ParticlesEffect : public Effect { public: diff --git a/src/effects/sdf_test_effect.cc b/src/effects/sdf_test_effect.cc index 28b3513..264809f 100644 --- a/src/effects/sdf_test_effect.cc +++ b/src/effects/sdf_test_effect.cc @@ -9,8 +9,8 @@ SDFTestEffect::SDFTestEffect(const GpuContext& ctx) : SDFEffect(ctx) { ResourceBinding bindings[] = { {uniforms_.get(), WGPUBufferBindingType_Uniform}, {camera_params_.get(), WGPUBufferBindingType_Uniform}}; - pass_ = gpu_create_render_pass(ctx_.device, ctx_.format, - sdf_test_shader_wgsl, bindings, 2); + pass_ = gpu_create_render_pass(ctx_.device, ctx_.format, sdf_test_shader_wgsl, + bindings, 2); pass_.vertex_count = 3; } diff --git a/src/gpu/demo_effects.h b/src/gpu/demo_effects.h index 85498ad..d4df20b 100644 --- a/src/gpu/demo_effects.h +++ b/src/gpu/demo_effects.h @@ -40,12 +40,8 @@ #include <memory> - - // Common particle definition is now in effects/particle_defs.h - - // Auto-generated functions from sequence compiler void LoadTimeline(MainSequence& main_seq, const GpuContext& ctx); diff --git a/src/gpu/gpu.cc b/src/gpu/gpu.cc index ce234fa..ff4def7 100644 --- a/src/gpu/gpu.cc +++ b/src/gpu/gpu.cc @@ -143,7 +143,6 @@ RenderPass gpu_create_render_pass(WGPUDevice device, WGPUTextureFormat format, WGPUShaderModule shader_module = wgpuDeviceCreateShaderModule(device, &shader_desc); - // Create Bind Group Layout & Bind Group std::vector<WGPUBindGroupLayoutEntry> bgl_entries; std::vector<WGPUBindGroupEntry> bg_entries; diff --git a/src/tests/gpu/test_shader_assets.cc b/src/tests/gpu/test_shader_assets.cc index 135c477..63f9b5d 100644 --- a/src/tests/gpu/test_shader_assets.cc +++ b/src/tests/gpu/test_shader_assets.cc @@ -42,8 +42,8 @@ int main() { all_passed &= validate_shader(AssetId::ASSET_SHADER_COMMON_UNIFORMS, "COMMON_UNIFORMS", {"struct", "GlobalUniforms"}); - all_passed &= validate_shader(AssetId::ASSET_SHADER_SDF_SHAPES, - "SDF_SHAPES", {"fn", "sd"}); + all_passed &= validate_shader(AssetId::ASSET_SHADER_SDF_SHAPES, "SDF_SHAPES", + {"fn", "sd"}); all_passed &= validate_shader(AssetId::ASSET_SHADER_LIGHTING, "LIGHTING", {"fn", "calc"}); all_passed &= validate_shader(AssetId::ASSET_SHADER_RAY_BOX, "RAY_BOX", diff --git a/tools/cnn_v2_test/index.html b/tools/cnn_v2_test/index.html index e226d0c..84702d5 100644 --- a/tools/cnn_v2_test/index.html +++ b/tools/cnn_v2_test/index.html @@ -32,32 +32,21 @@ <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>CNN v2 Testing Tool</title> + <link rel="stylesheet" href="../common/style.css"> <style> - * { margin: 0; padding: 0; box-sizing: border-box; } body { - font-family: 'Courier New', monospace; - background: #1a1a1a; - color: #e0e0e0; display: flex; flex-direction: column; height: 100vh; - overflow: hidden; } .header { - background: #2a2a2a; padding: 16px; border-bottom: 1px solid #404040; - display: flex; - align-items: center; gap: 24px; - flex-wrap: wrap; } h1 { font-size: 18px; } .controls { - display: flex; gap: 16px; - align-items: center; - flex-wrap: wrap; } .control-group { display: flex; @@ -66,7 +55,7 @@ } .control-group label { font-size: 12px; } input[type="range"] { width: 120px; } - input[type="number"] { width: 60px; background: #1a1a1a; color: #e0e0e0; border: 1px solid #404040; padding: 4px; } + input[type="number"] { width: 60px; padding: 4px; } .drop-zone { border: 3px dashed #606060; padding: 20px; @@ -80,18 +69,10 @@ color: #4a9eff; } button { - background: #1a1a1a; - border: 1px solid #404040; - color: #e0e0e0; padding: 6px 12px; font-size: 12px; - font-family: 'Courier New', monospace; - cursor: pointer; - transition: all 0.2s; - border-radius: 4px; } button:hover { border-color: #606060; background: #252525; } - button:disabled { opacity: 0.3; cursor: not-allowed; } video { display: none; } .drop-zone:hover { border-color: #4a9eff; background: #2a3545; } .drop-zone.active { border-color: #4a9eff; background: #1a2a3a; } @@ -120,7 +101,6 @@ padding: 24px; overflow: auto; position: relative; - background: #1a1a1a; } .video-controls-float { position: absolute; @@ -185,7 +165,6 @@ padding: 16px; } .panel { - border: 1px solid #404040; border-radius: 4px; overflow: hidden; } @@ -228,28 +207,14 @@ margin-bottom: 12px; } .layer-buttons button { - background: #1a1a1a; - border: 1px solid #404040; - color: #e0e0e0; padding: 6px 12px; font-size: 10px; - font-family: 'Courier New', monospace; - cursor: pointer; - transition: all 0.2s; - } - .layer-buttons button:hover { - border-color: #606060; - background: #252525; } .layer-buttons button.active { background: #4a9eff; border-color: #4a9eff; color: #1a1a1a; } - .layer-buttons button:disabled { - opacity: 0.3; - cursor: not-allowed; - } .layer-buttons button:disabled:hover { border-color: #404040; background: #1a1a1a; diff --git a/tools/common/style.css b/tools/common/style.css new file mode 100644 index 0000000..1ba4bad --- /dev/null +++ b/tools/common/style.css @@ -0,0 +1,117 @@ +:root { + --bg-dark: #1e1e1e; + --bg-medium: #252526; + --bg-light: #3c3c3c; + --text-primary: #d4d4d4; + --text-muted: #858585; + --accent-blue: #0e639c; + --accent-blue-hover: #1177bb; + --accent-green: #4ec9b0; + --accent-orange: #ce9178; + --accent-red: #f48771; + --border-color: #858585; + --gap: 10px; + --radius: 4px; +} + +* { + margin: 0; + padding: 0; + box-sizing: border-box; +} + +body { + font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; + background: var(--bg-dark); + color: var(--text-primary); + overflow: hidden; +} + +button, .btn, .file-label { + background: var(--accent-blue); + color: white; + border: none; + padding: 10px 20px; + border-radius: var(--radius); + cursor: pointer; + font-size: 14px; + display: inline-block; + text-align: center; +} + +button:hover, .btn:hover, .file-label:hover { + background: var(--accent-blue-hover); +} + +button:disabled, .btn:disabled { + background: var(--bg-light); + cursor: not-allowed; +} + +input[type="file"] { + display: none; +} + +input, select { + background: var(--bg-light); + border: 1px solid var(--border-color); + border-radius: var(--radius); + color: var(--text-primary); + padding: 8px; + font-size: 14px; +} + +.container { + width: 100%; + height: 100vh; + display: flex; + flex-direction: column; +} + +.header { + background: var(--bg-medium); + padding: 15px; + border-bottom: 1px solid var(--border-color); + display: flex; + align-items: center; + gap: 20px; + flex-wrap: wrap; +} + +h1 { + color: var(--accent-green); + font-size: 18px; + white-space: nowrap; +} + +.controls { + display: flex; + gap: var(--gap); + flex-wrap: wrap; + align-items: center; +} + +.panel { + background: var(--bg-medium); + border: 1px solid var(--border-color); + border-radius: var(--radius); + padding: 15px; +} + +.error-message { + background: #5a1d1d; + color: var(--accent-red); + padding: 10px; + border-radius: var(--radius); + box-shadow: 0 2px 8px rgba(0,0,0,0.3); + margin: 10px 0; +} + +.success-message { + background: #1e5231; + color: #89d185; + padding: 10px; + border-radius: var(--radius); + box-shadow: 0 2px 8px rgba(0,0,0,0.3); + margin: 10px 0; +} diff --git a/tools/shader_editor/index.html b/tools/shader_editor/index.html index bad0abb..d93a595 100644 --- a/tools/shader_editor/index.html +++ b/tools/shader_editor/index.html @@ -4,26 +4,8 @@ <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>WGSL Shader Editor</title> + <link rel="stylesheet" href="../common/style.css"> <style> -* { - margin: 0; - padding: 0; - box-sizing: border-box; -} - -body { - font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", monospace; - background: #1e1e1e; - color: #d4d4d4; - overflow: hidden; - height: 100vh; -} - -.container { - display: flex; - height: 100vh; -} - .preview-pane { flex: 0 0 57%; background: #252526; @@ -89,26 +71,13 @@ body { } .control-group button { - background: #0e639c; - color: #fff; - border: none; padding: 6px 12px; - border-radius: 3px; - cursor: pointer; font-size: 13px; } -.control-group button:hover { - background: #1177bb; -} - .control-group input[type="number"], .control-group select { - background: #3c3c3c; - color: #d4d4d4; - border: 1px solid #3e3e42; padding: 4px 8px; - border-radius: 3px; font-size: 13px; } @@ -153,19 +122,10 @@ body { } .editor-header button { - background: #0e639c; - color: #fff; - border: none; padding: 6px 12px; - border-radius: 3px; - cursor: pointer; font-size: 13px; } -.editor-header button:hover { - background: #1177bb; -} - .editor-container { flex: 1; position: relative; diff --git a/tools/spectral_editor/index.html b/tools/spectral_editor/index.html index 75658ae..2d5f3e5 100644 --- a/tools/spectral_editor/index.html +++ b/tools/spectral_editor/index.html @@ -4,6 +4,7 @@ <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Spectral Brush Editor</title> + <link rel="stylesheet" href="../common/style.css"> <link rel="stylesheet" href="style.css"> </head> <body> diff --git a/tools/spectral_editor/style.css b/tools/spectral_editor/style.css index 48f7463..87fb54e 100644 --- a/tools/spectral_editor/style.css +++ b/tools/spectral_editor/style.css @@ -1,18 +1,4 @@ -/* Spectral Brush Editor Styles */ - -* { - margin: 0; - padding: 0; - box-sizing: border-box; -} - -body { - font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; - background: #1e1e1e; - color: #d4d4d4; - overflow: hidden; - height: 100vh; -} +/* Spectral Brush Editor Specific Styles */ #app { display: flex; @@ -20,41 +6,12 @@ body { height: 100vh; } -/* Header */ -header { - background: #252526; - padding: 12px 20px; - border-bottom: 1px solid #3e3e42; - display: flex; - justify-content: space-between; - align-items: center; -} - -header h1 { - font-size: 18px; - font-weight: 600; - color: #cccccc; -} - -.header-controls { - display: flex; - align-items: center; - gap: 15px; -} - -.file-info { - font-size: 13px; - color: #858585; -} - -/* Main content area */ .main-content { display: flex; flex: 1; overflow: hidden; } -/* Canvas container (80% width) */ .canvas-container { flex: 1; position: relative; @@ -89,7 +46,6 @@ header h1 { display: none; } -/* Mini spectrum viewer (bottom-right overlay) */ .spectrum-viewer { position: absolute; bottom: 10px; @@ -99,12 +55,8 @@ header h1 { background: rgba(30, 30, 30, 0.9); border: 1px solid #3e3e42; border-radius: 3px; - display: block; /* Always visible */ - pointer-events: none; /* Don't interfere with mouse events */ -} - -.spectrum-viewer.active { - display: block; /* Keep for backward compatibility */ + display: block; + pointer-events: none; } #spectrumCanvas { @@ -123,7 +75,6 @@ header h1 { color: #858585; } -/* Toolbar (20% width) */ .toolbar { width: 250px; background: #252526; @@ -155,16 +106,6 @@ header h1 { transition: background 0.2s; } -.btn-toolbar:hover { - background: #1177bb; -} - -.btn-toolbar:disabled { - background: #3e3e42; - color: #858585; - cursor: not-allowed; -} - .btn-toolbar.btn-danger { background: #a82d2d; } @@ -199,7 +140,6 @@ header h1 { border-color: #0e639c; } -/* Point info panel */ .point-info { margin-top: 10px; padding: 10px; @@ -224,7 +164,6 @@ header h1 { font-family: monospace; } -/* Control panel (bottom) */ .control-panel { background: #252526; border-top: 1px solid #3e3e42; @@ -314,16 +253,6 @@ header h1 { transition: background 0.2s; } -.btn-playback:hover:not(:disabled) { - background: #1177bb; -} - -.btn-playback:disabled { - background: #3e3e42; - color: #858585; - cursor: not-allowed; -} - .btn-playback kbd { background: rgba(255, 255, 255, 0.1); padding: 2px 5px; @@ -331,7 +260,6 @@ header h1 { font-size: 11px; } -/* Action bar (bottom) */ .action-bar { background: #2d2d30; border-top: 1px solid #3e3e42; @@ -365,11 +293,6 @@ header h1 { border-color: #0e639c; } -.btn-action:disabled { - color: #858585; - cursor: not-allowed; -} - .btn-primary { padding: 6px 16px; background: #0e639c; @@ -381,17 +304,11 @@ header h1 { transition: background 0.2s; } -.btn-primary:hover { - background: #1177bb; -} - -/* Icon styling */ .icon { font-size: 14px; line-height: 1; } -/* Modal */ .modal { position: fixed; z-index: 1000; @@ -490,7 +407,6 @@ header h1 { color: #cccccc; } -/* Scrollbar styling */ ::-webkit-scrollbar { width: 10px; height: 10px; @@ -509,7 +425,6 @@ header h1 { background: #4e4e52; } -/* Waveform intensity viewer */ .waveform-container { position: relative; height: 120px; @@ -570,24 +485,9 @@ header h1 { transition: background 0.2s; } -.btn-copy:hover, .btn-snap:hover { - background: #1177bb; -} - -.btn-copy:active, .btn-snap:active { - background: #0d5a8f; -} - .spectrogram-wrapper { flex: 1; position: relative; overflow: hidden; z-index: 1; } - -#spectrogramCanvas { - width: 100%; - height: 100%; - display: block; - cursor: crosshair; -} diff --git a/tools/timeline_editor/README.md b/tools/timeline_editor/README.md index 72b5ae0..66e39bd 100644 --- a/tools/timeline_editor/README.md +++ b/tools/timeline_editor/README.md @@ -39,7 +39,12 @@ This helps identify performance hotspots in your timeline. ## Usage -1. **Open:** `open tools/timeline_editor/index.html` or double-click in browser +1. **Open:** Requires HTTP server (ES6 modules): + ```bash + cd tools/timeline_editor + python3 -m http.server 8080 + ``` + Then open: `http://localhost:8080` 2. **Load timeline:** Click "📂 Load timeline.seq" → select `workspaces/main/timeline.seq` 3. **Load audio:** Click "🎵 Load Audio (WAV)" → select audio file 4. **Auto-load via URL:** `index.html?seq=timeline.seq&wav=audio.wav` @@ -125,7 +130,11 @@ open "tools/timeline_editor/index.html?seq=../../workspaces/main/timeline.seq" ## Technical Notes -- Pure HTML/CSS/JavaScript (no dependencies, works offline) +- Modular ES6 structure (requires HTTP server, not file://) + - `index.html` - Main editor and rendering + - `timeline-viewport.js` - Zoom/scroll/indicator control + - `timeline-playback.js` - Audio playback and waveform +- No external dependencies - **Internal representation uses beats** (not seconds) - Sequences have absolute times (beats), effects are relative to parent sequence - BPM used for seconds conversion (tooltips, audio waveform alignment) diff --git a/tools/timeline_editor/index.html b/tools/timeline_editor/index.html index 363c5cb..c5e0264 100644 --- a/tools/timeline_editor/index.html +++ b/tools/timeline_editor/index.html @@ -4,105 +4,466 @@ <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Timeline Editor - timeline.seq</title> + <link rel="stylesheet" href="../common/style.css"> <link rel="icon" href="data:image/svg+xml,<svg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 100 100'><rect width='100' height='100' fill='%231e1e1e'/><rect x='10' y='30' width='15' height='40' fill='%234ec9b0'/><rect x='30' y='20' width='15' height='60' fill='%234ec9b0'/><rect x='50' y='35' width='15' height='30' fill='%234ec9b0'/><rect x='70' y='15' width='15' height='70' fill='%234ec9b0'/></svg>"> <style> - :root { - --bg-dark: #1e1e1e; - --bg-medium: #252526; - --bg-light: #3c3c3c; - --text-primary: #d4d4d4; - --text-muted: #858585; - --accent-blue: #0e639c; - --accent-blue-hover: #1177bb; - --accent-green: #4ec9b0; - --accent-orange: #ce9178; - --accent-red: #f48771; - --border-color: #858585; - --gap: 10px; - --radius: 4px; + body { + padding: 20px; + min-height: 100vh; } - * { margin: 0; padding: 0; box-sizing: border-box; } - body { font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; background: var(--bg-dark); color: var(--text-primary); padding: 20px; min-height: 100vh; } - .container { max-width: 100%; width: 100%; margin: 0 auto; } + .container { + max-width: 100%; + width: 100%; + margin: 0 auto; + } + + header { + background: var(--bg-medium); + padding: 20px; + border-radius: 8px; + margin-bottom: 20px; + display: flex; + align-items: center; + justify-content: space-between; + gap: 20px; + flex-wrap: wrap; + } + + .zoom-controls { + display: flex; + gap: var(--gap); + flex-wrap: wrap; + align-items: center; + margin-bottom: var(--gap); + } + + .checkbox-label { + display: flex; + align-items: center; + gap: 8px; + cursor: pointer; + user-select: none; + } + + .checkbox-label input[type="checkbox"] { + cursor: pointer; + } + + .timeline-container { + background: var(--bg-medium); + border-radius: 8px; + position: relative; + height: calc(100vh - 280px); + min-height: 500px; + display: flex; + flex-direction: column; + } + + .timeline-content { + flex: 1; + overflow: auto; + position: relative; + padding: 0 20px 20px 20px; + scrollbar-width: none; + -ms-overflow-style: none; + } + + .timeline-content::-webkit-scrollbar { + display: none; + } + + .timeline { + position: relative; + min-height: 100%; + } + + .sticky-header { + position: sticky; + top: 0; + background: var(--bg-medium); + z-index: 100; + padding: 20px 20px 10px 20px; + border-bottom: 2px solid var(--bg-light); + flex-shrink: 0; + } + + .waveform-container { + position: relative; + height: 80px; + overflow: hidden; + background: rgba(0, 0, 0, 0.5); + border-radius: var(--radius); + cursor: crosshair; + } + + #cpuLoadCanvas { + position: absolute; + left: 0; + bottom: 0; + height: 10px; + display: block; + z-index: 1; + } - header { background: var(--bg-medium); padding: 20px; border-radius: 8px; margin-bottom: 20px; display: flex; align-items: center; justify-content: space-between; gap: 20px; flex-wrap: wrap; } - h1 { color: var(--accent-green); white-space: nowrap; } - .controls { display: flex; gap: var(--gap); flex-wrap: wrap; align-items: center; } - .zoom-controls { display: flex; gap: var(--gap); flex-wrap: wrap; align-items: center; margin-bottom: var(--gap); } + #waveformCanvas { + position: absolute; + left: 0; + top: 0; + height: 80px; + display: block; + z-index: 2; + } + + .waveform-cursor { + position: absolute; + top: 0; + bottom: 0; + width: 1px; + background: rgba(78, 201, 176, 0.6); + pointer-events: none; + z-index: 3; + display: none; + } + + .waveform-tooltip { + position: absolute; + background: rgba(30, 30, 30, 0.95); + color: var(--text-primary); + padding: 6px 10px; + border-radius: 4px; + font-size: 12px; + pointer-events: none; + z-index: 4; + display: none; + white-space: nowrap; + border: 1px solid var(--border-color); + box-shadow: 0 2px 8px rgba(0,0,0,0.3); + } + + .playback-indicator { + position: absolute; + top: 0; + bottom: 0; + left: 20px; + width: 2px; + background: var(--accent-red); + box-shadow: 0 0 4px rgba(244, 135, 113, 0.8); + pointer-events: none; + z-index: 110; + display: none; + } - button, .file-label { background: var(--accent-blue); color: white; border: none; padding: 10px 20px; border-radius: var(--radius); cursor: pointer; font-size: 14px; display: inline-block; } - button:hover, .file-label:hover { background: var(--accent-blue-hover); } - button:disabled { background: var(--bg-light); cursor: not-allowed; } - input[type="file"] { display: none; } + .time-markers { + position: relative; + height: 30px; + margin-top: var(--gap); + border-bottom: 1px solid var(--bg-light); + } - .checkbox-label { display: flex; align-items: center; gap: 8px; cursor: pointer; user-select: none; } - .checkbox-label input[type="checkbox"] { cursor: pointer; } + .time-marker { + position: absolute; + top: 0; + font-size: 12px; + color: var(--text-muted); + } - .timeline-container { background: var(--bg-medium); border-radius: 8px; position: relative; height: calc(100vh - 280px); min-height: 500px; display: flex; flex-direction: column; } - .timeline-content { flex: 1; overflow: auto; position: relative; padding: 0 20px 20px 20px; scrollbar-width: none; -ms-overflow-style: none; } - .timeline-content::-webkit-scrollbar { display: none; } - .timeline { position: relative; min-height: 100%; } + .time-marker::before { + content: ''; + position: absolute; + left: 0; + top: 20px; + width: 1px; + height: 10px; + background: var(--bg-light); + } - .sticky-header { position: sticky; top: 0; background: var(--bg-medium); z-index: 100; padding: 20px 20px 10px 20px; border-bottom: 2px solid var(--bg-light); flex-shrink: 0; } - .waveform-container { position: relative; height: 80px; overflow: hidden; background: rgba(0, 0, 0, 0.3); border-radius: var(--radius); cursor: crosshair; } - #cpuLoadCanvas { position: absolute; left: 0; bottom: 0; height: 10px; display: block; z-index: 1; } - #waveformCanvas { position: absolute; left: 0; top: 0; height: 80px; display: block; z-index: 2; } + .time-marker::after { + content: ''; + position: absolute; + left: 0; + top: 30px; + width: 1px; + height: 10000px; + background: rgba(100, 100, 60, 0.9); + pointer-events: none; + } - .playback-indicator { position: absolute; top: 0; bottom: 0; left: 20px; width: 2px; background: var(--accent-red); box-shadow: 0 0 4px rgba(244, 135, 113, 0.8); pointer-events: none; z-index: 110; display: none; } + .sequence { + position: absolute; + background: #264f78; + border: 2px solid var(--accent-blue); + border-radius: var(--radius); + padding: 8px; + cursor: move; + min-height: 40px; + transition: box-shadow 0.2s; + } - .time-markers { position: relative; height: 30px; margin-top: var(--gap); border-bottom: 1px solid var(--bg-light); } - .time-marker { position: absolute; top: 0; font-size: 12px; color: var(--text-muted); } - .time-marker::before { content: ''; position: absolute; left: 0; top: 20px; width: 1px; height: 10px; background: var(--bg-light); } - .time-marker::after { content: ''; position: absolute; left: 0; top: 30px; width: 1px; height: 10000px; background: rgba(60, 60, 60, 0.2); pointer-events: none; } + .sequence:hover { + box-shadow: 0 0 10px rgba(14, 99, 156, 0.5); + } + + .sequence.selected { + border-color: var(--accent-green); + box-shadow: 0 0 10px rgba(78, 201, 176, 0.5); + } + + .sequence.collapsed { + overflow: hidden !important; + background: #1a3a4a !important; + } + + .sequence.collapsed .sequence-name { + display: none !important; + } + + .sequence.active-playing { + border-color: var(--accent-green); + background: #2a5f4a; + } + + .sequence.active-flash { + animation: sequenceFlash 0.6s ease-out; + } - .sequence { position: absolute; background: #264f78; border: 2px solid var(--accent-blue); border-radius: var(--radius); padding: 8px; cursor: move; min-height: 40px; transition: box-shadow 0.2s; } - .sequence:hover { box-shadow: 0 0 10px rgba(14, 99, 156, 0.5); } - .sequence.selected { border-color: var(--accent-green); box-shadow: 0 0 10px rgba(78, 201, 176, 0.5); } - .sequence.collapsed { overflow: hidden !important; background: #1a3a4a !important; } - .sequence.collapsed .sequence-name { display: none !important; } - .sequence.active-playing { border-color: var(--accent-green); background: #2a5f4a; } - .sequence.active-flash { animation: sequenceFlash 0.6s ease-out; } @keyframes sequenceFlash { - 0% { box-shadow: 0 0 20px rgba(78, 201, 176, 0.8); border-color: var(--accent-green); } - 100% { box-shadow: 0 0 10px rgba(14, 99, 156, 0.5); border-color: var(--accent-blue); } + 0% { + box-shadow: 0 0 20px rgba(78, 201, 176, 0.8); + border-color: var(--accent-green); + } + 100% { + box-shadow: 0 0 10px rgba(14, 99, 156, 0.5); + border-color: var(--accent-blue); + } + } + + .sequence-header { + position: absolute; + top: 0; + left: 0; + right: 0; + padding: 8px; + z-index: 5; + cursor: move; + user-select: none; + } + + .sequence-header-name { + font-size: 14px; + font-weight: bold; + color: #ffffff; } - .sequence-header { position: absolute; top: 0; left: 0; right: 0; padding: 8px; z-index: 5; cursor: move; user-select: none; } - .sequence-header-name { font-size: 14px; font-weight: bold; color: #ffffff; } - .sequence:not(.collapsed) .sequence-header-name { display: none; } - .sequence-name { position: absolute; top: 50%; left: 50%; transform: translate(-50%, -50%); font-size: 24px; font-weight: bold; color: #ffffff; text-shadow: 2px 2px 8px rgba(0, 0, 0, 0.9), -1px -1px 4px rgba(0, 0, 0, 0.7); pointer-events: none; white-space: nowrap; opacity: 1; transition: opacity 0.3s ease; z-index: 10; } - .sequence.hovered .sequence-name { opacity: 0; } + .sequence:not(.collapsed) .sequence-header-name { + display: none; + } + + .sequence-name { + position: absolute; + top: 50%; + left: 50%; + transform: translate(-50%, -50%); + font-size: 24px; + font-weight: bold; + color: #ffffff; + text-shadow: 2px 2px 8px rgba(0, 0, 0, 0.9), -1px -1px 4px rgba(0, 0, 0, 0.7); + pointer-events: none; + white-space: nowrap; + opacity: 1; + transition: opacity 0.3s ease; + z-index: 10; + } - .effect { position: absolute; background: #3a3d41; border: 1px solid var(--border-color); border-radius: 3px; padding: 4px 8px; cursor: move; font-size: 11px; transition: box-shadow 0.2s; display: flex; align-items: center; white-space: nowrap; overflow: hidden; text-overflow: ellipsis; } - .effect:hover { box-shadow: 0 0 8px rgba(133, 133, 133, 0.5); background: #45484d; } - .effect.selected { border-color: var(--accent-orange); box-shadow: 0 0 8px rgba(206, 145, 120, 0.5); } - .effect.conflict { background: #4a1d1d; border-color: var(--accent-red); box-shadow: 0 0 8px rgba(244, 135, 113, 0.6); } - .effect.conflict:hover { background: #5a2424; } - .effect-handle { position: absolute; top: 0; width: 6px; height: 100%; background: rgba(78, 201, 176, 0.8); cursor: ew-resize; display: none; z-index: 10; } - .effect.selected .effect-handle { display: block; } - .effect-handle.left { left: 0; border-radius: 3px 0 0 3px; } - .effect-handle.right { right: 0; border-radius: 0 3px 3px 0; } - .effect-handle:hover { background: var(--accent-green); width: 8px; } + .sequence.hovered .sequence-name { + opacity: 0; + } - .properties-panel { position: fixed; bottom: 20px; left: 20px; width: 350px; max-height: 80vh; background: var(--bg-medium); padding: 15px; border-radius: 8px; box-shadow: 0 4px 12px rgba(0, 0, 0, 0.5); z-index: 1000; overflow-y: auto; transition: transform 0.3s ease; } - .properties-panel.collapsed { transform: translateY(calc(100% + 40px)); } - .panel-header { display: flex; justify-content: space-between; align-items: center; margin-bottom: 15px; padding-bottom: 10px; border-bottom: 1px solid var(--bg-light); } - .panel-header h2 { margin: 0; color: var(--accent-green); font-size: 16px; } - .panel-toggle { background: transparent; border: 1px solid var(--border-color); color: var(--text-primary); padding: 4px 8px; border-radius: 3px; cursor: pointer; font-size: 12px; } - .panel-toggle:hover { background: var(--bg-light); } - .panel-collapse-btn { position: fixed; bottom: 20px; left: 20px; background: var(--bg-medium); border: 1px solid var(--border-color); color: var(--text-primary); padding: 8px 12px; border-radius: var(--radius); cursor: pointer; z-index: 999; box-shadow: 0 2px 6px rgba(0, 0, 0, 0.3); display: none; } - .panel-collapse-btn:hover { background: var(--bg-light); } - .panel-collapse-btn.visible { display: block; } + .effect { + position: absolute; + background: #3a3d41; + border: 1px solid var(--border-color); + border-radius: 3px; + padding: 4px 8px; + cursor: move; + font-size: 11px; + transition: box-shadow 0.2s; + display: flex; + align-items: center; + white-space: nowrap; + overflow: hidden; + text-overflow: ellipsis; + } - .property-group { margin-bottom: 15px; } - .property-group label { display: block; margin-bottom: 5px; color: var(--text-muted); font-size: 14px; } - .property-group input, .property-group select { width: 100%; padding: 8px; background: var(--bg-light); border: 1px solid var(--border-color); border-radius: var(--radius); color: var(--text-primary); font-size: 14px; } + .effect:hover { + box-shadow: 0 0 8px rgba(133, 133, 133, 0.5); + background: #45484d; + } - .stats { background: var(--bg-dark); padding: 10px; border-radius: var(--radius); margin-top: 10px; font-size: 12px; color: var(--text-muted); } - #messageArea { position: fixed; top: 80px; right: 20px; z-index: 2000; max-width: 400px; } - .error { background: #5a1d1d; color: var(--accent-red); padding: 10px; border-radius: var(--radius); box-shadow: 0 2px 8px rgba(0,0,0,0.3); } - .success { background: #1e5231; color: #89d185; padding: 10px; border-radius: var(--radius); box-shadow: 0 2px 8px rgba(0,0,0,0.3); } + .effect.selected { + border-color: var(--accent-orange); + box-shadow: 0 0 8px rgba(206, 145, 120, 0.5); + } + + .effect.conflict { + background: #4a1d1d; + border-color: var(--accent-red); + box-shadow: 0 0 8px rgba(244, 135, 113, 0.6); + } + + .effect.conflict:hover { + background: #5a2424; + } + + .effect-handle { + position: absolute; + top: 0; + width: 6px; + height: 100%; + background: rgba(78, 201, 176, 0.8); + cursor: ew-resize; + display: none; + z-index: 10; + } + + .effect.selected .effect-handle { + display: block; + } + + .effect-handle.left { + left: 0; + border-radius: 3px 0 0 3px; + } + + .effect-handle.right { + right: 0; + border-radius: 0 3px 3px 0; + } + + .effect-handle:hover { + background: var(--accent-green); + width: 8px; + } + + .properties-panel { + position: fixed; + bottom: 20px; + left: 20px; + width: 350px; + max-height: 80vh; + background: var(--bg-medium); + padding: 15px; + border-radius: 8px; + box-shadow: 0 4px 12px rgba(0, 0, 0, 0.5); + z-index: 1000; + overflow-y: auto; + transition: transform 0.3s ease; + } + + .properties-panel.collapsed { + transform: translateY(calc(100% + 40px)); + } + + .panel-header { + display: flex; + justify-content: space-between; + align-items: center; + margin-bottom: 15px; + padding-bottom: 10px; + border-bottom: 1px solid var(--bg-light); + } + + .panel-header h2 { + margin: 0; + color: var(--accent-green); + font-size: 16px; + } + + .panel-toggle { + background: transparent; + border: 1px solid var(--border-color); + color: var(--text-primary); + padding: 4px 8px; + border-radius: 3px; + cursor: pointer; + font-size: 12px; + } + + .panel-toggle:hover { + background: var(--bg-light); + } + + .panel-collapse-btn { + position: fixed; + bottom: 20px; + left: 20px; + background: var(--bg-medium); + border: 1px solid var(--border-color); + color: var(--text-primary); + padding: 8px 12px; + border-radius: var(--radius); + cursor: pointer; + z-index: 999; + box-shadow: 0 2px 6px rgba(0, 0, 0, 0.3); + display: none; + } + + .panel-collapse-btn:hover { + background: var(--bg-light); + } + + .panel-collapse-btn.visible { + display: block; + } + + .property-group { + margin-bottom: 15px; + } + + .property-group label { + display: block; + margin-bottom: 5px; + color: var(--text-muted); + font-size: 14px; + } + + .property-group input, + .property-group select { + width: 100%; + } + + .stats { + background: var(--bg-dark); + padding: 10px; + border-radius: var(--radius); + margin-top: 10px; + font-size: 12px; + color: var(--text-muted); + } + + #messageArea { + position: fixed; + top: 80px; + right: 20px; + z-index: 2000; + max-width: 400px; + } + + .error { + background: #5a1d1d; + color: var(--accent-red); + padding: 10px; + border-radius: var(--radius); + box-shadow: 0 2px 8px rgba(0,0,0,0.3); + } + + .success { + background: #1e5231; + color: #89d185; + padding: 10px; + border-radius: var(--radius); + box-shadow: 0 2px 8px rgba(0,0,0,0.3); + } </style> </head> <body> @@ -149,12 +510,14 @@ <div id="messageArea"></div> - <div class="timeline-container"> + <div class="timeline-container" id="timelineContainer"> <div class="playback-indicator" id="playbackIndicator"></div> <div class="sticky-header"> <div class="waveform-container" id="waveformContainer"> <canvas id="cpuLoadCanvas"></canvas> <canvas id="waveformCanvas"></canvas> + <div class="waveform-cursor" id="waveformCursor"></div> + <div class="waveform-tooltip" id="waveformTooltip"></div> </div> <div class="time-markers" id="timeMarkers"></div> </div> @@ -176,16 +539,15 @@ <div class="stats" id="stats"></div> </div> - <script> + <script type="module"> + import { ViewportController } from './timeline-viewport.js'; + import { PlaybackController } from './timeline-playback.js'; + // Constants const POST_PROCESS_EFFECTS = new Set(['FadeEffect', 'FlashEffect', 'GaussianBlurEffect', 'SolarizeEffect', 'VignetteEffect', 'ChromaAberrationEffect', 'DistortEffect', 'ThemeModulationEffect', 'CNNEffect', 'CNNv2Effect']); - const TIMELINE_LEFT_PADDING = 20; - const SCROLL_VIEWPORT_FRACTION = 0.4; - const SMOOTH_SCROLL_SPEED = 0.1; - const VERTICAL_SCROLL_SPEED = 0.3; const SEQUENCE_GAP = 10; const SEQUENCE_DEFAULT_WIDTH = 10; const SEQUENCE_DEFAULT_DURATION = 16; @@ -195,21 +557,29 @@ const SEQUENCE_BOTTOM_PADDING = 5; const EFFECT_SPACING = 30; const EFFECT_HEIGHT = 26; - const WAVEFORM_AMPLITUDE_SCALE = 0.4; + + // BPM computation helper + const computeBPMValues = (bpm) => ({ + secondsPerBeat: 60.0 / bpm, + beatsPerSecond: bpm / 60.0 + }); // State + const DEFAULT_BPM = 120; const state = { - sequences: [], currentFile: null, selectedItem: null, pixelsPerSecond: 100, - showBeats: true, quantizeUnit: 1, bpm: 120, isDragging: false, dragOffset: { x: 0, y: 0 }, + sequences: [], currentFile: null, selectedItem: null, pixelsPerBeat: 100, + showBeats: true, quantizeUnit: 1, bpm: DEFAULT_BPM, isDragging: false, dragOffset: { x: 0, y: 0 }, lastActiveSeqIndex: -1, isDraggingHandle: false, handleType: null, handleDragOffset: 0, - audioBuffer: null, audioDuration: 0, audioSource: null, audioContext: null, + audioBuffer: null, audioDurationSeconds: 0, audioSource: null, audioContext: null, isPlaying: false, playbackStartTime: 0, playbackOffset: 0, playStartPosition: 0, animationFrameId: null, - lastExpandedSeqIndex: -1, dragMoved: false + lastExpandedSeqIndex: -1, dragMoved: false, + ...computeBPMValues(DEFAULT_BPM) }; // DOM const dom = { timeline: document.getElementById('timeline'), + timelineContainer: document.getElementById('timelineContainer'), timelineContent: document.getElementById('timelineContent'), fileInput: document.getElementById('fileInput'), saveBtn: document.getElementById('saveBtn'), @@ -238,7 +608,9 @@ bpmSlider: document.getElementById('bpmSlider'), currentBPM: document.getElementById('currentBPM'), showBeatsCheckbox: document.getElementById('showBeatsCheckbox'), - quantizeSelect: document.getElementById('quantizeSelect') + quantizeSelect: document.getElementById('quantizeSelect'), + waveformCursor: document.getElementById('waveformCursor'), + waveformTooltip: document.getElementById('waveformTooltip') }; // Parser @@ -247,7 +619,7 @@ let currentSequence = null, bpm = 120, currentPriority = 0; const parseTime = (timeStr) => { - if (timeStr.endsWith('s')) return parseFloat(timeStr.slice(0, -1)) * bpm / 60.0; + if (timeStr.endsWith('s')) return parseFloat(timeStr.slice(0, -1)) * bpm / 60.0; // Local bpm during parsing if (timeStr.endsWith('b')) return parseFloat(timeStr.slice(0, -1)); return parseFloat(timeStr); }; @@ -293,14 +665,25 @@ } // Helpers - const beatsToTime = (beats) => beats * 60.0 / state.bpm; - const timeToBeats = (seconds) => seconds * state.bpm / 60.0; + const updateBPM = (newBPM) => { + state.bpm = newBPM; + Object.assign(state, computeBPMValues(newBPM)); + }; + const beatsToTime = (beats) => beats * state.secondsPerBeat; + const timeToBeats = (seconds) => seconds * state.beatsPerSecond; const beatRange = (start, end) => { const s = start.toFixed(1), e = end.toFixed(1); const ss = beatsToTime(start).toFixed(1), es = beatsToTime(end).toFixed(1); return state.showBeats ? `${s}-${e}b (${ss}-${es}s)` : `${ss}-${es}s (${s}-${e}b)`; }; + // Utilities + function showMessage(text, type) { + if (type === 'error') console.error(text); + dom.messageArea.innerHTML = `<div class="${type}">${text}</div>`; + setTimeout(() => dom.messageArea.innerHTML = '', 3000); + } + function detectConflicts(seq) { const conflicts = new Set(); const priorityGroups = {}; @@ -334,84 +717,16 @@ return output; } - // Audio - async function loadAudioFile(file) { - try { - const arrayBuffer = await file.arrayBuffer(); - if (!state.audioContext) state.audioContext = new (window.AudioContext || window.webkitAudioContext)(); - state.audioBuffer = await state.audioContext.decodeAudioData(arrayBuffer); - state.audioDuration = state.audioBuffer.duration; - renderWaveform(); - dom.playbackControls.style.display = 'flex'; - dom.playbackIndicator.style.display = 'block'; - dom.clearAudioBtn.disabled = false; - dom.replayBtn.disabled = false; - showMessage(`Audio loaded: ${state.audioDuration.toFixed(2)}s`, 'success'); - renderTimeline(); - } catch (err) { - showMessage(`Error loading audio: ${err.message}`, 'error'); - } - } - - function renderWaveform() { - if (!state.audioBuffer) return; - const canvas = dom.waveformCanvas, ctx = canvas.getContext('2d'); - - // Calculate maxTime same as timeline to ensure alignment - let maxTime = 60; - for (const seq of state.sequences) { - maxTime = Math.max(maxTime, seq.startTime + SEQUENCE_DEFAULT_DURATION); - for (const effect of seq.effects) maxTime = Math.max(maxTime, seq.startTime + effect.endTime); - } - if (state.audioDuration > 0) maxTime = Math.max(maxTime, state.audioDuration * state.bpm / 60.0); - - const w = maxTime * state.pixelsPerSecond, h = 80; - canvas.width = w; canvas.height = h; - canvas.style.width = `${w}px`; canvas.style.height = `${h}px`; - ctx.fillStyle = 'rgba(0, 0, 0, 0.3)'; ctx.fillRect(0, 0, w, h); - - const channelData = state.audioBuffer.getChannelData(0); - const audioBeats = timeToBeats(state.audioDuration); - const audioPixelWidth = audioBeats * state.pixelsPerSecond; - const samplesPerPixel = Math.ceil(channelData.length / audioPixelWidth); - const centerY = h / 2, amplitudeScale = h * WAVEFORM_AMPLITUDE_SCALE; - - ctx.strokeStyle = '#4ec9b0'; ctx.lineWidth = 1; ctx.beginPath(); - for (let x = 0; x < audioPixelWidth; x++) { - const start = Math.floor(x * samplesPerPixel); - const end = Math.min(start + samplesPerPixel, channelData.length); - let min = 1.0, max = -1.0; - for (let i = start; i < end; i++) { - min = Math.min(min, channelData[i]); - max = Math.max(max, channelData[i]); - } - const yMin = centerY - min * amplitudeScale, yMax = centerY - max * amplitudeScale; - x === 0 ? ctx.moveTo(x, yMin) : ctx.lineTo(x, yMin); - ctx.lineTo(x, yMax); - } - ctx.stroke(); - ctx.strokeStyle = 'rgba(255, 255, 255, 0.1)'; - ctx.beginPath(); ctx.moveTo(0, centerY); ctx.lineTo(audioPixelWidth, centerY); ctx.stroke(); - - // Draw beat markers across full maxTime width - ctx.strokeStyle = 'rgba(255, 255, 255, 0.15)'; - ctx.lineWidth = 1; - for (let beat = 0; beat <= maxTime; beat++) { - const x = beat * state.pixelsPerSecond; - ctx.beginPath(); - ctx.moveTo(x, 0); - ctx.lineTo(x, h); - ctx.stroke(); - } - } + // Controllers - initialized after DOM setup + let viewportController, playbackController; function computeCPULoad() { - if (state.sequences.length === 0) return { maxTime: 60, loads: [], conflicts: [] }; - let maxTime = Math.max(60, ...state.sequences.flatMap(seq => + if (state.sequences.length === 0) return { maxTimeBeats: 60, loads: [], conflicts: [] }; + let maxTimeBeats = Math.max(60, ...state.sequences.flatMap(seq => seq.effects.map(eff => seq.startTime + eff.endTime))); - if (state.audioDuration > 0) maxTime = Math.max(maxTime, timeToBeats(state.audioDuration)); + if (state.audioDurationSeconds > 0) maxTimeBeats = Math.max(maxTimeBeats, timeToBeats(state.audioDurationSeconds)); - const resolution = 0.1, numSamples = Math.ceil(maxTime / resolution); + const resolution = 0.1, numSamples = Math.ceil(maxTimeBeats / resolution); const loads = new Array(numSamples).fill(0); const conflicts = new Array(numSamples).fill(false); @@ -462,19 +777,19 @@ }); }); - return { maxTime, loads, conflicts, resolution }; + return { maxTimeBeats, loads, conflicts, resolution }; } function renderCPULoad() { const canvas = dom.cpuLoadCanvas, ctx = canvas.getContext('2d'); - const { maxTime, loads, conflicts, resolution } = computeCPULoad(); - const w = maxTime * state.pixelsPerSecond, h = 10; + const { maxTimeBeats, loads, conflicts, resolution } = computeCPULoad(); + const w = maxTimeBeats * state.pixelsPerBeat, h = 10; canvas.width = w; canvas.height = h; canvas.style.width = `${w}px`; canvas.style.height = `${h}px`; ctx.fillStyle = 'rgba(0, 0, 0, 0.3)'; ctx.fillRect(0, 0, w, h); if (loads.length === 0) return; - const barWidth = resolution * state.pixelsPerSecond; + const barWidth = resolution * state.pixelsPerBeat; loads.forEach((load, i) => { if (load === 0) return; const n = Math.min(load / 8, 1.0); @@ -487,114 +802,31 @@ }); } - function clearAudio() { - stopPlayback(); state.audioBuffer = null; state.audioDuration = 0; state.playbackOffset = 0; - state.playStartPosition = 0; - dom.playbackControls.style.display = 'none'; - dom.playbackIndicator.style.display = 'none'; - dom.clearAudioBtn.disabled = true; - dom.replayBtn.disabled = true; - const ctx = dom.waveformCanvas.getContext('2d'); - ctx.clearRect(0, 0, dom.waveformCanvas.width, dom.waveformCanvas.height); - renderTimeline(); - showMessage('Audio cleared', 'success'); - } - - async function startPlayback() { - if (!state.audioBuffer || !state.audioContext) return; - if (state.audioSource) try { state.audioSource.stop(); } catch (e) {} state.audioSource = null; - if (state.audioContext.state === 'suspended') await state.audioContext.resume(); - try { - state.audioSource = state.audioContext.createBufferSource(); - state.audioSource.buffer = state.audioBuffer; - state.audioSource.connect(state.audioContext.destination); - state.audioSource.start(0, state.playbackOffset); - state.playbackStartTime = state.audioContext.currentTime; - state.isPlaying = true; dom.playPauseBtn.textContent = '⏸ Pause'; - updatePlaybackPosition(); - state.audioSource.onended = () => { if (state.isPlaying) stopPlayback(); }; - } catch (e) { - console.error('Failed to start playback:', e); showMessage('Playback failed: ' + e.message, 'error'); - state.audioSource = null; state.isPlaying = false; - } - } - - function stopPlayback(savePosition = true) { - if (state.audioSource) try { state.audioSource.stop(); } catch (e) {} state.audioSource = null; - if (state.animationFrameId) { cancelAnimationFrame(state.animationFrameId); state.animationFrameId = null; } - if (state.isPlaying && savePosition) { - const elapsed = state.audioContext.currentTime - state.playbackStartTime; - state.playbackOffset = Math.min(state.playbackOffset + elapsed, state.audioDuration); - } - state.isPlaying = false; dom.playPauseBtn.textContent = '▶ Play'; - } - - function updatePlaybackPosition() { - if (!state.isPlaying) return; - const elapsed = state.audioContext.currentTime - state.playbackStartTime; - const currentTime = state.playbackOffset + elapsed; - const currentBeats = timeToBeats(currentTime); - dom.playbackTime.textContent = `${currentTime.toFixed(2)}s (${currentBeats.toFixed(2)}b)`; - updateIndicatorPosition(currentBeats, true); - expandSequenceAtTime(currentBeats); - state.animationFrameId = requestAnimationFrame(updatePlaybackPosition); - } - - function expandSequenceAtTime(currentBeats) { - let activeSeqIndex = -1; - for (let i = 0; i < state.sequences.length; i++) { - const seq = state.sequences[i]; - const seqEndBeats = seq.startTime + (seq.effects.length > 0 ? Math.max(...seq.effects.map(e => e.endTime)) : 0); - if (currentBeats >= seq.startTime && currentBeats <= seqEndBeats) { activeSeqIndex = i; break; } - } - if (activeSeqIndex !== state.lastExpandedSeqIndex) { - const seqDivs = dom.timeline.querySelectorAll('.sequence'); - if (state.lastExpandedSeqIndex >= 0 && seqDivs[state.lastExpandedSeqIndex]) { - seqDivs[state.lastExpandedSeqIndex].classList.remove('active-playing'); - } - if (activeSeqIndex >= 0 && seqDivs[activeSeqIndex]) { - seqDivs[activeSeqIndex].classList.add('active-playing'); - } - state.lastExpandedSeqIndex = activeSeqIndex; - } - } - - function updateIndicatorPosition(beats, smoothScroll = false) { - const timelineX = beats * state.pixelsPerSecond; - const scrollLeft = dom.timelineContent.scrollLeft; - dom.playbackIndicator.style.left = `${timelineX - scrollLeft + TIMELINE_LEFT_PADDING}px`; - if (smoothScroll) { - const targetScroll = timelineX - dom.timelineContent.clientWidth * SCROLL_VIEWPORT_FRACTION; - const scrollDiff = targetScroll - scrollLeft; - if (Math.abs(scrollDiff) > 5) dom.timelineContent.scrollLeft += scrollDiff * SMOOTH_SCROLL_SPEED; - } - } - // Render function renderTimeline() { renderCPULoad(); dom.timeline.innerHTML = ''; document.getElementById('timeMarkers').innerHTML = ''; - let maxTime = 60; + let maxTimeBeats = 60; for (const seq of state.sequences) { - maxTime = Math.max(maxTime, seq.startTime + SEQUENCE_DEFAULT_DURATION); - for (const effect of seq.effects) maxTime = Math.max(maxTime, seq.startTime + effect.endTime); + maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + SEQUENCE_DEFAULT_DURATION); + for (const effect of seq.effects) maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + effect.endTime); } - if (state.audioDuration > 0) maxTime = Math.max(maxTime, state.audioDuration * state.bpm / 60.0); - const timelineWidth = maxTime * state.pixelsPerSecond; + if (state.audioDurationSeconds > 0) maxTimeBeats = Math.max(maxTimeBeats, state.audioDurationSeconds * state.beatsPerSecond); + const timelineWidth = maxTimeBeats * state.pixelsPerBeat; dom.timeline.style.width = `${timelineWidth}px`; let totalTimelineHeight = 0; const timeMarkers = document.getElementById('timeMarkers'); if (state.showBeats) { - for (let beat = 0; beat <= maxTime; beat += 4) { + for (let beat = 0; beat <= maxTimeBeats; beat += 4) { const marker = document.createElement('div'); - marker.className = 'time-marker'; marker.style.left = `${beat * state.pixelsPerSecond}px`; + marker.className = 'time-marker'; marker.style.left = `${beat * state.pixelsPerBeat}px`; marker.textContent = `${beat}b`; timeMarkers.appendChild(marker); } } else { - const maxSeconds = maxTime * 60.0 / state.bpm; + const maxSeconds = maxTimeBeats * state.secondsPerBeat; for (let t = 0; t <= maxSeconds; t += 1) { - const beatPos = t * state.bpm / 60.0, marker = document.createElement('div'); - marker.className = 'time-marker'; marker.style.left = `${beatPos * state.pixelsPerSecond}px`; + const beatPos = t * state.beatsPerSecond, marker = document.createElement('div'); + marker.className = 'time-marker'; marker.style.left = `${beatPos * state.pixelsPerBeat}px`; marker.textContent = `${t}s`; timeMarkers.appendChild(marker); } } @@ -611,9 +843,9 @@ const numEffects = seq.effects.length; const fullHeight = Math.max(SEQUENCE_MIN_HEIGHT, SEQUENCE_TOP_PADDING + numEffects * EFFECT_SPACING + SEQUENCE_BOTTOM_PADDING); const seqHeight = seq._collapsed ? SEQUENCE_COLLAPSED_HEIGHT : fullHeight; - seqDiv.style.left = `${seqVisualStart * state.pixelsPerSecond}px`; + seqDiv.style.left = `${seqVisualStart * state.pixelsPerBeat}px`; seqDiv.style.top = `${cumulativeY}px`; - seqDiv.style.width = `${(seqVisualEnd - seqVisualStart) * state.pixelsPerSecond}px`; + seqDiv.style.width = `${(seqVisualEnd - seqVisualStart) * state.pixelsPerBeat}px`; seqDiv.style.height = `${seqHeight}px`; seqDiv.style.minHeight = `${seqHeight}px`; seqDiv.style.maxHeight = `${seqHeight}px`; seq._yPosition = cumulativeY; cumulativeY += seqHeight + SEQUENCE_GAP; totalTimelineHeight = cumulativeY; const seqHeaderDiv = document.createElement('div'); seqHeaderDiv.className = 'sequence-header'; @@ -640,9 +872,9 @@ if (conflicts.has(effectIndex)) effectDiv.classList.add('conflict'); Object.assign(effectDiv.dataset, { seqIndex, effectIndex }); Object.assign(effectDiv.style, { - left: `${(seq.startTime + effect.startTime) * state.pixelsPerSecond}px`, + left: `${(seq.startTime + effect.startTime) * state.pixelsPerBeat}px`, top: `${seq._yPosition + SEQUENCE_TOP_PADDING + effectIndex * EFFECT_SPACING}px`, - width: `${(effect.endTime - effect.startTime) * state.pixelsPerSecond}px`, + width: `${(effect.endTime - effect.startTime) * state.pixelsPerBeat}px`, height: `${EFFECT_HEIGHT}px` }); effectDiv.innerHTML = `<div class="effect-handle left"></div><small>${effect.className}</small><div class="effect-handle right"></div>`; @@ -685,13 +917,13 @@ if (!state.isDragging || !state.selectedItem) return; state.dragMoved = true; const containerRect = dom.timelineContent.getBoundingClientRect(); - let newTime = Math.max(0, (e.clientX - containerRect.left + dom.timelineContent.scrollLeft - state.dragOffset.x) / state.pixelsPerSecond); - if (state.quantizeUnit > 0) newTime = Math.round(newTime * state.quantizeUnit) / state.quantizeUnit; - if (state.selectedItem.type === 'sequence') state.sequences[state.selectedItem.index].startTime = newTime; + let newTimeBeats = Math.max(0, (e.clientX - containerRect.left + dom.timelineContent.scrollLeft - state.dragOffset.x) / state.pixelsPerBeat); + if (state.quantizeUnit > 0) newTimeBeats = Math.round(newTimeBeats * state.quantizeUnit) / state.quantizeUnit; + if (state.selectedItem.type === 'sequence') state.sequences[state.selectedItem.index].startTime = newTimeBeats; else if (state.selectedItem.type === 'effect') { const seq = state.sequences[state.selectedItem.seqIndex], effect = seq.effects[state.selectedItem.effectIndex]; - const duration = effect.endTime - effect.startTime, relativeTime = newTime - seq.startTime; - effect.startTime = relativeTime; effect.endTime = effect.startTime + duration; + const durationBeats = effect.endTime - effect.startTime, relativeTimeBeats = newTimeBeats - seq.startTime; + effect.startTime = relativeTimeBeats; effect.endTime = effect.startTime + durationBeats; } renderTimeline(); updateProperties(); } @@ -709,7 +941,7 @@ state.selectedItem = { type: 'effect', seqIndex, effectIndex, index: seqIndex }; const seq = state.sequences[seqIndex], effect = seq.effects[effectIndex]; const containerRect = dom.timelineContent.getBoundingClientRect(); - const mouseTimeBeats = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerSecond; + const mouseTimeBeats = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerBeat; const handleTimeBeats = seq.startTime + (type === 'left' ? effect.startTime : effect.endTime); state.handleDragOffset = handleTimeBeats - mouseTimeBeats; document.addEventListener('mousemove', onHandleDrag); document.addEventListener('mouseup', stopHandleDrag); @@ -718,13 +950,13 @@ function onHandleDrag(e) { if (!state.isDraggingHandle || !state.selectedItem) return; const containerRect = dom.timelineContent.getBoundingClientRect(); - let newTime = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerSecond + state.handleDragOffset; - newTime = Math.max(0, newTime); - if (state.quantizeUnit > 0) newTime = Math.round(newTime * state.quantizeUnit) / state.quantizeUnit; + let newTimeBeats = (e.clientX - containerRect.left + dom.timelineContent.scrollLeft) / state.pixelsPerBeat + state.handleDragOffset; + newTimeBeats = Math.max(0, newTimeBeats); + if (state.quantizeUnit > 0) newTimeBeats = Math.round(newTimeBeats * state.quantizeUnit) / state.quantizeUnit; const seq = state.sequences[state.selectedItem.seqIndex], effect = seq.effects[state.selectedItem.effectIndex]; - const relativeTime = newTime - seq.startTime; - if (state.handleType === 'left') effect.startTime = Math.min(relativeTime, effect.endTime - 0.1); - else if (state.handleType === 'right') effect.endTime = Math.max(effect.startTime + 0.1, relativeTime); + const relativeTimeBeats = newTimeBeats - seq.startTime; + if (state.handleType === 'left') effect.startTime = Math.min(relativeTimeBeats, effect.endTime - 0.1); + else if (state.handleType === 'right') effect.endTime = Math.max(effect.startTime + 0.1, relativeTimeBeats); renderTimeline(); updateProperties(); } @@ -762,8 +994,8 @@ const samePriority = effect.priorityModifier === '='; dom.propertiesContent.innerHTML = ` <div class="property-group"><label>Effect Class</label><input type="text" id="propClassName" value="${effect.className}"></div> - <div class="property-group"><label>Start Time (relative to sequence)</label><input type="number" id="propStartTime" value="${effect.startTime}" step="0.1"></div> - <div class="property-group"><label>End Time (relative to sequence)</label><input type="number" id="propEndTime" value="${effect.endTime}" step="0.1"></div> + <div class="property-group"><label>Start Time (beats, relative to sequence)</label><input type="number" id="propStartTime" value="${effect.startTime}" step="0.1"></div> + <div class="property-group"><label>End Time (beats, relative to sequence)</label><input type="number" id="propEndTime" value="${effect.endTime}" step="0.1"></div> <div class="property-group"><label>Constructor Arguments</label><input type="text" id="propArgs" value="${effect.args || ''}"></div> <div class="property-group"><label>Stack Position (determines priority)</label> <div style="display: flex; gap: 5px; margin-bottom: 10px;"> @@ -826,18 +1058,11 @@ updateProperties(); } - // Utilities - function showMessage(text, type) { - if (type === 'error') console.error(text); - dom.messageArea.innerHTML = `<div class="${type}">${text}</div>`; - setTimeout(() => dom.messageArea.innerHTML = '', 3000); - } - function updateStats() { const effectCount = state.sequences.reduce((sum, seq) => sum + seq.effects.length, 0); - const maxTime = Math.max(0, ...state.sequences.flatMap(seq => + const maxTimeBeats = Math.max(0, ...state.sequences.flatMap(seq => seq.effects.map(e => seq.startTime + e.endTime).concat(seq.startTime))); - dom.stats.innerHTML = `📊 Sequences: ${state.sequences.length} | 🎬 Effects: ${effectCount} | ⏱️ Duration: ${maxTime.toFixed(2)}s`; + dom.stats.innerHTML = `📊 Sequences: ${state.sequences.length} | 🎬 Effects: ${effectCount} | ⏱️ Duration: ${maxTimeBeats.toFixed(2)}b (${beatsToTime(maxTimeBeats).toFixed(2)}s)`; } async function loadFromURLParams() { @@ -848,21 +1073,22 @@ const response = await fetch(seqURL); if (!response.ok) throw new Error(`HTTP ${response.status}`); const content = await response.text(), parsed = parseSeqFile(content); - state.sequences = parsed.sequences; state.bpm = parsed.bpm; + state.sequences = parsed.sequences; + updateBPM(parsed.bpm); dom.currentBPM.value = state.bpm; dom.bpmSlider.value = state.bpm; state.currentFile = seqURL.split('/').pop(); state.playbackOffset = 0; renderTimeline(); dom.saveBtn.disabled = false; dom.addSequenceBtn.disabled = false; dom.reorderBtn.disabled = false; - updateIndicatorPosition(0, false); + if (viewportController) viewportController.updateIndicatorPosition(0, false); showMessage(`Loaded ${state.currentFile} from URL`, 'success'); } catch (err) { showMessage(`Error loading seq file: ${err.message}`, 'error'); } } - if (wavURL) { + if (wavURL && playbackController) { try { const response = await fetch(wavURL); if (!response.ok) throw new Error(`HTTP ${response.status}`); const blob = await response.blob(), file = new File([blob], wavURL.split('/').pop(), { type: 'audio/wav' }); - await loadAudioFile(file); + await playbackController.loadAudioFile(file); } catch (err) { showMessage(`Error loading audio file: ${err.message}`, 'error'); } } } @@ -876,11 +1102,12 @@ reader.onload = e => { try { const parsed = parseSeqFile(e.target.result); - state.sequences = parsed.sequences; state.bpm = parsed.bpm; + state.sequences = parsed.sequences; + updateBPM(parsed.bpm); dom.currentBPM.value = state.bpm; dom.bpmSlider.value = state.bpm; state.playbackOffset = 0; renderTimeline(); dom.saveBtn.disabled = false; dom.addSequenceBtn.disabled = false; dom.reorderBtn.disabled = false; - updateIndicatorPosition(0, false); + if (viewportController) viewportController.updateIndicatorPosition(0, false); showMessage(`Loaded ${state.currentFile} - ${state.sequences.length} sequences`, 'success'); } catch (err) { showMessage(`Error parsing file: ${err.message}`, 'error'); } }; @@ -894,41 +1121,7 @@ showMessage('File saved', 'success'); }); - dom.audioInput.addEventListener('change', e => { const file = e.target.files[0]; if (file) loadAudioFile(file); }); - dom.clearAudioBtn.addEventListener('click', () => { clearAudio(); dom.audioInput.value = ''; }); - dom.playPauseBtn.addEventListener('click', async () => { - if (state.isPlaying) stopPlayback(); - else { - if (state.playbackOffset >= state.audioDuration) state.playbackOffset = 0; - state.playStartPosition = state.playbackOffset; - await startPlayback(); - } - }); - - dom.replayBtn.addEventListener('click', async () => { - stopPlayback(false); - state.playbackOffset = state.playStartPosition; - const replayBeats = timeToBeats(state.playbackOffset); - dom.playbackTime.textContent = `${state.playbackOffset.toFixed(2)}s (${replayBeats.toFixed(2)}b)`; - updateIndicatorPosition(replayBeats, false); - await startPlayback(); - }); - - dom.waveformContainer.addEventListener('click', async e => { - if (!state.audioBuffer) return; - const rect = dom.waveformContainer.getBoundingClientRect(); - const canvasOffset = parseFloat(dom.waveformCanvas.style.left) || 0; - const clickX = e.clientX - rect.left - canvasOffset; - const clickBeats = clickX / state.pixelsPerSecond; - const clickTime = beatsToTime(clickBeats); - const wasPlaying = state.isPlaying; - if (wasPlaying) stopPlayback(false); - state.playbackOffset = Math.max(0, Math.min(clickTime, state.audioDuration)); - const pausedBeats = timeToBeats(state.playbackOffset); - dom.playbackTime.textContent = `${state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`; - updateIndicatorPosition(pausedBeats, false); - if (wasPlaying) await startPlayback(); - }); + // Audio/playback event handlers - managed by PlaybackController dom.addSequenceBtn.addEventListener('click', () => { state.sequences.push({ type: 'sequence', startTime: 0, priority: 0, effects: [], _collapsed: true }); @@ -963,30 +1156,24 @@ showMessage('Sequences re-ordered by start time', 'success'); }); - dom.zoomSlider.addEventListener('input', e => { - state.pixelsPerSecond = parseInt(e.target.value); - dom.zoomLevel.textContent = `${state.pixelsPerSecond}%`; - if (state.audioBuffer) renderWaveform(); - renderTimeline(); - updateIndicatorPosition(timeToBeats(state.playbackOffset), false); - }); + // Zoom handler - managed by ViewportController dom.bpmSlider.addEventListener('input', e => { - state.bpm = parseInt(e.target.value); + updateBPM(parseInt(e.target.value)); dom.currentBPM.value = state.bpm; - if (state.audioBuffer) renderWaveform(); + if (state.audioBuffer && playbackController) playbackController.renderWaveform(); renderTimeline(); - updateIndicatorPosition(timeToBeats(state.playbackOffset), false); + if (viewportController) viewportController.updateIndicatorPosition(timeToBeats(state.playbackOffset), false); }); dom.currentBPM.addEventListener('change', e => { const bpm = parseInt(e.target.value); if (!isNaN(bpm) && bpm >= 60 && bpm <= 200) { - state.bpm = bpm; + updateBPM(bpm); dom.bpmSlider.value = bpm; - if (state.audioBuffer) renderWaveform(); + if (state.audioBuffer && playbackController) playbackController.renderWaveform(); renderTimeline(); - updateIndicatorPosition(timeToBeats(state.playbackOffset), false); + if (viewportController) viewportController.updateIndicatorPosition(timeToBeats(state.playbackOffset), false); } else { e.target.value = state.bpm; } @@ -998,22 +1185,15 @@ dom.panelCollapseBtn.addEventListener('click', () => { dom.propertiesPanel.classList.remove('collapsed'); dom.panelCollapseBtn.classList.remove('visible'); dom.panelToggle.textContent = '▼ Collapse'; }); dom.timeline.addEventListener('click', () => { state.selectedItem = null; dom.deleteBtn.disabled = true; dom.addEffectBtn.disabled = true; renderTimeline(); updateProperties(); }); - dom.timeline.addEventListener('dblclick', async e => { + dom.timeline.addEventListener('dblclick', e => { if (e.target !== dom.timeline) return; + if (!playbackController || !state.audioBuffer) return; const containerRect = dom.timelineContent.getBoundingClientRect(); - const clickX = e.clientX - containerRect.left + dom.timelineContent.scrollLeft - TIMELINE_LEFT_PADDING; - const clickBeats = clickX / state.pixelsPerSecond; + const clickX = e.clientX - containerRect.left + dom.timelineContent.scrollLeft - viewportController.TIMELINE_LEFT_PADDING; + const clickBeats = clickX / state.pixelsPerBeat; const clickTime = beatsToTime(clickBeats); - if (state.audioBuffer) { - const wasPlaying = state.isPlaying; - if (wasPlaying) stopPlayback(false); - state.playbackOffset = Math.max(0, Math.min(clickTime, state.audioDuration)); - const pausedBeats = timeToBeats(state.playbackOffset); - dom.playbackTime.textContent = `${state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`; - updateIndicatorPosition(pausedBeats, false); - if (wasPlaying) await startPlayback(); - showMessage(`Seek to ${clickTime.toFixed(2)}s (${clickBeats.toFixed(2)}b)`, 'success'); - } + const result = playbackController.seekTo(clickBeats, clickTime); + if (result) showMessage(`Seek to ${result.clickTime.toFixed(2)}s (${result.clickBeats.toFixed(2)}b)`, 'success'); }); document.addEventListener('keydown', e => { @@ -1030,51 +1210,21 @@ } }); - dom.timelineContent.addEventListener('scroll', () => { - const scrollLeft = dom.timelineContent.scrollLeft; - dom.cpuLoadCanvas.style.left = `-${scrollLeft}px`; - dom.waveformCanvas.style.left = `-${scrollLeft}px`; - document.getElementById('timeMarkers').style.transform = `translateX(-${scrollLeft}px)`; - updateIndicatorPosition(timeToBeats(state.playbackOffset), false); - }); + // Scroll/wheel handlers - managed by ViewportController - dom.timelineContent.addEventListener('wheel', e => { - e.preventDefault(); - if (e.ctrlKey || e.metaKey) { - const rect = dom.timelineContent.getBoundingClientRect(), mouseX = e.clientX - rect.left; - const scrollLeft = dom.timelineContent.scrollLeft, timeUnderCursor = (scrollLeft + mouseX) / state.pixelsPerSecond; - const zoomDelta = e.deltaY > 0 ? -10 : 10; - const newPixelsPerSecond = Math.max(10, Math.min(500, state.pixelsPerSecond + zoomDelta)); - if (newPixelsPerSecond !== state.pixelsPerSecond) { - state.pixelsPerSecond = newPixelsPerSecond; - dom.zoomSlider.value = state.pixelsPerSecond; - dom.zoomLevel.textContent = `${state.pixelsPerSecond}%`; - if (state.audioBuffer) renderWaveform(); - renderTimeline(); - dom.timelineContent.scrollLeft = timeUnderCursor * newPixelsPerSecond - mouseX; - updateIndicatorPosition(timeToBeats(state.playbackOffset), false); - } - return; - } - dom.timelineContent.scrollLeft += e.deltaY; - const currentScrollLeft = dom.timelineContent.scrollLeft, viewportWidth = dom.timelineContent.clientWidth; - const slack = (viewportWidth / state.pixelsPerSecond) * 0.1, currentTime = (currentScrollLeft / state.pixelsPerSecond) + slack; - let targetSeqIndex = 0; - for (let i = 0; i < state.sequences.length; i++) { - if (state.sequences[i].startTime <= currentTime) targetSeqIndex = i; else break; - } - if (targetSeqIndex !== state.lastActiveSeqIndex && state.sequences.length > 0) { - state.lastActiveSeqIndex = targetSeqIndex; - const seqDivs = dom.timeline.querySelectorAll('.sequence'); - if (seqDivs[targetSeqIndex]) { - seqDivs[targetSeqIndex].classList.add('active-flash'); - setTimeout(() => seqDivs[targetSeqIndex]?.classList.remove('active-flash'), 600); - } + // Initialize controllers + const renderCallback = (trigger) => { + if (trigger === 'zoom' || trigger === 'zoomWheel') { + if (state.audioBuffer && playbackController) playbackController.renderWaveform(); + renderTimeline(); + if (viewportController) viewportController.updateIndicatorPosition(timeToBeats(state.playbackOffset), false); + } else { + renderTimeline(); } - const targetScrollTop = state.sequences[targetSeqIndex]?._yPosition || 0; - const currentScrollTop = dom.timelineContent.scrollTop, scrollDiff = targetScrollTop - currentScrollTop; - if (Math.abs(scrollDiff) > 5) dom.timelineContent.scrollTop += scrollDiff * VERTICAL_SCROLL_SPEED; - }, { passive: false }); + }; + + viewportController = new ViewportController(state, dom, renderCallback); + playbackController = new PlaybackController(state, dom, viewportController, renderCallback, showMessage); window.addEventListener('resize', renderTimeline); renderTimeline(); loadFromURLParams(); diff --git a/tools/timeline_editor/timeline-playback.js b/tools/timeline_editor/timeline-playback.js new file mode 100644 index 0000000..a1c50ab --- /dev/null +++ b/tools/timeline_editor/timeline-playback.js @@ -0,0 +1,322 @@ +// timeline-playback.js - Audio playback and waveform rendering + +export class PlaybackController { + constructor(state, dom, viewportController, renderCallback, showMessage) { + this.state = state; + this.dom = dom; + this.viewport = viewportController; + this.renderCallback = renderCallback; + this.showMessage = showMessage; + + // Constants + this.WAVEFORM_AMPLITUDE_SCALE = 0.4; + this.SEQUENCE_DEFAULT_DURATION = 16; + + this.init(); + } + + init() { + this.dom.audioInput.addEventListener('change', e => { + const file = e.target.files[0]; + if (file) this.loadAudioFile(file); + }); + + this.dom.clearAudioBtn.addEventListener('click', () => { + this.clearAudio(); + this.dom.audioInput.value = ''; + }); + + this.dom.playPauseBtn.addEventListener('click', async () => { + if (this.state.isPlaying) this.stopPlayback(); + else { + if (this.state.playbackOffset >= this.state.audioDurationSeconds) { + this.state.playbackOffset = 0; + } + this.state.playStartPosition = this.state.playbackOffset; + await this.startPlayback(); + } + }); + + this.dom.replayBtn.addEventListener('click', async () => { + this.stopPlayback(false); + this.state.playbackOffset = this.state.playStartPosition; + const replayBeats = this.timeToBeats(this.state.playbackOffset); + this.dom.playbackTime.textContent = `${this.state.playbackOffset.toFixed(2)}s (${replayBeats.toFixed(2)}b)`; + this.viewport.updateIndicatorPosition(replayBeats, false); + await this.startPlayback(); + }); + + this.dom.waveformContainer.addEventListener('click', async e => { + if (!this.state.audioBuffer) return; + const rect = this.dom.waveformContainer.getBoundingClientRect(); + const canvasOffset = parseFloat(this.dom.waveformCanvas.style.left) || 0; + const clickX = e.clientX - rect.left - canvasOffset; + const clickBeats = clickX / this.state.pixelsPerBeat; + const clickTime = this.beatsToTime(clickBeats); + + const wasPlaying = this.state.isPlaying; + if (wasPlaying) this.stopPlayback(false); + this.state.playbackOffset = Math.max(0, Math.min(clickTime, this.state.audioDurationSeconds)); + const pausedBeats = this.timeToBeats(this.state.playbackOffset); + this.dom.playbackTime.textContent = `${this.state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`; + this.viewport.updateIndicatorPosition(pausedBeats, false); + if (wasPlaying) await this.startPlayback(); + }); + } + + async loadAudioFile(file) { + try { + const arrayBuffer = await file.arrayBuffer(); + + // Detect original WAV sample rate before decoding + const dataView = new DataView(arrayBuffer); + let originalSampleRate = 32000; // Default assumption + + // Parse WAV header to get original sample rate + // "RIFF" at 0, "WAVE" at 8, "fmt " at 12, sample rate at 24 + if (dataView.getUint32(0, false) === 0x52494646 && // "RIFF" + dataView.getUint32(8, false) === 0x57415645) { // "WAVE" + originalSampleRate = dataView.getUint32(24, true); // Little-endian + console.log(`Detected WAV sample rate: ${originalSampleRate}Hz`); + } + + if (!this.state.audioContext) { + this.state.audioContext = new (window.AudioContext || window.webkitAudioContext)(); + } + + this.state.audioBuffer = await this.state.audioContext.decodeAudioData(arrayBuffer); + this.state.audioDurationSeconds = this.state.audioBuffer.duration; + this.state.originalSampleRate = originalSampleRate; + this.state.resampleRatio = this.state.audioContext.sampleRate / originalSampleRate; + + console.log(`AudioContext rate: ${this.state.audioContext.sampleRate}Hz, resample ratio: ${this.state.resampleRatio.toFixed(3)}x`); + + this.renderWaveform(); + this.dom.playbackControls.style.display = 'flex'; + this.dom.playbackIndicator.style.display = 'block'; + this.dom.clearAudioBtn.disabled = false; + this.dom.replayBtn.disabled = false; + this.showMessage(`Audio loaded: ${this.state.audioDurationSeconds.toFixed(2)}s @ ${originalSampleRate}Hz`, 'success'); + this.renderCallback('audioLoaded'); + } catch (err) { + this.showMessage(`Error loading audio: ${err.message}`, 'error'); + } + } + + renderWaveform() { + if (!this.state.audioBuffer) return; + const canvas = this.dom.waveformCanvas; + const ctx = canvas.getContext('2d'); + + // Calculate maxTimeBeats same as timeline + let maxTimeBeats = 60; + for (const seq of this.state.sequences) { + maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + this.SEQUENCE_DEFAULT_DURATION); + for (const effect of seq.effects) { + maxTimeBeats = Math.max(maxTimeBeats, seq.startTime + effect.endTime); + } + } + if (this.state.audioDurationSeconds > 0) { + maxTimeBeats = Math.max(maxTimeBeats, this.state.audioDurationSeconds * this.state.beatsPerSecond); + } + + const w = maxTimeBeats * this.state.pixelsPerBeat; + const h = 80; + canvas.width = w; + canvas.height = h; + canvas.style.width = `${w}px`; + canvas.style.height = `${h}px`; + + ctx.fillStyle = 'rgba(0, 0, 0, 0.3)'; + ctx.fillRect(0, 0, w, h); + + const channelData = this.state.audioBuffer.getChannelData(0); + const audioBeats = this.timeToBeats(this.state.audioDurationSeconds); + const audioPixelWidth = audioBeats * this.state.pixelsPerBeat; + const samplesPerPixel = Math.ceil(channelData.length / audioPixelWidth); + const centerY = h / 2; + const amplitudeScale = h * this.WAVEFORM_AMPLITUDE_SCALE; + + ctx.strokeStyle = '#4ec9b0'; + ctx.lineWidth = 1; + ctx.beginPath(); + + for (let x = 0; x < audioPixelWidth; x++) { + const start = Math.floor(x * samplesPerPixel); + const end = Math.min(start + samplesPerPixel, channelData.length); + let min = 1.0, max = -1.0; + + for (let i = start; i < end; i++) { + min = Math.min(min, channelData[i]); + max = Math.max(max, channelData[i]); + } + + const yMin = centerY - min * amplitudeScale; + const yMax = centerY - max * amplitudeScale; + + if (x === 0) ctx.moveTo(x, yMin); + else ctx.lineTo(x, yMin); + ctx.lineTo(x, yMax); + } + ctx.stroke(); + + // Center line + ctx.strokeStyle = 'rgba(255, 255, 255, 0.1)'; + ctx.beginPath(); + ctx.moveTo(0, centerY); + ctx.lineTo(audioPixelWidth, centerY); + ctx.stroke(); + + // Beat markers + ctx.strokeStyle = 'rgba(255, 255, 255, 0.50)'; + ctx.lineWidth = 1; + for (let beat = 0; beat <= maxTimeBeats; beat++) { + const x = beat * this.state.pixelsPerBeat; + ctx.beginPath(); + ctx.moveTo(x, 0); + ctx.lineTo(x, h); + ctx.stroke(); + } + } + + clearAudio() { + this.stopPlayback(); + this.state.audioBuffer = null; + this.state.audioDurationSeconds = 0; + this.state.playbackOffset = 0; + this.state.playStartPosition = 0; + + this.dom.playbackControls.style.display = 'none'; + this.dom.playbackIndicator.style.display = 'none'; + this.dom.clearAudioBtn.disabled = true; + this.dom.replayBtn.disabled = true; + + const ctx = this.dom.waveformCanvas.getContext('2d'); + ctx.clearRect(0, 0, this.dom.waveformCanvas.width, this.dom.waveformCanvas.height); + + this.renderCallback('audioClear'); + this.showMessage('Audio cleared', 'success'); + } + + async startPlayback() { + if (!this.state.audioBuffer || !this.state.audioContext) return; + + if (this.state.audioSource) { + try { this.state.audioSource.stop(); } catch (e) {} + this.state.audioSource = null; + } + + if (this.state.audioContext.state === 'suspended') { + await this.state.audioContext.resume(); + } + + try { + this.state.audioSource = this.state.audioContext.createBufferSource(); + this.state.audioSource.buffer = this.state.audioBuffer; + this.state.audioSource.connect(this.state.audioContext.destination); + this.state.audioSource.start(0, this.state.playbackOffset); + this.state.playbackStartTime = this.state.audioContext.currentTime; + this.state.isPlaying = true; + this.dom.playPauseBtn.textContent = '⏸ Pause'; + + this.updatePlaybackPosition(); + + this.state.audioSource.onended = () => { + if (this.state.isPlaying) this.stopPlayback(); + }; + } catch (e) { + console.error('Failed to start playback:', e); + this.showMessage('Playback failed: ' + e.message, 'error'); + this.state.audioSource = null; + this.state.isPlaying = false; + } + } + + stopPlayback(savePosition = true) { + if (this.state.audioSource) { + try { this.state.audioSource.stop(); } catch (e) {} + this.state.audioSource = null; + } + + if (this.state.animationFrameId) { + cancelAnimationFrame(this.state.animationFrameId); + this.state.animationFrameId = null; + } + + if (this.state.isPlaying && savePosition) { + const elapsed = this.state.audioContext.currentTime - this.state.playbackStartTime; + this.state.playbackOffset = Math.min(this.state.playbackOffset + elapsed, this.state.audioDurationSeconds); + } + + this.state.isPlaying = false; + this.dom.playPauseBtn.textContent = '▶ Play'; + } + + updatePlaybackPosition() { + if (!this.state.isPlaying) return; + + const elapsed = this.state.audioContext.currentTime - this.state.playbackStartTime; + const currentTime = this.state.playbackOffset + elapsed; + const currentBeats = this.timeToBeats(currentTime); + + this.dom.playbackTime.textContent = `${currentTime.toFixed(2)}s (${currentBeats.toFixed(2)}b)`; + this.viewport.updateIndicatorPosition(currentBeats, true); + this.expandSequenceAtTime(currentBeats); + + this.state.animationFrameId = requestAnimationFrame(() => this.updatePlaybackPosition()); + } + + expandSequenceAtTime(currentBeats) { + let activeSeqIndex = -1; + + for (let i = 0; i < this.state.sequences.length; i++) { + const seq = this.state.sequences[i]; + const seqEndBeats = seq.startTime + (seq.effects.length > 0 ? + Math.max(...seq.effects.map(e => e.endTime)) : 0); + + if (currentBeats >= seq.startTime && currentBeats <= seqEndBeats) { + activeSeqIndex = i; + break; + } + } + + if (activeSeqIndex !== this.state.lastExpandedSeqIndex) { + const seqDivs = this.dom.timeline.querySelectorAll('.sequence'); + + if (this.state.lastExpandedSeqIndex >= 0 && seqDivs[this.state.lastExpandedSeqIndex]) { + seqDivs[this.state.lastExpandedSeqIndex].classList.remove('active-playing'); + } + + if (activeSeqIndex >= 0 && seqDivs[activeSeqIndex]) { + seqDivs[activeSeqIndex].classList.add('active-playing'); + } + + this.state.lastExpandedSeqIndex = activeSeqIndex; + } + } + + seekTo(clickBeats, clickTime) { + if (!this.state.audioBuffer) return; + + const wasPlaying = this.state.isPlaying; + if (wasPlaying) this.stopPlayback(false); + + this.state.playbackOffset = Math.max(0, Math.min(clickTime, this.state.audioDurationSeconds)); + const pausedBeats = this.timeToBeats(this.state.playbackOffset); + this.dom.playbackTime.textContent = `${this.state.playbackOffset.toFixed(2)}s (${pausedBeats.toFixed(2)}b)`; + this.viewport.updateIndicatorPosition(pausedBeats, false); + + if (wasPlaying) this.startPlayback(); + + return { clickTime, clickBeats }; + } + + // Helpers + beatsToTime(beats) { + return beats * this.state.secondsPerBeat; + } + + timeToBeats(seconds) { + return seconds * this.state.beatsPerSecond; + } +} diff --git a/tools/timeline_editor/timeline-viewport.js b/tools/timeline_editor/timeline-viewport.js new file mode 100644 index 0000000..dcedb45 --- /dev/null +++ b/tools/timeline_editor/timeline-viewport.js @@ -0,0 +1,170 @@ +// timeline-viewport.js - Viewport zoom/scroll control + +export class ViewportController { + constructor(state, dom, renderCallback) { + this.state = state; + this.dom = dom; + this.renderCallback = renderCallback; + + // Constants + this.TIMELINE_LEFT_PADDING = 20; + this.SCROLL_VIEWPORT_FRACTION = 0.4; + this.SMOOTH_SCROLL_SPEED = 0.1; + this.VERTICAL_SCROLL_SPEED = 0.3; + + this.init(); + } + + init() { + // Zoom controls + this.dom.zoomSlider.addEventListener('input', e => this.handleZoomSlider(e)); + + // Scroll sync + this.dom.timelineContent.addEventListener('scroll', () => this.handleScroll()); + + // Wheel handling - capture at container level to override all child elements + const wheelHandler = e => this.handleWheel(e); + this.dom.timelineContainer.addEventListener('wheel', wheelHandler, { passive: false, capture: true }); + + // Prevent wheel bubbling from UI containers outside timeline + document.querySelector('header').addEventListener('wheel', e => e.stopPropagation()); + this.dom.propertiesPanel.addEventListener('wheel', e => e.stopPropagation()); + document.querySelector('.zoom-controls').addEventListener('wheel', e => e.stopPropagation()); + document.querySelector('.stats').addEventListener('wheel', e => e.stopPropagation()); + + // Waveform hover tracking + this.dom.waveformContainer.addEventListener('mouseenter', () => this.showWaveformCursor()); + this.dom.waveformContainer.addEventListener('mouseleave', () => this.hideWaveformCursor()); + this.dom.waveformContainer.addEventListener('mousemove', e => this.updateWaveformCursor(e)); + } + + handleZoomSlider(e) { + this.state.pixelsPerBeat = parseInt(e.target.value); + this.dom.zoomLevel.textContent = `${this.state.pixelsPerBeat}%`; + this.renderCallback('zoom'); + } + + handleScroll() { + const scrollLeft = this.dom.timelineContent.scrollLeft; + this.dom.cpuLoadCanvas.style.left = `-${scrollLeft}px`; + this.dom.waveformCanvas.style.left = `-${scrollLeft}px`; + document.getElementById('timeMarkers').style.transform = `translateX(-${scrollLeft}px)`; + this.updateIndicatorPosition(this.timeToBeats(this.state.playbackOffset), false); + } + + handleWheel(e) { + e.preventDefault(); + + // Zoom with ctrl/cmd + if (e.ctrlKey || e.metaKey) { + this.handleZoomWheel(e); + return; + } + + // Horizontal scroll + this.dom.timelineContent.scrollLeft += e.deltaY; + + // Auto-scroll to active sequence + this.autoScrollToSequence(); + } + + handleZoomWheel(e) { + const rect = this.dom.timelineContent.getBoundingClientRect(); + const mouseX = e.clientX - rect.left; + const scrollLeft = this.dom.timelineContent.scrollLeft; + const timeUnderCursor = (scrollLeft + mouseX) / this.state.pixelsPerBeat; + + const zoomDelta = e.deltaY > 0 ? -10 : 10; + const newPixelsPerBeat = Math.max(10, Math.min(500, this.state.pixelsPerBeat + zoomDelta)); + + if (newPixelsPerBeat !== this.state.pixelsPerBeat) { + this.state.pixelsPerBeat = newPixelsPerBeat; + this.dom.zoomSlider.value = this.state.pixelsPerBeat; + this.dom.zoomLevel.textContent = `${this.state.pixelsPerBeat}%`; + this.renderCallback('zoomWheel'); + this.dom.timelineContent.scrollLeft = timeUnderCursor * newPixelsPerBeat - mouseX; + this.updateIndicatorPosition(this.timeToBeats(this.state.playbackOffset), false); + } + } + + autoScrollToSequence() { + const currentScrollLeft = this.dom.timelineContent.scrollLeft; + const viewportWidth = this.dom.timelineContent.clientWidth; + const slack = (viewportWidth / this.state.pixelsPerBeat) * 0.1; + const currentTime = (currentScrollLeft / this.state.pixelsPerBeat) + slack; + + let targetSeqIndex = 0; + for (let i = 0; i < this.state.sequences.length; i++) { + if (this.state.sequences[i].startTime <= currentTime) targetSeqIndex = i; + else break; + } + + if (targetSeqIndex !== this.state.lastActiveSeqIndex && this.state.sequences.length > 0) { + this.state.lastActiveSeqIndex = targetSeqIndex; + const seqDivs = this.dom.timeline.querySelectorAll('.sequence'); + if (seqDivs[targetSeqIndex]) { + seqDivs[targetSeqIndex].classList.add('active-flash'); + setTimeout(() => seqDivs[targetSeqIndex]?.classList.remove('active-flash'), 600); + } + } + + const targetScrollTop = this.state.sequences[targetSeqIndex]?._yPosition || 0; + const currentScrollTop = this.dom.timelineContent.scrollTop; + const scrollDiff = targetScrollTop - currentScrollTop; + if (Math.abs(scrollDiff) > 5) { + this.dom.timelineContent.scrollTop += scrollDiff * this.VERTICAL_SCROLL_SPEED; + } + } + + updateIndicatorPosition(beats, smoothScroll = false) { + const timelineX = beats * this.state.pixelsPerBeat; + const scrollLeft = this.dom.timelineContent.scrollLeft; + this.dom.playbackIndicator.style.left = `${timelineX - scrollLeft + this.TIMELINE_LEFT_PADDING}px`; + + if (smoothScroll) { + const targetScroll = timelineX - this.dom.timelineContent.clientWidth * this.SCROLL_VIEWPORT_FRACTION; + const scrollDiff = targetScroll - scrollLeft; + if (Math.abs(scrollDiff) > 5) { + this.dom.timelineContent.scrollLeft += scrollDiff * this.SMOOTH_SCROLL_SPEED; + } + } + } + + showWaveformCursor() { + if (!this.state.audioBuffer) return; + this.dom.waveformCursor.style.display = 'block'; + this.dom.waveformTooltip.style.display = 'block'; + } + + hideWaveformCursor() { + this.dom.waveformCursor.style.display = 'none'; + this.dom.waveformTooltip.style.display = 'none'; + } + + updateWaveformCursor(e) { + if (!this.state.audioBuffer) return; + const rect = this.dom.waveformContainer.getBoundingClientRect(); + const mouseX = e.clientX - rect.left; + const scrollLeft = this.dom.timelineContent.scrollLeft; + const timeBeats = (scrollLeft + mouseX) / this.state.pixelsPerBeat; + const timeSeconds = timeBeats * this.state.secondsPerBeat; + + // Position cursor + this.dom.waveformCursor.style.left = `${mouseX}px`; + + // Position and update tooltip + const tooltipText = `${timeSeconds.toFixed(3)}s (${timeBeats.toFixed(2)}b)`; + this.dom.waveformTooltip.textContent = tooltipText; + + // Position tooltip above cursor, offset to the right + const tooltipX = mouseX + 10; + const tooltipY = 5; + this.dom.waveformTooltip.style.left = `${tooltipX}px`; + this.dom.waveformTooltip.style.top = `${tooltipY}px`; + } + + // Helper + timeToBeats(seconds) { + return seconds * this.state.beatsPerSecond; + } +} diff --git a/tools/track_visualizer/index.html b/tools/track_visualizer/index.html index 4a613ec..d1e7480 100644 --- a/tools/track_visualizer/index.html +++ b/tools/track_visualizer/index.html @@ -4,18 +4,8 @@ <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Music Track Visualizer</title> + <link rel="stylesheet" href="../common/style.css"> <style> - * { - margin: 0; - padding: 0; - box-sizing: border-box; - } - body { - font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif; - background: #1e1e1e; - color: #d4d4d4; - overflow: hidden; - } #controls { padding: 15px; background: #2d2d2d; @@ -27,16 +17,8 @@ } button, input[type="file"] { padding: 8px 16px; - background: #0e639c; - color: white; - border: none; - border-radius: 4px; - cursor: pointer; font-size: 14px; } - button:hover { - background: #1177bb; - } input[type="file"] { padding: 6px 12px; } @@ -58,7 +40,6 @@ width: 100%; height: calc(100vh - 70px); overflow: auto; - background: #1e1e1e; } #timeline-canvas { display: block; diff --git a/workspaces/main/timeline.seq b/workspaces/main/timeline.seq index 3e9052b..05d9026 100644 --- a/workspaces/main/timeline.seq +++ b/workspaces/main/timeline.seq @@ -3,100 +3,95 @@ # BPM 90 SEQUENCE 0.00 0 - EFFECT - FlashCubeEffect 0.00 4.88 - EFFECT + FlashEffect 0.00 2.00 color=1.0,0.5,0.5 decay=0.95 - EFFECT + FadeEffect 0.20 2.00 - EFFECT + SolarizeEffect 0.00 4.00 - EFFECT + VignetteEffect 0.00 5.00 radius=0.6 softness=0.1 + EFFECT - FlashCubeEffect 0.00 4.00 +# EFFECT + FlashEffect 0.00 2.00 color=1.0,0.5,0.5 decay=0.95 +# EFFECT + FadeEffect 2.00 4.00 +# EFFECT + SolarizeEffect 0.00 4.00 + EFFECT + VignetteEffect 0.00 4.00 radius=0.6 softness=0.1 -SEQUENCE 5.00 0 "rotating cube" - EFFECT + CircleMaskEffect 0.00 8.00 0.50 - EFFECT + RotatingCubeEffect 0.00 8.00 - EFFECT + GaussianBlurEffect 2.00 4.00 strength=1.0 - EFFECT + GaussianBlurEffect 6.00 8.00 strength=2.0 +SEQUENCE 4.00 0 "rotating cube" + EFFECT + CircleMaskEffect 0.00 4.00 0.50 + EFFECT + RotatingCubeEffect 0.00 4.00 + EFFECT + GaussianBlurEffect 1.00 4.00 strength=1.0 -SEQUENCE 12.00 0 - EFFECT - FlashCubeEffect 0.22 2.90 +SEQUENCE 8.00 0 "Flash Cube" + EFFECT - FlashCubeEffect 0.00 4.02 EFFECT + FlashEffect 0.00 0.40 -SEQUENCE 14.00 1 "spray" - EFFECT + ParticleSprayEffect 0.00 4.00 - EFFECT + ParticlesEffect 0.00 6.00 +SEQUENCE 12.00 1 "spray" + EFFECT + ParticleSprayEffect 0.00 2.00 + EFFECT + ParticlesEffect 2.00 4.00 EFFECT = GaussianBlurEffect 0.00 4.00 strength=3.0 -SEQUENCE 17.00 2 "Hybrid3D" +SEQUENCE 16.00 2 "Hybrid3D + CNN" EFFECT + ThemeModulationEffect 0.00 4.00 - EFFECT + HeptagonEffect 0.40 4.00 - EFFECT + ParticleSprayEffect 0.00 4.00 - EFFECT = ParticlesEffect 0.00 4.00 + EFFECT + HeptagonEffect 0.00 4.00 + EFFECT + ParticleSprayEffect 0.00 2.00 + EFFECT = ParticlesEffect 2.00 4.00 EFFECT + Hybrid3DEffect 0.00 4.00 - EFFECT + GaussianBlurEffect 0.00 4.00 EFFECT + CNNEffect 0.00 4.00 layers=3 blend=.9 -SEQUENCE 21.00 0 "CNN effect" - EFFECT + HeptagonEffect 0.00 22.00 - EFFECT + Scene1Effect 0.00 24.00 - EFFECT + CNNEffect 2.00 24.00 layers=3 blend=.5 +SEQUENCE 20.00 0 "CNN effect" + EFFECT + HeptagonEffect 0.00 8.00 + EFFECT + Scene1Effect 0.00 8.00 + EFFECT + CNNEffect 6.00 8.00 layers=3 blend=.5 -SEQUENCE 44.00 0 "buggy" - EFFECT + HeptagonEffect 0.00 0.40 - EFFECT + FadeEffect 0.22 2.02 +SEQUENCE 28.00 0 "buggy" + EFFECT + HeptagonEffect 0.00 2.00 + EFFECT + FadeEffect 0.00 2.00 -SEQUENCE 44.00 3 "Seq-8" - EFFECT + ThemeModulationEffect 0.00 8.00 - EFFECT = HeptagonEffect 0.00 8.00 +SEQUENCE 30.00 3 "Seq-8" + EFFECT + ThemeModulationEffect 0.00 10.00 + EFFECT = HeptagonEffect 0.00 10.00 EFFECT + GaussianBlurEffect 0.00 10.00 strength=1.5 EFFECT + ChromaAberrationEffect 0.00 10.00 offset=0.03 angle=0.785 EFFECT + SolarizeEffect 0.00 10.00 -SEQUENCE 46.00 2 - EFFECT - FlashCubeEffect 0.40 3.00 +SEQUENCE 40.00 2 + EFFECT - FlashCubeEffect 0.00 4.00 EFFECT + HeptagonEffect 0.00 4.00 EFFECT + ParticleSprayEffect 0.00 4.00 - EFFECT + ParticlesEffect 0.00 4.00 -SEQUENCE 46.00 2 "Fade" - EFFECT - FlashCubeEffect 0.40 3.00 - EFFECT + FlashEffect 0.00 2.00 +SEQUENCE 44.00 2 "Fade" + EFFECT - FlashCubeEffect 0.00 2.00 + EFFECT + FlashEffect 1.00 2.00 -SEQUENCE 48.00 10 - EFFECT - FlashCubeEffect 0.40 3.00 - EFFECT + GaussianBlurEffect 0.00 4.00 - EFFECT + FlashEffect 0.00 0.40 - EFFECT = FlashEffect 1.00 0.40 +SEQUENCE 46.00 10 + EFFECT - FlashCubeEffect 0.00 3.00 + EFFECT + GaussianBlurEffect 0.00 3.00 + EFFECT + FlashEffect 0.00 3.00 -SEQUENCE 51.00 1 - EFFECT + ThemeModulationEffect 0.00 16.00 - EFFECT + HeptagonEffect 0.40 4.00 - EFFECT + ParticleSprayEffect 0.00 16.00 - EFFECT + Hybrid3DEffect 0.00 16.12 - EFFECT + GaussianBlurEffect 0.00 16.00 - EFFECT + ChromaAberrationEffect 0.00 16.28 - EFFECT + SolarizeEffect 0.00 15.76 +SEQUENCE 49.00 1 + EFFECT + ThemeModulationEffect 0.00 8.00 + EFFECT + HeptagonEffect 0.00 8.00 + EFFECT + ParticleSprayEffect 0.00 8.00 + EFFECT + Hybrid3DEffect 0.00 8.00 + EFFECT + GaussianBlurEffect 0.00 8.00 + EFFECT + ChromaAberrationEffect 0.00 8.00 -SEQUENCE 66.00 0 - EFFECT + ThemeModulationEffect 0.00 6.00 - EFFECT + VignetteEffect 0.00 6.00 radius=0.6 softness=0.3 - EFFECT + SolarizeEffect 0.00 6.00 +SEQUENCE 57.00 0 + EFFECT + ThemeModulationEffect 0.00 7.00 + EFFECT + VignetteEffect 0.00 7.00 radius=0.6 softness=0.3 + EFFECT + SolarizeEffect 0.00 7.00 -SEQUENCE 71.00 0 - EFFECT + ThemeModulationEffect 0.00 8.00 - EFFECT + HeptagonEffect 0.40 4.00 - EFFECT + GaussianBlurEffect 0.00 16.00 +SEQUENCE 64.00 0 + EFFECT + ThemeModulationEffect 0.00 4.00 + EFFECT + HeptagonEffect 0.00 4.00 + EFFECT + GaussianBlurEffect 0.00 4.00 EFFECT + SolarizeEffect 0.00 4.00 -SEQUENCE 85.00 0 "double hepta!" - EFFECT + ThemeModulationEffect 0.00 12.00 - EFFECT = HeptagonEffect 0.40 4.00 - EFFECT + Hybrid3DEffect 0.00 8.00 - EFFECT + ParticleSprayEffect 0.00 11.00 - EFFECT + HeptagonEffect 0.00 16.00 - EFFECT + ChromaAberrationEffect 0.00 15.00 - EFFECT + GaussianBlurEffect 0.00 16.00 +SEQUENCE 68.00 0 "double hepta!" + EFFECT + ThemeModulationEffect 0.00 4.00 + EFFECT = HeptagonEffect 0.00 4.00 + EFFECT + Hybrid3DEffect 0.00 4.00 + EFFECT + ParticleSprayEffect 0.00 4.00 + EFFECT + HeptagonEffect 0.00 4.00 + EFFECT + ChromaAberrationEffect 0.00 4.00 + EFFECT + GaussianBlurEffect 0.00 4.00 -SEQUENCE 100.00 0 - EFFECT + ThemeModulationEffect 0.00 8.00 - EFFECT + HeptagonEffect 0.00 19.00 - EFFECT + ChromaAberrationEffect 0.00 18.00 - EFFECT + GaussianBlurEffect 0.00 16.00 +SEQUENCE 72.00 0 "The End" + EFFECT + ThemeModulationEffect 0.00 7.00 + EFFECT + HeptagonEffect 0.00 7.00 + EFFECT + ChromaAberrationEffect 0.00 7.00 + EFFECT + GaussianBlurEffect 0.00 7.00 |
