diff options
| author | skal <pascal.massimino@gmail.com> | 2026-02-13 16:12:24 +0100 |
|---|---|---|
| committer | skal <pascal.massimino@gmail.com> | 2026-02-13 16:12:24 +0100 |
| commit | b04816a400703ac6c364efb70ae84930d79ccb12 (patch) | |
| tree | 257acfe047ee79c6037db0dd983b91396139d5a4 /training/target_1/img_002.png | |
| parent | b5e8abad0490e47b52d300d2d0c48425c3fac4f3 (diff) | |
CNN v2: Fix activation function mismatch between training and inference
Layer 0 now uses clamp [0,1] in both training and inference (was using ReLU in shaders).
- index.html: Add is_layer_0 flag to LayerParams, handle Layer 0 separately
- export_cnn_v2_shader.py: Generate correct activation for Layer 0
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Diffstat (limited to 'training/target_1/img_002.png')
0 files changed, 0 insertions, 0 deletions
