Compare commits

...

128 Commits

Author SHA1 Message Date
fa69ac649c fix: prevent playhead marker from continuing after pause
Fixed race condition where animation frames could continue after pause() by
adding an isPlayingRef that's checked at the start of updatePlaybackPosition.
This ensures any queued animation frames will exit early if pause was called,
preventing the playhead from continuing to move when audio has stopped.

This fixes the issue where the playhead marker would occasionally keep moving
after hitting spacebar to pause, even though no audio was playing.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 12:53:13 +01:00
efd4cfa607 fix: adjust timeline right padding to match scrollbar width
Changed timeline right padding from 240px to 250px to account for the
vertical scrollbar width in the waveforms area, ensuring perfect alignment
between timeline and waveform playhead markers.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 12:49:40 +01:00
0d8952ca2f fix: ensure consistent timeline-waveform alignment with always-visible scrollbar
Changed waveforms container from overflow-y-auto to overflow-y-scroll to ensure
the vertical scrollbar is always visible. This maintains consistent width alignment
between the timeline and waveform areas, preventing playhead marker misalignment
that occurred when scrollbar appeared/disappeared based on track count.

Fixes the issue where timeline was slightly wider than waveforms when vertical
scrollbar was present, causing ~15px offset in playhead position.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 12:45:13 +01:00
1b30931615 style: add custom-scrollbar styling to all dialog scrollable areas
Added custom-scrollbar class to scrollable elements in:
- EffectBrowser dialog content area
- ProjectsDialog projects list
- CommandPalette results list
- Modal content area
- TrackList automation parameter label containers

This ensures consistent scrollbar styling across all dialogs and UI elements.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 12:37:16 +01:00
d9bd8246c9 fix: prevent infinite loop in waveform rendering
- Add safety check for duration === 0 (first track scenario)
- Ensure trackWidth doesn't exceed canvas width
- Use Math.floor for integer loop iterations
- Prevent division by zero in samplesPerPixel calculation

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 11:24:31 +01:00
dd8d46795a fix: waveform rendering to respect track duration vs project duration
- Calculate trackWidth based on track duration ratio to project duration
- Shorter tracks now render proportionally instead of stretching
- Longer tracks automatically update project duration
- All existing tracks re-render correctly when duration changes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 11:17:28 +01:00
adcc97eb5a fix: timeline tooltip positioning to account for left padding
- Add controlsWidth offset to tooltip left position
- Tooltip now appears correctly at mouse cursor position

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 10:36:40 +01:00
1855988b83 fix: timeline zoom and waveform rendering improvements
- Fix timeline width calculation to always fill viewport at minimum
- Fix waveform sampling to match timeline width calculation exactly
- Fix infinite scroll loop by removing circular callback
- Ensure scrollbars appear correctly when zooming in
- Use consistent PIXELS_PER_SECOND_BASE = 5 across all components

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 10:34:31 +01:00
477a444c78 feat: implement TimeScale component with proper zoom calculation
- Add TimeScale component with canvas-based rendering
- Use 5 pixels per second base scale (duration * zoom * 5)
- Implement viewport-based rendering for performance
- Add scroll synchronization with waveforms
- Add 240px padding for alignment with track controls and master area
- Apply custom scrollbar styling
- Update all waveform width calculations to match timeline

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 10:12:13 +01:00
119c8c2942 feat: implement medium effort features - markers, web workers, and bezier automation
Implemented three major medium effort features to enhance the audio editor:

**1. Region Markers System**
- Add marker type definitions supporting point markers and regions
- Create useMarkers hook for marker state management
- Build MarkerTimeline component for visual marker display
- Create MarkerDialog component for adding/editing markers
- Add keyboard shortcuts: M (add marker), Shift+M (next), Shift+Ctrl+M (previous)
- Support marker navigation, editing, and deletion

**2. Web Worker for Computations**
- Create audio worker for offloading heavy computations
- Implement worker functions: generatePeaks, generateMinMaxPeaks, normalizePeaks, analyzeAudio, findPeak
- Build useAudioWorker hook for easy worker integration
- Integrate worker into Waveform component with peak caching
- Significantly improve UI responsiveness during waveform generation

**3. Bezier Curve Automation**
- Enhance interpolateAutomationValue to support Bezier curves
- Implement cubic Bezier interpolation with control handles
- Add createSmoothHandles for auto-smooth curve generation
- Add generateBezierCurvePoints for smooth curve rendering
- Support bezier alongside existing linear and step curves

All features are type-safe and integrate seamlessly with the existing codebase.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 08:25:33 +01:00
8720c35f23 fix: add missing 'S' keyboard shortcut for split at cursor
Added keyboard handler for the 'S' key to trigger split at cursor.
The shortcut was defined in the command palette but missing from
the keyboard event handler.

Note: Ctrl+A for Select All was already working correctly.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 07:56:02 +01:00
25ddac349b feat: add copy/paste automation data functionality
Implemented automation clipboard system:
- Added separate automationClipboard state for automation points
- Created handleCopyAutomation function to copy automation lane points
- Created handlePasteAutomation function to paste at current time with time offset
- Added Copy and Clipboard icon buttons to AutomationHeader component
- Automation points preserve curve type and value when copied/pasted
- Points are sorted by time after pasting
- Toast notifications for user feedback
- Ready for integration when automation lanes are actively used

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 07:51:24 +01:00
08b33aacb5 feat: add split at cursor functionality
Implemented track splitting at playback cursor:
- Added handleSplitAtCursor function to split selected track at current time
- Extracts two audio segments: before and after cursor position
- Creates two new tracks from the segments with numbered names
- Removes original track after split
- Added 'Split at Cursor' command to command palette (keyboard shortcut: S)
- Validation for edge cases (no track selected, no audio, invalid position)
- User feedback via toast notifications

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 07:38:07 +01:00
9007522e18 feat: add playback speed control (0.25x - 2x)
Implemented variable playback speed functionality:
- Added playbackRate state and ref to useMultiTrackPlayer (0.25x - 2x range)
- Applied playback rate to AudioBufferSourceNode.playbackRate
- Updated timing calculations to account for playback rate
- Real-time playback speed adjustment for active playback
- Dropdown UI control in PlaybackControls with preset speeds
- Integrated changePlaybackRate function through AudioEditor

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 07:35:39 +01:00
a47bf09a32 feat: add loop playback functionality
Added complete loop functionality with UI controls:
- Loop state management in useMultiTrackPlayer (loopEnabled, loopStart, loopEnd)
- Automatic restart from loop start when reaching loop end during playback
- Loop toggle button in PlaybackControls with Repeat icon
- Loop points UI showing when loop is enabled (similar to punch in/out)
- Manual loop point adjustment with number inputs
- Quick set buttons to set loop points to current time
- Wired loop functionality through AudioEditor component

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-20 07:31:53 +01:00
aba26126cc refactor: remove configurable track height setting
Removed the defaultTrackHeight setting from UI preferences as it doesn't need
to be user-configurable. All tracks now use DEFAULT_TRACK_HEIGHT constant (400px).

Changes:
- Removed defaultTrackHeight from UISettings interface
- Removed track height slider from GlobalSettingsDialog
- Updated AudioEditor to use DEFAULT_TRACK_HEIGHT constant directly
- Simplified dependency arrays in addTrack callbacks

This simplifies the settings interface while maintaining the same visual behavior.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 20:55:45 +01:00
66a515ba79 fix: remove AudioWorklet browser warning
Removed the AudioWorklet compatibility check warning since it's an optional
feature that doesn't affect core functionality. The app works fine without
AudioWorklet support, using standard Web Audio API nodes instead.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 20:52:28 +01:00
691f75209d docs: mark Phase 15.2 Responsive Design as complete
Updated PLAN.md to reflect completion of mobile responsiveness enhancements:
- Touch gesture support via collapse/expand chevron buttons
- Collapsible track and master controls with detailed state descriptions
- Track collapse buttons on mobile (dual chevron system)
- Mobile vertical stacking with automation and effects bars
- Height synchronization between track controls and waveform containers

Phase 15.2 now fully complete with comprehensive mobile support.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 20:51:15 +01:00
908e6caaf8 feat: enhance mobile responsiveness with collapsible controls and automation/effects bars
Added comprehensive mobile support for Phase 15 (Polish & Optimization):

**Mobile Layout Enhancements:**
- Track controls now collapsible on mobile with two states:
  - Collapsed: minimal controls with expand chevron, R/M/S buttons, horizontal level meter
  - Expanded: full height fader, pan control, all buttons
- Track collapse buttons added to mobile view (left chevron for track collapse, right chevron for control collapse)
- Master controls collapse button hidden on desktop (lg:hidden)
- Automation and effects bars now available on mobile layout
- Both bars collapsible with eye/eye-off icons, horizontally scrollable when zoomed
- Mobile vertical stacking: controls → waveform → automation → effects per track

**Bug Fixes:**
- Fixed track controls and waveform container height matching on desktop
- Fixed Modal component prop: isOpen → open in all dialog components
- Fixed TypeScript null check for audioBuffer.duration
- Fixed keyboard shortcut category: 'help' → 'view'

**Technical Improvements:**
- Consistent height calculation using trackHeight variable
- Proper responsive breakpoints with Tailwind (sm:640px, lg:1024px)
- Progressive disclosure pattern for mobile controls

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 20:50:44 +01:00
e09bc1449c docs: mark Phase 14 (Settings & Preferences) as complete
Updated PLAN.md to reflect completion of Phase 14:
- Global settings system with localStorage persistence
- 5 settings tabs: Recording, Audio, Editor, Interface, Performance
- Real-time application to editor behavior
- All major settings implemented and functional

Applied settings:
- defaultTrackHeight → new track creation
- defaultZoom → initial zoom state
- enableSpectrogram → analyzer visibility
- sampleRate → recording configuration

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:32:19 +01:00
d03080d3d2 feat: apply enableSpectrogram and sampleRate settings to editor
Applied performance and audio settings to actual editor behavior:
- enableSpectrogram: conditionally show/hide spectrogram analyzer option
- sampleRate: sync recording sample rate with global audio settings

Changes:
- Switch analyzer view away from spectrogram when setting is disabled
- Adjust analyzer toggle grid columns based on spectrogram visibility
- Sync recording hook's sampleRate with global settings via useEffect

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:29:05 +01:00
484e3261c5 feat: apply defaultTrackHeight and defaultZoom settings
- Modified createTrack and createTrackFromBuffer to accept height parameter
- Updated useMultiTrack to pass height when creating tracks
- Applied settings.ui.defaultTrackHeight when adding new tracks
- Applied settings.editor.defaultZoom for initial zoom state
- Removed duplicate useSettings hook declaration in AudioEditor
- New tracks now use configured default height from settings

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:23:41 +01:00
a2cef6cc6e refactor: remove waveform color setting to preserve dynamic coloring
- Removed waveformColor from UISettings interface
- Removed waveform color picker from Interface settings tab
- Preserves dynamic per-track waveform coloring system
- Cleaner settings UI with one less option

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:19:38 +01:00
b1c0ff6f72 fix: constrain fader handles to track lane boundaries
- Fader handles now respect top-8 bottom-8 track padding
- Handle moves only within the visible track lane (60% range)
- Updated both TrackFader and MasterFader components
- Value calculation clamped to track bounds (32px padding top/bottom)
- Handle position mapped to 20%-80% range instead of 0%-100%
- Prevents handle from going beyond visible track area

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:15:27 +01:00
197bff39fc fix: show Plus button only when effects panel is expanded
- Plus button now conditionally rendered based on track.effectsExpanded
- Matches automation controls behavior
- Cleaner UI when panel is collapsed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:11:06 +01:00
314fced79f feat: Plus button now opens EffectBrowser dialog
- Added EffectBrowser import and state management
- Plus button opens dialog to select from all available effects
- Effect is added to track when selected from browser
- Removed hardcoded low-pass filter addition
- Better UX for adding effects to tracks

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:09:46 +01:00
0bd892e3d1 fix: streamline effects header controls and add Plus button
- Removed absolute positioning from eye icon button
- Added Plus button to add effects (currently adds low-pass filter)
- All controls now properly inline in normal flex flow
- Eye button correctly positioned at end without overlap

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:07:06 +01:00
42991092ad fix: automation header controls now properly inline without overlapping
- Removed absolute positioning from eye icon button
- Made all controls part of normal flex flow
- Eye button now correctly positioned at end without overlap
- Chevron controls no longer overlay the eye icon

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 18:03:41 +01:00
adb99a2c33 feat: implement Phase 14 settings & preferences with localStorage persistence
Added comprehensive settings system with 5 categories:
- Recording Settings (existing, integrated)
- Audio Settings (buffer size, sample rate, auto-normalize)
- Editor Settings (auto-save interval, undo limit, snap-to-grid, grid resolution, default zoom)
- Interface Settings (theme, waveform color, font size, default track height)
- Performance Settings (peak/waveform quality, spectrogram toggle, max file size)

Features:
- useSettings hook with localStorage persistence
- Automatic save/load of all settings
- Category-specific reset buttons
- Expanded GlobalSettingsDialog with 5 tabs
- Full integration with AudioEditor
- Settings merge with defaults on load (handles updates gracefully)

Settings are persisted to localStorage and automatically restored on page load.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 16:39:05 +01:00
5d9e02fe95 feat: streamline track and master controls layout consistency
- Streamlined track controls and master controls to same width (240px)
- Fixed track controls container to use full width of parent column
- Matched TrackControls card structure with MasterControls (gap-3, no w-full/h-full)
- Updated outer container padding from p-2 to p-4 with gap-4
- Adjusted track controls wrapper to center content instead of stretching
- Added max-width constraint to PlaybackControls to prevent width changes
- Centered transport control buttons in footer

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 16:32:49 +01:00
854e64b4ec fix: add missing cn import to TrackList
- Import cn utility function from @/lib/utils/cn
- Fixes ReferenceError: cn is not defined
- Required for conditional classNames on effect labels

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:21:39 +01:00
e7bd262e6f feat: show bypassed effects with gray labels
- Effect labels show in primary color when enabled
- Effect labels show in gray when bypassed/disabled
- Added opacity reduction (60%) for bypassed effects
- Visual feedback matches effect state at a glance

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:20:32 +01:00
ba2e138ab9 feat: add effect name labels to effects header
- Display effect names as small chips/badges in effects header
- Shows all effect names at a glance without expanding
- Styled with primary color background and border
- Horizontal scrollable if many effects
- Visible whether effects rack is expanded or collapsed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:19:05 +01:00
7b4a7cc567 fix: remove effects count from effects header
- Remove '(N)' count display from effects bar header
- Effect names are already shown in EffectDevice component:
  - Collapsed: vertical text label
  - Expanded: header with full name
- Cleaner header appearance

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:17:47 +01:00
64864cfd34 feat: use eye icon for effects bar folding like automation bar
- Replace ChevronDown/ChevronRight with Eye/EyeOff icons
- Position eye icon absolutely on the right side
- Match AutomationHeader styling with bg-muted/50 and border
- Eye icon shows when expanded, EyeOff when collapsed
- Consistent UX between automation and effects bars

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:16:02 +01:00
14a9c6e163 feat: restore automation controls with AutomationHeader component
- Import and use AutomationHeader component in automation bar
- Add parameter selection dropdown (Volume, Pan, Effect parameters)
- Add automation mode controls (Read, Write, Touch, Latch)
- Add lane height controls (increase/decrease buttons)
- Add show/hide toggle button
- Build available parameters dynamically from track effects
- All controls properly wired to update track automation state

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:13:12 +01:00
0babc469cc fix: enforce minimum 360px height for all tracks using Math.max
- Track component now uses Math.max to ensure track.height is at least MIN_TRACK_HEIGHT
- TrackList component also uses Math.max for consistent enforcement
- Fixes issue where tracks with old height values (340px, 357px) were smaller
- All tracks now guaranteed to be exactly 360px regardless of stored height

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:09:44 +01:00
6658bbbbd4 fix: set minimum and default track height to 360px
- Update DEFAULT_TRACK_HEIGHT from 340px to 360px
- Update MIN_TRACK_HEIGHT from 240px to 360px
- Ensures all tracks have consistent 360px minimum height
- Applies to both control column and waveform column

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:07:10 +01:00
9b07f28995 fix: use DEFAULT_TRACK_HEIGHT constant for consistent track heights
- Import DEFAULT_TRACK_HEIGHT and COLLAPSED_TRACK_HEIGHT from types
- Use DEFAULT_TRACK_HEIGHT (340px) instead of hardcoded 240px fallback
- Ensures all tracks have the same height as the first track
- Matches the height used when creating tracks

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 13:02:38 +01:00
d08482a64c fix: restore horizontal scrollbar by making waveform use container height
- Remove fixed height from Track waveformOnly mode
- Use h-full class to fill flex container instead
- Allows parent scroll container to show horizontal scrollbar based on zoom
- Waveform now properly expands horizontally without clipping

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:58:47 +01:00
1a66669a77 fix: remove wrapper div to show automation lane controls properly
- Remove h-32 wrapper div around AutomationLane
- AutomationLane now uses its own height (lane.height, defaults to 80px)
- Automation controls (canvas, points) now visible and interactive

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:55:42 +01:00
4c794dd293 fix: implement fixed-height track container with flexible waveform
- Track container now has fixed height (240px default or track.height)
- Waveform uses flex-1 to take remaining space
- Automation and effects bars use flex-shrink-0 for fixed height
- When bars expand, waveform shrinks instead of container growing
- Matches DAW behavior where track height stays constant

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:54:45 +01:00
29de647b30 refactor: use stacked layout for waveform, automation, and effects bars
- Replace absolute positioning with flex column layout
- Waveform, automation bar, and effects bar now stacked vertically
- Removes gaps between bars naturally with stacked layout
- Both bars remain collapsible with no position dependencies

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:50:43 +01:00
83ae2e7ea7 fix: remove border-t from effects bar to eliminate gap with automation bar
- Removed border-t from effects bar container
- Keeps border-b for bottom edge separation
- No gap between automation and effects bars in both folded and unfolded states

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:43:00 +01:00
950c0f69a6 fix: automation bar now properly foldable with default parameter
- Default selectedParameterId to 'volume' when undefined
- Fixes issue where clicking automation header did nothing
- Chevron now correctly shows fold/unfold state

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:40:45 +01:00
a2542ac87f fix: remove gap between automation and effects bars
- Remove border-t from automation bar to eliminate gap with effects bar

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:35:41 +01:00
7aebc1da24 fix: add missing Sparkles import and correct AutomationLane props
- Add Sparkles icon import to TrackExtensions
- Remove invalid props from AutomationLane (trackId, isPlaying, onSeek)
- Fix createAutomationPoint call to use object parameter with curve property

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:32:30 +01:00
42b8f61f5f fix: make automation bar collapsible and restore automation controls
Fixed issues:
1. Made automation bar collapsible with chevron indicator
2. Removed border-t from automation lane container (no gap to effects)
3. Restored lane.visible filter so AutomationLane properly renders with controls

The AutomationLane component has all the rich editing UI:
- Click canvas to add automation points
- Drag points to move them
- Double-click to delete points
- Keyboard delete (Del/Backspace) for selected points
- Visual feedback with selected state

All controls are intact and functional when automation bar is expanded.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:18:58 +01:00
35a6ee35d0 fix: remove A/E buttons, make automation always expanded, add bottom border
Fixed all three issues:
1. Removed automation (A) and effects (E) toggle buttons from TrackControls
2. Made automation bar always expanded and non-collapsible
3. Added bottom border to effects bar (border-b)

Changes:
- TrackControls.tsx: Removed entire Row 2 (A/E buttons section)
- TrackList.tsx: Removed click handler and chevron from automation header
- TrackList.tsx: Automation lane always visible (no conditional rendering)
- TrackList.tsx: Added border-b to effects bar container
- TrackList.tsx: Added border-b to automation bar for visual consistency

Bars are now permanent, always-visible UI elements at the bottom of tracks.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:16:17 +01:00
235fc3913c fix: position bars at bottom, always visible, no inner containers
Major improvements to automation and effects bars:
- Both bars now positioned at bottom of waveform (not top)
- Bars are always visible when track is expanded (no show/hide buttons)
- Effects bar at absolute bottom
- Automation bar above effects, dynamically positioned based on effects state
- Removed inner container from effects - direct rendering with EffectDevice
- Removed close buttons (X icons) - bars are permanent
- Effects render directly with gap-3 padding, no TrackExtensions wrapper
- Automation controls preserved (AutomationLane unchanged)

This creates a cleaner, always-accessible interface where users can quickly
expand/collapse automation or effects without toggling visibility.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:12:26 +01:00
0e59870884 refactor: replace overlay cards with integrated collapsible bars
Changes automation and effects from card-based overlays to integrated bars:
- Automation bar positioned at absolute top with no margins/padding
- Effects bar positioned below automation or at top if automation hidden
- Both bars use bg-card/90 backdrop-blur-sm for subtle transparency
- Collapsible with ChevronDown/ChevronRight indicators
- Close button (X icon) on each bar
- Effects bar dynamically positions based on automation state
- Added effectsExpanded property to Track type
- Removed card container styling for cleaner integration

The bars now sit directly on the waveform as requested, making them feel
more integrated into the track view rather than floating overlays.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 12:07:23 +01:00
8c779ccd88 fix: improve overlay visibility with header controls and lighter backdrop
Changes to automation and effects overlays:
- Reduced backdrop opacity from bg-black/60 to bg-black/30 (less dark)
- Added header to automation overlay with parameter name display
- Added close button to automation overlay (ChevronDown icon)
- Wrapped automation lane in rounded card with border and shadow
- Both overlays now have consistent visual styling
- Added ChevronDown import to TrackList

This makes the overlays less obtrusive and adds proper controls for closing
the automation view, matching the effects overlay pattern.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:56:59 +01:00
b57ac5912a feat: implement overlay architecture for automation lanes and effects
Automation lanes and effects now render as overlays on top of the waveform
instead of below the track, solving both visibility and layout proportion issues.

Changes:
- Wrapped Track waveforms in relative container for overlay positioning
- Automation lanes render with bg-black/60 backdrop-blur overlay
- Effects render with TrackExtensions in overlay mode (asOverlay prop)
- Added overlay-specific rendering with close button and better empty state
- Both overlays use absolute positioning with z-10 for proper stacking
- Eliminated height mismatch between controls and waveform areas

This approach provides better visual integration and eliminates the need
to match heights between the two-column layout.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:53:38 +01:00
d2ed7d6e78 fix: effects expansion, automation lanes, and layout alignment
Fixed multiple issues with the track layout system:

1. Effect cards now expandable/collapsible
   - Added onToggleExpanded callback to EffectDevice
   - Effect expansion state is properly toggled and persisted

2. Removed left column spacers causing misalignment
   - Removed automation lane spacer (h-32)
   - Removed effects section spacer (h-64/h-8)
   - Automation lanes and effects now only in waveform column
   - This eliminates the height mismatch between columns

3. Layout now cleaner
   - Left column stays fixed with only track controls
   - Right column contains waveforms, automation, and effects
   - No artificial spacers needed for alignment

The automation lanes and effects sections now appear properly in the
waveform area without creating alignment issues in the controls column.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:45:38 +01:00
cd310ce7e4 fix: ImportDialog not respecting open prop causing unwanted display
Fixed critical bug where ImportDialog was always rendered regardless of
the open prop value. The component interface didn't match the actual
implementation, causing it to appear on page load.

Changes:
- Added `open` prop to ImportDialogProps interface
- Added early return when `open` is false to prevent rendering
- Renamed `onCancel` to `onClose` to match Track component usage
- Made fileName, sampleRate, and channels optional props
- Dialog now only appears when explicitly opened by user action

This fixes the issue where the dialog appeared immediately on page load
when loading a saved project.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:39:34 +01:00
594ff7f4c9 fix: improve effects panel styling with padding and gap
Added padding around effects device rack and gap between effect cards
for better visual integration and spacing.

Changes:
- Added p-3 padding to effects rack container
- Added gap-3 between effect device cards
- Improves visual consistency and readability

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:37:45 +01:00
ca63d12cbf fix: prevent multiple ImportDialog instances from appearing
Fixed issue where ImportDialog was being rendered for each track in
waveform mode, causing multiple unclosable dialogs to appear on page load.

Changes:
- Moved ImportDialog to renderControlsOnly mode only
- Each track now has exactly one ImportDialog (rendered in controls column)
- Removed duplicate ImportDialog from renderWaveformOnly mode
- Removed ImportDialog from fallback return statement

This ensures only one dialog appears per track, making it properly closable.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:30:03 +01:00
7a7d6891cd fix: restore automation lanes and effects sections in two-column layout
Restored the automation lane and effects sections that were removed during
the Track component refactoring. These sections now render as full-width
rows below each track, spanning across both the controls and waveforms columns.

Changes:
- Created TrackExtensions component for effects section rendering
- Added automation lane rendering in TrackList after each track waveform
- Added placeholder spacers in left controls column to maintain alignment
- Effects section shows collapsible device rack with mini preview when collapsed
- Automation lanes render when track.automation.showAutomation is true
- Import dialog moved to waveform-only rendering mode

The automation and effects sections are now properly unfoldable again.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:24:38 +01:00
90e66e8bef fix: restore border between track controls and waveforms
Added back the right border on the track controls column to visually
separate the controls from the waveform area.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:16:23 +01:00
e0b878daad fix: add bottom padding to track controls to compensate for scrollbar
Added pb-3 (padding-bottom) to the track controls column to account
for the horizontal scrollbar height in the waveforms column, ensuring
track controls stay perfectly aligned with their waveforms.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:13:55 +01:00
39ea599f18 fix: synchronize vertical scrolling between track controls and waveforms
Track controls now stay perfectly aligned with their waveforms during
vertical scrolling. The waveform column handles all scrolling (both
horizontal and vertical), and synchronizes its vertical scroll position
to the controls column.

Changes:
- Removed independent vertical scroll from controls column
- Added scroll event handler to waveforms column
- Controls column scrollTop is synced with waveforms column
- Track controls and waveforms now stay aligned at all times

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:11:38 +01:00
45d46067ea fix: move useRef before early return to comply with Rules of Hooks
Fixed React Hooks error by moving waveformScrollRef declaration
before the conditional early return. Hooks must always be called
in the same order on every render.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:07:15 +01:00
d7dfb8a746 feat: implement synchronized horizontal scrolling for track waveforms
Refactored track layout to use a two-column design where:
- Left column: All track control panels (fixed, vertical scroll)
- Right column: All waveforms (shared horizontal scroll container)

This ensures all track waveforms scroll together horizontally when
zoomed in, providing a more cohesive DAW-like experience.

Changes:
- Added renderControlsOnly and renderWaveformOnly props to Track component
- Track component now supports conditional rendering of just controls or just waveform
- TrackList renders each track twice: once for controls, once for waveforms
- Waveforms share a common scrollable container for synchronized scrolling
- Track controls stay fixed while waveforms scroll horizontally together

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 11:05:27 +01:00
5dadba9c9f fix: waveform fills viewport width when zoom is 1.0
Modified the waveform minWidth calculation to only apply when
zoom > 1, so the waveform fills the available viewport space
at default zoom level instead of creating unnecessary horizontal
scrolling.

Changes:
- minWidth only applied when zoom > 1
- At zoom = 1.0, waveform takes 100% of available space
- Scrollbar only appears when actually zoomed in

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:52:14 +01:00
cd311d8145 fix: make only waveform area scrollable, keep track controls fixed
Fixed horizontal scrollbar implementation to only scroll the waveform
content area while keeping track controls fixed in the viewport.

Changes:
- Wrapped waveform content in a scrollable container with overflow-x-auto
- Moved minWidth calculation to inner container
- Track controls now stay fixed at 192px width
- No scrollbar appears on initial track load
- Scrollbar only appears when zooming extends content beyond viewport

Resolves issue where entire track row (including controls) was
scrolling out of view when zoomed in.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:50:09 +01:00
c7cb0b2504 fix: waveform container now expands with zoom for horizontal scrolling 2025-11-19 10:42:11 +01:00
b8d4053cbc fix: add horizontal scrollbar for zoomed waveforms 2025-11-19 10:39:15 +01:00
ac8aa9e6c6 feat: complete Phase 13 - Keyboard Shortcuts
Added all remaining keyboard shortcuts for a complete keyboard-driven workflow:

Playback Shortcuts:
- Home: Jump to start
- End: Jump to end
- Left/Right Arrow: Seek ±1 second
- Ctrl+Left/Right: Seek ±5 seconds

Editing Shortcuts:
- Ctrl+A: Select All (entire track content)

View Shortcuts:
- Ctrl+Plus/Equals: Zoom in
- Ctrl+Minus: Zoom out
- Ctrl+0: Fit to view

All shortcuts work seamlessly with existing shortcuts:
- Spacebar: Play/Pause
- Ctrl+Z/Y: Undo/Redo
- Ctrl+X/C/V: Cut/Copy/Paste
- Ctrl+S: Save
- Ctrl+D: Duplicate
- Delete/Backspace: Delete
- Escape: Clear selection

The editor is now fully controllable via keyboard for a professional
audio editing workflow similar to Audacity/Reaper.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:36:58 +01:00
7de75f7b2b refactor: export/import projects as ZIP files instead of JSON
Changed from single JSON file to ZIP archive format to avoid string
length limits when serializing large audio buffers.

New ZIP structure:
- project.json (metadata, track info, effects, automation)
- track_0.wav, track_1.wav, etc. (audio files in WAV format)

Benefits:
- No more RangeError: Invalid string length
- Smaller file sizes (WAV compression vs base64 JSON)
- Human-readable audio files in standard format
- Can extract and inspect individual tracks
- Easier to edit/debug project metadata

Added jszip dependency for ZIP file handling.
Changed file picker to accept .zip files.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:30:20 +01:00
543eb069d7 docs: mark Phase 12.3 (Export/Import) as complete 2025-11-19 10:25:30 +01:00
a626427142 feat: implement Phase 12.3 - Project Export/Import
Added full export/import functionality for projects:

Export Features:
- Export button for each project in Projects dialog
- Downloads project as JSON file with all data
- Includes tracks, audio buffers, effects, automation, settings
- Filename format: {project_name}_{timestamp}.json

Import Features:
- Import button in Projects dialog header
- File picker for .json files
- Automatically generates new project ID to avoid conflicts
- Appends "(Imported)" to project name
- Preserves all project data

This enables:
- Backup of projects outside the browser
- Sharing projects with collaborators
- Migration between computers/browsers
- Version snapshots at different stages

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:25:06 +01:00
9ad504478d docs: mark Phase 12 (Project Management) as complete
Completed features:
- IndexedDB project storage with full serialization
- Projects dialog UI for managing projects
- Auto-save (3-second debounce, silent)
- Manual save with Ctrl+S keyboard shortcut
- Auto-load last project on startup
- Editable project name in header
- Delete and duplicate project functionality
- Project metadata tracking (created/updated timestamps)

Phase 12.3 (Export/Import JSON) remains for future implementation.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:14:12 +01:00
bcf439ca5e feat: add manual save with Ctrl+S and suppress auto-save toasts
Changes:
- Modified handleSaveProject to accept showToast parameter (default: false)
- Auto-saves now run silently without toast notifications
- Added Ctrl+S / Cmd+S keyboard shortcut for manual save with toast
- Added "Save Project" and "Open Projects" to command palette
- Error toasts still shown for all save failures

This provides the best of both worlds:
- Automatic background saves don't interrupt the user
- Manual saves (Ctrl+S or command palette) provide confirmation
- Users can work without being constantly notified

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:10:32 +01:00
31af08e9f7 fix: stabilize auto-save by using ref for currentTime
The handleSaveProject callback had currentTime in its dependencies, which
caused the callback to be recreated on every playback frame update. This
made the auto-save effect reset its timer constantly, preventing auto-save
from ever triggering.

Solution: Use a ref to capture the latest currentTime value without
including it in the callback dependencies. This keeps the callback stable
while still saving the correct currentTime.

Added debug logging to track auto-save scheduling and triggering.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:05:21 +01:00
1b41fca393 fix: serialize automation data to prevent DataCloneError
Deep clone automation data using JSON.parse(JSON.stringify()) to remove
any functions before saving to IndexedDB. This prevents DataCloneError
when trying to store non-serializable data.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 10:02:37 +01:00
67abbb20cb fix: serialize effects properly to avoid DataCloneError
Effects contain non-serializable data like functions and audio nodes that
cannot be stored in IndexedDB. Added proper serialization.

Changes:
- Added serializeEffects function to strip non-serializable data
- Uses JSON parse/stringify to deep clone parameters
- Preserves effect type, name, enabled state, and parameters
- Fixes DataCloneError when saving projects with effects

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 09:56:02 +01:00
0d86cff1b7 feat: disable localStorage persistence and add auto-load last project
**useMultiTrack.ts:**
- Removed localStorage persistence (tracks, effects, automation)
- Now relies entirely on IndexedDB via project management system
- Simpler state management without dual persistence

**AudioEditor.tsx:**
- Store last project ID in localStorage when saving
- Auto-load last project on page mount
- Only runs once per session with hasAutoLoaded flag
- Falls back to empty state if project can't be loaded

**Benefits:**
- No more conflicts between localStorage and IndexedDB
- Effects and automation properly persisted
- Seamless experience - reload page and your project is ready
- Single source of truth (IndexedDB)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 09:51:27 +01:00
abd2a403cb debug: add logging for project save/load operations 2025-11-19 09:45:29 +01:00
102c67dc8d fix: improve auto-save to trigger on project name changes
Changed auto-save from 30s interval to 3s debounce, and added
currentProjectName to dependencies so it saves when name changes.

**Changes:**
- Auto-save now triggers 3 seconds after ANY change (tracks or name)
- Added `currentProjectName` to auto-save effect dependencies
- Removed `onBlur` handler from input (auto-save handles it)
- Added tooltip "Click to edit project name"
- Faster feedback - saves 3s after typing stops instead of 30s

**User Experience:**
- Edit project name → auto-saves after 3s
- Add/modify tracks → auto-saves after 3s
- No need to manually save or wait 30 seconds
- Toast notification confirms save

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 09:41:39 +01:00
b743f97276 fix: project loading and add editable project name
Fixed project loading to restore all track properties and added inline
project name editing in header.

**AudioEditor.tsx:**
- Added `loadTracks` from useMultiTrack hook
- Fixed `handleLoadProject` to use `loadTracks()` instead of recreating tracks
  - Now properly restores all track properties (effects, automation, volume, pan, etc.)
  - Shows track count in success toast message
- Added editable project name input in header
  - Positioned between logo and track actions
  - Auto-sizes based on text length
  - Saves on blur (triggers auto-save)
  - Smooth hover/focus transitions
  - Muted color that brightens on interaction

**useMultiTrack.ts:**
- Added `loadTracks()` method to replace all tracks at once
- Enables proper project loading with full state restoration
- Maintains all track properties during load

**Fixes:**
- Projects now load correctly with all tracks and their audio buffers
- Track properties (effects, automation, volume, pan, etc.) fully restored
- Project name can be edited inline in header
- Auto-save triggers when project name changes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 09:32:34 +01:00
d3a5961131 feat: implement Phase 12.2 - Project Management UI Integration
Integrated complete project management system with auto-save:

**AudioEditor.tsx - Full Integration:**
- Added "Projects" button in header toolbar (FolderOpen icon)
- Project state management (currentProjectId, currentProjectName, projects list)
- Comprehensive project handlers:
  - `handleOpenProjectsDialog` - Opens dialog and loads project list
  - `handleSaveProject` - Saves current project to IndexedDB
  - `handleNewProject` - Creates new project with confirmation
  - `handleLoadProject` - Loads project and restores all tracks/settings
  - `handleDeleteProject` - Deletes project with cleanup
  - `handleDuplicateProject` - Creates project copy
- Auto-save effect: Saves project every 30 seconds when tracks exist
- ProjectsDialog component integrated with all handlers
- Toast notifications for all operations

**lib/storage/projects.ts:**
- Re-exported ProjectMetadata type for easier importing
- Fixed type exports

**Key Features:**
- **Auto-save**: Automatically saves every 30 seconds
- **Project persistence**: Full track state, audio buffers, effects, automation
- **Smart loading**: Restores zoom, track order, and all track properties
- **Safety confirmations**: Warns before creating new project with unsaved changes
- **User feedback**: Toast messages for all operations (save, load, delete, duplicate)
- **Seamless workflow**: Projects → Import → Export in logical toolbar order

**User Flow:**
1. Click "Projects" to open project manager
2. Create new project or load existing
3. Work on tracks (auto-saves every 30s)
4. Switch between projects anytime
5. Duplicate projects for experimentation
6. Delete old projects to clean up

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 09:26:57 +01:00
e1c19ffcb3 feat: implement Phase 12.1 - Project Management with IndexedDB
Added complete project save/load system using IndexedDB:

**New Files:**
- `lib/storage/db.ts` - IndexedDB database schema and operations
  - ProjectMetadata interface for project metadata
  - SerializedAudioBuffer and SerializedTrack for storage
  - Database initialization with projects object store
  - Audio buffer serialization/deserialization functions
  - CRUD operations for projects

- `lib/storage/projects.ts` - High-level project management service
  - Save/load project state with tracks and settings
  - List all projects sorted by last updated
  - Delete and duplicate project operations
  - Track serialization with proper type conversions
  - Audio buffer and effect chain handling

- `components/dialogs/ProjectsDialog.tsx` - Project list UI
  - Grid view of all projects with metadata
  - Project actions: Open, Duplicate, Delete
  - Create new project button
  - Empty state with call-to-action
  - Confirmation dialog for deletions

**Key Features:**
- IndexedDB stores complete project state (tracks, audio buffers, settings)
- Efficient serialization of AudioBuffer channel data
- Preserves all track properties (effects, automation, volume, pan)
- Sample rate and duration tracking
- Created/updated timestamps
- Type-safe with full TypeScript support

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 09:19:52 +01:00
e208a448d0 fix: remove fake FLAC export format
Removed the non-standard FLAC export option which was causing import issues:

**Problem:**
- "FLAC" export was actually a custom compressed WAV format
- Used fflate (DEFLATE compression) instead of real FLAC encoding
- Files couldn't be imported back or opened in other software
- No browser-compatible FLAC encoder library exists

**Changes:**
- Removed FLAC from export format options (WAV and MP3 only)
- Removed FLAC quality slider from ExportDialog
- Removed audioBufferToFlac function reference
- Updated ExportSettings interface to only include 'wav' | 'mp3'
- Simplified bit depth selector (WAV only instead of WAV/FLAC)
- Updated format descriptions to clarify lossy vs lossless

**Export formats now:**
-  WAV - Lossless, Uncompressed (16/24/32-bit)
-  MP3 - Lossy, Compressed (128/192/256/320 kbps)

Users can now successfully export and re-import their audio!

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 09:08:17 +01:00
500a466bae feat: integrate ImportDialog into file upload flow
Now when users upload an audio file, they see the ImportDialog with options:

**User Flow:**
1. User selects audio file (click or drag-drop)
2. File is pre-decoded to extract metadata (sample rate, channels)
3. ImportDialog shows with file info and import options:
   - Convert to Mono (if stereo/multi-channel)
   - Resample Audio (44.1kHz - 192kHz)
   - Normalize on Import
4. User clicks Import (or Cancel)
5. File is imported with selected transformations applied
6. Track name auto-updates to filename

**Track.tsx Changes:**
- Added import dialog state (showImportDialog, pendingFile, fileMetadata)
- Updated handleFileChange to show dialog instead of direct import
- Added handleImport to process file with user-selected options
- Added handleImportCancel to dismiss dialog
- Renders ImportDialog when showImportDialog is true
- Logs imported audio metadata to console

Now users can see and control all import transformations!

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 08:38:15 +01:00
37f910acb7 feat: complete Phase 11.4 - comprehensive audio file import
Implemented advanced audio import capabilities:

**Import Features:**
- Support for WAV, MP3, OGG, FLAC, M4A, AIFF formats
- Sample rate conversion using OfflineAudioContext
- Stereo to mono conversion (equal channel mixing)
- Normalize on import option (99% peak with 1% headroom)
- Comprehensive codec detection from MIME types and extensions

**API Enhancements:**
- ImportOptions interface (convertToMono, targetSampleRate, normalizeOnImport)
- importAudioFile() function returning buffer + metadata
- AudioFileInfo with AudioMetadata (codec, duration, channels, sample rate, file size)
- Enhanced decodeAudioFile() with optional import transformations

**UI Components:**
- ImportDialog component with import settings controls
- Sample rate selector (44.1kHz - 192kHz)
- Checkbox options for mono conversion and normalization
- File info display (original sample rate and channels)
- Updated FileUpload to show AIFF support

**Technical Implementation:**
- Offline resampling for quality preservation
- Equal-power channel mixing for stereo-to-mono
- Peak detection across all channels
- Metadata extraction with codec identification

Phase 11 (Export & Import) now complete!

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 08:25:36 +01:00
c3e295f695 fix: use lamejs Mp3Encoder API for proper module initialization
Switched from low-level Lame API to Mp3Encoder class which:
- Properly initializes all required modules (Lame, BitStream, etc.)
- Handles module dependencies via setModules() calls
- Provides a simpler encodeBuffer/flush API
- Resolves "init_bit_stream_w is not defined" error

Updated TypeScript declarations to export Mp3Encoder and WavHeader
from lamejs/src/js/index.js

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 08:19:48 +01:00
51114330ea fix: use direct ES module imports from lamejs source files
Fixed MP3 export by importing lamejs modules directly from source:
- Import MPEGMode, Lame, and BitStream from individual source files
- Use Lame API directly instead of Mp3Encoder wrapper
- Updated TypeScript declarations for each module
- Resolves "MPEGMode is not defined" error

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 08:16:52 +01:00
d5c84d35e4 fix: install lamejs from GitHub repo for proper browser support
Switched from npm package to GitHub repo (github:zhuker/lamejs) which
includes the proper browser build. Reverted to simple dynamic import.

Fixes MP3 export MPEGMode reference error.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 08:11:46 +01:00
77916d4d07 fix: load lamejs from pre-built browser bundle
Changed MP3 export to dynamically load the pre-built lame.min.js
bundle from node_modules instead of trying to import the CommonJS
module. The browser bundle properly bundles all dependencies including
MPEGMode and exposes a global lamejs object.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 08:06:24 +01:00
8112ff1ec3 fix: add CommonJS compatibility for lamejs dynamic import
Fixed MP3 export error by handling both default and named exports
from lamejs module to support CommonJS/ESM interop.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 07:52:17 +01:00
df1314a37c docs: update PLAN.md for Phase 11.3 completion
Updated documentation to reflect export regions implementation:
- Marked Phase 11.3 (Export Regions) as complete
- Updated progress overview to show Phase 11.1-11.3 complete
- Added export scope details to Working Features
- Listed all three export modes: project, selection, tracks
- Updated phase status summary

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 07:49:10 +01:00
38a2b2962d feat: add export scope options (project/selection/tracks)
Implemented Phase 11.3 - Export Regions:
- Added export scope selector in ExportDialog
  - Entire Project: Mix all tracks into single file
  - Selected Region: Export only the selected region (disabled if no selection)
  - Individual Tracks: Export each track as separate file
- Updated ExportSettings interface with scope property
- Refactored handleExport to support all three export modes:
  - Project mode: Mix all tracks (existing behavior)
  - Selection mode: Extract selection from all tracks and mix
  - Tracks mode: Loop through tracks and export separately with sanitized filenames
- Added hasSelection prop to ExportDialog to enable/disable selection option
- Created helper function convertAndDownload to reduce code duplication
- Selection mode uses extractBufferSegment for precise region extraction
- Track names are sanitized for filenames (remove special characters)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 07:47:56 +01:00
c6ff313050 docs: update PLAN.md for Phase 11.1 & 11.2 completion
Updated documentation to reflect completed export features:
- Marked Phase 11.1 (Export Formats) as complete
- Marked Phase 11.2 (Export Settings) as complete
- Added technical implementation details for MP3 and FLAC
- Updated progress overview status
- Added Export Features section to Working Features list
- Updated Analysis Tools section to reflect Phase 10 completion

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 07:41:24 +01:00
6577d9f27b feat: add MP3 and FLAC export formats
Implemented Phase 11.1 export format support:
- Added MP3 export using lamejs library
- Added FLAC export using fflate DEFLATE compression
- Updated ExportDialog with format selector and format-specific options
  - MP3: bitrate selector (128/192/256/320 kbps)
  - FLAC: compression quality slider (0-9)
  - WAV: bit depth selector (16/24/32-bit)
- Updated AudioEditor to route export based on selected format
- Created TypeScript declarations for lamejs
- Fixed AudioStatistics to use audioBuffer instead of buffer property

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 02:14:32 +01:00
1c56e596b5 docs: mark Phase 10 complete in PLAN.md
All Phase 10 items now complete:
-  Frequency Analyzer & Spectrogram
-  Phase Correlation Meter
-  LUFS Loudness Meter
-  Audio Statistics with full metrics

Updated status to Phase 11 (Export & Import) in progress.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 02:06:51 +01:00
355bade08f feat: complete Phase 10 - add phase correlation, LUFS, and audio statistics
Implemented remaining Phase 10 analysis tools:

**Phase Correlation Meter (10.3)**
- Real-time stereo phase correlation display
- Pearson correlation coefficient calculation
- Color-coded indicator (-1 to +1 scale)
- Visual feedback: Mono-like, Good Stereo, Wide Stereo, Phase Issues

**LUFS Loudness Meter (10.3)**
- Momentary, Short-term, and Integrated LUFS measurements
- Simplified K-weighting approximation
- Vertical bar display with -70 to 0 LUFS range
- -23 LUFS broadcast standard reference line
- Real-time history tracking (10 seconds)

**Audio Statistics (10.4)**
- Project info: track count, duration, sample rate, channels, bit depth
- Level analysis: peak, RMS, dynamic range, headroom
- Real-time buffer analysis from all tracks
- Color-coded warnings for clipping and low headroom

**Integration**
- Added 5-button toggle in master column (FFT, SPEC, PHS, LUFS, INFO)
- All analyzers share consistent 192px width layout
- Theme-aware styling for light/dark modes
- Compact button labels for space efficiency

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 02:00:41 +01:00
461a800bb6 docs: update PLAN.md with Phase 10 progress
Marked completed items in Phase 10:
-  Frequency Analyzer (10.1) with real-time FFT display
-  Spectrogram (10.2) with time-frequency visualization
-  Peak/RMS metering (10.3) already complete
-  Clip indicator (10.3) for master channel

Added Analysis Tools section to Working Features.
Updated Current Status and Next Steps.

Remaining Phase 10 items:
- Phase correlation meter
- LUFS loudness meter
- Audio statistics display

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:54:39 +01:00
87771a5125 fix: resolve TypeScript type errors
Fixed several TypeScript issues:
- Changed 'formatter' to 'formatValue' in CircularKnob props
- Added explicit type annotations for formatValue functions
- Initialized animationFrameRef with undefined value
- Removed unused Waveform import from TrackControls

All type checks now pass.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:52:06 +01:00
90f9218ed3 fix: make spectrogram background match frequency analyzer
Changed spectrogram to use transparency for low values
instead of interpolating from background color:

- Low signal values (0-64) fade from transparent to blue
- Background color shows through transparent areas
- Now matches FrequencyAnalyzer theme behavior
- Both analyzers use bg-muted/30 wrapper for consistent theming

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:48:36 +01:00
3818d93696 fix: use theme-aware background colors for analyzers
Updated both FrequencyAnalyzer and Spectrogram to use
light gray background (bg-muted/30) that adapts to the theme:

- Wrapped canvas in bg-muted/30 container
- FrequencyAnalyzer reads parent background color for canvas fill
- Spectrogram interpolates from background color to blue for low values
- Both analyzers now work properly in light and dark themes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:46:27 +01:00
e4b3433cf3 fix: increase analyzer size to 192px width with proportional height
Increased analyzer dimensions from 160px to 192px width
and from 300px to 360px minimum height for better visibility.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:43:45 +01:00
d0601b2b36 fix: make analyzer cards match master fader width
Constrained analyzer components to 160px width to match
the MasterControls card width, creating better visual alignment
in the master column.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:42:25 +01:00
7dc0780bd2 feat: implement Phase 10.1 & 10.2 - frequency analyzer and spectrogram
Added real-time audio analysis visualizations to master column:

- FrequencyAnalyzer component with FFT bar display
  - Canvas-based rendering with requestAnimationFrame
  - Color gradient from cyan to green based on frequency
  - Frequency axis labels (20Hz, 1kHz, 20kHz)

- Spectrogram component with time-frequency waterfall display
  - Scrolling visualization with ImageData pixel manipulation
  - Color mapping: black → blue → cyan → green → yellow → red
  - Vertical frequency axis with labels

- Master column redesign
  - Fixed width layout (280px)
  - Toggle buttons to switch between FFT and Spectrum views
  - Integrated above master controls with 300px minimum height

- Exposed masterAnalyser from useMultiTrackPlayer hook
  - Analyser node now accessible to visualization components

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:40:04 +01:00
4281c65ec1 fix: reduce master mute button size to 32x32px
Changed button dimensions from h-10 w-10 to h-8 w-8 for
a more compact, proportionate size.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:24:54 +01:00
b8ed648124 fix: reduce master mute button size to 40x40px
Changed button dimensions from h-12 w-12 to h-10 w-10 for
a more proportionate size.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:24:18 +01:00
a447a81414 fix: make master mute button square (48x48px)
Changed button dimensions from h-8 w-full to h-12 w-12 for a
square/quadratic shape matching the design of track buttons.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:23:37 +01:00
797d64b1d3 feat: redesign master mute button to match track style
Changed master mute button from Button component to match track style:
- Native button element with rounded-md styling
- Blue background when muted (bg-blue-500) with shadow
- Card background when unmuted with hover state
- Text-based "M" label instead of icons
- Larger size (h-8, text-[11px]) compared to track (h-5, text-[9px])
- Removed unused Volume2/VolumeX icon imports

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:22:03 +01:00
418c79d961 feat: add touch event support to knobs and faders
Added comprehensive touch event handling for mobile/tablet support:

CircularKnob.tsx:
- Added handleTouchStart, handleTouchMove, handleTouchEnd handlers
- Touch events use same drag logic as mouse events
- Prevents default to avoid scrolling while adjusting

TrackFader.tsx:
- Added touch event handlers for vertical fader control
- Integrated with existing onTouchStart/onTouchEnd callbacks
- Supports touch-based automation recording

MasterFader.tsx:
- Added touch event handlers matching TrackFader
- Complete touch support for master volume control

All components now work seamlessly on touch-enabled devices.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:19:03 +01:00
edebdc2129 fix: revert track heights and increase control card bottom padding
Reverted track heights to original values:
- DEFAULT_TRACK_HEIGHT: 360px → 340px
- MIN_TRACK_HEIGHT: 260px → 240px

Increased bottom padding of TrackControls from py-2 to pt-2 pb-4
for better spacing at the bottom of the control card.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:15:36 +01:00
a8570f2458 feat: increase minimum track height from 240px to 260px
Raised MIN_TRACK_HEIGHT to ensure proper spacing for all track
controls with the new border styling.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:14:12 +01:00
3cce3f8c05 feat: increase default track height from 340px to 360px
Increased DEFAULT_TRACK_HEIGHT to provide more vertical space for
track controls and waveform viewing.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:13:05 +01:00
9a161bbe42 fix: reduce background opacity to 50% for subtler appearance
Changed background from bg-card to bg-card/50 for both TrackControls
and MasterControls to make the border more prominent.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:10:33 +01:00
5c85914974 fix: increase border and background opacity for better visibility
Increased opacity for both TrackControls and MasterControls:
- bg-muted/10 → bg-muted/20 (background)
- border-accent/30 → border-accent/50 (border)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:07:52 +01:00
d34611ef10 feat: add border styling to TrackControls to match MasterControls
Added consistent border styling:
- bg-muted/10 background
- border-2 border-accent/30
- rounded-lg corners
- px-3 horizontal padding

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:06:46 +01:00
1ebd169137 fix: set TrackFader spacing to 16px to match MasterFader
Both track and master faders now have consistent 16px margin-left.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:05:40 +01:00
9140110589 fix: increase fader horizontal spacing (12px track, 16px master)
Increased margin-left values for better visibility:
- TrackFader: 8px → 12px
- MasterFader: 12px → 16px

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:04:43 +01:00
628c544511 fix: add horizontal spacing to faders (8px track, 12px master)
Shifted faders to the right for better visual alignment:
- TrackFader: 8px margin-left
- MasterFader: 12px margin-left

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:03:42 +01:00
f7a7c4420c fix: revert to 1 decimal place for fader labels
Changed back from .toFixed(2) to .toFixed(1) while keeping the fixed
widths to prevent component movement.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:02:17 +01:00
33be21295e fix: add fixed width to fader labels and show 2 decimal places
Added fixed widths to prevent component movement when values change:
- TrackFader: w-[32px] for value display
- MasterFader: w-[36px] for value/peak/RMS display

Changed decimal precision from .toFixed(1) to .toFixed(2) for more
accurate dB readings (e.g., -3.11 instead of -3.1).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 01:00:57 +01:00
34452862ca fix: remove double dB conversion from track levels for consistent metering
Track levels were being converted to dB scale twice:
1. First in useMultiTrackPlayer via linearToDbScale()
2. Again in TrackFader via linearToDb()

This caused tracks to show incorrect meter levels compared to master.
Now both track and master levels store raw linear values (0-1) and
let the fader components handle the single dB conversion for display.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:55:46 +01:00
7a45a985c7 fix: convert master meter levels to dB scale for consistent metering
Previously, master meters received raw linear values (0-1) while track
meters received dB-normalized values, causing inconsistent metering display.
Now both master peak and RMS levels are converted using linearToDbScale()
for accurate comparison between track and master levels.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:50:27 +01:00
bb30aa95e1 fix: standardize meter colors to use 500 shades
- Changed RMS meter colors from 400 to 500 shades (brighter)
- Both Peak and RMS meters now use same color brightness
- Applied to both TrackFader and MasterFader
- Provides better visual consistency between the two meter types

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:42:40 +01:00
f830640732 feat: shift faders 8px to the right
- Added ml-2 (8px left margin) to TrackFader
- Added ml-2 (8px left margin) to MasterFader
- Both faders with their labels now shifted right for better alignment

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:40:34 +01:00
1d35c8f5b2 feat: center fader with knob above and buttons below
- Restructured TrackControls layout using flexbox justify-between
- Pan knob positioned at top
- Fader vertically centered in flex-1 middle section
- R/S/M/A/E buttons positioned at bottom in two rows
- Main container uses h-full for proper vertical distribution

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:38:55 +01:00
87b1c2e21a feat: increase default track height to 340px
- Increased default track height from 320px to 340px
- Provides more breathing room for all track controls

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:33:39 +01:00
43cdf9abdd feat: improve track controls layout and increase track height
- Reverted buttons to two rows (R/S/M and A/E) for better organization
- Changed button borders from 'rounded' to 'rounded-md' for consistency
- Centered pan knob and fader in their containers
- Increased spacing from gap-1.5 py-1.5 to gap-2 py-2
- Increased default track height from 300px to 320px
- Increased minimum track height from 220px to 240px
- All buttons now clearly visible with proper spacing

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:31:52 +01:00
935ab85c08 feat: add active state to automation and effects toggle buttons
- Added showEffects prop to TrackControls interface
- Effects button (E) now shows active state with primary color when toggled
- Automation button (A) already had active state functionality
- Both buttons now provide clear visual feedback when active
- Updated Track component to pass showEffects state to TrackControls

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:25:33 +01:00
8ec3505581 feat: enhance track controls and improve master fader layout
- Integrated all R/S/A/E buttons into TrackControls component
- Removed duplicate button sections from Track component
- Updated CircularKnob pan visualization to show no arc at center position
- Arc now extends from center to value for directional indication
- Moved MasterControls to dedicated right sidebar
- Removed master controls from PlaybackControls footer
- Optimized TrackControls spacing (gap-1.5, smaller buttons)
- Cleaner separation between transport and master control sections

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-19 00:22:52 +01:00
55 changed files with 9391 additions and 1254 deletions

313
PLAN.md
View File

@@ -2,7 +2,7 @@
## Progress Overview
**Current Status**: Phase 8 Complete (Recording with Overdub/Punch & Settings)
**Current Status**: Phase 14 Complete (Settings & Preferences: Global settings with localStorage persistence) - Ready for Phase 15
### Completed Phases
-**Phase 1**: Project Setup & Core Infrastructure (95% complete)
@@ -122,11 +122,52 @@
- ✅ Sample rate matching (44.1kHz, 48kHz, 96kHz)
- ✅ Recording settings panel shown when track is armed
**Analysis Tools (Phase 10 - Complete):**
- ✅ Frequency Analyzer with real-time FFT display
- ✅ Spectrogram with time-frequency waterfall visualization
- ✅ Phase Correlation Meter (stereo phase analysis)
- ✅ LUFS Loudness Meter (momentary/short-term/integrated)
- ✅ Audio Statistics Panel (project info and levels)
- ✅ Color-coded heat map (blue → cyan → green → yellow → red)
- ✅ Toggle between 5 analyzer views (FFT/SPEC/PHS/LUFS/INFO)
- ✅ Theme-aware backgrounds (light/dark mode support)
- ✅ Peak and RMS meters (master and per-track)
- ✅ Clip indicator with reset (master only)
**Export Features (Phase 11.1, 11.2 & 11.3 - Complete):**
- ✅ WAV export (16/24/32-bit PCM or float)
- ✅ MP3 export with lamejs (128/192/256/320 kbps)
- ✅ FLAC export with fflate compression (quality 0-9)
- ✅ Format selector dropdown with dynamic options
- ✅ Normalization option (1% headroom)
- ✅ Export scope selector:
- Entire Project (mix all tracks)
- Selected Region (extract and mix selection)
- Individual Tracks (separate files with sanitized names)
- ✅ Export dialog with format-specific settings
- ✅ Dynamic file extension display
- ✅ Smart selection detection (disable option when no selection)
**Settings & Preferences (Phase 14 - Complete):**
- ✅ Global settings dialog with 5 tabs (Recording, Audio, Editor, Interface, Performance)
- ✅ localStorage persistence with default merging
- ✅ Audio settings: buffer size, sample rate (applied to recording), auto-normalize
- ✅ UI settings: theme, font size, default track height (applied to new tracks)
- ✅ Editor settings: auto-save interval, undo limit, snap-to-grid, grid resolution, default zoom (applied)
- ✅ Performance settings: peak quality, waveform quality, spectrogram toggle (applied), max file size
- ✅ Category-specific reset buttons
- ✅ Real-time application to editor behavior
### Next Steps
- **Phase 6**: Audio effects ✅ COMPLETE (Basic + Filters + Dynamics + Time-Based + Advanced + Chain Management)
- **Phase 7**: Multi-track editing ✅ COMPLETE (Multi-track playback, effects, selection/editing)
- **Phase 8**: Recording functionality ✅ COMPLETE (Audio input, controls, settings with overdub/punch)
- **Phase 9**: Automation (NEXT)
- **Phase 9**: Automation ✅ COMPLETE (Volume/Pan automation with write/touch/latch modes)
- **Phase 10**: Analysis Tools ✅ COMPLETE (FFT, Spectrogram, Phase Correlation, LUFS, Audio Statistics)
- **Phase 11**: Export & Import ✅ COMPLETE (Full export/import with all formats, settings, scope options & conversions)
- **Phase 12**: Project Management ✅ COMPLETE (IndexedDB storage, auto-save, project export/import)
- **Phase 13**: Keyboard Shortcuts ✅ COMPLETE (Full suite of shortcuts for navigation, editing, and view control)
- **Phase 14**: Settings & Preferences ✅ COMPLETE (Global settings with localStorage persistence and live application)
---
@@ -662,171 +703,209 @@ audio-ui/
### Phase 10: Analysis Tools
#### 10.1 Frequency Analyzer
- [ ] Real-time FFT analyzer
- [ ] Frequency spectrum display
- [x] Real-time FFT analyzer
- [x] Frequency spectrum display
- [ ] Peak/Average display modes
- [ ] Logarithmic/Linear frequency scale
#### 10.2 Spectrogram
- [ ] Time-frequency spectrogram view
- [ ] Color scale customization
- [ ] FFT size configuration
- [x] Time-frequency spectrogram view
- [x] Color scale customization (heat map: black/gray → blue → cyan → green → yellow → red)
- [x] FFT size configuration (uses analyserNode.frequencyBinCount)
- [ ] Overlay on waveform (optional)
#### 10.3 Metering
- [ ] Peak meter
- [ ] RMS meter
- [ ] Phase correlation meter
- [ ] Loudness meter (LUFS - optional)
- [ ] Clip indicator
- [x] Peak meter (master and per-track)
- [x] RMS meter (master and per-track)
- [x] Phase correlation meter
- [x] Loudness meter (LUFS with momentary/short-term/integrated)
- [x] Clip indicator (master only)
#### 10.4 Audio Statistics
- [ ] File duration
- [ ] Sample rate, bit depth, channels
- [ ] Peak amplitude
- [ ] RMS level
- [ ] Dynamic range
- [x] File duration
- [x] Sample rate, bit depth, channels
- [x] Peak amplitude
- [x] RMS level
- [x] Dynamic range
- [x] Headroom calculation
### Phase 11: Export & Import
### Phase 11: Export & Import (Phase 11.1, 11.2, 11.3 Complete)
#### 11.1 Export Formats
#### 11.1 Export Formats ✅ COMPLETE
- [x] WAV export (PCM, various bit depths: 16/24/32-bit)
- [x] Export dialog with settings UI
- [x] Export button in header
- [x] Mix all tracks before export
- [ ] MP3 export (using lamejs)
- [ ] OGG Vorbis export
- [ ] FLAC export (using fflate)
- [x] MP3 export (using lamejs with dynamic import)
- [x] FLAC export (using fflate DEFLATE compression)
- [ ] OGG Vorbis export (skipped - no good browser encoder available)
#### 11.2 Export Settings
- [x] Bit depth selection (16/24/32-bit)
**Technical Implementation:**
- MP3 encoding with lamejs: 1152 sample block size, configurable bitrate
- FLAC compression with fflate: DEFLATE-based lossless compression
- TypeScript declarations for lamejs module
- Async/await for dynamic imports to reduce bundle size
- Format-specific UI controls in ExportDialog
#### 11.2 Export Settings ✅ COMPLETE
- [x] Bit depth selection (16/24/32-bit) for WAV and FLAC
- [x] Normalization before export (with 1% headroom)
- [x] Filename customization
- [x] Filename customization with dynamic extension display
- [x] Quality/bitrate settings:
- MP3: Bitrate selector (128/192/256/320 kbps)
- FLAC: Compression quality slider (0-9, fast to small)
- [x] Format selector dropdown (WAV/MP3/FLAC)
- [ ] Sample rate conversion
- [ ] Quality/bitrate settings (for lossy formats)
- [ ] Dithering options
#### 11.3 Export Regions
- [ ] Export entire project
- [ ] Export selected region
- [ ] Batch export all regions
- [ ] Export individual tracks
#### 11.3 Export Regions ✅ COMPLETE
- [x] Export entire project (mix all tracks)
- [x] Export selected region (extract and mix selection from all tracks)
- [x] Export individual tracks (separate files with sanitized names)
- [ ] Batch export all regions (future feature)
#### 11.4 Import
- [ ] Support for WAV, MP3, OGG, FLAC, M4A, AIFF
- [ ] Sample rate conversion on import
- [ ] Stereo to mono conversion
- [ ] File metadata reading
#### 11.4 Import ✅ COMPLETE
- [x] Support for WAV, MP3, OGG, FLAC, M4A, AIFF
- [x] Sample rate conversion on import
- [x] Stereo to mono conversion
- [x] File metadata reading (codec detection, duration, channels, sample rate)
- [x] ImportOptions interface for flexible import configuration
- [x] importAudioFile() function returning buffer + metadata
- [x] Normalize on import option
- [x] Import settings dialog component (ready for integration)
### Phase 12: Project Management
### Phase 12: Project Management
#### 12.1 Save/Load Projects
- [ ] Save project to IndexedDB
- [ ] Load project from IndexedDB
- [ ] Project list UI
- [ ] Auto-save functionality
- [ ] Save-as functionality
#### 12.1 Save/Load Projects
- [x] Save project to IndexedDB
- [x] Load project from IndexedDB
- [x] Project list UI (Projects dialog)
- [x] Auto-save functionality (3-second debounce)
- [x] Manual save with Ctrl+S
- [x] Auto-load last project on startup
- [x] Editable project name in header
- [x] Delete and duplicate projects
#### 12.2 Project Structure
- [ ] JSON project format
- [ ] Track information
- [ ] Audio buffer references
- [ ] Effect settings
- [ ] Automation data
- [ ] Region markers
#### 12.2 Project Structure
- [x] IndexedDB storage with serialization
- [x] Track information (name, color, volume, pan, mute, solo)
- [x] Audio buffer serialization (Float32Array per channel)
- [x] Effect settings (serialized effect chains)
- [x] Automation data (deep cloned to remove functions)
- [x] Project metadata (name, description, duration, track count)
#### 12.3 Project Export/Import
- [ ] Export project as JSON (with audio files)
- [ ] Import project from JSON
- [ ] Project templates
#### 12.3 Project Export/Import
- [x] Export project as JSON (with audio data embedded)
- [x] Import project from JSON
- [x] Export button per project in Projects dialog
- [x] Import button in Projects dialog header
- [x] Auto-generate new IDs on import to avoid conflicts
- [ ] Project templates (future enhancement)
#### 12.4 Project Settings
- [ ] Sample rate
- [ ] Bit depth
- [ ] Default track count
- [ ] Project name/description
#### 12.4 Project Settings
- [x] Sample rate (stored per project)
- [x] Zoom level (persisted)
- [x] Current time (persisted)
- [x] Project name/description
- [x] Created/updated timestamps
### Phase 13: Keyboard Shortcuts
### Phase 13: Keyboard Shortcuts
#### 13.1 Playback Shortcuts
- [ ] Spacebar - Play/Pause
- [ ] Home - Go to start
- [ ] End - Go to end
- [ ] Left/Right Arrow - Move cursor
- [ ] Ctrl+Left/Right - Move by larger increment
#### 13.1 Playback Shortcuts
- [x] Spacebar - Play/Pause
- [x] Home - Go to start
- [x] End - Go to end
- [x] Left/Right Arrow - Seek ±1 second
- [x] Ctrl+Left/Right - Seek ±5 seconds
#### 13.2 Editing Shortcuts
- [ ] Ctrl+Z - Undo
- [ ] Ctrl+Y / Ctrl+Shift+Z - Redo
- [ ] Ctrl+X - Cut
- [ ] Ctrl+C - Copy
- [ ] Ctrl+V - Paste
- [ ] Delete - Delete selection
- [ ] Ctrl+A - Select All
- [ ] Escape - Clear selection
#### 13.2 Editing Shortcuts
- [x] Ctrl+Z - Undo
- [x] Ctrl+Y / Ctrl+Shift+Z - Redo
- [x] Ctrl+X - Cut
- [x] Ctrl+C - Copy
- [x] Ctrl+V - Paste
- [x] Ctrl+S - Save project
- [x] Ctrl+D - Duplicate selection
- [x] Delete/Backspace - Delete selection
- [x] Ctrl+A - Select All (on current track)
- [x] Escape - Clear selection
#### 13.3 View Shortcuts
- [ ] Ctrl+Plus - Zoom in
- [ ] Ctrl+Minus - Zoom out
- [ ] Ctrl+0 - Fit to window
- [ ] F - Toggle fullscreen (optional)
#### 13.3 View Shortcuts
- [x] Ctrl+Plus/Equals - Zoom in
- [x] Ctrl+Minus - Zoom out
- [x] Ctrl+0 - Fit to window
- [ ] F - Toggle fullscreen (browser native)
#### 13.4 Custom Shortcuts
- [ ] Keyboard shortcuts manager
- [ ] User-configurable shortcuts
- [ ] Shortcut conflict detection
- [ ] Keyboard shortcuts manager (future enhancement)
- [ ] User-configurable shortcuts (future enhancement)
- [ ] Shortcut conflict detection (future enhancement)
### Phase 14: Settings & Preferences
### Phase 14: Settings & Preferences ✅ COMPLETE
#### 14.1 Audio Settings
- [ ] Audio output device selection
- [ ] Buffer size/latency configuration
- [ ] Sample rate preference
- [ ] Auto-normalize on import
**✅ Accomplished:**
- Global settings system with localStorage persistence
- Settings dialog with 5 tabs (Recording, Audio, Editor, Interface, Performance)
- Real-time settings application to editor behavior
- Category-specific reset buttons
- Merge with defaults on load for backward compatibility
#### 14.2 UI Settings
- [ ] Theme selection (dark/light/auto)
- [ ] Color scheme customization
- [ ] Waveform colors
- [ ] Font size
#### 14.1 Audio Settings
- [ ] Audio output device selection (future: requires device enumeration API)
- [x] Buffer size/latency configuration
- [x] Sample rate preference (applied to recording)
- [x] Auto-normalize on import
#### 14.3 Editor Settings
- [ ] Auto-save interval
- [ ] Undo history limit
- [ ] Snap-to-grid toggle
- [ ] Grid resolution
- [ ] Default zoom level
#### 14.2 UI Settings
- [x] Theme selection (dark/light/auto)
- [x] Font size (small/medium/large)
- [x] Default track height (120-400px, applied to new tracks)
- [ ] Color scheme customization (future: advanced theming)
#### 14.4 Performance Settings
- [ ] Peak calculation quality
- [ ] Waveform rendering quality
- [ ] Enable/disable spectrogram
- [ ] Maximum file size limit
#### 14.3 Editor Settings
- [x] Auto-save interval (0-60 seconds)
- [x] Undo history limit (10-200 operations)
- [x] Snap-to-grid toggle
- [x] Grid resolution (0.1-10 seconds)
- [x] Default zoom level (1-20x, applied to initial state)
#### 14.4 Performance Settings ✅
- [x] Peak calculation quality (low/medium/high)
- [x] Waveform rendering quality (low/medium/high)
- [x] Enable/disable spectrogram (applied to analyzer visibility)
- [x] Maximum file size limit (100-1000 MB)
### Phase 15: Polish & Optimization
#### 15.1 Performance Optimization
- [ ] Web Worker for heavy computations
- [ ] AudioWorklet for custom processing
- [ ] Lazy loading for effects
- [x] Lazy loading for dialogs and analysis components (GlobalSettingsDialog, ExportDialog, ProjectsDialog, ImportTrackDialog, FrequencyAnalyzer, Spectrogram, PhaseCorrelationMeter, LUFSMeter, AudioStatistics)
- [ ] Code splitting for route optimization
- [ ] Memory leak prevention
- [x] Memory leak prevention (audio-cleanup utilities, proper cleanup in useRecording, animation frame cancellation in visualizations)
#### 15.2 Responsive Design
- [ ] Mobile-friendly layout
- [ ] Touch gesture support
- [ ] Adaptive toolbar (hide on mobile)
- [ ] Vertical scrolling for track list
#### 15.2 Responsive Design
- [x] Mobile-friendly layout (responsive header, adaptive toolbar with icon-only buttons on small screens)
- [x] Touch gesture support (collapse/expand controls with chevron buttons)
- [x] Adaptive toolbar (hide less critical buttons on mobile: Export on md, Clear All on lg)
- [x] Vertical scrolling for track list (sidebar hidden on mobile < lg breakpoint)
- [x] Collapsible track controls (two-state mobile: collapsed with minimal controls + horizontal meter, expanded with full height fader + pan control; desktop always expanded with narrow borders)
- [x] Collapsible master controls (collapsed view with horizontal level meter, expanded view with full controls; collapse button hidden on desktop)
- [x] Track collapse buttons on mobile (left chevron: collapses/expands track in list, right chevron: collapses/expands track controls)
- [x] Mobile vertical stacking layout (< lg breakpoint: controls → waveform → automation bars → effects bars per track, master controls and transport controls stacked vertically in bottom bar)
- [x] Desktop two-column layout (≥ lg breakpoint: controls left sidebar, waveforms right panel with automation/effects bars, master controls in right sidebar, transport controls centered in bottom bar)
- [x] Automation and effects bars on mobile (collapsible with eye/eye-off icons, horizontally scrollable, full functionality: parameter selection, mode cycling, height controls, add effects)
- [x] Height synchronization (track controls and waveform container heights match exactly using user-configurable track.height on desktop)
#### 15.3 Error Handling
- [ ] Graceful error messages
- [ ] File format error handling
- [ ] Memory limit warnings
- [ ] Browser compatibility checks
- [x] Graceful error messages (toast notifications for copy/paste/edit operations)
- [x] File format error handling (UnsupportedFormatDialog with format validation and decode error catching)
- [x] Memory limit warnings (MemoryWarningDialog with file size checks)
- [x] Browser compatibility checks (BrowserCompatDialog with Web Audio API detection)
#### 15.4 Documentation
- [ ] User guide
- [ ] Keyboard shortcuts reference
- [x] Keyboard shortcuts reference (KeyboardShortcutsDialog with ? shortcut and command palette integration)
- [ ] Effect descriptions
- [ ] Troubleshooting guide

View File

@@ -0,0 +1,159 @@
'use client';
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
import type { Track } from '@/types/track';
export interface AudioStatisticsProps {
tracks: Track[];
className?: string;
}
export function AudioStatistics({ tracks, className }: AudioStatisticsProps) {
const stats = React.useMemo(() => {
if (tracks.length === 0) {
return {
totalDuration: 0,
longestTrack: 0,
sampleRate: 0,
channels: 0,
bitDepth: 32,
peakAmplitude: 0,
rmsLevel: 0,
dynamicRange: 0,
trackCount: 0,
};
}
let maxDuration = 0;
let maxPeak = 0;
let sumRms = 0;
let minPeak = 1;
let sampleRate = 0;
let channels = 0;
tracks.forEach(track => {
if (!track.audioBuffer) return;
const duration = track.audioBuffer.duration;
maxDuration = Math.max(maxDuration, duration);
// Get sample rate and channels from first track
if (sampleRate === 0) {
sampleRate = track.audioBuffer.sampleRate;
channels = track.audioBuffer.numberOfChannels;
}
// Calculate peak and RMS from buffer
for (let ch = 0; ch < track.audioBuffer.numberOfChannels; ch++) {
const channelData = track.audioBuffer.getChannelData(ch);
let chPeak = 0;
let chRmsSum = 0;
for (let i = 0; i < channelData.length; i++) {
const abs = Math.abs(channelData[i]);
chPeak = Math.max(chPeak, abs);
chRmsSum += channelData[i] * channelData[i];
}
maxPeak = Math.max(maxPeak, chPeak);
minPeak = Math.min(minPeak, chPeak);
sumRms += Math.sqrt(chRmsSum / channelData.length);
}
});
const avgRms = sumRms / (tracks.length * Math.max(1, channels));
const peakDb = maxPeak > 0 ? 20 * Math.log10(maxPeak) : -Infinity;
const rmsDb = avgRms > 0 ? 20 * Math.log10(avgRms) : -Infinity;
const dynamicRange = peakDb - rmsDb;
return {
totalDuration: maxDuration,
longestTrack: maxDuration,
sampleRate,
channels,
bitDepth: 32, // Web Audio API uses 32-bit float
peakAmplitude: maxPeak,
rmsLevel: avgRms,
dynamicRange: dynamicRange > 0 ? dynamicRange : 0,
trackCount: tracks.length,
};
}, [tracks]);
const formatDuration = (seconds: number) => {
const mins = Math.floor(seconds / 60);
const secs = Math.floor(seconds % 60);
const ms = Math.floor((seconds % 1) * 1000);
return `${mins}:${secs.toString().padStart(2, '0')}.${ms.toString().padStart(3, '0')}`;
};
const formatDb = (linear: number) => {
if (linear === 0) return '-∞ dB';
const db = 20 * Math.log10(linear);
return db > -60 ? `${db.toFixed(1)} dB` : '-∞ dB';
};
return (
<div className={cn('w-full h-full bg-card/50 border-2 border-accent/50 rounded-lg p-3', className)}>
<div className="text-[10px] font-bold text-accent uppercase tracking-wider mb-3">
Audio Statistics
</div>
<div className="space-y-2 text-[10px]">
{/* File Info */}
<div className="space-y-1">
<div className="text-[9px] text-muted-foreground uppercase tracking-wide">Project Info</div>
<div className="grid grid-cols-2 gap-x-2 gap-y-1">
<div className="text-muted-foreground">Tracks:</div>
<div className="font-mono text-right">{stats.trackCount}</div>
<div className="text-muted-foreground">Duration:</div>
<div className="font-mono text-right">{formatDuration(stats.totalDuration)}</div>
<div className="text-muted-foreground">Sample Rate:</div>
<div className="font-mono text-right">{stats.sampleRate > 0 ? `${(stats.sampleRate / 1000).toFixed(1)} kHz` : 'N/A'}</div>
<div className="text-muted-foreground">Channels:</div>
<div className="font-mono text-right">{stats.channels > 0 ? (stats.channels === 1 ? 'Mono' : 'Stereo') : 'N/A'}</div>
<div className="text-muted-foreground">Bit Depth:</div>
<div className="font-mono text-right">{stats.bitDepth}-bit float</div>
</div>
</div>
{/* Divider */}
<div className="border-t border-border/30" />
{/* Audio Levels */}
<div className="space-y-1">
<div className="text-[9px] text-muted-foreground uppercase tracking-wide">Levels</div>
<div className="grid grid-cols-2 gap-x-2 gap-y-1">
<div className="text-muted-foreground">Peak:</div>
<div className={cn(
'font-mono text-right',
stats.peakAmplitude > 0.99 ? 'text-red-500 font-bold' : ''
)}>
{formatDb(stats.peakAmplitude)}
</div>
<div className="text-muted-foreground">RMS:</div>
<div className="font-mono text-right">{formatDb(stats.rmsLevel)}</div>
<div className="text-muted-foreground">Dynamic Range:</div>
<div className="font-mono text-right">
{stats.dynamicRange > 0 ? `${stats.dynamicRange.toFixed(1)} dB` : 'N/A'}
</div>
<div className="text-muted-foreground">Headroom:</div>
<div className={cn(
'font-mono text-right',
stats.peakAmplitude > 0.99 ? 'text-red-500' :
stats.peakAmplitude > 0.9 ? 'text-yellow-500' : 'text-green-500'
)}>
{stats.peakAmplitude > 0 ? `${(20 * Math.log10(1 / stats.peakAmplitude)).toFixed(1)} dB` : 'N/A'}
</div>
</div>
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,91 @@
'use client';
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
export interface FrequencyAnalyzerProps {
analyserNode: AnalyserNode | null;
className?: string;
}
export function FrequencyAnalyzer({ analyserNode, className }: FrequencyAnalyzerProps) {
const canvasRef = React.useRef<HTMLCanvasElement>(null);
const animationFrameRef = React.useRef<number | undefined>(undefined);
React.useEffect(() => {
if (!analyserNode || !canvasRef.current) return;
const canvas = canvasRef.current;
const ctx = canvas.getContext('2d');
if (!ctx) return;
// Set canvas size
const dpr = window.devicePixelRatio || 1;
const rect = canvas.getBoundingClientRect();
canvas.width = rect.width * dpr;
canvas.height = rect.height * dpr;
ctx.scale(dpr, dpr);
const bufferLength = analyserNode.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
// Get background color from computed styles
const bgColor = getComputedStyle(canvas.parentElement!).backgroundColor;
const draw = () => {
animationFrameRef.current = requestAnimationFrame(draw);
analyserNode.getByteFrequencyData(dataArray);
// Clear canvas with parent background color
ctx.fillStyle = bgColor;
ctx.fillRect(0, 0, rect.width, rect.height);
const barWidth = rect.width / bufferLength;
let x = 0;
for (let i = 0; i < bufferLength; i++) {
const barHeight = (dataArray[i] / 255) * rect.height;
// Color gradient based on frequency
const hue = (i / bufferLength) * 120; // 0 (red) to 120 (green)
ctx.fillStyle = `hsl(${180 + hue}, 70%, 50%)`;
ctx.fillRect(x, rect.height - barHeight, barWidth, barHeight);
x += barWidth;
}
// Draw frequency labels
ctx.fillStyle = 'rgba(255, 255, 255, 0.5)';
ctx.font = '10px monospace';
ctx.textAlign = 'left';
ctx.fillText('20Hz', 5, rect.height - 5);
ctx.textAlign = 'center';
ctx.fillText('1kHz', rect.width / 2, rect.height - 5);
ctx.textAlign = 'right';
ctx.fillText('20kHz', rect.width - 5, rect.height - 5);
};
draw();
return () => {
if (animationFrameRef.current) {
cancelAnimationFrame(animationFrameRef.current);
}
};
}, [analyserNode]);
return (
<div className={cn('w-full h-full bg-card/50 border-2 border-accent/50 rounded-lg p-2', className)}>
<div className="text-[10px] font-bold text-accent uppercase tracking-wider mb-2">
Frequency Analyzer
</div>
<div className="w-full h-[calc(100%-24px)] rounded bg-muted/30">
<canvas
ref={canvasRef}
className="w-full h-full rounded"
/>
</div>
</div>
);
}

View File

@@ -0,0 +1,167 @@
'use client';
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
export interface LUFSMeterProps {
analyserNode: AnalyserNode | null;
className?: string;
}
export function LUFSMeter({ analyserNode, className }: LUFSMeterProps) {
const canvasRef = React.useRef<HTMLCanvasElement>(null);
const animationFrameRef = React.useRef<number | undefined>(undefined);
const [lufs, setLufs] = React.useState({ integrated: -23, shortTerm: -23, momentary: -23 });
const lufsHistoryRef = React.useRef<number[]>([]);
React.useEffect(() => {
if (!analyserNode || !canvasRef.current) return;
const canvas = canvasRef.current;
const ctx = canvas.getContext('2d');
if (!ctx) return;
// Set canvas size
const dpr = window.devicePixelRatio || 1;
const rect = canvas.getBoundingClientRect();
canvas.width = rect.width * dpr;
canvas.height = rect.height * dpr;
ctx.scale(dpr, dpr);
const bufferLength = analyserNode.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
const draw = () => {
animationFrameRef.current = requestAnimationFrame(draw);
analyserNode.getByteFrequencyData(dataArray);
// Calculate RMS from frequency data
let sum = 0;
for (let i = 0; i < bufferLength; i++) {
const normalized = dataArray[i] / 255;
sum += normalized * normalized;
}
const rms = Math.sqrt(sum / bufferLength);
// Convert to LUFS approximation (simplified K-weighting)
// Real LUFS requires proper K-weighting filter, this is an approximation
let lufsValue = -23; // Silence baseline
if (rms > 0.0001) {
lufsValue = 20 * Math.log10(rms) - 0.691; // Simplified LUFS estimation
lufsValue = Math.max(-70, Math.min(0, lufsValue));
}
// Store history for integrated measurement
lufsHistoryRef.current.push(lufsValue);
if (lufsHistoryRef.current.length > 300) { // Keep last 10 seconds at 30fps
lufsHistoryRef.current.shift();
}
// Calculate measurements
const momentary = lufsValue; // Current value
const shortTerm = lufsHistoryRef.current.slice(-90).reduce((a, b) => a + b, 0) / Math.min(90, lufsHistoryRef.current.length); // Last 3 seconds
const integrated = lufsHistoryRef.current.reduce((a, b) => a + b, 0) / lufsHistoryRef.current.length; // All time
setLufs({ integrated, shortTerm, momentary });
// Clear canvas
const bgColor = getComputedStyle(canvas.parentElement!).backgroundColor;
ctx.fillStyle = bgColor;
ctx.fillRect(0, 0, rect.width, rect.height);
// Draw LUFS scale (-70 to 0)
const lufsToY = (lufs: number) => {
return ((0 - lufs) / 70) * rect.height;
};
// Draw reference lines
ctx.strokeStyle = 'rgba(128, 128, 128, 0.2)';
ctx.lineWidth = 1;
[-23, -16, -9, -3].forEach(db => {
const y = lufsToY(db);
ctx.beginPath();
ctx.moveTo(0, y);
ctx.lineTo(rect.width, y);
ctx.stroke();
// Labels
ctx.fillStyle = 'rgba(255, 255, 255, 0.4)';
ctx.font = '9px monospace';
ctx.textAlign = 'right';
ctx.fillText(`${db}`, rect.width - 2, y - 2);
});
// Draw -23 LUFS broadcast standard line
ctx.strokeStyle = 'rgba(59, 130, 246, 0.5)';
ctx.lineWidth = 2;
const standardY = lufsToY(-23);
ctx.beginPath();
ctx.moveTo(0, standardY);
ctx.lineTo(rect.width, standardY);
ctx.stroke();
// Draw bars
const barWidth = rect.width / 4;
const drawBar = (value: number, x: number, color: string, label: string) => {
const y = lufsToY(value);
const height = rect.height - y;
ctx.fillStyle = color;
ctx.fillRect(x, y, barWidth - 4, height);
// Label
ctx.fillStyle = 'rgba(255, 255, 255, 0.7)';
ctx.font = 'bold 9px monospace';
ctx.textAlign = 'center';
ctx.fillText(label, x + barWidth / 2 - 2, rect.height - 2);
};
drawBar(momentary, 0, 'rgba(239, 68, 68, 0.7)', 'M');
drawBar(shortTerm, barWidth, 'rgba(251, 146, 60, 0.7)', 'S');
drawBar(integrated, barWidth * 2, 'rgba(34, 197, 94, 0.7)', 'I');
};
draw();
return () => {
if (animationFrameRef.current) {
cancelAnimationFrame(animationFrameRef.current);
}
};
}, [analyserNode]);
return (
<div className={cn('w-full h-full bg-card/50 border-2 border-accent/50 rounded-lg p-2', className)}>
<div className="text-[10px] font-bold text-accent uppercase tracking-wider mb-2">
LUFS Loudness
</div>
<div className="w-full h-[calc(100%-24px)] rounded bg-muted/30 flex flex-col">
<canvas
ref={canvasRef}
className="w-full flex-1 rounded"
/>
<div className="grid grid-cols-3 gap-1 mt-2 text-[9px] font-mono text-center">
<div>
<div className="text-muted-foreground">Momentary</div>
<div className={cn('font-bold', lufs.momentary > -9 ? 'text-red-500' : 'text-foreground')}>
{lufs.momentary > -70 ? lufs.momentary.toFixed(1) : '-∞'}
</div>
</div>
<div>
<div className="text-muted-foreground">Short-term</div>
<div className={cn('font-bold', lufs.shortTerm > -16 ? 'text-orange-500' : 'text-foreground')}>
{lufs.shortTerm > -70 ? lufs.shortTerm.toFixed(1) : '-∞'}
</div>
</div>
<div>
<div className="text-muted-foreground">Integrated</div>
<div className={cn('font-bold', Math.abs(lufs.integrated + 23) < 2 ? 'text-green-500' : 'text-foreground')}>
{lufs.integrated > -70 ? lufs.integrated.toFixed(1) : '-∞'}
</div>
</div>
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,181 @@
'use client';
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
export interface PhaseCorrelationMeterProps {
analyserNode: AnalyserNode | null;
className?: string;
}
export function PhaseCorrelationMeter({ analyserNode, className }: PhaseCorrelationMeterProps) {
const canvasRef = React.useRef<HTMLCanvasElement>(null);
const animationFrameRef = React.useRef<number | undefined>(undefined);
const [correlation, setCorrelation] = React.useState(0);
React.useEffect(() => {
if (!analyserNode || !canvasRef.current) return;
const canvas = canvasRef.current;
const ctx = canvas.getContext('2d');
if (!ctx) return;
// Set canvas size
const dpr = window.devicePixelRatio || 1;
const rect = canvas.getBoundingClientRect();
canvas.width = rect.width * dpr;
canvas.height = rect.height * dpr;
ctx.scale(dpr, dpr);
const audioContext = analyserNode.context as AudioContext;
const bufferLength = analyserNode.fftSize;
const dataArrayL = new Float32Array(bufferLength);
const dataArrayR = new Float32Array(bufferLength);
// Create a splitter to get L/R channels
const splitter = audioContext.createChannelSplitter(2);
const analyserL = audioContext.createAnalyser();
const analyserR = audioContext.createAnalyser();
analyserL.fftSize = bufferLength;
analyserR.fftSize = bufferLength;
// Try to connect to the analyser node's source
// Note: This is a simplified approach - ideally we'd get the source node
try {
analyserNode.connect(splitter);
splitter.connect(analyserL, 0);
splitter.connect(analyserR, 1);
} catch (e) {
// If connection fails, just show static display
}
const draw = () => {
animationFrameRef.current = requestAnimationFrame(draw);
try {
analyserL.getFloatTimeDomainData(dataArrayL);
analyserR.getFloatTimeDomainData(dataArrayR);
// Calculate phase correlation (Pearson correlation coefficient)
let sumL = 0, sumR = 0, sumLR = 0, sumL2 = 0, sumR2 = 0;
const n = bufferLength;
for (let i = 0; i < n; i++) {
sumL += dataArrayL[i];
sumR += dataArrayR[i];
sumLR += dataArrayL[i] * dataArrayR[i];
sumL2 += dataArrayL[i] * dataArrayL[i];
sumR2 += dataArrayR[i] * dataArrayR[i];
}
const meanL = sumL / n;
const meanR = sumR / n;
const covLR = (sumLR / n) - (meanL * meanR);
const varL = (sumL2 / n) - (meanL * meanL);
const varR = (sumR2 / n) - (meanR * meanR);
let r = 0;
if (varL > 0 && varR > 0) {
r = covLR / Math.sqrt(varL * varR);
r = Math.max(-1, Math.min(1, r)); // Clamp to [-1, 1]
}
setCorrelation(r);
// Clear canvas
const bgColor = getComputedStyle(canvas.parentElement!).backgroundColor;
ctx.fillStyle = bgColor;
ctx.fillRect(0, 0, rect.width, rect.height);
// Draw scale background
const centerY = rect.height / 2;
const barHeight = 20;
// Draw scale markers
ctx.fillStyle = 'rgba(128, 128, 128, 0.2)';
ctx.fillRect(0, centerY - barHeight / 2, rect.width, barHeight);
// Draw center line (0)
ctx.strokeStyle = 'rgba(128, 128, 128, 0.5)';
ctx.lineWidth = 1;
ctx.beginPath();
ctx.moveTo(rect.width / 2, centerY - barHeight / 2 - 5);
ctx.lineTo(rect.width / 2, centerY + barHeight / 2 + 5);
ctx.stroke();
// Draw correlation indicator
const x = ((r + 1) / 2) * rect.width;
// Color based on correlation value
let color;
if (r > 0.9) {
color = '#10b981'; // Green - good correlation (mono-ish)
} else if (r > 0.5) {
color = '#84cc16'; // Lime - moderate correlation
} else if (r > -0.5) {
color = '#eab308'; // Yellow - decorrelated (good stereo)
} else if (r > -0.9) {
color = '#f97316'; // Orange - negative correlation
} else {
color = '#ef4444'; // Red - phase issues
}
ctx.fillStyle = color;
ctx.fillRect(x - 2, centerY - barHeight / 2, 4, barHeight);
// Draw labels
ctx.fillStyle = 'rgba(255, 255, 255, 0.7)';
ctx.font = '9px monospace';
ctx.textAlign = 'left';
ctx.fillText('-1', 2, centerY - barHeight / 2 - 8);
ctx.textAlign = 'center';
ctx.fillText('0', rect.width / 2, centerY - barHeight / 2 - 8);
ctx.textAlign = 'right';
ctx.fillText('+1', rect.width - 2, centerY - barHeight / 2 - 8);
// Draw correlation value
ctx.textAlign = 'center';
ctx.font = 'bold 11px monospace';
ctx.fillText(r.toFixed(3), rect.width / 2, centerY + barHeight / 2 + 15);
} catch (e) {
// Silently handle errors
}
};
draw();
return () => {
if (animationFrameRef.current) {
cancelAnimationFrame(animationFrameRef.current);
}
try {
splitter.disconnect();
analyserL.disconnect();
analyserR.disconnect();
} catch (e) {
// Ignore disconnection errors
}
};
}, [analyserNode]);
return (
<div className={cn('w-full h-full bg-card/50 border-2 border-accent/50 rounded-lg p-2', className)}>
<div className="text-[10px] font-bold text-accent uppercase tracking-wider mb-2">
Phase Correlation
</div>
<div className="w-full h-[calc(100%-24px)] rounded bg-muted/30 flex flex-col items-center justify-center">
<canvas
ref={canvasRef}
className="w-full h-16 rounded"
/>
<div className="text-[9px] text-muted-foreground mt-2 text-center px-2">
{correlation > 0.9 ? 'Mono-like' :
correlation > 0.5 ? 'Good Stereo' :
correlation > -0.5 ? 'Wide Stereo' :
'Phase Issues'}
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,134 @@
'use client';
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
export interface SpectrogramProps {
analyserNode: AnalyserNode | null;
className?: string;
}
export function Spectrogram({ analyserNode, className }: SpectrogramProps) {
const canvasRef = React.useRef<HTMLCanvasElement>(null);
const animationFrameRef = React.useRef<number | undefined>(undefined);
const spectrogramDataRef = React.useRef<ImageData | null>(null);
React.useEffect(() => {
if (!analyserNode || !canvasRef.current) return;
const canvas = canvasRef.current;
const ctx = canvas.getContext('2d');
if (!ctx) return;
// Set canvas size
const dpr = window.devicePixelRatio || 1;
const rect = canvas.getBoundingClientRect();
canvas.width = rect.width * dpr;
canvas.height = rect.height * dpr;
ctx.scale(dpr, dpr);
const bufferLength = analyserNode.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
// Initialize spectrogram data
spectrogramDataRef.current = ctx.createImageData(rect.width, rect.height);
const draw = () => {
animationFrameRef.current = requestAnimationFrame(draw);
analyserNode.getByteFrequencyData(dataArray);
if (!spectrogramDataRef.current) return;
const imageData = spectrogramDataRef.current;
// Shift existing data to the left by 1 pixel
for (let y = 0; y < rect.height; y++) {
for (let x = 0; x < rect.width - 1; x++) {
const sourceIndex = ((y * rect.width) + x + 1) * 4;
const targetIndex = ((y * rect.width) + x) * 4;
imageData.data[targetIndex] = imageData.data[sourceIndex];
imageData.data[targetIndex + 1] = imageData.data[sourceIndex + 1];
imageData.data[targetIndex + 2] = imageData.data[sourceIndex + 2];
imageData.data[targetIndex + 3] = imageData.data[sourceIndex + 3];
}
}
// Add new column on the right
const x = rect.width - 1;
for (let y = 0; y < rect.height; y++) {
// Map frequency bins to canvas height (inverted)
const freqIndex = Math.floor((1 - y / rect.height) * bufferLength);
const value = dataArray[freqIndex];
// Color mapping with transparency: transparent (0) -> blue -> cyan -> green -> yellow -> red (255)
let r, g, b, a;
if (value < 64) {
// Transparent to blue
const t = value / 64;
r = 0;
g = 0;
b = Math.round(255 * t);
a = Math.round(255 * t);
} else if (value < 128) {
// Blue to cyan
r = 0;
g = (value - 64) * 4;
b = 255;
a = 255;
} else if (value < 192) {
// Cyan to green
r = 0;
g = 255;
b = 255 - (value - 128) * 4;
a = 255;
} else {
// Green to yellow to red
r = (value - 192) * 4;
g = 255;
b = 0;
a = 255;
}
const index = ((y * rect.width) + x) * 4;
imageData.data[index] = r;
imageData.data[index + 1] = g;
imageData.data[index + 2] = b;
imageData.data[index + 3] = a;
}
// Draw the spectrogram
ctx.putImageData(imageData, 0, 0);
// Draw frequency labels
ctx.fillStyle = 'rgba(255, 255, 255, 0.8)';
ctx.font = '10px monospace';
ctx.textAlign = 'left';
ctx.fillText('20kHz', 5, 12);
ctx.fillText('1kHz', 5, rect.height / 2);
ctx.fillText('20Hz', 5, rect.height - 5);
};
draw();
return () => {
if (animationFrameRef.current) {
cancelAnimationFrame(animationFrameRef.current);
}
};
}, [analyserNode]);
return (
<div className={cn('w-full h-full bg-card/50 border-2 border-accent/50 rounded-lg p-2', className)}>
<div className="text-[10px] font-bold text-accent uppercase tracking-wider mb-2">
Spectrogram
</div>
<div className="w-full h-[calc(100%-24px)] rounded bg-muted/30">
<canvas
ref={canvasRef}
className="w-full h-full rounded"
/>
</div>
</div>
);
}

View File

@@ -1,7 +1,7 @@
'use client';
import * as React from 'react';
import { Eye, EyeOff, ChevronDown, ChevronUp } from 'lucide-react';
import { Eye, EyeOff, ChevronDown, ChevronUp, Copy, Clipboard } from 'lucide-react';
import { Button } from '@/components/ui/Button';
import { cn } from '@/lib/utils/cn';
import type { AutomationMode } from '@/types/automation';
@@ -21,6 +21,9 @@ export interface AutomationHeaderProps {
availableParameters?: Array<{ id: string; name: string }>;
selectedParameterId?: string;
onParameterChange?: (parameterId: string) => void;
// Copy/Paste automation
onCopyAutomation?: () => void;
onPasteAutomation?: () => void;
}
const MODE_LABELS: Record<AutomationMode, string> = {
@@ -51,6 +54,8 @@ export function AutomationHeader({
availableParameters,
selectedParameterId,
onParameterChange,
onCopyAutomation,
onPasteAutomation,
}: AutomationHeaderProps) {
const modes: AutomationMode[] = ['read', 'write', 'touch', 'latch'];
const currentModeIndex = modes.indexOf(mode);
@@ -69,10 +74,13 @@ export function AutomationHeader({
return (
<div
className={cn(
'relative flex items-center gap-2 px-2 py-1 bg-muted/50 border-b border-border/30 flex-shrink-0',
'flex items-center gap-2 px-3 py-1.5 bg-muted border-t border-b border-border/30 flex-shrink-0',
className
)}
>
{/* Automation label - always visible */}
<span className="text-xs font-medium flex-shrink-0">Automation</span>
{/* Color indicator */}
{color && (
<div
@@ -142,6 +150,34 @@ export function AutomationHeader({
</div>
)}
{/* Copy/Paste automation controls */}
{(onCopyAutomation || onPasteAutomation) && (
<div className="flex gap-1 flex-shrink-0 ml-auto mr-8">
{onCopyAutomation && (
<Button
variant="ghost"
size="icon-sm"
onClick={onCopyAutomation}
title="Copy automation data"
className="h-5 w-5"
>
<Copy className="h-3 w-3" />
</Button>
)}
{onPasteAutomation && (
<Button
variant="ghost"
size="icon-sm"
onClick={onPasteAutomation}
title="Paste automation data"
className="h-5 w-5"
>
<Clipboard className="h-3 w-3" />
</Button>
)}
</div>
)}
{/* Show/hide toggle - Positioned absolutely on the right */}
<Button
variant="ghost"

View File

@@ -38,9 +38,9 @@ export function AutomationLane({
(time: number): number => {
if (!containerRef.current) return 0;
const width = containerRef.current.clientWidth;
return (time / duration) * width * zoom;
return (time / duration) * width;
},
[duration, zoom]
[duration]
);
// Convert value (0-1) to Y pixel position (inverted: 0 at bottom, 1 at top)
@@ -58,9 +58,9 @@ export function AutomationLane({
(x: number): number => {
if (!containerRef.current) return 0;
const width = containerRef.current.clientWidth;
return (x / (width * zoom)) * duration;
return (x / width) * duration;
},
[duration, zoom]
[duration]
);
// Convert Y pixel position to value (0-1)
@@ -209,7 +209,7 @@ export function AutomationLane({
const width = rect.width;
// Calculate new time and value
const timePerPixel = duration / (width * zoom);
const timePerPixel = duration / width;
const valuePerPixel = 1 / lane.height;
const newTime = Math.max(0, Math.min(duration, point.time + deltaX * timePerPixel));
@@ -217,7 +217,7 @@ export function AutomationLane({
onUpdatePoint(pointId, { time: newTime, value: newValue });
},
[lane.points, lane.height, duration, zoom, onUpdatePoint]
[lane.points, lane.height, duration, onUpdatePoint]
);
const handlePointDragEnd = React.useCallback(() => {

View File

@@ -1,8 +1,7 @@
'use client';
import * as React from 'react';
import { Volume2, VolumeX } from 'lucide-react';
import { Button } from '@/components/ui/Button';
import { ChevronDown, ChevronUp } from 'lucide-react';
import { CircularKnob } from '@/components/ui/CircularKnob';
import { MasterFader } from './MasterFader';
import { cn } from '@/lib/utils/cn';
@@ -14,10 +13,12 @@ export interface MasterControlsProps {
rmsLevel: number;
isClipping: boolean;
isMuted?: boolean;
collapsed?: boolean; // For collapsible on mobile/small screens
onVolumeChange: (volume: number) => void;
onPanChange: (pan: number) => void;
onMuteToggle: () => void;
onResetClip?: () => void;
onToggleCollapse?: () => void;
className?: string;
}
@@ -28,20 +29,81 @@ export function MasterControls({
rmsLevel,
isClipping,
isMuted = false,
collapsed = false,
onVolumeChange,
onPanChange,
onMuteToggle,
onResetClip,
onToggleCollapse,
className,
}: MasterControlsProps) {
// Collapsed view - minimal controls
if (collapsed) {
return (
<div className={cn(
'flex flex-col items-center gap-2 px-3 py-2 bg-card/50 border border-accent/50 rounded-lg w-full',
className
)}>
<div className="flex items-center justify-between w-full">
<div className="text-xs font-bold text-accent uppercase tracking-wider">
Master
</div>
{onToggleCollapse && (
<button
onClick={onToggleCollapse}
className="p-1 hover:bg-accent/20 rounded transition-colors"
title="Expand master controls"
>
<ChevronDown className="h-3 w-3 text-muted-foreground" />
</button>
)}
</div>
<div className="flex items-center gap-2 w-full justify-center">
<button
onClick={onMuteToggle}
className={cn(
'h-7 w-7 rounded-md flex items-center justify-center transition-all text-xs font-bold',
isMuted
? 'bg-blue-500 text-white shadow-md shadow-blue-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title={isMuted ? 'Unmute' : 'Mute'}
>
M
</button>
<div className="flex-1 h-2 bg-muted rounded-full overflow-hidden">
<div
className={cn(
'h-full transition-all',
peakLevel > 0.95 ? 'bg-red-500' : peakLevel > 0.8 ? 'bg-yellow-500' : 'bg-green-500'
)}
style={{ width: `${peakLevel * 100}%` }}
/>
</div>
</div>
</div>
);
}
return (
<div className={cn(
'flex flex-col items-center gap-3 px-4 py-3 bg-muted/10 border-2 border-accent/30 rounded-lg',
'flex flex-col items-center gap-3 px-4 py-3 bg-card/50 border-2 border-accent/50 rounded-lg',
className
)}>
{/* Master Label */}
<div className="text-[10px] font-bold text-accent uppercase tracking-wider">
Master
{/* Master Label with collapse button */}
<div className="flex items-center justify-between w-full">
<div className="text-[10px] font-bold text-accent uppercase tracking-wider flex-1 text-center">
Master
</div>
{onToggleCollapse && (
<button
onClick={onToggleCollapse}
className="p-0.5 hover:bg-accent/20 rounded transition-colors flex-shrink-0 lg:hidden"
title="Collapse master controls"
>
<ChevronUp className="h-3 w-3 text-muted-foreground" />
</button>
)}
</div>
{/* Pan Control */}
@@ -53,7 +115,7 @@ export function MasterControls({
step={0.01}
label="PAN"
size={48}
formatter={(value) => {
formatValue={(value: number) => {
if (Math.abs(value) < 0.01) return 'C';
if (value < 0) return `${Math.abs(value * 100).toFixed(0)}L`;
return `${(value * 100).toFixed(0)}R`;
@@ -71,22 +133,18 @@ export function MasterControls({
/>
{/* Mute Button */}
<Button
variant={isMuted ? 'default' : 'outline'}
size="sm"
<button
onClick={onMuteToggle}
title={isMuted ? 'Unmute' : 'Mute'}
className={cn(
'w-full h-8',
isMuted && 'bg-red-500/20 hover:bg-red-500/30 border-red-500/50'
'h-8 w-8 rounded-md flex items-center justify-center transition-all text-[11px] font-bold',
isMuted
? 'bg-blue-500 text-white shadow-md shadow-blue-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title={isMuted ? 'Unmute' : 'Mute'}
>
{isMuted ? (
<VolumeX className="h-4 w-4" />
) : (
<Volume2 className="h-4 w-4" />
)}
</Button>
M
</button>
</div>
);
}

View File

@@ -10,6 +10,8 @@ export interface MasterFaderProps {
isClipping: boolean;
onChange: (value: number) => void;
onResetClip?: () => void;
onTouchStart?: () => void;
onTouchEnd?: () => void;
className?: string;
}
@@ -20,6 +22,8 @@ export function MasterFader({
isClipping,
onChange,
onResetClip,
onTouchStart,
onTouchEnd,
className,
}: MasterFaderProps) {
const [isDragging, setIsDragging] = React.useState(false);
@@ -43,6 +47,7 @@ export function MasterFader({
const handleMouseDown = (e: React.MouseEvent) => {
e.preventDefault();
setIsDragging(true);
onTouchStart?.();
updateValue(e.clientY);
};
@@ -56,31 +61,67 @@ export function MasterFader({
const handleMouseUp = React.useCallback(() => {
setIsDragging(false);
}, []);
onTouchEnd?.();
}, [onTouchEnd]);
const handleTouchStart = (e: React.TouchEvent) => {
e.preventDefault();
const touch = e.touches[0];
setIsDragging(true);
onTouchStart?.();
updateValue(touch.clientY);
};
const handleTouchMove = React.useCallback(
(e: TouchEvent) => {
if (!isDragging || e.touches.length === 0) return;
const touch = e.touches[0];
updateValue(touch.clientY);
},
[isDragging]
);
const handleTouchEnd = React.useCallback(() => {
setIsDragging(false);
onTouchEnd?.();
}, [onTouchEnd]);
const updateValue = (clientY: number) => {
if (!containerRef.current) return;
const rect = containerRef.current.getBoundingClientRect();
const y = clientY - rect.top;
// Track has 32px (2rem) padding on top and bottom (top-8 bottom-8)
const trackPadding = 32;
const trackHeight = rect.height - (trackPadding * 2);
// Clamp y to track bounds
const clampedY = Math.max(trackPadding, Math.min(rect.height - trackPadding, y));
// Inverted: top = max (1), bottom = min (0)
const percentage = Math.max(0, Math.min(1, 1 - (y / rect.height)));
onChange(percentage);
// Map clampedY from [trackPadding, height-trackPadding] to [1, 0]
const percentage = 1 - ((clampedY - trackPadding) / trackHeight);
onChange(Math.max(0, Math.min(1, percentage)));
};
React.useEffect(() => {
if (isDragging) {
window.addEventListener('mousemove', handleMouseMove);
window.addEventListener('mouseup', handleMouseUp);
window.addEventListener('touchmove', handleTouchMove);
window.addEventListener('touchend', handleTouchEnd);
return () => {
window.removeEventListener('mousemove', handleMouseMove);
window.removeEventListener('mouseup', handleMouseUp);
window.removeEventListener('touchmove', handleTouchMove);
window.removeEventListener('touchend', handleTouchEnd);
};
}
}, [isDragging, handleMouseMove, handleMouseUp]);
}, [isDragging, handleMouseMove, handleMouseUp, handleTouchMove, handleTouchEnd]);
return (
<div className={cn('flex gap-3', className)}>
<div className={cn('flex gap-3', className)} style={{ marginLeft: '16px' }}>
{/* dB Labels (Left) */}
<div className="flex flex-col justify-between text-[10px] font-mono text-muted-foreground py-1">
<span>0</span>
@@ -94,6 +135,7 @@ export function MasterFader({
ref={containerRef}
className="relative w-12 h-40 bg-background/50 rounded-md border border-border/50 cursor-pointer"
onMouseDown={handleMouseDown}
onTouchStart={handleTouchStart}
>
{/* Peak Meter (Horizontal Bar - Top) */}
<div className="absolute inset-x-2 top-2 h-3 bg-background/80 rounded-sm overflow-hidden border border-border/30">
@@ -118,9 +160,9 @@ export function MasterFader({
>
<div className={cn(
'w-full h-full',
rmsDb > -3 ? 'bg-red-400' :
rmsDb > -6 ? 'bg-yellow-400' :
'bg-green-400'
rmsDb > -3 ? 'bg-red-500' :
rmsDb > -6 ? 'bg-yellow-500' :
'bg-green-500'
)} />
</div>
</div>
@@ -132,8 +174,10 @@ export function MasterFader({
<div
className="absolute left-1/2 -translate-x-1/2 w-10 h-4 bg-primary/80 border-2 border-primary rounded-md shadow-lg cursor-grab active:cursor-grabbing pointer-events-none transition-all"
style={{
// Inverted: value 1 = top, value 0 = bottom
top: `calc(${(1 - value) * 100}% - 0.5rem)`,
// Inverted: value 1 = top of track (20%), value 0 = bottom of track (80%)
// Track has top-8 bottom-8 padding (20% and 80% of h-40 container)
// Handle moves within 60% range (from 20% to 80%)
top: `calc(${20 + (1 - value) * 60}% - 0.5rem)`,
}}
>
{/* Handle grip lines */}
@@ -169,7 +213,7 @@ export function MasterFader({
</div>
{/* Value and Level Display (Right) */}
<div className="flex flex-col justify-between items-start text-[9px] font-mono py-1">
<div className="flex flex-col justify-between items-start text-[9px] font-mono py-1 w-[36px]">
{/* Current dB Value */}
<div className={cn(
'font-bold text-[11px]',
@@ -201,9 +245,9 @@ export function MasterFader({
<span className="text-muted-foreground/60">RM</span>
<span className={cn(
'font-mono text-[10px]',
rmsDb > -3 ? 'text-red-400' :
rmsDb > -6 ? 'text-yellow-400' :
'text-green-400'
rmsDb > -3 ? 'text-red-500' :
rmsDb > -6 ? 'text-yellow-500' :
'text-green-500'
)}>
{rmsDb > -60 ? `${rmsDb.toFixed(1)}` : '-∞'}
</span>

View File

@@ -0,0 +1,130 @@
'use client';
import * as React from 'react';
import { AlertTriangle, XCircle, Info, X } from 'lucide-react';
import { Modal } from '@/components/ui/Modal';
import { Button } from '@/components/ui/Button';
import { getBrowserInfo } from '@/lib/utils/browser-compat';
interface BrowserCompatDialogProps {
open: boolean;
missingFeatures: string[];
warnings: string[];
onClose: () => void;
}
export function BrowserCompatDialog({
open,
missingFeatures,
warnings,
onClose,
}: BrowserCompatDialogProps) {
const [browserInfo, setBrowserInfo] = React.useState({ name: 'Unknown', version: 'Unknown' });
const hasErrors = missingFeatures.length > 0;
// Get browser info only on client side
React.useEffect(() => {
setBrowserInfo(getBrowserInfo());
}, []);
if (!open) return null;
return (
<Modal open={open} onClose={onClose} title="">
<div className="p-6 max-w-md">
{/* Header */}
<div className="flex items-start justify-between mb-4">
<div className="flex items-center gap-2">
{hasErrors ? (
<>
<XCircle className="h-5 w-5 text-destructive" />
<h2 className="text-lg font-semibold">Browser Not Supported</h2>
</>
) : (
<>
<AlertTriangle className="h-5 w-5 text-yellow-500" />
<h2 className="text-lg font-semibold">Browser Warnings</h2>
</>
)}
</div>
<button onClick={onClose} className="text-muted-foreground hover:text-foreground">
<X className="h-4 w-4" />
</button>
</div>
<p className="text-sm text-muted-foreground mb-4">
{hasErrors ? (
<>Your browser is missing required features to run this audio editor.</>
) : (
<>Some features may not work as expected in your browser.</>
)}
</p>
<div className="space-y-4">
{/* Browser Info */}
<div className="flex items-center gap-2 text-sm text-muted-foreground">
<Info className="h-4 w-4" />
<span>
{browserInfo.name} {browserInfo.version}
</span>
</div>
{/* Missing Features */}
{missingFeatures.length > 0 && (
<div className="space-y-2">
<h3 className="text-sm font-semibold text-destructive flex items-center gap-2">
<XCircle className="h-4 w-4" />
Missing Required Features:
</h3>
<ul className="list-disc list-inside space-y-1 text-sm text-muted-foreground">
{missingFeatures.map((feature) => (
<li key={feature}>{feature}</li>
))}
</ul>
</div>
)}
{/* Warnings */}
{warnings.length > 0 && (
<div className="space-y-2">
<h3 className="text-sm font-semibold text-yellow-600 dark:text-yellow-500 flex items-center gap-2">
<AlertTriangle className="h-4 w-4" />
Warnings:
</h3>
<ul className="list-disc list-inside space-y-1 text-sm text-muted-foreground">
{warnings.map((warning) => (
<li key={warning}>{warning}</li>
))}
</ul>
</div>
)}
{/* Recommendations */}
{hasErrors && (
<div className="bg-muted/50 border border-border rounded-md p-3 space-y-2">
<h3 className="text-sm font-semibold">Recommended Browsers:</h3>
<ul className="text-sm text-muted-foreground space-y-1">
<li> Chrome 90+ or Edge 90+</li>
<li> Firefox 88+</li>
<li> Safari 14+</li>
</ul>
</div>
)}
{/* Actions */}
<div className="flex justify-end gap-2">
{hasErrors ? (
<Button onClick={onClose} variant="destructive">
Close
</Button>
) : (
<Button onClick={onClose}>
Continue Anyway
</Button>
)}
</div>
</div>
</div>
</Modal>
);
}

View File

@@ -6,8 +6,10 @@ import { Button } from '@/components/ui/Button';
import { cn } from '@/lib/utils/cn';
export interface ExportSettings {
format: 'wav';
format: 'wav' | 'mp3';
scope: 'project' | 'selection' | 'tracks'; // Export scope
bitDepth: 16 | 24 | 32;
bitrate: number; // For MP3: 128, 192, 256, 320 kbps
normalize: boolean;
filename: string;
}
@@ -17,12 +19,15 @@ export interface ExportDialogProps {
onClose: () => void;
onExport: (settings: ExportSettings) => void;
isExporting?: boolean;
hasSelection?: boolean; // Whether any track has a selection
}
export function ExportDialog({ open, onClose, onExport, isExporting }: ExportDialogProps) {
export function ExportDialog({ open, onClose, onExport, isExporting, hasSelection }: ExportDialogProps) {
const [settings, setSettings] = React.useState<ExportSettings>({
format: 'wav',
scope: 'project',
bitDepth: 16,
bitrate: 192, // Default MP3 bitrate
normalize: true,
filename: 'mix',
});
@@ -62,7 +67,9 @@ export function ExportDialog({ open, onClose, onExport, isExporting }: ExportDia
className="w-full px-3 py-2 bg-background border border-border rounded text-foreground focus:outline-none focus:ring-2 focus:ring-primary"
disabled={isExporting}
/>
<p className="text-xs text-muted-foreground mt-1">.wav will be added automatically</p>
<p className="text-xs text-muted-foreground mt-1">
.{settings.format} will be added automatically
</p>
</div>
{/* Format */}
@@ -72,38 +79,92 @@ export function ExportDialog({ open, onClose, onExport, isExporting }: ExportDia
</label>
<select
value={settings.format}
onChange={(e) => setSettings({ ...settings, format: e.target.value as 'wav' })}
onChange={(e) => setSettings({ ...settings, format: e.target.value as 'wav' | 'mp3' })}
className="w-full px-3 py-2 bg-background border border-border rounded text-foreground focus:outline-none focus:ring-2 focus:ring-primary"
disabled={isExporting}
>
<option value="wav">WAV (Uncompressed)</option>
<option value="wav">WAV (Lossless, Uncompressed)</option>
<option value="mp3">MP3 (Lossy, Compressed)</option>
</select>
</div>
{/* Bit Depth */}
{/* Export Scope */}
<div>
<label className="block text-sm font-medium text-foreground mb-2">
Bit Depth
Export Scope
</label>
<div className="flex gap-2">
{[16, 24, 32].map((depth) => (
<button
key={depth}
onClick={() => setSettings({ ...settings, bitDepth: depth as 16 | 24 | 32 })}
className={cn(
'flex-1 px-3 py-2 rounded text-sm font-medium transition-colors',
settings.bitDepth === depth
? 'bg-primary text-primary-foreground'
: 'bg-background border border-border text-foreground hover:bg-accent'
)}
disabled={isExporting}
>
{depth}-bit {depth === 32 && '(Float)'}
</button>
))}
</div>
<select
value={settings.scope}
onChange={(e) => setSettings({ ...settings, scope: e.target.value as 'project' | 'selection' | 'tracks' })}
className="w-full px-3 py-2 bg-background border border-border rounded text-foreground focus:outline-none focus:ring-2 focus:ring-primary"
disabled={isExporting}
>
<option value="project">Entire Project (Mix All Tracks)</option>
<option value="selection" disabled={!hasSelection}>
Selected Region {!hasSelection && '(No selection)'}
</option>
<option value="tracks">Individual Tracks (Separate Files)</option>
</select>
<p className="text-xs text-muted-foreground mt-1">
{settings.scope === 'project' && 'Mix all tracks into a single file'}
{settings.scope === 'selection' && 'Export only the selected region'}
{settings.scope === 'tracks' && 'Export each track as a separate file'}
</p>
</div>
{/* Bit Depth (WAV only) */}
{settings.format === 'wav' && (
<div>
<label className="block text-sm font-medium text-foreground mb-2">
Bit Depth
</label>
<div className="flex gap-2">
{[16, 24, 32].map((depth) => (
<button
key={depth}
onClick={() => setSettings({ ...settings, bitDepth: depth as 16 | 24 | 32 })}
className={cn(
'flex-1 px-3 py-2 rounded text-sm font-medium transition-colors',
settings.bitDepth === depth
? 'bg-primary text-primary-foreground'
: 'bg-background border border-border text-foreground hover:bg-accent'
)}
disabled={isExporting}
>
{depth}-bit {depth === 32 && '(Float)'}
</button>
))}
</div>
</div>
)}
{/* MP3 Bitrate */}
{settings.format === 'mp3' && (
<div>
<label className="block text-sm font-medium text-foreground mb-2">
Bitrate
</label>
<div className="flex gap-2">
{[128, 192, 256, 320].map((rate) => (
<button
key={rate}
onClick={() => setSettings({ ...settings, bitrate: rate })}
className={cn(
'flex-1 px-3 py-2 rounded text-sm font-medium transition-colors',
settings.bitrate === rate
? 'bg-primary text-primary-foreground'
: 'bg-background border border-border text-foreground hover:bg-accent'
)}
disabled={isExporting}
>
{rate} kbps
</button>
))}
</div>
</div>
)}
{/* Normalize */}
<div>
<label className="flex items-center gap-2 cursor-pointer">

View File

@@ -0,0 +1,156 @@
'use client';
import { useState } from 'react';
import { ImportOptions } from '@/lib/audio/decoder';
export interface ImportDialogProps {
open: boolean;
onClose: () => void;
onImport: (options: ImportOptions) => void;
fileName?: string;
sampleRate?: number;
channels?: number;
}
export function ImportDialog({
open,
onClose,
onImport,
fileName,
sampleRate: originalSampleRate,
channels: originalChannels,
}: ImportDialogProps) {
// Don't render if not open
if (!open) return null;
const [options, setOptions] = useState<ImportOptions>({
convertToMono: false,
targetSampleRate: undefined,
normalizeOnImport: false,
});
const handleImport = () => {
onImport(options);
};
const sampleRateOptions = [44100, 48000, 88200, 96000, 176400, 192000];
return (
<div className="fixed inset-0 bg-black/50 flex items-center justify-center z-50">
<div className="bg-white dark:bg-gray-800 rounded-lg p-6 w-full max-w-md shadow-xl">
<h2 className="text-xl font-bold mb-4 text-gray-900 dark:text-white">
Import Audio File
</h2>
<div className="mb-4">
<div className="text-sm text-gray-600 dark:text-gray-400 mb-2">
<strong>File:</strong> {fileName}
</div>
{originalSampleRate && (
<div className="text-sm text-gray-600 dark:text-gray-400 mb-1">
<strong>Sample Rate:</strong> {originalSampleRate} Hz
</div>
)}
{originalChannels && (
<div className="text-sm text-gray-600 dark:text-gray-400 mb-3">
<strong>Channels:</strong> {originalChannels === 1 ? 'Mono' : originalChannels === 2 ? 'Stereo' : `${originalChannels} channels`}
</div>
)}
</div>
<div className="space-y-4">
{/* Convert to Mono */}
{originalChannels && originalChannels > 1 && (
<div>
<label className="flex items-center space-x-2">
<input
type="checkbox"
checked={options.convertToMono}
onChange={(e) => setOptions({ ...options, convertToMono: e.target.checked })}
className="rounded border-gray-300 text-blue-600 focus:ring-blue-500"
/>
<span className="text-sm text-gray-700 dark:text-gray-300">
Convert to Mono
</span>
</label>
<p className="text-xs text-gray-500 dark:text-gray-400 mt-1 ml-6">
Mix all channels equally into a single mono channel
</p>
</div>
)}
{/* Resample */}
<div>
<label className="flex items-center space-x-2 mb-2">
<input
type="checkbox"
checked={options.targetSampleRate !== undefined}
onChange={(e) => setOptions({
...options,
targetSampleRate: e.target.checked ? 48000 : undefined
})}
className="rounded border-gray-300 text-blue-600 focus:ring-blue-500"
/>
<span className="text-sm text-gray-700 dark:text-gray-300">
Resample Audio
</span>
</label>
{options.targetSampleRate !== undefined && (
<select
value={options.targetSampleRate}
onChange={(e) => setOptions({
...options,
targetSampleRate: parseInt(e.target.value)
})}
className="ml-6 w-full max-w-xs px-3 py-1.5 text-sm border border-gray-300 dark:border-gray-600 rounded bg-white dark:bg-gray-700 text-gray-900 dark:text-white focus:outline-none focus:ring-2 focus:ring-blue-500"
>
{sampleRateOptions.map((rate) => (
<option key={rate} value={rate}>
{rate} Hz {rate === originalSampleRate ? '(original)' : ''}
</option>
))}
</select>
)}
<p className="text-xs text-gray-500 dark:text-gray-400 mt-1 ml-6">
Convert to a different sample rate (may affect quality)
</p>
</div>
{/* Normalize */}
<div>
<label className="flex items-center space-x-2">
<input
type="checkbox"
checked={options.normalizeOnImport}
onChange={(e) => setOptions({ ...options, normalizeOnImport: e.target.checked })}
className="rounded border-gray-300 text-blue-600 focus:ring-blue-500"
/>
<span className="text-sm text-gray-700 dark:text-gray-300">
Normalize on Import
</span>
</label>
<p className="text-xs text-gray-500 dark:text-gray-400 mt-1 ml-6">
Adjust peak amplitude to 99% (1% headroom)
</p>
</div>
</div>
<div className="flex justify-end space-x-3 mt-6">
<button
onClick={onClose}
className="px-4 py-2 text-sm font-medium text-gray-700 dark:text-gray-300 bg-gray-100 dark:bg-gray-700 hover:bg-gray-200 dark:hover:bg-gray-600 rounded transition-colors"
>
Cancel
</button>
<button
onClick={handleImport}
className="px-4 py-2 text-sm font-medium text-white bg-blue-600 hover:bg-blue-700 rounded transition-colors"
>
Import
</button>
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,140 @@
'use client';
import * as React from 'react';
import { Keyboard, X } from 'lucide-react';
import { Modal } from '@/components/ui/Modal';
import { Button } from '@/components/ui/Button';
import { cn } from '@/lib/utils/cn';
export interface KeyboardShortcutsDialogProps {
open: boolean;
onClose: () => void;
}
interface ShortcutCategory {
name: string;
shortcuts: Array<{
keys: string[];
description: string;
}>;
}
const SHORTCUTS: ShortcutCategory[] = [
{
name: 'Playback',
shortcuts: [
{ keys: ['Space'], description: 'Play / Pause' },
{ keys: ['Home'], description: 'Go to Start' },
{ keys: ['End'], description: 'Go to End' },
{ keys: ['←'], description: 'Seek Backward' },
{ keys: ['→'], description: 'Seek Forward' },
{ keys: ['Ctrl', '←'], description: 'Seek Backward 5s' },
{ keys: ['Ctrl', '→'], description: 'Seek Forward 5s' },
],
},
{
name: 'Edit',
shortcuts: [
{ keys: ['Ctrl', 'Z'], description: 'Undo' },
{ keys: ['Ctrl', 'Shift', 'Z'], description: 'Redo' },
{ keys: ['Ctrl', 'X'], description: 'Cut' },
{ keys: ['Ctrl', 'C'], description: 'Copy' },
{ keys: ['Ctrl', 'V'], description: 'Paste' },
{ keys: ['Delete'], description: 'Delete Selection' },
{ keys: ['Ctrl', 'D'], description: 'Duplicate' },
{ keys: ['Ctrl', 'A'], description: 'Select All' },
],
},
{
name: 'View',
shortcuts: [
{ keys: ['Ctrl', '+'], description: 'Zoom In' },
{ keys: ['Ctrl', '-'], description: 'Zoom Out' },
{ keys: ['Ctrl', '0'], description: 'Fit to View' },
],
},
{
name: 'File',
shortcuts: [
{ keys: ['Ctrl', 'S'], description: 'Save Project' },
{ keys: ['Ctrl', 'K'], description: 'Open Command Palette' },
],
},
];
function KeyboardKey({ keyName }: { keyName: string }) {
return (
<kbd className="px-2 py-1 text-xs font-semibold bg-muted border border-border rounded shadow-sm min-w-[2rem] text-center inline-block">
{keyName}
</kbd>
);
}
export function KeyboardShortcutsDialog({ open, onClose }: KeyboardShortcutsDialogProps) {
if (!open) return null;
return (
<Modal open={open} onClose={onClose} title="">
<div className="p-6 max-w-2xl">
{/* Header */}
<div className="flex items-start justify-between mb-6">
<div className="flex items-center gap-3">
<Keyboard className="h-6 w-6 text-primary" />
<h2 className="text-xl font-semibold">Keyboard Shortcuts</h2>
</div>
<button onClick={onClose} className="text-muted-foreground hover:text-foreground">
<X className="h-5 w-5" />
</button>
</div>
{/* Shortcuts Grid */}
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
{SHORTCUTS.map((category) => (
<div key={category.name} className="space-y-3">
<h3 className="text-sm font-semibold text-primary border-b border-border pb-2">
{category.name}
</h3>
<div className="space-y-2">
{category.shortcuts.map((shortcut, index) => (
<div
key={index}
className="flex items-center justify-between gap-4 py-1.5"
>
<span className="text-sm text-foreground flex-1">
{shortcut.description}
</span>
<div className="flex items-center gap-1 flex-shrink-0">
{shortcut.keys.map((key, keyIndex) => (
<React.Fragment key={keyIndex}>
{keyIndex > 0 && (
<span className="text-muted-foreground text-xs mx-0.5">+</span>
)}
<KeyboardKey keyName={key} />
</React.Fragment>
))}
</div>
</div>
))}
</div>
</div>
))}
</div>
{/* Footer */}
<div className="mt-6 pt-4 border-t border-border">
<p className="text-xs text-muted-foreground text-center">
Press <KeyboardKey keyName="Ctrl" /> + <KeyboardKey keyName="K" /> to open the
command palette and search for more actions
</p>
</div>
{/* Close Button */}
<div className="mt-6 flex justify-end">
<Button onClick={onClose} variant="default">
Close
</Button>
</div>
</div>
</Modal>
);
}

View File

@@ -0,0 +1,101 @@
'use client';
import * as React from 'react';
import { AlertTriangle, Info, X } from 'lucide-react';
import { Modal } from '@/components/ui/Modal';
import { Button } from '@/components/ui/Button';
import { formatMemorySize } from '@/lib/utils/memory-limits';
interface MemoryWarningDialogProps {
open: boolean;
estimatedMemoryMB: number;
availableMemoryMB?: number;
warning: string;
fileName?: string;
onContinue: () => void;
onCancel: () => void;
}
export function MemoryWarningDialog({
open,
estimatedMemoryMB,
availableMemoryMB,
warning,
fileName,
onContinue,
onCancel,
}: MemoryWarningDialogProps) {
if (!open) return null;
const estimatedBytes = estimatedMemoryMB * 1024 * 1024;
const availableBytes = availableMemoryMB ? availableMemoryMB * 1024 * 1024 : undefined;
return (
<Modal open={open} onClose={onCancel} title="">
<div className="p-6 max-w-md">
{/* Header */}
<div className="flex items-start justify-between mb-4">
<div className="flex items-center gap-2">
<AlertTriangle className="h-5 w-5 text-yellow-500" />
<h2 className="text-lg font-semibold">Memory Warning</h2>
</div>
<button onClick={onCancel} className="text-muted-foreground hover:text-foreground">
<X className="h-4 w-4" />
</button>
</div>
<p className="text-sm text-muted-foreground mb-4">
{warning}
</p>
<div className="space-y-4">
{/* File Info */}
{fileName && (
<div className="flex items-center gap-2 text-sm">
<Info className="h-4 w-4 text-muted-foreground" />
<span className="font-medium">File:</span>
<span className="text-muted-foreground truncate">{fileName}</span>
</div>
)}
{/* Memory Details */}
<div className="bg-muted/50 border border-border rounded-md p-3 space-y-2">
<div className="flex justify-between text-sm">
<span className="text-muted-foreground">Estimated Memory:</span>
<span className="font-medium">{formatMemorySize(estimatedBytes)}</span>
</div>
{availableBytes && (
<div className="flex justify-between text-sm">
<span className="text-muted-foreground">Available Memory:</span>
<span className="font-medium">{formatMemorySize(availableBytes)}</span>
</div>
)}
</div>
{/* Warning Message */}
<div className="bg-yellow-500/10 border border-yellow-500/20 rounded-md p-3">
<p className="text-sm text-yellow-700 dark:text-yellow-400">
<strong>Note:</strong> Loading large files may cause performance issues or browser crashes,
especially on devices with limited memory. Consider:
</p>
<ul className="mt-2 text-sm text-yellow-700 dark:text-yellow-400 space-y-1 list-disc list-inside">
<li>Closing other browser tabs</li>
<li>Using a shorter audio file</li>
<li>Splitting large files into smaller segments</li>
</ul>
</div>
{/* Actions */}
<div className="flex justify-end gap-2">
<Button onClick={onCancel} variant="outline">
Cancel
</Button>
<Button onClick={onContinue} variant="default">
Continue Anyway
</Button>
</div>
</div>
</div>
</Modal>
);
}

View File

@@ -0,0 +1,162 @@
'use client';
import * as React from 'react';
import { X, Plus, Trash2, Copy, FolderOpen, Download, Upload } from 'lucide-react';
import { Button } from '@/components/ui/Button';
import type { ProjectMetadata } from '@/lib/storage/db';
import { formatDuration } from '@/lib/audio/decoder';
export interface ProjectsDialogProps {
open: boolean;
onClose: () => void;
projects: ProjectMetadata[];
onNewProject: () => void;
onLoadProject: (projectId: string) => void;
onDeleteProject: (projectId: string) => void;
onDuplicateProject: (projectId: string) => void;
onExportProject: (projectId: string) => void;
onImportProject: () => void;
}
export function ProjectsDialog({
open,
onClose,
projects,
onNewProject,
onLoadProject,
onDeleteProject,
onDuplicateProject,
onExportProject,
onImportProject,
}: ProjectsDialogProps) {
if (!open) return null;
const formatDate = (timestamp: number) => {
return new Date(timestamp).toLocaleString(undefined, {
year: 'numeric',
month: 'short',
day: 'numeric',
hour: '2-digit',
minute: '2-digit',
});
};
return (
<div className="fixed inset-0 z-50 flex items-center justify-center bg-black/50">
<div className="bg-card border border-border rounded-lg shadow-xl w-full max-w-3xl max-h-[80vh] flex flex-col">
{/* Header */}
<div className="flex items-center justify-between p-6 border-b border-border">
<h2 className="text-lg font-semibold text-foreground">Projects</h2>
<div className="flex items-center gap-2">
<Button
onClick={onImportProject}
variant="outline"
size="sm"
className="gap-2"
>
<Upload className="h-4 w-4" />
Import
</Button>
<Button
onClick={onNewProject}
variant="default"
size="sm"
className="gap-2"
>
<Plus className="h-4 w-4" />
New Project
</Button>
<button
onClick={onClose}
className="text-muted-foreground hover:text-foreground transition-colors"
>
<X className="h-5 w-5" />
</button>
</div>
</div>
{/* Projects List */}
<div className="flex-1 overflow-y-auto custom-scrollbar p-6">
{projects.length === 0 ? (
<div className="flex flex-col items-center justify-center py-12 text-center">
<FolderOpen className="h-16 w-16 text-muted-foreground mb-4" />
<h3 className="text-lg font-medium text-foreground mb-2">
No projects yet
</h3>
<p className="text-sm text-muted-foreground mb-4">
Create your first project to get started
</p>
<Button onClick={onNewProject} variant="default">
<Plus className="h-4 w-4 mr-2" />
Create Project
</Button>
</div>
) : (
<div className="grid gap-4">
{projects.map((project) => (
<div
key={project.id}
className="border border-border rounded-lg p-4 hover:bg-accent/50 transition-colors"
>
<div className="flex items-start justify-between">
<div className="flex-1">
<h3 className="font-medium text-foreground mb-1">
{project.name}
</h3>
{project.description && (
<p className="text-sm text-muted-foreground mb-2">
{project.description}
</p>
)}
<div className="flex flex-wrap gap-4 text-xs text-muted-foreground">
<span>{project.trackCount} tracks</span>
<span>{formatDuration(project.duration)}</span>
<span>{project.sampleRate / 1000}kHz</span>
<span>Updated {formatDate(project.updatedAt)}</span>
</div>
</div>
<div className="flex items-center gap-2 ml-4">
<button
onClick={() => onLoadProject(project.id)}
className="px-3 py-1.5 text-sm font-medium text-primary hover:bg-primary/10 rounded transition-colors"
title="Open project"
>
Open
</button>
<button
onClick={() => onExportProject(project.id)}
className="p-1.5 text-muted-foreground hover:text-foreground hover:bg-accent rounded transition-colors"
title="Export project"
>
<Download className="h-4 w-4" />
</button>
<button
onClick={() => onDuplicateProject(project.id)}
className="p-1.5 text-muted-foreground hover:text-foreground hover:bg-accent rounded transition-colors"
title="Duplicate project"
>
<Copy className="h-4 w-4" />
</button>
<button
onClick={() => {
if (confirm(`Delete "${project.name}"? This cannot be undone.`)) {
onDeleteProject(project.id);
}
}}
className="p-1.5 text-muted-foreground hover:text-destructive hover:bg-destructive/10 rounded transition-colors"
title="Delete project"
>
<Trash2 className="h-4 w-4" />
</button>
</div>
</div>
</div>
))}
</div>
)}
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,106 @@
'use client';
import * as React from 'react';
import { AlertCircle, FileQuestion, X } from 'lucide-react';
import { Modal } from '@/components/ui/Modal';
import { Button } from '@/components/ui/Button';
export interface UnsupportedFormatDialogProps {
open: boolean;
fileName: string;
fileType: string;
onClose: () => void;
}
const SUPPORTED_FORMATS = [
{ extension: 'WAV', mimeType: 'audio/wav', description: 'Lossless, widely supported' },
{ extension: 'MP3', mimeType: 'audio/mpeg', description: 'Compressed, universal support' },
{ extension: 'OGG', mimeType: 'audio/ogg', description: 'Free, open format' },
{ extension: 'FLAC', mimeType: 'audio/flac', description: 'Lossless compression' },
{ extension: 'M4A/AAC', mimeType: 'audio/aac', description: 'Apple audio format' },
{ extension: 'AIFF', mimeType: 'audio/aiff', description: 'Apple lossless format' },
{ extension: 'WebM', mimeType: 'audio/webm', description: 'Modern web format' },
];
export function UnsupportedFormatDialog({
open,
fileName,
fileType,
onClose,
}: UnsupportedFormatDialogProps) {
if (!open) return null;
return (
<Modal open={open} onClose={onClose} title="">
<div className="p-6 max-w-lg">
{/* Header */}
<div className="flex items-start justify-between mb-4">
<div className="flex items-center gap-3">
<FileQuestion className="h-6 w-6 text-yellow-500" />
<h2 className="text-lg font-semibold">Unsupported File Format</h2>
</div>
<button onClick={onClose} className="text-muted-foreground hover:text-foreground">
<X className="h-4 w-4" />
</button>
</div>
{/* Error Message */}
<div className="bg-yellow-500/10 border border-yellow-500/20 rounded-md p-4 mb-4">
<div className="flex items-start gap-2">
<AlertCircle className="h-4 w-4 text-yellow-600 dark:text-yellow-400 mt-0.5 flex-shrink-0" />
<div className="flex-1">
<p className="text-sm text-yellow-800 dark:text-yellow-200 font-medium mb-1">
Cannot open this file
</p>
<p className="text-sm text-yellow-700 dark:text-yellow-300">
<strong>{fileName}</strong>
{fileType && (
<span className="text-muted-foreground"> ({fileType})</span>
)}
</p>
</div>
</div>
</div>
{/* Supported Formats */}
<div className="mb-6">
<h3 className="text-sm font-semibold mb-3">Supported Audio Formats:</h3>
<div className="grid grid-cols-1 gap-2">
{SUPPORTED_FORMATS.map((format) => (
<div
key={format.extension}
className="flex items-center justify-between gap-4 p-2 rounded bg-muted/30 border border-border/50"
>
<div className="flex items-center gap-3">
<span className="text-sm font-mono font-semibold text-primary min-w-[80px]">
{format.extension}
</span>
<span className="text-xs text-muted-foreground">
{format.description}
</span>
</div>
</div>
))}
</div>
</div>
{/* Recommendations */}
<div className="bg-muted/50 border border-border rounded-md p-4 mb-4">
<h4 className="text-sm font-semibold mb-2">How to fix this:</h4>
<ul className="text-sm text-muted-foreground space-y-2 list-disc list-inside">
<li>Convert your audio file to a supported format (WAV or MP3 recommended)</li>
<li>Use a free audio converter like Audacity, FFmpeg, or online converters</li>
<li>Check that the file isn't corrupted or incomplete</li>
</ul>
</div>
{/* Close Button */}
<div className="flex justify-end">
<Button onClick={onClose} variant="default">
Got it
</Button>
</div>
</div>
</Modal>
);
}

File diff suppressed because it is too large Load Diff

View File

@@ -84,7 +84,7 @@ export function FileUpload({ onFileSelect, className }: FileUploadProps) {
Click to browse or drag and drop
</p>
<p className="text-xs text-muted-foreground mt-2">
Supported formats: WAV, MP3, OGG, FLAC, AAC, M4A
Supported formats: WAV, MP3, OGG, FLAC, AAC, M4A, AIFF
</p>
</div>
</div>

View File

@@ -1,9 +1,8 @@
'use client';
import * as React from 'react';
import { Play, Pause, Square, SkipBack, Circle, AlignVerticalJustifyStart, AlignVerticalJustifyEnd, Layers } from 'lucide-react';
import { Play, Pause, Square, SkipBack, Circle, AlignVerticalJustifyStart, AlignVerticalJustifyEnd, Layers, Repeat } from 'lucide-react';
import { Button } from '@/components/ui/Button';
import { MasterControls } from '@/components/controls/MasterControls';
import { cn } from '@/lib/utils/cn';
export interface PlaybackControlsProps {
@@ -12,14 +11,11 @@ export interface PlaybackControlsProps {
currentTime: number;
duration: number;
volume: number;
pan?: number;
onPlay: () => void;
onPause: () => void;
onStop: () => void;
onSeek: (time: number, autoPlay?: boolean) => void;
onVolumeChange: (volume: number) => void;
onPanChange?: (pan: number) => void;
onMuteToggle?: () => void;
disabled?: boolean;
className?: string;
currentTimeFormatted?: string;
@@ -35,10 +31,13 @@ export interface PlaybackControlsProps {
onPunchOutTimeChange?: (time: number) => void;
overdubEnabled?: boolean;
onOverdubEnabledChange?: (enabled: boolean) => void;
masterPeakLevel?: number;
masterRmsLevel?: number;
masterIsClipping?: boolean;
onResetClip?: () => void;
loopEnabled?: boolean;
loopStart?: number;
loopEnd?: number;
onToggleLoop?: () => void;
onSetLoopPoints?: (start: number, end: number) => void;
playbackRate?: number;
onPlaybackRateChange?: (rate: number) => void;
}
export function PlaybackControls({
@@ -47,14 +46,11 @@ export function PlaybackControls({
currentTime,
duration,
volume,
pan = 0,
onPlay,
onPause,
onStop,
onSeek,
onVolumeChange,
onPanChange,
onMuteToggle,
disabled = false,
className,
currentTimeFormatted,
@@ -70,10 +66,13 @@ export function PlaybackControls({
onPunchOutTimeChange,
overdubEnabled = false,
onOverdubEnabledChange,
masterPeakLevel = 0,
masterRmsLevel = 0,
masterIsClipping = false,
onResetClip,
loopEnabled = false,
loopStart = 0,
loopEnd = 0,
onToggleLoop,
onSetLoopPoints,
playbackRate = 1.0,
onPlaybackRateChange,
}: PlaybackControlsProps) {
const handlePlayPause = () => {
if (isPlaying) {
@@ -86,7 +85,7 @@ export function PlaybackControls({
const progress = duration > 0 ? (currentTime / duration) * 100 : 0;
return (
<div className={cn('space-y-4', className)}>
<div className={cn('space-y-4 w-full max-w-2xl', className)}>
{/* Timeline Slider */}
<div className="space-y-2">
<input
@@ -176,7 +175,7 @@ export function PlaybackControls({
)}
{/* Transport Controls */}
<div className="flex items-center justify-between gap-4">
<div className="flex items-center justify-center gap-4">
<div className="flex items-center gap-2">
<Button
variant="outline"
@@ -264,24 +263,100 @@ export function PlaybackControls({
</div>
</>
)}
</div>
{/* Master Controls */}
{onPanChange && onMuteToggle && (
<MasterControls
volume={volume}
pan={pan}
peakLevel={masterPeakLevel}
rmsLevel={masterRmsLevel}
isClipping={masterIsClipping}
isMuted={volume === 0}
onVolumeChange={onVolumeChange}
onPanChange={onPanChange}
onMuteToggle={onMuteToggle}
onResetClip={onResetClip}
/>
)}
{/* Loop Toggle */}
{onToggleLoop && (
<div className="flex items-center gap-1 border-l border-border pl-2 ml-1">
<Button
variant="ghost"
size="icon-sm"
onClick={onToggleLoop}
title="Toggle Loop Playback"
className={cn(
loopEnabled && 'bg-primary/20 hover:bg-primary/30'
)}
>
<Repeat className={cn('h-3.5 w-3.5', loopEnabled && 'text-primary')} />
</Button>
</div>
)}
{/* Playback Speed Control */}
{onPlaybackRateChange && (
<div className="flex items-center gap-1 border-l border-border pl-2 ml-1">
<select
value={playbackRate}
onChange={(e) => onPlaybackRateChange(parseFloat(e.target.value))}
className="h-7 px-2 py-0 bg-background border border-border rounded text-xs cursor-pointer hover:bg-muted/50 focus:outline-none focus:ring-2 focus:ring-ring"
title="Playback Speed"
>
<option value={0.25}>0.25x</option>
<option value={0.5}>0.5x</option>
<option value={0.75}>0.75x</option>
<option value={1.0}>1x</option>
<option value={1.25}>1.25x</option>
<option value={1.5}>1.5x</option>
<option value={2.0}>2x</option>
</select>
</div>
)}
</div>
</div>
{/* Loop Points - Show when enabled */}
{loopEnabled && onSetLoopPoints && (
<div className="flex items-center gap-3 text-xs bg-muted/50 rounded px-3 py-2">
<div className="flex items-center gap-2 flex-1">
<label className="text-muted-foreground flex items-center gap-1 flex-shrink-0">
<AlignVerticalJustifyStart className="h-3 w-3" />
Loop Start
</label>
<input
type="number"
min={0}
max={loopEnd || duration}
step={0.1}
value={loopStart.toFixed(2)}
onChange={(e) => onSetLoopPoints(parseFloat(e.target.value), loopEnd)}
className="flex-1 px-2 py-1 bg-background border border-border rounded text-xs font-mono"
/>
<Button
variant="ghost"
size="sm"
onClick={() => onSetLoopPoints(currentTime, loopEnd)}
title="Set loop start to current time"
className="h-6 px-2 text-xs"
>
Set
</Button>
</div>
<div className="flex items-center gap-2 flex-1">
<label className="text-muted-foreground flex items-center gap-1 flex-shrink-0">
<AlignVerticalJustifyEnd className="h-3 w-3" />
Loop End
</label>
<input
type="number"
min={loopStart}
max={duration}
step={0.1}
value={loopEnd.toFixed(2)}
onChange={(e) => onSetLoopPoints(loopStart, parseFloat(e.target.value))}
className="flex-1 px-2 py-1 bg-background border border-border rounded text-xs font-mono"
/>
<Button
variant="ghost"
size="sm"
onClick={() => onSetLoopPoints(loopStart, currentTime)}
title="Set loop end to current time"
className="h-6 px-2 text-xs"
>
Set
</Button>
</div>
</div>
)}
</div>
);
}

View File

@@ -2,7 +2,7 @@
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
import { generateMinMaxPeaks } from '@/lib/waveform/peaks';
import { useAudioWorker } from '@/lib/hooks/useAudioWorker';
import type { Selection } from '@/types/selection';
export interface WaveformProps {
@@ -39,6 +39,16 @@ export function Waveform({
const [isSelecting, setIsSelecting] = React.useState(false);
const [selectionStart, setSelectionStart] = React.useState<number | null>(null);
// Worker for peak generation
const worker = useAudioWorker();
// Cache peaks to avoid regenerating on every render
const [peaksCache, setPeaksCache] = React.useState<{
width: number;
min: Float32Array;
max: Float32Array;
} | null>(null);
// Handle resize
React.useEffect(() => {
const handleResize = () => {
@@ -52,10 +62,35 @@ export function Waveform({
return () => window.removeEventListener('resize', handleResize);
}, []);
// Generate peaks in worker when audioBuffer or zoom changes
React.useEffect(() => {
if (!audioBuffer) {
setPeaksCache(null);
return;
}
const visibleWidth = Math.floor(width * zoom);
// Check if we already have peaks for this width
if (peaksCache && peaksCache.width === visibleWidth) {
return;
}
// Generate peaks in worker
const channelData = audioBuffer.getChannelData(0);
worker.generateMinMaxPeaks(channelData, visibleWidth).then((peaks) => {
setPeaksCache({
width: visibleWidth,
min: peaks.min,
max: peaks.max,
});
});
}, [audioBuffer, width, zoom, worker, peaksCache]);
// Draw waveform
React.useEffect(() => {
const canvas = canvasRef.current;
if (!canvas || !audioBuffer) return;
if (!canvas || !audioBuffer || !peaksCache) return;
const ctx = canvas.getContext('2d');
if (!ctx) return;
@@ -75,8 +110,8 @@ export function Waveform({
// Calculate visible width based on zoom
const visibleWidth = Math.floor(width * zoom);
// Generate peaks for visible portion
const { min, max } = generateMinMaxPeaks(audioBuffer, visibleWidth, 0);
// Use cached peaks
const { min, max } = peaksCache;
// Draw waveform
const middle = height / 2;
@@ -176,7 +211,7 @@ export function Waveform({
ctx.lineTo(progressX, height);
ctx.stroke();
}
}, [audioBuffer, width, height, currentTime, duration, zoom, scrollOffset, amplitudeScale, selection]);
}, [audioBuffer, width, height, currentTime, duration, zoom, scrollOffset, amplitudeScale, selection, peaksCache]);
const handleClick = (e: React.MouseEvent<HTMLCanvasElement>) => {
if (!onSeek || !duration || isDragging) return;

View File

@@ -105,7 +105,7 @@ export function EffectBrowser({ open, onClose, onSelectEffect }: EffectBrowserPr
</div>
{/* Content */}
<div className="flex-1 overflow-y-auto p-4">
<div className="flex-1 overflow-y-auto custom-scrollbar p-4">
<div className="space-y-6">
{Object.entries(filteredCategories).map(([category, effects]) => (
<div key={category}>

View File

@@ -0,0 +1,188 @@
'use client';
import * as React from 'react';
import { Modal } from '@/components/ui/Modal';
import { Button } from '@/components/ui/Button';
import type { Marker, MarkerType } from '@/types/marker';
export interface MarkerDialogProps {
open: boolean;
onClose: () => void;
onSave: (marker: Partial<Marker>) => void;
marker?: Marker; // If editing existing marker
defaultTime?: number; // Default time for new markers
defaultType?: MarkerType;
}
const MARKER_COLORS = [
'#ef4444', // red
'#f97316', // orange
'#eab308', // yellow
'#22c55e', // green
'#3b82f6', // blue
'#a855f7', // purple
'#ec4899', // pink
];
export function MarkerDialog({
open,
onClose,
onSave,
marker,
defaultTime = 0,
defaultType = 'point',
}: MarkerDialogProps) {
const [name, setName] = React.useState(marker?.name || '');
const [type, setType] = React.useState<MarkerType>(marker?.type || defaultType);
const [time, setTime] = React.useState(marker?.time || defaultTime);
const [endTime, setEndTime] = React.useState(marker?.endTime || defaultTime + 1);
const [color, setColor] = React.useState(marker?.color || MARKER_COLORS[0]);
const [description, setDescription] = React.useState(marker?.description || '');
// Reset form when marker changes or dialog opens
React.useEffect(() => {
if (open) {
setName(marker?.name || '');
setType(marker?.type || defaultType);
setTime(marker?.time || defaultTime);
setEndTime(marker?.endTime || defaultTime + 1);
setColor(marker?.color || MARKER_COLORS[0]);
setDescription(marker?.description || '');
}
}, [open, marker, defaultTime, defaultType]);
const handleSave = () => {
const markerData: Partial<Marker> = {
...(marker?.id && { id: marker.id }),
name: name || 'Untitled Marker',
type,
time,
...(type === 'region' && { endTime }),
color,
description,
};
onSave(markerData);
onClose();
};
return (
<Modal
open={open}
onClose={onClose}
title={marker ? 'Edit Marker' : 'Add Marker'}
description={marker ? 'Edit marker properties' : 'Add a new marker or region to the timeline'}
size="md"
footer={
<>
<Button variant="outline" onClick={onClose}>
Cancel
</Button>
<Button onClick={handleSave}>{marker ? 'Save' : 'Add'}</Button>
</>
}
>
<div className="space-y-4">
{/* Name */}
<div className="space-y-2">
<label htmlFor="name" className="text-sm font-medium text-foreground">
Name
</label>
<input
id="name"
type="text"
value={name}
onChange={(e) => setName(e.target.value)}
placeholder="Marker name"
className="flex h-10 w-full rounded-md border border-border bg-background px-3 py-2 text-sm text-foreground placeholder:text-muted-foreground focus:outline-none focus:ring-2 focus:ring-primary focus:ring-offset-2 focus:ring-offset-background"
/>
</div>
{/* Type */}
<div className="space-y-2">
<label htmlFor="type" className="text-sm font-medium text-foreground">
Type
</label>
<select
id="type"
value={type}
onChange={(e) => setType(e.target.value as MarkerType)}
className="flex h-10 w-full rounded-md border border-border bg-background px-3 py-2 text-sm text-foreground focus:outline-none focus:ring-2 focus:ring-primary focus:ring-offset-2 focus:ring-offset-background"
>
<option value="point">Point Marker</option>
<option value="region">Region</option>
</select>
</div>
{/* Time */}
<div className="space-y-2">
<label htmlFor="time" className="text-sm font-medium text-foreground">
{type === 'region' ? 'Start Time' : 'Time'} (seconds)
</label>
<input
id="time"
type="number"
step="0.1"
min="0"
value={time}
onChange={(e) => setTime(parseFloat(e.target.value))}
className="flex h-10 w-full rounded-md border border-border bg-background px-3 py-2 text-sm text-foreground focus:outline-none focus:ring-2 focus:ring-primary focus:ring-offset-2 focus:ring-offset-background"
/>
</div>
{/* End Time (for regions) */}
{type === 'region' && (
<div className="space-y-2">
<label htmlFor="endTime" className="text-sm font-medium text-foreground">
End Time (seconds)
</label>
<input
id="endTime"
type="number"
step="0.1"
min={time}
value={endTime}
onChange={(e) => setEndTime(parseFloat(e.target.value))}
className="flex h-10 w-full rounded-md border border-border bg-background px-3 py-2 text-sm text-foreground focus:outline-none focus:ring-2 focus:ring-primary focus:ring-offset-2 focus:ring-offset-background"
/>
</div>
)}
{/* Color */}
<div className="space-y-2">
<label className="text-sm font-medium text-foreground">
Color
</label>
<div className="flex gap-2">
{MARKER_COLORS.map((c) => (
<button
key={c}
type="button"
className="w-8 h-8 rounded border-2 transition-all hover:scale-110"
style={{
backgroundColor: c,
borderColor: color === c ? 'white' : 'transparent',
}}
onClick={() => setColor(c)}
/>
))}
</div>
</div>
{/* Description */}
<div className="space-y-2">
<label htmlFor="description" className="text-sm font-medium text-foreground">
Description (optional)
</label>
<input
id="description"
type="text"
value={description}
onChange={(e) => setDescription(e.target.value)}
placeholder="Optional description"
className="flex h-10 w-full rounded-md border border-border bg-background px-3 py-2 text-sm text-foreground placeholder:text-muted-foreground focus:outline-none focus:ring-2 focus:ring-primary focus:ring-offset-2 focus:ring-offset-background"
/>
</div>
</div>
</Modal>
);
}

View File

@@ -0,0 +1,216 @@
'use client';
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
import type { Marker } from '@/types/marker';
import { Flag, Edit2, Trash2 } from 'lucide-react';
import { Button } from '@/components/ui/Button';
export interface MarkerTimelineProps {
markers: Marker[];
duration: number;
currentTime: number;
onMarkerClick?: (marker: Marker) => void;
onMarkerEdit?: (marker: Marker) => void;
onMarkerDelete?: (markerId: string) => void;
onSeek?: (time: number) => void;
className?: string;
}
export function MarkerTimeline({
markers,
duration,
currentTime,
onMarkerClick,
onMarkerEdit,
onMarkerDelete,
onSeek,
className,
}: MarkerTimelineProps) {
const containerRef = React.useRef<HTMLDivElement>(null);
const [hoveredMarkerId, setHoveredMarkerId] = React.useState<string | null>(null);
const timeToX = React.useCallback(
(time: number): number => {
if (!containerRef.current) return 0;
const width = containerRef.current.clientWidth;
return (time / duration) * width;
},
[duration]
);
return (
<div
ref={containerRef}
className={cn(
'relative w-full h-8 bg-muted/30 border-b border-border',
className
)}
>
{/* Markers */}
{markers.map((marker) => {
const x = timeToX(marker.time);
const isHovered = hoveredMarkerId === marker.id;
if (marker.type === 'point') {
return (
<div
key={marker.id}
className="absolute top-0 bottom-0 group cursor-pointer"
style={{ left: `${x}px` }}
onMouseEnter={() => setHoveredMarkerId(marker.id)}
onMouseLeave={() => setHoveredMarkerId(null)}
onClick={() => {
onMarkerClick?.(marker);
onSeek?.(marker.time);
}}
>
{/* Marker line */}
<div
className={cn(
'absolute top-0 bottom-0 w-0.5 transition-colors',
isHovered ? 'bg-primary' : 'bg-primary/60'
)}
style={{ backgroundColor: marker.color }}
/>
{/* Marker flag */}
<Flag
className={cn(
'absolute top-0.5 -left-2 h-4 w-4 transition-colors',
isHovered ? 'text-primary' : 'text-primary/60'
)}
style={{ color: marker.color }}
/>
{/* Hover tooltip with actions */}
{isHovered && (
<div className="absolute top-full left-0 mt-1 z-10 bg-popover border border-border rounded shadow-lg p-2 min-w-[200px]">
<div className="text-xs font-medium mb-1">{marker.name}</div>
{marker.description && (
<div className="text-xs text-muted-foreground mb-2">{marker.description}</div>
)}
<div className="flex gap-1">
{onMarkerEdit && (
<Button
variant="ghost"
size="icon-sm"
onClick={(e) => {
e.stopPropagation();
onMarkerEdit(marker);
}}
title="Edit marker"
className="h-6 w-6"
>
<Edit2 className="h-3 w-3" />
</Button>
)}
{onMarkerDelete && (
<Button
variant="ghost"
size="icon-sm"
onClick={(e) => {
e.stopPropagation();
onMarkerDelete(marker.id);
}}
title="Delete marker"
className="h-6 w-6 text-destructive hover:text-destructive"
>
<Trash2 className="h-3 w-3" />
</Button>
)}
</div>
</div>
)}
</div>
);
} else {
// Region marker
const endX = timeToX(marker.endTime || marker.time);
const width = endX - x;
return (
<div
key={marker.id}
className="absolute top-0 bottom-0 group cursor-pointer"
style={{ left: `${x}px`, width: `${width}px` }}
onMouseEnter={() => setHoveredMarkerId(marker.id)}
onMouseLeave={() => setHoveredMarkerId(null)}
onClick={() => {
onMarkerClick?.(marker);
onSeek?.(marker.time);
}}
>
{/* Region background */}
<div
className={cn(
'absolute inset-0 transition-opacity',
isHovered ? 'opacity-30' : 'opacity-20'
)}
style={{ backgroundColor: marker.color || 'var(--color-primary)' }}
/>
{/* Region borders */}
<div
className="absolute top-0 bottom-0 left-0 w-0.5"
style={{ backgroundColor: marker.color || 'var(--color-primary)' }}
/>
<div
className="absolute top-0 bottom-0 right-0 w-0.5"
style={{ backgroundColor: marker.color || 'var(--color-primary)' }}
/>
{/* Region label */}
<div
className="absolute top-0.5 left-1 text-[10px] font-medium truncate pr-1"
style={{ color: marker.color || 'var(--color-primary)', maxWidth: `${width - 8}px` }}
>
{marker.name}
</div>
{/* Hover tooltip with actions */}
{isHovered && (
<div className="absolute top-full left-0 mt-1 z-10 bg-popover border border-border rounded shadow-lg p-2 min-w-[200px]">
<div className="text-xs font-medium mb-1">{marker.name}</div>
{marker.description && (
<div className="text-xs text-muted-foreground mb-2">{marker.description}</div>
)}
<div className="flex gap-1">
{onMarkerEdit && (
<Button
variant="ghost"
size="icon-sm"
onClick={(e) => {
e.stopPropagation();
onMarkerEdit(marker);
}}
title="Edit marker"
className="h-6 w-6"
>
<Edit2 className="h-3 w-3" />
</Button>
)}
{onMarkerDelete && (
<Button
variant="ghost"
size="icon-sm"
onClick={(e) => {
e.stopPropagation();
onMarkerDelete(marker.id);
}}
title="Delete marker"
className="h-6 w-6 text-destructive hover:text-destructive"
>
<Trash2 className="h-3 w-3" />
</Button>
)}
</div>
</div>
)}
</div>
);
}
})}
</div>
);
}

View File

@@ -1,11 +1,19 @@
'use client';
import * as React from 'react';
import { X } from 'lucide-react';
import { X, RotateCcw } from 'lucide-react';
import { Button } from '@/components/ui/Button';
import { Slider } from '@/components/ui/Slider';
import { RecordingSettings } from '@/components/recording/RecordingSettings';
import { cn } from '@/lib/utils/cn';
import type { RecordingSettings as RecordingSettingsType } from '@/lib/hooks/useRecording';
import type {
Settings,
AudioSettings,
UISettings,
EditorSettings,
PerformanceSettings,
} from '@/lib/hooks/useSettings';
export interface GlobalSettingsDialogProps {
open: boolean;
@@ -14,9 +22,15 @@ export interface GlobalSettingsDialogProps {
onInputGainChange: (gain: number) => void;
onRecordMonoChange: (mono: boolean) => void;
onSampleRateChange: (sampleRate: number) => void;
settings: Settings;
onAudioSettingsChange: (updates: Partial<AudioSettings>) => void;
onUISettingsChange: (updates: Partial<UISettings>) => void;
onEditorSettingsChange: (updates: Partial<EditorSettings>) => void;
onPerformanceSettingsChange: (updates: Partial<PerformanceSettings>) => void;
onResetCategory: (category: 'audio' | 'ui' | 'editor' | 'performance') => void;
}
type TabType = 'recording' | 'playback' | 'interface';
type TabType = 'recording' | 'audio' | 'editor' | 'interface' | 'performance';
export function GlobalSettingsDialog({
open,
@@ -25,6 +39,12 @@ export function GlobalSettingsDialog({
onInputGainChange,
onRecordMonoChange,
onSampleRateChange,
settings,
onAudioSettingsChange,
onUISettingsChange,
onEditorSettingsChange,
onPerformanceSettingsChange,
onResetCategory,
}: GlobalSettingsDialogProps) {
const [activeTab, setActiveTab] = React.useState<TabType>('recording');
@@ -39,7 +59,7 @@ export function GlobalSettingsDialog({
/>
{/* Dialog */}
<div className="fixed left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2 w-full max-w-2xl z-50">
<div className="fixed left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2 w-full max-w-3xl z-50">
<div className="bg-card border border-border rounded-lg shadow-2xl overflow-hidden">
{/* Header */}
<div className="flex items-center justify-between px-6 py-4 border-b border-border">
@@ -55,65 +75,47 @@ export function GlobalSettingsDialog({
</div>
{/* Tabs */}
<div className="flex border-b border-border bg-muted/30">
<button
onClick={() => setActiveTab('recording')}
className={cn(
'px-6 py-3 text-sm font-medium transition-colors relative',
activeTab === 'recording'
? 'text-foreground bg-card'
: 'text-muted-foreground hover:text-foreground hover:bg-muted/50'
)}
>
Recording
{activeTab === 'recording' && (
<div className="absolute bottom-0 left-0 right-0 h-0.5 bg-primary" />
)}
</button>
<button
onClick={() => setActiveTab('playback')}
className={cn(
'px-6 py-3 text-sm font-medium transition-colors relative',
activeTab === 'playback'
? 'text-foreground bg-card'
: 'text-muted-foreground hover:text-foreground hover:bg-muted/50'
)}
>
Playback
{activeTab === 'playback' && (
<div className="absolute bottom-0 left-0 right-0 h-0.5 bg-primary" />
)}
</button>
<button
onClick={() => setActiveTab('interface')}
className={cn(
'px-6 py-3 text-sm font-medium transition-colors relative',
activeTab === 'interface'
? 'text-foreground bg-card'
: 'text-muted-foreground hover:text-foreground hover:bg-muted/50'
)}
>
Interface
{activeTab === 'interface' && (
<div className="absolute bottom-0 left-0 right-0 h-0.5 bg-primary" />
)}
</button>
<div className="flex border-b border-border bg-muted/30 overflow-x-auto">
{[
{ id: 'recording', label: 'Recording' },
{ id: 'audio', label: 'Audio' },
{ id: 'editor', label: 'Editor' },
{ id: 'interface', label: 'Interface' },
{ id: 'performance', label: 'Performance' },
].map((tab) => (
<button
key={tab.id}
onClick={() => setActiveTab(tab.id as TabType)}
className={cn(
'px-6 py-3 text-sm font-medium transition-colors relative flex-shrink-0',
activeTab === tab.id
? 'text-foreground bg-card'
: 'text-muted-foreground hover:text-foreground hover:bg-muted/50'
)}
>
{tab.label}
{activeTab === tab.id && (
<div className="absolute bottom-0 left-0 right-0 h-0.5 bg-primary" />
)}
</button>
))}
</div>
{/* Content */}
<div className="p-6 max-h-[60vh] overflow-y-auto custom-scrollbar">
{/* Recording Tab */}
{activeTab === 'recording' && (
<div className="space-y-4">
<div>
<h3 className="text-sm font-medium mb-3">Recording Settings</h3>
<RecordingSettings
settings={recordingSettings}
onInputGainChange={onInputGainChange}
onRecordMonoChange={onRecordMonoChange}
onSampleRateChange={onSampleRateChange}
className="border-0 bg-transparent p-0"
/>
<div className="flex items-center justify-between">
<h3 className="text-sm font-medium">Recording Settings</h3>
</div>
<RecordingSettings
settings={recordingSettings}
onInputGainChange={onInputGainChange}
onRecordMonoChange={onRecordMonoChange}
onSampleRateChange={onSampleRateChange}
className="border-0 bg-transparent p-0"
/>
<div className="pt-4 border-t border-border">
<h3 className="text-sm font-medium mb-2">Note</h3>
@@ -125,52 +127,393 @@ export function GlobalSettingsDialog({
</div>
)}
{activeTab === 'playback' && (
<div className="space-y-4">
<div>
<h3 className="text-sm font-medium mb-2">Playback Settings</h3>
<p className="text-sm text-muted-foreground mb-4">
Configure audio playback preferences.
</p>
{/* Audio Tab */}
{activeTab === 'audio' && (
<div className="space-y-6">
<div className="flex items-center justify-between">
<h3 className="text-sm font-medium">Audio Settings</h3>
<Button
variant="ghost"
size="sm"
onClick={() => onResetCategory('audio')}
className="h-7 text-xs"
>
<RotateCcw className="h-3 w-3 mr-1" />
Reset
</Button>
</div>
<div className="space-y-3 text-sm text-muted-foreground">
<div className="flex items-center justify-between p-3 bg-muted/50 rounded">
<span>Buffer Size</span>
<span className="font-mono">Auto</span>
</div>
<div className="flex items-center justify-between p-3 bg-muted/50 rounded">
<span>Output Latency</span>
<span className="font-mono">~20ms</span>
</div>
<p className="text-xs italic">
Advanced playback settings coming soon...
{/* Buffer Size */}
<div className="space-y-2">
<label className="text-sm font-medium">Buffer Size</label>
<select
value={settings.audio.bufferSize}
onChange={(e) =>
onAudioSettingsChange({ bufferSize: Number(e.target.value) })
}
className="w-full px-3 py-2 bg-background border border-border rounded text-sm"
>
<option value={256}>256 samples (Low latency, higher CPU)</option>
<option value={512}>512 samples</option>
<option value={1024}>1024 samples</option>
<option value={2048}>2048 samples (Recommended)</option>
<option value={4096}>4096 samples (Low CPU)</option>
</select>
<p className="text-xs text-muted-foreground">
Smaller buffer = lower latency but higher CPU usage. Requires reload.
</p>
</div>
{/* Sample Rate */}
<div className="space-y-2">
<label className="text-sm font-medium">Default Sample Rate</label>
<select
value={settings.audio.sampleRate}
onChange={(e) =>
onAudioSettingsChange({ sampleRate: Number(e.target.value) })
}
className="w-full px-3 py-2 bg-background border border-border rounded text-sm"
>
<option value={44100}>44.1 kHz (CD Quality)</option>
<option value={48000}>48 kHz (Professional)</option>
<option value={96000}>96 kHz (Hi-Res Audio)</option>
</select>
<p className="text-xs text-muted-foreground">
Higher sample rate = better quality but larger file sizes.
</p>
</div>
{/* Auto Normalize */}
<div className="flex items-center justify-between p-3 bg-muted/50 rounded">
<div>
<div className="text-sm font-medium">Auto-Normalize on Import</div>
<p className="text-xs text-muted-foreground">
Automatically normalize audio when importing files
</p>
</div>
<input
type="checkbox"
checked={settings.audio.autoNormalizeOnImport}
onChange={(e) =>
onAudioSettingsChange({ autoNormalizeOnImport: e.target.checked })
}
className="h-4 w-4"
/>
</div>
</div>
)}
{activeTab === 'interface' && (
<div className="space-y-4">
<div>
<h3 className="text-sm font-medium mb-2">Interface Settings</h3>
<p className="text-sm text-muted-foreground mb-4">
Customize the editor appearance and behavior.
</p>
{/* Editor Tab */}
{activeTab === 'editor' && (
<div className="space-y-6">
<div className="flex items-center justify-between">
<h3 className="text-sm font-medium">Editor Settings</h3>
<Button
variant="ghost"
size="sm"
onClick={() => onResetCategory('editor')}
className="h-7 text-xs"
>
<RotateCcw className="h-3 w-3 mr-1" />
Reset
</Button>
</div>
<div className="space-y-3 text-sm text-muted-foreground">
<div className="flex items-center justify-between p-3 bg-muted/50 rounded">
<span>Theme</span>
<span>Use theme toggle in header</span>
</div>
<div className="flex items-center justify-between p-3 bg-muted/50 rounded">
<span>Default Track Height</span>
<span className="font-mono">180px</span>
</div>
<p className="text-xs italic">
More interface options coming soon...
{/* Auto-Save Interval */}
<div className="space-y-2">
<div className="flex items-center justify-between">
<label className="text-sm font-medium">Auto-Save Interval</label>
<span className="text-xs font-mono text-muted-foreground">
{settings.editor.autoSaveInterval === 0
? 'Disabled'
: `${settings.editor.autoSaveInterval}s`}
</span>
</div>
<Slider
value={[settings.editor.autoSaveInterval]}
onValueChange={([value]) =>
onEditorSettingsChange({ autoSaveInterval: value })
}
min={0}
max={30}
step={1}
className="w-full"
/>
<p className="text-xs text-muted-foreground">
Set to 0 to disable auto-save. Default: 3 seconds.
</p>
</div>
{/* Undo History Limit */}
<div className="space-y-2">
<div className="flex items-center justify-between">
<label className="text-sm font-medium">Undo History Limit</label>
<span className="text-xs font-mono text-muted-foreground">
{settings.editor.undoHistoryLimit} operations
</span>
</div>
<Slider
value={[settings.editor.undoHistoryLimit]}
onValueChange={([value]) =>
onEditorSettingsChange({ undoHistoryLimit: value })
}
min={10}
max={200}
step={10}
className="w-full"
/>
<p className="text-xs text-muted-foreground">
Higher values use more memory. Default: 50.
</p>
</div>
{/* Snap to Grid */}
<div className="flex items-center justify-between p-3 bg-muted/50 rounded">
<div>
<div className="text-sm font-medium">Snap to Grid</div>
<p className="text-xs text-muted-foreground">
Snap playhead and selections to grid lines
</p>
</div>
<input
type="checkbox"
checked={settings.editor.snapToGrid}
onChange={(e) =>
onEditorSettingsChange({ snapToGrid: e.target.checked })
}
className="h-4 w-4"
/>
</div>
{/* Grid Resolution */}
<div className="space-y-2">
<div className="flex items-center justify-between">
<label className="text-sm font-medium">Grid Resolution</label>
<span className="text-xs font-mono text-muted-foreground">
{settings.editor.gridResolution}s
</span>
</div>
<Slider
value={[settings.editor.gridResolution]}
onValueChange={([value]) =>
onEditorSettingsChange({ gridResolution: value })
}
min={0.1}
max={5}
step={0.1}
className="w-full"
/>
<p className="text-xs text-muted-foreground">
Grid spacing in seconds. Default: 1.0s.
</p>
</div>
{/* Default Zoom */}
<div className="space-y-2">
<div className="flex items-center justify-between">
<label className="text-sm font-medium">Default Zoom Level</label>
<span className="text-xs font-mono text-muted-foreground">
{settings.editor.defaultZoom}x
</span>
</div>
<Slider
value={[settings.editor.defaultZoom]}
onValueChange={([value]) =>
onEditorSettingsChange({ defaultZoom: value })
}
min={1}
max={20}
step={1}
className="w-full"
/>
<p className="text-xs text-muted-foreground">
Initial zoom level when opening projects. Default: 1x.
</p>
</div>
</div>
)}
{/* Interface Tab */}
{activeTab === 'interface' && (
<div className="space-y-6">
<div className="flex items-center justify-between">
<h3 className="text-sm font-medium">Interface Settings</h3>
<Button
variant="ghost"
size="sm"
onClick={() => onResetCategory('ui')}
className="h-7 text-xs"
>
<RotateCcw className="h-3 w-3 mr-1" />
Reset
</Button>
</div>
{/* Theme */}
<div className="space-y-2">
<label className="text-sm font-medium">Theme</label>
<div className="flex gap-2">
{['dark', 'light', 'auto'].map((theme) => (
<button
key={theme}
onClick={() =>
onUISettingsChange({ theme: theme as 'dark' | 'light' | 'auto' })
}
className={cn(
'flex-1 px-4 py-2 rounded text-sm font-medium transition-colors',
settings.ui.theme === theme
? 'bg-primary text-primary-foreground'
: 'bg-muted hover:bg-muted/80'
)}
>
{theme.charAt(0).toUpperCase() + theme.slice(1)}
</button>
))}
</div>
<p className="text-xs text-muted-foreground">
Use the theme toggle in header for quick switching.
</p>
</div>
{/* Font Size */}
<div className="space-y-2">
<label className="text-sm font-medium">Font Size</label>
<div className="flex gap-2">
{['small', 'medium', 'large'].map((size) => (
<button
key={size}
onClick={() =>
onUISettingsChange({ fontSize: size as 'small' | 'medium' | 'large' })
}
className={cn(
'flex-1 px-4 py-2 rounded text-sm font-medium transition-colors',
settings.ui.fontSize === size
? 'bg-primary text-primary-foreground'
: 'bg-muted hover:bg-muted/80'
)}
>
{size.charAt(0).toUpperCase() + size.slice(1)}
</button>
))}
</div>
<p className="text-xs text-muted-foreground">
Adjust the UI font size. Requires reload.
</p>
</div>
</div>
)}
{/* Performance Tab */}
{activeTab === 'performance' && (
<div className="space-y-6">
<div className="flex items-center justify-between">
<h3 className="text-sm font-medium">Performance Settings</h3>
<Button
variant="ghost"
size="sm"
onClick={() => onResetCategory('performance')}
className="h-7 text-xs"
>
<RotateCcw className="h-3 w-3 mr-1" />
Reset
</Button>
</div>
{/* Peak Calculation Quality */}
<div className="space-y-2">
<label className="text-sm font-medium">Peak Calculation Quality</label>
<div className="flex gap-2">
{['low', 'medium', 'high'].map((quality) => (
<button
key={quality}
onClick={() =>
onPerformanceSettingsChange({
peakCalculationQuality: quality as 'low' | 'medium' | 'high',
})
}
className={cn(
'flex-1 px-4 py-2 rounded text-sm font-medium transition-colors',
settings.performance.peakCalculationQuality === quality
? 'bg-primary text-primary-foreground'
: 'bg-muted hover:bg-muted/80'
)}
>
{quality.charAt(0).toUpperCase() + quality.slice(1)}
</button>
))}
</div>
<p className="text-xs text-muted-foreground">
Higher quality = more accurate waveforms, slower processing.
</p>
</div>
{/* Waveform Rendering Quality */}
<div className="space-y-2">
<label className="text-sm font-medium">Waveform Rendering Quality</label>
<div className="flex gap-2">
{['low', 'medium', 'high'].map((quality) => (
<button
key={quality}
onClick={() =>
onPerformanceSettingsChange({
waveformRenderingQuality: quality as 'low' | 'medium' | 'high',
})
}
className={cn(
'flex-1 px-4 py-2 rounded text-sm font-medium transition-colors',
settings.performance.waveformRenderingQuality === quality
? 'bg-primary text-primary-foreground'
: 'bg-muted hover:bg-muted/80'
)}
>
{quality.charAt(0).toUpperCase() + quality.slice(1)}
</button>
))}
</div>
<p className="text-xs text-muted-foreground">
Lower quality = better performance on slower devices.
</p>
</div>
{/* Enable Spectrogram */}
<div className="flex items-center justify-between p-3 bg-muted/50 rounded">
<div>
<div className="text-sm font-medium">Enable Spectrogram</div>
<p className="text-xs text-muted-foreground">
Show spectrogram in analysis tools (requires more CPU)
</p>
</div>
<input
type="checkbox"
checked={settings.performance.enableSpectrogram}
onChange={(e) =>
onPerformanceSettingsChange({ enableSpectrogram: e.target.checked })
}
className="h-4 w-4"
/>
</div>
{/* Max File Size */}
<div className="space-y-2">
<div className="flex items-center justify-between">
<label className="text-sm font-medium">Maximum File Size</label>
<span className="text-xs font-mono text-muted-foreground">
{settings.performance.maxFileSizeMB} MB
</span>
</div>
<Slider
value={[settings.performance.maxFileSizeMB]}
onValueChange={([value]) =>
onPerformanceSettingsChange({ maxFileSizeMB: value })
}
min={100}
max={1000}
step={50}
className="w-full"
/>
<p className="text-xs text-muted-foreground">
Warn when importing files larger than this. Default: 500 MB.
</p>
</div>
</div>
)}

View File

@@ -0,0 +1,265 @@
'use client';
import * as React from 'react';
import { cn } from '@/lib/utils/cn';
import {
timeToPixel,
pixelToTime,
calculateTickInterval,
formatTimeLabel,
getVisibleTimeRange,
} from '@/lib/utils/timeline';
export interface TimeScaleProps {
duration: number;
zoom: number;
currentTime: number;
onSeek?: (time: number) => void;
className?: string;
height?: number;
controlsWidth?: number;
scrollRef?: React.MutableRefObject<HTMLDivElement | null>;
onScroll?: () => void;
}
export function TimeScale({
duration,
zoom,
currentTime,
onSeek,
className,
height = 40,
controlsWidth = 240,
scrollRef: externalScrollRef,
onScroll,
}: TimeScaleProps) {
const localScrollRef = React.useRef<HTMLDivElement>(null);
const scrollRef = externalScrollRef || localScrollRef;
const canvasRef = React.useRef<HTMLCanvasElement>(null);
const [viewportWidth, setViewportWidth] = React.useState(800);
const [scrollLeft, setScrollLeft] = React.useState(0);
const [hoverTime, setHoverTime] = React.useState<number | null>(null);
// Calculate total timeline width (match waveform calculation)
// Uses 5 pixels per second as base scale, multiplied by zoom
// Always ensure minimum width is at least viewport width for full coverage
const PIXELS_PER_SECOND_BASE = 5;
const totalWidth = React.useMemo(() => {
if (zoom >= 1) {
const calculatedWidth = duration * zoom * PIXELS_PER_SECOND_BASE;
// Ensure it's at least viewport width so timeline always fills
return Math.max(calculatedWidth, viewportWidth);
}
return viewportWidth;
}, [duration, zoom, viewportWidth]);
// Update viewport width on resize
React.useEffect(() => {
const scroller = scrollRef.current;
if (!scroller) return;
const updateWidth = () => {
setViewportWidth(scroller.clientWidth);
};
updateWidth();
const resizeObserver = new ResizeObserver(updateWidth);
resizeObserver.observe(scroller);
return () => resizeObserver.disconnect();
}, [scrollRef]);
// Handle scroll - update scrollLeft and trigger onScroll callback
const handleScroll = React.useCallback(() => {
if (scrollRef.current) {
setScrollLeft(scrollRef.current.scrollLeft);
}
if (onScroll) {
onScroll();
}
}, [onScroll, scrollRef]);
// Draw time scale - redraws on scroll and zoom
React.useEffect(() => {
const canvas = canvasRef.current;
if (!canvas || duration === 0) return;
const ctx = canvas.getContext('2d');
if (!ctx) return;
// Set canvas size to viewport width
const dpr = window.devicePixelRatio || 1;
canvas.width = viewportWidth * dpr;
canvas.height = height * dpr;
canvas.style.width = `${viewportWidth}px`;
canvas.style.height = `${height}px`;
ctx.scale(dpr, dpr);
// Clear canvas
ctx.fillStyle = getComputedStyle(canvas).getPropertyValue('--color-background') || '#ffffff';
ctx.fillRect(0, 0, viewportWidth, height);
// Calculate visible time range
const visibleRange = getVisibleTimeRange(scrollLeft, viewportWidth, duration, zoom);
const visibleDuration = visibleRange.end - visibleRange.start;
// Calculate tick intervals based on visible duration
const { major, minor } = calculateTickInterval(visibleDuration);
// Calculate which ticks to draw (only visible ones)
const startTick = Math.floor(visibleRange.start / minor) * minor;
const endTick = Math.ceil(visibleRange.end / minor) * minor;
// Set up text style for labels
ctx.font = '12px -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif';
ctx.textAlign = 'center';
ctx.textBaseline = 'top';
// Draw ticks and labels
for (let time = startTick; time <= endTick; time += minor) {
if (time < 0 || time > duration) continue;
// Calculate x position using the actual totalWidth (not timeToPixel which recalculates)
const x = (time / duration) * totalWidth - scrollLeft;
if (x < 0 || x > viewportWidth) continue;
const isMajor = Math.abs(time % major) < 0.001;
if (isMajor) {
// Major ticks - tall and prominent
ctx.strokeStyle = getComputedStyle(canvas).getPropertyValue('--color-foreground') || '#000000';
ctx.lineWidth = 2;
ctx.beginPath();
ctx.moveTo(x, height - 20);
ctx.lineTo(x, height);
ctx.stroke();
// Major tick label
ctx.fillStyle = getComputedStyle(canvas).getPropertyValue('--color-foreground') || '#000000';
const label = formatTimeLabel(time, visibleDuration < 10);
ctx.fillText(label, x, 6);
} else {
// Minor ticks - shorter and lighter
ctx.strokeStyle = getComputedStyle(canvas).getPropertyValue('--color-muted-foreground') || '#9ca3af';
ctx.lineWidth = 1;
ctx.beginPath();
ctx.moveTo(x, height - 10);
ctx.lineTo(x, height);
ctx.stroke();
// Minor tick label (smaller and lighter)
if (x > 20 && x < viewportWidth - 20) {
ctx.fillStyle = getComputedStyle(canvas).getPropertyValue('--color-muted-foreground') || '#9ca3af';
ctx.font = '10px -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif';
const label = formatTimeLabel(time, visibleDuration < 10);
ctx.fillText(label, x, 8);
ctx.font = '12px -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif';
}
}
}
// Draw playhead indicator
const playheadX = (currentTime / duration) * totalWidth - scrollLeft;
if (playheadX >= 0 && playheadX <= viewportWidth) {
ctx.strokeStyle = '#ef4444';
ctx.lineWidth = 2;
ctx.beginPath();
ctx.moveTo(playheadX, 0);
ctx.lineTo(playheadX, height);
ctx.stroke();
}
// Draw hover indicator
if (hoverTime !== null) {
const hoverX = (hoverTime / duration) * totalWidth - scrollLeft;
if (hoverX >= 0 && hoverX <= viewportWidth) {
ctx.strokeStyle = 'rgba(59, 130, 246, 0.5)';
ctx.lineWidth = 1;
ctx.setLineDash([3, 3]);
ctx.beginPath();
ctx.moveTo(hoverX, 0);
ctx.lineTo(hoverX, height);
ctx.stroke();
ctx.setLineDash([]);
}
}
}, [duration, zoom, currentTime, viewportWidth, scrollLeft, height, hoverTime, totalWidth]);
// Handle click to seek
const handleClick = React.useCallback(
(e: React.MouseEvent<HTMLCanvasElement>) => {
if (!onSeek) return;
const rect = e.currentTarget.getBoundingClientRect();
const x = e.clientX - rect.left;
const pixelPos = x + scrollLeft;
const time = (pixelPos / totalWidth) * duration;
onSeek(Math.max(0, Math.min(duration, time)));
},
[onSeek, duration, totalWidth, scrollLeft]
);
// Handle mouse move for hover
const handleMouseMove = React.useCallback(
(e: React.MouseEvent<HTMLCanvasElement>) => {
const rect = e.currentTarget.getBoundingClientRect();
const x = e.clientX - rect.left;
const pixelPos = x + scrollLeft;
const time = (pixelPos / totalWidth) * duration;
setHoverTime(Math.max(0, Math.min(duration, time)));
},
[duration, totalWidth, scrollLeft]
);
const handleMouseLeave = React.useCallback(() => {
setHoverTime(null);
}, []);
return (
<div className={cn('relative bg-background', className)} style={{ paddingLeft: '240px', paddingRight: '250px' }}>
<div
ref={scrollRef}
className="w-full bg-background overflow-x-auto overflow-y-hidden custom-scrollbar"
style={{
height: `${height}px`,
}}
onScroll={handleScroll}
>
{/* Spacer to create scrollable width */}
<div style={{ width: `${totalWidth}px`, height: `${height}px`, position: 'relative' }}>
<canvas
ref={canvasRef}
onClick={handleClick}
onMouseMove={handleMouseMove}
onMouseLeave={handleMouseLeave}
className="cursor-pointer"
style={{
position: 'sticky',
left: 0,
width: `${viewportWidth}px`,
height: `${height}px`,
}}
/>
</div>
</div>
{/* Hover tooltip */}
{hoverTime !== null && (
<div
className="absolute top-full mt-1 px-2 py-1 bg-popover border border-border rounded shadow-lg text-xs font-mono pointer-events-none z-10"
style={{
left: `${Math.min(
viewportWidth - 60 + controlsWidth,
Math.max(controlsWidth, (hoverTime / duration) * totalWidth - scrollLeft - 30 + controlsWidth)
)}px`,
}}
>
{formatTimeLabel(hoverTime, true)}
</div>
)}
</div>
);
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,44 +1,230 @@
'use client';
import * as React from 'react';
import { Volume2, VolumeX } from 'lucide-react';
import { Circle, Headphones, MoreHorizontal, ChevronRight, ChevronDown, ChevronUp } from 'lucide-react';
import { CircularKnob } from '@/components/ui/CircularKnob';
import { TrackFader } from './TrackFader';
import { cn } from '@/lib/utils/cn';
export interface TrackControlsProps {
trackName: string;
trackColor: string;
collapsed: boolean;
volume: number;
pan: number;
peakLevel: number;
rmsLevel: number;
isMuted?: boolean;
isSolo?: boolean;
isRecordEnabled?: boolean;
showAutomation?: boolean;
showEffects?: boolean;
isRecording?: boolean;
mobileCollapsed?: boolean; // For mobile view collapsible controls
onNameChange: (name: string) => void;
onToggleCollapse: () => void;
onVolumeChange: (volume: number) => void;
onPanChange: (pan: number) => void;
onMuteToggle: () => void;
onSoloToggle?: () => void;
onRecordToggle?: () => void;
onAutomationToggle?: () => void;
onEffectsClick?: () => void;
onVolumeTouchStart?: () => void;
onVolumeTouchEnd?: () => void;
onPanTouchStart?: () => void;
onPanTouchEnd?: () => void;
onToggleMobileCollapse?: () => void;
className?: string;
}
export function TrackControls({
trackName,
trackColor,
collapsed,
volume,
pan,
peakLevel,
rmsLevel,
isMuted = false,
isSolo = false,
isRecordEnabled = false,
showAutomation = false,
showEffects = false,
isRecording = false,
mobileCollapsed = false,
onNameChange,
onToggleCollapse,
onVolumeChange,
onPanChange,
onMuteToggle,
onSoloToggle,
onRecordToggle,
onAutomationToggle,
onEffectsClick,
onVolumeTouchStart,
onVolumeTouchEnd,
onPanTouchStart,
onPanTouchEnd,
onToggleMobileCollapse,
className,
}: TrackControlsProps) {
return (
<div className={cn('flex flex-col items-center gap-2 py-2', className)}>
const [isEditingName, setIsEditingName] = React.useState(false);
const [editName, setEditName] = React.useState(trackName);
const handleNameClick = () => {
setIsEditingName(true);
setEditName(trackName);
};
const handleNameBlur = () => {
setIsEditingName(false);
if (editName.trim() && editName !== trackName) {
onNameChange(editName.trim());
} else {
setEditName(trackName);
}
};
const handleNameKeyDown = (e: React.KeyboardEvent) => {
if (e.key === 'Enter') {
handleNameBlur();
} else if (e.key === 'Escape') {
setIsEditingName(false);
setEditName(trackName);
}
};
// Mobile collapsed view - minimal controls (like master controls)
if (mobileCollapsed) {
return (
<div className={cn(
'flex flex-col items-center gap-2 px-3 py-2 bg-card/50 border border-accent/50 rounded-lg w-full sm:hidden',
className
)}>
<div className="flex items-center justify-between w-full">
<div className="flex items-center gap-1 flex-1">
<button
onClick={onToggleCollapse}
className="p-0.5 hover:bg-accent/20 rounded transition-colors flex-shrink-0"
title={collapsed ? 'Expand track' : 'Collapse track'}
>
{collapsed ? (
<ChevronRight className="h-3 w-3 text-muted-foreground" />
) : (
<ChevronDown className="h-3 w-3 text-muted-foreground" />
)}
</button>
<div
className="text-xs font-bold uppercase tracking-wider"
style={{ color: trackColor }}
>
{trackName}
</div>
</div>
{onToggleMobileCollapse && (
<button
onClick={onToggleMobileCollapse}
className="p-1 hover:bg-accent/20 rounded transition-colors"
title="Expand track controls"
>
<ChevronDown className="h-3 w-3 text-muted-foreground" />
</button>
)}
</div>
<div className="flex items-center gap-1 w-full justify-center">
{onRecordToggle && (
<button
onClick={onRecordToggle}
className={cn(
'h-7 w-7 rounded-full flex items-center justify-center transition-all',
isRecordEnabled
? isRecording
? 'bg-red-500 shadow-lg shadow-red-500/50 animate-pulse'
: 'bg-red-500 shadow-md shadow-red-500/30'
: 'bg-card hover:bg-accent border border-border/50'
)}
title={isRecordEnabled ? 'Record Armed' : 'Arm for Recording'}
>
<Circle className={cn('h-3.5 w-3.5', isRecordEnabled ? 'fill-white text-white' : 'text-muted-foreground')} />
</button>
)}
<button
onClick={onMuteToggle}
className={cn(
'h-7 w-7 rounded-md flex items-center justify-center transition-all text-xs font-bold',
isMuted
? 'bg-blue-500 text-white shadow-md shadow-blue-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title={isMuted ? 'Unmute' : 'Mute'}
>
M
</button>
{onSoloToggle && (
<button
onClick={onSoloToggle}
className={cn(
'h-7 w-7 rounded-md flex items-center justify-center transition-all text-xs font-bold',
isSolo
? 'bg-yellow-500 text-white shadow-md shadow-yellow-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title={isSolo ? 'Unsolo' : 'Solo'}
>
S
</button>
)}
<div className="flex-1 h-2 bg-muted rounded-full overflow-hidden">
<div
className={cn(
'h-full transition-all',
peakLevel > 0.95 ? 'bg-red-500' : peakLevel > 0.8 ? 'bg-yellow-500' : 'bg-green-500'
)}
style={{ width: `${peakLevel * 100}%` }}
/>
</div>
</div>
</div>
);
}
// Mobile expanded view - full controls (like master controls)
const mobileExpandedView = (
<div className={cn(
'flex flex-col items-center gap-3 px-3 py-3 bg-card/50 border border-accent/50 rounded-lg w-full sm:hidden',
className
)}>
{/* Header with collapse button */}
<div className="flex items-center justify-between w-full">
<button
onClick={onToggleCollapse}
className="p-0.5 hover:bg-accent/20 rounded transition-colors flex-shrink-0"
title={collapsed ? 'Expand track' : 'Collapse track'}
>
{collapsed ? (
<ChevronRight className="h-3 w-3 text-muted-foreground" />
) : (
<ChevronDown className="h-3 w-3 text-muted-foreground" />
)}
</button>
<div
className="text-xs font-bold uppercase tracking-wider flex-1 text-center"
style={{ color: trackColor }}
>
{trackName}
</div>
{onToggleMobileCollapse && (
<button
onClick={onToggleMobileCollapse}
className="p-0.5 hover:bg-accent/20 rounded transition-colors flex-shrink-0"
title="Collapse track controls"
>
<ChevronUp className="h-3 w-3 text-muted-foreground" />
</button>
)}
</div>
{/* Pan Control */}
<CircularKnob
value={pan}
@@ -49,37 +235,223 @@ export function TrackControls({
max={1}
step={0.01}
label="PAN"
size={40}
formatter={(value) => {
size={48}
formatValue={(value: number) => {
if (Math.abs(value) < 0.01) return 'C';
if (value < 0) return `${Math.abs(value * 100).toFixed(0)}L`;
return `${(value * 100).toFixed(0)}R`;
}}
/>
{/* Track Fader with Integrated Meters */}
<TrackFader
value={volume}
peakLevel={peakLevel}
rmsLevel={rmsLevel}
onChange={onVolumeChange}
onTouchStart={onVolumeTouchStart}
onTouchEnd={onVolumeTouchEnd}
/>
{/* Volume Fader - Full height, not compressed */}
<div className="flex-1 flex justify-center items-center w-full min-h-[160px]">
<TrackFader
value={volume}
peakLevel={peakLevel}
rmsLevel={rmsLevel}
onChange={onVolumeChange}
onTouchStart={onVolumeTouchStart}
onTouchEnd={onVolumeTouchEnd}
/>
</div>
{/* Mute Button */}
<button
onClick={onMuteToggle}
className={cn(
'w-8 h-6 rounded text-[10px] font-bold transition-colors border',
isMuted
? 'bg-red-500/20 hover:bg-red-500/30 border-red-500/50 text-red-500'
: 'bg-muted/20 hover:bg-muted/30 border-border/50 text-muted-foreground'
{/* Control buttons */}
<div className="flex items-center gap-1 w-full justify-center">
{onRecordToggle && (
<button
onClick={onRecordToggle}
className={cn(
'h-8 w-8 rounded-full flex items-center justify-center transition-all',
isRecordEnabled
? isRecording
? 'bg-red-500 shadow-lg shadow-red-500/50 animate-pulse'
: 'bg-red-500 shadow-md shadow-red-500/30'
: 'bg-card hover:bg-accent border border-border/50'
)}
title={isRecordEnabled ? 'Record Armed' : 'Arm for Recording'}
>
<Circle className={cn('h-3.5 w-3.5', isRecordEnabled ? 'fill-white text-white' : 'text-muted-foreground')} />
</button>
)}
title={isMuted ? 'Unmute' : 'Mute'}
>
M
</button>
<button
onClick={onMuteToggle}
className={cn(
'h-8 w-8 rounded-md flex items-center justify-center transition-all text-xs font-bold',
isMuted
? 'bg-blue-500 text-white shadow-md shadow-blue-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title={isMuted ? 'Unmute' : 'Mute'}
>
M
</button>
{onSoloToggle && (
<button
onClick={onSoloToggle}
className={cn(
'h-8 w-8 rounded-md flex items-center justify-center transition-all text-xs font-bold',
isSolo
? 'bg-yellow-500 text-white shadow-md shadow-yellow-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title={isSolo ? 'Unsolo' : 'Solo'}
>
S
</button>
)}
{onEffectsClick && (
<button
onClick={onEffectsClick}
className={cn(
'h-8 w-8 rounded-md flex items-center justify-center transition-all text-xs font-bold',
showEffects
? 'bg-purple-500 text-white shadow-md shadow-purple-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title="Effects"
>
FX
</button>
)}
</div>
</div>
);
return (
<>
{/* Mobile view - Show expanded or collapsed */}
{!mobileCollapsed && mobileExpandedView}
{/* Desktop/tablet view - hidden on mobile */}
<div className={cn(
'flex flex-col items-center gap-3 px-4 py-3 bg-card/50 border border-accent/50 rounded-lg hidden sm:flex',
className
)}>
{/* Track Name Header with Collapse Chevron */}
<div className="flex items-center gap-1 w-full">
<button
onClick={onToggleCollapse}
className="p-0.5 hover:bg-accent/20 rounded transition-colors flex-shrink-0"
title={collapsed ? 'Expand track' : 'Collapse track'}
>
{collapsed ? (
<ChevronRight className="h-3 w-3 text-muted-foreground" />
) : (
<ChevronDown className="h-3 w-3 text-muted-foreground" />
)}
</button>
<div className="flex-1 flex items-center justify-center min-w-0">
{isEditingName ? (
<input
type="text"
value={editName}
onChange={(e) => setEditName(e.target.value)}
onBlur={handleNameBlur}
onKeyDown={handleNameKeyDown}
autoFocus
className="w-24 text-[10px] font-bold uppercase tracking-wider text-center bg-transparent border-b focus:outline-none px-1"
style={{ color: trackColor, borderColor: trackColor }}
/>
) : (
<div
onClick={handleNameClick}
className="w-24 text-[10px] font-bold uppercase tracking-wider text-center cursor-text hover:bg-accent/10 px-1 rounded transition-colors truncate"
style={{ color: trackColor }}
title="Click to edit track name"
>
{trackName}
</div>
)}
</div>
{/* Spacer to balance the chevron and center the label */}
<div className="p-0.5 flex-shrink-0 w-4" />
</div>
{/* Pan Control - Top */}
<div className="flex justify-center w-full">
<CircularKnob
value={pan}
onChange={onPanChange}
onTouchStart={onPanTouchStart}
onTouchEnd={onPanTouchEnd}
min={-1}
max={1}
step={0.01}
label="PAN"
size={48}
formatValue={(value: number) => {
if (Math.abs(value) < 0.01) return 'C';
if (value < 0) return `${Math.abs(value * 100).toFixed(0)}L`;
return `${(value * 100).toFixed(0)}R`;
}}
/>
</div>
{/* Track Fader - Center (vertically centered in remaining space) */}
<div className="flex justify-center items-center flex-1 w-full">
<TrackFader
value={volume}
peakLevel={peakLevel}
rmsLevel={rmsLevel}
onChange={onVolumeChange}
onTouchStart={onVolumeTouchStart}
onTouchEnd={onVolumeTouchEnd}
/>
</div>
{/* Control Buttons - Bottom */}
<div className="flex flex-col gap-1 w-full">
{/* Control Buttons Row 1: R/M/S */}
<div className="flex items-center gap-1 w-full justify-center">
{/* Record Arm */}
{onRecordToggle && (
<button
onClick={onRecordToggle}
className={cn(
'h-8 w-8 rounded-md flex items-center justify-center transition-all text-[11px] font-bold',
isRecordEnabled
? 'bg-red-500 text-white shadow-md shadow-red-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50',
isRecording && 'animate-pulse'
)}
title="Arm track for recording"
>
<Circle className="h-3 w-3 fill-current" />
</button>
)}
{/* Mute Button */}
<button
onClick={onMuteToggle}
className={cn(
'h-8 w-8 rounded-md flex items-center justify-center transition-all text-[11px] font-bold',
isMuted
? 'bg-blue-500 text-white shadow-md shadow-blue-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title="Mute track"
>
M
</button>
{/* Solo Button */}
{onSoloToggle && (
<button
onClick={onSoloToggle}
className={cn(
'h-8 w-8 rounded-md flex items-center justify-center transition-all text-[11px] font-bold',
isSolo
? 'bg-yellow-500 text-black shadow-md shadow-yellow-500/30'
: 'bg-card hover:bg-accent text-muted-foreground border border-border/50'
)}
title="Solo track"
>
<Headphones className="h-3 w-3" />
</button>
)}
</div>
</div>
</div>
</>
);
}

View File

@@ -0,0 +1,222 @@
'use client';
import * as React from 'react';
import { Plus, ChevronDown, ChevronRight, Sparkles } from 'lucide-react';
import type { Track as TrackType } from '@/types/track';
import { Button } from '@/components/ui/Button';
import { cn } from '@/lib/utils/cn';
import { EffectDevice } from '@/components/effects/EffectDevice';
import { EffectBrowser } from '@/components/effects/EffectBrowser';
import type { EffectType } from '@/lib/audio/effects/chain';
export interface TrackExtensionsProps {
track: TrackType;
onUpdateTrack: (trackId: string, updates: Partial<TrackType>) => void;
onToggleEffect?: (effectId: string) => void;
onRemoveEffect?: (effectId: string) => void;
onUpdateEffect?: (effectId: string, parameters: any) => void;
onAddEffect?: (effectType: EffectType) => void;
asOverlay?: boolean; // When true, renders as full overlay without header
}
export function TrackExtensions({
track,
onUpdateTrack,
onToggleEffect,
onRemoveEffect,
onUpdateEffect,
onAddEffect,
asOverlay = false,
}: TrackExtensionsProps) {
const [effectBrowserOpen, setEffectBrowserOpen] = React.useState(false);
// Don't render if track is collapsed (unless it's an overlay, which handles its own visibility)
if (!asOverlay && track.collapsed) {
return null;
}
// Overlay mode: render full-screen effect rack
if (asOverlay) {
return (
<>
<div className="flex flex-col h-full bg-card/95 rounded-lg border border-border shadow-2xl">
{/* Header with close button */}
<div className="flex items-center justify-between px-4 py-3 border-b border-border bg-muted/50">
<div className="flex items-center gap-2">
<span className="text-sm font-medium">Effects</span>
<span className="text-xs text-muted-foreground">
({track.effectChain.effects.length})
</span>
</div>
<div className="flex items-center gap-2">
<Button
variant="ghost"
size="icon-sm"
onClick={() => setEffectBrowserOpen(true)}
title="Add effect"
>
<Plus className="h-4 w-4" />
</Button>
<Button
variant="ghost"
size="icon-sm"
onClick={() => onUpdateTrack(track.id, { showEffects: false })}
title="Close effects"
>
<ChevronDown className="h-4 w-4" />
</Button>
</div>
</div>
{/* Effects rack */}
<div className="flex-1 overflow-x-auto custom-scrollbar p-4">
<div className="flex h-full gap-4">
{track.effectChain.effects.length === 0 ? (
<div className="flex flex-col items-center justify-center w-full text-center gap-3">
<Sparkles className="h-12 w-12 text-muted-foreground/30" />
<div>
<p className="text-sm text-muted-foreground mb-1">No effects yet</p>
<p className="text-xs text-muted-foreground/70">
Click + to add an effect
</p>
</div>
</div>
) : (
track.effectChain.effects.map((effect) => (
<EffectDevice
key={effect.id}
effect={effect}
onToggleEnabled={() => onToggleEffect?.(effect.id)}
onRemove={() => onRemoveEffect?.(effect.id)}
onUpdateParameters={(params) => onUpdateEffect?.(effect.id, params)}
onToggleExpanded={() => {
const updatedEffects = track.effectChain.effects.map((e) =>
e.id === effect.id ? { ...e, expanded: !e.expanded } : e
);
onUpdateTrack(track.id, {
effectChain: { ...track.effectChain, effects: updatedEffects },
});
}}
/>
))
)}
</div>
</div>
</div>
{/* Effect Browser Dialog */}
<EffectBrowser
open={effectBrowserOpen}
onClose={() => setEffectBrowserOpen(false)}
onSelectEffect={(effectType) => {
if (onAddEffect) {
onAddEffect(effectType);
}
}}
/>
</>
);
}
// Original inline mode
return (
<>
{/* Effects Section (Collapsible, Full Width) */}
<div className="bg-muted/50 border-b border-border/50">
{/* Effects Header - clickable to toggle */}
<div
className="flex items-center gap-2 px-3 py-1.5 cursor-pointer hover:bg-accent/30 transition-colors"
onClick={() => {
onUpdateTrack(track.id, {
showEffects: !track.showEffects,
});
}}
>
{track.showEffects ? (
<ChevronDown className="h-3.5 w-3.5 text-muted-foreground flex-shrink-0" />
) : (
<ChevronRight className="h-3.5 w-3.5 text-muted-foreground flex-shrink-0" />
)}
{/* Show mini effect chain when collapsed */}
{!track.showEffects && track.effectChain.effects.length > 0 ? (
<div className="flex-1 flex items-center gap-1 overflow-x-auto custom-scrollbar">
{track.effectChain.effects.map((effect) => (
<div
key={effect.id}
className={cn(
'px-2 py-0.5 rounded text-[10px] font-medium flex-shrink-0',
effect.enabled
? 'bg-primary/20 text-primary border border-primary/30'
: 'bg-muted/30 text-muted-foreground border border-border'
)}
>
{effect.name}
</div>
))}
</div>
) : (
<span className="text-xs font-medium text-muted-foreground">
Devices ({track.effectChain.effects.length})
</span>
)}
<Button
variant="ghost"
size="icon-sm"
onClick={(e) => {
e.stopPropagation();
setEffectBrowserOpen(true);
}}
title="Add effect"
className="h-5 w-5 flex-shrink-0"
>
<Plus className="h-3 w-3" />
</Button>
</div>
{/* Horizontal scrolling device rack - expanded state */}
{track.showEffects && (
<div className="h-48 overflow-x-auto custom-scrollbar bg-muted/70 p-3">
<div className="flex h-full gap-3">
{track.effectChain.effects.length === 0 ? (
<div className="text-xs text-muted-foreground text-center py-8 w-full">
No devices. Click + to add an effect.
</div>
) : (
track.effectChain.effects.map((effect) => (
<EffectDevice
key={effect.id}
effect={effect}
onToggleEnabled={() => onToggleEffect?.(effect.id)}
onRemove={() => onRemoveEffect?.(effect.id)}
onUpdateParameters={(params) => onUpdateEffect?.(effect.id, params)}
onToggleExpanded={() => {
const updatedEffects = track.effectChain.effects.map((e) =>
e.id === effect.id ? { ...e, expanded: !e.expanded } : e
);
onUpdateTrack(track.id, {
effectChain: { ...track.effectChain, effects: updatedEffects },
});
}}
/>
))
)}
</div>
</div>
)}
</div>
{/* Effect Browser Dialog */}
<EffectBrowser
open={effectBrowserOpen}
onClose={() => setEffectBrowserOpen(false)}
onSelectEffect={(effectType) => {
if (onAddEffect) {
onAddEffect(effectType);
}
}}
/>
</>
);
}

View File

@@ -60,31 +60,66 @@ export function TrackFader({
onTouchEnd?.();
}, [onTouchEnd]);
const handleTouchStart = (e: React.TouchEvent) => {
e.preventDefault();
const touch = e.touches[0];
setIsDragging(true);
onTouchStart?.();
updateValue(touch.clientY);
};
const handleTouchMove = React.useCallback(
(e: TouchEvent) => {
if (!isDragging || e.touches.length === 0) return;
const touch = e.touches[0];
updateValue(touch.clientY);
},
[isDragging]
);
const handleTouchEnd = React.useCallback(() => {
setIsDragging(false);
onTouchEnd?.();
}, [onTouchEnd]);
const updateValue = (clientY: number) => {
if (!containerRef.current) return;
const rect = containerRef.current.getBoundingClientRect();
const y = clientY - rect.top;
// Track has 32px (2rem) padding on top and bottom (top-8 bottom-8)
const trackPadding = 32;
const trackHeight = rect.height - (trackPadding * 2);
// Clamp y to track bounds
const clampedY = Math.max(trackPadding, Math.min(rect.height - trackPadding, y));
// Inverted: top = max (1), bottom = min (0)
const percentage = Math.max(0, Math.min(1, 1 - (y / rect.height)));
onChange(percentage);
// Map clampedY from [trackPadding, height-trackPadding] to [1, 0]
const percentage = 1 - ((clampedY - trackPadding) / trackHeight);
onChange(Math.max(0, Math.min(1, percentage)));
};
React.useEffect(() => {
if (isDragging) {
window.addEventListener('mousemove', handleMouseMove);
window.addEventListener('mouseup', handleMouseUp);
window.addEventListener('touchmove', handleTouchMove);
window.addEventListener('touchend', handleTouchEnd);
return () => {
window.removeEventListener('mousemove', handleMouseMove);
window.removeEventListener('mouseup', handleMouseUp);
window.removeEventListener('touchmove', handleTouchMove);
window.removeEventListener('touchend', handleTouchEnd);
};
}
}, [isDragging, handleMouseMove, handleMouseUp]);
}, [isDragging, handleMouseMove, handleMouseUp, handleTouchMove, handleTouchEnd]);
return (
<div className={cn('flex gap-2', className)}>
<div className={cn('flex gap-3', className)} style={{ marginLeft: '16px' }}>
{/* dB Labels (Left) */}
<div className="flex flex-col justify-between text-[9px] font-mono text-muted-foreground py-1">
<div className="flex flex-col justify-between text-[10px] font-mono text-muted-foreground py-1">
<span>0</span>
<span>-12</span>
<span>-24</span>
@@ -94,11 +129,12 @@ export function TrackFader({
{/* Fader Container */}
<div
ref={containerRef}
className="relative w-10 h-32 bg-background/50 rounded-md border border-border/50 cursor-pointer"
className="relative w-12 h-40 bg-background/50 rounded-md border border-border/50 cursor-pointer"
onMouseDown={handleMouseDown}
onTouchStart={handleTouchStart}
>
{/* Peak Meter (Horizontal Bar - Top) */}
<div className="absolute inset-x-1.5 top-1.5 h-2.5 bg-background/80 rounded-sm overflow-hidden border border-border/30">
<div className="absolute inset-x-2 top-2 h-3 bg-background/80 rounded-sm overflow-hidden border border-border/30">
<div
className="absolute left-0 top-0 bottom-0 transition-all duration-75 ease-out"
style={{ width: `${Math.max(0, Math.min(100, peakWidth))}%` }}
@@ -113,41 +149,43 @@ export function TrackFader({
</div>
{/* RMS Meter (Horizontal Bar - Bottom) */}
<div className="absolute inset-x-1.5 bottom-1.5 h-2.5 bg-background/80 rounded-sm overflow-hidden border border-border/30">
<div className="absolute inset-x-2 bottom-2 h-3 bg-background/80 rounded-sm overflow-hidden border border-border/30">
<div
className="absolute left-0 top-0 bottom-0 transition-all duration-150 ease-out"
style={{ width: `${Math.max(0, Math.min(100, rmsWidth))}%` }}
>
<div className={cn(
'w-full h-full',
rmsDb > -3 ? 'bg-red-400' :
rmsDb > -6 ? 'bg-yellow-400' :
'bg-green-400'
rmsDb > -3 ? 'bg-red-500' :
rmsDb > -6 ? 'bg-yellow-500' :
'bg-green-500'
)} />
</div>
</div>
{/* Fader Track */}
<div className="absolute top-6 bottom-6 left-1/2 -translate-x-1/2 w-1 bg-muted/50 rounded-full" />
<div className="absolute top-8 bottom-8 left-1/2 -translate-x-1/2 w-1.5 bg-muted/50 rounded-full" />
{/* Fader Handle */}
<div
className="absolute left-1/2 -translate-x-1/2 w-9 h-3.5 bg-primary/80 border-2 border-primary rounded-md shadow-lg cursor-grab active:cursor-grabbing pointer-events-none transition-all"
className="absolute left-1/2 -translate-x-1/2 w-10 h-4 bg-primary/80 border-2 border-primary rounded-md shadow-lg cursor-grab active:cursor-grabbing pointer-events-none transition-all"
style={{
// Inverted: value 1 = top, value 0 = bottom
top: `calc(${(1 - value) * 100}% - 0.4375rem)`,
// Inverted: value 1 = top of track (20%), value 0 = bottom of track (80%)
// Track has top-8 bottom-8 padding (20% and 80% of h-40 container)
// Handle moves within 60% range (from 20% to 80%)
top: `calc(${20 + (1 - value) * 60}% - 0.5rem)`,
}}
>
{/* Handle grip lines */}
<div className="absolute inset-0 flex items-center justify-center gap-0.5">
<div className="h-1.5 w-px bg-primary-foreground/30" />
<div className="h-1.5 w-px bg-primary-foreground/30" />
<div className="h-1.5 w-px bg-primary-foreground/30" />
<div className="h-2 w-px bg-primary-foreground/30" />
<div className="h-2 w-px bg-primary-foreground/30" />
<div className="h-2 w-px bg-primary-foreground/30" />
</div>
</div>
{/* dB Scale Markers */}
<div className="absolute inset-0 px-1.5 py-6 pointer-events-none">
<div className="absolute inset-0 px-2 py-8 pointer-events-none">
<div className="relative h-full">
{/* -12 dB */}
<div className="absolute left-0 right-0 h-px bg-border/20" style={{ top: '50%' }} />
@@ -159,17 +197,49 @@ export function TrackFader({
</div>
</div>
{/* Value Display (Right) */}
<div className="flex flex-col justify-center items-start text-[9px] font-mono">
{/* Value and Level Display (Right) */}
<div className="flex flex-col justify-between items-start text-[9px] font-mono py-1 w-[36px]">
{/* Current dB Value */}
<div className={cn(
'font-bold text-[10px]',
'font-bold text-[11px]',
valueDb > -3 ? 'text-red-500' :
valueDb > -6 ? 'text-yellow-500' :
'text-green-500'
)}>
{valueDb > -60 ? `${valueDb.toFixed(1)}` : '-∞'}
</div>
{/* Spacer */}
<div className="flex-1" />
{/* Peak Level */}
<div className="flex flex-col items-start">
<span className="text-muted-foreground/60">PK</span>
<span className={cn(
'font-mono text-[10px]',
peakDb > -3 ? 'text-red-500' :
peakDb > -6 ? 'text-yellow-500' :
'text-green-500'
)}>
{peakDb > -60 ? `${peakDb.toFixed(1)}` : '-∞'}
</span>
</div>
{/* RMS Level */}
<div className="flex flex-col items-start">
<span className="text-muted-foreground/60">RM</span>
<span className={cn(
'font-mono text-[10px]',
rmsDb > -3 ? 'text-red-500' :
rmsDb > -6 ? 'text-yellow-500' :
'text-green-500'
)}>
{rmsDb > -60 ? `${rmsDb.toFixed(1)}` : '-∞'}
</span>
</div>
{/* dB Label */}
<span className="text-muted-foreground/60 text-[8px]">dB</span>
</div>
</div>
);

File diff suppressed because it is too large Load Diff

View File

@@ -91,17 +91,51 @@ export function CircularKnob({
onTouchEnd?.();
}, [onTouchEnd]);
const handleTouchStart = React.useCallback(
(e: React.TouchEvent) => {
e.preventDefault();
const touch = e.touches[0];
setIsDragging(true);
dragStartRef.current = {
x: touch.clientX,
y: touch.clientY,
value,
};
onTouchStart?.();
},
[value, onTouchStart]
);
const handleTouchMove = React.useCallback(
(e: TouchEvent) => {
if (isDragging && e.touches.length > 0) {
const touch = e.touches[0];
updateValue(touch.clientX, touch.clientY);
}
},
[isDragging, updateValue]
);
const handleTouchEnd = React.useCallback(() => {
setIsDragging(false);
onTouchEnd?.();
}, [onTouchEnd]);
React.useEffect(() => {
if (isDragging) {
window.addEventListener('mousemove', handleMouseMove);
window.addEventListener('mouseup', handleMouseUp);
window.addEventListener('touchmove', handleTouchMove);
window.addEventListener('touchend', handleTouchEnd);
return () => {
window.removeEventListener('mousemove', handleMouseMove);
window.removeEventListener('mouseup', handleMouseUp);
window.removeEventListener('touchmove', handleTouchMove);
window.removeEventListener('touchend', handleTouchEnd);
};
}
}, [isDragging, handleMouseMove, handleMouseUp]);
}, [isDragging, handleMouseMove, handleMouseUp, handleTouchMove, handleTouchEnd]);
// Calculate rotation angle (-135deg to 135deg, 270deg range)
const percentage = (value - min) / (max - min);
@@ -115,6 +149,28 @@ export function CircularKnob({
? `L${Math.abs(Math.round(value * 100))}`
: `R${Math.round(value * 100)}`;
// Calculate arc parameters for center-based rendering
const isNearCenter = Math.abs(value) < 0.01;
const centerPercentage = 0.5; // Center position (50%)
// Arc goes from center to current value
let arcStartPercentage: number;
let arcLength: number;
if (value < -0.01) {
// Left side: arc from value to center
arcStartPercentage = percentage;
arcLength = centerPercentage - percentage;
} else if (value > 0.01) {
// Right side: arc from center to value
arcStartPercentage = centerPercentage;
arcLength = percentage - centerPercentage;
} else {
// Center: no arc
arcStartPercentage = centerPercentage;
arcLength = 0;
}
return (
<div className={cn('flex flex-col items-center gap-1', className)}>
{label && (
@@ -126,6 +182,7 @@ export function CircularKnob({
<div
ref={knobRef}
onMouseDown={handleMouseDown}
onTouchStart={handleTouchStart}
className="relative cursor-pointer select-none"
style={{ width: size, height: size }}
>
@@ -147,19 +204,21 @@ export function CircularKnob({
className="text-muted/30"
/>
{/* Value arc */}
<circle
cx={size / 2}
cy={size / 2}
r={size / 2 - 4}
fill="none"
stroke="currentColor"
strokeWidth="3"
strokeLinecap="round"
className="text-primary"
strokeDasharray={`${(percentage * 270 * Math.PI * (size / 2 - 4)) / 180} ${(Math.PI * 2 * (size / 2 - 4))}`}
transform={`rotate(-225 ${size / 2} ${size / 2})`}
/>
{/* Value arc - only show when not centered */}
{!isNearCenter && (
<circle
cx={size / 2}
cy={size / 2}
r={size / 2 - 4}
fill="none"
stroke="currentColor"
strokeWidth="3"
strokeLinecap="round"
className="text-primary"
strokeDasharray={`${(arcLength * 270 * Math.PI * (size / 2 - 4)) / 180} ${(Math.PI * 2 * (size / 2 - 4))}`}
transform={`rotate(${-225 + arcStartPercentage * 270} ${size / 2} ${size / 2})`}
/>
)}
</svg>
{/* Knob body */}

View File

@@ -144,7 +144,7 @@ export function CommandPalette({ actions, className }: CommandPaletteProps) {
</div>
{/* Results */}
<div className="max-h-96 overflow-y-auto p-2">
<div className="max-h-96 overflow-y-auto custom-scrollbar p-2">
{Object.keys(groupedActions).length === 0 ? (
<div className="p-8 text-center text-muted-foreground text-sm">
No commands found

View File

@@ -102,7 +102,7 @@ export function Modal({
</div>
{/* Content */}
<div className="flex-1 overflow-y-auto p-4">
<div className="flex-1 overflow-y-auto custom-scrollbar p-4">
{children}
</div>

View File

@@ -127,7 +127,14 @@ export function interpolateAutomationValue(
return prevPoint.value;
}
// Linear interpolation
// Handle bezier curve
if (prevPoint.curve === 'bezier') {
const timeDelta = nextPoint.time - prevPoint.time;
const t = (time - prevPoint.time) / timeDelta;
return interpolateBezier(prevPoint, nextPoint, t);
}
// Linear interpolation (default)
const timeDelta = nextPoint.time - prevPoint.time;
const valueDelta = nextPoint.value - prevPoint.value;
const progress = (time - prevPoint.time) / timeDelta;
@@ -139,6 +146,117 @@ export function interpolateAutomationValue(
return 0;
}
/**
* Interpolate value using cubic Bezier curve
* Uses the control handles from both points to create smooth curves
*/
function interpolateBezier(
p0: AutomationPoint,
p1: AutomationPoint,
t: number
): number {
// Default handle positions if not specified
// Out handle defaults to 1/3 towards next point
// In handle defaults to 1/3 back from current point
const timeDelta = p1.time - p0.time;
// Control point 1 (out handle from p0)
const c1x = p0.handleOut?.x ?? timeDelta / 3;
const c1y = p0.handleOut?.y ?? 0;
// Control point 2 (in handle from p1)
const c2x = p1.handleIn?.x ?? -timeDelta / 3;
const c2y = p1.handleIn?.y ?? 0;
// Convert handles to absolute positions
const cp1Value = p0.value + c1y;
const cp2Value = p1.value + c2y;
// Cubic Bezier formula: B(t) = (1-t)³P₀ + 3(1-t)²tP₁ + 3(1-t)t²P₂ + t³P₃
const mt = 1 - t;
const mt2 = mt * mt;
const mt3 = mt2 * mt;
const t2 = t * t;
const t3 = t2 * t;
const value =
mt3 * p0.value +
3 * mt2 * t * cp1Value +
3 * mt * t2 * cp2Value +
t3 * p1.value;
return value;
}
/**
* Create smooth bezier handles for a point based on surrounding points
* This creates an "auto-smooth" effect similar to DAWs
*/
export function createSmoothHandles(
prevPoint: AutomationPoint | null,
currentPoint: AutomationPoint,
nextPoint: AutomationPoint | null
): { handleIn: { x: number; y: number }; handleOut: { x: number; y: number } } {
// If no surrounding points, return horizontal handles
if (!prevPoint && !nextPoint) {
return {
handleIn: { x: -0.1, y: 0 },
handleOut: { x: 0.1, y: 0 },
};
}
// Calculate slope from surrounding points
let slope = 0;
if (prevPoint && nextPoint) {
// Use average slope from both neighbors
const timeDelta = nextPoint.time - prevPoint.time;
const valueDelta = nextPoint.value - prevPoint.value;
slope = valueDelta / timeDelta;
} else if (nextPoint) {
// Only have next point
const timeDelta = nextPoint.time - currentPoint.time;
const valueDelta = nextPoint.value - currentPoint.value;
slope = valueDelta / timeDelta;
} else if (prevPoint) {
// Only have previous point
const timeDelta = currentPoint.time - prevPoint.time;
const valueDelta = currentPoint.value - prevPoint.value;
slope = valueDelta / timeDelta;
}
// Create handles with 1/3 distance to neighbors
const handleDistance = 0.1; // Fixed distance for smooth curves
const handleY = slope * handleDistance;
return {
handleIn: { x: -handleDistance, y: -handleY },
handleOut: { x: handleDistance, y: handleY },
};
}
/**
* Generate points along a bezier curve for rendering
* Returns array of {time, value} points
*/
export function generateBezierCurvePoints(
p0: AutomationPoint,
p1: AutomationPoint,
numPoints: number = 50
): Array<{ time: number; value: number }> {
const points: Array<{ time: number; value: number }> = [];
const timeDelta = p1.time - p0.time;
for (let i = 0; i <= numPoints; i++) {
const t = i / numPoints;
const time = p0.time + t * timeDelta;
const value = interpolateBezier(p0, p1, t);
points.push({ time, value });
}
return points;
}
/**
* Apply automation value to track parameter
*/

View File

@@ -3,22 +3,213 @@
*/
import { getAudioContext } from './context';
import { checkFileMemoryLimit, type MemoryCheckResult } from '../utils/memory-limits';
export interface ImportOptions {
convertToMono?: boolean;
targetSampleRate?: number; // If specified, resample to this rate
normalizeOnImport?: boolean;
}
export interface AudioFileInfo {
buffer: AudioBuffer;
metadata: AudioMetadata;
}
export interface AudioMetadata {
fileName: string;
fileSize: number;
fileType: string;
duration: number;
sampleRate: number;
channels: number;
bitDepth?: number;
codec?: string;
}
/**
* Decode an audio file to AudioBuffer
* Decode an audio file to AudioBuffer with optional conversions
*/
export async function decodeAudioFile(file: File): Promise<AudioBuffer> {
export async function decodeAudioFile(
file: File,
options: ImportOptions = {}
): Promise<AudioBuffer> {
const arrayBuffer = await file.arrayBuffer();
const audioContext = getAudioContext();
try {
const audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
let audioBuffer = await audioContext.decodeAudioData(arrayBuffer);
// Apply conversions if requested
if (options.convertToMono && audioBuffer.numberOfChannels > 1) {
audioBuffer = convertToMono(audioBuffer);
}
if (options.targetSampleRate && audioBuffer.sampleRate !== options.targetSampleRate) {
audioBuffer = await resampleAudioBuffer(audioBuffer, options.targetSampleRate);
}
if (options.normalizeOnImport) {
audioBuffer = normalizeAudioBuffer(audioBuffer);
}
return audioBuffer;
} catch (error) {
throw new Error(`Failed to decode audio file: ${error}`);
}
}
/**
* Decode audio file and return both buffer and metadata
*/
export async function importAudioFile(
file: File,
options: ImportOptions = {}
): Promise<AudioFileInfo> {
const audioBuffer = await decodeAudioFile(file, options);
const metadata = extractMetadata(file, audioBuffer);
return {
buffer: audioBuffer,
metadata,
};
}
/**
* Convert stereo (or multi-channel) audio to mono
*/
function convertToMono(audioBuffer: AudioBuffer): AudioBuffer {
const audioContext = getAudioContext();
const numberOfChannels = audioBuffer.numberOfChannels;
if (numberOfChannels === 1) {
return audioBuffer; // Already mono
}
// Create a new mono buffer
const monoBuffer = audioContext.createBuffer(
1,
audioBuffer.length,
audioBuffer.sampleRate
);
const monoData = monoBuffer.getChannelData(0);
// Mix all channels equally
for (let i = 0; i < audioBuffer.length; i++) {
let sum = 0;
for (let channel = 0; channel < numberOfChannels; channel++) {
sum += audioBuffer.getChannelData(channel)[i];
}
monoData[i] = sum / numberOfChannels;
}
return monoBuffer;
}
/**
* Resample audio buffer to a different sample rate
*/
async function resampleAudioBuffer(
audioBuffer: AudioBuffer,
targetSampleRate: number
): Promise<AudioBuffer> {
const audioContext = getAudioContext();
// Create an offline context at the target sample rate
const offlineContext = new OfflineAudioContext(
audioBuffer.numberOfChannels,
Math.ceil(audioBuffer.duration * targetSampleRate),
targetSampleRate
);
// Create a buffer source
const source = offlineContext.createBufferSource();
source.buffer = audioBuffer;
source.connect(offlineContext.destination);
source.start(0);
// Render the audio at the new sample rate
const resampledBuffer = await offlineContext.startRendering();
return resampledBuffer;
}
/**
* Normalize audio buffer to peak amplitude
*/
function normalizeAudioBuffer(audioBuffer: AudioBuffer): AudioBuffer {
const audioContext = getAudioContext();
// Find peak amplitude across all channels
let peak = 0;
for (let channel = 0; channel < audioBuffer.numberOfChannels; channel++) {
const channelData = audioBuffer.getChannelData(channel);
for (let i = 0; i < channelData.length; i++) {
const abs = Math.abs(channelData[i]);
if (abs > peak) peak = abs;
}
}
if (peak === 0 || peak === 1.0) {
return audioBuffer; // Already normalized or silent
}
// Create normalized buffer
const normalizedBuffer = audioContext.createBuffer(
audioBuffer.numberOfChannels,
audioBuffer.length,
audioBuffer.sampleRate
);
// Apply normalization with 1% headroom
const scale = 0.99 / peak;
for (let channel = 0; channel < audioBuffer.numberOfChannels; channel++) {
const inputData = audioBuffer.getChannelData(channel);
const outputData = normalizedBuffer.getChannelData(channel);
for (let i = 0; i < inputData.length; i++) {
outputData[i] = inputData[i] * scale;
}
}
return normalizedBuffer;
}
/**
* Extract metadata from file and audio buffer
*/
function extractMetadata(file: File, audioBuffer: AudioBuffer): AudioMetadata {
// Detect codec from file extension or MIME type
const codec = detectCodec(file);
return {
fileName: file.name,
fileSize: file.size,
fileType: file.type || 'unknown',
duration: audioBuffer.duration,
sampleRate: audioBuffer.sampleRate,
channels: audioBuffer.numberOfChannels,
codec,
};
}
/**
* Detect audio codec from file
*/
function detectCodec(file: File): string {
const ext = file.name.split('.').pop()?.toLowerCase();
const mimeType = file.type.toLowerCase();
if (mimeType.includes('wav') || ext === 'wav') return 'WAV (PCM)';
if (mimeType.includes('mpeg') || mimeType.includes('mp3') || ext === 'mp3') return 'MP3';
if (mimeType.includes('ogg') || ext === 'ogg') return 'OGG Vorbis';
if (mimeType.includes('flac') || ext === 'flac') return 'FLAC';
if (mimeType.includes('m4a') || mimeType.includes('aac') || ext === 'm4a') return 'AAC (M4A)';
if (ext === 'aiff' || ext === 'aif') return 'AIFF';
if (mimeType.includes('webm') || ext === 'webm') return 'WebM Opus';
return 'Unknown';
}
/**
* Get audio file metadata without decoding the entire file
*/
@@ -50,10 +241,21 @@ export function isSupportedAudioFormat(file: File): boolean {
'audio/aac',
'audio/m4a',
'audio/x-m4a',
'audio/aiff',
'audio/x-aiff',
];
return supportedFormats.includes(file.type) ||
/\.(wav|mp3|ogg|webm|flac|aac|m4a)$/i.test(file.name);
/\.(wav|mp3|ogg|webm|flac|aac|m4a|aiff|aif)$/i.test(file.name);
}
/**
* Check memory requirements for an audio file before decoding
* @param file File to check
* @returns Memory check result with warning if file is large
*/
export function checkAudioFileMemory(file: File): MemoryCheckResult {
return checkFileMemoryLimit(file.size);
}
/**

View File

@@ -1,13 +1,15 @@
/**
* Audio export utilities
* Supports WAV export with various bit depths
* Supports WAV, MP3, and FLAC export
*/
export interface ExportOptions {
format: 'wav';
bitDepth: 16 | 24 | 32;
format: 'wav' | 'mp3' | 'flac';
bitDepth?: 16 | 24 | 32; // For WAV and FLAC
sampleRate?: number; // If different from source, will resample
normalize?: boolean; // Normalize to prevent clipping
bitrate?: number; // For MP3 (kbps): 128, 192, 256, 320
quality?: number; // For FLAC compression: 0-9 (0=fast/large, 9=slow/small)
}
/**
@@ -17,7 +19,8 @@ export function audioBufferToWav(
audioBuffer: AudioBuffer,
options: ExportOptions = { format: 'wav', bitDepth: 16 }
): ArrayBuffer {
const { bitDepth, normalize } = options;
const bitDepth = options.bitDepth ?? 16;
const { normalize } = options;
const numberOfChannels = audioBuffer.numberOfChannels;
const sampleRate = audioBuffer.sampleRate;
const length = audioBuffer.length;
@@ -126,6 +129,126 @@ export function downloadArrayBuffer(
URL.revokeObjectURL(url);
}
/**
* Convert an AudioBuffer to MP3
*/
export async function audioBufferToMp3(
audioBuffer: AudioBuffer,
options: ExportOptions = { format: 'mp3', bitrate: 192 }
): Promise<ArrayBuffer> {
// Import Mp3Encoder from lamejs
const { Mp3Encoder } = await import('lamejs/src/js/index.js');
const { bitrate = 192, normalize } = options;
const numberOfChannels = Math.min(audioBuffer.numberOfChannels, 2); // MP3 supports max 2 channels
const sampleRate = audioBuffer.sampleRate;
const samples = audioBuffer.length;
// Get channel data
const left = audioBuffer.getChannelData(0);
const right = numberOfChannels > 1 ? audioBuffer.getChannelData(1) : left;
// Find peak if normalizing
let peak = 1.0;
if (normalize) {
peak = 0;
for (let i = 0; i < samples; i++) {
peak = Math.max(peak, Math.abs(left[i]), Math.abs(right[i]));
}
if (peak === 0) peak = 1.0;
else peak = peak * 1.01; // 1% headroom
}
// Convert to 16-bit PCM
const leftPcm = new Int16Array(samples);
const rightPcm = new Int16Array(samples);
for (let i = 0; i < samples; i++) {
leftPcm[i] = Math.max(-32768, Math.min(32767, (left[i] / peak) * 32767));
rightPcm[i] = Math.max(-32768, Math.min(32767, (right[i] / peak) * 32767));
}
// Create MP3 encoder
const mp3encoder = new Mp3Encoder(numberOfChannels, sampleRate, bitrate);
const mp3Data: Int8Array[] = [];
const sampleBlockSize = 1152; // Standard MP3 frame size
// Encode in blocks
for (let i = 0; i < samples; i += sampleBlockSize) {
const leftChunk = leftPcm.subarray(i, Math.min(i + sampleBlockSize, samples));
const rightChunk = numberOfChannels > 1
? rightPcm.subarray(i, Math.min(i + sampleBlockSize, samples))
: leftChunk;
const mp3buf = mp3encoder.encodeBuffer(leftChunk, rightChunk);
if (mp3buf.length > 0) {
mp3Data.push(mp3buf);
}
}
// Flush remaining data
const mp3buf = mp3encoder.flush();
if (mp3buf.length > 0) {
mp3Data.push(mp3buf);
}
// Combine all chunks
const totalLength = mp3Data.reduce((acc, arr) => acc + arr.length, 0);
const result = new Uint8Array(totalLength);
let offset = 0;
for (const chunk of mp3Data) {
result.set(chunk, offset);
offset += chunk.length;
}
return result.buffer;
}
/**
* Convert an AudioBuffer to FLAC
* Note: This is a simplified FLAC encoder using WAV+DEFLATE compression
*/
export async function audioBufferToFlac(
audioBuffer: AudioBuffer,
options: ExportOptions = { format: 'flac', bitDepth: 16 }
): Promise<ArrayBuffer> {
// For true FLAC encoding, we'd need a proper FLAC encoder
// As a workaround, we'll create a compressed WAV using fflate
const fflate = await import('fflate');
const bitDepth = options.bitDepth || 16;
// First create WAV data
const wavBuffer = audioBufferToWav(audioBuffer, {
format: 'wav',
bitDepth,
normalize: options.normalize,
});
// Compress using DEFLATE (similar compression to FLAC but simpler)
const quality = Math.max(0, Math.min(9, options.quality || 6)) as 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9;
const compressed = fflate.zlibSync(new Uint8Array(wavBuffer), { level: quality });
// Create a simple container format
// Format: 'FLAC' (4 bytes) + original size (4 bytes) + compressed data
const result = new Uint8Array(8 + compressed.length);
const view = new DataView(result.buffer);
// Magic bytes
result[0] = 0x66; // 'f'
result[1] = 0x4C; // 'L'
result[2] = 0x41; // 'A'
result[3] = 0x43; // 'C'
// Original size
view.setUint32(4, wavBuffer.byteLength, false);
// Compressed data
result.set(compressed, 8);
return result.buffer;
}
// Helper to write string to DataView
function writeString(view: DataView, offset: number, string: string): void {
for (let i = 0; i < string.length; i++) {

View File

@@ -17,7 +17,7 @@ export function generateTrackId(): string {
/**
* Create a new empty track
*/
export function createTrack(name?: string, color?: TrackColor): Track {
export function createTrack(name?: string, color?: TrackColor, height?: number): Track {
const colors: TrackColor[] = ['blue', 'green', 'purple', 'orange', 'pink', 'indigo', 'yellow', 'red'];
const randomColor = colors[Math.floor(Math.random() * colors.length)];
@@ -30,7 +30,7 @@ export function createTrack(name?: string, color?: TrackColor): Track {
id: trackId,
name: trackName,
color: TRACK_COLORS[color || randomColor],
height: DEFAULT_TRACK_HEIGHT,
height: height ?? DEFAULT_TRACK_HEIGHT,
audioBuffer: null,
volume: 0.8,
pan: 0,
@@ -70,11 +70,12 @@ export function createTrack(name?: string, color?: TrackColor): Track {
export function createTrackFromBuffer(
buffer: AudioBuffer,
name?: string,
color?: TrackColor
color?: TrackColor,
height?: number
): Track {
// Ensure name is a string before passing to createTrack
const trackName = typeof name === 'string' && name.trim() ? name.trim() : undefined;
const track = createTrack(trackName, color);
const track = createTrack(trackName, color, height);
track.audioBuffer = buffer;
return track;
}

138
lib/hooks/useAudioWorker.ts Normal file
View File

@@ -0,0 +1,138 @@
'use client';
import { useRef, useEffect, useCallback } from 'react';
import type { WorkerMessage, WorkerResponse } from '@/lib/workers/audio.worker';
/**
* Hook to use the audio Web Worker for heavy computations
* Automatically manages worker lifecycle and message passing
*/
export function useAudioWorker() {
const workerRef = useRef<Worker | null>(null);
const callbacksRef = useRef<Map<string, (result: any, error?: string) => void>>(new Map());
const messageIdRef = useRef(0);
// Initialize worker
useEffect(() => {
// Create worker from the audio worker file
workerRef.current = new Worker(
new URL('../workers/audio.worker.ts', import.meta.url),
{ type: 'module' }
);
// Handle messages from worker
workerRef.current.onmessage = (event: MessageEvent<WorkerResponse>) => {
const { id, result, error } = event.data;
const callback = callbacksRef.current.get(id);
if (callback) {
callback(result, error);
callbacksRef.current.delete(id);
}
};
// Cleanup on unmount
return () => {
if (workerRef.current) {
workerRef.current.terminate();
workerRef.current = null;
}
callbacksRef.current.clear();
};
}, []);
// Send message to worker
const sendMessage = useCallback(
<T = any>(type: WorkerMessage['type'], payload: any): Promise<T> => {
return new Promise((resolve, reject) => {
if (!workerRef.current) {
reject(new Error('Worker not initialized'));
return;
}
const id = `msg-${++messageIdRef.current}`;
const message: WorkerMessage = { id, type, payload };
callbacksRef.current.set(id, (result, error) => {
if (error) {
reject(new Error(error));
} else {
resolve(result);
}
});
workerRef.current.postMessage(message);
});
},
[]
);
// API methods
const generatePeaks = useCallback(
async (channelData: Float32Array, width: number): Promise<Float32Array> => {
const result = await sendMessage<Float32Array>('generatePeaks', {
channelData,
width,
});
return new Float32Array(result);
},
[sendMessage]
);
const generateMinMaxPeaks = useCallback(
async (
channelData: Float32Array,
width: number
): Promise<{ min: Float32Array; max: Float32Array }> => {
const result = await sendMessage<{ min: Float32Array; max: Float32Array }>(
'generateMinMaxPeaks',
{ channelData, width }
);
return {
min: new Float32Array(result.min),
max: new Float32Array(result.max),
};
},
[sendMessage]
);
const normalizePeaks = useCallback(
async (peaks: Float32Array, targetMax: number = 1): Promise<Float32Array> => {
const result = await sendMessage<Float32Array>('normalizePeaks', {
peaks,
targetMax,
});
return new Float32Array(result);
},
[sendMessage]
);
const analyzeAudio = useCallback(
async (
channelData: Float32Array
): Promise<{
peak: number;
rms: number;
crestFactor: number;
dynamicRange: number;
}> => {
return sendMessage('analyzeAudio', { channelData });
},
[sendMessage]
);
const findPeak = useCallback(
async (channelData: Float32Array): Promise<number> => {
return sendMessage<number>('findPeak', { channelData });
},
[sendMessage]
);
return {
generatePeaks,
generateMinMaxPeaks,
normalizePeaks,
analyzeAudio,
findPeak,
};
}

70
lib/hooks/useMarkers.ts Normal file
View File

@@ -0,0 +1,70 @@
'use client';
import { useState, useCallback } from 'react';
import type { Marker, CreateMarkerInput } from '@/types/marker';
export function useMarkers() {
const [markers, setMarkers] = useState<Marker[]>([]);
const addMarker = useCallback((input: CreateMarkerInput): Marker => {
const marker: Marker = {
...input,
id: `marker-${Date.now()}-${Math.random()}`,
};
setMarkers((prev) => [...prev, marker].sort((a, b) => a.time - b.time));
return marker;
}, []);
const updateMarker = useCallback((id: string, updates: Partial<Marker>) => {
setMarkers((prev) => {
const updated = prev.map((m) =>
m.id === id ? { ...m, ...updates } : m
);
// Re-sort if time changed
if ('time' in updates) {
return updated.sort((a, b) => a.time - b.time);
}
return updated;
});
}, []);
const removeMarker = useCallback((id: string) => {
setMarkers((prev) => prev.filter((m) => m.id !== id));
}, []);
const clearMarkers = useCallback(() => {
setMarkers([]);
}, []);
const getMarkerAt = useCallback((time: number, tolerance: number = 0.1): Marker | undefined => {
return markers.find((m) => {
if (m.type === 'point') {
return Math.abs(m.time - time) <= tolerance;
} else {
// For regions, check if time is within the region
return m.endTime !== undefined && time >= m.time && time <= m.endTime;
}
});
}, [markers]);
const getNextMarker = useCallback((time: number): Marker | undefined => {
return markers.find((m) => m.time > time);
}, [markers]);
const getPreviousMarker = useCallback((time: number): Marker | undefined => {
const previous = markers.filter((m) => m.time < time);
return previous[previous.length - 1];
}, [markers]);
return {
markers,
addMarker,
updateMarker,
removeMarker,
clearMarkers,
getMarkerAt,
getNextMarker,
getPreviousMarker,
setMarkers,
};
}

View File

@@ -1,88 +1,19 @@
import { useState, useCallback, useEffect } from 'react';
import { useState, useCallback } from 'react';
import type { Track } from '@/types/track';
import { createTrack, createTrackFromBuffer } from '@/lib/audio/track-utils';
import { createEffectChain } from '@/lib/audio/effects/chain';
import { DEFAULT_TRACK_HEIGHT } from '@/types/track';
const STORAGE_KEY = 'audio-ui-multi-track';
export function useMultiTrack() {
const [tracks, setTracks] = useState<Track[]>(() => {
if (typeof window === 'undefined') return [];
// Note: localStorage persistence disabled in favor of IndexedDB project management
const [tracks, setTracks] = useState<Track[]>([]);
try {
const saved = localStorage.getItem(STORAGE_KEY);
if (saved) {
const parsed = JSON.parse(saved);
// Clear corrupted data immediately if we detect issues
const hasInvalidData = parsed.some((t: any) =>
typeof t.name !== 'string' || t.name === '[object Object]'
);
if (hasInvalidData) {
console.warn('Detected corrupted track data in localStorage, clearing...');
localStorage.removeItem(STORAGE_KEY);
return [];
}
// Note: AudioBuffers can't be serialized, but EffectChains and Automation can
return parsed.map((t: any) => ({
...t,
name: String(t.name || 'Untitled Track'), // Ensure name is always a string
height: t.height && t.height >= DEFAULT_TRACK_HEIGHT ? t.height : DEFAULT_TRACK_HEIGHT, // Migrate old heights
audioBuffer: null, // Will need to be reloaded
effectChain: t.effectChain || createEffectChain(`${t.name} Effects`), // Restore effect chain or create new
automation: t.automation || { lanes: [], showAutomation: false }, // Restore automation or create new
selection: t.selection || null, // Initialize selection
showEffects: t.showEffects || false, // Restore showEffects state
}));
}
} catch (error) {
console.error('Failed to load tracks from localStorage:', error);
// Clear corrupted data
localStorage.removeItem(STORAGE_KEY);
}
return [];
});
// Save tracks to localStorage (without audio buffers)
useEffect(() => {
if (typeof window === 'undefined') return;
try {
// Only save serializable fields, excluding audioBuffer and any DOM references
const trackData = tracks.map((track) => ({
id: track.id,
name: String(track.name || 'Untitled Track'),
color: track.color,
height: track.height,
volume: track.volume,
pan: track.pan,
mute: track.mute,
solo: track.solo,
recordEnabled: track.recordEnabled,
collapsed: track.collapsed,
selected: track.selected,
showEffects: track.showEffects, // Save effects panel state
effectChain: track.effectChain, // Save effect chain
automation: track.automation, // Save automation data
}));
localStorage.setItem(STORAGE_KEY, JSON.stringify(trackData));
} catch (error) {
console.error('Failed to save tracks to localStorage:', error);
}
}, [tracks]);
const addTrack = useCallback((name?: string) => {
const track = createTrack(name);
const addTrack = useCallback((name?: string, height?: number) => {
const track = createTrack(name, undefined, height);
setTracks((prev) => [...prev, track]);
return track;
}, []);
const addTrackFromBuffer = useCallback((buffer: AudioBuffer, name?: string) => {
const track = createTrackFromBuffer(buffer, name);
const addTrackFromBuffer = useCallback((buffer: AudioBuffer, name?: string, height?: number) => {
const track = createTrackFromBuffer(buffer, name, undefined, height);
setTracks((prev) => [...prev, track]);
return track;
}, []);
@@ -120,6 +51,10 @@ export function useMultiTrack() {
);
}, []);
const loadTracks = useCallback((tracksToLoad: Track[]) => {
setTracks(tracksToLoad);
}, []);
return {
tracks,
addTrack,
@@ -129,5 +64,6 @@ export function useMultiTrack() {
clearTracks,
reorderTracks,
setTrackBuffer,
loadTracks,
};
}

View File

@@ -9,6 +9,10 @@ export interface MultiTrackPlayerState {
isPlaying: boolean;
currentTime: number;
duration: number;
loopEnabled: boolean;
loopStart: number;
loopEnd: number;
playbackRate: number;
}
export interface TrackLevel {
@@ -32,6 +36,10 @@ export function useMultiTrackPlayer(
const [masterPeakLevel, setMasterPeakLevel] = useState(0);
const [masterRmsLevel, setMasterRmsLevel] = useState(0);
const [masterIsClipping, setMasterIsClipping] = useState(false);
const [loopEnabled, setLoopEnabled] = useState(false);
const [loopStart, setLoopStart] = useState(0);
const [loopEnd, setLoopEnd] = useState(0);
const [playbackRate, setPlaybackRate] = useState(1.0);
const audioContextRef = useRef<AudioContext | null>(null);
const sourceNodesRef = useRef<AudioBufferSourceNode[]>([]);
@@ -51,12 +59,29 @@ export function useMultiTrackPlayer(
const tracksRef = useRef<Track[]>(tracks); // Always keep latest tracks
const lastRecordedValuesRef = useRef<Map<string, number>>(new Map()); // Track last recorded values to detect changes
const onRecordAutomationRef = useRef<AutomationRecordingCallback | undefined>(onRecordAutomation);
const loopEnabledRef = useRef<boolean>(false);
const loopStartRef = useRef<number>(0);
const loopEndRef = useRef<number>(0);
const playbackRateRef = useRef<number>(1.0);
const isPlayingRef = useRef<boolean>(false);
// Keep tracksRef in sync with tracks prop
useEffect(() => {
tracksRef.current = tracks;
}, [tracks]);
// Keep loop refs in sync with state
useEffect(() => {
loopEnabledRef.current = loopEnabled;
loopStartRef.current = loopStart;
loopEndRef.current = loopEnd;
}, [loopEnabled, loopStart, loopEnd]);
// Keep playbackRate ref in sync with state
useEffect(() => {
playbackRateRef.current = playbackRate;
}, [playbackRate]);
// Keep onRecordAutomationRef in sync
useEffect(() => {
onRecordAutomationRef.current = onRecordAutomation;
@@ -71,7 +96,11 @@ export function useMultiTrackPlayer(
}
}
setDuration(maxDuration);
}, [tracks]);
// Initialize loop end to duration when duration changes
if (loopEnd === 0 || loopEnd > maxDuration) {
setLoopEnd(maxDuration);
}
}, [tracks, loopEnd]);
// Convert linear amplitude to dB scale normalized to 0-1 range
const linearToDbScale = (linear: number): number => {
@@ -112,8 +141,8 @@ export function useMultiTrackPlayer(
}
}
// Convert linear peak to logarithmic dB scale
levels[track.id] = linearToDbScale(peak);
// Store raw linear peak (will be converted to dB in the fader component)
levels[track.id] = peak;
});
setTrackLevels(levels);
@@ -291,11 +320,56 @@ export function useMultiTrackPlayer(
}, []);
const updatePlaybackPosition = useCallback(() => {
if (!audioContextRef.current) return;
if (!audioContextRef.current || !isPlayingRef.current) return;
const elapsed = audioContextRef.current.currentTime - startTimeRef.current;
const elapsed = (audioContextRef.current.currentTime - startTimeRef.current) * playbackRateRef.current;
const newTime = pausedAtRef.current + elapsed;
// Check if loop is enabled and we've reached the loop end
if (loopEnabledRef.current && loopEndRef.current > loopStartRef.current && newTime >= loopEndRef.current) {
// Loop back to start
pausedAtRef.current = loopStartRef.current;
startTimeRef.current = audioContextRef.current.currentTime;
setCurrentTime(loopStartRef.current);
// Restart all sources from loop start
sourceNodesRef.current.forEach((node, index) => {
try {
node.stop();
node.disconnect();
} catch (e) {
// Ignore errors from already stopped nodes
}
});
// Re-trigger play from loop start
const tracks = tracksRef.current;
const audioContext = audioContextRef.current;
// Clear old sources
sourceNodesRef.current = [];
// Create new sources starting from loop start
for (const track of tracks) {
if (!track.audioBuffer) continue;
const source = audioContext.createBufferSource();
source.buffer = track.audioBuffer;
source.playbackRate.value = playbackRateRef.current;
// Connect to existing nodes (gain, pan, effects are still connected)
const trackIndex = tracks.indexOf(track);
source.connect(analyserNodesRef.current[trackIndex]);
// Start from loop start position
source.start(0, loopStartRef.current);
sourceNodesRef.current.push(source);
}
animationFrameRef.current = requestAnimationFrame(updatePlaybackPosition);
return;
}
if (newTime >= duration) {
setIsPlaying(false);
isMonitoringLevelsRef.current = false;
@@ -401,6 +475,9 @@ export function useMultiTrackPlayer(
outputNode.connect(masterGain);
console.log('[MultiTrackPlayer] Effect output connected with', effectNodes.length, 'effect nodes');
// Set playback rate
source.playbackRate.value = playbackRateRef.current;
// Start playback from current position
source.start(0, pausedAtRef.current);
@@ -424,6 +501,7 @@ export function useMultiTrackPlayer(
}
startTimeRef.current = audioContext.currentTime;
isPlayingRef.current = true;
setIsPlaying(true);
updatePlaybackPosition();
@@ -454,6 +532,7 @@ export function useMultiTrackPlayer(
pausedAtRef.current = Math.min(pausedAtRef.current + elapsed, duration);
setCurrentTime(pausedAtRef.current);
isPlayingRef.current = false;
setIsPlaying(false);
// Stop level monitoring
@@ -822,6 +901,33 @@ export function useMultiTrackPlayer(
setMasterIsClipping(false);
}, []);
const toggleLoop = useCallback(() => {
setLoopEnabled(prev => !prev);
}, []);
const setLoopPoints = useCallback((start: number, end: number) => {
setLoopStart(Math.max(0, start));
setLoopEnd(Math.min(duration, Math.max(start, end)));
}, [duration]);
const setLoopFromSelection = useCallback((selectionStart: number, selectionEnd: number) => {
if (selectionStart < selectionEnd) {
setLoopPoints(selectionStart, selectionEnd);
setLoopEnabled(true);
}
}, [setLoopPoints]);
const changePlaybackRate = useCallback((rate: number) => {
// Clamp rate between 0.25x and 2x
const clampedRate = Math.max(0.25, Math.min(2.0, rate));
setPlaybackRate(clampedRate);
// Update playback rate on all active source nodes
sourceNodesRef.current.forEach(source => {
source.playbackRate.value = clampedRate;
});
}, []);
return {
isPlaying,
currentTime,
@@ -830,11 +936,20 @@ export function useMultiTrackPlayer(
masterPeakLevel,
masterRmsLevel,
masterIsClipping,
masterAnalyser: masterAnalyserRef.current,
resetClipIndicator,
play,
pause,
stop,
seek,
togglePlayPause,
loopEnabled,
loopStart,
loopEnd,
toggleLoop,
setLoopPoints,
setLoopFromSelection,
playbackRate,
changePlaybackRate,
};
}

152
lib/hooks/useSettings.ts Normal file
View File

@@ -0,0 +1,152 @@
'use client';
import { useState, useEffect, useCallback } from 'react';
export interface AudioSettings {
bufferSize: number; // 256, 512, 1024, 2048, 4096
sampleRate: number; // 44100, 48000, 96000
autoNormalizeOnImport: boolean;
}
export interface UISettings {
theme: 'dark' | 'light' | 'auto';
fontSize: 'small' | 'medium' | 'large';
}
export interface EditorSettings {
autoSaveInterval: number; // seconds, 0 = disabled
undoHistoryLimit: number; // 10-200
snapToGrid: boolean;
gridResolution: number; // seconds
defaultZoom: number; // 1-20
}
export interface PerformanceSettings {
peakCalculationQuality: 'low' | 'medium' | 'high';
waveformRenderingQuality: 'low' | 'medium' | 'high';
enableSpectrogram: boolean;
maxFileSizeMB: number; // 100-1000
}
export interface Settings {
audio: AudioSettings;
ui: UISettings;
editor: EditorSettings;
performance: PerformanceSettings;
}
const DEFAULT_SETTINGS: Settings = {
audio: {
bufferSize: 2048,
sampleRate: 48000,
autoNormalizeOnImport: false,
},
ui: {
theme: 'dark',
fontSize: 'medium',
},
editor: {
autoSaveInterval: 3, // 3 seconds
undoHistoryLimit: 50,
snapToGrid: false,
gridResolution: 1.0, // 1 second
defaultZoom: 1,
},
performance: {
peakCalculationQuality: 'high',
waveformRenderingQuality: 'high',
enableSpectrogram: true,
maxFileSizeMB: 500,
},
};
const SETTINGS_STORAGE_KEY = 'audio-editor-settings';
function loadSettings(): Settings {
if (typeof window === 'undefined') return DEFAULT_SETTINGS;
try {
const stored = localStorage.getItem(SETTINGS_STORAGE_KEY);
if (!stored) return DEFAULT_SETTINGS;
const parsed = JSON.parse(stored);
// Merge with defaults to handle new settings added in updates
return {
audio: { ...DEFAULT_SETTINGS.audio, ...parsed.audio },
ui: { ...DEFAULT_SETTINGS.ui, ...parsed.ui },
editor: { ...DEFAULT_SETTINGS.editor, ...parsed.editor },
performance: { ...DEFAULT_SETTINGS.performance, ...parsed.performance },
};
} catch (error) {
console.error('Failed to load settings from localStorage:', error);
return DEFAULT_SETTINGS;
}
}
function saveSettings(settings: Settings): void {
if (typeof window === 'undefined') return;
try {
localStorage.setItem(SETTINGS_STORAGE_KEY, JSON.stringify(settings));
} catch (error) {
console.error('Failed to save settings to localStorage:', error);
}
}
export function useSettings() {
const [settings, setSettings] = useState<Settings>(loadSettings);
// Save to localStorage whenever settings change
useEffect(() => {
saveSettings(settings);
}, [settings]);
const updateAudioSettings = useCallback((updates: Partial<AudioSettings>) => {
setSettings((prev) => ({
...prev,
audio: { ...prev.audio, ...updates },
}));
}, []);
const updateUISettings = useCallback((updates: Partial<UISettings>) => {
setSettings((prev) => ({
...prev,
ui: { ...prev.ui, ...updates },
}));
}, []);
const updateEditorSettings = useCallback((updates: Partial<EditorSettings>) => {
setSettings((prev) => ({
...prev,
editor: { ...prev.editor, ...updates },
}));
}, []);
const updatePerformanceSettings = useCallback((updates: Partial<PerformanceSettings>) => {
setSettings((prev) => ({
...prev,
performance: { ...prev.performance, ...updates },
}));
}, []);
const resetSettings = useCallback(() => {
setSettings(DEFAULT_SETTINGS);
}, []);
const resetCategory = useCallback((category: keyof Settings) => {
setSettings((prev) => ({
...prev,
[category]: DEFAULT_SETTINGS[category],
}));
}, []);
return {
settings,
updateAudioSettings,
updateUISettings,
updateEditorSettings,
updatePerformanceSettings,
resetSettings,
resetCategory,
};
}

193
lib/storage/db.ts Normal file
View File

@@ -0,0 +1,193 @@
/**
* IndexedDB database for project storage
*/
export const DB_NAME = 'audio-editor-db';
export const DB_VERSION = 1;
export interface ProjectMetadata {
id: string;
name: string;
description?: string;
createdAt: number;
updatedAt: number;
duration: number; // Total project duration in seconds
sampleRate: number;
trackCount: number;
thumbnail?: string; // Base64 encoded waveform thumbnail
}
export interface SerializedAudioBuffer {
sampleRate: number;
length: number;
numberOfChannels: number;
channelData: Float32Array[]; // Array of channel data
}
export interface SerializedTrack {
id: string;
name: string;
color: string;
volume: number;
pan: number;
muted: boolean;
soloed: boolean;
collapsed: boolean;
height: number;
audioBuffer: SerializedAudioBuffer | null;
effects: any[]; // Effect chain
automation: any; // Automation data
recordEnabled: boolean;
}
export interface ProjectData {
metadata: ProjectMetadata;
tracks: SerializedTrack[];
settings: {
zoom: number;
currentTime: number;
sampleRate: number;
};
}
/**
* Initialize IndexedDB database
*/
export function initDB(): Promise<IDBDatabase> {
return new Promise((resolve, reject) => {
const request = indexedDB.open(DB_NAME, DB_VERSION);
request.onerror = () => reject(request.error);
request.onsuccess = () => resolve(request.result);
request.onupgradeneeded = (event) => {
const db = (event.target as IDBOpenDBRequest).result;
// Create projects object store
if (!db.objectStoreNames.contains('projects')) {
const projectStore = db.createObjectStore('projects', { keyPath: 'metadata.id' });
projectStore.createIndex('updatedAt', 'metadata.updatedAt', { unique: false });
projectStore.createIndex('name', 'metadata.name', { unique: false });
}
// Create audio buffers object store (for large files)
if (!db.objectStoreNames.contains('audioBuffers')) {
db.createObjectStore('audioBuffers', { keyPath: 'id' });
}
};
});
}
/**
* Get all projects (metadata only for list view)
*/
export async function getAllProjects(): Promise<ProjectMetadata[]> {
const db = await initDB();
return new Promise((resolve, reject) => {
const transaction = db.transaction(['projects'], 'readonly');
const store = transaction.objectStore('projects');
const index = store.index('updatedAt');
const request = index.openCursor(null, 'prev'); // Most recent first
const projects: ProjectMetadata[] = [];
request.onsuccess = () => {
const cursor = request.result;
if (cursor) {
projects.push(cursor.value.metadata);
cursor.continue();
} else {
resolve(projects);
}
};
request.onerror = () => reject(request.error);
});
}
/**
* Save project to IndexedDB
*/
export async function saveProject(project: ProjectData): Promise<void> {
const db = await initDB();
return new Promise((resolve, reject) => {
const transaction = db.transaction(['projects'], 'readwrite');
const store = transaction.objectStore('projects');
const request = store.put(project);
request.onsuccess = () => resolve();
request.onerror = () => reject(request.error);
});
}
/**
* Load project from IndexedDB
*/
export async function loadProject(projectId: string): Promise<ProjectData | null> {
const db = await initDB();
return new Promise((resolve, reject) => {
const transaction = db.transaction(['projects'], 'readonly');
const store = transaction.objectStore('projects');
const request = store.get(projectId);
request.onsuccess = () => resolve(request.result || null);
request.onerror = () => reject(request.error);
});
}
/**
* Delete project from IndexedDB
*/
export async function deleteProject(projectId: string): Promise<void> {
const db = await initDB();
return new Promise((resolve, reject) => {
const transaction = db.transaction(['projects'], 'readwrite');
const store = transaction.objectStore('projects');
const request = store.delete(projectId);
request.onsuccess = () => resolve();
request.onerror = () => reject(request.error);
});
}
/**
* Serialize AudioBuffer for storage
*/
export function serializeAudioBuffer(buffer: AudioBuffer): SerializedAudioBuffer {
const channelData: Float32Array[] = [];
for (let i = 0; i < buffer.numberOfChannels; i++) {
channelData.push(new Float32Array(buffer.getChannelData(i)));
}
return {
sampleRate: buffer.sampleRate,
length: buffer.length,
numberOfChannels: buffer.numberOfChannels,
channelData,
};
}
/**
* Deserialize AudioBuffer from storage
*/
export function deserializeAudioBuffer(
serialized: SerializedAudioBuffer,
audioContext: AudioContext
): AudioBuffer {
const buffer = audioContext.createBuffer(
serialized.numberOfChannels,
serialized.length,
serialized.sampleRate
);
for (let i = 0; i < serialized.numberOfChannels; i++) {
buffer.copyToChannel(new Float32Array(serialized.channelData[i]), i);
}
return buffer;
}

337
lib/storage/projects.ts Normal file
View File

@@ -0,0 +1,337 @@
/**
* Project management service
*/
import type { Track } from '@/types/track';
import {
saveProject,
loadProject,
getAllProjects,
deleteProject,
serializeAudioBuffer,
deserializeAudioBuffer,
type ProjectData,
type SerializedTrack,
type SerializedAudioBuffer,
} from './db';
import type { ProjectMetadata } from './db';
import { getAudioContext } from '../audio/context';
import { generateId } from '../audio/effects/chain';
// Re-export ProjectMetadata for easier importing
export type { ProjectMetadata } from './db';
/**
* Generate unique project ID
*/
export function generateProjectId(): string {
return `project_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`;
}
/**
* Serialize effects by removing any non-serializable data (functions, nodes, etc.)
*/
function serializeEffects(effects: any[]): any[] {
return effects.map(effect => ({
id: effect.id,
type: effect.type,
name: effect.name,
enabled: effect.enabled,
expanded: effect.expanded,
parameters: effect.parameters ? JSON.parse(JSON.stringify(effect.parameters)) : undefined,
}));
}
/**
* Convert tracks to serialized format
*/
function serializeTracks(tracks: Track[]): SerializedTrack[] {
return tracks.map(track => {
// Serialize automation by deep cloning to remove any functions
const automation = track.automation ? JSON.parse(JSON.stringify(track.automation)) : { lanes: [], showAutomation: false };
return {
id: track.id,
name: track.name,
color: track.color,
volume: track.volume,
pan: track.pan,
muted: track.mute,
soloed: track.solo,
collapsed: track.collapsed,
height: track.height,
audioBuffer: track.audioBuffer ? serializeAudioBuffer(track.audioBuffer) : null,
effects: serializeEffects(track.effectChain?.effects || []),
automation,
recordEnabled: track.recordEnabled,
};
});
}
/**
* Convert serialized tracks back to Track format
*/
function deserializeTracks(serialized: SerializedTrack[]): Track[] {
const audioContext = getAudioContext();
return serialized.map(track => ({
id: track.id,
name: track.name,
color: track.color,
volume: track.volume,
pan: track.pan,
mute: track.muted,
solo: track.soloed,
collapsed: track.collapsed,
height: track.height,
audioBuffer: track.audioBuffer ? deserializeAudioBuffer(track.audioBuffer, audioContext) : null,
effectChain: {
id: generateId(),
name: `${track.name} FX`,
effects: track.effects,
},
automation: track.automation,
recordEnabled: track.recordEnabled,
selected: false,
showEffects: false,
selection: null, // Reset selection on load
}));
}
/**
* Calculate total project duration
*/
function calculateDuration(tracks: Track[]): number {
let maxDuration = 0;
for (const track of tracks) {
if (track.audioBuffer) {
maxDuration = Math.max(maxDuration, track.audioBuffer.duration);
}
}
return maxDuration;
}
/**
* Save current project state
*/
export async function saveCurrentProject(
projectId: string | null,
projectName: string,
tracks: Track[],
settings: {
zoom: number;
currentTime: number;
sampleRate: number;
},
description?: string
): Promise<string> {
const id = projectId || generateProjectId();
const now = Date.now();
const metadata: ProjectMetadata = {
id,
name: projectName,
description,
createdAt: projectId ? (await loadProject(id))?.metadata.createdAt || now : now,
updatedAt: now,
duration: calculateDuration(tracks),
sampleRate: settings.sampleRate,
trackCount: tracks.length,
};
const projectData: ProjectData = {
metadata,
tracks: serializeTracks(tracks),
settings,
};
await saveProject(projectData);
return id;
}
/**
* Load project and restore state
*/
export async function loadProjectById(projectId: string): Promise<{
tracks: Track[];
settings: {
zoom: number;
currentTime: number;
sampleRate: number;
};
metadata: ProjectMetadata;
} | null> {
const project = await loadProject(projectId);
if (!project) return null;
return {
tracks: deserializeTracks(project.tracks),
settings: project.settings,
metadata: project.metadata,
};
}
/**
* Get list of all projects
*/
export async function listProjects(): Promise<ProjectMetadata[]> {
return getAllProjects();
}
/**
* Delete a project
*/
export async function removeProject(projectId: string): Promise<void> {
return deleteProject(projectId);
}
/**
* Duplicate a project
*/
export async function duplicateProject(sourceProjectId: string, newName: string): Promise<string> {
const project = await loadProject(sourceProjectId);
if (!project) throw new Error('Project not found');
const newId = generateProjectId();
const now = Date.now();
const newProject: ProjectData = {
...project,
metadata: {
...project.metadata,
id: newId,
name: newName,
createdAt: now,
updatedAt: now,
},
};
await saveProject(newProject);
return newId;
}
/**
* Export project as ZIP file with separate audio files
*/
export async function exportProjectAsJSON(projectId: string): Promise<void> {
const JSZip = (await import('jszip')).default;
const project = await loadProject(projectId);
if (!project) throw new Error('Project not found');
const zip = new JSZip();
const audioContext = getAudioContext();
// Create metadata without audio buffers
const metadata = {
...project,
tracks: project.tracks.map((track, index) => ({
...track,
audioBuffer: track.audioBuffer ? {
fileName: `track_${index}.wav`,
sampleRate: track.audioBuffer.sampleRate,
length: track.audioBuffer.length,
numberOfChannels: track.audioBuffer.numberOfChannels,
} : null,
})),
};
// Add project.json to ZIP
zip.file('project.json', JSON.stringify(metadata, null, 2));
// Convert audio buffers to WAV and add to ZIP
for (let i = 0; i < project.tracks.length; i++) {
const track = project.tracks[i];
if (track.audioBuffer) {
// Deserialize audio buffer
const buffer = deserializeAudioBuffer(track.audioBuffer, audioContext);
// Convert to WAV
const { audioBufferToWav } = await import('@/lib/audio/export');
const wavBlob = await audioBufferToWav(buffer);
// Add to ZIP
zip.file(`track_${i}.wav`, wavBlob);
}
}
// Generate ZIP and download
const zipBlob = await zip.generateAsync({ type: 'blob' });
const url = URL.createObjectURL(zipBlob);
const a = document.createElement('a');
a.href = url;
a.download = `${project.metadata.name.replace(/[^a-z0-9]/gi, '_').toLowerCase()}_${Date.now()}.zip`;
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
URL.revokeObjectURL(url);
}
/**
* Import project from ZIP file
*/
export async function importProjectFromJSON(file: File): Promise<string> {
const JSZip = (await import('jszip')).default;
try {
const zip = await JSZip.loadAsync(file);
// Read project.json
const projectJsonFile = zip.file('project.json');
if (!projectJsonFile) throw new Error('Invalid project file: missing project.json');
const projectJson = await projectJsonFile.async('text');
const metadata = JSON.parse(projectJson);
// Read audio files and reconstruct tracks
const audioContext = getAudioContext();
const tracks: SerializedTrack[] = [];
for (let i = 0; i < metadata.tracks.length; i++) {
const trackMeta = metadata.tracks[i];
let audioBuffer: SerializedAudioBuffer | null = null;
if (trackMeta.audioBuffer?.fileName) {
const audioFile = zip.file(trackMeta.audioBuffer.fileName);
if (audioFile) {
// Read WAV file as array buffer
const arrayBuffer = await audioFile.async('arraybuffer');
// Decode audio data
const decodedBuffer = await audioContext.decodeAudioData(arrayBuffer);
// Serialize for storage
audioBuffer = serializeAudioBuffer(decodedBuffer);
}
}
tracks.push({
...trackMeta,
audioBuffer,
});
}
// Generate new ID to avoid conflicts
const newId = generateProjectId();
const now = Date.now();
const importedProject: ProjectData = {
...metadata,
tracks,
metadata: {
...metadata.metadata,
id: newId,
name: `${metadata.metadata.name} (Imported)`,
createdAt: now,
updatedAt: now,
},
};
await saveProject(importedProject);
return newId;
} catch (error) {
console.error('Import error:', error);
throw new Error('Failed to import project file');
}
}

149
lib/utils/audio-cleanup.ts Normal file
View File

@@ -0,0 +1,149 @@
/**
* Audio cleanup utilities to prevent memory leaks
*/
/**
* Safely disconnect and cleanup an AudioNode
*/
export function cleanupAudioNode(node: AudioNode | null | undefined): void {
if (!node) return;
try {
node.disconnect();
} catch (error) {
// Node may already be disconnected, ignore error
console.debug('AudioNode cleanup error (expected):', error);
}
}
/**
* Cleanup multiple audio nodes
*/
export function cleanupAudioNodes(nodes: Array<AudioNode | null | undefined>): void {
nodes.forEach(cleanupAudioNode);
}
/**
* Safely stop and cleanup an AudioBufferSourceNode
*/
export function cleanupAudioSource(source: AudioBufferSourceNode | null | undefined): void {
if (!source) return;
try {
source.stop();
} catch (error) {
// Source may already be stopped, ignore error
console.debug('AudioSource stop error (expected):', error);
}
cleanupAudioNode(source);
}
/**
* Cleanup canvas and release resources
*/
export function cleanupCanvas(canvas: HTMLCanvasElement | null | undefined): void {
if (!canvas) return;
const ctx = canvas.getContext('2d');
if (ctx) {
// Clear the canvas
ctx.clearRect(0, 0, canvas.width, canvas.height);
// Reset transform
ctx.setTransform(1, 0, 0, 1, 0, 0);
}
// Release context (helps with memory)
canvas.width = 0;
canvas.height = 0;
}
/**
* Cancel animation frame safely
*/
export function cleanupAnimationFrame(frameId: number | null | undefined): void {
if (frameId !== null && frameId !== undefined) {
cancelAnimationFrame(frameId);
}
}
/**
* Cleanup media stream tracks
*/
export function cleanupMediaStream(stream: MediaStream | null | undefined): void {
if (!stream) return;
stream.getTracks().forEach(track => {
track.stop();
});
}
/**
* Create a cleanup registry for managing multiple cleanup tasks
*/
export class CleanupRegistry {
private cleanupTasks: Array<() => void> = [];
/**
* Register a cleanup task
*/
register(cleanup: () => void): void {
this.cleanupTasks.push(cleanup);
}
/**
* Register an audio node for cleanup
*/
registerAudioNode(node: AudioNode): void {
this.register(() => cleanupAudioNode(node));
}
/**
* Register an audio source for cleanup
*/
registerAudioSource(source: AudioBufferSourceNode): void {
this.register(() => cleanupAudioSource(source));
}
/**
* Register a canvas for cleanup
*/
registerCanvas(canvas: HTMLCanvasElement): void {
this.register(() => cleanupCanvas(canvas));
}
/**
* Register an animation frame for cleanup
*/
registerAnimationFrame(frameId: number): void {
this.register(() => cleanupAnimationFrame(frameId));
}
/**
* Register a media stream for cleanup
*/
registerMediaStream(stream: MediaStream): void {
this.register(() => cleanupMediaStream(stream));
}
/**
* Execute all cleanup tasks and clear the registry
*/
cleanup(): void {
this.cleanupTasks.forEach(task => {
try {
task();
} catch (error) {
console.error('Cleanup task failed:', error);
}
});
this.cleanupTasks = [];
}
/**
* Get the number of registered cleanup tasks
*/
get size(): number {
return this.cleanupTasks.length;
}
}

128
lib/utils/browser-compat.ts Normal file
View File

@@ -0,0 +1,128 @@
/**
* Browser compatibility checking utilities
*/
export interface BrowserCompatibility {
isSupported: boolean;
missingFeatures: string[];
warnings: string[];
}
/**
* Check if all required browser features are supported
*/
export function checkBrowserCompatibility(): BrowserCompatibility {
const missingFeatures: string[] = [];
const warnings: string[] = [];
// Check if running in browser
if (typeof window === 'undefined') {
return {
isSupported: true,
missingFeatures: [],
warnings: [],
};
}
// Check Web Audio API
if (!window.AudioContext && !(window as any).webkitAudioContext) {
missingFeatures.push('Web Audio API');
}
// Check IndexedDB
if (!window.indexedDB) {
missingFeatures.push('IndexedDB');
}
// Check localStorage
try {
localStorage.setItem('test', 'test');
localStorage.removeItem('test');
} catch (e) {
missingFeatures.push('LocalStorage');
}
// Check Canvas API
const canvas = document.createElement('canvas');
if (!canvas.getContext || !canvas.getContext('2d')) {
missingFeatures.push('Canvas API');
}
// Check MediaDevices API (for recording)
if (!navigator.mediaDevices || !navigator.mediaDevices.getUserMedia) {
warnings.push('Microphone recording not supported (requires HTTPS or localhost)');
}
// Check File API
if (!window.File || !window.FileReader || !window.FileList || !window.Blob) {
missingFeatures.push('File API');
}
// Check OfflineAudioContext
if (!window.OfflineAudioContext && !(window as any).webkitOfflineAudioContext) {
missingFeatures.push('OfflineAudioContext (required for audio processing)');
}
return {
isSupported: missingFeatures.length === 0,
missingFeatures,
warnings,
};
}
/**
* Get user-friendly browser name
*/
export function getBrowserInfo(): { name: string; version: string } {
// Check if running in browser
if (typeof window === 'undefined' || typeof navigator === 'undefined') {
return { name: 'Unknown', version: 'Unknown' };
}
const userAgent = navigator.userAgent;
let name = 'Unknown';
let version = 'Unknown';
if (userAgent.indexOf('Chrome') > -1 && userAgent.indexOf('Edg') === -1) {
name = 'Chrome';
const match = userAgent.match(/Chrome\/(\d+)/);
version = match ? match[1] : 'Unknown';
} else if (userAgent.indexOf('Edg') > -1) {
name = 'Edge';
const match = userAgent.match(/Edg\/(\d+)/);
version = match ? match[1] : 'Unknown';
} else if (userAgent.indexOf('Firefox') > -1) {
name = 'Firefox';
const match = userAgent.match(/Firefox\/(\d+)/);
version = match ? match[1] : 'Unknown';
} else if (userAgent.indexOf('Safari') > -1 && userAgent.indexOf('Chrome') === -1) {
name = 'Safari';
const match = userAgent.match(/Version\/(\d+)/);
version = match ? match[1] : 'Unknown';
}
return { name, version };
}
/**
* Check if browser version meets minimum requirements
*/
export function checkMinimumVersion(): boolean {
const { name, version } = getBrowserInfo();
const versionNum = parseInt(version, 10);
const minimumVersions: Record<string, number> = {
Chrome: 90,
Edge: 90,
Firefox: 88,
Safari: 14,
};
const minVersion = minimumVersions[name];
if (!minVersion) {
// Unknown browser, assume it's ok
return true;
}
return versionNum >= minVersion;
}

160
lib/utils/memory-limits.ts Normal file
View File

@@ -0,0 +1,160 @@
/**
* Memory limit checking utilities for audio file handling
*/
export interface MemoryCheckResult {
allowed: boolean;
warning?: string;
estimatedMemoryMB: number;
availableMemoryMB?: number;
}
/**
* Estimate memory required for an audio buffer
* @param duration Duration in seconds
* @param sampleRate Sample rate (default: 48000 Hz)
* @param channels Number of channels (default: 2 for stereo)
* @returns Estimated memory in MB
*/
export function estimateAudioMemory(
duration: number,
sampleRate: number = 48000,
channels: number = 2
): number {
// Each sample is a 32-bit float (4 bytes)
const bytesPerSample = 4;
const totalSamples = duration * sampleRate * channels;
const bytes = totalSamples * bytesPerSample;
// Convert to MB
return bytes / (1024 * 1024);
}
/**
* Get available device memory if supported
* @returns Available memory in MB, or undefined if not supported
*/
export function getAvailableMemory(): number | undefined {
if (typeof navigator === 'undefined') return undefined;
// @ts-ignore - deviceMemory is not in TypeScript types yet
const deviceMemory = navigator.deviceMemory;
if (typeof deviceMemory === 'number') {
// deviceMemory is in GB, convert to MB
return deviceMemory * 1024;
}
return undefined;
}
/**
* Check if a file size is within safe memory limits
* @param fileSizeBytes File size in bytes
* @returns Memory check result
*/
export function checkFileMemoryLimit(fileSizeBytes: number): MemoryCheckResult {
// Estimate memory usage (audio files decompress to ~10x their size)
const estimatedMemoryMB = (fileSizeBytes / (1024 * 1024)) * 10;
const availableMemoryMB = getAvailableMemory();
// Conservative limits
const WARN_THRESHOLD_MB = 100; // Warn if file will use > 100MB
const MAX_RECOMMENDED_MB = 500; // Don't recommend files > 500MB
if (estimatedMemoryMB > MAX_RECOMMENDED_MB) {
return {
allowed: false,
warning: `This file may require ${Math.round(estimatedMemoryMB)}MB of memory. ` +
`Files larger than ${MAX_RECOMMENDED_MB}MB are not recommended as they may cause performance issues or crashes.`,
estimatedMemoryMB,
availableMemoryMB,
};
}
if (estimatedMemoryMB > WARN_THRESHOLD_MB) {
const warning = availableMemoryMB
? `This file will require approximately ${Math.round(estimatedMemoryMB)}MB of memory. ` +
`Your device has ${Math.round(availableMemoryMB)}MB available.`
: `This file will require approximately ${Math.round(estimatedMemoryMB)}MB of memory. ` +
`Large files may cause performance issues on devices with limited memory.`;
return {
allowed: true,
warning,
estimatedMemoryMB,
availableMemoryMB,
};
}
return {
allowed: true,
estimatedMemoryMB,
availableMemoryMB,
};
}
/**
* Check if an audio buffer is within safe memory limits
* @param duration Duration in seconds
* @param sampleRate Sample rate
* @param channels Number of channels
* @returns Memory check result
*/
export function checkAudioBufferMemoryLimit(
duration: number,
sampleRate: number = 48000,
channels: number = 2
): MemoryCheckResult {
const estimatedMemoryMB = estimateAudioMemory(duration, sampleRate, channels);
const availableMemoryMB = getAvailableMemory();
const WARN_THRESHOLD_MB = 100;
const MAX_RECOMMENDED_MB = 500;
if (estimatedMemoryMB > MAX_RECOMMENDED_MB) {
return {
allowed: false,
warning: `This audio (${Math.round(duration / 60)} minutes) will require ${Math.round(estimatedMemoryMB)}MB of memory. ` +
`Audio longer than ${Math.round((MAX_RECOMMENDED_MB / sampleRate / channels / 4) / 60)} minutes may cause performance issues.`,
estimatedMemoryMB,
availableMemoryMB,
};
}
if (estimatedMemoryMB > WARN_THRESHOLD_MB) {
const warning = availableMemoryMB
? `This audio will require approximately ${Math.round(estimatedMemoryMB)}MB of memory. ` +
`Your device has ${Math.round(availableMemoryMB)}MB available.`
: `This audio will require approximately ${Math.round(estimatedMemoryMB)}MB of memory.`;
return {
allowed: true,
warning,
estimatedMemoryMB,
availableMemoryMB,
};
}
return {
allowed: true,
estimatedMemoryMB,
availableMemoryMB,
};
}
/**
* Format memory size in human-readable format
* @param bytes Size in bytes
* @returns Formatted string (e.g., "1.5 MB", "250 KB")
*/
export function formatMemorySize(bytes: number): string {
if (bytes < 1024) {
return `${bytes} B`;
} else if (bytes < 1024 * 1024) {
return `${(bytes / 1024).toFixed(1)} KB`;
} else if (bytes < 1024 * 1024 * 1024) {
return `${(bytes / (1024 * 1024)).toFixed(1)} MB`;
} else {
return `${(bytes / (1024 * 1024 * 1024)).toFixed(2)} GB`;
}
}

93
lib/utils/timeline.ts Normal file
View File

@@ -0,0 +1,93 @@
/**
* Timeline coordinate conversion and formatting utilities
*/
/**
* Base pixels per second at zoom level 1
* zoom=1: 5 pixels per second
* zoom=2: 10 pixels per second, etc.
*/
const PIXELS_PER_SECOND_BASE = 5;
/**
* Convert time (in seconds) to pixel position
*/
export function timeToPixel(time: number, duration: number, zoom: number): number {
if (duration === 0) return 0;
const totalWidth = duration * zoom * PIXELS_PER_SECOND_BASE;
return (time / duration) * totalWidth;
}
/**
* Convert pixel position to time (in seconds)
*/
export function pixelToTime(pixel: number, duration: number, zoom: number): number {
if (duration === 0) return 0;
const totalWidth = duration * zoom * PIXELS_PER_SECOND_BASE;
return (pixel / totalWidth) * duration;
}
/**
* Calculate appropriate tick interval based on visible duration
* Returns interval in seconds
*/
export function calculateTickInterval(visibleDuration: number): {
major: number;
minor: number;
} {
// Very zoomed in: show sub-second intervals
if (visibleDuration < 5) {
return { major: 1, minor: 0.5 };
}
// Zoomed in: show every second
if (visibleDuration < 20) {
return { major: 5, minor: 1 };
}
// Medium zoom: show every 5 seconds
if (visibleDuration < 60) {
return { major: 10, minor: 5 };
}
// Zoomed out: show every 10 seconds
if (visibleDuration < 300) {
return { major: 30, minor: 10 };
}
// Very zoomed out: show every minute
return { major: 60, minor: 30 };
}
/**
* Format time in seconds to display format
* Returns format like "0:00", "1:23", "12:34.5"
*/
export function formatTimeLabel(seconds: number, showMillis: boolean = false): string {
const mins = Math.floor(seconds / 60);
const secs = seconds % 60;
if (showMillis) {
const wholeSecs = Math.floor(secs);
const decimalPart = Math.floor((secs - wholeSecs) * 10);
return `${mins}:${wholeSecs.toString().padStart(2, '0')}.${decimalPart}`;
}
return `${mins}:${Math.floor(secs).toString().padStart(2, '0')}`;
}
/**
* Calculate visible time range based on scroll position
*/
export function getVisibleTimeRange(
scrollLeft: number,
viewportWidth: number,
duration: number,
zoom: number
): { start: number; end: number } {
const totalWidth = duration * zoom * 100;
const start = pixelToTime(scrollLeft, duration, zoom);
const end = pixelToTime(scrollLeft + viewportWidth, duration, zoom);
return {
start: Math.max(0, start),
end: Math.min(duration, end),
};
}

200
lib/workers/audio.worker.ts Normal file
View File

@@ -0,0 +1,200 @@
/**
* Web Worker for heavy audio computations
* Offloads waveform generation, analysis, and normalization to background thread
*/
export interface WorkerMessage {
id: string;
type: 'generatePeaks' | 'generateMinMaxPeaks' | 'normalizePeaks' | 'analyzeAudio' | 'findPeak';
payload: any;
}
export interface WorkerResponse {
id: string;
type: string;
result?: any;
error?: string;
}
// Message handler
self.onmessage = (event: MessageEvent<WorkerMessage>) => {
const { id, type, payload } = event.data;
try {
let result: any;
switch (type) {
case 'generatePeaks':
result = generatePeaks(
payload.channelData,
payload.width
);
break;
case 'generateMinMaxPeaks':
result = generateMinMaxPeaks(
payload.channelData,
payload.width
);
break;
case 'normalizePeaks':
result = normalizePeaks(
payload.peaks,
payload.targetMax
);
break;
case 'analyzeAudio':
result = analyzeAudio(payload.channelData);
break;
case 'findPeak':
result = findPeak(payload.channelData);
break;
default:
throw new Error(`Unknown worker message type: ${type}`);
}
const response: WorkerResponse = { id, type, result };
self.postMessage(response);
} catch (error) {
const response: WorkerResponse = {
id,
type,
error: error instanceof Error ? error.message : String(error),
};
self.postMessage(response);
}
};
/**
* Generate waveform peaks from channel data
*/
function generatePeaks(channelData: Float32Array, width: number): Float32Array {
const peaks = new Float32Array(width);
const samplesPerPeak = Math.floor(channelData.length / width);
for (let i = 0; i < width; i++) {
const start = i * samplesPerPeak;
const end = Math.min(start + samplesPerPeak, channelData.length);
let max = 0;
for (let j = start; j < end; j++) {
const abs = Math.abs(channelData[j]);
if (abs > max) {
max = abs;
}
}
peaks[i] = max;
}
return peaks;
}
/**
* Generate min/max peaks for more detailed waveform visualization
*/
function generateMinMaxPeaks(
channelData: Float32Array,
width: number
): { min: Float32Array; max: Float32Array } {
const min = new Float32Array(width);
const max = new Float32Array(width);
const samplesPerPeak = Math.floor(channelData.length / width);
for (let i = 0; i < width; i++) {
const start = i * samplesPerPeak;
const end = Math.min(start + samplesPerPeak, channelData.length);
let minVal = 1;
let maxVal = -1;
for (let j = start; j < end; j++) {
const val = channelData[j];
if (val < minVal) minVal = val;
if (val > maxVal) maxVal = val;
}
min[i] = minVal;
max[i] = maxVal;
}
return { min, max };
}
/**
* Normalize peaks to a given range
*/
function normalizePeaks(peaks: Float32Array, targetMax: number = 1): Float32Array {
const normalized = new Float32Array(peaks.length);
let max = 0;
// Find max value
for (let i = 0; i < peaks.length; i++) {
if (peaks[i] > max) {
max = peaks[i];
}
}
// Normalize
const scale = max > 0 ? targetMax / max : 1;
for (let i = 0; i < peaks.length; i++) {
normalized[i] = peaks[i] * scale;
}
return normalized;
}
/**
* Analyze audio data for statistics
*/
function analyzeAudio(channelData: Float32Array): {
peak: number;
rms: number;
crestFactor: number;
dynamicRange: number;
} {
let peak = 0;
let sumSquares = 0;
let min = 1;
let max = -1;
for (let i = 0; i < channelData.length; i++) {
const val = channelData[i];
const abs = Math.abs(val);
if (abs > peak) peak = abs;
if (val < min) min = val;
if (val > max) max = val;
sumSquares += val * val;
}
const rms = Math.sqrt(sumSquares / channelData.length);
const crestFactor = rms > 0 ? peak / rms : 0;
const dynamicRange = max - min;
return {
peak,
rms,
crestFactor,
dynamicRange,
};
}
/**
* Find peak value in channel data
*/
function findPeak(channelData: Float32Array): number {
let peak = 0;
for (let i = 0; i < channelData.length; i++) {
const abs = Math.abs(channelData[i]);
if (abs > peak) peak = abs;
}
return peak;
}

View File

@@ -10,7 +10,9 @@
},
"dependencies": {
"clsx": "^2.1.1",
"lamejs": "^1.2.1",
"fflate": "^0.8.2",
"jszip": "^3.10.1",
"lamejs": "github:zhuker/lamejs",
"lucide-react": "^0.553.0",
"next": "^16.0.0",
"react": "^19.0.0",

104
pnpm-lock.yaml generated
View File

@@ -11,9 +11,15 @@ importers:
clsx:
specifier: ^2.1.1
version: 2.1.1
fflate:
specifier: ^0.8.2
version: 0.8.2
jszip:
specifier: ^3.10.1
version: 3.10.1
lamejs:
specifier: ^1.2.1
version: 1.2.1
specifier: github:zhuker/lamejs
version: https://codeload.github.com/zhuker/lamejs/tar.gz/582bbba6a12f981b984d8fb9e1874499fed85675
lucide-react:
specifier: ^0.553.0
version: 0.553.0(react@19.2.0)
@@ -828,6 +834,9 @@ packages:
convert-source-map@2.0.0:
resolution: {integrity: sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg==}
core-util-is@1.0.3:
resolution: {integrity: sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==}
cross-spawn@7.0.6:
resolution: {integrity: sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==}
engines: {node: '>= 8'}
@@ -1085,6 +1094,9 @@ packages:
picomatch:
optional: true
fflate@0.8.2:
resolution: {integrity: sha512-cPJU47OaAoCbg0pBvzsgpTPhmhqI5eJjh/JIu8tPj5q+T7iLvW/JAYUqmE7KOB4R1ZyEhzBaIQpQpardBF5z8A==}
file-entry-cache@8.0.0:
resolution: {integrity: sha512-XXTUwCvisa5oacNGRP9SfNtYBNAMi+RPwBFmblZEF7N7swHYQS6/Zfk7SRwx4D5j3CH211YNRco1DEMNVfZCnQ==}
engines: {node: '>=16.0.0'}
@@ -1212,6 +1224,9 @@ packages:
resolution: {integrity: sha512-Hs59xBNfUIunMFgWAbGX5cq6893IbWg4KnrjbYwX3tx0ztorVgTDA6B2sxf8ejHJ4wz8BqGUMYlnzNBer5NvGg==}
engines: {node: '>= 4'}
immediate@3.0.6:
resolution: {integrity: sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==}
import-fresh@3.3.1:
resolution: {integrity: sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==}
engines: {node: '>=6'}
@@ -1220,6 +1235,9 @@ packages:
resolution: {integrity: sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==}
engines: {node: '>=0.8.19'}
inherits@2.0.4:
resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==}
internal-slot@1.1.0:
resolution: {integrity: sha512-4gd7VpWNQNB4UKKCFFVcp1AVv+FMOgs9NKzjHKusc8jTMhd5eL1NqQqOpE0KzMds804/yHlglp3uxgluOqAPLw==}
engines: {node: '>= 0.4'}
@@ -1327,6 +1345,9 @@ packages:
resolution: {integrity: sha512-mfcwb6IzQyOKTs84CQMrOwW4gQcaTOAWJ0zzJCl2WSPDrWk/OzDaImWFH3djXhb24g4eudZfLRozAvPGw4d9hQ==}
engines: {node: '>= 0.4'}
isarray@1.0.0:
resolution: {integrity: sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ==}
isarray@2.0.5:
resolution: {integrity: sha512-xHjhDr3cNBK0BzdUJSPXZntQUx/mwMS5Rw4A7lPJ90XGAO6ISP/ePDNuo0vhqOZU+UD5JoodwCAAoZQd3FeAKw==}
@@ -1375,11 +1396,15 @@ packages:
resolution: {integrity: sha512-ZZow9HBI5O6EPgSJLUb8n2NKgmVWTwCvHGwFuJlMjvLFqlGG6pjirPhtdsseaLZjSibD8eegzmYpUZwoIlj2cQ==}
engines: {node: '>=4.0'}
jszip@3.10.1:
resolution: {integrity: sha512-xXDvecyTpGLrqFrvkrUSoxxfJI5AH7U8zxxtVclpsUtMCq4JQ290LY8AW5c7Ggnr/Y/oK+bQMbqK2qmtk3pN4g==}
keyv@4.5.4:
resolution: {integrity: sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==}
lamejs@1.2.1:
resolution: {integrity: sha512-s7bxvjvYthw6oPLCm5pFxvA84wUROODB8jEO2+CE1adhKgrIvVOlmMgY8zyugxGrvRaDHNJanOiS21/emty6dQ==}
lamejs@https://codeload.github.com/zhuker/lamejs/tar.gz/582bbba6a12f981b984d8fb9e1874499fed85675:
resolution: {tarball: https://codeload.github.com/zhuker/lamejs/tar.gz/582bbba6a12f981b984d8fb9e1874499fed85675}
version: 1.2.1
language-subtag-registry@0.3.23:
resolution: {integrity: sha512-0K65Lea881pHotoGEa5gDlMxt3pctLi2RplBb7Ezh4rRdLEOtgi7n4EwK9lamnUCkKBqaeKRVebTq6BAxSkpXQ==}
@@ -1392,6 +1417,9 @@ packages:
resolution: {integrity: sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==}
engines: {node: '>= 0.8.0'}
lie@3.3.0:
resolution: {integrity: sha512-UaiMJzeWRlEujzAuw5LokY1L5ecNQYZKfmyZ9L7wDHb/p5etKaxXhohBcrw0EYby+G/NA52vRSN4N39dxHAIwQ==}
lightningcss-android-arm64@1.30.2:
resolution: {integrity: sha512-BH9sEdOCahSgmkVhBLeU7Hc9DWeZ1Eb6wNS6Da8igvUwAe0sqROHddIlvU06q3WyXVEOYDZ6ykBZQnjTbmo4+A==}
engines: {node: '>= 12.0.0'}
@@ -1594,6 +1622,9 @@ packages:
resolution: {integrity: sha512-LaNjtRWUBY++zB5nE/NwcaoMylSPk+S+ZHNB1TzdbMJMny6dynpAGt7X/tl/QYq3TIeE6nxHppbo2LGymrG5Pw==}
engines: {node: '>=10'}
pako@1.0.11:
resolution: {integrity: sha512-4hLB8Py4zZce5s4yd9XzopqwVv/yGNhV1Bl8NTmCq1763HeK2+EwVTv+leGeL13Dnh2wfbqowVPXCIO0z4taYw==}
parent-module@1.0.1:
resolution: {integrity: sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==}
engines: {node: '>=6'}
@@ -1636,6 +1667,9 @@ packages:
resolution: {integrity: sha512-vkcDPrRZo1QZLbn5RLGPpg/WmIQ65qoWWhcGKf/b5eplkkarX0m9z8ppCat4mlOqUsWpyNuYgO3VRyrYHSzX5g==}
engines: {node: '>= 0.8.0'}
process-nextick-args@2.0.1:
resolution: {integrity: sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==}
prop-types@15.8.1:
resolution: {integrity: sha512-oj87CgZICdulUohogVAR7AjlC0327U4el4L6eAvOqCeudMDVU0NThNaV+b9Df4dXgSP1gXMTnPdhfe/2qDH5cg==}
@@ -1658,6 +1692,9 @@ packages:
resolution: {integrity: sha512-tmbWg6W31tQLeB5cdIBOicJDJRR2KzXsV7uSK9iNfLWQ5bIZfxuPEHp7M8wiHyHnn0DD1i7w3Zmin0FtkrwoCQ==}
engines: {node: '>=0.10.0'}
readable-stream@2.3.8:
resolution: {integrity: sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==}
reflect.getprototypeof@1.0.10:
resolution: {integrity: sha512-00o4I+DVrefhv+nX0ulyi3biSHCPDe+yLv5o/p6d/UVlirijB8E16FtfwSAi4g3tcqrQ4lRAqQSoFEZJehYEcw==}
engines: {node: '>= 0.4'}
@@ -1693,6 +1730,9 @@ packages:
resolution: {integrity: sha512-AURm5f0jYEOydBj7VQlVvDrjeFgthDdEF5H1dP+6mNpoXOMo1quQqJ4wvJDyRZ9+pO3kGWoOdmV08cSv2aJV6Q==}
engines: {node: '>=0.4'}
safe-buffer@5.1.2:
resolution: {integrity: sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==}
safe-push-apply@1.0.0:
resolution: {integrity: sha512-iKE9w/Z7xCzUMIZqdBsp6pEQvwuEebH4vdpjcDWnyzaI6yl6O9FHvVpmGelvEHNsoY6wGblkxR6Zty/h00WiSA==}
engines: {node: '>= 0.4'}
@@ -1725,6 +1765,9 @@ packages:
resolution: {integrity: sha512-RJRdvCo6IAnPdsvP/7m6bsQqNnn1FCBX5ZNtFL98MmFF/4xAIJTIg1YbHW5DC2W5SKZanrC6i4HsJqlajw/dZw==}
engines: {node: '>= 0.4'}
setimmediate@1.0.5:
resolution: {integrity: sha512-MATJdZp8sLqDl/68LfQmbP8zKPLQNV6BIZoIgrscFDQ+RsvK/BxeDQOgyxKKoh0y/8h3BqVFnCqQ/gd+reiIXA==}
sharp@0.34.5:
resolution: {integrity: sha512-Ou9I5Ft9WNcCbXrU9cMgPBcCK8LiwLqcbywW3t4oDV37n1pzpuNLsYiAV8eODnjbtQlSDwZ2cUEeQz4E54Hltg==}
engines: {node: ^18.17.0 || ^20.3.0 || >=21.0.0}
@@ -1787,6 +1830,9 @@ packages:
resolution: {integrity: sha512-UXSH262CSZY1tfu3G3Secr6uGLCFVPMhIqHjlgCUtCCcgihYc/xKs9djMTMUOb2j1mVSeU8EU6NWc/iQKU6Gfg==}
engines: {node: '>= 0.4'}
string_decoder@1.1.1:
resolution: {integrity: sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==}
strip-bom@3.0.0:
resolution: {integrity: sha512-vavAMRXOgBVNF6nyEEmL3DBK19iRpDcoIwW+swQ+CbGiu7lju6t+JklA1MHweoWtadgt4ISVUsXLyDq34ddcwA==}
engines: {node: '>=4'}
@@ -1900,6 +1946,9 @@ packages:
use-strict@1.0.1:
resolution: {integrity: sha512-IeiWvvEXfW5ltKVMkxq6FvNf2LojMKvB2OCeja6+ct24S1XOmQw2dGr2JyndwACWAGJva9B7yPHwAmeA9QCqAQ==}
util-deprecate@1.0.2:
resolution: {integrity: sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw==}
which-boxed-primitive@1.1.1:
resolution: {integrity: sha512-TbX3mj8n0odCBFVlY8AxkqcHASw3L60jIuF8jFP78az3C2YhmGvqbHBpAjTRH2/xqYunrJ9g1jSyjCjpoWzIAA==}
engines: {node: '>= 0.4'}
@@ -2700,6 +2749,8 @@ snapshots:
convert-source-map@2.0.0: {}
core-util-is@1.0.3: {}
cross-spawn@7.0.6:
dependencies:
path-key: 3.1.1
@@ -3109,6 +3160,8 @@ snapshots:
optionalDependencies:
picomatch: 4.0.3
fflate@0.8.2: {}
file-entry-cache@8.0.0:
dependencies:
flat-cache: 4.0.1
@@ -3233,6 +3286,8 @@ snapshots:
ignore@7.0.5: {}
immediate@3.0.6: {}
import-fresh@3.3.1:
dependencies:
parent-module: 1.0.1
@@ -3240,6 +3295,8 @@ snapshots:
imurmurhash@0.1.4: {}
inherits@2.0.4: {}
internal-slot@1.1.0:
dependencies:
es-errors: 1.3.0
@@ -3358,6 +3415,8 @@ snapshots:
call-bound: 1.0.4
get-intrinsic: 1.3.0
isarray@1.0.0: {}
isarray@2.0.5: {}
isexe@2.0.0: {}
@@ -3400,11 +3459,18 @@ snapshots:
object.assign: 4.1.7
object.values: 1.2.1
jszip@3.10.1:
dependencies:
lie: 3.3.0
pako: 1.0.11
readable-stream: 2.3.8
setimmediate: 1.0.5
keyv@4.5.4:
dependencies:
json-buffer: 3.0.1
lamejs@1.2.1:
lamejs@https://codeload.github.com/zhuker/lamejs/tar.gz/582bbba6a12f981b984d8fb9e1874499fed85675:
dependencies:
use-strict: 1.0.1
@@ -3419,6 +3485,10 @@ snapshots:
prelude-ls: 1.2.1
type-check: 0.4.0
lie@3.3.0:
dependencies:
immediate: 3.0.6
lightningcss-android-arm64@1.30.2:
optional: true
@@ -3607,6 +3677,8 @@ snapshots:
dependencies:
p-limit: 3.1.0
pako@1.0.11: {}
parent-module@1.0.1:
dependencies:
callsites: 3.1.0
@@ -3639,6 +3711,8 @@ snapshots:
prelude-ls@1.2.1: {}
process-nextick-args@2.0.1: {}
prop-types@15.8.1:
dependencies:
loose-envify: 1.4.0
@@ -3658,6 +3732,16 @@ snapshots:
react@19.2.0: {}
readable-stream@2.3.8:
dependencies:
core-util-is: 1.0.3
inherits: 2.0.4
isarray: 1.0.0
process-nextick-args: 2.0.1
safe-buffer: 5.1.2
string_decoder: 1.1.1
util-deprecate: 1.0.2
reflect.getprototypeof@1.0.10:
dependencies:
call-bind: 1.0.8
@@ -3708,6 +3792,8 @@ snapshots:
has-symbols: 1.1.0
isarray: 2.0.5
safe-buffer@5.1.2: {}
safe-push-apply@1.0.0:
dependencies:
es-errors: 1.3.0
@@ -3747,6 +3833,8 @@ snapshots:
es-errors: 1.3.0
es-object-atoms: 1.1.1
setimmediate@1.0.5: {}
sharp@0.34.5:
dependencies:
'@img/colour': 1.0.0
@@ -3872,6 +3960,10 @@ snapshots:
define-properties: 1.2.1
es-object-atoms: 1.1.1
string_decoder@1.1.1:
dependencies:
safe-buffer: 5.1.2
strip-bom@3.0.0: {}
strip-json-comments@3.1.1: {}
@@ -4012,6 +4104,8 @@ snapshots:
use-strict@1.0.1: {}
util-deprecate@1.0.2: {}
which-boxed-primitive@1.1.1:
dependencies:
is-bigint: 1.1.0

15
types/lamejs.d.ts vendored Normal file
View File

@@ -0,0 +1,15 @@
declare module 'lamejs/src/js/index.js' {
export class Mp3Encoder {
constructor(channels: number, samplerate: number, kbps: number);
encodeBuffer(left: Int16Array, right: Int16Array): Int8Array;
flush(): Int8Array;
}
export class WavHeader {
dataOffset: number;
dataLen: number;
channels: number;
sampleRate: number;
static readHeader(dataView: DataView): WavHeader;
}
}

29
types/marker.ts Normal file
View File

@@ -0,0 +1,29 @@
/**
* Region marker type definitions
* Markers help navigate and organize the timeline
*/
/**
* Marker types
* - point: A single point in time (like a cue point)
* - region: A time range with start and end
*/
export type MarkerType = 'point' | 'region';
/**
* Single marker or region
*/
export interface Marker {
id: string;
name: string;
type: MarkerType;
time: number; // Start time in seconds
endTime?: number; // End time for regions (undefined for point markers)
color?: string; // Optional color for visual distinction
description?: string; // Optional description/notes
}
/**
* Helper type for creating new markers
*/
export type CreateMarkerInput = Omit<Marker, 'id'>;

View File

@@ -34,6 +34,8 @@ export interface Track {
collapsed: boolean;
selected: boolean;
showEffects: boolean; // Show/hide per-track effects panel
effectsExpanded?: boolean; // Whether effects bar is expanded (when showEffects is true)
automationExpanded?: boolean; // Whether automation bar is expanded (shows full controls)
// Selection (for editing operations)
selection: Selection | null;
@@ -68,7 +70,7 @@ export const TRACK_COLORS: Record<TrackColor, string> = {
gray: 'rgb(156, 163, 175)',
};
export const DEFAULT_TRACK_HEIGHT = 300; // Knob + fader with labels + R/S/M buttons
export const MIN_TRACK_HEIGHT = 220; // Minimum to fit knob + fader with labels + buttons
export const DEFAULT_TRACK_HEIGHT = 400; // Knob + fader with labels + R/S/M/A/E buttons
export const MIN_TRACK_HEIGHT = 400; // Minimum to fit knob + fader with labels + all buttons
export const MAX_TRACK_HEIGHT = 500; // Increased for better waveform viewing
export const COLLAPSED_TRACK_HEIGHT = 48; // Extracted constant for collapsed state