Feature Requests

Stable ID-Based MIDI Binding for Clips and Widgets
Description: Implement a robust MIDI binding system that links MIDI controls to persistent internal IDs for clips and widgets, instead of to their current index, position, or label. The goal is to keep MIDI mappings stable across layout edits, reordering, and design changes, so controllers remain reliable as projects evolve. Problem: Currently, MIDI bindings appear to be tied to positional or name-based references (for example, "Clip 15" or "Slider 7"). When the user adds, deletes, or reorders clips or widgets, these indices shift, causing existing bindings to: Point to the wrong clip or widget, or Break entirely and require manual rebuilding. This leads to: MIDI mappings becoming unreliable after even minor layout tweaks. Repeatedly having to re-check and rebuild bindings when refining a page or moving UI elements. A strong disincentive to iterate on layouts once MIDI is set up. High risk of failure in live performance, where a moved clip or widget can silently invalidate mappings. Users report that even moving a clip slightly on the canvas can change its internal numbering and break mappings, forcing a tedious verification pass over "each and every binding" after small visual adjustments. It also limits practical reuse of MIDI setups across pages and projects. Proposed Solution: Introduce ID-based object binding for all MIDI mappings: Assign each clip and widget a persistent internal ID (unique object identifier). When a MIDI binding is created, store the binding against that internal ID, not its index, row, column, or label. Ensure that: - Reordering clips or widgets on a page does not affect their MIDI bindings. - Moving a clip or widget between rows or positions on the canvas leaves its bindings intact. - Duplicating a clip or widget (including across pages or projects) offers the option to either: - Copy the bindings along with the object, or - Create a clean copy without bindings (user-selectable behavior). - Only explicit deletion of a clip or widget invalidates its associated bindings. Implementation notes: Provide a safe migration path for existing projects (e.g. converting current index-based bindings to ID-based on load). In the control settings / profiles UI, display the bound target by name and location for user clarity, but internally use the stable ID. Optionally expose a "relink target" function for reassigning a binding to another object without recreating it from scratch. Benefits: MIDI mappings become resilient to layout changes, renaming, and reordering. Users can freely refine pages, move clips a "few centimeters" for better ergonomics, or redesign a performance layout without destroying their control setup. Greatly improved reliability in live contexts, where any unexpected re-mapping is unacceptable. Enables copying individual clips or widgets (with their bindings) across pages and projects as reusable building blocks. Encourages experimentation and modular UI design, fully aligned with Loopy Pro’s flexible canvas concept. Examples: Layout refinement: - A row of clips is moved down to make room for new controls. - With ID-based binding, the same footswitches still trigger the same clips as before, regardless of their new positions. Reusing a performance widget: - A "Record/Play/Stop" button widget with carefully tuned MIDI bindings is copied to a new page or project. - The copy retains its mappings to the intended target clip or role, instead of reverting to default or pointing at the wrong object. Multi-song setups: - A user designs a template page with a grid of clips and a bank of widgets mapped to a MIDI controller. - They duplicate the page for Song 2 and Song 3, adjust clip contents and layout, and all bindings continue to work without manual re-learning. This summary was automatically generated by GPT-5.1 Thinking on 2025-12-27 .
4
·
chatgpt-proof-read-done
·
under review
Support Multi-Output AUv3 Effects (Route Each Output Bus Independently)
Description: Enable routing of AUv3 effects that expose multiple output buses (e.g. 4 outputs) so each output can be sent to different inputs/tracks for separate processing and mixing. Problem: Some AUv3 effects provide multiple discrete outputs (for example, separate taps, layers, or processing paths). Currently, these multi-output effects behave like single-output units, so their additional outputs cannot be routed independently. This prevents common workflows like splitting an effect's individual outputs to different tracks for separate EQ/comp/reverb, parallel chains, or distinct mixes. Proposed Solution: 1) Expose AUv3 output buses as routable nodes When an AUv3 effect reports multiple output buses, display each bus as a selectable output source (e.g. "Out 1", "Out 2", "Out 3", "Out 4"). Allow routing each bus to different inputs/tracks/mixers. 2) Add an output-bus selector per route (minimal UI option) In the routing UI, add a field like "Output Bus" for AUv3 effects. Default to the main/mixed output for compatibility, with manual selection for additional buses. 3) Clear channel/bus mapping and monitoring Show how each bus maps to channels (mono/stereo) to avoid confusion. Provide a safe fallback (downmix or "main out only") if a project is opened on a system where the AUv3 reports a different bus configuration. 4) Optional: per-bus level meters Lightweight metering for each exposed bus to confirm signal is present and routed correctly. Benefits: Unlocks advanced multi-chain workflows on iOS/iPadOS with AUv3 effects that are designed for multi-output operation. Enables separate processing per output (EQ/comp/reverb), parallel routing, and cleaner mixes. Makes complex performance setups more reliable and predictable (no hidden downmixing). Examples: Use Bram Bos Scatterbrain with its 4 outputs: - Route "Out 1" -> Track A (dry-ish) -> compressor - Route "Out 2" -> Track B -> shimmer reverb - Route "Out 3" -> Track C -> band-pass + delay - Route "Out 4" -> Track D -> distortion + limiter Sound design: Send different effect outputs to separate loopers/tracks to record and layer each output independently. Live mixing: Keep one output as the main mix, while routing another output to a dedicated track for sidechain or audience FX send. This summary was automatically generated by GPT-5.1 Thinking on 2025-12-09 . Original Post: Trying to use bram Bos Scatterbrain with its 4 ouputs in Loopy Pro but it seems that is not yet possible? Would be my main feature request to be able the route multiple outs from an fx to multiple inputs for seperate processing.
1
·
chatgpt-proof-read-done
·
under review
Dedicated "Set Lists" Access in Main Menu (Optional Floating Button)
Description: Add a dedicated, performance-friendly way to open "Set Lists" directly from the main UI, without having to drill down through the project browser hierarchy. The goal is to make set lists reliably one tap away in live use, on any screen size. Problem: Accessing set lists currently requires several taps and context switches via the project browser. On stage, this extra navigation is slow and increases the risk of wrong taps, especially on smaller screens, in low-light conditions, or under time pressure between songs. Set lists are a performance-critical feature for users managing many projects, but their current placement deep in the browser makes quick, reliable access harder than necessary and can interrupt flow. Proposed Solution: Provide one (or more) of the following UI options to surface set lists more prominently: 1) Main menu entry Add a direct "Set Lists" entry in the main menu, alongside other top-level destinations. This makes set lists consistently one tap away from anywhere in the app. 2) Dedicated button under the folder (browser) icon Under the current main menu folder icon, add a clearly labeled "Set Lists" button. Tapping it opens the set list view immediately, without extra navigation steps. 3) Optional floating "Set Lists" button (most performance-oriented) Add a preference: "Show Set Lists Button". When enabled, display a floating on-screen button that opens the set list view instantly. Allow the button to be dragged and positioned anywhere on the canvas, so left- and right-handed users can place it in their preferred reach zone. Allow toggling the button on/off in preferences to keep layouts clean for users who do not need it all the time. Compatibility / integration notes: The dedicated access options should coexist with existing set list functionality in the browser. This feature aligns well with related requests around better project/load targets (e.g. choosing between folders, set lists, recent projects, reload current project). Benefits: Faster navigation between projects and songs in live sets, with fewer taps. Reduced cognitive load and lower error risk when switching pieces under pressure. Better ergonomics and accessibility, especially with a movable floating button that can be placed for one-handed operation. Encourages consistent use of set lists for large libraries (dozens or hundreds of projects), improving overall workflow for complex shows. Keeps the UI flexible: users who prefer a clean screen can disable the floating button and rely on the main menu entry instead. Examples: Large live set with 60+ songs: - Tap the floating "Set Lists" button. - Tap the next project in the set list. - Project loads immediately, without having to re-open the browser or change context. One-handed performance: - Place the floating "Set Lists" button in the bottom-right corner for right-handed use (or bottom-left for left-handed use). - Quickly switch to the next song between phrases with a single thumb tap. Studio or rehearsal workflow: - Keep the floating button disabled for a cleaner layout. - Use the main menu "Set Lists" entry whenever you need to jump between rehearsal projects, retaining the same quick access without cluttering the canvas. This summary was automatically generated by GPT-5.1 Thinking on 2025-12-27 .
2
·
chatgpt-proof-read-done
·
under review
Multi-Target Morph Pad With Zones, Rails, and High-Resolution Output
Description: Add a new control-surface widget: a "Morph Pad" that can continuously morph (interpolate) between multiple user-defined target states. Each target stores values for many destinations at once (AUv3 parameters, mixer controls, sends, actions, and/or MIDI). The performer moves a cursor (finger) on a 2D pad, and Loopy Pro outputs a smooth blend across all assigned destinations. The key goal: one gesture should reliably and repeatably control many parameters ("XYZABC..."), not just X and Y. Problems: Complex transitions currently require many separate controls (sliders/knobs) or multiple XY pads, which is slow to build and fragile live. Live morphing across many parameters is hard to hit precisely and hard to repeat. Freeform touch control without target/snap logic can cause jitter near boundaries and makes it difficult to land on musically meaningful states. Users who want "morphing" often depend on external apps/controllers, adding routing complexity and failure points. Proposed Solution: 1) Morph core: Targets (the foundation) Allow adding N "Targets" (e.g., 2–16+). Each Target stores a full snapshot of assigned destinations (any number of parameters/controls). During performance, compute weights per Target (distance-based interpolation) and output interpolated values to all destinations in real time. 2) Live-safe precision Optional "Magnet/Snap" to Targets (strength + radius). Optional hysteresis/stability to prevent flicker when hovering near boundaries or between Targets. Optional release behavior: hold last value, return to a default Target, or spring to center. 3) Zones (aligned, performance-oriented) Provide an aligned zone editor (rectangles/polygons with snap-to-grid, numeric sizing/position). Zones serve as: a) Visual overlays (labels) to communicate intent, and/or b) Mapping layers: Zone A morphs parameter set A, Zone B morphs parameter set B. Rationale: aligned zones keep values "anvisierbar" and repeatable under finger performance, while still enabling complex layouts. 4) Rails/Paths (line tool for repeatable morph gestures) Let users define one or more "Rails" (paths). Optional cursor lock-to-rail: the pad behaves like a constrained morph fader along an arbitrary curve. Rails enable stage-proof morphs (Clean -> Big -> Destroyed) with minimal risk of unintended states. 5) Scaling, curves, and limits per destination Per destination: min/max range, curve (linear/exponential/S-curve), invert, smoothing. Optional quantized steps for musically discrete parameters (e.g., rhythmic divisions). 6) High-resolution control output (optional) Internal high-resolution smoothing for AUv3 parameters. Optional high-resolution external MIDI modes (e.g., 14-bit CC via MSB/LSB pairs and/or NRPN) where appropriate. 7) Fast workflow ("build it in minutes") "Add Destination" / learn workflow to capture AUv3 params or internal controls quickly. "Capture Target" button: store current values into selected Target. Copy/paste Targets and mappings, and a clear overview list of all destinations. Benefits: Dramatically reduces UI clutter while increasing expressive control. Enables repeatable, precise morphing between meaningful sound states. Improves reliability for live performance through targets, snap, hysteresis, and rails. Unifies internal control and external MIDI ecosystems without extra routing apps. Examples: FX Morph: one pad morphs Reverb mix, Delay feedback, Filter cutoff, and Drive from "Clean" to "Cinematic" to "Aggressive". Loop Scene Morph: crossfade track groups, adjust send levels, and tweak global FX with one gesture. Safe Rail: a single "Clean -> Big" rail that is easy to hit and repeat under stress. Zone Layers: top half morphs "Ambient FX", bottom half morphs "Rhythmic FX", with identical hand motion producing different musical outcomes depending on region. This summary was automatically generated by GPT-5.2 Thinking on 2025-12-27 .
1
·
chatgpt-proof-read-done
·
under review
Clear Indicator When Slider Value Is Rounded for Display
Description: When adjusting parameters via sliders, the internally selected value can be a high-precision float (e.g. 0.5966716) while the UI displays a rounded value (e.g. 0.6). This can mislead users into thinking the value is exactly what is shown. Problem: Slider interaction often lands on a non-rounded internal value, but the UI shows a rounded number without any hint. Users may assume they selected an exact value (e.g. exactly 0.6) when they actually did not. This can cause subtle inconsistencies when matching settings, recreating sounds, comparing presets, or troubleshooting automation. Proposed Solution: Add a clear, minimal rounding indicator whenever the displayed value is not equal to the actual internal value. Recommended implementation (low UI impact, high clarity): Append a small indicator symbol next to the displayed value, e.g. "≈". - Example: "0.6 ≈" Only show the indicator when rounding is happening: - Condition: displayedValue != actualValue (after formatting/rounding for display) Optional (but very useful): On tap/long-press/hover, show a tooltip or popover that reveals the exact value: - Example tooltip: "Displayed value is rounded. Actual: 0.5966716" Alternative implementations (if preferred): Show a secondary smaller line only when needed: - Main: "0.6" - Secondary: "exact: 0.5966716" Use a short label instead of a symbol: - Example: "0.6 (rounded)" Benefits: Eliminates confusion about whether the shown value is exact or rounded. Improves trust and transparency in parameter editing. Helps users reproduce settings accurately (especially with automation, presets, and A/B comparisons). Keeps the UI clean because the indicator appears only when necessary. Examples: User drags a slider and the parameter internally becomes 0.5966716, but the UI formats to 0.6: - Display: "0.6 ≈" - Tooltip: "Displayed value is rounded. Actual: 0.5966716" If the internal value is exactly 0.6: - Display: "0.6" (no indicator) This summary was automatically generated by GPT-5.2 Thinking on December 27, 2025.
1
·
chatgpt-proof-read-done
·
under review
Clip Offset + Snap On/Off
Description: Introduce fine-grained control over the start position and snap behavior of loops, including the ability to offset clips by precise values and toggle snap modes dynamically. Problem: Currently, clip playback and overdub always start at the clip’s natural beginning. There’s no way to nudge the start point slightly (for timing correction or groove) or start clips at intentional musical offsets. Additionally, toggling between freeform and grid-based snap isn’t practical in live performance scenarios. Proposed Solution: – Add support for clip offset values , applied per clip or per overdub layer – Offset types could include: • Time (e.g. ms, beats, bars) • Percent of clip • Samples • Degrees (clip rotation) • Cycles – Add Snap ON/OFF toggle with assignable actions – Add “back to original snap” action – MIDI knob/fader can be assigned to cycle/select snap/offset values live – Optional: individual snap/offset settings per overdub layer Benefits: ✅ Correct slight timing issues post-recording without re-recording ✅ Creative manipulation of groove, swing, or polyrhythmic layers ✅ Dynamic control over grid behavior during performance ✅ Ideal for glitch looping, layering FX, or experimental workflows ✅ Supports both precision timing needs and playful offset experimentation Example: Initial Record: offset -1.5 bar Overdub 1: offset +517 ms Overdub 2: offset -17.21% clip Overdub 3: offset +1/3 clip Overdub 4: no offset Overdub 5: offset -651 samples Overdub 6: offset +141° clip Overdub 7: offset 2.5 beats Overdub 8: offset 4.28 clip cycles This summary was automatically generated by ChatGPT-4 on May 2, 2025.
7
·
chatgpt-proof-read-done
·
under review
Action to Dynamically Replace Text in Text Widgets
Description: Add an action that can directly replace the text content of a text widget on the canvas. A button (or other trigger) should be able to target a specific text box and set its text to a defined value, or potentially to a dynamically generated string. This enables text widgets to act as live information displays for controller profiles, signal chains, lyrics, notes and other performance-related messages. Problem: Currently, text widgets are mostly static labels. There is no simple way to: Press a button and have a text box update to show a different message. Use text widgets as dynamic status displays (e.g. current MIDI controller profile, which signal chain is active, which section of a set is playing). Integrate text changes with follow actions or other logic to step through lyrics, chord prompts, or performance notes. As a result: MIDI controller users cannot easily see which “mode/profile” they’re in from a big on-screen label. Live performers can’t quickly show large, clear messages on screen for themselves or the audience (“Chill Part”, “Solo Coming Up”, “Next Song: …”) triggered by buttons. Designers who create on-screen control layouts for midimapping can’t annotate or update these layouts with changing text states. Any kind of pseudo-teleprompter, chord progression steps, or random notes requires awkward workarounds instead of a straightforward text-replace action. This limits the potential of text widgets as flexible, performance-relevant UI elements. Proposed Solution: Introduce a “Replace Text in Text Box” action with clear targeting and configuration: Core action behavior: - Action: “Change / Replace Text” - Parameters: - Target: select a specific text widget. - New Text: user-specified content (literal string or dynamic expression in the future). - When the action is triggered, the target text widget’s content is immediately replaced with the specified text. Trigger sources: - Canvas buttons (on press). - MIDI controller inputs. - Follow actions (e.g. step through a sequence of messages). - Other Loopy actions/macros. Dynamic usage examples and extensions: - Use with dials and other widgets while a broader “dynamic text” system is still evolving: - For example, a button that updates a text box to show which control profile is active. - Support performance communication: - Show large text on screen indicating which signal chain is currently in use. - Display notes to the audience on an external screen (“Next piece”, “Short break”, etc.). - Integration with follow actions: - Cycle through chords, lyrics fragments, or other notations by chaining text-replace actions. - Build a “scrolling” or stepwise teleprompter-style system. Future-friendly design: - The same mechanism could later be expanded to accept variables or bindings (e.g. insert current tempo, section name, loop name). - Could integrate with a more advanced “dynamic text” concept, but this basic replace-text action is a powerful step on its own. Benefits: Dynamic information display: Turn text widgets into live status indicators for MIDI profiles, routing states, scenes, and more. Better feedback for controller users: MIDI controller setups can have clear on-screen labels showing which layout or profile is currently active. Performance messaging: Easy way to push big, legible messages to the screen for performer or audience, especially when Loopy is on a large display. Lightweight building block: A simple, generic action that unlocks many creative uses (teleprompter, chord prompts, random notes) without requiring complex new UI. Examples: A MIDI controller profile display: - Several buttons select different MIDI controller modes (e.g. “Drums”, “FX”, “Loop Control”). - Each button triggers a “Replace Text” action targeting a central text box: - “Current Profile: DRUMS” - “Current Profile: FX” - “Current Profile: LOOPS” Signal chain indicator: - Different buttons activate different signal chains or scenes. - The same button press updates a big text widget to show: - “Chain A: Clean Ambient” - “Chain B: Heavy FX” - “Chain C: Vocals + Delay” Lyrics / chords / notes: - A series of follow actions steps through a list of text replacements: - Each trigger replaces the text box content with the next lyric line or chord. - On the big screen, this becomes a simple scrolling notes system for the performer or audience. This summary was automatically generated by GPT-5.1 Thinking on 2025-11-20 .
3
·
chatgpt-proof-read-done
·
under review
Piano Roll MIDI Editor Enhancements (editing tools, CC/MPE lanes, navigation & grid)
Description: Upgrade the Piano Roll with pro-grade editing tools , navigation , and controller lanes : multi-clip/ghost editing, scale & drum maps, separate position vs. length quantize , CC/Per-Note lanes (MPE/MIDI 2.0), line/curve tools, fold/filter views, and performance-safe playback with chase. Problem: Complex MIDI editing currently takes too many steps: limited CC tooling, no per-note expression lanes, slow navigation on dense clips, and workarounds for drum maps, scale constraints, or ghost references. This slows arrangement, harms live reliability, and obscures key information (articulations, keyswitches, pedal splits). Proposed Solution: Navigation & Selection - Zoom to selection/loop ; Focus Playhead ; magnetic scroll; marquee + smart lasso. - Time Selection that spans tracks/clips for batch ops (copy/move/delete/ripple). - “Solo edit” focus per track/clip; Ghost Notes from other clips/tracks (dimmed, color-tagged). Grid, Snap & Quantize - Independent Position Quantize and Length Quantize ; Relative Snap; triplet/dotted sets; cycle-of-clip grid. - Scale overlays; bar emphasis; late-press guard for record and edit commits. Note Editing Tools - Draw/erase/paint; Split at Grid / Join / Glue ; Legato (min gap); Humanize (pos/vel/len). - Strum/Flam creator (direction, spread, curve). - Transform : transpose (±semitones/oct), scale-constrain (block or move to nearest), invert, retrograde, stretch (time). - Mute/Solo notes , color by channel/pitch/part. Drum & Scale Workflows - Drum Editor mode : per-row names (GM/custom), lane mutes/solos, fixed-length paint, velocity bars. - Fold to used notes, to Scale , or to Drum Map . User libraries for scales & maps. CC / Pedal / Automation Lanes - Multiple lanes for CC/AT/PB/NRPN/RPN with Line/Curve/Step/Freehand tools, smoothing & throttle . - Pedal-aware ops: Bake Sustain (CC64) → Note Lengths , Split at Pedal Lifts , repedal window, half-pedal curves (ties into Sostenuto/Soft). - Ramp shapes library; anchor points with bezier; copy/paste between lanes. Per-Note Expression (MPE / MIDI 2.0) - Per-Note PB/AT/CC lanes grouped by note; filter by selected notes; channel/zone aware. - MIDI 2.0 ready: high-resolution controllers, per-note profiles, 14-bit compatibility. Articulations & Keyswitches - Keyswitch lane with labels, latch/hold modes, and relocation tools; articulation mapping presets. Playback, Chase & Safety - Chase notes/CC when starting mid-clip; preview on edit; click-safe commit of drastic edits. - Loop-safe playback; drift meter for clocked rigs; undo groups for compound actions. Multi-Clip Editing - Edit several clips at once (layered or tabbed); per-clip color, quick target switch; Apply to all/selected toggles. Actions & Variables - Actions: Quantize (pos/len) , Legato , Humanize , Split/Join , Strum , Scale Constrain , Bake/Unbake Sustain , Add CC Lane , Draw Ramp (line/curve) , Toggle Ghost , Fold (Used/Scale/Map) , Transpose ± , Nudge ±ticks , Stretch Time , Mute/Solo Notes , Set Drum Map/Scale . - Vars: edit.grid , edit.snapMode , edit.scale.root/type , edit.foldMode , cc.lane[n].type , mpe.enabled , selection.count , quantize.pos/len , chase.enabled . Benefits: Faster, clearer MIDI editing with fewer workarounds and stronger musical intent. Deep control over controllers and per-note expression for modern instruments. Reliable drum and scale workflows; better readability on dense arrangements. Safer live revisions with chase, guard rails, and undo-grouped edits. Examples: Tighten a piano take: Bake Sustain to Lengths , Quantize Position 1/16 (50%) , Quantize Length 1/8 , then Humanize velocity ±6. Program drums: enable Drum Map , Fold to Used , paint fixed 1/16 hats, line-draw an open→closed CC4 curve, then strum tom rolls. MPE lead: edit per-note pitch curves for only the selected notes; clamp global PB to ±2 while keeping per-note at ±48. Orchestral keyswitches: use the Keyswitch lane to retime articulations to bar lines; lock them during further quantize passes. This summary was automatically generated by GPT-5.1 Thinking on 2025-11-24.
2
·
chatgpt-proof-read-done
·
under review
Load More