Feature Requests

Multi-View External Touchscreen Mode (Show 4-6 Independent Screens on One Large Display)
Description: Add an external-display mode that can show multiple independent app views ("screens") simultaneously on one large third-party touchscreen (24" or larger). The goal is to run a single device as the brain, while presenting 4-6 separate, touch-interactive panels on one big screen (e.g., Clips, Mixer, Widgets, Set List, Browser, Actions). Problem: Large performance rigs often need multiple views at the same time: On iPad, users constantly switch pages/panels (Clips <-> Mixer <-> Widgets <-> Browser), which costs time and increases live-risk. Using multiple iPads/iPhones as "visual units" works, but it adds hardware, power, cooling, cabling, and reliability complexity. A large external touchscreen has enough real estate to replace several small screens, but there is no way to display multiple independent views at once (instead of a single mirrored or single-window view). Problems: Excessive view switching during performance increases mistakes and slows operation. Multi-device setups are expensive and fragile (battery, heat, mounts, networking, sync). A single mirrored view on a large display does not solve the "many panels at once" workflow. Proposed Solution: 1) External Display "Multi-View" mode When an external display is connected, offer a mode where the external screen can be split into 2/4/6 panels (user-selectable layouts). Each panel can show an independently chosen view, for example: - Clips page (or a specific page) - Mixer - Widget canvas (a specific widget page) - Project browser / set lists - Actions / sequences monitor - Plugin windows (optional) 2) Touch routing and interaction Touch input on the external touchscreen should interact directly with the panel being touched. Optional "lock panel" to prevent accidental edits (performance safety). 3) Panel assignment and persistence Each panel has a selector to choose its content. Save panel layout per project, or as a global preset ("Stage layout", "Studio layout"). 4) Performance-focused options "Always-on" critical panels (e.g., master meters + CPU/DSP + battery if available). Optional large-font / high-contrast mode for distance viewing. Optional "safe mode" where only whitelisted controls are touchable. 5) Compatibility strategy If iOS limits true multi-window external UI, provide the best feasible approximation: - Dedicated external UI that renders multiple panels while the main device UI remains normal. Graceful fallback to current single-view behavior on unsupported hardware. Benefits: Replaces multiple small devices with one large, touch-capable surface. Dramatically reduces live navigation friction (no constant panel switching). Improves safety and speed: key controls remain visible and reachable at all times. Cleaner stage ergonomics and simplified rig (less power management, fewer mounts/cables). Examples: 6-panel stage layout on a 24" touchscreen: - Top row: Clips (main page), Mixer, Set List - Bottom row: Widgets (performance macros), Actions/Sequences status, Master meters Rehearsal layout: - Large clip grid + mixer + browser, with "lock" enabled for non-critical panels. Busking/compact rig: - 4 panels: Clips, Widgets, Master meters, Quick browser, replacing a multi-phone display setup. This summary was automatically generated by GPT-5.2 Thinking on 2025-12-29 . Original Post: 6 loopy pro screens to appear on one 24” or larger 3rd party touchscreen. Allow 6 loopy pro screens to appear on one 24” or larger 3rd party touchscreen. For immediate and very spontaneous live performance access (quick grap and go on larger format) designed for a quick glance by front man guitarist constantly engaging audience with non stop between songs 45 min set ie ….one screen all drum and drum loop info, one screen prerecorded.mp3 stereo tracks, etc,etc
2
·
chatgpt-proof-read-done
·
under review
Stable ID-Based MIDI Binding for Clips and Widgets
Description: Implement a robust MIDI binding system that links MIDI controls to persistent internal IDs for clips and widgets, instead of to their current index, position, or label. The goal is to keep MIDI mappings stable across layout edits, reordering, and design changes, so controllers remain reliable as projects evolve. Problem: Currently, MIDI bindings appear to be tied to positional or name-based references (for example, "Clip 15" or "Slider 7"). When the user adds, deletes, or reorders clips or widgets, these indices shift, causing existing bindings to: Point to the wrong clip or widget, or Break entirely and require manual rebuilding. This leads to: MIDI mappings becoming unreliable after even minor layout tweaks. Repeatedly having to re-check and rebuild bindings when refining a page or moving UI elements. A strong disincentive to iterate on layouts once MIDI is set up. High risk of failure in live performance, where a moved clip or widget can silently invalidate mappings. Users report that even moving a clip slightly on the canvas can change its internal numbering and break mappings, forcing a tedious verification pass over "each and every binding" after small visual adjustments. It also limits practical reuse of MIDI setups across pages and projects. Proposed Solution: Introduce ID-based object binding for all MIDI mappings: Assign each clip and widget a persistent internal ID (unique object identifier). When a MIDI binding is created, store the binding against that internal ID, not its index, row, column, or label. Ensure that: - Reordering clips or widgets on a page does not affect their MIDI bindings. - Moving a clip or widget between rows or positions on the canvas leaves its bindings intact. - Duplicating a clip or widget (including across pages or projects) offers the option to either: - Copy the bindings along with the object, or - Create a clean copy without bindings (user-selectable behavior). - Only explicit deletion of a clip or widget invalidates its associated bindings. Implementation notes: Provide a safe migration path for existing projects (e.g. converting current index-based bindings to ID-based on load). In the control settings / profiles UI, display the bound target by name and location for user clarity, but internally use the stable ID. Optionally expose a "relink target" function for reassigning a binding to another object without recreating it from scratch. Benefits: MIDI mappings become resilient to layout changes, renaming, and reordering. Users can freely refine pages, move clips a "few centimeters" for better ergonomics, or redesign a performance layout without destroying their control setup. Greatly improved reliability in live contexts, where any unexpected re-mapping is unacceptable. Enables copying individual clips or widgets (with their bindings) across pages and projects as reusable building blocks. Encourages experimentation and modular UI design, fully aligned with Loopy Pro’s flexible canvas concept. Examples: Layout refinement: - A row of clips is moved down to make room for new controls. - With ID-based binding, the same footswitches still trigger the same clips as before, regardless of their new positions. Reusing a performance widget: - A "Record/Play/Stop" button widget with carefully tuned MIDI bindings is copied to a new page or project. - The copy retains its mappings to the intended target clip or role, instead of reverting to default or pointing at the wrong object. Multi-song setups: - A user designs a template page with a grid of clips and a bank of widgets mapped to a MIDI controller. - They duplicate the page for Song 2 and Song 3, adjust clip contents and layout, and all bindings continue to work without manual re-learning. This summary was automatically generated by GPT-5.1 Thinking on 2025-12-27 .
4
·
chatgpt-proof-read-done
·
under review
Multi-Target Morph Pad With Zones, Rails, and High-Resolution Output
Description: Add a new control-surface widget: a "Morph Pad" that can continuously morph (interpolate) between multiple user-defined target states. Each target stores values for many destinations at once (AUv3 parameters, mixer controls, sends, actions, and/or MIDI). The performer moves a cursor (finger) on a 2D pad, and Loopy Pro outputs a smooth blend across all assigned destinations. The key goal: one gesture should reliably and repeatably control many parameters ("XYZABC..."), not just X and Y. Problems: Complex transitions currently require many separate controls (sliders/knobs) or multiple XY pads, which is slow to build and fragile live. Live morphing across many parameters is hard to hit precisely and hard to repeat. Freeform touch control without target/snap logic can cause jitter near boundaries and makes it difficult to land on musically meaningful states. Users who want "morphing" often depend on external apps/controllers, adding routing complexity and failure points. Proposed Solution: 1) Morph core: Targets (the foundation) Allow adding N "Targets" (e.g., 2–16+). Each Target stores a full snapshot of assigned destinations (any number of parameters/controls). During performance, compute weights per Target (distance-based interpolation) and output interpolated values to all destinations in real time. 2) Live-safe precision Optional "Magnet/Snap" to Targets (strength + radius). Optional hysteresis/stability to prevent flicker when hovering near boundaries or between Targets. Optional release behavior: hold last value, return to a default Target, or spring to center. 3) Zones (aligned, performance-oriented) Provide an aligned zone editor (rectangles/polygons with snap-to-grid, numeric sizing/position). Zones serve as: a) Visual overlays (labels) to communicate intent, and/or b) Mapping layers: Zone A morphs parameter set A, Zone B morphs parameter set B. Rationale: aligned zones keep values "anvisierbar" and repeatable under finger performance, while still enabling complex layouts. 4) Rails/Paths (line tool for repeatable morph gestures) Let users define one or more "Rails" (paths). Optional cursor lock-to-rail: the pad behaves like a constrained morph fader along an arbitrary curve. Rails enable stage-proof morphs (Clean -> Big -> Destroyed) with minimal risk of unintended states. 5) Scaling, curves, and limits per destination Per destination: min/max range, curve (linear/exponential/S-curve), invert, smoothing. Optional quantized steps for musically discrete parameters (e.g., rhythmic divisions). 6) High-resolution control output (optional) Internal high-resolution smoothing for AUv3 parameters. Optional high-resolution external MIDI modes (e.g., 14-bit CC via MSB/LSB pairs and/or NRPN) where appropriate. 7) Fast workflow ("build it in minutes") "Add Destination" / learn workflow to capture AUv3 params or internal controls quickly. "Capture Target" button: store current values into selected Target. Copy/paste Targets and mappings, and a clear overview list of all destinations. Benefits: Dramatically reduces UI clutter while increasing expressive control. Enables repeatable, precise morphing between meaningful sound states. Improves reliability for live performance through targets, snap, hysteresis, and rails. Unifies internal control and external MIDI ecosystems without extra routing apps. Examples: FX Morph: one pad morphs Reverb mix, Delay feedback, Filter cutoff, and Drive from "Clean" to "Cinematic" to "Aggressive". Loop Scene Morph: crossfade track groups, adjust send levels, and tweak global FX with one gesture. Safe Rail: a single "Clean -> Big" rail that is easy to hit and repeat under stress. Zone Layers: top half morphs "Ambient FX", bottom half morphs "Rhythmic FX", with identical hand motion producing different musical outcomes depending on region. This summary was automatically generated by GPT-5.2 Thinking on 2025-12-27 .
1
·
chatgpt-proof-read-done
·
under review
Retrospective Loop Waveform Visualization (Pre-Record Buffer + Instant Waveform)
Description: Add a retrospective (pre-record) loop buffer and immediately display a waveform visualization for the captured audio once the loop is created. The goal is to support “I played it already” moments: users can capture the last few seconds/bars of performance and instantly see the waveform for editing and confidence. Problem: In live looping, great moments happen unexpectedly: Users start playing, then realize they want that phrase as a loop after it happened. Without retrospective capture, the moment is lost or must be recreated. Even when audio is captured, lack of immediate waveform feedback makes it harder to confirm what was recorded, where the transient start is, and whether trimming is needed. A retrospective buffer plus waveform display would reduce friction and improve confidence when creating loops from spontaneous performance. Proposed Solution: 1) Retrospective record buffer Maintain a continuous circular buffer per selected input/bus (opt-in to control CPU/memory). Allow “Capture last …” actions: - Last X seconds - Last X beats/bars (tempo-synced), if applicable Optionally provide multiple capture lengths (e.g., 2 bars / 4 bars / 8 bars) as separate actions. 2) Automatic loop creation from buffer When triggered, Loopy creates a new loop/clip from the buffer content. Provide alignment options: - Capture aligned to bar boundaries (quantized) - Capture “as played” (free) Optional transient detection for better start points (nice-to-have). 3) Waveform visualization immediately after capture Once the clip exists, show a waveform view right away: - In the clip view/editor - Or as a temporary overlay preview Include basic markers: - start/end points - loop boundary - playhead 4) Editing integration Quick trim/adjust loop points directly from the waveform view. Optional “fade in/out” for click prevention (if not already present). 5) Performance and quality considerations Configurable buffer length and sources to limit memory/CPU usage. Use low-latency waveform generation (generate a coarse waveform first, refine in background if needed). Benefits: Captures spontaneous musical moments that would otherwise be lost. Faster loop creation: no need to re-play parts just to record them. Immediate visual confirmation improves confidence and reduces re-takes. Waveform-based trimming makes loops cleaner and more musical (better starts/stops). Examples: Spontaneous riff capture: - User plays a great 4-bar phrase, then taps “Capture last 4 bars” to instantly create a loop and see the waveform for quick trimming. Free-time texture: - User improvises an ambient swell, then captures the last 12 seconds and trims visually to a clean loop boundary. Live reliability: - Waveform view immediately shows a clipped transient or off-start point, allowing a quick fix before the loop enters the mix. This summary was automatically generated by GPT-5.2 Thinking on 2025-12-29 . Original Post: When using retrospecive loops it’s hard to know if I covered a whole loop with audio. To visualize the waveform inside the loop before captured would help to know how the final result will be once I close the loop and play it.
2
·
chatgpt-proof-read-done
·
under review
Support Multi-Output AUv3 Effects (Route Each Output Bus Independently)
Description: Enable routing of AUv3 effects that expose multiple output buses (e.g. 4 outputs) so each output can be sent to different inputs/tracks for separate processing and mixing. Problem: Some AUv3 effects provide multiple discrete outputs (for example, separate taps, layers, or processing paths). Currently, these multi-output effects behave like single-output units, so their additional outputs cannot be routed independently. This prevents common workflows like splitting an effect's individual outputs to different tracks for separate EQ/comp/reverb, parallel chains, or distinct mixes. Proposed Solution: 1) Expose AUv3 output buses as routable nodes When an AUv3 effect reports multiple output buses, display each bus as a selectable output source (e.g. "Out 1", "Out 2", "Out 3", "Out 4"). Allow routing each bus to different inputs/tracks/mixers. 2) Add an output-bus selector per route (minimal UI option) In the routing UI, add a field like "Output Bus" for AUv3 effects. Default to the main/mixed output for compatibility, with manual selection for additional buses. 3) Clear channel/bus mapping and monitoring Show how each bus maps to channels (mono/stereo) to avoid confusion. Provide a safe fallback (downmix or "main out only") if a project is opened on a system where the AUv3 reports a different bus configuration. 4) Optional: per-bus level meters Lightweight metering for each exposed bus to confirm signal is present and routed correctly. Benefits: Unlocks advanced multi-chain workflows on iOS/iPadOS with AUv3 effects that are designed for multi-output operation. Enables separate processing per output (EQ/comp/reverb), parallel routing, and cleaner mixes. Makes complex performance setups more reliable and predictable (no hidden downmixing). Examples: Use Bram Bos Scatterbrain with its 4 outputs: - Route "Out 1" -> Track A (dry-ish) -> compressor - Route "Out 2" -> Track B -> shimmer reverb - Route "Out 3" -> Track C -> band-pass + delay - Route "Out 4" -> Track D -> distortion + limiter Sound design: Send different effect outputs to separate loopers/tracks to record and layer each output independently. Live mixing: Keep one output as the main mix, while routing another output to a dedicated track for sidechain or audience FX send. This summary was automatically generated by GPT-5.1 Thinking on 2025-12-09 . Original Post: Trying to use bram Bos Scatterbrain with its 4 ouputs in Loopy Pro but it seems that is not yet possible? Would be my main feature request to be able the route multiple outs from an fx to multiple inputs for seperate processing.
1
·
chatgpt-proof-read-done
·
under review
Dedicated "Set Lists" Access in Main Menu (Optional Floating Button)
Description: Add a dedicated, performance-friendly way to open "Set Lists" directly from the main UI, without having to drill down through the project browser hierarchy. The goal is to make set lists reliably one tap away in live use, on any screen size. Problem: Accessing set lists currently requires several taps and context switches via the project browser. On stage, this extra navigation is slow and increases the risk of wrong taps, especially on smaller screens, in low-light conditions, or under time pressure between songs. Set lists are a performance-critical feature for users managing many projects, but their current placement deep in the browser makes quick, reliable access harder than necessary and can interrupt flow. Proposed Solution: Provide one (or more) of the following UI options to surface set lists more prominently: 1) Main menu entry Add a direct "Set Lists" entry in the main menu, alongside other top-level destinations. This makes set lists consistently one tap away from anywhere in the app. 2) Dedicated button under the folder (browser) icon Under the current main menu folder icon, add a clearly labeled "Set Lists" button. Tapping it opens the set list view immediately, without extra navigation steps. 3) Optional floating "Set Lists" button (most performance-oriented) Add a preference: "Show Set Lists Button". When enabled, display a floating on-screen button that opens the set list view instantly. Allow the button to be dragged and positioned anywhere on the canvas, so left- and right-handed users can place it in their preferred reach zone. Allow toggling the button on/off in preferences to keep layouts clean for users who do not need it all the time. Compatibility / integration notes: The dedicated access options should coexist with existing set list functionality in the browser. This feature aligns well with related requests around better project/load targets (e.g. choosing between folders, set lists, recent projects, reload current project). Benefits: Faster navigation between projects and songs in live sets, with fewer taps. Reduced cognitive load and lower error risk when switching pieces under pressure. Better ergonomics and accessibility, especially with a movable floating button that can be placed for one-handed operation. Encourages consistent use of set lists for large libraries (dozens or hundreds of projects), improving overall workflow for complex shows. Keeps the UI flexible: users who prefer a clean screen can disable the floating button and rely on the main menu entry instead. Examples: Large live set with 60+ songs: - Tap the floating "Set Lists" button. - Tap the next project in the set list. - Project loads immediately, without having to re-open the browser or change context. One-handed performance: - Place the floating "Set Lists" button in the bottom-right corner for right-handed use (or bottom-left for left-handed use). - Quickly switch to the next song between phrases with a single thumb tap. Studio or rehearsal workflow: - Keep the floating button disabled for a cleaner layout. - Use the main menu "Set Lists" entry whenever you need to jump between rehearsal projects, retaining the same quick access without cluttering the canvas. This summary was automatically generated by GPT-5.1 Thinking on 2025-12-27 .
2
·
chatgpt-proof-read-done
·
under review
Clear Indicator When Slider Value Is Rounded for Display
Description: When adjusting parameters via sliders, the internally selected value can be a high-precision float (e.g. 0.5966716) while the UI displays a rounded value (e.g. 0.6). This can mislead users into thinking the value is exactly what is shown. Problem: Slider interaction often lands on a non-rounded internal value, but the UI shows a rounded number without any hint. Users may assume they selected an exact value (e.g. exactly 0.6) when they actually did not. This can cause subtle inconsistencies when matching settings, recreating sounds, comparing presets, or troubleshooting automation. Proposed Solution: Add a clear, minimal rounding indicator whenever the displayed value is not equal to the actual internal value. Recommended implementation (low UI impact, high clarity): Append a small indicator symbol next to the displayed value, e.g. "≈". - Example: "0.6 ≈" Only show the indicator when rounding is happening: - Condition: displayedValue != actualValue (after formatting/rounding for display) Optional (but very useful): On tap/long-press/hover, show a tooltip or popover that reveals the exact value: - Example tooltip: "Displayed value is rounded. Actual: 0.5966716" Alternative implementations (if preferred): Show a secondary smaller line only when needed: - Main: "0.6" - Secondary: "exact: 0.5966716" Use a short label instead of a symbol: - Example: "0.6 (rounded)" Benefits: Eliminates confusion about whether the shown value is exact or rounded. Improves trust and transparency in parameter editing. Helps users reproduce settings accurately (especially with automation, presets, and A/B comparisons). Keeps the UI clean because the indicator appears only when necessary. Examples: User drags a slider and the parameter internally becomes 0.5966716, but the UI formats to 0.6: - Display: "0.6 ≈" - Tooltip: "Displayed value is rounded. Actual: 0.5966716" If the internal value is exactly 0.6: - Display: "0.6" (no indicator) This summary was automatically generated by GPT-5.2 Thinking on December 27, 2025.
1
·
chatgpt-proof-read-done
·
under review
Editable Clip Length in Bars/Beats
Description This feature request proposes the addition of a field in Loopy Pro's clip details and import windows that allows users to specify the musical length of a clip in bars and beats. The goal is to enable Loopy Pro to calculate the original tempo of the clip based on this information, facilitating more accurate tempo matching and synchronization. Problems Inaccurate Tempo Detection : When importing pre-made audio content, especially loops that are not standard 2 or 4 bar segments in 4/4 time, Loopy Pro's automatic tempo detection may not always yield accurate results. Manual Adjustments Required : Users often need to manually adjust the tempo to fit the clip, which can be cumbersome and time-consuming, particularly when the clip's length in bars and beats is already known. Workflow Inefficiency : The lack of a direct method to input known musical lengths leads to a less efficient workflow, especially for users working with a large number of pre-made loops. Proposed Solution Add Length Input Field : Introduce a field in the clip details and import windows where users can enter the clip's length in bars and beats. Automatic Tempo Calculation : Utilize the provided length information to calculate the original tempo of the clip, ensuring accurate synchronization with the project's tempo. Enhanced Import Functionality : Improve the import process by allowing users to specify musical length upfront, reducing the need for manual tempo adjustments post-import. Benefits Improved Accuracy : Ensures more precise tempo matching for imported clips, enhancing the overall musical coherence of projects. Streamlined Workflow : Reduces the need for manual tempo adjustments, saving time and effort during the import process. User Convenience : Provides a straightforward method for users to input known musical information, aligning with their existing knowledge and resources. This summary was automatically generated by ChatGPT-4 on 2025-05-08.
2
·
chatgpt-proof-read-done
·
under review
Load More