NMFOLIO
melabel

Selected Work · melabel

What happens when the designer is also the one who loses if it's wrong.

These case studies come from building melabel — a music business OS for independent artists and labels. That means every design decision had a direct business consequence and no one above me to absorb the fallout.

melabel is a music business OS that touches distribution, analytics, contracts, finance, project management, and AI automation — all for independent artists and labels. These case studies document the hardest design problems I encountered building it.

01Abstraction Design

The form that needed to be 47 fields — and had to feel like 8

Distribution UXIndependent artistsProgressive disclosure

Background

melabel uses FUGA as its distribution infrastructure — a professional-grade system that routes releases to 40+ DSPs including Spotify, Apple Music, and Tidal. FUGA's submission requirements are comprehensive: release-level metadata, per-track metadata, contributor roles, rights information, pricing tiers, catalog numbers, UPC codes, and more.

The brief I set myself: build a release submission flow that is complete enough to pass FUGA validation on every submission, and simple enough that an artist releasing their first song doesn't feel like they're filing a tax return. Those two requirements are genuinely in tension. That tension is the design problem.

The real problem

Early in the build, I made a mistake that most tools in this space also make: I started by mapping the FUGA field requirements and designing a form around them. The result was technically correct and emotionally brutal.

The insight that changed everything came from watching five artists attempt the submission in early testing. Not one of them abandoned the form because the fields were hard to understand. They abandoned it because of the fields they couldn't answer — publisher name, catalog number, pricing tier, label name. Questions that assumed a business infrastructure most independent artists don't have.

“I don't have a label name. Do I need a label? Is this not for me?”
— Artist, early usability session, abandoning at field 6 of 47

The solution: a two-layer field model

I categorised every FUGA-required field into two buckets: fields that artists understand and need to provide themselves, and fields that exist for technical or rights infrastructure that can either be handled administratively or inferred from defaults.

FieldRequired by FUGAShown to artistRationale
Release titleYesVisibleArtist knows this. Essential.
Artist name(s)YesVisibleCore identity field.
Release dateYesVisibleArtist controls this.
Primary genreYesVisibleArtist knows their genre.
Track audio fileYesVisibleArtist has the file.
Track titleYesVisibleArtist names their tracks.
Audio languageYesVisibleSimple, artist knows this.
Contributor rolesYesVisibleArtist knows who made it.
Original release dateConditionalConditionalOnly shown for re-releases.
UPC / Catalog numberYesAdmin layerGenerated/assigned by admin.
Label nameYesAdmin layerDefaults to melabel or artist name.
PublisherYesAdmin layerDefaults unless PRO-registered.
Price tierYesAdmin layerStandard tier by default.
Copyright lineYesAdmin layerAuto-generated from artist + year.
Music Catalog — release list view
Release list with type and status badges, Create New Release CTA
Music Catalog — release list view
New release modal
Create new release vs. Import existing (coming soon) — entry point into submission flow
New release modal
Release edit form
6-step wizard: Release, Artists, Metadata, Tracks, Distribution, Revenue Split — artist-facing fields only
Release edit form

The result: the artist-facing submission flow contains 8 required fields, 1 conditional field, and contributor roles — a flow that takes under 4 minutes to complete for a standard single. Zero FUGA submission fields were removed or skipped. The system is complete. The experience is simple. Those two things coexist because they live in different layers.

Before vs. After — two-layer model
Left: V1 with all 47 FUGA fields exposed (68% abandonment). Right: V2 with 8 artist-facing fields (94% completion).
Before vs. After — two-layer model
The harder decision — conditional logic

What I cut, and what I kept conditional

The hardest field was Original Release Date — required by FUGA for catalogue releases but irrelevant for new releases. The solution was conditional rendering based on a single upstream question: “Is this a new release or a re-release of existing music?” One binary question surfaces or hides the field entirely.

This became the template for every edge case in the flow. Rather than designing for the average user and leaving edge cases broken, the pattern was: one upstream question that shapes what follows.

Outcomes

94%
Release submission completion rate after two-layer model (up from 32%)
<4 min
Average time to complete a standard single submission
0
FUGA submission fields skipped — the system remains complete underneath

Before the two-layer model, the most common support question was some variation of “what is a catalog number / what do I put for label name.” After launch those questions dropped to near zero.

What I'd do differently
The admin layer — the interface that the melabel team uses to handle the hidden fields — was underdesigned at launch. Admin tasks that should take 30 seconds take 3–4 minutes because the interface requires manual cross-referencing. The abstraction only works as well as the admin layer that supports it. I'd invest in both sides equally from the start.
02Creator Data Design

Streams are a number. Decisions require meaning.

Creator analyticsData visualisationIndependent artists

Background

melabel aggregates streaming data across 40+ DSPs into a single analytics layer. When I designed the first version, I showed everything — total streams, per-platform breakdown, daily trend lines, top tracks, territories, demographics. A comprehensive data picture.

More data was making artists feel worse, not more informed. They would check the dashboard, see a stream count that meant nothing in isolation, feel vaguely anxious, and close the tab without taking any action.

The real problem

I ran structured interviews with 14 artists. The question was: “Tell me the last time you looked at your analytics and did something differently because of what you saw.”

Eight of 14 couldn't recall a single instance. The other six described making decisions based on three signals: which platform was growing fastest, which territory was showing unexpected traction, and how a recent release had performed in its first 7 days. Everything else they looked at but didn't act on.

“I check my streams every morning like I'm checking my weight. It doesn't help me do anything. It just makes me feel things.”
— Independent artist, 180K monthly listeners
The redesign

Three decisions that restructured the analytics experience

Momentum first
The top of the dashboard no longer shows total stream count. It shows growth rate versus the previous period. After the change, artists who previously described the dashboard as “depressing” described it as “motivating” — the same underlying data, restructured around movement instead of magnitude.
Territory signal
Unexpected territory traction surfaces as a notification, not a buried chart. A prompt that says “you're seeing unusual growth in Brazil this month, 340% above your baseline” converts an observation into an invitation to act.
Release window
The first 7 days of any release get a dedicated view with a different data model. The questions artists have in the release window are completely different from questions 3 months later. Separating them meant both could be designed for their actual use case.
Analytics dashboard — V1 comprehensive vs. V2 momentum-first
Left: All metrics visible, total streams hero. Right: Growth rate hero, territory signal as notification, release window view.
Analytics dashboard — V1 comprehensive vs. V2 momentum-first

What I deliberately chose not to show

I removed skip rate and save rate from the primary artist-facing dashboard, even though melabel has access to both signals. Skip rate for an independent artist in the first two years of their career is a metric that causes spiralling and doesn't point toward an action. I tested including it with 8 artists over 4 weeks. Six of eight described it negatively. Zero described changing anything because of it.

It stays in the data layer — accessible in detail views for artists who want it — but it doesn't lead.

Analytics — Web tab
Profile Views, Unique Visitors, Avg. Session, Bounce Rate KPI cards + Views Over Time chart
Analytics — Web tab
Release window view
Dedicated 7-day release performance — first 48h spike, platform pickup indicators, algorithmic playlist add signal
Release window view

Outcomes

3.1x
Increase in action events within 24h of dashboard visit
+61%
Weekly active return rate to analytics after momentum-first redesign
-40%
Support queries categorised as "I don't understand my data" in 3 months

A dashboard that gets opened and closed without producing action is a well-designed anxiety machine. The 3.1× action event increase was the signal that the redesign was actually working.

What I'd do differently
The territory notification uses a static 300% threshold. That number was instinctive. It should have been calibrated per artist based on their normal territory variance. I introduced false positives and trained some artists to dismiss the notification. The threshold logic needs to be relative, not absolute.
03Platform Design

Designing one product for two users who want opposite things

Platform designDual ICPArtists + Labels

Background

melabel serves two distinct customer types on the same platform: independent artists (B2C) and independent labels (B2B). When I launched, these two groups had access to fundamentally the same interface. That was the mistake.

The same navigation, the same dashboard, the same information hierarchy served neither well.

The real problem

I initially assumed this was a permissions problem. It wasn't. The entire mental model of the product was different for each user type. An artist thinks in releases. A label manager thinks in rosters. These aren't the same product with different permissions. They're different products with shared infrastructure.

Dashboard — label view
Dark Chamber label home showing Quick Stats (Releases, Tracks, Members, Files) and recent Releases list
Dashboard — label view
“When I log in I feel like I'm looking at someone else's product. I can tell it wasn't built for me — I have to find my way around it every time.”
— Label manager, 12-artist roster, feedback session
The design response

Two entry points, shared infrastructure underneath

Artist entry
Navigation organised around projects and releases. Home dashboard leads with the active project, upcoming release milestones, and recent performance signals. The product feels like it belongs to the artist.
Label entry
Navigation organised around the roster. Home dashboard leads with roster-level performance, release pipeline across all artists, and financial summary. Individual artist views are accessible from the roster but are clearly sub-contexts.
Shared components
The same underlying components, data models, and feature set serve both. New features built for artists automatically become available in the label view as a sub-context, and vice versa.
Switching model
Label managers can move into an artist's view and experience the product as that artist would. The perspective switch is a single action, clearly indicated, easily reversed.
Dual ICP entry states
Artist home (project-centric) vs. Label home (roster-centric) — same infrastructure, different primary frame
Dual ICP entry states

The tension I didn't fully resolve

The hardest ongoing problem is features that are simultaneously artist-facing and label-facing but mean different things to each. The financial dashboard is the clearest example: the same data, read from two positions of power, carries different stakes. This remains an active design problem rather than a solved one.

Perspective switch
Label manager viewing platform from an individual artist's perspective — one action, fully reversible
Perspective switch

Outcomes

+74%
Label manager satisfaction score after dual entry state launch
-52%
Navigation-related support queries from label managers in 60 days
50+
Labels on the platform, each managing 2-40 artists in the same product
What I'd do differently
I designed the artist and label views in parallel, treating them as equal design problems. They aren't. The label view has significantly higher complexity and deserved more design investment. For v2, I'd deliberately over-invest in the label view to close that gap.
04Work OS Design

Projects & Files Manager — designing work tools for people who think in songs, not tasks

Project management UXCreative workflowsFile organisationMulti-user collaborationIndependent artists + labels

Background

melabel includes a full project management layer — the ability to create a release project, manage tasks across the release cycle, attach and version files at the task level, collaborate with team members, and track progress across multiple concurrent projects.

Projects grid
Type-labelled project cards (Live Show, Single Release, Campaign) with per-project progress bars
Projects grid

When I built the first version, I modelled it on standard task management. It was technically complete. Artists opened it once and never came back. Usage data at 30 days showed that 91% of users who created a project had not returned to it after the initial setup.

Understanding why V1 failed

I ran 11 interviews with artists who had created a project and abandoned it:

Failure 1
“It felt like work admin, not making music.” The project structure imported the emotional register of a corporate PM tool into a creative context.
Failure 2
“I didn't know what to put in it.” The blank-state experience assumed users knew how to decompose a music release into tasks. Most didn't.
Failure 3
“My files were all over the place.” The flat attachment system meant files were attached to individual tasks with no version logic, no naming convention, no way to understand which file was current.
Failure 4
“My label can see everything but doesn't know what to look at.” A long task list with mixed statuses communicated chaos, not progress.
“I made one project, stared at it, didn't know what to do with it, and went back to my Notes app and a group chat.”
— Independent artist, 90K monthly listeners, 30-day churn interview
Problem 1 — The structure mismatch

Creative work is not linear — and the task model assumed it was

Standard project management assumes a workflow that moves forward: To Do → In Progress → Done. Music production doesn't work that way. A song can be “mixed,” then need to go back to recording. The forward-only status model was the wrong shape for the work.

I replaced the linear status model with one built around the actual states creative work moves through — including backwards movement and parallel states: Not started, Active, Waiting on someone, Ready, Parked, and Needs revision.

“Parked” for work deliberately deferred without being abandoned. “Waiting on someone” to distinguish between active and blocked work. “Needs revision” to handle backwards movement that standard models pretend doesn't exist.

Problem 2 — The blank state

Templates as domain knowledge, not just structure

Working with 6 artists, I mapped out what a release project actually contains when it goes smoothly. The resulting template was an embedded release playbook with pre-populated tasks, suggested sequencing, timing guidance, and placeholder file slots at each stage.

Pre-production
Song structure, reference track, key and tempo, demo recording — structured place for reference material that cuts back-and-forth before sessions.
Production
Recording sessions, mix version tracking, master approval — pre-structured file slots (Mix v1, Mix v2, Final Mix, Master) give the file system shape before anyone uploads.
Release prep
Artwork, metadata submission, distribution checklist, pre-save setup — where most release failures happen. Pre-populated checklists reduce silent errors.
Promo
Playlist pitching, social content, press outreach, ad campaign — pre-creating these tasks makes promotion a continuation of the project, not an afterthought.

The template reduced time-to-first-task from 8.4 minutes to under 60 seconds.

Status model comparison
Left: V1 linear statuses (To Do / In Progress / In Review / Done). Right: V2 creative model with 6 states including Parked, Waiting, Needs revision.
Status model comparison
Release project template
Four-phase template — Pre-production, Production, Release prep, Promo — with pre-populated tasks and file slots
Release project template
Problem 3 — The file management model

Files in context, not files in folders

V1 treated files as attachments — a flat list appended to tasks. I reframed the design question from “Where do files live?” to “What does someone need to know about a file to use it confidently?”

Current vs. versioned
Every file slot has a single “current” version and an accessible version history. You never need to name a file “final_v3_ACTUAL_FINAL.wav” again.
Phase context
Files inherit the phase context of the task they belong to. Searching for “the mix” surfaces all files from the mixing phase, regardless of what they're named.
Who needs it
Files have a role-level visibility model. Files surface to the people who need them, not to everyone with project access.
File model comparison
Left: V1 flat attachments. Right: V2 contextual file system with version history, phase context, and role visibility.
File model comparison
Files Manager views
Phase view (files by project phase) · File type view (files by format) · Version history panel
Files Manager views
Problem 4 — Collaborative visibility

Making label oversight feel like support, not surveillance

Artists described feeling watched. The solution was separating two distinct viewing modes:

What the artist sees
My project — what's next
Active tasks, upcoming deadlines, files I'm waiting on. No overdue count in the hero. Progress framed as momentum, not completion percentage.
What the label sees
Roster overview — project health
Completion percentage, blocked items, upcoming release dates, which artists need attention. Same data, framed for management decisions.

The critical decision was removing the completion percentage from the artist-facing project view. I replaced it with a milestone proximity indicator — “your next milestone is Mix Approval, 5 tasks away.” This communicates progress without implying a deficit.

Roster view — label manager
All artist projects with completion %, blocked items, release dates, health indicators per project
Roster view — label manager
Files manager
Grid view showing folders alongside audio and image files with storage usage indicator
Files manager

Outcomes

34%
30-day project churn after redesign — down from 91% in V1
<60s
Time to first task from project creation after template launch
-67%
"Can't find the file" support queries in 60 days after contextual file system
+82%
Label manager satisfaction with project visibility after roster view redesign
3.4x
Increase in promo phase task completion — artists starting promo before release day
2,000+
Artists currently managing active projects inside melabel

The metric I track most closely is whether artists are using the promo phase — the tasks that exist after the release is live. The 3.4× increase in promo phase completion is the signal that the project structure is working as a system, not just as a task list.

What I'd do differently
The template system launched with one template — the single release project. I waited too long to build more. Artists working on albums, EPs, and merchandise were improvising for months. I'd build at least five core templates before shipping the template system at all.

melabel is live at melabel.io — explore the product the design work behind this case study helped build.