OpenIDEO is an open innovation platform. Join our global community to solve big challenges for social good. Sign up, Login or Learn more

CODA: Turn Spotify into an accessible music therapy tool

It's not individual songs that bind us; it's the memories and meaning those songs inspire. Create legacy playlists combining songs + stories

Photo of Brad Wolfe
1 3

Written by

Who is your idea designed for and how does it reimagine the end-of-life experience?

This is a reflective experience designed for the dying + their families to make music therapy accessible. Those facing take Spotify playlists of their own meaningful songs and add accompanying stories + memories that relate. Loved ones create their own playlists, too, selecting their own songs and adding their own stories. The CODA app takes the songs + stories and weaves them into beautiful audio/video playlists that visually tell the stories while songs play, creating deeper connection


CODA Inspiration

Coda (n) definition, Music. a more or less independent passage, at the end of a composition, introduced to bring it to a satisfactory close

What is CODA

What if we could turn our music player into a "meaning player" to help us reflect on our lives and deepest relationships?

CODA is a way that everyone can participate in music as therapy by reflecting on lives and legacy through song, creating connections and building a community of love and meaning through the process.

Add stories and memories around specific meaningful songs. Add associated photos if you'd like. Tag people who are important or related. Then, when those songs play through Spotify (or your itunes catalog), it triggers CODA to sync up and "play" those related stories in a beautiful way. Songs +  Stories= Meaning.

CODA as therapy

The process of creation is reflective and therapeutic. It uses music to help those facing death through a central component of the art therapy process--using art to help explore memories, mental images, and emotions.

It is also therapeutic for friends and family, who engage in the same process by making CODA playlists of songs related to the person they care about. 

CODA as a connecting tool and a legacy relic

The output generates a beautiful artistic gift that helps make the end of life experience better and a beautiful relic that can help us remember and celebrate someone after they are gone.  When someone dies, we have these songs and stories for us to hold on to forever.  And we are bound together through music.

CODA for musicians

Musicians love CODA because it allows people to attach their own personal meaning to artists songs and share it. It's the ultimate expression of how music affects us. If CODA users make playlists public, others can search the global CODA Map to discover meaning and music across the globe, connecting to the human experience of memories and love worldwide. All of sudden, they can hear their favorite songs in a new, more personal way. CODA becomes a global empathy platform.

Description of attached CODA Video prototype

I asked a colleague to submit three items:

1) Song that reminded her of loss

2) Story of why

3) Any associated images and their stories

I then took the stories and evenly distributed the text over the song in imovie. CODA would do this all automatically and in a more beautiful way.  All users do is add text to the song, CODA does the rest. Also, this song was for someone that passed away, not as a tool for the dying...but you could imagine it being used in both cases.

Technical description of how this might work (from the amazing Jonathan Lipps):

- build an Electron app (which is essentially a web app inside of a native Electron frame). With this we have access to the local filesystem and all Node.js libraries and ecosystem
- in this app, construct a UI where the user can select a song file from their system via a file-finding dialog
- the app can then copy this file in and serve it to the webview via an internal web server, utilizing the usual <media> tags
- the user can then enter a bunch of text in a simple text editor window. There could be a convention that each line of text will appear by itself as part of the music video
- alternatively we could have a more complex UI where they add each text phrase as a separate object (perhaps being able to control font/color independently)
- using media libraries (which libraries exist may determine what kinds of song files this app supports (.mp3, .m4a, etc...)), determine the length of the song
- create a <canvas> element and, using some kind of animation library, fade the lines of the user text in and out. the amount of time they're on the screen would be dependent on the length of the song
- could add some other options for the user to customize things a bit more, that wouldn't involve a lot of development effort:
- font
- lead-in/lead-out space
- background picture or color (could even allow multiple pictures that likewise fade in/out as the song plays and the words show)
- text color (we might need the user to set this since it's difficult to automatically change text color to be visible on an arbitrary background image)
- sharing is the most challenging thing here from a technical perspective (not to mention a legal one). two options I can think of:
- figure out how to render the <canvas> animation as a video, then add the music stream to the video track. this would then be a shareable video file (probably illegal)
- create an object storage representation so users could send all the metadata (words, timings, colors, fonts, etc...) that someone else could import intheir version of the app, and, assuming they have the same song, could attach the song to the imported metadata, and the experience could be recreated locally. This would be legal but also the sharing would be very limited (would depend on both users having and finding the song file, etc...)

Another direction would be to use Spotify's API; require users to sign in with their Spotify account and select from songs available on Spotify. Not sure if there's a way to kick off Spotify songs (a) without showing their player UI which would be odd, and (b) at a time of our choosing (to ensure proper synchronization of song + words). If those things were possible, this would probably be the method that would provide the best overall user experience (not to mention being legal). Also, if Spotify can work, there's no reason to build a native Electron app, it could all be purely a web app (since the main point of the Electron architecture would be to allow importing of files locally without having users upload them to a web service---and maybe some video processing magic, dunno).

What early, lightweight experiment might you try out in your own community to find out if the idea will meet your expectations?

Prototype more videos like the one attached and interview folks around their reactions...
Work with a hospice to prototype this as a service for some patients...

What skills, input or guidance from the OpenIDEO community would be most helpful in building out or refining your idea?

Connect us to a hospice for testing?
Help us design look and feel?
Blast this out to influencer network once it's at the right stage.

Tell us about your work experience:

Founder of Sunbeam Foundation for pediatric cancer--performed and written songs for kids with cancer....
Professional Musician--featured on MTV, etc
Positive Psychology Expert--worked at the Greater Good Science Center, Delivering Happiness
MBA/MA in positive Org Behavior

This idea emerged from

  • A group brainstorm
  • An OpenIDEO Outpost or Chapter
  • An Individual

1 comment

Join the conversation:

Photo of Michael

I like this idea.  It brings together several elements that people attach to.  First a personal story that is important to them.  Second is a song that creates a particular feeling/mood.  Then it has visual cues that help anchor down the experience in their mind.  The song is associated with a story and pictures which brings multiple pieces of communication together. This is what movies and shows do all the time to invoke others to experience a particular feeling.

 I also wonder if there could be an involvement with the other sense to have a greater connection to the experience?  Perhaps a suggested drink/snack/food to have while watching the video, or lighting a particular candle/incense to have a particular smell occurring, or feeling a particular type of material.  

Not sure if that is the answer but it would be interesting to tie in as many senses as possible so that all forms of memory can be associated with that experience and person.