(SEE ATTACHED VIDEO OF PROTOTYPE)
CODA Inspiration
Coda (n) definition, Music. a more or less independent passage, at the end of a composition, introduced to bring it to a satisfactory close
What is CODA
What if we could turn our music player into a "meaning player" to help us reflect on our lives and deepest relationships?
CODA is a way that everyone can participate in music as therapy by reflecting on lives and legacy through song, creating connections and building a community of love and meaning through the process.
Add stories and memories around specific meaningful songs. Add associated photos if you'd like. Tag people who are important or related. Then, when those songs play through Spotify (or your itunes catalog), it triggers CODA to sync up and "play" those related stories in a beautiful way. Songs + Stories= Meaning.
CODA as therapy
The process of creation is reflective and therapeutic. It uses music to help those facing death through a central component of the art therapy process--using art to help explore memories, mental images, and emotions.
It is also therapeutic for friends and family, who engage in the same process by making CODA playlists of songs related to the person they care about.
CODA as a connecting tool and a legacy relic
The output generates a beautiful artistic gift that helps make the end of life experience better and a beautiful relic that can help us remember and celebrate someone after they are gone. When someone dies, we have these songs and stories for us to hold on to forever. And we are bound together through music.
CODA for musicians
Musicians love CODA because it allows people to attach their own personal meaning to artists songs and share it. It's the ultimate expression of how music affects us. If CODA users make playlists public, others can search the global CODA Map to discover meaning and music across the globe, connecting to the human experience of memories and love worldwide. All of sudden, they can hear their favorite songs in a new, more personal way. CODA becomes a global empathy platform.
Description of attached CODA Video prototype
I asked a colleague to submit three items:
1) Song that reminded her of loss
2) Story of why
3) Any associated images and their stories
I then took the stories and evenly distributed the text over the song in imovie. CODA would do this all automatically and in a more beautiful way. All users do is add text to the song, CODA does the rest. Also, this song was for someone that passed away, not as a tool for the dying...but you could imagine it being used in both cases.
Technical description of how this might work (from the amazing Jonathan Lipps):
- build an Electron app (which is essentially a web app inside of a native Electron frame). With this we have access to the local filesystem and all Node.js libraries and ecosystem
- in this app, construct a UI where the user can select a song file from their system via a file-finding dialog
- the app can then copy this file in and serve it to the webview via an internal web server, utilizing the usual <media> tags
- the user can then enter a bunch of text in a simple text editor window. There could be a convention that each line of text will appear by itself as part of the music video
- alternatively we could have a more complex UI where they add each text phrase as a separate object (perhaps being able to control font/color independently)
- using media libraries (which libraries exist may determine what kinds of song files this app supports (.mp3, .m4a, etc...)), determine the length of the song
- create a <canvas> element and, using some kind of animation library, fade the lines of the user text in and out. the amount of time they're on the screen would be dependent on the length of the song
- could add some other options for the user to customize things a bit more, that wouldn't involve a lot of development effort:
- font
- lead-in/lead-out space
- background picture or color (could even allow multiple pictures that likewise fade in/out as the song plays and the words show)
- text color (we might need the user to set this since it's difficult to automatically change text color to be visible on an arbitrary background image)
- sharing is the most challenging thing here from a technical perspective (not to mention a legal one). two options I can think of:
- figure out how to render the <canvas> animation as a video, then add the music stream to the video track. this would then be a shareable video file (probably illegal)
- create an object storage representation so users could send all the metadata (words, timings, colors, fonts, etc...) that someone else could import intheir version of the app, and, assuming they have the same song, could attach the song to the imported metadata, and the experience could be recreated locally. This would be legal but also the sharing would be very limited (would depend on both users having and finding the song file, etc...)
Another direction would be to use Spotify's API; require users to sign in with their Spotify account and select from songs available on Spotify. Not sure if there's a way to kick off Spotify songs (a) without showing their player UI which would be odd, and (b) at a time of our choosing (to ensure proper synchronization of song + words). If those things were possible, this would probably be the method that would provide the best overall user experience (not to mention being legal). Also, if Spotify can work, there's no reason to build a native Electron app, it could all be purely a web app (since the main point of the Electron architecture would be to allow importing of files locally without having users upload them to a web service---and maybe some video processing magic, dunno).
4 comments
Join the conversation:
CommentJames Siverson