I Made a Tool That Turns Songs into DNA Portraits
You describe how it feels. It draws what it looks like. There's a question I've been sitting with for a while: What does a song look like? Not the album art. Not the waveform. The feeling of it. Th...

Source: DEV Community
You describe how it feels. It draws what it looks like. There's a question I've been sitting with for a while: What does a song look like? Not the album art. Not the waveform. The feeling of it. The way Gymnopédie No.1 feels like fog over cold water. The way Bohemian Rhapsody feels like architecture that shouldn't work but does. Spotify tried to answer this with their audio features API — energy, valence, danceability, tempo. Machine-derived floats between 0 and 1. They deprecated it last year. I think they were solving the wrong problem anyway. Your ears are the API Spotify's audio analysis is about the signal. I'm interested in the experience. So I built Song Portrait — a tool where you are the data source. You describe a song across 8 perceptual dimensions: Energy — soft and still, or electric and alive? Mood — sitting in shadows, or reaching for light? Tempo Feel — floating, or driven? Texture — smooth like glass, or rough like gravel? Warmth — cold and distant, or warm and close?