Song lyrics banner
ALL PROJECTS ALL PROJECTS

Creating Data Art From Emotional Aspects of Song Lyrics

March 2022

Introduction

Written as a research paper, this project explores different types of data art, and goes on to propose a type of data art that creatively visualizes various emotions contained in lyrics of different songs.

Explorations

What is Data Art?

Data art, like art itself, is difficult to clearly define. One of the ways could be to identify the intent of the creator; if they intended their product to actually be art.

Data art can also have some major overlaps with data visualization.

  • Data visualisation sacrifices aesthetics for insights, whereas data art does the opposite
  • Data art might not be good data visualisation, but it promotes what’s called “disinterested analysis”: provoking a non-expert into thought by using strong connected emotions
  • Both data visualisation and data art consist of two things: objects and features. An example of objects can be cities, with their features being their weather attributes
  • Overly focussing on usability can snatch away the aspect of beauty, spontaneity and emotion from data art
  • Data art is allowed to commit ‘sins of data visualisation’- unclear presentation of information and blurred readability of data just for the sake of aesthetics
  • Data visualisations provide quantitative insights, whereas data art may provide more qualitative insights open to interpretation

Classification of Data Art

Data art can be classified based on form, temporality and generators.

Based on Form

Based on Temporality

Based on Generators

The Proposal

I propose a method to generate digital 3D models using lyrical emotions of different songs, and present data art created using this method.

Music as a dataset is readily available, and is consumed by the public in massive volumes. It is relatable and appealing. The data art I present, instead of focussing on music’s quantitative factors like pitch and amplitude, focusses on the actual emotion behind what its lyrics say, which more often than not, is completely different from the song’s beats and tempo.

Literature Review of Existing Visualisations

Most visualisations are static, considering the entire song at once. Some of them are dynamic shapes that changes in real time.

The aim of this project is to not focus on accuracy as much as the emotions in the songs; its mood and meaning. The result should visually capture the artist’s interpretation of the essence of the song based on the data provided to the algorithm, resulting in a different form for each musical piece.

The Product

I present data art as a ‘growing’ 3D model that develops based on the underlying emotions of a song. The process of generating this piece is as follows:

Choosing a song

For more engaging pieces, it’s good to choose songs that are ‘emotional rollercosters’. Also, heavily lyrical songs are preferred, with minimal repetitions. Following 3 songs are considered:

Assigning Emotions to Song Lyrics

Song lyrics, once laid out line by line, have 2 inputs: quantitative and qualitative. Quantitative input is time; how long it took for the singer to say that line. Qualitative input is the emotional variable assigned to that line. This project uses an arousal-valence dimensional model called the Russel’s Circumplex Model of Affect, which places different emotions on a scatter plot matrix based on their positivity and arousal values.

Converting Emotions to Inputs

Based on emotions assigned to different lyrics, the total time each emotion appears is noted. Based on the emotions’ location in the matrix, zones are assigned as seen below. Both arousal and valence are divided into 8 zones each, giving a total of 16 combinations. This gives three numerical values; the positivity zone (from-4 to +4), the arousal zone (from -4 to +4) and the running time (in seconds).

Generating a 3D Model from the Inputs

Using three.js, different emotions are visualised as randomly growing lines emanating from a common origin, in 3D space. For each emotion, the drawing algorithm takes as input the positivity zone, arousal zone and the running time.

Comparing Art Created by the 3 Songs

Data from lyrics of three songs; Space Bound, How Far I’ll Go and Eastside is taken to create three different art pieces.

Conclusions and Future Work

I propose a method to generate digital 3D models using lyrical emotions of different songs, and present data art created using this method.

Check out the detailed project report.