DJ el Niño: Expressing synthetic emotions with music

Research output: Contribution to journalArticleResearchpeer-review

3 Citations (Scopus)

Abstract

The purpose of this work is twofold: (1) to present an artistic experiment on how to use artificial intelligence to develop a "different kind" of DJ, and (2) to test a cognitive model on how music expresses emotions. Based on a former model conceived by the author, electronic music loops were tagged according to the type and intensity of the expressed emotion. Then, using a feedback model, an artificial personality was arranged, which was affected by the music and played the loops depending on the emotional state the artificial personality was in. The efficiency and credibility of the model was tested in a "live event" during the Z2000 art festival in Berlin, with encouraging results.
Original languageEnglish
Pages (from-to)257-263
JournalAI and Society
Volume18
DOIs
Publication statusPublished - 1 Dec 2004

Fingerprint Dive into the research topics of 'DJ el Niño: Expressing synthetic emotions with music'. Together they form a unique fingerprint.

Cite this