Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by Diogo Miguel Cocharro

2016

A multi-level tonal interval space for modelling pitch relatedness and musical consonance

Authors
Bernardes, G; Cocharro, D; Caetano, M; Guedes, C; Davies, MEP;

Publication
JOURNAL OF NEW MUSIC RESEARCH

Abstract
In this paper we present a 12-dimensional tonal space in the context of the Tonnetz, Chew's Spiral Array, and Harte's 6-dimensional Tonal Centroid Space. The proposed Tonal Interval Space is calculated as the weighted Discrete Fourier Transform of normalized 12-element chroma vectors, which we represent as six circles covering the set of all possible pitch intervals in the chroma space. By weighting the contribution of each circle (and hence pitch interval) independently, we can create a space in which angular and Euclidean distances among pitches, chords, and regions concur with music theory principles. Furthermore, the Euclidean distance of pitch configurations from the centre of the space acts as an indicator of consonance.

2016

Conchord: An Application for Generating Musical Harmony by Navigating in the Tonal Interval Space

Authors
Bernardes, G; Cocharro, D; Guedes, C; Davies, MEP;

Publication
Music, Mind, and Embodiment

Abstract
We present Conchord, a system for real-time automatic generation of musical harmony through navigation in a novel 12-dimensional Tonal Interval Space. In this tonal space, angular and Euclidean distances among vectors representing multi-level pitch configurations equate with music theory principles, and vector norms acts as an indicator of consonance. Building upon these attributes, users can intuitively and dynamically define a collection of chords based on their relation to a tonal center (or key) and their consonance level. Furthermore, two algorithmic strategies grounded in principles from function and root-motion harmonic theories allow the generation of chord progressions characteristic of Western tonal music.

2016

Harmony Generation Driven by a Perceptually Motivated Tonal Interval Space

Authors
Bernardes, G; Cocharro, D; Guedes, C; Davies, MEP;

Publication
COMPUTERS IN ENTERTAINMENT

Abstract
We present D'accord, a generative music system for creating harmonically compatible accompaniments of symbolic and musical audio inputs with any number of voices, instrumentation, and complexity. The main novelty of our approach centers on offering multiple ranked solutions between a database of pitch configurations and a given musical input based on tonal pitch relatedness and consonance indicators computed in a perceptually motivated Tonal Interval Space. Furthermore, we detail a method to estimate the key of symbolic and musical audio inputs based on attributes of the space, which underpins the generation of key-related pitch configurations. The system is controlled via an adaptive interface implemented for Ableton Live, MAX, and Pure Data, which facilitates music creation for users regardless of music expertise and simultaneously serves as a performance, entertainment, and learning tool. We perform a threefold evaluation of D'accord, which assesses the level of accuracy of our key-finding algorithm, the user enjoyment of generated harmonic accompaniments, and the usability and learnability of the system.

2019

Dynamic Music Generation, Audio Analysis-Synthesis Methods

Authors
Bernardes, G; Cocharro, D;

Publication
Encyclopedia of Computer Graphics and Games

Abstract

2021

A Review of Musical Rhythm Representation and (Dis)similarity in Symbolic and Audio Domains

Authors
Cocharro, D; Bernardes, G; Bernardo, G; Lemos, C;

Publication
Perspectives on Music, Sound and Musicology

Abstract

2021

Understanding cross-genre rhythmic audio compatibility: A computational approach

Authors
Lemos, C; Cocharro, D; Bernardes, G;

Publication
ACM International Conference Proceeding Series

Abstract
Rhythmic similarity, a fundamental task within Music Information Retrieval, has recently been applied in creative music contexts to retrieve musical audio or guide audio-content transformations. However, there is still very little knowledge of the typical rhythmic similarity values between overlapping musical structures per instrument, genre, and time scales, which we denote as rhythmic compatibility. This research provides the first steps towards the understanding of rhythmic compatibility from the systematic analysis of MedleyDB, a large multi-track musical database composed and performed by artists. We apply computational methods to compare database stems using representative rhythmic similarity metrics - Rhythmic Histogram (RH) and Beat Spectrum (BS) - per genre and instrumental families and to understand whether RH and BS are prone to discriminate genres at different time scales. Our results suggest that 1) rhythmic compatibility values lie between [.002,.354] (RH) and [.1,.881] (BS), 2) RH outperforms BS in discriminating genres, and 3) different time scale in RH and BS impose significant differences in rhythmic compatibility. © 2021 ACM.

  • 1
  • 2