Keywords: automated musical generation, spotify API, sampling, recurrent neural network, correlation schemes between color and pitches
Generation of genre musical compositions according to the emotional state of a person
UDC 004.896
DOI: 10.26102/2310-6018/2022.37.2.026
The aim of this article is research and development of algorithms and software for automation and support of technical creativity process by automated generation of musical compositions of different genres, based on the emotional state of a person. It relies on the method of generating musical material with the aid of artificial neural networks. To generate music, a recurrent neural network with long-short term memory is chosen because this is the type of neural networks that helps to take into account the hierarchy and codependency of musical data. The paper contains a detailed description of training data collection process, the process of neural network training, its use for generating musical compositions as well as an illustration of the network architecture. In addition, it outlines a generalized method for obtaining the emotional state of a person by analyzing an image by utilizing the principles of the Luscher test. For the synthesis of sounds with the help of the prefabricated musical material, the sampling method is applied. It is this method that makes it possible to emulate the realistic sound of musical instruments, which is also relatively easy to implement. Furthermore, the article includes a description of the software design and development process with a view to confirming the algorithms and methods under review, namely a website for generation musical composition by analyzing an image.
1. Chereshniuk I Algorithmic composition and its role in modern musical education. Art education. 3:65–68.
2. Ariza C. Two Pioneering Projects from the Early History of Computer-Aided Algorithmic Composition. Computer Music Journal. MIT Press. 2012;3:40–56.
3. Nikitin N.A. Avtomatizirovannyj sintez muzykal'nyh kompozicij na osnove rekurrentnyh nejronnyh setej. Iskusstvennyj intellekt v reshenii aktual'nyh social'nyh i ekonomicheskih problem HKHI veka: sb. st. po materialam CHetvyortoj vseros. nauch.-prakt. konf., provodimoj v ramkah Permskogo estestvennonauchnogo foruma «Matematika i global'nye vyzovy XXI veka». 2019:80–85. (In Russ.)
4. Doornbusch P. Gerhard Nierhaus: Algorithmic Composition: Paradigms of Automated Music Generation. Computer Music Journal 2014;4.
5. Graves A., Jaitly N., Mohamed A. Hybrid speech recognition with deep bidirectional LSTM. Automatic Speech Recognition and Understanding (ASRU). IEEE Workshop on IEEE. 2013;273–278.
6. Jain A., Murty M., Flynn P. Data clustering: A review. ACM Computing Surveys. 1999;31:264–323.
7. Nikitin N.A. Razrabotka metodov dlya sinteza muzykal'nyh kompozicij na osnove intuitivnogo i emocional'nogo podhodov. Programmnaya inzheneriya: sovremennye tendencii razvitiya i primeneniya (PI–2020): sb. materialov IV vseros. nauch.-prakt. konf. 2020:54–61. (In Russ.)
8. Feynman L et al. Automatic Stylistic Composition of Bach Chorales with Deep LSTM ISMIR. 2017.
9. Raffel C. Learning-Based Methods for Comparing Sequences, with Applications to Audio-to-MIDI Alignment and Matching. Doctoral thesis, Columbia Uniersity. 2016;161–163.
10. Bertin-Mahieux Th, Ellis D, Whitman B and Lamere P The Million Song Dataset Proceedings of the 12th International Conference on Music Information Retrieval (ISMIR). 2011:591–596.
11. Bengio Y., Simard P., Frasconi P. Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks. 1994;5(2):157–166, DOI: 10.1109/72.279181.
12. Hochreiter S., Schmidhuber J. Long short-term memory. Neural Computation1997;9:1735–1780.
Keywords: automated musical generation, spotify API, sampling, recurrent neural network, correlation schemes between color and pitches
For citation: Nikitin N.A., Orlova Y.A., Rozaliev V.L. Generation of genre musical compositions according to the emotional state of a person. Modeling, Optimization and Information Technology. 2022;10(2). URL: https://moitvivt.ru/ru/journal/pdf?id=1175 DOI: 10.26102/2310-6018/2022.37.2.026 (In Russ).
Received 28.04.2022
Revised 22.06.2022
Accepted 29.06.2022
Published 30.06.2022