Artificial intelligence (A.I.) is not new. For many decades, humans have experimented with AI. Although AI is primarily used in the field of entertainment today, such as games, it nonetheless will prove useful in other fields once the technology is mature. Artificial intelligence is based on the premise of machine learning, which essentially is allowing a computer to recognise the patterns in a sequence of steps or actions fed to it by humans. By tapping on its neural network, it learns the patterns and tries to mimic the patterns and generate its own set of steps or actions.
As part of its constant research into A.I., Google has come up with an application that makes piano music based on input from a human player. This innovative project, known as the A.I. Duet and based on the Magenta music generation model, is the brainchild of the Google Creative Lab. In this human-computer interaction model, the user has to first input a series of notes by pushing piano keys, whereupon the computer responds with its own series of notes. What makes this project different from other AI projects is that where traditionally, the programmer has to input specific rules into the computer, which form the basis for the AI’s decisions, the Magenta music model does not need any rules. It makes its own rules and judgements based on what it learns from the human player.
All the user needs to do is to simply tap a series of keys – whether they actually form a coherent piece of music or not – and then listen as the computer returns the favour. While some instances of the computer-played notes sound completely random, others sound eerily human-like, as if the computer can indeed piece together a melodious piece of art that sounds pleasing to the human ear on its own! This is made all the more amazing by the fact that the computer does not learn by relying on rules given by the programmer.
Since A.I. Duet was released by Google Creative Lab, the programme has received coverage in the Internet press. Many have gone forward to try it, with interesting results. Music lovers, in particular, have been thrilled, now that it is finally possible to have a computer actually interact with humans in the form of a piano duet. The emergence of A.I. Duet breaks the traditional need to have two human players seated side-by-side before a keyboard.
However, as with all things that relate to technology, with technological breakthroughs come tough questions. Just as people ask if driverless cars will replace the joy of driving and if sending email kills the warm personal touch of writing a traditional letter, people too, are asking if it is really possible for computers to make music. Music, after all, has been an essential part of human existence for as long as civilisations have existed. To some, it is even something sacred, as can be seen in the use of music in religious events and festivals.
Now then, to what extent can humans accept music that is artificially-created as part of their lives? Apart from this psychological aspect of the equation, questions also exist as to whether computers are able to understand human emotions, which undoubtedly form an important element in music compositions. The great composers and performers of music have invested much time, resources and feelings to fuel their passion in the musical arts, and their results can be heard in the famous compositions known globally.
Will computers be able to do the same? Will we see a day in which computers can experience human joy and pain and compose musical melodies that reflect these feelings? What do you think?