Japanese designer Yuri Suzuki has reimagined a sixty-year-old electronic sequencer machine as a physical piece of music software that uses artificial intelligence to generate melodies.
Conceived by musician Raymond Scott in 1959, the Electronium, which is regarded as the world's first electronic sequencer, was made up of three switchboards mounted on a wooden cabinet.
Although the machine was never completed, it was meant to allow users to perform and compose music simultaneously.
Using pre-programmed algorithms, it would turn a snippet of any given melody into a full composition while enabling users to add embellishments over the top.
Presented at the upcoming Barbican exhibition AI: More Than Human, Suzuki – who is a partner at Pentagram – wanted to recreate the landmark machine using musical AI software Google Magenta.
"Scott was the first person to build an electronic sequencer and is widely seen as a forefather of electronic and ambient music," said Suzuki. "All of his previous inventions, skills and ambitions culminated in the creation of the Electronium."
"It was conceived as an instantaneous performance-composition machine, able to intelligently generate music by responding to sequenced melodic phrases," he explained.
Similar to the original, Suzuki's Electronium features three touch-screen panels.
Users can use the middle panel to tap out a short melody, which is then transformed into a more complex composition using artificial intelligence that is displayed on the right-hand panel – a function that was analogue in the previous design.
The left-hand panel functions as an interface where users can add beats and effects onto the existing melodies. Those unfamiliar with electronic sequencers can also experiment with a series of tutorials created by Suzuki and his team for the software.
Given the scarce amount of information surrounding the instrument, Suzuki reached out to Scott's family and the owner of the only existent non-complete Electronium for Scott's schematics, notes and signals flowcharts.
"Always wary that contemporaries could steal his work, Scott was extremely secretive and deliberately avoided writing detailed documentation on his inventions," said Suzuki.
"Various parts and cabling were removed from the machine for future projects, leaving only a series of recordings behind," he explained.
Following weeks of research, Suzuki and his team were able to analyse how the machine worked and find programmers who could help build the software.
Creative studio Counterpoint was enlisted to work on the project. Using Google Magenta, the team taught the artificial intelligence how to distinguish between multiple singing voices and melodies using composer JS Bach's Chorales, a selection of four-part harmonies.
"As a result of its baroque influence, the results are often extremely idiosyncratic as the code tries to work around a more pop sensibility offered in the major scale of the Electronium."
The resulting software aims to function as a compositional tool for beginners and experts alike.
"Though the machine was conceptualised in 1959, it still poses questions about authenticity, the nature of creativity, and man-machine relationships that are increasingly vital in the present day," said Suzuki.
"We hope that bringing the Electronium to life contributes to the current conversation about the qualities of artificial intelligence, and how it is increasingly common to implement intelligent technology in all aspects of our lives," he continued.
Suzuki's Electronium will be on show at the Barbican's AI: More Than Human exhibition in London, between 16 May and 26 August 2019.
The designer has previously embarked on a number of musical projects, including sound-modifying sculptures and a Brexit-themed acid house record.