Guy Ben-Ary has invented a new form of synthesiser powered entirely by his own brain cells with no computer intervention.

Guy Ben-Ary is the mastermind behind cellF – a newly created sound machine that’s powered entirely by human brain cells. This remarkable feat is the result of a four-year research and development project exploring the potential for artworks that combine human biology and robotic technologies. The pioneering musician and inventor is one of Australia’s most innovative minds and now the bionic artist has created another mind all of its own: an “external brain” that can control modular synthesisers.  

Ben-Ary received an Australia Council Fellowship in 2012 to begin working on a ‘self-portrait’ project in SymbioticA, a center for excellence in biological arts at the University of Western Australia. This development period has involved Ben-Ary growing a second “external brain” using his own bodily tissues. He took skin cells from his arm via biopsy and then converted them to neural stem cells using Induced Pluripotent Stem cell technology. This coaxed the cells back into their embryonic state, which then allowed Ben-Ary to reprogram the young cells to form a functional neural network over a Multi-Electrode Array dish – essentially creating a simple brain that could be used to power cellF.

“When thinking about what kind of body to design for myself I immediately rejected the idea of working within a humanist paradigm,” Ben-Ary explains. “I decided to give my-cellF a sound producing body and follow my naïve childhood dream of being a rock and roll star. At this point, the project took a slight shift in its focus from a self-portrait to a quest – of becoming a Rock Star.”

While based on sound science, the mechanics of this sonic self-portrait read like something from Dr Frankenstein’s laboratory. A grid of electrodes is wired up to the dishes containing the “brain”. These record the electronic signals that the neurons produce, and can also be used to simultaneously send stimulations to the neurons from outside sources, allowing the system to react to other living performers. These impulses control the sounds produced by cellF. This neural interface effectively mimics the “read-and-write” part of the brain, enabling communication between a human performer and the neural network by controlling and monitoring the input stimulations (the human-generated sound) and the output signal (neural activity).

To give the cells a “voice” and method of expression, Ben-Ary connected the “brain” interface to a signal amplifier and a custom-built analogue synthesiser. A built-in mixer spatialises the sound generated by the cells through 16 speakers, creating a dynamic, three-dimensional soundscape. Walking in the space created by the speakers during a cellF performance is akin to walking inside the “brain” of Ben-Ary.

So, what is cellF? A brain? A robotic machine? An instrument? Essentially it’s a cybernetic musician and composer, in that it can interact with musicians in a creative and musical manner. It’s not powered by a computer or driven by programming code. Rather, the living neurons cultivated by Ben-Ary control the array of analogue modular synthesisers attached, making it an entirely autonomous analogue instrument, otherwise referred to as a wet-analogue instrument. Individuals can perform live improvisation performances with cellF by feeding music to the neurons as stimulation. cellF gave its world premiere on October last year, jamming live with Tokyo-based experimental jazz drummer Darren Moore in a one-off improvisation.

cellF is a collaboration between artists Guy Ben-Ary, Darren Moore, Nathan Thompson and Andrew Fitch, and scientists Stuart Hodgetts, Mike Edel and Douglas Bakkum.

For more information about cellF, visit Ben-Ary’s website. You can hear an interview with Guy Ben-Ary and cellF in performance on ABC RN’s Books and Arts show. 

Get Limelight's free weekly round-up of music, arts and culture.