Kann man mit Apples iPhone X Musik oder sogar mehr machen?

Apple iPhone X

Apple hat gestern drei neue iPhones bekannt gegeben, darunter das iPhone X. Das Besondere ist nicht nur der neue schnellere A11 Bionic Chip, sondern etwas, was vielleicht später eine kreative Quelle werden könne.

Gemeint ist die Phalanx von Kameras und Sensoren des neuen iPhone X. Das iPhone 8 und 8+ sind weitgehend vom gleichen Potential wie bisherige iPhones. Sie sind schneller und die Ergebnisse für die Kameras können schneller berechnet werden und damit Rauschen verhindern. So auch beim iPhone X, aber was hat das nun mit Musik zu tun?

Apple iPhone X und die Musik

Eigentlich könnte diese neue FaceID-Sensorensammlung tun, was bisher Kinect tat, nämlich Gesten, Handstellungen und vieles mehr erkennen. Mit diesem Instrumentarium habe ich schon ganze Kunstausstellungen gefüllt gesehen, in denen 3D-Objekte auf Pappe projiziert werden und aussehen wie lebende Organismen, die die eigenen Tanzbewegungen in Muster umsetzen können. So könnte man selbstverständlich auch Musik erzeugen.

Die Möglichkeiten könnten Bewegungen eines Drummers oder Musikers umsetzen und damit Instrumente und Sounds einstarten helfen oder eine Art Theremin erschaffen, welches nicht nur auf die Entfernungen der Hände zu zwei Antennen berücksichtigen, sondern auch deren Stellungen.

Zauberwort Bewegungserkennung

Wieso sollte so etwas nicht eine Reihe Sampleloops auslösen können? Oder Audio in der Luft “aufnehmen“. Dies alles kann ein iPhone heute locker bewältigen, da schon der Vorgänger-Prozessor mit 6 Kernen in etwa die Kraft eines aktuellen 13” Macbooks hat. Dass diese Anwendungen auch VJs und Visual-Artists helfen könnte oder neuartige Controller aufbauen lassen könnten, liegt vermutlich auf der Hand. Denn das iPhone X besitzt einen Infrarot-Sensor, der auf Wärme reagiert – das liegt in der Natur der Sache.

iPhone X – alles in einem

Und wer Anwendungen von Kinect kennt, kennt vielleicht auch Instrumente wie das Space Palette von Tim Thompson. Das besteht aus einem selbstgebauten Rahmen und in einer Ecke des Raumes steht ein Kinect mit Rechner – so etwas kann das iPhone X heute allein vollbringen und könnte so aussehen. Wichtig dabei ist, dass das iPhone sowohl Klangerzeuger, Controller als auch Gesteneingabemaschine in einem wäre und verdammt portabel. Wesentlich weniger Aufwand ist nötig, als Kinect und Rechner aufstellen zu müssen. Man müsste bestenfalls einen Ständer haben, um die Kameras so auszurichten, dass sie auf den Performer zeigen. Und dann noch das nötige Kleingeld …

Das iPhone X auf den ersten Blick

How one community was mapping the future of visuals this summer

There’s a shift on in the worldwide community of visualists, of the growing field of people using electronic visuals as a medium for performance, art, and inquiry. As these media become more mature and more international, there’s a renewed sense of closeness among practitioners. While big media festivals focus on novelty and show, these maker-to-maker events emphasize something else: craft.

This summer seemed a particularly historic moment for not one but two tools – each of them built by small teams who make art themselves. We already covered the Berlin gathering for Isadora, the visual performance tool that has rich connections to the world of dance. Now, we get to look at TouchDesigner, which has made a name for itself as the leading go-to tool for interactive event visuals (among other things). And maybe it’s fitting that unique tools would leave a particular mark. For artists, that particular piece of software is their axe, their main instrument, something to know inside and out.

I asked Isabelle Rousset from Derivative, TouchDesigner’s developer, to help prepare a report on the gathering in Moscow.

And picking someone from the team here works, because these gatherings are family affairs. This is summer camp for visual nerds – a retreat for people of passion. And I was ready for an exhaustive “what did you do last summer” report. We got it, in the form of obsessive notes on what happened and endless leads to check out yourself.


What TouchFest was about

Moscow’s MARS Center is a hub for the city’s electronic media community – one of a handful of places everybody meets to see the latest tech and visiting artists from around the world. And in this case, that same community got their hands dirty organizing the event.

Curiosity Media Lab’s Yan Kalnberzin and Eugene Afonin spear-headed the four-day event in July. It came against the backdrop of the cancellation of Outline Festival (the last afternoon of TouchFest) – but as such, was a reminder of the possibility still latent in Moscow’s scene.

Here’s Derivative on the experience:

It was solidly packed with masterclasses, lectures, demos, audio visual performances and a ‘marathon of interactive madness’. There was zero fluff!

Derivative’s Greg Hermanovic, Markus Heckmann and Isabelle Rousset who were there to participate were blown away on many fronts: the scope, quality, range of TouchDesigner projects and applications, community engagement and support (100+ in attendance), festival programming and schedule where workshops, lectures and masterclasses ran in parallel for multiple days, the generosity and proficiency of the festival organizers for putting together such a BIG and exciting festival (while working on a major project for clients i.e. they didn’t sleep much if at all for days), the MARS Center who provided the fantastic facilities and staffing…. The volunteers and of course the performers and participants whose work and energies were very far out and intense. A ‘hotbed’ of TouchDesigner 🙂


I’m still surprised that around 250 people came))

This was a chance to learn if Moscow is ready for educational events like this. And we wanted to see how many people are really interested in the subject. I’m amazed how many people came from around Russia – three people from Krasnodar, several from Izhevsk, from Novosibirsk, and some other pretty far away cities. Ed.: Yeah – uh, Googling those myself!

We are super happy and thankful to MARS Center and their technical team. They totally made half of the event. And of course to all the speakers – mostly friends, but not all of them – that they agreed to perform for free.


Deep in technical education.

Deep in technical education.

TouchFest’s organizers were themselves long-time TouchDesigner instructors. Yan Kalnberzin and Evgeniy Afonin have been teaching the tool since 2012 – even 3- and 6-month courses. These also culminated in presentations:

Yan’s 2014 work, The Square Root of Sin, 2014

Exhibit Item Awakening, 2015 [Golden Mask winner]

As they explain:

Here at Curiosity Media Lab we often get letters about education from scratch.
As we organize such an event, inviting Touch masters from Russia and abroad, we want to give you a chance to understand at least on a basic level what they are speaking about.
Our 8 hour masterclass will start from the very beginning – fields of application, nodes, logic, interface, contexts of the program.

Yan and Evgeniy were teaching again. Other highlights:

Markus Heckmann presented an eight-hour class entitled “Developing in TouchDesigner: Python Extensions and Custom Parameters.” “The master class examined how TouchDesigner’s new Custom Parameters and Extensions helps develop complex functionality within the environment of visual programming with networks of nodes,” says Isabelle.

“Probably the longest-standing” TouchDesigner user and teacher Andrew Quinn taught a course that incorporated audio and gesture, “Sound-Reactive Visuals and Gesture Tracking.”

Recorded or live sound could then animate movement, light, and color – in two and three dimensions. Gesture tracking transforms the VJ “into a puppeteer.” Andrew is applying the same concept to coursework with kids.

He played the closing audiovisual concert with composer Nikolai Popov and six musicians from the Russian Conservatory.

Conservatory musicians join the AndrewQuinn / NikolayPopov AV performance.

Conservatory musicians join the AndrewQuinn / NikolayPopov AV performance.

Total immersive interactive chaos intensive

Three days in the Vostochnaya Gallery turned into a circus of interactive hacking, thanks to Ildar Yakubov (someone I’ve also had the pleasure to know).

I like the chaos aspect. As Isabelle describes the event, “NEITHER SEW NOR FASTEN”:

[The marathon] showed the world what pure, boundaryless and unpredictable total interactivity looks like. (At one point, I’m told, a doctor in blue scrubs came down saying he was performing an operation upstairs and the floor was shaking!) There was also a flood!

But for three days the kids worked tirelessly, connecting all kinds of materials to TouchDesigner, including:

Microsoft Kinect 2
Intel RealSense F200
Intel RealSense SR300
Arduinos with a variety of peripherials
I/O devices / physical computing
LED strips
Enttec Open DMX Ethernet
DMX controlled devices

Here’s the proposal-manifesto – slightly broken English here, sorry, but posted as-is (as maybe English can’t really describe everything they imagined)!

“Artists are really suppressed with the totality of technology and this way doomed to a both senseless and endless flirting with it” – some critics say 
“Until you are really familiar with the technology you are not able to reflect on it” – says the other 
“Turn off that weird shit or I call the police” – say the neighbours.

Armed with these tools, and using the unlimited potential Touchdesigner offers we will challenge an intuitive interface and the planned user experience.

After two days, we will create a real media art gezamkunstverk and put it to the mercy of the crowd.
Oh yes, about two days – the event will take place in a Hackathon mode: we will work tirelessly for two days. Some rumors say that on the second morning, participants will become familiar with the machine learning process and will learn to use this powerful tool in their practice!

Tools of the interactive madness.

Tools of the interactive madness.

Plotting the madness marathon.

Plotting the madness marathon.

Ildar writes:

For three days we have collected Krastinator – crazy device like a Rube Goldberg machine, the sole purpose of which was to stop the madness in and of itself. However, the rubber glove could not reach the big button, but successfully deceived Leap Motion posing as a real hand! The result of the analysis of the motion of non-existent bone defunct arms generate sound skeleton projected on screens, mixed with a picture of motorized cameras, controlled their own picture and includes a strobe on the floor rolled robo-ball, grazing contact microphones, vibration motor and the cooler is activated by a small Korg, smoke machine made his work, and all of this in an endless loop of interactions and relationships, secured by TouchDesigner.
Such collaboration with different people and skills backgrounds very useful in every sense of the activity, a huge thank you to all participants – you are super!

Lectures: praxis and philosophy

Thumbing through the notes from the lecture content, what strikes me is, you could navigate the full program without ever wanting to even use TouchDesigner user and still be really happy. There’s enough content dealing with theory and general technique. Or, on the other hand, you could come to TouchFest wanting to really hone up on skills – even from scratch – and have a ball, too.

To give these two areas physical space, the practical and technical were kept on downstairs and loftier topics literally above. (Nice.)

I was sent pages of notes, so let me summarize.



3D mapping technique with Andrew Flat, who has done everything from event design to VFX supervisor, and now is co-founder and technical director of AVEA company.

Roman Gavrilov of Curiosity Media Lab, who has spent years researching folk traditions in Russia and Ukraine and makes the leap from traditional craft to electronic media, covered LED control software GEOPIX.

SILA SVETA’s Dmitry Napolnov drew on an extensive background in events (from live motion capture to projector calibration) to present solutions to production tasks.

A roundtable discussion looked at how to present LED lighting at low costs (from DMX controllers to DIY LED).

Dmitry Karpov covered the “battle” for VR, evaluating platforms and how the technology would shake up the landscape of designers.

Derivative’s Greg Hermanovic looked at TouchDesigner past, present, and (possible) future, in a talk called “A playground for design, and how it got that way”

Other topics:
Building audiovisual instruments: advanced ways to mix human impact with algorithms, TD+Ableton+Max4Live.
Anatomy of an installation
Coding a pixelshader(GLSL TOP) with DAT nodes, for “raymarching”

Dmitry Napolnov, lecture.

Dmitry Napolnov, lecture.

Upstairs philosophy topics

Procedural Functions as a New Canon – watch:

Anna Titovets (Intektra), of Russia’s Plums Fest, went deep into the question of live performance. I will paste the whole description here, as I think it matters:

Live video performance: from fractal tunnels to live cinema
Live video performance as a social and cultural phenomenon in contemporary media art.
Talk and talk about video performance as a format, which is a kind of marker for the changes occurring in today’s information society in the context of global changes taking place with the psychology of perception of the information society, with the communication methods with the audience and global technological development. How did the “vj” what “protovidzheing” Live Cinema differs from other genres, as changes in video performance linked with the development of technology and the changes taking place in society in general and in music in particular.
As part of the lecture will discuss the main current stylistic trends shaping and live video performances that exist at the moment (from 8-bit to glitch and Camp aesthetics of mash-up and political videointerventsy to generativa and nonlinear narrative Live Cinema).

And Isabelle’s own talk sounds fascinating:

A Timeline of TouchDesigner, or, How We Got From Touch001 to Russia
This talk proposes to untangle the co-joined history of TouchDesigner’s development in the context of
1. making music visuals for raves and
2. everything that followed. Isabelle will attempt to trace TouchDesigner’s development through time, technological advancements, historical events, historical achievements, grand projects, lightning bolt moments, community development and hard work.

Ilya Ostrikov covered “the philosophy of digital art”:

Thinking out loud, or in response to your questions: Semantics and historical review – where the legs grow .. Art as a way of expression and how to make it? As a born and where to apply? Follow the trend and whether it is now fashionable?

More topics:
Code as supreme and universal coauthor

More speakers

Vadim Epstein is a top VJ – who happens to have a background in theoretical physics and 13 years consulting for HP, to boot. Now he makes generative work with code.

Kir Hachaturov and Konstantin Novikov presented their team STRUTTURA, a generative AV performance team made up of designers, coders, media artists, and musicians, seen at venues like Moscow’s Circle of Light and Berlin’s Fashion Film Festival.

Alexandr Letcius of St. Petersburg’s TUNDRA collective specializes on algorithms for AV instruments and synths, with AV installation and performances.

Markus, lecturing.

Markus, lecturing.

Markus Heckmann of the Derivative team has been doing… lots and lots and lots of work, too.

Ilya Derzaev, Curiosity Media Lab

Alexey Nadzharov, Curiosity Media Lab

Andrew Quinn


Alex Nadzharov, performance.

Alex Nadzharov, performance.

Alex Nadzharov performance.

Alex Nadzharov performance.

Here’s just a sampling of some of the AV program:


Kir Hachaturov, Konstantin Novikov

Maxim Emelianov, Alexandr Krivoshapkin

Licht pfad: Stanislav Glazov, Margo Kudrina

Ildar Yakubov performs live - the same crazy scientist behind the mayhem of the interactive marathon.

Ildar Yakubov performs live – the same crazy scientist behind the mayhem of the interactive marathon.

Fiber optic sin: Ildar Yakubov, Viktor Kudryashov

AV live – Zero One – RUSTAMO

Katarina Pits’ video of Rustamo Yusupov:

AV live – Uno
Alexandr Letcius

AV live Ilya Derzaev

AV live Markus Heckmann, WORX.

ARTRA for MIDI percussion and electronics from akipix on Vimeo.

H/5: for accordion and electronic music from akipix on Vimeo.

Nikolay Popov -"ANF-93" for ensemble and electronic (2014) from Nikolay Popov on Vimeo.

Andrew Quinn / Nikolay Popov AV performance.

Andrew Quinn / Nikolay Popov AV performance.

Seems the start of something huge.

And it’s great to be back on Create Digital Motion – as one CDM. Seems just the right moment, as visuals reach a new age.



The post How one community was mapping the future of visuals this summer appeared first on CDM Create Digital Music.

A composition you can only hear by moving your head

“It’s almost like there’s an echo of the original music in the space.”

After years of music being centered on stereo space and fixed timelines, sound seems ripe for reimagination as open and relative. Tim Murray-Browne sends us a fascinating idea for how to do that, in a composition in sound that transforms as you change your point of view.

Anamorphic Composition (No. 1) is a work that uses head and eye tracking so that you explore the piece by shifting your gaze and craning your neck. That makes for a different sort of composition – one in which time is erased, and fragments of sound are placed in space.

Here’s a simple intro video:

Anamorphic Composition (No. 1) from Tim Murray-Browne on Vimeo.

I was also unfamiliar with the word “anamorphosis”:

Anamorphosis is a form which appears distorted or jumbled until viewed from a precise angle. Sometimes in the chaos of information arriving at our senses, there can be a similar moment of clarity, a brief glimpse suggestive of a perspective where the pieces align.

Tech details:

The head tracking and most of the 3D is done in Cinder using the Kinect One. This pipes OSC into SuperCollider which does the sounds synthesis. It’s pretty much entirely additive synthesis based around the harmonics of a bell.

I’d love to see experiments with this via acoustically spatialized sound, too (not just virtual tracking). Indeed, this question came up in a discussion we hosted in Berlin in April, as one audience member talked about how his perception of a composition changed as he tilted his head. I had a similar experience taking in the work of Tristan Perich at Sónar Festival this weekend (more on that later).

On the other hand, virtual spaces will present still other possibilities – as well as approaches that would bend the “real.” With the rise of VR experiences in technology, the question of point of view in sound will become as important as point of view in image. So this is the right time to ask this question, surely.

Something is lost on the Internet, so if you’re in London, check out the exhibition in person. It opens on the 25th:


The post A composition you can only hear by moving your head appeared first on cdm createdigitalmusic.

From Beethoven to Kinect, linking music to our bodies

“Gesture” is a term that gets tossed about regularly in modern interaction design. But to me, the word is most deeply associated with classical music – and the gestures that first brought me to music, the piano. In this video for TED@BCG, I got to talk about that and why I think it can inform design through today’s newest interfaces.

In rapid-fire form, obviously more could be said about this.

The software involved:
NI Mate, which I need to revisit – it’s gotten more sophisticated since what you see here

And visuals by Geso.

I am now deep into another TED project – TEDxESA at the European Space Agency – but while having a quick coffee, it seemed long overdue to share the last TED project, in front of the audience of TED’s TED@BCG (an official TED-curated event organized with Boston Consulting Group). I’ll be honest: I was a bit slow to share this partly because I would have loved to have had more time to prepare that talk and performance. Performing onstage at a conference is a unique challenge, on another order of magnitude when adding computer vision.

Anyway, thought I might share it (having put off doing so) to see what reactions were.

And I will say, it was a pleasure to work with the TED organization and see how they operate behind the scenes. It felt like having a personal trainer for talks, even in just a brief time. And now I look forward to an entirely different setting, with the folks at ESA. Stay tuned – suffice to say, in the midst of posting this, we’re exploring the world of space exploration and research through the medium of sound. And I think there will be a lot more to say very soon. So back to that – and greetings from south Holland.

From Beethoven to Kinect, linking music to our bodies [ted.com / TED Institute]

Photo: Wolfram Scheible


The post From Beethoven to Kinect, linking music to our bodies appeared first on Create Digital Music.

Seattle Symphony Debuts New Work For Orchestra & Kinect-Controlled Music Robots

The Seattle Symphony (Music Director Ludovic Morlot) recently premiered Above, Below, and In Between, a commission and site-specific composition by sculptor, sound artist and composer Trimpin. Above, Below, and In Between is a sculpture and musical composition for small orchestra, soprano voice, … Continue reading

A Toe-Tapping, Dancing 3D-Printed Robot Plays Music

Making Music With Poppy from Pierre Rouanet on Vimeo.

It can “learn” to tap its toe and bob its head. And then it can make sounds as you move its arms. It’s a robotic interface for music – a bit like playing with a very smart toy doll.

To show off its interactive/interfacing abilities, the team behind Poppy used music.

Poppy is a robot that can be produced with a 3D printer. All the hardware and software are fully open source. The idea – fused with cash from the EU’s European Research Council for funding science and creativity – is to help teach, as well as to empower engineers, scientists, and researchers. Apart from getting kids excited by being really cool, robotics are an excellent way to explore ideas in physical space, honing skills around logic as well as programming.


The combination of robotics and teaching has a long, proud history; look no further than the Logo programming language and the educational Turtle robot. See the founding pioneers of creative computing who led that effort, like roboticist and neuroscientist William Grey Walter, Wally Feurzeig, AI pioneer Seymour Papert, and notably Cynthia Solomon. Solomon helped create Logo, but also took that R&D to Apple and Atari, which brought it to the masses – I was a child of that effort, experiencing Logo for the first time on the Apple //e and going on to teach creative coding myself.

The juncture of science and computer science with music, though is an important one. It can make those concepts expressive and immediate.

This video could just be the beginning: the research team, led by France’s Dr. Clément Moulin-Frier, produced it after just a few hours in a code spring, plus the video. So, you could well build on this idea and do something better, given more time. In the meantime, I think it’s already more than reasonably fun.

You’ll find more details on the Poppy forum:
Poppy in a musical setup, please share your ideas

The same team created a Kinect-tracked robotic dance, which is oddly mesmerizing:

More on that effort, by dancer Marie-Aline and researcher/developer Jean-Marc Weber:
Artist residency: Êtres et Numérique

The European Commission has a release on the project in general (here printed in the English-language Prague Post):

Meet Poppy, the printable robot

An overview video covers how the whole rig works:

Poppy humanoid beta Overview from Poppy Project on Vimeo.

Here’s what assembly looks like:

Time lapse of Poppy humanoid's assembly from Poppy Project on Vimeo.

And printing goes something like this, as a hand is produced on a Makerbot Replicator. (Fun trivia: years ago, before founding Makerbot, now-celebrity Bre Pettis was one of the first presenters at CDM’s MusicMakers/Handmade Music event, showing off a cassette tape Mellotron built with Etsy’s Eric Beug. I think it even sort of worked. So, here, things come sort of full circle.)

Poppy's hand being 3Dprinted on a replicator 2 from Poppy Project on Vimeo.

You’re going to need access to a 3D printer, of course, to try this out, but if someone ventures into experimenting with Poppy, we’d love to hear about it.

The post A Toe-Tapping, Dancing 3D-Printed Robot Plays Music appeared first on Create Digital Music.

Watch Adriano Make Surprising Objects, Laser Beams into Triggers for Wild Music


Now that anything can become an instrument, musicianship can become the practice of finding the spirit in the unexpected. It’s what Matt Moldover championed in the notion of controllerism, what years of DIYers have made evident. It’s not just a matter of finding a novelty or two. It’s really taking those novelties and making them a creative force.

Adriano Clemente, the Italian-born, Brooklyn-based artist (aka Capcom), is a shining light of just that sort of imagination. Regular CDM readers will see some familiar techniques. There’s a laser harp, a circuit-bent toy, mic transducers making objects into triggers, a Numark Orbit controller, a LEAP Motion, a Kinect, an Ableton Push, and I’m fairly sure that’s fellow Italian Marco Donnarumma’s wonderful Xth Sense controller in VICE/Motherboard’s featurette on the artist. But it’s the way Adriano puts it all together that becomes the magic.
To put it simply, it’s hard not to get infected by his enthusiasm. He doesn’t just play these unusual objects – he really plays. He’s exploring the reality around him.

This is in fact the perfect companion to last week’s story by Matt Earp, with Spanish artist Ain TheMachine:
Music That’s All Human Body and Objects, No Instruments: Biotronica with Ain TheMachine [Interview]

The scene for this kind of work, once limited to isolated experiments and academia, is really heating up. It’s actually becoming a realm in which people are outdoing one another, as the world community of experimental performance grows.

I think readers here will also respond to what Adriano says about encountering conservatism – about the people who try to put these different approaches into boxes. (The “that isn’t real music” argument is something we’ve all certainly found.)

Watching the VICE video, you may miss out on Adriano’s musical versatility – and there’s a lot. So, here’s more to see. He isn’t just using odd DIY tools; he mixes familiar options like Ableton Live and conventional MIDI controllers with more experimental approaches, and teaches both, as well. (He’s on the faculty at New York’s Dubspot – and now runs their mysterious and intriguing Dubspot Labs.)

I find his music across genres to be really evocative. Here’s a quick experiment with custom Rutt Etra-style visuals and rather lovely music.

Analog Cubes Processing / Rutt Etra Studies- Adriano Clemente from adriano clemente on Vimeo.

In Den Haag, NL, he turned Leap Motion into a triply gestural controller for light and sound – a kind of Theremin light and sound organ. Done before? Oh, indeed. But by mixing in clever, glitchy rhythmic element, he ramped up the expressive, fun quality of that interaction. Implementation is everything. Visuals here are produced by Resolume Arena with sound by Ableton Live.

But he’s just at home improvising on more conventional controllers. Here he is (for Dubspot) on Ableton Push and (for KORG) on the Korg Tra. I actually think this is a better demonstration of Push improvisation than the promo videos Ableton themselves produced – but, then, Adriano has done a lot of expert work with setup. Ahem – that is to say, he can make the rest of us look clumsy. (I’d better practice my Push routines.)

Adriano on the setup:

In this video, I’ve made an effort to concentrate on the major features and options that users have to perform with in Ableton Push. I want to clarify that I don’t necessarily define Push as a performance controller, nor do I use it Live as a only component of my rig, it’s more a studio buddy which helps me to transport the experience of making music into a more engaging dimension and let me escape the classic keyboard-mouse setting. – Adriano Clemente

This sure does look like a performance to me, though I understand that he’s choosing different tools in his main performance rig. But maybe that’s the point: this sort of live improvisation can invigorate studio work, too.

He goes into more detail on the Dubspot blog.

With KORG taktile, he blows a huge hole with my previous argument that you don’t necessarily want pads on a keyboard by showing just what you can do combining a keyboard with X/Y controller and pads all in one device. (This is part of what makes KORG taktile an interesting rival to Native Instruments’ Komplete Kontrol – the NI option is more minimal, which could be a factor depending on your tastes.)

None of this would be worthwhile if it was just flailing arms around. Fortunately, his music can send you into a state of glitches-out mental vacation. For instance, here he is going nuts in a trippy, game-inspired world:

And there’s a lot more on SoundCloud:


Previously, I covered his Kinect work.

Ableton has the best profile of his background and inspiration – as much about the nature of the interactions he explores as it is about their products:
Adriano Clemente: Human Interaction

You’ll find lots more via his official site (including links to social media):


Thanks, Adriano, for the latest inspiration!

The post Watch Adriano Make Surprising Objects, Laser Beams into Triggers for Wild Music appeared first on Create Digital Music.

Experimental: Nagual Dance – Soundscape Firedance with Kinect

The music in this video was created through the movement of the dancers in front of a Kinect camera.This new interactive music experience is called Nagual Dance and will soon be available to public. It works like that: The Kinect tracks your movements and sends it in form of data to a computer. The software […]

iOS アプリ「Gestrument」次期バージョンはKinect カメラに対応

ピアノやギターを弾けなくてもiPadのスクリーンをタッチすることで音楽を演奏することが出来てしまうワンダフルなアプリGestrument。このGestrument次期バージョンはMicrosoft Kinect 360 カメラ に対応するようで、開発者らによるリサーチビデオが公開になっています。ビデオではジャグラー使ったり、新体操のようなジェスチャーを行いながらアプリをコントロールしている様子が映されています。Kinectカメラと音楽プログラムの融合は近い将来でもっとも楽しみにしていることの一つです。




App Storeにて¥800





The Computer Orchestra

The Computer Orchestra is a crowdsourcing platform, created by Simon de Diesbach, Jonas Lacôte, Laura Perrenoud, that allows users to create and conduct their own orchestra. Users can choose to upload their own music or download samples to integrate into their … Continue reading