Live compositions on oscilloscope: nuuun, ATOM TM

The Well-Tempered vector rescanner? A new audiovisual release finds poetry in vintage video synthesis and scan processors – and launches a new AV platform for ATOM TM.

nuuun, a collaboration between Atom™ (raster, formerly Raster-Noton) and Americans Jahnavi Stenflo and Nathan Jantz, have produced a “current suite.” These are all recorded live – sound and visuals alike – in Uwe Schmidt’s Chilean studio.

Minimalistic, exposed presentation of electronic elements is nothing new to the Raster crowd, who are known for bringing this raw aesthetic to their work. You could read that as part punk aesthetic, part fascination with visual imagery, rooted in the collective’s history in East Germany’s underground. But as these elements cycle back, now there’s a fresh interest in working with vectors as medium (see link below, in fact). As we move from novelty to more refined technique, more artists are finding ways of turning these technologies into instruments.

And it’s really the fact that these are instruments – a chamber trio, in title and construct – that’s essential to the work here. It’s not just about the impression of the tech, in other words, but the fact that working on technique brings the different media closer together. As nuuun describe the release:

Informed and inspired by Scan Processors of the early 1970’s such as the Rutt/Etra video synthesizer, “Current Suite No.1” uses the oscillographic medium as an opportunity to bring the observer closer to the signal. Through a technique known as “vector-rescanning”, one can program and produce complex encoded wave forms that can only be observed through and captured from analog vector displays. These signals modulate an electron-beam of a cathode-ray tube where the resulting phosphorescent traces reveal a world of hidden forms. Both the music and imagery in each of these videos were recorded as live compositions, as if they were intertwined two-way conversations between sound and visual form to produce a unique synesthetic experience.

“These signals modulate an electron-beam of a cathode-ray tube where the resulting phosphorescent traces reveal a world of hidden forms.”

Even with lots of prominent festivals, audiovisual work – and putting visuals on equal footing with music – still faces an uphill battle. Online music distribution isn’t really geared for AV work; it’s not even obvious how audiovisual work is meant to be uploaded and disseminated apart from channels like YouTube or Vimeo. So it’s also worth noting that Atom™ is promising that NN will be a platform for more audiovisual work. We’ll see what that brings.

Of course, NOTON and Carsten Nicolai (aka Alva Noto) already has a rich fine art / high-end media art career going, and the “raster-media” launched by Olaf Bender in 2017 describes itself as a “platform – a network covering the overlapping border areas of pop, art, and science.” We at least saw raster continue to present installations and other works, extending their footprint beyond just the usual routine of record releases.

There’s perhaps not a lot that can be done about the fleeting value of music in distribution, but then music has always been ephemeral. Let’s look at it this way – for those of us who see sound as interconnected with image and science, any conduit to that work is welcome. So watch this space.

For now, we’ve got this first release:

http://atom-tm.com/NN/1/Current-Suite-No-IVideo/

Previously:

Vectors are getting their own festival: lasers and oscilloscopes, go!

In Dreamy, Electrified Landscapes, Nalepa ‘Daytime’ Music Video Meets Rutt-Etra

The post Live compositions on oscilloscope: nuuun, ATOM TM appeared first on CDM Create Digital Music.

In gorgeous ETHER, a handmade micro lens brings cymatics closer

Sound is physical, but we don’t often get to see that physicality. In this gorgeous video for Thomas Vaquié, directed by Nico Neefs, those worlds of vibrations explode across your screen. It’s the latest release from ANTIVJ, and it’s spellbinding.

The sounds really do generate the visuals here, from generating terrain from an analysis from the waveform to revealing footage of metal powder animated by sonic vibrations. A self-made micro lens provides the optics.

https://www.youtube.com/watch?v=aK0BXH7zu-M

Everything in this video was made using the sound waves of the track Ether.
Equipped with a home-made micro lens, a camera travels inside physical representations of the musical composition, from a concrete mountain built from the spectrogram of the music, to eruptions of metal powder caused by rhythmic impulsions.

(Impulsion is a word; look it up! I had to do so.)

Still from the video.

Nico Neefs is the director, working with images he created with Corentin Kopp. It’s set to music from Belgian producer Thomas Vaquié’s new album Ecume, on Antivj Recordings. That imprint has for over a decade been a label for audiovisual creations across media – release, installation, performance. Simon Geilfus developed the tool for visualization.

They’ve employed the same techniques to make a very attractive physical release. The image you see in the artwork is cast from a concrete mold. For a limited edition box set, they’re producing 33cm x 33cm plates cast from that mold in dark resin. And it’s ready to mount to a wall if you choose; hardware included. Or if you feel instead like you own enough things, there’s a digital edition.

Ultra-limited handmade physical release.

Concrete mold.

Concrete mold; detail.

The whole album is beautiful; I’m especially fond of the bell-like resonances in the opening piece. It’s a sumptuous, sonic environment, full of evocative sound designs that rustle and ring in easy, organic assemblies, part synthetic, part string. Those then break into broken, warped grooves that push forward. (Hey, more impulsion – like a horse.)

The music was repurposed from installations and art contexts:

These are all derivations of compositions for site-specific and installation projects, the original pieces having been created as a response to place and space, to light and architecture, to code and motion. Now separated and transformed from their original context, the music takes on an independent existence in these new realisations.

That does lend the whole release an environmental quality – spaces you can step in and out of – but is nonetheless present emotionally. There’s impact, listening top to bottom, enough so that you might not immediately assume the earlier context. And the release is fully consistent and coherent as a whole. (It is very possible you heard an installation here or there. Vaquié has produced compositions for Centre Pompidou Metz the Old Port of Montreal’s metallic conveyor tower, in Songdo South Korea, at Oaxaca’s ethnobotanical gardens, and at Hala Stulecia, Poland’s huge concrete dome.)

And there’s thoroughly fine string writing throughout – with a sense that strings and electronic media are always attuned to one another.

Cover artwork.

Thomas Vaquié.

Poetic explanation accompanies the album:

Ether embodies the world that exists above the skies.
It is the air that the gods breathe.
It is that feeling of dizziness,
that asphyxiation that we feel when faced with immensity.

Full video credits:

Music by Thomas Vaquié
Video directed by Nico Neefs
Images by Nico Neefs & Corentin Kopp
Edit & Post-production by Nico Neefs
Video produced by Charles Kinoo for Less Is More Studio and Thomas Vaquié
Filmed at BFC Studio, Brussels 2018.

More, including downloads / physical purchases:

https://thomasvaquie.bandcamp.com/

Plus:
www.thomasvaquie.com
www.antivj.com

The post In gorgeous ETHER, a handmade micro lens brings cymatics closer appeared first on CDM Create Digital Music.

SPIRALALALA transforms a spiral staircase into a vocal vortex

Just when you’re bored with digital media installations, something happens that gets you back to childlike wonder mode. And a magical staircase is a pretty good way to do that.

The team at Poland’s panGenerator have been on a tear lately. This time, they took a grand spiral staircase and imagined what would happen if you could make your voice a kinetic part of the architecture. It’s way better than just shouting your echo at a wall.

It’s also a great example of how spatial sound and architecture can interact, making the normally static structures of an environment more dynamic. This is the sort of interactive architecture we’re routinely promised, but now you see/hear it actually working. Each floor gets its own audio, so the sound seems to descend with the ball. Custom built gates with infrared sensors and radio modules complete the illusion by transforming the sound accordingly.

It’s neo-baroque sonic trompe l’oeil, made with digital technology. The digital transformations of the sound, mapped to the actual kinetic movement of the ball, mix virtual and real.

The artists:

Krzysztof Cybulski
Krzysztof Goliński
Jakub Koźniewski

What we got most recently from the same Warszawa-based crew:

The retro-futuristic Apparatum draws from Polish electronic music history

Details:

During MDF Festival we’ve changed the iconic spiral staircase of the Szczecin Philharmonic into 35m long / 15m high spatial voice-transforming instrument.

The audience has been invited to experiment with various spatialised sound effects applied to their vocalisations that were synchronised with the movement of the balls falling along 35m long track. The interaction starts with insertion of the ball into the microphone. Then recording starts and after the recorded sound stops the ball is released to slide down along the track.

Thanks to custom built gates with infrared sensors and radio modules the sound transformations applied to the recording were synchronised with the current speed and position of the ball. The light trail following the ball has also been created thanks to the sensors and microcontrollers measuring the speed of the ball passing the gates.

Since we were using five speakers – one per floor, we were also able to achieve spatialisation of the sound creating the illusion of the sound “falling” with the ball. As a finishing touch we’ve also used simple projection mapping synchronised with the motion of the ball to make the whole thing more visible for the people standing in the lobby of the Philharmonic.

In the end we’ve created a playful and engaging audience-driven audiovisual performance that exemplifies our vision for integrating new media art practice with architecture and breathing the life into static form thanks to digital technology.

——

VIDEO CREDITS

DOP – Hola Hola Film – holaholafilm.pl
VIDEO EDITING & POSTPRODUCTION – Jakub Koźniewski
SOUND EDITING – Krzysztof Cybulski
VIDEO SOUNDTRACK – Maciek Dobrowolski – mdobrowolski.com
VOICE – Jona Ardyn – jonaardyn.pl

SPECIAL THANKS

Paulina Stok-Stocka
Barbara Kinga Majewska
Tomasz Midzio
Maciej Kalczyński

—–

pangenerator.com/
mdf.filharmonia.szczecin.pl/
https:/filharmonia.szczecin.pl/en

More:

http://pangenerator.com/projects/spiralalala/

The post SPIRALALALA transforms a spiral staircase into a vocal vortex appeared first on CDM Create Digital Music.

Teenage Engineering OP-Z has DMX track for lighting, Unity 3D integration

The OP-Z may be the hot digital synth of the moment, but it’s also the first consumer music instrument to have dedicated features for live visuals. And that starts with lighting (DMX) and 3D visuals (Unity 3D).

One of various surprises about the OP-Z launch is this: there’s a dedicated track for controlling DMX. That’s the MIDI-like protocol that’s an industry standard for stage lighting, supported by lighting instruments and light boards.

Not a whole lot revealed here, but you get the sense that Teenage Engineering are committed to live visual applications:

There’s also integration with Unity 3D, for 2D and 3D animations you can sequence. This integration relies on MIDI, but they’ve gone as far as developing a framework for MIDI-controlled animations. Since Unity runs happily both on mobile devices and beefy desktop rigs, it’s a good match both for doing fun things with your iOS display (which the OP-Z uses anyway), and desktop machines with serious GPUs for more advanced AV shows.

Check out the framework so far on their GitHub:

https://github.com/teenageengineering/videolab

We’ll talk to Teenage Engineering to find out more about what they’re planning here, because #createdigitalmotion.

https://teenageengineering.com/products/op-z

The post Teenage Engineering OP-Z has DMX track for lighting, Unity 3D integration appeared first on CDM Create Digital Music.

Fake a $30k pro video controller with an APC40 or Beatstep and Davinci Resolve

We’re living in an age of video and motion graphics. But now not only can you get a free license of Davinci Resolve to use pro-level tools, but this hack will let you make a standard music controller do a convincing impression of a $30,000 controller. Finally, visuals get as easily hands-on as music.

The Tacyhon Post has a bunch of excellent tools for users of Davinci Resolve. (Resolve is the editor / motion graphics / post tool from Blackmagic. It’s a pro-grade tool, but you can use a free license.) But most intriguing are controller mappings for the Akai APC40 and original Arturia Beatstep. If you don’t have an APC40 already, for instance, that’s an inexpensive used buy. (And maybe this will inspired other mappings, too.)

The APC mapping is the most interesting. And it’s ridiculous how much it does. Suddenly color grading, shapes and motion, tracking and all the editing functions are tangible controls. THe developer has also added in mappings for Resolve FX. And it’s updated for the latest version, Resolve 15, released this summer.

Watch:

The Beatstep version is pretty cool, as well, with similar functionality to the APC. This isn’t the Beatstep Pro but the “vintage” Beatstep. Unlike the APC, that controller hasn’t had quite the staying power on the music side – the Pro version was much better. But that means it’s even better to repurpose it for video, and of course then you have an effective mobile solution.

If you’re the sort of person to drop 30 grand on the actual controller, this probably isn’t for you. But what it does is to liberate all those workflows for the rest of us – to make them physical again. The APC is uniquely suited to the task because of a convenient layout of buttons and encoders.

I’m definitely dusting off an APC40 and a forgotten Beatstep to try this out. Maybe if enough of us buy a license, it’ll prompt the developer to try other hardware, too.

Super custom edition by the script developer, with some hardware hacks and one-off paint job. Want.

Meanwhile, where this really gets fun is with this gorgeous custom paint job. DIY musicians get to be the envy of all those studio video people.

Grab the scripts to make this work (paid):

https://posttools.tachyon-consulting.com/davinci-resolve-controllers/apc40-resolve-edition/

https://posttools.tachyon-consulting.com/davinci-resolve-controllers/beatstep-resolve-edition/

Thank you, Davo, for the tip!

The post Fake a $30k pro video controller with an APC40 or Beatstep and Davinci Resolve appeared first on CDM Create Digital Music.

Max 8: Multichannel, mappable, faster patching is here

Max 8 is released today, as the latest version of the audiovisual development environment brings new tools, faster performance, multichannel patching, MIDI learn, and more.

Max is now 30 years old, with a direct lineage to the beginning of visual programming for musicians – creating your own custom tools by connecting virtual cables on-screen instead of typing in code. Since then, its developers have incorporated additional facilities for other code languages (like JavaScript), different data types, real-time visuals (3D and video), and integrated support inside Ableton Live (with Max for Live). Max 8 actually hits all of those different points with improvements. Here’s what’s new:

MC multichannel patching.

It’s always been possible to do multichannel patching – and therefore support multichannel audio (as with spatial sound) – in Max and Pure Data. But Max’s new MC approach makes this far easier and more powerful.

  • Any sound object can be made into multiples, just by typing mc. in front of the object name.
  • A single patch cord can incorporate any number of channels.
  • You can edit multiple objects all at once.

So, yes, this is about multichannel audio output and spatial audio. But it’s also about way more than that – and it addresses one of the most significant limitations of the Max/Pd patching paradigm.

Polyphony? MC.

Synthesis approaches with loads of oscillators (like granular synthesis or complex additive synthesis)? MC.

MPE assignments (from controllers like the Linnstrument and ROLI Seaboard)? MC.

MC means the ability to use a small number of objects and cords to do a lot – from spatial sound to mass polyphony to anything else that involves multiples.

It’s just a much easier way to work with a lot of stuff at once. That was present in open code environment SuperCollider, for instance, if you were willing to put in some time learning SC’s code language. But it was never terribly easy in Max. (Pure Data, your move!)

MIDI mapping

Mappings lets you MIDI learn from controllers, keyboards, and whatnot, just by selecting a control, and moving your controller.

Computer keyboard mappings work the same way.

The whole implementation looks very much borrowed from Ableton Live, down to the list of mappings for keyboard and MIDI. It’s slightly disappointing they didn’t cover OSC messages with the same interface, though, given this is Max.

It’s faster

Max 8 has various performance optimizations, says Cycling ’74. But in particular, look for 2x (Mac) – 20x (Windows) faster launch times, 4x faster patching loading, and performance enhancements in the UI, Jitter, physics, and objects like coll.

Also, Max 8’s Vizzie library of video modules is now OpenGL-accelerated, which additionally means you can mix and match with Jitter OpenGL patching. (No word yet on what that means for OpenGL deprecation by Apple.)

Node.JS

This is I suspect a pretty big deal for a lot of Max patchers who moonlight in some JavaScript coding. NodeJS support lets you run Node applications from inside a patch – for extending what Max can do, running servers, connecting to the outside world, and whatnot.

There’s full NPM support, which is to say all the ability to share code via that package manager is now available inside Max.

Patching works better, and other stuff that will make you say “finally”

Actually, this may be the bit that a lot of long-time Max users find most exciting, even despite the banner features.

Patching is now significantly enhanced. You can patch and unpatch objects just by dragging them in and out of patch cords, instead of doing this in multiple steps. Group dragging and whatnot finally works the way it should, without accidentally selecting other objects. And you get real “probing” of data flowing through patch cords by hovering over the cords.

There’s also finally an “Operate While Unlocked” option so you can use controls without constantly locking and unlocking patches.

There’s also a refreshed console, color themes, and a search sidebar for quickly bringing up help.

Plus there’s external editor support (coll, JavaScript, etc.). You can use “waypoints” to print stuff to the console.

And additionally, essential:

High definition and multitouch support on Windows
UI support for the latest Mac OS
Plug-in scanning

And of course a ton of new improvements for Max objects and Jitter.

What about Max for Live?

Okay, Ableton and Cycling ’74 did talk about “lockstep” releases of Max and Max for Live. But… what’s happening is not what lockstep usually means. Maybe it’s better to say that the releases of the two will be better coordinated.

Max 8 today is ahead of the Max for Live that ships with Ableton Live. But we know Max for Live incorporated elements of Max 8, even before its release.

For their part, Cycling ’74 today say that “in the coming months, Max 8 will become the basis of Max for Live.”

Based on past conversations, that means that as much functionality as possibly can be practically delivered in Max for Live will be there. And with all these Max 8 improvements, that’s good news. I’ll try to get more clarity on this as information becomes available.

Max 8 now…

Ther’s a 30-day free trial. Upgrades are US$149; full version is US$399, plus subscription and academic discount options.

Full details on the new release are neatly laid out on Cycling’s website today:

https://cycling74.com/products/max-features?utm_source=press&utm_campaign=max8-release

The post Max 8: Multichannel, mappable, faster patching is here appeared first on CDM Create Digital Music.

This light sculpture plays like an instrument, escaped from Tron

Espills is a “solid light dynamic sculpture,” made of laser beams, laser scanners, and robotic mirrors. And it makes a real-life effect that would make Tron proud.

The work, made public this month but part of ongoing research, is the creation of multidisciplinary Barcelona-based AV team Playmodes. And while large-scale laser projects are becoming more frequent in audiovisual performance and installation, this one is unique both in that it’s especially expressive and a heavily DIY project. So while dedicated vendors make sophisticated, expensive off-the-shelf solutions, the Playmodes crew went a bit more punk and designed and built many of their own components. That includes robotic mirrors, light drawing tools, synths, scenery, and even the laser modules. They hacked into existing DMX light fixtures, swapping mirrors for lamps. They constructed their own microcontroller solutions for controlling the laser diodes via Artnet and DMX.

And, oh yeah, they have their own visual programming framework, OceaNode, a kind of home-brewed solution for imagining banks of modulation as oscillators, a visual motion synth of sorts.

It’s in-progress, so this is not a Touch Designer rival so much as an interesting homebrew project, but you can toy around with the open source software. (Looks like you might need to do some work to get it to build on your OS of choice.)

https://github.com/playmodesStudio/ofxoceanode

Typically, too, visual teams work separately from music artists. But adding to the synesthesia you feel as a result, they coupled laser motion directly to sound, modding their own synth engine with Reaktor. (OceaNode sends control signal to Reaktor via the now-superior OSC implementation in the latter.)

They hacked that synth engine together from Santiago Vilanova’s PolyComb – a beautiful-sounding set of resonating tuned oscillators (didn’t know this one, now playing!):

https://www.native-instruments.com/es/reaktor-community/reaktor-user-library/entry/show/9717/

Oh yeah, and they made a VST plug-in to send OSC from Reaper, so they can automate OSC envelopes using the Reaper timeline.

OceaNode, visual programming software, also a DIY effort by the team.

… and the DIY OSC VST plug-in, to allow easy automation from a DAW (Reaper, in this case).

It’s really beautiful work. You have to notice that the artists making best use of laser tech – see also Robert Henke and Christopher Bauder here in Berlin – are writing some of their own code, in order to gain full control over how the laser behaves.

I think we’ll definitely want to follow this work as it evolves. And if you’re working in similar directions, let us know.

The post This light sculpture plays like an instrument, escaped from Tron appeared first on CDM Create Digital Music.

A flood of user-submitted faces make a poignant new Max Cooper video

It’s a kind of love letter to humanity – and strikingly personal, both in the music and imagery, finding some hopeful part of the current zeitgeist. Don’t miss the new “Lovesong” video, made by music producer Max Cooper in collaboration with artist Kevin McGloughlin.

There’s a lot of bombast both in mainstage electronic music and in music videos – shouting being one way of attempting to get above the din of media in our world. But what’s engaging about “Lovesong” is that it does feel so intimate. There’s the unaffected, simple chord line, wandering around thirds like a curious child, slow fuzzy beats ambling in and out with just enough bits of sonic icing and flourish. And the video, composed of a stream of user-submitted faces, manages to make those faces seem to gaze back through the screen. It’s where we’ve come at last: the visual effects aren’t so gimmicky any more, but seem more natural first language. (Compare the fanfare with which Michael Jackson’s “Black or White” arrived – see below.)

That is, while this video is surely fodder for design blogs and … well, this one … I suspect it’ll spread passed by one person to another, more on the human level suggested by the video.

The visual work, by the way, comes from fellow Irishman and animator/filmmaker Kevin McGloughlin, a self-taught artist and director. Here’s what Max and Kevin have to say for themselves:

From the album ‘One Hundred Billion Sparks’ [Mesh]
Stream / buy: MaxCooper.lnk.to/OHBS
For more information visit onehundredbillionsparks.net and sign up for first news exclusive content at maxcooper.net/#join

“My new album, one hundred billion sparks, is out today, so it seemed a good day to also launch the music video which you created. The whole thing is built from photos which were submitted by those of you who listen to my work, so many thanks for that, and I hope you can spot your face in there!

The topic itself was a difficult one to approach, as so much popular music is written about love that it seems to have become more of an exercise in sales than anything authentic. So it’s a topic I’ve always avoided, but one that came naturally during the process of creating this album about the mind and what creates us, especially with the arrival of a new tiny person at the end of the writing period.

I chatted to Kevin McGloughlin about how we could visualise the idea in a general sense, and we decided that imagery of the human face would be the way to do it. Kevin had the great idea of setting up a platform for the viewers of the video project to submit their own photos to build it from, so as to make the video a personalised, and more meaningful rendering of the love song. Then Kevin worked his magic with the photos creating a beautiful complex blending and processing of stills. Many thanks again to all of you who submitted your photos, and I recommend scanning through to find yourself in there and getting screenshot. It’s amazing how much is going on when you slow down the video to look at the individual frames too, hats off to the awesome Kevin McGloughlin once again!”

– Max Cooper

– – –

“I am completely honoured to have worked directly with so many people for this global portrait.
It was a great experience of collaboration and though some of the images are less prominent than others, each and every image was as instrumental and important as the next in creating the final piece.

When Max told me about his vision for ‘Love Song’, “love of the species”
I immediately had the idea to include real people and real moments in the video.
We asked for submissions and images flooded in from people all over the world, and work got under way.

This video is like a postcard for me. Something for all the people involved.

Big thanks to all the collaborators and to Memfies who aided us in the compilation of all the initial images.”

– Kevin McGloughlin

That idea of “love of the species” recalls for me one of my favorite texts, associated with a street corner in my hometown of Louisville, Kentucky. It comes from Catholic mystic Thomas Merton, but it’s universal enough that the Dalai Lama took it as a title when visiting the city. (And it’s partly about getting away from superficial religiosity.)

“In Louisville, at the corner of Fourth and Walnut, in the center of the shopping district, I was suddenly overwhelmed with the realization that I loved all those people, that they were mine and I theirs, that we could not be alien to one another even though we were total strangers. It was like waking from a dream of separateness, of spurious self-isolation in a special world, the world of renunciation and supposed holiness… This sense of liberation from an illusory difference was such a relief and such a joy to me that I almost laughed out loud… I have the immense joy of being man, a member of a race in which God Himself became incarnate. As if the sorrows and stupidities of the human condition could overwhelm me, now I realize what we all are. And if only everybody could realize this! But it cannot be explained. There is no way of telling people that they are all walking around shining like the sun.”

“Spurious self-isolation” is certainly an idea that music producers will find familiar, not only monks. But even though it’s not always easy, it’s great when we can find our way to wake from the dream of separateness and find love again – and we’re lucky to have music for those love songs.

maxcooper.net
kevinmcgloughlin.com/

By way of history, this piece on “Black or White” and early morphing is a must-read for lovers of animation and computer graphics:

An Oral History of Morphing in Michael Jackson’s ‘Black or White’ [Cartoon Brew]

I had forgotten about the Plymouth Voyager, but I suppose you could argue the minivan is also a love letter to humanity. (The “pooling kids to soccer practice” one, maybe?)

The post A flood of user-submitted faces make a poignant new Max Cooper video appeared first on CDM Create Digital Music.

Touché now puts expressive control at hand for $229

“Expressive control” has largely translated to “wiggly keyboards” and “squishy grids,” with one notable exception – the unique, paddle-like Touché from Expressive E. And while keeping essentially the same design, they’ve gotten the price down to just US$/EUR229, making this potentially a no-brainer.

The result: add this little device to your rig, and play gesturally with a whole bunch of instruments, either using provided examples or creating your own.

Preset-packed paddle?

Expressive E’s approach has set itself apart in two key ways. First, they’ve gone with a design that’s completely different than anyone else working in expressive control. It’s not a ribbon, not a grid, not an X/Y pad, and not a keyboard, in other words.

The Touché is best described as a paddle, a standalone object that you sit next to your computer or instrument. There’s a patented mechanism in there that responds to mechanical movements, so with the slightest pressure or tap, you can activate it, or push harder for multi-axis control.

And that, in turn, opens this up to lots of different control applications. Expressive E market this mainly for controlling instruments, like synthesizers, but any music or visual performance input could be relevant.

The second clever element in Expressive E’s approach is to bundle a whole bunch of presets. The first Touché had loads of support even for hardware synths. The new one is focused more on software. But together, this means that while you can map your own ideas, you’ve got a load of places to start.

Touché SE

The original Touché is US$/EUR 399.

Touché SE is just $/EUR 229.

Here’s the cool thing about that price break: the only real sacrifice here is the standalone operation with hardware. (The SE works with bus-powered USB only.)

Other than that, it’s the same hardware as before, though with a polycarbonate touch plate.

In fact, otherwise you get more:

  • Lié hosting software, with VST hosting so you can use your own plug-ins
  • UVI-powered internal sound engine with leads and mallets and loads of other things
  • 200 ready-to-play internal sounds, which you can call up using dedicated buttons on the device
  • 200+ presets for popular plug-ins (like Native Instruments’ Massive and Prism, Serum, Arturia software, etc.)

So connect this USB bus-powered device (they put a huge four-foot cable in the box), and you get multi-dimensional gestural control.

Standalone, VST, AU, Mac, Windows. (Would love to see a Linux/Raspi version!)

I’ve been playing one for a bit and – it’s hugely powerful, likely of appeal both to plug-in and synth lovers and DIYers alike.

http://www.expressivee.com/touche-se

The post Touché now puts expressive control at hand for $229 appeared first on CDM Create Digital Music.

Vectors are getting their own festival: lasers and oscilloscopes, go!

It’s definitely an underground subculture of audiovisual media, but lovers of graphics made with vintage displays, analog oscilloscopes, and lasers are getting their own fall festival to share performances and techniques.

Vector Hack claims to be “the first ever international festival of experimental vector graphics” – a claim that is, uh, probably fair. And it’ll span two cities, starting in Zagreb, Croatia, but wrapping up in the Slovenian capital of Ljubljana.

Why vectors? Well, I’m sure the festival organizers could come up with various answers to that, but let’s go with because they look damned cool. And the organizers behind this particular effort have been spitting out eyeball-dazzling artwork that’s precise, expressive, and unique to this visceral electric medium.

Unconvinced? Fine. Strap in for the best. Festival. Trailer. Ever.

Here’s how they describe the project:

Vector Hack is the first ever international festival of experimental vector graphics. The festival brings together artists, academics, hackers and performers for a week-long program beginning in Zagreb on 01/10/18 and ending in Ljubljana on 07/10/18.

Vector Hack will allow artists creating experimental audio-visual work for oscilloscopes and lasers to share ideas and develop their work together alongside a program of open workshops, talks and performances aimed at allowing young people and a wider audience to learn more about creating their own vector based audio-visual works.

We have gathered a group of fifteen participants all working in the field from a diverse range of locations including the EU, USA and Canada. Each participant brings a unique approach to this exiting field and it will be a rare chance to see all their works together in a single program.

Vector Hack festival is an artist lead initiative organised with
support from Radiona.org/Zagreb Makerspace as a collaborative international project alongside Ljubljana’s Ljudmila Art and Science Laboratory and Projekt Atol Institute. It was conceived and initiated by Ivan Marušić Klif and Derek Holzer with assistance from Chris King.

Robert Henke is featured, naturally – the Berlin-based artist and co-founder of Ableton and Monolake has spent the last years refining his skills in spinning his own code to control ultra-fine-tuned laser displays. But maybe what’s most exciting about this scene is discovering a whole network of people hacking into supposedly outmoded display technologies to find new expressive possibilities.

One person who has helped lead that direction is festival initiator Derek Holzer. He’s finishing a thesis on the topic, so we’ll get some more detail soon, but anyone interested in this practice may want to check out his open source Pure Data library. The Vector Synthesis library “allows the creation and manipulation of vector shapes using audio signals sent directly to oscilloscopes, hacked CRT monitors, Vectrex game consoles, ILDA laser displays, and oscilloscope emulation software using the Pure Data programming environment.”

https://github.com/macumbista/vectorsynthesis

The results are entrancing – organic and synthetic all at once, with sound and sight intertwined (both in terms of control signal and resulting sensory impression). That is itself perhaps significant, as neurological research reveals that these media are experienced simultaneously in our perception. Here are just two recent sketches for a taste:

They’re produced by hacking into a Vectrax console – an early 80s consumer game console that used vector signals to manipulate a cathode ray screen. From Wikipedia, here’s how it works:

The vector generator is an all-analog design using two integrators: X and Y. The computer sets the integration rates using a digital-to-analog converter. The computer controls the integration time by momentarily closing electronic analog switches within the operational-amplifier based integrator circuits. Voltage ramps are produced that the monitor uses to steer the electron beam over the face of the phosphor screen of the cathode ray tube. Another signal is generated that controls the brightness of the line.

Ted Davis is working to make these technologies accessible to artists, too, by developing a library for coding-for-artists tool Processing.

http://teddavis.org/xyscope/

Oscilloscopes, ready for interaction with a library by Ted Davis.

Ted Davis.

Here’s a glimpse of some of the other artists in the festival, too. It’s wonderful to watch new developments in the post digital age, as artists produce work that innovates through deeper excavation of technologies of the past.

Akiras Rebirth.

Alberto Novell.

Vanda Kreutz.

Stefanie Bräuer.

Jerobeam Fenderson.

Hrvoslava Brkušić.

Andrew Duff.

More on the festival:
https://radiona.org/
https://wiki.ljudmila.org/Main_Page

http://vectorhackfestival.com/

The post Vectors are getting their own festival: lasers and oscilloscopes, go! appeared first on CDM Create Digital Music.