❌ À propos de FreshRSS
Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierVos flux RSS

Roland Groovebox MC-101 live performance at Channel #21 (with...

Roland Groovebox MC-101 live performance at Channel #21 (with Unity visualizer)

A video clip from my live performance with Roland Groovebox...

A video clip from my live performance with Roland Groovebox MC-101 last night. This MC-101 project file and the visualizer (made with Unity) are available from the GitHub download page. Try them if you have a synth and a performant PC. – Grubo, an audio visual... – Grubo, an audio visual experience with Roland MC-101 and Unity. I’ll use this in a live performance tomorrow >>

Roland Groovebox MC-101 and Unity

Roland Groovebox MC-101 and Unity

Roland Groovebox MC-101 and Unity

Roland Groovebox MC-101 and Unity

Implemented a variable resolution test pattern generator as a...

Implemented a variable resolution test pattern generator as a custom post processing effect

Implemented a variable resolution test pattern generator as a...

Implemented a variable resolution test pattern generator as a custom post processing effect

Blurring the boundaries between technology and the individual – Exclusive interview with Shoeg.

Par Hayley Cantor

After seeing Shoeg’s project Infiltrate at LEV Matadero, we decided to catch up with him in Barcelona to find out more about his work, and to try and decipher the fascinating performance we saw that intrigued us to discover what technologies he uses to create his live AV shows.

Primarily I understand, you would consider yourself to be a musician, am I right? Or how would you label yourself? When did you decide to experiment with the A/V side of your show?

In the last years I’ve changed that way of seeing myself, so I would say I’m an artist. It’s not only sound anymore, I feel really that I am trying to express myself also through my code, my visual stuff, even my movements. I’m also collaborating with dance companies, where it is quite important to know how you move on stage, and this made me aware of that. So, for example I try to play without the table and computer blocking the visual line to the audience. I have also changed my relationship with sound, focusing more on textured layers instead of pitch.  

I started as a “musician”, but my visual side has been always there. I’ve been working for 15 years as a video editor, and I always had this fascination about image and sound synchronicity and feedback. 

Shoeg - Oudeis
Image from Shoeg’s project – Oudeis

Have you created the visual part of the show yourself or collaborated with a visual artist? (If so, who and why?) If not, tell us about how you developed the project and any challenges you faced in dealing with both elements of the performance.

I almost always create my own stuff. I’m not closed to collaborating with other people, but I tried to involve other artists in the past and for a reason it almost never happened, except for when I worked at the very beginning on the project with Ana Drucker, but after that I spent 2-3 years without a visual show, and I was really missing it. At some point, I wanted it back and I decided I had to refresh my coding knowledge to achieve what I wanted. I studied Computer Science for a couple of years, so at least I had a starting point – more or less.

I wanted to build a real time reactive visual system, that could be completely autonomous in a live set. The idea was to set up a bunch of rules, and do something sound reactive that could last 45 minutes in a live set without getting boring. So first challenge in this process was choosing which tools suited my needs the better. I tried, for example, Open Frameworks, which was a bit too complicated for my coding skills. Later, I knew about game engines like Unreal or Unity, which are free and you can do a lot of things scripting, easier to code. It’s also great to have this good amount of documentation and works done by other people online. I’m curious now about what Touch Designer can do, but for the moment Unity allows me to have a precise control of what I need. 

Shoeg - skin
Image from Shoeg’s project – Container

On the other hand, I wanted to work with objects from the real world in 3D aesthetics. I could model them with Blender, but I have no idea. So I learned some 3D techniques, like photogrammetry or 3D scanning. I remember wanting something more “perfect”, but discovered almost by accident the beautiful imperfections this techniques introduce in the models.

We recently saw your performance of your latest project ‘Infiltrate’ at LEV Matadero. What tools and set up are you using for the show? 

All the sound was generated using a couple of Etee sensors that the guys at Tangi0 lent me for a couple of months. These devices capture my hand and finger motion, as well as pressure data, and that is converted into MIDI signal through a Max MSP patch. Finally, MIDI is sent to the Virus and Digitakt. I had to bring hardware synths to the live sets, because I need a lot of polyphony to build these big layers of sound, and I couldn’t achieve it in virtual synths. Then, the visual stuff is a Unity app reacts to the sound mix. 

Infiltrate - LEV Matadero
Infiltrate at LEV Matadero, photo by Hayley Cantor

How does the use of this technology improve, or add to the quality and experience of your show for you, as an artist?

It allows to express myself in ways I could’ve never imagined. I’ve never performed as comfortable and with wide palette of possibilities with an instrument until I discovered motion sensors combined with the computer. The ability to map any behaviour to any response allows you to optimize your abilities in order to get what you want. This can’t ever happen with “traditional” instruments, you have to adapt to the instrument rigidness and background. I also see the coding process as a prosthesis, an extension able to repeat mechanical operations while you pierce through them.

What does the future hold for Shoeg in the world of live performance?

In the near future, I have to improve a lot of things: I want to make my hands more prominent on stage and be less computer dependent. People keep asking what is happening with the sensors, and I want to make it a bit more understandable. I also have this long list of ideas to code which don’t have time to make, and I would also like to collaborate with other people. But before that, I want to record a new album. I hope I’ll be able to work on it in the next months. 

You can find out more about Shoeg’s work through his artist page.

The post Blurring the boundaries between technology and the individual – Exclusive interview with Shoeg. appeared first on Audiovisualcity.

Has the Mirrorless Silent Mode Feature Killed the Unit Stills Blimp?

Par Lewis McGregor

Unit stills photographers shoot the majority of promotional material and behind-the-scenes stills, but many people don't know much about their role.

SATIS-Screen4All 2019 : Unit Image revient sur Love, Death & Robots

Par Shadows

Dans le cadre des conférences du SATIS-Screen4All qui se tenait début novembre près de Paris, une table ronde était organisée autour de la production pour les services de VOD.

Célia Digard, directrice des productions au sein du studio Unit Image, avait fait le déplacement pour évoquer la manière dont elle et son équipe ont géré l’épisode Derrière la faille (Beyond the Aquila Rift en VO) de la série animée Love, Death & Robots visible sur Netflix.

Célia Digard – Directrice des productions – Unit Image

Contexte et fabrication

Initialement, Unit fut contacté par Blur Studio, structure lancée en 1995 par Tim Miller et qui était en charge de la production des épisodes. Si le talent des français n’était plus à prouver, il a tout de même fallu que Unit Image présente deux plans de proof of concept avant de valider le lancement de la fabrication.

La bande-annonce de la série Love, Death & Robots

La suite du projet s’est déroulée avec une grande fluidité, a expliqué Célia Digard : une fois le script reçu, Unit a pu proposer quelques modifications (pour des raisons de narration, fabrication ou durée). Malgré une durée d’épisode d’environ un quart d’heure, bien au-delà des bandes-annonces auxquelles les équipes et le pipeline du studio sont habitués, la fabrication s’est déroulée sans encombre. Des échanges avec le client ont évidemment eu lieu, mais Unit a pu bénéficier d’une grande liberté. Au final, la majeure partie de la communication avec Netflix a finalement eu lieu assez tôt, pour le choix des acteurs (Unit Image ayant eu la chance de pouvoir s’impliquer dès cette partie du projet). Le studio a aussi pu gérer la partie sound design et la musique (mais pas les voix), une nouveauté pour Unit : généralement, pour les cinématiques de jeux vidéo, le son ne fait pas partie des livrables.

Netflix, un client pointilleux

Outre quelques évolutions de pipeline, Célia Digard a souligné que ce projet a amené des contraintes juridiques spécifiques, notamment en ce qui concerne la confidentialité. Un point au final très positif puisque les éléments mis en place servent désormais de modèle pour les nouveaux contrats de Unit Image.
Autre spécificité du travail pour Netflix : le géant du streaming a pris un mois environ pour valider les images 2K générées par Unit Image. Célia Digard nous a précisé qu’en fait, Netflix cherche à vérifier les moindres détails, y compris dans des conditions de lecture inhabituelles : contraste et luminosité poussés à fond. L’objectif pour Netflix semble être d’avoir la certitude que quelles que soient les conditions de visionnage, quel que soit le support, l’image sera parfaite y compris dans les ombres. Une exigence et un perfectionnisme qui vont dans le sens d’échos que nous avions pu avoir d’autres studios.

Un dernier point : si Célia Digard n’a pas pu nous révéler le montant du contrat, elle a tout de même précisé que le devis n’a pas été discuté par Netflix. De même, Netflix n’a pas demandé de pénalités en cas de retard.

Un bilan très positif

Vous l’aurez compris, et Célia Digard l’a clairement indiqué : Unit Image a beaucoup apprécié ce projet. Si le studio n’a malheureusement pas pu avoir de chiffres sur l’audience générée par son épisode, l’opportunité pour Unit de s’éloigner de sa zone de confort tout en ayant la confiance de son client a manifestement été une bonne expérience. Même les contraintes juridiques ont finalement eu un impact positif, comme nous l’évoquions plus haut.
Soulignons enfin que la série Love, Death & Robots a été renouvelée pour une seconde saison. On ne peut que souhaiter à Unit Image de pouvoir à nouveau en faire partie.

Pour aller plus loin

– La série Love, Death & Robots est visible sur Netflix.

– Pour un autre exemple de collaboration entre un studio français et Netflix, nous vous invitons à lire ou relire notre interview de D-Seed autour du film de science-fiction Io.

L’article SATIS-Screen4All 2019 : Unit Image revient sur Love, Death & Robots est apparu en premier sur 3DVF.

Added a block noise glitch effect to the custom post processing...

Added a block noise glitch effect to the custom post processing effect collection

Added a block noise glitch effect to the custom post processing...

Added a block noise glitch effect to the custom post processing effect collection

Shoeg (AKA Carlos Martorell)

Par Hayley Cantor

Carlos Martorell is a sound and visual artist based in a small Catalan town called Torelló, near Girona. His work focuses on the symbiotic relationship between humans and technology. He uses his programming skills and knowledge of new technologies to explore the visual and audio through the creation of experimental music and live AV.

Carlos Martorell by Xavi Casanueva

He creates sumptious virtual worlds through programming language such as Unity, as well as with 3D scans. It’s not uncommon to see him performing with non-traditional MIDI equipment, using apparatus such as gloves and hand-held technology, which as an adds a peculiar physical dimension to his live shows.

SHOEG: Website
LINK BOX: Soundcloud / Bandcamp / Instagram / Youtube / Resonate

Header photo © Hayley Cantor

The post Shoeg (AKA Carlos Martorell) appeared first on Audiovisualcity.