❌ À propos de FreshRSS
Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Hier — 30 mai 2020Vos flux RSS

Next live stream on Tuesday.⁣⁣⁣⁣ ⁣We will demo some effects...

Next live stream on Tuesday.⁣⁣⁣⁣
⁣We will demo some effects shipped with Millumin, but also show how to add new ones, and even create your own.⁣
⁣⁣Of course, we will chat and answer your questions.⁣⁣
⁣⁣⁣⁣Be sure to subscribe on Youtube : ⁣⁣⁣⁣
⁣⁣⁣⁣Of course, we will chat and answer your questions.⁣⁣⁣⁣
⁣⁣⁣⁣📅 Facebook event : ⁣⁣⁣
⁣⁣⁣⁣🇬🇧 In English with a beautiful French accent⁣⁣⁣⁣
⁣⁣⁣⁣📘 Previous streams on ⁣⁣⁣⁣
⁣⁣⁣⁣⁣⁣⁣#millumin #live #training #streaming #tracking #effect #shader #isf #custom #youtube⁣⁣⁣⁣


DIT Eduardo Eguia details the imaging process used on The Mandalorian The Mandalorian is about the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic. It is the backstory vision of George Lucas’, to the Star Wars bounty hunter theme that tied together ...

The post DIT EDUARDO EGUIA ON MAKING THE MANDALORIAN appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

Speed up Your Editing with a Keyboard-Only Workflow in Premiere

Want to speed up your editing in Premiere Pro? Move to a keyboard-only editing style and improve your editing speed by twenty percent.

Multicast in NDI across subnets

Par : tcomerma

I have some questions regarding the use of multicast in NDI enabled devices that I've been unable to verify by myself.

We've been testing some NDI devices (BirdDog, Magewell, NDI Tools) before deploying into our network. We are a quite large broadcaster and the network covers several buildings, so routing traffic is a must, and multicast is desirable

While testing it seems that transmiters and receivers switch from multicast to unicast when they are on different networks. Even though I selected multicast, with TTL=10 in Access Manager. Support from Magewell said “Please kindly be aware that multicast is not supposed to cross subnet”. Is this true? There is a way to force multicast? I guess it has to be possible, because you can specify a TTL for multicast, and that is only necessary to allow routing.

We’ve set up a Discovery Service. Also Magewell support saig “when a Discovery Serive is enabled, it would force unicast for transmission”. Is this right?

I’ve been unable to find documentation to verify these.


Sony ZV-1, la caméra parfaite pour le vlog ?

Par : Thomas

Sony ZV-1 vlog

Nous ne vous apprenons rien, la tendance actuelle est de filmer, diffuser et partager un maximum de choses sur notre vie, nos activités ou notre métier. Ce sont donc des milliers de vidéos qui sont disponibles sur Youtube, Instagram, TikTok etc. L’ensemble de ces vidéos sont de plus en plus réussies et cela est en […]

Output:TargetDisplay: Display01: 29 May 2020 ONLINE

Par : Marco Savo

Espronceda Institute of Art and Culture presents this virtual reality exhibition broadcast live on Espronceda social networks.


In the exhibition the Iranian artist Mohsen Hazrati will describe his experience in the Immense VR / AR Residence 2020 at Espronceda.


After his first stage of the artist residence, the pandemic outbreak forced the audiovisual artist to delay his return to Iran and continue to develop research and projects in Barcelona.

During the audiovisual event the artist will describe his virtual talk in the RTTT project of the IAM weekend 2020.


The talk revolves around the concepts of reflections and mirrors in the Iranian literature, and how the RTT (render to texture) technique in 3D game engines could translate these concepts in experiences.

In addition Hazrati explore the speech based art game projects: SOKHON. In these game art experiences the player’s voice can alter the game process.




Facebook | Instagram | Twitter

The post Output:TargetDisplay: Display01: 29 May 2020 ONLINE appeared first on Audiovisualcity.

Un prototype d’iPhone 3GS sur eBay

Vu sur eBay, un prototype d’iPhone 3GS. Vendu 230 €, il n’a rien de très particulier visuellement.

Le seul point particulier, c’est que l’appareil est sous SwitchBoard, l’interface de test d’Apple. C’est un iPhone 3GS de 16 Go assez standard pour le reste, visiblement.

Sous SwitchBoard

Rien de spécial

PHABRIX successfully passes ST 2110 and AMWA NMOS TR-1001-1 JT-NM Testing

(May 2020) — PHABRIX is pleased to announce that its Qx rasterizer and handheld Sx TAG were successfully self-tested against the criteria for AMWA NMOS TR-1001-1 and SMPTE 2110 at the recent JT-NM Spring 2020 self-test. The JT-NM Tested Program, designed by the Joint Task Force on Networked Media (JT-NM), offers prospective purchasers of IP-based ...

The post PHABRIX successfully passes ST 2110 and AMWA NMOS TR-1001-1 JT-NM Testing appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

NDI Camera questions

Par : HughC
Last week, I did a very successful presentation of my new live performance initiative, INVITATION – a Socially Distant Performance Installation.

I used Isadora to control it all and got the live feed from my iPhone using the app: New Tek NDI, which seems to be an older app no longer offered by New Tek. I've used it in previous performances and never had any trouble with it and indeed, this performance went smoothly for the most part. I was told that there were a few jittery moments, which I didn't see because I was in the show.

I should mention the that the wireless network is a dedicated AC3900 router with an extender. One node at each end of the performance space, which is a house, so there are some walls and so on.

This Covid related installation will be ongoing on Friday nights for the next little while and in an effort to grow, we've been experimenting with TWO NDI camera's both from the same model of iPhone X. When I went to the New Tek website to get the app onto my partner's iPhone I learned that the app I have has been replaced by the NDI®|HX, which offers 4K and all sorts of other new features. In comparing the two before last week's performance, I learned that my phone with the old app was smooth and worked well and her phone was unacceptable jittery, even when we dialed down the resolution. No problem, we used my phone for the show with the results mentioned above.

Today we tested again with the idea of using BOTH phones at the same time. Using the New Tek apps, both phones were jittery. I decided to try another app on her phone and found JustWifiCam. Now, her phone is perfectly smooth, but mine is unacceptable jittery. I tried JustWifiCam on both phones with the same result: she's smooth, but I'm not. I have another app called NDICam, which hasn't worked in the past, but does work now but with the same jittery result.

FYI, Isadora doesn't seem to have a problem handling the two NDI inputs. The load is about 70% on my laptop, which is a bit underpowered for this, but works.

I wonder if anyone has any ideas as to why my phone was fine last week but not this week? Even with only one phone connected, her phone is smooth (using JustWifiCam) and mine is jittery (using anything).


Hugh in Winnipeg - All test machines, Win10 Pro, 64 bit, OS SSD and separate data SSD.

this laptop: Dell5520 2018, i7 7820HQ; 32 gigs, Quadro M1200 (and Iris 630)...

NDI scanner RoI

Par :
Is there a way to save a specific RoI, and have the scanner start up on that region?

video stuttering and audio sync issues with OBS

Par : jwine
Preface this that I've got a technology day job but am completely new to video production/streaming.

I'm helping setup video streaming for the church I attend. NDI looks great and we've had some reasonable first tests, but I'm having a couple of issues and I'd like to find a way to resolve them so I can be more comfortable spending a fair bit of money on a permanent setup like a PTZ camera. Our current testing setup is using the NDI HX Camera app on an iPhone XS with a lightning ethernet adapter wired to a MacBook Pro (2018, 16GB RAM, 2.6GHz i7) running OBS 25 with the NDI plugin. We're also using the NDI Tools Scan Converter on a windows laptop that runs our presentation from the podium, so OBS is able to capture the presentation and video of the speaker and we can do PIP-style scenes where appropriate. Audio comes through our sound system to a USB capture device.

The problems we've seen is that there's a bit of periodic video stuttering and the audio is not easily synchronized to the video. The audio is the bigger problem right now as the delay value seems to drift around for factors unknown. For example during some testing, we arrived at 750ms delay which seemed to match the video perfectly. However, the next day when we were live streaming an actual service, it started at 900ms and then drifted around to 600ms, 700ms, 800ms. I had to make those changes during the stream, which introduces a brief audio glitch, not to mention it being hard to guess the right value. FWIW, we had also tested taking audio directly from the iPhone's NDI source but that was also delayed and/or unusably glitchy. That setup isn't a viable end-target anyway; we were just testing things.

In retrospect, I realized that we were streaming the highest quality from the iPhone but scaling it way down in OBS since we're only streaming 720p to Facebook. I can see how all that extra data would require additional processing to receive and scale down, so I'm going to test using one of the lower quality settings in the NDI HX Camera app. I'm dubious that would cause all of the issues though, given the hardware we're using. Further, OBS was only reporting 20% CPU usage and a couple hundred dropped frames throughout a 2 hour stream, so it didn't seem too taxed from those metrics.

Are these types of issues normal/expected? Is there anything else I should check or consider? Are there other setups (other software, USB camera, USB HDMI capture card) that would be more stable or consistent? We've got a team of volunteers so we can't need to constantly change the audio delay parameter, or even test it before every service. Can we get to a point where someone just starts the stream and flips scenes as needed, or is that unrealistic?

thanks for any help or guidance!

Baltimore Hospital reaches Health Care Workers Through Live Streaming with Teradek

Effective May 28, 2020 Martin Jenoff is dedicated to getting the news out around Baltimore. He shoots and edits at the local network news station two days a week, presides over the local videographers group, and with his own Focal Point Productions, has been providing live streaming services for nearly a decade. He says, “Back ...

The post Baltimore Hospital reaches Health Care Workers Through Live Streaming with Teradek appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

Seeking Advice - Streaminhg Live Music and Theater Events

Par : tbarden
We're looking to install multi camera streaming / capture capabilities into our small 80 seat theater. We want the production value of the events to be good or better. For many reasons, the idea of using an NDI based solution really appeals to me but I've been getting feedback from some corners that suggest NDI is not yet ready for prime time. They are advocating for SDI instead. I don't want to invest 40K in something that's not going to do the job. Any advice is greatly appreciated.


iPhone is not recognized as NDI Source (NDI HX Camera & NDI Tools 4.0 for Windows)

Par : ryoku
Hello, everyone,

I have the following problem and I'm hoping that somebody can help me.

I would like to use my iPhone 11 as a webcam on my Windows notebook. I have installed the NDI Tools 4.0 on my notebook and downloaded the NDI HX Camera App.

Unfortunately, the iPhone is neither displayed in the Studio Monitor nor in Virtual Input ("No NDI Source found").

- Both devices are in the same WLAN

- The Windows Firewall allows incoming and outgoing connections to the NDI Tools

- The WLAN network is set to private

- The NDI Button in the NDI HX Camera App is blue

Does anyone have any idea what the problem might be?

Reverb G2 : HP lève le voile sur son nouveau casque VR

Par : Gwendal P
hp reverb G2 utilisation

HP présente le successeur de son casque de réalité virtuelle Reverb, sobrement appelé G2. Le fabricant en avait dévoilé les contours fin mars avec un aperçu plutôt sombre. Aujourd’hui le voile est entièrement levé sur ce casque PCVR de la gamme Windows Mixed Reality.

Pour parler du Reverb G2, il faut tout d’abord expliquer que ce produit est le fruit de la collaboration d’HP, Microsoft et Valve. Or, c’est surtout ce dernier qui a apporté son expertise et aidé Hewlett-Packard a livré un casque « sans compromis ». Ainsi, s’il reprend certains points du G1, dont la partie tracking Inside-Out, il en améliore beaucoup d’autres. À noter que c’est Microsoft qui s’occupe du tracking et du 6DoF.

HP a souhaité particulièrement se concentrer sur l’affichage qui progresse avec une résolution des écrans LCD de 2160 x 2160 par œil. Le champ de vision ne change en revanche pas et reste de 114°. Le taux de rafraîchissement de ces écrans est quant à lui de 90 Hz.

Le Reverb G2 emprunte les haut-parleurs du Valve Inddex

Du côté de la sonorisation, cela ne vous aura peut-être pas échappé, car leur design est inchangé. La paire d’écouteurs est exactement la même que celle du Valve Index. Le Reverb G2 emprunte donc à son cousin son acoustique. En effet, les écouteurs ne touchent pas les oreilles, mais en sont écartés de 10 mm. En outre, ils délivrent un son spatial qui dirige automatiquement le son pour créer une sensation d’immersion encore plus forte.

HP a également tenu à se concentrer sur le confort du casque. C’est pourquoi la répartition du poids a été revue, tout comme la qualité et la densité des mousses. D’ailleurs, le casque affiche sur la balance un poids de 550 g. Il intègre aussi un système de réglage de l’écart des lentilles (IPD) pour mieux correspondre à tous les types de morphologie du visage. Le câble fourni dans la boîte pour le relier au PC mesure 6 mètres de long et est plus fin que sur la génération précédente.

Une disponibilité du G2 à l’automne 2020

Le Reverb G2 est désormais équipé de 4 caméras pour un meilleur suivi des mouvements. Les contrôleurs ont évolué et reprennent la construction plus générique que l’on retrouve chez les autres marques. Les boutons se renomment donc en A B X et Y. HP annonce que les manettes devraient s’appairer plus rapidement en Bluetooth. En revanche, le casque ne prend pas en charge le suivi des mains.

Avec ce Reverb G2, HP se focalise sur le confort et la qualité de l’image pour proposer une bonne expérience aux gamers. Car c’est là leur cible prioritaire. Mais pas seulement, puisque comme le montrent les événements récents, la réalité virtuelle est aussi un excellent espace pour la création, l’éducation et la collaboration en général.

Le HP Reverb G2 sera disponible en précommande en France à partir du mois de juillet pour une livraison prévue dès le mois de septembre. Son prix est de 599 euros HT.

Cet article Reverb G2 : HP lève le voile sur son nouveau casque VR a été publié sur Réalité

NDI and VT5 with OBS/Streaming


Out of curiosity, is there a way to enable the legacy VT5 as a NDI source?

The VT5 is on Win7 machine and I am planning to use a Macbook Pro as the OBS/streaming machine, same network, same house, different room (mainly because the VT5 machine is noisy).

Would the NDI software tools allow this? (it seems that NDI is a more advanced version of iVGA).

Or would I need something a couple of converters ( VT5 Analog to SDI/HDMI) and (SDI/HDMI to NDI) like the Newtek Connect Spark?


DejaSoft launches DejaEdit Version 3 –

Par : RadianceC

DejaEdit Version 3 takes file synchronizing to a new level with enhanced user experience, additional security features and improved workflow management For Immediate Release, 28 May 2020, Gothenburg, Sweden – Dynamic tech start-up DejaSoft has launched Version 3 of its cutting-edge media file synchronising software tool –  DejaEdit. An essential application vital for film productions, DejaEdit ...

The post DejaSoft launches DejaEdit Version 3 – appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

The Best Creative Advice for Career Videographers

Here's the simple phrase that has helped me through my entire creative and video production career and saved me hours of frustration. Take a look!

La loi européenne à propos des drones repoussée à janvier 2021

Par : Thomas

loi européenne drone repoussée

Vous l’avez certainement entendu, suite à l’épidémie Covid-19, la mise en place de la nouvelle législation européenne est repoussée. Initialement, le règlement (UE) 2019 / 947 devait voir le jour le 1er juillet 2020, mais la Commission Européenne a repoussé cette date au 1er janvier 2021. Ce sont donc les lois françaises qui seront conservées […]

How to Remove the Boom Mic in Resolve and Premiere

Let's take a look at two different programs, Adobe Premiere and DaVinci Resolve, and how you can use them to get rid of unwanted objects.

Cinegy announces TURBOCUT – Making Editing with Adobe Premiere Faster than Ever

Munich, Germany, 28 May 2020 – Cinegy today announced TURBOCUT, a new Adobe CC plug-in which significantly accelerates the editing of H.264/HEVC by utilising the NVIDIA GPU’s hardware decoder. This announcement coincides with Adobe releasing version 14.2 of Adobe Premiere with several new features added, but still missing NVIDIA hardware accelerated editing and using the NVIDIA ...

The post Cinegy announces TURBOCUT – Making Editing with Adobe Premiere Faster than Ever appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

Yamaha System Designers Conference : L’immersif à pas de géant

Par : admin

La dernière édition du System Designers Conference, cet indispensable rendez-vous annuel organisé par Yamaha cette fois-ci à Prague, a laissé percevoir ce que pourrait être le son en salle de demain. Ça nous a donné envie de rêver à l'outil improbable, imparable et inutilisable ! Lire la suite

L’article Yamaha System Designers Conference : L’immersif à pas de géant est apparu en premier sur SoundLightUp..

Faire un ² avec un clavier Apple

Depuis de longues années (vraiment) un truc m’énerve : l’agencement des claviers des Mac ne propose pas de touche pour l’exposant 2, le ². C’est assez énervant, parce que je dois de temps en temps parler de mm² et entrer ce caractère, présent sur les claviers de PC. Mais il y a une solution.

La solution de base, c’est d’utiliser Afficher les Emoji et symboles, et de rechercher le ² (en le mettant en favori).

Les raccourcis

L’autre solution que j’ai découvert par hasard, marche assez bien pour en rentrer plusieurs rapidement : il faut simplement ajouter l’agencement Français – PC dans Préférence Système -> Clavier -> Méthodes de saisie. Ensuite, quand il faut taper un ², il suffit de passer sur l’agencement PC dans la barre de menus, taper sur la touche @ (elle va afficher un ²) et revenir sur l’agencement classique d’Apple. C’est assez rapide quand on a l’habitude, et si on ne tape pas trop souvent de ². Attention, ça ne fonctionne pas avec les vieux OS : l’agencement PC date de Mavericks en 2013.

L’option Français – PC

Sinon, il est évidemment possible d’utiliser un clavier PC…

Showtime’s Penny Dreadful: City of Angels Brings 1938 Los Angeles to Viewers’ Homes with Cooke S4/i Primes

(May 2020) – When it came time for cinematographer John Conroy to develop the look for Showtime’s Penny Dreadful: City of Angels spin-off, he already had eight episodes of lensing the original Penny Dreadful under his belt. A major part of the look for the spin-off would come from using Cooke Optics’ S4/i prime lenses ...

The post Showtime’s Penny Dreadful: City of Angels Brings 1938 Los Angeles to Viewers’ Homes with Cooke S4/i Primes appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

Broadcast Pix announces new office location

Broadcast Pix, the award-winning live broadcasting, and streaming company have moved to new offices, located at 141 Middlesex Road in Tyngsboro, Massachusetts. “It’s exciting! Our new location is more in tune with Broadcast Pix 2.0 – more office, meeting, and thinking space with less physical manufacturing. As we transition to delivering more software-based streaming appliances, ...

The post Broadcast Pix announces new office location appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

Lisa Park

Lisa Park is a multidisciplinary artist based in New York and South Korea.


She is best known for her works with biofeedback devices, such as heart rate and brainwave sensors to express invisible biological signals and emotions as auditory and visual representations. In creating art installations and performances using sensor technology, she strives to explore the importance of human relationships and connections.

Park is a recipient of the New York Foundation for the Arts Fellowship.


Her works have been featured by Art21, Artnet, The Creators Project, New York Times magazine, Wired, PBS, Time Out NY, the New York Post, and through many other media outlets.

She received BFA in Fine Arts at Art Center College of Design and her Masters from the Interactive Telecommunications Program at New York University’s Tisch School of the Arts.

One of her interactive audiovisual installations that left us fascinated is “Blooming”.


It highlights the importance of human presence and physical connection in our lives. It cannot be bloomed alone and is only bloomed by the relationship between people. As a response to participants’ skin- to – skin contacts, heart rate, and gestures, “Blooming” blossoms according to their intimacy. As audience members hold hands or embrace , the digital Cherry tree flowers bloom and scatter.



Facebook | Instagram | Vimeo

The post Lisa Park appeared first on Audiovisualcity.

6 Reasons for You to Upgrade to Resolve Studio Today

DaVinci Resolve is unique among post-production applications in that it’s free. But, here are six reasons why you'd upgrade to its paid version.

À partir d’avant-hierVos flux RSS

USD en pratique : le cas Luma Pictures

Par : Shadows

Nous avons déjà eu l’occasion d’évoquer à plusieurs reprises USD (Universal Scene Description), qui facilite les échanges de données au sein d’un studio ou d’une production et offre de nombreux avantages théoriques par rapport aux techniques classiques. Notre confrère Ian Failes nous propose un cas concret de l’usage d’USD avec le studio d’effets visuels Luma Pictures. Un article revient sur l’implémentation d’USD au sein de la structure (en particulier avec leur pipeline de lighting et lookdev reposant sur le logiciel Katana), comment et pourquoi cette mise en place s’est faite, mais aussi comment USD a été employé sur des projets récents tels que Spider-Man : Far From Home.

L’article USD en pratique : le cas Luma Pictures est apparu en premier sur 3DVF.

Emilia : un boitier Raspberry Pi pour les fans d’Amiga

Simon Bachman est un vrai fan des Amiga et cela se sent dans son travail puisqu’en tant que développeur de jeu, il a travaillé sur Pathways, un titre sorti en 2019 qui fleure bon le pixel des Commodore. Cette passion l’a également amené à travailler sur des projets plus personnels.


A réception de son imprimante 3D Prusa, il s’est mis a développer un boitier original pour Rasberry Pi. Baptisé Emilia, ce boitier est conçu pour accueillir un clavier 60%, un modèle plus compact qu’un clavier classique mais avec un vrai mécanisme Cherry MX. Il a utilisé un modèle Pok3r de Vortex indisponible en France mais on trouve beaucoup de modèles équivalents sur le marché. 


Après une série d’impressions, quelques bonnes feuilles de papier de verre et de bonnes couches de peinture, son boitier était prêt a recevoir un peu d’électronique.


Le choix s’est naturellement porté vers un Raspberry Pi qui est intégré directement au châssis de l’Emilia.


Emilia Emilia Emilia

Le résultat est cette solution autonome qui reprend les codes de design des Amiga 600 et Amiga 1200. Il suffit ensuite d’utiliser un émulateur Amiga comme Happiga pensé pour les cartes Raspberry Pi pour retrouver toute l’ambiance de ces machines mais également leur logithèque si particulière.


Il a, depuis, publié une nouvelle série de photos d’une nouvelle version de cet engin. Cette fois-ci en rouge et blanc pour une machine totalement dédiée au jeu. L’objet utilise un émulateur Amiga sous Raspbian et le résultat est à tomber. Si vous avez connu des titres comme ce Turrican3 à l’époque, le petit brin de nostalgie mélangé à la pertinence de ce nouveau design ne pourra que vous faire fondre.


La très bonne nouvelle de ce projet, c’est que Simon a indiqué qu’une fois les éléments de son design achevés, il compte les partager. Cela risque de prendre un peu de temps mais vous pouvez suivre son compte Twitter pour être tenu au courant

Merci à @Cafeine pour l’info <3.

Emilia : un boitier Raspberry Pi pour les fans d’Amiga © 2020.