❌ À propos de FreshRSS
Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierVos flux RSS

We pick the brains of Cinema.AV on his beautiful video synth work

Par Hayley Cantor

These days AV artists are hiding out all over the place, this time curiosity didn’t kill the cat, as I stumbled upon the work of Cinema.AV on Instagram. it’s amazing where a hashtag can take you… #videosynth. I was keen to find out how someone so visually analog ends up that way, and how they manage in an ever expanding digital world (at the time of writing more so than ever).

1.Tell us about your first ever live gig? When was it and how did it go?

For years, I used to play a kind of ambient, soundscape style of music, and for live performance, I would put whatever found vhs tape behind me for visuals. Often without a screen. It often just turned into lighting for my performance, instead of clearcut visuals. 

Fast-forward to a couple years later, in summer 2015, where I started buying jvc video mixers, archer and vidicraft boxes. It was here where I took it upon myself to do visuals for a show I had booked. Sadly, I didn’t realize, the projector couldn’t handle the distorted signals I was throwing at it. Luckily though, someone at the last moment, let me borrow theirs. It was total godsend. The result was this hyper-distorted cross between national geographic videotapes. It worked for the more abstract, psychedelia I had booked for the evening,

Later down the line, I found the need for time base correctors in live performance, and mixers equipped with such. To evenly blend, rather an abruptly with one of those RCA Y splitter cables turned on end. Which is actually the same as the classic Klomp dirty mixer. It was all stuff I got for free, or nearly no money. Never top of the line. Always the most difficult, least practical solutions. But the result was always unique to the moment, to the performance; endlessly fleeting. 

2.We discovered your work on Instagram. How do you usually connect with the AV community online? Does social media play a big role for you?

Strangely, yeah. I hardly ever go out locally, unless of course I’m playing a show. So beyond that setting, you’ll never find me in the wild. Even before this quarantine action, I was a total homebody. Staying in whenever possible to work on art and infinitely explore the machines. So having access to social media platforms is actually key to the whole system. I can actively gauge what pieces people actually like, what ideas stick and in turn, get shared with a larger audience. 

Its those posts that snowball into bigger and better gigs. As the recognition on a global scale is significantly more gratifying than just the local efforts I receive so often. In fact, for the better part of 2019, I was very busy with live video work. Having nearly no time off, I accepted this as a lifestyle, rather than just hobby. And in the social media zone, I’ve been able to publicly beta-test things like the Erogenous Tones Structure module, Phil Baljeu’s custom vector graphics system and as of late Andrei Jay’s latest raspberry pi video synth and feedback algorithms as hardware devices. The curiosity the results generated have in turn, sold modules and made the manufactures money to sustain their efforts.   

…having access to social media platforms is actually key to the whole system. I can actively gauge what pieces people actually like, what ideas stick and in turn, get shared with a larger audience. 

To be fair though, I’m not sure how much of this actually real. If it’s all made up, or the reactions are fabricated. It’s a fine silver-lining we’re all walking along. One day, a post could generate hundreds of interactions, while the next day, nothing. I think alot of that could actually be the option for folks to drift between realities, between the physical and the cyberspace. It’s in this cyberspace, that I do often connect with other artists, say for example my bud Konopka and has online video painting series. To watch him create something entirely from scratch, in real time, thousands of miles away is a true head-spin if you think about it. But not even 5 years ago would have been possible. 

All photos courtesy of Evan Henry.

3. It’s fascinating to how analog and digital worlds inspire AV artists. What’s your take on the two and how do you find working with analog systems for live visuals?

Truly. When I first got started, it was all analog, all found devices. Though in time, I’ve found the whole LZX modular zone, which started analog and now has drifted into this wild digital hardware dimension that has opened up all kinds doors. The obvious attraction to the large analog modular is the physicality and pure intuitive nature of the whole thing. As in a live setting, there is nothing more fun and unpredictable than a hand-wired mess of cables and devices to create this ever-fleeting dialogue, never again to be replicated. For ambient, for house, for techno and literally everything in between, there’s this infinite body that just works, and often never crashes or fails. 

If anything, it’s always the digital component that freezes or fails first. I’ve done shows with computer artists that for some reason or another, who just can’t make it work that particular night.

If anything, it’s always the digital component that freezes or fails first. I’ve done shows with computer artists that for some reason or another, who just can’t make it work that particular night. So just step in and end up taking over the evening with my system. However, I’ve had my fair share of venues tell me their systems are HDMI only. So learned to convert the analog composite outputs of the modular to the HDMI with aid of things like Ambery converters and scalers, Extron scalers, and even the silly Blackmagic shuttle, that has it’s own share of issues. It wasn’t until last summer that I realized the Roland V4-EX had a very effective means of conversion and scaling to HDMI, VGA, and back down again. The result was a total game-changer. So I sold my other mixers, and devices to scale up to HDMI and hadn’t looked back. This meant I could seamlessly work with digital projection systems and streaming processes. And from the get-go, it’s been used in every performance effort since. It’s even let me collaborate with both digital and analog artists alike. To fade and key between all manner of artists and ideas. 

So little things like that make the whole system go, which leads me into the question…

4.What’s your basic setup when do performance live AV shows? (If you have one)

I am constantly pushing myself as an artist. So every year or so, I’ll experience this major creative shift around winter time, when my job at the photo lab temporarily shutters for winter break on campus. It is is then where I have about a month to chill and regroup my mind. This generally means some new gear enters the studio, and in turn the dirty warehouses they get thrown into for live work. 

All photos courtesy of Evan Henry.

In 2019, I saw my modular system grow from a single 6U, two row case that could fly on any airline, to a larger 12U, four row system, that for the majority, made it’s way into every gig. In tandem with the V4-EX, the two were all I needed to do 8-10 hours of a rave whatever else I was getting booked for. However, the few time I flew out for one-offs, I brought it back down to 6U. Which was a lot of fun and lent itself to collaboration with other artists. It was in this time though, away from gigs and rather chill moments at the lab, where I began to experiment with the virtual dimension of VSynth, the Max/Msp visual extension. The result was very reminiscent of my larger modular system. Though at the time, my computer could only handle small patches. Anything big would see my computer begin to overheat and grind to a halt. 

This got me looking at computers, seriously.  As a video generation and manipulation tool, much in the same way the dedicated hardware was, but a larger, more sophisticated, and recallable level. It was months of research and a very generous donation within the family that lead me onto a gaming-oriented laptop, complete with a dedicated graphics card, that in it’s day was considered high-spec, and miles beyond my aging macbook. From the moment I lifted open the box and got it booted, I went straight into complex Max patches and dense 3D structures with the aid of Resolume Arena. When I realized I could save, and recall every motion, I started plotting how to gig with it. To layer to pieces together and to treat Resolume as a video sampler of my analog devices. What began to happen was a meshing of dimensions. No longer was one any better than the other. They were one of the same. It was with this entry that live performance physically became less stressful and far more manageable. No longer did have to carry this unwieldy modular system on a train or a bus. I could now discreetly carry the common laptop computer, just as everyone else. 

All photos courtesy of Evan Henry

Setting up and breaking down, with the projector, is a two cable, two power supply motion. So quick and so light. With the aid of a midi controller, all the tactility remains, and nothing changes. The digital results do look incredible though. I cannot deny that. No matter what I have though, I make the best of all of it. For touring, in 2020, my setup is just that. I did some dates with Steve Hauschildt and Telefon Tel Aviv across Texas and the process was so smooth. Same for the brief efforts with LLORA and BATHHOUSE, just weeks ago. So much less to think about, all with the same manipulations and motions.  

5. What would be your dream AV gig?

Currently speaking, the dream is still to tour, to travel and do large scale art installations with my video work. I had things lined up, but those have all fallen in favor of the current pandemic. But that’s honestly not going to hold anyone for long. These things will all still happen, just not soon as I had anticipated. I was truthfully hoping to break into the festival dimension; Mutek, Movement, Sonar, Aurora, as from a live scale, that feels like the next big move, amidst touring through the theaters and dedicated art spaces. I’ve had tastes of all those, but like anyone serious about their craft, I want to further and really make a name for myself, as truly, I don’t know what else to do. 

Find out more about Cinema.AV on his artist page

The post We pick the brains of Cinema.AV on his beautiful video synth work appeared first on Audiovisualcity.

NDI HX Camera (iPhone) with OBS (Mac OS 10.15.4)

Par skr
I´ve just installed NDI HX Camera and want to use it as Input source for OBS Studio 24.06.

The only way I figured out to get the stream into OBS Studio is by launching NewTek NDI Video Monitor 3.1 (28) and then choose "Window" as source for OBS. That works fine so far, but I wonder, if there is a way to get the stream directly into OBS, without using Video Monitor?

"Secure" NDI Streams

Par joshmarcik
Looking for a way to limit what a user can see in Access Manager/Studio Monitor.

Unicast addresses in Remote Sources work fine accordingly, but we have content security concerns where someone who may have access one day shouldn't have it the next day, and we cannot constantly change IP addresses to accommodate.

Anyone with similar needs and solutions?

Blueberry ou Bondi Blue ? Le dilemme des souris

Par Pierre Dandumont

Récemment, je me suis mis à inventorier mon stock de trucs en tout genre (le confinement, ça laisse un peu de temps après le boulot). Et dans une des boîte de souris, celle avec les souris Apple colorées, j’ai eu un souci. Qui est Bondi Blue, qui est Blueberry ?

(pas de poisson cette année, je n’ai pas eu d’idée lumineuse, et ce n’est pas nécessairement la période pour de faire des blagues idiotes)

Le bleu du premier iMac (Bondi Blue) et celui des iMac de troisième génération (Blueberry) est très proche, et différencier les deux est compliqué. Mais il existe deux points qui peuvent de trancher.

Une Blueberry

Une Bondi Blue

La première étape permet d’identifier qu’une souris est une Blueberry. La souris Apple a en effet été critiquée sur sa forme, et à un moment, Apple a décidé d’ajouter une petite encoche sur le bouton, pour permettre à l’utilisateur de trouver plus facilement le sens de la souris à l’aveugle. SI cette encoche est présente, la souris est forcément une Blueberry. Cette technique a un défaut : il existe des Blueberry sans l’encoche. Si elle l’a, c’est donc une Blueberry. Si elle ne l’a pas… c’est une Blueberry ou une Bondi Blue.

Une Bondi Blue, une Blueberry

Deux Strawberry, différentes

La seconde étape, je l’ai vue . Apple a changé la boule entre le premier modèle (Bondi Blue) et les autres. Dans la première, la boule est bicolore, avec du Bondi Blue et du blanc. Dans les autres, la boule est gris et blanc. Le site indique le contraire, mais mon expérience personnelle (et ce test de l’iMac en 1998) montrent bien que c’est la bonne valeur. Enfin, disons que ma seule souris avec une boule Bondi Blue est une souris Bondi Blue, et que toutes les autres ont la version grise.

Les deux boules

En clair, si la souris a une boule colorée et pas d’encoche, c’est sûrement une Bondi Blue. Si elle a une encoche, c’est forcément une Blueberry.

Cinema AV (AKA Evan Henry)

Par Hayley Cantor

Evan Henry, from Dallas, Texas, is a truly multidisciplinary AV artist, who primarily works visually under the artistic name Cinema AV, but who is also known to write ambient music scores with both analog and digital synthesizers. His work embraces both analog and digital set ups, with his main interest visually representing sound.

What began as a love of photography, cinema and found footage grew into something much greater when in 2015, Evan was introduced to video circuit-bending and once-obsolete video electronics. Using these pieces in a live performance setting was always his goal, and from the get-go, tachyons boxes, vcrs, and video mixers turned into buying used Gieskes 3trinsrgb+1c standalone video synthesizer, building its expanders and just over a year later, the LZX cadet and castle line of DIY eurorack modules.

From there, video art went from beyond a hobby, to a complete way of life. Reliant on live performance, he plays at gigs relentlessly for both local, and touring artists alike. In 2018, he joined Ghostly Intl.’s Steve Hauschildt on a tour through the East Coast and Canada. He became the resident visual artist for Proton Limited in Dallas, Texas in 2019. These motions set the stage for a constantly evolving motion in the live visual dimension. 

Cinema AV’s work extends itself to instant and 35mm film renderings and has appeared in galleries and pop-up’s throughout North Texas. But when not playing live, or coordinating visuals for Dallas Ambient Music Nights, Evan is occasionally writing or building a set of modules for fellow artists.

The result is an infinitely growing body of work, that in the last few years has expanded itself into largely digital dimensions in Resolume Arena and Max/Msp. 

Website | Instagram | Facebook | Youtube | Bandcamp | Soundcloud | Vimeo

The post Cinema AV (AKA Evan Henry) appeared first on Audiovisualcity.

Scan Converter using a lot of CPU resources

Par 4xStreamer
I am getting about 22-30% CPU usage from Scan Converter on my PC. What can I do to drop the resource requirement? It also has the same level even if I use region capture. I thought this was supposed to only use ~6%.

PC stats:

i7-7700K @ 4.2ghz
GTX 1070
3 - 1080p monitors
1 - 1080p webcam

I'm capturing at 60fps. Also, is there some way to make it only capture from one monitor? ROI doesn't capture the whole screen.

Note: On one of my older computers, which has an i5-2400, it uses 30-40% CPU.

MIX 3 & 4 are missing

Par Muzzman
I have just updated our TC-1 to Build 7-1-200210C and just realized that MIX 3 and MIX 4 are no longer available within the "Source Input" drop down list within the TC1 NDI selection. Only MIX1 and MIX2 are available?
MIX 1 and MIX 2 are needed for recording?
I require the DDR's to return as an Input to use Video Delay for each source (for timing) as all the Audio is being done external through a Yamaha Mixer, including DDRs.
I am able to see MIX 3 and MIX 4 in the Output configuration page where I have selected DDR 1 and DDR 2 but it seems with this Firmware update these are no longer available as described above.
Is this a Bug with this F/W or anything else I am missing as it was fine before this Firmware change.

Video stuttering and audio dropouts

Par davidcbny
I've been working with NDI for awhile, but one persistent issue is video stuttering and/or audio drop out. We have the TC 460, and the Mini-HD-SDI, some PTZ cameras, and two Sparks. The stuttering happens regardless of what combination of hardware. Right now, I'm using my laptop with NDI Tools 4 scan converter and receiving with the TC Mini.

NewTek NDI Camera is not NDI|HX Camera? (IOS)

Par JoLa
I bought NewTek NDI Camera. Now I see only NDI|HX camera in Appstore (IOS). And Appstore want another 235 SEK for that. Has Newtek changed name to be able to charge me again? Or what have I missed?

Thank you to Newtek.

Par kidgkodiak
Just wanted to give you all a big round of applause and thanks for your hard work to each and every one of the people at Newtek for the latest NDI release today. You guys came at the perfect time and while most of the Newtek staff here are usually hounded by questions, I hope this message gets to your team. Again, thank you for doing all that you are to keep us (and definitely from us working from home) afloat during these hard times.

Cheers to your team, and stay safe folks.

Ndi 4.5

Par kanep
NDI Tools 4.5 has been released for Windows and Macintosh. Updated NDI Analysis and SDK, the Unreal Engine NDI SDK is now available.

Here is a (very) partial list of improvements from the SDK docs. This release is under the hood improvements, NDI Tools operate as they did before.

* Significant improvement to UDP sender that should dramatically improve the performance on many systems and networks. This should reduce the CPU time, and also get far better network efficiency.

* At 1080p60 the number of channels of decoding possible on Windows should increase when you have a larger number of cores available.

* Lower latency in all forms of HX decoding. This is at least 2 frames better in almost all cases and in many it will be better than that.

* Vastly improved startup discovery performance on Windows.

* Quite a large improvement to codec quality.

* Fix for problem with right clicking when using Scan Converter and KVM.

* NDI now uses predictable port numbers in all cases so that fire-wall rules can be properly built.
- Each connection uses TCP ports 5960 and up.
- Each mTCP or UDP receiver use port 6960 and up.
- Each mTCP or UDP sender use port 7960 and up.

NDI and Avid Media Composer Across VLANS

Par justsomeguy
Hi All,

Here's my problem: I'm trying to give multiple end users the ability to use Studio Monitor to watch down the output of multiple Avid Media Composer's running NDI Open NDI plugin. To be clear, i want them to be able to select from a list of roughly 50 machines, to watch one of them, using the monitor. My network is not multicast nor mDNS enabled. My end users are on different VLANS than the Media Composers. For reference, I'm using MC 2018.12 and NDI Tools v4.0 on monitor endpoints.

1) Avid Media Composer does not seem to have a place to configure the Open i/o NDI plugin in order to point it at a Discovery Server (anyone know of a way?) so i can't use a discovery server implementation.

2) I've resorted to Access Manager on the NDI Monitor end point. Im able to manually add the sources in the access manager for the multiple machines and they all work (hooray).

3) Access manager stores it's config in several places.
ndi-config.v1.json in C:\ProgramData\NewTek\NDI
a registry key called "IPs" in HKEY_USERS\-username-\Software\NDI\Networks where -username- is the Administrative or Power user that last launched the Access Manager app

The last point there is something i need to flag, because its important for the next one. Access Manager can only be ran (on windows 10 anyway) by a user that has Power User or Administrative privileges on that machine. If the end user that needs the monitor is not an Admin or Power user, an Administrator needs to configure the access manager and then their use of Studio Monitor will then reference the config put in by the windows Administrative user that did the setup.

What I've found is that neither of the above Configs actually take any precedent in the "Remote Sources" tab. I've tried to manually add Source IPs (comma,separated) to both the RegKey as well as the JSON file. When I next launch the Access Manager application and update the config, both the JSON file and the Registry Key are overwritten by the application upon closing.

So that got my wheels spinning and I realized there must be another config somewhere. Sure enough, there is. That config is called ndiMemos.xml and lives here: %USERPROFILE%\AppData\Roaming\NewTek where %username% is again, the Admin or PowerUser that configured the Access Manager. I was able actually to manually modify that file, which is great. But I still have this problem with Access Manager needing to be ran as a Power/Admin user because unfortunately many of my users are not Power/Admin on their own machines.

So, here's my ask 1) In the long term, is there any way Access Manager can be properly signed so any user can run it? 2) Is there an easy way to do this? I'm thinking a webserver inside of my environment that "Catches" all the media composer feeds over NDI and Aggregates them into a WebPage so an end user can simply hit that webserver over p443 and click a Source to view it in the browser (html5 video). Does anything like that exist out there?

Thanks all

Newtek NDI 4.5

Par LeCollagiste
Newtek NDI

Le groupe Vizrt propriétaire de Newtek créateur du protocole NDI depuis 2015, annonce la venue du NDI 4.5.

Dans les tuyaux de cette nouvelle mouture, des fonctionnalités avancées pour l’utilisation d’Internet et du réseau sans fil, le NDI|HX avec faible latence, un support complet de l’accélération du décodage GPU, l’intégration du NDI|HX dans la caméra autonome Mevo. La possibilité de diffuser sur son réseau NDI, des flux en temps réel d’Unreal Engine, laissant le champ libre pour l’augmentée et la réalité virtuelle. L’enregistrement IP illimité sur la version 4.5

NDI Tools est une suite gratuite de friandises conçues pour un flux de travail vidéo sur le réseau.

NDI Tools


Newtek NDI 4.5 est un billet de LeCollagiste VJ.
LeCollagiste VJ - Mag du LeCollagiste, actu sur la culture VJING, Vidéo Mapping, info video et high tech.

Advanced Edition 3 no longer supports iVGA - Need help

Par isaac2k2
We recently upgraded out Tricaster 8000 to the latest Advanced Edition 3 firmware and we lost ability to display our Easyworship via ivga to traicaster.

I have read about the NDI solution and installed it on the desktop and setup everything but I have facing a challenge with this new system.

The laptop main screen is captured on Tricaster as a feed but no where to force it to display the secondary screen which is where we project our scriptures to external monitors.

The old ivga had a setting that allows you to select which screen but this new one does not have such option hence we are stuck.

Anyone able to assist.


Can't Get NDI Studio Monitor to record a feed with alpha channel

Par avkid6345
I'm struggling with studio monitor this morning with Studio Monitor and IsoCorder. I have titles coming in from ProPresenter with an alpha channel. I can get it to display properly within Studio Monitor with the checkerboard background confirming that the alpha channel is working, but all recordings just have a black background. I can't find a way to get the iso feed from the titles recorded with alpha channel. Any obvious settings I'm missing? Is there a better NDI recording software out there?

NDI HX1 vs HX2

Par DStoneburner
is there an article that shows the improvements of HX2? Thanks in advance.

Magewell’s New Flexible 4K Decoder for NDI® And SRT To Be Launched at 2020 NAB Show

Par Andres Benetar

  Magewell will unveil its most powerful decoder hardware to date in booth # SU5724 at the 2020 NAB Show (April 19-22 in Las Vegas). Believed to be the industry’s first hardware decoder to support both Newtek’s NDI® technology for production-grade media transport and the Secure Reliable Transport (SRT) protocol – developed and open-sourced by ...

The post Magewell’s New Flexible 4K Decoder for NDI® And SRT To Be Launched at 2020 NAB Show appeared first on NAB Show News | 2020 NAB Show Media Partner and Producer of NAB Show LIVE. Broadcast Engineering News.

Powerpoint in Presenter Mode & Full Screen options?

Par Pasnow
Thinking out loud here, and for an external set of eyes & brain to help think something thru. We do conferences in a fairly large auditorium (100-200 ppl) about the size of a high school gym, but professional environment. Previously, we had out laptop at the podium, and speakers would use the clicker to cycle thru the slides. In between speakers tho we have our staff kinda walk up, fumble thru the laptop to close out the PPT and load up either a On Break set of slides, or maybe a promo or video. It's not terrible, but there's talk it looks a bit amatuerish, and we want to control the slides and projectors view from the back. (Also, some less savvy speakers wish to show a youtube video or something and have trouble bringing it up).

We use Wirecast, and have a Connect Spark. We are thinking of using the Powerpoint laptop in to the Connect Spark, and then Wirecast grabs it for our online viewers as Picture In Picture (this we currently do & is A Okay). Then place a laptop up front, and use Studio Monitor for the speaker to view the powerpoint. And from there, HDMI out to the projector.

However, inevitably, some speakers are going to need/want to see their PPT in Presenter View, and see notes about their speech. It is not our auditorium, and we only get in about 6:30am for an 8am start, and we do not have access to the projector from the back, its hanging high up & fixed, with the HDMI input cable at the front. If we send Presenter View to the podium via Studio Monitor, it then outputs Presenter View to the Projector. There seem to be things like Birddog Mini however if the Powerpoint HDMIs out Full Screen to it, likely it will full screen to the podium.

Logistically, can anyone conceive a way to pull this off? We're thinking some form of screenshare (GotoAssist) for the podium to view the PPT Laptops Presenter View screen, while HDMI out is Full Screen to the Monitor. Or, maybe via OBS using NDI.
Lastly, 2 laptops with NDI Scan Converter on would not see each other, right?

NDI Studio no video but has audio

This is a weird one. Reminds me of when we had to change the .ini file to get overlay to work. Computer Specs below
NDI Studio Ver 4 loads the studio and receives audio, but only shows green screen on video. I'm simply trying to display our NDI output on a local computer on the same Gigabit LAN. Could not get VLC to decode, but suspect it has encode and no decode. Maybe someone knows how to get VLC to decode and display.

So Studio has sound and green screen for video
NDI test pattern Ver 4 crashes with the following error:
Problem signature:
Problem Event Name: APPCRASH
Application Name: Application.Network.TestPatterns.exe
Application Version:
Application Timestamp: 5e45db8b
Fault Module Name: Processing.NDI.Lib.x64.dll
Fault Module Version:
Fault Module Timestamp: 5e45db06
Exception Code: c000001d
Exception Offset: 0000000000bab64a
OS Version: 6.1.7601.
Locale ID: 1033
Additional Information 1: 16a6
Additional Information 2: 16a64011e9427dfff697afa4c86d841b
Additional Information 3: a4c8
Additional Information 4: a4c8bb0ef04e858c1479ab66be8c00c1

Tried Ver 3.6 NDI Tools and get test pattern to another machine for 20+ hours, but Studio 3.6 crashes with this error:
Problem signature:
Problem Event Name: APPCRASH
Application Name: Application.Network.StudioMonitor.x64.exe
Application Version:
Application Timestamp: 5b580b37
Fault Module Name: Application.Network.StudioMonitor.x64.exe
Fault Module Version:
Fault Module Timestamp: 5b580b37
Exception Code: c000001d
Exception Offset: 000000000053fdcb
OS Version: 6.1.7601.
Locale ID: 1033
Additional Information 1: 1ee0
Additional Information 2: 1ee092440bf7aac5428f9257921782a1
Additional Information 3: b72b
Additional Information 4: b72b6fc4d6af5792c99d216a12a03bf2
Cleaned everything out and tried several things nothing allows it to display the video. Updated all drivers, no virus, no malware, no spyware, nothing unusual, DX updated, very basic setup. Output to one external monitor, not using laptop monitor on Dell D620

System: Win 7 Ultimate SP1 all updates, turned off everything I didn't need. Plenty of HD space, and swap space set to 5Gb on separate partition.
CPU Intel Mobile Core 2 Duo T7200 @2.0 GHz, 4096 KB cache, MMX, SSE, SSE2, SSE3, SSSE3, EMT 64, VT-x
CPU 12%, Physical RAM 50%, network .35%
GPU Nvidia Quadro NVS 110M, Memory 128 MB actual; used dedicated 19MB, Dynamic 25MB

RAM 2x 2GB = 4GB
Gigabit Ethernet to a switch to the other PC, running almost the same specs, but RAM 16Gb and CPU i7. Network is empty except for these two devices.

Any ideas on how to get Studio monitor 4 to display the video?

Scan Converter closes during use

Par tellusmuseum
I am running a PowerPoint on a Windows 10 machine through scan converter into a Tricater TC-1. The two are connected through a local Gigabit switch along with a NCIO, worksurface, and five Newtek PTZ cameras. I am getting freezing through the NDI input every once in a while on the Tricaster. The screen isn't frozen on the Windows 10 machine, just the input into the TC-1. Today, when I checked, the scan converter plug in must have had closed during the PowerPoint. I restarted the plug in, and once again I could see the PC.

Any thoughts?

A way to build with FFmpeg?

Par sneakyjoe
I've recently got interested in building my server with NDI-Nginx-FFmpeg transcoding for my stream. I learned that around a year ago FFMpeg stopped NDI support.
It also looks like FFmpeg even of older versions won't build with NDI SDK? (I also assume I may have been doing something wrong)
Is there a static build of FFmpeg to use on ubuntu or even in Windows, or a way to build FFmpeg myself for personal usage?

NDI stutter

Par imryh

We have a large NDI setup with a new TC Mini 4K in the middle of it. All camera sources (6) are coming into the TC Mini 4K using BirdDog Studios.

Every single camera source is coming up on the TC Mini has a light video stutter every 10-12 seconds.

Any ideas why, and how to fix it?

All on a local network. Lots of bandwidth.



Mage Noir : six illustrateurs dévoilent leurs créations pour un jeu de cartes

Par Shadows

Camille Fourcade, Charles Ouvrard, Jessica Heran, Jeffrey Jeanson, Nicolas Camiade et Johann Goutard dévoilent leurs créations pour Mage Noir, un projet de jeu de cartes à jouer et collectionner. Ce dernier, porté par les game designers Vincent Vimont et Constantin Dedeyan, fait l’objet d’une campagne Kickstarter.

En voici une petite sélection :

Jessica Heran
Johann Goutard
Charles Ouvrard
Jeffrey Jeanson
Camille Fourcade
Nicolas Camiade

D’autres visuels sont visibles sur le site du projet et la page Kickstarter. Le jeu comportera bien évidemment de nombreuses autres artworks, puisque chaque pack starter comportera 80 cartes et que des boosters sont prévus.
On notera d’ailleurs que 45% de la somme demandée pour la campagne correspond, justement, à la création des artworks et donc à la rémunération des artistes.

L’article Mage Noir : six illustrateurs dévoilent leurs créations pour un jeu de cartes est apparu en premier sur 3DVF.

Facebook NDI Users Group

Par joly
For those that might like such things.

Not-affiliated with Newtek, public, 4.4k users.

Powerpoint & presenter view

Par dveldhouse
This one seems strange to me, so hoping someone has an idea why. When I connect my laptop to my Mini for recording a PPT, we can't seem to make it work correctly with the presenter view. The standard presentation looks fine on the laptop, but on the Mini and on the pass-through to a projector, the slides are squished top and bottom. The display drivers are fine and are set correctly as far as I can see. There is no problem with the standard presentation view, but some people really like to see their notes when talking. Any thoughts or am I missing something simple?


Callipeg : un outil d’animation 2D prometteur a besoin de vous

Par Shadows

L’équipe d’Enoben nous présente son projet en cours : Callipeg, un outil d’animation pensé pour l’iPad.

L’objectif est ambitieux : il s’agit ni plus ni moins que de proposer un produit convenant aux professionnels. L’équipe insiste aussi sur sa volonté de s’appuyer au maximum sur les capacités de l’iPad : Apple Pencil, gestes sont donc pleinement supportés.

En beta depuis plusieurs mois, l’application propose déjà une timeline avancée, des menus radiaux, le support du multi-touch, l’import/export de données ou encore un système de brosses. L’interface a manifestement fait l’objet d’un travail soigné ; les gauchers apprécieront au passage l’existence d’un mode dédié.

Le modèle économique retenu sera l’abonnement, avec un tarif assez léger (sous la barre des 5€/mois ou 50€/an).

D’ici le lancement effectif, une campagne kickstarter a été mise en place. L’idée est de financer le développement de certaines fonctions avant la mise à disposition publique de l’outil : une caméra 2D, un calque de transformations sont par exemple au menu.
Participer à la campagne ne vous donnera pas accès à l’outil, en revanche vous disposerez des fichiers d’animation utilisés dans les vidéos de présentation : de quoi partir sur de bonnes bases quand vous testerez l’outil par la suite.

L’article Callipeg : un outil d’animation 2D prometteur a besoin de vous est apparu en premier sur 3DVF.

Crowdfunding : un jeu vidéo inspiré des travaux de Moebius et Jodorowsky

Par Shadows

Adapter l’univers de Moebius et Jodorwsky en jeu vidéo : voilà un projet pour le moins ambitieux. C’est pourtant le défi que s’est lancé Ruiz Dupont Manuel alias Mata avec The Eyes of The Cat, une future expérience 3D inspirée de la BD du même nom.

Vous pourrez y contrôler un chat, un aigle et un enfant dans ce qui est décrit comme du “1 vs 1 vs 1” : une sorte de jeu d’échecs à trois joueurs, mais où vous incarnez chacun des joueurs.
Graphiquement, ce titre proposera trois partis pris correspondant aux trois avatars. Le tout sera créé sous Unreal Engine.

Pour boucler le projet, une levée de fonds a donc été mise en place sur Kickstarter. Le but, récolter au moins 8600$ : 1400 environ ont déjà été rassemblés. Un don de 12€ vous donnera accès à la version actuelle et finalisée du jeu, à une interview complète de Jodorowsky ou encore à des fonds d’écran.

Notez enfin que Mata dispose d’une solide expérience dans le jeu vidéo : il a eu l’occasion de travailler en tant que game director, technical artist (notamment chez Epic Games), mais aussi sur des projets temps réel chez Dassault.

Pour plus de détails, on consultera la campagne Kickstarter.

L’article Crowdfunding : un jeu vidéo inspiré des travaux de Moebius et Jodorowsky est apparu en premier sur 3DVF.

Le Fuffee Mug utilise la chaleur pour alimenter un écran qui sert à rien

Par Pierre Lecourt

Un mug avec un écran à encre électronique, voilà ce qu’il manquait à toute une génération d’utilisateurs. Ceux en manque de maximes débiles et  qui n’ont pas la chance de traîner sur Linkedin tous les matins pour déguster la laborieuse prose des génies qui y partagent leurs fabuleuses leçons de vie.

Le Fuffee Mug n’a qu’un intérêt, celui d’embarquer une technologie permettant de récupérer une partie de la chaleur dégagée par la boisson qu’il contient pour afficher des données sur son écran à encre numérique. A vrai dire rien de vraiment nouveau, on sait faire cela depuis un bon moment. Jean-Charles Peltier a découvert comment récupérer du courant à partir d’une source chaude en… 1834.


Seulement JC n’avait pas l’esprit start-up et n’a donc pas eu le bonne idée de fabriquer un Mug qui pourrait afficher en gros “N’engueulez pas le pas Patron, la patronne s’en charge” directement sur un Fuffee. L’objet surfe sur une vague verte en insistant largement sur le fait qu’il est “battery-free”. Mon mug actuel l’est tout autant, il contient plus, est moins lourd et n’a aucune chance de se transformer un jour en un nouveau support de publicité.

Le pitch de la marque est assez clair “En exploitant l’énergie thermique d’un café, Fuffee devient un mug et une plateforme sociale sans piles qui vous permettra de coopérer avec diverses marques et services. Vous ressentirez quelque chose de différent à chaque fois  que vous boirez votre Fuffee”.1


Dire que l’on est un produit vert en ajoutant une électronique inutile à un objet aussi simple qu’un Mug me parait un peu compliqué quand même. Mais si c’est pour gâcher mon café en m’apostrophant avec une publicité débile sur un énième nouveau support… Avec un message relayé en Bluetooth via mon smartphone qui pourrait tout aussi bien l’afficher lui même ? Ça c’est clairement une incitation au crime.


Je trouve cependant l’idée de base excellente et on peut imaginer des usages à cette relation chaleur – énergie pour un affichage à encre numérique. Nul besoin d’une source de chaleur permanente puisque l’écran continuera à afficher ses données même si la température redescend. Pour une station météo, un affichage d’informations importantes ayant une durée de vie assez longue, ce type d’interface est à réfléchir. On pourrait même imaginer un système de récupération de chaleur directement intégré à nos ordinateurs – qui en génèrent pas mal – pour afficher des données de cette manière.

Le Fuffee Mug est lancé à 69$, combien coûterait le même produit sans le mug pour des usages plus intelligents ?

Le Fuffee Mug utilise la chaleur pour alimenter un écran qui sert à rien © 2020.