Blackmagic Design’s new ATEM Mini Pro live production switcher
Today, during an hour-long live-stream event, Blackmagic Design’s CEO Grant Petty made several product announcements. The most noteworthy was the introduction of a new ATEM Mini Pro, which at $595 is, according to the company, a new low-cost live production switcher that has “all the features of ATEM Mini but now with extra features for recording, streaming and monitoring.”
Blackmagic Design also said the new ATEM Mini Pro includes a new hardware streaming engine to allow direct streaming via its Ethernet connection to YouTube Live, Facebook and Twitch. The new ATEM Mini Pro also supports recording the stream direct to USB flash disks in H.264, plus supports recording to multiple disks for continuous recording, and includes multi-view capability on the HDMI video output that allows all inputs to be monitored on a single monitor, as well as live status of recording, streaming and the audio mixer.
Other new ATEM Mini Pro features include:
Petty said the new ATEM Mini Pro switcher is available now.
In addition to the ATEM Mini Pro, Blackmagic Design had a couple of other announcements. The company announced two free updates for its popular Blackmagic Pocket Cinema Camera 4K and 6K models that includes powerful studio camera features. It also introduced an update for its HyperDeck Studio Mini broadcast recorders
The Blackmagic Camera update (6.9) and the HyperDeck Studio Mini update (7.1) are available for download now from the company’s website. For more on each product, see the following press releases:
The post Blackmagic Design Introduces New ATEM Mini Pro Live Production Switcher appeared first on HD Video Pro.
The Panasonic Lumix DC-S1H is one of the newest generation mirrorless hybrids that utilizes the H.265 codec for internal recording.
Does your current primary camera support H.265 recording? Other than a preset in your camera’s codec/recording options, what do you know about the H.265 codec and how to most effectively use it in your production and post workflow? I thought it would be timely to explore H.265 usage in digital cinema and pro video since we’re just receiving word that there are already new options on the horizon that will update and compete with H.265. While this is by no means a complete list, popular cameras like the entire ZCam line, the Panasonic Lumix S1H and AU-EVA 1, Go Pro Hero 7 and 8 Black, Apple iPhone, Canon XF-705, as well as many others, currently or will support H.265 in the future.
High-Efficiency Video Coding (HEVC), also known as H.265 and MPEG-H Part 2, is a video compression standard designed as a successor to the widely used Advanced Video Coding (AVC, H.264 or MPEG-4 Part 10). I was amazed to discover that H.265 was started in 2013, but it’s only about the past two years that it seems to have reached critical mass with a lot of new cameras, from Go Pros to high-end digital cinema cameras, including it in their codec options. In comparison to AVC, HEVC offers from 25 percent to 50 percent better data compression at the same level of video quality or substantially improved video quality at the same bit rate. It supports resolutions up to 8192×4320, including 8K UHD, and unlike the primarily 8-bit AVC, HEVC’s higher fidelity Main10 profile has been incorporated into nearly all-supporting hardware.
Out of my own two main digital cinema cameras, only one of them supports H.265 recording, the Fujifilm X-T3, but I also notice that my GoPro Hero 7 Black supports H.265, although GoPro refers to it only as HEVC. Like many other terms in our business, there are often two or even three commonly used names, but just to be clear, as of today, HEVC and H.265 are the same thing and are often used interchangeably. From here on out, I’ll refer to HEVC primarily as H.265, but just beware that as your read and research about how this codec works and how it related to your gear, you may see both of the names HEVC and H.265 in common and interchangeable use.
For me, the main factor in H.265 is that its efficiency allows 10-bit video recording in a tiny mirrorless camera like the Fujifilm X-T3, although not all H.265-capable cameras support 10-bit recording (looking at you, Apple!), but most do. That alone makes it an interesting codec as the camera’s other recording choice, H.264, is limited to 8-bit recording. As you’re probably aware, with 8-bit video the maximum number of colors that can be displayed at any one time is only 256. The result is that 8-bit video can look very good but will be limited in color reproduction, resulting in color banding in gradients. If you shoot skies, you’ll often see banding, as well as in numerous interior situations. When performing color-intensive techniques like green screen, keying and compositing, it can be more difficult to pull a clean and precise key with 8-bit video.
Shooting 10-bit video, the maximum number of colors that can be displayed jumps from 256 to 1,024 levels per channel. The end result is that 10-bit video reproduces gradients and skies with little to no perceptible banding and for keying/green/blue screen, compositing and color grading, 10-bit gives you much more color resolution to work with, meaning that you can refine composites and push the signal around in color grading considerably more.
How does H.265 increase efficiency over say, its predecessor, H.264? H.265 achieves greater levels of compression, including better variable-block-size segmentation, improved deblocking and motion compensation filters, sample adaptive offset filtering and better motion vector prediction and precision. H.264 utilized same size blocks while H.265 advances the concept.
Now that we have established what H.265 is, let’s talk about how you actually can efficiently utilize it in your own work. Now that H.265 footage is becoming more commonplace in professional production workflows, I have a few observations about how it’s used and what some common issues are around H.265.
In our business, time is money, right? In production, shooting H.265 isn’t any more difficult or time consuming than any other codec you’ve been using. Where I see a lot of users having issue with H.265 is when it comes to handing it off to clients or for their own postproduction workflows. Let’s state it now, H.265 is unequivocally not designed for editing. Why? Without getting into the technological intricacies of exactly how each codec, Long GOP (groups of pictures) and Intraframe (individual video frame) encoding works, with H.265 being a compressed codec, your computer has to do a lot more heavy lifting to decompress that information to edit.
When a video has been shot with a long GOP codec, there’s a lot of interpolated information in many of the frames. Transcoding the footage into ProRes or another edit-ready codec converts those groups of pictures to an All-Intraframe codec so that your computer no longer has to devote CPU cycles to making sense of the interpolated data. It also allows you to upgrade the color from 4:2:0 to 4:2:2/4:4:4, which won’t make the colors better in your shot but gives you a bit less color/signal degradation when color grading.
Sure, if you can invest in the top-of-the-line CPU with a lot of RAM and the fastest RAID for editing, it’s possible to edit H.265 in real time without dropping frames. But even if you have a system capable of real-time processing the H.265 footage, what happens when you need to edit a multicam project? Then you’re asking your computer to decade multiple streams of long GOP H.265 footage in real time.
What if you’re editing a huge project with a long running time and lots of H.265 footage? What generally happens is if your system is capable of editing an H.265 media stream with real time performance, it’s probably at or near the edge of your system’s capability. It’s always a good idea with client projects, deadlines and money on the line to have extra capability and additional overhead. For this reason, most experts agree that transcoding H.265 to a more edit-friendly codec is the safest way to assure that your system will have enough additional overhead to handle multiple streams and layers without choking by not being able to play back in real time.
The two most common solutions to the challenges presented by using a distribution codec like H.265 for editing are:
Let’s break down each of these approaches. For number one, using an external recorder, many users report good results utilizing an external recorder like the Atomos Ninja V or the Blackmagic Design Video Assist. It’s nice to be able to shoot, return to your editing system with ready-to-edit ProRes or even compressed RAW files to edit with. All isn’t perfect, though, with this approach, there are several drawbacks and limitations in using an external recorder:
In general, external recorders solve one problem or challenge (getting around the H.265 computing requirements in post) by adding additional problems or challenges to your production workflow. It’s up to you if the tradeoffs are worth the advantages.
For number two, transcoding H.265 to an edit-ready codec, you realize the same advantage of using an external recorder, you begin your post process with an editing computer-friendly codec like ProRes. Transcoding your footage overcomes almost all of the disadvantages of using an external recorder: no additional expense, weight, bulk, size and power requirements, and no additional HDMI or SDI cable going from your camera to a recorder. All isn’t perfect with shooting H.265 and transcoding after the shoot:
To that end, there are excellent utilities that aren’t very expensive that will efficiently transcode your footage. Even if you don’t want to use FCP X and you’re on the Mac platform, Apple Compressor is an excellent, easy to utilize, fully-featured transcoding and compression utility that’s optimized for the latest Apple OS and it only costs $49 from the App Store.
Brorsoft makes a utility called Video Converter for both PC and Mac platforms that can easily and efficiently transcode your H.265 to the best format for your editing system. There are numerous other utilities made by numerous other developers and most of them cost less than $60 and offer a free trial to make sure that the utility can do what you need it to. Most of these utilities also other functions that may be helpful to your post workflow for other file conversions.
H.265 is a reality for professional media production right now. How you choose to deal with it in your workflow is a choice you’ll have to make using whatever factors make the most sense for your budget and workflow. For some, investing in the top-of-the-line hardware available is worth it to be able to edit H.265, at least at some level, in real time, is important to their business plan. For other workflows and uses, an external recorder makes the most sense. For those of us who prefer to deal with H.265 in postproduction, a software solution and transcoding makes the most sense.
Keep in mind that coming in 2020, we’ll see the introduction by the Motion Picture Experts Group (MPEG) of VVC, Versatile Video Coding that will also be known as H.266. From the intro documentation, it becomes clear that the design parameters of this new codec are definitely made with more advanced distribution properties. The MPEG requirements document defines this as a bit rate reduction of between 30 percent and 50 percent with the same perceptual quality as HEVC (H.265) Main Profile.
In terms of complexity, the same document states that “Encoding complexity of approximately 10 times or more than that of HEVC is acceptable for many applications.” Reading between the lines, it seems as if this new H.266 codec hasn’t been thought through any more effectively for shooting/acquisition than it was for H.265 in 2013. Meaning that for you, as a digital cinema/video user, you may need to continue these strategies I’ve outlined in this article much longer than just the useful lifespan of H.265 since it appears that the new version, H.266 is also mainly a distribution codec.
New codecs will always be coming into our workflows, it helps to have strategies in place for how to deal with them effectively.
Last time, I talked about the problem of judder—the stuttering or strobing that occurs when there’s fast movement in a scene. The movement may be caused by fast camera movement or by fast movement of objects in a scene—like vehicles. Judder is even more pronounced as we move to more HDR finishing.
At the Hollywood Professional Association’s Tech Retreat in February, Pixelworks discussed their TrueCut motion grading workflow. By analyzing each frame in a shot, the software can smooth out the judder.
The workflow process begins with powerful workstations (either local or cloud-based) analyzing scenes that may need to be adjusted. This isn’t a simple task. Proprietary algorithms perform complex motion analysis. This isn’t a button press and get some coffee. It takes time.
Once the analysis is complete, the TrueCut software integrates with color grading software like DaVinci Resolve or Baselight. Preset options for correct judder are presented for each shot for quick decisions. Or for more creative control, there’s a software control panel (shown at the top of this article).
One way to offset judder is by increasing motion blur. The control panel allows you to control motion blur via changes to the shutter angle. Yes, changing the perceived shutter angle in post is possible.
Another control is something Pixelworks calls judder angle. This control reduces the appearance of judder, not by motion blur but through pixel interpolation based on the prior analysis of the scene. Between adjusting shutter angle and judder angle, a motion grader can use TrueCut to do a shot-by-shot motion grade for a film.
The demo at HPA showed removing motion blur, adding motion blur, and—more importantly—dialing down the judder. And while almost completely removing judder effects is an option, visually it takes the scene from film to video.
Currently, motion grading occurs during the color grading session. This type of grading, like color grading, really involves creative decisions. I’m told directors/cinematographers have different opinions on the best motion grade for each shot just as they do with color.
As this process is used for more and more films, will we see motion grading move to a new position? Will we see Motion Grader in the credits?
Years ago, when I worked for Nikon, I found one of the most enjoyable “freebies” I had access to as an employee was being able to attend the Nikon School Workshop in New York City. Back then, the day-long workshop was a slideshow presentation given in auditoriums around U.S. by various Nikon photographers and product experts, who discussed how to take and make great photos.
Today, in response to those who are self-quarantining themselves due to the Coronavirus pandemic, Nikon is, in a way, extending that same benefit to everyone. Here’s the message you’ll now find on NikonUSA’s Events webpage: “Make the most of this time. Nikon’s mission has always been to empower creators. In these uncertain times, we can do that by helping creators stay inspired, engaged and growing. That’s why we’re providing all of our courses free for the entire month of April. Let’s come out of this even better.”
The ten or so educational videos, which in total would cost you a few hundred dollars t0 access, are taught by Nikon Ambassadors and include:
To watch them for free, though, you’ll need to provide Nikon with your first and last name as well as your email address.
The post Stream Nikon School Educational Videos For Free Throughout April appeared first on HD Video Pro.
Does this Zoom webcast interface look familiar to you, especially since quarantine started?
This is my first blog entry since we’ve been stuck in quarantine from the Covid-19 virus. Like all of you, I’ve been feeling the full range of the human experience—fear, uncertainty, boredom and frustration. We’re definitely in unprecedented times and one constant I’ve been noticing a lot is that the weeks and possibly months of quarantine are leading to some interesting behaviors. If you consider yourself an introvert, you gain energy and engagement from being with yourself. If you’re an extrovert, you gain energy and engagement from socializing and being with other people.
While quarantine is difficult for all of us, it has shown me that contrary to popular belief, I think that there are few of us who are all introvert or all extrovert. It seems more likely that each of us shows a disposition for being either an introvert or an extrovert, but most of us have elements of both qualities in our personality. It’s in times like we are in that introverts may be gaining a bit of a longing to go out and socialize since in quarantine, it’s been weeks of staying indoors, while those of us who are more extroverted may be dying a thousand deaths at not being able to go out and hang with people at all.
The most immediate solution to our very human problem of isolating through quarantine seems to be streaming. Whether it’s Facebook Live, Microsoft Teams, Zoom or Facetime, I’m noticing a lot of activity with live streaming/webcasting the past three weeks. I sat down with a vendor this past week for a one-hour training via a Zoom session about a new audio app I’ll be writing about here soon. My wife, who is a teacher, has been on two to three webcasts per day since quarantine started a few weeks ago from her school and next week will begin teaching her students via Zoom. My friends and relatives all seem to be streaming for their work. We’re even getting together tonight for a Zoom “party” to catch up with our friends who we haven’t seen for quite a few weeks now. I’m not even sure exactly what we will do, but it will be good to hear what they’ve been up to, see their faces and hear their voices since we can’t meet up thanks to the quarantine.
From a professional viewpoint, to me, this entire virus event feels as if it could be the beginning of a sea change in how people regard webcasting/live streaming. More people are using this technology now than probably ever have from a numbers standpoint. More people are probably discovering many tasks, meetings and gatherings that they had previously thought of as a “live, face to face” thing that are now being done virtually with live streaming and may shift more toward webcasting and live streaming even when this Pandemic ends.
Major entertainment from networks like NBC, such as The Tonight Show, are webcasting to broadcast live and live to tape from Jimmy Fallon’s home. DJs and musicians are live streaming to their audiences. DJ D-Nice has been hosting weekly dance parties called Club Quarantine where millions of people are live streaming his DJ sets and dancing “together” in their homes.
Right now, people who are quarantined, which is a good portion of the world’s population, don’t really have a choice about live streaming, it’s the only option for people to reach their audiences with new, live content now that most televisions studios and networks, at least here in the United States, have severely reduced their staff and many have closed or are running with a skeleton staff. Today is March 29, 2020, and predictions are that the virus outbreak will become worse before it gets better. When it does get better though, once we’re on the road to recovery, what will this all mean to live streaming/webcasting?
I shoot documentary films, some entertainment programming and some corporate work. Up until this pandemic, almost all of my work was shot, edited and distributed through social media, broadcast and occasionally in theaters and through corporate internal networks. How does what has been happening with live streaming affect what I do for a living? At this point, I can’t really say, I don’t think anyone can. We won’t know how viewing habits and how viewers consume media will truly change until we’re over this pandemic and life begins to return to the new normal. I have some idea though about directions I’d like to take when we get to that point though.
For those of us who aren’t Gen X or Millenials, media through this Pandemic and quarantine, to date, has proven that just like demographers and sociologists tell us, what matters most is authenticity. Seeing a celebrity or newscaster broadcasting from their living room or kitchen on their phone is proof of this. Take away the nice lighting, top-end cameras and professionally recorded and mixed audio and what you’re left with is either the real deal or it’s not. If authenticity is still there, the audience will still watch, even without all of the niceties of good production value.
In our relentless search for the next “big thing” with what we video professionals become consumed with, specifications, ever-increasing resolution, HDR, full frame and all of the other fads we have seen explode in popularity over the past couple of years with image-making, strangely, doesn’t seem to matter much when it comes to live streaming and webcasting. Most people view webcasts on either their phone or tablet and some on their laptops. All of sudden, 4K, 6K, 8K, HDR and FF don’t seem to matter a whole lot. What matters more from a technical viewpoint is having a reliable, fast connection at the upload and download end.
I’m not sure exactly what’s happening with media through this pandemic and quarantine that seems to be sweeping the world, but it feels as if some things are evolving and once things somewhat return to normal (or at least the new normal), it appears that some priorities may be changing. One thing to think about as a content creator is that in one fell swoop, the entire media industry has been placed on hold—there’s literally almost no production happening in TV and features, and even limited live streaming and webcasting beyond a few from their living rooms.
Think about all of the projects that you were working on (I had four projects cancel as soon as the quarantine orders went into place) that now may or may not happen. Look at all of the projects your friends, colleagues and clients were working on that have all been halted, stopped midway or just went away. Hopefully, we can all take some lessons from this shared experience and begin to think in a new and innovative way how to protect our industry and hedge our bets so that if and when another situation like this arises, we’ll be better prepared, creatively and financially to weather the storm.
For me, a key component of this new preparedness is to more fully embrace live streaming and webcasting. Both seem to be fairly disaster-proof unless the entire structure of the internet goes down. I’m thinking of new ways to try to tell stories and communicate with my audience during these times of unprecedented disruption and it seems that mixture of traditional programming (documentary and short clips) that can be streamed, mixed with live, authentic, real-time content could be an interesting mix that I haven’t seen done with webcasting much.
If you think about live streaming/webcasting as regular live TV but over the internet to anyone’s phone, tablet or laptop, the potential for creatively applying the medium is mind-boggling. It’s an infrastructure that can be put in place now, today, easily and quickly and executed at almost any level, from Facebook Live from your handheld phone to full-on multi-camera, professional-level live broadcast and everything in between. For those of you who plan on sticking with production once the pandemic passes and life returns to the new normal, it’s time to dig into your skills, thought and creative process and be on the leading wave of change in media that I predict will engulf us all. It’s time to adapt or perish.
“White Angel Bread Line, San Francisco,” by Dorothea Lange, 1933. Gelatin silver print, 10 3/4 x 8 7/8″ (27.3 x 22.6 cm). The Museum of Modern Art, New York. Gift of Albert M. Bender
Many times when I write a “Fine-Art Photography Friday” blog post, I’ll mention that “these are some of the intriguing photo exhibitions or films taking place around the U.S.” In doing so, my hope is to allow the reader to physically go to one of the profiled museums, galleries or theaters, buy a ticket and then enter. But since most everything is closed, that doesn’t makes much sense.
Instead, I’ve decided to write about some of the wonderful online fine-art photography exhibitions and film presentations currently taking place on the web. Here are three notable examples of such content that I believe will intrigue and inspire you:
The first one is a show that did open in New York, but is now closed. The Museum of Modern Art had planned to present “Dorothea Lange Words & Pictures” through May 9, 2020, which the museum said “was the first major solo exhibition at the museum of the photographer’s incisive work in over 50 years” and included approximately 100 photographs drawn entirely from the MoMA’s collection. It’s an understatement to say that it’s a shame that such a great show had to close. In part, it’s because Lange was one of the great 20th century documentary photographers, whose work includes the iconic “Migrant Mother, Nipomo, California,” and many others that continue to resonate. But luckily, you can still see her work, here online on the MoMA’s website.
The MoMA also includes an additional article about Lange on its website, titled, “Written by Dorothea Lange.” In it, the article asks, “How do words and pictures work together? Dorothea Lange took great care in combining photographs with words to communicate the stories of everyday life.” It also quotes the artist herself, who said, “all photographs—not only those that are so-called ‘documentary’… can be fortified by words.”
One of her landmark photo books “An American Exodus: A Record of Human Erosion,” also vividly illustrates this point. The book was a collaborative project that Dorothea Lange worked on with the economist and writer Paul Taylor. According to the MoMA article, in the book, the two weave “together text drawn from field notes, folk song lyrics, newspaper excerpts, sociological observations, and quotations from the sharecroppers, displaced people, and migrant workers whom Lange photographed.” It’s a great example of how great photographers are often also great communicators, and take care with the words they use to write their captions.
If you’re interested to learn how contemporary photographers are inspired by a master photographer like Lange, be sure to read “The Art of Documentary Photography: Dorothea Lange Reconsidered” by Holly Stuart Hughes, who wrote about the show when it was still open and interviewed Sam Contis, a photographer inspired by Lange’s work.
Here are a couple of additional fine-art online websites to check out:
A preview video of photos in the exhibition.
This exhibition, titled “Vanity Fair: Hollywood Calling—The Stars, the Parties, and the Powerbrokers” at The Annenberg Space for Photography, in Los Angeles, comprises celebrity portraits culled from Vanity Fair magazine over the past four decades, and includes photos by Annie Leibovitz, of course, but also Helmut Newton, Mark Seliger, Herb Ritts and other luminaries. The online tour is expertly narrated by exhibit curator and Vanity Fair creative development editor David Friend in a beautiful slideshow of selected exhibit images, which Friend says reveal “the deep impact of photographic portraiture” and in many ways are “icons of icons.”
“That’s My Jazz” is a wonderfully tender, sumptuous, mostly black-and-white 14-minute heartfelt film that presents an intimate portrait of Milt Abel II, who is considered one of the best young pastry chefs in the world. According to the New York Film festival’s website, “the son of Kansas City jazz legend Milt Abel Sr., Milt II longed to follow in his father’s footsteps, but on a different stage.” This short is part of the New York Film festival series A Short Film a Day Keeps Anxiety Away and is directed by Ben Proudfoot and produced by Breakwater Studios. It’s a video that has brilliant visuals, inspiring music and a remarkably captivating monolog that all work together incredibly well. During the film, the son calls his dad, “Hall of fame jazz musician. Hall of fame dad.”
Watch the film! It’s guaranteed to re-connected you to the human race!
The post Virtual Fine-Art Photography & Film Friday: Dorothea Lange At The MoMA & More appeared first on HD Video Pro.
As I wrap up my writing about the annual Hollywood Professional Association’s Tech Retreat, I’d like to touch on new technology that I hinted at last time. It deals with motion in cinema capture.
The motion I refer to comes from either moving the camera—panning, tilting, dollying and handholding—or from objects that move within the scene. Because of the typical 24-frames-per-second frame rate used in cinema, the capture of the movement can be tricky.
The problem is referred to as motion judder. If you move the camera too quickly, the image appears to have what’s often called stutter or strobing. The same artifact may also occur when the camera doesn’t move but something in the scene—a vehicle, for instance—moves too fast.
What’s too fast? The picture at the top of this article shows a page from my copy of the ASC’s American Cinematography Manual. There are several charts in this book that address this specific problem. The image above shows a chart listing panning speeds for 35mm cameras at specific focal lengths. Using the data in these charts, a camera operator can determine how fast they can pan a camera with a lens of a given focal length and not get motion judder.
I bet you can guess how many people pull out their ASC manual when setting up a shot! And even if they did, the director would probably tell them that it doesn’t matter what the chart says, “I need the camera to move this way!” or “I need that car at that speed!”
In fact, most of the time people shoot with a 180-degree shutter (in photographic terms: a shutter speed twice the frame rate). This leads to motion blur. And motion blur helps minimize the appearance of motion judder.
However, as we move to HDR the problems of motion judder are more obvious. High contrast images really magnify judder artifacts. During the breakfast roundtables at the HPA Tech Retreat, post professionals described some scenes as almost unwatchable in HDR because of motion judder.
There’s software that can help. The software is operated by someone that might be called a motion grader. Next time, how this new position can help with motion judder.
It’s probably no surprise to most content creators that many camera companies throughout the U.S. have already shut down repair and service operations, or they’ve dramatically scaled them back. However, if you’re a pro photographers or videographer, and your equipment breaks, you’ll want to know your options.
We’ve been keeping track of what camera companies have been stating on their website about repairs and service, although some have yet to specifically update their service and repair webpages with specific information. We’ll be sure to update this page with any additional information as we hear it.
The following is a select list of some of the notices and statements we’ve found from camera and lens companies so far:
Canon service facilities in select states—California, New Jersey, Illinois and Hawaii—are temporarily closed due to the Covid-19 virus. For more, see this COVID-19 Update page from Canon USA. Canon also states there will be delays in services for its Canon Professional Services members: “Due to direction from state and local authorities in CA, NJ, IL & HI our Canon service facilities in those states are temporarily closed until further notice. All equipment repairs will be directed to our Factory Service Center in Newport News, VA. We apologize for any inconvenience or delays that may occur during this time.”
On the Fujfilm service and support page, the company said that “our digital camera repair drop off and pick up in Edison, New Jersey is closed until further notice. We are currently still accepting camera repairs that are shipped in by UPS, FedEx, and the US Mail.” Find more on this FujifilmUSA page.
The camera and lens manufacturer, based in Germany, had the following message on its service pages on its U.S. website: “The coronavirus affects the world and the current situation requires careful action by each of us. We at Leica Camera AG are also aware of our social responsibility and our main focus is on the health of our employees. We therefore hope for your understanding that we will no longer be able to offer our services to the usual extent and will close the Leica Customer Care at Leitz-Park Wetzlar [in Germany].” They asked that customers not send any Leica products for repair and maintenance after March 23, 2020, until further notice. See Leica’s service page for more.
Nikon said it was temporarily suspending equipment repairs at service facilities and that they’re not currently accepting equipment. You can learn more at its NikonUSA service and repairs page. For pro shooters who are Nikon Professional Services members, Nikon also posted the following message on its NPS service and support page: “Due to the impact of COVID-19 and in accordance with mandates issued by the federal government and various state governments, we have temporarily suspended Nikon Professional Services including equipment repairs services, product evaluation loans, special assignment loans including repair loans. In addition, NPS events and event support have also been suspended in accordance to this mandate.”
On the Olympus website, the company said that its New Jersey drop-off location was closed, but that you could still mail packages for repair. On another website, they said “Nevertheless, our Repair Facilities are still operating.” However, the Olympus website in Japan just posted this page earlier this week, stating that as “China slowly starts resuming regular operation, many offices in other Asian countries, Americas and in EMEA are working in shifts and/or from home where applicable.”
Sigma stated that its two main US offices—Sigma Ronkonkoma, NY Office, US Headquarters and Sigma Burbank, California, Office and Showroom—are “both currently closed until further notice.” Direct product ordering from the Sigma website has been suspended, and all previously scheduled events and activations for the month of March have been cancelled or rescheduled. However, there will be phone support and online information. For more, go to Sigma’s Special Notice Regarding Offices and Service Due to Coronavirus webpage.
This lens manufacturer, which has its USA headquarters on Long island, stated that “with the mandate delivered March 20th by New York State’s Governor Cuomo for 100% of non-essential workforces to work from home, Tamron’s office is closed until this mandate is lifted. As such, shipments are not being made or received, mail-in rebates cannot be processed and only limited staff is available to take your call. Repairs are not being made, so please do not send in your lens for repair at this time.” For more, go to the Tamron USA website home page.
The post Coronavirus And Camera Repair And Services Centers appeared first on HD Video Pro.
Today, Madavor Media, LLC announced that it has acquired Imaging Resource, one of the most popular and trusted websites for camera and photography equipment reviews on the web. The company, which also owns Outdoor Photographer, Digital Photo Pro, Digital Photo and HDVideoPro, says the new acquisition will operate under its existing brand, and founder Dave Etchells will maintain an ongoing presence within the company as editor emeritus.
“We’re excited to welcome the engaged consumer base and advertisers that trust Imaging Resource,” said Madavor Chief Operating Officer Courtney Whitaker. “We believe the website’s in-depth expertise on cameras and other products will be a perfect complement to our other photo content.”
“Part of my confidence in transferring the IR brand to Madavor is the level of quality they’ve consistently supported in all of their existing photo publications,” said IR founder Dave Etchells. “I was also struck by the excellent strategic fit between IR and Madavor’s existing respected photo publications and websites.”
For more, see the press release below. You can also subscribe to the free Imaging Resource newsletter.
Madavor Media Acquires Imaging Resource
Madavor Media, LLC announced today that it has acquired Imaging Resource, one of the most popular and trusted websites for camera and photography equipment reviews. The 22-year-old website, which has millions of loyal followers, will continue to operate utilizing its experienced staff and contributors.
The acquisition by Madavor will enable Imaging Resource seamlessly to continue its mission to provide the most comprehensive, independent news and reviews in the photography business.
Imaging Resource will operate under its existing brand, and founder Dave Etchells will maintain an ongoing presence within the company as editor emeritus. The camera review website will now join Outdoor Photographer, Digital Photo Pro, Digital Photo and HDVideoPro as part of the photography portfolio at Madavor and will enhance the company’s position as the dominant media company to connect with a wide range of photographers and videographers.
“We’re excited to welcome the engaged consumer base and advertisers that trust Imaging Resource and we believe the website’s in-depth expertise on cameras and other products will be a perfect complement to our other photo content,” said Madavor Chief Operating Officer Courtney Whitaker. “The opportunities for advertisers to achieve their marketing objectives will increase across all of our photo titles as well as some of our other publications such as BirdWatching and Plane & Pilot. We also anticipate the cross-promotion of content will be very beneficial to our readers, especially our newsletter subscribers.”
Imaging Resource was started in 1998 and has provided in-depth coverage, testing and reviews on new technology since that time from its Georgia headquarters. The company will continue to be based in Georgia, giving Madavor Media a new location as part of its nationwide operations.
“Part of my confidence in transferring the IR brand to Madavor is the level of quality they’ve consistently supported in all of their existing photo publications,” Dave Etchells said. “I was also struck by the excellent strategic fit between IR and Madavor’s existing respected photo publications and websites. Their loyal and passionate audiences enjoy timely award-winning content, which tends to be more focused on the art of photography than the gear. This is an element that I’ve always felt was missing from IR, but that we never had the resources to fill.”
About Madavor Media, LLC
Founded in 2004, Madavor Media develops and markets content for consumers who are passionate about their interests and those seeking highly informative editorial that helps them take charge of their well-being and live happier, healthier lives. Through its team of experts, Madavor delivers highly engaging, world-class content that is disseminated and consumed through virtually all channels. Headquartered in Braintree, Massachusetts, Madavor’s main office acts as the hub for social media, marketing, design and production, operations and information, while its associates throughout the country keep in close contact with its customers and communities of interest.
Contact: Tim Doolan, Social Media & Marketing Manager, (617) 279-0190, email: firstname.lastname@example.org
The post Madavor Media Buys Camera Review Website Imaging Resource appeared first on HD Video Pro.
On the second day of the Hollywood Professional Association Tech Retreat’s Supersession, several scenes were shot for a short film. It demonstrated several technologies now being used on set and in post-production. Last time, I wrote about in-camera effects, but there was a lot more. Maybe not as flashy but still important.
Digital dailies and extensive use of the cloud enabled access to footage by various artists. Interestingly, they ended up shooting about 1.5 TBs of footage for this 11-minute film. By the end of the project, that had grown to about 12 TBs of data.
With any discussion about using the cloud (or moving footage by any means), security is an issue. During the tech retreat, this was evident. One technology discussed was how to secure the media at the source. Rather than security methods that vary from facility to facility, data is encrypted at the beginning. Put it another way, the data protects itself.
Part of the data being protected is metadata. During the HPA Supersession shoot, lens data—including lens distortion and shading—was captured on set. During visual effects work, a shot’s lens information could be used to remove the distortion and shading prior to compositing, then later to reapply it to create a composite that matches the look and feel of non-composited shots.
ACES, the Academy Color Encoding System, was used as the way of managing color throughout the workflow. The open system uses input and output transforms to allow a standard interchange of footage throughout the post workflow. This approach eliminates the dozens of different formats typically coming from a variety of cameras and software.
Important to any management of color during workflow is monitor calibration. This was evident at HPA as the on-set monitors and projection were calibrated. Any talk of displays these days will, of course, bring up the topic of high dynamic range (HDR).
One side note that I picked up during a breakfast session was that some productions may not have an HDR display on set. Not only are they expensive, but some cinematographers might not want to look at them. Instead, those cinematographers’ experience at capture—both lighting and exposure—guides them to the results they want, HDR or SDR.
Getting the right results during finishing brought up another rather new technology and new position: motion grader. More about that next time.
Earlier today, the National Association of Broadcasters confirmed the news that they were canceling the NAB 2020 trade show, which was to be held in Las Vegas, from April 19 through April 22. According to NAB President and CEO, Gordon Smith, the primary reason for shuttering the show was due to coronavirus concerns:
As you know, we have been carefully monitoring coronavirus developments both domestically and globally over the past few weeks.
In the interest of addressing the health and safety concerns of our stakeholders and in consultation with partners throughout the media and entertainment industry, we have decided not to move forward with NAB Show in April. We are currently considering a number of potential alternatives to create the best possible experience for our community.
Smith also noted that “keeping the community safe and healthy is NAB’s highest priority; therefore, we are deferring to the developing consensus from public health authorities on the challenges posed by coronavirus.”
We’ll continue to follow any future developments, including new product announcements. We’ll also let you know about possible alternatives to the canceled trade show. For more news on NAB 2020, click here.
The post NAB 2020 Trade Show Canceled Due To Coronavirus Fears appeared first on HD Video Pro.
Last time, I talked about an annual meeting in Palm Springs, California, where Hollywood minds discuss all things about content creation. The second day of the Hollywood Professional Association Tech Retreat is referred to as the Supersession. This year it certainly earned that title.
Rather than scheduling speaker after speaker to talk about various technologies, HPA made a short film. It was a unique way to cover the topic of using the cloud for post-production while also creating content—live!—in front of the attendees.
Three scenes of the short film were shot throughout the day, and a drone sequence was shot out in the desert simultaneous to the morning presentations. The film was directed by Steven Shaw, DGA, ASC with Roy Wagner, ASC as cinematographer.
Prior to the retreat, the bulk of the scenes for the 11-minute film were shot at various locations back in Los Angeles. All the footage was uploaded to Amazon S3 storage, allowing access to scenes by various post-production service companies. At the end of the day, the film—with full-color grading, visual effects and sound design—was screened for attendees.
The day showcased a dizzying array of technologies. In-camera effects was one of the first examples of cutting-edge tech. An interior train scene had been set up on the presentation stage. The action appeared to take place in the evening, and the view out the train windows was produced using LED panels playing back footage previously shot by multiple cameras.
But here’s where it gets real—literally. The footage wasn’t just played back, it was rendered out for playback by a real-time rendering engine that was tied to tracking hardware on the primary shooting camera. As the camera moved to cover the action, the scene outside the windows changed perspective just as it would if you were on an actual train.
A similar scene—shot previously—involved a group sitting around a picnic table in the woods. Although it was also shot with LED panels, this time the panels provided the entire background. With this technology, the background can be organic or synthetic. Using synthetic backgrounds allows for quick modifications to the scenery. This tech was used for scenes in “Star Wars: The Mandalorian.”
This shooting method has multiple benefits. One is that the actors don’t just walk around a green screen. They can actually see and respond to the environment. Another is that the LED panels can also be used as light sources, simply by adding a white shape in any area not in the shot.
On the other hand, there was a drawback to this method of shooting. Some actors may get a little motion sick as the background moves to track the camera. To help with that problem, only the part of the background that the camera sees is moved.
In addition to in-camera effects, the day was filled with other technologies—next time.
This Tiffen Variable ND filter is one of the most commonly used ND filters for mirrorless hybrid and DSLR shooters.
Neutral-density filtration. Why would we want to discuss something as boring as ND filters? Simple, because the times, they are a changin’ and along with them, how we think about ND filters is also changing. Depending on which type of camera you predominantly shoot with, your view about neutral-density filters may be different. If you shoot with a mid-level camera, something like a Blackmagic Ursa Mini Pro G2, Canon C200 or a Panasonic EVA 1, neutral-density filtration is built into your camera in the form of a filter wheel.
The equation is really pretty simple, we use ND filters to reduce the amount of light reaching the sensor, allowing us to operate our video and digital cinema cameras at a more typical shutter speed, often at 180 degrees. If you set your camera to record in 24p (really 23,976) fps, following the 180-degree shutter rule results in a 1/48th of a second shutter, right? You can extrapolate that rule to apply to any other frame rate that you shoot, the 180-degree rule of doubling the frame rate with the shutter speed will give you a shutter angle or speed that’s complementary to creating the typical look of shooting a film camera.
Of course, rules are made to be broken and I often do. It can be fun to under crank a camera’s frame rate (wow, under crank, words that harken all of the way back to the silent film era’s hand-cranked cameras) and let the shutter stay open for longer, resulting in a smeary, blurry look that’s perfect for a drug or alcohol-induced haze on a character’s POV shot. Of course, conversely, you can decrease the shutter’s exposure by speeding it up, giving you that surreal “Saving Private Ryan” beach jittery look that has its own surreal visual signature.
Typically, though, we mostly try to stick to the 180-degree rule for most shots for most projects, unless something “different” or “edgy” is called for in the script or concept. If you have a background in photography, you know that the relatively slow shutter speeds we use compared to the shutter speeds that still photographers use results in a LOT more light coming into our sensor. If we’re attempting to keep the shutter speed on our camera in the 180-degree neighborhood, that means we only have two other ways to reduce exposure, reducing the lens aperture or reducing the camera’s ISO or gain (sometimes also ASA for film stock if you’re still shooting film).
The other way to reduce the amount of light hitting our sensor or film stock is via ND filters. Neutral density only exists to reduce the light hitting the imager, hopefully without coloring or otherwise adversely affecting the image, hence the “neutral” in the name.
If you shoot with a removable lens mirrorless/DSLR camera or high-end digital cinema camera like a RED or Arri Alexa, none of these cameras offer internal ND. You have to buy external ND filters and somehow affix them to the lens, either a circular ND filter that’s fastened to the lens front via threads or with a matte box that you use to insert rectangular glass filters into slots in the matte box. There are a few specialty lenses where the front element of the lens protrudes too far out to physically mount a filter in front of the lens. Some lenses of this type have a small slot at the rear of the lens where the diameter of the lens barrel is smaller so that a small ND or another type of filter can be inserted into the lens barrel, providing the same light-reducing effect as the larger filter at the front of the lens.
If you think about it, ND systems are one of the more primitive systems on a modern digital cinema camera. It’s either an external piece of darkened glass that’s affixed to the front or rear of the lens or it’s a wheel of small ND glass filters in a wheel inside the camera that rotates, typically on a turret, to place the selected ND strength in front of the sensor. Both systems are primitive compared to the rest of the electronics in the camera, which are some of the most sophisticated electronic systems outside of military hardware and aviation. Why is it 2020 and barely any companies have moved from either of these mechanical paradigms used for reducing the light that reaches a sensor?
You may or may not be familiar with the Z CAM lineup of digital cinema cameras. Z Cam is a Chinese company that has only been on the scene for a relatively short time, yet their popularity has grown quickly as their camera systems offer professional cinema camera features at a small fraction of the cost of a traditional cinema camera from Canon, Sony or Panasonic. Z CAM offers several camera models with Micro four-thirds, S35 and Full Frame sensors and they just introduced an optional accessory for their cameras called the Z CAM E-ND Filter.
I believe this new electronic filter is very innovative and offers an ND range of 1.7 to 6.7 stops of ND filtration in 1/3 stop increments. The advantage of electronic ND is that it has much more precise control over its ND strength over traditional Internal NDs that typically are set up only in 2-stop ND increments. The E ND sells for $399 and is easily integrated into the Z CAM lens mount with just two screws. The electronic ND filter will be implemented in all of the Z CAM E2 flagship models—E2, S6, F6 and F8—with EF-mount or PL-mount. What I find innovative about this ND system is that it’s electronic, so the ND ramping in 1/3-stop increments is smooth and seamless, easily controlled in the Z CAM menu. The addition of this accessory into Z CAM lineup immediately elevates their image and feature set, strengthening the case for considering a Z CAM instead of their competition.
Sony has implemented an internal electronic variable ND system on three cameras to date, the PXW FS5/MKII, PXW FS7 MKII, PXW Z190/Z280 and the PXW FX9. Unlike the mechanical filter wheels that its competitors use and even Sony uses on its top-of-the-line Venice camera, these models utilize an infinitely variable electronic ND that can easily be adjusted via an access wheel at the front of the camera.
The Variable ND can also be set to auto ND to track a particular exposure value, allowing the user to set ISO, aperture and shutter speed and keep them constant with the camera ramping the ND filter to track a particular exposure through radical lighting changes. It allows for internal recording for time-lapse, allowing the camera to seamlessly change ND values as exposure changes for night to day and day to night shots, etc. I had a chance to utilize the electronic variable ND on the FX9 as I recently used it for several shoots and I walked away impressed at the versatility and ease of use with the system. It was painful to go back to my traditional camera with its internal fixed 2-stop ND filter wheel afterward.
I think that much like autofocus technology that just a few short years ago was unthinkable to implement on a professional camera, Electronic Variable ND is the future for all cameras. When competitors to Z CAM and Sony see what sort of buyer feedback the companies receive about features like Electronic Variable ND, they’ll begin to see that having it on a camera is a competitive advantage.
This should lead them to implement their own versions on their own camera lines, and the end benefit for all of us is that these kinds of features get out of the way of creativity and automate functions that we once had to concentrate on ourselves. Once you have used this technology, you begin to not notice exactly how many stops of ND you have dialed in; you just check using your exposure measurement tools (waveform monitor for me!) and dial it in until it achieves perfect exposure. These features give you more control, in the end, control over your exposure value, shutter speed, ISO and aperture. It’s a fourth variable to adjust exposure, which is, in itself, a radical idea.
The new Fujifilm X-T4 Mirrorless Hybrid is the latest offering from Fujifilm.
As you’ve seen if you’re tapped into camera/production social media and it’s resulting blogosphere, Fujifilm recently announced the successor to the Fujifilm X-T3, one of the most popular cameras it has ever produced. We shot a project for a Los Angeles NPR station two years ago and had a chance to use our client’s Fujifilm X-T2 as a gimbal and B camera, gathering various moving b-roll footage around the station as we shot interviews with our A camera in their on-air studio. Overall, I found the X-T2, despite some glaring video omissions, to be a pleasant camera to use with some nice results.
I then covered the launch of the Fujifilm X-H1, a more video-centric model with in-body image stabilization (IBIS) and a few more video features. After using the X-H1 a bit and speaking with the engineering team that was over from Japan at Fujifilm’s offices in Los Angeles, I knew that Fujifilm was close to introducing a camera that, while primarily a still camera, would have enough solid video features to be useful for me.
A bit of background: I owned the Panasonic GH4, it was our first 4K camera, but I found that the Micro Four Thirds imager seemed to be too noisy for my shooting style, and I found the skin tones lacking, with a pastel quality that had to do with the noise reduction the GH4 applied. I also had experience shooting with my producing partner’s Sony A7 II on a few shoots and while I found its high ISO ability to be useful for low-light shooting (we shot some footage in a dark nightclub for a documentary where we couldn’t light the shots), I found its constant overheating when shooting 4K and its color science to not be appealing to me.
I have a DSLR, the Canon EOS 80D, that wasn’t a bad video camera, but it only shot 1080 and I found the footage to be marginal when any kind of grading or even mild color correction was applied. As we were heading into production for a docu-series that we wanted to produce in 4K, I was on the hunt for a 4K-capable mirrorless hybrid. I really liked the Panasonic GH5; it was a big improvement over the GH4, but its autofocus wasn’t very good even though the rest of its features were very appealing for video shooting.
I was planning on using whichever mirrorless hybrid we ended up with primarily on a gimbal and as a handheld, in a cage mount for shooting in cars, on small boats in the ocean or in other locations where bringing in our A cameras, the Canon C300 MKII and the C200, fully rigged, would be too conspicuous.
When Fujifilm introduced the X-T3 in late 2018, I knew that it could be a good contender to serve as the gimbal and B camera for our docu-series. It seemed that Fujifilm had improved on the X-T2 and X-H1 video capabilities with the exception that the X-T3 lacked IBIS. But it had improved autofocus, the ability to use the AF while shooting 4K, great color science, a very detailed and good looking sensor called the X-Trans 4 and not only a way to shoot Flog, Fujifilm’s log format, but also the ability to shoot using Fujifilm simulation presets.
I knew that Flog would generally yield the most dynamic range, but I had seen some YouTube clips shot using the X-T3’s film presets that I thought looked very good too. With the X-T3, Fujifilm introduced a new film simulation called Eterna that looked to be a great starting spot for light grading and color correction. The other intriguing thing was that the X-T3 shot 10-bit H.265. 10-bit, which has gone from being considered an exotic high bit rate to what’s now considered standard fare in mirrorless camera video, but at the time of the X-T3’s introduction, 10-bit 4K wasn’t common. The ability to shoot at up to 400 Mbps made other competing camera’s 4K data rates (100 Mbps on all Sony A7 variants!) look weak and inadequate for post-production.
Since I haven’t yet had a Fujifilm X-T4 made available to me for review, I can’t verify a few small details about the new X-T4, but looking at Fujifilm’s specifications, press photos, YouTube videos and the like, I can surmise what I believe the X-T4 to be. Cutting to the chase, in a nutshell, the X-T4 is basically an X-T3 with a slightly larger body, same exact sensor and video specs save for the fact that it can now shoot 240 fps in FHD versus 120 fps on the X-T3. Fujifilm added a new ETERNA Bleach Bypass film simulation, a flippy screen for Vloggers, a significantly larger battery and IBIS. There has been much consternation that the X-T4 also loses the 3.5mm headphone jack from the X-T3, which was replaced with a USB C dongle.
Here are the main points that I found lacking in the X-T3.
The headline feature of the X-T4 is IBIS. The X-H1 had IBIS but was a physically larger body than the X-T3 and not the X-T4. Fujifilm implemented magnetic IBIS which, according to preliminary tests, works pretty well. That said, most but not all pros utilize a gimbal, motion control slider or a Steadicam-like device to fluidly move the camera. IBIS seems to be more of a hobbyist feature, but it can be useful in certain situations, taming the micro jitter that’s painfully apparent when shooting 4K especially. I’ve tried shooting handheld with our X-T3 without the accompanying cage, monitor microphone and external battery system that all together add up to making our X-T3 handheld rig weigh about 6 to 7 pounds depending on the lens. Trying to shoot handheld with the X-T3, even with a wide-angle lens, results in a lot of micro jitters that the IBIS in the X-T4 will tame.
A feature that has been used to hook a lot of still shooters coming into the world of mirrorless hybrids is the full-frame sensor. I debate even including this point, but all of the Fujifilm X series bodies use an S35 sensor. If you shoot a lot of low light and need high gain without as much grain, FF sensors are superior in low light. That said, the Fujifilm X-T3 does well up to about ISO 2,500, which is plenty of gain for all but the darkest situations. Since the X-T4 uses the same sensor, it’s fair to say the ISO performance is probably roughly the same as the X-T3. Most but not all video/digital cinema pros are able to light the majority of their scenes, but if you shoot weddings, events or constantly shoot in other situations where you want or need to shoot at ISO 12,500 or higher, do yourself a favor and buy an FF camera.
Moving on from sensor size, the X-T4 appears to be more of a good thing and one the most interesting mirrorless hybrids out there. The value equation is still excellent with the X-T4 body retailing for $1,699 in the United States. IBIS was easily the most requested feature at all of the Fujifilm Summits and from feedback from Fuji user groups. The second most requested feature was a flippy screen versus the tilt screen on the X-T3. The additional battery horsepower is much welcomed, although until we get a hands-on review unit, it’s hard to say what the recording times will be.
Overall, if you’re buying your first mirrorless hybrid, the X-T4 appears to be an across-the-board winner, with a great value equation and features for $1,699. If you’re obsessed with shooting in the dark, look elsewhere for a full-frame camera. If you own the X-T3, the real question is, is it money well spent to sell off your X-T3 and upgrade to the X-T4?
For us, the answer is no. We have tamed the short battery record times of the X-T3 with the external battery grip that adds two more batteries to the internal X-T3 battery. We power the X-T3 from the DC output of our gimbal, so short battery times aren’t a factor anymore. Same with IBIS, we have the Zhiyun Crane 2 gimbal, so we don’t really need IBIS. We don’t need the flippy screen because (thankfully) we don’t Vlog. Objectively, the X-T4 is an iterative upgrade, but it’s an upgrade of an already very good camera that probably edges into great territory for pro video/digital cinema shooters.
The post The Evolution Of The Mirrorless Hybrid Camera With The Fujifilm X-T4 appeared first on HD Video Pro.
Blackmagic Design introduced the Blackmagic URSA Mini Pro 4.6K G2 in March of 2019, which was a significant update from the original URSA Mini Pro, which was announced two years earlier, in March 2017.
Here’s a brief description of it from Blackmagic Design’s website: “The URSA Mini Pro 4.6K G2 is a next-generation digital film camera with updated electronics and a high-performance 4.6K HDR image sensor for shooting at up to 300 frames per second. You get a Super 35mm 4.6K sensor with 15 stops of dynamic range, built-in optical ND filters, interchangeable EF lens mount that can be swapped for optional PL, B4 or F mounts, Blackmagic RAW and ProRes recording to dual CFast or dual SD cards, and an innovative USB-C expansion port for recording directly to external disks. In addition, URSA Mini Pro 4.6K G2 features a massive set of external broadcast style controls, backlit status display, foldout touchscreen monitor and more!”
The UMP G2 includes additional features as well, including a variety of shooting resolutions, including 4.6K (4608 x 2592), 4K DCI, UHD, 3K Anamorphic, 2K DCI and 1080; high-speed frame rates, including Blackmagic RAW (8:14.6K full—up to 120 fps); UHD Windowed (up to 150 frames per second) and HD Windowed (up to 300 fps). The camera also includes the ability to shoot ProRes 422, up to HQ codec. It has a built-in four-position ND-filter wheel, with clear two-stop, four-stop and six-stop ND filters. It comes with autofocus (when using compatible lenses, iris control, a 4-inch LCD capacitive touchscreen, a highly accurate time-code clock, an SDI 12G output and dual XLR audio inputs).
Plus, on the main products page of the website, the new model is described as “three cameras in one” because, the site says, it combines high-end digital film quality with the features and controls live broadcasters want. And just below that, the website refers to the camera as also being “ideal for any kind of work from high-end feature films, television shows and commercials, to independent films, broadcast news, and even studio and live production.”
After doing my research, I believed that Blackmagic Design was positioning the UMP G2 as an all-around professional digital camera. But that was just an educated guess—to get a better sense of this camera, I’d need to do two more things: Do some hands-on tests with the camera and look at pricing. But to do the latter, I’d need to consider how I wanted my UMP G2 configured for the work I do.
Let me break down the camera configurations that I received and give you a realistic picture of how you’d probably want to configure the camera if you were going to buy it:
So, the UMP G2 with all these extras will actually cost $8,709, about $3,000 more than the camera-body price.
Now, the specs and list of features are impressive, but to get a sense of what the UMP G2 can do, I wanted to use it on a project. It’s the best way to find out how it will perform in action.
Luckily, as a producer and cinematographer, I had various projects in the works. So, I decided to put the UMP G2 through its paces by taking it on a client-paid shoot to determine how good the new camera is. My reasoning? It’s simple: Although it’s always tempting to talk about numbers and specs when it comes to cinema gear, there’s nothing like taking a brand-new camera out in the field—under pressure and in challenging conditions—to see how it performs. And where it belongs in the world of cine camera gear.
Now, I didn’t have a chance to use it on a feature film or on broadcast news. But I did use it on a non-profit project: The event we covered with the UMP G2 was a paddleboard race, called the Catalina Crossing. During this event, we filmed interviews and shot b-roll of the event. We captured top paddleboard racers as they crossed the finish line in Manhattan Beach, California. All in all, I felt the shoot offered challenging conditions on every front—requiring handheld footage, shots from tripods with long-lens shooting in extremely bright, harsh sunlight conditions with wind and ocean noise as well as tons of people and lots of white, reflective sand.
I received the review unit from Blackmagic a few days before the shoot, which meant I was able to rig it up for shooting and just tried a few quick shots around the office before packing it up for the big shoot. My first impressions were good: The camera seemed to be very well built. But the UMP G2 isn’t a light camera. Just consider the following and how much the camera and various accessories weigh:
All in all, the total camera package weighed almost 15 lbs. (Although, to be fair, my Canon C200 set up weighs just a couple of pounds less.) What’s important to note, though, is weight in cine cameras has its advantages and disadvantages:
During our shoot, we found a lot to like about the camera.
For instance, the EVF is excellent. It’s super bright and clear, with good color. For $1,500, it looks as good as other EVFs that cost a lot more. The menus are clear, simple and easy to navigate. We also found switching between regular shooting and slow motion was a breeze.
But the camera had shortcomings as well. For example, there was no waveform monitor, only a histogram, and the record button on the handgrip was difficult to find by feel. They also positioned the XLR inputs facing up instead of placing them at the back of the camera. I also found the autofocus is pretty limited for moving subjects—it’s slow and takes a long time to lock onto the subject.
I also had a hard time balancing the rig, especially using my Canon EF 70-200mm f/2.8 IS II lens. The shoulder pad moved a little to allow you to shift the balance point, but it didn’t move enough. Plus, you have to unscrew two screws each time you want to move it, which isn’t convenient jumping back and forth with lighter and heavier lenses.
We shot with Blackmagic RAW for a few clips just to test it out. We then switched to Prores HQ, since this footage was supposed to slot on with a lot of other footage that had already been shot and was edited in FCP X as Prores. We also shot footage at 120 fps and even higher frame rates. It’s impressive that this camera is able to shoot at up to 300 fps!
Overall, the footage looked very good. As expected, the Blackmagic RAW images in Resolve seemed to yield better sharpness and slightly lower noise than similar Prores HQ clips in FCP X. We shot Blackmagic RAW at 5:1, 8:1 and 12:1 compression ratios and, predictably, in really bright sunlight, there weren’t a lot of recognizable artifacts or compression signatures we could see. Rolling shutter wasn’t much of an issue: We put the camera on a tripod, panned back and forth with some light poles on the pier, and saw very little skewing or JELL-O vision.
Our footage was taken on a very bright, sunny day, which meant we didn’t have a chance to increase the ISO to see how it performed at higher gain settings. But we tried it out back at our office.
The URSA Mini Pro G2 doesn’t have the same dual-native ISO of the Pocket 4K and 6K cameras. For me, I was comfortable shooting the camera at a maximum of ISO 1600. (In comparison, we’ve shot the Canon EOS C200 several times at ISO 2500 and even 3200 before seeing objectionable noise creep into the image.)
After using the UMP G2 on our project and taking it through its paces, here’s my general assessment of what I found:
Performance: If you use a fast lens, set the UMP G2 at ISO 1600 and light your scenes or shoot in well-lit environments, your footage should look quite good. But if you regularly shoot in very low-light environments, the UMP G2 may not be the best choice for you.
I’m always torn by cameras like this because to make good-looking images, you need to light them, period. And the G2 responded beautifully to the good lighting we used in our project. But I also shoot a lot of documentaries, which are often shot in poorly or even atrociously lit scenarios or in scenarios where there’s only a little light. So, for us, lighting the scene up just isn’t an option.
Value: Whenever I test a cine camera, I ask myself if it represents a good value. I also have to ask how this model might help me in marketing my work. And since I work mostly in the Los Angeles market and for my work-for-hire as a director of photography, I always have to face facts: Trying to sell an ad agency, PR or marketing firm or the studios that you’re going to shoot with a Blackmagic camera can be an uphill climb. That’s because Blackmagic Design cameras aren’t nearly as well known with clients as RED, ARRI, Sony, Panasonic or Canon cameras.
On the other hand, some clients don’t care what you shoot with (which was the case in this instance) just as long as the results were professional and the images looked and sounded good, which they did. But many clients—PR/marketing/corporate/the studios—do care which camera you use.
Still, it’s just one of several factors to consider.
One of the most important, of course, is price. The UMP G2, configured the way I needed it to be configured, costs close to $9,000 with all of the accessories I’d need and want to make it work for my shoots. However, I do have an alternative. Most other camera manufacturers in this segment offer multiple cameras in this general price range.
Consider the following: Panasonic offers the VariCam LT for just a little more and the AU-EVA1 for a little less. Sony offers the PXW-FS7, PXW-FS7 MKII and soon the upcoming PXW-FX9, for a couple of thousand dollars more. Canon offers the C200 at roughly $500 more for the basic camera package, except the EOS C200 includes the top handle, handgrip and a few other accessories that are extra on the URSA Mini Pro G2. Canon is still selling the EOS C300 MKII, as well, for just a little more than the URSA Mini Pro G2.
So, if you want a camera in this price range, you do have quite a few options. My point is, there are a lot of candidates for professional digital cinema cameras in the $6,000-to-$10,000 space.
But Blackmagic does have one ace in the hole: The URSA Mini Pro G2 has the ability, with the right Blackmagic accessories, to transform into a live event or broadcast camera, something that none of its competitors are designed to do.
The URSA Mini Pro G2 was designed with this in mind. So if having an extremely versatile camera is important to your work, then the G2 is worth considering.
Overall, we were impressed with the URSA Mini G2 since it does a lot of things exceedingly well, including its superb slow-motion capability and its menu system that’s a joy to navigate.
There is an additional way the UMP G2 is unlike its competition: Blackmagic Design makes one of the best software suites for editing your URSA Mini Pro G2 footage: It makes DaVinci Resolve 16, and the company even gives you a free copy of the Studio version with your purchase of the camera. No camera manufacturer makes a good editing program along with its own lineup of pro digital cinema cameras.
If the limitations mentioned can be worked around in your workflow, the URSA Mini Pro offers an excellent value, robust build quality and a great image. It’s a cine camera I’d consider as seriously as any of its competition.
The post Hands-On Review: Blackmagic Design’s URSA Mini Pro G2 appeared first on HD Video Pro.
Livestream Studio is a fully featured live streaming package that’s popular in the streaming world.
How often do you try something new in your work? I mean really new, like something you’ve never done before? Or what about something you haven’t done for a long time? Recently, a friend who I’ve known for quite a few years approached me about helping him with a project he had going on. He’s in the live streaming business. I’m not in the live streaming business. At all. As in, I’ve never participated in a live stream, except as a camera operator. Last year, I was the lead camera operator for the Facebook Live Stream for the NAACP Image Awards Red Carpet.
When I say, “in Hollywood,” this ceremony was to take place at the Dolby Theater on Hollywood Boulevard, at the Hollywood and Highland complex, directly across the street from the Jimmy Kimmel Live Theater. It’s nice that the Hollywood city government frequently allows Hollywood Boulevard to be shut down for various live events, red carpets, concerts, etc., including this event.
From a camera operator’s perspective, the shoot was fairly straightforward live television except that instead of being uplinked to a broadcast antenna, our Facebook Live Stream was routed via a hard-wired Gigabyte Ethernet connection at the Hollywood and Highland Center. My job was to be the lead camera operator, shooting our talent as they interviewed all of the celebrities and actors coming down the red carpet as they arrived at the ceremony. We had five switched cameras set up. One notable factor was that if you’re used to shooting live television, it’s most often shot with 2/3” three CCD broadcast cameras, similar to the same ones used for live broadcast for most entertainment and sporting event coverage.
Streaming is a whole other ball game so to speak. Keep in mind that streaming, while growing monthly in viewership, is still playing little brother to the big network live broadcasts. BET was airing their coverage of the ceremony and awards live while our little “broadcast” was mainly all about the red carpet arrivals, wardrobe, guests and interviews before the ceremony—a “pre-game” show if you will. I try to keep an open mind with production, which is how I’ve come to have a lot of different skill sets in my toolkit. I have shot live broadcast television before, although I must confess, it’s been quite a while since I worked in live television. When my friend offered to hire me to be a camera operator for this live stream, he knew that I don’t shoot a lot of ongoing live television events, but he did know that I had live TV experience.
In case you don’t have live television experience, it’s very different than shooting events, documentary, narrative, music videos or commercials. You only get one shot to get things right, and if you make mistakes, they go out live to your audience, which never looks professional and doesn’t motivate your client to call you when the next live gig needs crew. I have found the main thing when operating for live television is that a good director, technical director and stage manager, if the event has one, are key to your performance as a camera operator.
You receive your camera direction from the director in your headset. A calm, cool and collected director is always a good fit for live television. Things can get stressful and heated, and a good live director can keep everyone on the team motivated, creative and on top of their game. Fortunately, we had a good director for the Facebook Live event, and the shoot came off pretty well. We had some stressful moments, but fortunately for us, they mostly happened in the build-up to going live, not during the live stream. Our entire sidewalk studio lost power for about an hour. This was during the setup and lighting. The entire red carpet walkway had been lined with 2,500 watt HMI lights. The electrical load apparently hadn’t been coordinated with the rest of the power needs on the circuits everyone was using. It took a while to figure out how to solve the power distribution problem.
Besides a director, a technical director, five camera ops and a sound mixer, we had a few miscellaneous crew, including someone whose job it was to mainly monitor the Ethernet hookup and how the stream was playing out to the end users. In live streaming, there are a lot of things on the computer, Ethernet and service provider for the streaming end of things. Our shoot, for the most part, went well, with both the technology and our on-camera hosts. Keep in mind that for this type of shoot, there was a tech rehearsal to make sure that the gear was functional, but there was no way to rehearse what our hosts would be covering because that all depended on who was coming down the red carpet next.
The NAACP Awards live stream was about a year ago. Overall, the event and our live stream went well and the clients were happy with the end result. I didn’t hear back from my friend who has the live streaming company for a few months until he recently called and asked if I could once again help him out with some live streaming work. This situation was very different. Instead of covering a red carpet for Facebook Live, we’d be covering a building dedication ceremony for a university. The client was expecting an audience of about 1,000 attendees, students, professors and politicians. Because the new facility could only seat about 700 people, the client wanted to use our stream for IMAG in an adjacent overflow building for those who couldn’t be seated in the main building. The client was also expecting possibly a few hundred other viewers from the East Coast who couldn’t be in California for the building dedication and would want to watch the live stream of the speeches and ceremony.
My friend and the rest of his company were booked to cover a boxing match in Las Vegas the same day as he was booked to cover this building dedication. I was asked if, instead of camera operating, I’d feel comfortable directing the live stream. In this case, because the crew was small, not only would I direct the coverage, I’d also serve as technical director.
Usually in live television, the director and the technical director sit side by side in the control room. The director directs the cameras to get the shots they want and, at the same time, directs the technical director sitting next to them to choose the shots, transitions, video roll-ins, graphics and lower thirds that are sent out to air live. Both positions take a keen ability to multitask. I was going to be doing both positions, but our coverage would only have three sources. We’d cover the ceremony with two cameras and I’d be rolling in title cards from the client. In return, while all three sources would be going out to the live stream, only camera one would be going out to the venue’s AV people to be displayed on eight 70-inch OLED screens that flanked the auditorium and projected on two large 16-by-9-foot projected images on screens.
Since we only had two camera feeds and one graphics source, the key to coverage would be to make sure that while one camera or graphic source was live to both the stream and the venues IMAG (Image Magnification) screens, it would be my job as director to direct the camera operator as to which shot I wanted next, how to frame it and be ready for the next shot. Fortunately, at my time in a television program at college, we had an opportunity to produce a live television show each week, aired on local cable, and each of us in the class was rotated in different crew positions each week. I had a chance to direct, technical direct, stage manage, camera operate and also worked with the sound mix and with motion graphics packages that went to air. I felt confident going in that I’d recall exactly how to direct, call shots and live switch smoothly.
Fortunately, we had a load in and build day built into our schedule. I was able to drive to the location with my friend, load in all of the gear and set it up for a tech rehearsal and run-through. Setting up all of the gear was pretty straightforward—running power, audio from the house mixing board to our streaming setup, hook up and test out the cameras—you know, the usual. Where the whole project became possibly more challenging was when my friend began walking me through setting up the streaming. He uses a service called Dacast for custom setup live streaming.
What’s custom setup? Anytime you aren’t going to use one of the standard streaming providers like YouTube, Facebook or Vimeo, you’re going to need your own streaming provider. As he walked me through Dacast and its setup, he also showed me how it was a good idea to not only stream a 1080 30p 5 Mbps stream but also a smaller, lower-resolution stream for mobile viewers. That stream we targeted for a nominal data rate of 750 Kbps, which I was told is acceptable for mobile users. Very cleverly, the streaming server can poll each device to decide which stream to supply. Very handy.
The other factor that you don’t think about much if you’ve never done this is bit budgeting. You have to figure out how much bandwidth you need to serve out to how many viewers and then pay for it. I won’t bore you with the mathematics of the equation, but let’s just say I can see why a lot of hardware providers dumped hardware in favor of selling bandwidth for streaming video.
As the clock kept moving toward 5:20 p.m., when I’d hit the streaming button on our Live Stream Studio hardware, I had been trying to figure out how I could successfully wear the Sony MDRV-7506 headphones to monitor the audio for our live stream and wear the Eartec single-sided headphones so that I could communicate with our camera operator and the AV person who was supplying our motion graphics and sending in house audio. I was having an awkward time trying to wear both sets of headphones when our client came up with a brilliant idea.
The function began with a live choir and orchestra that was reaching typical live music performance sound levels with a lot of dynamic range. Contrast that with the first speaker, the university president, who happened to have an average to even lower volume voice. The sound mixer had to pad and reduce the sound levels being sent to us for the musical performance but raise the audio levels significantly for the speakers. Rather than me trying to ride audio levels for the live stream while also directing and TDing, the AV person suggested that with an iPad, we could locate the sound mixer at our streaming table of gear next to me. That way, they could monitor our streaming audio levels and adjust their new Allen & Heath mixing board levels via the iPad.
The IPad app had very nice VUs and interface for their sound mixer. This freed me up to concentrate on directing, having our camera operator line up the next shot as I switched between camera two and our title cards, then cutting to camera one, the close-up shot of the speaker at the podium.
It had been quite a few years since I had directed and switched live television, but once the show began, I’m glad to say that it all came flooding back to me. I stayed calm, even had fun with the crew as I lined up shots, switched to them, rolled in video clips and title cards and kept things moving. I had a feed from the video stream displaying on an iPad in front of me, but it was disorienting as the actual “live” stream ended up having about a :45 delay over the live events happening in front of me. Streaming, other than the actual encoding, hookup and streaming configuration, is very similar to live TV.
If you get the chance to step outside your normal comfort zone, I recommend you do it when you can. I’m definitely not a highly experienced live TV director, but the client and my friend were both happy with how the live stream worked. The key, I found, the same as it is with all live television coverage, is to stay calm and focused. Anticipate what’s going to happen next and be ready to direct your team to react to what’s happening as quickly and smoothly as possible. We had a show schedule, and like in many live situations, things changed on the fly, resulting in us being “lost” as far as what was next a couple of times. Things will rarely be perfect, but that’s just how things happen in live coverage, whether it’s news, the Oscars or just a little live stream for a local client.
Every February, some of the smartest minds in the content creation world come together and engage in presentations, discussions and debates about wide-ranging topics. They call it the Hollywood Professional Association Tech Retreat.
I try to make time every year to travel to Palm Springs so I can absorb the unvarnished content that attracts current and future leaders of our industry. This year’s 2020 Tech Retreat didn’t disappoint.
Although some of the dialogs really don’t apply to my day-to-day work, I love learning about the industry as a whole. The Tech Retreat showcases the generosity of the industry in sharing information. While it would be impossible to summarize everything I learned, here are a few things I walked away with.
For large broadcast facilities, the realm of coax cable is disappearing. It’s being replaced by IP (Internet Protocol) signal flow—think ethernet—but on fiber. But what’s interesting is that video black—like what you use for genlocking cameras—is still part of the equation, for now.
In the equation of sports broadcasting, it’s becoming harder to ignore betting. Legal gambling in the professional sports arena (pardon the pun) is a big business.
There was debate as to whether gambler-specific game broadcasts would be a thing. For example, would there be different play-by-plays and color commentaries that call out individual player performances for fantasy players? The thought was that serious/professional gamblers are making their bets before the game, but amateurs might be attracted to in-game proposition betting: “Will they make a first down?”
An interesting surprise to me was that professional tennis is the second biggest betting sport in the world.
Speaking of betting, studios prefer to produce movies that have good odds of making money. So, there was discussion on audience research. One presentation talked about measuring carbon dioxide (CO2) in a test viewing room. The CO2 created by audience members’ exhaling seems to indicate their emotion.
Research wasn’t limited to audiences, however. Another talk showcased data mining a database of existing films to determine the “emotional tonality of every single character and every single line in the script.” Couple that with the ability to determine, by every zip code in America, the dislikes and likes of TV shows, and you can see that machine learning is becoming capable of being used to predict success.
For me, the HPA Tech Retreat is all about stopping what you’re doing so you can think about the future. Some of the talk on audience research caused me to stop and think. I’m a little concerned that we’re creating technologies that are amazing but that might also have the unintended consequence of changing us from storytellers to manufacturers. Rather than telling the stories we want to tell, will we—or someone who has power over a project—use market research simply to create experiences that sell?
On the other hand, the Tech Retreat is truly a sharing of information and dialog. There were conversations about the double-edged sword of this type of research. I’m not the only one concerned. Even some of those presenting admitted that they were cognizant of this issue.
That was just the first day at the HPA Tech Retreat. Next time, day 2—I get to be in a movie.
Fujifilm’s new X-T4 mirrorless camera
Earlier today, Fujifilm introduced the next iteration of its popular X-T mirrorless camera line by announcing the X-T4. Fujifilm said the new X-T4 will be available in both black and silver this spring for $1,699.
The new model looks quite similar to its predecessor, the X-T3, but has some important updates. They include:
It still comes with Fujifilm’s 26.1 megapixel X-Trans CMOS 4 sensor and X-Processor 4, which were both on the X-T3. However, there are some additional changes, as well, including changes to the Q menu and the addition of film simulations like the new Eterna Bleach Bypass Film Simulation. The company has also improved the AF system for more accurate AF tracking and eye- and face-detection.
For more, go to fujifilm-x.com/en-us/products/cameras/x-t4/ or see the press release below:
Valhalla, New York – February 26, 2020 – FUJIFILM North America Corporation is proud to announce the launch of the FUJIFILM X-T4 (hereinafter “X-T4”), a flagship model of the X Series family of mirrorless digital cameras.
The X-T4 is an astounding imaging tool, packing a newly designed IBIS, a quiet new shutter unit, a new vari-angle LCD screen, a new Eterna Bleach Bypass Film Simulation, and a new, large-capacity, battery all into a compact and lightweight camera body. This camera is the perfect tool for today’s image makers and is an ideal multi-functional solution for visual storytellers to use in creating their stories.
More information about the key features of X-T4:
Fujifilm’s state-of-the-art X-Trans CMOS 4 sensor and X-Processor 4 combination sits at this camera’s core, pairing this exceptional, 26.1MP, back-side illuminated sensor with a powerful quad-core CPU to produce images with wide dynamic range and incredible image quality, doing so with lightning-fast processing and precision AF performance, right down to -6EV.
A huge part of creating great photos or videos is being in the right place at the right time – and that often means making handheld images to get to the heart of the action. X-T4’s five-axis In-Body Image Stabilization (IBIS) provides up to 6.5 stops1 of image stabilization to make sure that, even in the midst of all the excitement, images remain steady and sharp. Combine this with the new four-axis Digital Image Stabilizer, and there’s lots of room to maneuver.
For any serious image maker, having a tool that can be relied upon to perform flawlessly whenever it is required is extremely important. For this reason, X-T4 features a newly developed mechanical shutter that is the fastest and most robust in the history of the X Series. Not only can it make 26.1 Megapixel images at 15 frames per second, but it is also rated for 300,000 actuations. Combined with its larger capacity battery that is capable of up to 600 frames per charge2, X-T4 has the power and the durability to give users the peace of mind that they’ll never miss the perfect opportunity.
When chasing the perfect image, versatility is key. The 1.62 million pixel vari-angle touchscreen LCD featured on X-T4 can be adjusted to make it visible from a wide range of positions. This not only provides a high-quality monitor to frame with, but also provides quick and simple controls when they’re needed most. On the flip side, there are times when it’s necessary to minimize the light and distractions that a screen can create. That’s why X-T4’s LCD has been designed to easily fold away so it is completely hidden from view, leaving the updated 3.69 million pixel/100fps electronic viewfinder to focus on the moment at hand.
The modern image maker is blurring the lines between photography and videography, and X-T4 has been designed to celebrate this new generation of hybrid creativity. With the simple flick of a switch, movie mode is activated, meaning X-T4 is capable of recording both professional-level DCI 4K/60p and Full HD/240p super slow-motion video. It is also possible to record F-Log footage in 10-bit color, straight to the card. What’s more, the innovative AF-C subject tracking works in low-light conditions down to -6EV and the camera’s use of a new, high capacity battery lets content creators push their creative limits.
For over 86 years, FUJIFILM Corporation has produced photographic films that have been used by some of the world’s best-known moviemakers to create some of the world’s most successful movies. This legendary reputation in color science is celebrated with the company’s hugely popular selection of film simulation modes, which digitize some of the industry’s most iconic films and puts them right at hand. X-T4 introduces ETERNA Bleach Bypass, the newest addition to the much-loved collection of Film Simulation modes available in the X Series product line, which creates a beautiful de- saturated, high-contrast look that image-makers will find irresistible.
X-T4 will be available in both black and silver and is expected to be available for sale in Spring 2020 at a manufacturer’s suggested retail price of $1,699.95 USD. For more information, please visit https://fujifilm-x.com/en-us/products/cameras/x- t4/.
The new Sony FE 20mm F1.8 G prime lens on the Sony a7R IV full-frame mirrorless camera (top view)
Today, Sony announced the new FE 20mm F1.8 G prime lens for both E-mount full-frame and APS-C Sony mirrorless cameras. The new versatile lens will retail for $899 and be available in March.
The new wide-angle prime has a small, compact and lightweight design and a large f/1.8 aperture that the company says makes it ideal for many genres of photography, including travel, landscape, street, close-up and low-light photography as well as a more specialized genres, like astrophotography.
Here are some of the notable features included on the new Sony FE 20mm F1.8 G prime lens:
According to Sony, the new lens has an optical formula (specifically two AA and three ED glass elements) that allow it to accurately reproduce point light sources with high contrast and with a minimum of sagittal flare (which is an unnatural spreading of point light sources and is most common in large-aperture lenses).
Below are some recent sample images shot with the new Sony FE 20mm F1.8 G prime lens (on the Sony a7R IV full-frame mirrorless camera):
Besides 8K at CES, high dynamic range (HDR) was also being shouted about from the rooftops. I saw a lot of HDR displays. I saw some good HDR footage. I saw some bad HDR footage. I also saw SDR footage playing on HDR displays. And I saw two spectacular HDR clips. Two.
All the hype and the poor ratio of great to not great footage reminded me a little of the 3D era. One reason was hearing again the term “immersive.” (It was also used to talk about 8K.) Besides HDR, I’ve looked at a lot of VR and 360. But, so far, the only immersive experience I’ve had is scuba diving.
Kidding aside, well done HDR is great to look at. But well done HDR isn’t commonplace. I’m not talking about the screaming bright highlights of battle armaments, I’m talking about the full breadth of color and contrast that HDR offers, from a creative perspective.
For me, there are two obstacles to HDR. First, affordable HDR workflow is critical. How often is HDR relegated to a couple of days of grading after weeks of SDR grading? Is HDR being displayed at checkpoints during the postproduction workflow or is HDR monitoring prohibitively expensive or overly complicated?
Another obstacle is consumer confusion. Certainly, technology like Filmmaker Mode that I wrote about previously will help home viewers. With the push of one button, they can set up their TV to match the content creators’ intent, rather than wading through menu upon menu of settings.
However, the plethora of HDR standards and terms is creeping up to confusing levels. Consumers are bombarded with terms like HDR, HDR10, HDR10+, Advanced HDR, HLG, Dolby Vision, DisplayHDR (400-1400), DisplayHDR True Black, PQ, SL-HDR1, Rec. 2010, Nits, Deep Color, Technicolor. Not all are true HDR standards or even specifically HDR, but they’re bandied about. And I won’t even bring up the various flavors of HDMI!
On the other hand, obstacles like this aren’t new. A lot of new technology introductions experience this. And, in a way, this can be good because it slows things down a bit. That allows us to get it right in terms of workflow and finishing. Happily, most cinema cameras, in the right hands, can capture footage suitable for HDR finishing.
Also, the fact that more and more consumer sets support HDR means that alternatives to expensive displays can make their way into the workflow—when calibrated properly.
So, CES had a lot of hype about 8K and HDR. I spent a lot of time looking at both. After the pixels settled, my final reaction is that I’ll let others push 8K as the next big thing while I hone my skills on finishing in HDR.
In 2019, we saw some intriguing developments in the new digital video tools from this season: 6K. Full-frame sensors. Mirrorless cameras and mirrorless disruptors. Internal vs. external RAW. New camera card media formats. Shooting to SSDs. Ultra-bright monitors. In many cases, these technologies and features weren’t available to most just a year or two ago. Now, they’re within reach of almost any working professional. So, as a content creator, you can really maximize your budget for equipment.
But this year also brought some paradigm shifts as well in the tools we use. For some, it feels as if the ground is shifting beneath our feet.
So, you can use this guide to also make sense of the vast, continually changing landscape that’s digital cinema and video production gear. Below, you’ll find this particular guide is divided into five sections:
In this section, I’m going to focus on a few new cameras but will avoid two notable models announced this past September—Canon EOS C500 MKII and the Sony PMW-FX9—since they won’t likely be available to purchase in time for the holidays. Instead, I’ll focus on cameras that are actually in stock at dealers today.
Blackmagic Design Pocket Cinema Camera 6K: I like to call this camera a “mirrorless disrupter” because it’s not a mirrorless hybrid; it’s definitely a cinema camera but sells for what many mirrorless cameras sell for. In other words, it’s still a 6K S35 digital cinema camera. It also has 4K Prores and is compatible with Canon EF-mount lenses. It’s a very capable, low-end digital cinema camera that’s an amazing value for the money.
Panasonic Lumix DC-S1H: As far as cost goes, it’s middle of the road, but it’s a very fully featured full-frame 6K mirrorless camera, with many advanced video camera features, including vlog recording and RAW via HDMI (which is coming). It’s also compatible with L-mount lenses. It’s really just one of the most video-capable mirrorless cameras on the market.
The RED Ranger Ecosystem: These new top-of-the-line RED Ranger cameras feature a more integrated body style that comes with a lot of features and accessories, which were extra on the DSMC2 standard bodies. I still consider the Monstro 8K VV version a rental-only, but with these other two you can now own a new RED Ranger.
Price: $24,950 (for Gemini 5K S35 sensor) or $29,950 (for Helium 8K S35 sensor).
The line between “still” camera lenses and cine lenses continues to blur. This year, we found affordable, high-performance cine lenses available for almost every removable-lens camera, in every mount.
DZOFilm 20-70mm t/2.9 Cine Lens: The Chinese manufacturer DZO, which produces optics for a variety of scientific and industrial applications, constructed this lens specifically for Micro Four Thirds cinema cameras, like the Blackmagic Design Pocket Camcorder 4K, Panasonic Lumix GH5 and Z Cam E2. Overall, it’s a zoom lens that has very good specs—like a 12-blade iris and a parfocal design—and weighs in at just under 39 ounces yet is very reasonably priced. It’s a true cine-quality lens made for inexpensive M43 cameras.
Rokinon Xeen CF Primes: By constructing the bodies out of carbon fiber, Rokinon was able to make this series more lightweight than the metal versions of this Xeen prime series of lenses. Yet the Xeen CF Primes still retain the same strength. They’re currently available in 24mm, 50mm and 85mm focal lengths and in E, EF and PL mounts. And each lens features an 11-blade iris and 200-degree focus rotation.
Price: $2,495 (each)
Angénieux Optimo Anamorphic 56-152mm t/4.0 Zoom Lens: If you want to shoot your magnum opus in anamorphic, consider this top-of-the-line, mid-range zoom from French lens manufacturer Angénieux. It offers a 2X anamorphic squeeze, with 2.7X zoom range. It also produces minimal breathing and is available in PL, PV and EF mounts, all with the legendary Angénieux look.
We saw a dazzling array of new and interesting lights hitting the market this year. Many, although not all, were LEDs.
Aputure LS 300d II: The company has updated the original 300d light to the 300d II, providing it with more output, more features and a quieter power supply. This medium-sized LED (which has a mid-ranged price tag) has a single-source Spot/Fresnel. Numerous Bowens Mount accessories are available, allowing you to turn this into whatever kind of light source you need: soft source, Fresnel with barn doors, pancake, space light or lantern. Aputure made it extremely versatile, with a high output and performance ratio for the price. It can also be powered with AC power or dual high-watt hour V-Mount or Gold-Mount batteries.
Lowel Ego LED Bi-Color Light: Simple and easy to use, this new LED is a Bi-color LED replacement for the long-discontinued Ego Digital Fluorescent Light. It’s a soft, curved light source, useful for tabletop or as a small key source for interviews and headshots. It’s great for vloggers or others with confined space shooting.
Digital Sputnik DS6 LED Modular Light System: The DS6 is Digital Sputnik’s top-of-the-line lighting system. It has high output, enough to serve where a 4K HMI Fresnel would be used—allowing it to be used as fill in daylight situations or punched through windows as a sun source…without a generator! It also may sound like an expensive light, but if you compare the cost to a new traditional 4K HMI Fresnel kit, the DS6 is actually quite a good value that will consume much less power and run much cooler. And if you’re not familiar with Digital Sputnik fixtures, they’re common on high-end Hollywood feature sets. And for good reason: They’re rugged, reliable and color accurate and can be powered from any AC power outlet with a maximum power draw of 840 watts, no generator needed.
Sound-for-picture technology has evolved in 2019: There were numerous new recorder/mixers and various new choices for microphones and wireless lavalier systems this year.
Zoom F6 6-Input/14-Track Multitrack Field Recorder: The F6 definitely shifted the paradigm for offering this type of audio hardware for less than $1,000. It features 32-bit floating-point audio recording, which is helpful with recordings made at too high or low of a recording level. You also get 14 tracks of simultaneous recording.
Sanken CMS-50 Stereo Shotgun Microphone: This mic is a small, high-quality stereo shotgun for those who want to capture high-quality music or ambient sound. If you shoot or record live music or ambient where it’s important to capture the sound in stereo, the Sanken CMS-50 is a 5.4-inch compact stereo condenser shotgun that weighs only 4.6 ounces.
Sound Devices Scorpio 32-Channel/36-Track Portable Mixer-Recorder: The Scorpio is Sound Devices’ flagship audio recorder mixer, with 36 tracks for recording and a powerful feature set that includes many innovative new features to improve sound quality. It’s designed for production in the field, with a plethora of connections for almost any scale of production.
Red Giant’s Universe 3.1: This is an inexpensive but full-featured video tool box for creatives, targeted at editors, designers and motion graphics artists. All Universe plugins are GPU-accelerated and run within multiple host applications on both Mac and Windows. The all-purpose suite has 82 different plug-ins for text effects, transitions, generators, motion graphics and utilities. And while Universe is a yearly fee application, it’s relatively affordable.
Price: $199 subscription per year
AVID Media Composer Ultimate: Although this film-and-video-editing software package is pricey, it’s truly a first-class app, which many consider the father of them all! And while there are many great editing software programs on the market, AVID Media Composer is an industry standard for editing Hollywood features and episodic television. That’s because its editing, effects, titling, color and audio tools make the creative process fast, easy and gratifying, and, most importantly, AVID Media Composer is generally considered the most stable and reliable editing program on the market with the most sophisticated multi-user workflow.
Price: $499 subscription per year (pre-paid)
Apple introduced the new Mac Pro and Pro Display XDR to the world at its Worldwide Developers Conference this past June. And ever since, many in video and film production, as well as elsewhere, have considered them two of the most controversial new products to hit the market in 2019.
Why the controversy? Simple. The cost.
The most basic version of the new Mac Pro will sell for $5,999. Hold your breath, though, because a new Mac Pro full spec with every possible option could run between $30,000 to even $50,000.
With these prices, the “Pro” in the Mac Pro is off the charts because it’s obviously not a computer for mere mortals. In many ways, it’s users who like to load up their AE or 3D rendering queues with the most lighting, shading, blurring and layered particles possible. It’s also for editors who are working with dozens of plug-ins for complex music arrangements and recordings and video editors working on insane deadlines with lots of motion graphics, video layers and effects.
The new Mac Pro starts at $5,999 for the 8-core Intel Xeon 3.5GHz model, with 32GB RAM, 256GB SSD, and Radeon Pro 580X graphics card, but there will be multiple build-to-order options and configurations available with up to 1.5TB of RAM, Radeon Pro Vega II Duo graphics and a 28-core Xeon.
In addition, there will be units and expansion modules that can be purchased separately, including the Mac Pro expansion module (MPX Module) and Afterburner and wheels (yes, this computer is heavy enough that many will want to wheel it around rather than carry it).
Additionally, to accompany the new Mac Pro to market, Apple had added a new high-end monitor to the mix with some interesting specs. The monitor, branded the Pro Display XDR, boasts an impressive spec list, including 6K and 32 inches.
But again, the price is in sync with the new Mac Pro: It’s $4,999 for the standard model and $5,999 for a model with “Nano texture glass,” which is said to reduce reflections.
And to top it off: The Apple stand that holds the new monitor itself costs a cool $999 (or $199 for a VESA mount).
If the new monitor is anything like the new Mac Pro itself, it will be an impressive technical achievement, but you’ll pay for it.
Price: Starts at $5,999 (Mac Pro); $4,999 (Pro Display XDR) or $5,999 model (with Pro Display XDR Nano texture glass)
Canon’s EOS R5 full-frame mirrorless camera
In the past week, there were a number of important camera announcements, including a few from Canon, which included the development of what it calls its most advanced full-frame mirrorless camera, the EOS R5, as well as introducing the latest EOS Rebel, the T8i DSLR, and a new 24-105mm lens for its R full-frame system.
Here’s a synopsis of each announcement:
The development of a new Canon EOS R5 full-frame mirrorless camera is in the works. New features include:
In the same press release, Canon said it’s developing seven RF lenses and two RF-lens extenders, which are scheduled for release during 2020, including the RF 100-500mm F4.5-7.1 L IS USM, Extender RF 1.4x and Extender RF 2x.
This week, Canon also announced its latest entry-level Rebel DSLR, the Canon EOS Rebel T8i. Here’s a list of some of the most notable specs and features:
Canon also announced a new affordable RF 24-105mm lens, which is a compact lens that includes optical IS, but is only 3.5 inches in length and weighs 13.9 oz. Canon says the lens also has Movie Servo AF using STM (stepping-motor) technology, which provides benefits for both video and still shooting. It also has a minimum focus distance of O.43 feet. The new Canon EOS RF 24-105mm F4-7.1 IS STM lens will cost $399, although it will be included in Canon’s body-and-lens kits for various EOS R and RP cameras.
For more on each announcement, see the below press releases:
[[press release ]]
The Next Generation: Canon Announces The Development Of The Company’s Most Advanced Full-Frame Mirrorless Camera Ever – The EOS R5
The Company will Also Develop Seven RF Lenses and Two RF Lens Extenders in 2020
MELVILLE, NY, February 12, 2020 – Canon U.S.A. Inc., a leader in digital imaging solutions, today announced that its parent company, Canon Inc., is developing the highly anticipated Canon EOS R5 full- frame mirrorless camera. The camera will feature a newly designed CMOS sensor and new image processor, along with new state-of-the- art optical technologies the company has been able to cultivate through its long history of groundbreaking camera and digital imaging solutions development. In addition, Canon plans to release seven RF lenses and two RF lens extenders that are currently in development. These new photography tools will help to continue to strengthen the EOS R system and cement the RF mount as an industry leader.
“Today’s announcement comes as a direct result of the tireless effort of Canon engineers who have been tasked with developing the next generation of Canon EOS R camera and RF lenses to help elevate the popular system that was announced in 2018,” said Kazuto Ogawa, president and chief operating officer, Canon U.S.A., Inc. “In developing the new camera, Canon listened to extensive user-feedback from a variety of photographers. The outcome is a camera and lenses that will delight a variety of shooters and further helps to demonstrate Canon’s commitment to full-frame mirrorless cameras and lenses.”
The EOS R System was initially developed to provide engineers with the ability to design lenses that were thought to be impossible to create previously. The wide lens mount diameter, shorter back focus, and high-speed system for transmitting data between camera and lens have resulted in an imaging system that delivers higher image quality and greater ease-of-use than ever before.
The new full-frame mirrorless camera currently under development will fully leverage the advantages of the EOS R System, helping to produce a camera that features high-speed continuous shooting and 8K video recording. Furthermore, the camera will provide photographers with more efficient workflows thanks to improved transmission functionality, operability and reliability. These enhancements, along with many others, will help to further elevate and solidify the EOS Series concept of “Speed, Comfort and High- Image-Quality.”
Canon’s EOS R5, the first of the next generation of full-frame mirrorless cameras planned for EOS R System, will include a newly developed CMOS sensor. The new sensor will enable enhanced features such as high-speed continuous shooting up to approximately 20 frames-per-second (FPS) when using the silent shutter and up to approximately 12 FPS when using the mechanical shutter – A feature professional sports and wildlife photographers will find to be extremely impactful on their ability to capture fast-moving subjects. From a video perspective, the camera’s 8K video capture capability will prepare videographers for the future of movie-making- capturing 8K footage today allows for even higher-quality 4K productions in addition to the ability to extract high-resolution still images from the video footage. The EOS R5 will be the first Canon camera equipped with IBIS (In Body Image Stabilization) and when used in conjunction with the extremely effective in-lens stabilization (IS), will allow photographers to handhold the camera in light levels not previously imagined. Additionally, the camera will also feature dual-card slots and will support the automatic transfer of image files from the device to the Canon’s new cloud platform.
Alongside the EOS R5, Canon is also developing seven RF lenses and two RF lens extenders scheduled for release during 2020, including the RF 100-500mm F4.5-7.1 L IS USM, Extender RF 1.4x and Extender RF 2x.
[[press release ]]
Bring the Firsts, the Lasts and the In- Between Moments to Life: Capture Photos and Videos with the New EOS Rebel T8i Camera
New Camera Delivers Vertical Video and Advanced Control for Maximum Creative Output
MELVILLE, NY, February 12, 2020 – Whatever your family dynamic, there are certain moments with the ones you love that deserve to be remembered. In the spirit of capturing powerful moments that last a lifetime, Canon U.S.A., Inc., a leader in digital imaging solutions, unveiled today the Canon EOS Rebel T8i. The newest and highest-performing Rebel camera within the Canon lineup features the DIGIC 8 Image Processor, eye-detection in live view, 4K video, clean 4K HDMI outputˆ, and is the first EOS DSLR with vertical video1 all within a compact and lightweight body to bring photography and videos to life.
“Our commitment to high-quality and high-performing DSLR cameras is unwavering,” said Kazuto Ogawa, president and chief operating officer, Canon U.S.A., Inc. “Visual storytelling is not one size fits all, and in order to encourage our current and future customers to explore their content creation journey and make it easier for people to explore the art of photography and the power of an image, it is imperative that the next EOS Rebel provide high-quality still imagery, high-speed shooting capabilities as well as top notch video functionality.”
The Greatness Within the EOS Rebel T8i
The compact and lightweight EOS Rebel T8i camera is ideal for documenting the early days on the soccer field and aspiring shutterbugs looking to go beyond the “Auto” feature. This camera includes:
Compatible with an extensive line of Canon EF and EF-S lenses, this model can capture vertical video and has multiple connectivity options using Bluetooth® and WiFi® technology, making it ideal for vlogging, uploading content to social media platforms, and web services or for day-to-day usage when capturing life’s most precious moments.
The EOS Rebel T8i camera body has an estimated retail price of $749.99 and the EOS Rebel T8i kit with EF-S 18-55mm F4-5.6 IS STM lens has an estimated retail price of $899.99.
[[press release ]]
Adding to your Lens Arsenal: Canon Introduces its New RF 24-105mm STM Standard Zoom Lens
New Compact, Lightweight Standard Zoom RF
Lens is Ideal for Users Looking to Add to their RF Lens Collection at an Affordable Price
MELVILLE, NY, February 12, 2020 – Whether it’s evoking an emotion, telling a story or reminiscing about a moment in time, visual creators of all levels know that a high quality, trusted lens is necessary to capture the essence and power of an image. Creating for the creators, Canon U.S.A., Inc., a leader in digital imaging solutions, today announced the introduction of its newest RF lens, the RF 24-105mm F4-7.1 IS STM standard zoom lens. The new compact and lightweight RF lens will be the perfect addition to a creator’s collection, delivering on quality output at an affordable price.
“Since the introduction of the EOS R system in late 2018, our goal has always been to develop full-frame mirrorless cameras and lenses to match every skill level of photographers, from entry-level to advanced professionals,” said Kazuto Ogawa, president and chief operating officer, Canon U.S.A., Inc. “Having a firm grasp on the needs of our customers looking for an impressive compact, lightweight lens at an affordable price point, the RF 24-105mm lens is the quintessential lens to have in any creator’s camera bag.”
Cementing Canon’s vision for the EOS R line to become as widely popular as its celebrated lineup of EOS DSLR line, the new RF 24- 105mm F4-7.1 IS STM lens pairs quality output with an affordable price. The lens is compact enough to carry every day, at 3.5in (88.8mm) in length and weighing in at 13.9oz (395g) and has a long zoom range starting 24mm wide within compact and lightweight body. The new lens also possesses Optical Image Stabilization Technology, which helps steady camera shake up to five stops¹, reducing image blur. The lens also includes Movie Servo AF using STM (stepping- motor) technology that contributes to both fast auto focus for still images as well as smooth, quiet auto focus for video in conjunction with camera’s Dual Pixel CMOS AF on compatible Canon full-frame mirrorless cameras.
Additional noteworthy features of the RF 24-105mm STM standard zoom lens include: Control Ring for Direct Setting Changes 12-pin Communication System Maximum Magnification of 0.4x at 105mm Minimum focus distance of .13m (0.43ft) using Center Focus Macro
Availability and Pricing
The Canon EOS RF 24-105mm F4-7.1 IS STM lens has an estimated retail price of $399.99 for the lens only.* It will also be sold as a body- and-lens kit with the EOS R and RP cameras.
The post Canon Is Developing EOS R5 & Introduces EOS Rebel T8i appeared first on HD Video Pro.
There are so many lens choices possible in 2020. Not only lens brands and models, but different types of lenses for different functionality.
Let’s talk lenses. In 2020, we have many, many choices in lenses, especially depending on the camera you shoot with. Our main digital cinema cameras are the Canon C300 MKII and the Canon C200. Choosing the best lenses for our camera should be relatively easy and fairly simple, right? Somehow, the more I mull it over in my head, the fuzzier the right choices become.
Lenses are best chose for the types of shoots that you mostly do. I’m a terrible illustrator, but imagine if you will a triangle. One side of the triangle is labeled cost. One side of the triangle is labeled quality. The last side of the triangle is labeled function. This triangle is sort of like the famous “Triangle of Buying Choice” where the sides of the triangle are labeled “Speed, Quality, Cost” and, as the saying goes, pick any two. You really can’t have all three; you have to compromise in at least one area. My lens triangle is somewhat the same, except I don’t know if you can even have two of the three. At times, it feels like you can only have one choice.
Case in point, lenses for our digital cinema cameras. Let’s take a look at some of the choices we have in the famous Canon EF mount, probably the most popular camera mount available today for digital cinema and mirrorless cameras overall.
These could be Canon EF and EF S models, but there are many other types and focal lengths available, obviously from third-party manufacturers.
In the case of Canon, I’d term these lenses hybrids because they take aspects of still lenses, cine lenses and B4 broadcast 2/3” lenses and combine them into one lens suitable only for S35 sensors. These are the Canon CN E 18-80mm t/4.4 compact servo lens and the Canon 70-200 t/4.4 compact servo lens. These aren’t still lenses, but they aren’t proper cinema lenses with hard stops and long focus rotation like Canon’s CN E Primes have either. They’re designed for digital cinema cameras, though, and cover an S35 sensor frame but have a servo control for zooming that isn’t typically found integrated into the lens, it must be obtained from third-party zoom controllers with separate external motors and power supplies.
Most of us know what a cine lens is, but some of the common characteristics are that the lenses have hard stops, markings on both sides of the lens for operator and ACs to view and often the markings glow in the dark for operating in dim environments. Cine lenses have all manual operation and typically have a focus rotation that’s long, 200 to 300 degrees for smoother operation with follow focus units, often controlled by a focus puller. It’s interesting today that in this category, we have low-end cine lenses like the Rokinon Cine DS, cine converted still lenses that cost a few hundred dollars apiece, all of the way up to the top of the line cine lenses by Zeiss, Cooke, Leica, Angenieux and many other manufacturers that can cost tens or even hundreds of thousands of dollars.
If you’re a new digital cinema camera owner, or even a mirrorless or lower end digital camera owner, you have many, many choices for which lenses will function best for you in your shooting situations.
Any of the three categories above can be used in this situation. But as you work through it, a few truths become self-evident. For these types of shoots, the most important factors are often size and weight. A huge, heavy cine lens will limit a single camera operator more severely than a smaller, lighter weight lens. Shooting this type of coverage with prime lenses is definitely possible, but you may find yourself missing certain shots without a zoom lens. I’ve shot documentary coverage with zoom lenses when I’m running around literally documenting an event as it happens in real time, but also covering an event like this in very, very low light where a fast prime lens is superior to any zoom because primes are typically so much faster because they have more light-gathering capability. Nobody wants to shoot low light and have to crank the gain in their camera very high because of being saddled with a slow zoom lens.
Interviews can definitely be shot with any lens in any of these categories. Autofocus technology can be very valuable when shooting interviews, especially if you don’t have a camera assistant pulling focus for you. S35 and full frame sensors have a much shallower depth of field than the smaller size sensors we used to use in 2/3” broadcast cameras. Combine that relatively shallow depth of field with talent that rocks or moves back and forth in the frame, and if you’re shooting with manual focus, it can be easy to end up with slightly out of focus footage if you can’t track their slight forward and backward movements perfectly. Even if you have a large enough monitor to judge focus, which you may or may not, or a good EVF where you can definitely judge focus, it can be exhausting to follow focus continuously on your subject over a one- or two-hour-long interview.
As cameras have moved from SD to 1080 to 4K and now 6K and 8K over the past decade, it’s also becoming more difficult to actually see if something is in sharp focus on a 1080 display, which is what most EVFs and smaller monitors are capable of. There are no 7-inch 4K, 6K or 8K native displays, so what you see isn’t necessarily what you are getting. Footage that looks perfectly sharp on your small monitor or EVF, when played back on a larger screen, can often be slightly soft and out of focus. For this reason, I value autofocus and face/eye detection for interviews. Unfortunately, this limits you to still lenses or hybrid lenses. None of the digital cine lenses on the market at this time have autofocus, although we believe that AF for pro cine users is coming in the near future.
Depending on how often you shoot narrative, you may or may not typically work with camera assistants and focus pullers. If you do, it can be a pleasure to shoot with a good AC and or focus puller. AC/focus pullers need, want and prefer cine lenses with longer focus scales, markings on both sides of the lens, hard stops, etc. Focus pullers don’t like working with still or hybrid lenses because they simply aren’t set up correctly to work in the way that focus pullers are used to working. Still and hybrid lenses typically have focus rotations that are too short and the lenses lack hard stops, making repeatability in hitting focus marks impossible and the physical size of most still and hybrid lenses aren’t conducive to having a crew working on the camera.
Personally, I think that the latest developments in AF technology are becoming so good that soon we’ll see professional level autofocus systems for higher-end digital cinema cameras and PL mount lenses. As of today, though, that technology hasn’t quite yet arrived, so my prediction is that pro cine AF is something to look out for in the near future.
Lenses were all manual about 15 to 20 years ago other than a few consumer AF lenses. Even still photo lenses were made out of steel, aluminum and brass; the focusing controls were smooth and mechanically linked to the lens. Same with aperture controls to open or close the lens’s iris, they were mechanical. Sure, still lenses had clicks to differentiate ƒ-stops for still photographers while cine and pro video lenses had clickless aperture controls so that the lens’s iris could be subtly opened or closed during a shot without the exposure change having steps.
At some point, when better AF became the norm for still lenses, mechanical focusing evolved as focus by wire became the norm. What is focus by wire? For most AF lenses today, when you turn the focusing ring there’s a digital chip/mechanism in the lens that receives the focusing input from the focus ring rotating and turns it into a digital signal that’s fed to a small circuit board in the lens. The impulse triggers the focusing motors to move and focus the lens elements. The problem for us is, all of this analog to digital conversion input has a slight latency or delay, and the impulses are merely two way, as in the rotation impulse moves the focusing element forward or backward. So any fine feel that you had in focusing a mechanically focused lens is now gone. No repeatability, no mechanical feel. Manually focusing on today’s lenses ranges from impossible to merely terrible.
Same with aperture control. Lenses used to have the iris mechanically linked to the aperture wheel on the lens, so you turned the aperture ring and the iris closed or opened accordingly. Today, many still lenses don’t even have an aperture ring. They’re controlled electronically by a wheel on the camera body. For manual operation, still lenses have actually become much worse over the past decade or two. AF technology has become noticeably better but manually focusing has become pretty bad, making still lenses a poor choice when you want to control focus or smoothly ramp the aperture as your character moves from dark to light or vice versa.
We already own a dozen Canon EF and EF S still lenses. Some really great ones like the EF 70-200mm f/2.8 IS II, and some not so great ones like the EF S 17-55mm f/2.8 IS. The former is a pro-level telephoto zoom that’s ruggedly built and makes beautiful footage that looks as good as footage coming from cine lenses that cost 10 times more. On the latter, the build isn’t very good, the dust sealing is terrible, the AF is much slower and clunkier and, worst of all, because it’s only designed to cover the APSC 1.6X crop, vignettes pretty badly with the 1.5X crop S35 sensors in the two cameras we shoot with most.
Lately, I ‘ve been shooting interviews and some lifestyle b-roll footage with our three Canon still primes, the EF 28mm f/1.8, 50mm f/1.8 STM and EF 85mm f/1.8. All cover full-frame sensors, so there’s no vignetting on our S35 cameras, and all look good and match fairly well as far a color and lens coatings. The only downside of them is when we try to use them for manually focusing, creatively when shooting b-roll, their focusing is all focus by wire and the focusing rotations are ridiculously short, making manual focusing, even using Canon’s handy focus assist function, a nuisance.
I’ve shot with and reviewed the Canon CN E 18-80 and 70-200 t/4.4 compact servo hybrids here for HDVideoPro. They’re great lenses and are very useful for run and gun and documentary work. But for manually focusing, they have no hard stops and because they support AF and IS, are both focus by wire.
I like shooting with primes because they’re generally smaller, lighter and much faster than zoom lenses. For situations where we’re going to be shooting talent, functions, dance, tabletop, performance or narrative, primes make a lot of sense. Now the dilemma is, which primes to invest in? Stay tuned for more details in an upcoming blog post where we’ll delve into the many choices that are available for an EF mount cine prime.
The new Nikon D6 Full-Frame DSLR with Nikon’s NIKKOR 24-70mm lens
In the fall of 2019, Nikon introduced news about its upcoming D6 DSLR. But it made an announcement that was, kind of… an announcement of an announcement, which indicated they were working on producing a replacement for the Nikon D5, its long-time flagship DSLR. These news statements are often referred to as a development announcement, and happen when flagship products are about to be replaced or updated.
Canon, for instance, made such an announcement for the EOS-1D X Mark II. In short, it’s an attempt to generate some buzz—and let the camera industry and photography world know that at some future point, there will be an official announcement of the product mentioned by the camera brand in the development announcement.
So today, that’s what Nikon gave us: The official Nikon D6 announcement, which will be available this coming April for $6,499, as a body-only configuration.
Overall, it looks like a very robust, high-performing DSLR, although the news that the D6 has just a 20.8-megapixel FX-Format (full-frame) CMOS sensor—the same resolution as its predecessor, the D5—doesn’t really generate an aura of marketing buzz.
Still, the camera’s default ISO range is 100–102,400, and is expandable to 3,280,000. That’s certainly one spec that will entice many photographers from all genres. Another powerful facet of the new flagship is the D6’s autofocusing, which Nikon says is the most “powerful AF system in Nikon’s history,” with “unparalleled low-light performance, powerful agility, advanced 4K UHD multimedia capabilities” and a mechanical shutter frame rate boosted to a staggering 14 frames per second.
Nikon says some of the new improvements on the D6 include:
For more on the Nikon D6, go to nikonusa.com.
In addition to the D6, Nikon also announced two full-frame lenses for its NIKKOR Z mirrorless camera-lenses lineup: The NIKKOR Z 24-200mm f/4-6.3 VR versatile, lightweight zoom lens and an ultra-wide prime, the NIKKOR Z 20mm f/1.8 S.
Some of the notable features on the NIKKOR Z 24-200mm f/4-6.3 VR zoom include:
The NIKKOR Z 20mm f/1.8 S prime lens has the following capabilities:
The NIKKOR Z 24-200mm f/4-6.3 VR will be available April 2020 for $899, and the NIKKOR Z 20mm f/1.8 S will be available March 2020 for $1,049. For more on the two lenses, visit nikonusa.com
The post Nikon Announces D6 Arrives In April And Two Z-Series Mirrorless Lenses appeared first on HD Video Pro.
The new Olympus OM-D E-M1 Mark III with 12-40mm zoom lens
Earlier today, Olympus made the somewhat surprising decision to add a second, top-of-the-line, or flagship mirrorless camera to its line-up of Micro Four Thirds cameras: The new 20.4-megapixel OM-D E-M1 Mark III joins its larger, heavier MFT twin, the Olympus OM-D E-M1X, which was announced a little over a year ago. The E-M1 Mark III will be available beginning February 24, 2020.
In addition to the new flagship, Olympus also introduced a new zoom, the 12-45mm f/4 PRO lens, and a PEN-series mirrorless camera, the Olympus PEN E-PL10, which is the newest addition to its PEN Lite series of Micro Four Thirds camera, which are targeted at novices those looking to step up from smartphone photography or basic point-and-shoots.
It may be unique for a camera brand to offer two flagships. However, it’s a decision that could easily confuse potential buyers. For example, why would a photographer choose the E-M1 Mark III over the E-M1X? Here’s one reason: According to Olympus, if a photographer is looking for a camera-body that is more agile, then the E-M1 Mark III might be the better choice since it’s more portable. However, the E-M1X might be more attractive for photographers who shoot with long lenses. For most photographers, the size and weight of the E-M1 Mark III is much more in keeping with the compact, lightweight form factor many have come to expect from a MFT camera body. (Olympus seemed to purposely build in bulk to the E-M1X.)
The E-M1 Mark III shares a number of qualities with its larger brand sibling. One remarkable feature is its robust 5-Axis image stabilization with 7.5 shutter speed steps of compensation, which Olympus says is the “world’s most effective” IS. It’s also incredibly fast, with the ability to fire off 60 frames per second with AF locked (using the silent electronic shutter) or 18 fps for AF/AE-tracking sequential shooting (again, using the silent electronic shutter).
But both O-MD series cameras share a number of other features, including phase + contrast detection dual AF, a 121-point all cross-type On-chip Phase Detection AF sensor, the ability to shoot 4K video, function buttons, Live ND mode and a 50MP Handheld Hi-Res Shot mode, which lets you create a 50-megapixel, seamless composite of a scene from 16 images without using a tripod. It also has a feature found just on the E-M1 Mark III: Starry Sky AF mode, which is intended to make shooting photographs of stars much simpler and more effective.
The E-M1 Mark III has an excellent viewfinder, a 2.36M-dot LCD EVF, and a 3-inch swiveling touch LCD. It also comes with a variety of powerful 4K-resolution video modes (at 30, 25 and 24 fps) and with OM-log mode.
Like the E-M1X, the E-M1 Mark III is a rugged cameras—it even has an IPX1 rating! And both models come with built-in Wi-Fi, Bluetooth and GPS. However, one of the very few elements that the new E-M1 Mark III doesn’t share with the E-M1X are the any features relying on artificial intelligence, or AI-based shooting, such as Intelligent Subject Detection AF, which is only found on the E-M1X.
Because Olympus has continued to develop new lenses for its MTF-system cameras, it’s no surprise that today the company also introduced a new zoom, the M.Zuiko Digital ED 12-45mm F4.0 PRO lens.
The company says it’s a high-performance medium-range zoom—with a 35mm-film equivalent range of 24mm to 90mm, which means you can quickly zoom between wide-angle and telephoto shots in an instant. Olympus says it also features “supreme macro capabilities with a maximum magnification of 0.5x (35mm equivalent) across the entire zoom range.” It has a closest-focusing distance of 4.7 inches at the wide-angle end and a little over 9 inches at the telephoto end, which makes it quite a versatile lens.
The zoom has a lens design of 12 elements in 9 groups. Like many Olympus camera bodies, this lens is also dustproof, splashproof and freezeproof (to -10°C), and weighs just under 9 lbs. It’s compact and lightweight, which makes it great for bringing it on the road as a travel lens or for shooting events.
Olympus is targeting its new PEN E-PL10 MFT mirrorless camera at beginner photographers, who might be interested in stepping up the quality of their photos and video from what they can capture with a basic point-and-shoot digital camera or a smartphone.
This compact, lightweight camera includes a number of easy-to-use features, like a built-in pop-up flash, a wide array of expressive photography functions and art filters, in-body image stabilization (IBIS) and a 180-degree flip-down LCD screen, which Olympus says has “a step-by-step touch menu interface to guide the user as they capture beautiful photos in any situation.” It also is compatible with various interchangeable lenses and has built-in Wi-Fi and Bluetooth.
The Olympus OM-D E-M1 Mark III (black) will be available, beginning February 24, 2020, for $1,799 (body only). It will also be available in two kit configurations—for $2,499 with the M.Zuiko Digital ED 12-40mm F2.8 PRO lens and for $2,899 with the M.Zuiko Digital ED 12-100 F4.0 IS PRO lens. The Olympus M.Zuiko Digital 12-45mm F4.0 PRO lens (black), which comes bundled with the LH-61G lens hood, will be available April 7 for $649. The Olympus PEN E-PL10 is available now in three colors—shiro (or white), kuro ( or black) and mocha (or brown)—in a body-only configuration for $599 or bundled with the M.Zuiko Digital ED 14-42mm F3.5-5.6 EZ kit lens for $699. Additionally, Olympus will sell two accessories for the OM-D E-M1 Mark III that will be available in April, 2020: A shock mount adapter (SM2), for $39 and an audio cable (KA33) for $14.
Stay tuned for my hands-on first-look review at using the new O-MD E-M1 Mark III in Costa Rica on a recent trip with Olympus.
For more information, see the press releases below or visit http://www.getolympus.com.
[[ press release ]]
OLYMPUS OM-D E-M1 MARK III INTERCHANGEABLE LENS CAMERA
Delivering Stunning Image Quality, Superior Mobility And Absolute Reliability
For Professional Photographers Everywhere
CENTER VALLEY, Pa., February 12, 2020 — Today Olympus debuts the OM-D E-M1 Mark III, scheduled to go on sale February 24, 2020. The Olympus OM-D E-M1 Mark III is a professional model built for superior mobility. This professional interchangeable lens camera conforms to the Micro Four Thirds® System standard. It comes equipped with a new image processing engine, TruePic IX, enabling features such as 50MP Handheld High-Res Shot. Combined with the high image quality of M.Zuiko® Digital lenses, this system fulfills the needs of professional photographers in any field, all in a compact, lightweight dustproof, splashproof, freezeproof magnesium alloy body for peace of mind when shooting in harsh environments.
This reliable, compact and lightweight body offers the world’s most effective 7.5 shutter speed steps of compensation. The OM-D E-M1 Mark III is also equipped with a 121-point all cross-type On-chip Phase Detection AF sensor for high-precision focusing. Starry Sky AF delivers revolutionary autofocus performance for astrophotography, and the Advanced Face / Eye Priority AF tracks and ensures the subject’s eye is continuously in focus, resulting in a crisp, clear portrait. This model is also equipped with versatile features that were popular on the Olympus OM-D E-M1X, such as 50MP Handheld High Res Shot, Live ND, Pro Capture mode, and handheld 4K video, thanks to its 5-axis In-Body Image Stabilization, designed to meet the demands of the professional photographer.
Compact and Lightweight with High Image Quality
By synchronizing the in-lens image stabilization of supported lenses with the in-body 5-axis image stabilization, this model achieves the world’s best 7.5 shutter speed steps of compensation with 5-axis sync IS. Powerful image stabilization enables shooting handheld in dark locations and during super-telephoto photography for outstanding freedom in various scenarios without the need for a tripod. In-body image stabilization ensures image stabilization with all attached lenses, up to 7.0 shutter speed steps of compensation performance.
With the new image processing engine, TruePic IX, combined with its 20.4 Megapixel Live MOS sensor, the world’s most effective 7.5 shutter speed steps of compensation, and high-resolution M.Zuiko Digital lenses deliver minimal noise even at high-sensitivity settings. This model boasts top-class image quality in the interchangeable lens camera class with minimal distortion to the edges of the shot. Improved AF algorithms and high resolution, high speed performance allow for features such as Handheld High Res Shot, Live ND, Starry Sky AF and improved face/eye priority AF.
50MP Handheld High Res Shot makes it possible to capture high-resolution images without the need for a tripod. This feature makes use of the minimal movement occurring between each of the 16 shots to generate a single 50 Megapixel high-resolution photo. This feature is particularly useful for capturing high-resolution shots in locations where it is impossible to use a tripod. Tripod High Res Shot is also available for recording ultra-high-resolution approximately 80 Megapixel equivalent JPEG images, great for suppressing movement in the merged shot, such as a rippling surface of water or leaves shaking in the wind.
Live ND, which is highly regarded on the OM-D E-M1X, is also included on this model, creating a slow shutter effect without the need for a physical ND filter. This feature virtually extends the exposure time and allows the capture of images with the appearance of a slow shutter speed by merging multiple exposures together. Users can select the effect level from ND2 (one step) to ND32 (5 steps), and view the slow shutter effects in the viewfinder before capturing, eliminating the need to change lenses or optical ND filters.
The OM-D E-M1 Mark III body is the foundation to meet photographers’ need for portability and reliability. Add Olympus M.Zuiko Digital PRO lenses for an unrivaled compact and lightweight system, maintaining the best balance of portability and image quality resulting in performance required and expected by professionals
The magnesium alloy body of the OM-D E-M1 Mark III features advanced weatherproof construction, resulting in dustproof, splashproof, and freezeproof performance. When paired with a dustproof and splashproof M.Zuiko Digital lens, users can enjoy shooting in the harshest condition without ever worrying about weather or location.
Avoid extra retouching due to dirt and dust on the sensor. The OM-D E-M1 Mark III is equipped with an industry-leading dust reduction system. The SSWF (Super Sonic Wave Filter) vibrates the image sensor at a frequency of 30,000 times per second to shake off dust and dirt. The new dust resistant coating recently introduced on the OM-D E-M1X is also used on this model, making it less likely for dust and dirt to stick to the image sensor, reducing spots in images by 90%.
The Lithium-ion Battery BLH-1 can be fully charged in as little as two hours when charged in the camera via a USB-C PD (Power Delivery) compatible charger of up to 100W.It is also possible to power the camera via a portable USB-C PD power bank or battery pack allowing the photographer to shoot for long durations, especially convenient for astrophotography or photographing in cold locations.
High Speed Sequential Shooting
This camera is equipped with 121-point all cross-type On-chip Phase Detection AF sensor for tracking subjects across a wide range quickly and accurately. AF information from recorded images is also used even during sequential shooting, more easily tracking subjects that move unpredictably. It offers 75% vertical coverage and 80% horizontal coverage of the screen for a wide focusing area. Paired with the advanced AF algorithm, this feature can continually focus on fast-moving subjects with a high degree of precision. Unlike DSLR cameras, there is no degradation in AF precision when using a fast lens. The OM-D E-M1 Mark III offers high-precision focusing that can sufficiently bring out the capabilities even of large-diameter lenses such as those with a maximum aperture of F1.2. AF/AE tracking is possible at maximum 18 fps high-speed sequential shooting while maintaining the full pixel count of 20.4Megapixels. The subject can also be checked in the viewfinder during high-speed sequential shooting for accurate tracking. Stunning 60 fps shooting performance captures split-second moments in high resolution that the human eye cannot detect utilizing AF/AE lock sequential shooting.
Pro Capture mode makes it possible to record scenes that are difficult to capture due to time lag in the subjects’ reactions or camera operation time lag. Recording begins upon the half shutter release, capturing up to 35 frames retroactively from the point of the full shutter release. Because there is no blackout during shooting, it is possible to keep an eye on subject movement while pressing the shutter button. RAW shooting is also supported. Pro Capture makes it possible to record once-in-a-lifetime shots that you might otherwise miss due to the time lag between people’s reaction and camera operating time lag.
Accuracy Autofocus System
Various Creative Features
Live Composite is included with the OM-D E-M1 Mark III. This feature makes it possible to check exposure status in Live View in real time. Live Composite also supports up to six hours of shooting. With B mode added to the shooting mode dial, Live Composite, Live Bulb, and Live Time are now easier to access and configure. Record photos in focus all the way from the foreground to background. Focus Stacking automatically creates a composite in-body from up to 15 frames. Focus Bracketing allows the photographer to shoot up to 999 images at different focus points to composite later using the software of their choice. Silent Mode turns off the mechanical shutter and all electronic sounds. Perfect for shooting in areas where shutter sounds are inappropriate, such as concert halls. The OM-D E-M1 Mark III is equipped with dual card slots, allowing the user to record JEPG and RAW separately, backup, automatically switching, etc. Slot # 1 is UHS-II/ UHS-I compatible and Slot # 2 is UHS-I compatible. In-body Fisheye Compensation allows the user to remove the distortion created by a fisheye lens providing more wide-angle creative options. Keystone Compensation applies trapezoidal compensation and perspective enhancement simultaneously, providing the functionality of a tilt/shift lens. Anti-Flicker Shooting (mechanical shutter only) detects the flicker of alternating light sources and reduces the effect by only shooting at peak brightness, reducing exposure variation. Flicker Scan (electronic shutter only) minimizes the effects of flickering occur under LED lighting
Electronic stabilization combined with in-body 5-axis stabilization delivers powerful image stabilization during video recording. OM-D Movie makes handheld 4K/C4K shooting possible due to a powerful image stabilization mode specifically designed for video recording (M-IS1). This offers three levels of performance to allow handheld 4K and Cinema 4K (C4K) high resolution shooting.
Software and Smartphone Applications
The OI.Share® dedicated iOS and Android app can be used to connect to the camera via Wi-Fi®, import shooting data to a smartphone, and to use the smartphone for remote camera operation and more convenient shooting and image organization. OI.Share can be used to update the camera firmware and backup and restore camera settings for the OM-D E-M1 Mark III.
Olympus Capture camera control software for computers meets the demands of studio photographers. Recorded images can be imported via Wi-Fi without using a USB connection, providing powerful support the workflow of studio shooting. It supports high-speed 5 GHz band communication.
Olympus Workspace image editing software can handle professional tasks such as RAW processing and image editing, along with offering freedom over screen layout, etc. Connect a computer to the OM-D E-M1 Mark III via USB to enable high-speed RAW processing on Olympus Workspace using the new image processing engine TruePic IX. Clarity and Dehaze editing filters are also included for a greater range of expression in astrophotography, etc. By using Olympus Workspace Version 1.3, being released at the same time as the OM-D E-M1 Mark III, easily replace the audio files of the recorded video to High res sound recorded by using Slate Tone on the LS-P4 / LS-100 while recording video.
Separately Available Accessories
The Power Battery Holder HLD-9 features a dustproof, splashproof, freezeproof design that delivers the same controls whether held vertically or horizontally. Attach the HLD-9 when shooting scenes with frequent changes between vertical and horizontal positions, or when you need to capture a lot of shots, or remove it for greater mobility. When using one Lithium-ion Battery BLH-1 in the camera and one in the HLD-9, together, up to 840 shots8 can be recorded.
Shock Mount Adapter SM2 is an adapter designed for absorbing camera noise while the LS-P4 is attached to the camera hot shoe. It prevents vibration and operational noise from the camera, making video shooting with higher quality audio possible.
Audio Cable KA335 is a high quality cable designed for connecting the camera and recorder. An L-shaped plug and curled cord provides easy handling when connected to the OM-D E-M1 Mark III.
Pricing and Availability
The Olympus OM-D E-M1 Mark III (black) will be available beginning February 24, 2020. The camera body only will have a suggested retail price of $1,799.99 USD and $2,399.99 CAD. The camera body bundled with the M.Zuiko Digital ED 12-40mm F2.8 PRO Lens will have a suggested retail price of $2,499.99 USD and $3,299.99 CAD, and the camera body bundled with the M.Zuiko Digital ED 12-100 F4.0 IS PRO Lens will have a suggested retail price of $2,899.99 USD and $3,799.99 CAD. The shock mount adapter SM2 will have a suggested retail price of $39.99 USD and $51.99 CAD, and the audio cable KA335 will have a suggested retail price of $14.99 USD and $19.99 CAD. These accessories will be available beginning April, 2020.
[[ press release ]]
THE ULTIMATE COMPACT, LIGHTWEIGHT, HIGH-RESOLUTION M.ZUIKO® PRO LENS
M.Zuiko Digital ED 12-45mm F4.0 PRO (35mm equivalent: 24-90)
CENTER VALLEY, Pa., February 12, 2020 —Olympus is pleased to announce the M.Zuiko Digital ED 12-45mm F4.0 PRO lens, scheduled for availability April 7, 2020. This high-performance medium range zoom PRO lens conforms to the Micro Four Thirds® System standard and features superb optical performance at all focal lengths, while being the world’s most compact, lightweight model. The M.Zuiko Digital ED 12-45mm F4.0 joins the M.Zuiko PRO category of lenses, possessing dustproof, splashproof and freezeproof (-10°C) performance that delivers excellent image quality and peace of mind even when shooting in the most severe environments. This lens delivers superb resolution to the edge of the frame across the entire zoom range, making the most of the appealing aspects of the Micro Four Thirds System standard. It features supreme macro capabilities with a maximum magnification of 0.5x (35mm equivalent) across the entire zoom range, making it an anytime, anywhere, all-around lens. When paired with the Olympus OM-D® E-M5 Mark III, this lens delivers high resolution and amazing portability to conveniently carry in a small bag.
The M.Zuiko Digital ED 12-45mm F4.0 PRO lens is the world’s most compact, lightweight medium range zoom PRO lens with a fixed aperture value covering a focal length from wide angle 24mm to telephoto 90mm (35mm equivalent). It consists of nearly 190 precision machined components all mounted in a dense configuration, resulting in a size of 63.4 mm./2.5 in. (max. diameter) x 70 mm/2.76 in. (overall length), and a weight of approximately 254 g/8.96 oz. This small and lightweight lens delivers high-speed, precise autofocus for capturing any subject. Its dustproof and splashproof construction contains sealing in nine places to keep out dust and rain, providing peace of mind when shooting in active situations.
Effective placement of aspherical lenses and ZERO (Zuiko Extra-low Reflection Optical) Coating provide clear depictive performance, drastically reduce aberrations, ghosts, and flare for sharp, high-definition image quality. Suppressing loss of light at the edges of images makes it possible to obtain bright, clear depictive performance up to the very edges. Because the aperture value is fixed across the entire focal length, it is easy to control the exposure when zooming and when recording video.
Enjoy macro shooting with a maximum magnification of 0.5x (35mm equivalent) across the entire zoom range. The closest focusing distance is 12 cm at the wide-angle end, and 23 cm at the telephoto end, delivering a wide range of macro shooting effects, including wide-angle macro shots that emphasize a sense of perspective by capturing vast backgrounds, and telephoto macro shots for more significant background defocusing effects. Diverse macro effects are possible, such as Focus Stacking, which generates a single image on the camera with a large depth-of-field in focus from the foreground to the background.
Pair the new M.Zuiko 12-45mm F4.0 PRO lens with the recently announced Olympus OM-D E-M5 Mark III to create the ultimate travel combination. As Olympus’ smallest weathersealed combination to date, at just 670 g/23.6 oz., you are able to travel with ease and shoot on-the-go, no matter the environment. Enjoy a bright constant aperture of F4.0, along with a myriad of pro features brought to you in the OM-D E-M5 Mark III. Lighten up with this ultimate travel combo and change both your photography and lifestyle forever.
Pricing and Availability
The Olympus M.Zuiko Digital 12-45mm F4.0 PRO lens (black) comes bundled with the LH-61G lens hood, specifically designed to protect the lens and reduce unwanted light entering the lens in backlit situations. The lens will be available April 7, 2020. The lens will have a suggested retail price of $649.99 USD and $849.99 CAD.
[[ press release ]]
TRANSFORM YOUR PHOTOGRAPHY WHILE EMBRACING CREATIVITY
WITH THE OLYMPUS PEN® E-PL10
A Compact and Sophisticated Interchangeable Lens Camera That You Can Take Everywhere
CENTER VALLEY, Pa., February 12, 2020 — Today Olympus America Inc. is pleased to introduce the newest addition to its PEN Lite series of Micro Four Thirds® System standard interchangeable lens cameras to our region, the Olympus PEN E-PL10. Drawing from the classic style of the 1963 PEN-F, the E-PL10 features an attractive clean aesthetic, thoughtful design, packed with a TruePic VIII Image Processor, built-in pop-up flash, and a wide array of expressive photography functions. The compact, lightweight body is equipped with in-body Image Stabilization and a 180-degree flip-down LCD screen with step-by-step touch menu interface to guide the user as they capture beautiful photos in any situation.
The Olympus PEN E-PL10 delivers blur-free high image quality with a simple touch operation. It is packed with features that expand creative expressions, such as Selfie, Art Filters for impressive, artistic finishes, and offers compatibility with various interchangeable lenses. By using the built-in Wi-Fi® or Bluetooth® in conjunction with the Olympus Image Share (OI.Share®) smartphone app, the camera easily connects to a smartphone to transfer images and share them on social media. Tutorial videos are also available to learn photography techniques using OI.Share, making it the perfect interchangeable lens camera for the beginner photographer. The PEN E-PL10 packs versatile features in a simple, sophisticated, compact design, available in three trendy colors that you can take everywhere.
Stylish, Premium Finished Design with Hidden Flash
The E-PL10 is available in shiro白 (white), kuro 黒 (black), and 茶 (mocha). Each are designed with premium materials and offer chic finishes, including leather grain, brushed aluminum, a grip to make it easy to hold, large mode dial, and built-in hidden flash, turning this camera into a fashion accessory. When paired with the M.Zuiko® Digital ED 14-42mm F3.5-5.6 EZ wide-angle zoom lens, it is highly portable, and lighter than a 16 oz. bottle of water.
Intuitive Guides with Simple Touchscreen
Never get frustrated with your camera again. The Olympus PEN E-PL10 removes the guesswork from your photography with four primary assist options on the mode dial that cater to users based on skill level: Auto, Scene (SCN), Advanced Photo (AP) and Art Filters (ART). For carefree photography, Auto Mode precisely detects the scene, lighting, subject and camera motion and automatically selects the optimal settings. Auto Mode ensures reduced blur for clear, sharp photos by detecting camera shake and moving subjects, then adjusting its settings accordingly. For additional control or hard to capture shots, Scene Modes allow the user to quickly customize the camera’s settings directly from the touch screen. Double-tap to choose from one of the following six categories: People, Motion, Indoors, Nightscapes, Scenery or Close-ups. Follow the on-screen prompts to capture challenging shots, like a candle-lit birthday cake or a running pet with a blurred motion background. The Shortcut button directly jumps to the primary settings in each photography mode. It provides an efficient navigation to the settings you need without having to fumble through cumbersome menus or instruction manuals.
180-Degree Flip Touchscreen with Automatic Selfie Mode
The PEN E-PL10’s unique flip touch screen makes photography and learning how to capture a great photo easier. Simply touch the subject shown on the LCD monitor to simultaneously focus and activate the shutter (Touch AF Shutter). When the monitor is flipped down, it automatically switches the camera to Selfie mode for easy and beautifully exposed selfies in all conditions, day or night. You can also select e-Portrait for brighter, smoother skin, or switch to movie recording with a simple touch operation. With Touch AF high-speed autofocus, the camera instantly focuses and captures with a simple touch of the screen.
Next Level Photography Features and Customizable Art Filters
The PEN E-PL10 uses a 16-Megapixel image sensor, paired with Olympus’ dual-core TruePic VIII Image Processor to deliver outstanding quality in every image. Advanced Photo (AP) mode provides functions that generally require advanced photography techniques, but is simple to operate. Anyone can capture a multi-exposure photo by simply overlapping two images in Multi Exposure, and capture light trails of stars or catch crisp trails of automobile headlights and taillights without the risk of overexposure, using Live Composite. Silent Mode, which mutes shutter and operation sounds, is now possible in P, A, S and M modes as well as AP mode. Art Filters make it easy to create distinctive photography in-camera, no post-editing required. With 16 unique Art Filter options, you can capture creative photos simply by scrolling and tapping on the screen. Use the new Fine Tune option to adjust the level of Art Filter effects while checking the results on the screen to create the photo exactly how you like. High Speed Sequential Shooting of up to 8.6 frames-per-second in Single AF Mode, or up to 4.8 frames-per-second in Continuous AF Mode, ensure that the shot is never missed.
Handheld Shake-Free Still and Smooth 4K Video
This model is equipped with 3-Axis in-body Image Stabilization, allowing the user to capture blur-free photos handheld without the need for a tripod, even in situations where camera shake can cause blur nighttime photography, dim indoor situations, while shooting video or when using a telephoto or macro lens). The PEN-E-PL10 captures smooth UHD 4K 30p video for ultra-high-resolution capture, no stabilizing gear needed. In-Movie Capture allows the user to capture an 8MP image from a 4K video.
Easy Wireless Sharing with Built-in Bluetooth and Wi-Fi
Use the built-in Wi-Fi and Bluetooth with the free Olympus Image Share (OI.Share) app to easily connect the camera and smartphone to import images and wirelessly share with friends and followers. By using the Share Order function, selected photos or videos on the camera will be automatically transferred to your smartphone once the camera is turned off. Convenient remote control of the PEN E-PL10 allows the user to control the camera settings and compose images all from a smart device, perfect for when the user wants to be in the picture. OI.Share also provides a camera how-to guide, containing tutorial videos of photography techniques and a digital guidebook packed with other useful photo tips.
Versatile Interchangeable Lenses
A versatile lineup of compact, lightweight, high-performance interchangeable lenses are available, including bright, single-focal-length lenses, as well as macro lenses, to deliver beautiful defocusing effects. Dramatically expand the possibilities of photographic expression with the perfect lens.
Separately Available Accessories
Genuine Leather Body Jacket (CS-45B), Genuine Leather Shoulder Strap (CSS-S109LL II), and Genuine Leather Lens Cover (LC-60.5GL). These genuine leather accessories are designed to protect the camera and enhance its design.
Pricing and Availability
The Olympus PEN E-PL10 is available now in shiro白 (white), kuro 黒 (black), and mocha 茶 (brown) to easily to match with any style. The camera body only will have a suggested retail price of $599.99 USD and $779.99 CAD. The camera body bundled with the M.Zuiko Digital ED 14-42mm F3.5-5.6 EZ Lens, camera case, lens cloth and SD memory card will have a suggested retail price of $699.99 USD
 Image size is 8160×6120 pixels.
 When 5-axis sync IS used. Lens used: M.Zuiko Digital ED 12-100mm F4.0 IS PRO. At a focal distance of f=100mm (35mm equivalent: f=200mm), halfway release image stabilization: Off, frame rate: high speed. Conforms to CIPA standards, when corrected on 2 axes (Yaw and Pitch). Current as of February 18, 2020.
 M.Zuiko Digital ED 12-100mm F4.0 IS PRO, M.Zuiko Digital ED 300mm F4.0 IS PRO (As of February 18, 2020)
 Lens used: M.Zuiko Digital ED 12-40mm F2.8 PRO. At a focal distance of f=40mm (35mm equivalent: f=80mm), conforms to CIPA standards, when corrected on 2 axes (Yaw and Pitch)
7 Under Olympus test conditions.
8 CIPA testing standard:
 Launch offers may apply.
 35mm equivalent: 24-90mm
 As of February 12, 2020. World’s most compact, lightweight medium range zoom PRO lens with a fixed aperture value.
 When combined with a dustproof and splashproof OM-D series body.
 Supported cameras: OM-D E-M1 Mark III. A firmware update is required for the following camera models: OM-D E-M1X, OM-D E-M1 Mark II, OM-D E-M5 Mark III
 As of February 12, 2020. Camera body sold separately
 The camera must be set to AUTO in order to select e-Portrait on the screen in Selfie mode.
 Only available in Pop Art I and Soft Focus.
 Up to 3.5 shutter speed steps. M.Zuiko Digital ED 14-42mm F3.5-5.6 EZ lens at a focal distance of f=42mm (35mm equivalent: f=84mm), conforms to CIPA standards, when corrected on 2 axes (yaw and pitch)
 When the smartphone OS version is Android 6.0 or later, images are not transferred automatically when the smartphone display is in sleep mode (when the smartphone screen is off). Make sure the display is active. On iOS devices, Olympus Image Share (OI.Share) must be launched first.
The post Olympus Adds Second Mirrorless Flagship Camera: OM-D E-M1 Mark III appeared first on HD Video Pro.
CES always offers a mind-blowing array of screens on display. Also mind-blowing is the hype that goes with them. This year’s exhibition in Las Vegas was no different.
I never walk away from CES with a definite idea of what the best display is. It’s really not a place to compare, as nothing is side by side or showing the same material. What I do get from it is a sense of where manufacturers think things are going.
TV makers were holding on to what was cool last year. LG had their roll-up screens. The entrance to their booth once again showcased their ability to curve screens. Like last year, it drew crowds of people trying to capture the display on their non-curved smartphones.
Samsung attracted attention again with “The Wall,” their MicroLED display. In 2020, the display was even bigger, measuring out at 292 inches. New this year was a model in 8K but at 150 inches.
While Samsung and LG competed to attract the most crowds at their booths, they also vied to define 8K resolution in displays. Continuing a battle started in October, LG claims that their sets are true 8K because of how they measure resolution.
I listened to both sides of the argument. While it’s an interesting conundrum, I won’t feel bad if it doesn’t get resolved for a while. Why? Because I feel we first need to get to a place where 4K is as simple to do as HD has become. I don’t think we’re there yet.
I have to ask how much finishing that’s done is truly 4K. For effects work, is it all done in 4K? Why are streaming services charging extra for 4K? Once we get to a place where 4K isn’t special, then maybe there’s a place for 8K displays. And when we get there, will 8K be different enough?
I saw a lot of 8K on the show floor that was time-lapse footage. That always sends a subliminal message to me that capturing 8K in real-time isn’t easy, either from a practical standpoint (the gear/talent may not be available), or it’s too expensive, or both.
On the other hand, it’s relatively easy to set up a DSLR that can capture 8K (or more) resolution one frame per second, rather than 24, 30 or 60 frames a second. And that capture process hasn’t really changed since HD times.
That’s not to say the time-lapse footage looks bad because it usually doesn’t. But I rarely see any time-lapse footage on 4K televisions at CES. Using time-lapse also hides the elephant in the room (in my opinion) with 8K: frame rate and camera movement. But that’s for another time. Still, display manufacturers have to have a big splash at CES, so 8K lives.
The Canon EOS Cinema C200 Digital Camcorder with a good amount of accessories on board.
Digital Cinema Camera manufacturers spend a lot of time and effort crafting the designs of the cameras that we buy and use. The ergonomics, layout, design workflow and usability are a significant part of the camera design budget. Yet how many times have you bought a digital cinema camera or even a mirrorless/DSLR camera and NOT accessorized it?
For me, I rarely shoot with any camera without third party accessories. Here’s why. It’s all about purpose and deployment. Case in point, mirrorless cameras are designed and executed as still cameras that also happen to shoot pretty good video as well. For those of us who buy them mainly to shoot video, we aren’t using the cameras for their primary purpose. To make a mirrorless camera more appealing for video shooting takes accessories. Sometimes a lot of them. To deploy a mirrorless camera on a gimbal, handheld, off the shoulder or mounted to a tripod, there’s a huge array of accessories available that allow the camera to perform better in each situation.
Let’s take a look at what I use as my main A camera, the Canon EOS Cinema C200. The C200 is great for clients who want to shoot RAW (yes, they do exist, we have at least a few of them), clients who care about shooting 4K60p and for clients who like a relatively small footprint in the camera used while still retaining an extremely high-quality level with Cinema RAW Light recording. For broadcast clients who care more about a mid-range codec than RAW or 4K60p, we shoot the C300 MKII or the FS7 MKII. For accessory purposes, the two Canons are basically similar, the FS7, a bit different but a lot of our various camera accessories are also adaptable to the Sony.
Let’s start at the bottom of the camera. While you can attach a tripod plate directly to any camera, pro users often utilize a baseplate. The camera baseplate can offer a variety of additional functionality to the camera. Here are just a few reasons we use a baseplate: we’re often required to shoulder mount the camera for long-term handheld shooting. Small digital cinema cameras are rarely very shoulder mount friendly unless you attach them to a baseplate designed for shoulder-mounted shooting. The baseplate we chose is the Zacuto VCT Pro. There are many others from SmallRig, Tilta, Wooden Camera, Arri and others, but we chose the VCT Pro for some specific reasons, primarily because it mounts to the ever-popular Sony VCT14 tripod plate, the same model we have been using since we started in the business with Betacams and other larger Sony broadcast-type cameras like the F900. The VCT Pro baseplate also has a sliding adjustment that lets us customize the center of gravity balance point of the camera/lens/accessory package mounted on it.
The VCT Pro also allows the use of 15mm rods, two sets, one in the front (useful for mounting handles, follow focus or FIZ controls, lens supports and other accessories) and one set in the rear (useful for attaching outboard recorders, battery plates and wireless video transmitters). The gel-padded cells on the bottom of the VCT Pro are fairly soft and comfortable, allowing you to shoot shoulder-mounted longer without pain. Lastly, the Zacuto VCT Pro is fairly universal, allowing us to use it with almost any camera we own or rent. Few things in our business are usable with lots of different cameras, so it seemed to be a good idea when we bought it.
Moving up to the top of the camera, we bought a Zacuto C200 Top Plate and Recoil Handle. Why do you need or want a top plate and/or handle? It varies from user to user and camera to camera, but for us, we wanted to specifically mount the accompanying Zacuto Recoil handle so that we would have a place to mount the C200’s touchscreen farther forward that the stock handle allowed.
The top plate also gives us a variety of ¼” 20 and 3/8” mounting options should we need to mount other accessories to the top of the camera. The top plate is rock solid and allows for more configuration options.
What about rods? There are two basic flavors of rods for digital cinema cameras, 15mm and 19mm. 15mm is definitely the most popular, but 19mm is popular on large, heavy-duty builds more commonly used in features and episodic work, the 19mm rods are more rigid and can support more weight without flexing. We use 15mm rods on the front of the VCT Pro for lens supports for longer, heavier lenses and the occasional follow focus. We also mount a Wooden Camera dual Arri Rosette mount for handles when operating shoulder-mounted. We occasionally use shorter 15mm rods on the rear of the camera to support a V-Mount battery plate or sometimes an Atomos recorder.
When operating shoulder-mounted, most cameras have the grip handle placed too high and too far back on the body to be useful. We have two Tilta skeleton camera grip extension arms that we use on the C200, usually only one on the C300 MKII. The Sony FS7 comes with its own grip arm extension, but there are third-party accessory arms from Shape that allow single-handed adjustment of length, unlike the stock Sony unit, which requires two hands to adjust.
Moving away from shoulder mounting, cages are extremely popular for mirrorless and DSLRs as well as for small cameras like Blackmagic Pocket 4K and 6K cameras. These smaller-bodied cameras tend to be more fragile than their larger digital cinema cousins, so placing a metal cage around the camera body gives you more points to mount accessories like monitors, handles, lights and microphones as well as offering protection against impacts against the camera body. A mirrorless or DSLR type camera can be mounted and usually left in a cage permanently unless you need to configure your camera in a way where the cage adds too much bulk or weight.
So far, we’ve covered a bit about why we accessorize our cameras in various ways that have mostly been about physical dimensions and functionality for shooting. Two areas we haven’t yet covered are audio and time code accessories. Neither our C200 or our Fujifilm X-T3 offer time code inputs so we purchased three of the Tentacle Sync E, a Bluetooth time code generator system that allows us to sync up to three audio sources with each other wirelessly. The C300 MKII has time code inputs so it can easily be hard wired to a pro sound recorder, but we even use our Tentacle Sync E with the C300 MKII because going wireless for timecode sync is always easier than running long BNC cables all over the floor.
Other audio accessories that are popular for all types of cameras are onboard shotgun microphones. There are a lot of different models on the market, but the Røde line seems to be one of the most popular with mirrorless/DSLR/phone users. We use an Audio Technica AT-875r on our C200, C300 MKII and FS7 as a camera mic for scratch audio, but we use a Røde Video Micro on our Fujifilm X-T3 because it’s smaller, lighter and outputs a 3.5mm signal versus an XLR output like the pro mics have to better match the Fujifilm X-T3s 3.5mm microphone input.
Other popular camera audio accessories are audio interfaces like the Panasonic DMW-XLR1 XLR Microphone Adapter, a popular addition to the Panasonic DCGH5 and DC-GH5S mirrorless cameras that allows pro-phantom-powered microphones to be used with a consumer/prosumer mirrorless camera. I wish all mirrorless camera manufacturers offered a similar adapter.
At the end of the day, stock mirrorless/DSLR and digital cinema cameras have to appeal to a wide audience of users who use them in mind-bogglingly diverse ways. I never cease to be amazed at how many people hang their cameras from drones, take them underwater, mount them on motion control sliders, jib arms, shoot handheld or on a gimbal or Steadicam-like devices. Many users need these accessories that allow them to use their cameras in so many different ways. We live in a time when there’s a wealth of aftermarket manufacturers, from budget to high end to fill our every need for rigging, mounting and shooting, we’re truly spoiled for choice.
At CES, I walk the acres of exhibit floors looking at all the new technology. There isn’t a great deal that applies to my job as a video editor, but certainly, displays and various capture devices do. Also, as someone who has been involved in the tech industry for most of my life, I relish the opportunity to learn where technology is going.
As I view the exhibits, in the back of my mind I try to envision how the technology I see might apply to my suite. But since most of the exhibits are geared to consumers, that brings up the whole issue of using consumer gear in my world.
When I talked about the fact that HDMI was never designed to be used in a professional production environment, it made me think about gear I use that some wouldn’t call professional. (I’ll skip the argument on what’s professional and what’s not. I can leave that for the endless comment sections produced when Final Cut Pro X was introduced.)
Of course, I use HDMI, usually to show my clients edits on a consumer TV in the endless quest to answer the question, “Is it going to look like that at home?” Is HDMI my preference? No, but there aren’t any sets that have BNC connectors.
Do I really care? Am I being a BNC snob? Not at all. I’m fine using consumer gear to get the job done. But, at the same time, I keep in mind that I’m using something that might not have the same performance as professional gear.
So, as I was walking the show floor, I came across Hyundai Technology. And no, there were no cars there. Although its origins came from the same company, Hyundai Technology creates various consumer devices. The one that caught my eye was an external SSD.
Small and portable, with models from 250 GB to 2 TB, the read and write speed was specified at 450 MB/s and 400 MB/s respectively. It’s packaged with both USB-A and USB-C cables. This was a device that I could use when I needed to transfer files!
I often have clients in my suite who want to give me source material. The usual method of handing me a USB stick becomes problematic because files are getting larger and larger and most (not all) USB sticks are slow. So we wait to transfer files from their computer to the stick and then from the stick to my edit array. It certainly doesn’t take hours, but it can definitely dash the momentum during a session.
Another thought I had when I was learning about the SSD is that while it’s small, it’s large enough that it won’t get lost. And by “lost” I mean that it won’t be accidentally taken by the client. It won’t be in a pile of USB sticks or get mixed in with any client sticks.
The Hyundai SSD will be another piece of gear marketed to consumers that can have a place in my workflow. It might not compare in some performance aspects with another “professional” SSD: it might not be screaming fast or survive being dropped from a two-story building or being run over by a dump truck, but that’s fine. I’ll take that into account in how I use it.
That’s the tradeoff of using consumer gear in a professional environment —emphasis on “trade.” It’s like the trade-off between a warehouse-style store and a full-service store. The warehouse store might not look as nice and it might take you longer to find someone to help you, but you pay less at check-out.
As long as you remember where you’re shopping, you shouldn’t be disappointed that it might take you longer to find what you need. Likewise, as long as you remember what your gear was originally designed for, you shouldn’t be disappointed in its performance.
“Sergio Mendes: In the Key of Joy,” a documentary that I shot in 2017 and 2018 just had its premiere at the Santa Barbara International Film Festival.
It finally happened. Director John Scheinfeld’s feature documentary about the life of Brazilian composer and musical legend Sérgio Mendes, “Sérgio Mendes: In The Key of Joy” that I served as the director of photography on all through 2017 and into 2018 is finished and had its world premiere at the Santa Barbara International Film Festival a couple of weeks ago. It’s been quite a while since I’ve seen my work projected in a theater with an audience, and if you haven’t had the privilege to experience it, it’s something special to feel their reaction as they watch the story you helped to tell.
During production, there were dozens of shoots here in Los Angeles, at Sérgio’s home and at multiple recording studios. In 2017, we journeyed to Brazil with Sérgio and his wife Gracinha to shoot Sérgio at his birthplace, at the first club he ever played at professionally and at various locations all around his home town of Niteroi, across the bay from Rio. We also shot studio sessions with various musicians in Rio and a song session with Soccer legend Pelé in Sao Paolo. In 2017, I flew with director John Scheinfeld and producer Dave Harding to Rio where I had my first exposure to working with Brazilian crews. My crew there was so professional, helpful and a lot of fun to work with. Visually, Brazil is a wonderful tapestry of tropical beauty mixed with the European and Portuguese influence; I’ve rarely shot in a more beautiful, spectacular location.
While there, we shot Sérgio and various scenic b-roll all around Rio and Niteroi, but the highlight of the shoot for me was the day we spent in a small recording studio in the hills of Rio, almost directly in the shadow of the iconic Christ the Redeemer statue. For the session, Sérgio assembled a small jazz ensemble of bass, drums and a three-piece horn section. With Sérgio at the keyboard, the group played original jazz tunes that Sergio had written as a young composer when he was just coming onto the music scene in Brazil. These jazz tunes were written well before Sergio found worldwide acclaim for his global hit, Mas Que Nada with Brasil ’66. No vocals, no lyrics—just pure, unadulterated jazz. It was a challenging shoot, trying to light the small studio to resemble a jazz club, but the musical experience was incredible as a fan, and I kept practically pinching myself that I was getting to hear this music as I shot that nobody else had heard played for more than 60 years. It was a magical experience for a jazz fan.
As the days of the shoot wore on, we traveled with Sérgio all around Rio, shooting b-roll of the beautiful resort and beach areas. One of the most spectacular shoots was documenting the sunset over Dois Irmãos (Two Brothers), two iconic mountains that tower over Ipanema Beach. We filmed the scene from a rock outcropping on the east side of the beach. In Rio, watching the sunset is an activity all its own, and we were joined by hundreds of people on the rocks as we filmed the sun descending behind the mountains. Everyone on the beach cheered and applauded as the sun disappeared behind the two mountains, the scene punctuated by vendors selling fruit and drinks to the assembled crowd, it was quite a unique experience. Where else have you filmed where the sunset is its own star with an adoring audience?
Another epic shoot in Brazil, both logistically and emotionally, was filming Sérgio aboard the ferry from Rio to his hometown of Niteroi across the bay. Sérgio was born during World War II in 1941, so riding the same ferry that he rode as a teen, then a young man to travel from his home in Niteroi to Rio where he played his first gigs as a professional musician and composer was quite emotional. Compounding that was filming on the top deck of a 200-foot long ferry over the bay. The winds were very high, making the shoot a challenge with sound and with trying to not have the flags and reflectors I was using to light Sérgio blow away. My Brazilian grip and gaffer were on it, but everyone in the crew pitched in to keep any of our grip gear from going airborne into the ocean as we crossed the bay.
Our director, John and our producer Dave Harding decided that we would shoot all of our interviews using a green screen. As a cinematographer, for me, shooting interviews on a green screen isn’t always the most creative endeavor as you rarely get to choose and light the background plates. That function, at least in documentaries, is often decided later, in post, long after you’ve shot the interviews. Green screen is often a “cart before the horse” situation in that you don’t know what the backgrounds will be, so how do you decide to light the talent in the green-screen shot? What will the lighting on the backgrounds look like? Which direction will the light come from for a given shot? Color and textures?
Since we were shooting in multiple countries and locations, as a producer, I understood the decision to shoot green screen; it made sense. We shot some interviews in various recording studios, several more interview sessions in Sérgio’s home and in many different locations in Brazil. But I was a bit wary that the interviews I had shot probably wouldn’t match very well with the backgrounds. More on this later.
We were able to shoot interviews with an amazing array of talent. Quincy Jones, Harrison Ford, Herb Alpert, Lani Hall, John Legend, will.i.am, Common, Brazilian musician Carlinhos Brown, Gracinha, Sérgio’s wife and all of Sérgio’s family, numerous Brazilian TV executives, journalists and musicians all make appearances in the film. The list went on and on. We developed a loose signature look for the interviews, even though they were all shot green screen. Our producer, Dave Harding, was a former gaffer, so it was nice to have a producer who understood lighting and could confer with me to help push the look to what John’s vision for the film was.
I utilized a soft key source, usually using a LitePanels Gemini 2×1 shone through a 4x or 6x silk and then, rather than utilizing another soft or hard source as a fill source opposite the talent as one might normally do, I’d place another smaller instrument like an Aputure Lightstorm LS-1S LED panel through a 42-inch diffusion disc on the same side as the key source but lower and at less intensity. This would give me some extra power to light talent, whose skin tones raged from Caucasian and fairly pale to fairly dark skin, in a relatively even light level. I would sometimes use a solid or Duvetyn on a C-stand on the fill side of the talent’s face to knock down the wrap from the two soft sources, allowing us to get some shadow and “mood” on the talent, but not too much as the tone of the film was to be upbeat, lighthearted and joyous.
For the women and some of the male interview subjects, I’d finish off the look using a small hair/rim light. For women, even though hair lights are a bit out of style lately, it gave a nice flattering glow to their hair, but I kept the level to a minimum, trying to not make it too apparent. Most importantly, the hair/rim light would nicely separate Sérgio’s iconic hats that he always wears. It was kind of wonderful; Sérgio seemed to wear a different hat in almost every interview in the film. I wanted to make sure that the hats were clearly highlighted as part of Sérgio’s look, and I think we succeeded.
Later, in 2018, I received a call that we were going to actually have sets built and shoot them as the background plates for the green screen interviews. We also shot a large selection of individual elements that would also be integrated into various notion graphics in the film. It’s rare, as a documentary DP, to have the opportunity to also shoot the background plates for your green screen interviews so that was an interesting experience for me and our Los Angeles crew. Our Gaffer, Mark Napier, even came up with a really effective way to add movement to our shadows on the backgrounds using a rotating Mason jar as well as a woven Bamboo basket. The patterns and lettering from the glass broke up the light and lent a nice, organic quality to the movement, along with the Bamboo basket, kind of like the sun shining through moving tree branches without being as literal as using a “Branch-o-loris,” shining a light through an actual tree branch on the C-stand. The backgrounds came out nice and are used to great effect throughout the film.
Director John Scheinfeld’s cut of the film comes in at about 100 minutes and played its opening night at the Santa Barbara Film Festival to a sold-out house. The audience gave the film, John and Sérgio—both who were in attendance—a standing ovation. John and Sérgio gave a Q&A after the screening, which was then followed by a set with Sérgio and his touring band. I was lucky enough to be in the audience that night and received a nice shout out from John about the photography of the film and the challenges we encountered shooting it.
At the screening, I reflected back on how lucky I was to use my cinematography to help tell the story of a musical legend. In a way, Sérgio Mendes has musical accomplishments that are commensurate with artists like the Beatles (Sérgio and band opened their set at the screening with a Brazilian flavored cover of the Beatles’ “Fool on the Hill,” a big hit for Sérgio in 1968), his music built a huge global audience for Bossa Nova and he’s still touring all over the world, collaborating and composing with younger musicians and producers like will.i.am (producer of The Black Eyed Peas with whom he re-recorded in 2006 an updated version of his breakthrough hit, Mas Que Nada), John Legend and Common as well as numerous other musicians and collaborators who appear on his latest album that’s just being released as I write this.
As John related during the Q&A session at the screening, the film is John’s positive, feel-good antidote to the darker political and social times that are so prevalent in 2020. As Sérgio’s story evolves in the film, he faced some incredibly dark times with political persecution during the military coup in Brazil as a young man, which was the main reason he came to New York in the 1960s. But he never let the darkness he experienced color his optimistic outlook on life; he found joy and happiness through the serendipity of life, which is a timeless and relevant message for us all.
The post Documentary Serendipity—Sérgio Mendes: In The Key of Joy appeared first on HD Video Pro.
Wunmi Mosaku and Sope Dirisu appear in “His House,” directed by Remi Weekes, an official selection of the Midnight program at the 2020 Sundance Film Festival. Courtesy of Sundance Institute | photo by Aidan Monaghan.
“My House” pulsates with claustrophobic unease, a horror film that disturbed audiences with powerful effect. It’s a remarkable and surreal entry from first-time director Remi Weekes, reveling in terrifying moments and a disturbing storyline that was screened as part of Sundance’s Midnight program.
In this video clip, first-time director describes how he created such a claustrophobic horror film:
The film tells the story of a refugee couple escaping war-torn Sudan, only to find themselves in an even more dire situation as refugees housed in the sanctuary of the UK.
Saved after a small boat packed full of people capsizes in rough waters, the couple arrives in England, heartbroken over the death of their young daughter, who drowned during the rescue. Sope Dirisu and Wunmi Mosaku play husband and wife with pinpoint accuracy, a relationship becoming increasingly strained while attempting to integrate into a society that simply does not want them around, or just does not care.
Housed in a raggedy interior that is nightmarishly dirty—mysterious holes in its walls, endless streams of roaches and lights that never turn on—the couple quickly becomes aware of cirrational and incomprehensible phenomena that surround them in their less-than-humble abode.
A threatening atmosphere builds from the outset, not only inside the derelict house but outside too: As Sope tells her caseworker, she survived in Sudan by wearing the scars of two warring tribes, alive by trying to not belong anywhere at all. The cultural divide is further underlined when she approaches three black school kids, struggling to get directions from them as they mock her thick African accent.
Her husband is equally at odds, shopping for clothes in a store surrounded by images of pristine Caucasian models. Their horrors are everywhere, both subtle and extreme, and there seems to be no escape. Their growing unease soon escalates with the appearance of dreamy, grotesque and bizarre creatures peeking out from broken walls, and shuffling past from dark, creepy corners. Surreal horror plays large, with dreaminess, grotesqueness, bizarreness and fantastical all around, a home with cosmic supernatural spirits at play.
“My House” is a disturbing film that highlights the plight of refugees within a horror-laden storyline, one that questions where we belong, what we call home, and how we identify who we are. Weekes does an admirable job balancing the surreal genre elements of horror with the very real trauma of migration.
The post Sundance 2020: “His House” Offers No Refuge From Horror appeared first on HD Video Pro.