Halation and Gate Weave

I'm working on stepping up my film emulation strategies and I was curious how you guys are handling halation and gate weave. I know there was an old thread about halation floating around, but I wanted to start a new one as that wandered off a bit.

My current strategy for halation is a fairly soft luma key of the highlights combined with a minor red \ yellow glow using the built-in glow effect. While it "works", I'd love to use a more technically accurate method as it can go a bit crazy on brighter scenes due to the glow effect.

In terms of gate weave, I haven't experimented too much yet but this twitter thread got me thinking this morning. @Dan Moran had a pretty cool idea of using the Gate Weave OFX effect (link) and @jamie dickinson had a cool idea of using a fusion tool, but ideally, I'd love to keep it on the color page.

How are you working with these elements of film emulation? Do you have a neat way of re-creating the effects within Resolve w/o 3rd party tools?
 
Last edited:
I have been trying to build a node tree that can be applied to every shot but only adds halation to highlights with strong edges.

So far it’s the original image fed into a layer mixer with lighten mode and a selective Luma key feeding Find Edges OFX.

I have looked at using camera shake for gate weave but have not found the right settings yet.
 
I have been trying to build a node tree that can be applied to every shot but only adds halation to highlights with strong edges.

So far it’s the original image fed into a layer mixer with lighten mode and a selective Luma key feeding Find Edges OFX.

I have looked at using camera shake for gate weave but have not found the right settings yet.

Sounds very interesting, thats along the lines of what I’m trying to do. I keep having to adjust my previous attempts too much per shot.

Neat idea of using the lighten mode!
 
I know that noted DP Steve Yedlin, ASC has a proprietary method for adding halation and other film-like qualities. But I don't think gate weave is a good idea unless you're trying to imitate (or match) archival film. I have had cases where I had to rebuild main title elements for classic films, and we had to precisely match weave, jitter, and grain so that there wasn't any great change going from the real thing to the simulation, and it's tough.

Yedlin's essays start here:

http://www.yedlin.net/OnColorScience/

Also quite a bit on Reddit. I believe Nuke is his weapon of choice, but there's lots of ways to composite or process this stuff.

One trick with halation is it's different in every year, with every stock. I'm doing a Chinese film right now that is weird-city in terms of halation and flare, but was only done in 1989. I gots no idea why it looks the way it does... but it doesn't look like another 1989 film I worked on a few weeks ago. It's a moving target, ephemeral and unpredictable. I'm not sure it's worth imitating, because to me it's not a quality, it's a flaw.
 
Most “flaws” from the 80’s are now a “stylistic” choice . But, to add something useful to the conversation...Juan Melara is doing some interesting things matching halation to Alexa profiled film stock LUTs that he created. They aren’t available but perhaps you could at least see and compare / analyze what he’s up to. Check his Instagram
 
Yes I am so interested in this also. The key for me is finding a method that, as said above, needs minimal tweaking shot to shot, and is pretty much 80-90% there. The issue i have had to this point is a method that truly interacts with the highest contrast edges only, an not the interior of said bright source. The edge detect tool seems to be the best tool to truly isolate high contrast edges, but I have not been able to get it to look as natural as without it, it can create harsh edges at times - but without it, there is constant tweaking needed, shot to shot. The other tricky part is getting the color right and having it blend well. I had started this thread on lowepost a year ago that has some interesting methods in it. https://lowepost.com/forums/topic/1103-simulating-film-halation/

Also, @Juan Melara just posted some stills on Instagram of his halation simulation and film emulation, which is very impressive.

https://www.instagram.com/p/BzaZw9-hsTK/?utm_source=ig_web_button_share_sheet
 

Attachments

  • LogNoHalation.jpg
    LogNoHalation.jpg
    112 KB · Views: 377
  • LogwithHalation.jpg
    LogwithHalation.jpg
    112.2 KB · Views: 366
  • Final.jpg
    Final.jpg
    123.8 KB · Views: 390
Yes I am so interested in this also. The key for me is finding a method that, as said above, needs minimal tweaking shot to shot, and is pretty much 80-90% there. The issue i have had to this point is a method that truly interacts with the highest contrast edges only, an not the interior of said bright source. The edge detect tool seems to be the best tool to truly isolate high contrast edges, but I have not been able to get it to look as natural as without it, it can create harsh edges at times - but without it, there is constant tweaking needed, shot to shot. The other tricky part is getting the color right and having it blend well. I had started this thread on lowepost a year ago that has some interesting methods in it. https://lowepost.com/forums/topic/1103-simulating-film-halation/

Also, @Juan Melara just posted some stills on Instagram of his halation simulation and film emulation, which is very impressive.

https://www.instagram.com/p/BzaZw9-hsTK/?utm_source=ig_web_button_share_sheet
It's was actually @Juan Melara along with Steve Yedlin's work that inspired me to take a deeper look at this, as the results are quite impressive. @Marc Wielage, Steve does use Nuke and has been fairly open with his process BUT it does require quite a lot of processing power (from what I hear). They used a linux render farm to handle the nuke work for The Last Jedi (@Walter Volpatto, feel free to clarify if I'm mistaken). I'm looking for something that finds a middle ground between technical accuracy, render times, and creative control over the effect.

Thanks for sharing your PG, @Jeremy Dulac ! Curious to check it out when I get a chance.
 
To me, scene referred workflow is mandatory for this kind of powergrade to work on every shot without re-adjusting. You set it once and then the shot’s exposure/composition dictates how much it naturally halates.
Agree.

Sent from my SM-N960U using Tapatalk

So outside of ACES and using a color-managed timeline, how could you set this up? Could you use a scene-referred colorspace for the timeline sequence work?
 
So outside of ACES and using a color-managed timeline, how could you set this up? Could you use a scene-referred colorspace for the timeline sequence work?

You can use any working color space you want, convert all incoming formats to that with CSC's or luts, and have a single output LUT. That basically IS a "color managed timeline," with the caveat that you choose the working color space (often Arri LogC, Sony Slog3/Sgamut3, etc). Many, if not most, studio level DI's are done that way.
 
You can use any working color space you want, convert all incoming formats to that with CSC's or luts, and have a single output LUT. That basically IS a "color managed timeline," with the caveat that you choose the working color space (often Arri LogC, Sony Slog3/Sgamut3, etc). Many, if not most, studio level DI's are done that way.
Basically this.

Sent from my SM-N960U using Tapatalk
 
For me, the way to do this effect has been to observe the way in which it happens naturally on film and then emulate that process. I observe that:
1. The halation has the characteristic of flaring, and not of brightening, around bright objects. This is why it is most noticeable when the bright object is surrounded by black (i.e. traffic light at night). When the background gets brighter (i.e. traffic light at day), the halation is less or not visible. The "spillover" effect still happens, only we don't see it (the overexposed silver halide doesn't "know" if the ones next to it are also overexposed).
2. The brightness of an object modulates both the luminance and the width of the halo.

It follows then that:
1. The emulation needs to be processed in scene-linear (gamma 1.0) to represent the flare.
2. Its intensity is dynamic in terms of nits and pixels. That means neither qualifiers, nor edge detection will be useful.
3. It's alright if it affects the whole image, as long as it's proportional to the intensity of the light and the darkness of the surrounding pixels.

Here is my recipe:
1. Convert the image from the working space curve to linear (gamut stays the same).
2. With curves, force all except for the super bright spots (the "halo emitters") to 0. Being in linear-light, the curve has to be quite aggressive because the highlights are way outside the usual 0.0-1.0 range (see image).
3. Give it the orange tint.
4. Blur. The blur amount is the maximum flare width.
5. Add the result back to the original (linear) image.
6. Convert from linear to working space curve.

HaloCurve.jpg
 
5. Add the result back to the original (linear) image.
6. Convert from linear to working space curve.

When you are adding back you are using add blending mode over the original?
How do you clip or roll-off the doubled light you get from the result? Maybe in when converting back to working curve space?

Matthias in what stage of colour grade would you add halation effect? As first step, on balanced original or as very last?
 
Back
Top