Do you look at AF points while culling? Curious about your workflow

Roman Gishtimulat

New Member
Joined
Jun 19, 2013
Messages
18
Lightroom Experience
Advanced
Lightroom Version
Classic
Hey everyone,

I’m a developer working on a photo culling app, and I wanted to ask out of curiosity how you handle something that I personally find more and more useful lately:
Do you actually look at autofocus points / AF zones when you cull your photos?
With modern cameras using Eye AF, subject detection and tracking, I’ve noticed that just zooming in to check sharpness doesn’t always tell the whole story. Sometimes the photo is “almost” right — and seeing where the camera focused instantly explains why it missed slightly.

We recently added AF point visualization into PhotoPicker for iOS, mainly because a sports photographer asked for it to understand whether the camera locked onto the right subject in fast-paced situations. But now I’m noticing that wedding and portrait shooters are using it too, especially with shallow depth of field and Eye AF.

I’d really love to hear:
  • Do you check AF points at all when culling?
  • At what point in your workflow do you look at them (before rating, after, only on problem shots)?
  • Does it actually influence your keep/reject decisions — or is sharpness alone enough for you?
If you use any tools or plugins for visualizing focus points inside Lightroom, I’d also be curious which ones you like and why.

Not here to sell anything — genuinely interested in how real photographers use (or ignore) this part of metadata in practice

Looking forward to your workflows and experiences!
 
I use BreezeBrowser Pro, which is a Windows-only application. (If you're developing a culling application for Macs, please check out the functionality of BBPro. It'd be great to have something similar on the Apple side, besides PhotoMechanic.) I sometimes check the focus points when I'm still in the culling/captioning process, before importing the files into Lightroom. I'll go back to look at the focus points in BBPro when I discover something is amiss when zooming in on a file in Lightroom. It's educational, being able to figure out why certain files "work" and others don't.
 
I am mostly a landscape / travel shooter, but occasionally travel with serious wildlife photographers, many who use really high frame rates.

The two behaviours I notice..

1. Many rate their images in camera… especially to strike out a whole burst that they know has missed the critical moment. Images are rated and then probably check focus… depending on when they can transfer to a computer for a better end user experience.
2. Following import to LrC… depending on power of the computer… for bursts of images… keep one finger on the right arrow button and allow the computer display the sequence of images in a pseudo slow motion effect… the objective.. to zone in quickly on the most critical frame.
 
Hey everyone,

I’m a developer working on a photo culling app, and I wanted to ask out of curiosity how you handle something that I personally find more and more useful lately:
Do you actually look at autofocus points / AF zones when you cull your photos?
With modern cameras using Eye AF, subject detection and tracking, I’ve noticed that just zooming in to check sharpness doesn’t always tell the whole story. Sometimes the photo is “almost” right — and seeing where the camera focused instantly explains why it missed slightly.

We recently added AF point visualization into PhotoPicker for iOS, mainly because a sports photographer asked for it to understand whether the camera locked onto the right subject in fast-paced situations. But now I’m noticing that wedding and portrait shooters are using it too, especially with shallow depth of field and Eye AF.

I’d really love to hear:
  • Do you check AF points at all when culling?
  • At what point in your workflow do you look at them (before rating, after, only on problem shots)?
  • Does it actually influence your keep/reject decisions — or is sharpness alone enough for you?
If you use any tools or plugins for visualizing focus points inside Lightroom, I’d also be curious which ones you like and why.

Not here to sell anything — genuinely interested in how real photographers use (or ignore) this part of metadata in practice

Looking forward to your workflows and experiences!

Adobe already has introduced AI culling. How can you improve on that?

I have used the Focus point plug-ins available for Lightroom Classic in the past but only to satisfy my curiosity, not to determine which image to keep. Usually I am able to “eyeball” the parts of the image in focus to determine suitability. I’m sure Adobe’s AI culling looks at focused pixels and especially whether eyes are in focus. In using Adobe’s AI culling tool, I find it more than adequate to determine which images to keep and which to discard.

As for Workflow, Lightroom Classic is my DAM tool. Pre-screening apps like PhotoMechanic are IMO unnecessary and slow down the workflow. I import everything and evaluate in Classic rejecting images while the import process is still ingesting images. For the last few years, I have been using an iPadPro as a front end to Classic and importing there. I haven’t imported images using my iPadPro since v11 update. So I don’t know if AI Culling is available on the more limited platform (It doesn’t appear so.)

You should need and use only one DAM tool. Until something better comes along, that tool is and should be. Lightroom. Any features like autofocus point identification and AI culling should be a part of Lightroom. If they are not, then they should be available inside Lightroom via a Plugin. One shortcoming of the Lightroom (cloud based) family is the inability to enhance the app via plugins.


Sent from my iPad using Tapatalk
 
I don’t see how showing the focus point can be useful for culling. It only tells you which focus point the camera used, not if the focus was achieved. That means it can indeed be useful to try to find out why an image is not in focus, but it can’t tell you if the image is in focus.
 
Never these day. If it's out of focus it's out of focus. I trust the modern AF subject/eye detect systems so if it is OOF it's likely something I did. I used to many years ago but I shoot in Servo exclusively and at times it was difficult to judge accuracy.

I shoot with Canon and I really like its proprietary software called DPP. In Quick Check - Full Screen mode RAW files are displayed as final Jpegs and the screen resizing algorithms are very good. That is my first wave of culls and most OOF don't get past that stage. I then import into LrC and do my final culling, etc.
 
For high volume shooters, increasingly with images on Cf Express cards. I cannot see photographers detouring to an IOS app.

I deliberately purchased an iPhone with 1 TB capacity, so I could backup my cards to my phone if not possible to bring my travel MacAir M2 (which has never happened yet)… and even with such capacity I have no interest in an iPhone or Cloud based workflow.

How does your app cater with raw files or can you just use the focus coordinates and built in jpg.
 
Never these day. If it's out of focus it's out of focus. I trust the modern AF subject/eye detect systems so if it is OOF it's likely something I did.
Most modern cameras have several focusing methods. Pinpoint, Spot,Wide area (average), Auto Area, Eye detection and Tracking. Choosing the wrong one can result in the subject being OOF. AI tools like Topaz Super Focus. can sometimes fix an OOF image, correct motion blur or DOF blur. I find this useful if the best image I captured is slightly OOF. And it is something that would be a nice addition to Lightroom that could mean not needing an external editor and a derivative image file.
 
Most modern cameras have several focusing methods. Pinpoint, Spot,Wide area (average), Auto Area, Eye detection and Tracking. Choosing the wrong one can result in the subject being OOF. AI tools like Topaz Super Focus. can sometimes fix an OOF image, correct motion blur or DOF blur. I find this useful if the best image I captured is slightly OOF. And it is something that would be a nice addition to Lightroom that could mean not needing an external editor and a derivative image file.
Yep. I start out with Whole Area and Eye Detect and my camera are programmed for several manual override contingencies to get the AI to do what I want, not what it wants. Well over 80% effective when I leave it alone. The transition can lead to some OOF shots and the worst offender is me as I can be shaky. I'm not a young whippersnapper anymore so the more I tire out the more shaky I get. Prop blur and panning is one of the most challenging things for more.
 
For high volume shooters, increasingly with images on Cf Express cards. I cannot see photographers detouring to an IOS app.
This one does ;-) My CF Express cards import about as quickly as an XQD or SD Card. My 128GB CF Express card can hold ~2000 48mp NEFs.

When I am traveling, a real computer is not going to be available. All of the necessary preliminary evaluation can be done in Lightroom and be synced to the Adobe Cloud and down to my real computer by the time I get home
 
I should also say shooting static objects with eye detect these days is like shooting fish in a barrel. Flying birds and sports is still work and fun for me but it is easier. I don't keep too may OOF shots. It has to be pretty special for me to keep and use a 3rd party de-blurring app. Something I'll never see or get another chance at. If I was a working pro I'd do whatever I needed to get the job done.
 
Yep. I start out with Whole Area and Eye Detect and my camera are programmed for several manual override contingencies to get the AI to do what I want, not what it wants. Well over 80% effective when I leave it alone. The transition can lead to some OOF shots and the worst offender is me as I can be shaky. I'm not a young whippersnapper anymore so the more I tire out the more shaky I get. Prop blur and panning is one of the most challenging things for more.
I know we are getting off topic here but I'd like to share some results. AI focusing tools can be a great solution to OOF images. Here are examples before and after Topaz Super Focus viewed at 200%.
1764782651913.png
1764782529809.png
 
We are a bit off topic but I guess once in while a little does not hurt. Model airplane show and I just got there as this one was finishing so I was rushing and to set up. I like these planes so I saved it. Topaz. I downsized it and it looks better on my screen.

6d24d0cbef874c77b1a2c2786908f95f.jpg
 
Adobe already has introduced AI culling. How can you improve on that?

Roman is the developer of PhotoPicker, which is iOS only as far as I know.

Adobe has not introduced AI culling (so far) for iPad Lightroom, so what he’s working on can improve the current Lightroom culling situation on iOS.
 
Because I shot a lot of sport and action photography, I do check focus on a regular basis using a plug-in for LRc, but not normally at the culling point, but for me that would be useful for me.
 
Thanks everyone for sharing your workflows — this has been really insightful to read.

Special thanks to those who mentioned BreezeBrowser Pro — that’s a great tip. I’ll definitely take a closer look at BBPro to see what we can learn from its approach and feature set.

A small technical note that might be interesting in the context of this discussion:
In PhotoPicker, we don’t just show the position of the AF point. We actually read AF information directly from the MakerNotes inside the RAW files. For most supported cameras, that includes not only the coordinates of the focus point / zone, but also whether the camera reported the subject as in focus or not in focus at the time of capture.

We visualize that like this:
  • Red frame = camera reported “in focus”
  • Yellow frame = focus was not confirmed

So it’s not only about where the camera focused, but also whether it thought it succeeded. That helps explain borderline cases where the image looks “almost right” and you’re trying to decide if it’s a keeper or not.
Again, thanks a lot for all the thoughtful replies — different shooting styles clearly need different approaches, and that’s exactly what makes these discussions useful.
I genuinely appreciate everyone taking the time to describe how you work.
 
Back
Top