I use exactly none of above systems, with the exception of simply labeling sheet film shots that need
other than "normal" development, and need to be otherwise lumped into N+1, N-1, etc batch sessions.
What I do is almost instinctively or instantly visualize the actual developed film curve in my head, and
where on it I want to place specific spotmeter values. It's a helluva lot more accurate than any kind of
math forumla or generic zone model. I can readily switch between different films, lighting ratios, developers and timing, filters, the whole nine yards, and nearly always nail it spot on. Interpretation of
a neg onto print all transpires in the darkroom anyway. After you've done enough densitometer plotting
and have enough field and darkroom experience, it all seems to become intuitive. I've even worked without a light meter just from memory of analogous settings. .. and I never rely on the "latitude" of a
film. I want to know exactly how the shadows, midtones, and highlight are going to differentiate. Might
initially sound tricky, but having first been schooled (school of hard knocks) by shooting and printing
chromes, black and white exposure is relatively easy to cope with.