Looking to the sky
And seeing what I see
My heart wells up
The muses to please
I go grab my camera
And frame what I see
Release the shutter
And become so displeased
Finally! I made some time to get some moon frames and stuff.
I met a young fellow who was trying to make an image of the moon, handheld.
He was with his son and he was struggling. Eventually I offered some help, but by that time it was just too dark to make the piece without a tripod.
I did however introduce him to manual mode and graduated filters (for handheld work).
All of this reminds me of what I have believed since day one of “serious” photography making for me. Cameras are dumb, and the effort to make them right seems to be at the bottom of the list. They are nowhere near the amazingness of the human eye and we are strapped with fixing and manipulating the shortcomings of the ineptness of it all.
According to some reporting, the human eye can see some 21 stops of dynamic range. That is one of the reasons why we can look at the brightness value of a cloud, then look into shadow and see details in both. (The eye is also very fast in adjusting)
So in my mind, we can stop considering “more pixels” and start thinking about creating software/sensor systems that can SEE WHAT WE SEE.
Any camera company that could do that, would dominate for a few years…until the others catch up.
source: https://www.digitalcameraworld.com/news/if-your-eyes-were-cameras-what-would-the-specs-be
Case in point: The cover photo was exposed for the moon. The photo below was exposed for the stars (barley). Why don’t camera companies make the sensor/software systems see what we see? (Please no excuses, I know them already). Excuses do not solve problems.

Here is the fellow I met with his son. For handheld work, grad filters are a lot of fun.

Sorry for the rant. I still love photography and all of its challenges.




