I set my camera to the street to test the detection levels. This is what I noticed. On sunny days it’s accurate but on overcast days it accurate on person wearing colored clothing. Except persons wearing dark or gray color clothing the detection levels drop to about 50 percent. Why? Don’t know unless the camera uses contrasting imaging to see movement.
depending on the contrast of the person/ clothing to the background environment the AI might only see the movement at certain frames and thus not have enough of a “person detected” to signify that it should be flagging it as a person.
imaging your camera pointing at a pure white wall, and a person wearing a white suit similar to that wall, when the person travels in front of the wall the camera would only pick up the movement of the limb or body not matching that wall. so it would be similar in lower contrasting dark colors which is probably whats going on.
keep in mind that although our cameras have good “vision” the human eye is hundreds of times better so you will always notice more detail than our cameras.
Not to mention that because PD uses Pixel detection, the higher the color count, the easier the camera can discern contrast differences. I’m finding that although my human detection has got a 85-90% success rate, the thing can be fooled quite easily with trees and shadows when it comes to motion. I don’t sweat it too much, I’d rather be notified ad nauseum than miss some tweeker pulling in my driveway.
Keep feeding the new AI, likely just needs more learning.
^^^That’s what worked for me