In traditional camera parlance, lower sensitivity means the camera is less likely to detect
motion. Higher sensitivity means the camera is more likely to detect motion.
“The Wyze app lets you set the sensitivity of your Wyze Cam with the Detection Settings.
The slider has a range of 1-100 and will adjust the percentage of changed pixels that are
necessary to generate an event video.”
This is confusing. Does Wyze equate going from low to high on the slider with 1 to 100
percent of changed pixels? Because that would mean that in “Wyze World” a 1% pixel
change is considered “low” but a 1% pixel change would trigger almost endless motion
alerts and alternatively a 100% pixel change trigger would result in much fewer notifications.
So basically what I want to know is which direction moving the Wyze Sensitivity sliders
equates to more alerts and which direction results in fewer alerts.