Here’s a compilation video of Person Detection for 5 days. I had to compress from over 200 Mb to under 6 Mb to be able to post so it’s small and blurry. However you’ll get the gist of it. I also trimmed the videos and sped it up.
Rather than just being the vehicles identified as people, I think the vehicles or other movement (trees, etc) are triggering the motion, then the AI is identifying this as a person:
It is seeing a sort of standing structure like a human with features that go in and out (like hips, shoulders, etc) with a neck-like thing and then a bigger head-like thing on top. The green lines are following movement, but not necessarily saying that is what the AI identified as a person (though it probably sometimes does). I can totally see why the AI might think that structure is a person. Sometimes it even looks like there are clear head, shoulders/arms, legs, etc.
I have seen a lot of similar things happen, including personally. When I identified the problem, I was able to disrupt the pattern and it solved the false detections. This also happens a lot for false vehicle detection.
A few things you can try to help resolve this issue are as follows:
- In theory you could try to make that area look like less of a person in shape…this might be impractical in this particular instance, but I felt it was still worth mentioning.
- A sub category of this, is to change the camera angle, and sometimes even a tiny angle shift fixes the issue too.
- Use Detection zones to only capture when something happens on your property (driveway, lawn, etc)…you’ll still get false alerts when you drive in or out, but you’ll have fewer issues.
- Specifically block out that little area from the detection zone. While it won’t be a 100% solution, Wyze AI team said they were actually working on having the AI be able to tell the detection zone areas we have and ignore anything that is blocked out. So for example, if you blocked out the road and across the street, then at some point, at least some cams would not identify a “person” who is walking on the other side of the street, even if some other motion (a cat walking in your driveway) set off the motion and detection. Thus, if you block out that particular structure, it’s possible your false person detections will decrease, at least in the long term, if not initially. I have had some cameras that worked this way and others that don’t, but they do sometimes do some different processing on V2 vs V3 vs VDB, so it can be hard to tell and I am also beta testing some of the AI stuff on some cams. I guess the point is, this might help, at least eventually.
Lastly, sorry you’re having such bad detection.
I have had ups and downs with mine over the last 7 months, but I have seen huge improvements in that time. Early on I had a bunch of false detections, including with vehicles, but that is now extremely rare. A few times a week I have my black cat identified as a person incorrectly, but otherwise it’s pretty accurate for me nowadays. I rarely ever have vehicles identified as a person, unless there is also a person shown within the same event, which is why I am thinking it is detecting something else as a person in this case rather than the vehicles. Mostly, I’m saying, since I’ve seen it get a lot better, I would hope that it should get better for you too. In the meantime, I hope you’re submitting the bad videos (though of course a few of those legitimately actually had a person in them and were correct).
Best wishes, hopefully the AI learns to get better for your situation, or that there is something you can do. It certainly would be frustrating to get so many false alerts. Thanks for sharing.
That’s a brick light post and it is covered up in my custom detection zone. (Pavement only). I have this cam at my parents house. I have 4 cams at my house and don’t have this issue. I changed every single trigger for the A.I for an entire month. I gave up this week.
I’m guessing the paper guy (0:41) doesn’t ever deliver on steep hills
Except for a Tesla, aren’t there always “people” driving those cars?
I’ve moved my had in front of the camera and it was flagged as person, too.
Definately a work in pogress. Nice work on the video btw.
I’m guessing the camera is triggered by motion in the defined zone, then the A.I. looks everywhere regardless of the zone for people. The light post is most likely being seen as a person as @carverofchoice said above.
This issue and many other similar false detections could be solved by training the AI to compare 2 different frames in the video. If it detects a person in a single frame, then checks a later frame and the “person” is occupying the exact same pixel locations, then it should not issue a “person” detection.
Same for cars. If it detects a “vehicle” in one frame, and in a later frame that “vehicle” is still occupying the exact same pixels, then it should not register a “Vehicle” label. This would ensure statues and box shaped things, etc are not flagged as false detections. It would be a simple solution to resolve a lot of false detection problems as is being seen here with the lamp post. Even if the lamp post sometimes resembles a human, if it is always in the exact same location (occupying the exact same pixels in each frame), then the AI could rule it out as being a person as far as videos are concerned. If it so happens that a person sits or stands or lays very still for a while, then as soon as they do move it will be able to detect they really are a human because the human looking shape moved to different positions in two different frames, therefore it should report it in the video labeling.