A&S partnered with Australian remote monitoring service provider VideoControlRoom to test outdoor detectors. Michael Brown, MD of VideoControlRoom, discussed how the solutions that aim to eliminate false alarms while reliably detecting intruders fared in real life.
As VideoControlRoom moves toward “bill per event” services, it is important to prove to customers that we understand and use “world's best practice” outdoor detection. Therefore, we decided to partner with A&S, the only trade publication with the reputation and global reach to engage the manufacturing community, and share the results for the betterment of the industry. The invitation was open to all manufacturers, and the only entry criterion was a willingness to have their products openly tested against peers.
VideoControlRoom takes its hat off to the brave manufacturers who put their reputations on the line for the first A&S Great Outdoor Detector Shootout. There are no losers in this test. The simple act of participation shows that a detector manufacturer takes detection technology seriously.
We did not expect to find the perfect outdoor detector; we do not believe it exists now and possibly will not come into reality for some time yet. The point is to provide consumers with the very best possible solution today. This requires continuous testing and finetuning of the offering across numerous installation conditions, to deliver the best possible balance of false-alarm rate versus failed-to-detect rate.
Therefore, any detector manufacturer willing to participate is a winner in our books. All the participants have shown the commitment to being a leader in this market. Why do we say this? It was obvious to VideoControlRoom, which has monitored a number of detectors, that many manufacturers hold too high an opinion of their own products. They never bothered to challenge their performance in real-life environments.
To highlight the advantages of a commitment to real-world testing, we mention Suren Technologies. Suren flew its chief R&D engineer from Shenzhen, China to our Australian test grounds, armed with its new dual-frequency IR outdoor detector in prototype form. During pretesting, we could not tune out alarms from flapping plastic in the freight and logistics test location. Not being a full release, it withdrew from the competition and has since modified the algorithm to accommodate swaying objects, critical in outdoor detection. We look forward to running further tests on its new release and thank Suren for its effort.
The Great Outdoor Detector Shootout is a step toward a solution which is perfectly tuned for the job. The participants have all shown commitment to outdoor detector excellence:
* D – Tect GJD 310 by GJD Manufacturing (United Kingdom)
* TriWatcher SIR10 by Atsumi Electric (Japan)
* Bobby by Lince Italia (Italy)
* Prestige External TD by Texecom (United Kingdom)
The D-Tect and TriWatcher mount at a good height for video monitoring. They are also vandal and accidental damage-resistant.
In commercial installations with trucks and forklifts, there is a good chance a low-mounting detector will get wiped off the wall by a forklift or truck. VideoControlRoom advises that outdoor detectors need to mount at least 2.7 meters high. Otherwise, they can easily be reached by people, and they do not readily match the camera's field of view.
We picked two typical outdoor environments for test locations.
Freight and Logistics
We set up the first camera trap location at the rear of a small freight and logistics yard. Typical of this environment are sea containers and pallets with plastic wrappings. Typical security issues experienced at this site are after-hour intruders looking for valuable stock in pallets and containers.
The benign alarm from this location was mostly Oliver, a small to mid-size dog.
Scrap Metal Recycling
Typical security issues experienced at this site are after-hour intruders trying to steal copper.
Scrap metal yards are a challenging environment, with a great deal of loose materials blowing around. Among the benign causes of false alarms was debris blowing in the wind.
At each location, we set all detectors to around the 12-meter detection range, to two pulse counts and mid-sensitivity. This may not have been ideal for every detector, but we had to find a common setting point. It would not have been fair to have one detector on one pulse count and maximum sensitivity, while another was on four pulse counts and minimum sensitivity.
For mounting height, we went with the manufacturer's recommendation. We set up the detectors to have the same field of view at 12 meters. In total, there were four test detectors and a control detector, as testing outdoor detectors in isolation is difficult and we needed to have a baseline for comparison.
The control detector selected is one we are familiar with. The unit is good at detecting human presence. However, as it was designed for sterile high-security sites, it is prone to be activated for small birds, rubbish and other nuisance factors. With the control detector, we knew there would be false alarms. Would the test detectors activate on the benign-alarm cause as well?
For the test, false alarms were defined as alarm activations caused by a benign-alarm cause, like cats, dogs, foxes, birds, rubbish, waving bushes or flapping plastic. A failedto- detect event would be the lack of an alarm where there is a known human presence moving through a detector's field of view.
The D-Tect was probably best at cutting down false alarms, but failed to detect instances of human presence too often. It may perform better had it been set to be more sensitive.
The TriWatcher was good at detecting the presence of human intrusion, with decent false-alarm immunity. It may perform better with extra pulse counts or lower sensitivity. The detector had a good mounting height for commercial applications and video monitoring.
Lince's Bobby detected humans well, but did not have enough pet immunity.
The Prestige had a higher failed-to-detect rate than the others.
The second site was a harsh environment. Perhaps the TriWatcher should have been on minimum sensitivity and more pulse counts. We set all detectors to two pulse counts and mid-sensitivity to compare the technologies in play. It needs to be considered that this does not reflect how the detectors would be installed for commercial conditions. As installers will have more settings for field of view when not comparing detectors, this may improve individual detector performance.
What is highlighted is a great deal of variance between products, or “not all detectors are created equal.” Outdoor detectors are not a “bolt up and forget” proposition. It is a science to get an acceptable balance between false-alarm rate and failedto-detect rate. Perhaps manufacturers, customers and control rooms should be looking for measured benchmarks on installation.