STAY CONNECTED: Have the stories that matter most delivered every night to your email inbox. Subscribe to our daily local news wrap.

‘A lot safer place:’ Calgary university uses AI to monitor threats

May 14, 2019 | 2:20 PM

CALGARY — A post-secondary school in southern Alberta is turning to artificial intelligence to make its buildings safer and to allow security officers to catch criminals in the act.

The technology at Mount Royal University in Calgary alerts officials when anything out of the ordinary occurs in the normal flow of university hallways or on its grounds.

Developed in Australia, the iCetana system breaks everything down to pixels and “learns” the movement patterns of people, equipment and vehicles across the campus over a 14-day period. It comes to recognize shapes, sizes and movement — but not people or specific objects.

“What it will do if people normally walk in this hallway, and now they’re running, it’s going to … flash it on the screen,” explains Grant Sommerfeld, the university’s associate vice-president of facilities management.

“All the system does essentially is track the movement of pixels. It could be a gun. It could be a backpack, a selfie stick. It’s not going to automatically know that’s a gun. It doesn’t look for images.”

Sommerfeld says security officials used to watch an outdated bank of video screens, which would cycle through cameras on campus.

“Security staff faced with a wall of monitors and a full shift could almost get hypnotized by it and essentially it just becomes like wallpaper.”

New, high-resolution, 360-degree cameras that have been installed across campus catch details the old ones would have missed. And the screens are black unless something out of place has been detected.

“When there’s a change in the movement of the pixels on these screens, it flashes up in security with 15 seconds of what was happening before the pattern changed. It fast-forwards … and presents a real-time image,” says Sommerfeld.

“The human sitting at the desk decides we’ve got to go and investigate or decides, no, that’s fine.”

Sommerfeld says the system also recognizes something as simple as a backpack left in an empty hallway.

“It may be somebody forgot a backpack or it could be more sinister than that. It just knows that something is not normal in the movement on that hallway.”

Sommerfeld says security staff can intervene right away if there is a problem, unlike previously when video was used as evidence after the fact.

“It allows us to become much more proactive in terms of intervening and stopping crimes or bad behaviour,” he says. “I think it makes the school a lot safer place.”

There are other artificial intelligence models being developed, especially in the aftermath of school shootings.

A computer software company out of Philadelphia has it’s own AI threat detection. It identifies weapons and sends images of the firearms and the shooter to law enforcement and school officials.

“We’re not in Canada right now,” says ZeroEyes spokesman Rob Huberty, who adds the technology is still part of pilot projects.

Huberty says the software focuses on active shooter situations and uses video systems already in place.

“We basically take all the weapons that have been used in school shootings as our models and we constantly add to our data base,” Huberty says.

“A lot of people in school shootings expose their weapons fairly early, a lot of times in the parking lot, so we can let everybody know before any shot is fired.

— Follow @BillGraveland on Twitter

 

Bill Graveland, The Canadian Press