Aislinn Conrad, of social work, and Karim Abdel-Malek, of Iowa Technology Institute, have partnered on the invention.
Thursday, October 31, 2024
Karim Abdel-Malek Aislinn Conrad
Karim Abdel-Malek, Aislinn Conrad

University of Iowa professors in engineering and social work are inventing a camera that uses artificial intelligence (AI) and physics to detect real-time abuse of vulnerable people, such as children or the elderly. 

Aislinn Conrad, an associate professor of social work, calls it a “nanny cam on steroids.”

"Imagine a single camera that is running, but in the background AI is whirring,” said Conrad, a former child welfare investigator and case manager in the foster care system.

“The system is asleep until there is a jarring act that is leading to violence. The system comes awake and begins recording. If the AI determines violence is occurring, it sends an alert to caregivers so they can quickly respond."

Conrad has partnered with Karim Abdel-Malek, a professor of mechanical engineering and interim director of the Iowa Technology Institute. Abdel-Malek has secured a patent for the algorithm behind the AI.

The algorithm analyzes kinematics, or the motion of the human body. Speed and movement behavior helps the AI recognize if certain conditions have been met to delineate the potential for violence and categorize the type of violent act. If the conditions have been met, the segment of footage is sent to designated personnel for review.

The algorithm is an outgrowth of Abdel-Malek's virtual soldier research, which has been applied in military settings for more than 20 years to create a safer, more resilient soldier.

“We are optimistic about creating a world where technology acts as a silent guardian, protecting the most vulnerable among us,” Abdel-Malek said. “Our goal is to develop a system that not only detects abuse but also prevents it, empowering individuals and communities to build a safer future.”

The two are working to secure funding to conduct trials with research subjects.

The subjects would act out different behaviors to help train the AI to distinguish between different behaviors, such as punch versus kick, pinch versus slap, and playing versus fighting.

They have high hopes for the camera, noting the vast majority of abusers walk free for a variety of reasons including the lack of proof. In addition to allowing rapid intervention, the recordings could potentially be used as evidence to aid in prosecution.

The hope is commercializing the technology into a product that could become widely available by 2026, Conrad said.

“Our challenge is how do we mitigate abuse,” Conrad said. “Part of that is determining when it occurs. This is bridging a gap in the field in how to quickly respond to abuse and prove it is happening.”