On the morning of Jan. 22, 2025, Solomon Henderson went inside a restroom at Antioch High School in Nashville, Tennessee. The 17-year-old student posted photos on social media, and, according to news reports, began livestreaming on Kick, a loosely moderated platform popular with video game streamers. The teenager headed to the nearby cafeteria and fired 10 shots within 17 seconds from a 9 mm semi-automatic pistol, killing a 16-year-old student, wounding another, and, finally, turning the gun on himself. One year after that fateful day, investigators at the Metropolitan Nashville Police Department believe that Henderson acted alone.
It’s possible that traditional discipline could have prevented the shooting. Henderson had a history of violence and threats against family, students, and teachers. The teenager had recently been suspended for threatening another student with a box cutter, and a ProPublica investigation raised questions around why he was allowed to return to school at that time. The school did not have traditional metal detectors, which probably would have detected the gun, according to critics of the school system’s security measures such as former Metro Nashville Public Schools board member Fran Bush.
Instead, the school was using an emerging technology: an artificial intelligence-driven weapons detection system that turns existing video surveillance systems into “smart cameras” that scan the footage in real time to detect visible weapons.
The school’s system — developed by the Virginia-based company Omnilert — was active at the time of the shooting but did not detect Henderson’s gun because it was not visible from the cameras locations, according to news reports. In an emailed statement to Undark, Blake Mitchell, Omnilert’s vice president of marketing, confirmed those reports. “While cameras were present at the location, the weapon was not brandished within the field of view of cameras running visual gun detection,” Mitchell wrote, adding that the company has since worked with the school district “and other school partners to reinforce best practices for deployment, including recommending that visual gun detection be enabled on all eligible cameras, where feasible, to maximize coverage.”
In the last decade or so, a growing number of school districts have deployed AI-based weapons detection systems marketed to prevent school shootings. There is limited data but it is believed that hundreds of school buildings use some type of AI weapons detection.
Some experts — and representatives from companies themselves — suggest these technologies could complement but not replace existing school safety measures such as metal detectors, video cameras, single-point entry, and safety drills. At the same time, some officials and researchers are questioning the technology’s effectiveness and accuracy, citing the many false positives and the mistaking of harmless items — such as a clarinet and even a bag of chips — for guns. In addition, there is little to no empirical evidence that demonstrates that artificial intelligence systems have prevented shootings in a real-world setting like a school, a point emphasized in a 2024 paper that argues for a multidisciplinary approach to firearms detection.
The lack of hard data and several high-profile failures to prevent violent school attacks raise troubling questions about the growth of AI-based weapons detection technology, critics say, and whether security companies are preying on the fears of parents and school districts. There is “a real mismatch between the expectations around what these technologies are capable of and the reality of what they’re capable of,” said Hannah Quay-de la Vallee, a computer scientist and senior technologist at the Center for Democracy and Technology, a nonprofit advocating for digital rights. “The consequences of that are catastrophic.”
As evidence, Quay-de la Vallee and other researchers point to the Nashville shooting as well as a 2022 high school stabbing in Utica, New York, where a student brought a hunting knife in his backpack through the AI-enhanced entry scanners. The student later attacked a classmate with the knife, police say. The victim sued the city of Utica, the Utica School District, and Evolv, the Massachusetts-based company that developed the AI system, as well as its third-party contractor system.
Some officials and researchers are questioning the technology’s effectiveness and accuracy, citing the many false positives and the mistaking of harmless items — such as a clarinet and even a bag of chips — for guns.
The Utica case was highlighted by the Federal Trade Commission in November 2024 as part of its investigation into allegations that Evolv had deceptively advertised its ability to reliably detect weapons. As part of a 2024 settlement, the company was barred from making “unsupported claims” about its products and ordered to give schools the option to cancel their contracts. “The FTC has been clear that claims about technology — including artificial intelligence — need to be backed up, and that is especially important when these claims involve the safety of children,” said Samuel Levine, director of the FTC’s Bureau of Consumer Protection, according to a press release on the FTC website. The district decided to remove the Evolv entry scanners at the high school, replacing them with traditional metal detectors, according to news reports.
Evolv’s technology has detected weapons that students have brought to school. In one case in January of last year, for instance, a student was arrested after a loaded gun was found in his backpack. Still, Quay-de la Vallee suggests a more effective school safety strategy could include investing in students’ well-being. Simone Browne, a Black studies scholar at the University of Texas at Austin whose research focus is on surveillance, questions whether these schools are doing this already. “[Is there] something holistic around health, de-escalation of violence, mental health counseling, or other ways?” asked Browne. Otherwise, she added, “is it just more and more staging of this very carceral technology?”
AI-based weapons detection technology gained more visibility in 2018 following the school shooting in Parkland, Florida that killed 17 people. The so-called school security industry has grown significantly since then and is reported to be worth up to $4 billion today.
Advances in artificial intelligence have led to improvements in several areas of weapons detection. The algorithms are faster, more sophisticated, and more accurate, with fewer false positives than in earlier versions. But there are still many questions surrounding the technology’s accuracy and effectiveness — especially in identifying smaller weapons and knives, experts say.
From a scientific standpoint, firearm detection algorithms work well, said Beidi Dong, an associate professor of criminology, law, and society at George Mason University and co-author of the 2024 paper on the limitations of visual weapons detection published in the peer-reviewed Security Journal. But “we don’t know yet how successful or valid” the algorithms are in the real world, he added, and more research and data are needed.
There are two main types of automated weapons detection. The first — the kind of system used in Nashville — enhances existing video security systems to identify weapons by analyzing shapes, movements, and situations.
The main suppliers of these systems include Omnilert, VOLT AI, and ZeroEyes. In addition to schools, universities, and hospitals, some public transit systems have also installed these systems. The Regional Transportation Commission of Southern Nevada, which serves those in the Las Vegas area, recently became the first transit agency in the nation to deploy the ZeroEyes system across all of its transit centers.

SIGN UP FOR NEWSLETTER JOURNEYS: Dive deeper into pressing issues with Undark’s limited run newsletters. Each week for four weeks, you’ll receive a hand-picked excerpt from our archive related to your subject area of interest. Pick your journeys here.
The technology analyzes each object in a video frame — such as a dog, a human, or potential gun — and assigns it a label. Images showing guns and humans with guns are quickly prioritized and forwarded to a remote operations center to be validated. Alerts are then automatically dispatched to local law enforcement. The entire process — from capturing the image to validation and sending officers — is said to take less than five seconds.
A statement emailed to Undark that Mark Prindle, senior director of Fusion Public Relations, requested to be attributed to a ZeroEyes spokesperson, noted that the company has identified more than 1,000 fake and real guns. “Until a gun is identified, the monitoring screens in the ZOC [ZeroEyes Operations Center] stay absolutely blank,” the statement noted. “The team does not see a live feed from any cameras. Further, the platform does not store personal or biometric data or conduct any kind of facial recognition, so there is no risk of bias based on skin color or other personal characteristics.”
Omnilert employs a similar technology in its visual gun detection software. Mitchell noted that the system, which is deployed in more than 600 schools across 50 districts, should be one layer of a school safety plan.
“While no single technology can prevent all acts of violence, Omnilert remains committed to strengthening situational awareness and response capabilities as part of a layered approach to school safety,” he wrote in an email to Undark.
“We don’t know yet how successful or valid” weapons detection algorithms are in the real world, said Dong.
The second type of automated weapons detection looks for concealed weapons. Used at arts and entertainment venues, sports stadiums, hospitals, and schools, these systems enhance traditional metal detectors or X-ray scanners to detect weapons carried in bags, purses, backpacks, or under clothing. The Evolv Express, for example, uses low-frequency electromagnetic fields and artificial intelligence to penetrate clothing and bags and assess objects — such as keys, mobile phones, or handguns — based on material composition, size, and shape.
These systems — which require the installation of new hardware as well as regular software updates and staffing — are expensive. A typical wireless sensor from Evolv can cost tens of thousands of dollars. Traditional metal detectors, on the other hand, are a one-time purchase and could cost from a few hundred dollars up to as much as $10,000.
Evolv’s technology is used to screen 850,000 people every day in more than 1,500 school buildings across the country, according to Alexandra Ozerkis, chief marketing officer for the Massachusetts-based company.
“Advanced weapons detection, such as Evolv, is one layer of security schools are considering in their safety plan,” Ozerkis wrote in a statement to Undark. “It is important to understand that no single technology or layer should be relied upon as a single source of protection; nothing is perfect. Security involves a layered approach to reduce risk.”
Despite the millions of dollars committed to buying such systems —Gwinnett County Public Schools in suburban Atlanta, for example, is reportedly spending about $20 million to install the Evolv system — there do not appear to be any independent, peer-reviewed studies on the effectiveness of commercial AI weapons detection systems in schools or elsewhere.
Evolv’s technology is used to screen 850,000 people every day in more than 1,500 school buildings across the country.
The lack of hard data is disturbing, said Browne, especially since school systems may be purchased with federal funding.
But recent research has begun focusing on the overall merits of automated weapons detection. A 2024 systematic literature review submitted to a workshop by a research team based at the Federal University of Goiás in Brazil analyzed about 10,500 papers related to artificial intelligence, security, and weapons identification. The team identified 13 papers that were focused on computer vision algorithms used in schools and other public spaces.
The review found “significant gaps in current research, such as the lack of studies focused on specific educational environments,” and emphasized the need for more empirical data.
Other limitations cited by the Brazilian researchers and others are the persistently high number of false positives and the accuracy of weapons detection algorithms in detecting smaller weapons like knives. More research is needed on developing algorithms and techniques to identify knives, researchers say, because these weapons are frequently confiscated at schools.
An over-reliance on AI-based weapons detection technology can actually make schools and students less safe, according to scholars such as the Center for Democracy and Technology’s Quay-De la Vallee, who recently published a brief on the failures of education-based artificial intelligence in general.
When looking at these types of surveillance technologies, there is a tendency to say “What’s the harm and why not use it if it can prevent one school shooting?” Quay-De la Vallee said in an interview with Undark. “The implication is that there is no harm and that implication is not correct. There’s the harm to students of that feeling of surveillance. There’s the loss of resources. And loss of resources may also mean that schools are not doing other things that would be more effective.”
Researchers also note that metal detectors in themselves do not make schools any safer.
In interviews with Undark — as well as in articles published by nonprofit organizations such as the American Civil Liberties Union and the Center for Democracy and Technology — experts argue that more transparency and data are needed to understand how students interact with AI-powered weapons detection technology.
“What happens if there is a false positive? What resources are deployed? What is the protocol?” asked Juan Gilbert, a computer scientist at the University of Florida’s Herbert Wertheim College of Engineering. “I have no idea. But we need to understand it.”
“There’s the harm to students of that feeling of surveillance,” said Quay-De la Vallee. “There’s the loss of resources. And loss of resources may also mean that schools are not doing other things that would be more effective.”
Gilbert’s research largely focuses around “human centered computing” — designing technology with a primary focus on human needs and behavior. “The easy comparison is the metal detector,” he said. “We know exactly what the protocol is there. In other words, there needs to be a similar understanding.”
The standard protocol with a walk-through metal detector, for example, is an alarm and a secondary screening to confirm that the person does not have weapons.
But protocols and best practices are often absent with emerging technologies such as AI-based weapons detection, said several researchers.
“My biggest concern here is that rather than keeping civilians and also police safer — which is what this technology claims to do — it’s going to instead misclassify harmless items like books as weapons and cause more fear and overreaction on the part of police,” said Cooper Quintin, a senior staff technologist at the Electronic Frontier Foundation, who co-wrote a 2024 article on the Federal Trade Commission’s recent investigation into Evolv.
To this end, critics cite a recent incident outside a suburban Baltimore high school, where the Omnilert detection system incorrectly identified a crumpled bag of Doritos as a gun. Police officers were called to the scene and drew their service weapons on a 16-year student, who was handcuffed and searched.
Support Undark Magazine
Undark is a non-profit, editorially independent magazine covering the complicated and often fractious intersection of science and society. If you would like to help support our journalism, please consider making a donation. All proceeds go directly to Undark’s editorial fund.
This points to another key challenge, critics say: AI-enhanced technology can lead to more searches, discipline, and contact with police. These searches are based on technology that has been demonstrated to generate lots of false and inaccurate results, said Clarence Okoh, senior attorney for Civil Rights and Technology at TechTonic Justice, a nonprofit focused on digital rights and algorithmic justice in lower income and under-resourced communities. “From a legal perspective, these are reasonable-suspicion, probable-cause generating machines, he said. If the technology says that a person may have a weapon on them, “a school official or a law enforcement official has a reason to go and conduct a search of their backpacks, pat them on, and so on and so forth.”
Okoh wrote a policy brief on the impact of artificial intelligence and the “school-to-prison pipeline,” the term used by many researchers to describe when early exposure of students to harsh disciplinary measures such as suspensions and expulsions leads to poorer academic outcomes, as well as more contact with police and arrests. The data show that Black and Hispanic students are disproportionately affected.
Okoh said some data show poorer academic performance in schools with high surveillance. A 2022 paper by researchers at Johns Hopkins University and Washington University in St. Louis, for example, found that increased school surveillance measures, such as metal detectors and security cameras, are associated with lower math test scores and fewer college enrollments.
AI-enhanced technology can lead to more searches, discipline, and contact with police.
Most researchers and security experts agree that technology alone will not deter or prevent school violence — especially gun violence. A multi-layered approach involving traditional metal detectors, AI-powered security solutions, and other security measures could be a stronger defense, they say. Experts also suggest that students should be taught skills such as conflict resolution and violence intervention to reduce the likelihood of gun violence.
It’s crucial that school districts and administrators ask more questions about emerging surveillance technologies, say researchers like Browne. That’s “not to say you need TSA scanners in every school — I don’t think that’s the answer either,” said Quay-de la Vallee. But until the technology improves dramatically, “it’s just not doing what schools need it to be doing,” she added. “There’s a real question about whether any resources should go towards this.”
UPDATE: A previous version of this piece referred to Beidi Dong as an assistant professor of criminology, law, and society at George Mason University. He is an associate professor.