Schools are using AI software like GoGuardian Beacon to monitor students' online communications for signs of self-harm, often resulting in police being dispatched to students' homes based on misinterpretations. This practice has caused traumatic incidents, such as a Missouri teenager being awakened by police due to a misunderstood poem. While some argue it can prevent crises, critics highlight the software's invasiveness, lack of effectiveness data, and the potential dangers of involving law enforcement.