Gaggle Knows Everything About Teens And Kids In School

Gaggle monitors the work and communications of almost 5 million students in the US, and schools are paying big money for its services. Hundreds of company documents unveil a sprawling surveillance industrial complex that targets kids who can’t opt out.

This is part of a BuzzFeed News package on schools and social media surveillance. Read more here.

For the 1,300 students of Santa Fe High School, participating in school life means producing a digital trail — homework assignments, essays, emails, pictures, creative writing, songs they've written, and chats with friends and classmates.

All of it is monitored by student surveillance service Gaggle, which promises to keep Santa Fe High School kids free from harm.

Santa Fe High, located in Santa Fe, Texas, is one of more than 1,400 schools that have taken Gaggle up on its promise to “stop tragedies with real-time content analysis." It's understandable why Santa Fe's leaders might want such a service. In 2018, a shooter killed eight students and two teachers at the school. Its student body is now part of the 4.8 million US students that the for-profit "safety management" service monitors.

A college student whose middle school used Gaggle told BuzzFeed News that the tool taught them that they would always be watched. “I feel like now I’m very desensitized to the threat of my information being looked at by people,” they said.

Using a combination of in-house artificial intelligence and human content moderators paid about $10 an hour, Gaggle polices schools for suspicious or harmful content and images, which it says can help prevent gun violence and student suicides. It plugs into two of the biggest software suites around, Google’s G Suite and Microsoft 365, and tracks everything, including notifications that may float in from Twitter, Facebook, and Instagram accounts linked to a school email address.

Gaggle touts itself as a tantalizingly simple solution to a diverse set of horrors. It claims to have saved hundreds of lives from suicide during the 2018–19 school year. The company, which is based in Bloomington, Illinois, also markets itself as a tool that can detect threats of violence.

“... Maybe it teaches students some lessons, but not the ones we want them to learn.”

But hundreds of pages of newly revealed Gaggle documentation and content moderation policies, as well as invoices and student incident reports from 17 school districts around the country obtained via public records requests, show that Gaggle is subjecting young lives to relentless inspection, and charging the schools that use it upward of $60,000. And it’s not at all clear whether Gaggle is as effective in saving lives as it claims, or that its brand of relentless surveillance is without long-term consequences for the students it promises to protect.

Parents and caregivers have always monitored their kids. Sarah Igo, a professor of history at Vanderbilt University who studies surveillance and privacy, told BuzzFeed News, modern schools are often a de facto “training house for personhood.” But student surveillance services like Gaggle raise questions about how much monitoring is too much, and what rights minors have to control the ways that they’re watched by adults.

“It just seems like maybe it teaches students some lessons, but not the ones we want them to learn,” Igo said.

Questionable content

Gaggle is one of the biggest players in today’s new economy of student surveillance, in which student work and behavior are scrutinized for indicators of violence or a mental health crisis, and profanity and sexuality are policed.

“[There’s] a very consistent and long-standing belief that children have fewer rights to their own communications, and to their own inner thoughts and to their own practices,” Igo told BuzzFeed News.

Gaggle uses an in-house, AI-powered filtering system to monitor everything that a student produces on their school account associated with Google’s G Suite or Microsoft’s 365 suite of tools. It scans student emails, documents, chats, and calendars, and compares what students write against a “blocked word list,” which contains profanity as well as references to self-harm, violence, bullying, or drugs. A Gaggle spokesperson told BuzzFeed News the list is regularly updated and “based on the language commonly used by children and adolescents for almost a decade.”

The service also runs images uploaded by students through an “Anti-Pornography Scanner” (also proprietary and powered by AI). Gaggle, citing the sensitivity of its proprietary information, declined to tell BuzzFeed News how these tools were trained, answer questions about the original training sets, or say whether Gaggle’s AI tools learn based on what students put into G Suite and Microsoft 365.

Among the many banned words and phrases on Gaggle’s list are "suicide," "kill myself," "want to die," "hurt me,” "drunk," and "heroin." Gaggle also commonly catches profanity — in 17 US school districts, about 80% of posts flagged by Gaggle within a particular school year were flagged for such words, according to documents obtained by BuzzFeed News.

Also on the list: LGBTQ-related words like "gay," "lesbian," and "queer." When asked whether the company provides LGBTQ sensitivity training to safety representatives, Gaggle said that it coaches them in how to separate "bias and personal opinion" from content moderation decisions. It’s unclear if, in addition to antidiscrimination training, Gaggle educates moderators on the specific challenges that LGBTQ youth may encounter.

According to Gaggle’s company wiki, student emails are scanned in real time, and those that violate the rules are blocked from their intended recipients. Meanwhile, documents and files are moderated “after the fact.”

When Gaggle’s software flags student content for any reason at all, it passes the material to one of the Level 1 safety representatives, who number 125. If the first reviewer thinks there’s an imminent risk to student safety, Gaggle said, it is forwarded it to one of the group of 25 core “trained safety professionals.” These employees are responsible for flagging content to school administrators.

Gaggle said that that its “trained” safety representatives “have degrees in fields such as criminal justice, psychology, communications, health administration, and education.” It’s unclear whether Gaggle was referring to its Level 2 safety representatives.

Gaggle did not answer BuzzFeed News' list of specific questions about the training or qualifications required from its contract and staff safety representatives, respectively. It also didn’t specify how much content safety representatives are required to review each day and month. However, Gaggle wrote on its company wiki in 2017 that content moderators individually reviewed “over a million blocked student communications each month.”

According to two job descriptions for “Level 1 Safety Representatives,” these positions are filled by contract employees paid about $10 an hour. These safety representatives do not need experience in counseling or adolescent health, or even working with children, teens, or young adults, though familiarity with current slang and social media is encouraged. A job description posted to Indeed notes that they are typically capped at 38 hours per week, just shy of what an employee would need to be considered full time and legally entitled to benefits. Gaggle declined to answer whether its Level 1 safety representatives receive benefits or insurance.

Gaggle also declined to detail its safety representative hiring process and training program, but job reviews posted to Glassdoor suggest they are not particularly rigorous. One person claimed that an online application added to a waiting list, and shortly after, they received a job offer and emails "explaining how everything works.”

Gaggle analyzes student data and funnels its conclusions to a “Safety Management Dashboard.” School administrators with access can see rule violations alongside numeric IDs of the students who committed them. The dashboard includes a “Top Concerns” graph, which displays a scoreboard of students who have violated Gaggle’s rules over a particular time period. Administrators can expand the graph into a list that ranks the students within their district based on how often they violate Gaggle’s rules.

Gaggle ranks rule-breaking incidents according to three tiers of severity. A “violation” may include any use of language that breaches Gaggle rules, including false positives, like sending song lyrics or quoting from a book. “Questionable Content” includes material that’s concerning but not an imminent threat to student safety. This category could include cyberbullying or sending “professional pornographic images or files,” according to Gaggle. School officials are alerted when these occur.

A “Possible Student Situation” is the most severe type of content violation. According to Gaggle, it represents an “immediate threat” to the safety of students, including “student-produced pornography, violence, suicide, rape, or harmful family situations.” If a Gaggle safety representative determines that a piece of pornography is student-produced, Gaggle automatically send the file to the National Center for Missing and Exploited Children, which maintains a database of child pornography.

Gaggle operates by a “three strike rule,” meaning that mild rule violations could be flagged to school administrators if a student does it repeatedly. For instance, if a student said “fuck” three times, school officials would be alerted. According to Gaggle, students who commit three strikes have their account privileges limited until a school official gives those privileges back. It’s unclear if a student would lose email privileges in these situations, since it can be necessary for communicating with teachers and completing assignments.

Sarah Roberts, a UCLA professor and a scholar in digital content moderation, told BuzzFeed News that people should be skeptical about the effectiveness of policies based on“three strikes.”

“They're talking about 'three strikes' rules and other kinds of quasi-judicial language,” Roberts said. “These framings, we know that three strikes rules are not particularly held in high regard in places like California where they came from. And yet they're being replicated and seamlessly built into the architecture of these platforms and their whole logic.”

“A solution in search of a problem”

Analytics designed to make track of massive troves of student data might seem like useful tools for school administrators. However, critics worry that schools are teaching students to accept sweeping forms of surveillance.

“My sense about this particular suite of products and services is that it's a solution in search of a problem,” Roberts said, “which is to say that the only way that the logic of it works is if we first accept that our children ought to be captured within a digital system, basically, from the time they're sentient until further notice.”

Gaggle claims that its tool promotes a sense of “digital citizenship.” However, documents obtained by BuzzFeed News show that students often don’t understand that their work and communications are being surveilled until they violate the rules.

“Sometimes, a frank talk about the proper use of a school-issued email account is enough to make students realize that emails are not private,” an email obtained from a school in St. Mary Parish in Louisiana says. “We've discovered that some students don't realize that they are using inappropriate language.”

“These are children. They're supposed to be exploring and learning and have the ability to make mistakes.”

Chris Gilliard, an English professor at Macomb Community College who studies privacy and inequality, told BuzzFeed News he believes that Gaggle’s definition of “citizenship” is misguided.

“Teaching young people that you should exist online to the extent that you're palatable to companies, or future employers, I think is worrisome,” Gilliard said. “These are children. They're supposed to be exploring and learning and have the ability to make mistakes.”

Gaggle told BuzzFeed News that it “recommends” school districts get permission from parents and students before they use the company’s tools to monitor them. It also provides a choice to opt out, but since it works with required school services (email accounts, etc.), it’s unclear how that would work.

“If a student opts out of Gaggle, then they would not be able to use the school-provided technology and would have to use their personal email addresses for their school work — and that personal email would not be scanned by Gaggle,” a Gaggle spokesperson told BuzzFeed News. In other words, once a school district buys Gaggle services, students don’t have a school-friendly alternative.

“Life saved”

With suicide one of the leading causes of death for people under 18 and school shootings an ever-present concern, Gaggle’s student surveillance has a clear appeal to school administrators. While the federal government struggles to pass even elementary gun control, Gaggle is something schools can easily pay for. Gaggle is also a more toned-down option than, say, proposals from politicians to arm teachers or install armed police in schools. When a robust team of student counselors is financially impossible, and when school administrators believe something is better than nothing, Gaggle becomes a feasible choice.

However, experts doubt whether Gaggle, or any kind of surveillance, is an adequate or appropriate response to adolescent suicide and school shootings.

“Everybody would nearly line up behind the thing that saving a young person's life is worth it,” Igo said. “But this is always, always the argument that is made about any kind of surveillance that has to do with crime or destruction or violence. We cannot just simply turn over when that explanation comes along. We need to think about the consequences of using these kinds of surveillance technologies on students.”

“We need to think about the consequences of using these kinds of surveillance technologies on students.”

One of Gaggle’s central selling points is that it saves hundreds of lives each year from suicide. For the 2018–19 school year, it claimed that it prevented 722 students from committing suicide. When asked to explain, Gaggle said that out of the 52,000 references to suicide and self-harm it flagged, 6,000 were serious enough that Gaggle called the school “immediately.”

“Of those, 722 were so specific that we identify them as a ‘life saved,’” Gaggle said.

When asked what Gaggle meant by “specific,” a company spokesperson said, “(a) that they have heard back from the district that a life was saved, or (b) that what the student wrote included a clear and definitive plan, i.e. the means to commit suicide, the specific place and time as to when they would be taking their lives, and the reason for it, or (c) both.”

Gaggle’s claim is tough to verify, especially since its system is very sensitive in flagging text as indicating possible thoughts of suicide and self-harm. Between July 2018 and June this year, for example, Gaggle flagged 447 cases of “questionable content” and 31 “possible student situations,” including possible suicides, in one Illinois school district.

According to 500 pages of emails from the school district, many of the documents flagged by Gaggle appear creative or expository. Items flagged for a demonstrating a risk of suicide or self-harm often had titles like “poem portfolio,” “Possible Poems ???” “Narrative Essay,” and “Narrative/Common App College Entry Essay.”

“One of the things [Gaggle is] teaching is not to share things, which is presumably the opposite of what many mental health professionals would say,” Igo said. If a student is afraid of being interviewed by school administrators, they may shut themselves off from adults.

It’s unclear, based on the emails alone, whether Gaggle was a helpful tool in assisting vulnerable students. In several email chains, teachers and administrators were already aware that the child was struggling and were in touch with the parents of the students in question.

“It's a little bit cynical to wave [student suicides] in front of people to try to get them to buy in,” Roberts said. “I have more faith in relationships that could be built locally and interpersonally than far-removed through content moderators and through for profit revenue driven systems in terms of thinking about the safety of children.”

Gaggle claims that its safety representatives moderate content “24/7 every day of the year.” If Gaggle flags an imminent threats of suicide between 6 a.m. and 11 p.m. Central time, a safety representative will try to alert school administrators by phone. If they can’t reach anyone, safety representatives are told to call local police.

“I wouldn't want a stranger poring over the communications and engagement of my children."

Samuel Fasulo, a psychologist at NYU Langone specializing in adolescent mental health, said that escalating a student’s mental health treatment by hospitalizing them or calling the police can be risky. If poorly timed, these escalations could worsen a student’s symptoms.

“If it really helped to escalate, we would escalate all situations,” Fasulo said. “The reality is that there are what's called iatrogenic effects in health care, which is a negative impact as a result of receiving treatment. It's not necessarily bad treatment, per se, but it's just treatment that increases the various symptoms you're trying to treat.”

Fasulo said that tools like Gaggle could be helpful in flagging suicidal ideation, but noted that it’s likely best used by a team of human clinicians and counselors.

“If it's done right, I don't actually think there's anything inherently wrong with [Gaggle],” Fasulo said. “It just has to be done well, and it can't be used as the end-all, be-all — which I think people are often looking for.”

But doing right by students isn’t easy, or cheap. Gaggle, like any platform whose success requires ongoing growth, faces issues similar to Facebook and Twitter when it comes to moderating at scale. AI-dependent moderation is clunky, and good moderators require training, reasonable pay, and emotional support. These are all things that impact a company’s bottom line. Beyond this is the issue of student privacy.

Roberts expressed concern at the policy of having a human being reviewing every piece of content that could be flagged by Gaggle’s internal AI tools.

“That seems like a gross mismatch right there too,” Roberts said. “I wouldn't want a stranger poring over the communications and engagement of my children. But I wouldn't want it taking place fully online anyway. Because you know it again, it poses so many questions that I think the tools themselves are ill-equipped to respond to.”

“We can't even get on board to funding school lunches”

Gaggle is not cheap. Invoices, letters of intent, and receipts from 17 cities show that the company charges school districts tens of thousands of dollars for its services each academic year. Depending on how many students were in the district, and how many premium services a school were to choose to use, annual fees could range from about $12,000 to more than $60,000 for a single academic year. (Gaggle’s contracts prohibit it from using "district data" or "student records" for targeted advertising during or after an agreement period, and it’s required to delete that data when the period ends.)

According to trade publication SmartBrief, schools can tap into a hefty amount of federal money to pay for student surveillance services like Gaggle. Under Title IV of the 2015 Every Student Succeeds Act (ESSA), school districts may use more than $30,000 of federal money, according to need, in order to fund various, vaguely defined initiatives. Schools that receive more than $30,000 under ESSA must dedicate at least 20% of the money in order to fund a "well-rounded" education, at least 20% on "safe and healthy" activities for students, and some undefined amount of money toward "technology."

Roberts said that she feared tools like Gaggle could be used disproportionately as a stand-in for a larger staff of counselors at poorly funded school districts. She said that human support services like these could build relationships and trust with students and perhaps prevent issues from escalating rather than simply responding to them.

“I think this is a question of values, and it's a question of how we value our young people. It's a question of where we collectively as a society want to spend money,” Roberts said. “We can't even get on board to funding school lunches, and yet we're going to give massive amounts of money to enrich this company that is spying on our kids.” ●

You'll find the source documents on which this report was based here.


Skip to footer