If Tom Sawyer were a real boy, alive today, he’d be arrested for what he does in the first chapter of Mark Twain’s famous novel. Tom skips school. He beats up the new kid in town, then sneaks into his guardian’s house in the middle of the night. In the 1840s, towns along the Mississippi River lacked professional police forces, so Tom escaped the fate he would face today.
Let’s call today’s Tom Sawyer “Tommy S.” He would be known to the local police. Officers, teachers, and court officials would be monitoring and logging his activities using digital technologies. By chronicling his activities on Instagram, Snapchat, and other digital platforms, Tommy himself would create exhibits A through Z in the court case against him.
Meanwhile, Aunt Polly, Tommy’s guardian, might well write her own Facebook post about the affair: “Blue lights flashing outside my bedroom window. Cops woke me up again tonite. Tommy!!! How do you deal with #teentroubles? #bailmoney #xtralargecoffee.”
Aunt Polly is a sharent — a parent, teacher, or other adult caregiver who publishes, transmits, stores, or engages in other activities involving private information about a child in her or his care via digital channels.
Today, parents, teachers, and other caregivers are in Aunt Polly’s position. Sharents make decisions to disclose digital data about children that invade traditional zones of privacy and threaten kids’ and teens’ current and future opportunities, as well as their ability to develop their own sense of self.
Sharenting decisions disrupt any common understanding we may have of childhood and adolescence as protected spaces for play. How can our kids and teens discover who they are when we adults are tracking them, analyzing them, and attempting to decide for them — based on the data we gather — who they are and should become? We owe them greater freedom, in many ways, than we give ourselves to engage in self-discovery.
Youth “currently enjoy almost no privacy rights vis-à-vis their parents,” as two legal scholars write in “The New Law of the Child” in the Yale Law Journal, and because parents are typically gatekeepers for their children’s privacy rights in educational and other settings outside the home, the lack of youth privacy rights may be even more acute with respect to nonparental adults, like teachers. The privacy and related tech choices that adults make so fundamentally shape our children’s current lives and future prospects that we are almost unaware of their magnitude.
Tech providers tend to lack transparency about what data they are collecting, why they are collecting it, what they will do with it, and whether users can set meaningful boundaries. There is a lack of comprehensive data privacy protection from the legal system, for youth and adults alike. There is almost a necessity to be digital in one’s work and social doings. So are we making choices to sharent? Or are we “sharent-trapped” and stuck in our routine by forces outside ourselves?
The short answer to both questions is yes. The relationship between choice and structural context is not “either/or” — it’s “yes and.”
You’d tell your teenager to use social media to create a positive portrayal of herself so future employers will be impressed by her savvy dissection of current events rather than how many tequila shots she had at Tommy’s #houseparty last weekend. You should think about telling yourself to post only a tasteful update about your 10-year-old’s success at baseball rather than the fight he got into with his younger sister after the game ended. Use the “holiday card” rule of thumb: if you wouldn’t put it in hard copy and mail it to a few hundred people in your life for display on their refrigerators, don’t put it on the Internet for thousands of people in, near, or outside of your life to repurpose and display indiscriminately.
Social media and other tech platforms frequented by parents, teachers, and other adults could develop more features to encourage choices that protect play, such as one that asks, “Are you sure you want to post this about your child because it could have the following consequences?”
We could also look to companies for parent versions of kid-focused platforms. We have YouTube Kids and new Google services for kids, which allow parental controls. But what about YouTube Parents or Facebook Parents: ways for parents to connect without these platforms tracking, aggregating, or otherwise using data about the parents’ kids? For example, Facebook could leave up a post from a parent about toilet training with the privacy settings that the parent picked but couldn’t pass that information through to third parties in any way or use information from it for its own internal market analysis or product development.
We’re at an interesting moment: we think we’re in the depths of the digital kid privacy rapids, but we are probably only at the start. We can still build a better raft.
Leah A. Plunkett is an associate dean and professor at the University of New Hampshire School of Law and faculty associate at the Berman Klein Center for Internet & Society at Harvard. This essay was adapted from “Sharenthood: Why We Should Think Before We Talk About Our Kids Online” by Leah A. Plunkett. Copyright 2019 Massachusetts Institute of Technology. Reprinted by permission of the MIT Press. Send comments to firstname.lastname@example.org.