She will host a livestreamed panel at BKC on November 9th to debate who profits from so called “kid-influencers” who produce content on social media platforms such as TikTok.
Plunkett is also the inaugural Executive Director of Harvard Law School Online and Associate Dean of Learning Experience and Innovation (LXI) and the Meyer Research Lecturer on Law at Harvard Law School. Her book, Sharenthood: Why We Should Think before We Talk about Our Kids Online (MIT Press 2020), explores how adults unwittingly compromise children’s privacy online. Plunkett sat down with BKC intern Zhamilya Bilyalova for a deep dive on her work.
I wanted to talk a little bit about “sharenting” and its legal aspects. What do you see happening in the future [as far as] potential cases that we might have to struggle with to understand how to fix it?
I think that one of the areas where there's likely to be a fair amount of movement on the legal front as today's kids and younger teenagers come of age…is around what I call “commercial sharenting,” but [what is] more commonly called influencer culture. When we look at parents who are engaged in commercial sharenting, they're actually monetizing or trying to monetize their kids in a way that, right now, the law is really not regulating in the United States in any consistent, meaningful way.
We have a huge global industry that has as one of its core components unregulated, in most instances, child labor. Right there, when you have kids who are doing that labor, who are providing the content, when they come of age—or maybe even a little bit later into adulthood, once they kind of look around—that's when we might start to see a cluster of litigation and/or more proactive labor laws being passed at the state level.
I'm looking at the advocacy of young people, and it seems like what they're mostly asking for is to make social media companies more accountable, and also asking the government to help regulate social media platforms [by having them] remove harmful content, for example. In order for us [young people] to prepare for the future, what would you think are the most important things for us to ask for right now?
I think it's really important to ask for limitations on the use of data collected from or about minors, and not just if it's the minor themselves sharing it, but also if it is shared about them. I think that it is really tricky from a privacy perspective to put the brakes on data collection—it's not impossible, and certainly there are many important efforts happening—but I think that it is difficult to fully mitigate current or future harms when we're talking about data collection. I think data use makes more sense, and what I mean by “data use” is having legal guardrails in place so that decision makers for key life opportunities—higher education, bank products, homes, jobs, et cetera—are not able to use digitally acquired data about minors or from minors.
There do need to be carve outs; for instance, you don't want doctors to be unable to use digital information from electronic medical records, so you need to be able to draft any potential law in a way that carves out the ability for collectively agreed-upon, legitimate uses of that data. But I think what young people would be well served to think about is having prohibitions on gatekeepers to major life opportunities using digital information about them or from them unless they have consented in a very direct and explicit way.
Even the word “privacy” is confusing. Do you have an idea or a framework you use to think about it?
The way I think about privacy—and this builds on Jonathan’s [Zittrain] work—is the importance of having, for kids and teens, a protected place to play, to make mischief and make mistakes and grow up better for having made them. I really do think that that overarching framework is sort of an umbrella to a very practical and ethical approach. Now you're absolutely right that under that umbrella…you would need law and policy, you would need technical, you would need education, you would need changing of norms and standards. You would certainly need private industry. But if we think about having a coherent vision of privacy, we then need to recognize that with any vision, the solution space will be multi-stakeholder. It really needs to be across the board.
I think we can protect minors by saying, okay, some amount of private data is going to get out of the bag. We need to try to say that wherever the information comes from, unless a specific reason like healthcare applies, that information just can't be used to decide whether a minor gets access to education or health insurance or consumer credit. I think that is something that makes a really big difference in setting up childhood and adolescence as more private spaces because it would stop the long tail of digital information from following you.
I want to ask more about your approach to digital parenting. I feel like a lot of parents are really struggling with even using parental controls or trying to monitor social media. Parents’ instincts maybe are not working as well just because the world has really changed. Do you have any final thoughts on that?
My first thought is you're correct. Parenting instincts do not work as well when the available tools are changing and evolving at such a rapid rate, and the digital landscape that our kids and teens are in is so different from the digital landscape that almost all of us came of age into. I think, maybe counterintuitively, the best way to address that is through building more and stronger human-to-human relationships. If we can increase the number of trusting, respectful human to human connections, then we are going to have more opportunities for kids to be able to just think through and share and get guidance and wave their hands if they need help or share something that there is to celebrate. And so, I think building human-facing community for our kids—more and more of it—is actually the best way to counterbalance and ultimately, ideally support online communities as well.
Interviewer
Zhamilya Bilyalova is a student at Wellesley College, where she studies Anthropology and Data Science. She is also the co-founder of PrivaZy, a community-centered movement to raise awareness about and address online privacy issues faced by Gen Z. During the summer of 2023, she interned with the Berkman Klein Center’s Institute for Rebooting Social Media.
You might also like
- communityPIT in Action: Climate