A guide to parental controls on social media – ABC17News.com

By Samantha Murphy Kelly, CNN Business
A little over a year ago, social media companies were put on notice for how they protect, or fail to protect, their youngest users.
In a series of congressional hearings, executives from Facebook, TikTok, Snapchat and Instagram faced tough questions from lawmakers over how their platforms can lead younger users to harmful content, damage mental health and body image (particularly among teenage girls), and lacked sufficient parental controls and safeguards to protect teens.
Those hearings, which followed disclosures in what became known as the “Facebook Papers” from whistleblower Frances Haugen about Instagram’s impact on teens, prompted the companies to vow to change. The four social networks have since introduced more tools and parental control options aimed at better protecting younger users. Some have also made changes to their algorithms, such as defaulting teens into seeing less sensitive content and increasing their moderation efforts. But some lawmakers, social media experts and psychologists say the new solutions are still limited, and more needs to be done.
“More than a year after the Facebook Papers dramatically revealed Big Tech’s abuse, social media companies have made only small, slow steps to clean up their act,” Sen. Richard Blumenthal, who chairs the Senate’s consumer protection subcommittee, told CNN Business. “Trust in Big Tech is long gone and we need real rules to ensure kids’ safety online.”
Michela Menting, a digital security director at market research firm ABI Research, agreed that social media platforms are “offering very little of substance to counter the ills their platforms incur.” Their solutions, she said, put the onus on guardians to activate various parental controls,such as those intended to filter, block and restrict access, and more passive options, such as monitoring and surveillance tools that run in the background.
Alexandra Hamlet, a New York City-based clinical psychologist, recalls being invited to a roundtable discussion roughly 18 months ago to discuss ways to improve Instagram, in particular, for younger users. “I don’t see many of our ideas being implemented,” she said. Social media platforms, she added, need to work on “continuing to improve parental controls, protect young people against targeted advertising, and remove objectively harmful content.”
The social media companies featured in this piece either declined to comment or did not respond to a request for comment on criticism that more needs to be done to protect young users.
For now, guardians must learn how to use the parental controls while also being mindful that teens can often circumvent those tools. Here’s a closer look at what parents can do to help keep their kids safe online.

Instagram

After the fallout from the leaked documents, Meta-owned Instagram paused its much-criticized plan to release a version of Instagram for kids under age 13 and focused on making its main service safer for young users.
It has since introduced an educational hub for parents with resources, tips and articles from experts on user safety, and rolled out a tool that allows guardians to see how much time their kids spend on Instagram and set time limits. Parents can also receive updates on what accounts their teens follow and the accounts that follow them, and view and be notified if their child makes an update to their privacy and account settings. Parents can see which accounts their teens have blocked, as well. The company also provides video tutorials on how to use the new supervision tools.
Another feature encourages users to take a break from the app, such as suggesting they take a deep breath, write something down, check a to-do list or listen to a song, after a predetermined amount of time. Instagram also said it’s taking a “stricter approach” to the content it recommends to teens and will actively nudge them toward different topics, such as architecture and travel destinations, if they’ve been dwelling on any type of content for too long.

Facebook

Facebook’s Safety Center provides supervision tools and resources, such as articles and advice from leading experts. “Our vision for Family Center is to eventually allow parents and guardians to help their teens manage experiences across Meta technologies, all from one place,” Liza Crenshaw, a Meta spokesperson, told CNN Business.
The hub also offers a guide to Meta’s VR parental supervision tools from ConnectSafely, a nonprofit aimed at helping kids stay safe online, to assist parents with discussing virtual reality with their teens. Guardians can see which accounts their teens have blocked and access supervision tools, as well as approve their teen’s download or purchase of an app that is blocked by default based on its rating, or block specific apps that may be inappropriate for their teen.

Snapchat

In August, Snapchat introduced a parent guide and hub aimed at giving guardians more insight into how their teens use the app, including who they’ve been talking to within the last week (without divulging the content of those conversations). To use the feature, parents must create their own Snapchat account, and teens have to opt-in and give permission.
While this was Snapchat’s first formal foray into parental controls, it did previously have a few existing safety measures for young users, such as requiring teens to be mutual friends before they can start communicating with each other and prohibiting them from having public profiles. Teen users have their Snap Map location-sharing tool off by default but can also use it to disclose their real-time location with a friend or family member even while their app is closed as a safety measure. Meanwhile, a Friend Check Up tool encourages Snapchat users to review their friend lists and make sure they still want to be in touch with certain people.
Snap previously said it’s working on more features, such as the ability for parents to see which new friends their teens have added and allow them to confidentially report concerning accounts that may be interacting with their child. It’s also working on a tool to give younger users the option to notify their parents when they report an account or piece of content.
The company told CNN Business it will continue to build on its safety features and consider feedback from the community, policymakers, safety and mental health advocates, and other experts to improve the tools over time.

TikTok

In July, TikTok announced new ways to filter out mature or “potentially problematic” videos. The new safeguards allocated a “maturity score” to videos detected as potentially containing mature or complex themes. It also rolled out a tool that aims to help people decide how much time they want to spend on TikToks. The tool lets users set regular screen time breaks, and provides a dashboard that details the number of times they opened the app, a breakdown of daytime and nighttime usage and more.
The popular short form video app currently offers a Family Pairing hub, which allows parents and teens to customize their safety settings. A parent can also link their TikTok account to their teen’s app and set parental controls, including how long they can spend on the app each day; restrict exposure to certain content; decide if teens can search for videos, hashtags, or Live content; and whether their account is private or public. TikTok also offers its Guardian’s Guide that highlights how parents can best protect their kids on the platform.
In addition to parental controls, the app restricts access to some features to younger users, such as Live and direct messaging. A pop-up also surfaces when teens under the age of 16 are ready to publish their first video, asking them to choose who can watch the video. Push notifications are curbed after 9 p.m. for account users ages 13 to 15, and 10 p.m. for users ages 16 to 17.
The company said it will be doing more around boosting awareness of its parental control features in the coming days and months.

Discord

Discord did not appear before the Senate last year but the popular messaging platform has faced criticism over difficulty reporting problematic content and the ability of strangers to get in touch with young users.
In response, the company recently refreshed its Safety Center, where parents can find guidance on how to turn on safety settings, FAQs about how Discord works, and tips on how to talk about online safety with teens. Some existing parental control tools include an option to prohibit a minor from receiving a friend request or a direct message from someone they don’t know.
Still, it’s possible for minors to connect with strangers on public servers or in private chats if the person was invited by someone else in the room or if the channel link is dropped into a public group that the user accessed. By default, all users — including users ages 13 to 17 — can receive friend invitations from anyone in the same server, which then opens up the ability for them to send private messages.
The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.
ABC 17 News is committed to providing a forum for civil and constructive conversation.
Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here
If you would like to share a story idea, please submit it here.
Terms of Service | Privacy Policy | Community Guidelines | KMIZ-TV FCC Public File | FCC Applications |
Do Not Sell My Personal Information
Breaking News
Severe Weather
Daily News & Weather Updates
Contests & Promotions

Accessibility Tools

source

Leave a Comment

Your email address will not be published. Required fields are marked *