When all is said and done, I know that camDown is the solution for securing your webcam from cyber criminals and pedophiles and I know your family would agree!
Smartphone with TikTok logo. (Shutterstock)
Last month, after the
Wall Street Journal revealed that social media app TikTok served drug and bondage videos to teenage accounts, TikTok said it "in no way represents the behavior and viewing experience of a real person."
"Protecting minors is vitally important," a spokeswoman said, "and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens."
A Raw Story investigation, however, found the TikTok experience — seen through the lens of a teen account that dwelled on law enforcement content — to be anything but safe. Within twelve hours of opening a 13-year-old account, TikTok recommended content promoting firearms, along with videos promoting body armor and rifle mounts that improve the accuracy of weapons fire. It also provided links to websites where they are sold.
TikTok also suggested an account about serial killers that described the murder of a naked 14-year-old. Within several days, the app played videos that young users uploaded of their apparent failed suicide attempts, including one girl who appeared to be in an hospital. TikTok's promotion of suicide content will be the subject of a report from Raw Story later this week.
TikTok, owned by Beijing-based ByteDance, is an app that provides a stream of user-uploaded videos. It recommends additional videos based on which videos users watch. Generally, it offers innocuous content like people doing funny dances. While the technology is effective at keeping users on the app, it can send users down rabbit holes of toxic content if they show interest in certain videos. It also tends to find more and more extreme videos about the type of content you watch.
Raw Story's simulated 13-year-old user initially dwelled on videos of police, servicemembers and hunting. Within two hours, TikTok showed hunting videos jokingly suggesting shooting a neighbor's dog and an Amish man. Within three hours, TikTok recommended "flexible" rifle armor. After five hours, TikTok suggested we consider Unity Tactical's Fast Mount, a device used to improve firearms targeting. Unity Tactical's
website says the mount is helpful "especially while wearing tactical gear, night vision goggles, gas masks, helmets, and plate carriers."
Both profiles promoting body armor and rifle equipment linked to websites where the items were sold.
By the time bedtime rolled around — 10 p.m. for our eighth-grader — TikTok served up videos about serial killers. By clicking the profile, our 13-year-old found graphic descriptions of murders committed by convicted murderer Jeffrey Dahmer, including the killing of a 14-year-old who was found naked in the street by police and an 18-year-old "dismembered and disposed of… in the woods behind his parent's home."
Many videos appear to violate TikTok's Community Guidelines. TikTok bars content that "promotes, normalizes, or glorifies extreme violence or suffering" and "depiction, promotion, or trade of firearms, ammunition, firearm accessories, or explosive weapons." Raw Story found the depiction or promotion of all four types of prohibited weapons products.
While the guidelines say they ban any depiction of firearms, however, they later say weapons "carried by a police officer, in a military parade, or used in a safe and controlled environment such as a shooting range may be allowed."
After an inquiry, TikTok called Raw Story to gather background information and take questions. Raw Story provided TikTok with the content, but TikTok declined to comment.
TikTok's owner offers a different, censored version of its app in China which has more restrictive rules. The version provided to Americans is banned in China.
TikTok knows about the prevalence of guns on its platform. In 2020, Gizmodo published "
TikTok is Full of Guns," which found at least 100 accounts in apparent violation of TikTok's policies. In March, Media Matters followed up with an article entitled "TikTok is teaching teens how to build fully automatic rifles and make 'hollow point' ammunition." The news site Digital Trends then found demonstrations of how to manufacture ammunition and 3D-print guns — "dozens" of clips which, it said, in some cases, accumulated half a million views.
TikTok told Digital Trends it "prohibits the trade, sale, and promotion of weapons," and removes "content and related accounts as they're identified." The platform allows users to flag content they don't like. TikTok now bans searching the hashtags, #homemadegun and #3dgun, though videos may be available under other terms.
One video Digital Trends cited showed a purported 9-year-old firing a handgun. Months later, the video remains on TikTok, and is easily discoverable by a 13-year-old.
Children may fire guns legally in many states. Minnesota allows 10-year-olds to fire guns while hunting with parents; Wisconsin
did away with an age restriction altogether in 2017. Children fire guns in video games and movies, including this year's James Bond film, No Time to Die.
What's different about TikTok is that users who show an interest in content depicting soldiers or toy weapons are recommended videos of people firing real weapons, then suggested an opportunity to buy them. Raw Story's 13-year-old user paused on military videos and within 12 hours was shown content advertising firearms accessories and body armor.
Matthew Hogenmiller, digital manager for the gun control group March For Our Lives, worries about TikTok showing teens radicalizing content alongside links to buy guns. March for Our Lives was founded by survivors of the 2018 Parkland school shooting.
"To think that these young people are getting that radicalizing content, and could get a gun marketing video within the same scroll session is incredibly dangerous," Hogenmiller said. "On other platforms, like YouTube or Facebook, you have to actively seek out content surrounding guns. With TikTok the very nature of the platform is to allow an algorithm to choose what it shows you, and for some people, that algorithm can show you alt-right ideologies and what website to buy a [modified] gun on in a span of a few hours."
The profile for the video allegedly showing a 9-year-old firing a handgun links to a New York firearms dealer which sells handguns, rifles, shotguns, knives and silencers.
Raw Story went through the dealer's online process to purchase a Glock handgun. The gun must be retrieved at a federally licensed dealer. A minor could not legally pick up the weapon, but the ease of the process — address, credit card and a checkbox to accept terms — would allow an agreeable adult to help with a weapons purchase. Teens have used adults to buy them weapons, including the teens who murdered 12 people at Columbine High School in 1999. In Georgia, where Raw Story looked to have the gun sent, the Glock could be picked up at one of eleven nearby Cash America Pawn stores.
The dealer did not respond to an email seeking comment.
Nick Groat, founder of Safe Life Defense, whose rifle armor TikTok recommended, said his product wasn't meant for teens and that he couldn't control TikTok's algorithms. He emphasized his products were intended for safety and shouldn't be grouped with weapons.
"Body armor unfortunately gets grouped in with weapons occasionally, but that's just simply not what it is," Groat said. "It's the exact opposite. It's no different than wearing a seatbelt or a hard hat for a construction worker."
When Raw Story noted that many of TikTok's users are minors, he said, "TikTok is one of the fastest growing platforms in the world and is commonly used by adults as well. Most of the influencers that we work with are law enforcement that are older than I am."
TikTok also showed Raw Story's teen account body armor from two other sellers, whose profiles linked to their online stores. Neither responded to repeated requests for comment.
Concern about young adults and guns stem from the fact teens have used firearms in massacres at U.S. public schools. In 2018, 19-year-old Nikolas Cruz shot more than a dozen students to death at Marjory Stoneman Douglas High School in Parkland.
A video of Cruz describing his plan to murder teens prior to carrying it out is on TikTok. The video was found through a TikTok search, and is
"I'm gonna be the next school shooter of 2018," Cruz tells the camera. "My goal is at least 20 people. With an AR-15 and a couple of tracer rounds, I think I can get it done."
The video has 6.3 million views.
The AR-15, the lightweight semi-automatic rifle Cruz used in the Parkland shooting, is also on TikTok. TikTok's app notes that users have viewed AR-15 content 277 million times.
The Columbine High School killers also live on in TikTok. A search for Columbine reveals the two teens screaming into the camera prior to their classmates' deaths. TikTok also hosts a fan account for Columbine shooter Eric Harris which says "he is so hot" and "I love you more and more."
In addition to school shooter videos, TikTok also provides a virtual marketplace for ammunition dealers. A seller cited by Media Matters in March continues to hawk bullets, even though their original account was taken down.
"We got lots of ammo in stock," one video states. "Order through our email or inbox us directly on TikTok." The company did not respond to an email seeking comment.
Raw Story easily found three other sellers by searching "ammo."
Media Matters Research Director Sharon Kann said that following its March report, TikTok removed many videos they'd flagged. But she noted that content TikTok takes down frequently reappears.
"Although TikTok has taken down potentially violative content, we've seen time and again how a lack of proactive and consistent enforcement enables bad actors, harmful content and misinformation to reappear and flourish on the platform," Kann said.
While its rules bar the depiction of guns except in circumstances, TikTok seems to have made little effort to prevent teens from searching for firearms videos. The app shows weapons and violence to millions of users. According to TikTok's own app, the hashtag for murder has 3.4 billion views; guns, 1.8 billion views; and ak47, 100 million views. TikTok even suggests content under the hashtag "killingchallenge."
One account TikTok recommended to Raw Story's 13-year-old offered drawings created by serial killers (
video here), pictures of a serial killer's torture chamber, and descriptions of the murders of teens (videos here and here). It depicted a bloody mattress being removed from serial killer Jeffrey Dahmer's apartment, and retold his murder of Stephen Hicks, an 18-year-old that Dahmer said he lured with alcohol and killed. A photo by serial killer Richard Ramirez appears follows.
The account, also described the murder of a teen Milwaukee police officers
found naked in the street in 1991. Dahmer later said he injected hydrochloric acid into the boy's brain after drilling a hole in his skull.
"Jeffrey came home to find one of his victims who was in a zombie like state had escaped," the video notes. "He was naked standing near some woman on the street. When the police came, he told them that it was his boyfriend and that he was drunk. They gave the boy back to Dahmer and he was killed."
"That boy was 14," the video adds.
The account claims it is "not a fan account."
TikTok's impact on teens is not simply theoretical. Across the country, schools are dealing with TikTok "challenges," which encourage teens to engage in destructive behavior and upload videos of it to the platform. Last month's "devious licks" challenge resulted in theft and vandalism at schools across the country. Three weeks ago, a teen was
arrested after punching a disabled 64-year-old teacher in the face.
Even toy weapons that appeared in pranks have posed challenges. A "Gun Prank War" in North Carolina found police responding to reports of teens pointing guns at motorists. The weapons, which turned out to be toys, so alarmed residents that police put out a
statement on Facebook.
"Most of the people involved were under the age of eighteen," Roxboro police Chief David Hess told Raw Story. "In today's society, with airsoft and water guns designed to replicate real firearms, law enforcement and even general citizens cannot tell the difference, which creates a potentially deadly situation. We used our local incident as an educational approach to hopefully prevent a deadly situation occurring."
TikTok recommended videos of individuals shooting airsoft guns within two hours of Raw Story's teen being on the platform. The devices are so realistic they are banned in public in three states.
TikTok's critics have been pleading with the company to take a more hands-on approach to protect its young users. In 2019, the Federal Trade Commission fined TikTok $5.7 million for violating the Children's Online Privacy Protection Act, for improperly collecting the personal information of children.
TikTok is also facing inquiries from Congress. The company's head of public policy will testify today in a Senate consumer protection subcommittee hearing on social media and child safety.
"Recent revelations about harm to kids online show that Big Tech is facing its Big Tobacco moment—a moment of reckoning," subcommittee chairman Sen. Richard Blumenthal (D-CT) said in a
statement. "We need to understand the impact of popular platforms like Snapchat, TikTok, and YouTube on children and what companies can do better to keep them safe."
The committee's ranking Republican, Sen. Marsha Blackburn (R-TN), has said "companies are prioritizing profits over safety."
Blumenthal, along with Sen. Ed Markey (D-MA) and Rep. Kathy Kastor (D-FL), introduced legislation aimed at protecting users under 16. The so-called KIDS Act would ban auto-play settings for young teens and prohibit websites from amplifying violent or inappropriate content.
"Figuring out that they're interested in things like suicide or guns and then bombarding them with that content, is something that the KIDS Act would expressly prohibit," said Josh Grolin, Executive Director of Fairplay, a nonprofit watchdog which has filed an FTC complaint against TikTok.
In September, the UK began enforcing a law requiring social media companies to employ "age appropriate design" when serving content to users it believes to be minors. Two days before the bill went into effect, Instagram
began requiring users to enter their birthday before using the app. The company also introduced changes that restrict advertisers from targeting audiences under 18 using anything other than their basic demographic information.
Following the law's passage, TikTok
announced that it would stop sending push notifications after 9 p.m. to 13-15 year olds. YouTube announced it would turn off auto-play settings for children and add break and bedtime reminders.
The companies' moves to restrict teen accounts raise hope for critics that say legislation is necessary to curb potentially harmful impacts on teens. Instagram made the changes globally, suggesting that individual countries may be able to influence tech giants' global behavior.
Correction: An earlier version of this story incorrectly attributed quotes to Devorah Heitner, author of Screenwise: Helping Kids Thrive (and Survive) in Their Digital World.
Have tips about TikTok or internal information about social media impacting teens? Email [email protected]
John Byrne holds direct investments in Softbank, one of TikTok's large early investors; Alibaba; Facebook; Microsoft; Tencent; and Alphabet, the parent company of Google and YouTube; He is the founder of Raw Story.
Did you know that camDown helps stop hackers from getting access to the webcam that I use for my work. Now I can get even more gigs as a freelancer and advertise that I have top security with my home computer?