this post was submitted on 30 Aug 2024
172 points (96.2% liked)

News

23267 readers
3192 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

Taiwanna Anderson’s life changed forever in December 2021, when she found her 10-year-old daughter Nylah unconscious, hanging from a purse strap in a bedroom closet.

Barely an adolescent, Nylah wasn’t suicidal. She had merely come across the “Blackout Challenge” in a feed of videos curated her for her by TikTok’s algorithm. The challenge circulating on the video-sharing app encouraged users to choke themselves with household items until they blacked out. When they regained consciousness, they were supposed to then upload their video results for others to replicate. After several days in a hospital’s intensive care unit, Nylah succumbed to her strangulation injuries. Anderson sued TikTok over product liability and negligence that she alleges led to Nylah’s death.

For years, when claimants tried to sue various internet platforms for harms experienced online, the platforms benefited from what amounted to a get-out-of-jail-free card: Section 230 of the Communications Decency Act, a 1996 statute that offers apps and websites broad immunity from liability for content posted to their sites by third-party users. In 2022, a federal district judge accepted TikTok’s Section 230 defense to dismiss a lawsuit filed by Anderson based on the assessment that TikTok didn’t create the blackout challenge video Nylah saw—a third-party user of TikTok did.

But on Tuesday, the federal Third Circuit Court of Appeals released an opinion reviving the mother’s lawsuit, allowing her case against TikTok to proceed to trial. TikTok may not have filmed the video that encouraged Nylah to hang herself, but the platform “makes choices about the content recommended and promoted to specific users,” Judge Patty Shwartz wrote in the appellate court’s opinion, “and by doing so, is engaged in its own first-party speech.”

top 21 comments
sorted by: hot top controversial new old
[–] v1605@lemmy.world 48 points 2 months ago

I think the laws need to clarify the difference between just hosting vs suggesting content. Tik Tok should be responsible since it suggested the dangerous video, unlike if it was just hosted on an AWS server.

[–] Fedizen@lemmy.world 46 points 2 months ago (1 children)

Algorithms are content decisions already made by these companies. Surprisingly good decision made by the court.

[–] theherk@lemmy.world 17 points 2 months ago

This seems so black in white to me that I have a hard time discussing it with people. I support 230 protections if you blindly host and aggregate content. But the moment to do anything to drive engagement, a site should be fully responsible for every bit of content.

[–] schnurrito@discuss.tchncs.de 18 points 2 months ago (2 children)

Section 230 was invented for things like mailing lists, newsgroups, phpBB-style forums, wikis. IMHO all of those are great things that we should definitely continue to have.

Notice something? None of these have recommendation algorithms. If someone posts something to any of these kinds of platforms, it will be shown deterministically to those who have chosen to look at or subscribe to the place where it is posted; but it will not be shown to anyone else.

I think it makes sense, regardless of how the current law should be interpreted, to say that operators of these kinds of platforms aren't liable for what users do with them, but once you install a recommendation algorithm to show things to more users who haven't actively made the choice to look for/at those things, you are liable for the choices of that recommendation algorithm.

I admit that there are gray areas in that distinction, particularly whether search results should count as recommendation algorithms; but that is the general idea of what the law should ideally be.

Current large social media platforms get so much data uploaded to them all the time that their recommendation algorithms basically constitute a choice on the part of the social media platform to promote some of that data over other data. This makes these kinds of social media platforms essentially media companies, and everyone agrees that media companies are liable for what they promote.

[–] jaybone@lemmy.world 2 points 2 months ago (2 children)

PhpBB style forums have plugins for things like trending threads and posts right? At some point it becomes a slippery slope.

[–] ted@sh.itjust.works 3 points 2 months ago (2 children)

Trending posts is not a recommendation algorithm, in my opinion. I think the slope stops at curation (interests-based algorithms targeted at specific users).

I.e. reddit homepage is now curated, Lemmy sorting methods are not.

[–] NotAnotherLemmyUser@lemmy.world 1 points 2 months ago

Then if they go this route, they better make sure that they clearly define what they mean by a "Recommendation Algorithm" or an "interests-based algorithm" because the opinions of individuals won't hold up in court.

If it's not defined an attorney could easily argue that Lemmy's "Scaled" algorithm is a "recommendation algorithm" and you would hope that the judge understood enough about programming to know where to draw the line.

[–] Not_mikey@slrpnk.net 1 points 2 months ago

Lemmy sorting is still interest based if your not scrolling through /all , it's just that those are declared interests, you subscribe to the tennis community, as opposed to inferred interests, the algorithm figured out you like tennis based on your watching habits. It's still curated it's just self curated instead of algorithmically curated.

So I guess you could say it stops at how the interests are compiled and whether the interest was given explicitly by the user but then you get into how a user understands certain actions like likes. Do people like something to just give feedback to the poster, then it shouldn't be used at all. Do they like something because they want to boost it and have their wider community to see it, then the algorithm can take that into account when giving it to friends / followers. Do they like something because they want to see more of it, then the algorithm can use that information for recommending things that user will see. My guess is people use it as some combination of all 3, and as long as the social media tells its users at the beginning that the heart button is all 3 they could get away with saying there algorithm is explicit while not changing much.

[–] schnurrito@discuss.tchncs.de 1 points 2 months ago

normally they don't, perhaps there are such plugins available, but the normal way of using them is to have those threads on top that have most recently been posted in, and that concept should definitely be protected by section 230 and equivalents

[–] barsquid@lemmy.world 2 points 2 months ago (1 children)

Search is a gray area that can be resolved in the courts. Like searching "blackout challenge" and seeing blackout content is different from searching "cartoons" and seeing blackout content.

[–] schnurrito@discuss.tchncs.de 3 points 2 months ago

Yes; more interesting is which videos are at the top of a search for, say, "Kamala Harris" or "Donald Trump"; if those videos tell lies about these people (which could cause them to lose an election in a few months), what is the liability there?

[–] Carrolade@lemmy.world 18 points 2 months ago

Advocates of Section 230 have long held the broad liability shield is necessary for the internet to exist and evolve as a societal tool; if websites were responsible for monitoring the heaps of content that hundreds of millions of independent users create, they contend, lawsuits would devastate platforms’ coffers and overwhelm the judicial system. “If you have fewer instances in which 230 applies, then platforms will be exposed to more liability...

If you're contributing to enough actual harm that lawsuits could feasibly devastate your balance sheet and overwhelm the judicial system, perhaps the problem is worse than we thought.

[–] Bassman1805@lemmy.world 12 points 2 months ago (1 children)

This "challenge" has been going on some before widespread internet access.

[–] catloaf@lemm.ee 14 points 2 months ago (1 children)

But it wasn't facilitated by a social media platform, it was word-of-mouth, right? That's the difference.

[–] glimse@lemmy.world 10 points 2 months ago

It's even more different because the "challenge" videos show you that "it works" and that you'll get fame (in the way of views) for participating

[–] WhatsHerBucket@lemmy.world 11 points 2 months ago (1 children)

Where were the parents? This isn’t Gen X, kids these days can’t be left alone without doing something stupid. TikTok is not a replacement for a babysitter.

I’m not saying TikTok isnt at fault for their shitty algorithms, but why is a 10yo on TikTok by herself in the first place? I can’t believe TikTok’s TOS would even allow that age to have an account.

[–] some_guy@lemmy.sdf.org 6 points 2 months ago

Right? Part of modern parenting needs to include a talk about how to recognize when following a fad is dangerous. The parents failed their child.

I tried joking around with a ziploc bag when I was a kid and got a very good talking-too about the dangers of a plastic bag and the potential for suffocation and that shit couldn't even fit over my head when I was around 7.

[–] dan1101@lemm.ee 9 points 2 months ago (1 children)

Jesus Christ, if I was running Tik Tok I'd have employees working on shifts 24/7 to delete all such videos and ban and prosecute the uploaders.

[–] Drusas@fedia.io 5 points 2 months ago

They want the chaos.

[–] DaCrazyJamez@sh.itjust.works -4 points 2 months ago (1 children)

I hate to take Tik Toks side in anything, but in this case I dont think they should be liable. A kid made a dumb choice based on a stupid video...but even if it showed up in a feed, Tik Tok didnt e courage the behavior.

This one is a sad case of darwinism.

[–] Diplomjodler3@lemmy.world 10 points 2 months ago

Wrong. Children are being shown these videos in the name of "engagement", i.e. in order to maximise profits. That is the main problem with oligopolistic social media platforms. It's the algorithms that are destroying society.