Future of social media is infront of the SCOTUS this week

archives

Verified User
”How Two Supreme Court Cases Could Completely Change the Internet”

“The future of the federal law that protects online platforms from liability for content uploaded on their site is up in the air as the Supreme Court is set to hear two cases that could change the internet this week.”

“The cases will decide whether online platforms can be held liable for the targeted advertisements or algorithmic content spread on their platforms.”

“Tech companies argue that Section 230 protects them from these types of lawsuits because it grants them legal immunity from liability over third-party content that is posted on their platform. The case will decide whether platforms can be held liable for spreading harmful content to users through their algorithm.”

“The law explicitly states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” meaning online platforms are not responsible for the content a user may post.”

“Conservatives have long criticized Section 230, alleging that it allows social media platforms to censor right-leaning content.”

“Democrats have similarly argued against Section 230, saying that it prevents platforms from being held liable for hate speech and misinformation spread on their sites. President Joe Biden released an Op-Ed in the Wall Street Journal, asking for bipartisan legislation that would hold tech companies accountable.”

https://time.com/6256887/supreme-court-section-230-internet/

Seems as if the Court can’t just walk on this one, and individuals as Musk got to be concerned
 
”How Two Supreme Court Cases Could Completely Change the Internet”

“The future of the federal law that protects online platforms from liability for content uploaded on their site is up in the air as the Supreme Court is set to hear two cases that could change the internet this week.”

“The cases will decide whether online platforms can be held liable for the targeted advertisements or algorithmic content spread on their platforms.”

“Tech companies argue that Section 230 protects them from these types of lawsuits because it grants them legal immunity from liability over third-party content that is posted on their platform. The case will decide whether platforms can be held liable for spreading harmful content to users through their algorithm.”

“The law explicitly states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” meaning online platforms are not responsible for the content a user may post.”

“Conservatives have long criticized Section 230, alleging that it allows social media platforms to censor right-leaning content.”

“Democrats have similarly argued against Section 230, saying that it prevents platforms from being held liable for hate speech and misinformation spread on their sites. President Joe Biden released an Op-Ed in the Wall Street Journal, asking for bipartisan legislation that would hold tech companies accountable.”

https://time.com/6256887/supreme-court-section-230-internet/

Seems as if the Court can’t just walk on this one, and individuals as Musk got to be concerned

I dont really give a shit I just want the rules whatever they are to be applied to everyone.
 
The problem for tech platforms that do censor information and posts is that they are no longer bystanders but have become editors and it can be argued that makes them publishers because they select what user content can be displayed and which cannot. Further, if they are limiting some content while promoting others, it amounts to the same thing. The tech company is no longer a neutral party but an active participant in the conversation.

That wouldn't limit them to removing content that was criminal, malicious, or illegal, but rather they can't inject themselves into the conversation by censoring material they disagree with.
 
The problem for tech platforms that do censor information and posts is that they are no longer bystanders but have become editors and it can be argued that makes them publishers because they select what user content can be displayed and which cannot. Further, if they are limiting some content while promoting others, it amounts to the same thing. The tech company is no longer a neutral party but an active participant in the conversation.

That wouldn't limit them to removing content that was criminal, malicious, or illegal, but rather they can't inject themselves into the conversation by censoring material they disagree with.

But if the Court rules against the tech companies, dumps Section 230, they can be held legally responsible for anything they post, anything they allow posted, which they aren’t at the moment.

If what you think now censorship is prevalent what do you think is going to happen when they have to increase scrutiny inorder to prevent million dollar lawsuits?
 
But if the Court rules against the tech companies, dumps Section 230, they can be held legally responsible for anything they post, anything they allow posted, which they aren’t at the moment.

If what you think now censorship is prevalent what do you think is going to happen when they have to increase scrutiny inorder to prevent million dollar lawsuits?

Or, they rule that section 230 only applies if the company allows content without interference. If the company isn't acting as a censor or editor, they aren't a publisher, they're merely providing a platform for speech.
 
The problem for tech platforms that do censor information and posts is that they are no longer bystanders but have become editors and it can be argued that makes them publishers because they select what user content can be displayed and which cannot. Further, if they are limiting some content while promoting others, it amounts to the same thing. The tech company is no longer a neutral party but an active participant in the conversation.

That wouldn't limit them to removing content that was criminal, malicious, or illegal, but rather they can't inject themselves into the conversation by censoring material they disagree with.
They should be treated like a news paper.
 
Or, they rule that section 230 only applies if the company allows content without interference. If the company isn't acting as a censor or editor, they aren't a publisher, they're merely providing a platform for speech.

No, you are misinterpreting the rule, currently they are protected under 230 as a publisher, regardless of content, just passing on information, however if the rule is struck down, they lose this protection no matter how or what posts are offered
 
But if the Court rules against the tech companies, dumps Section 230, they can be held legally responsible for anything they post, anything they allow posted, which they aren’t at the moment.

If what you think now censorship is prevalent what do you think is going to happen when they have to increase scrutiny inorder to prevent million dollar lawsuits?

Well it's either those platforms are a place for free speech or they aren't. The bottom line is the people posting should be the only ones responsible. This is like that stupidity of trying to hold bars responsible for the actions of drunk drivers
 
Well it's either those platforms are a place for free speech or they aren't. The bottom line is the people posting should be the only ones responsible. This is like that stupidity of trying to hold bars responsible for the actions of drunk drivers

Which is the protection Section 230 offers now, the social media sites can’t be sued, but that might disappear this week, which makes the case important
 
No, you are misinterpreting the rule, currently they are protected under 230 as a publisher, regardless of content, just passing on information, however if the rule is struck down, they lose this protection no matter how or what posts are offered

But they aren't "just passing on information". They are deciding what's is "passed on" and that should make them responsible
 
Which is the protection Section 230 offers now, the social media sites can’t be sued, but that might disappear this week, which makes the case important

They should be sued when they decide what will and will not appear on their sites. They are controlling the content not the posters.
 
”How Two Supreme Court Cases Could Completely Change the Internet”

“The future of the federal law that protects online platforms from liability for content uploaded on their site is up in the air as the Supreme Court is set to hear two cases that could change the internet this week.”

“The cases will decide whether online platforms can be held liable for the targeted advertisements or algorithmic content spread on their platforms.”

“Tech companies argue that Section 230 protects them from these types of lawsuits because it grants them legal immunity from liability over third-party content that is posted on their platform. The case will decide whether platforms can be held liable for spreading harmful content to users through their algorithm.”

“The law explicitly states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” meaning online platforms are not responsible for the content a user may post.”

“Conservatives have long criticized Section 230, alleging that it allows social media platforms to censor right-leaning content.”

“Democrats have similarly argued against Section 230, saying that it prevents platforms from being held liable for hate speech and misinformation spread on their sites. President Joe Biden released an Op-Ed in the Wall Street Journal, asking for bipartisan legislation that would hold tech companies accountable.”

https://time.com/6256887/supreme-court-section-230-internet/

Seems as if the Court can’t just walk on this one, and individuals as Musk got to be concerned


What is on the line is the algorithms that censor conservative political views. At the point that Facebook or Twitter determine WHAT content they will carry, they become a publisher.
 
But they aren't "just passing on information". They are deciding what's is "passed on" and that should make them responsible

That is a different issue, what they now decide to pass on and not, but without 230, everything they run will be subject to legal recourse, that will certainly make them even more careful of what they offer
 
I dont really give a shit I just want the rules whatever they are to be applied to everyone.

That's what the court will decide. Can a platform determine content and still be a forum, or does the exercise of editorial control over content make the platform a publisher?

A forum will have a set of published rules that are applied to all. A publisher exercises editorial discretion over what is carried, as Twitter, Facebook, CNN, and the Los Angeles Times do.
 
They are now, which annoys many from both sides cause newspapers can be sued for what they print, social media sites can not
Yep Social media decide what gets printed on their platforms and News Papers decide what gets printed in the paper. Essentially the same.
 
They should be sued when they decide what will and will not appear on their sites. They are controlling the content not the posters.

You are still missing the point, right now they aren’t sued if disinformation is run over their platforms, if 230 disappears, they can be sued
 
That is a different issue, what they now decide to pass on and not, but without 230, everything they run will be subject to legal recourse, that will certainly make them even more careful of what they offer

No it is the issue. They should be careful. And if they lean heavily one way then we will know their stance and people can decide to participate or not. Everyone should be responsible for their choices.n
 
Back
Top