The ongoing expansion of social media has demanded a continuing conversation surrounding the importance of safety and privacy, particularly for children and young adults. The newest point of contention surrounds the responsibility of tech companies in regulating content. The Supreme Court is currently evaluating a case that seeks to hold tech giant Google accountable for enabling harmful content to be shared online and allowing its algorithms to promote it to users.
In some cases, users have been driven to commit violence against others or themselves, based on unmoderated or even promoted content. If you have questions about pursuing legal action against a tech company for its role in your personal injury, it is in your best interest to speak with a Florida personal injury lawyer as soon as possible. You may be able to pursue compensation for the damages that you have suffered because of social media via a social media harm lawsuit.
Our social media harm lawyers are currently accepting cases and are prepared to hear your stories. Dolman Law Group offers free consultations for prospective clients, and we encourage you to take advantage of this opportunity to learn more about your options for legal recourse.
Supreme Court Case May Transform How Tech Companies Operate
While multiple tech companies are facing criticism for their moderation policies and algorithms, the case currently in front of the Supreme Court focuses on Google, the owner of Youtube. The case, Gonzalez v. Google, is being brought on behalf of the estate of Nohemi Gonzalez, a young woman who was killed in an ISIS terrorist attack in 2015. In their wrongful death lawsuit, her family has accused Google of failing to properly regulate content on Youtube’s platform.
They claim that not only did Google allow ISIS to post violent videos for recruitment purposes, but Youtube’s algorithm also pushed increasingly radical videos to its users. The Gonzalez’s lawyers claim that by allowing and encouraging this type of content on its platform, Google should be considered negligent in Nohemi Gonzalez’ death. This would potentially allow them to recover damages from the tech giant. If the Supreme Court rules in Gonzalez’ favor, this would set a precedent that would allow injured parties to go after the platforms that allowed harmful content, not just the individual or group who posted it.
The outcome of this case could radically alter the way that tech companies function, and what type of content users encounter. The primary issue being decided is to what extent can tech companies be held liable for their algorithms and moderation policies if content displayed on their platform leads to serious physical and mental injuries.
How Can Social Media Usage Cause Injuries?
Researchers, doctors, teachers, and parents have been sounding the alarm for years: social media has the power to irreparably damage the mental, emotional, and physical health of users, especially young users. Some of the most commonly cited issues are anxiety and depression, which can lead to substance abuse, sleep issues, eating disorders, self-harm, or even suicide. This may stem from actions like bullying, child sex abuse or exploitation, and terrorism, which can be amplified and conducted via social media.
Understanding Examples of Damage Caused by Social Media
In one tragic case, an 11-year-old girl was “financially and sexually exploited” by adult men after interacting with them on Discord, which is known for its limited regulation. As a result of her communication with these predators, she has struggled with her mental health and attempted to commit suicide multiple times. Her parents are currently seeking damages for the costs of her treatment. This emphasizes how virtual communication can cause extensive health damage in the real world.
Not only does the anonymity of the internet provide an opportunity for predators, but it also allows users to construct a false narrative of their lives and create harmful content with limited accountability. For example, let’s say a young user stumbles across an influencer who praises the values of dangerous dieting practices as a result of Instagram’s algorithm and becomes obsessed with modeling the influencer’s life. Over time, they develop an eating disorder that does significant damage to their heart.
At this time, the tech company would not share liability, even though it was their algorithm that deliberately encouraged the young user to view this content based on “interest”. While this is a hypothetical situation, it is entirely plausible. Instagram has opened admitted that its algorithm promotes body image issues and even eating disorders among young girls, but it did little to mitigate this issue. This leaves parents with limited ways to hold powerful companies accountable for the damage they do to their children.
Are Tech Companies Liable for Damages?
Thus far, tech companies have been mostly protected from personal injury lawsuits under Section 230 of the Communications Decency Act of 1996. Under this act, tech companies cannot face civil lawsuits for illegal content that is posted or shared across their platforms. Additionally, they are still protected from civil liability even if the content in question violates their own policies and other comparable content has been regulated.
What these protections mean in practice is that users who are injured by another party, or driven to injure themselves, have had limited legal recourse against the platforms that allow this content. Tech companies claim that their status is distinct from “publishers”, who have a greater responsibility to vet their content, and thus far this has allowed them to evade liability.
However, critics of this perspective on Section 230 claim that a tech company with an algorithm that actively promotes harmful content should not be shielded from litigation. Some distinguish the protections afforded by Section 230 by saying that these algorithms constitute tech companies taking an active role in enabling content that incites violence or harassment. As of right now, the debate remains unresolved.
Contact Dolman Law Group for Help With Your Social Media Injury Lawsuit
When social media becomes a tool for violence or harassment, there are often real-world consequences. If you or your child has been injured due to social media activity, you may be wondering if you can hold the platform or company itself accountable for allowing harmful content. Dolman Law Group recognizes that this is a complicated issue that will largely depend on the outcome of Gonzalez v. Google. Trusting our results-based approach is the wisest way to ensure your damages are properly compensated.
We are prepared to investigate all possible avenues for compensation for damages for costs like medical bills, lost wages, and pain and suffering. Our award-winning team will work closely with you to establish a compelling claim for damages, and we will advocate for it in negotiations and in court. You can trust us to dedicate the time and effort that is needed to recover maximum compensation for your injuries and damages.
Dolman Law Group has been on the front line of this emerging area in personal injury law. As we await a formal decision by the Supreme Court, our team is on standby to answer any questions you may have and assist you in discerning liability as you consider filing a claim. You can reach us by phone at (727) 451-6900 or through our website contact form to schedule a free consultation with our personal injury lawyers.