Fletcher breaks down this story in English. Octavio reacts and expands in Spanish. Follow along with the live transcript, tap any word for its translation. Intermediate level — perfect for intermediate learners expanding their range.
So.
A jury in Los Angeles just handed down a verdict that could be one of the most consequential decisions in the history of tech litigation, and almost nobody is talking about it because of everything happening in the Gulf right now.
Bueno, mira, el caso se llama K.G.M.
Well, look, the case is called K.G.M.
contra Meta y Google.
versus Meta and Google.
Una mujer demandó a estas dos empresas porque cuando era niña, se hizo adicta a Instagram y a YouTube.
A woman sued these two companies because when she was a child, she became addicted to Instagram and YouTube.
Y el jurado decidió que ella tenía razón.
And the jury decided she was right.
Right, and the reason this matters, the reason I wanted to dig into this today, is that this isn't just one woman's story.
This is the first major jury verdict in what is essentially a tidal wave of lawsuits against social media companies for harming children's mental health.
Es que hay miles de casos similares en los Estados Unidos ahora mismo.
The thing is, there are thousands of similar cases in the United States right now.
Muchas familias dicen que sus hijos sufrieron problemas de salud mental porque usaron estas aplicaciones cuando eran muy jóvenes.
Many families say their children suffered mental health problems because they used these apps when they were very young.
I mean, when I first heard about this case, my instinct was skepticism.
I grew up watching television for hours, my parents worried about that too.
So I asked myself: is this really different?
And the more I looked at it, the more I thought, yes, actually, it is fundamentally different.
Claro, la televisión no sabía quién eras tú.
Exactly.
No tenía información sobre tus emociones, tus miedos, qué fotos miraste más tiempo.
Television didn't know who you were.
Los algoritmos de Instagram y YouTube sí tienen toda esa información, y la usaron para que los niños pasaran más y más horas en la aplicación.
It had no information about your emotions, your fears, which photos you looked at longest.
Here's the thing that keeps coming back to me, and I think it's the heart of the legal argument.
There's a phrase that came out of internal Meta documents a few years ago.
A researcher wrote, and I'm quoting, 'we make body image issues worse for one in three teenage girls.' That's their own researcher, saying that internally.
Bueno, eso es muy importante.
Well, that's very important.
La empresa sabía que su producto era dañino para los adolescentes, especialmente para las chicas, y no cambió nada.
The company knew its product was harmful to teenagers, especially girls, and changed nothing.
Continuó exactamente igual.
It just carried on exactly the same.
And that is exactly the structure of the tobacco litigation, which is the historical parallel I keep coming back to.
For decades, tobacco companies knew their product caused cancer.
They had internal studies.
They suppressed those studies.
And when litigation finally broke through in the 1990s, it broke through in exactly this way, with internal documents.
La verdad es que la comparación con el tabaco es muy buena.
Honestly, the comparison with tobacco is very apt.
Con el tabaco, los abogados encontraron los documentos internos y todo cambió.
With tobacco, the lawyers found the internal documents and everything changed.
Y ahora tenemos los documentos de Meta, y el mundo sabe lo que la empresa conocía sobre el daño a los niños.
And now we have Meta's documents, and the world knows what the company knew about the harm to children.
The extraordinary thing is how long it took.
The tobacco settlements, the big ones in the US, that was 1998.
Decades of people getting sick, decades of litigation, and then suddenly the dam broke.
I wonder if we're at that moment now with social media.
A ver, pero hay una diferencia importante.
Well, but there's an important difference.
Con el tabaco, el daño físico fue muy claro, el cáncer es visible.
With tobacco, the physical harm was very clear, cancer is visible.
Con las redes sociales, el daño es psicológico, y es más difícil de probar en un tribunal.
With social media, the harm is psychological, and it's harder to prove in a courtroom.
That's a fair challenge.
And yet a jury just decided it was provable.
So something shifted.
I think part of what shifted is the sheer volume of research, clinical studies linking heavy social media use in adolescence to anxiety, depression, self-harm, eating disorders.
The science caught up.
Sí, y los números son muy serios.
Yes, and the numbers are very serious.
En muchos países occidentales, los casos de depresión y ansiedad entre adolescentes aumentaron mucho entre 2012 y 2020.
In many Western countries, cases of depression and anxiety among teenagers increased a lot between 2012 and 2020.
Y 2012 fue exactamente el año cuando los teléfonos inteligentes se hicieron muy populares entre los jóvenes.
And 2012 was exactly the year when smartphones became very popular among young people.
Look, I want to be careful about correlation and causation, because that's a trap journalists fall into.
But the researcher Jonathan Haidt has spent years on this, and his argument, backed by a lot of data, is that the timing is too precise to be coincidental.
The mental health of teenage girls in particular just falls off a cliff starting around 2012.
Mira, en España también vimos esto.
Look, in Spain we saw this too.
Los médicos y los psicólogos que trabajan con adolescentes dicen que en los últimos diez años, los problemas de salud mental en jóvenes aumentaron mucho.
Doctors and psychologists who work with teenagers say that in the last ten years, mental health problems among young people increased a lot.
Especialmente los problemas de imagen corporal y la ansiedad social.
Especially body image issues and social anxiety.
And that's not just America, not just one culture.
You're describing Spain, I've read similar things about South Korea, Brazil, the UK.
Which tells you something about what Instagram and YouTube actually are.
They're a global environment that adolescent brains are developing inside, and the environment was designed to maximize engagement, not wellbeing.
Exactamente.
Exactly.
Y eso es el problema central del caso legal.
And that's the central problem of the legal case.
La mujer que demandó no dijo simplemente que pasó mucho tiempo en Instagram.
The woman who sued didn't simply say she spent a lot of time on Instagram.
Dijo que el algoritmo la manipuló activamente, que le mostró contenido más y más extremo para que no pudiera parar de mirar.
She said the algorithm actively manipulated her, that it showed her more and more extreme content so she couldn't stop watching.
Right, and this is where the opioid crisis parallel is maybe even stronger than the tobacco one.
Because with opioids, the argument wasn't just that the drug was addictive.
The argument was that pharmaceutical companies actively marketed addictive drugs to vulnerable populations and downplayed the risks.
That's a different moral category.
Bueno, y Meta y Google hicieron exactamente lo mismo.
Well, and Meta and Google did exactly the same thing.
Diseñaron sus aplicaciones para crear hábitos difíciles de romper.
They designed their apps to create habits that are hard to break.
Usaron técnicas psicológicas muy conocidas, como las notificaciones, los 'me gusta', los videos infinitos que no terminan nunca.
They used well-known psychological techniques, like notifications, likes, infinite videos that never end.
The slot machine comparison.
That's the one that always lands for me.
The variable reward schedule, you don't know if your post is going to get twenty likes or two hundred, and that uncertainty is precisely what makes it compulsive.
Casinos figured this out decades ago, and then Silicon Valley imported the technique into a product marketed to children.
Es que hay personas que trabajaron en estas empresas y después explicaron exactamente esto.
The thing is, there are people who worked at these companies and afterwards explained exactly this.
Dijeron que usaron los mismos principios que las máquinas de los casinos, y que sabían que funcionaba especialmente bien en los cerebros jóvenes.
They said they used the same principles as casino machines, and that they knew it worked especially well on young brains.
Tristan Harris.
Former Google design ethicist, went on to found the Center for Humane Technology.
His testimony and his broader work essentially created the intellectual framework for these lawsuits.
He coined the phrase 'a race to the bottom of the brainstem,' which is a phrase I've never forgotten.
A ver, pero yo quiero ser honesto también.
Well, but I also want to be honest.
Hay personas que dicen que la relación entre las redes sociales y la salud mental no es tan simple.
There are people who say the relationship between social media and mental health isn't that simple.
Que algunos estudios encontraron efectos negativos muy pequeños.
That some studies found very small negative effects.
Que otros factores también son importantes.
That other factors are also important.
No, you're absolutely right about that.
The science isn't uniformly catastrophist.
But here's where I land on it: even if the effect size is modest in average populations, the design of these platforms creates severe outcomes for the most vulnerable users, and the companies knew who the vulnerable users were because their algorithms identified them.
Sí, y eso es exactamente lo que probó el caso.
Yes, and that's exactly what the case proved.
No que Instagram daña a todos los niños.
Not that Instagram harms all children.
Sino que identificó a esta chica específica como una persona vulnerable y le mostró más contenido dañino porque eso significaba que pasaba más tiempo en la aplicación.
But that it identified this specific girl as a vulnerable person and showed her more harmful content because that meant she spent more time in the app.
Which is, frankly, chilling.
The algorithm didn't know her name.
It knew her behavior.
It knew she spent longer looking at images related to thinness and dieting.
And it gave her more of that.
Not because anyone decided to harm her.
Because the system was optimizing for time-on-platform, and her distress was just a variable in that equation.
Mira, en Europa tenemos el GDPR, que protege los datos personales.
Look, in Europe we have the GDPR, which protects personal data.
Y también hay leyes nuevas, el Reglamento de Servicios Digitales, que intentan controlar cómo las plataformas usan los algoritmos.
And there are also new laws, the Digital Services Act, which try to control how platforms use algorithms.
Pero la verdad es que estas leyes llegaron muy tarde.
But honestly, these laws arrived very late.
Very late.
Instagram launched in 2010.
TikTok hit a billion users in 2021.
An entire generation grew up in this environment before regulators even understood what they were dealing with.
So the question now is, can litigation do what regulation failed to do?
Bueno, con el tabaco, la combinación de litigios y regulación fue la que funcionó al final.
Well, with tobacco, it was the combination of litigation and regulation that worked in the end.
Las demandas costaron mucho dinero a las empresas, y eso las obligó a cambiar algunas cosas.
The lawsuits cost the companies a lot of money, and that forced them to change some things.
Quizás el mismo proceso va a pasar con las redes sociales.
Perhaps the same process will happen with social media.
The tobacco settlements in the US totaled two hundred and forty-six billion dollars over twenty-five years.
Which sounds enormous until you realize what it actually produced: cigarette prices went up, advertising restrictions were imposed, and a fund was created for anti-smoking campaigns.
The companies survived.
The question is whether a comparable outcome for social media would be enough.
La verdad es que yo no creo que el dinero es suficiente.
Honestly, I don't think money is enough.
El problema real es el diseño de las aplicaciones.
The real problem is the design of the apps.
Si los algoritmos continúan optimizando para el tiempo en la pantalla, los problemas de salud mental van a continuar también.
If the algorithms keep optimizing for time on screen, the mental health problems will continue too.
So what would meaningful change actually look like?
Some researchers have proposed chronological feeds instead of algorithmic ones, because the algorithm is the thing doing the optimizing.
Others have proposed age verification.
Norway passed a law raising the social media age limit to fifteen.
Australia went to sixteen.
There's real movement here.
En España también hay un debate muy intenso sobre esto ahora mismo.
In Spain there's also a very intense debate about this right now.
El gobierno habló de prohibir los móviles en las escuelas, y muchas familias dicen que están de acuerdo.
The government talked about banning mobile phones in schools, and many families say they agree.
Pero es difícil porque los jóvenes ya tienen los teléfonos en casa.
But it's difficult because young people already have phones at home.
I covered a story years ago in Jakarta about how mobile phones spread into rural areas faster than sanitation systems.
The technology moves faster than any society's ability to adapt.
And social media is the same story on a more psychologically intimate level.
The product was in children's bedrooms before anyone thought about what that meant.
A ver, yo tengo dos hijos adolescentes.
Look, I have two teenage children.
Y la verdad es que como padre, yo no sabía exactamente qué hacían en sus teléfonos.
And honestly, as a father, I didn't know exactly what they were doing on their phones.
Los algoritmos son invisibles.
The algorithms are invisible.
Tú ves que tu hijo mira el teléfono, pero no sabes qué le está mostrando el teléfono a él.
You see your child looking at the phone, but you don't know what the phone is showing them.
And that asymmetry of information is, I think, what makes this different from television or video games or the things previous generations of parents worried about.
With those, a parent could watch what their child was watching.
With an algorithmic feed, the content is personalized and invisible, and it adapts in real time.
Exactamente.
Exactly.
Y por eso el argumento legal es tan interesante.
And that's why the legal argument is so interesting.
No se trata solo de que el producto es peligroso.
It's not just that the product is dangerous.
Se trata de que la empresa ocultó información a los padres, a los usuarios, y a los reguladores sobre cómo funcionaba realmente el sistema.
It's that the company hid information from parents, users, and regulators about how the system actually worked.
There's going to be an appeal, of course.
Meta and Google have armies of lawyers and this is one jury in Los Angeles.
But the significance of this verdict is what it signals to the other plaintiffs in those thousands of pending cases.
It signals that a jury can be convinced.
That changes the calculus for everyone.
Sí, y también cambia el cálculo para las empresas.
Yes, and it also changes the calculus for the companies.
Si hay miles de casos similares, y los jurados empezaron a decidir en contra de Meta y Google, entonces para las empresas es mejor negociar un acuerdo general que pagar caso por caso.
If there are thousands of similar cases, and juries start deciding against Meta and Google, then for the companies it's better to negotiate a general settlement than to pay case by case.
Which is exactly how tobacco ended.
Not with a single dramatic verdict but with the weight of accumulated litigation forcing a negotiated settlement.
The tobacco companies didn't admit liability, they paid, they accepted some restrictions, and they moved on.
I expect we'll see something similar here within the next five years.
Bueno, pero yo espero que el resultado final incluya cambios reales en el diseño de las aplicaciones.
Well, but I hope the final outcome includes real changes to how the apps are designed.
No solo dinero.
Not just money.
Porque si las empresas solo pagan y continúan con los mismos algoritmos, los jóvenes del futuro van a sufrir los mismos problemas.
Because if the companies just pay up and carry on with the same algorithms, young people in the future will suffer the same problems.
That's the right note to end on, I think.
This verdict matters not as the conclusion of something but as the beginning of a reckoning.
The question for the next decade is whether that reckoning produces genuine structural change or just very expensive legal settlements that leave the underlying system intact.
La verdad es que soy un poco pesimista.
Honestly, I'm a little pessimistic.
Las empresas tecnológicas son muy poderosas y muy ricas.
The tech companies are very powerful and very rich.
Pero este caso demostró que es posible ganar.
But this case showed that it's possible to win.
Y eso es importante para las familias que esperan justicia.
And that matters for the families waiting for justice.
Before we wrap up, let's do our vocabulary for today.
Octavio, give us the key words from this episode.
Muy bien.
Very good.
Las palabras importantes de hoy son: la adicción, el algoritmo, el jurado, la salud mental, el daño, y demandar.
The important words today are: addiction, algorithm, jury, mental health, harm, and to sue.
Son palabras muy útiles para hablar de tecnología y de la ley.
They're very useful words for talking about technology and law.
La adicción, el algoritmo, el daño.
I'm going to practice those and probably mispronounce all of them.
Octavio, thanks, as always.
And listeners, we'll see you next time on Twilingua.
Hasta la próxima.
Until next time.
Y Fletcher, no pongas hielo en tu vino mientras escuchas el podcast.
And Fletcher, don't put ice in your wine while you listen to the podcast.