Fletcher breaks down this story in English. Octavio reacts and expands in Spanish. Follow along with the live transcript, tap any word for its translation. Intermediate level — perfect for intermediate learners expanding their range.
So here's a number that stopped me cold this morning: twelve million people in the UK were overcharged on car loans, and they didn't know it.
The regulator just ruled they're owed compensation.
That's not a rounding error.
That's roughly one in five British adults.
Bueno, mira, es un número enorme.
Look, it's an enormous number.
La autoridad financiera del Reino Unido, la FCA, dijo que doce millones de personas pagaron demasiado por sus créditos de coches.
The UK's financial authority, the FCA, said twelve million people overpaid on their car loans.
Cada persona va a recibir unos 829 libras.
Each person will receive about £829.
Y los bancos tienen que pagar un total de 9.100 millones de libras.
And the banks have to pay a total of £9.1 billion.
Right, so 9.1 billion pounds.
Which is, for context, more than the annual GDP of some countries we could name.
And the thing that makes this a technology story, not just a finance story, is how it happened.
It wasn't a crooked loan officer in a back room.
It was built into the system.
Exacto.
Exactly.
El sistema se llamaba "discretionary commission arrangement", o DCA.
The system was called a 'discretionary commission arrangement', or DCA.
Los vendedores de coches podían cambiar el tipo de interés del crédito.
Car salespeople could change the interest rate on a loan.
Si ponían un interés más alto, ganaban más dinero.
If they set a higher rate, they earned more money.
Pero el cliente no sabía esto.
But the customer didn't know this.
I mean, let's just sit with that for a second.
You walk into a dealership, you're thinking about horsepower and whether the seats are heated, and the guy across the desk has a financial incentive to make your loan more expensive.
And there's nothing on the wall that tells you this.
No, no había transparencia.
No, there was no transparency.
Y esto es lo importante: el sistema digital de los bancos permitía este ajuste de forma automática.
And this is what matters: the banks' digital systems allowed this adjustment automatically.
El vendedor usaba un programa, cambiaba el número, y el banco procesaba el crédito sin preguntar.
The salesperson used a program, changed the number, and the bank processed the loan without asking questions.
Era un negocio muy bien organizado.
It was a very well-organized business.
Here's what gets me about that.
We talk about fintech, financial technology, as this great democratizing force.
More access to credit, faster decisions, less paperwork.
And all of that is true.
But what this case shows is that the same systems that make credit faster can also make exploitation faster.
Sí, y esto no es nuevo en el Reino Unido.
Yes, and this isn't new in the UK.
Antes hubo otro escándalo muy grande, el PPI, el "payment protection insurance".
Before this there was another very big scandal, the PPI, 'payment protection insurance'.
Los bancos vendieron seguros que los clientes no necesitaban, o que no funcionaban.
Banks sold insurance that customers didn't need, or that didn't work.
Ese escándalo costó más de 38.000 millones de libras.
That scandal cost more than £38 billion.
The PPI scandal.
I remember covering the edges of that.
It went on for years.
People were still getting letters about PPI refunds a decade after the initial exposure.
So Britain has been here before.
The question is whether digital systems make this kind of thing easier to catch, or easier to hide.
Bueno, las dos cosas.
Both things.
Con los sistemas digitales, hay más datos.
With digital systems, there's more data.
Los reguladores pueden analizar millones de contratos muy rápido.
Regulators can analyze millions of contracts very quickly.
Pero también es más fácil esconder los problemas dentro del código, dentro de los algoritmos.
But it's also easier to hide problems inside code, inside algorithms.
La gente normal no puede ver cómo funciona el sistema.
Ordinary people can't see how the system works.
That opacity point is crucial.
Look, in the old days, a bad loan had a paper trail.
An actual human signed something, in a room, with a pen.
You could subpoena the documents.
Now the decision is made in milliseconds by software, and the audit trail requires a team of forensic data scientists to untangle.
A ver, pero hay algo positivo también.
But there's something positive too.
La FCA descubrió este problema porque analizó los datos de millones de contratos.
The FCA discovered this problem because it analyzed data from millions of contracts.
Comparó los intereses que pagaron diferentes clientes.
It compared the interest rates different customers paid.
Y vio que algunos pagaron mucho más que otros, sin una razón clara.
And it saw that some paid much more than others, without a clear reason.
So the data that enabled the abuse also enabled the detection of the abuse.
There's a kind of dark irony in that.
The same database that held all these inflated rates became the evidence against the banks.
Exactamente.
Exactly.
Y aquí está la parte más interesante para mí.
And here's the most interesting part for me.
La FCA usó técnicas de análisis de datos muy modernas para comparar los contratos.
The FCA used very modern data analysis techniques to compare the contracts.
No es que un inspector leyó 12 millones de documentos.
It's not that an inspector read 12 million documents.
Los algoritmos detectaron los patrones del abuso.
The algorithms detected the patterns of abuse.
Algorithms catching the damage done by algorithms.
Right.
And this is where I think the story gets philosophically interesting, because we tend to have one of two reactions to algorithmic decision-making.
Either we think it's neutral and objective because it's math, or we think it's inherently biased because it was built by humans with biases.
The truth is somewhere messier.
La verdad es que los algoritmos reflejan las decisiones de las personas que los crearon.
The truth is that algorithms reflect the decisions of the people who created them.
En este caso, el sistema DCA fue diseñado para dar poder a los vendedores.
In this case, the DCA system was designed to give power to the salespeople.
Los programadores hicieron exactamente lo que les pidieron.
The programmers did exactly what they were asked to do.
El problema era el diseño del negocio, no el código en sí.
The problem was the business design, not the code itself.
That's a really important distinction.
The technology did what it was told.
The ethical failure was upstream of the technology.
Someone, at some point in some boardroom, decided that salespeople should have discretion over rates, and that customers didn't need to know about it.
The code just executed that decision at scale.
Mira, esto me recuerda a algo que pasó en España también.
Look, this reminds me of something that happened in Spain too.
No con créditos de coches exactamente, sino con hipotecas.
Not with car loans exactly, but with mortgages.
Los bancos españoles vendieron hipotecas con condiciones ocultas, las llamadas "cláusulas suelo".
Spanish banks sold mortgages with hidden conditions, the so-called 'floor clauses'.
Millones de familias pagaron más de lo que debían pagar.
Millions of families paid more than they should have.
The cláusulas suelo.
I actually reported on that during the financial crisis period.
The floor clause scandal, where the interest rate on your variable-rate mortgage could go down but only to a certain point, and you were never told that clearly.
Spain had to pay out billions too.
Sí, exacto.
Yes, exactly.
Y en España el Tribunal Supremo tardó muchos años en decidir quién tenía razón.
And in Spain the Supreme Court took many years to decide who was right.
Los bancos decían que la información estaba en el contrato.
The banks said the information was in the contract.
Los clientes decían que el contrato era demasiado complicado para entender.
The customers said the contract was too complicated to understand.
Los dos tenían razón, en parte.
Both were right, in part.
And that tension, between technically legal and genuinely fair, is at the heart of every fintech regulation debate happening right now.
You can be technically compliant and still be stealing from people.
The law hasn't caught up to the speed of digital financial innovation, and I'm not sure it ever will.
Es que los reguladores siempre van detrás de la industria.
Regulators are always behind the industry.
Cuando los reguladores entendieron el sistema DCA, ya había doce millones de víctimas.
By the time regulators understood the DCA system, there were already twelve million victims.
Esto es un problema fundamental.
This is a fundamental problem.
La tecnología avanza muy rápido, y las leyes son lentas.
Technology moves very fast, and laws are slow.
The regulatory lag problem.
I've watched this play out in every sector I've covered.
Social media, pharmaceuticals, aviation.
The industry moves, the damage accumulates, and then the regulators show up five years later with a fine.
And sometimes the fine is smaller than the profit they made during the abuse.
Bueno, en este caso el número es muy grande.
Well, in this case the number is very large.
9.100 millones de libras es mucho dinero.
£9.1 billion is a lot of money.
Los bancos más grandes del Reino Unido, como Lloyds y Santander UK, tienen que pagar una parte importante.
The biggest UK banks, like Lloyds and Santander UK, have to pay a significant part.
Lloyds solo reservó casi 2.000 millones de libras para este escándalo.
Lloyds alone set aside almost £2 billion for this scandal.
Santander UK is in there, which I find interesting given we're a Spanish language show.
Santander is a Spanish bank, of course.
One of the great international banking success stories.
And they're on the hook for a chunk of this British car finance mess.
Sí, Santander opera en muchos países y tiene que seguir las reglas de cada mercado.
Yes, Santander operates in many countries and has to follow the rules of each market.
Esto es otra consecuencia de la tecnología financiera global: un banco español puede tener problemas legales en el Reino Unido por cómo funcionó su plataforma digital allí.
This is another consequence of global financial technology: a Spanish bank can have legal problems in the UK because of how its digital platform worked there.
So let's go deeper on the technology itself for a minute.
Because what we're describing, this DCA system, is actually a precursor to something much more sophisticated that's happening now: algorithmic credit scoring.
Where the interest rate you're offered isn't set by a salesperson anymore.
It's set by a model.
A ver, los modelos algorítmicos usan muchos datos sobre el cliente.
The algorithmic models use a lot of data about the customer.
El historial de crédito, el trabajo, a veces incluso el código postal donde vive.
Credit history, employment, sometimes even the postal code where they live.
Y esto puede ser un problema porque si vives en un barrio pobre, el algoritmo puede darte un interés más alto, aunque tú seas una persona responsable.
And this can be a problem because if you live in a poor neighborhood, the algorithm might give you a higher interest rate, even if you are a responsible person.
That's the discrimination-by-postcode problem, and it's been documented in the United States extensively.
Redlining, they used to call it.
Banks would literally draw red lines on maps around Black neighborhoods and refuse to lend there.
We outlawed that decades ago.
But algorithmic systems can produce the same outcome without anyone making an explicitly racist decision.
Esto es muy importante.
This is very important.
El algoritmo no tiene intención de discriminar.
The algorithm has no intention of discriminating.
Pero aprende de datos históricos.
But it learns from historical data.
Y si los datos históricos reflejan discriminación del pasado, el algoritmo repite esa discriminación de forma automática.
And if the historical data reflects past discrimination, the algorithm repeats that discrimination automatically.
Es un ciclo muy difícil de romper.
It's a very difficult cycle to break.
No, you're absolutely right about that.
And what strikes me is that the UK car finance case is almost the innocent version of this problem.
It's human greed encoded into software.
The algorithmic discrimination problem is subtler.
Nobody programmed racism in.
The racism emerged from the data.
La verdad es que los dos problemas son muy serios.
The truth is that both problems are very serious.
Pero el segundo es más difícil de ver y de corregir.
But the second one is harder to see and to correct.
Con el sistema DCA, la FCA pudo identificar el problema y calcular la compensación.
With the DCA system, the FCA was able to identify the problem and calculate the compensation.
Con los algoritmos de discriminación, es mucho más complicado porque el daño es invisible.
With discrimination algorithms, it's much more complicated because the damage is invisible.
So where does this go from here?
Because 9.1 billion pounds in compensation is significant.
But I'm more interested in whether this changes how the industry behaves.
Does the size of the penalty actually deter the next version of this?
Mira, en España después del escándalo de las hipotecas, los bancos cambiaron muchas cosas.
Look, in Spain after the mortgage scandal, the banks changed a lot of things.
Ahora los contratos son más claros, hay más información para el cliente.
Now contracts are clearer, there's more information for the customer.
Pero también hay nuevos productos financieros que son complicados otra vez.
But there are also new financial products that are complicated again.
Los bancos son muy creativos para encontrar nuevas formas de ganar dinero.
Banks are very creative at finding new ways to make money.
There's a term for this in regulatory economics: regulatory arbitrage.
You close one loophole, the industry finds another.
And in a digital environment, they can find the next loophole faster than ever, because software is infinitely malleable.
A paper contract takes weeks to redesign.
An algorithm can be updated overnight.
Exacto.
Exactly.
Y por eso algunos expertos hablan de "open banking" como una solución.
And that's why some experts talk about 'open banking' as a solution.
La idea es que el cliente tiene control de sus datos financieros y puede compartirlos con diferentes empresas para comparar precios.
The idea is that the customer has control of their financial data and can share it with different companies to compare prices.
Si hay más transparencia, hay más competencia, y es más difícil cobrar demasiado.
If there's more transparency, there's more competition, and it's harder to overcharge.
Open banking.
The UK is actually one of the leaders in this, which makes the car finance scandal all the more ironic.
You have a country that pioneered giving consumers control of their financial data, and simultaneously allowed car dealers to secretly inflate loan rates for a decade.
Both things are true.
Sí, el Reino Unido es interesante.
Yes, the UK is interesting.
Tiene regulación financiera muy sofisticada, pero también tiene estos escándalos grandes.
It has very sophisticated financial regulation, but it also has these big scandals.
Creo que es porque el sector financiero allí es muy poderoso.
I think it's because the financial sector there is very powerful.
La City de Londres es el centro financiero más importante de Europa, y tiene mucha influencia política.
The City of London is the most important financial center in Europe, and it has a lot of political influence.
The capture problem.
When the regulated industry is more powerful than the regulator, the regulation tends to be shaped in favor of the industry.
And you can have all the sophisticated data analysis tools in the world, but if the FCA is chronically underfunded or politically constrained, they're still going to be five years behind.
Bueno, aquí hay algo que creo que es muy importante para los oyentes que estudian el español.
Well, here's something I think is very important for listeners learning Spanish.
En España decimos "el que hace la ley, hace la trampa".
In Spain we say 'el que hace la ley, hace la trampa'.
Significa que la persona que crea las reglas también sabe cómo evitarlas.
It means that the person who creates the rules also knows how to avoid them.
Es un dicho que explica mucho de la historia financiera.
It's a saying that explains a lot of financial history.
El que hace la ley, hace la trampa.
The person who makes the law also makes the loophole.
That's...
that's pretty cynical but I cannot argue with it.
Every financial regulation I've ever reported on has had an industry-shaped exemption baked into it somewhere.
A ver, no quiero ser demasiado negativo.
Look, I don't want to be too negative.
Hay soluciones tecnológicas que pueden ayudar.
There are technological solutions that can help.
Por ejemplo, algunos países experimentan con algoritmos de regulación que analizan las transacciones financieras en tiempo real.
For example, some countries are experimenting with regulatory algorithms that analyze financial transactions in real time.
Si el sistema detecta un patrón anormal, alerta al regulador inmediatamente.
If the system detects an abnormal pattern, it alerts the regulator immediately.
No después de diez años.
Not after ten years.
Real-time regulatory surveillance of financial markets.
Which sounds either like the future of consumer protection or an Orwellian nightmare, depending on who controls the algorithm.
I mean, that's the fundamental tension, isn't it?
More data, more oversight, also means more surveillance.
And who watches the watchers?
Es verdad.
That's true.
Pero para mí, la pregunta más importante es: ¿quién paga cuando el sistema falla?
But for me, the most important question is: who pays when the system fails?
En este caso, los bancos van a pagar.
In this case, the banks are going to pay.
Pero las doce millones de personas pagaron demasiado durante muchos años.
But the twelve million people overpaid for many years.
El dinero que recuperan no compensa el tiempo perdido, los intereses pagados, el estrés.
The money they get back doesn't compensate for the lost time, the interest paid, the stress.
The extraordinary thing is that 829 pounds sounds like real money to most people, but it's actually the tip of what was taken.
If you're paying an inflated rate on a five-year car loan, the actual cost over time is much larger.
The compensation is a political number, not a mathematical one.
Totalmente de acuerdo.
Totally agree.
Y esto es lo que los ciudadanos tienen que entender sobre la tecnología financiera.
And this is what citizens have to understand about financial technology.
No es neutral.
It's not neutral.
Tiene consecuencias reales para personas reales.
It has real consequences for real people.
Cuando lees un contrato de crédito y hay cosas que no entiendes, eso no es un accidente.
When you read a credit contract and there are things you don't understand, that's not an accident.
Muchas veces es una decisión de diseño.
Many times it's a design decision.
Complexity as a feature, not a bug.
If the contract is impossible to understand, you sign it anyway, and the bank wins.
I've seen this in every country I've ever worked in.
It's not a British problem or a Spanish problem.
It's a power problem.
And technology, for all its promise, has mostly made the powerful more powerful.
La verdad es que soy un poco más optimista que tú.
The truth is I'm a bit more optimistic than you.
Creo que escándalos como este, cuando los reguladores actúan y las personas reciben compensación, son también una señal de que el sistema puede corregirse.
I think scandals like this, when regulators act and people receive compensation, are also a sign that the system can correct itself.
No perfectamente, no rápido, pero puede aprender de los errores.
Not perfectly, not quickly, but it can learn from mistakes.
I'll take that.
A cautious optimist and a skeptical journalist walk into a bar.
Maybe that's what this show is.
Look, the bottom line for me: this ruling matters beyond Britain.
It's a signal to every fintech company operating in every market that the data you generate about your customers can also be used against you.
Build the system badly, and the system will tell on you.
Eso es un buen resumen.
That's a good summary.
Los datos son como un espejo.
Data is like a mirror.
Muestran lo que realmente pasó, no lo que los bancos decían que pasó.
It shows what really happened, not what the banks said happened.
Y para los estudiantes de español que nos escuchan, espero que este episodio les ayudó a entender no solo vocabulario nuevo, sino también cómo funciona el dinero en el mundo digital.
And for the Spanish students listening to us, I hope this episode helped you understand not just new vocabulary, but also how money works in the digital world.
Well said.
And if you walked away from this episode thinking a little harder before you sign the next financial contract on your phone at three in the morning, then we've done our job.
For Twilingua, I'm Fletcher Haines.
Y yo soy Octavio Solana.
And I'm Octavio Solana.
Hasta la próxima.
Until next time.