Consumer rights aren’t guaranteed in a digital world, warns Consumer Reports CEO

In today’s technology-driven economy, due to digital tools, protecting consumer rights is becoming increasingly difficult.

That’s what Martha Tellado, president and CEO of Consumer Reports, a nonprofit that conducts product testing and other consumer advocacy, says in her new book, “The Buyer Knows: Harnessing the Power of the Consumer for a Safe, Fair, and Open Market.” As the use of algorithms and artificial intelligence increases, he says, so does the risk of unfair practices that consumers are unaware of.

“You have to make decisions every day based on algorithms and things you can’t see, feel or touch,” Tellado told Marketplace’s David Brancaccio. “And it’s incredibly difficult for everyday consumers.”

Below is an edited transcript of their conversation.

David Brancaccio: Someone might think your job is to review robot vacuum cleaners or remind them once again that Toyota is good. But do you see your professional work as a crusade for civil rights in some way?

Marta Tellado: Yeah right. I think a lot of people know us because of the ratings. But a lot of what we do is really shaping the market to create a fairer and safer market. And what I really tried to do in The Buyer’s Ticket was a broader narrative about our democracy that can only flourish if we have a marketplace: one that is fair and just for all consumers.

Brancaccio: All consumers – no one wants to be taken advantage of by the companies we interact with. But beyond that, and I think it’s very important, reading your book, you see the issue of capital. Exploitation is bad for some people and not for others.

Tellado: Yeah right. I think the seed was planted as a young immigrant child who later came to the United States [the] The revolution in Cuba and seeing my parents having to rebuild their economic life, I am very grateful that I was able to come to democracy through my own experience. But you see with your own eyes that economic freedom is a civil right, and if there is inherent bias, you cannot have a fair market. And when our economic power and agency are compromised, so is our power to function as free and equal members of a democracy. So the book tries to draw some examples, some really scary examples, of how powerful it is and of the strong connection between having a free and fair democracy and economic opportunity in the marketplace.

The issue of digital consumer protection

Brancaccio: When you think about civil rights and justice as it relates to consumer rights, I mean, I think some people talk about credit redistribution or broadband access, which is not the same on, say, Native American tribal lands. But it goes much deeper. I mean, do you worry much about emerging technology, artificial intelligence, and built-in biases?

Also Read :  We Now Know When Super Nintendo World Opens At Universal Studios Hollywood

Tellado: Yes, absolutely. We’re proud of all the work we’ve done for more than 86 years to codify the laws and rules of fairness and justice in the marketplace. But unfortunately, many of these rules do not apply to this new digital landscape, where we lack transparency. And you have to make decisions every day based on algorithms and things you can’t see, feel or touch. And it’s incredibly difficult for everyday consumers.

Brancaccio: I spoke with the head of the computer science department of the engineering school. And he pointed out to me that with machine learning, even computer scientists can’t reverse-engineer their systems to fully understand why a machine chooses to do one thing and not another. And you can see how this can lead to abuse and possibly discrimination.

Tellado: It is correct. So when you think about it, machine learning: the bad data you have becomes biased. And I’ve dedicated a section to really digging into what it means when an algorithm discriminates against you. Sometimes it’s a life and death situation. Let’s think about what a medical trend looks like in an algorithm. If you’re thinking about something serious like going to the doctor and find out you’re in end-stage kidney disease, that means you need a transplant. And we know that transplants are not enough, we need to stand on the national waiting list. How did this happen? Well, for all of us, you must qualify. You must have a score of 20 or below, and this is a test based on your medical data that shows how quickly your kidneys are filtering blood. But here’s the catch, if you’re Black, your score will be adjusted. There is a race-adjusted coefficient based on flawed research done in the 90s – bad data. And this suggests that black people may have different kidney functions. Let’s call this patient Eli. It is included in the test. It cannot receive a score of 20 or below and will not be cut. And in this life-and-death situation, there was an algorithm, not transparent to the patient, that made decisions about the patient’s access to medical care, and in this case, life-saving medical care.

Brancaccio: Now I know you looked for it in your book—I don’t think you found it. What federal rule governs fairness in artificial intelligence?

Also Read :  Windies, Sri Lanka favored in T20 World Cup's first round

Tellado: Well, unfortunately there is no federal rule. While we are proud of the work we do on consumer rights and protections, those rules, regulations and consumer protections do not carry over into the digital landscape. This rears its head in many ways, but artificial intelligence is not one of them. And right now, we may have amazing leaders in our agencies, but we don’t have the tools, capabilities, or guidelines to ensure fairness and transparency. And you can’t trust something that isn’t clear to you. And, of course, you can’t hold him accountable.

Potential discrimination … hidden in algorithms

Brancaccio: And Consumer Reports gets involved in the public policy process – can you make your organization heard on an issue like this?

Tellado: Yes, many people come to us because they are making personal choices. And what we’re looking at is, “How do these choices scale into the marketplace?” We have a digital lab that looks at bias precisely. We also looked at car insurance and you think your car insurance is based on your driving record because you don’t have a ticket or you don’t have this valve. But in reality, the algorithm also looks at non-motor factors about you: where you live, what your income is, what your level of education is. So what we’ve found is that the price you pay for your car insurance has a lot to do with your zip code and whether that area is black, white or Hispanic, and what you pay for that premium. Thus, black and Hispanic neighborhoods are paying higher premiums than white neighborhoods.

Brancaccio: I mean, language barriers where even companies can’t communicate important information in languages ​​that people know.

Tellado: And as you say, David, the stakes are really high in some of these examples. They are life and death. And I will give you an example of something that is not an algorithm. But this, this product. This is a familiar product. If you see a little clip on someone’s finger when you go to the hospital, it’s a pulse oximeter. But it doesn’t work well on dark skin. And what we do know is that people of color are three times more likely to miss low oxygen than white people. The implications are so amazing because you come to the ER and they try this. If you don’t have a score, you’ve rejected ER, and that’s really surprising given what we’ve been through in the pandemic. And we know that many people of color have been excluded and had different influences. So again, fairness by design is something Consumer Reports looks at as well

Also Read :  Laenor and Qarl's Story Surprises Book-Readers

Brancaccio: You can see the engineers of these devices saying, “We never checked or thought of that.” And you’re insisting, “This is how you should think, companies.”

Tellado: Yeah right. Another area where we’ve seen a lot of bias, and we’ve known this for a while, is that women are much more likely to be hurt and injured in car accidents and being hit in your car. This is because we know that our biology and bone structure are very different from men’s. But test dummies are not anatomically correct. They are based on men and how people are affected by the forces of disaster. This, for us, is a battle we still have to fight. But that’s part of what we’re doing—we’re testing these products because we need to change and eliminate bias and create products for everyone.

Brancaccio: Martha, will you reverse some of this? Maybe readers want to tell them if the blender is good or not?

Tellado: We get that all the time. I think people are like, “Wait a minute, you know, stay in your lane, you know, tell me what machine or what blender to get.” But the stakes are too high, David. What people don’t know is that, as you say, we are a non-profit organization. We are not a private publishing company. We work with our members just like public radio. So we are a public asset. We collect data because we want to make the world and the marketplace a better, fairer and more transparent place for consumers. So our work really helps to strengthen and lay the foundation for many of the safety requirements you see and take into your home. In terms of the hardware products we bring home, we still have the load. And the burden really falls on the consumer in the digital landscape. So now we live in a world where our privacy and data security is a setting, not a right.

There’s a lot going on in the world. Through it all, Marketplace is here for you.

You rely on Marketplace to analyze world events and tell you how it affects you in a fact-based, accessible way. We count on your financial support.

Your contribution today empowers the independent journalism you rely on. For just $5/month, you can help keep Marketplace running so we can keep you posted on the things that matter to you.

Source

Leave a Reply

Your email address will not be published.