[rank_math_breadcrumb]

News

FTC Chair warns airlines could one day ‘charge you more’ if they know you’re attending a funeral — here’s how

Updated: 20-10-2024, 02.57 PM

FTC Chair warns airlines could one day ‘charge you more’ if they know you’re attending a funeral — here’s how

FTC Chair warns airlines could one day ‘charge you more’ if they know you’re attending a funeral — here’s how

If you’re still worried about AI taking your job, the Federal Trade Commission (FTC) Chair has news for you: it could also be used as a tool to take your money from you.

If this sounds like some rejected subplot from a dystopian sci-fi novel, think again. FTC Chair Lina Khan has warned the public about the perils of customer data being leveraged for possible “surveillance pricing.”

Don’t miss

During the 2024 Fast Company Innovation Festival in September, Khan used the example of a passenger being charged more money for an airline ticket “because the company knows that they just had a death in the family and need to fly across the country.”

As of now, no airline carrier has either implemented or been investigated for this practice. However, could Khan’s prediction become a reality?

From cunning scammers to commerce vendors

Over the years, airlines have already introduced additional fees for checked bags, seat selection, ticket changes, and even carry-on luggage — items once taken for granted as free.

The situation has become so dire that it stoked the ire of the U.S. Senate which, in March, stepped up its investigation into the billions of dollars airlines rake in as a result of excess fees.

“Given just how much intimate and personal information that digital companies are collecting on us, there’s increasingly the possibility of each of us being charged a different price based on what firms know about us,” Khan elaborated.

This isn’t just about air transport, either. Targeted data crunched by artificial intelligence (AI) is already here, arriving — as these things often do — via scammers.

Whether through events such as the 2017 Equifax breach that exposed the personal information of 147 million people, or companies such as Meta reneging on user privacy commitments, it’s increasingly possible for your valuable personal data to somehow get into the digital wild.

Sometimes, the danger can come from an innocuous source: purchase data that a company acquires through honest means (think: your favorite online merchants, for example).

If merged with broader information obtained and analyzed via cutting-edge analytics, the merchant could discover you’re in a vulnerable position in the blink of an AI, if you will. This can then possibly open the door for price gouging.

Read more: Why people who work with a financial advisor retire with an extra $1.3 million

How to protect yourself

In February, popular fast-food chain, Wendy’s, endured online scorn and ridicule shortly after CEO Kirk Tanner announced a $20 million investment in digital menu boards to test dynamic pricing.

The AI-driven technology could, in theory, analyze consumer habits based on rushes and lulls, and charge accordingly. Customers slammed it as a backdoor way to rip them off. (Wendy’s has since said the digital menu boards would only provide discounts or recommend products.)

You can also tell politicians and media outlets when you spot price discrimination — and let companies know you’re onto them through social media.

Meanwhile, AI scams have become part of the FTC’s “bread and butter fraud work,” Khan noted, so reach out to the agency when you see or experience something amiss.

“Some of these AI tools are turbocharging that fraud because they allow some of these scams to be disseminated much more quickly, much more cheaply, and on a much broader scale,” she said.

Before anyone comes to your rescue, know that criminals will continue to ply their cunning as AI grows more sophisticated.

One new scam involves voice cloning, wherein malefactors use the voice imprint of a loved one (recorded in a coffee shop, for example) to ply for emergency cash, hoping your guard’s down because it’s a voice you recognize.

The technology is so widespread that reputable outlets, such as Descript, boast that they can clone your voice in a minute using only seconds of an audio recording.

In 2023, McAfee surveyed 7,000 people and found one in four people had experienced a voice cloning scam or knew someone who had.

In addition, one in 10 respondents also revealed they’d received a phone call with an AI voice clone, with 77% of that cohort admitting they’d lost money as a result.

To protect yourself, don’t take calls from strange or unknown numbers and if someone asks for money, even if they claim to be a person known to you, hang up and call said person back at their legitimate phone number.

With email phishing, scammers now use AI technology to make their copy more realistic, or to pose as someone who may oversee an account service, such as Netflix.

What to read next

This article provides information only and should not be construed as advice. It is provided without warranty of any kind.

Leave a Comment

Design by proseoblogger