Your voice could be the biggest threat to your privacy... How can you prevent artificial intelligence from exploiting it?

 

humans
 artificial intelligence 

Your voice could be the biggest threat to your privacy... How can you prevent artificial intelligence from exploiting it?

A person's voice can tell you about their education level, psychological state, profession, financial status, and much more than you might imagine. Now, scientists are pointing out that voice-to-text technology could be used for price manipulation, unfair categorization, harassment, and tracking.

While humans may be attuned to more obvious cues such as fatigue, nervousness, happiness, etc., computers can do the same, but with much more information, and much faster.

A new study claims that tone patterns or your choice of words can reveal everything, from your personal political leanings to the presence of health or medical conditions, according to a report by Live Science, which was reviewed by Al Arabiya Business.

This study, published on November 19, 2025 in the journal "Proceedings of the IEEE," highlights serious concerns about the potential for privacy violations and unfair profiling associated with this technology.

Despite the opportunities offered by voice processing and recognition technologies, Tom Backstrom, associate professor in the Department of Speech and Language Technologies at Aalto University and lead author of the study, sees the potential for serious risks and harm.

For example, if a company can understand your economic situation or your needs from your vote, this opens the door to price manipulation, such as imposing discriminatory insurance premiums.

When voices can reveal details such as emotional vulnerability, gender, or other personal information, cybercriminals or stalkers can identify and track victims across platforms, exposing them to blackmail or harassment. We unconsciously transmit all these details when we speak, and we respond to them subconsciously before anything else.

“Very little attention is paid to the physiology of listening,” said Jenalyn Bonraj, founder of Delayer and a futurist specializing in the organization of the human nervous system in light of emerging technologies, to Live Science. “In times of crisis, people are not primarily processing language, but rather responding to tone, rhythm, intonation, and breathing, often before the mind has a chance to react.”

Pay attention to your tone of voice.
Backstrom told Live Science that although the technology has not yet been used, its seeds have already been sown.

He added: "The automatic detection of anger and toxic behavior in online games and call centers is openly discussed. These are useful and ethically sound goals," adding, "But the increasing adoption of voice-to-customer interfaces, for example—so that the automated response style is similar to the customer's—tells me that more ethically questionable or even malicious goals can be achieved."

He noted that although he had not heard of any entity being caught using this technology inappropriately, he did not know whether it was because no one had done so, or simply because they had not been looking for it.

We must also remember that our voices are everywhere; between every voice message we leave, and every time the customer service line tells us that the call is being recorded for training and quality assurance purposes, there is a digital record of our voices in quantities that match our digital footprint, which includes posts, purchases and other online activities.

If or when a major insurance company realizes that it can increase its profits by selectively pricing coverage based on information about us gathered from our voices using artificial intelligence, what will stop it?

Backstrom said that simply talking about this issue could open the door to major problems, as it exposes both the public and "adversaries" to the new technology.
He stressed the need to educate the public about the potential risks, otherwise, "large corporations and watchdog states will have already won." He concluded, "This may sound overly pessimistic, but I choose to be optimistic that I can do something about it."

Protect your voice

Fortunately, there are promising engineering approaches that can help people protect themselves. The first step is to accurately measure what our voices reveal. As Backstrom noted, it's difficult to build protective devices when you don't know what you're trying to protect.

This idea led to the creation of the "Security And Privacy In Speech Communication Interest Group," a group that provides a multidisciplinary forum for research and a framework for measuring the amount of information contained in speech.

From here, it becomes possible to transmit only the information necessary for the intended transaction. Imagine the system converting speech to text to extract only the essential information required; either the service provider enters the information into their system (without recording the actual call), or your phone converts your words to text for transmission.

As Backstrom stated in an interview with Live Science, "The information sent to the service will be the minimum necessary to accomplish the required task."


Post a Comment

0 Comments