VRB News
Virtual Reality Brisbane
  • Home
  • About us
  • IT news
  • Tech
  • World
  • Contact
No Result
View All Result
  • Home
  • About us
  • IT news
  • Tech
  • World
  • Contact
No Result
View All Result
No Result
View All Result
Home IT news

Why it’s hard to correct algorithm bias

admin by admin
November 9, 2021
in IT news
0
Why it's hard to correct algorithm bias
0
SHARES
22
VIEWS
Share on FacebookShare on Twitter

Artificial intelligence models inadvertently acquire gender, race or class biases.

Algorithm they’re racist. They are macho. They discriminate by purchasing power, by nationality, on thousands of grounds. It is something we have assumed at the same time that artificial intelligence was accommodated on everyone’s lips.

Once the complaint is made, artificial intelligence experts try to solve these biases with which their algorithms are born, or acquired. But it’s not easy at all. The problem is that biases go unnoticed and sometimes they are so inherent in the data that models are trained on that it becomes difficult to solve the issue.

The data with which the algorithms are trained has been accused of causing these deviations. But these are also found in the design of the system. When managers decide what they want to achieve with this model of machine learning or deep learning, they are marking how the software will be. The goal can be biasedinstead of fair. The economic benefit of a business could be sought, which would first imply discrimination between customers who spend more and those who spend less. It would not be strange if this translated into a differentiation based on purchasing power, with which indirectly it would be discriminating on this ground.

Obviously, data may also have to be guilty of algorithm bias. They may not be representative of reality as a whole or simply reflect existing prejudices. The world is far from perfect, and little by little these imperfections are assumed as the first step to correct them. But with this imperfect information you train algorithm.

To illustrate it with an example that has happened. Managers of a human resources department order a model to help recruit the most suitable candidates. However, the data fed into the algorithm include past hiring decisions in which men were favored over women. Artificial intelligence can only inherit this biaswhich human resources managers are likely to disagree with at the moment.

algorithm bias

When data contains prejudices, it is difficult to get rid of them. There are more obvious improvements. For example, in the case of the recruitment algorithm, the terms that allude to gender included in the justification of the decisions that have served to train the model can be ignored. But it is very difficult to polish the information completely. Other words or forms of expression may also contribute to this gender bias and go unnoticed.

The working method

The experts in artificial intelligence algorithms invest a fair amount of effort in creating accurate models that give answers to a problem. From there, it is a common practice that if an algorithm works well try applying it to another task. Although this is similar, the social context sometimes changes. The design of the first model does not take into account the new environment in which you will work. Therefore, it is possible that it carries assumptions that in its new context become prejudices.

There is also a double preparation of the model when training it. The first thing is to test it and this is done with a data set. Then you train, which is done with another data set. But in reality this is the same set of information, which is divided for these two stages. Artificial intelligence is not exposed to great diversity and is therefore more vulnerable to algorithm bias.

Finally, we also enter the field of philosophy. What is neutrality? It’s actually a chimera to get results that are completely unbiased in a prediction. And here’s what artificial intelligence models do: predict the unknown.

Images: comfreak, insspirito

Previous Post

(UPDATED) Nolo announces Survive: VR Battle Royale for mobile platform

Next Post

Beat Saber was the most downloaded from PSVR in 2018

admin

admin

Related Posts

How to Grow a YouTube Channel with ScaleLab
IT news

How to Grow a YouTube Channel with ScaleLab: Effective Strategies for Creators

February 4, 2025
Sticker mockups
IT news

Sticker mockups: how to visualize your ideas professionally and efficiently

January 13, 2025
Ways to Get Free Senegal Proxies for Work and Surfing
IT news

Ways to Get Free Senegal Proxies for Work and Surfing

December 24, 2024
Crypto Betting Frontiers
IT news

Crypto Betting Frontiers: The 2025 Landscape

December 6, 2024
iGaming Marketing Trends for 2025
IT news

iGaming Marketing Trends for 2025: Adapting to a Rapidly Changing Landscape

December 5, 2024
Next Post
Beat Saber was the most downloaded from PSVR in 2018

Beat Saber was the most downloaded from PSVR in 2018

Premium Content

Tactical multiplayer shooter Zero Killed will be available in September

Tactical multiplayer shooter Zero Killed will be available in September

February 12, 2022
Star Citizen will implement Vulkan

Star Citizen will implement Vulkan

April 14, 2022
Is it getting too hot for the Internet?

Is it getting too hot for the Internet?

September 19, 2022

Browse by Category

  • Games
  • IT news
  • Tech
  • World

VRB News is ready to cooperate with webmasters and content creators. Send an email to info@virtualrealitybrisbane.com

Categories

  • Games
  • IT news
  • Tech
  • World

Recent Posts

  • How to Grow a YouTube Channel with ScaleLab: Effective Strategies for Creators
  • Sticker mockups: how to visualize your ideas professionally and efficiently
  • Ways to Get Free Senegal Proxies for Work and Surfing

© 2023 - The project has been developed ServReality

No Result
View All Result
  • Home
  • About us
  • IT news
  • Tech
  • World
  • Contact

© 2023 - The project has been developed ServReality

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?