Breaking News                 Latest News            Usa News                 Latest News

7 Factors to Consider Before Applying into Any University
Search engines provide discriminatory images, application software disadvantaged blacks: where algorithms have their prejudices – and why they are so dangerous.

Algorithms: Programmed racism

Search engines provide discriminatory images, application software disadvantaged blacks: where algorithms have their prejudices – and why they are so dangerous.

Algorithms: Programmed racism
Content
  • Page 1 — programmed racism
  • Page 2 — algorithms should explain
  • Page 3 — The new racism is no less perfidious
  • Read on a page

    Racism ranks our thinking and living toger. With focus on "everyday racism" we want to find out why this is what this means for society and how it could be changed. In this part, Eva Wolfangel explains why even technology is not free of racist patterns.

    We humans do not know everything, but at least we know this: we make decisions by mixing facts, half-knowledge, preferences, prejudices and expectation of personal benefits – hoping to end up Richtigzuliegen. There are no really objective decisions. To counteract this problem, we are increasingly relying on machines. Computers and algorithms should now liberate us from prejudice. Developments in field of artificial intelligence should not only take decisions from us, but also make better and more objective decisions.

    So far idea. But last thing that happened was this: a Google software labeled photo of a Afroamerikanerinmit "gorilla". When DieMenschen googled "ProfessionelleFrisur", only blonde braided hairstyles appeared in ErstenErgebnissen of image search. In end selection of a machine entschiedenenSchönheitswettbewerbs only white candidates landed. A software derneuseeländischen Passport authority denied recognition of Asian passports, algorithm assumed that eyes of pictured were closed. And a system that should help judges to decide which prisoners should be discharged prematurely was of opinion that a dark skin color was decisive criterion a high likelihood of recurrence of criminal offences. And that is just one example of what can go wrong with "predictive police" – technical term for police work on basis of crime forecasts.

    "We thought dassAlgorithmen could make better decisions, such as VorhersagendenPolizeiarbeit or staff selection," says US amerikanischeTechniksoziologin Zeynep Tufekci time online, "Aberdieser Dream is dreamy".

    The fact that software is not NURIRRT, but surprisingly consistently discriminates against population groups, i.e. a racist bias, is attributable to two causes. The first is explained quickly: modern algorithms learn on basis of selected training data, for example, information about how people have taken se decisions so far. And y see patterns in it. However, if training data has already been poorly selected by developers, algorithms reproduce problem. The computer programs mselves are not racist, but y can produce discriminatory results based on data. Actually, it's easy, says Margaret Mitchell of Google Research in Seattle: "If you Müllrein, trash will come out." Or even: "We put prejudices in, we get prejudices."

    If you put garbage in, trash will come out. Margaret Mitchell, senior scientist at Google research

    These are, however, Meistnicht obvious, which is why y were not noticed at first. "Today, thanks to deep-learning revolution, we have powerful technologies," says Mitchell, "DieTendenzen in data are sometimes only through output of Systemesichtbar". In or words: what is going wrong often erstam end. And this only if developers and developers are darüberbewusst that y have to question results.

    You could tackle problem by making training data more inclusive. A software ZurGesichtserkennung, for example, must not only be fed with an image database VonMitteleuropäern, but must also recognize people with DunklerHautfarbe and faces of Asians correctly. Then Google's software might not have confused a black one with a gorilla. However, re should be a correspondingly comprehensive set of data. And even n, re is no such way as to systematically detect discriminatory tendencies in training data, says Mitchell: "This technology has yet to medium. We have to deal with this now, because se systems are dieGrundlage for technologies of future. "

    Publish Date : 20 Haziran 2018 Çarşamba 12:02

    Breaking News Headlines

    Signs that You Could be a Snowboarding Instructor
    Signs that You Could be a Snowboarding Instructor
    4 Tips to Get the Perfect Romantic Gift for Your Partner
    4 Tips to Get the Perfect Romantic Gift for Your Partner
    7 exciting benefits of Cryptocurrency
    7 exciting benefits of Cryptocurrency
    This woman chastises Apple and Google billions of dollars – we Have a large desk
    This woman chastises Apple and Google billions of dollars – we Have a large desk
    Health Stocks Yielding Healthy Dividends
    Health Stocks Yielding Healthy Dividends
    Who wield the gavel on Dani Mateo
    Who wield the gavel on Dani Mateo
    Stopped a narco with 450 kilos of hashish in the car after an accident in Malaga
    Stopped a narco with 450 kilos of hashish in the car after an accident in Malaga
    The PP accused the PNV of to manufacture an atomic bomb with the new Statute agreed with Bildu
    The PP accused the PNV of to manufacture an atomic bomb with the new Statute agreed with Bildu
    The associations of students of 26 universities unite to challenge the monarchy
    The associations of students of 26 universities unite to challenge the monarchy
    The EU is betting on a supercomputer that anticipates natural disasters
    The EU is betting on a supercomputer that anticipates natural disasters
    The otp requests that are not supported the challenge against Marchena for the ‘whatsapp’ of Cosidó
    The otp requests that are not supported the challenge against Marchena for the ‘whatsapp’ of Cosidó
    The Government agrees to negotiate with the Basque country and the transfer of the prisons
    The Government agrees to negotiate with the Basque country and the transfer of the prisons
    Pages
    NEWS ARCHIVES