Breaking News                 Latest News            Usa News                 Latest News

Cocoa: Korean Messenger launches Klip
Search engines provide discriminatory images, application software disadvantaged blacks: where algorithms have their prejudices – and why they are so dangerous.

Algorithms: Programmed racism

Search engines provide discriminatory images, application software disadvantaged blacks: where algorithms have their prejudices – and why they are so dangerous.

Algorithms: Programmed racism
Content
  • Page 1 — programmed racism
  • Page 2 — algorithms should explain
  • Page 3 — The new racism is no less perfidious
  • Read on a page

    Racism ranks our thinking and living toger. With focus on "everyday racism" we want to find out why this is what this means for society and how it could be changed. In this part, Eva Wolfangel explains why even technology is not free of racist patterns.

    We humans do not know everything, but at least we know this: we make decisions by mixing facts, half-knowledge, preferences, prejudices and expectation of personal benefits – hoping to end up Richtigzuliegen. There are no really objective decisions. To counteract this problem, we are increasingly relying on machines. Computers and algorithms should now liberate us from prejudice. Developments in field of artificial intelligence should not only take decisions from us, but also make better and more objective decisions.

    So far idea. But last thing that happened was this: a Google software labeled photo of a Afroamerikanerinmit "gorilla". When DieMenschen googled "ProfessionelleFrisur", only blonde braided hairstyles appeared in ErstenErgebnissen of image search. In end selection of a machine entschiedenenSchönheitswettbewerbs only white candidates landed. A software derneuseeländischen Passport authority denied recognition of Asian passports, algorithm assumed that eyes of pictured were closed. And a system that should help judges to decide which prisoners should be discharged prematurely was of opinion that a dark skin color was decisive criterion a high likelihood of recurrence of criminal offences. And that is just one example of what can go wrong with "predictive police" – technical term for police work on basis of crime forecasts.

    "We thought dassAlgorithmen could make better decisions, such as VorhersagendenPolizeiarbeit or staff selection," says US amerikanischeTechniksoziologin Zeynep Tufekci time online, "Aberdieser Dream is dreamy".

    The fact that software is not NURIRRT, but surprisingly consistently discriminates against population groups, i.e. a racist bias, is attributable to two causes. The first is explained quickly: modern algorithms learn on basis of selected training data, for example, information about how people have taken se decisions so far. And y see patterns in it. However, if training data has already been poorly selected by developers, algorithms reproduce problem. The computer programs mselves are not racist, but y can produce discriminatory results based on data. Actually, it's easy, says Margaret Mitchell of Google Research in Seattle: "If you Müllrein, trash will come out." Or even: "We put prejudices in, we get prejudices."

    If you put garbage in, trash will come out. Margaret Mitchell, senior scientist at Google research

    These are, however, Meistnicht obvious, which is why y were not noticed at first. "Today, thanks to deep-learning revolution, we have powerful technologies," says Mitchell, "DieTendenzen in data are sometimes only through output of Systemesichtbar". In or words: what is going wrong often erstam end. And this only if developers and developers are darüberbewusst that y have to question results.

    You could tackle problem by making training data more inclusive. A software ZurGesichtserkennung, for example, must not only be fed with an image database VonMitteleuropäern, but must also recognize people with DunklerHautfarbe and faces of Asians correctly. Then Google's software might not have confused a black one with a gorilla. However, re should be a correspondingly comprehensive set of data. And even n, re is no such way as to systematically detect discriminatory tendencies in training data, says Mitchell: "This technology has yet to medium. We have to deal with this now, because se systems are dieGrundlage for technologies of future. "

    Publish Date : 20 Haziran 2018 Çarşamba 12:02

    Breaking News Headlines

    What time do they close the bars, pubs and clubs in each autonomous community?
    What time do they close the bars, pubs and clubs in each autonomous community?
    One out of every five self-employed have at least one worker in their charge
    One out of every five self-employed have at least one worker in their charge
    19% of british companies that cut their earnings forecasts cited Brexit
    19% of british companies that cut their earnings forecasts cited Brexit
    Employment estimated that only one out of every three contracts to support entrepreneurs to survive in the present
    Employment estimated that only one out of every three contracts to support entrepreneurs to survive in the present
    The aircraft deliveries of Boeing recorded in July, its worst level since November 2008 for the 737 MAX
    The aircraft deliveries of Boeing recorded in July, its worst level since November 2008 for the 737 MAX
    In spite of the crisis in the case of Media Markt & Saturn: mother-to-group makes large profit
    In spite of the crisis in the case of Media Markt & Saturn: mother-to-group makes large profit
    Tug-of-war to Metro: trade group rejects billion Takeover
    Tug-of-war to Metro: trade group rejects billion Takeover
    Bitcoin price under $ 10,000: crypto-Scam to blame?
    Bitcoin price under $ 10,000: crypto-Scam to blame?
    Bitcoin-exchange-Coinbase wants to be the largest crypto-Trustee of the world
    Bitcoin-exchange-Coinbase wants to be the largest crypto-Trustee of the world
    How Blockchain Is Changing Social Media
    How Blockchain Is Changing Social Media
    Signs that You Could be a Snowboarding Instructor
    Signs that You Could be a Snowboarding Instructor
    4 Tips to Get the Perfect Romantic Gift for Your Partner
    4 Tips to Get the Perfect Romantic Gift for Your Partner
    Pages
    NEWS ARCHIVES