{"id":5970,"date":"2023-12-29T17:04:45","date_gmt":"2023-12-29T16:04:45","guid":{"rendered":"https:\/\/www.gorr.si\/?p=5970"},"modified":"2024-02-14T14:38:34","modified_gmt":"2024-02-14T13:38:34","slug":"did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai","status":"publish","type":"post","link":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/","title":{"rendered":"Did We Make AI Racist? Addressing Gender and Racial Bias in AI"},"content":{"rendered":"\n<p class=\"has-text-align-center\">While it might sound like a ridiculous premise, there is a startling amount of evidence demonstrating that (human) discrimination has (already) found its way into AI, an issue that <a href=\"https:\/\/www.gorr.si\/en\" target=\"_blank\" rel=\"noreferrer noopener\">translation services<\/a> are no stranger to. <\/p>\n\n\n\n<p class=\"has-text-align-center\">As the capabilities of AI continue to be probed, it is becoming more and more evident that <strong>even it is not immune to biases,<\/strong> particularly (and troublingly) those concerning <a href=\"https:\/\/www.gorr.si\/en\/translation-editing\/\" target=\"_blank\" rel=\"noreferrer noopener\">race and gender.<\/a> This persistent issue is naturally not confined to <a href=\"https:\/\/www.gorr.si\/translation-editing\/?lang=en\" target=\"_blank\" rel=\"noreferrer noopener\">professional translation<\/a>, but extends well into the fields of art, design and even tech, leaving us with a host of <strong>uncomfortable challenges<\/strong> to face. <\/p>\n\n\n\n<div style=\"height:25px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dots\"\/>\n\n\n\n<div style=\"height:25px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 class=\"wp-block-heading has-text-align-center\"><strong>The Birth of AI Bias<\/strong> <\/h2>\n\n\n\n<p class=\"has-text-align-center\">The root of bias in \u201ctranslation\u201d AI and AI in general lie in t<strong>he data on which these models are trained.<\/strong> Machine learning algorithms learn from vast datasets, and if these datasets contain biased or culturally insensitive content (which we can clearly see they do), the AI may inadvertently <a href=\"https:\/\/www.npr.org\/2023\/07\/19\/1188739764\/how-ai-could-perpetuate-racism-sexism-and-other-biases-in-society\" target=\"_blank\" rel=\"noreferrer noopener\">adopt and perpetuate these biases.<\/a> <\/p>\n\n\n\n<p class=\"has-text-align-center\">Making matters worse, biases can even emerge from the <strong>underrepresentation of certain languages, dialects, or cultures<\/strong> in the source training data, leading to skewed results and <a href=\"https:\/\/www.npr.org\/2022\/02\/13\/1080464162\/lack-of-diversity-in-ai-development-causes-serious-real-life-harm-for-people-of-\" target=\"_blank\" rel=\"noreferrer noopener\">real-life consequences.<\/a> <\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\" target=\"_blank\" rel=\"noreferrer noopener\">GORR LANGUAGE PROFESSIONALS<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\"><strong>What Went Wrong?<\/strong> <\/h3>\n\n\n\n<p class=\"has-text-align-center\">It\u2019s not an easy question to answer (in every possible sense). As previously mentioned, the development of \u201ctranslation\u201d AI involves training models on large corpora of text from the internet, books, and other sources. Should these datasets unintentionally reflect <strong>existing societal biases,<\/strong> the AI then absorbs and regurgitates <em>our<\/em> stereotypes and discriminatory patterns in myriad forms. <\/p>\n\n\n\n<p class=\"has-text-align-center\">In layman\u2019s terms, AI learned its gender and racial bias from <em>us.<\/em>  <\/p>\n\n\n\n<p class=\"has-text-align-center\">The additional <strong>lack of diversity in the teams developing AI models<\/strong> also contributes to oversight regarding the potential discrimination or alienation of people of color. Examples include the use of <a href=\"https:\/\/www.cbc.ca\/news\/science\/artificial-intelligence-racism-bias-1.6027150\" target=\"_blank\" rel=\"noreferrer noopener\">inappropriate terminology<\/a>, image generators being <a href=\"https:\/\/www.nytimes.com\/2023\/07\/04\/arts\/design\/black-artists-bias-ai.html\" target=\"_blank\" rel=\"noreferrer noopener\">unable to realistically depict Black women<\/a> (smiling or crying), <a href=\"https:\/\/www.ted.com\/talks\/joy_buolamwini_how_i_m_fighting_bias_in_algorithms?language=en&amp;subtitle=en\" target=\"_blank\" rel=\"noreferrer noopener\">facial recognition detection catastrophes<\/a>, and <a href=\"https:\/\/academic.oup.com\/applij\/article\/44\/4\/613\/6901317\" target=\"_blank\" rel=\"noreferrer noopener\">failures in speech recognition technology<\/a> to recognize commands given by Black speakers or those who speak English as a second language. <\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\/2023\/06\/19\/seven-ways-to-use-chatgpt-in-the-translation-industry\/\" target=\"_blank\" rel=\"noreferrer noopener\">GORR AND ChatGPT<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dots\"\/>\n\n\n\n<h2 class=\"wp-block-heading has-text-align-center\"><strong>So, What Can We Do?<\/strong> <\/h2>\n\n\n\n<p class=\"has-text-align-center\">Seeing as there is undeniable room for improvement, here are a few things that can be done to diversify the abilities of AI: <\/p>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\"><strong>Regular Collaboration Between Human Translators and AI<\/strong> <\/h3>\n\n\n\n<p class=\"has-text-align-center\">At the risk of sounding self-serving, human translators, with their nuanced understanding of <a href=\"https:\/\/www.gorr.si\/en\/localization\/\" target=\"_blank\" rel=\"noreferrer noopener\">cultural contexts<\/a>, idioms, and linguistic subtleties, can play a crucial role in avoiding or correcting <strong>linguistic issues<\/strong> before they even become issues. Unlike AI, humans possess the ability to comprehend contexts beyond the original \u201ctext\u201d navigating undertones and nuances that would be lost on machines. <\/p>\n\n\n\n<p class=\"has-text-align-center\">The reliance on human translators becomes even more critical in sensitive areas such as <a href=\"https:\/\/www.gorr.si\/en\/legal-translation\/\" target=\"_blank\" rel=\"noreferrer noopener\">legal,<\/a> <a href=\"https:\/\/www.gorr.si\/en\/life-sciences\/\" target=\"_blank\" rel=\"noreferrer noopener\">medical,<\/a> <a href=\"https:\/\/www.gorr.si\/en\/it-telecommunications\/\" target=\"_blank\" rel=\"noreferrer noopener\">IT,<\/a> or <a href=\"https:\/\/www.gorr.si\/en\/technical\/\" target=\"_blank\" rel=\"noreferrer noopener\">technical<\/a> contexts where <strong>precision and cultural sensitivity are paramount.<\/strong> A collaboration between the two would likely benefit both parties: translators could make use of AI to be more efficient while also helping to refine and improve machine translation algorithms for future use. <\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\/industries\/\" target=\"_blank\" rel=\"noreferrer noopener\">OUR SPECIALTIES<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\"><strong>Diversify Training Data Input<\/strong> <\/h3>\n\n\n\n<p class=\"has-text-align-center\">At this point, this is probably self- explanatory. Ensuring diverse and representative datasets is an <strong>absolutely pivotal step<\/strong> in <a href=\"https:\/\/www.forbes.com\/sites\/jeffraikes\/2023\/04\/21\/ai-can-be-racist-lets-make-sure-it-works-for-everyone\/?sh=37b5a022e40d\" target=\"_blank\" rel=\"noreferrer noopener\">mitigating biases in AI<\/a>. It would be a massive undertaking, but well worth the effort. <\/p>\n\n\n\n<p class=\"has-text-align-center\">To succeed, developers would have to <strong>actively seek out and incorporate content<\/strong> from underrepresented groups, languages, and cultures to foster a more inclusive and accurate AI. This technology would have to not only be <a href=\"https:\/\/time.com\/5520558\/artificial-intelligence-racial-gender-bias\/\" target=\"_blank\" rel=\"noreferrer noopener\">produced in association with these underrepresented groups<\/a>, but also extensively tested with them in mind to guarantee coherent function. <\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\/languages\/\" target=\"_blank\" rel=\"noreferrer noopener\">GORR KNOWS LANGUAGE<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\"><strong>The Development of Ethical AI<\/strong><\/h3>\n\n\n\n<p class=\"has-text-align-center\">As an accompaniment to diversifying input, t<strong>he implementation of ethical guidelines<\/strong> for AI development, including <strong>the promotion of transparency and accountability<\/strong>, would go a long way towards helping identify and rectify biases. <\/p>\n\n\n\n<p class=\"has-text-align-center\">Again, this would be an area where <a href=\"https:\/\/www.gorr.si\/en\/2023\/12\/08\/breaking-boundaries-the-crucial-role-of-translation-services-and-translators-2\/\" target=\"_blank\" rel=\"noreferrer noopener\">human intervention<\/a> would be indispensable to carrying out r<strong>egular audits and assessments<\/strong> of AI systems. Such evaluation would <a href=\"https:\/\/www.vox.com\/technology\/23738987\/racism-ai-automated-bias-discrimination-algorithm\" target=\"_blank\" rel=\"noreferrer noopener\">add a constant human touch<\/a> to ongoing improvements and corrections as the natural flow of time inevitably leads to change and further growth.<\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\/mt-mtpe\/\" target=\"_blank\" rel=\"noreferrer noopener\">GORR MT &amp; MTPE<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\"><strong>Establishing a System of User Feedback<\/strong> <\/h3>\n\n\n\n<p class=\"has-text-align-center\">As most businesses know, a strong pillar of progression is the <strong>internalization of user feedback<\/strong> to your processes, approaches, and methodology. Establishing mechanisms for users to provide feedback on function, <a href=\"https:\/\/www.gorr.si\/en\/certified-translation\/\" target=\"_blank\" rel=\"noreferrer noopener\">translation,<\/a> and limitations would aid in <a href=\"https:\/\/www.gorr.si\/en\/2024\/01\/18\/what-is-certified-translation-and-when-do-i-need-it\/\" target=\"_blank\" rel=\"noreferrer noopener\">identifying and rectifying biases<\/a> without exerting excessive energy on the developers\u2019 end. Continuous improvement through user input is <strong>crucial to refining AI<\/strong> over time and creating systems that are as inclusive as they are extensive. <\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\/about-us\/\" target=\"_blank\" rel=\"noreferrer noopener\">GORR\u2019S HUMAN TOUCH<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p class=\"has-text-align-center\">Though opinions on the subject vary, the hard truth is that before any of these challenges can be met and overcome, we must first acknowledge and understand <a href=\"https:\/\/www.techtarget.com\/searchenterpriseai\/definition\/machine-learning-bias-algorithm-bias-or-AI-bias#:~:text=Machine%20learning%20bias%20generally%20stems,biases%20or%20real%2Dlife%20prejudices.\" target=\"_blank\" rel=\"noreferrer noopener\">the origins of AI discrimination.<\/a> Once this has been accomplished, the identification of fallacies and implementation of corrections will come much more easily. <\/p>\n\n\n\n<p class=\"has-text-align-center\">AI may be a powerful tool, one that is likely to play large part in the future of our world but <strong>one that cannot be left to its own devices.<\/strong> Certainly, in the case of <a href=\"https:\/\/www.gorr.si\/en\" target=\"_blank\" rel=\"noreferrer noopener\">translation services<\/a> constant human supervision and guidance is the only way to ensure accurate, <a href=\"https:\/\/www.gorr.si\/en\/localization\/\" target=\"_blank\" rel=\"noreferrer noopener\">culturally sensitive,<\/a> and contextually nuanced translations and a future where AI better reflects the rich diversity and complexity of the human experience. <\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\/blogs\/\" target=\"_blank\" rel=\"noreferrer noopener\">READ MORE<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity is-style-dots\"\/>\n\n\n\n<h3 class=\"wp-block-heading has-text-align-center\">DIVE HEADFIRST INTO AI WITH GORR<\/h3>\n\n\n\n<p class=\"has-text-align-center\"><a href=\"https:\/\/www.gorr.si\/en\/2023\/12\/20\/humans-vs-ai-navigating-a-new-frontier-in-translation\/\" target=\"_blank\" rel=\"noreferrer noopener\">Humans VS AI<\/a><\/p>\n\n\n\n<p class=\"has-text-align-center\"><a href=\"https:\/\/www.gorr.si\/en\/2024\/02\/14\/5-ways-translators-can-make-ai-better\/\">5 Ways Translators Can Make A<\/a><a href=\"https:\/\/www.gorr.si\/en\/2024\/02\/14\/5-ways-translators-can-make-ai-better\/\" target=\"_blank\" rel=\"noreferrer noopener\">I Better<\/a><br><br><a href=\"https:\/\/www.gorr.si\/en\/2023\/12\/28\/the-top-5-pitfalls-of-ai-translation\/\" target=\"_blank\" rel=\"noreferrer noopener\">The Top 5 Pitfalls of AI Translation<\/a><br><br><a href=\"https:\/\/www.gorr.si\/en\/2024\/01\/31\/ai-avatars-and-deepfakes-and-what-they-can-do-for-you\/\" target=\"_blank\" rel=\"noreferrer noopener\">AI Avatars and Deepfakes: What Can They Do for You?<\/a><br><br><a href=\"https:\/\/www.gorr.si\/en\/2024\/01\/08\/lost-in-translation-the-real-life-consequences-of-ai-taking-the-wheel\/\" target=\"_blank\" rel=\"noreferrer noopener\">The Real-Life Consequences of AI Taking the Wheel<\/a><br><br><a href=\"https:\/\/www.gorr.si\/en\/2023\/06\/19\/seven-ways-to-use-chatgpt-in-the-translation-industry\/\" target=\"_blank\" rel=\"noreferrer noopener\">7 Ways to Use ChatGPT in the Translation Industry<\/a><br><br><a href=\"https:\/\/www.gorr.si\/en\/2023\/06\/19\/is-machine-translation-the-future-of-translation\/\" target=\"_blank\" rel=\"noreferrer noopener\">Is Machine Translation the Future of Translation?<\/a><\/p>\n\n\n\n<div class=\"wp-block-buttons is-content-justification-center is-layout-flex wp-container-core-buttons-is-layout-16018d1d wp-block-buttons-is-layout-flex\">\n<div class=\"wp-block-button\"><a class=\"wp-block-button__link wp-element-button\" href=\"https:\/\/www.gorr.si\/en\/blogs\/\" target=\"_blank\" rel=\"noreferrer noopener\">MORE ON AI<\/a><\/div>\n<\/div>\n\n\n\n<div style=\"height:25px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n","protected":false},"excerpt":{"rendered":"<p>As the use of AI becomes more normalized, it has begun to reflect the merits and flaws of its creators demonstrating that AI is what we made it. <\/p>\n","protected":false},"author":4,"featured_media":5960,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"_breakdance_hide_in_design_set":false,"_breakdance_tags":"","footnotes":""},"categories":[54],"tags":[56,58,70,55],"class_list":["post-5970","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-when-ai-goes-wrong","tag-ai-en","tag-artificial-intelligence","tag-chatgpt-en","tag-translation"],"aioseo_notices":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v22.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Did We Make AI Racist? Addressing Gender and Racial Bias in AI - GORR<\/title>\n<meta name=\"description\" content=\"Can a machine be racist? Most would say no, but GORR knows that the rabbit hole of AI bias is a deep one.\" \/>\n<meta name=\"robots\" content=\"noindex, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Did We Make AI Racist? Addressing Gender and Racial Bias in AI - GORR\" \/>\n<meta property=\"og:description\" content=\"Can a machine be racist? Most would say no, but GORR knows that the rabbit hole of AI bias is a deep one.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/\" \/>\n<meta property=\"og:site_name\" content=\"GORR\" \/>\n<meta property=\"article:published_time\" content=\"2023-12-29T16:04:45+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-02-14T13:38:34+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1280\" \/>\n\t<meta property=\"og:image:height\" content=\"964\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Gregor\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Gregor\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/\",\"url\":\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/\",\"name\":\"Did We Make AI Racist? Addressing Gender and Racial Bias in AI - GORR\",\"isPartOf\":{\"@id\":\"https:\/\/gorr.si\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg\",\"datePublished\":\"2023-12-29T16:04:45+00:00\",\"dateModified\":\"2024-02-14T13:38:34+00:00\",\"author\":{\"@id\":\"https:\/\/gorr.si\/#\/schema\/person\/b0f552a67f6a2330a460512721cbb0f5\"},\"description\":\"Can a machine be racist? Most would say no, but GORR knows that the rabbit hole of AI bias is a deep one.\",\"breadcrumb\":{\"@id\":\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#breadcrumb\"},\"inLanguage\":\"en-EN\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-EN\",\"@id\":\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#primaryimage\",\"url\":\"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg\",\"contentUrl\":\"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg\",\"width\":1280,\"height\":964,\"caption\":\"An\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/gorr.si\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Did We Make AI Racist? Addressing Gender and Racial Bias in AI\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/gorr.si\/#website\",\"url\":\"https:\/\/gorr.si\/\",\"name\":\"GORR\",\"description\":\"translation center\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/gorr.si\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-EN\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/gorr.si\/#\/schema\/person\/b0f552a67f6a2330a460512721cbb0f5\",\"name\":\"Gregor\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-EN\",\"@id\":\"https:\/\/gorr.si\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/1c33e3a761220c52fbd1e9df19b3d2913ef05f4cb2a14dc97df56ed9c97b0dd9?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/1c33e3a761220c52fbd1e9df19b3d2913ef05f4cb2a14dc97df56ed9c97b0dd9?s=96&d=mm&r=g\",\"caption\":\"Gregor\"},\"url\":\"https:\/\/gorr.si\/en\/author\/gregor\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Did We Make AI Racist? Addressing Gender and Racial Bias in AI - GORR","description":"Can a machine be racist? Most would say no, but GORR knows that the rabbit hole of AI bias is a deep one.","robots":{"index":"noindex","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"og_locale":"en_US","og_type":"article","og_title":"Did We Make AI Racist? Addressing Gender and Racial Bias in AI - GORR","og_description":"Can a machine be racist? Most would say no, but GORR knows that the rabbit hole of AI bias is a deep one.","og_url":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/","og_site_name":"GORR","article_published_time":"2023-12-29T16:04:45+00:00","article_modified_time":"2024-02-14T13:38:34+00:00","og_image":[{"width":1280,"height":964,"url":"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg","type":"image\/jpeg"}],"author":"Gregor","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Gregor","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/","url":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/","name":"Did We Make AI Racist? Addressing Gender and Racial Bias in AI - GORR","isPartOf":{"@id":"https:\/\/gorr.si\/#website"},"primaryImageOfPage":{"@id":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#primaryimage"},"image":{"@id":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#primaryimage"},"thumbnailUrl":"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg","datePublished":"2023-12-29T16:04:45+00:00","dateModified":"2024-02-14T13:38:34+00:00","author":{"@id":"https:\/\/gorr.si\/#\/schema\/person\/b0f552a67f6a2330a460512721cbb0f5"},"description":"Can a machine be racist? Most would say no, but GORR knows that the rabbit hole of AI bias is a deep one.","breadcrumb":{"@id":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#breadcrumb"},"inLanguage":"en-EN","potentialAction":[{"@type":"ReadAction","target":["https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/"]}]},{"@type":"ImageObject","inLanguage":"en-EN","@id":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#primaryimage","url":"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg","contentUrl":"https:\/\/gorr.si\/wp-content\/uploads\/2023\/12\/jobs-8053534_1280.jpg","width":1280,"height":964,"caption":"An"},{"@type":"BreadcrumbList","@id":"https:\/\/gorr.si\/en\/2023\/12\/29\/did-we-make-ai-racist-addressing-gender-and-racial-bias-in-ai\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/gorr.si\/"},{"@type":"ListItem","position":2,"name":"Did We Make AI Racist? Addressing Gender and Racial Bias in AI"}]},{"@type":"WebSite","@id":"https:\/\/gorr.si\/#website","url":"https:\/\/gorr.si\/","name":"GORR","description":"translation center","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/gorr.si\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-EN"},{"@type":"Person","@id":"https:\/\/gorr.si\/#\/schema\/person\/b0f552a67f6a2330a460512721cbb0f5","name":"Gregor","image":{"@type":"ImageObject","inLanguage":"en-EN","@id":"https:\/\/gorr.si\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/1c33e3a761220c52fbd1e9df19b3d2913ef05f4cb2a14dc97df56ed9c97b0dd9?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/1c33e3a761220c52fbd1e9df19b3d2913ef05f4cb2a14dc97df56ed9c97b0dd9?s=96&d=mm&r=g","caption":"Gregor"},"url":"https:\/\/gorr.si\/en\/author\/gregor\/"}]}},"_links":{"self":[{"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/posts\/5970","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/comments?post=5970"}],"version-history":[{"count":11,"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/posts\/5970\/revisions"}],"predecessor-version":[{"id":7511,"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/posts\/5970\/revisions\/7511"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/media\/5960"}],"wp:attachment":[{"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/media?parent=5970"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/categories?post=5970"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/gorr.si\/en\/wp-json\/wp\/v2\/tags?post=5970"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}