![]() ![]() Each of these “verticals” addresses different but interconnected issues, from sustainability to employment to gaming. The ICSR collaboration has separate project teams researching systemic racism in different sectors of society, including health care. The aim of the ICSR is to develop and harness computational tools that can help effect structural and normative change toward racial equity. Schwarzman College of Computing and beyond. Here at MIT, Watkins has joined the newly launched Initiative on Combatting Systemic Racism (ICSR), an IDSS research collaboration that brings together faculty and researchers from the MIT Stephen A. “You have to give the user complete control over how their data is shared and used and what data a care provider sees. ![]() “Data rights are a significant component,” he argues. He takes a user-centered approach to the development that is grounded in data ethics. Watkins is no stranger to the privacy concerns such an app would raise. “The goal is to provide data-informed insights to care providers in order to deliver higher-impact services.” “We are building a real-time data collection platform that looks at activities and behaviors and tries to identify patterns and contexts in which certain mental states emerge,” says Watkins. In the case of mental health access, this entails developing a tool to help mental health providers deliver care more efficiently. He works also to develop tools and platforms that can address inequality outside of tech head-on. That demand has been exacerbated by the pandemic, and access to care is harder for communities of color.įor Watkins, taking the bias out of the algorithm is just one component of building more ethical AI. The demand for mental health care, for example, far outstrips the capacity for services in the United States. One broad domain is health care, where Watkins says inequity shows up in both quality of and access to care. Inequality is perpetuated by technology in many ways across many sectors. “One of the fundamental questions of the work is: how do we build AI models that deal with systemic inequality more effectively?” Watkins, a professor at the University of Texas at Austin and the founding director of the Institute for Media Innovation, researches the impacts of media and data-based systems on human behavior, with a specific concentration on issues related to systemic racism. “The stakes are higher because new systems can replicate historical biases at scale.” Craig Watkins, whose academic home for his time at MIT is the Institute for Data, Systems, and Society (IDSS). “There’s an urgency as AI is used to make really high-stakes decisions,” says MLK Visiting Professor S. The reason: a facial recognition algorithm had matched the photo on his driver’s license to grainy security camera footage.įacial recognition algorithms - which have repeatedly been demonstrated to be less accurate for people with darker skin - are just one example of how racial bias gets replicated within and perpetuated by emerging technologies. So why did they arrest him in the first place? After some questioning, however, it became clear that they had the wrong man. He was handcuffed in front of his family and spent a night in lockup. ![]() In 2020, Detroit police arrested a Black man for shoplifting almost $4,000 worth of watches from an upscale boutique. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |