The epidemic taught me Alondra Nelson said that this is a lesson we need to learn again: science and technology are closely related to society, inequality and social life issues.

A year later, science was politicized Pandemic During the presidential campaign, President-elect Joe Biden appointed Nelson the deputy director of the Science and Social Department of the White House Office of Science and Technology Policy in January, a new position. Nelson will establish a science and social department within OSTP to address issues ranging from data and democracy to STEM education. In another first time, Biden made his scientific adviser Eric Lander (Eric Lander) a part of him, and he was also the director of OSTP. cabinet.

Nelson has spent her career at the intersection of race, technology, and society, writing on topics such as how to African Futurism Can make the world a better place, and how the Panthers used healthcare as a form of activism, led to the organization’s early interest in genetics.She is the author of several books, including DNA social life, Which focuses on the rise of consumers Genetics How the testing industry and the desire to understand their ancestry led blacks and Mormons to become early users of the technology.

Nelson is a professor at the Institute for Advanced Study in Princeton, New Jersey. Before she was appointed, she was writing a book on OSTP and the Obama administration’s major science projects, including a series of reports on artificial intelligence and government policy.

In his first official speech when he assumed his new position in January, Nielsen called science a social phenomenon and stated that technology is artificial intelligence It can reveal or reflect dangerous social structures that support the pursuit of scientific progress. In an interview with WIRED, Nielsen said that especially the black communities were over-exposed to the hazards of science and technology, and did not fully enjoy these benefits.

In the interview, she talked about the scientific moon landing plan of the Biden administration and why the government has no official position on the prohibition face recognitionAnd what she believes must be resolved during her tenure in the government related to emerging technologies and society. The edited transcript is as follows.

Wired: In January, you talked about the “dangerous social structure hidden under the scientific progress we are pursuing” and mentioned gene editing and artificial intelligence. What prompted you to mention gene editing and artificial intelligence in your first public speech in this position?

Alondra Nelson: I think genetic science and artificial intelligence have in common that they are data-centric. Some things we know about data and how data analysis works at scale are true for certain aspects of large-scale genome analysis and machine learning, so these are the foundations. I think that as a country, the problems we still need to solve are the data sources analyzed using artificial intelligence tools, and who can decide which variables to use and which questions are raised by scientific and technological research. I hope the difference in this OSTP is the honesty of the past. Technology has harmed some communities and left out communities, preventing people from engaging in technology work.

The issue of racial equality and restoring trust in the government are identified as key issues on the first day of government work, which means that the work of science and technology policy must treat the past honestly and restore trust in the government—the part of restoring trust Some believe that science and technology have the ability to bring any benefit to the world—in fact, they are open to the history of deficiencies and failures of science and technology.

Unfortunately, there are many examples.Next month will be another anniversary of the Associated Press story About 40 years ago, Tuskegee syphilis research was exposed, so we came to that anniversary again. Of course, we encountered a problem in AI in our research, that is, the data used is incomplete, and their incompleteness means that the inferences they make are incomplete and inaccurate, especially when used in social services and criminal justice systems. At that time, it has a truly disproportionately harmful impact on the black and brown communities.

Rand said at his confirmation hearing that OSTP will solve the algorithm-based discrimination problem. bias. How will this work?


Source link

Leave a Reply