https://dcs.univ-nantes.fr/medias/photo/capture-d-e-cran-2020-10-13-a-10-46-54_1602578846049-png
  • 09 November 2020 > 10 November 2020
    false false

A French American Dialogue on Privacy Trust and Human Oversight in Health and Well-Being

This event is organized by the Georgia Institute of Technology, the Consulate General of France in Atlanta, the Emory University Center for Ethics, the Georgia Tech Ethics, Technology, and Human Interaction Center , the University of Nantes “Droit et Changement Social”(Law and Social Change) Research Center and DataSanté Research Program, SKEMA Business School, and French Tech Raleigh – Research Triangle, with the support of the Atlanta Office of the Cultural Services of the Embassy of France in the United States and the Office for Science and Technology of the Embassy of France in the United States.

About this Event

Artificial intelligence (AI) models are now capable of collecting and analyzing enormously large datasets in ways that are challenging fundamental values embraced within Europe and the United States. Holding much promise in terms of increased productivity, efficiency, and quality time, AI programs and algorithms could function as an assistant, a peer, a manager, or even as a friend. Indeed, they might be so revolutionary that no one, regardless of whether they are consumers, citizens, patients, operators, or stakeholders, will remain unaffected.

The power of AI is such that it may jeopardize what it means to be human, whether people retain freedom of choice, and AI might reshape the relationship between humans and technology in society. The ethical issues emerging from AI are complex and quickly evolving. What follows is that identifying and implementing appropriate solutions can be difficult.

The approaches taken by France, the European Union and the United States to address these ethical issues are currently being defined and the governments are, in 2020, still considering options to maximize the potential of AI and big data while mitigating potential ethical harms.

See: