Stanovení etických pravidel pro neurotechnologie a AI

Je zřejmé, že nové technologie umožňují vědcům nevídané věci.  Proto vědci a etici z The Morningside group zaslali do časopisu Nature varování před neurotechnologiemi a AI s tím, že by měly být nastaveny určité etické standardy pro využití těchto technologií a navrhli čtyři základní oblasti, kam by se měla ochrana člověka a nastavení pravidel soustředit: ochrana  lidského soukromí, identity,  svébytnosti   a rovnosti:

 

Privacy and consent. Most people are already hooked up to the internet through their smartphones, which can collect huge amounts of revealing data. Google says that people touch their phones 1 million times a year. “We believe that citizens should have the ability — and right — to keep their neural data private,” assert the authors. Opting out of sharing this data should be the default choice on devices. The transfer of neural data should be regulated like organ donation to prevent a market from developing.

Agency and identity. “Neurotechnologies could clearly disrupt people’s sense of identity and agency, and shake core assumptions about the nature of the self and personal responsibility — legal or moral.” Therefore “neurorights” should be protected by international treaties. Consent forms should warn patients about the risk of changes in mood, sense of self and personality.

Augmentation. Neurotechnology could allow people to radically increase their endurance or intelligence, creating discrimination and changes in social norms. The researchers urge that “guidelines [be] established at both international and national levels to set limits on the augmenting neurotechnologies that can be implemented, and to define the contexts in which they can be used — as is happening for gene editing in humans.”

Bias. Research has shown that bias can be incorporated into AI system, and can be devilishly hard to eliminate. “Probable user groups (especially those who are already marginalized) have input into the design of algorithms and devices as another way to ensure that biases are addressed from the first stages of technology development,” comments the Morningside Group.

(zdroj: https://www.bioedge.org/bioethics/ethical-standards-urgently-needed-for-neurotechnology-say-researchers-and-e/12544)