The Code of Conduct for the Human-Centered and Ethical Development of Immersive Technologies aims at fostering a trusted, competitive, and socially responsible XR ecosystem in Europe.
The XR4Human Project is currently developing this Code of Conduct and values your input.
We cordially invite you to participate in the stakeholder consultation for the draft Code of Conduct for the Human-Centered and Ethical Development of Immersive Technologies. You can access and provide your comments on the draft Code of Conduct here.
Comments added! I’d be curious to know what is the level of specificity you are aiming for. Some of the sections give directions towards quite specific actions while others are rather high level and abstract.
@lene.hagen and @RigmorBaraas What do you think here? I think that the ideal is that the articles are specific enough to be intuitively recognizable in the actions of as many designers and developers as possible, but not so specific as to bias interpretation or limit scope. It’s a needle to thread.
That may not be a great answer, but @Rasa do you think the Code of Coduct needs to be more or less specific than it currently is to achieve that goal above?
Thank you so much for your feedback, @Rasa! We appreciate the comments and insights you’ve shared.
I agree with @michaelbarngrover’s response. The balance between specificity and flexibility is important, and it’s valuable to hear your and other stakeholders’ opinions on this matter, as your perspectives can help us refine the Code of Conduct to be both actionable and adaptable.
XR has both great potential and great danger. Most XR users and XR developers are not aware of the dangers, esp. due to manipulation. VR can serve as the prefect skinner box, combined with eye tracking and AI to “read the users’ thoughts”.
In partial, the CoC covers important elements like data ownership and privacy.
I would like to add “education about the risks and dangers” and no manipulation in an ethics section.
Hi David, do you mean that XR developers are not aware of the danger that they themselves may manipulate their own users? Can you list or describe what you see as great danger posed by VR? You mentioned manipulation and thought reading, but are there more that you see?
For the Code of Conduct, how would you suggest adding “education about the risks and dangers” to it? Should the code advise developers to regularly undertake trainings to understand the risks they pose, or should designers and developers educate their users on the dangers posed by the technology they use?
Regarding manipulation, I recall that this was meant to be addressed by the “trustworthy and transparency” principle. There was a discussion about how much the Code should explicitly prohibit activities that were already illegal or already defined as unethical, with the decision being that the Code should not restate these prohibitions. Manipulation, as a term, is already commonly defined as an unethical or unscrupulous act on another person. Because of that, the idea was for the Code to describe how to address issues and decisions that are less ethically pre-defined.
CONFIDENTIALITY NOTICE: This message, including attachments, contains information which may be confidential and privileged, and is intended only for use by the addressees designated above. Unless you are the intended recipient, any use, copying, disclosure, or distribution is prohibited. If you have received the message in error, please immediately delete the message and destroy all copies thereof and notify the sender by mail.
@lene.hagen Can you check on the settings required to access the file? I understand that others outside the project have been able to access it and comment in it, but maybe there’s something required?
I’m sorry to hear you’re having trouble accessing the document, @Amiodezky. There are no special requirements to access it, and you shouldn’t need to sign in.
I’d like to respond first with an analogy: It is something different to hand someone ham sandwich and tell then to not kill someone compared to handing someone a loaded gun and to not kill someone. Right? For me, XR is like this loaded gun. Obviously, it does not make sense to repeat general ethical consesus.
XR and especially VR have great potential. And as we know from Spiderman, with great power comes great responsibility. From what I notice in the market is that a) some big companies try abusing this power for selfish reasons and b) almost all users and most developers are not aware of the dangers of this technology. My wish is to change that and I believe we all have an ethical duty to educate. I would not differentiate in developers and non-developers, I’d argue that all (aspiring) VR experts have a responsibility to educate themselves and everyone they consult.
Regarding manipulation: A lot of marketing and advertisement is manipulation. So to some degree it is accepted. With broad adoption of XR, this will become the main medium of advertisement. Since it is such a powerful tool to change the perception of its users, advertisements will push the boundaries of what is accepted. I propose a stance against this trend in the code of conduct, as this could lead to election manipulation and basically the end of democracy.
I hope this explanation makes my points clearer and helps with refining the wording so that it better suits the code of conduct and its purpose.
“XR is like this loaded gun” is a very profound statement! It makes me think of how designers and developers of XR in Europe are going to need to adapt to the greater prominence of defense sector projects in the next years.
I say this because my first response to your analogy was, who would want to create guns? Certainly some, but just as certainly there are many who would not want to create lethal weapons. If someone saw XR as a lethal weapon, what would attract them to creating it, and do they represent a particular ethical profile?
Your analogy inspired in me more questions than it did ideas for the Code of Conduct, but that’s often what a good analogy does.
Regarding your suggestions, I would support an addition to the CoC that encourages not just continuous learning but also continuous sharing of that learning with peers and stakeholders. I envisioned the transparency principle and relevant examples in the articles to carry an obligation to educate, but a more explicit obligation to education, such as through disseminated learning, seems valuable. It also helps to share the burden for adoption of the CoC with its adopters.
Regarding the taking of a stance on the ubiquity of manipulative advertising, I fully agree and even suggested that the CoC consider including subarticles that addressed digital platforms and business models built on advertising. That seems to be the core motivation for many of the ills we are sensitive to in digital platforms, so why not address it? Ultimately it was not included, but I would support its inclusion at the risk of lowering the likelihood of adoption by major companies.
We’re looking for designers, developers, and others who work with XR applications to join Zoom sessions to discuss the XR Code of Conduct and offer feedback. Please consider joining one of these sessions in April and May.