I really like the intentions behind the Code of Conduct. However, as an artist, I feel like the requirement for safe experiences and “content moderation” raises the question: Who determines what is “safe”? If an experience is controversial, or provocative, but consensual, should it be restricted? For instance, how would the code of conduct apply to immersive erotic sex-related experiences?
What about an XR experience that is censored in one country because it’s political and goes against a government, but it’s meaningful because it shows the social and political issues in that country? Could this principle be used to justify overreach and censorship in the name of safety?
So the question again is who determines what is “safe”? and what is safe?
I think that the Code of Conduct (CoC) applies to erotic sex-related experiences the same as any other. The developers and designers (D&Ds) decide for whom they are making the experience and that should extend to deciding what content moderation is appropriate for their users, how they transparently communicate the content they create or host, and the methods to obtain and maintain proper informed consents from their users. The CoC should always recognise that D&Ds have the greatest agency in the creation process and thus the greatest responsibility.
Your second question about the differences across borders is a good question and one that I think about a lot. I would say that the CoC should inform D&Ds to think about the varying risks that their users face in different cultural and legal jurisdictions. D&Ds should make reasonable efforts to avoid putting their users at unnecessary risks in jurisdictions that don’t permit the kinds of experiences they want to make or host.
I also think that D&Ds have the right to self-censure or attempt to limit or ban types of content in the experiences or platforms they create. When D&Ds make these decisions and are transparent about them, users can make informed decisions about whether experiences and platforms are the right ones for them. When corporate or legal authorities make content moderation decisions for large numbers of D&Ds, especially without the consent of users and D&Ds, that is when content moderation risks becoming suppressive or oppressive.
I understand that this XR CoC is meant to support the decision-making of D&Ds, which I think leads to the safest outcome.