ACM CCS 2025
October 13-17, 2025
Taipei, Taiwan
Diversity and Inclusion
ACM CCS is committed to taking steps to enhance the diversity and inclusion of our community. We value your feedback and input. While not everyone may have the same idea on how to make our community healthier and more welcoming to everyone, we are eager to hear your diverse ideas, and we hope that our efforts will lead to more open conversations regarding this goal.
The open exchange of ideas is central to the Association for Computing Machinery’s mission. This requires an environment that embraces diversity and provides a safe, welcoming environment for all.
Inclusion and Diversity in Writing:
As a large scientific and technical community that has a direct impact on many people from different backgrounds around the world, Diversity and Inclusion are crucial for the security community. ACM explains these goals as follows. Diversity is achieved when the individuals around the table are drawn from a variety of backgrounds and experience, leading to a breadth of viewpoints, reasoning, and approaches (also referred to as “the who”). Inclusion is achieved when the environment is characterized by behaviors that welcome and embrace diversity (“the how”). Both are important in our writing and other forms of communication such as posters and talks.
Inclusion
Be mindful of not using language or examples that further the marginalization, stereotyping, or erasure of any group of people, especially historically marginalized and/or under-represented groups (URGs) in computing. Of course, exclusionary treatment can arise unintentionally. Be vigilant and actively guard against such issues in your writing. Reviewers will also be empowered to monitor and demand changes if such issues arise in your submissions. Here are some examples of such issues for your benefit:
Examples of exclusionary and other non-inclusive writing to consider avoiding:
- Implicit assumption: An example of data integrity constraints: “Every person has a mother and a father.” This example is exclusionary and potentially hurtful to people from single-parent households and people with same-sex parents.
- Oppressive terminology: Using the term “Master-Slave” to describe a distributed data system architecture. This can be hurtful to people whose families have suffered the inhumanity of enslavement. A good source of alternative terms to oppressive language often used in computer science can be found in this article.
- Marginalization of URGs: An example of attribute domains: “The Gender attribute is either Male or Female.” This example is exclusionary and potentially hurtful to people who are intersex, transgender, third gender, two-spirit, agender, or have other non-binary gender identities.
- Lack of accessibility: Using color alone to convey information in a plot when good alternative data visualization schemes exist. This can be exclusionary to people who are color-blind. Please consider using patterns, symbols and textures to emphasize and contrast visual elements in graphs and figures, rather than using colors alone. Use a color-blind friendly palette that is designed with accessibility for visually impaired people. Avoid bad color combinations such as green/red or blue/purple.
- Stereotyping: Reinforcing gender stereotypes in names or examples of roles, e.g., using only feminine names or presentations for personal secretary or assistant roles.
Diversity
Going further, please also consider actively raising the representation of URGs in your writing. Diversity of representation helps create an environment and community culture that could ultimately make our field more welcoming and attractive to people from URGs. This is a small but crucial step you can take towards celebrating and improving our community’s diversity. Examples of infusing diversity into writing to consider adopting:
- Embracing different cultures: Names of people are a visible way to enhance diversity of representation in writing. Instead of reusing overused names in computing such as Alice and Bob, consider using names from a variety of languages, cultures, and nationalities, e.g., Alvarez and Bano. Avail of the many online resources on this front for ideas, e.g., this article on names across different cultures.
- Embracing differences in figures: Depictions of people or people-like icons in illustrations are also a good avenue to enhance diversity of representation. Consider depicting people of different gender presentations, skin colors, ability status, and other visible attributes of people.
- Embracing gender diversity in pronouns: Consider using a variety of gender pronouns across your named examples consciously, including “he/him/his,” “she/her/hers,” and “they/them/theirs”. Likewise, consider using gender-neutral nouns when referring to generic roles, e.g., “chairperson” or just “chair” instead of “chairman,” and gender-neutral pronouns for such roles.
Responsibility
Finally, if your work involves data-driven techniques that make decisions about people, please consider explicitly discussing whether it may lead to disparate impact on different groups, especially URGs. Consider discussing the ethical and societal implications. For example, see this article discussing the potential for disparate impact of facial recognition in healthcare and strategies to avoid or reduce harm. This SIGMOD Blog article also gives a comprehensive overview of various dimensions and approaches for responsible application of data management ideas. We hope our community can help permeate this culture of responsibility and awareness about potentially harmful unintended negative consequences of our work within the larger computing landscape.