Compassion Through Computation
I am so inspired by Joy Buolamwini's work and her creation of the Algorithmic Justice League. We all do better as a society when we are inclusive. The research is clear about this. But what if we are unaware of the biases in code? As a programmer in an often white-male, dominated field, what are the underlying assumptions made in algorithms? What are the training sets that we default to without even being aware of it? If 'norms' default only to the group of people writing the code, who is left out and what is the impact? I believe there is a moral and ethical dimension to computation that needs greater focus and attention. For organizations, there is certainly a strong business case to increase the employee, customer and partner base through inclusive practices. If you want to learn more, check out Joy's inspirational TED Talk.