
Artificial intelligence promises major societal transformation, but its development culture often shuts out women, top computer scientist Dame Wendy Hall said at the AI Impact Summit in New Delhi, reports Tech Xplore. Hall, a professor at the University of Southampton, described the industry as “totally male-dominated,” arguing that this imbalance prevents half the population from shaping how AI is built and deployed. She noted that most leaders in AI companies remain men and that few women are seen in senior roles or at high-profile events. This skewed representation, Hall said, means that important perspectives on design, ethics, and impact are missing from critical decisions.
Gender bias doesn’t stop at leadership. Hall explained that because AI systems learn from data reflecting existing societal inequalities, these tools can reproduce and amplify stereotypes. For example, language models often reflect traditional gender roles, and generative systems have produced harmful outputs targeting women. Cases such as controversial deepfake images that disproportionately depict women underline the risks when diverse viewpoints are not included early in development and governance.
The pipeline problem compounds the issue. Fewer women choose computer science or related fields, and those who do may encounter career obstacles that push them out. Hall described personal experiences of discrimination early in her career, illustrating how structural barriers persist decades later. Women-led startups also face challenges securing funding, Hall said, limiting their influence on the technology’s direction.
Even so, Hall saw positive signs at the summit, where many young women from India were actively engaging with AI discussions. She stressed that change requires both encouraging more women into technical roles and reshaping industry culture so that technology teams reflect the broader society they serve. Inclusive participation, Hall believes, is essential not only for fairness but also for building AI that benefits everyone rather than reinforcing existing biases.