Female-founded AI companies have made up just 2% of the startup deals in the sector over the last 10 years, according to new research.
Despite investors seemingly tripping over themselves to back promising AI startups, a new report from the Alan Turing Institute has found there is a stark gender imbalance among the founders receiving investment.
The report found that in addition to being poorly represented across the total number of deals, female-founded AI startups also received significantly smaller average investments.
According to the new research, woman-led AI startups that do secure funding receive an average investment of £1.3m, compared with their all-male counterparts receiving an average £8.6m.
“The recent explosion in interest and investment in AI, especially generative AI, means that there is an urgent need for women and minorities to have equal access in the tech and venture space,” said Alan Turing Institute research fellow Dr Erin Young.
“Venture capital firms impact the business models of the startups in which they invest, and VCs tend to invest in companies that reflect their own networks and value systems, in turn shaping the technologies developed. Encouraging inclusion in the VC space can help promote responsible AI design, tackle AI biases and foster innovation.”
The report recommended addressing gender diversity at a recruitment level to ensure women are being allowed similar opportunities in the field. The report also encouraged investors to more deliberately consider their investment practices to tackle bias.
Figures published by the Office for National Statistics (ONS) in August revealed that the first half of 2023 saw a drop off in the number of women working in the UK tech industry of around 3,000.
Data from the British Business Bank from July found that there had been no improvement in the share of equity investment deals raised by all-female-founded startups in the UK over the last decade.
A lack of diversity in the sector has led to problems in AI systems in the past. For example, facial recognition systems have struggled to accurately identify people with darker skin, in some cases leading to false positives when used by law enforcement.