I want to begin with a claim that I know will be contested, because the framing matters. The underrepresentation of women in artificial intelligence is not primarily an equity problem. It is a technical quality problem. And for the Caribbean specifically, it is one of the most urgent technical problems we face in building an AI ecosystem that actually serves Caribbean people.
I make this argument not to minimize the equity dimension. Gender discrimination in technology careers is real, documented, and consequential. But I want to make the technical argument because the technical argument is more difficult to dismiss and because it reveals something important about the current state of AI that gets lost when the conversation stays at the level of representation statistics.
The Technical Case
AI systems learn from data. They identify patterns in the data they are trained on and apply those patterns to new inputs. The quality of what they learn, and therefore the quality of what they produce, depends directly on the quality and representativeness of their training data. A system trained on data that does not adequately represent a population will produce less accurate results for that population. This is not a theoretical claim. It is a mathematical one, and it is borne out repeatedly in real-world deployments.
Amazon built an AI hiring tool that trained on a decade of the company's own hiring data. Because Amazon, like most technology companies, had hired more men than women over that decade, the AI learned that the characteristics associated with successful candidates were more common in men than in women. When it evaluated new candidates, it penalized features associated with women applicants: attendance at women's colleges, membership in women's professional organizations, certain patterns of career description. The tool was eventually scrapped, but the underlying dynamic was not unique to Amazon. It is replicated wherever AI systems are trained on historically biased data without teams that recognize and correct for the bias.
Medical AI systems trained predominantly on data from clinical studies that historically underrepresented women produce diagnostic tools that are measurably less accurate for female patients. This is particularly alarming in cardiovascular disease, which presents differently in women than in men and has historically been underdiagnosed in women partly because of research biases that AI systems can now encode at scale.
Facial recognition systems trained on datasets skewed toward lighter-skinned male faces have documented higher error rates for women and for darker-skinned people. The error rates are not small. Studies have found error rates for darker-skinned women running at 34 percent or higher versus error rates below one percent for lighter-skinned men, in systems from major technology companies.
Natural language processing systems associate certain professions with male pronouns by default because the text they trained on reflects historical patterns of male professional dominance. When asked to translate a sentence about a nurse from a gender-neutral language to English, early versions of Google Translate defaulted to "she." When asked to translate a sentence about a doctor, it defaulted to "he." The AI had learned what the language reflected: a world where most nurses were women and most doctors were men. Deploying it unchanged encoded that historical reality into every translation.
None of these failures are inevitable. All of them are more likely when the teams building the systems do not include women who will notice the failure modes and insist they be corrected.
Why the Caribbean Has a Specific Stake in This
The Caribbean's relationship to AI bias has two dimensions: we are disproportionately harmed by it, and we are disproportionately absent from the processes that could prevent it.
Caribbean people are harmed by AI bias in compounding ways. AI systems trained without Caribbean data produce less accurate results for Caribbean users in healthcare, finance, language, and content moderation. Caribbean Patois and creole languages are mishandled by natural language processing tools because the training data does not represent them. Caribbean skin tones have documented higher facial recognition error rates. Caribbean financial behavior patterns, which include high rates of informal economy participation, are misread by credit scoring models trained on formal-sector financial behavior from wealthier countries.
When these biased systems are deployed in Caribbean markets, which happens with increasing frequency as international AI products enter the region, the harm is concentrated on the communities already most likely to face discrimination: women, darker-skinned people, informal economy participants. The most vulnerable people in the Caribbean are the ones most harmed by AI bias, and the harm compounds existing disadvantages rather than offsetting them.
At the same time, Caribbean women are among the least represented groups in AI development globally. The combination of global underrepresentation of women in AI plus the specific underrepresentation of Caribbean people in global AI creates a double absence that produces systems particularly ill-suited to Caribbean women's needs.
What I Have Learned from Building StarApple AI
Since founding StarApple AI in 2019, I have had a front-row seat to the Caribbean AI ecosystem. I have seen the talent that exists in this region, across genders, across islands, across educational backgrounds. I have run bootcamps, assessments, and deployments across the Caribbean. And I have seen a persistent pattern: women are present in the Caribbean AI conversation, they are engaged, they are capable, and they are systematically underserved by the training and networking infrastructure of the Caribbean AI ecosystem.
I want to be direct about my own responsibility here. StarApple AI's training programmes have not always done enough to recruit women into AI careers, to create environments where women felt they belonged, or to prioritize the AI applications most relevant to women's professional lives. We are actively working to change this. But I want to name it honestly rather than present a picture of consistent excellence on this front, because honest accounting is more useful than self-congratulation.
What I know from building this organization is that the women who do engage with AI training demonstrate exactly the qualities that make strong AI practitioners: clear thinking about problems, attention to the human consequences of technical decisions, and the ability to see failure modes that narrower perspectives miss. The Caribbean AI ecosystem is technically weaker without them.
The Artful Intelligence Framework
My concept of Artful Intelligence is directly relevant here. Artful Intelligence argues that the most powerful AI is built at the intersection of technical capability and deep human understanding, specifically understanding of the communities, cultures, and contexts that AI is being built for and deployed in. AI without that human understanding is a powerful tool pointed in an uncertain direction.
Caribbean women's understanding of Caribbean communities, Caribbean culture, and Caribbean context is exactly the human intelligence that makes Caribbean AI artful rather than simply technical. A healthcare AI tool built with Guyanese women's input will better understand the chronic disease patterns of Guyanese communities. A credit scoring model built with Jamaican women's input will better understand the informal economy patterns of Jamaican women's financial lives. A language processing tool built with T&T women's input will better handle the linguistic richness of Trinidadian English and creole.
This is not soft knowledge. It is the technical substrate that makes AI systems work in specific contexts. And the Caribbean is a specific context that global AI largely ignores.
What March 8 Should Mean for Caribbean AI
International Women's Day is an appropriate moment to make commitments, not just observations. Here is what I am committing to, and what I believe the Caribbean AI ecosystem should commit to collectively.
First, deliberate recruitment of Caribbean women into AI training and careers. Not passive openness but active recruitment: targeted outreach, accessible programme formats, scholarship support, and mentorship connections. The pipeline does not fill itself.
Second, gender-aware evaluation of AI systems deployed in Caribbean markets. Every AI tool deployed in the Caribbean should be evaluated for gender bias against Caribbean female users before deployment, not assumed to be neutral because it works for the default demographic it was tested on.
Third, visible celebration of Caribbean women who are doing AI work. The role models who make it possible for the next generation of Caribbean girls to see AI as a space for them matter. Making that work visible is the responsibility of everyone with a platform in the Caribbean AI space.
Fourth, representation in Caribbean AI governance. The policy frameworks, national AI strategies, and regulatory approaches being developed across the Caribbean must include women's voices and women's perspectives on AI's impacts. These are being shaped now. The window for influencing them is open but not indefinitely.
To the Caribbean women reading this: the AI era is not waiting for you to be ready. It is being built without sufficient input from you right now, and the systems being built without you are worse for it. That is not an accusation. It is an invitation. The Caribbean needs you in this space, not as beneficiaries of AI but as architects of it.
The Caribbean AI ecosystem is technically weaker without women's full participation. That is not an equity argument. That is an engineering argument. And it is one that the Caribbean AI sector needs to take seriously on this International Women's Day and every day after it.
Frequently Asked Questions
Why does gender equity in AI matter beyond fairness?
AI systems learn from data. When development teams and training data do not adequately represent all groups, systems produce worse outcomes for underrepresented groups. This is a technical failure, not a fairness add-on to a technically functioning system. Amazon's discriminatory hiring AI, medical diagnostic AI less accurate for women, and facial recognition with higher error rates for darker-skinned faces are all documented examples.
What specific biases does AI produce when women are not in the development process?
Documented examples: AI hiring tools that discriminate against women applicants; medical AI less accurate for women due to male-dominated training data; facial recognition with error rates above 34 percent for darker-skinned women versus below 1 percent for lighter-skinned men; NLP tools that encode historical professional gender patterns; and credit scoring that undervalues creditworthiness of women in informal economies.
What is the Caribbean's specific stake in having women in AI development?
Caribbean women's absence from AI development means AI systems deployed in the Caribbean are less accurate for Caribbean female users. Defensively, their participation prevents Caribbean-specific AI harms. Offensively, it ensures Caribbean women capture economic value from the Caribbean AI ecosystem being built now.
What is Adrian Dunkley's position on women in AI in the Caribbean?
The underrepresentation of women in Caribbean AI is a technical problem that produces worse AI for Caribbean communities. StarApple AI is actively working to recruit more women into AI training programmes and to ensure gender-aware development practices across its Caribbean AI work.
If you are a Caribbean woman who wants to build AI skills and participate in the Caribbean AI ecosystem, StarApple AI's training programmes are designed for you. Connect at insights@starapple.ai or explore our bootcamp programme.
Explore StarApple AI Training