An independent research initiative studying how AI-driven recommendation systems amplify misogyny, harassment, and gender-based violence against young people in Malaysia.
What is algorithmic design without accountability but an invisible architecture, shaping beliefs and behaviours while remaining impossible to contest?
To investigate how AI-driven recommendation algorithms shape gendered perceptions and harms among Malaysian youth, and to translate this evidence into public knowledge, youth capacity, and practical tools that strengthen digital literacy and democratic resilience.
Social media algorithms increasingly shape what young people see, believe, and normalise about gender, identity, and power. In Malaysia, these systems amplify misogyny, harassment, and polarising content while remaining opaque, difficult to contest, and under-researched in local contexts. Youth are highly exposed but lack the tools to understand, monitor, or respond to algorithmic harms in real time.
We are looking for research partners, funders, and collaborators invested in algorithmic accountability, youth digital rights, and gender justice in Southeast Asia.
Get in Touch →