Download
Abstract
The Rohingya genocide demonstrates how modern technology can exacerbate historical ethnic tensions and facilitate severe human rights abuses. In 2016–17, Myanmar’s military orchestrated clearance operations that forced over 700,000 Rohingya to flee to Bangladesh. Meta, Facebook’s parent company, played a critical role in this crisis through systematic failures in content moderation and platform design. The company’s inadequate investment in Burmese-speaking moderators and culturally appropriate algorithmic systems allowed hate speech to flourish, while its engagement-based recommender system amplified anti-Rohingya content. Using Stanton’s Ten Stages of Genocide model, this paper demonstrates how these platform dynamics catalysed the progression of offline violence against the Rohingya. These failures highlight a stark disparity in safety measures applied to Global South and Global North users.
This case exemplifies a broader pattern of digital colonialism, where Meta’s Internet.org initiative drove users to Facebook while prioritising data extraction and market influence over local population safety. Similar patterns of Facebook-amplified ethnic violence have emerged in other Global South nations, including Ethiopia and Sri Lanka. The paper argues that this systematic neglect of user safety in favour of economically valuable data collection perpetuates colonial power structures, challenging the assumption that technology platforms are neutral intermediaries in protecting human rights.
Citation
Oates, H., 2025. “Digital colonialism: An analysis of Facebook’s role in the Rohingya genocide”, ANU Undergraduate Research Journal, 14(1), pp. 36–46.
@article{oates2025digital,
author = {Harrison Oates},
title = {Digital Colonialism: An Analysis of Facebook’s Role in the Rohingya Genocide},
journal = {ANU Undergraduate Research Journal},
volume = {14},
number = {1},
year = {2025},
pages = {36--46},
publisher = {ANU Press},
}