
Surveillance as Erasure: How China’s Digital Infrastructure, the IJOP, Targets Uyghur Identity
The Chinese state's technological surveillance of Uyghurs in East Turkestan constitutes a severe form of repression through the Integrated Joint Operations Platform (IJOP). This system criminalizes cultural and religious practices, fosters arbitrary detention, and aims to erase Uyghur identity, posing global risks as similar technologies spread internationally.
The Chinese state's technological control over Uyghurs in East Turkestan is not merely an extension of traditional authoritarian methods. It is a deeply insidious, total, and enduring form of repression. Over the past decade, the Integrated Joint Operations Platform (IJOP) has emerged as the central nervous system of a sprawling mass surveillance architecture, one that uses predictive policing, facial recognition, and biometric databases to monitor, sort, and punish Uyghurs in East Turkestan (Xinjiang). This system not only watches Uyghurs but also preemptively criminalizes them. As Human Rights Watch (2019) has documented, technologies like IJOP actively shape life in occupied regions, deciding whose presence is visible, whose identity is recognized, and whose existence is effectively disposable.
As Human Rights Watch (2019) has documented, technologies like IJOP actively shape life in occupied regions, deciding whose presence is visible, whose identity is recognized, and whose existence is effectively disposable.
Data from police checkpoints, security cameras, Wi-Fi sniffers, and smartphone scans are fused into an ever-expanding database that flags “suspicious” behavior. Suspicion is not tied to evidence of wrongdoing. Markers of cultural or religious life often trigger it. Possession of a Qur’an, use of WhatsApp, communication with relatives abroad, or attendance at a mosque can be recorded as potential threats (Australian Strategic Policy Institute [ASPI], n.d.; Human Rights Watch, 2019). Facial recognition systems deployed in public spaces feed into this database, matching Uyghur faces against watchlists in real time (Amnesty International, 2022). The result is an environment in which every gesture of faith, every act of memory, every expression of language becomes a data point in a system designed for repression.
Biometric collection deepens this control. Chinese authorities have taken DNA samples, voice recordings, iris scans, and even three-dimensional body scans from millions of Uyghurs, often under the guise of free health check-ups (Human Rights Watch, 2017). These biometric profiles are integrated into IJOP, creating a permanent record linking identity to suspicion. When surveillance becomes so granular that it can track the body itself, its movements, its voice, its biological essence, there is no refuge.Surveillance does not merely make people visible. It erases alternative ways of being. Cultural life becomes a source of risk until, eventually, it becomes safer not to live that life at all.
Surveillance does not merely make people visible. It erases alternative ways of being. Cultural life becomes a source of risk until, eventually, it becomes safer not to live that life at all.
This is not an abstract story of human rights violations. The pairing of IJOP’s predictive policing with arbitrary detention has created a direct pipeline to “re-education” camps, where Uyghurs are held indefinitely, subjected to political indoctrination, and often coerced into abandoning their religion and language (U.S. Department of Labor, 2024). During my Ph.D. fieldwork in Istanbul with Uyghur exiles, survivors described being punished for praying, for wearing traditional clothing, and for speaking Uyghur instead of Mandarin. These are not isolated abuses. They are intended outcomes of a system targeting the cultural, linguistic, and religious fabric of a people. Under the UN Convention on the Prevention and Punishment of the Crime of Genocide, the destruction of a group’s cultural and spiritual life constitutes an element of genocide (United Nations, 1948). By that measure, IJOP’s digital infrastructure is not merely an accessory to repression. It is a central tool in dismantling Uyghur identity.
What makes this infrastructure particularly dangerous is its combination of physical control with narrative manipulation. In my research on digital Potemkin villages, I documented how the erasure of Uyghur voices online is paired with state-sponsored influencer campaigns that portray East Turkestan as a peaceful and happy tourist destination. This substitution, replacing testimony with tourism, memory with myth, allows the state to deny atrocities even as they occur. In the logic of IJOP, the Uyghur who disappears from the street also disappears from the internet, replaced by an image of a compliant citizen or a sanitized cultural performance (Human Rights Watch, 2019).
The danger is not contained to East Turkestan. China’s surveillance technologies have already been sold to governments in Central Asia, Africa, and Latin America (Wall Street Journal, 2023). The combination of biometric tracking, AI-driven policing, and centralized data fusion offers authoritarian leaders an irresistible tool to criminalize dissent preemptively, strip populations of privacy, and enact cultural erasure under the guise of “stability” or “development” (Axios, 2022). If adopted elsewhere, this model could normalize automated repression globally.
There is also a feedback loop in the technology itself. The algorithms powering IJOP and its facial recognition systems are trained on biased datasets that disproportionately flag Muslim identity markers as suspicious (Amnesty International, 2022). These biases persist when the technology is exported, embedding discriminatory assumptions in AI systems. Digital erasure, as I argue in my dissertation, is not only about absence. It is about substitution, where AI extrapolates from what is visible online. When state violence has already silenced certain voices and narratives on the internet, AI-driven systems reinforce the erasures that have shaped their data.
We are entering an era in which automated moderation, predictive policing, and biometric surveillance converge into a single infrastructure of repression. For Uyghurs, that convergence has already brought devastating consequences: loss of freedom, the hollowing of cultural life, and the rewriting of history. For the rest of the world, it presents a choice. East Turkestan can be treated as a warning, a case study in how technology enables systematic genocide, or ignored, allowing the model to spread unchallenged to other authoritarian contexts.
The story of IJOP is not only about what is being done to Uyghurs; it is about the kind of future that becomes possible when such systems are normalized. If we fail to confront this reality, we risk inheriting a world in which surveillance is permanent, erasure is automated, and cultural extinction is just another application of technology (Human Rights Watch, 2019).
References
Amnesty International. (2022). China: Mass surveillance in Xinjiang and the digital erasure of Uyghurs.https://www.amnesty.org/en/latest/research/2022/03/china-uyghur-surveillance
Australian Strategic Policy Institute. (n.d.). How mass surveillance works in Xinjiang.https://xjdp.aspi.org.au/explainers/how-mass-surveillance-works-in-xinjiang/
Axios. (2022, June 14). Report: Hikvision cameras help Xinjiang police ensnare Uyghurs. https://www.axios.com/2022/06/14/report-hikvision-cameras-xinjiang-police-uyghurs
Human Rights Watch. (2017, December 13). China: Minority region collects DNA from millions.https://www.hrw.org/news/2017/12/13/china-minority-region-collects-dna-millions
Human Rights Watch. (2019, May 1). China’s algorithms of repression: Reverse-engineering a Xinjiang police mass surveillance app. https://www.hrw.org/report/2019/05/01/chinas-algorithms-repression/reverse-engineering-xinjiang-police-mass
U.S. Department of Labor. (2024, November 6). Uyghur and Turkic Muslims: Forced labour in China.https://hansard.parliament.uk/commons/2024-11-06/debates/7C4D516D-FC4F-4C8A-AC19-5BEC8D7769D8/UyghurAndTurkicMuslimsForcedLabourInChina
United Nations. (1948). Convention on the Prevention and Punishment of the Crime of Genocide. https://www.un.org/en/genocideprevention/documents/
Wall Street Journal. (2023, September 13). Canada reviewing request to sanction Hikvision, other Chinese surveillance companies. https://www.wsj.com/articles/canada-reviewing-request-to-sanction-hikvision-other-chinese-surveillance-companies-2e5d9cde
Ifat Gazia
Ifat Gazia, şu anda doktorasını sürdüren Keşmirli Müslüman bir araştırmacıdır. UMass Amherst'te İletişim bölümünde. Çalışma alanları Keşmir, Filistin ve Doğu Türkistan'dır ve çalışmaları genel olarak teknoloji ve sosyal adaletin kesişimine bakmaktadı...