Kafka’s Nightmare Emerges: China’s “Social Credit Score”
adUnit = document.getElementById("google-ads-WfvO");
google_ad_client = "ca-pub-1897954795849722";
adUnit = document.getElementById("google-ads-WfvO");
adWidth = adUnit.offsetWidth;
if ( adWidth >= 999999 ) {
/* GETTING THE FIRST IF OUT OF THE WAY */
} else {
google_ad_slot = "0";
adUnit.style.display = "none";
}
China is creating Kafka’s nightmare world as the perfection of centralized control of its citizenry.
China is rapidly building out a Total Surveillance State on a scale that far surpasses any government surveillance program in the West. The scope of this surveillance is so broad and pervasive that it borders on science fiction:
Life Inside China’s Total Surveillance State (8 min video)
China Aims For Near-Total Surveillance, Including in People’s Homes (“Sharp Eyes” nationwide surveillance network)- “You’re Being Controlled All The Time” – An Inside Look At China’s “Social Credit Score”
- China Assigns Every Citizen A ‘Social Credit Score To Identify Who Is And Isn’t Trustworthy
It’s well known that the intelligence agencies in America seek what’s known as Total Information Awareness, the goal being to identify and disrupt terrorists before they can strike.
This level of surveillance has run partly aground on civil liberties concerns, which still have a fragile hold on the American psyche and culture.
The implicit goal of China’s Total Surveillance State is to control the citizenry and root out any dissent before it threatens The Communist Party’s hold on power, but the explicit goal is a behavioral psychologist’s dream: to reward “positive social behaviors” and punish “negative social behaviors” via a “Social Credit Score.”
There is something breathtakingly appealing to anyone in a position of power about this goal: imagine being able to catch miscreants who smoke in no-smoking zones, who jaywalk, who cheat people online, and of course, who say something negative about those in power.
But let’s ask a simple question of China’s vast surveillance system: what happens when it’s wrong? What if one of those thousands of cameras mis-identifies a citizen breaking some minor social code, and over time, does so enough times to trigger negative consequences for the innocent citizen?
What recourse does the citizen have? It appears the answer is none, as the process is not strictly speaking judicial; the system appears to be largely automated.
Here’s a second question: is the scoring system truly transparent, or can insiders place their thumbs on the scale, so to speak, to exact revenge on personal enemies?
googletag.cmd.push(function() { googletag.display('div-gpt-ad-1470694951173-5'); });
(deployads = window.deployads || ).push({});