Does a tracking system making laws more enforceable actually improve society?
Shazeda Ahmed is juggling four research papers, involving multiple collaborators and data sets, from the confines of her San Francisco home, with the subject of her research roughly 6,500 miles away. A PhD candidate at UC Berkeley’s School of Information, Ahmed studies the social effects of technology. In her role as a Stanford HAI and Center for International Security and Cooperation fellow, she’s specifically examining how technology firms and the Chinese government are constructing China’s social credit system, the initiative to build databases and information sharing procedures that monitor the behavior of individuals, corporations, legal institutions, and government representatives, with the end goal of building a society where those individuals and corporations follow the law.
To that end, the government creates dossiers on how individuals and corporations may violate the law. Law-breakers can be publicly shamed through court-issued blacklists, and these blacklists can be shared with tech companies to further punish their subjects.
Here Ahmed shares her research findings and what drew her to this field.
How do you explain what the social credit system is?
Ahmed: There is a lot of misinformation and myth surrounding it, but what it is is a system that the government has set up to make laws more enforceable. Not criminal laws, but administrative laws, such as not paying taxes and then subsequently not paying the fines that a judge issues as a penalty for tax evasion.
The courts create blacklists of the individuals who are able but still chose not to fulfill court orders or stop committing their law-breaking behavior. For example, if someone is sued for violating administrative law and didn’t show up to court, and the judge levies a fine and the person doesn’t pay, they can go on a blacklist. The courts share the names of these people, who are referred to as “judgment defaulters,” with tech companies when a punishment involves a service that the tech company provides. For example, you might not be allowed to purchase a certain item or use a certain service.
There are scores of blacklists, not just a single one. It can result in you not being able to put your children in private school or ride high-speed rail or not being able to fly, because under Chinese law these are considered forms of “luxury consumption.” It puts pressure on people to change their behavior and get off the blacklists.
There are aspects to what you describe that many people would find troubling. How do you approach the research?
In doing the research interviews, I try to keep my opinions about the social credit system to myself. One of the myths I hear is, “China has a social credit system where everyone gets a score of how moral they are.” I don’t know how some of these myths originated, but I know they’re not true. Not to say there aren’t elements that aren’t troubling, but there is no single score that determines your rank in Chinese society and no talk of it being created. For example, cities are creating their own local scoring systems that look at data like tax and social insurance payment history and offer rewards like faster processing of state-issued licenses, but most people who live in these cities aren’t even aware of their local scoring platforms.
How are technology companies involved in the process of creating these lists?
Technology companies don’t create the blacklists, they just act on them. The courts generate the lists, and they have a mechanism that I’m still trying to figure out to share those lists with companies. It restricts certain types of consumption and notifies the user, “This is why you can’t buy [plane tickets, high-speed rail tickets, etc.] at this time.” Some courts will have specific arrangements with tech companies, like WeChat, to publicize those blacklists. Sometimes it’s about public shaming. Part of my dissertation is trying to figure out how the whole thing works, and the tension is where it maybe goes too far and what rights it might harm.
One city in China partnered with Douyin (as video social platform TikTok is known in China) to run ads showing images of people on that city’s intermediate court’s blacklists and asking for information about them for the court, even offering reward money. In China, there is starting to be discussion of, “What if this is published after you’ve already fulfilled your court obligations?” There are conversations about reputational harm that are still very niche, but the mainstream discussion is, “Oh my gosh, it’s working.” This kind of practice gets justified as a solution to the problem that “enforcement [of court orders] is hard.”
Alibaba (the e-commerce company) is trying to become the vertical integration of the court system. They also run court auctions for several courts around China, and when a court seizes property, such as land or office supplies or cars, they auction it off.
For the Chinese government, what is this about?
It’s about legitimacy. The state is making people face consequences and in doing so, they’re really trying to create a reward system. There are places where if you have a history of winning awards for volunteer service, donating blood, and giving to charity, there’s talk of finding ways to reward that. The rewards are still rudimentary, and it’s not rolled out on a national level. The best thing you get is, for example, the fast-tracking of processing of government documents and licenses. It’s way more about punishment than rewards. It’s more of a naming and shaming scenario where they hope it creates pressure on people to obey laws.
Does this system work?
I do qualitative work and only recently received ethical review board permission to talk to people on blacklists, so the answer is, I don’t know and I probably won’t be able to find out until it’s safe to return to China for research. In theory, I’ve seen news stories that people say it’s ostracizing, especially in less densely populated areas. For example, they may publicize blacklist mugshots before a screening at a movie theater, and the people whose mugshots these are, or their family or friends, may end up seeing that mugshot.
There are all kinds of unanswered questions about those people that nobody is talking about. I’ve heard professors in China talk about how the government hasn’t done a cost-benefit analysis and what counts as a cost if someone is implicated on a list in error.
In addition to your work on the social credit system, what else are you focusing on right now?
A colleague and I are doing a big sweep of Chinese language sources on emotion recognition technology, which is detecting and identifying emotion from faces, and of multi-modal emotion recognition, such as identifying emotion from tone of voice, or body movement and gait. The basic takeaway right now is that it’s not enormously widespread in China, but that it’s being discussed as the next step after facial recognition – “Why wouldn’t we do this to help us catch criminals or find out who’s depressed or not paying attention in class?” But there’s limited discussion of the privacy and civil rights implications.
It’s not just China that’s pursuing emotion recognition. Many researchers started this work in the U.S. decades ago, and part of the reason I got interested is that I have colleagues at Berkeley who work on emotion recognition and proposed uses for it. Even a few years ago, my colleagues questioned what harms could result from use by, for example, the TSA (Transportation Security Administration). The reason to look at China is that it has the resources that if its technology sector decided this is worth pursuing, it has a market for it at home and can easily export the technology. It’s worth studying how it will set a precedent in other places and what social impacts it might have.
How did you come to focus your work on China?
In high school, in the Bronx, I wanted to learn Japanese, but the class was always full. I read somewhere that Mandarin uses the same characters, so I started studying that and I loved it and kept studying in college. That’s not to say I haven’t struggled with language barrier issues, but I’ve never had a total communication breakdown. I’ve never struggled with expressing myself in big cities in China, but I’ve had other struggles with bureaucracy and doing research as a foreigner in more rural areas.
In China, I stick out as a South Asian. Nobody has ever made offensive remarks, but they ask, “Are you really American? How does that work?” Many people I studied Mandarin with wanted to take advantage of economic opportunities in China that brought them there, but for me, fascination with the language was what ultimately made China the place I ended up pursuing research.
Stanford HAI's mission is to advance AI research, education, policy and practice to improve the human condition. Learn more.