Article: Thursday 15 January
Imagine entering a room knowing that everyone can see your credit score – the single number that’s meant to summarise your social behaviour, trustworthiness and moral standing. What are the effects? In a new paper Social credit scores reduce interpersonal cooperation and trust published recently in PLOS One, behavioural scientist Dr Alexander Genevsky of Rotterdam School of Management, Erasmus University (RSM) shows that social credit systems reduce trust, undermine cooperation and disproportionately disadvantage people whose circumstances mean they start with a lower score. Rather than strengthening communities, these systems reduce trust and make people less willing to work together.
Social credit systems may sound like something from a dystopian fiction about the dark side of technology and its impact on human nature, but they are no longer speculative. Formal social credit systems already affect more than a billion people worldwide. China’s nationwide system is the most well-known example, but Genevsky stresses that Western societies are moving in the same direction, albeit in more fragmented ways.
Insurance companies can legally use social media data to profile customers and adjust premiums,visitors to bars and public venues are screened and their ID scans are shared through centralised databases; and governments increasingly use algorithmic risk models sometimes incorporating sensitive demographic characteristics, to detect fraud. In all these cases, past behaviour is turned into a score that determines future opportunities. The justification is familiar: transparency creates trust and encourages cooperation. But as Genevsky puts it, “that assumption is rarely tested empirically.” His research set out to do exactly that.
Genevsky examined how the availability of social credit scores affects trust, cooperation and social perception using three preregistered experiments involving more than 2,400 participants.
The first experiment measured cooperation: participants played a ‘public goods’ game, an economic experiment where individuals choose between self-interest (keeping money) and group benefit (contributing to a shared pool that gets multiplied and redistributed). Participants contributed more to the shared pot when no social credit scores were visible, but as soon as scores were introduced, overall cooperation declined. Strikingly, participants with higher scores became less cooperative, particularly when paired with lower-scoring partners.
The second experiment focused on interpersonal trust using a trust game. Participants had to decide how much to entrust to a partner, knowing the partner could return some or all of it. Again, the presence of social credit scores reduced trust. People sent less when scores were visible, and low-score individuals received significantly less in return, even when their behaviour was identical to that of higher-scoring partners.
The third experiment examined perceptions and partner choice. Participants evaluated others on traits such as trustworthiness, warmth and desirability as a future partner. Here, the effect of scoring was most persistent. Even when low-score individuals behaved generously, they were judged more harshly than high-score individuals who behaved in exactly the same way. Negative impressions proved resistant to correction, even when contradicted by clear behavioural evidence.
Together, the findings paint a troubling picture. Social credit systems do not merely reflect behaviour; they actively shape how people interpret it. Scores become cognitive shortcuts that override direct experience. Once someone is labelled as ‘low score’, that label sticks.
Crucially, the costs are not evenly distributed. Individuals with lower scores, who often come from already disadvantaged backgrounds, face a double burden. They are trusted less, rewarded less and struggle to recover their reputation, even when they act generously. In some cases, they contribute more yet receive less in return. Rather than promoting fairness, these social credit systems risk reinforcing existing inequalities and creating self-fulfilling cycles of exclusion.
Although the experiments were conducted in controlled settings, the implications of their results extend far beyond the laboratory. This research raises urgent questions for policymakers, regulators and industry leaders as algorithmic reputation systems spread across public services, digital platforms and commercial markets. For policymakers, the findings suggest that efficiency gains from scoring systems may come at the cost of social cohesion and fairness. For businesses, especially those relying on ratings, trust scores or behavioural profiling, the research highlights the risk of embedding bias into systems that appear neutral on the surface.
Genevsky does not argue against all forms of reputation. Informal trust built through repeated interaction can support cooperation. But formal, centralised and quantified systems operate differently. By reducing complex human behaviour to a single number, they distort judgement in ways that are difficult to undo.
Dr Genevsky summarised his concerns of the effects of using algorithms to measure reputation.
Read Alexander Genevsky’s open-access research article: Social credit scores reduce interpersonal cooperation and trust in PLOS One.
Science Communication and Media Officer
Rotterdam School of Management, Erasmus University (RSM) is one of Europe’s top-ranked business schools. RSM provides ground-breaking research and education furthering excellence in all aspects of management and is based in the international port city of Rotterdam – a vital nexus of business, logistics and trade. RSM’s primary focus is on developing business leaders with international careers who can become a force for positive change by carrying their innovative mindset into a sustainable future. Our first-class range of bachelor, master, MBA, PhD and executive programmes encourage them to become to become critical, creative, caring and collaborative thinkers and doers.