As health care predictive algorithms, including the Prescription Drug Monitoring Programs (PDMP), expand their reach beyond traditional enforcement of health care regulations, the concept of a social credit score has emerged as a powerful but controversial tool. Like predictive policing, social credit scores use an individual’s behaviors, online activity, and social interactions to assess their perceived trustworthiness and risk to society and are used as fodder for the prosecution of health care providers.
A notable example involved Dr. Muhamad Aly Rifai, whose case was highlighted in a November 14, 2022, DOJ press release. The DOJ’s claims focused on Medicare fraud allegations, casting Dr. Rifai as a significant figure in health care fraud. Such public statements can have lasting impacts on the reputations of health care providers, much like financial credit scores affect personal financial standing. Although the DOJ’s intention is to protect public resources, the consequences of these statements are often disproportionate and damaging.
The United States Department of Justice (DOJ) has had a long tradition of using “social scandal” in its press releases touting the prosecutions of U.S. physicians. The DOJ has, with some degree of success, capitalized on advancements in utilizing social credit scores to shore up its activity in pressuring physicians accused of malfeasance related to health care services. Once a scandalous press release from the DOJ is launched onto the World Wide Web, it is there for eternity with all the unjust and untrue allegations in it: “Lehigh Valley psychiatrist stole from Medicare,” the DOJ proclaimed on the press release dated November 14, 2022, as if Muhamad Aly Rifai, MD, were the sole reason why the Medicare system is going to become insolvent in the next few years. Such pronouncements and allegations sully the social credit score of the unjustly accused, much like how bankruptcy due to medical debt destroys the financial credit score of an American today. These social credit scores are often touted as a way to enhance public safety by claiming to predict who is most likely to commit a crime. However, this claim is deeply flawed, and social credit systems are far more adept at predicting who is most likely to be indicted or arrested than identifying actual criminal behavior—a critical distinction that blurs the line between prevention and profiling, justice and surveillance.
At its core, a social credit score is a data-driven profile of an individual, incorporating information from various sources, including financial records, social media activity, and even one’s friends or neighborhood. This data is used to assess “risky” behavior with the aim of predicting who might violate laws or social norms in the future. In United States v. Dr. Rajindra Bothra and his colleagues, the DOJ’s tactic was to impact the social credit score of these physicians and impugn their trustworthiness and credibility at trial (if they ever reached one). Dr. Bothra, the well-known international philanthropist who courted Mother Theresa and Pope John Paul II, was a persona beyond reproach. The DOJ insisted, in its attempt to deny him bail, that he would abscond to India, only for their case to collapse and for him to be found not guilty after three and a half years of imprisonment and numerous briefs and motions from the DOJ utilizing social media info, statistics, and profiles of those who had fled justice to convince the courts to keep Dr. Bothra imprisoned. Dr. Bothra and his colleague were portrayed as the root cause of deaths from the opioid and fentanyl pandemics, “flooding the streets of Detroit with drugs.”
The utilization of social credit scores by the U.S. DOJ parallels countries like the People’s Republic of China, where social credit scores are used to monitor everything from financial habits to jaywalking. In both countries, this system has been presented to promote lawful behavior and improve societal trust. The traditional press and media, an instrument of the government in China, have also aided the U.S. DOJ in executing such unfair systems, which tend to conflate the risk of committing a crime with the risk of being arrested. Dr. David Lewis, an African American indicted alongside Dr. Bothra, was portrayed in the media as a “thug” by describing how he bought “Gucci shoes” and owned a prized “antique 1965 Rolls Royce.” The problem is that computer algorithms trained on historical data about arrests often reflect biased enforcement patterns, not objective criminality. For example, individuals from historically marginalized groups are more likely to be flagged as “high-risk” in these systems, not because they are more likely to commit crimes, but because they are more likely to encounter the criminal justice system. Another physician of Indian ethnicity was accused by the DOJ, in his indictment, of owning a 20,000-square-foot mansion—a crucial distinction that many miss. If an artificial intelligence algorithm says someone is “likely to commit a crime,” it might be saying they are more likely to be policed and, thus, more likely to be arrested.
To demonstrate how predictive policing and health care social credit scores suffer from the alignment problem, we explore how often they are misaligned with the values of fairness and justice, reflecting entrenched inequalities rather than delivering unbiased assessments. A Puerto Rican fellow physician, Dr. Antonio Reyes Vizcarrondo, was accused of false billing when a staffing company billed on his behalf for the same services he had billed for. He was accused of using the proceeds “for his pleasures and expenses.” After languishing for close to five years under indictment, the government dropped the charges on the eve of the trial after discovering that the staffing company did not have authorization to bill on behalf of Dr. Reyes Vizcarrondo. Because social credit scores rely on past data, they perpetuate the same biases found in traditional criminal justice practices, disproportionately labeling individuals from certain racial, ethnic, or economic backgrounds as risky. This means that even benign actions, such as having low credit scores or associating with individuals deemed “high-risk,” can lower one’s social credit score, triggering increased scrutiny or even penalties like restricted access to health care services or increased surveillance. These outcomes are not necessarily tied to actual criminal behavior but to a perception that certain individuals or communities are inherently more suspicious. The confusion between predicting who will commit a crime and who will be arrested mirrors the facts detailed in the 2020 Netflix documentary Coded Bias. These biases have translated into several targets, including Dr. Muhamad Aly Rifai, Dr. Neil Anand, Dr. Lesly Pompy, Dr. Lonnie Joseph Parker, and Dr. Felix Brizuela. However, unlike in movies, today’s systems do not have clairvoyance—they have data, and that data is flawed, and the system is rigged.
The system, with flawed data, not only leads to the unjust prosecution of minority physicians in a biased fashion but also targets minority patients—the “undesired,” “deplorable,” including racial and ethnic minorities, children, and the elderly. In USA v. Muhamad Aly Rifai, the prosecution claimed that psychiatric services provided lawfully and accurately to the dispensable elderly in rural Pennsylvania were “not medically necessary,” despite severe psychiatric illness and suicidal ideation. In USA v. Bothra, the DOJ called 25,000 minority patients, who are inner-city Detroit residents with significant disabilities, “addicts,” even though many of these patients have longstanding chronic pain issues related to true physical disease.
The widespread adoption of predictive tools like social credit scores, predictive policing, and PDMP algorithms is changing how societies define criminality and risk. But by confusing who is most likely to commit a crime with who is most likely to be arrested, these systems are not making communities safer. Instead, they are amplifying systemic biases and entrenching the surveillance and punishment of already marginalized populations. If we are to truly create fair and just societies, we must recognize the difference between predicting crime and predicting arrest and reform these systems before they become tools of institutionalized exclusion.
Muhamad Aly Rifai is a practicing internist and psychiatrist in the Greater Lehigh Valley, Pennsylvania. He is the CEO, chief psychiatrist and internist of Blue Mountain Psychiatry. He holds the Lehigh Valley Endowed Chair of Addiction Medicine. Dr. Rifai is board-certified in internal medicine, psychiatry, addiction medicine, and psychosomatic medicine. He is a fellow of the American College of Physicians, the Academy of Psychosomatic Medicine, and the American Psychiatric Association. He is the former president of the Lehigh Valley Psychiatric Society.
He can be reached on LinkedIn, Facebook, X @muhamadalyrifai, YouTube, and his website. You can also read his Wikipedia entry and publications.
Neil Anand is an anesthesiologist.