The excesses of wokeness are thankfully getting their overdue correction, but as often happens with corrections, they tend to overshoot. A case in point was from the usually reasonable writer Wesley Yang on Twitter, who posted the following:
Poor long-term immigration policy has apparently led to too many foreign-born and native criminals of foreign descent in Norway, a disproportionate number of whom, per the chart, hail from Somalia and Morocco. Yang argues using this data to inform Norway’s immigration policies going forward is both unproblematic and “no longer prejudice.”
This is a sleight of hand. You can advocate for designing prejudicial immigration policies based on the group to which an individual potential immigrant belongs, but you cannot both do so and pretend it is not prejudicial.
This is not a point about what countries should or should not DO. They are free IMO to design immigration policies as they see fit, and there are arguments to be had over that. But there is a reactionary blindness wherein people pretend that grouping individuals by race, religion or country of origin isn't prejudicial to those individuals who have no control over how others in their assigned group behave.
The grouping is justified in their view by "data" or "statistics" or "reason." The truth is most of us still view "prejudice" as negative, so no one advocating for discriminatory immigration policies wants to be considered prejudiced. They are for the policy, think it's good, and therefore try to pretend it's not prejudiced when it obviously is.
Don't pretend. If you're for prejudicial, discriminatory immigration policies "for the greater good", just advocate for it on those terms. Be honest about what you're saying.
This might seem like a trivial point on which to base a post, but I don’t think it is. It’s not because “prejudice is bad” or “racism!” That’s just one type of instance in which this midwit tendency — to assign individuals to a group and then pretend you are being “data-driven” — rears its terrible head. I remember a couple years ago people on Twitter accused me of “taking up an ICU bed” because I refused to inject myself with Pfizer’s latest. Their reasoning was similar: because the unvaccinated are more likely to take up ICU beds (which turned out to be a lie, but let’s assume it were true), I was in the *group* taking up more beds, therefore I was taking up a bed.
The trick is to assign someone to a group in which they qualify (Somali, unvaccinated, etc.), get “data” about that group and then apply that data to the individuals within it, even if it is totally inaccurate with respect to them.
This is the same trick used to justify disastrous DEI policies too. You are from a historically underprivileged group, so you deserve to be held to lower standards. It doesn’t matter if you would succeed on your own merit, it doesn’t matter your race, gender, sexual orientation are irrelevant to the task for which you are being hired, it doesn’t matter if you are the son of wealthy and famous people and attended all the best schools, you belong to a particular category and therefore this standard will be applied.
But the implications of this midwit sleight of hand are even worse than that, especially on the eve of AI-based systems taking over so many of the administrative functions at both the corporate and state level. You will be categorized, and you will be pre-judged on that basis because it is far more efficient for the system to do so than to examine everyone individually. In fact, I’d argue, unlike humans, AI is incapable of seeing individuals, it sees only “data.”
This was the premise for the movie Minority Report, set in a dystopian future wherein people get convicted of “pre-crime” because the system determined from the data about them they were likely to commit one. You might not have done anything *yet*, but based on your angry social media posts, your high testosterone, your age and background, you have a 99.9 percent chance of violence. Why not arrest you now before you victimize someone? For every 1000 victims we protect, we’ll convict only one person who would not in fact have done it!
There will be many people who are for this kind of reasoning. It’s “data” after all. Never mind the data can be biased based on the grouping. Why am I grouped with other people I don’t know and over whom I have no control? What if they added 100 other parameters which showed I’m among the least likely to commit violence? Who is selecting these parameters and what are their agendas?
But more to the point, such a society is not free. You are at the whims of your grouping, of parameters, of statistics. The paradigm of individual civil liberties, innocent until proven guilty, responsibility only for your own actions would be dead. You are pre-judged before you have done anything.
The second-order effects of such a paradigm would be a race to the bottom. If I am not in the wrong group I can do no wrong, based on my favorable peers. I can roam the streets, commit acts of vandalism and violence with impunity, I can loaf at work, not do my job and never be fired. I would argue that under the present (and mercifully soon-to-be-departed) administration we’ve already experienced many of these effects already.
So back to the original question. What is Norway to do with this “data”? One idea would be to have a non-discriminatory vetting process that involves delayed gratification and requires some diligence on the part of the prospective immigrant. This might self-sort the people capable of living in an open, prosperous western society and those who are not. You would be admitted or denied based on demonstrated suitability (merit), not with whom you were grouped (identity). This would be closer to the “postjudice” policy prescription Yang erroneously claims for his own in the cited post.
Whether that results in more people of one race or another, one nationality or another getting in, isn't important. What's important is both not judging individuals on the basis of their assigned group and keeping the country free of violent criminals. Both goals are essential if we want to avoid dystopian outcomes.