Pages

Sunday, September 11, 2016

(Some of) What Technocrats Get Wrong

Slate ran an article this week about the newsy side of Facebook, and it's a reminder of so many reasons that technocrats are not to be trusted around education.

Facebook has been having trouble handling the news. Well, and history.


They censored the award-winning photo sometimes known as "napalm girl," an immediately recognizable Vietnam war photo both important in its role for driving public opinion about the war as well as a stunning record of the horrors of the war itself. But of course the algorithm Facebook uses says that a naked girl = bad, so they first got in a fight-by-deletion with a Norwegian news organization and the actual Norwegian prime minister before finally registering what a whole bunch of users were telling them and allowing the photo.

Facebook has also been having trouble with its bot-run trending news feature. On Friday it celebrated 9/11 by kicking to the top of the trending news a piece about how 9/11 was all faked. And that's only the latest way in which bot-managed news on Facebook has been...um... unimpressive.

Facebook's woes are reminders of some major flaws in technocrat thinking. If you get a well-constructed pipeline in place, the reasoning goes, and you set up an algorithm to run the pipeline, then you don't have to have any understanding of what is moving through that pipeline. This is the same kind of flawed reasoning that presumes that reading can be treated as a context-free set of skills, that reading skills are unrelated to the content of what is being read.

An algorithm can censor an important picture and promote a piece of junk writing because the algorithm does not grasp the context of either piece of "content."

What happens when we apply this kind of thinking to a school? We get a technocratic system, a pipeline through which students and educational content are supposed to just move through, with no recognition of the context of either. The pipeline algorithm does not recognize the idea of relationships between anything and anything else; to the pipeline operators, it's all just a uniform stream of stuff, to be moved through the system according to the system's rules.

And yet at the end of the day, because systems and algorithms are stupid in a way that actual humans are not, it takes humans speaking up to say, "Hey, your system made a very bad choice" to keep the system from making terrible and stupid mistakes. The degree to which that human voice is silenced and disregarded is the degree to which the system will screw up. That, of course, is the problem we face in the education world; though the system actual has teachers installed as gatekeepers at every significant point in the system, but rather than depend on their judgment, systems technocrats are determined to silence the "noise" of teacher input, to stop the disruptions to the smooth-running system that occur every time a teacher speaks up to say, "Hey, this is not right."

That's because systems technocrats ultimately want to be responsible only for the system. Facebook does not want to admit it is a media company, because it doesn't want a media company's responsibilities. Uber doesn't want to be responsible for issues with its the drivers and passengers. AirBnB doesn't want to be responsible for issues with its hosts and guests. They all just want to run a system, and their ultimate loyalty is to the system and not to the people who use it. "Hey, our system is working great-- if the results weren't that great for you, that's not our problem."

This approach is exactly wrong for a school, for education, for the growth and support of young humans. Removing human judgment from the system removes the system's ability to deal with the full range of human behavior, needs, and yes, screw-ups. It's no way to run education.

Bonus: Here is an absolutely magnificent rant in reply to Facebook's assertion that these sorts of human social problems are just, you know, too hard -- an excuse they never use for engineering problems. Also, Cory Doctorow's spin from that rant.



2 comments:

  1. Even if the programmers do take the content into account, how do they know that what they think is significant in terms of content is actually significant? The curriculum must remain open to review and many different people with many different perspectives and fields of expertise need to be involved in a constant discussion of what is significant, in terms of curriculum, and what is not.

    But the technocratic views is that someone can be certain about what needs to be taught and that person can simply devise a curriculum/computer programs that can teach it. Their idea is that knowledge is static. This is a very facile view of what counts as knowledge in many many fields and disciplines.

    ReplyDelete
  2. This is related to the infatuation with the ideal of objectivity. It unwittingly exalts dumb processes. Because humans are not to be trusted. Unless they have a lot of money, of course.

    ReplyDelete