I’m kind of lumping a few things together in the post that seem related in my mind, but maybe not yours. The one thing they do have in common is that they all involve some understanding of basic social science, which appear to be lacking in what the results wound up being.
For decades now, highways have been built based on traffic studies that tried to predict traffic patterns going out 20 years. Obviously, accounting for all of the possible changes over the next twenty years would be almost impossible, but what if I told you there was one they didn’t even try to account for?
“It’s not so much about the measurement being wrong, it’s that the whole underlying thesis is wrong,” said University of Connecticut professor Norman Garrick. “You’re not thinking about how people behave and how they’re using the system. You’re just saying this is how it happened in the past [and] this is how it will happen in the future, even though you’re injecting this big change into the system.”
That’s right, the algorithm in use doesn’t even understand that, if you make a huge change, like building a new highway, or not building a new highway, that people’s actions might change. That when there is plenty of highway capacity, people feel free to move further away, causing more traffic, which eventually leads to congestion again, or that if you make a freeway into a toll road, people will not just continue to drive the same amount on that road and leave other roads less traveled, or they will simply not make some trips if the drive will take too long.
All of this seems pretty obvious and yet, it wasn’t included. I assume because it was also unpredictable. Over the course of 20 years, things change.
Not all that different from the “great” Facebook algorithm, which starts showing me posts on day 31 after a 30-day snooze of someone I follow, all at once at the top of my feed.
Think about that, I literally just told the algorithm that I’m not interested in seeing what this person has to say for 30 days. I’m that disengaged from their posts right now. And immediately after that, it decides that the most important thing is to show me what I missed while I was ignoring it.
That makes no sense. It’s almost as if the algorithm code simply pays no attention to how I’ve chosen to use their own filtering tools.
This last example is similar, but not the same. Rather than simply forgetting to account for people making their own decisions, this one is an example of where we take one input, make decisions based on it, but ignore how those decisions then change people down the road.
Similar to the idea that if we build a new road and ease congestion, people feel free to drive more, we have to take a look at what we tell people about the control of their own lives.
On a recent episode of No Stupid Questions, the first half of the show relates to this question:
In the episode, it becomes very clear that there is a huge benefit to looking at life and goals with a growth mindset, believing that success, or grit, is within our own internal locus of control. That, with enough effort, willpower, etc., we can reach any particular goal. But, it’s also not so simple. There’s a bit of a downside to it when it comes to dealing with other people who are not as successful as we might be.
As the experiment goes, the kids who believed that going into a test that would award them with a toy, were indeed more likely to achieve the result. But, there was a significant difference when it came to sharing that toy. They might share it with kids who weren’t part of the test, but if there was any ambiguity about whether the kids asking to share had possibly been part of it, they did not share. Clearly, the thought was, if I achieved this, and you didn’t, that’s your fault and I don’t need to share with you.
This might be a bit problematic when it comes to real, adult life. This leads to, what we commonly refer to as the “pulling yourself up by the bootstraps” mentality, where we don’t recognize that there are life situations, and systems, that make it much harder, if not impossible, for people to achieve the same goals.
If you spend any time on social media, you’ve seen this mentality, either when it comes to social justice issues, or even in the “positivity” movement. It leaves no room for the possibility that maybe, just maybe, the reality is that there’s a whole lot we do not control.
I’ve written about this before on my other site because this belief that we, in connection with fate, the universe, our higher power, etc. control our futures, is an incredibly damaging thing to believe when someone is actually a victim.
And this really brings me back to my point, that we do a poor job of truly understanding science, statistics, and cause and effect. We believe that algorithms have all been well-thought-out, and produce a “true” result, even when they are trying to predict something as unpredictable as what traffic will look like 20 years from now. We assume social science studies are giving us the “right” answer for how to educate people or train them for the best outcomes, without considering what we are teaching them about the larger world. We assume that we can tweak one belief, or one thing, without human beings reacting to those changes in unpredictable ways, all the while thinking our one change will cause the reaction we DO predict.
We assume a lot that should never be assumed. We over-simplify a world that actually has more influences than we can possibly account for, and assume that what is really a small statistical difference represents universal truth.
It doesn’t. There are no simple answers. It takes hard work, hard discussions, and lots of listening to figure out the best way forward. Don’t wait for AI to tell you what to do, it may be missing quite a bit.