There’s free speech and the question of free will, but lately I’ve been wondering how free our thoughts are. And I’m pretty sure they’re not free to roam wherever they may like, but constrained by unseen guardrails and self-censoring mechanisms that arrest the development of interesting ideas before they can be fully articulated.
On average we are probably the most creative and generative as children with the dwindling of this superpower coinciding perfectly with when we begin to develop self-consciousness. Self-consciousness seems to go hand in hand with policing our own thoughts and doubting them before we can fully articulate them and consider their veracity. It’s as if something inside our brain says “stop kidding yourself, that can’t be” and shuts everything down before the thought can be further entertained. Self-consciousness seems to make some things unthinkable and this seems largely in line with the behavior of people lacking self-consciousness. People incapable of considering what others think of them do things others wouldn’t consider doing. But the interesting question remains: why would the mind undermine itself in the first place? My first hunch is that it has to do with the consequences and costs associated with realizing new ideas. Divergent ideas necessitate updating your world view and occasionally this provides you with a shortcut, but it’s much more likely to invalidate the guideposts and milestones we organize our lives and identity around so our memetic immune systems choose to suppress the new idea before it and it’s implications can throw your life into disarray. It’s memetic suppression or having to backtrack, cross uncharted waters, adjust your exposure, and recognize that milestone you hold so dearly has lost some of its shine. Another possible explanation is moral disgust with ourselves. Biting into a rancid apple or the smell of decaying flesh both activate the region of the brain associated with gustatory and sensory disgust. Interestingly the same region activates when observing social norm deviations. So, maybe this region of your brain turns itself inward directing moral disgust at yourself and that’s when your mind quickly backs away from that new idea, depriving it of oxygen it needs to fully develop.
Comparing languages with opposite grammatical genders for the same words helps to unearth some of these guardrails. To describe a “bridge” which is feminine in German, German speakers use words like “beautiful,” “elegant,” and “fragile.” But in Spanish where the same word is masculine, Spanish speakers use words like “big,” “strong,” and “dangerous.” So language at least influences perception but it also seems to funnel our thoughts in specific directions, as is the case with Russel conjugations where owing to our social nature we unconsciously let the connotations of words discourage us from considering whether the opposite of what we are being told could be true. Describing someone as “crazy” instead of a “freethinker” impacts how much consideration your audience gives to their ideas.
If language can do these things, it’s not a stretch to assume gaps in our language create artificial boundaries in our minds forbidding us from thinking from new angles and stifling variation in thought.
Luckily, I’m pretty sure we can neutralize the constraints language places on our thoughts. The invention of new words could allow us to overcome these language induced boundaries in the same way first principles thinking allows us to exceed the constraints of thinking exclusively in available analogies. The best proximate to this is likely the invention of numbers. Before numbers presumably people could distinguish between one and more than one or at least a little and a lot, but two options only leaves you with so much room to reason. The invention of the numeral system allowed for precision and more complexity. The same should be true of words. Coining new words should allow us to explore ideas with more precision and granularity, explore fresh shades of grey for deeper truths, reason up from new building blocks to totally new conclusions, and generally provide us with more room to think. And most importantly, allow us to ask questions we previously couldn’t conceive of, expanding what is imaginable, and finally what is possible1 2.
The most pervasive guardrail stems from a default expectation of ours: the expectation that the natural state is for ideas across a population to be more or less homogenous3. And this expectation appears to corrupt or at least transcend a lot of our thinking. You see this every time someone comes up with an idea and immediately assumes someone else has already tried it and likely failed, you see this when someone discourages you from entering a crowded market4, and it’s especially prevalent among the “innovation is inevitable” crowd. Implicit in the belief that if inventor A hadn’t invented X, inventor B, C, and D would have is the assumption that the possibility of an individual having an idea no one else has is very unlikely5. Besides the fact that this totally undermines the individual and deprives us of additional innovation, it’s no wonder we don’t give new ideas the necessary space to develop when everything around us, including ourself is telling us they don’t exist 6.
Just realizing these guardrails exist seems to make it easier to detect when these self-censoring mechanisms are at work, but the most useful thing I’ve realized is you simultaneously want a mimetic immune system that protects you from integrating false truths but also want to weaken the part of your mimetic immune system that is calibrated to produce false positives in very specific contexts — those where the implications of the thought are too much to handle. If the idea of your brain stopping you from thinking certain things sounds totally unbelievable, we know this exists because in cases of severe trauma it is not uncommon for individuals to have no recollection of what happened to them for years on end. The problem is your brain doesn’t work for you, it works for your genes. There’s a bit of a principle agency problem.
Notes:
Interestingly, some of the smartest people I know are routinely coining new words and phrases out of necessity. When you can see past these guardrails you run into yet to be defined ideas and phenomena all the time. ↩
But isn’t language just a representation of what we think? Don’t we have to conceive of a concept before we can label it? Well, I think there’s a distinction between learning an existing language and inventing a new one. If you were to invent a new one in some places it would map one to one to other languages but there would likely be places they don’t overlap and that’s where the interesting things lie. Also, existing languages erect preconceptions and direct us by embedding default answers in the questions we may ask. ↩
Closely related to variation in ideas is variation in thought processes. Some people think different and we tend to label different modes of thinking disorders, which is strange. ↩
If my understanding of the history of the payments processing industry is correct, Stripe seems like a really good example of this. In 2010, the industry appeared crowded (PayPal, Braintree, etc.) and the assumption was you’d have no choice but to compete on price as there were no insights or angles left for startups to use to leverage their way to success. Turns out the innovation in the industry was far from maxed out. ↩
This topic very much deserves its own post. If you’re interested enter your email to get it in your inbox. ↩
Additionally, by choosing to believe new ideas are scarce we predispose ourselves to assume we’ve exhausted our frontiers before we actually have and we give up so much progress as a result. Part of this is undoubtedly due to the lack of a physical frontier, but this lack wouldn’t be as much of a problem if we didn’t fail to acknowledge that frontiers exist around us today. Physics, economics, biology, chemistry, etc. never stopped being frontiers. Maybe if we spent time in school focused on the unanswered questions we wouldn’t treat our knowledge as so complete and dogmatic. ↩
First published August 2020