The Law Says What I Say It Says
Knowledge is knowing a tomato is a fruit. Wisdom is not putting it in a fruit salad.
I’ve heard this all my life, and honestly I don’t trust a lot of the attribution citations I was able to find, so let’s say… -Anonymous
(This is perhaps beside the point, but I think you absolutely can include tomatoes in a fruit salad if you know what you’re doing.)
There is a decent chance you may have already encountered the seemingly ridiculous California court ruling that found that bees are fish. By the time I sat down to write a bit on this early in June, I think I’d seen at least 5 headlines about it. It’s one of those “ain’t that whacky?” headlines that’s fun on the surface, but challenging once you dig into the story.
So what is California up to here? California has a law on the books called the Endangered Species Act. That law specifically names a number of categories of wildlife which are entitled to legal protections related to their being, you guessed it, endangered. In 2019, the California Fish and Game Commission used the law to protect 4 species of bumblebees….and I can feel your eyes glazing over, so I’m gonna take a step back.
Something I learned all too well while serving on the Portland, Maine rent board is just how challenging it is to both craft and interpret legislation. More often than not, intention is irrelevant against an argument of strict interpretation. (This is NOT the same as Originalism, which is a line of wildly successful bullshit funded by market fundamentalists for the past generation plus.) For instance, using a phrase like “fair rate of return” in one section and then “fair return on investment” in another can create hours and hours of discussion that can really just take it out of you.
California’s Endangered Species Act doesn’t holistically create categories that could apply to all kinds of insects, but it does define “terrestrial invertebrates” pretty well. So a commission moved to protect the bees under that, agricultural groups pushed back because it hit their bottom line, and after a bunch of back and forth, precedent has now effectively codified into law that bees are fish because it allows for the protection of these endangered bees.
(Side note: did you know bees have to be trucked into California to support almond agriculture? Honey bees aren’t native to the US and thus don’t do great without a lot of support, so they are driven around the country to pollinate for us.)
(Not so side note: makes sense why we might need to protect the bees we do have, aye?)
Anyway, the story is interesting but to me the more interesting theme is the myriad ways that legislation and legal dicta (what a judge says about a case that isn’t strictly part of a ruling) are used to wedge something into a legal category for in-/exclusion from protections.
This story immediately brought to mind the story of the ironic fight to classify the X-Men as not human for the purposes of toy making, despite the fact that in X-Men stories, their humanity is the whole point! I won’t do the story the disservice of reducing it to a few pithy lines. This one is worth a deeper read.
We will always fail to accurately gauge diversity
Humans are pattern recognizers. (We are not alone, but I don’t want to speak ignorantly for the Lorax, so let’s just talk about humans)
When we see novelty in an environment, such as a loudly dressed person at a somber gathering, we are apt to focus in on it subconsciously even if we don’t give much direct thought to the person.
Recognizing novelty in an environment can be evolutionarily advantageous; being alert to sudden difference in your surroundings can save you for predators and accidents alike.
Unfortunately, it can also result in overestimating the significance of the novel component. If you’ve ever seen a settled bird take flight because the wind caught a branch just right, you’ve seen this at the micro scale. No big thing, really. The bird will settle again and, over time, likely acclimate to this overreaction.
The critical issue is what this overestimation might mean in systems that are orders of magnitude more complex, because this fallacy applies to perceptions of social/cultural minorities as much as it does to a boldly painted house or the sudden appearance of morels on the forest floor.
Overestimating the number of morels available in a patch of woods might simply mean an underwhelming dinner, but misperceiving the prevalence of a group of people can mean they don’t have what they need. Greater complexity, greater consequences.
This error of extrapolation holds true no matter what the minority is. As this article points out, “[r]esidents of New York City, for example, are a tiny minority of Americans, only 3 percent of the population. But adult respondents to this nationwide survey thought that a whopping 30 percent of Americans live in the Big Apple.” More concerning still is that this thinking also leads people to assume that over 40% of Americans are Black while only about 12% are.
This overestimation, in other words, leads so easily to the majority group succumbing to an “illusion of diversity.”
I’ve been learning a bit recently about systems thinking, and this tendency to see novelty as more prevalent than it actually is strikes me as the kind of fallacy that can really doom an attempt to model the world, even when the person doing the modeling is aware of such biases.
It highlights the incredible care that must be taken with our assumptions. Inflating the percentage of a religious minority might mean little or nothing at all on the individual scale, but the knowledge that this kind of thinking is default puts a lot of power into the hands of a demagogue that wishes to stoke fear and anger. Prepare accordingly.
Sec Column: Developing a Security Posture
I’m going to have to think of a clever name for this recurring column, something I won’t tire of immediately. If you have any ideas, shoot them my way.
Last month, I introduced something of a personal origin story of my interest in IT, and later operational, security. This origin story is almost certainly reductive, but the time in which I grew up and a handful of personality quirks are most certainly part of what led me to want to talk regularly on ideas of security.
To review, last month I shared the following high-level definitions of IT & operational security:
IT Security - strategies for making sure our digital information is accessed only by those who should have it and in ways that we have specifically allowed for
Operational Security (OPSEC) - “a security and risk management process that prevents sensitive information from getting into the wrong hands.” (source)
And to reiterate quickly, I tend to view OPSEC as taking principles of IT security and bringing them into the analog (not “real”) world.
These security considerations are predicated on a security posture, but what the heck even is that?
Like our previous terms, a lot of the definitions one finds pertain more to institutions or corporations than they do to individuals. I think that’s a shame, because we all stand to benefit from developing a security posture. I tend to think of a personal security posture as having two components.
The first component is a recognition of risk channels. Put more simply, where and how are you vulnerable? We all possess a version of this by default; our brains analyze risk and attempt to mitigate it. We are generally fearful of great heights, high speeds, dangerous creatures, etc.
Unfortunately, modern life and access to information can shift this innate risk mitigation into overdrive. The human mind over privileges negative stories towards in favor of self preservation. Media and political narratives often take this into consideration as they highlight that which leads us to fear and anxiety, and thus keeps our attention at the expense of our feeling secure.
Fortunately, that same access to information can be wielded against this onslaught by focusing on media literacy. I find asking ‘qui bono?,’ or ‘to whom is it a benefit?,’ is often a good first step.
Recognizing risk channels requires education, but not obsession. A fear of increased crime, stoked by poorly contextualized data, can lead to one perceiving danger in and around their own home whether it exists or not. Contextualizing the information, perhaps by looking at the data yourself and learning to interpret it, can reveal to what extent you are actually vulnerable.
This applies to all ways one might be vulnerable, from someone gaining access to your email (and what that really means in 2022, when every online service uses your email as a backup) to swimming in unfamiliar waters to buying dietary supplements with specious health claims. This first component of your security posture does not require obsession, but it is an ongoing process. It that way, it isn’t like learning. It is learning. It is simply doing so with an appreciation of what does and doesn’t impact you and your family’s security.
The second component of a personal security posture, at least according to your humble Ephemeral Card Puncher, is what you do to prepare to respond to these recognized risk channels. Again, to put it more simply, first you recognize risk and then you take steps to at least understand how to address them.
You may note that I’ve already gotten ahead of myself, as these two components are not so discrete from one another. When you research the actual data around crime statistics in your neighborhood, you are as much refining your understanding of the risk as you are moving from recognition to preparedness. This is the absolute essence of a security posture. Identify, learn, and prepare so that you don’t have to live fearfully.
In the example of swimming, this can mean so many different things. Maybe it means knowing where it’s safe to swim, or taking survival swimming classes to make yourself safe even in less safe places. Hopefully it doesn’t mean avoiding the water entirely, because swimming is amazing.
Another example with a particularly American flavor might be a handgun owner taking shooting classes and practicing regularly. (OBVIOUS ALERT) Guns are insanely dangerous and in abundance in America. Bringing one into your home is at once a risk increase and a risk mitigation. Doing so without preparation has no connection to a security posture. Instead, you are acting only off of that innate first component of a security posture. You are acting out of fear, not preparedness.
Above is my not at all succinct definition of security posture, a concept that we all innately understand. I think this thorough definition of security posture maps well onto future plans for this column, and so I think it was worth sharing in some detail. For the future, however, I want to simplify.
A personal security posture is a recognition of risk, properly educated and contextualized, that spurs one to some measure of planning in the event that the risk becomes an actual threat. It is there I will pick up next month to discuss how you can find the right digital tools to support a digital security posture.