Photos of people
before murder. Stop. No. Don’t
go there. Run away.
Photos of people
Photos of people
before murder. Stop. No. Don’t
go there. Run away.
Using the blog to put together a book, gradually. If you want to come along for the ride then start below with the Introduction.
“A little knowledge is a dangerous thing” (these days more often referred to as the “Dunning-Kruger Effect”) has been the downfall of the human race since there was one, and perhaps earlier. But there is a second syndrome, just as dangerous, which is “a moderate amount of knowledge is misleading”. Then there is contrarianism. These are people who are quite information rich, though not as rich as they think, but who know there is more money to be made in opposing reality than in recognising it. And finally, one that could perhaps be called the Kruger-Dunning effect, in which scientists greatly underestimate how information rich they are because of concerns about how much there is still to know.
The combination of these syndromes has left the human race largely rejecting and disregarding the findings of the scientific method, the only process by which we can reach the truth. It is as if, scientists having invented vaccination, the only method for resisting disease, people had refused to vaccinate their children. Oh, wait, that actually happens….
The Dunning-Kruger Effect is a self-fulfilling fallacy. Some grain of partly understood information makes sufferers confident that they can dismiss, say, evolutionary theory, the Big Bang, climate change, cancer treatments, continental drift, ancient history, the authorship of Shakespeare, and replace them with crackpot ideas that came to them in a moment of inspiration.
The Contrarians are the useful idiots of the world of big corporations and venal politicians. They all have a Galileo complex (as would the Dunning-Krugers if they had heard of him), or pretend to have one. They have learnt that claiming (with 100% certainty) the Earth to be flat (say) will get you more publicity than being just another round Earth clone, and that publicity will see you become a media darling, appearing on screen to provide “balance” (mustn’t just present a biased view that the Earth is round without presenting both equal sides of what obviously must be a debate), and eventually coming to the attention of people with money who have an interest in nit letting the public accept the Earth’s roundness.
The Kruger-Dunning (remember, this is just a name I invented) effect is the result of a kind of Zeno’s paradox in science. The fundamental bedrock of science, which every scientist has drummed into them in the cradle, is that nothing is 100% certain. Every finding has a probability attached to it, ranging from low, where the result doesn’t differ from random, to high (95%) where the chance is very low (ie 5%) that the result is random, to very high (99%). Sometimes even higher in certain kinds of physics experiments, although not in the real world. In other words, just like a Zeno runner who covers half the distance to the finish line, then half of half, then half of a quarter, half of an eighth, but can never cover the last little fraction (only half of it) to complete the course, a scientist can approach certainty in increments, but never reach it.
The result of this philosophy is that scientists constantly, when discussing their results and implications, use words like “may”, “possibly”, “likely”, “probably”. When other scientists hear such words they recognise them as referring to levels of statistical probability, when the general public, journalists, politicians, hear them, they hear “uncertainty”. Consequently non-scientists can be easily convinced that, say, smoking may or may not damage your health, climate change may or may not be real.
But the scientific law that nothing is certain is simply wrong. It would be better expressed as levels of certainty increase over time. When Darwin and Wallace proposed a mechanism (natural selection and geographic isolation) for evolution in 1859 their contemporaries could have said little more (apart from “why didn’t I see that”) than that it was possible. As more and more work was done in coming years to establish genetic mechanisms, fossil sequences, descriptions of new species, comparisons of skeletons and so on, it moved to likely and then probably. Now (and for many years) the theory of evolution is 100% certain, and all work in biological sciences uses it as a firm bedrock on which to investigate details. [Incidentally while it is commonly said that a single anomalous fossil find – a rabbit fossil in Cambrian rocks I think is the example usually given – would disprove the theory of evolution, and therefore you can’t say it is 100% certain, this is simply not true.
Evolution is so well established that such a find (or similar anomaly) would stick out like a sore Panda’s thumb. It would not cause a “rewriting of the science books” (to use that journalism cliche) but instead it would cause a thorough re-investigation of the fossil itself (chemical composition etc) and of the stratum in which it was found. The first suspect would be scientific fraud, a Piltdown Rabbit, but next would come some oddity of the geology that had caused fossil transposition, and finally, if the fossil passed all such tests, there would be a re-examination of the currently accepted phylogenetic sequence. In fact no one is going to find a genuine early rabbit, of course, but much lesser “anomalies” do occur from time to time and result in a new understanding of the time of origin of some group or its late survival. But for that to be the case the new arrangement would have to fit logically into existing frameworks of knowledge.
Similarly, the hypothesis that the Earth moved around the Sun, not vice versa, was originally very unlikely, then possible, now certain. Continental Drift went through the same sequence. So has Climate Change. In all areas of science we know far more now than we don’t know (the possible exception being cosmology where there are still many known unknowns – not least Dark Matter and Dark Energy – but even in this discipline the broad outlines of the structure and history of the universe are certainly known). So instead of pandering to and enabling the Dunning-Krugers and the Contrarians by saying that nothing is certain, and that really, in comparison to how much there is to know we know nothing, nothing, speak firmly, confident that you know a great deal more than they do, and smack down idiotic suggestions based on invincible ignorance.
The syndromes described above are all so straightforward as to need no further discussion. But that brings us to those with a moderate amount of knowledge, and the ill effects that causes can only be illustrated by a series of examples, the discussion of which forms the rest of this book.
Should Pen Pals now be called Pixel Pals?
Suppose I had found an ape lying dead upon the ground, and it should be inquired how the ape happened to be in that place; I should hardly think of the answer I had before given, that for anything I knew, the ape might have always been there. … There must have existed, at some time, and at some place or other, some ancestors of this ape, who became adapted into the form which we find it actually to answer; adapted by natural selection to become different to other apes including ourselves. Clearly such an ape is evidence of evolution.
The Bletchley Park Orchestra played the Enigma Variations.
The trick in dealing with anxiety is to forget the evils of the past, and not to fret about the perils of the future, while living entirely in the present. The problem is finding out the secret of how the trick is done.
Was it Warren Buffet who said “of course there is a class war and my class is winning”? No matter, it’s the thought that counts. It comes in response to the neoconservative meme, going back a few years now, in which parties vaguely of the Left (Australian Labour, American Democrats, British Labor), are accused of conducting “class warfare” (or “class envy”) in response to any suggestion by them that the rich might pay a little more in taxes in order to try to restore some balance to huge disparities in wealth, educational opportunity, access to health care, access to decent housing and so on.
So let us translate. When the rich tell the poor to stop indulging in class warfare this is a euphemism. They really mean, as they have been saying since civilisation (a euphemism for the creation of wealth disparity in human populations) began, the following: “Listen up, peasants, this hierarchy you find yourself in, with one percent super rich, 4 percent very rich, and 95% poor, is perfectly natural, prescribed by god. After the Garden of Eden socialist experiment failed He divided society into the Deserving Rich and the Undeserving Poor, and anointed some Super Rich Kings to keep what should be an unstable system stable. To help the Kings He appointed some very rich religious leaders, their own riches dependent on the stability of the system, who would keep the 95% happy and unrebellious by making sure they understood how undeserving they were, and by dangling the unverified and unverifiable promise of a life after death in which they might get a bit more milk and honey.
So, peasants, be thankful you are not actually slaves any more and that we, out of the goodness of our hearts and a rich social conscience, have agreed to pay some small amount of money to buy enough food to stay alive. In return we will decide how many hours you work, and under what conditions, and, if profits fall even slightly, we will throw you out on to the street without a moment’s notice. We will decide what kind of houses you can live in, what kind of schools your children will go to, what kind of medical care you can get, what kind of media you can have. Oh, and if we hear any more of that class warfare talk about inequity from anyone they will also find themselves on the street and out of a job. Get it? Good, now back to work.”
If your popular musical tastes were formed in the 1950s and 1960s little written after that time will seem very good.
YouTube comment threads are an embarrassing display of the ignorance of the world in the 21st century.
Proust’s taste of a madeleine catapulted him to the memory of a time and place. We all have our madeleine moments. Perhaps, like Proust, the taste of a cake, or of fish and chips, or of a certain flavour of soda drink. Not just tastes though – smells: of a wood fire, or sun lotion, or road tar on a hot day; sounds: a popular song, a bird call, a bat hitting a ball; touch: the feel of canvas, or a blanket, or a dog; sights: the Sun setting over the ocean, a vintage motor bike, a movie.
But beyond these specific memories of time and place triggered by sensations, we also have “memory chains”. A chain can start with a word or a phrase, read or heard. The word triggers memory of a person or place or event, that memory in turn can take us to another event, person, place, which in turn takes us still further, each trigger reaching further back in time in our lives.
But there is a limit to how far back any chain can stretch (rather like the limit on how far we can see out into the universe because beyond a certain point the light simply can’t reach us as the universe expands). We all think we have a memory (or memories if we are very confident) reaching back to the age of 2 or even one year old. But –
“In a survey of more than 6,600 people, published in Psychological Science, researchers found that 40% of people believe they have a first memory from when they were two or even younger, even though evidence suggests it is not possible for memories from this age to be retained. Around three to three-and-a-half seems to be the agreed age of a first memory, although Martin Conway, the study’s co-author and director of the Centre for Memory and Law at City, University of London, has said it’s “not until we’re five or six that we form adult-like memories due to the way that the brain develops and due to our maturing understanding of the world”.” (https://www.theguardian.com/science/2018/jul/19/sunflowers-and-santa-claus-guardian-writers-and-readers-on-how-their-first-memory-changed-them?CMP=Share_iOSApp_Other)
I suspect even 5 or 6 is probably ambitious. I think what happens is that we remember remembering early memories (perhaps even remember remembering remembering and so on) although we have lost the actual primary memory. I mean, if cells are replaced every 7 years (or whatever the figure is, my memory fails me) then by the time you get to my age all your cells have been replaced 10 times over. Including, I think, brain cells.
And there are other sources of false memory (fake news?) – we are told about things that happened by our family, and this implants their memories into ours. We see, all our lives, photos of when we were young – holding grandmother’s hand at the station, sitting on a horse statue, blowing out birthday candles, dressed in fancy dress – and the memory of those photos also become memories we think we have of the event concerned.
So, my earliest memory? Dunno. It can’t be separated from all the false memories that provide a fuzzy image of my youth. But then, most memories are like that. Even when triggered by a madeleine we think we remember eating.
Donald Trump – perhaps a Manchurian Candidate, certainly, and worse, a Murdochian Candidate.