Humans: how and what we are

What is initially required is an answer to a simple question: what return, as energy (food and warmth), did the hominid organism gain from day to day in the roughly two thousand million day journey from ape to man, to balance the increasing energy consumption of the increasing brain? The human brain is needy, and what it needs in order to work properly are two things, energy, and information, both in amounts way in excess of anything that our most recent common ancestor with the chimpanzees had  access to.  By information I mean everything that has gone into a human phenotype up to this moment, whenever this moment may be, through the eyes and ears and nose and mouth and skin, that has ended up being what we collectively know about the universe.  And what each and every one of us individually knows about the universe is orders of magnitude greater than what even the brightest chimp ever knows or knew. Biological energy, though complex in its detail, is in principle equally straightforward.  It’s what our mitochondria produce to fuel each of the forty trillion cells in our bodies.  Mitochondria (each of us has about a quadrillion) are tiny energy pumps distributed about each of our cells as densely as stars in the sky. They convert electrons stripped from food into the chemical ATP that fuels our whole organism and the organisms of all complex life (Lane, The Vital Question, 2015).  They fuel our brains and nerves, our muscles, digestions, all our organs.  If all my mitochondria disappeared on the instant, I would be dead within seconds. “There’s no such thing as a free lunch” is an important principle of evolution, though Darwin never put it quite that way. Energy that our mitochondria converts into  ATP comes ultimately from the sun, more immediately from the food we eat.  The human brain has long been considered an outlier, somehow significantly different from all other animal brains in its configuration and structure.  This is to an extent true.  The brain structures of various orders of animals can be quite evolutionarily distant from each other, so the brains of elephants, whales, rats and gorillas are structurally different.  But our brains are much the same configuration and structure as other primate brains, gorilla and chimpanzee and orang utan (Herculano-Houzel, 2013).  It’s just that they are much bigger, and this increased distance from one place to another across the brain leads, for reasons of energy economy, to an increase of localised connectivity on the outer layer of the cortex, the formation of “small world networks”.   Inevitably all this increased neural bulk leads to an equivalent rise in energy consumption.  Our brains consume a fifth of all the energy that our resting bodies, the collective of our mitochondria, produce. When we parted company with chimpanzees between perhaps eight and five million years ago, evolution took them along one path, us another.  Their success went along with big body size and big body, powerful muscle energy consumption, and a concomitantly relatively unevolving, unincreasing brain.  We went along a riskier route.  We developed the increasing brain size, and partially compensated for its energy needs with a relatively puny body (which is not to say that our bodies are negligible, far from it.)  But there is still no such thing as a free lunch, and the big brain had to do something to pay its way from day to day, something that ape brains couldn’t do. What was that something? The big brain did not arrive just like that, not there one millennium, there the next.  It evolved over probably  eight million years since our last common ancestor with the chimpanzee (the fossil evidence for this period is sparse, the calibration of the biological clock not yet determined).  The modern chimpanzee brain has about six billion neurons.  Roughly three million years ago the brains of the man-ape australopithecines were up to about thirty five billion neurons, and around one and a half million years ago the nearer-human Homo erectus had reached about sixty two billion neurons.  Our kissing (and not just kissing) cousins the Neanderthals raised the neuron score to between 79 and 90 billion, and 90 billion is about where we are today(Herculano-Houzel and Kaas, 2011). The energy cost of procuring nutrition to fuel these almost hundred billion neurons had become huge over our millions of years of evolution, and the big brain had to be doing something, and more and more of it, that put food in the mouth; and it had to be an immediate something. An empty belly does no wait for millennia of evolutionary change to feed it.  It needs feeding now. Two good suggestions for what supplied the extra nutrition are, complex tool use, and fire.  Sharp-edged stone tools to cut up meat, thus obviating the need for big teeth and massive jaws and jaw muscles, and fire to cook it with, would go a long way to compensating for the metabolic costs. Okay, so the answer is easy. The big brain had to be able to invent the things that it so clearly did, fire, weapons, cutting and shaping tools, hafted axes and picks; aeroplanes; quantum physics. Invent is the word that weakens the conventional analysis. If we look at the emergence of any morphological locus, external to the organism, vital to more energy-efficient food processing, let us say the cutting edge (which can utilise the already multipurpose and in situ hand and arm to do the job of big teeth and massive jaws), then that cutting edge emerged very slowly.  Indeed long before the beginning of its evolutionary development it was already sparsely present in the landscape.  Capuchin monkeys bang stones on stones with a motivation not wholly understood, (they lick the pulverised dust, apparently) and produce adventitious flakes that you or I could use to cut with, but they don’t  (Proffitt, 2016). Frost and glaciation do the same with flint, so it’s difficult to tell the difference between a flint geofact, produced by nature, and a flint artefact produced by a monkey or hominin. The earliest, roughest worked blades, from the site at Lomekwi 3 in West Turkana, Kenya (Harmand, 2015) are now dated possibly as far back as 3.3 million years ago, contemporaneous with Australopithecus afarensis. None of this fits in with “invention”, the cartoon character in animal skins sitting outside a cave, a speech bubble over his head with first a light bulb and then a graphic of a flint knife. Yet the prevalent use of the word “invent” in thousands of academic papers on hominin evolution suggest that this is what their writers really think actually happened. Nothing in nature, including that bit of nature which is our species, is ever invented.  When Archimedes leapt from his bath ( well that’s what the neighbours said) shouting “Eureka” because he’d finally come to a conclusion about why some things float and others sink, he hadn’t invented a single thing.  He was at the time one of the world’s leading engineers and naval architects.  For him ships, floating craft going back maybe sixty thousand years, their properties and qualities, were a given from the world. He’d been familiar with them since he was first apprenticed.  That ships should float, and reciprocally should not sink, was his job.  The margin between the two was a preoccupation. Archimedes’ breakthrough was his synthesis of bits of human knowledge acquired from uncountable generations.  He recognised a process, that when a human body is immersed in a bath it displaces its own weight of water, just about; and a human body floats in water, just about.  He recalled that if he dropped a lump of lead into the bath, it sank (and being so heavy must displace less than its own weight of water).  If he dropped a cork into the bath, it floated.  And what does floating mean?  It means that something is supported by the water, some of it below the surface and some of it above.  And that the water displaced by a floating cork, of equal weight to the cork, has less volume than the cork itself.  This synthesis was an act of genius, but it was synthesis of what was already there.  It had nothing to do with a sudden lightbulb switching on in the head.  Invention, the imagined material product of pure human thought, is a Cartesian delusion.  It has no place in an account of hominin evolution. The emergence of the first worked stone tools, extrapolated from the archaeological evidence, happened like this: Australopithecus afarensis individuals or a contemporary species were already using found blades, geofacts or other adventitious flakes, to cut flesh, and they were already, as apes and monkeys do, using stones as hammers to crack seeds and nuts.  Using a stone as a hammer, as with a nut on an anvil stone, will if mis-struck produce similar adventitious flakes.  And here was the holotype of human competence.  Australopithecus afarensis individuals recognised the chance products of their hammering as blades of the same type as the found blades they were already using, which monkeys and apes had never done.  These evolving hominins had the capacity, not of invention, but of recognition.  And they had the capacity to shut their eyes, look away, and still have that blade stored in the brain.  They had a durable registration of all that is a >flake that is also a cutting tool<, and everything else in the world that is not a >flake that is also a cutting tool<.  And among the things that were spatio-temporally contiguous to >flakes that were also cutting tools<, but were not >flakes that were also cutting tools<, were a >hammer stone<, recognisable and distinguishable from all in the world that was not a >hammer stone<, including a >cutting tool<; and the same goes for the >anvil stone<. This is the human genius.  Chimpanzees can enact it in the context of the already evolved human extended phenotype, but it seems they can’t perform autogenic acts of recognition, what Iain Davidson (Davidson, 2013) characterises by the sequence distraction (attention moving away from the object blade) and “re-engagement where you left off”, which necessitates not only a durable passive registration of the object in the brain, that which triggers simple recognition; but a replicatable registration of the object in the brain which triggers anticipation of the object being present even when it is not immediately available to the senses.  If the worked stones of Lomekwi 3 really are the product of Australopithecus afarensis or a similar taxon, then they could work this act of durably registered recognition and anticipation over three million years ago.  But even if, as was the general view up to last year, the earliest worked tools were the product of the bigger brained Homo habilis seven hundred thousand years later, but still two and a half million years ago, that puts the cognitive distinction between ape and human at long before the emergence of Homo sapiens. The significant word in all this is recognise.  The competence is not one of invention, it is one of persisting registration of a type in the brain such as will trigger recognition of that type even when an immediate stimulus for acquiring it is absent.  It is collecting behaviour, but not the same as a squirrel storing nuts. Collecting without an evolved hardwired stimulatory pathway, collecting as in a hominin picking up a stone good for cutting and carrying it home, is significant, but it did not build the Taj Mahal.  It constituted only a part of the evolving hominin competence.  The other part was to recognise, initially probably only as a brain-neuro-muscular registration, the spatio-temporal relationship (the semantic space where the verb would emerge) between hand, hammer stone, core, anvil stone and flakes durably registered as cutting stones, such that after a period of distraction, a few seconds or a whole day, the operation of striking flakes off a core resting on an anvil stone — the same operation as a chimpanzee cracking a nut — could be repeated.  That is to say, they knew you had to hit the core with the hammer and that would produce flakes with a cutting edge. That sounds a simple operation, but there’s an instructive video of a young capuchin monkey that knows, by observation, that you can get at the kernel of a nut with a hammer and anvil but goes through a series of ineffective operations, such as holding the hammer in one hand, putting the nut on the anvil, and then hitting the anvil with the other, empty, hand.  Adult chimpanzees, as we know with the Panda oleosa nuts in chapter 12, crack nuts expertly, incidentally demonstrating the technique to their young, who pick it up eventually, especially when they learn to use only their front feet.  They clearly recognise the type hammer stone and anvil stone, and have a durable registration of the spatio-temporal relationship between the two, since they will collect hammer stones before they set off for the Panda oleosa tree.  But they have never, as far as is known by human beings, got as far as cutting, because they don’t recognise adventitious flakes. The unique hominin competence, of not just recognising a potential specific dynamic relationship between two objects when the objects are present, but of retaining a durable and addressable representation of this relationship when the physical referents are absent, is clearly described in Iain Davis’s game-changing Carta lecture, University of California, UCSDTV and YouTube https://www.ucsd.tv/search-details.aspx?showID=25398 .  It constitutes anticipation, or foresight. The “cultural” evolutionary parting of the ways with the other apes, whether if occurred with Australopithecus afarensis or Homo habilis, required the (very very slowly) increasing ability to divide the world up into more and more discrete things, >?<, that might later become the >cord<, the >awl<, the >chisel<, the >bowl<, the >basket<, the >spear<, all initially adventitious morphological loci, each with a function that contributed to more efficient nutrition or other means of balancing the energy equation, and each a durable registration in the brain; and to locate each of those loci in a matrix of spatio temporal relationships, or actions. There was never, with ape or human, a first of anything, cutting flake or cord.  All were on an evolutionary continuum.  The typological loci, cutting flake or cord, were always derived from previous in-the-world existence, actual flint tools or sharpened sticks or strands of bark, and the collective of all these things in the world was as much the human extended phenotype as were the web, the nest or the dam the extended phenotypes of the spider, the bird and the beaver.  But as a result of the newly emerging competence, the ability to recognise and use an increasing number of things-in-the-world, this extended phenotype could carry on extending indefinitely, and indeed it has, weighing in at today’s thirty trillion tons ( give or take the odd steel girder, skyscraper, burgeoning African, Chinese or Indian city).  And as it extended it provided more and more efficient nutrition, first purely proximately as in hunting and gathering, sharing and cooking, then in more complex interdependencies, as between the vineyard and the cathedral, or the clock and the factory, or Facebook and a contemporary hominin’s economic function. And, the core of this thesis, the brain, which could continue to expand and consume energy as the extended phenotype proliferated, was the environment in which the extended phenotype evolved.  And the proliferating extended human phenotype was the environment in which the human brain evolved.  And when I say evolved in both cases, it seems to be lacking in parsimony to say that the brain evolved by a process of Darwinian evolution: observed phylogenetic heritability, incessant replication with fidelity, an envelope of fractional variation, selection by external factors; and then to say that the extended human phenotype did not evolve according to exactly the same process; particularly as their co-evolution, what is absolutely explicit in the above, is a process of obligate symbiosis.  The one could not have evolved alongside and in step with the other in any other relationship. To ignore as a first assumption that the human extended phenotype was not evolving according to the Darwinian model therefore seems irrational.  But of course there are complications, not least of which is that stone blades do not reproduce according to any biological pattern.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.