Nowadays, to be intelligible is to be found out.
February 1972: My first two radio assignments were in delicious contrast. I had to launch a series on metric conversion (Australia had just opted for kilometers and kilos) and, at the same time, prepare for Apollo 16. We were still flying to the moon then—how old-fashioned.
For metrics, your fearless reporter embarked on a tour of David Jones’ lingerie department and asked the shoppers their bra sizes in centimeters. Yes, I squirm, too. The responses were polite but puzzled. Mr. Bean investigates! I set the resulting “interviews” and helpful tips on conversion from the imperial system to a surprisingly jaunty piece of Shostakovich, representing subliminally the continental invasion of your trusted, traditional standards. When the result went to air, I congratulated myself on becoming a realscience broadcaster.
Meanwhile, Apollo was due to launch in April 1972. It was to be the penultimate mission to the moon. The adventure would end in December, cut short, after only six landings, for the usual financial reasons, with 12 brave men doing something that now, 40 years later, in 2012, the world cannot manage.
I, with my long hair and jeans, was fashionably jaundiced about “The Right Stuff.” I thought the space race was tainted by military machismo and too little real science, but I willingly did the research my new bosses at the Australian Broadcasting Corporation demanded: What was the height of the Saturn rocket, what technological kit would be deployed on the lunar surface, why was the small observatory to be left there carried up wrapped in gold foil?
Come the day of the launch, I was completely overwhelmed by the adventure and the exhilaration: Apollo was, I realized, the first great reality show outside sports. My conversion was in seconds. And so, it seems, was that of our nationwide listeners. And there really was plenty of science; it wasn’t just a cosmic frolic.
We explained the trajectory and the calculations involved, we contrasted the moon rocks found with basalt from Earth (I even held some moon dirt from a previous mission in my hand as I went “live” to air), and we gave compelling details about the physiology of the astronauts and, inevitably, the proper arrangements for lunar loos. What was immensely satisfying was that no one at the ABC quibbled about our taking over studios and going to air for hours on end, displacing normal programs. ’Twas great to be alive at such a time.
A few weeks before that broadcasting bonanza, I had walked in off the street, fresh off my ship from Britain, straight into a job at the ABC. It was a fluke. Two people had left the science unit a few days before. I was put on as a temp, a stopgap. I’ve been there ever since.
That year, 1972, was momentous in other ways. Almost exactly coincident with my sudden employment was the launch of the “Limits to Growth” study by the Club of Rome, a pioneering, often maligned attempt to track the finite nature of Earthly resources in the face of human expansion. Then came the first United Nations Conference on the Human Environment, in Stockholm, forerunner of Rio, Kyoto, Copenhagen and all those other fests where hope was demolished by bitter experience. That April, as I talked on air about men on the moon, I referred romantically to that first photo, taken from Apollo 8, of our dear planet shown, for the first time, all alone in its blue magnificence, hanging there in space. Many mark our global preoccupation with the environment as being inspired by that breathtaking picture. Limits, indeed.
In December, just as Apollo 17 signalled the end to an adventure that had been planned to go on much longer, I was finally put formally on the ABC staff. It was the day Gough Whitlam, that very first of the “elites,” became prime minister.
Was the end of Apollo a turning point? Was that when so many lost their infatuations with science, after a generation of space-watching, DNA revelations and the Pill? Did the new green awareness, ironically, put the kibosh on our commitment to scientific authority? Did we cease to see those boffins as gentlemen, as reliable as doctors and librarians? You would imagine that the environment would be the best imaginable stimulant of our need to know. But there was talk of “scientism” and, worse, “kumbaya.”
In 1975, I launched “The Science Show.”I did the first program at the Pacific Science Congress in Vancouver. Two interviews stand out today from that very first show, all these years later. One was with Lord Ritchie-Calder, who told me about the burning of oil and coal and how all this pollution of the atmosphere could well cause climatic catastrophe. He said, “We’ve been concerned about this since the early 1960s, and here we are in 1975, and stillnobody’s doing anything about it!”
The second interview was with Gerard Piel, then publisher of ScientificAmerican. He told me his life’s work as a communicator of science was a failure because “young people have lost interest in science.”
How the themes repeat. How we go around in circles—like uncles at Christmas lunch, telling the same tale yet again after their third grog.
So where are we now, 40 years on, with science as culture, as a set of ideas and as the greatest force for change in our lives? Well, it’s hard to see beyond the clichés.
Yes, young people are rejecting science, in a way, in some countries such as the U.S., the U.K. and Australia. Paradoxically, the Anglo axis is still, by a long way, the source of most research and development. Why are the young saying no? The answer often given amounts really to a failure of imagination: “It’s got nothing to do with my life!” Really? So where did the tide of acronyms come from—those PCs, GPSs, ATMs, DVDs, CDs and all those iThings? From the Tooth Fairy?
Have they thought, for an instant, about the rate of change in the modern world? None of those acronyms existed on the day I walked into the ABC for the first time: There were no office computers, no mobile phones, no video machines, no cable TV. Color television, indeed, was three years away, as were jumbo jets. And I’m no Methuselah. Where did the rapid technological revolution originate?
If you’d asked me that same question when my mates and I were inventing the future in the 1960s—yes, we really were; hubris had no limits in that generation as we escaped the aftermath of the war and the grim, gray 1950s—I’d have said, as Churchill did, that science was on tap, not on top. Inventiveness was the driver, not the shaper of how we wanted to live. The boffins were the last people you’d ask for directions to the Promised Land. Too narrow, too odd looking, standing there in front of a graph, holding a piece of flex or a widget.
And when you review those scientists or engineers who haveachieved leadership positions, you do worry: Jimmy Carter, Margaret Thatcher, Osama bin Laden, Yasser Arafat, Bashar al-Assad, Radovan Karadzic. … OK, there’sAngela Merkel, but you get my drift. It is a fast-evolving material world, but the guys in white coats aren’t designing it, only its contents.
But perhaps young people do, after all, appreciate that science underpins everything, the world that surrounds them, inside and out, their digestion, fitness, drug use, leisure and much more. Perhaps they are simply not moved by entreaties to work very hard, studying for a few decades with incomes barely above minimum wage, in the hope that in early middle age, they may score a job with the prospects that their city contemporaries manage to trick in their 20s. Or used to.
Never mind that Australia, like elsewhere, is said to be catastrophically short of engineers (by 20,000) or scientists (by 100,000) to keep us in the forefront of innovation. Are those jobs really there when you go to find them?
What, then, should be the message to convince the next generation, in any
nation, that science is a must? The answer is that no one can do anyjob, be equipped for anyprofession, without science as part of their culture. We should build this into our kids’ schooling from the start,as we do language. Some will, along the way, become gripped by stars, birds, numbers or materials, and wish to stay with them forever, and I’m sure you’ll get more actual scientiststhat way than by asking 9-year-olds or teenagers to plan for a life in the lab when they’re grown up.
But that’s not the only block against science in 2012. And when I say “science,”I mean “the understanding of nature.” The real block is a viciously effective political one. It was significant that the new President of The Royal Society of London, Sir Paul Nurse, formerly head of Rockefeller University in New York, chose to spend his first weeks in office last year shooting a film for the BBC called “Science Under Attack.”In her book “Merchants of Doubt,” Professor Naomi Oreskes, with her colleagues at University of California, San Diego, showed how this attack has been successfully and relentlessly mounted by special interests:Don’t tell people they are dumb because they don’t understand all the endless research; don’t blast them with instructions on how their lives must change. Just spread doubt. Then whisper that university people are self-serving plotters in sheltered workshops.
And it works. Most surveys show that acceptance of science-based concerns has plummeted in the last five to 10 years.
While that has been going on, nature is as it is. The atmosphere, quite uninformed by human ideology, is changing, nonetheless. The land and the oceans are being transformed drastically by human intervention. There is a new age added to the Devonian, Jurassic and Pleistocene: It’s called the Anthropocene.
The real contrast, for me, these 40 years on, is how we seem no longer to have a sense of the future. Even those Apollo men seemed to presage a coming age of space exploration. How quaint! Now, do we ever think of that International Space Station “up there”? Does anyone know—do you know—how many men and women are living there in the sky above your head, right now?
Even toward the end of the last century, quite a few had a vague idea of what kind of society they thought they were working toward. But now? Obama won the last U.S. election saying, “Yes, we can!” Now, most people you ask will reply, “No, we can’t.” The wheels fell off finance, banks, jobs; normal polite discourse turned to projectile vomit and gratuitous insult—and we blunder on, hoping to improvise a way to cope with next week, let alone the next decade.
Can better science communication help? I’m sure it can.
A story, to finish: When my daughter was finishing high school, she couldn’t choose between art, her favorite, and marine biology, to which she aspired. I arranged for her to spend three weeks as a “slave” on Lizard Island, carrying gear and assisting, generally, the PhDs and other researchers doing work on the Great Barrier Reef.
When she returned, she decided—as I knew she would—to go to art school at the University of Sydney. The chemistry for marine science was beyond her, she thought, and her flair ought to be her priority. But that time with the scientists, she said, would never leave her and would always be reflected in her art. Why, I asked? Because, she replied, “Those fish, those corals, the sea grasses, the tides are real.”
Unlike so much of this increasingly virtual existence, as we bend, blinkered into our little screens, squinting at reflections of our selves, realityis everywhere. Embracing it is science.