Archive | Science RSS feed for this section
4 Nov

http://www.aeonmagazine.com/world-views/anyone-can-learn-to-be-a-polymath/
 

Master of many trades

Our age reveres the narrow specialist but humans are natural polymaths, at our best when we turn our minds to many things

Renaissance man: Portrait of a Young Gentleman in His Studio by Lorenzo Lotto, c. 1530. Gallerie dell'Accademia, Venice. Photo by CorbisRenaissance man: Portrait of a Young Gentleman in His Studio by Lorenzo Lotto, c. 1530. Gallerie dell’Accademia, Venice. Photo by Corbis

Robert Twigger is a British poet, writer and explorer. He lives in Cairo, Egypt.

 

I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.

We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?

The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’

Ever since the beginning of the industrial era, we have known both the benefits and the drawbacks of dividing jobs into ever smaller and more tedious ones. Riches must be balanced against boredom and misery. But as long as a boring job retains an element of physicality, one can find a rhythm, entering a ‘flow’ state wherein time passes easily and the hard labour is followed by a sense of accomplishment. In Jack Kerouac’s novel Big Sur (1962) there is a marvellous description of Neal Cassady working like a demon, changing tyres in a tyre shop and finding himself uplifted rather than diminished by the work. Industrialism tends toward monopathy because of the growth of divided labour, but it is only when the physical element is removed that the real problems begin. When the body remains still and the mind is forced to do something repetitive, the human inside us rebels.

The average job now is done by someone who is stationary in front of some kind of screen. Someone who has just one overriding interest is tunnel-visioned, a bore, but also a specialist, an expert. Welcome to the monopathic world, a place where only the single-minded can thrive. Of course, the rest of us are very adept at pretending to be specialists. We doctor our CVs to make it look as if all we ever wanted to do was sell mobile homes or Nespresso machines. It’s common sense, isn’t it, to try to create the impression that we are entirely focused on the job we want? And wasn’t it ever thus?

In fact, it wasn’t. Classically, a polymath was someone who ‘had learnt much’, conquering many different subject areas. As the 15th-century polymath Leon Battista Alberti — an architect, painter, horseman, archer and inventor — wrote: ‘a man can do all things if he will’. During the Renaissance, polymathy became part of the idea of the ‘perfected man’, the manifold master of intellectual, artistic and physical pursuits. Leonardo da Vinci was said to be as proud of his ability to bend iron bars with his hands as he was of the Mona Lisa.

Polymaths such as Da Vinci, Goethe and Benjamin Franklin were such high achievers that we might feel a bit reluctant to use the word ‘polymath’ to describe our own humble attempts to become multi-talented. We can’t all be geniuses. But we do all still indulge in polymathic activity; it’s part of what makes us human.

So, say that we all have at least the potential to become polymaths. Once we have a word, we can see the world more clearly. And that’s when we notice a huge cognitive dissonance at the centre of Western culture: a huge confusion about how new ideas, new discoveries, and new art actually come about.

Science, for example, likes to project itself as clean, logical, rational and unemotional. In fact, it’s pretty haphazard, driven by funding and ego, reliant on inspired intuition by its top-flight practitioners. Above all it is polymathic. New ideas frequently come from the cross-fertilisation of two separate fields. Francis Crick, who intuited the structure of DNA, was originally a physicist; he claimed this background gave him the confidence to solve problems that biologists thought were insoluble. Richard Feynman came up with his Nobel Prize-winning ideas about quantum electrodynamics by reflecting on a peculiar hobby of his — spinning a plate on his finger (he also played the bongos and was an expert safe-cracker). Percy Spencer, a radar expert, noticed that the radiation produced by microwaves melted a chocolate bar in his pocket and developed microwave ovens. And Hiram Maxim, the inventor of the modern machine gun, was inspired by a self-cocking mousetrap he had made in his teens.

I thought you were either a ‘natural’ or nothing. Then I saw natural athletes fall behind when they didn’t practice enough. This, shamefully, was a great morale booster

Despite all this, there remains the melancholy joke about the scientist who outlines a whole new area of study only to dismiss it out of hand because it trespasses across too many field boundaries and would never get funding. Somehow, this is just as believable as any number of amazing breakthroughs inspired by the cross-fertilisation of disciplines.

One could tell similar stories about breakthroughs in art — cubism crossed the simplicity of African carving with a growing non-representational trend in European painting. Jean-Michel Basquiat and Banksy took street graffiti and made it acceptable to galleries. In business, cross-fertilisation is the source of all kinds of innovations: fibres inspired by spider webs have become a source of bulletproof fabric; practically every mobile phone also seems to be a computer, a camera and a GPS tracker. To come up with such ideas, you need to know things outside your field. What’s more, the further afield your knowledge extends, the greater potential you have for innovation.

Invention fights specialisation at every turn. Human nature and human progress are polymathic at root. And life itself is various — you need many skills to be able to live it. In traditional cultures, everyone can do a little of everything. Though one man might be the best hunter or archer or trapper, he doesn’t do only that.

The benefits of polymathic endeavour in innovation are not so hard to see. What is less obvious is how we ever allowed ourselves to lose sight of them. The problem, I believe, is some mistaken assumptions about learning. We come to believe that we can only learn when we are young, and that only ‘naturals’ can acquire certain skills. We imagine that we have a limited budget for learning, and that different skills absorb all the effort we plough into them, without giving us anything to spend on other pursuits.

Our hunch that it’s easier to learn when you’re young isn’t completely wrong, or at least it has a real basis in neurology. However, the pessimistic assumption that learning somehow ‘stops’ when you leave school or university or hit thirty is at odds with the evidence. It appears that a great deal depends on the nucleus basalis, located in the basal forebrain. Among other things, this bit of the brain produces significant amounts of acetylcholine, a neurotransmitter that regulates the rate at which new connections are made between brain cells. This in turn dictates how readily we form memories of various kinds, and how strongly we retain them. When the nucleus basalisis ‘switched on’, acetylcholine flows and new connections occur. When it is switched off, we make far fewer new connections.

Between birth and the age of ten or eleven, the nucleus basalisis is permanently ‘switched on’. It contains an abundance of the neurotransmitter acetylcholine, and this means new connections are being made all the time. Typically this means that a child will be learning almost all the time — if they see or hear something once they remember it. But as we progress towards the later teenage years the brain becomes more selective. From research into the way stroke victims recover lost skills it has been observed that the nucleus basalis only switches on when one of three conditions occur: a novel situation, a shock, or intense focus, maintained through repetition or continuous application.

Over-specialisation, eventually retreats into defending what one has learnt rather than making new connections

I know from my own experience of studying martial arts in Japan that intense study brings rewards that are impossible to achieve by casual application. For a year I studied an hour a day three days a week and made minimal progress. For a further year I switched to an intensive course of five hours a day five days a week. The gains were dramatic and permanent, resulting in a black belt and an instructor certificate. Deep down I was pessimistic that I could actually learn a martial art. I thought you were either a ‘natural’ or nothing. Then I saw natural athletes fall behind when they didn’t practice enough. This, shamefully, was a great morale booster.

The fact that I succeeded where others were failing also gave me an important key to the secret of learning. There was nothing special about me, but I worked at it and I got it. One reason many people shy away from polymathic activity is that they think they can’t learn new skills. I believe we all can — and at any age too — but only if we keep learning. ‘Use it or lose it’ is the watchword of brain plasticity.

People as old as 90 who actively acquire new interests that involve learning retain their ability to learn. But if we stop taxing the nucleus basalis, it begins to dry up. In some older people it has been shown to contain no acetylcholine — they have been ‘switched off’ for so long the organ no longer functions. In extreme cases this is considered to be one factor in Alzheimers and other forms of dementia — treated, effectively at first, by artificially raising acetylcholine levels. But simply attempting new things seems to offer health benefits to people who aren’t suffering from Alzheimers. After only short periods of trying, the ability to make new connections develops. And it isn’t just about doing puzzles and crosswords; you really have to try and learn something new.

Monopathy, or over-specialisation, eventually retreats into defending what one has learnt rather than making new connections. The initial spurt of learning gives out, and the expert is left, like an animal, merely defending his territory. One sees this in the academic arena, where ancient professors vie with each other to expel intruders from their hard-won patches. Just look at the bitter arguments over how far the sciences should be allowed to encroach on the humanities. But the polymath, whatever his or her ‘level’ or societal status, is not constrained to defend their own turf. The polymath’s identity and value comes from multiple mastery.

Besides, it may be that the humanities have less to worry about than it seems. An intriguing study funded by the Dana foundation and summarised by Dr Michael Gazzaniga of the University of California, Santa Barbara, suggests that studying the performing arts — dance, music and acting — actually improves one’s ability to learn anything else. Collating several studies, the researchers found that performing arts generated much higher levels of motivation than other subjects. These enhanced levels of motivation made students aware of their own ability to focus and concentrate on improvement. Later, even if they gave up the arts, they could apply their new-found talent for concentration to learning anything new.

I find this very suggestive. The old Renaissance idea of mastering physical as well as intellectual skills appears to have real grounding in improving our general ability to learn new things. It is having the confidence that one can learn something new that opens the gates to polymathic activity.

There is, I think, a case to be made for a new area of study to counter the monopathic drift of the modern world. Call it polymathics. Any such field would have to include physical, artistic and scientific elements to be truly rounded. It isn’t just that mastering physical skills aids general learning. The fact is, if we exclude the physicality of existence and reduce everything worth knowing down to book-learning, we miss out on a huge chunk of what makes us human. Remember, Feynman had to be physically competent enough to spin a plate to get his new idea.

Polymathics might focus on rapid methods of learning that allow you to master multiple fields. It might also work to develop transferable learning methods. A large part of it would naturally be concerned with creativity — crossing unrelated things to invent something new. But polymathics would not just be another name for innovation. It would, I believe, help build better judgment in all areas. There is often something rather obvious about people with narrow interests — they are bores, and bores always lack a sense of humour. They just don’t see that it’s absurd to devote your life to a tiny area of study and have no other outside interests. I suspect that the converse is true: by being more polymathic, you develop a better sense of proportion and balance — which gives you a better sense of humour. And that can’t be a bad thing.

Published on 4 November 2013

Advertisements

NASA’s Plutonium Problem Could End Deep-Space Exploration

11 Oct

http://www.wired.com/wiredscience/2013/09/plutonium-238-problem/all/
 

NASA’s Plutonium Problem Could End Deep-Space Exploration

The Voyager probe’s three radioisotope thermoelectric generators (RTGs) can be seen mounted end-to-end on the left-extending boom. (NASA)

In 1977, the Voyager 1 spacecraft left Earth on a five-year mission to explore Jupiter and Saturn. Thirty-six years later, the car-size probe is still exploring, still sending its findings home. It has now put more than 19 billion kilometers between itself and the sun. Last week NASA announced that Voyager 1 had become the first man-made object to reach interstellar space.

The distance this craft has covered is almost incomprehensible. It’s so far away that it takes more than 17 hours for its signals to reach Earth. Along the way, Voyager 1 gave scientists their first close-up looks at Saturn, took the first images of Jupiter’s rings, discovered many of the moons circling those planets and revealed that Jupiter’s moon Io has active volcanoes. Now the spacecraft is discovering what the edge of the solar system is like, piercing the heliosheath where the last vestiges of the sun’s influence are felt and traversing the heliopause where cosmic currents overcome the solar wind. Voyager 1 is expected to keep working until 2025 when it will finally run out of power.

None of this would be possible without the spacecraft’s three batteries filled with plutonium-238. In fact, Most of what humanity knows about the outer planets came back to Earth on plutonium power. Cassini’s ongoing exploration of Saturn, Galileo’s trip to Jupiter, Curiosity’s exploration of the surface of Mars, and the 2015 flyby of Pluto by the New Horizons spacecraft are all fueled by the stuff. The characteristics of this metal’s radioactive decay make it a super-fuel. More importantly, there is no other viable option. Solar power is too weak, chemical batteries don’t last, nuclear fission systems are too heavy. So, we depend on plutonium-238, a fuel largely acquired as by-product of making nuclear weapons.

But there’s a problem: We’ve almost run out.

“We’ve got enough to last to the end of this decade. That’s it,” said Steve Johnson, a nuclear chemist at Idaho National Laboratory. And it’s not just the U.S. reserves that are in jeopardy. The entire planet’s stores are nearly depleted.

The country’s scientific stockpile has dwindled to around 36 pounds. To put that in perspective, the battery that powers NASA’s Curiosity rover, which is currently studying the surface of Mars, contains roughly 10 pounds of plutonium, and what’s left has already been spoken for and then some. The implications for space exploration are dire: No more plutonium-238 means not exploring perhaps 99 percent of the solar system. In effect, much of NASA’s $1.5 billion-a-year (and shrinking) planetary science program is running out of time. The nuclear crisis is so bad that affected researchers know it simply as “The Problem.”

But it doesn’t have to be that way. The required materials, reactors, and infrastructure are all in place to create plutonium-238 (which, unlike plutonium-239, is practically impossible to use for a nuclear bomb). In fact, the U.S. government recently approved spending about $10 million a year to reconstitute production capabilities the nation shuttered almost two decades ago. In March, the DOE even produced a tiny amount of fresh plutonium inside a nuclear reactor in Tennessee.

It’s a good start, but the crisis is far from solved. Political ignorance and shortsighted squabbling, along with false promises from Russia, and penny-wise management of NASA’s ever-thinning budget still stand in the way of a robust plutonium-238 production system. The result: Meaningful exploration of the solar system has been pushed to a cliff’s edge. One ambitious space mission could deplete remaining plutonium stockpiles, and any hiccup in a future supply chain could undermine future missions.

**********

The only natural supplies of plutonium-238 vanished eons before the Earth formed some 4.6 billion years ago. Exploding stars forge the silvery metal, but its half-life, or time required for 50 percent to disappear through decay, is just under 88 years.

Fortunately, we figured out how to produce it ourselves — and to harness it to create a remarkably persistent source of energy. Like other radioactive materials, plutonium-238 decays because its atomic structure is unstable. When an atom’s nucleus spontaneously decays, it fires off a helium core at high speed while leaving behind a uranium atom. These helium bullets, called alpha radiation, collide en masse with nearby atoms within a lump of plutonium — a material twice as dense as lead. The energy can cook a puck of plutonium-238 to nearly 1,260 degrees Celsius. To turn that into usable power, you wrap the puck with thermoelectrics that convert heat to electricity. Voila: You’ve got a battery that can power a spacecraft for decades.

“It’s like a magic isotope. It’s just right,” said Jim Adams, NASA’s deputy chief technologist and former deputy director of the space agency’s planetary science division.

A radiation-shielded glove box at Savannah River Site. In chambers like these during the cold war, the government assembled plutonium-238 fuel for use in spacecraft such as Galileo and Ulysses. (Savannah River Site)

U.S. production came primarily from two nuclear laboratories that created plutonium-238 as a byproduct of making bomb-grade plutonium-239. The Hanford Site in Washington state left the plutonium-238 mixed into a cocktail of nuclear wastes. The Savannah River Site in South Carolina, however, extracted and refined more than 360 pounds during the Cold War to power espionage tools, spy satellites, and dozens of NASA’s pluckiest spacecraft.

By 1988, with the Iron Curtain full of holes, the U.S. and Russia began to dismantle wartime nuclear facilities. Hanford and Savannah River no longer produced any plutonium-238. But Russia continued to harvest the material by processing nuclear reactor fuel at a nuclear industrial complex called Mayak. The Russians sold their first batch, weighing 36 pounds, to the U.S. in 1993 for more than $45,000 per ounce. Russia had become the planet’s sole supplier, but it soon fell behind on orders. In 2009, it reneged on a deal to sell 22 pounds to the U.S.

Whether or not Russia has any material left or can still create some is uncertain. “What we do know is that they’re not willing to sell it anymore,” said Alan Newhouse, a retired nuclear space consultant who spearheaded the first purchase of Russian plutonium-238. “One story I’ve heard … is that they don’t have anything left to sell.”

By 2005, according a Department of Energy report (.pdf), the U.S. government owned 87 pounds, of which roughly two-thirds was designated for national security projects, likely to power deep-sea espionage hardware. The DOE would not disclose to WIRED what is left today, but scientists close to the issue say just 36 pounds remain earmarked for NASA.

That’s enough for the space agency to launch a few small deep-space missions before 2020. A twin of the Curiosity rover is planned to lift off for Mars in 2020 and will require nearly a third of the stockpile. After that, NASA’s interstellar exploration program is left staring into a void — especially for high-profile, plutonium-hungry missions, like the proposed Jupiter Europa Orbiter. To seek signs of life around Jupiter’s icy moon Europa, such a spacecraft could require more than 47 pounds of plutonium.

“The supply situation is already impacting mission planning,” said Alice Caponiti, a nuclear engineer who leads the DOE’s efforts to restart plutonium-238 production. “If you’re planning a mission that’s going to take eight years to plan, the first thing you’re going to want to know is if you have power.”

Many of the eight deep-space robotic missions that NASA had envisioned over the next 15 years have already been delayed or canceled. Even more missions — some not yet even formally proposed — are silent casualties of NASA’s plutonium poverty. Since 1994, scientists have pleaded with lawmakers for the money to restart production. The DOE believes a relatively modest $10 to 20 million in funding each year through 2020 could yield an operation capable of making between 3.3 and 11 pounds of plutonium-238 annually — plenty to keep a steady stream of spacecraft in business.

**********

In 2012, a line item in NASA’s $17-billion budget fed $10 million in funding toward an experiment to create a tiny amount of plutonium-238. The goals: gauge how much could be made, estimate full-scale production costs, and simply prove the U.S. could pull it off again. It was half of the money requested by NASA and the DOE, the space agency’s partner in the endeavor (the Atomic Energy Act forbids NASA to manufacture plutonium-238). The experiment may last seven more years and cost between $85 and $125 million.

At Oak Ridge National Laboratory in Tennessee, nuclear scientists have used the High Flux Isotope Reactor to produce a few micrograms of plutonium-238. A fully reconstituted plutonium program described in the DOE’s latest plan, released this week, would also utilize a second reactor west of Idaho Falls, called the Advanced Test Reactor.

That facility is located on the 890-square-mile nuclear ranch of Idaho National Laboratory. The scrub of the high desert rolls past early morning visitors as the sun crests the Teton Range. Armed guards stop and inspect vehicles at a roadside outpost, waving those with the proper credentials toward a reactor complex fringed with barbed wire and electrified fences.

The Advanced Test reactor’s unique four—leaf—clover core design. (Idaho National Laboratory)

Beyond the last security checkpoint is a warehouse-sized, concrete-floored room. Yellow lines painted on the floor cordon off what resembles an aboveground swimming pool capped with a metal lid. A bird’s-eye view reveals four huge, retractable metal slabs; jump through one and you’d plunge into 36 feet of water that absorbs radiation. Halfway to the bottom is the reactor’s 4-foot-tall core, its four-leaf clover shape dictated by slender, wedge-shaped bars of uranium. “That’s where you’d stick your neptunium,” nuclear chemist Steve Johnson said, pointing to a diagram of the radioactive clover.

Neptunium, a direct neighbor to plutonium on the periodic table and a stable byproduct of Cold War-era nuclear reactors, is the material from which plutonium-238 is most easily made. In Johnson’s arrangement, engineers pack tubes with neptunium-237 and slip them into the reactor core. Every so often an atom of neptunium-237 absorbs a neutron emitted by the core’s decaying uranium, later shedding an electron to become plutonium-238. A year or two later — after harmful isotopes vanish — technicians could dissolve the tubes in acid, remove the plutonium, and recycle the neptunium into new targets.

The inescapable pace of radioactive decay and limited reactor space mean it may take five to seven years to create 3.3 pounds of battery-ready plutonium. Even if full production reaches that rate, NASA needs to squeeze every last watt out of what will inevitably always be a rather small stockpile.

The standard-issue power source, called a multi-mission thermoelectric generator — the kind that now powers the Curiosity rover — won’t cut it for space exploration’s future. “They’re trustworthy, but they use a heck of a lot of plutonium,” Johnson said.

In other words, NASA doesn’t just need new plutonium. It needs a new battery.

 **********

In a cluttered basement at NASA Glenn Research Center in Cleveland, metal cages and transparent plastic boxes house a menagerie of humming devices. Many look like stainless-steel barbells about a meter long and riddled with wires; others resemble white crates the size of two-drawer filing cabinets.

The unpretentious machines are prototypes of NASA’s next-generation nuclear power system, called the Advanced Stirling Radioisotope Generator. It’s shaping up to be a radically different, more efficient nuclear battery than any before it.

On the outside, the machines are motionless. Inside is a flurry of heat-powered motion driven by the Stirling cycle, developed in 1816 by the Scottish clergyman Robert Stirling. Gasoline engines burn fuel to rapidly expand air that pushes pistons, but Stirling converters need only a heat gradient. The greater the difference between a Stirling engine’s hot and cold parts, the faster its pistons hum. When heat warms one end of a sealed chamber containing helium, the gas expands, pushing a magnet-laden piston through a tube of coiled wire to generate electricity. The displaced, cooling gas then moves back to the hot side, sucking the piston backward to restart the cycle.

“Nothing is touching anything. That’s the whole beauty of the converter,” said Lee Mason, one of several NASA engineers crowded into the basement. Their pistons float like air hockey pucks on the cycling helium gas.

For every 100 watts of heat generated, the Stirling generator converts more than 30 watts into electricity. That’s nearly five times better than the nuclear battery powering Curiosity. In effect, the generator can use one-fourth of the plutonium while boosting electrical output by at least 25 percent. Less plutonium also means these motors weigh two-thirds less than Curiosity’s 99-pound battery — a big difference for spacecraft on 100 million-mile-or-more journeys. Curiosity was the biggest, heaviest spacecraft NASA could send to Mars at the time, with a vast majority of its mass dedicated to a safe landing — not science. Reducing weight expands the possibilities for advanced instruments on future missions.

But the Stirling generator’s relatively complicated technology, while crucial to the design, worries some space scientists. “There are people who are very concerned that this unit has moving parts,” said John Hamley, manager of NASA Glenn’s nuclear battery program. The concern is that the motion might interfere with spacecraft instruments that must be sensitive enough to map gravity fields, electromagnetism, and other subtle phenomena in space.

As a workaround, each generator uses two Stirling converters sitting opposite each other. An onboard computer constantly synchronizes their movements to cancel out troublesome vibrations. To detect and correct design flaws, engineers have abused their generator prototypes in vacuum chambers, assaulted them on shaking tables, and barraged them with powerful blasts of radiation and magnetism.

But NASA typically requires new technologies to be tested for one and a half expected lifetimes before flying them in space. For the Stirling generator, that would take 25 years. Earnest testing began in 2001, cutting the delay to 13 years – but that’s longer than NASA can wait: In 2008, only one of 10 nuclear-powered missions called for the device. By 2010, seven of eight deep-space missions planned through 2027 required them.

To speed things up, Hamley and his team run a dozen different units at a time. The oldest device has operated almost continuously for nearly 10 years while the newest design has churned since 2009. The combined data on the Stirling generators totals more than 50 years, enough for simulations to reliably fast-forward a model’s wear-and-tear. So far, so good. “Nothing right now is a show-stopper,” Hamley said. His team is currently building two flight-worthy units, plus a third for testing on the ground (Hamley expects Johnson’s team in Idaho to fuel it sometime next year).

For all of the technology’s promise, however, it “won’t solve this problem,” Johnson said. Even if the Stirling generator is used, plutonium-238 supplies will only stretch through 2022.

An early ASRG prototype. Its 10,016 hours of use has contributed to decades of combined data on the performance of NASA’s revolutionary nuclear battery. (Dave Mosher/WIRED)

Any hiccups in funding for plutonium-238 production could put planetary science into a tailspin and delay, strip down, or smother nuclear-powered missions. The outlook among scientists is simultaneously optimistic and rattled.

The reason: It took countless scientists and their lobbyists more than 15 years just to get lawmakers’ attention. A dire 2009 report about “The Problem,” authored by more than five dozen researchers, ultimately helped slip the first earnest funding request into the national budget in 2009. Congressional committees squabbled over if and how to spend $20 million of taxpayers’ money — it took them three years to make up their minds.

********** 

“There isn’t a day that goes by that I don’t think about plutonium-238,” said Jim Adams, the former deputy boss of NASA’s planetary science division.

At the National Air and Space Museum in Washington, D.C., Adams stares through the glass at the nuclear wonder that powered his generation’s space exploration. Amid the fake moon dust sits a model of SNAP-27, a plutonium-238-fueled battery that every lunar landing after Apollo 11 to power its science experiments. “My father worked on the Lunar Excursion Model, which that thing was stored on, and it’s still up there making power,” Adams said.

Just a few steps away is a model of the first Viking Lander, which touched down on Mars in 1976 and began digging for water and life. It found neither. “We didn’t dig deep enough,” Adams said. “Just 4 centimeters below the depth that Viking dug was a layer of pristine ice.”

One floor up, a model of a Voyager spacecraft hangs from the ceiling. The three nuclear power supplies aboard the real spacecraft are what allow Voyager 1 and its twin, Voyager 2, to contact the Earth after 36 years. Any other type of power system would have expired decades ago.

The same technology fuels the Cassini spacecraft, which continues to survey Saturn, sending a priceless stream of data and almost-too-fantastic-to believe images of that planet and its many moons. New Horizons’ upcoming flyby of Pluto — nine and a half years in the making — wouldn’t be possible without a reliable source of nuclear fuel.

The Viking lander needed to dig deeper. Now we do, too.

Is It Safe to Launch Nuclear Batteries?

Anti-nuclear activists often state that just one microscopic particle of plutonium-238 inhaled into the lungs can lead to fatal cancer. There’s something to the claim, as pure plutonium-238 — ounce-for-ounce — is 270 times more radioactive than the plutonium-239 inside nuclear warheads. But the real risks to anyone of launching a nuclear battery are frequently mis-represented or misunderstood.
Statisticians compare apples to apples by looking at a threat’s severity, likelihood and affected population. An asteroid able to wipe out 1.5 billion people, for example, hits Earth about once about every 500,000 years — so the risk is high-severity, yet low-probability. Nuclear battery disasters, meanwhile, exist as low-severity and low-probability events, even near the launch pad.

Cassini, for example, left Earth with the most plutonium of any spacecraft at 72 pounds . Late in that probe’s launch there was about a 1 in 476 chance of plutonium release. If that had happened, fatalities over 50 years from that release would have numbered an estimated 1/25th of a person per the safety design of its nuclear batteries. The overall risk of cancer to a person near the launch pad during an accident was estimated at 7 in 100,000. Beyond that zone, risk was even lower.

Statisticians also considered a second hypothetical and potentially dangerous event with Cassini. To get to Saturn, the spacecraft swung back toward and flew within 600 miles of Earth, zooming by at tens of thousands of miles per hour. The chance of releasing plutonium then was less than 1 in a million. If a release of plutonium occurred, statisticians estimated it might cause 120 cancer fatalities — for the whole planet — over 50 years. By contrast, natural background radiation likely claims a million lives a year, and lightning strikes about 10,000 lives.

A launch accident with NASA’s Curiosity rover had a roughly 1 in 250 chance of releasing plutonium. But the low chance of cancer fatalities brought individual risk down to about 1 in 5.8 million. “I feel that they’re completely safe,” said Ryan Bechtel, DOE’s nuclear battery safety manager. “My entire family was there at Curiosity’s launch site.”

Earth’s copper ring; or, a science experiment that didn’t catch on

19 Aug

This pairs well with another post of mine from a while ago:  Science You Never Knew Existed
 
 
http://www.wired.com/wiredscience/2013/08/project-west-ford/

The Forgotten Cold War Plan That Put a Ring of Copper Around the Earth

 

 

During the summer of 1963, Earth looked a tiny bit like Saturn.

The same year that Martin Luther King, Jr. marched on Washington and Beatlemania was born, the United States launched half a billion whisker-thin copper wires into orbit in an attempt to install a ring around the Earth. It was called Project West Ford, and it’s a perfect, if odd, example of the Cold War paranoia and military mentality at work in America’s early space program.

The Air Force and Department of Defense envisioned the West Ford ring as the largest radio antenna in human history. Its goal was to protect the nation’s long-range communications in the event of an attack from the increasingly belligerent Soviet Union.

During the late 1950’s, long-range communications relied on undersea cables or over-the-horizon radio. These were robust, but not invulnerable. Should the Soviets have attacked an undersea telephone or telegraph cable, America would only have been able to rely on radio broadcasts to communicate overseas. But the fidelity of the ionosphere, the layer of the atmosphere that makes most long-range radio broadcasts possible, is at the mercy of the sun: It is routinely disrupted by solar storms. The U.S. military had identified a problem.

A potential solution was born in 1958 at MIT’s Lincoln Labs, a research station on Hanscom Air Force Base northwest of Boston. Project Needles, as it was originally known, was Walter E. Morrow’s idea. He suggested that if Earth possessed a permanent radio reflector in the form of an orbiting ring of copper threads, America’s long-range communications would be immune from solar disturbances and out of reach of nefarious Soviet plots.

Each copper wire was about 1.8 centimeters in length. This was half the wavelength of the 8 GHz transmission signal beamed from Earth, effectively turning each filament into what is known as a dipole antenna. The antennas would boost long-range radio broadcasts without depending on the fickle ionosphere.

Today it’s hard to imagine a time where filling space with millions of tiny metal projectiles was considered a good idea. But West Ford was spawned before men had set foot in space, when generals were in charge of NASA’s rockets, and most satellites and spacecraft hadn’t flown beyond the drafting table. The agency operated under a “Big Sky Theory.” Surely space is so big that the risks of anything crashing into a stray bit of space junk were miniscule compared to the threat of communism.

The project was renamed West Ford, for the neighboring town of Westford, Massachusetts. It wasn’t the first, or even the strangest plan to build a global radio reflector. In 1945, science fiction author Arthur C. Clarke suggested that Germany’s V2 rocket arsenal could be repurposed to deploy an array of antennas into geostationary orbit around the Earth. So prescient was Clarke’s vision, today’s communications satellites, residing at these fixed points above the planet, are said to reside in “Clarke Orbit”.

Meanwhile, American scientists had been attempting to use our own moon as a communications relay, a feat that would finally be accomplished with 1946’s Project Diana. An even more audacious scheme was hatched in the early 1960s from a shiny Mylar egg known as Project Echo, which utilized a pair of microwave reflectors in the form of space-borne metallic balloons.

Size of the copper needles dispersed as part of Project West Ford. (NASA)

As Project West Ford progressed through development, radio astronomers raised alarm at the ill effects this cloud of metal could have on their ability to survey the stars. Concerns were beginning to arise about the problem of space junk. But beneath these worries was an undercurrent of frustration that a space mission under the banner of national security was not subject to the same transparency as public efforts.

The Space Science Board of the National Academy of Sciences convened a series of classified discussions to address astronomers’ worries, and President Kennedy attempted a compromise in 1961. The White House ensured that West Ford’s needles would be placed in a low orbit, the wires would likely re-enter Earth’s atmosphere within two years, and no further tests would be conducted until the results of the first were fully evaluated. This partially appeased the international astronomy community, but still, no one could guarantee precisely what would happen to twenty kilograms of copper wire dispersed into orbit.

The West Ford dispersal system. (NASA)

On October 21, 1961, NASA launched the first batch of West Ford dipoles into space. A day later, this first payload had failed to deploy from the spacecraft, and its ultimate fate was never completely determined.

“U.S.A. Dirties Space” read a headline in the Soviet newspaper Pravda. 

Ambassador Adlai Stevenson was forced to make a statement before the UN declaring that the U.S. would consult more closely with international scientists before attempting another launch. Many remained unsatisfied. Cambridge astronomer Fred Hoyle went so far as to accuse the U.S. of undertaking a military project under “a façade of respectability,” referring to West Ford as an “intellectual crime.”

On May 9, 1963, a second West Ford launch successfully dispersed its spindly cargo approximately 3,500 kilometers above the Earth, along an orbit that crossed the North and South Pole. Voice transmissions were successfully relayed between California and Massachusetts, and the technical aspects of the experiment were declared a success. As the dipole needles continued to disperse, the transmissions fell off considerably, although the experiment proved the strategy could work in principle.

Concern about the clandestine and military nature of West Ford continued following this second launch. On May 24 of that year, the  The Harvard Crimson quoted British radio astronomer Sir Bernard Lovell as saying, “The damage lies not with this experiment alone, but with the attitude of mind which makes it possible without international agreement and safeguards.”

Recent military operations in space had given the U.S. a reckless reputation, especially following 1962’s high-altitude nuclear test Starfish Prime. This famously bad idea dispersed radiation across the globe, spawning tropical auroras and delivering a debilitating electromagnetic pulse to Hawaiian cities.

The ultimate fate of the West Ford needles is also surrounded by a cloud of uncertainty. Because the copper wires were so light, project leaders assumed that they would re-enter the atmosphere within several years, pushed Earthward by solar wind. Most of the needles from the failed 1961 and successful 1963 launch likely met this fate. Many now lie beneath snow at the poles.

But not all the needles returned to Earth. Thanks to a design flaw, it’s possible that several hundred, perhaps thousands of clusters of clumped needles still reside in orbit around Earth, along with the spacecraft that carried them.

The copper needles were embedded in a naphthalene gel designed to evaporate quickly once it reached the vacuum of space, dispersing the needles in a thin cloud. But this design allowed metal-on-metal contact, which, in a vacuum, can weld fragments into larger clumps.

In 2001, the European Space Agency published a report that analyzed the fate of needle clusters from the two West Ford payloads. Unlike the lone needles, these chains and clumps have the potential to remain in orbit for several decades, and NORAD space debris databases list several dozen still aloft from the 1963 mission. But the ESA report suggests that, because the 1961 payload failed to disperse, thousands more clusters could have been deployed, and several may be too small to track.

Active communication satellites quickly made projects like West Ford obsolete, and no more needles were launched after 1963. Telstar, the first modern communications satellite, was launched in 1962, beaming television signals across the Atlantic for two hours a day.

In Earth’s catalog of space junk, West Ford’s bits of copper make up only a fraction of the total debris cloud that circles the Earth. But they surely have one of the strangest stories.

The scheme serves as yet another reminder that it was military might that brought the first space missions to bear, for better and worse. Like moon bases and men on Mars, it’s another long-lost dream born at a time when nothing was out of reach. Even putting a ring around the Earth.

10 Ways to Happiness – Because Science

8 Aug

http://lifehacker.com/ten-things-you-can-do-to-be-happier-backed-by-science-1065356587?utm_campaign=socialflow_lifehacker_facebook&utm_source=lifehacker_facebook&utm_medium=socialflow

 

Ten Simple Things You Can Do to Be Happier, Backed by Science

 

Happiness is so interesting, because we all have different

ideas about what it is and how to get it. I would love to be happier—as I’m sure most people would—so I thought it would be interesting to find some ways to become a happier person that are actually backed up by science. Here are ten of the best ones I found.

Exercise More

Exercise has such a profound effect on our happiness and well-being that it’s actually been proven to be an effective strategy for overcoming depression. In a study cited in Shawn Achor’s book, The Happiness Advantage, three groups of patients treated their depression with either medication, exercise, or a combination of the two. The results of this study really surprised me. Although all three groups experienced similar improvements in their happiness levels to begin with, the follow up assessments proved to be radically different:

The groups were then tested six months later to assess their relapse rate. Of those who had taken the medication alone, 38 percent had slipped back into depression. Those in the combination group were doing only slightly better, with a 31 percent relapse rate. The biggest shock, though, came from the exercise group: Their relapse rate was only 9 percent!

You don’t have to be depressed to gain benefit from exercise, though. It can help you to relax, increase your brain power and even improve your body image, even if you don’t lose any weight. A study in the Journal of Health Psychology found that people who exercised felt better about their bodies, even when they saw no physical changes:

Body weight, shape and body image were assessed in 16 males and 18 females before and after both 6 × 40 mins exercise and 6 × 40 mins reading. Over both conditions, body weight and shape did not change. Various aspects of body image, however, improved after exercise compared to before.

We’ve explored exercise in depth before, and looked at what it does to our brains, such as releasing proteins and endorphins that make us feel happier, as you can see in the image below.

Ten Simple Things You Can Do to Be Happier, Backed by Science

Sleep More

We know that sleep helps our bodies to recover from the day and repair themselves, and that it helps us focus and be more productive. It turns out, it’s also important for our happiness. In NutureShock, Po Bronson and Ashley Merryman explain how sleep affects our positivity:

Negative stimuli get processed by the amygdala; positive or neutral memories gets processed by the hippocampus. Sleep deprivation hits the hippocampus harder than the amygdala. The result is that sleep-deprived people fail to recall pleasant memories, yet recall gloomy memories just fine.

In one experiment by Walker, sleep-deprived college students tried to memorize a list of words. They could remember 81% of the words with a negative connotation, like “cancer.” But they could remember only 31% of the words with a positive or neutral connotation, like “sunshine” or “basket.”

The BPS Research Digest explores another study that proves sleep affects our sensitivity to negative emotions. Using a facial recognition task over the course of a day, the researchers studied how sensitive participants were to positive and negative emotions. Those who worked through the afternoon without taking a nap became more sensitive late in the day to negative emotions like fear and anger.

Using a face recognition task, here we demonstrate an amplified reactivity to anger and fear emotions across the day, without sleep. However, an intervening nap blocked and even reversed this negative emotional reactivity to anger and fear while conversely enhancing ratings of positive (happy) expressions.

Of course, how well (and how long) you sleep will probably affect how you feel when you wake up, which can make a difference to your whole day. Especially this graph showing how your brain activity decreases is a great insight about how important enough sleep is for productivity and happiness:

Ten Simple Things You Can Do to Be Happier, Backed by Science

Another study tested how employees’ moods when they started work in the morning affected their work day.

Researchers found that employees’ moods when they clocked in tended to affect how they felt the rest of the day. Early mood was linked to their perceptions of customers and to how they reacted to customers’ moods.

And most importantly to managers, employee mood had a clear impact on performance, including both how much work employees did and how well they did it.

Sleep is another topic we’ve looked into before, exploring how much sleep we really need to be productive.

Move Closer to Work

Our commute to the office can have a surprisingly powerful impact on our happiness. The fact that we tend to do this twice a day, five days a week, makes it unsurprising that its effect would build up over time and make us less and less happy. According to The Art of Manliness, having a long commute is something we often fail to realize will affect us so dramatically:

… while many voluntary conditions don’t affect our happiness in the long term because we acclimate to them, people never get accustomed to their daily slog to work because sometimes the traffic is awful and sometimes it’s not. Or as Harvard psychologist Daniel Gilbert put it, “Driving in traffic is a different kind of hell every day.”

We tend to try to compensate for this by having a bigger house or a better job, but these compensations just don’t work:

Two Swiss economists who studied the effect of commuting on happiness found that such factors could not make up for the misery created by a long commute.

Spend Time with Friends and Family

Staying in touch with friends and family is one of the top five regrets of the dying. If you want more evidence that it’s beneficial for you, I’ve found some research that proves it can make you happier right now. Social time is highly valuable when it comes to improving our happiness, even for introverts. Several studies have found that time spent with friends and family makes a big difference to how happy we feel, generally.

I love the way Harvard happiness expert Daniel Gilbert explains it:

We are happy when we have family, we are happy when we have friends and almost all the other things we think make us happy are actually just ways of getting more family and friends.

George Vaillant is the director of a 72-year study of the lives of 268 men.

In an interview in the March 2008 newsletter to the Grant Study subjects, Vaillant was asked, “What have you learned from the Grant Study men?” Vaillant’s response: “That the only thing that really matters in life are your relationships to other people.”

He shared insights of the study with Joshua Wolf Shenk at The Atlantic on how the men’s social connections made a difference to their overall happiness:

The men’s relationships at age 47, he found, predicted late-life adjustment better than any other variable, except defenses. Good sibling relationships seem especially powerful: 93 percent of the men who were thriving at age 65 had been close to a brother or sister when younger.

In fact, a study published in the Journal of Socio-Economics states than your relationships are worth more than $100,000:

Using the British Household Panel Survey, I find that an increase in the level of social involvements is worth up to an extra £85,000 a year in terms of life satisfaction. Actual changes in income, on the other hand, buy very little happiness.

I think that last line is especially fascinating: Actual changes in income, on the other hand, buy very little happiness. So we could increase our annual income by hundreds of thousands of dollars and still not be as happy as if we increased the strength of our social relationships.

The Terman study, which is covered in The Longevity Project, found that relationships and how we help others were important factors in living long, happy lives:

We figured that if a Terman participant sincerely felt that he or she had friends and relatives to count on when having a hard time then that person would be healthier. Those who felt very loved and cared for, we predicted, would live the longest.

Surprise: our prediction was wrong… Beyond social network size, the clearest benefit of social relationships came from helping others. Those who helped their friends and neighbors, advising and caring for others, tended to live to old age.

Go Outside

In The Happiness Advantage, Shawn Achor recommends spending time in the fresh air to improve your happiness:

Making time to go outside on a nice day also delivers a huge advantage; one study found that spending 20 minutes outside in good weather not only boosted positive mood, but broadened thinking and improved working memory…

This is pretty good news for those of us who are worried about fitting new habits into our already-busy schedules. Twenty minutes is a short enough time to spend outside that you could fit it into your commute or even your lunch break. A UK study from the University of Sussex also found that being outdoors made people happier:

Being outdoors, near the sea, on a warm, sunny weekend afternoon is the perfect spot for most. In fact, participants were found to be substantially happier outdoors in all natural environments than they were in urban environments.

The American Meteorological Society published research in 2011 that found current temperature has a bigger effect on our happiness than variables like wind speed and humidity, or even the average temperature over the course of a day. It also found that happiness is maximized at 13.9°C, so keep an eye on the weather forecast before heading outside for your 20 minutes of fresh air.

Help Others

One of the most counterintuitive pieces of advice I found is that to make yourself feel happier, you should help others. In fact, 100 hours per year (or two hours per week) is the optimal time we should dedicate to helping others in order to enrich our lives. If we go back to Shawn Achor’s book again, he says this about helping others:

…when researchers interviewed more than 150 people about their recent purchases, they found that money spent on activities—such as concerts and group dinners out—brought far more pleasure than material purchases like shoes, televisions, or expensive watches. Spending money on other people, called “prosocial spending,” also boosts happiness.

The Journal of Happiness Studies published a study that explored this very topic:

Participants recalled a previous purchase made for either themselves or someone else and then reported their happiness. Afterward, participants chose whether to spend a monetary windfall on themselves or someone else. Participants assigned to recall a purchase made for someone else reported feeling significantly happier immediately after this recollection; most importantly, the happier participants felt, the more likely they were to choose to spend a windfall on someone else in the near future.

So spending money on other people makes us happier than buying stuff for ourselves. What about spending our time on other people? A study of volunteering in Germany explored how volunteers were affected when their opportunities to help others were taken away:

Shortly after the fall of the Berlin Wall but before the German reunion, the first wave of data of the GSOEP was collected in East Germany. Volunteering was still widespread. Due to the shock of the reunion, a large portion of the infrastructure of volunteering (e.g. sports clubs associated with firms) collapsed and people randomly lost their opportunities for volunteering. Based on a comparison of the change in subjective well-being of these people and of people from the control group who had no change in their volunteer status, the hypothesis is supported that volunteering is rewarding in terms of higher life satisfaction.

In his book Flourish: A Visionary New Understanding of Happiness and Well-being, University of Pennsylvania professor Martin Seligman explains that helping others can improve our own lives:

…we scientists have found that doing a kindness produces the single most reliable momentary increase in well-being of any exercise we have tested.

Practice Smiling

Smiling itself can make us feel better, but it’s more effective when we back it up with positive thoughts, according to this study:

A new study led by a Michigan State University business scholar suggests customer-service workers who fake smile throughout the day worsen their mood and withdraw from work, affecting productivity. But workers who smile as a result of cultivating positive thoughts–such as a tropical vacation or a child’s recital–improve their mood and withdraw less.

Of course it’s important to practice “real smiles” where you use your eye sockets. It’s very easy to spot the difference:

Ten Simple Things You Can Do to Be Happier, Backed by Science

According to PsyBlog, smiling can improve our attention and help us perform better on cognitive tasks:

Smiling makes us feel good which also increases our attentional flexibility and our ability to think holistically. When this idea was tested by Johnson et al. (2010), the results showed that participants who smiled performed better on attentional tasks which required seeing the whole forest rather than just the trees.

A smile is also a good way to alleviate some of the pain we feel in troubling circumstances:

Smiling is one way to reduce the distress caused by an upsetting situation. Psychologists call this the facial feedback hypothesis. Even forcing a smile when we don’t feel like it is enough to lift our mood slightly (this is one example of embodied cognition).

Plan a Trip

As opposed to actually taking a holiday, it seems that planning a vacation or just a break from work can improve our happiness. A study published in the journal, Applied Research in Quality of Life showed that the highest spike in happiness came during the planning stage of a vacation as employees enjoyed the sense of anticipation:

In the study, the effect of vacation anticipation boosted happiness for eight weeks. After the vacation, happiness quickly dropped back to baseline levels for most people.

Shawn Achor has some info for us on this point, as well:

One study found that people who just thought about watching their favorite movie actually raised their endorphin levels by 27 percent. If you can’t take the time for a vacation right now, or even a night out with friends, put something on the calendar—even if it’s a month or a year down the road. Then whenever you need a boost of happiness, remind yourself about it.

Meditate

Meditation is often touted as an important habit for improving focus, clarity and attention span, as well as helping to keep you calm. It turns out it’s also useful for improving your happiness:

In one study, a research team from Massachusetts General Hospital looked at the brain scans of 16 people before and after they participated in an eight-week course in mindfulness meditation. The study, published in the January issue of Psychiatry Research: Neuroimaging, concluded that after completing the course, parts of the participants’ brains associated with compassion and self-awareness grew, and parts associated with stress shrank.

Meditation literally clears your mind and calms you down, it’s been often proven to be the single most effective way to live a happier live. I believe that this graphic explains it the best:

Ten Simple Things You Can Do to Be Happier, Backed by Science

According to Shawn Achor, meditation can actually make you happier long-term:

Studies show that in the minutes right after meditating, we experience feelings of calm and contentment, as well as heightened awareness and empathy. And, research even shows that regular meditation can permanently rewire the brain to raise levels of happiness.

The fact that we can actually alter our brain structure through mediation is most surprising to me and somewhat reassuring that however we feel and think today isn’t permanent.

Practice Gratitude

This is a seemingly simple strategy, but I’ve personally found it to make a huge difference to my outlook. There are lots of ways to practice gratitude, from keeping a journal of things you’re grateful for, sharing three good things that happen each day with a friend or your partner, and going out of your way to show gratitude when others help you.

In an experiment where some participants took note of things they were grateful for each day, their moods were improved just from this simple practice:

The gratitude-outlook groups exhibited heightened well-being across several, though not all, of the outcome measures across the 3 studies, relative to the comparison groups. The effect on positive affect appeared to be the most robust finding. Results suggest that a conscious focus on blessings may have emotional and interpersonal benefits.

The Journal of Happiness studies published a study that used letters of gratitude to test how being grateful can affect our levels of happiness:

Participants included 219 men and women who wrote three letters of gratitude over a 3 week period. Results indicated that writing letters of gratitude increased participants’ happiness and life satisfaction, while decreasing depressive symptoms.

Quick Last Fact: Getting Older Will Make You Happier

As a final point, it’s interesting to note that as we get older, particularly past middle age, we tend to grow happier naturally. There’s still some debate over why this happens, but scientists have got a few ideas:

Researchers, including the authors, have found that older people shown pictures of faces or situations tend to focus on and remember the happier ones more and the negative ones less. Other studies have discovered that as people age, they seek out situations that will lift their moods—for instance, pruning social circles of friends or acquaintances who might bring them down. Still other work finds that older adults learn to let go of loss and disappointment over unachieved goals, and hew their goals toward greater wellbeing.

So if you thought being old would make you miserable, rest assured that it’s likely you’ll develop a more positive outlook than you probably have now.

F*** yeah fluid dynamics – On blenders and cavitation

20 Mar

 

The fluid dynamics of a commercial-quality blender amount to a lot more than just stirring. Here high-speed video shows how the blender’s moving blades create a suction effect that pulls contents down through the middle of the blender, then flings them outward. This motion creates large shear stresses, which help break up the food, as well as turbulence that can mix it. But if you watch carefully, you’ll also see tiny bubbles spinning off the blades. These bubbles, formed by the pressure drop of fluid accelerated over the arms of the blades, are cavitation bubbles. When they collapse, or implode, they create localized shock waves that further break up the blender’s contents. This same effect is responsible for damage to boat propellers and lets you destroy glass bottles. (Video credit: ChefSteps; via Wired; submitted by jshoer)

http://fuckyeahfluiddynamics.tumblr.com/post/45345509325/the-fluid-dynamics-of-a-commercial-quality-blender

Mechanical Porn

29 Jun

 

I feel like I’m back in my Mechanisms of Machinery class in college. LOVE it!

“The film presents a deceptively “open” series of images of gears and pistons that transfer movement from vertical to rotary directions. Musical in its repetitive visual form, it now seems akin to Charles Sheeler’s paintings and photographs of railroad locomotive gears and wheels, a tribute to the machine age.” 

— Robert A. Haller

 

7 minutes of terror – NASA style

25 Jun

http://www.nasa.gov/multimedia/videogallery/index.html?media_id=146903741

MSL (Mars Science Laboratory) Style. 

Mars Science Laboratory (MSL, or Curiosity) is a Mars rover launched by NASA on November 26, 2011.[1][3] Currently en route to the planet, it is scheduled to land in Gale Crater at about 05:31 UTC on August 6, 2012. The rover’s objectives include searching for past or present life, studying the Martian climate, studying Martian geology, and collecting data for a future manned mission to Mars.[11]

Curiosity is about five times larger than the Spirit or Opportunity Mars exploration rovers,[12] and carries over ten times the mass of scientific instruments. It will attempt a more precise landing than previous rovers, within a landing ellipse of 7 km by 20 km,[13] in the Aeolis Palus region of Gale Crater. This location is near the mountain Aeolis Mons (formerly called “Mount Sharp”).[14][15] It is designed to explore for at least 687 Earth days (1 Martian year) over a range of 5–20 km (3–12 miles).[16]

The Mars Science Laboratory mission is part of NASA’s Mars Exploration Program, a long-term effort for the robotic exploration of Mars, and the project is managed by the Jet Propulsion Laboratory of California Institute of Technology. When MSL launched, the program’s director was Doug McCuistion of NASA’s Planetary Science Division.[17] The total cost of the MSL project is about US$2.5 billion.[18]

http://en.wikipedia.org/wiki/Mars_Science_Laboratory


 

Curiosity’s mission site: http://www.nasa.gov/mission_pages/msl/index.html

What’s in a number?

23 Apr

http://www.lastwordonnothing.com/2012/04/12/whats-in-a-number/

 

What’s In a Number

By Richard Panek | April 12, 2012 |

 

“Since there is an infinite number of alternative universes, there must be one in which there isn’t an infinite number of alternative universes. Perhaps this is it.”

 

No, that speculation didn’t come from the “Ask Mr. Cosmology” mailbag. It’s from a reader of New Scientist, courtesy of LWON’s own Sally, who is an editor at the magazine. She forwarded it to me because, she said, “it kind of made my head asplode.” After receiving reassurances from her that her head hadn’t actually spontaneously detonated—this is, after all, someone who is capable of falling into the Thames without any help—I sat and thought and tried to find the flaw in the logic.

 

The speculation has a logical basis in the current standard cosmological model. According to quantum theory, virtual particles should be popping into and out of existence all the time—and are, as experiments have repeatedly shown over the past six decades. In that case, the universe could be the product of one such quantum pop.

If it is, then it could have gone through a process that physicists call a “phase transition” and that everyone else calls “the thing that happens when water turns into ice or vice versa.” At the age of a trillionth of a trillionth of a trillionth of one second—that’s a 1 followed by 36 zeros, or 1036—the universe would have expanded ten septillion-fold—or to 1025 times its previous size. And it would have done so over the course of 1/1035 seconds.

And if inflation can pop one quantum universe into existence, then why not many? In fact, according to quantum theory, it should. It would, if inflation actually happened.

The inflationary universe. Also, Sally’s head.

 

The case for inflation isn’t airtight, but with every fresh observation of the Cosmic Microwave Background—the remnant echo of the Big Bang, loosely speaking—the evidence has looked better and better. Over the past decade, consensus has coalesced: We very likely did come from a quantum pop. In that case, our inflationary bubble would be one of an ensemble of 10500 universes. The number isn’t quite infinity, as the New Scientist reader suggests, but who’s counting?

Still, let’s say the number of universes is infinite. In that case, the reader’s argument goes like so:

A. The number of universes is infinite.

B. A universe exists in which the number of universes is not infinite.

C. This might be it.

When the argument is stated this starkly, the flaw in the logic becomes pretty clear. B contradicts A. “The number of universes is infinite” and “the number of universes is not infinite” can’t both be true. The contradiction, however, is obscured by the inclusion of “A universe exists in which.” The implication is that there’s something special about universes, something that, for instance, doorknobs don’t have. “Since there is an infinite number of doorknobs, there must be one for which there isn’t an infinite number of doorknobs” wouldn’t make anyone’s head asplode, except perhaps in bewilderment.

So what’s so special about universes that the existence of an infinite number of them would, for physics-savvy readers, somehow seem to suggest the necessary existence of one that allows the impossible?

I suspect the answer is quantum probability. According to quantum theory, everything is a matter of probability; therefore anything is possible. Anything. The probability that a butterfly will give birth to a dragon or that I will one day fall into the Thames is vanishingly small—but, technically, it’s not zero. Same with the emergence of a universe, or a cornucopia of universes, from nothing. The laws of physics allow it.

And that’s the implicit, but missing, “something special” in premise B: the laws of physics. As in “A universe exists in which the laws of physics require the number of universes to not be infinite.” What prompted the New Scientist reader, and what posed a threat to Sally’s noggin, was an unthinking assumption: that “the laws of physics”—in particular quantum theory—are part of the argument.

It’s a tempting assumption. According to current cosmological thinking, if an infinite ensemble of (or 10500, anyway) universes exists, then presumably each could come equipped with its own laws of physics. So couldn’t our universe be the one in which the laws of physics require that other universes don’t exist?

Yes—but only if our laws of physics have something to do with the other universes. We all, however, went our separate ways 13.7 billion years ago. Our laws of physics affect what happens within our universe, but there’s no reason to think they would influence the multiverse at large. Doorknobs, after all, don’t dictate the laws of physics.

Still, if they did, then maybe we could reframe the New Scientist‘s reader’s comment:

“Since there is an infinite number of alternative universes, there must be one in which there is just one alternative universe. Perhaps this is it.”

“Since there is an infinite number of alternative universes, there must be one in which there are two alternative universes. Perhaps this is it.”

“Since there is an infinite number of alternative universes, there must be one in which there are 2,125,179,218 alternative universes. Perhaps this is it.”

Memo to New Scientist staff: You can remove your plastic ponchos now.

You Become the Sound

9 Apr

In the world’s quietest place, ‘you become the sound’

By Adi Robertson

 

http://www.theverge.com/2012/4/5/2927823/orfield-laboratories-worlds-quietest-place

 

via i.dailymail.co.uk

 

The quietest place on Earth is a room in Minneapolis, Minnesota, and the longest anyone has stayed in the dark there is 45 minutes. The ‘anechoic chamber’ at Orfield Laboratories absorbs over 99 percent of sound with 3-foot-thick fiberglass wedges and insulated walls, removing virtually every sound except that of people and objects brought into the chamber. In some cases, that’s used for simple industrial purposes: it’s a way to hear the sounds of switches, motors, or washing machines without outside interference.

Put a human being in there, however, and they become disoriented or even experience hallucinations. After a few minutes, founder Steven Orfield told the Daily Mail, your body begins to adapt to the soundlessness, picking up smaller and smaller sounds. “You’ll hear your heart beating, sometimes you can hear your lungs, hear your stomach gurgling loudly. In the anechoic chamber, you become the sound.” Because there are no external sounds, it’s difficult to move around: “If you’re in there for half an hour, you have to be in a chair.”

In extreme cases, the sensory deprivation is debilitating. NASA astronauts train by being placed in a water tank in the room, an experience that apparently causes hallucinations as the body tries to create sensations out of thin air. When the lights are turned out, the Mail says that the longest time anyone has been able to stay inside is 45 minutes. At Orfield, it seems, the greatest distraction of all is not noise but silence.

To Grasp a Billion Stars

9 Apr

Reposting another Phil Plait piece. This one is totally mindblowing.
 
http://mblogs.discovermagazine.com/badastronomy/
 
 

 

To grasp a billion stars

There are times — rare, but they happen — when I have a difficult time describing the enormity of something. Something so big, so overwhelming, that words simply cannot suffice.

The basic story is: Using the VISTA telescope in Chile and the UKIRT telescope in Hawaii, astronomers have made an incredibly detailed map of the sky in infrared. This map will help understand our own galaxy, more distant galaxies, quasars, nebulae, and much more.

But what do I mean by “incredibly detailed”?

This is where words get hard. So hang on tight; let me show you instead.

Here’s a section of the survey they made, showing the star-forming region G305, an enormous cloud of gas about 12,000 light years away which is busily birthing tens of thousands of stars:

[Click to enstellarnate.]

Pretty, isn’t it? There are about 10,000 stars in this image, and you can see the gas and dust that’s forming new stars even as you look.

But it’s the scale of this image that’s so amazing. It’s only a tiny, tiny part of this new survey. How tiny? Well, it came from this image (the area of the first image is outlined in the white square):

Again, click to embiggen — it’ll blow your socks off. But we’re not done! That image is a subsection of this one:

… which itself is a subsection of this image:

Sure, I’ll admit that last one doesn’t look like much, squished down into a width of a few hundred pixels here for the blog. So go ahead, click on it. I dare you. If you do, you’ll get a roughly 20,000 x 2000 pixel picture of the sky, a mosaic made from thousands of individual images… and even that is grossly reduced from the original survey.

How big is the raw data from the survey? Why, it only has 150 billion pixels aiieeee aiieeeeee AIIEEEEE!!!

And this would be where I find myself lacking in adjectives. Titanic? Massive? Ginormous? These all fail utterly when trying to describe a one hundred fifty thousand megapixel picture of the sky.

Yegads.

And again, why worry over words when I can show you? The astronomers involved helpfully made the original data — all 150 billion pixels of it — into a pan-and-zoomable image where you can zoom in, and in, and in. It’s hypnotizing, like watching “Inception”, but made of stars.

And made of stars it is: there are over a billion stars in the original image! A billion. With a B. It’s one of the most comprehensive surveys of the sky ever made, and yet it still only scratches the surface. This survey only covers the part of the sky where the Milky Way galaxy itself is thickest — in the bottom image above you can see the edge-on disk of our galaxy plainly stretching across the entire shot — and that’s only a fraction of the entire sky.

Think on this: there are a billion stars in that image alone, but that’s less than 1% of the total number of stars in our galaxy! As deep and broad as this amazing picture is, it’s a tiny slice of our local Universe.

And once again, we’ve reached the point where I’m out of words. Our puny brains, evolved to count the number of our fingers and toes, to grasp only what’s within reach, to picture only what we can immediately see — balk at these images.

But… we took them. Human beings looked up and wondered, looked around and observed, looked out and discovered. In our quest to seek ever more knowledge, we built the tools needed to make these pictures: the telescopes, the detectors, the computers. And all along, the power behind that magnificent work was our squishy pink brains.

A billion stars in one shot, thanks to a fleshy mass of collected neurons weighing a kilogram or so. The Universe is amazing, but so are we.

Images credit: Mike Read (WFAU), UKIDSS/GPS and VVV

%d bloggers like this: