2013-02-27

Hooray for Science

A Moment to Celebrate Science

In February, we celebrate the birthday of two people whose simple observations established much of what we consider modern science:  Darwin (Feb. 12) and Galileo (Feb. 15).

Each of these people made observations that change the way we think, and to a rather extraordinary extent, defined the best of how doctors should practice medicine.

You might think that the whole approach to thinking about medical practice is soundly based on the firmest principles of science, but we find that as time goes on, the connection between medicine and science actually may be getting weaker and stronger at the same time.

Given that there are forces that would reverse the power of science to help heal humanity, it seems fitting to mention a word about what science is about, why it is suspect, and why it should be valued as the ultimate foundation for good medical practice.

What is Science?
At its best, science is a well-defined approach to two big challenges in life- figuring out why things happen, and figuring out how to make things happen.  Perhaps the most curious reason both of these challenges are challenges, is that our minds did not evolve to answer either question very clearly.

It turns out our mind was built to make the best guess possible in the moment.  The people who were best, at a glance, at knowing a tiger was over there were more likely to have descendants than those who did not.  Life itself values the best guess over almost any other technique.

But over time, our minds progressed to allow us the option of thinking past the best guess, and towards a deeper understanding of true causes and true effects.  

Science, then, is an approach to thinking about the world that asks us to go beyond what is apparent and try to figure out what is real, particularly when it comes to explaining why something happens, or how to make something happen.

Galileo and Darwin's work both serve as outstanding examples of what is science.   In every instance, the great work done by Galileo was simply seeing something and asking a question about it.  There are many examples, but my two favorites are the observation about weight and gravity, and the one about not noticing our motion.

The thing Galileo noticed that we still have trouble believing today, is that a very heavy ball will drop to Earth from a height at exactly the same rate as a light ball.  Heavy things don't fall faster than light things.  He proved this by dropping a very heavy cannonball and a light ball from a very tall tower and then seeing that they landed at the same time.  

The other observation he made was that even though we are whizzing through space at truly high speeds, we don't feel as if the earth is moving at all.    But we are.  We spin around the Earth at about 1000 miles per hour, and spin around the Sun at about 67,000 miles per hour.  If we got in a car and went in a big circle at 67,000 mph or a tighter circle at 1,000 mph, we would feel it, and dramatically.

So how could we possibly move at such tremendous speeds (a bullet goes about 6000 mph), and not feel as though we are moving at all?  It turns out no matter how fast you move, if it is a constant rate and your background doesn't change much, you do not notice moving at all.  Believe it or not, it was this observation, just simply noticing that the we don't feel the Earth spin or rotate, that launched a several century move towards the theory of relativity.

And, of course Darwin observed that animals and plants change in response to changes in their environment, that species evolve.  His entire theory of evolution came from simple observations of the world.

So science is an approach to understanding why things happen based on observing what is actually happening around us.

Why is Science Suspect?

One might assume that simply observing what actually happens would not be controversial, but we live in a time of great tension in response to the ways of science.

On the one hand, we treasure it.  Almost everyone uses, treasures, or depends on some form of electronic wonder- not only the fancy things like smartphones, the cloud, the Internet, but also even television itself.  Not many people would gladly go back to a day before we knew what electricity was.   And when it comes time for a major medical intervention, like having appendicitis and needing your appendix removed, nearly everyone is grateful for the progress made in fields like medicine, surgery, microbiology, and yes, pharmaceuticals.

But on the other hand, there is a deep unease with science, a suspicion, and a reaction against it.
It is not clear what truly fuels this suspicion.  The unease may be in part due to the sense that the heady days of science actually solving our problems have given way to a sense that so many of our problems are more complicated and not likely to have quick fixes.  It may be in part due to a sense that in many cases scientific work gets caught up in money-making- research giving way to business.

We see this suspicion across the medical sciences when drug companies devote so much resource to promoting taking an ever-growing array of drugs or when medical centers push so hard for people to have a rapidly growing choice of tests and procedures.

Whatever the reasons, we are living through an era that is witnessing the rise of well organized opposition to even the most direct and simple observations.  From religion to politics to grass roots non-profits, people are organized to deny the observations made about evolution, the melting of our ice caps, and the benefits of some basic medical interventions.

But Science Turns out the be the Best Foundation for Good Medical Practice.

I am very sympathetic to many of the suspicions raised by science, but find most of the troubles are related to how the observations science makes are used, not the observations themselves.  Consider the example noted above regarding the actual experience people have with science when it comes to medicines, even just antibiotics.  The observation that certain chemicals kill bacteria is one of the great discoveries of history, but the actual use of antibiotics is a bit crazy right now.  They are soaking our livestock, and being way overused for conditions, like colds, for which they offer no help at all.  Everyone has an uneasy feeling that we are being oversold on the use of medications, and that much harm is done to us as a result.

Our main point is that we should be very careful not to confuse the problem with science.  
The problem in this example is the sale of antibiotics, not the discovery of them.

BOTTOM LINE

  1. Science has come under unfair suspicion recently, for a number of reasons.
  2. But most of the trouble related to science comes from how its products are used and sold.
  3. Science itself is a wonderful approach to the world that simply consists of observing what happens and learning what really goes on, how things work, and how to make things better.
  4. The challenge we all face today is how to oppose the abuse of the products of science without rejecting science itself, AND without embracing clearly irrational approaches to problem-solving (also known as quackery)
  5. Science stands alone in its ability to improve our understanding of how things work in the world and how to make things work better.    This is not only a good foundation for the practice of medicine, it is essential if medicine is to really help anyone.
  6. At Advanced Pediatrics we work hard to respect and learn from the great observations science delivers and to combine that with the second pillar of medical care- caring and connecting.

Dr. Arthur Lavin

PS- Here is one of the all time best essays on science, an essay on Galileo, I have read recently:
http://www.newyorker.com/arts/critics/atlarge/2013/02/11/130211crat_atlarge_gopnik?printable=true



*Disclaimer* The comments contained in this electronic source of information do not constitute and are not designed to imply that they constitute any form of individual medical advice. The information provided is purely for informational purposes only and not relevant to any person's particular medical condition or situation. If you have any medical concerns about yourself or your family please contact your physician immediately. In order to provide our patients the best uninfluenced information that science has to offer,we do not accept samples of drugs, advertising tchotchkes, money, food, or any item from outside vendors.

http://www.newyorker.com/arts/critics/atlarge/2013/02/11/130211crat_atlarge_gopnik?printable=true&currentPage=all

A CRITIC AT LARGE

MOON MAN

What Galileo saw.

by FEBRUARY 11, 2013

Galileo facing the Inquisition: he provided every argument for toleration he could, and still the Church couldn
Galileo facing the Inquisition: he provided every argument for toleration he could, and still the Church couldn’t tolerate him.

Although Galileo and Shakespeare were both born in 1564, just coming up on a shared four-hundred-and-fiftieth birthday, Shakespeare never wrote a play about his contemporary. (Wise man that he was, Shakespeare never wrote a play about anyone who was alive to protest.) The founder of modern science had to wait three hundred years, but when he got his play it was a good one: Bertolt Brecht’s “Galileo,” which is the most Shakespearean of modern history plays, the most vivid and densely ambivalent. It was produced with Charles Laughton in 1947, during Brecht’s Hollywood exile, and Brecht’s image of the scientist as a worldly sensualist and ironist is hard to beat, or forget. Brecht’s Galileo steals the idea for the telescope from the Dutch, flatters the Medici into giving him a sinecure, creates two new sciences from sheer smarts and gumption—and then, threatened by the Church with torture for holding the wrong views on man’s place in the universe, he collapses, recants, and lives on in a twilight of shame.
It might be said that Brecht, who truckled to the House Un-American Activities Committee—“My activities . . . have always been purely literary activities of a strictly independent nature”—and then spent the next bit of his own life, post-Hollywood, accessorized to the Stalinist government of East Germany, was the last man in the world to be pointing a finger at someone for selling out honesty for comfort. But then the last man who ought to point that finger is always the one who does. Galileo’s shame, or apostasy, certainly shapes the origin myth of modern science, giving it not a martyr-hero but a turncoat, albeit one of genius. “Unhappy is the land that breeds no heroes,” his former apprentice says at the play’s climax to the master who has betrayed the Copernican faith. “No,” Galileo replies, “unhappy is the land that needs a hero.” It is a bitter valediction for the birth of the new learning. The myth that, once condemned, he muttered under his breath, about the earth, “But still, it moves,” provides small comfort for the persecuted, and is not one that Brecht adopted.
A number of books have come out in anticipation of the anniversary, including a fine big biography, “Galileo” (Oxford), by the Berkeley historian of science John L. Heilbron, and new studies reflecting new research within the archives of the Roman Inquisition. Modern scholars have a gravitational pull toward ancient bureaucrats—keep records even of your cruelties and history will love you—and the new research has produced a slightly, if significantly, revised picture of Galileo’s enemies. The newer (and, unsurprisingly, Church-endorsed) view is that Galileo made needless trouble for himself by being impolitic, and that, in the circumstances of the time, it would have been hard for the Church to act otherwise. The Church wanted, as today’s intelligent designers now say, to be allowed to “teach the controversy”—to teach the Copernican and Aristotelian views as rival hypotheses, both plausible, both unproved. All Galileo had to do was give the Church a break and say that you could see it that way if you wanted to. He wouldn’t give it a break. The complaint is, in a way, the familiar torturer’s complaint: Why did you force us to do this to you? But the answer is the story of his life.
Although the twinship of Shakespeare and Galileo is one that we see retrospectively, another, even more auspicious twinning was noted and celebrated during Galileo’s lifetime: Galileo was born in Pisa on the day that Michelangelo died. In truth, it was probably about a week later, but the records were tweaked to make it seem so. The connection was real, and deep. Galileo spent his life as an engineer and astronomer, but his primary education was almost exclusively in what we would call the liberal arts: music, drawing, poetry, and rhetoric—the kind of thing that had made Michelangelo’s Florence the capital of culture in the previous hundred years.
Galileo was afflicted with a cold and crazy mother—after he made his first telescope, she tried to bribe a servant to betray its secret so that she could sell it on the market!—and some of the chauvinism that flecks his life and his writing may have derived from weird-mom worries. He was, however, very close to his father, Vincenzo Galilei, a lute player and, more important, a musical theorist. Vincenzo wrote a book, startlingly similar in tone and style to the ones his son wrote later, ripping apart ancient Ptolemaic systems of lute tuning, as his son ripped apart Ptolemaic astronomy. Evidently, there were numerological prejudices in the ancient tuning that didn’t pass the test of the ear. The young Galileo took for granted the intellectual freedom conceded to Renaissance musicians. The Inquisition was all ears, but not at concerts.
Part of Galileo’s genius was to transfer the spirit of the Italian Renaissance in the plastic arts to the mathematical and observational ones. He took the competitive, empirical drive with which Florentine painters had been looking at the world and used it to look at the night sky. The intellectual practices of doubting authority and trying out experiments happened on lutes and with tempera on gesso before they turned toward the stars. You had only to study the previous two centuries of Florentine drawing, from the rocky pillars of Masaccio to the twisting perfection of Michelangelo, to see how knowledge grew through a contest in observation. As the physicist and historian of science Mark Peterson points out, the young Galileo used his newly acquired skills as a geometer to lecture on the architecture of Hell as Dante had imagined it, grasping the hidden truth of “scaling up”: an Inferno that big couldn’t be built on classical engineering principles. But the painters and poets could look at the world, safely, through the lens of religious subjects; Galileo, looking through his lens, saw the religious non-subject. They looked at people and saw angels; he looked at the heavens, and didn’t.
In the fifteen-eighties, Galileo studied at the University of Pisa, where he absorbed the Aristotelian orthodoxy of his time—one as synthetic as most orthodoxy is. There were Arab-spiced versions of Aristotle, which led first to alchemy and then to chemistry; more pious alternatives merged the Greek philosopher with St. Thomas Aquinas. They all agreed that what made things move in nature was an impetus locked into the moving things themselves. The universe was divided into neat eternal zones: the earth was rough, rugged, and corrupt with mortality, and therefore had settled in, heavy and unhappy, at the center of the universe. Things up above were pure and shining and smooth, and were held aloft, like the ladies in the Renaissance romances, by the conceited self-knowledge of their perfection. Movement was absolute. Things had essences, constantly revealed. You could know in advance how something would move or act by knowing what it was. A brick and a cannonball, dropped from a tower, would fall at different rates based on their weight. And the best argument, often the only argument, for all these beliefs was that Aristotle had said so, and who were you to say otherwise?
Galileo soon began to have doubts about this orthodoxy, which he aired in conversation with friends and then in correspondence with other natural philosophers in Europe, particularly the great German astronomer Johannes Kepler. Mail was already the miracle of the age. In correspondence, the new science passed back and forth through Europe, almost as fluidly as it does in the e-mail era. It’s astonishing to follow the three-way correspondence among Tycho Brahe, Kepler, and Galileo, and see how little time was lost in disseminating gossip and discovery. Human curiosity is an amazing accelerant.
Kepler encouraged Galileo to announce publicly his agreement with the sun-centered cosmology of the Polish astronomer monk Copernik, better known to history by the far less euphonious, Latinized name of Copernicus. His system, which greatly eased astronomical calculation, had been published in 1543, to little ideological agitation. It was only half a century later, as the consequences of pushing the earth out into plebeian orbit dawned on the priests, that it became too hot to handle, or even touch.
In 1592, Galileo made his way to Padua, right outside Venice, to teach at the university. He promised to help the Venetian Navy, at the Arsenale, regain its primacy, by using physics to improve the placement of oars on the convict-rowed galleys. Once there, he earned money designing and selling new gadgets. He made a kind of military compass and fought bitterly in support of his claim to have invented it. Oddly, he also made money by casting horoscopes for his students and wealthy patrons. (Did he believe in astrology? Maybe so. He cast them for himself and his daughters, without being paid.)
If you were trying to choose the best places in history to have lived—making allowances for syphilis, childbirth mortality, and all the other pre-antibiotic plagues—Venice in Galileo’s day would have to be high on the list. The most beautiful of cities, with the paint still wet on the late Bellinis and Titians, Venice also had wonderful music, geisha-like courtesans, and a life of endless, mostly free conversation. Galileo called these years the happiest of his life.
He became an ever more convinced Copernican, but he had his crotchets. He never accepted Kepler’s proof that the orbits of the planets in the Copernican system had to be ellipses, because he loved the perfection of circles; and he was sure that the movement of the tides was the best proof that the earth was turning, since the ocean water on the earth’s surface was so obviously sloshing around as it turned. The truth—that the moon was pulling the water at a distance—seemed to him obvious nonsense, and he never tired of mocking it.
Although Copernicus didn’t see any big ideas flowing from the sun-centered system, the Church was slowly beginning to suspect that heliocentrism, heretically, elbowed man right out of the center of things. Galileo alone saw something more: the most interesting thing about the earth’s spinning at high speeds around the sun was that, in the normal course of things, none of us noticed. One of the deepest insights in the history of thought was his slowly developed idea of what we now call the “inertial system”: the idea that the physics stays the same within a system whether it’s in rapid movement or at rest—indeed, that “rest” and “movement” are relative terms. Physical laws, he insisted, are the same in all inertial systems. We experience the earth as stable and still, but it might well be racing around the cosmos, just as we could lock ourselves up in the hold of a ship and, if it was moving evenly, never know that it was moving at all. (The insight is nicely available to New Yorkers when the local and the express trains catch up on parallel subway tracks, and, travelling alongside each other at the same speed, suddenly seem to stand still.) Fast and slow, large and small, up and down are all relative conditions, and change depending on where you stand and how fast you’re moving. The idea demolished absolutes and democratized the movement of the spheres. Galileo grasped some of the significance of what he had discovered, writing later that “to our natural and human reason, I say that these terms ‘large,’ ‘small,’ ‘immense,’ ‘minute,’ etc. are not absolute but relative; the same thing in comparison with various others may be called at one time ‘immense’ and at another ‘imperceptible.’ ” But he saw only sporadically just how far you could push the principle: he saw the sun at the center of things, and didn’t reflect, at any length, that the sun might itself be turning around some other star.
In 1609, Galileo heard rumors about a Dutch gadget that gave you a closeup look at faraway ships and distant buildings. After a friend sent him the dimensions and the basic layout—two lenses in a forty-eight-inch tube—he got to work, and within weeks had made his own telescope. One night in December, he turned it on the moon, and saw what no man had seen before. Or, rather, since there were Dutch gadgets in many hands by then, and many eyes, he understood what he was seeing as no man of his time had before—that shadows from some of the splotches were craters and others mountains. The moon was not a hard, pure sphere; it was geological.
A few weeks later, he pointed his gadget at Jupiter. Some of his notes, scratched on the back of an envelope, still exist, here in New York, at the Morgan Library. He was startled to see four little stars near the planet. In an episode in the history of thought that can still make the heart beat faster, he noticed that, night after night, they were waltzing back and forth near the big planet: first left, then right, never quite clearing its path, as though the planet were sticky and they wanted to stay near it. In a flash of intuition, he had it: the new stars near Jupiter were actually moons, orbiting the planet as our moon orbits us. So their light might be reflected light, as is our moon’s. All moonlight might be sunshine, bounced off a hall of celestial mirrors. More important still, there in the sky was a miniature Copernican system, visible to the aided eye.
It’s hard to overstate how important the telescope was to Galileo’s image. It was his emblem and icon, the first next big thing, the ancestor of Edison’s light bulb and Steve Jobs’s iPhone. A Tuscan opportunist to the bone, Galileo rushed off letters to the Medici duke in Florence, hinting that, in exchange for a job, he would name the new stars after the Medici. He wanted to go back to Florence, partly, it seems, because he wanted to persuade the smart, well-educated Jesuits who clustered there to accept his world picture. Sell the powerful Jesuits on the New Science, he thought, and you wouldn’t have to worry about the Inquisition or the Pope. Galileo felt himself already under enough religious pressure to continue to encode all talk of his discoveries in his correspondence with Kepler. He even sent him a letter about the phases of Venus in cipher, ending, “Oy!” Really, he did. Heilbron suggests, smilingly, that this hints at Jewish ancestry. (No evidence exists that Kepler replied “Vey!”)
Throughout Italy, the Inquisition was what Heilbron calls “low-level background terrorism.” (One of Galileo’s servants had already reported him for not going to Mass regularly.) It was an Italian Inquisition, meaning subject to the laws and influences of clan, and cheerfully corrupt, but disinclined to killing. Disinclined but not incapable; as recently as 1600, the Roman Inquisition had burned alive, in public, the great Giordano Bruno, who taught the doctrine of the plurality of worlds, uncomfortably like Galileo’s doctrine of many moons. It was unusual for the Inquisition to burn philosophers alive; on the other hand, how many philosophers do you have to burn alive to keep other philosophers from thinking twice before they say anything inflammatory?
The Catholic Church in Italy then was very much like the Communist Party in China now: an institution in which few of the rulers took their own ideology seriously but still held a monopoly on moral and legal authority, and also the one place where ambitious, intelligent people could rise, even without family connections (though they helped). Like the Party in China now, the Church then was pluralistic in practice about everything except an affront to its core powers.
For the next two decades, Galileo tried to do what we would now call basic research while simultaneously negotiating with the Church to let him do it. Eventually, he and the Church came to an implicit understanding: if he would treat Copernicanism merely as a hypothesis, rather than as a truth about the world, it would be acceptable—if he would claim his work only as “istoria,” not as “dimostrazione,” the Inquisitors would leave him alone. The Italian words convey the same ideas as the English equivalents: a new story about the cosmos to contemplate for pleasure is fine, a demonstration of the way things work is not. You could calculate, consider, and even hypothesize with Copernicus. You just couldn’t believe in him.
Again, the distinction, bewildering on the surface, makes sense transposed to contemporary China: you can engage in the free market, and make every calculation that the University of Chicago demands. But you can’t publish a book saying that Milton Friedman was right about everything and Mao was wrong. Galileo even seems to have had six interviews with the sympathetic new Pope, Urban VIII—a member of the sophisticated Barberini family—in which he was more or less promised freedom of expression in exchange for keeping quiet about his Copernicanism. It was a poisoned promise: though Galileo, vain as ever, thought he could finesse the point, Copernicanism was at the heart of what he wanted to express.
It all came to a head in 1632, with the publication of his masterpiece, manifesto, poem, and comedy, “Dialogue Concerning the Two Chief World Systems.” Set in Venice as a conversation among three curious friends, the book was in part an evocation of happy times there—a highly stylized version of the kinds of evenings and conversations Galileo had once had. It was in honor of those evenings that he named two of the characters after his friends: Salviati, who here speaks entirely for Galileo, and Sagredo, who represents an honest non-scientist of common sense. He invented a third puppet, Simplicio, who speaks, stumblingly, for Aristotle and the establishment—the other World System. Salviati describes him as “one of that herd who, in order to learn how matters such as this take place, do not betake themselves to ships or crossbows or cannons, but retire into their studies and glance through an index and a table of contents to see whether Aristotle has said anything about them.” Aristotle is to Simplicio one of those complete thinkers, of the Heidegger or Ayn Rand kind, whose every thought must be true even if you can’t show why it is in this particular instance: it explains everything except anything.
“Dialogue Concerning the Two Chief World Systems” is the most entertaining classic of science ever published. Written in the vernacular—the best modern translation is by Stillman Drake—it uses every device of Renaissance humanism: irony, drama, comedy, sarcasm, pointed conflict, and a special kind of fantastic poetry. There are passages that are still funny, four hundred years later. At one point, the dispute takes up the high-minded Aristotelian view that “corrupt” elements must have trajectories different from pure ones, and Sagredo points out that an Aristotelian author “must believe that if a dead cat falls out of a window, a live one cannot possibly fall, too, since it is not a proper thing for a corpse to share in qualities suitable for the living.” The dialogue is also philosophically sophisticated. Though Galileo/Salviati wants to convince Simplicio and Sagredo of the importance of looking for yourself, he also wants to convince them of the importance of not looking for yourself. The Copernican system is counterintuitive, he admits—the earth certainly doesn’t seem to move. It takes intellectual courage to grasp the argument that it does.
Galileo’s tone is thrilling: he is struggling to find things out, and his eye covers everything from the movement of birds in the air to the actual motion of cannonballs fired at the horizon, from the way stars glow to the way all movable bones of animals are rounded. There’s even a lovely moment when, trying to explain to Simplicio how deceptive appearances can be, Sagredo refers to “the appearance to those who travel along a street by night of being followed by the moon, with steps equal to theirs, when they see it go gliding along the eaves of the roofs.” You can’t trust your eyes, but you can’t trust old books, either. What can you trust? Nothing, really, is Galileo/Salviati’s answer, only some fluid mixture of sense impression and strong argument. “Therefore, Simplicius, come either with arguments and demonstrations,” Salviati declares, in Thomas Salusbury’s fine Jacobean translation, in words that remain the slogan of science, “and bring us no more Texts and authorities, for our disputes are about the Sensible World, and not one of Paper.”
Contemporary historians of science have a tendency to deprecate the originality of the so-called scientific revolution, and to stress, instead, its continuities with medieval astrology and alchemy. And they have a point. It wasn’t that one day people were doing astrology in Europe and then there was this revolution and everyone started doing astronomy. Newton practiced alchemy; Galileo drew up all those horoscopes. But if you can’t tell the difference in tone and temperament between Galileo’s sound and that of what went before, then you can’t tell the difference between chalk and cheese. The difference is apparent if you compare what astrologers actually did and what the new astronomers were doing. “The Arch-Conjuror of England” (Yale), Glynn Parry’s entertaining new biography of Galileo’s contemporary the English magician and astrologer John Dee, shows that Dee was, in his own odd way, an honest man and a true intellectual. He races from Prague to Paris, holding conferences with other astrologers and publishing papers, consulting with allies and insulting rivals. He wasn’t a fraud. His life has all the look and sound of a fully respectable intellectual activity, rather like, one feels uneasily, the life of a string theorist today.
The look and the sound of science . . . but it does have a funny smell. Dee doesn’t once ask himself, “Is any of this real or is it all just bullshit?” If it works, sort of, and you draw up a chart that looks cool, it counts. Galileo never stopped asking himself that question, even when it wasn’t bullshit but sounded as though it might well be. That’s why he went wrong on the tides; the-moon-does-it-at-a-distance explanation sounds too much like the assertion of magic. The temperament is not all-seeing and curious; it is, instead, irritable and impatient with the usual stories. The new stories might be ugly, but they’re not crap. “It is true that the Copernican system creates disturbances in the Aristotelian universe,” Salviati admits in the “Dialogue,” “but we are dealing with our own real and actual universe.”
What is so strange, and sad, given what would soon happen, is that “Two Chief World Systems” contains some of the best “accommodationist” rhetoric that has ever been written. To the objections that the Copernican universe, with its vast spaces outside the solar system, is now too big to be beautiful, Galileo has his puppets ask, Too big for whom? How presumptuous to say it is too big for God’s mind! God’s idea of beauty is surely different and more encompassing than ours. The truth that God has his eye on the sparrow means that the space between the sparrow and outer space is impossible for us to see as God sees it.
These are the arguments that, less eloquently put, are used now by smart accommodationists in favor of evolution. Evolution is not an alternative to intelligent design; it is intelligent design, seen from the point of view of a truly intelligent designer. Galileo was happy enough to go on doing research under the generally benevolent umbrella of the Church if only it would let him.
It wouldn’t let him. He provided every argument for toleration he could, and still he wasn’t tolerated. Part of the trouble was traceable to his hubris: he had remembered at the last minute to put the Pope’s favorite argument for a “hypothetical” reading of Copernicus into his book, but he had made it into a closing speech for Simplicio, and when you are going to put the Pope’s words in a puppet’s mouth it is a good idea first to make sure that the puppet is not named Dumbso. But it went deeper than the insult. Whatever might be said to accord faith and Copernicus, religion depends for its myth on a certain sense of scale. Small domestic dogmatists are always merely funny (like Alceste, in “The Misanthrope,” or the dad in just about any American sitcom). Man must be at the center of a universe on a stable planet, or else the core Catholic claim that the omnipotent ruler of the cosmos could satisfy his sense of justice only by sending his son here to be tortured to death begins to seem a little frayed. Scale matters. If Clark Kent had never left Smallville, then the significance of Superman would be much reduced.
Two new books by the historian Thomas F. Mayer take up exactly what happened to Galileo: “The Trial of Galileo” (Toronto) is specifically about the scientist’s persecution by the Inquisition, while his much longer “The Roman Inquisition: A Papal Bureaucracy and Its Laws in the Age of Galileo” (Pennsylvania) delves into its social and intellectual context. Mayer deprecates the conventional account as, in the words of another scholar, “shrouded in myth and misunderstanding.” But, when you’ve read through his collected evidence, the myth seems pretty much right: Galileo wrote a book about the world saying that the earth goes around the sun, and the Church threatened to have him tortured or killed if he didn’t stop saying it, so he stopped saying it. Mayer believes that had Galileo been less pugnacious things would have worked out better for science; yet his argument is basically one of those “If you put it in context, threatening people with hideous torture in order to get them to shut up about their ideas was just one of the ways they did things then” efforts, much loved by contemporary historians.
To be sure, Galileo’s trial was a bureaucratic muddle, with crossing lines of responsibility, and it left fruitfully unsettled the question of whether Copernican ideas had been declared heretical or if Galileo had simply been condemned as an individual for continuing to promote them after he had promised not to. But what is certain is that, in 1633, Galileo was threatened with torture, forced on his knees to abjure his beliefs and his book, and then kept under house arrest and close watch for the rest of his life. (Albeit of a fairly loose kind: John Milton came to see him, and the image of the imprisoned scientist appears in Milton’s defense of free speech, the “Areopagitica.”) Galileo’s words, read a certain way, were not innocent of irony: “I do not hold the Copernican opinion, and have not held it after being ordered by injunction to abandon it.” Notice that he does not say that he never held it, or that he would not still hold it, had he not been forced to abandon it.
Could he, as Brecht might have wanted, have done otherwise, acted more heroically? Milton’s Galileo was a free man imprisoned by intolerance. What would Shakespeare’s Galileo have been, one wonders, had he ever written him? Well, in a sense, he had written him, as Falstaff, the man of appetite and wit who sees through the game of honor and fidelity. Galileo’s myth is not unlike the fat knight’s, the story of a medieval ethic of courage and honor supplanted by the modern one of cunning, wit, and self-knowledge. Martyrdom is the test of faith, but the test of truth is truth. Once the book was published, who cared what transparent lies you had to tell to save your life? The best reason we have to believe in miracles is the miracle that people are prepared to die for them. But the best reason that we have to believe in the moons of Jupiter is that no one has to be prepared to die for them in order for them to be real.
So the scientist can shrug at the torturer and say, Any way you want me to tell it, I will. You’ve got the waterboard. The stars are still there. It may be no accident that so many of the great scientists really have followed Galileo, in ducking and avoiding the consequences of what they discovered. In the roster of genius, evasion of worldly responsibility seems practically a fixed theme. Newton escaped the world through nuttiness, Darwin through elaborate evasive courtesies and by farming out the politics to Huxley. Heisenberg’s uncertainty was political—he did nuclear-fission research for Hitler—as well as quantum-mechanical. Science demands heroic minds, but not heroic morals. It’s one of the things that make it move. 
ART: DE AGOSTINI EDITORE/BRIDGEMAN

Read more: http://www.newyorker.com/arts/critics/atlarge/2013/02/11/130211crat_atlarge_gopnik?printable=true#ixzz2KJby2NQa 
Enhanced by Zemanta

2013-02-01

The Word on Hemangiomas in Infancy- A Type of Red Bump

The Word on Hemangiomas in Infancy- 
A Type of Red Bump



Since about one in twenty of all people born develop a particular type of red bump, called an infantile hemangioma, we thought you might be interested in the latest information about them.

These are so common that most people have a child with one or know someone who does.  They are often called strawberry hemangiomas, or strawberries, which is how I will refer to them in this note.

What is a Strawberry Hemangioma?

Strawberries have the wonderful property of tending to go away all on their own.   But what are they?

To understand them, it helps to know just a bit about blood vessels.  Blood vessels are not inert pipes, but rather are formed by cells that are especially adept at forming living tubes.  Strawberries are clumps of these special cells that form a tangle of tiny, tiny blood vessels.

Strawberries are very different than the usual assembly of these special cells.  Strawberries do not contain arteries, veins, or even well-developed capillaries.  Rather, they contain clumps of the cells that make these structures and so really are a tangle of nearly developed blood vessels.

The Timing of a Strawberry Appearing and Disappearing

The timing of appearance and disappearance of these red bumps is quite striking.  Here is how it usually goes:
  1. The strawberries tend not to be present at birth.
  2. They tend to appear in the first few weeks of life.
  3. Then they grow for a few months, typically to age 4-8 months old.
  4. If any part of a strawberry darkens a bit blue or gray, that usually is the signal that it will no longer grow.
  5. After it ceases to grow, it slowly shrinks in a process that can take several years.  Often by age 4-6 years old, the strawberry is gone.
  6. Most strawberries that are small and not too deep disappear completely.  
  7. Deeper or larger strawberries also tend to have their color fully clear, but may leave behind a small rubbery lump.
What is the Cause of the Strawberry Hemangioma?

So what causes these to occur, and who is more likely to get them?

The who is more likely question is well known.  We see strawberries more often the more prematurely someone is born.  We also see them more often in girls than boys, European more than African or Asian, multiples more than singletons, and in children of older mothers more than younger mothers. 

As to what causes them, recent leads have been quite intriguing.  

The first is the idea that some early embryonic cells from the placenta sometimes find their way into the fetal circulation and lodge in the skin.

The second idea is that as the fetus is forming, certain areas may be relatively short on oxygen.  We know that shortages of oxygen supply stimulate blood vessel formation and so this  concept suggests that as the body develops a local area of relatively low oxygen the cells that make blood vessels might form a lump.

And, there are some observations that some of the molecules that signal to the body- make some blood vessels- might accumulate for some reason in certain spots after birth.

Types of Strawberries
Whatever the mechanism of cause, strawberries come in three varieties:  small and single, many, one very large.  By far and away, the small and single type is the most common.

Treatment for Strawberry Hemangiomas
The small and single types tend to go away entirely on their own, but can take 4-6 years to do so.
Since the end result of these small, single strawberries can be full clearing, any intervention has to be measured against that result:  painless, complete, disappearance with no side effects, and no scars at all.

For the larger strawberries, or those located in very sensitive areas, like the airway, there are treatments that can work.

Treatments tend to fall into one of three categories:
1.  Topical agents applied to the strawberry.  
2.  Lasers.
3.  Oral medications.
4.  Doing nothing.

Topical agents
These are limited to a couple of medications.  
Topical steroids might help in specific circumstances, but overall have little impact.  
Injected steroids are local if not topical, and do work, but this can be very uncomfortable.
Timolol, a new drug for use in strawberries, is like the propranolol discussed below and may turn out to be effective, but carries some of the same risks as oral propranolol.  


Lasers
Laser therapy is set to burn areas that are just the right color of red, effectively burning the clump of cells that make the strawberry.  But the cells can only eliminated a zap at a time, clearing just a very small area of redness away.  So for most strawberries, full clearing of the red area may take a large number of treatments.
Each treatment is said to feel like a rubber band snapping onto your skin.  So if a large area needs treatment, that is many zaps and can be quite painful.  Laser therapy has its own side effects- including some breakdown in the strawberry, and even scarring.

Oral medications
A number of drugs have been tried to make the strawberry hemangioma go away, but one seems to be offering real possibilities.  That is propranolol, which has had some dramatic success in making strawberries go away, but there are potentially very serious side effects to worry about in young infants, including sharp drops in blood sugar levels, drops in blood pressure, and trouble breathing.  So the use of this drug is currently limited to very large strawberries, and its use requires observation in a hospital.

Doing Nothing
For the vast majority of strawberries, this is by far the best choice.  
After all, if even at its peak size the strawberry is small and not too deep, doing nothing typically gives the best possible result- nothing left over and no scar.
The only problem with doing nothing is that you have to do nothing for several years.
But your child is safeguarded from exposure to potentially harmful medications, painful procedures, and/or long-term scarring.

Bottom Line
  • Infantile hemangiomas, or strawberries, are very common.
  • They are made up of the cells that usually form blood vessels and may come from stray cells from the placenta, or be responding to conditions present during embryonic and fetal development.
  • For most children the strawberries are small.  
  • They grow in size for a few months, then a bit of blue or gray appears in the center.  At that point they stop growing and slowly disappear over several years time.
  • For the usual small strawberry, doing nothing is an excellent choice of therapy.  It works, it is painless, there is no risk, and you are left with no mark or scar.
  • Unusually, the strawberry hemangiomas can be too large or placed in too risky an area to leave alone.
  • For these very unusual circumstance, a new drug, propranolol, although risky, may turn out to be very effective in making them shrink or even clear.
Dr. Arthur Lavin






*Disclaimer* The comments contained in this electronic source of information do not constitute and are not designed to imply that they constitute any form of individual medical advice. The information provided is purely for informational purposes only and not relevant to any person's particular medical condition or situation. If you have any medical concerns about yourself or your family please contact your physician immediately. In order to provide our patients the best uninfluenced information that science has to offer,we do not accept samples of drugs, advertising tchotchkes, money, food, or any item from outside vendors.