Speaking Warmly of Saturn

Okay, this just proves I’m a whore for astronomy images. I’d already started writing something about PET scans in relation to today’s story about the long-term effects of chemotherapy on brain function, but then I saw the press release for the above image.

In spite of the mediocre resolution (mostly because it’s a composite of infrared mapping spectrometer, which operates in a scanning, single-pixel-at-a-time mode), the image rather intuitively communicates the idea of seeing ”through” Saturn’s clouds.

As described in the caption from JPL, we’re looking at a near-infrared image of Saturn, in which the shorter-wavelength light (shown as blue-to-green) is reflected off the cloudtops whereas the longer-wavelength emission (colored red) from Saturn’s warm interior shines through in shadowed regions, less obviously in the daylit regions. Because most people (stellar astronomers excluded) think of red as warm and blue as cool, this image capitalizes on people’s natural sensibilities. Always a good thing.

This strikes me as a good image to talk about some of the confusing aspects of infrared light, which well-informed people typically perceive simply as “heat,” because that’s what they’ve been told. Of course, the problem with blackbody radiation is that there’s a big contrast issue: yeah, the lower layers of Saturn’s atmosphere may be warm, but their infrared glow gets blocked by clouds in the upper atmosphere, plus it has to compete with the bright, reflected glare of the Sun. The image above allows one to talk about those contrast issues while clearly conveying that infrared light allows us to see things we can’t in visible light. Also, the rings cut across the center of the image as a blue line, indicating that they reflect the short-wavelength infrared light but don’t emit much thermally—pretty much as one would expect from chilly rings made mostly of ice.

Stunningly Incomprehensible

Words fail me on this one (well, not really). How incomprehensible is it? Let me count the ways.

First off, we’re confronted with the somewhat cryptic information that we’re seeing the “cosmos” at a distance of 450 million light years. Mmmm hmmm. Then we have a color bar at the bottom, sans units, that tells us that the values range from –1.0 something to 4.3 something. Other labels appear scattered pell-mell across the image… “Fur-For”? “Per-Peg”? What is an Average Jane to make of these? (They’re short for “Further Fornax” and “Perseus-Pegasus,” BTW. I’ll even make the obnoxious grammarian observation that it should be “Farther Fornax” if anything.) And to add insult to injury, the red typeface blends right into the color scheme of the plot!

What saddens me most about this image is that it represents a spectacular result poorly communicated. What you’re looking at, FYI, is a color-coded representation of matter density within a thin sliver of the Universe at an approximate distance of (you guessed it) 450 million light years. The press release correctly identifies this as “the largest full-sky, three-dimensional survey of galaxies ever conducted” (based on 2MASS data, just so you know). How kewl is that? You wouldn’t know from this picture.

(If I were petty, I would note that the headings on the images don’t even match the captions provided in the press release from the Royal Astronomical Society, at least as it was emailed to me. For example, the caption for the image above reads, “The reconstructed density field, evaluated on a thin shell… at 45 million light years. The main overdensity, Shapley, is shown in red and green. Other overdensities are Rixos F231\526 (RIX), SC44, C19, Pisces (Pis), Perseus-Pegasus (Per-Peg), C25, C26, Hercules (Her), Abell S0757, C28, C29, C30, SC 43, Leo, Abell 3376, C27 and C21. The voids (blue) are V4, Further-Fornax (Fur-For) and V15.” Emphasis mine. But I’m not petty. Oh, and the picture is correctly labelled; the caption is wrong.)

The academic paper presents the same images as black-and-white contour plots, so I can only assume that the researchers believed themselves to be translating their results into user-friendly format simply by adding garish color.

Allow to digress for a moment to explain why I see this as such an egregious mistake. My two-bit definition of science visualization goes something like this… We make images from data (i.e., ones and zeros) for essentially one of three purposes: 1) communication to oneself in the form of data analysis, 2) communication to peers, often in the form of a graph or contour plot, and 3) communication to a general audience. The first two purposes depend heavily on well-developed visual language—for example, knowing what variables you’re plotting against one another, knowing what color-coding means, etc.—that often tend to be very specialized. Those of us who have been reading cartesian coordinates for most of our lives forget what it’s like not to be exposed to that visual vocabulary on a day-to-day basis. Needless to say, scientists tend to communicate via an extremely sophisticated visual language (that furthermore varies from discipline to discipline). The biggest problem occurs when scientists try to make the leap to the third form of visualization—communicating to the general public. It’s difficult to translate their complex visual language into a visual vernacular. (More details on my “personal paradigm” are available as part of my “What Is Viz?” PowerPoint, but be warned that you need to read the comments for each slide; otherwise, it’s just a bunch of pictures.)

Thus, the fundamental issue I see here can be likened to a mistranslation. The image above uses a very specific vocabulary (e.g., false color, Aitoff projection of a sphere onto a plane) to describe a small part of the Universe. Without attempting to translate the truly challenging aspects of the image, the presentation of candy-colored data sans specific quantifying information in fact results in a terribly confusing message.

Oddly enough, we have played with an earlier version of the 2MASS data as part of the Hayden Planetarium Digital Universe, so I’ve seen (in 3-D) the galaxy data upon which this is based. The data certainly lend themselves to a 3-D representation, and I can image a fly-through in which the matter distribution is represented by 3-D isosurfaces, or if a lot of rendering time were at one’s disposal, by volumetric rendering (similar to the manner in which some dark matter simulations are depicted.

Holy Ozone, Batman!

(N.B. that the animated GIF above is rather large, about four megabytes, so it could take a little while to load in your browser. If you prefer, a smaller version is also available online.)

Not a new story, really, but an interesting choice of animation to illustrate it. New data from the SCIAMACHY instrument onboard the European Space Agency (ESA) satellite Envisat shows a record-breaking ozone hole—after a few years of improvement, the hole seems to be deepening once again.

When I first glanced at the above image sequence, I reacted with a basic, nonplussed “hmmm.” The data in the middle look screwed up, and I’ve never been a big fan of false color, so I scanned down and started reading the article on the ESA website. When I glanced back at the animated GIF, it was much more interesting! How embarrassing to discover that I’m as much a victim of our culture’s impatience effect as the museum- and planetarium-goers I’m trying to entice into taking lengthier looks at things.

Basically, the above animation does’t hit its stride until about halfway through (mid-September), at which point the change is quite striking (and depressing): a big, black hole develops over Antarctica, swallowing up the continent’s outline like a killer blob from a 1950s sci-fi film. And like the slower-paced films of that era, you have to wait a little while for the punchline.

The only other issue I have with the image is that it doesn’t quite underscore the story, namely that the ozone hole is “deeper” than at any time in the last eight years. To communicate that, the page of figures on the ESA website relies on a bunch of wiggly graphs. All well and good, but how ’bout a side-by-side comparison of two years, even as a still image? Or perhaps a viewer that would allow you to select two years to show side-by-side, clicking through dates in lock step? That would be an impressive and intuitive interface.

Environmental data demand good visualization, for personal impact and political import both. Sadly, we have no superheroes to save us from the ozone problem, and with humanity’s track record for addressing long-term problems, we need all the help we can get.

Mapmaking

A slow Monday. So I’ll highlight a reference I came across while reading the book Weighing the World, by Edwin Danson. (Danson’s book describes the processes of surveying in illuminating but excruciating detail; what struck me as most interesting was both the variety of individuals involved and the dramatic sweep of the effort, risking life and limb to determine once and for all that, in fact, Earth is not perfectly spherical, for example.)

Anyway, the reference is Antique Maps, by Carl Moreland, but you can read it online at its very own website. Now, the goal of this tome is to introduce prospective buyers to the essentials of map- and print-making, but the information passed along makes it well worth the occasional digressions. For example, Chapter Two, “The Printing of Old Maps,” gives a succinct and worthwhile survey of techniques that formed the foundation of all diagrams and printing techniques from the 16th to the 19th Centuries.

This historical stuff fascinates me, as my previous post about dodo lithographs may suggest. Printing maps made maps worthwhile in a way that drawing maps was not: a printed map could be amended and improved upon, and it could effectively incorporate input from myriad voyagers, surveyors, and sailors. In my opinion, that conceptual transformation has few parallels in the history of human thought.

More on that in posts to come…

Trying in Vein

Yes, I live in New York, but no, I did not attend NextFest while it was here. I nabbed the above image from their website, though!

It shows a Luminetx VeinViewer in action. Basically, the gadget takes an infrared image of a part of one’s body then re-projects it onto the corresponding surface of one’s skin. Veins don’t reflect the near infrared light, but the surrounding tissue does, so you get a high-contrast view of the veins. Hence, “VeinViewer.”

What I find amazing about the above image is that it communicates that concept fairly clearly—okay, not the near-infrared part, but the general function of the device. You see the image above, and you say to yourself, “Wow, I can see his veins!”

Now, that may not seem like such an accomplishment, but take a look at the images from the Luminetx gallery. Actually, I’ll save you the trouble and link one of the images below:

I mean, okay, the kid looks happy, but what’s that on his arm? It looks like he got a weird tattoo! (Or was the victim of some poor Photoshop work.) Having seen the other image, I understand what’s going on, but the image doesn’t read as clearly as the one from the NextFest site. Offhand, one might think that the real-world image from Luminetx would communicate the concept better than the somewhat stylized image of just the backlit hand. But the composition, lighting, and sheer simplicity combine to distill the concept in an impressively lucid (and certainly more aesthetically pleasing) manner.

High-Resolution Abstract Art

Having blogged remote sensing on Earth in my last post, I guess I should make the leap to remote sensing on Mars…

So, what annoys me about the way the press release describes the above image is that, although the story is “the highest-resolution camera ever to orbit Mars,” the reader is given no sense of scale. You have to go to the caption on the mission webpage to find out that the resolution is about a foot per pixel. If nothing else, the press release could indicate the size of something in the image—for example, the small crater in the lower right-hand corner is about 15 feet across (I checked).

To be honest, I’d like to see a graphic scale on the image, marking out a distance of, say, 100 feet. But that ruins the aesthetic that has started to develop around NASA images. Images from the rovers have their own appeal, but the Mars orbiters send back images that function more as abstract art (take, for example, the Mars Express image of Aureum Chaos). A little scale in the corner of the image would ruin that. I have in mind Elizabeth Kessler’s thesis about Hubble images and their relationship to American culture as I write this.

Mud Volcano

Okay, I’m an American, so no one would expect me to remain current in international events, but what’s with this “mud volcano” in Java? The images above come from the Centre for Remote Imaging, Sensing and Processing, which offers web page with a (must admit) spiffy viewer to examine the extent of the damage.

To me, the above images represent the power of remote sensing and our new-found ability to contextualize terrestrial events (cf. the extensive use of such imagery in Al Gore’s impressively illustrated An Inconvenient Truth).

Just bizarre. Listening to a bunch of talks at the recent Pale Blue Dot meeting convinced me that I really don’t understand geology, and this just confirms my suspicion…

Uranus in Season

Hubble just announced the discovery of a dark cloud in the atmosphere of Uranus, but as I looked at the press-release image, I realized that I was seeing Uranus from a slightly different angle compared to previous images. So I created the composite image above, using the various Hubble pictures of Uranus from 1994 through 2006. The Hubble crew did much the same thing with Saturn a few years ago, so it’s not an original idea. But it’s kinda kewl to see Uranus going through 14% of its orbit.

Sonar or Later

Several sources are reporting on the discovery of the remains of the U.S.S. Macon, one of the last dirigibles flown by the United States, which foundered and sank near Point Sur in the Pacific Ocean back in 1935. A webpage on the Monterey Bay Aquarium site describes the research project nicely—including an intriguing detective story (involving a girder and a seafood restaurant) and the plans to return to the site this month for further exploration.

It takes a little digging to find the image above, which is probably a good thing, since it’s not the easiest to interpret… It’s a sonar image that shows the track of the scanning device as a vertical white line on the left, with the debris field showing up as a mottled patch on the right of the image. The U.S.G.S. offers a good description of side scan sonar techniques, including a spiffy diagram at the top of the page that should make the above image somewhat more comprehensible.

Images like this one come in handy, however, when trying to describe the challenges of scientific discovery. Part of the detective story that doesn’t show up on the website is how you take miles and miles of scans like the one above and sift through them to determine the location of interesting stuff—remains of an early-20th-century airship, for example. The astronomical community will face just such a challenge in the near future, when the Large Synoptic Survey Telescope (LSST) comes online with several terabytes of imagery collected every day. How does one sift through so much data to find interesting events? And how does one convey the magnitude of that effort to the general public?

By the way, you may actually recognize the U.S.S. Macon from the famous picture of a dirigible over New York Harbor. That’s her! With Manhattan stretching out in the distance…

From Mice to Magnetospheres

A coworker just asked me a question about the magnetosphere, and in a quick online check, I came across the above image. I tend toward the nit-picky in matters concerning space weather, in large part because I attended a graduate program in “Space Physics and Astronomy” at Rice University. I see several problems with this image… Most disconcertingly, it makes the bow shock appear like a nearly solid boundary, with the interior and exterior vastly different in their texture and color; in fact, the bow shock simply represents the location where particles slow from supersonic (yes, there is a such a thing as sound speed in space). Furthermore, the little squiggly arrows presumably indicating particle motion through the magnetosheath look nothing like the actual flow. Finally Earth’s magnetic field should look more like a dipole, so all the field lines should not converge to a single point at the poles.

For contrast, allow me to reproduce a scene from the most recent space show produced by my institution, Cosmic Collisions:

We used data from the Center for Integrated Space Weather Modeling as the basis for the sequence, so the effect is not quite so diagrammatic. The grey surface represents the bow shock; the bluish surface represents the exterior of the region dominated by Earth’s magnetic field. And note that the flow moves through the bow shock and around the Earth’s magnetic field. You can see more from the show (and more of the data) in a short piece produced for our “Science Bulletins” AstroViz segment.

Finally, speaking of alma maters, a space weather story from Cornell just appeared in my inbox. Solar flares may cause problems with GPS receivers—no great shakes if you’re driving your Lexus down I-10, but more problematic if you’re flying through dense fog on a commuter jet.