SciTech Now 506

In this episode of SciTech Now, we look at a camouflaging Squid; research magnetic fields in rocks; go inside an operation for brain surgery; and predict weather patterns and disasters.



Coming up, a camouflaging squid...

The bacteria in the squid have been living together so long that they've both adapted to helping each other and being the perfect life partner.

Physical evidence of Earth's climate swings...

When the lakes literally dried out, the sediments became red, and so you had essentially ancient soils.

Augmented reality and brain surgery...

It's another thing to see your own tumor, to see it in three dimensions, to see it in relation to your own facial features.

Predicting weather patterns and disasters...

With CYGNSS, we're able to measure these wind speeds in places that have never been measured before.

It's all ahead.

Funding for this program is made possible by...


I'm Hari Sreenivasan.

Welcome to 'SciTech Now,' our weekly program bringing you the latest breakthroughs in science, technology, and innovation.

Let's get started.

Bobtail squids, a species of cephalopods closely related to cuttlefish, have a unique way of protecting themselves against predators at night.

With the help of glowing bacteria, the squid camouflages itself against the starlit night sky.

Our partner, 'Science Friday,' has the story.

I think you'd have to be nuts to not think a bobtail squid is cute.

Those big black eyes, I think they're just the cutest things ever, but Hawaiian bobtail squid are basically the couch potatoes of the cephalopod world.

They sit in the sand for much of their lives.

They do occasionally get up to hunt, but they're effectively, like, relaxing on vacation in Hawaii.

♪♪ My name is Sarah McAnulty, and I'm a squid biologist.

I study the symbiosis between Hawaiian bobtail squid and their beneficial bioluminescent bacterial partner.

A bobtail squid is a small squid that lives off the coast of Hawaii.

They're nocturnal, and they're about the size of a lime.

So these squid swim around at night, and it's basically like the squid has a constantly glowing lightbulb situated sort of in the center of the squid's body on the underside, and that lightbulb is all chock-full of bacteria called Vibrio fischeri.

If you're a cephalopod, you're super easy to eat.

You're basically like a swimming protein bar, so you have to be very good at camouflaging.

Typically if the squid wasn't glowing, it would look like a little squid-shaped shadow or silhouette.

With the help of the bacteria, they match the moonlight precisely coming down from above.

That partnership is called 'symbiosis.'

The bacteria gets a place to live, and if you're a squid, from the bacteria, you get light as camouflage.

Now, in terms of how the bacteria get into the light organ, this is a really cool process.

They go through this molecular gauntlet.

The squid is constantly beating cilia in the opposite direction of where the bacteria needs to swim, so these bacteria need to be awesome swimmers, and they also need to be able to put up with a lot of basically insults from the squid.

You've got nitric oxide.

You've got acid, but this Vibrio fischeri is able to eventually swim actively down pores and ducts into the depths of the light organ, and we call that deep area where the bacteria are trying to get to the crypt.

And once they're there, the immune cells also play a role in this specific symbiosis.

♪♪ My work is trying to understand how immune cells, which we call 'hemocytes,' are able to tell the difference between beneficial bacteria and others.

So I've developed a test and a method for watching the behavior of hemocytes.

The way I do this is, I take a squid from downstairs in the squid room, and I anesthetize it using ethanol.

It knocks them out completely, and I do a blood draw.

I only take out, like, a tiny drop of blood from these little squid, and then I'll do what I call squid PR, which is just, like, CPR on a squid.

I'll blow water over their gills using a pipette.

I'll tap them a little bit, and they burst back into life with color.

♪♪ So we take the blood out of the squid, and then we stain the immune cells with a dye, so we have bacteria in one color, and then we have a different type of bacteria in another color, and then we have the immune cells in far red.

We look at these different colors to tell who's who, and then I take time-lapse videos of these different cell types all interacting.

When the immune cell finds a bad bacteria in this context, it just engulfs it and eats it and kills it, but with Vibrio fischeri, it will bind, maybe carry it around for a little while, and then it just sort of, like, leaves it behind.

So, we think that there's some kind of education process that occurs between the bacteria and the squid at the beginning of the squid's life to sort of teach the immune cells that Vibrio fischeri is the beneficial bacteria.

The immune cells will migrate into the crypts where the bacteria live at night, and they will basically sacrifice themselves to feed the bacteria.

This is totally nuts and super cool because normally you would think of an immune cell as being just, like, a destroyer, not as something that would be feeding bacteria of all things, but this group found that this is what's going on.

The bacteria in the squid have been living together so long that they've both adapted to helping each other and being the perfect life partners.

It may seem bizarre, but it's really important because these squid are giving us a really unique opportunity to understand how animals and bacteria relate and how the colonization by bacteria affects your whole immune system, affects your whole genome and what far-reaching parts of the body are affected by having beneficial bacteria live with you.

♪♪ ♪♪

For years, some scientists have hypothesized about the existence of the Earth's natural climate cycles.

Now some researchers believe there is physical evidence to support this.

Dennis Kent, an expert in paleomagnetism, the study of magnetic fields in rocks, has coauthored a study that analyzes rocks to show how the Earth's climate is dictated by cycles.

He's here today to talk about his study, so tell me.

What'd you find?

Well, the springboard is actually just across the river from where we're speaking in New Jersey, where, 200 million years ago, way back in geologic time, was a series of lake beds and accumulated literally many miles of sediment, and in the course of this, there's a rhythmicity that's been obvious there now, been documented in more and more detail, and this rhythmicity, or cyclicity, can be discerned actually very simply by the colors of the rock often so that the rocks that are dark colored are those that surmised to be deposited under deep water, where there's very little oxygen so that the carbon content sort of colored the rock, and then other times when the lakes literally dried out, the sediments became red, and so they had essentially ancient soils.

And so this is the alternation in interpreting climate, essentially precipitation in this case, and many, many of these cycles been recorded, but one of the difficulties was that there's very little to date in the sediments themselves, ways of independently determining what the periodicity, or the timing, between these rhythms are.

So, when you are talking about these lake beds, I'm just imaging in my head here.

If I could kind of push the lake bed back, you'd have a giant cross section, and you'd be digging straight down, kind of like ice-core samples, and you could say, 'Oh, a rock from here versus a rock from there versus a rock from here,' right?

Is that what you...

That's right, yeah.

It's almost like a cake, a layer cake.

That's exactly right, and these are ancient, so they've... The lakes are long gone.

There's been tectonic, or the land has been uplifted, incised, but to make it convenient, we've taken cores through these thousands of feet of core that record these in a very accurate and precise way.

So, if you see this, you're able to tell what, that there were periods of incredible rain and dry spells or...

That's pretty much.

It's how rainy it is when that's the times when the lake fills up and then times when it rains less or is when the drier conditions, the lake literally dries out, quite analogous to actually the East African lake systems.

Many of those have gone in a more recent geologic past through these cycles of drying out and then being full.

So you're talking about time scales that are hard for us to wrap our heads around, hundreds of millions of years.

This is well before humans ever walked the Earth, right?

Well before.

Right, so what else can you start to extrapolate from this?

In this case, we're looking.

The rhythmicity is essentially a signal.

We're looking from a relation to the rest of the solar system.

It's our clock.

It's the periodic motions of the planets going around the sun, and they're influencing Earth's orbit and the way we receive sun over the course of a year and then over the eons, and it's this rhythmicity that's the back-and-forth that we've been particularly interested in because that's a potential clock, and we want to then be able to calibrate that clock, so then when we see it in other places, we'll be able to utilize it so to document what the course of climate change or whatever else might be recorded in these particular sediments or particular time.

I'm sure that yourself and other researchers have been conscious of the climate science and the climate-science denial that's been happening.

What if someone looks at your work and says, 'You know what?

Here is proof that humans aren't causing climate change because look at these cycles that he's proving in the rock.'

Well, I don't think there's any much doubt that there's been climate change in the past, and since it's before humans, then it's natural, if you will.


And these have been very large, so, for example, of most recently is the change from a glacial interval, where we're sitting now in Manhattan had quite a bit of ice here that withdrew some 10,000 years ago.

So, there have been very large climate changes, but that's not to say that we don't have an influence on going forward.

All right.

Dennis Kent from Columbia University Lamont-Doherty Earth Observatory, thanks so much for joining us.

You're quite welcome.

A new technological advancement in brain surgery comes to Mount Sinai Hospital in New York City.

Reporter Maddie Orton takes us into the operating room to see how one surgeon uses 3-D imaging and augmented reality.

A warning -- This story contains graphic images of the surgery process.

Lisa Galioto of Long Island, New York, was experiencing pain in her neck when she got an MRI in March 2018.

The test revealed an unrelated and shocking discovery.

Galioto had five benign tumors in her brain, one of which was so large it had reached the size of an orange.

Making matters worse, the tumor had arteries wrapped around it.

Removing a brain tumor is a challenging procedure, but Galioto's doctor, Joshua Bederson, at the Mount Sinai Hospital in New York City, has high-tech tools to help navigate these tricky surgeries.

He's one of the first brain surgeons in the country to use 3-D images of the brain integrated with augmented and virtual reality.

Augmented and virtual reality help us manage a situation like this in many ways, from the initial patient consultation all the way through the planning of the case and generally navigating towards safe quarters of surgery.

Here is how it works.

Two-D MRI and CT scans, referred to as DICOM images, are digitally fused together to create a 3-D image of the patient's brain.

Software is then used to paint outlines of the tumors, arteries and veins different colors on that 3-D image.

This creates a comprehensive map of targets and no-fly zones for the procedure ahead.

Holly Oemke researches new technology in the surgery process.

She helps Dr. Bederson implement and optimize these tools.

We have to tell the computer what pieces of the anatomy we're interested in.

So, we're using a tool called Smart Brush to actually paint on the DICOM images the tumor as a whole.

All those flow voids in there...


...those just have to go.

We've outlined the sinus, which is a vein, a major vein in the brain, and then the arteries that are either feeding the tumor or are close to the tumor that have potential to cause damage if we were to disrupt them in some way.

Inferior sagittal sinus.

For Lisa Galioto's surgery, this color-coded 3-D map of her brain helps Dr. Bederson and his team better develop their plan.

By preoperatively planning this case, we're able to visualize ahead of time what the tumor looks like before even getting into the operating room.

The patient, Galioto, is also able to visualize what the tumors look like and how Dr. Bederson plans to address them, a big benefit for laypeople about to undergo a major surgery.

It's one thing to be told by a surgeon that 'You have a tumor in right frontal lobe, and we have to make an incision over the top of your head.'

It's another thing to see your own tumor, to see it in three dimensions, to see it in relation to your own facial features and to understand exactly why we would be making an incision here and where we would be doing the opening.

Although this is still an invasive surgery, it seemed a little less invasive to me that he kind of knows where he's going.

Galioto gets rolled into the operating room.

It's in here that the technology perhaps comes most in handy.

The 3-D color-coded lines that were created to outline Galioto's tumors, veins and arteries are available to Dr. Bederson in the operating room through augmented reality.

He can view his patient's anatomy with the naked eye and then look through a microscope that provides an overlay of the digitally drawn outlines so he can see where the arteries, veins and tumors lie before he even reaches them.

This is referred to as a heads-up display.

The name originates from pilots using the technology to view information while looking ahead in flying rather than looking down to check their instruments.

Dr. Bederson says the same concept applies here.

Prior to heads-up-display capability, a surgeon's methods would've been very different.

You would've used a map.

You would have looked at the MRI scan, CT scan, gone back to the patient, looked at the patient, internalizing what you've seen on the MRI scan and trying to project that onto the patient in an accurate way.

More recently, Dr. Bederson relied on a GPS-like navigation probe.

The tip of what looks like a pen touches the patient's anatomy.

It's synced with a map of the brain to tell the surgeon where he is in real time.

This is what most surgeons currently use.

Dr. Bederson also incorporates this navigation probe into his process, but he doesn't have to rely on it solely anymore.

That's a big advance and very, very helpful, but it still requires that you stop what you're doing, look at the map and then go back to what you were doing.

In the analogy of flying a plane, you have to stop flying the plane to look at your information and then start flying the plane again.

That's no longer an issue.

Dr. Bederson says heads-up-display technology for surgery has made his work faster and safer.

To use the heads-up-display projection of the virtual-reality reconstruction onto the scalp while we're planning the skin incision so that we can position the opening precisely and give us the maximum exposure for the minimum opening.

Likewise, after the skin incision, we'll be able to position our craniotomy, which is the bone opening, right over the tumor by projecting the tumor onto the surface of the skull and sort of seeing through the skull to the tumor to more precisely outline our opening.

The heads-up display helps the rest of the team in the OR see what Dr. Bederson sees, as well, because the microscope he uses also functions as an exoscope, providing a magnified video feed to a screen complete with the overlay of color-coded outlines, viewed in 2-D or in 3-D with special glasses.

As the operation continues, the team can see the clear, solid outlines of the patient's tumors, veins and arteries despite blood flowing from the tumor, and they can anticipate where these structures will be as they move further into the brain thanks to dotted outlines that provide depth perception.

Six hours later, the surgery is a success.

Can you stand on one foot?


Then the other foot.


Today is my follow-up visit with Dr. Bederson.

I had brain surgery 2 weeks ago, and I'm feeling awesome.

And you never had any seizures, so I think that driving is okay for you now.


So that's good.

Get my independence back.

I think the technology made a big difference.

It helped us identify the important vascular structures that had been displaced and enveloped by the tumor and were at considerable risk when we reached the deep parts of the tumor removal.

So, knowing exactly where those were, being able to navigate to them and then to stay away from them once we were nearby was key in preventing a stroke.

Dr. Bederson is already excited about new technological features under development with several of the hospital's partners.

What if we could also provide haptic feedback to my surgical instruments?

We could provide auditory information.

We could assign different tissues different sounds, for example.

And with each technological advance, the hope is for more and more outcomes like Lisa Galioto's.

She's as normal as can be.

That's as good as you... Can't make someone better than normal, I don't think.


Deep in space, eight satellites orbit the Earth as part of NASA's Cyclone Global Navigation System.

NASA hopes the data from these satellites will help predict weather patterns and potentially save lives.

Here is the story.

CYGNSS, Cyclone Global Navigation Satellite System, constellation of eight microsatellite-class spacecraft, they're 29 kilograms, about the size of a lunch tray.

You could fit them on the desk.

We launched all eight of them on a single launch vehicle.

They're in a 35-degree inclination, about 550-kilometer-altitude orbit.

That means that they are flying over the tropics mostly, between plus and minus 35 degrees latitude.

There's a single instrument, single science instrument, on the spacecraft.

It's a GPS receiver, very sensitive.

We have antennas on both the top and the bottom of the spacecraft.

We're able to measure the directly received signal from the GPS satellites, as well as the GPS signal that reflects off of the ocean's surface, and by comparing those signals, we're able to measure the ocean-surface roughness, and from the ocean-surface roughness, we're able to infer the surface wind speed.

We've made a lot of advancements compared to previous platforms to do similar research.

We can revisit the same place on the Earth quite quickly.

We've got great coverage over the tropics, and we're able to see through dense precipitation in the core of a hurricane and make wind-speed retrievals in places where other aircraft-based instruments and other satellites that have similar instrumentation weren't able to do this in the past.

It's not a direct measurement of the wind speed, but it's a measure of the ocean-surface roughness.

So, if you can imagine a very still pond or a lake, very glassy surface, and you can, say, see the moon reflecting in that surface, you're going to get a very sharp reflection.

You're able to see the moon very easily, you know, specular reflection on the surface.

If it's windy and there's a little bit of a roughness on the surface of the ocean, you'll see that reflection of the moon is kind of scattered out, and you don't have that clear reflection of the moon.

The light is scattering off the surface, and so that's a similar thing that happens with the GPS signal when it hits the ocean, so the more that signal is scattered and less of a specular, sharp reflection, that indicates a higher wind speed at the surface that causes that surface roughness, and so there's wind-retrieval algorithms that the scientists have developed to be able to correlate surface roughness into actual wind speeds anywhere in the range from a few meters per second all the way up to 70 meters per second, which is category-5 hurricane, 150-mile-per-hour-plus winds.

Really what motivated the whole mission is that, in the past, oh, at least 30 years, since about 1990, there's been a great improvement in forecasting the track of a storm, where is it going to make landfall, several days out.

So, we've gotten a lot better at that, about 50% improvement in being able to forecast the track of a storm.

Where we haven't made any improvement is being able to forecast how strong the storm is going to be when it makes landfall and how fast it's going to intensify.

So, we fly hurricane-hunter aircraft through the storm to make measurements, but you can only do that so often and really when the storm gets closer to land, and with CYGNSS, we're able to make similar measurements 24/7 from multiple spacecraft and measure these wind speeds in places that have never been measured before in the hurricane with a hope that we will improve the models that scientists use to forecast rapid intensification of these storms.

So, ultimately, what it's going to mean to, you know, people on the ground that live in coastal areas is better indication of, is this storm going to be strong when it makes landfall?

Do you need to evacuate early?

Is it more likely that the storm is going to degrade and not be so intense such that evacuation isn't necessary?

Ultimately, you know, saving lives and, you know, advanced warning for when strong storms are going to hit is what CYGNSS is going to provide in the future.

So this is a tool that we can use to visualize where our satellites are located right now in orbit, and so if I switch to this view, you can see, as they're going around the orbit, they're not all perfectly evenly spaced right now.

One of our goals is to space them more evenly, but as long as they're about 5 to 10 degrees apart in the plane of the orbit, we can get unique measurements from all the spacecraft.

So, this is a very useful tool.

At any point in time, I can quickly jump to see, you know, where over the Earth are the spacecraft transiting.

I give you a world map, or maybe I prefer to look at the ground tracks of the spacecraft as they fly over.

So, here you can kind of see an example of how if there was a hurricane in the Caribbean here, you know, our spacecraft are flying over ground tracks.

One after another, our spacecraft flying over the storm.

So, you can imagine as over the course of the day, all these ground tracks, they fill in and just cover the entire tropics.

So, we're able to make measurements over the entire region between plus and minus 35 degrees latitude.

I think really what we've been trying to do all along is design and operate a mission that's going to save lives by allowing people an opportunity to evacuate coastal areas, you know, well in advance of the storm.

You know, we can't stop the brute force of a hurricane itself, but, you know, just like, you know, a tornado warning gives someone the opportunity to evacuate the area, you know, these hurricanes that span hundreds of miles and cause all the devastation that they do, people really need as good of an indication of whether to evacuate or not, and CYGNSS will be able to provide the improvement to the models that allow this to happen.

And that wraps it up for this time.

For more on science, technology, and innovation, visit our website, check us out on Facebook and Instagram, and join the conversation on Twitter.

You can also subscribe to our YouTube channel.

Until next time, I'm Hari Sreenivasan.

Thanks for watching.

Funding for this program is made possible by... ♪♪ ♪♪ ♪♪ ♪♪ ♪♪ ♪♪