SciTech Now Episode 514

 In this episode of SciTech Now, discover how researchers are testing drones that can think for themselves; Is the Internet affecting the creative side of our Brains; a 4D VR experience to the Moon; and a solution to rainwater and its many pollutants.



Coming up, self-thinking drones.

The overall mission goals are supplied by the human, but executing the tasks is done onboard autonomously.

Is the Internet affecting the creative side of our brains?

We won't have the skills to do the critical thinking that used to happen when we used to read long magazine articles.

Moving beyond gravity.

Through our partnership with NASA, we're really looking to incite the astronauts of tomorrow.

What lies beneath?

Instead of removing the nitrate, they're actually adding nitrogen into the system.

It's all ahead.

Funding for this program is made possible by...


I'm Hari Sreenivasan.

Welcome to our weekly program bringing you the latest breakthroughs in science, technology and innovation.

Let's get started.

Driverless cars are already on the streets in some places.

Now Central New York scientists are moving from the roads to the skies designing and testing drones that can think for themselves.

Here's the story.

All drones are not equal.

There are, like, a photography drones that you can use for cinematography, and someone can easily pilot that one, and you can use it, like, with GPS.

And people are thinking, 'Okay.

We can buy few of those drones and use it for inspecting, like, my nuclear power plant or inspecting this right line or inspecting this farmland.'

No, that is not the case.

So Akrobotix, we are developing safe, reliable, autonomous unmanned systems in all domains -- space, air, ground and underwater.

It's purely mathematical.

We provide scientific solutions rather than engineering fix, and we want to get this trust, so missions are not going to be harmful.

We can trust this mission.

It's going to be really good, and it's going to help humanity.

A lot of people have an idea of what a drone is, and they think of it as this sort of remote-controlled thing.

You're trying to take it to the next step.

What's that next step?

So there is a lot of research ongoing where you have onboard autonomous unmanned aerial systems, which... At which point it's basically a flying robot.

So the overall mission goals are supplied by the human, but most of the details of planning the trajectory, executing the tasks is done onboard autonomously.

And that's the next level that a lot of research is going on in that regard.

Was the drone programmed to fly in a certain way and then you see if it actually flew the way you wanted it to?

Yes, exactly.



So we have... So that drone is just like a black box.

It doesn't know anything, so all the command and control is from this one.


We are trying to make drones smarter and more autonomous, and whatever you see in our lab, actually, are being used for this purpose.

There are eight Vicon cameras, which are motion-capture cameras.

They help us to know the current position and orientation of drones, and then we can send the required comments by our control, guidance and navigation codes and algorithms we have to drone, and drone will know, actually, where to go and how to go.

So to make sure I understand then, the drone has the sensors on it.

The cameras detect where the drone is.

The cameras send the signals to the computer.


The computer sends the signal to the drone, so the drone know where it is.


And this is all a feedback loop that's happening, and how specific is it?

Sort of, generally, you know the drone is sort of over here or sort of over there?

Down to 1 millimeter?


So we know exactly where the drone is?


Exactly, precisely.

Why is that so important?

Can't you just be a few feet off?

Just thinking big picture and as you look forward, what happens with unmanned drones when everything is successful, say 10 or 15 years from now?

Is the idea of delivering Amazon and pizza by drone... Could that be real?

I mean, the technology is there to deliver packages.

What it needs, actually, is the factor of what happens when something bad happens, like when a drone crashes.

Who is responsible?

Those are factors that have to be worked out, and those involve not just the technology developers like us but involve regulatory policy makers.

They involve federal agencies as well as local communities.

But isn't part of your research making sure they don't crash?

That's what it is about.

So the autonomous part, the safe autonomy, the reliable autonomy part is to ensure that it does not crash, but there are things that happen in nature that we are not in control of.

For example, what happens if you have a bad weather or sudden wind gust blows your craft off course, and that's actually the more challenging as well as the interesting research questions that we are looking at.

♪♪ ♪♪

Ainissa Ramirez is a scientist, author and a self-proclaimed science evangelist.

She is the creator of a podcast series called 'Science Underground.'

She joins me now to discuss the Internet and its impact on the creative side of our brains.

So this is a topic that I am sure that everyone has an opinion on.

Now, is our brain something that actually changes over time given the type of stimulus in the world that we live in?

Absolutely, and this is good news because if you're older, like myself, you can learn that second language.

You can learn how to play the guitar.

The brain is flexible.

It's plastic.

That's what they would say.

So that's wonderful.

But it's also worrisome because it means that whatever we expose the brain to, it will also change accordingly.

And so now that we're all on the Internet where we're jumping from topic to topic, we're skimming.

We don't think deeply, so we're going to start adapting those kinds of skills as well.

So we are going to enter what this book that I read recently called the shallows.

The shallows, what does that mean?

So, like, right now, if I look at my phone, I have about six different headlines.

I have four different apps that are telling me of things that it thinks are very important to me.


Now, I feel somewhat informed in the news all of a sudden.

That's right.

I mean, obviously, I'm in the news business, but I feel somewhat informed just by looking at these headlines, and I might not click...

Right, but you're not doing a deep... Well, you will do a deep dive.


But most people won't do a deep dive, and so we don't develop the skills for deep thinking.

We won't have the skills to do the critical thinking that used to happen when we used to read long magazine articles or read books, but now that we're on the Internet and we're reading a tweet or a Facebook post, our attention spans are looking at things that are very short in size, and so the brain will start to adapt to digesting just small nuggets.

So what part of our actual brain is the creative side versus the numbers side, so to speak, and how is that affected?

Is it because it's just not stimulated that it starts to shrink down, or what's happening?

Oh, that's a good point.

Well, the parts of the brain for creativity is still under debate.

If you talk to different scientists, you'll hear that they're still starting to study it, and they're still trying to figure out how the brain creates.

It ends up that it's a bunch of parts of the brain that's interacting with each other, and what you want is a superhighway between all of them to be highly creative, so they're still kind of sorting that out.

But it has been found, in terms of new skills... Let's say that you and I learn how to juggle for a couple of weeks.

There's a part of our brains that will grow because of that new skill, so we know that it's plastic, and it can change, but for the Internet, they're still trying to study that.

The difficulty is that it's really hard to study the brain because it's really hard to study people who have not used the Internet.

You need a control, and most people have some exposure to it.

Those who don't, maybe they speak a different language or they're Amish or they are suffering abject poverty.

They've got other issues that make them difficult to be controls.


So what I hear you saying, though, is that it is possible to be keeping that part of our brain active by learning new things.



So besides maybe using the Internet less or maybe diving deeper when possible that we can learn to stay creative.

Oh, absolutely.

So there's two factions with creativity.

There's those who say we're going to be highly creative because we're exposed to so many new ideas as you just showed on your cellphone, and we'll know more, so we'll be more creative.

There's another faction that says that in the creativity process, you need to be able to simmer, and because of the way that we use the Internet, we don't really give ourselves space to simmer.

We're highly distracted.

We play Candy Crush.

We're not giving our brains time to think in the background about those ideas, so these two different factions say that creativity is going to change, but one says more and the other says less.


I could see right now a 12-year-old watching this segment saying, 'Ugh, these people are so old.

They don't understand how creative I am on Snapchat.

I did the face swap, and then I made the pineapples rain, and I did all this other stuff.'

What are you saying?


I don't know what you're saying.


So I mean, aren't there tools... I mean, the Internet being a tool...


Aren't there opportunities for creativity that it is enabling or it can enable?


It can create... It can make us more creative.

It's just that the way that we're using it, we're not providing enough time for our brains to incubate that I don't think that we'll be as creative as we'd like.

It definitely is a tool where we can be more creative.

We can learn things instantly.

We don't... Back in the day, we'd have to go to the library, go at certain times, go into encyclopedias and then...

The microfiche.

Microfiche, for the little ones, microfiche is this, you know, film, which is also another thing they don't know.


But so we have access to so much information, but we're highly distracted, and so we're not using that tool effectively to make us the most creative.

All right.

Ainissa Ramirez, thanks so much for joining us.

Thank you.


When we enter the Martian atmosphere, we're going at about 12,500 miles per hour, and we end up landing at about 5 miles per hour, nice and soft and easy and a piece of cake.

I'm Julie Wertz, and I help spacecraft land safely on Mars.

We'll all be in here monitoring the spacecraft.

Entry, descent and landing is basically the entire process of how we get from cruise when we're flying through space and on our way to Mars all the way through entering the atmosphere, deploying our parachute and landing safely on the ground.

There are a lot of things that have to right.

I'm going to be nervous, but at the same time, we've prepared for a lot of it.

We've tested every scenario we can think of.

In this room, we'll be watching for data from the spacecraft trying to help inform the public about what's going on and how we're doing.

My husband was on the entry, descent and landing team for Curiosity.

Coming up on entry.

I've been through EDL before as a wife and as a family member.

I've gone through the stress and the panic of going through EDL, but this will be my first time going through it with something that I've helped directly work on.

That's how we ended up as an EDL couple.

It's my fault.

Imagine embarking on a journey to space as an astronaut.

Now you could do just that on a 4-D virtual reality journey at the Samsung flagship store in New York City.

In celebration of the 50th anniversary of the first Moon landing, Samsung has partnered with NASA to showcase STEM education and the important role space has in the future of our humanity through virtual reality.

Come along for the journey.

Samsung is an innovation company, and when we looked to open our first-ever flagship, we really looked through that same lens of innovation in terms of what is the future of retail, the store of tomorrow?

And so we wanted to create a space where people could really immerse themselves in all aspects of the Samsung ecosystem, and we've really built it as a digital playground.

When people come to Samsung 837, we want them to really feel like they've experienced the future, like they've gotten a glimpse into how technology can help them unlock their potential and defy barriers and really where technology is headed.

So when you come to 837, you can experience virtual reality, augmented reality, get a sense of what 5G is going to mean in terms of connected cities, connected living.

So it's really a space where people can discover, explore on their own terms and really understand how technology is pushing us forward as a society.

So 'A Moon for All Mankind' was really our effort to push the boundaries of how immersive storytelling we can do through virtual reality.

We truly believe as an organization that virtual reality is not only an empathy machine but that it is the most immersive storytelling platform.

And we have multiple 4-D VR experiences.

That means when we're adding other aspects and other senses into the experience, moving platforms, wind, smell, different elements to heighten the overall experience.

And so when we were looking to create our latest experience, we really wanted to push the boundaries of what we could do.

And so space kept on rising up to the surface, and we really felt another mission for Samsung is to democratize experiences that are typically reserved for an elite few.

Only 12 men in history have ever walked on the Moon, and so we really wanted to break that barrier and bring the Moon to absolutely anyone who wanted to experience it.

So in partnership with NASA, we've developed this Gravity Offload System, which simulates lunar gravity, the 1/6 gravity that you experience on the Moon, and combined that with completely realistic virtual reality content and even IMUs that track your body movements so that you can see your limbs in goggle.

All of that combined creates a completely realistic experience.

So when you first step into the mission control area, you watch a briefing video from mission control, which tells you a few dos and don'ts about your Moonwalk and also a little bit more information about the mission that you're about to embark on.

You then undergo a multistep process in terms of getting ready for your mission.

So the first step is putting on the harness that we've specially developed for this experience.

After the harness, you then put on your flight suit, which is a very realistic flight suit that astronauts wear something very similar.

Following that, you're then strapped on IMUs, which are motion-detector sensors, that allow you to view your limbs in goggle as if you were really looking at them.

And then the last piece is to put on your helmet, which has a Gear VR headset integrated into it, and then you're going to come onto the landing platform.

So once you get here on the stage behind me, which is... And behind us we have our 3-story digital screen with a brilliant Moonscape.

You're strapped into the rig, which measures your weight so that the system can counterbalance based on your movements, and then we strap in a Gear S9, and you go on your mission.

So the in-goggle experience lasts about 5 minutes where you actually land on the Moon, you step out of your lander and then you're engaging with mission control as they guide you through your Moon mission.

Welcome to your first EVA.

Heart rate and biometrics are looking good.

The Moon was a domain for the few.

It's now open to all.

Time to make your mark.

Point and click your controller at the air lock door to open it.

We can confirm that you have the all clear to exit.

Take your first leap.

So I've always been a bit of a space geek, so this was a real bucket list project to work on.

Myself and my team got to go to Johnson Space Center through our partnership with NASA and actually test the ARGOS Gravity Offload System that NASA uses to train astronauts on lunar and zero gravity.

It was an amazing experience, but it really helped us ensure that the experience that we were developing for consumers here was completely realistic.

So NASA was incredibly helpful not only in providing us historical content so that the content, the Moonscape, that you experience in goggle is completely realistic but also in terms of the harness development and the Gravity Offload System that we specifically designed for Samsung 837.

Having gone through that training and done this, I can tell you this is a very, very realistic experience, so much so that we even had astronaut Mike Massimino, who's done two space walks on the International Space Station, come and experience our 'Moon for All Mankind' and said he actually felt it was almost more realistic than NASA's experience because of the VR content.

That was great.

That was so awesome.

Oh, it's beautiful.

We launched this experience on the 49th anniversary of the Apollo mission, the first Moon landing, as we lead up to July 20, 2019, which is the 50th anniversary of the Moon landing, and our goal is to, you know, have as many people experience this as possible.

Through our partnership with NASA, we're really using this experience to showcase our commitment to STEM and STEAM education, hoping to incite the astronauts of tomorrow as the Moon holds a critical role in our eventual manned mission to Mars.

So we want as many people to come and experience it and realize what an important role space has in the future of our humanity.

Rainwater containing pollutants, waste and motor oil are flowing into ponds across the country.

As a result, thick algae grows at the bottom of these ponds, creating environmental issues.

In this segment, we visit some researchers in North Carolina who are searching for a solution to this common problem.

Have you ever wondered what lies beneath the waters of those nicely landscaped retention ponds that are a pretty common sight in the urban landscape?

It's a pretty silty and sometimes slimy world underwater, and that's a problem.

New research shows retention ponds may look nice above the water, but they aren't working the way they're supposed to underwater.

The water, when it rains, flows into these wet ponds that are one of many of a class of storm water control measures that people use, and these wet ponds are a part of the landscape.

Our story begins with a question.

What can be done to reduce the amount of rainwater sitting on roads and on developed land?

That storm water carries pollutants such as fertilizers, yard and pet waste and even motor oil.

And as storm water carried all of that pollution into streams and rivers, the water supply along with people and the environment could be seriously harmed.

Because the presumption is they capture everything during the storm and fewer things leave, and if you use that in a general sense, that can mean water, which is accurate.

It does.

It captures the water.

It takes the peak off of the flow so that you don't have water rushing out into streams during storms.

But if you think about everything else like sediments and nutrients in particular, it's not as simple as they get captured in the pond and they just stay there.

State law requires developers to reduce the impact of storm water runoff.

The ponds are specifically designed to not only hold the water and let solid pollutants settle out but also to remove 30 percent of the nitrogen and phosphorous in the runoff.

That happens through a process called denitrification.

Where an available form of nitrogen, which is a nutrient that can stimulate algal blooms, comes in and through a microbially mediated process is removed.

So the presumption was that's what was going on.

Essentially, the microbes and plants in the water are supposed to take up the nitrogen.

Some is consumed.

Some is released into the air as N2 gas.

Problem solved.

A lot of those nutrients in the water are gone.

But that's not always what happens.

Some of our early work found out that not only was that not going on but the opposite of that -- nitrogen fixation was occurring.

So more nitrogen, new nitrogen from the atmosphere, was being added.

Yeah, just maybe a smidge lower if you can, and then we can always release some later in the lab if we have to.

It turns out all the action in a wet retention pond happens at the bottom, so researchers collected core samples.

You see, like, the fluff right here?


There's a bunch of just algal detritus.


Yeah, gunk on there.

It gets sandy underneath there.

You can kind of see...


...maybe some plant roots as well.

So what we do is we have our line connected to a core, so this one right here, where the water is pumping in the top, and it's being pulled out close to the sediment water interface.

So right down here is where we want to pull our water from, so then we can get a signal of what the sediments are doing, whether or not they're adding nitrogen or removing nitrogen.

What we would want to happen is denitrification where the sediments are converting nitrate, which is a nutrient, into N2 gas, which is a gas that organisms can't use to grow.

What we're seeing instead, sometimes, is net nitrogen fixation, which means that N2 gas is being taken from the water by the bacteria and turned into nutrients that they can use to grow.

Which explains why some ponds have a thick layer of algae on the surface and on the bottom.

Usually during the summer months, we're seeing actually net nitrogen fixation in the sediments, so instead of removing the nitrate, they're actually adding nitrogen into the system through nitrogen fixation.

It's bad because it's a net influx of nitrogen into the system, so instead of being net sinks, so an area where nitrogen is removed from the system where things can't use it, it's actually a net influx of nitrogen.

There's more nitrogen coming into the system than before.

And those explosions of algae can threaten streams and rivers and lakes farther downstream from the pond.

In a world where we have a lot of increasing urban development and a lot of impervious services and storm water increasing every time that you're building another Walmart or another parking lot, it's really important to think about where that water is going and how that water is affecting the wildlife community, the kind of microbes that you don't see and the nutrient flow that you also don't see, so yeah.

It's a key component for sure.

The solution, researchers say, is to routinely excavate the pond to prevent the buildup of organic material.

Installing aerators that mix the water in the pond, including fountains, can also help promote algae removal.

As they receive more and more water, they're receiving more and more sediment and other organic material, and it builds up, and so you have the physical filling of the pond, but you also have a big change in what the sediment composition is.

And if it gets too organic-rich, that doesn't favor the removal processes that the microorganisms do.

That favors the recycling of nutrients into the water column and is more likely to make there be nutrients passing through.

Just to the sediment surface is where all the action is, and so maintaining it, where you scoop it out and start anew, does an awful lot to change the conditions and potentially create conditions that are more favorable to removal.

And that wraps it up for this time.

For more on science, technology and innovation, visit our website.

Check us out on Facebook and Instagram, and join the conversation on Twitter.

You can also subscribe to our YouTube channel.

Until then, I'm Hari Sreenivasan.

Thanks for watching.

Funding for this program is made possible by... ♪♪ ♪♪ ♪♪ ♪♪