SciTech Now Episode 231

In this episode of SciTech Now: thin, flexible screens may be the future of technology; understanding the significance of gravitational waves; a company that’s giving a new kind of voice to those with speech disorders; and how pesticides used a century ago are affecting residents of a Washington town.

TRANSCRIPT

Coming up... the screens of the future.

You can always make them richer, better, nicer, more bright, vibrant.

Helping children with autism find their voices.

If you have challenges with communicating, this app not only can talk for you, but will teach you how to talk.

And, finally, dirty dirt.

I wondered if there was any type of old contamination but just assumed that it would be healthy, that they would take care of that.

It's all ahead.

Funding for this program is made possible by...

Hello.

I'm Hari Sreenivasan.

Welcome to 'SciTech Now,' our weekly program bringing you the latest breakthroughs in science, technology, and innovation.

Let's get started.

Our world is filled with display screens.

They're in computers, phones, e-readers, watches, and many other gadgets and devices.

Now, two researchers at the University of Central Florida are developing nanostructures with tunable pigments, which can be used to be create flexible, very thin screens.

Here's the story.

In a high-tech culture permeated by screens, it's hard to imagine the next advancement.

Will the display be the size of a billboard or a handheld?

Could it impact the military, as well as the fashion world?

Two University of Central Florida researchers think all this and more is possible.

They want to perfect the nanoscience behind a color-tunable surface.

The project started three years ago, when Debashis Chanda, my advisor, joined UCF.

He had this neat idea of creating a color-tunable surface, pretty much like that of an octopus or cephalopods.

They can create color and pattern on their skin.

And you can see their color-generation mechanism simply used surrounding light and some kind of nanostructure on the surface to create color.

So, that actually was one of the motivations for us, that can we create color on a similar way, where nature does?

The researchers are moving beyond rigid glass displays, typically found on current televisions, phones, and e-readers.

They're creating a flexible surface, with tunable pigment that mimics nature.

Here's an image of the helix array that we use for a demo.

These are structures that are more similar to the ones that we use in our research.

But these structures have very interesting interactions with light, and those are some of the aspects that we study.

We looked into how light gets coupled to this kind of metallic nanostructure surface and what happens to that light.

So, that was a first.

We played with the fundamental aspect of coupling light to those nanostructures.

This one is to show the minimize size that we can achieve with machine.

So, here we have an individual line that's about 100 nanometers.

Light itself has a wavelength of around 500 to 700.

So, one wavelength of light would oscillate 5 times that.

And by using patterning at the surface, we can kind of cheat that limit a little bit and create things even smaller.

Here we have a line that's 60 nanometers.

So, the process begins with that you need to make a nanostructure surface.

And then take that stamp, and you can actually keep imprinting on another polymidic surface.

And from that, we can then deposit metal, which will be our plasmonic surface, which actually absorbs the light.

And the liquid crystal, which is in contact with that metal surface, is what allows us to tune the color that we see from that surface.

Here we have a sample hooked up to a microscope.

And on top of the microscope we have a camera, which is then sent to a computer, where we can see the device working in real time.

We have connected to the device two electrical leads, which will change the orientation of the liquid crystal and result in the tuning of the color.

So, liquid crystals are pretty unique in that they can change their orientation based on electric field.

So, by applying a voltage, we switch the direction of these liquid-crystal molecules, and then that changes the color that is absorbed by that surface.

But significant challenges remain before these new screens become a mass-market reality.

Think about the color you generate.

You can always make them richer, better, nicer, more bright, vibrant.

You can always think about adding various gray scale.

The things that still are challenges are things like angle dependence.

You want to make sure that your display looks the same color if you're viewing it at 45 degrees or straight on.

So, we are looking at those fundamental science aspects to improve the color-generating mechanism from a science standpoint.

What excites the researchers most about their process is the seemingly limitless possibilities.

If you have a phone -- say that you want to change color.

You want it to be red one day, but you want it to be green the next.

You can tune that actively.

You could also mount a patch of this on your clothes.

You can actually watch video played on it.

There's also defense applications, like for camouflage uniforms and things like that.

It's very expensive to create different uniforms for different terrains, and so if you have one that maybe changes color, that would be advantageous.

It's not just focused on doing it once in a lab and then calling it a day.

It's about finding the technology, the fundamental science, and then trying to turn it into a real-life device.

Up next, reporter Andrea Vasquez has an interview via Google Hangout.

The scientific world was abuzz when a team of scientists announced that they had recorded the sound of two black holes colliding 1.3 billion light years away.

The discovery is the first direct proof of the existence of gravitational waves, or ripples, in the fabric of space and time.

Gravitational waves were first predicted by Albert Einstein in 1916, as part of his Theory of General Relativity, and they have major implications for the study of black holes and the dark universe.

Here to help us unpack the groundbreaking discovery is physicist and educator Umberto Cannella.

Can you explain what happened and why this took so long to figure out?

February 11th, the big discovery has been announced, but it was the end of a process of continuous check and cross-check that started around mid-September last year.

And that day in September, the story goes that the two instruments, the microphones of the universe, were on just barely ready after major upgrades, and they immediately detected these gravitational waves from the universe.

And this wave was already traveling at the speed of light, covering the distance of 1.3 billion light years from us.

So, what did scientists take away from that?

Well, first of all, the theory of gravity that Einstein built is reinforced.

And the nice thing about science is that not even Einstein deserves belief if it's not proved.

But now that we know they exist, we can access information about the universe that we were deaf about.

What is a gravitational wave, and how exactly is it created?

We're all familiar with throwing pebbles in ponds.

Mm-hmm.

From the point of impact, we can see these ripples going further and further and then dying out.

We now replace water with the space in the universe.

It's pretty similar.

We only need heavy pebbles that in our case are black holes or very complex stars that are called 'neutron stars.'

These stars and these black holes have a heavy mass, and that's what's creating this ripple?

Exactly.

In the usual way of describing things, one can think of a trampoline that is elastic and receives a deformation, a dent, by a bowling ball, for example.

And the bowling ball could be the Earth, could be the Sun.

According to the mass, it would have bigger dents.

So, what effects are these gravitational waves having on the environment around them?

Well, close to this very romantic encounter, things can really be a little bit violent.

The energy's huge in terms of the formation of space.

It's like if a wave was trying to knead us, like pasta.

So, a coming wave, a gravitational wave, heating our bodies would stretch our bodies top from bottom, like spaghetti, squeezing our hips at the same time.

But as if it were not enough, one second after, the effects would reverse.

So, our hips would be pulled outwards, our arms, too, and our head would be squashed towards our feet like a lasagne.

And for something that's so far out in this deep darkness of space, how are scientists able to observe and measure these waves?

Because of the peculiar effect that we discussed, if you build an instrument that is shaped like the letter 'L,' you are sensitive to these alternations of the formation, even if the deformation is minuscule.

Why is that?

Is it bouncing off of one to the other?

The light in the instrument is bouncing from the ends of the two tunnels in the 'L,' and the two arms of the tunnels are squeezed or stretched, and this alternation is what can make them apparent in the data.

Okay.

So, we're sending out these instruments to collect these measurements?

We will send instruments out in space.

For the moment we have two on Earth.

Could this impact things like space travel and other things that affect what we're actually doing here on Earth?

Everyone wishes so, even a few scientists, because it would be affecting the so-called fabric of space and time.

And because we are inside it, we can think of shortening the path between us and a star.

And how often are these gravitational waves rippling out into the universe?

That's a very deep question because so far we have been deaf, so we have basically no idea.

There were estimates that said we could have as many as a thousand signals per year.

Other estimates said that we would have to be lucky to have one signal per year, because we were deaf, so we didn't have many terms of comparison.

So, it's completely new.

That's why people are really excited about it.

And how long do we have to wait until we start getting the data back to give us a better picture?

Well, scientists are very thorough in their analysis because you have to disprove your own self, basically.

You don't want to believe what you would like to be true.

And that's why they have waited all along the rumors spreading on the Web, on the Internet, 'cause they wanted to be as sure as possible.

But once the procedure has been tested, and its nature is gentle enough, we could get one very soon.

Okay.

Well, we look forward to seeing it.

Thank you, Umberto, for being with us.

My pleasure.

'Euphony' -- it's a word that means pleasant-sounding to the ear.

It's also the name of a startup that's using innovative software to help children on the autism spectrum find their voices.

Here's a look.

Euphony's got an interesting story behind it.

We were working on a project for a government contractor in the advanced-research division, and the government sequester and shutdown at the time put in jeopardy a lot of the follow-on funding that we were really relying on.

Consequently, all of our projects got shut down.

So, I took that as an opportunity, instead of just a chance to lament, and asked my company if I could take out the technology that we'd actually developed that would have ended up rotting on a shelf.

So, we formed a small business, and here we are.

Really exposed me to a different world, where speech is important, and that's where people either don't have the voice they want or don't have a voice at all.

So, people within the speech-communication-disorder community, autism-spectrum-disorder community decided that that would be a really neat place to bring our technology advances and make use of them.

What we have managed to focus on specifically right now, looking at how to model emotion and context change, has allowed us to do the equivalent of putting a smile onto a face that doesn't smile -- doing that with a voice.

[ Monotone ] He turned sharply and faced Gregson across the table.

[ Inflected ] He turned sharply and faced Gregson across the table.

Right now what we've done is we're partnered with a speech-language-pathology company to replace the speech capability in one of their apps.

And that's called 'InnerVoice.'

InnerVoice is a mobile app.

It's available now on the Apple iTunes store.

It's only on IOS currently.

And it is a communication app.

So, if you have challenges with communicating, talking, with language, social language, this app not only can talk for you, but will teach you how to talk.

I want hot chocolate.

And Fuzz is the expert there.

He's working on all these wonderful text-to-speech things where you can put emotions into text-to-speech, and now we can teach our kids how to say things maybe happy.

Today it's fairly robotic.

[ Monotone ] I want an apple pie.

And it doesn't connect the emotional content with the user.

It doesn't say, 'This is what you should sound like when you're happy or sad.'

That's a piece that's missing.

So, what we're focused on right now in this project is adding three significant emotions and tying those to improvements to the face.

State of the art.

[ Monotone ] I like having a voice to speak with.

Here's Euphony's version of that.

[ Inflected ] I like having a voice to speak with.

It's a running spectrogram.

So, it's recording everything I'm saying and displaying it in a spectrogram that measures frequencies, frequency ranges, and power -- those kinds of things.

And it's live.

But what it really demonstrates is how much is involved in analyzing a speech signal.

You can see all the noise up top.

If I make 'S' sounds -- 'ssss'.... That's what I'm analyzing under the hood.

When I make voice sounds, like vowels in particular -- 'eeee' and 'aaaa' -- you can see what changes here based on the shape of the mouth.

The biggest challenge is how do we build the ability to change emotional context without completely overhauling and rebuilding a synthesis engine from scratch?

As a startup business, I don't have the resources to do that.

So, what we have successfully demonstrated is that we can build voices that very effectively drop into existing technology environments with zero or very few changes in some cases and allows a new context to be introduced.

It's completely out there.

The other companies are great.

They've been creating the grid styles since the '60s.

And, truthfully, when mobile technology came out, they took that same style and put it onto the mobile technology.

Well, we know mobile technology is capable of so much more than that.

It's wonderful.

I want a cookie.

The reason Euphony decided to really focus on this is because numbers in different areas are changing.

For instance, the 1 in 68 people being diagnosed with autism.

We know that's changing.

And that's just a number we're familiar with in the Western world, English-speaking world, predominantly.

This is a worldwide problem where 1 in 68 people could potentially not have the same opportunity to communicate in life verbally.

Should have that opportunity if they want it.

Not everybody necessarily wants to express themselves with speech, and that's fine.

But for those who want to, they should be part of our culture.

I want a cookie.

Lead and arsenic used decades ago in pesticides are still lingering in the topsoil of Pacific Northwest apple country.

This poses a health risk for children who come in close contact with dirt in the backyards and playgrounds developed from former orchards.

Our environmental news partner Earth-Fix gives us a look at what's being done to keep kids safe from contaminated soil.

One thing Jennifer Garcia loves about her home in Yakima, Washington, is having a yard where her children can play.

We set up a little garden for the kids out here for fun.

We have our pumpkins and tomatoes.

Garcia didn't know that the soil in her yard is contaminated.

I wondered if there was any type of old contamination but had hoped that, or you would just assume, that it would be healthy, that they would take care of that.

So, no eating dirt back here, guys.

Okay?

Her home sits on land that used to be an apple orchard.

Recent tests show high levels of arsenic in the topsoil.

The poisonous substance is left over from pesticides sprayed here decades ago.

It's the legacy from a decades-long battle between apple growers and an insatiable pest -- the codling moth.

Codling moth caterpillars infest apples, turning them into mush.

Damaged apples were thrown out by the thousands.

But in 1905, orchardists found an answer -- a pesticide called 'lead arsenate.'

Frank Peryea has studied lead arsenate at Washington State University for decades.

By the end of the 1930s, there was much lead arsenate being used that there's an argument that it wasn't really poisoning the insects.

It was forming such a coating that the insects couldn't chew threw it.

By the late 1940s, farmers started using other pesticides, but lead doesn't break down or move in soil.

And for the most part, the same is true for arsenic.

So, much of the lead arsenate sprayed here, even 100 years ago, is still in the soil today.

It's estimated that legacy lead and arsenic contaminates more than 187,000 acres of old orchards in central Washington.

That's an area roughly the size of Seattle and Portland put together.

That's a lot of acres.

You can't dig that up and cart it away.

And here's the problem.

That lead and arsenic poses a health risk to people who come into contact with it, especially children.

Lead is bad for young brains at any level.

There's no level that we consider safe for kids.

And it crosses the blood-brain barrier until you're about six years old, and it affects your brain.

The Washington Department of Ecology knows contaminated soil is a problem.

It's done cleanup projects at 26 contaminated schools but stopped before cleaning up the final two schools in Yakima.

The Department of Ecology says local communities and the state legislature haven't expressed much interest in cleaning up old orchard lands.

Valerie Bound oversees central Washington's toxics-cleanup program.

If I had people who were calling on a regular basis and wanting information, I would see a need.

This is an agricultural community.

Everybody knows people who are in the industry.

There's also the issue of money.

Cleaning up one school can cost more than $500,000.

Bound says the Department of Ecology has used up its funds for cleaning up old orchard sites.

So, there aren't any plans to finish the school cleanups or move on to other places where children can play, like parks or daycares.

That doesn't sit well with Jose Luis Mendoza.

He operates several daycare centers in the Yakima area.

Mendoza learned last year that his daycare center was contaminated with lead and arsenic.

He decided to cover the contaminated soil with clean dirt and then put in a new lawn.

Little kids under six years, wherever they're playing -- if they're playing in the backyard, everything they put in their mouth.

Mendoza wanted to make sure the cleanup worked.

So, he asked state officials to retest the topsoil on his property.

So, that top layer looks great.

Okay.

And it could be because that's the clean stuff you brought in.

Mm-hmm.

These ones down a little lower here, we'll see what's down underneath.

That dirt shows evidence that the daycare was indeed built on old orchard land.

But Mendoza had already done what state officials would have recommended to keep kids safe.

The clean dirt keeps the contaminated soil away from kids.

It's like, 'Oh, the Department of Ecology, they have to take care of it.'

No, we are living in the community.

We are part of the community.

We have to take care of it.

We have to do it together.

When developers build or sell homes on former orchards, they aren't required to disclose potential soil contamination.

Until that changes or the state decides to pay for more cleanups, many residents won't realize they're living with a legacy of pesticides.

And that wraps it up for this time.

For more on science, technology, and innovation, visit our website, check us out on Facebook and Instagram and join the conversation on Twitter.

You can also subscribe to our YouTube channel.

Until next time, I'm Hari Sreenivasan.

Thanks for watching.

Funding for this program is made possible by...