Technology for the blind tends to rely heavily on auditory cues to convey information. Now WearWorks, a Brooklyn, New York based technology company is developing a device that communicates to the visually impaired through touch. Keith Kirkland, CEO and Co-Founder of WearWorks joins Hari Sreenivasan to discuss this topic.
A wearable guides the visually impaired
Technology for the blind tends to rely heavily on auditory cues to convey information.
Now, WearWorks, a Brooklyn, New York-based technology company, is developing a device that communicates to the visually impaired through touch.
Joining us is the CEO and co-founder of WearWorks, Keith Kirkland.
Thanks for joining us.
Thank you for having me.
So, what do we have here?
What do you got?
So, this is the Wayband, and this is a wearable tactile navigation device for the blind and visually impaired.
Basically, we figured out a way to guide people to an end-destination using only vibration, without the need for any visual or audio cues at all.
Okay, so, somebody, uh, a visually impaired person would put this on their -- it looks like it's an arm band -- and what are the kind of -- what's the feedback that they would be getting through it to tell them to turn left or turn right?
And so, what most companies have done in this space is, they've either used two separate devices, a left device and a right device, to tell you to turn left, to turn right.
Or they've used a set of different vibration patterns, two buzzes, left, three buzzes, right.
We wanted to do something that was so intuitive that like, we could just give people the device, and they could figure out the direction on their own without any instructions or prior experiences.
And so, we've designed this kind of -- you know, this haptic corridor.
It basically gives you the sensation that you're walking down an invisible hallway that you can feel the edges of, and as you get to the corners, you can feel the edges of the next part of the corridor.
But let me put this in -- in practical terms.
They're walking down the sidewalk, and there's other people.
Are those other people part of, are they intruding, or coming into and out of that corridor that you're talking about?
No, no, so like, so right now, there's two issues around navigation for blind people.
One, is kind of like microscale, like the idea of avoiding the people that are around you -- the fire hydrant, the poles, things like that, and their cane skills and their dog and mobility training skills really help with that.
The part that we're looking at is like macroscale wayfinding.
Like, the dog or the cane doesn't help you to get to the post office, or to the coffee shop to meet your friend.
You could say that there's almost like a -- a Google Maps, or a Waze something that helps you plot these longer paths on how you walk, and you're -- you're not trying to replace the cane or the dog, per se, but you're helping them see places that they couldn't get to in the past.
Like, right now, most blind people use audio navigation.
Um, and audio navigation's a really great aid for them.
But the thing is, is that for blind people, their ears are almost like their eyes.
It's their primary sense for taking in, you know, environmental information, and it's how they keep themselves safe, and understand what's going on around them.
And so, having someone constantly talk into your ear might make you miss cues in the environment that are really important.
So, what kind of -- what kind of feedback is that actually giving you?
Are these pulses?
Yeah, I was about to say, actually, if you don't mind, I can show you real quick.
And so, we built a custom application.
Um, it's called Wayband.
And so, you launch the app, and you kind of get a map of kind of like the location of where you are, right?
And so, I'll give -- I'll let you try the -- the arm device on.
I pull it over your suit jacket.
Call it the forearm device.
Okay, I've got it on.
[ Laughs ] Looking quite stylish.
Um, and so, basically...
All right, I felt a little pulse there, turning it on.
You felt a little pulse there -- okay, perfect.
Whoa. All right.
And so, now what you can do is, um, we built it for -- we built a visual app, because visual impairments have such a huge range.
You might have like, perfect pinhole vision in one eye, and still be considered legally blind.
And so, we built a visual app still, so that people could have access to see, but we also built it so that it works with iPhone's VoiceOver features for accessibility, which is what blind people use to navigate their phones currently.
Um, and I can show you a demo of that in a second.
But you do a start navigation --
Yep, it's pulsing.
So you can't cheat, I want you to hold the phone upside down.
I want you to do a 360 spin really slow.
I want you to feel the whole experience, and come back around.
Oh, okay, now, it stopped, and now it's pulsing again.
And now, I want you to spin again, I want you to stop in the direction that you think is the right way to go based off what the device is telling you.
When you get there, freeze.
I'm want to check.
Let me see.
Oh, now it's -- now it's pulsing again.
Yeah, yeah, yeah. Perfect.
You got it right.
So, now, I should go in this direction?
And so, like, what it would do is... I'll turn it upside down so it's not working, but basically, what it is, is like, it's guiding you to this point, and when you get to this point, it's going to then give you your next point, so like we can orient you towards the direction of your next point.
When you get there and you collect that dot, almost like playing Pac-Man.
We auto-dynamically change the map so that you are guided to the next point.
It's going to constantly be buzzing until I get to a -- it's sort of an opening.
And I walk through opening after opening after opening.
And so, like, once you get to -- Exactly, so like, you find the opening and you keeping walking through it, and then you get a notification buzz that lets you know that you've arrived at your end-destination, and it feels something like this.
All right, so, this -- this -- How did you get into designing this stuff in the first place?
Um, so, me and my three co-founders, we all kind of came together.
Um, we all have backgrounds in industrial design.
We met at Pratt Institute in Brooklyn.
Yang, one of my other co-founders, and I, we did the masters program there.
And Kevin, the other co-founder, he was a part of the bachelor's program.
And so, we all kind of came into the space through different doors.
My personal door was, I had spent my thesis year trying to figure out a way of combining fashion, movement, and technology.
And I ended up designing, or setting the premise for a suit that you can download kung fu into, and the suit would teach you kung fu.
This is straight out of 'The Matrix.'
My thesis was 'The Matrix.'
I saw that movie like a thousand times.
And I was like, wouldn't it be great if...? And then I started putting all the pieces together.
I was like, wait, you can do this now with existing technology.
Building the kung fu suit was actually much harder, and it taught you some of the basics, and so you were able to apply some of that learning into this?
Yeah, almost -- almost a reverse.
Like building a kung fu suit was so -- was much harder.
And we were kind of like, okay, like, let's simplify, right?
You know, like navigation has a very distinct set of commands.
It's go straight, turn left, turn right, wrong way.
You know, you've arrived, you begin.
So, we're like, okay, let's take a simplified experience, and, like, let's see if we can figure out a a way of communicating this experience through touch.
You had a -- a visually impaired marathoner that wore this through the New York City Marathon.
And this was not something that -- you know, this wasn't the goal of the product, but you kind of put it through its paces.
That was a -- that was a wonderful experience for us, you know.
Um, Simon Wheatcroft, he's the -- the marathon runner, he reached out to us, and he said, 'Hey, you know, if you can have this device ready by the New York City Marathon, I'd run this year with it.'
And so, we did not believe that we could have it ready by the New York City Marathon, but we were like, let's make the choice to do this anyway, knowing that the process of trying to get to the marathon is going to advance the product and the technology so far.
In our minds, it was like, this marathon was like Le Mans.
Like, if we can get through this, like, we -- we set the standard, and we create trust for the community around what it is that we've been working on and what we can do.
So, what'd you learn?
Oh. Well, one of the things that we learned is that, um, make sure the devices are very water-resistant.
Um, the -- the marathon kind of, for us, we weren't expecting the rain, and the rain was -- it got a lot harder, and at some point, maybe around mile 15 or 16, the device stopped working, um, due to the -- due to the fact that it was wet.
But we also got a lot of interesting data along the way that like, now, are concerns that we are looking to resolve.
You know, like, one of the things is, is that, uh, GPS accuracy in the city is -- is pretty troublesome.
So, we're looking kind of on the software side of like how do we resolve that?
And then, ultimately, for our group, you know, like, the real issue is -- is not necessarily navigating through the city streets.
We're giving them a better way to navigate by letting them use touch.
But they can navigate with the audio navigation.
The real challenge is that last 30 feet, that like, where is the door handle?
Like, that's the part that, like, GPS resolution doesn't allow solving for.
And we're also working on a few projects around indoor navigation.
Keith Kirkland, CEO and co-founder of WearWorks.
Thanks so much for joining us today.
Yeah, thanks so much for having me.