Unmute Presents CSUN Recap with Special Guests

Marty, Michael, John, Chris, and Patrick reflect on their experiences at the CSUN tech conference, emphasizing the prevalence of AI in assistive technologies and the importance of networking with industry professionals. They discuss innovations like Glidance for individuals with visual impairments, Envision’s new assistant tool, and Aira for visual interpretation services. The team explores OKO for safe intersection crossing, Sorensen’s sign language service, and technologies for individuals with disabilities. Appreciating the networking opportunities and advancements, they express anticipation for future collaborations and innovations in accessibility tech.

Send us your feedback online: https://pinecast.com/feedback/unmute-presents-on-acb-communi/c4d0908c-a263-49d4-9e3c-530c5c667f87

This podcast is powered by Pinecast. Try Pinecast for free, forever, no credit card required. If you decide to upgrade, use coupon code r-e4dc67 for 40% off for 4 months, and support Unmute Presents.

WEBVTT

00:04.600 –> 00:14.894
Hey, everyone, and welcome back to another unmute. And today we are bringing you our CSUn recap. And with me today, I’ve got Michael Babcock. Say hello, Michael.

00:15.014 –> 00:16.278
Hello, everyone.

00:16.446 –> 00:18.742
And we have John Gassman. Say hello, John.

00:18.878 –> 00:22.526
Hello, John. You knew I was going to do that.

00:22.590 –> 00:23.478
Yeah, I did.

00:23.526 –> 00:24.766
Absolutely. Yeah.

00:24.830 –> 00:25.474
Yeah.

00:26.614 –> 00:29.436
And we have Chris Cook with us. Say hello, Chris.

00:29.590 –> 00:30.524
Hey there.

00:31.104 –> 00:34.384
And Patrick Burton. Say hello. Hey.

00:34.424 –> 00:35.284
How you doing?

00:35.704 –> 00:47.884
All right, so let’s see, John, what is some experiences? What is your favorite thing? What did you see? What is on the top of your list for CSun?

00:48.304 –> 01:27.616
Probably like, this is. This is an answer that nobody else will give. Probably AI. Everybody in the world has EI this year at CSUN, all the major companies had something to do with AI, and it was just fun to see what they, what they came up with. Envision and Ira are pretty close to being very similar with what they’re doing. Of course, a couple weeks before the season, jaws came out with an update to their pictures march feature with AI, which is really outstanding.

01:27.680 –> 01:48.024
I’ve told people several times that if your event or products didn’t have AI in the title or description, you probably shouldn’t have been at the 39th annual season, because it was either AI or product or document remediation were the two big topics at this year’s season.

01:48.364 –> 02:11.344
Yeah, I thought it was in everything, and at some point it almost becomes redundant, I would say, because a lot of it is kind of trying to achieve the same thing, but some was interpreted better than others. But, Chris, what about you? What did you see? What did you find that you thought was interesting?

02:12.324 –> 02:52.330
Well, first of all, it was my first time to attend, and I just loved it. I had a great time, so I was really thrilled to be there. It was my first csun, so it was really awesome. I was so excited. The best part about it for me was being there and meeting all the people. I know the tech was there, but it’s the people whose companies bring the tech and the personalities, and it was just really, really great to meet everyone, you know, so it was good to visit with all the braille displays and just anything to do with Braille. I was all over it, so. But, yeah, the highlight was actually the.

02:52.362 –> 03:33.038
People for me, I would definitely agree with that. It was fun. We had a great lunch, and we got to hang out with Allison from Nosilacast and her husband Steve, and they’re always a really, really good time, and we all got to kind of sit around and talk about what we had seen already and the technology that we were seeing out there up until that point, so that was really cool. And the other fun thing is, at lunch, everyone’s pulling out all their little things. We were passing around Zoom recorders, new Zoom recorders that people had. And, Chris, you had a little braille input device.

03:33.166 –> 03:34.754
Yeah, the little hable one.

03:35.534 –> 03:43.314
Yeah. So that was really cool to check that out as well. So we brought a friend with us, Patrick, who’s here as well. How are you, Patrick?

03:43.804 –> 03:45.996
Oh, doing great. Really happy to be here.

03:46.180 –> 03:48.784
And what was some of your highlights from CSUN?

03:49.564 –> 04:58.118
Yeah, I would echo what Chris said about the people that, you know, I have a new assistive tech company, and I wasn’t sure how I would be welcomed as a newcomer to the space, but everybody that I talked to, you know, I talked to some of our. Some of who I foresee maybe our top competitors one day, and, again, didn’t really know if it would be a friendly conversation or not, but the sentiment really seems to be, you know, there’s enough food for everyone to eat, and everybody was very generous in their conversations with the questions that I was asking, and. Yeah. And got to see a lot of tech that I’m already familiar with, like Ira Glydance Reboca. I got to learn that Rebecca, the CEO and founder of Reboca, actually came up with that name, oblivious that it was a pun, that it was a play on words on her own name, which I’m not sure if I actually believe that or not, but. But, yeah, it was just really cool, because I’ve been doing a lot of research on these companies, spent the last year doing a lot of research on these companies, and it was really cool to meet the great minds behind them in person and talk to them and to commend them on the great work that they’re doing.

04:58.286 –> 05:09.234
Yeah, I think when we were talking to her, I think she was saying she was taking the name from a photography Boca kind of a thing. Right? Isn’t that what she said?

05:09.654 –> 05:21.714
Yeah, yeah. And that resonates with me because I come from a videography background, so I’m quite familiar with the term Boca, but I really wanted her to say that it was a play on words, because I really wanted to just call her Reboca.

05:22.214 –> 06:14.354
Yeah, definitely. I got to play with one of Patrick’s apps, and this app is really cool. It’s in beta still, but basically, you use your phone and you put it out in front of you, and it tells you if the path in front of you is clear, so you can avoid running into people or things. And it was definitely interesting in a really crowded environment, like, we were in trying to kind of guide yourself around using that app, and it was auditory, and you had earbuds in, and it was helping you make sure that the path in front of you is clear with the camera and also using auditory signals. So it was really cool. It was interesting to be able to kind of navigate around all those people and all the things, and nobody’s paying attention. But, Patrick, do you want to kind of talk a little bit more about that?

06:14.814 –> 06:20.990
Yeah. Yeah. I don’t know if I told you this part, Marty, but you’re actually the first person outside of the company who got to try that prototype.

06:21.182 –> 06:22.310
Oh, that’s cool.

06:22.502 –> 06:23.834
I was a guinea pig.

06:24.294 –> 09:10.404
Yeah, yeah, exactly. So we were really happy to hear that you enjoyed the experience. And so that prototype came. Well, if I can back up a little bit. My company is called Ben Vision, and our product is, our flagship product is software. It’s called the binaural experience navigator, or Ben for short. And the reason it’s called Ben is we were inspired by a man named Ben Underwood, who is one of the handful of blind people that managed to teach himself how to use echolocation. And I saw a couple of people actually at the conference using echolocation themselves, which was really cool. But we were inspired by this man, and we thought, with this great technology we have at our fingertips these days, augmented reality AI, as was very popular at the conference, couldn’t we meet the human brain halfway in its ability to map nonverbal cues to visual objects? And so what we came up with was again called Ben. And our original prototype, what it did was it sonified objects. So the premise was to give everybody the power of echolocation. So using the power of computer vision, it would recognize objects and then assign spatial audio cues to them. But not just any spatial audio cues. We wanted to design an elegant and enjoyable experience. So we. We map them to music cues. So all of every object, if you can imagine, like a water bottle, a potted plant, cell phone, they all have unique musical music associated with them. And my co founder, Subin, she spent ten years as a film composer, and now she designs, she does sound design for games. And she’s composed all of these sounds together as a single, uniform soundscape to make sure that everything sounds good together. And it’s actually a pleasant experience. It’s not chaotic. And then she pulls out each of those cues that pertain to each of the objects and assigns them to the objects individually. So the experience is supposed to be like you’re navigating through a symphony. And so that was our first prototype, and when we did our first user test, it was met very positively. We didn’t really know how the community would receive it, but luckily, a lot of the people who tried it, they liked it. But one piece of feedback that we got was, you know, it’s great that I can tell that there’s this cluster of chairs in front of me. Like, it’s really cool that it’s informing me of this group of chairs, but I think it would be even cooler if I could identify the empty space between the chairs so I know where I can walk. And we realized that there’s a very real need that we were kind of missing, which was the inverse of what we were doing, sonifying the negative space, finding out instead of finding out where objects are, finding out where objects aren’t. And so that’s what that prototype was all about.

09:11.424 –> 09:19.832
Cool. That’s awesome. Well, thanks for the explanation. I’m sure people will really love to hear more and find out as it kind of comes to market and all.

09:19.848 –> 09:20.752
Of that fun stuff.

09:20.808 –> 09:22.844
So thanks, Patrick. That’s awesome.

09:23.184 –> 09:24.764
Yeah, thanks for the shout out.

09:25.064 –> 09:34.964
So, John, glidance was really popular at CSun this year. So did you get to make it over there, and what was the experience like? And maybe you can talk about what glidance is.

09:35.144 –> 15:19.934
Glidance really is for those of us who have used a cane or a dog for years, it’s another option that you have with regard to navigation and getting around, and they bill it as something that could be a complete option for brand new people to the world of blindness. And maybe you don’t want to bother with a cane or a dog. This is a robot, actually feels to me like a vacuum cleaner, and it has wheels and it has sensors on it, and it literally, as you. You hang onto the handle or maybe touch the handle, you don’t have to hang onto it, but anyway, you’ve got your hand on the handle and it maneuvers you around various objects, and it’s really in its prototype right now, so it’s just getting started. Even though they’re taking preorders, it’s probably not going to be out till summer, but it does walk you around objects and avoids crowds, and it does a pretty good job so far, and it adjusts to your walking pace, so it gets to know how quickly you move around and will adjust accordingly. And it did a pretty good job. They’ve got some work to do on it. Still but they’re expecting to have it out in the summer, hopefully. And it’s probably meant more for the person who doesn’t have a cane or a dog and maybe doesn’t want a cane or a dog, although I’m sure they’re hoping that people with one of those two devices will give it a try. The cost, they say, is going to be about the same as a cell phone. And who knows for sure? We’re still in the early stages, but it’s an interesting concept and it’s an interesting experience. And I did get a chance to walk with glidens. They had an individual there the first time who was manually moving it around so that I could get used to what it felt like to be turned and moving from one direction to another. And we talked about all of that. And then when we did the second go around, he allowed it to automatically just go with me without having to do anything manually. So it was up to the glidins to make all of the decisions when we were avoiding crowds and chairs and so forth. So it was an interesting experience, and I wish that my recording had turned out. There was a little glitch that prevented that from happening, because I would have had a great conversation and a great demo that I could have played. But anyway, that’s just the way it goes. It was a. And it’s an interesting opportunity, and I’d like to do it again, maybe when they have fixed a few problems and maybe some taken advantage of some of the suggestions that were made. So maybe, maybe at the convention during the summertime, I’ll get a chance to do it again and we’ll see how different it is. So that was really nice. And I got a chance to see envisions new assistant, which is now in pre pre beta, which means it’s not even ready for beta yet, but they came out with a version of it to get some feedback from people. And it really is a. And it’s an interesting assistant. And you pick your own name and your own avatar, and you. It asks you questions so that it can learn a little bit about you. And you can pick a voice that you want to as well, and it’ll speak to you in just regular slang if you wanted to, or it’ll be very, very proper, whatever you want it to be. And you can ask it, you know, similarly to those questions that you would have asked some of the other AI assistants out there. And this is a part of the. Right now it’s a part of the free app, but when it comes out, it might be its own individual app. They haven’t decided yet. They are going to be doing a webinar, the monthly envision webinar, and I’m sure they’re going to be talking a lot more about it. Slots open up for availability. They’re going to be asking more people to become beta testers. So it really is interesting, and Ira has one similarly that they’re doing. So, I mean, all your major companies are doing. The AI Vespiro came out with an update to picturesmart a couple of weeks ago, and that really is all chat, GTP and Gemini, and really has enhanced that feature a lot, and I’ve used it a lot now on photos that weren’t correctly identified to create new labels. So it really works well. And it also does read and give you information on files like PNG files and other files that weren’t accessible before. So I’ve not yet tried it on an embedded file within an email, but I bet it would work. So I’ll have to do that now that I’m back from CSUN and have a little bit of time to play with it. So it really was a lot of fun. I missed because usually when I go, I’m there five days a week and I’m going to a lot of the presentations, so I miss the networking this time around because I was only able to be there one day. I miss the networks, I miss the little receptions that some of the companies have. And I just missed, you know, talking to people who you normally don’t see but once a year or maybe, maybe twice a year if you go to one of the conventions. So I immerse myself in CSuN as much as I can this year because it wasn’t going to be the type of CSUN week long experience that I have experienced over the last nine or ten years and then a few before that when it was in LA. So I love going to see sun. I don’t want to miss another one.

15:20.094 –> 15:29.370
Yeah. Do you think you’re going to pick up a glidance once it comes to market and they get the bugs worked out? Is that something you think that you would pick up and use in your day to day life?

15:29.502 –> 15:55.496
No, I don’t think so, because I’m. I’ve been using a cane for centuries and I just can’t see myself switching to a glidance. I mean, I can see where it would be good for somebody who didn’t, who maybe didn’t have the mobility skills or just didn’t want to invest the time in a cane or a dog. And for them, I think it would be great, but it wouldn’t be for me on an everyday situation. But I’m interested to learn about it because it’s a, it’s a new and unique type of a feature.

15:55.650 –> 16:58.142
So I’ve retired two dogs now, and the first thing that I thought about with guidance when I got an opportunity to play with it, was, huh, this reminds me a lot of working with a dog. Now, I I have been told that in some instances, they were using the remote control, and sometimes because, and I don’t know if anyone else would agree with this, but I think glide was one of the more popular booths, and so I think things got super busy, and sometimes they forgot to let people know when they were using the remote. But I will say that the experience of using that device reminded me a lot of working with a dog. I I would like to initially say that I would get one. Honestly, though, glide doesn’t give you the companionship that a dog would give you, and I’m. I’m afraid it wouldn’t keep up with me. I’m a very swift walker, and I think that that would be very difficult for it to keep up with me. Though Amos did say that they want to get to the point where they can work with people who are jogging, so I’m really interested to see how that plays out.

16:58.338 –> 17:09.422
Yeah, it’s a unique feature, and I think that as it grows, more people will be interested in it. They could probably drop an AI assistant in there for your companionship part of this, Michael. I mean, then you could talk to it.

17:09.478 –> 17:14.958
Speaking of companionship, you met a new little companion doggy friend, while you were there, didn’t you?

17:15.046 –> 18:58.774
Hey, don’t tell people about the robotic dog that bit my finger. Thank you, Tony. So, yeah, there was a robotic dog at the Sony booth, and I see this working with people who might have cognitive difficulties or might simply need a companion in their life, but doesn’t want the responsibility of having to feed a dog or take it out. Now, I came home because I’m like, I got to do a little bit more research into this dog. Apparently, they’ve been around for a couple of years now. And do any of you remember neopets from the late nineties where you would have these pets? Yeah, yeah. So you have this little video game that would hang on your keychain, and you could play with your pet. That’s kind of what this is like, but in real life. So the dog is like the size of a small Labrador retriever puppy, it will move around on its own. There’s no guidance to what it does. You can. You can teach it positive behaviors by encouraging that positive behavior with voice commands and actions, and you can discourage bad behaviors as well. But what I learned is there are accessories you can buy for it. So you can buy at a food bowl, and you can buy it water bowls, and then you download the app that connects to the puppy, and you can feed it within its app. You can also go and make relationships with other people who have the puppies and your puppy and that other person’s puppy can become friends in virtual world. It’s really interesting to see both real life and virtual world come together. And, yes, it did bite me. I scratched it under the chin. It like that. So I stopped scratching it under the chin because I was talking to the person, and it moved its mouth down to clamp onto my finger, and there was no teeth, but it was. It was an interestingly weird feeling.

18:59.474 –> 19:01.090
Cool. Were you.

19:01.122 –> 19:05.266
Were you near a water, a fire hydrant at all during any of the.

19:05.410 –> 19:06.850
I was not. You weren’t?

19:06.882 –> 19:09.650
Okay. They didn’t bring in, like, a virtual fake fire hydrant.

19:09.682 –> 19:09.898
Right.

19:09.946 –> 19:21.714
You know, just in case. So, Chris, you’re an avid Ira user. Do you want to talk about what Ira is and what they’ve got coming and their new adventures of Ira stuff coming? Sure.

19:21.794 –> 20:57.894
I’ve been an IrA explorer for probably almost six years. It’ll be six years in June. And so I think they’re one of the best things since sliced bread. And I used Ira quite a bit to navigate around the hotel and exhibit hall, and it was. It was just great having that visual interpreting on demand and, you know, with such a chaotic place. Wonderful though it is, it’s. It’s tough when, as you’re navigating, if you, you know, don’t navigate correctly the first time, and then you fix it, you don’t know what you did to, you know, go the right direction or something. So it was just great to have Ira there. And, of course, I was traveling with my guide dog, and so, yeah, so Ira’s visual interpreting service was just really awesome to have there. And I know that they’re doing a lot with their AI where I don’t know all the details. And I know that Jonathan Mosen, I believe his most recent living Blindfold podcast, just had a thing on there. And you can submit a photo and. Or an image, and the assistant will analyze it, and then you can always confirm it with an IRA agent. They can get back to you with a message. So that’s something new that’s coming up with them. And so, of course, you know, everybody’s getting on the AI bandwagon and they’re just putting their twist on it. And so it should be interesting to know how that rolls out. So it’s just very thankful to have Ira at CSun.

20:58.234 –> 21:06.082
Yeah. And I heard that they were not charging any extra for that. That’s going to be a free service through Ira, and you won’t have to pay for it.

21:06.138 –> 21:48.404
Yeah, I did hear that, too. That, that concerns me only a little bit. Because once you offer something for free, and if, you know, if it’s not as sustainable as it could be, then, you know, people get upset if they start off to paying for things, but hopefully they’ll have the bandwidth to support that. And so it should be interesting being able to. It reminds me of the old Tap Tap C days. Maybe you all had that app and submitting pictures to that and waiting and waiting and waiting for a response. Oh, then you got a crowdsource, you know, explanation back. And so kind of reminds me of a really super updated, excellent opportunity. Like tap top c was ten years ago.

21:49.544 –> 21:50.232
Cool.

21:50.368 –> 23:43.594
So there’s another tool that I thought was really interesting, and, and I hope someone from the Marriott is listening, because Marriott, why are you not an access point? Like, come on, come on. Like that? That would have made the experience a lot better for a lot of individuals. But there’s a tool called, by a company called Sworensen. And what Swanson is, is it is on demand video captioning service. I’m sorry, video interpretation services. So they interpret sign language on demand and are able to communicate verbally to a third party for the sign that the individual who’s using sign is doing. So the example that was given to me by the gentleman at Sorensen was, let’s say that someone walks into a Starbucks, which is a Sorensen access location, and at that Starbucks, they can sign their order and then use this app to have someone remote across the country see through their camera and then verbally interpret the sign language that the individual is using and work vice versa. So if someone is speaking, then the app can, the individual on the app can translate that to sign. So the person who can’t hear is able to communicate. They were connected with Marriott for this year, and so throughout the entire hotel, you could use that at no additional charge. I think it’s really interesting. Number one, I was surprised that there really hadn’t been a service like this too much in the past. And number two, the fact that the camera is being used for so much more than just visual interpretation. As someone who’s just blind and only has hearing problems when my wife is talking to me, I don’t even think about the fact that this might be something that someone else might need. And that’s what I really enjoyed about CSun, is finding tools that would help people with multiple disabilities or other disabilities that I don’t personally experience on a day to day basis.

23:43.914 –> 24:16.814
And they’re right next to Disneyland, which is where we were at. The hotel we were at was right next to Disneyland. And the interesting thing about that is, you know, as you pull into where we were at, there’s multiple hotels in this kind of whole little circle area right next to Disneyland. So to Michael’s point, you would think they would definitely implement that and similar things because they would have a lot of, you know, people coming there for going over to Disneyland and using the hotels and all of that stuff. So they should, yeah, try and implement that would be really great.

24:17.234 –> 24:59.814
It’d be great if they had something for deafblind individuals, too, because then if there was a captioning service, then that person could read with their braille display. I got to get together with somebody that I’ve been communicating with on Mastodon, and they’re deafblind, and so they turned on voice control on their phone, opened a pages document, and then whatever I said was transcribed into the document, and then the person read it on their braille display. And so we just had a fantastic conversation at the start, and it was really great. I wasn’t sure how to navigate that. I haven’t spent a lot of time around folks who have deaf blindness, so it was just really, really cool what you can do with technology.

25:01.194 –> 25:15.474
I got to try out this really cool app that helps you cross intersections, actually. And I think, Michael, you’ve actually got to try this app out in real life. Really trying to do it at an actual crosswalk, I believe. Isn’t that correct?

25:15.594 –> 25:27.854
Yeah, you’re talking about Oko. And we did demonstrate Oko on unmute last two Sundays ago and how the Ohko experience actually worked. And it’s a really interesting technology, for sure.

25:28.194 –> 28:19.854
Yeah, I definitely thought it was really interesting. I got to try it. It wasn’t the same experience as if you’re actually in an intersection. They kind of had a fake, you know, little intersection set up there, and you could try it there. And it was really interesting. I thought it could be really helpful. With that being said, though, you still have to pay attention to the flow of traffic. And if you’re standing at the corner and you’re about to step into the crosswalk, you still have to make sure that that person isn’t going to blow through that right hand turn before you step out into the intersection. So that problem is not solved yet. So you still got to pay attention there. But the technology was really cool. I can see it helping a lot of people cross crosswalks, you know, a little more safely. So I thought that was really cool technology as well. I would say another thing I saw that was also really cool is there’s a few companies out there that are actually taking your documents and turning them into accessible documents. So that what that would mean is, let’s say you had some kind of a document and you need to make it accessible. You put all the content that you need into the document, and then what they do is make sure that all of your content is locked in there. And then all the spaces where you have to fill in the answers to the questions or fill out the boxes or any of that stuff is the space that you’re able to put the information in. And then everything else is kind of locked down and you’re able to reuse the form over and over again. If it’s something that you’re going to need to give out to many people, if you have a form for your own kind of needs that you would need. So they do everything, and they work from individuals all the way up to corporate, so they have everything in between. So I thought that there was some pretty cool stuff there. A lot of people need a document to be accessible, and nobody knows how to do it. Or at least a lot of people don’t know how to go to the right place to find someone that can make it happen for you. So it’s great that those kind of services are out there as well. And I would say also, like everybody else, you know, I had a lot of fun meeting everyone, seeing everyone hanging out. We had a couple really good hangouts at lunch and at dinner, and everyone got to bring their technology, and we all got to hang out and talk. And that was probably one of my favorite parts, is just being able to, you know, see everyone and hang out and have a good time. So, yep, it was a great one. And anyone have anything else they’d like to say in closing, I think we’re all looking forward to going back next year to see sun once again, and maybe we can all spend a little bit more time there. Like John said, hopefully you know, we can spend, you know, a few more days there. I only got for. Got to go for one day this year like John, but I’m hoping next year I’ll get to go for the whole week, or at least maybe three out of the four or five days. So, with that all being said, this was our csun recap, and everyone, thank you for being here, and we will talk to you guys soon.