In this episode of Digital Bytes, we discuss accessing visual information through smart glasses and apps. We explore the features of Envision glasses, including text scanning and scene description. We also touch on the Celeste glasses, which are still in development. The upcoming Apple glasses and their spatial audio capabilities are also mentioned. We recommend the Be My Eyes app for its AI assistant and Microsoft help desk. Join us as we keep you updated on the latest advancements in Mobile assistive devices.
00:02.650 –> 00:18.990
Have you caught the travel bug? Don’t worry, I have, too. I’m Katie, a travel agent who specializes in accessible travel. If you love to talk, all things vacations, with a little fun along the way, you should head on over to Katie talks travel, a podcast on unmute presents.
00:24.970 –> 00:36.882
Hey, everyone, we’re back with digital bytes. This is going to be the brand new show that’s going to be coming out every Sunday. So with me today, I have Michael Babcock. Say hello, Michael.
00:36.946 –> 00:38.102
00:38.236 –> 00:41.898
And I have Chris cook with us as well. Hey, Chris, how’s it going?
00:42.064 –> 00:43.900
Hey, it’s going great, thanks.
00:44.590 –> 00:45.098
00:45.184 –> 01:01.760
Yeah. We’re going to be talking about accessing visual information. So you could use a pair of smart glasses and. Or you could use apps or a combination of the two. So how to access visual information would be great.
01:02.290 –> 01:11.460
And you have some really cool glasses, Chris. Right. That you were going to chat with us about and let us know how those work to gather information for you.
01:11.990 –> 01:21.350
Yeah, I do. I have two pair of glasses, and I really enjoy them. I have the envision glasses and the Celeste glasses.
01:22.170 –> 01:24.200
And what would be the difference in the.
01:26.730 –> 03:37.122
I’ll tell. They’re two totally different products. So the envision glasses have been out for a bit longer time than the Celeste glasses. And they work with an app on your phone, but the app on the phone doesn’t necessarily have to be running. And they offer a lot of different features. They are using the Google Glass, which they assure us there are enough product to really move this product forward. Google Glass was discontinued recently, so that was a little disheartening. But I think they are assuring us that things will be fine. But the glasses have a little touch panel on the side on the right, what I call arm. Some other folks would call it maybe a leg of the glasses, but the arm of the glasses. There’s a little debate going on on another podcast about whether it’s an arm or leg, so you can swipe through these different functions. And so the kind of features that the glasses offer, they offer scan, text, instant text, call an ally, which is somebody you have in your contacts that you’ve added, and they’ve agreed to participate as a visual interpreter, so to speak. There’s also call an IRA agent. Yay, Ira. Describe scene. And what’s really cool about that is once you take a picture and you have listened to the information that it tells you, you can swipe back through the different components of what it told you about the scene. So if you swipe left, you’ll get the previous thing that it said, swiping left will get the previous thing that it said. And then you can also export the text of that description to your envision app, which is really cool. I’m always thinking about how would this be for folks that are deafblind? Because we’re getting audio information from the glasses. But if somebody is deafblind, they might want to access whatever was transcribed on the screen. And so this would be a way to do that.
03:37.256 –> 03:54.230
Yeah, definitely with the envision glasses. And you’re swiping around on that. Is that on the outside of the right arm, or is that touch panel on the top part kind of like where the buttons are with the celeste glasses?
03:54.730 –> 05:17.780
No, it is on the outside of the arm, near where the front of the glasses is. So the camera is located on the far to the right hand side of the front of the glasses, whereas the celeste, it’s located in the middle. But the touch panel is on the right. And there’s also, they call it a hinge button on the top where you can press it and ask questions. The other features are recognize, cache, find, object. It has quite a few objects in its little database. So if I lost my coffee mug, I could search for my mug. I haven’t done that too much. But there’s also a light detector, kind of like on the. What is it? Be my eyes? No, sorry. Seeing AI, one of those apps, light detector. And then there’s also batch scanning where if you have a lot of pages to scan, find people, you can train it to find a person you’re looking for. Explore just describes things in your environment, and there’s a scan QR code, and I have not tried the last thing that it does, which is the color detector, because I’m always frustrated by how inept those things are. So I haven’t tried the color detector, but it has a wide array of things that it can do.
05:19.110 –> 05:26.870
And if someone was interested in picking up those glasses, what would the cost be on those?
05:27.020 –> 05:59.982
So I bought them outright for $3,600, which is a chunk of change. It’s about what my braille display cost. I can’t speak to the current state of affairs with the subscription model or anything else that envision has going, because that’s kind of probably changeable. But I think there are ways to get the glasses without necessarily having to buy them outright. But you’d want to check the envision website for sure.
06:00.116 –> 06:29.690
As of Wednesday the 31 January, they do have the multi tier option. It starts at 1895. For the glasses, they have a second level tier at 24 95, and then the 34 95, I think, is the top tier of the envision glasses. And then they are offering a fourth payment option, which is $150 a month in order to get those on a subscription.
06:32.030 –> 07:03.060
That’s great. And in know, kind of put this in perspective. Our assistive tech is always pretty costly, but when you think about people are clamoring in the sighted world to get these vision pro Apple glasses that are the same price. And these things we can get now and try out and know that they’re made for us with all these. Yep. And connect with third party apps like Ira, which is awesome.
07:03.590 –> 07:08.530
Wow, that’s really awesome. And, Michael, you actually have the Celeste classes, right?
07:08.680 –> 08:24.670
Yeah. So I have the Celeste classes. I picked them up about a month after ACB national convention. So I put $100 down in August, and I received my Celeste glasses about a week before Christmas, but I didn’t open them until Christmas, and I have mixed feelings about the Celeste glasses. With Celeste, you have to always be running the application on your host device. So in this case, it’s the iPhone. I have not worked with Celeste on getting it on Android. I know they’re doing some beta testing on Android right now, but when you’re running the application on the iPhone, the Celeste glasses interact with the iPhone, and you can either control them from your phone or you can control them from the glasses, as Chris kind of described, at the top of the nose of the Celeste glasses. So in the middle is where the camera is. In my opinion, that seems like a little bit more of a logical place. I’d be interested to hear if Chris actually. Chris, do you have any experience with scanning documents on both envision and Celeste? And do you find the camera above your nose to be better than the camera above the right eye?
08:26.050 –> 08:41.278
I really like the camera above the nose because it’s centered. And as a person who has no vision and never has had, it makes more sense to me to know how I should be holding things when I have the camera centered.
08:41.454 –> 10:34.390
Yeah, I would agree with that, because not using the envision glasses, to me, it just seems more logical to have that camera centered, because then you just naturally hold what you’re trying to look at. And I use air quotes when I use the word look at, but you naturally just hold that where it needs to go, and the glass camera will pick up what it needs to pick up. So the celeste glasses on the right arm, like Chris, I agree. They are arms, not legs, on the right side at the top edge of the arm. Is two buttons. The front button is what you use to both take your picture by pressing it once or cycle between the modes by double pressing it. And you can also activate those modes and take the picture from your phone using the app if you would like as well. The button on the back, closer to your ear is what you press to get the battery status, or press and hold to turn the glasses on or off. I did not realize this until not too long ago, like maybe a couple of weeks ago, that on the outside of the arm you can swipe forward to adjust the volume and swipe backwards to turn the volume down. For the celeste glasses, it’s not the loudest volume on those. So I do think that I would struggle with them in a loud environment like an airport or something. And it doesn’t have as much flexibility as what the envision glasses have. So, for example, you can’t natively call someone from the glasses themselves. There are workarounds by using screen sharing from your iPhone with different tools to be able to connect with other individuals to see through the camera itself. But for $100 down and $50 a month, really, it’s a good way to get started in the wearable arena. Would you agree, Chris?
10:34.890 –> 11:00.266
I would. I think it’s a great way to get started. And knowing that these are still in beta, and knowing that the developers are still working really hard on this product and with this software, I think it’s really a good way to get into it. Just to be patient though, and knowing that it’s a beta and that things might change next week and work differently. But that’s why you’re along for the ride, for the experience of the beta.
11:00.298 –> 11:11.970
And using the product in third party apps. Do you think there’s going to be an ecosystem for more developers to bring third party apps to these classes?
11:14.940 –> 11:35.570
In talking with the developer, I think that they will be amenable to working with third party apps. I think that they realize that their strength in combining the resources that you have and expanding what you’re offering, and so I think that they will probably be amenable to that.
11:35.940 –> 12:18.190
Yeah. And I think it’s important to explain. I almost said preface, but we’ve already went into the details of it, that this is the first iteration of the Celeste classes, and it’s my understanding that in the next couple of months, those who have subscribed to the Celeste classes are going to have an opportunity to receive an updated version of them. So I’m interested to see what that brings, especially with some of the stuff that they have added to the existing hardware because the existing hardware is older hardware, and they’ve added things like assistance and other tools in order to experiment or get a better result from the glasses themselves.
12:19.760 –> 12:42.630
Right. It will be really good to know what they’re offering with new hardware. And that is another advantage of being a part of this particular beta testing program, is that they will send you the new hardware. So it’s a subscription based beta testing program and it’s a really good deal for testing things out. I think.
12:44.780 –> 14:09.696
The interesting thing is with these new Apple goggles or glasses that have come out, they already have a huge ecosystem. A lot of these developers are already making apps that are, know, apps like fantastical and things and on and on that are already going to be able to work on the Apple glasses or goggles. They actually look like goggles. They’re not really glasses in my opinion, they’re more like goggles. Hopefully, moving forward they’ll get a lot smaller and they’ll be more like regular glasses, but we’ll have to see how that goes as time goes on. But it’s also expensive. They’re starting at 3500 and I’m sure knowing Apple and their ecosystem, you’ll have to pay for all these apps from all the third party developers that are bringing apps. So this is just going to be an interesting space to see how it’s going to end up and grow as it gets bigger. And I kind of think that, or I should say I wonder if, because the apple glasses or goggles are so far ahead of the game at this point with third party developers and all of that stuff, if it’s going to blow the rest of these glasses or goggles that other manufacturers are making out.
14:09.718 –> 16:07.880
Of the water, something that I found to be interesting. And if you haven’t yet, go to my Mastodon payone at unmute community and take a look at the content shared on February 2. One of the posts that I recently shared is an Apple vision Pro accessibility walkthrough. And at six minutes into that video, you can actually hear what the select to speak option sounds like. And the ability to hear that in spatial audio is amazing. So it kind of comes off in front of you and to your left, and if you’re just wearing headphones, you can actually hear that and experience what that sounds like. And one of the things that I thought was really intriguing about this is if, let’s say you have your, I don’t know, mastodon application of choice configured off to your right and kind of off to the right. So you can just bring it in to focus when you want to use it or slide it off to the right. When you’re done using it, you can set it up. So voiceover, if it gets an alert from that specific app, voiceover will come out of your right ear and be spatially aware of where the application is that it’s interacting with. So I’m really intrigued by these vision pros, and I’m not going to go out and buy one right now, but if one landed in my lap, I wouldn’t complain about that. And it’s really interesting to see what Apple is going to do with this, and ultimately not only for accessibility, but for the mainstream. Where is it Apple’s going to go with the Vision Pro? Is it going to be one of those things that they’re going to try it out for five or six years and then realize this just isn’t working for us? Or after a couple of years, are they going to be able to iterate and make it smaller and more accessible in a means of making it more accessible to people who want to actually use it outside of their living room to watch big screen tv?
16:08.940 –> 16:57.130
Right. That’s what I’m wondering, too, because right now I think they’re hoping it’s a productivity tool for working on your computer and also all the entertainment features. But I want to be able to connect to Ira and be my eyes, and I want to be able to connect to those third party apps. I want to walk out and about with whatever I’m wearing. I want to be out and about. I want to be in the store looking at stuff. I want to be in the library pulling a book off the shelf. I want to be places. I don’t want to just sit in a chair and listen to stuff. And so that’s my concern, is that I want to be able to access all the things that I’m doing here on the envision glasses and have that be what I’m using it for.
16:59.020 –> 17:26.690
So, Chris, you mentioned Ira is available on the envision glasses right now with be my AI being taken the world by storm, really. And if you haven’t experienced it, go download be my eyes. I highly recommend it. Well worth it. And there’s some fun things. I’ll hint at this, there’s some really creative things coming in the beta. We’ll just leave it at that. So is be my eyes available on envision or not yet?
17:27.540 –> 18:02.604
It is not available on the envision glasses. As far as I know, though, you’re getting information that you would get from the AI part of by eyes on the envisioned glasses, like describing the scene and maybe some textual information. But I think that as an app or as an app to contact it is not.
18:02.722 –> 18:03.388
18:03.474 –> 18:04.110
18:04.640 –> 18:10.048
Yeah. Check that out though, because if you have not yet, it is awesome. What do you think of be my eyes?
18:10.214 –> 19:25.450
I really love the AI. I’m not so fond of just calling a random person. That doesn’t make me feel comfortable. But I really love the be my AI because I think the description of the clothing that I might pick know the things I might. I would normally call Ira for that. But if I just feel like, oh, I just want to call contact the AI and find out. Also, another thing I really like about the AI is through the be my eyes app. Sorry. You can call the Microsoft help desk and that is super cool because then you can get written directions for things like tell me how to format the columns of a spreadsheet with Excel. And if you say give me written directions on doing this task, it will do that. If you don’t say that, it will say, click on this and click on that. I’m like, oh, sorry, I meant written directions, not using a mouse. So I love the fact that you can contact the Microsoft help desk virtually. That’s very cool.
19:26.300 –> 20:23.450
I would say another really cool thing about it also is it does very well at orientating you to a new environment. So say, like, for example, you’re at a convention and you’re staying in a hotel room, and you’ve never been in that hotel room before. You could literally take a picture of the room and it will explain everything to you from the spot you took the picture in. So if you walked into the door, shut the door behind you, and you’re standing at the door and you took a picture of the room, it’ll tell you what is around you and where things are, depending on how much it can pick up. But it’s pretty good. I was pretty surprised at how good it actually works. So that’s definitely something you can do with it also. And it’d be really cool to see if and how the glasses and the goggles and these other things would do the same thing and how well it would translate, especially to a place you have never been before.
20:24.860 –> 21:26.940
That’s really great. I’m looking forward to trying that when I go to the CSN convention conference in March. Another thing I was going to say about the be my eyes is I love the description of photographs and also pictures that people take and post on Mastodon because it gives detailed description of the picture so I can enjoy the photos that another blind person has taken. Somebody also recently posted that their family member was taking pictures of their kid while they were on an outing and sending them back to the blind spouse. And it was so cool that she could enjoy what was going on with her kid while he was out and about. And I just think it’s great. It described a photo of me when I was just six and I hadn’t known all the details of that picture. And so it’s really fantastic having pictures described. And so that’s another advantage of the be my eyes, the be my AI.
21:29.140 –> 22:08.060
Yeah, sounds amazing. So much cool technologies coming, so we’ll have to stay tuned and keep everyone posted, especially as updates come and how all of the glasses and apps that go along with them evolve. So we’ll definitely keep everyone updated. Well, we’ll be back here every Sunday, so stay tuned and check it out. If you guys have any comments, questions, anything you’d like us to bring to you, you can email us at feedback at Unmute show and you can check out our website at Unmute show and we’ll see you guys next week. And say goodbye, Chris.
22:08.720 –> 22:10.124
22:10.322 –> 22:11.980
And say goodbye, Michael.
22:12.640 –> 22:15.356
Have a good one and we’ll see.
22:15.378 –> 22:16.090
You guys next time.