The Voices of Screen Readers
Article 19 explores the nuances and everyday use of screen readers
Explore the everyday use of screen readers and discover the nuances of navigating inaccessible web pages. Host, Katie Samson dives into this topic with co-host Kristen Witucki, who shares her personal journey with assistive tech, and accessibility expert Dax Castro. this episode dives into how screen readers work, common misconceptions, and their role in creating inclusive digital spaces.
Stay connected and continue the conversation with us LinkedIn.
Meet our guests:
- Kristen Witucki is a blind writer, teacher, and mother, who brings her learning about accessibility and her individual lived experiences to the Tamman team.
- Dax Castro is Co-founder of Chax. Dax is addicted to accessibility, aquariums and the outdoors. Some have called him the Sheldon Cooper of Accessibility. He loves his alpaca and is constantly building something.
Listen to more Article 19 Podcast Episodes
Full Transcript
Access the PDF Transcript
Katie Samson:
Hello everyone and welcome to Article 19. I’m Katie Sampson, Senior Director of Education at TAMMAN, and I’ll be your host for this episode. Let me start with a story. About 12 years ago, I found myself teaching Introduction to Disability Studies at West Chester University outside Philadelphia. While I wasn’t new to teaching undergrads, I was new to teaching disability studies, and the Dean of the Health Sciences Department wanted to bring in educators with lived experience who were passionate advocates. The semester had gotten off to an awkward start. The students were shy and squirmy in their seats the first day. They either couldn’t make eye contact or they gave me these reassuring smiles and nods to everything I said. It was obvious they had limited exposure to a wheelchair user. The second class, a student walked in and gave me a complete double take. What? I asked. Do I have something on my face? “No, uh, I just thought the wheelchair thing was an act, like, like an after-school special. You’re, like, uh, really, um, handicapped? That’s freaking cool,” he smirked. That was a first. I wasn’t sure what to make of it. Objectified? Labeled? Insulted? And complimented? All in the same moment. One thing I recognized right away was I had to beef up my syllabus. Call in the reinforcements. Ask some friends in the disability community to visit the class. I needed to provide the students with the reality of everyday life of people with disabilities, live and in person. If I was going to do anything to mold young minds, I needed to tackle stereotypes, stigma, and assumptions right out of the gate. Little did I know, I had a lot to learn in that regard and frankly still do.
A few weeks later, my friend Austin came into class to talk about his experience as a blind person. Austin was killing time before class on his iPhone. He had his headphones on and he was scrolling down the screen pretty fast. He was using the accessible voiceover feature to listen to the article through his headphones. He stopped, surprised by something he was reading, and asked me if I had read the article. He pulled the cord out of his phone, and all I could hear was the voiceover reading the text so fast. Rapid-fire words that were incomprehensible. Holy s***! That’s so fast. How can you possibly understand what it’s saying? I blurted out without even thinking. It turned out I needed to take my own course. There I was, humbled, embarrassed, and completely naive to assistive technology. The volume and the speed were completely normal for Austin. He was familiar with the technology and his brain had adapted to the pace that it was perfectly normal for him. Looking back on this story, now that I’m at TAMMAN and hosting Article 19, I thought if I had this reaction after being in disability spaces for quite some time, then many others out there might also.
Turns out, I had a lot more questions. Today I’m diving into everyday screen reader usage. What does a screen reader sound like? How is it used in everyday life? How is it used to test digital assets for accessibility, like in a Word document or a PDF file? What are the common applications it provides to millions of people who are blind or with low vision? So let’s kick off this episode with some help from JAWS. Not the great white shark from the 1975 Spielberg film. but JAWS as in Job Access with Speech. It’s a computer screen reader program for Microsoft Windows that was released in 1995 to allow blind and low vision users to read a computer screen with a voice output or through a touchable braille display. Now, many of you may already know this, but my co-host Kristen uses JAWS. So, let’s have JAWS take the first bite with our introduction. So, Kristen, could you have Jaws do our intro for today’s podcast, please?
Kristen Witucki:
Sure. I’m going to do it at the normal reading speed and see how you feel about it.
JAWS:
Article 19 is a call for others to join us in a bigger conversation around the ADA, digital accessibility, and access to information. At Tamman we are working to build the inclusive web every day. But to do that, we need all of us working together and learning together. Thank you so much for joining us.
Katie Samson:
Whoa, whoa, whoa, that’s, uh, that’s, that’s speedy. I think we need to slow it down a little bit.
Kristen Witucki:
Okay, okay, okay.
JAWS:
Article 19 is a call for others to join us in a bigger conversation around the ADA, digital accessibility, and access to information. At Tamman we are working to build the inclusive web every day. But to do that, we need all of us working together and learning together. Thank you so much for joining us.
Produced Introduction:
Expression is one of the most powerful tools we have. A voice, a pen, a keyboard. The real change which must give to people throughout the world their human rights must come about in the hearts of people. We must want our fellow human beings to have rights and freedoms which give them dignity. Article 19 is the voice in the room.
Katie Samson:
Hello everyone and welcome to Article 19. I’m Katie Sampson, Senior Director of Education at TAMMAN and I’ll be your host for our conversation today. Welcome Kristen Witucki, my co-producer, co-colleague, co-creator, co-friend. It’s great to have you with us.
Kristen Witucki:
Thank you. It’s really great to be here.
Katie Samson:
Let’s jump right in. I wondered if you could tell us a little bit about your background and your relationship with assistive technology.
Kristen Witucki:
I was born in 1981, and I think I’m one of the last people, certainly not the last, but one of the last people with disabilities who sort of remembers a pre-internet era. And the technology really came to me pretty early in my life as a kid, but there was still this whole realm of life outside of it for a long time. Like I couldn’t write a letter to someone else who could see that I could read. I couldn’t read letters that people wrote to me. The mail was this whole big inaccessible thing in general, except for the occasional braille mail. And books were very inaccessible. So anything outside of actually producing book report was really inaccessible for a long time. And that really influenced my appreciation of how far we have come and my impatience for us to get the rest of the way there.
Katie Samson:
I think it’s really interesting, the evolution of your experience And I wondered if you could take us back to those early days of learning how to use a screen reader, when that came into your life, like how old you were and what that process was at first?
Kristen Witucki:
Yeah, I think I was somewhere between seven and nine, maybe when I had my first computer that could talk. It was a DOS computer and it used a screen reading program called Softvert. I wonder if there’s a YouTube relic of it somewhere, but if you think the screen readers now sound like robots, trust me, they don’t because this really did.
Audio Insert of Old Screenreader:
To be able to read the screen when it first powers up, it goes through loading the program and sending a dictionary to the speech synthesizer to pronounce words in phonetic sounds that will sound proper to anybody that’s listening to it. Communication. Move on. Okay. Okay. C colon backslash W P period N E C file. Closed. No lock. Move on. Find display installed rider presented. Let’s shift to activate. Find display installed rider presented. Let’s shift to activate. No news, it’s got no security. C, colon, backlash, greater than.
Kristen Witucki:
I remember a lot of memorization to activate commands. Like there was a whole string of keys that you had to press in a certain order in order to get the computer to open a file, save a file, create a file. It wasn’t just sort of select commands. your icon or press one set of keys. It was, you know, kind of spelling it out to the computer. So that’s really what I remember. And the incredibly robotic voice and the weird way it pronounced my last name. One of the first things I did was try to figure out how to spell my last name so it would pronounce it correctly, like just for fun. You know, it’s really fun to try to make the computer pronounce things correctly and wrong. And that was just kind of a neat way for a little kid to get into it. But I was able to do things like book reports, research papers, anything major that I wanted to type from a relatively early age that way.
Katie Samson:
Do you remember that time when you felt so confident that you could help out a fellow classmate or you just felt like you were in the flow of the technology to the point that you could kind of cruise without using it as if it wasn’t assistive technology. It was just an everyday thing.
Kristen Witucki:
Well, I don’t think that I felt like it connected with my classmates until the internet really became… Until we were in this era of collaboration on discussion boards and things like that. In the early days, it was still a very isolated feeling. I didn’t feel ashamed of it because it was a way for me to do my work. I was really determined to do well in school. So that didn’t bother me, but I didn’t get the sense that those early computers would lead to collaboration. So I think it was really not until the 2000s when I started to imagine that I could be a contributing group member.
Katie Samson:
Yeah, that’s interesting, especially what we know about the early 2000s. Both of us are relatively the same age and experienced that transition in college. So the early 2000s where it all started happening all at the same time. I wondered if you could talk about some of just the general misconceptions that people have about screen readers?
Kristen Witucki:
The misconceptions people have about screen readers, to me, boil down to a lot of the disability issues under and over feelings that people have in general, like either pity or amazement. So I think the screen reader misconceptions really fall into those. If you feel sorry for someone using a screen reader, it’s kind of like, oh, that’s so hard. Like I could never do that. You know, not, uh, it wouldn’t work for me. Like, I think it’s a tool that can’t possibly work. And if you’re amazed, you’re like, oh my gosh, it’s so complicated.
Wow. I could never do it. Kind of the opposite reason. Like you must be so amazing. And I’m just like, no, I’m just trying to do my work here. So I think, yeah, those are the primary misconceptions I would get from people. But also, when people don’t use them all the time, I think people have the idea that they’re really scary and difficult to learn and that they’ll never understand them. And that’s not a problem in the sense that no one has to use a screen reader and everyone learns in their own way. But when I think about something like document accessibility, if you can learn just a few screen reader commands to check your document, you’re not only learning those commands, you’re actually learning that the screen reader isn’t that hard for most people to use. And it’s not scary and it’s not this other foreign thing that is, you know, across the ocean from you. It’s really accessible.
Katie Samson:
I want to pause our conversation here for just a moment to bring in another perspective to what Kristen is actually talking about. This notion of not being scared of screen readers. So I reached out to Dax Castro. He’s a document accessibility wizard. He’s also the co-founder of Chax Training and Consulting, who we work with. I just really wanted to get his perspective. So I asked him what it was like for him to learn about using a screen reader for the first time and how it’s used to test digital accessibility. What was that first encounter that really planted the seed for him?
Dax Castro:
There is a company and I’m going to plug them because it was a great session called Access Ingenuity. And there was a guy, his name was Xi. X-I. Xi And I used to work for the California High Speed Rail. And he came in and taught a class on how to use a screen reader. Xi was completely blind, taught the class amazingly well. And I was hooked from the moment I heard the screen reader to the end of the class. Literally, we started with not knowing anything. And by the end of the class, the test was turn your screen off and he gave you a, you are at this train station and you need to get to this college to get to your class by this time. Figure out how to get there. So we had to use the screen reader to go on the web, to find the bus schedule, to figure out what time it was. And then we had to tell them what the class and the answer was, what bus number do you need to get to? And what time do you need to get on that bus? At the end I was able to do it all. And it was, it was just great. It was really an epiphany moment where I really felt like, oh my gosh, I get this.
Katie Samson:
Great information from Dax. So let’s get back to Kristen.
Kristen:
And you can also slow it down. I think people hear the screen reader at the speed that I’m listening to it and they don’t understand it right away. And they’re like, oh, no, you can never do that. And you can actually adjust the speed easily in the menus and change the voice. And once you’ve done that, you can get it to a rate and a voice that’s really comfortable for you. And then as you get more and more familiar with it, you can change it again.
Katie Samson:
What do people normally say when they hear it for the first time?
Kristen:
Oh, I think most of the time people don’t say anything because what they’d want to say, they’d be scared of saying. But my imagination would suggest that people would say, what did that say? I can’t understand that.
Katie Samson:
Can you take us through a typical web surfing day for you? Document surfing. Like Do you pay your bills online? Shop for groceries? Do a little binge shopping?
Kristen Witucki:
Yeah, I do all of the above. I mean, as soon as I’m awake, I’ve often thought to myself, like, how did our days get to be like this? But as soon as I’m awake, I’m in my email, checking to see if there’s anything urgent. You know, getting some of those chores finished before my kids wake up. I do use websites to pay bills or, you know, check things. I also use apps like Instacart and some of the payment apps as well. So I’m back and forth on the computer or the phone, depending on which thing is more accessible for the task that I’m doing. And I what mood the app or the website is in that day, I think, might be. You know, sometimes, just as an example, like, PayPal works fine on the phone. Other times, it seems to be really annoying, and then I switch to the website, and the website’s fine. So I think there’s always this sort of thought, like, oh, is it going to work on this device today before I have to switch to the other one. So time is really important and just using the seconds that I have to accomplish things. And I do binge shop online. There are enough sites that are accessible that binging is possible, especially around Christmas time for my kids and stuff.
Katie Samson:
I think that’s really interesting the way that you described using your mobile device and then going back on the computer and realizing that it works better and how that can change from day to day. how often are you getting stuck on an inaccessible web page, on an inaccessible PDF that gets emailed to you? And is that “oh, well, I’m going to move on or I’m going to take up the fight.” Talk us through a little bit of that day-to-day process.
Kristen:
So it really varies. I have had days where a couple of different PDFs don’t load right in a row. And it’s like, how is this happening? And it depends on how important it is to me about whether I will just let someone know that this doesn’t work or go somewhere else. You know, just as an example, one of my children’s teachers was using inaccessible PDFs and it was new this year after the teacher last year was using PDFs that I could read just fine. And I reached out to let the person know and she apologized and sent, you know, maybe one or two more, I think that were inaccessible when I pointed them out. After that, the PDF disappeared. I noticed. It’s becoming email body text. I’m like, thank you. So I think it just depends on how urgent it is for me to read the document. And, you know, in a school situation, there’s not a competitor. So if you can’t read your child’s teacher’s words, then you can’t read them. Whereas if there’s a company whose website I can’t read, then I would go try another company and see if it’s any better.
Katie Samson:
I can definitely relate to that in the physical accessibility world, you know, when it comes to storefronts or restaurants or something like that. Y ou know, it’s just move along. So Dax even discussed this with me when he checks a document for accessibility how he has come to understand the user experience for someone like Kristen. Let’s listen to Dax’s process as a sighted person reviewing with a screen reader to make it more accessible or to remediate it.
Dax Castro:
I think it really boils down to understanding a persona, right? When we think about explaining a disability, it’s always best context to use a persona. You know, so-and-so has a vision disability, but he just has low vision. Somebody else might be colorblind, right? Someone else might have mobility issues. And so they use a text-to-speech tool, right? Rather than a full-blown NVDA or JAWS, which is really designed to be more of a, what we call a formal screen reader. And so lots of different avenues. And I think that’s one of the biggest challenges for people is to keep all of those personas in their brain as they’re remediating a document. Because passing a checker is just 30% of accessibility of a document if I had to throw out a number. Right. Because some people are colorblind. Some people have vision issues where they have tunnel vision or peripheral vision. And all of those things all play into what will this user experience be for the person digesting this document? So as I walk through a document, I’m thinking about all the different ways that a person might access this information. Is it useful for this item to be a heading or to be a list or to be a table? Is there a different way to present this that would make a bigger impact or create less of a barrier, right? And oftentimes, it really comes down to knowing what that user experience is like as a familiar experience person.
Katie Samson:
It is that barrier, that user experience upended in the digital space because of accessibility. But that frustration is hard to convey. Even when I’m face-to-face with a store owner who unintentionally blocks the aisle with way too much inventory, making it difficult for me to get around in my wheelchair versus Kristen shopping online when she can’t be face-to-face to address any issues. Kristen and I experience barriers on a daily basis. They can be very different types, physical versus technological. But I wanted to dig a little deeper and ask her how she handles it. I wonder if you could talk a little bit about just the feeling of alienation, segregated experience, and how often that, you know, for lack of a better way to put it, just kind of gets you down or at least just sort of, gets you frustrated enough that you feel like you want to do something about it and put in the time to self-advocate, whether it’s a platform specifically or a website that you know could be helpful for your family, for your children’s education, for your own personal writing?
Kristen Witucki:
Well, in the case of my kids’ education, it’s a really complex feeling of “I shouldn’t really be making this a priority because I’m concerned about his learning and what’s going on with him.” And obviously the teacher should be most concerned about the students in the class and not their family members. I know how valuable time is when you’re a teacher. So I feel really weird about bringing things like that up because I don’t want to distract the teacher from the 30,000 other things that they have to do every single day. In the case of a company, I think the frustration of voicing a complaint about accessibility is most likely more about worrying that whoever gets that message doesn’t actually know what I’m talking about because there’s not a dedicated public channel for accessibility complaints, or if there is, it comes across as, oh, let’s help these people with disabilities because our site doesn’t work, rather than not necessarily understanding what the problem is or how to fix it or you know, being able to talk that accessibility language. So I think there’s always that concern when I’m reaching out to a company, like who am I going to talk to? And even if they’re helpful, are they really going to address the systemic barrier or are they just going to be like, oh, let’s help this person because we’re nice and they’re nice. And then let’s go back to business as usual.
Katie Samson:
Yeah. It’s interesting because you’re not having that face-to-face communication. It’s oftentimes through email. You might not hear back for a very long time. Sometimes I know with the work that we’ve been doing at Tamman, you can reach out and get a roadmap from a company that will explain to you basically what, like we’re working on it. Check back with us in a year or so.
Kristen Witucki:
Yeah, this is funny. I’m like, do I have a year to wait like do I really have to wait that long but I mean the mta will be ready with 90 of their stations by 2055 so
Katie Samson:
Yeah and meanwhile like Delta Airlines is making it so power wheelchairs can be on flight so if like Boeing can recraft an airplane to accommodate a 500 pound wheelchair I think you can map out an accessible part of your website.
Kristen Witucki:
I hope so but you never know right?
Katie Samson:
I want to talk briefly about images, specifically alt text, which stands for alternative text. And if people are really interested, should go back and listen to an earlier podcast in which we talk about alternative text and AI with Be My Eyes. It’s a really great episode. Basically, alternative text is alternative descriptions for a visual or graphic element that allows a screen reader user to understand the context of an object not directly available as text, such as photos, drawings, charts, infographics. And this is code that’s written into the design of a web page or a document. Oftentimes, designers don’t understand how to build a website with accessibility for screen reader users. The information might come across as a title given to an image when uploaded. So for example, a screen reader might read an image as like young child seated on grass hugs a puppy, or it could come across as boy3896.jpg. So my question is like your thoughts around alt text, it’s specificity, it’s objectivity. And have you ever come across a website where you’re like “someone in the backend is writing some really amazing alt text. I kind of want to get to know them.”
Kristen Witucki:
Yeah. So alt text fortunately is getting better and better and I…there are many reasons for that. Humans are learning about it. AI is getting involved sometimes actually pretty well, sometimes not so well. And both of those channels lead to more and more awareness of alt text and websites are making it more and more obvious in the code when people are creating them that alt text is required. So that is a really amazing development. I think it depends on so many factors like where the image is, how much text is on the screen, how many times I’ve been to the website, like whether it’s new or familiar. If the images are sort of out of the way or between paragraphs or whether they just kind of interrupt the flow of text. And again, those wonderful image names, which can be totally lovely and descriptive or can be complete gibberish. So all of those things contribute to the a really wonderful or really terrible experience.
Katie Samson:
As we talked about in the interview that you did with me and my background in arts and culture, I wondered if you’d ever come across alt text as poetry and some of the trainings and workshops that are being done in that space?
Kristen Witucki:
I didn’t until I read the interview script for this interview. So I looked at the link and I’m kind of amazed that I haven’t because there’s so much that we can learn from poets, not just the emotional resonance, but how to create a compelling, thoughtful, beautiful story. economic piece of alt text. So, you know, keeping it short and to the point and using the words that you mean and really being aware of the emotional weight of words as well as their complexity or simplicity when you create alt text is so so important and it’s something that people at Tamman talk about all the time but I’m not sure that I ever heard anyone talk about it as poetry and maybe I’ll start evangelizing that to our documents team because I think they would love that.
Katie Samson:
We’ll put the link for alt text as poetry in when we post this. And also you can definitely just Google alt text as poetry. And I think it will come up. I want to just move on a little bit to talking about advocacy versus anxiety.And a topic that comes up often in disability community circles is the frustration and fatigue that comes with self-advocacy. And you definitely talked about this a little bit. Some days we’re really ready to take on the world and other days we don’t want to think about and fight through the barriers. Can you talk about your relationship with this general idea and where you find yourself aligning with self-advocacy and potentially in the digital space? But let’s take a trip outside the digital space too, because I’m just interested in that. And I don’t think we’ve ever talked about it.
Kristen Witucki:
For me, self-advocacy changes day to day, minute to minute. And I’m sure that biology has something to do with this. If I’m tired, if I’m hungry, if I’ve had one too many negative interactions that day, or even if I haven’t, all of these things play a role in my ability to advocate. I do think that advocating in the digital space, now that it’s part of my role at Tamman, and now that I’m getting more used to it, is becoming easier and more liberating. And I definitely don’t always win. Sometimes I feel like I’m just this tiny morality blip for these giant companies that have enough people to isolate them from an accessibility conscience. But it’s still freeing. Of course, part of that is not advocating in person. But I think most of it comes from being with a community of allies from all around the country who truly understand this. Some of them have disabilities, some of them don’t. Some of them come from other traditionally marginalized or underrepresented groups, and others don’t. And yet, we’re all brought together by this profound ethos of caring and human generosity and of being very mission-driven about accessibility. And that has really inspired me as well to advocate for and support others when it’s their turn to be seen, heard, known, understood, and validated.
When I think about advocacy out in the physical world, outside of my home office, it’s so much harder. I’ve often been the target of what I would call disability microaggressions. People will just say things to me that are insensitive or just communicate those distancing feelings of pity or amazement, or else they’ll say things about me if they think I can’t hear them or maybe even if they know I can. I recently read an update from a blind acquaintance on Facebook that although she generally feels comfortable advocating for herself and for what she needs, sometimes when those moments of pressure become so great, she can feel her words jumbling. She can feel herself not making sense. And I understand that so well. When you feel these moments of anything you say will be used against you, or like someone has already formed an opinion that your words will never change, sometimes you just can’t make sense anymore. And the most extreme example of this in my life is when I went to the hospital to give birth to my first child. I had already been there a few times. I had taken my own tour. I had talked to some of the doctors and nurses, and when my son decided to make his entrance and then maybe retracted his decision at the last minute, none of those people were there, and no one had communicated across the staff. And the nurse assigned to me was just incredibly hostile because of her own fears. I had a friend with me, and I remember she said, she’s about to have a baby. Are you going to help her or not? But still, it’s so hard to advocate while getting a baby across that barrier into the world. And I know that most days are not that extreme. You’re not about to give birth or at the point of death when you just struggle to make sense. A lot of times you just struggle. But it’s hard to be alive. It’s hard to make yourself understood when you feel like it’s not going to matter. And my colleagues and friends at Tamman have shown me over and over again how much it does matter. So here’s a much more positive example of that. That birth story is from 14 years ago. So that means this same boy is now a teenager. He’s like this gawky but very dear young bird who’s just starting to fledge. He walked with me to a Lyft I had ordered one morning to get me to a school where I was teaching a blind student. There was construction work around the house and I wasn’t sure where the car was. So he came with me to find it. And the driver told me that he wasn’t going to take me if I wasn’t traveling with someone because I was blind and he’d had a blind person before and he wasn’t going to take the risk. And of course, with that welcome, I really didn’t want to get into that car, but I didn’t want to back down. And I also didn’t want to be late for work. So I said, you need to take me right now or else I’ll let Lyft know that you’re violating the law, which is the Americans with Disabilities Act. Really, I should have my sections of that law memorized, and I still need to do that. But it was good enough in this case. He did let me into the car, and he took me where I needed to go. On the way to work, I talked to that driver a little bit. I gave him the line Marty, our president, sometimes says about, “If you’ve met one blind person, you’ve met one blind person.” Ultimately, I don’t think he totally understood, but I still won. And the really powerful part of that was my son was there to see it. Because while I do my very best to show my kids that I like my life a lot and that I have a full life, I also know that society gives them messages about how parents should be. And oh, your parents are different. And I know they internalize those. One of the most hurtful things I hear people call to me or them is, “now you take care of your mom.” And I’m just like, okay, they can read signs, but I’m in charge and they’d be dead without me. And people just don’t get that. And so to have that power in that moment with my child witnessing it is something that I’ll never, ever forget.
Katie Samson:
So you’ve brought a sample exercise today. And I’d like you to take us through a screen reader with a little poetry that you’re going to share with our audience. And I’m really excited to listen to it and to talk through what that sounds like for you.
Kristen Witucki:
This is a tiny bit of poetry by John Lee Clark who is a deafblind poet and so therefore experiences the screen reader in a completely different way than I do. And screen readers will actually, for people who are deafblind, give them access to connect the braille display to the computer. So it’s a very text-based experience and not an audio experience. So just really important to know that screen readers can handle these things in many, many different ways. I have really been into this book that we talked about at Tamman called Touch the Future by John Lee Clark. It’s a very provocative book by a deafblind author. And you know that I love this book when chapter one is called Against Access, but my whole job is for access and I still love the book. He’s also a poet and I found a really short poem that he wrote and I’d be happy to have JAWS read it for us.
JAWS Screenreader::
Our heading level one approach by Visited Link John Lee Clark.
Kristen Witucki:
So the heading level one is, you know, showing if you were on the website, which is very busy and you wanted to jump to a certain area really quickly to get to the poem, which is the most important part of the website, you can use your level one heading to get there.
JAWS Screenreader:
Article.
I spin around in the middle of the corridor.
My cane taps against four elevator doors.
I have pressed both the up and down buttons because there is a fifth elevator door.
If I tried to tap all five, I would come to closing doors too late.
Let the fifth door open to a ghost.
Let it be confused and close again.
Katie Samson:
So interesting. I wonder what your experiences in listening to poetry through JAWS versus reading it through Braille. Do you get a different level of satisfaction doing it this way? Or would you prefer just normally to be reading it the way that you normally read it?
Kristen Witucki:
So to be completely honest, I prefer Braille poetry because I feel like when I read poems, I want sort of my inner voice to be reading it, even if I’m not reading it out loud, and not to have that mediation where it doesn’t bother me so much on a day-to-day basis with emails and other things. However, if I’m in the middle of a doom scroll and I’m like, okay, I’m going to go read a poem, then that’s not going to stop me from enjoying a poem on JAWS.
Katie Samson:
What technology has made available to us is remarkable, and still we have a ways to go. Dax trains people in how to design digital documents with accessibility in mind. He put me through a demo navigating a document using a screen reader called NVDA.
Dax Castro:
So what’s interesting is I think that this is a really good experience for people who are listening to the podcast right now because you don’t get the benefit of seeing the content just like a person who is using a screen reader might do. So I’m going to go ahead and turn on NVDA.
NVDA Screen Reader:
Speech mode. Speech mode talk.
Dax Castro:
And we’re going to listen for a second. I’m just going to navigate around a bit and let you listen.
NVDA Screen Reader:
H. Digital line. Contact us heading level three. Smithsonian information heading level three. Welcome to the Smithsonian Heading Level 2. Smithsonian Information Heading Level 3. List with four items web. Link space www.si.edu. No indent plan. Start your visit planning at the Smithsonian’s vert hours. Most museums are entry. Free, unless otherwise no out of list heading level three, contact us.
Dax Castro:
So you can hear, we heard heading level two, heading level three, and then we heard list four items. And hopefully in your brain, as you were listening to that, you were getting a sense of the structure of the document, right? There’s a heading that says, welcome to Smithsonian. There’s a couple of subheadings for Smithsonian information and contact us. And then under each one of those, there’s a list of a few items. And that’s really the gist of what a document with proper tags sounds like. Because imagine you heard all that information just as run on text. There’d be no context as to what is a heading versus what is the information in the list or how many list items. One of the nice things you heard was list four items. Imagine that you had a list of a hundred items and you did manual bullets on every one of them. You just use the bullet symbol and hit tab and put your text in. The user would hear bullet item one, bullet item two, and it would get to about number five and go, okay, how many more items are in this list?
Katie Samson:
That’s a lot of bullets.
Dax Castro:
Exactly, because you manually made the bullets. But if you properly use the list tool inside Word or PowerPoint or InDesign, then it will voice list five items or list 100 items. Because if I know there’s 100 items, I might search, do a text search to see if I can find the one item I’m looking for. Imagine it’s a program with a list of graduates and it’s all in a list. I might just search for that person’s name, right? And then I would know, oh, you know, there’s Jane Smith in the middle of this list. And because you’re using heading structure, the screen reader user can skip that whole list by maybe pressing the H key for heading. And then so they would pop out of that list and go right to the next heading. And this goes to what we call designing with accessibility in mind. If I know I’m trying to tune my user experience for someone using a screen reader, I’m going to make sure that I am using headings from one section to another so that a person could skip. Because imagine there were three paragraphs after that list and the person pressed H, they would have skipped all three paragraphs, right? So making sure that you have anchors and markers in your document to allow a person to skip content is kind of an important skill that you learn as you learn to design with accessibility in mind.
Katie Samson:
So great to get a little taste of the work that DAX and Kristen do. And I know, Kristen, you have actually presented with DAX together in talking about screen reader use in conferences and webinars. Anything you want to add to that or just little tidbits that you might want to share about that experience?
Kristen Witucki:
Dax is almost literally a superhero. I’m convinced that that’s hiding back there somewhere. Maybe not, because he really inhabits that experience of many minds always thinking about how different people will interact with a document and really knows his commands and takes time to learn screen reader commands. And I’m learning from him, basically. So we presented together a couple of months ago, and it was really interesting to encounter some of the barriers in these webinar programs. So we had created a presentation video ahead of time, and that video was uploaded and the audience was watching the video. And our job for the first part of the presentation was to respond to the chat. So the only way I could really hear the chat clearly and focus on it without being distracted was to turn my video volume all the way down and listen to that. And I thought I had come up with a great solution. And then we had the live question portion of the webinar. And I had messed up the volume in a way so I couldn’t hear the audience asking questions. And Dax just swooped in and called me on Slack. I was like, okay, this person’s asking this question. Can you answer it, please? And I answered it. And it was a really interesting tech experience and his ability to think with this lightning speed is really really admirable.
Katie Samson:
Yeah sometimes in those situations it can be nerve-wracking because it’s like you don’t really know what’s happening and you want to solve it really quickly, but trying to figure out how to do it with typing and what format to use. So I think we definitely have to plug Chax Training and Consulting because it’s been just an absolute delight to bring this team together and please feel free to check out the website and sign up for a drop-in class if you’re really interested as well. C-H-A-X, get to know them and us as well. Kristen, I do want to have you share a little bit about resources or opportunities. If people are really interested in getting to know a screen reader or experiencing it for the first time, whether from the design perspective, but also as a person, perhaps that’s new to the experience of low vision or blindness in a sense that what’s out there, what’s available that you could recommend?
Kristen Witucki:
Well, there is a phenomenon called the no mouse challenge. And it’s the idea that people don’t understand how mouse dependent our computers are. And they actually were not always so. And People who are keyboard users only can find some websites really difficult to use because the designers and developers have thought about it from a mouse perspective. So there’s this idea that you take an hour or a day or a week or parts of hours and days as time permits to try using only your keyboard to get an idea of how websites work using that interface. And I guess I just want to reiterate that using a screen reader itself is very easy to do. It’s not scary. You really can’t mess it up. You can always turn it off when you’re scared or tired of it, but it is so available. You can get voiceover on your iPhone. You have an iPhone by going into settings, accessibility. and just turning it on and checking out how voiceover gestures work on a phone. If you have an Android, it’s TalkBack. And on a computer, you can download either NVDA for free or you can turn voiceover on if you’re using a Mac. So there are lots of ways to interact with a screen reader. Start with a really low-priority task or, god forbid, something fun you know, and just play with it. You know, just enjoy it. You know, play with the voices. People love playing with the voices. Adjust the speed so that you’re comfortable hearing what you hear. You don’t need to start out at the crazy speeds that we’re kind of showing off to you. You can really make it very comfortable for you. If you have a colleague who is blind or low vision, you can reach out and ask questions. And we’re very happy to answer it and talk about this because, you know, it’s fun to talk about. And also, get a colleague involved. Like, find a friend. If you’re worried about trying this out on your own, then try it with someone else. Try it with a kid. That’s always fun.
Katie Samson:
If you’re trying it with a kid, it’s sparking that curiosity at a really young age to learn about accessibility and all the aspects to it. So that’s a great idea.
Kristen Witucki:
Yeah, it goes both ways. It sparks the curiosity in a child, and it gives you the child’s enjoyment. Like, it’s not serious all the time. It’s not like, oh, I guess I need to go build my empathy now, you know? It’s like, oh, you know, a child will just have fun with it. Like when I started, I was playing with the pronunciation. So that’s fine. You know, nobody’s going to send the screen reader police over.
Katie Samson:
Well, thank you so much. I love that we shared with our listeners today two really different types of screen reader users and demonstrated both. I have learned a ton, but I also think there’s a lot more to learn and more to tell. So let’s keep this conversation going. Please reach out to us if it sparked your curiosity in some way, you’re interested in learning more about assistive tech and how we can open up further conversations and further interviews with folks out there. So thank you, Kristen, for being a part of my first hosting responsibility.
Kristen Witucki:
Oh, it’s been my pleasure, Katie. Thanks so much for hosting.
Katie Samson:
I thought it was only fair that we let my iPhone voiceover take us out of this episode. But before that, our guest today was Dax Castro. Markus Goldman is our executive producer. Support also came from Stephen Stufflebeam, Sydney Bromfield, Sloan Miller, Kristen Witucki, and Lena Marchese. I’m your host, Katie Sampson.
JAWS Screenreader:
[Need To Add words here] today and want to explore more about digital accessibility, technology, our company culture, or anything else, schedule a time to meet with us. You can find the whole TAMMAN team at tammaninc.com. That’s T-A-M-M-M-A-N-C-D-O-T-C-O-M. Be sure to rate our podcast five stars on spotify, apple podcasts or wherever you get you listen to us. It really helps our podcast grow and reach new audiences. Make sure to follow us. Hit the bell icon so you never miss an episode. If social media is more your style, you can also follow us at Tamman Incorporated on Linked In, Twitter X, Instagram, or Facebook and share our podcast on your favorite platform.
Show Notes
- JAWS Screen Reader: Screen reader used by Kristen and the world’s most popular screen reader
- NVDA Screen Reader: Screen reader used by Dax and is available to everyone for free
- Alt Text as Poetry: An artistic endeavor that reframes alt text as a type of poetry
- Article 19 Episode, ‘Can AI See the World the Way We Do?’: Joined by Vice-Chair of Be My Eyes, Bryan Bashin, and Tamman document accessibility specialist, Liza Grant, we explore the differences of alt text written by a trained human and AI and what the future of AI looks like for blind and low-vision individuals.
- Chax: Further connect with and learn from Dax and the Chax team
- Access Ingenuity: Screen Reader class mentioned by Dax
- Screen Reader DOS Computer Demo: YouTube video of Kristen’s first Screen Reader