Matt May, Head of Inclusive Design, Adobe: how to build inclusive technologies; ethics & trust; privilege & allyship – The Human Show Podcast 31
Matt May has had an extensive career in technology design, access and inclusivity. He is currently Head of Inclusive Design for Adobe where his work includes integrating inclusive design practices across every aspect of the Adobe user experience, training and mentoring the Adobe Design team, and advocating principles of accessibility and inclusive design to the public at large.
In today’s episode we talk to Matt about access, inclusion and exclusion in technology design through the lens of his work in this space. We cover what it is and how to start building inclusive design; how to scale inclusivity at a corporate level; the connection between inclusion, ethics and trust; the difference between inclusion and compliance. We also talk about how he uses research to support his work. Lastly, we cover privilege and allyship in the technology space.
Get the Podcast Here:
Mentioned in Podcast:
–Web content accessibility guidelines (WCAG)
– AI principles, ethics and Google
–The Americans with Disabilities Act
– Safiya Umoja Noble: Algorithms of Oppression: How Search Engines Reinforce Racism
–Emily Chang: Brotopia, Breaking up the boy’s club of Silicon Valley
– Virginia Eubanks: Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor
[00:00:02] Hi everyone. This is Corina Enache. Welcome to The Human Show, proudly presented and supported byWorldPodcasts.com. Here we explore the relationships between people technology and business. Join us on this journey where we interview anthropologists, other researchers and industry people from all over the world, from India to Kenya, US, Europe, to right back here in New Zealand.
[00:00:28] Corina: Hi friends. In today’s episode we are talking to Matt May, head of inclusive design for Adobe. We talked to Matt about access, inclusion and exclusion in technology design through the lens of his work in this space. We cover topics such as what it is, and how to start building inclusive design, how to scale inclusivity at a corporate level, The connection between inclusion, ethics and trust, the difference between inclusion and compliance. We also talk about how he uses research to support his work. And lastly, we cover privilege and allyship in the technology space. We hope you enjoy it.
[00:01:09] Corina: We are here today with Matt May from Adobe head of inclusive design for Adobe. Hi, Matt.
[00:01:15] Matt: Hi! How are you doing?
[00:01:17] Corina: Very, very good. Full of energy after an hour in the gym and my coffee. I need to kind of like lower my volume a bit.
[00:01:26] Matt: That’s cool. We’ll balance each other out because I’m in the last few hours of my Friday.
[00:01:33] Corina: Okay, we’ll do that. Well Matt before we dive into the topic of the wonderful topic of inclusion in technology, I want you to tell us and our listeners a bit about your own career path with technology and inclusion, and maybe you could start by defining what inclusion is to you personally.
[00:01:54] Matt: Sure. I guess to borrow from, from something I wrote a few weeks ago to talk about inclusion is to speak in a way that is like the second half of a sentence. You talk about inclusion because exclusion is the norm. And we see that sort of in every aspect of technology from the user interface to hiring practices to just decision making that goes into the products that we create on a regular basis, how we test users, how we ask for feedback. All of those things have aspects of inclusion or exclusion depending on what access you’re looking at.
So, my role at Adobe is basically to to find those instances of bias, whether they’re explicit or implicit and come up with a way to systematically eradicate that from the equation, wherever I have the opportunity. From my background, I guess we can get into that, but from my background from accessibility the concept of disability is something that jumps up because it’s almost always one area where people are actively being excluded and in very solid barriers to access. But my role at Adobe includes talking about the same kinds of exclusion along lines of race, gender, sexual identity, economic status, anywhere that we are the ones that are even unconsciously raising barriers that don’t need to be there.
[00:03:44] Corina: How do you come to have this interest in this topic?
[00:03:48] Matt: So this actually kind of takes me back to my beginning of my career as an engineer. I was a web developer in the mid-90s so I was there more or less when, like, my first HTML book was like 95 pages long. And it was a chapter and verse recitation of every element like down to like the dfn element for definition. So really ancient material.
Dotcom crash destroyed the entire the entire industry really. But we started getting calls from people saying like hey it’s really nice that you’re making it easier for the suburban household to buy their groceries and everything and let me tell you what my experience is for buying groceries. I call the local supermarket.
I tell them what I need. They open their paper catalog and they tell me what they have, and then they will go and pick from the supermarket aisles. You know, what’s available, and then I get into.. I live in Seattle, it’s a big city. So we have we have an access shuttle so you can call the shuttle which picks you up on a certain schedule.
So I call the access shuttle, the access shuttle takes me to the supermarket, I pay them for the groceries plus the packing fee and I come home and I do my own work. So what you’re doing for people to do a shop in like 10 or 15 minutes takes me maybe three or four hours out of my day just to do that. So that for me was the moment where the light came on, where there is this aspect like, we were dealing in a business where we were trying to shave minutes off of somebody’s day. Right. So we’re just trying to get a little extra time in and improve the quality of their life. And if we were looking at it through the lens of disability, we were talking about enabling wide swathes of someone’s day.
Like, that’s life-enabling technology. When you start thinking about the people who can actually benefit from it most. And once you see that, you don’t unsee it, right? That just continues to be the thing that’s in the back of your head like who is going to benefit from this the most? How can we use all of this technology? You know all of the CPU power all of this and with all of these peripherals like cameras and things like that to to be transformative, instead of just rent-seeking or finding just these little optimizations for things.
[00:07:17] Corina: You were mentioning earlier that exclusion is the norm, right, and you can’t get inclusion without looking at exclusion. So how do you decide which areas you want to focus on to make them inclusive?
[00:07:32] Matt: I think there’s an idea that the concept of starting with what you, what you know and what you’ve experienced. The next, the next project that I jump into is not my first try. I’ve already learned a lot of ways that the projects go wrong. And so we, I think, in the area of inclusion we all have the kinds of best practices or patterns or anti-patterns that we’ve recognized.
And I think that as time goes on those things sort of turn into policies, or we build architectural constraints into the things that we build from. So when I when I go into to a product team, one of the one of the easiest things for me to look at is how it intersects with disability, because there are concrete well-documented policies and laws like the Web Content Accessibility Guidelines that give you the boundaries. Right. Like, beyond this, you have failed in a very concrete way. It’s also way to express to designers and engineers and product managers why this is important.
There’s this story about Van Halen. I don’t know if you’ve heard this story before but the they said the rule was, on the rider for Van Halen’s contract, David Lee Roth had put in that he wanted M&Ms, but he only wanted like one color of M&Ms are one color taken out. It was absurd like sort of rock star kind of story. And because if he walked into his into the green room and he saw that they hadn’t done, that he would just trash the place. And that was kind of his reputation. And later on he said the reason that I did that was if they didn’t read that, then how can I trust them to build a stage that I’m not going to fall through. If you’re not paying attention to these kinds of details, you’re going to make far bigger errors that are going to be just sort of swept under the covers.
So accessibility, designing for and with disability in mind is a way for us to have some kind of structure to the way that we do this, that it isn’t just like, hey, let’s just let’s just make it faster and add more features and things like that, but find out what is the right way for us to express this kind of interface. So we basically start from that because the tools exist for it. There are well-defined markers for for progress.
There are ways that we can that we can test this in all different scenarios, and it really is kind of oriented toward the design and development of a product in general. The other areas where we’re focusing on inclusion have to do with the human aspect of developing products. So, machine learning and artificial intelligence. We have these ideas that… I have somebody that’s been working in a machine learning, has a Ph.D. and has been telling me all of these details that are just like shocking to me. Like that genome research is still built around basically an able-bodied 20-something white male.
And that when another like when someone from another race is introduced into the into this data set, what they do is just sort of find the differences in their DNA from the from the white male. And that the lazier researchers just ignore that data because it’s more difficult to deal with than they want. And so basically, the end result is sort of like eugenics but in digital form we’re still only learning things about, you know, white males, and to the detriment of everyone else. Those things happen in software all the time, right? We do user testing but we do it, in our neighborhood, we don’t control for a balance of genders, a balance of races. So for all of this new technology, all this great stuff that could theoretically be empowering to everyone, we’re still sort of introducing bias into it just by our own neglect.
Then there comes down to things like things as simple as like filling out a form online. A lot of the time. I mean, we’re getting a lot more casual about how we address people online. Used to be like 30 years ago it would always be like Mr. or Mrs. Ms. I’ve said this before like the the why does, like, Cat Fancy magazine need to know what your gender is? Why. Because the gender is something that is that culturally is shifting in a great number of directions and from culture to culture. You know some of them have already said it’s like not even discussed that there that there are more than two.
So we have reached this point where, why do we need to potentially offend a proportion of our of our user base by asking them for their gender if we’re never going to use it for anything. A lot of the time the only reason that an organization is capturing that data is because they want to use it for analytics purposes and it’s this shallow little data point that just reinforces a gender binary and also kind of has this sexist aspect of women do this and men do that. And it’s those little things where you go into a meeting you see that this thing is there and you go like, hey, why did we do that. And just to have like somebody say like, uh, what?
[00:14:23] Corina: I want to share a story of mine. I normally don’t do this but I think it really fits here. I was in a meeting on a project in the tech sector and we were sitting in a room which were all men. I was the only woman there and they were discussing how the avatar should look like for a virtual assistant that would take you to your loan application.
They were all saying, oh, so we have to, she needs to be trustworthy. So she needs to be like tall and blonde and, you know, because women in general are seen as being better with finances, so we just we need to make her look like that. And it needs to be a woman necessarily because men are not trustworthy when it comes to their finances. And I was sitting there and I asked, how many of your users are men? and they’re like, around 35 percent, the rest are women.
So would women have the same perception about who they would want them guiding them through an application process? And there was this silence in the room. Yeah it’s it’s it’s incredible how much our own bias comes up into the conversation all the time. How do you manage to gently call it out? Or maybe not even call it out but make it like a positive point of conversation.
[00:15:37] Matt: So, I have a couple of advantages in this department. Number one, I do what it says on the tin. I am the head of inclusive design. It is my job to introduce these kinds of questions and sort of demand an answer when I’m not really happy with the way that it is right now. The other one is that I am a cisgender heterosexual white male in North America. And I don’t really have to worry so much about stepping on people’s toes in these kinds of environments and that’s just, I mean, that is something that I that I have to calculate into all of this as well. How many people in this situation are going to be able to do that?
I think that there is room for for people from higher levels of privilege to be calling out something as privilege because we are the ones that end up in these in these rooms where where we are more likely to hear people being biased, that they’re being racist or classist or ableist and it’s just as much our responsibility to be doing it. So if I have any advantage in in just being able to call things as I see them then so be it. So I get to be loud. That’s the main advantage.
[00:18:05] Corina: Do you have specific areas of bias especially when it comes to the machine learning that you and Adobe focus on to call out as you work on projects?
[00:18:16] Matt: Yeah this is something that just this week, Google published a sort of a one pager about their AI ethics and what they’re going to do and whether they’re not going to do for us right now the things that concern me have to do with using AI and computer vision. Right. So for us at Adobe, our system is called Sensei and we’re we’re loading things into the cloud. So things were normal computational tasks that would go into Photoshop for example are now sort of put into this giant graphics processor in the sky.
And we have a lot more that we can that we can do when we offload that when you add machine learning into the equation you end up with some of the unintended but like really serious consequences that Google has experienced where human faces of a certain of certain color are recognized as non-human. And denying someone their humanity is about as terrible a thing as you can do, and whether or not that is an entirely computer-generated thing, you should be horrified if that’s the outcome.
So what concerns me is making sure that the data that we have, that we input into these machine learning systems because that’s how we get the results at the other end are as diverse as they can be, and that includes various shades of skin, eye colors, hair, height, size, and the things that I think a lot of people sort of first glance will will leave behind which is if you are trying to track somebody’s face, right, and you are looking for markers on their face that are going to move, that we understand that there are people that are born without one or both eyes, you know, that there that there are all different kinds of configurations of someone’s face. And if you are trying to build systems like that without accounting for that, then you create this absolute barrier to someone being able to use it. So we’re still learning what it is that we that we need to do in these kinds of systems. But my approach is, I kind of want to defuse the bomb before it goes off. That when we’re when we’re building these systems, that we’re taking these things into account versus, I wake up some Monday morning and find out that a random project has refused to acknowledge a set of humanity as being human.
[00:21:29] Corina: Do you have some metrics or systems that you used to kind of track progress?
[00:21:36] Matt: Yes, that is one of the things that my organization is going to be working on. So the way that we’re developing our products so far is, we we have a lot of different projects that are in different levels of development that are not really all using one specific piece that I can focus on and say, we need to get this part right. A lot of it is the product of a ton of original research and those systems will start to come together to become to make something larger. Great. That’s just the software development process. And I think that for me right now in that in that research process, the best thing that I can do in the short term is just bring those questions to each individual project because first off, it corrects things that might be happening right now, but in the longer term they know I’m going to come back to them the next time, right? So they know that they’re going to have to think about the same kinds of problems as they’re going, that it isn’t just a one time event.
This is a change to our organizational make up as a matter of the ethics of the organization. When we talk about when we talk about accessibility and the standards that are involved in that they exist because we know that things are reliably accurate for for all values of X. When we get into research, we start finding new ways that we can break things.
What we need is sort of a solid ethical viewpoint that we can refer back to, a set of core values for things, and then the people that are going out and blazing a trail have that to refer to. I mean, Google has always had that “don’t be evil” thing, you know. But it was originally “don’t be evil” I think they just changed that recently. But “don’t be evil” was that, and that’s problematic in a lot of different ways, because you know it’s in the eye of the beholder.
I think that there are a lot of people that were working at Google that were e doing what they would they considered from their perspective to be good things that don’t really square with sort of the external opinion of that. I think we need something a little bit better than that we are talking about design ethics. And that’s a subject that I think is exploding right now just because we’re sort of encountering not just the power of the new capabilities that we have, but also how easily they can be taken in the wrong direction.
[00:24:36] Corina: And where does this function of ethics sit within Adobe, and within your work?
[00:24:43] Matt: We’re… for me I think that I would always want this to be something that I can refer back to, where I could just say these are the ethics of that. And that is also evolving. And there are active discussions going on within Adobe and then sort of within the design community and in tech and in general around us. We do need to find a place where we say, we’re not going to do that. Again, Google just had this come up because they were working on a project with the Department of Defense, that led to sort of a revolt among their engineers and people actually quitting because they were doing things that were offensive and they were damaging to it. So we need to express those values and we’re working on it. I don’t think I can say much more than that.
[00:25:45] Corina: I wanted to ask you how do you work with research in this space? Like what’s the kind of role that you think research plays in supporting some of the work that you do?
[00:25:55] Matt: This is actually, I mean, I think your last three questions you’ve hit on like one meeting or more than I’ve had in the last week or so. We have a design research organization and then we have like a larger corporate research organization.
And I think I look at things on a on a timeline of, there’s the stuff that’s happening right now, where it’s everywhere in the marketplace, and especially if there’s one issue that sort of everybody has replicated because we never thought about it, the waterfall scenario where we’ve waited so long that it’s going to be thousands of times harder to do than we than if we had simply thought about it in the first place, and we look at things like VR, augmented reality, mixed reality where all of these pieces that we’ve been talking about so far come together.
Differences in humanity, whether it’s appearance, capabilities, heritage, cultural and otherwise, all of those pieces come to a head when you are in an AR/VR kind of environment.
So here we are in a new territory without a map, right? So what we know from the early days of the web was that we missed these pieces and so the next time we come up with some kind of transformative technology we need to be there at the very beginning, and we need to be asking questions about how we know how we’re going to do this so that we can reach the greatest number of people possible. And some of that is a matter of increasing the access in terms of making it available at a reasonable price point, making it available geographically.
But also, what else can we do in terms of, if you’re deaf and you’re in a VR environment how do we establish that the captions that are made for you, because obviously we need to have the captions, and in that environment how do we communicate that to you so that if you see a crowd of people that are like 5 feet off to your left then another crowd of people there 20 feet off here, right? How do we how do we communicate that these people are sort of within your range, and those people aren’t. But there are these kinds of questions where we don’t have a good answer for it. So research is what bridges that gap. We know that we’re going to need to know that answer, but we can find that out in six months or we can find it out in 10 years after we’ve built eight or nine years worth of things that people actually use.
[00:29:08] Corina: And what type of profile of a researcher are you guys looking at when you contemplate these kinds of questions?
[00:29:15] Matt: I would say a background in human factors is an easy and easy match. The researchers that I’ve worked with I think come from various different backgrounds. The head of design research came from a computer vision background or from a I think just a vision-vision background. So we, I know this because I mentioned something to her about a project that I wanted and she was like, (gasps) let’s do that! So nobody’s going to know everything about everything right. It would be nice if we had like one of those TV Ph.D.s where it was like they knew everything about everything. House.
The person that somehow knows 300 years worth of worth of research and has it available at the top of their head. But for an organization our size, we sort of thrive on the diversity of backgrounds and invoices and I think from my experience in working with the researchers here, they like being able to have somebody that knows more than them about one thing or another and my place in this, not coming from a real academic background but coming from accessibility which is always kind of a… we don’t usually always have the resources that that we want, let’s put it that way.
To have people like that that I can talk with and say, this is a positive, this is good, this is something that we can do that we can share with the world. If we get it right and sway them that way instead of saying this product is going to make us X million dollars in 18 months. Right. So research is incredibly valuable in that area because it gives you this space to incubate something that you know that can turn into something even greater.
I think a lot of companies are actually using disability especially as a bridge for the technologies that they don’t really know what they can become in three to five years but they focus it on to work to people for whom 80 percent of what they want is a dramatic life-enhancing kind of improvement and what they learn in that effort makes a better platform for doing the kind of work that comes along the line. I think Microsoft especially is doing that.
They’re seeing a Soundscape, all of these projects where they’re doing what they can with computer vision with 3D audio and and putting it in front of people in a way that’s like really practical. I’ll throw in Google because like two weeks ago they had Google I/O and they have a lot of tools that are working from that same kind of playbook.
[00:32:37] Corina: I wanted to ask you also what do you think is the connection between this work of inclusion and ethics and consumer trust.
[00:32:46] Matt: That’s a good one. The way that I think about it is that we establish relationships with the products that we use and the organisations that create them. So each time we’re happy with something it’s like adding to the bank. But each withdrawal is heavier than each deposit. So I think that there’s a certain point where we internally are keeping tabs of what the value is of this system that we’re putting into it versus taking out. So Facebook is a great example of that. You have every connection that hasn’t already turned off Facebook. In this system. That is their core value. But then you look at what they’re doing with your privacy and the effects on electoral politics or in the in the day to day lives of people. And each time you work each time there’s a lapse of trust.
That accrues to your connection to to Facebook as a company. So I think they get that now because, beyond just sort of the the stock price kind of level of analyzing the effects of trust on Facebook they’re doing a TV ad campaign saying effectively, we know we lost your trust and we’re going to try to earn it back. But I think the way that you put those two pieces together is telling, because for every case of Facebook not really caring who you are in general but selling your data anyway, there are all of the things that accrue especially on racial lines. There was a case where Facebook data, and if I remember this correctly, data was basically be used to effectively redline people from housing ads and things like that, which is illegal in the United States.
There are issues where like trust is lost along the lines of inclusion that I was talking about before. There is representation in advertising scenarios, whether it’s just sort of representation in the interface itself or things like those like asking for gender or individual details of that, so there are all different kinds of ways that you can that you can feel that this is not my place. I don’t feel comfortable in this space. And a lot of that I think gets translated as trust. But some of it is more has a greater impact than the rest where it is a matter of, especially if it’s a conscious or semi-conscious effort to exclude people, but not actually tell them that that’s what they’re doing.
[00:36:15] Corina: Do you know of any company or activity where there is kind of this direct line drawn between acting ethical and inclusive or not acting that way and that leading to loss of consumer base or loss in something that will tangibly kind of shake the organization especially when they’re focused on specific metrics?
[00:36:39] Matt: I can say a bunch of different sort of lawsuits or or legal settlements and the like in disability on along those lines. So there are airlines, supermarkets, places of business that were having failed to meet the legal requirements which for us there’s the Americans With Disabilities Act. There are similar at different levels kinds of legal requirements around the world. But to be clear, what these things are like the minimum, right? The reason things become a law is that we have now that we as a city, state, nation, federation, have determined that you do this and you’re done, right?
Laws happen when you’re trying to establish a minimum. So that’s not ethics, that’s law. Ethics is how you transcend, how you go above that, how you reset the bar, how you go to the logical conclusion of, in my case, inclusion. Right. So each of those cases on disability grounds is a case where you have denied someone access even at a physical level like not providing a ramp into the entrance of a building. We have civil rights legislation along the along the same lines as well that sort of cover a number of different aspects of humanity.
So what what I think is happening right now is that we are a lot of ways not just seeing our own communities and that we all stand up for ourselves and that just one group gets angry and then another group gets upset about something else. But we’re all sort of starting to see the effects on one another. That is that to me is ethics. I mean when one of these things is just self-preservation, when you start to see exclusion happen to someone that doesn’t look like you and you see that as a wrong I think that is that is when ethical behavior is coming into your life. So we see on a fairly regular basis people saying “I’m fed up with the way that the big corporations are dealing with my data,” with how Twitter for example is suspending one group of users for threatening people’s lives, and then suspending or terminating the accounts of other of other users for actually firing back after being victimized.
So we see those things on a regular basis. And that’s one where I think that there’s that there’s enough evidence to say something’s wrong with that system. Something’s wrong with the management of the system. When the people who are who are receiving death threats are treated more harshly than the ones that are sending them.
[00:40:25] Corina: I have a question around, how do you scale at a corporate level?
[00:40:30] Matt: There are like eight different directions I can go with this. So, I think there’s a piece in there that I want to tease out, which is sort of the quantification if you will, of ethics, like how do you know that somebody has ‘ethicsed” enough, right? And what I think about that is sort of an annual diversity statement and we put these out. Like, the the United States Department of Labor gets companies to produce data on their diversity metrics and things like that.
And there are lots of different ways to look at that information. What I have focused on has been the changing hearts and minds. I want to be talking with somebody especially somebody that’s an executive that has the ability to drive change through the through the organization to hear what I’m saying about inclusion, and respond in kind. When I say “inclusion” to somebody and they say “compliance” back to me, they are already in the wrong mindset. A compliance is I have I have this heavy weight, and I need to carry it over this line and then I can drop it right like that. That is what that is what compliance is. I know when I’m done. Inclusion is a set of things that we put in our backpack and we carry for as long as we’re moving. We always have to be coming back to this and we always have to be evaluating it.
My concern with the idea of like a scorecard for an organizational ethics is that when you create a game, people learn how to play the game. So if your value in the public sphere as connected to, did you do this, did you do this, did you do this, check boxes or ratings of those systems, then people are going to go for the perception of doing of doing the right thing without actually doing the right thing. And you get some gains out of that. But if people don’t actually care about doing this, and we know this from the accessibility world because we’ve seen that sort of check-mark kind of accessibility, then you can end up creating almost as much of a mess as if you hadn’t done anything at all.
So I spend a lot of my time trying to make people uncomfortable enough that they evaluate where they’re coming from as they’re doing this, but not so uncomfortable that they never want to talk with me again. So I’m working. I still feel like the best way to talk about humanity is to talk about it from a human perspective that we need to better express our own differences, the things that drive us. I couldn’t say that I would want to be working in technology forever if it wasn’t something where I felt like I could actually make life materially better for the people around me and I like to think that I work for a company where that’s true of everyone, and I haven’t really found that many exceptions to that scenario so I’m going to go with it.
So I start from this point where I know I have some opportunity to actually get people to think about the work that they’re doing as being a gateway to that kind of life-changing moment that I experienced early on in my career. So I wouldn’t want to become the arbiter of all that is ethical in technology because I have struggles of my own. I have things that I still need to work on and I feel like that it’s the product of a larger discussion. But we need to have that discussion. We need to keep having that discussion even when it’s uncomfortable. And I think that that is generationally the way that we sort of make sure that everybody is working integrally. So we’re so these are are not things that we have to constantly bring up like from scratch.
[00:45:28] Corina: I’m very mindful of your time and I just want to ask you if we have time for one last question.
[00:45:34] Matt: Sure.
[00:45:34] Corina: Oh that’s great thank you. So that is also about inclusion. I’m having a hard time phrasing this but bear with me. So I wanted to ask you about, how do you move from ideological statements to actual action? How do you make that critical leap, and how do you kind of start judging your own actions and get them to kind of align better to your intentions and to your ideological intentions?
[00:46:05] Matt: I think a lot of us are lucky if we’re, especially if we’re, you know being male, being straight, being middle class at least, to even understand if one of those things were were taken away, what the consequences of that would be.
And I’ve seen sort of time and time again these… you see it in social media like these viral posts where it’s like, so-and-so.. or something that’s like published in The New York Times or something like, “so-and-so says you can save a million dollars in the next three years by doing these by doing these things,” and supposedly this person says like, “well, I just bootstrapped, like I just did this all on my own,” and then like within the third paragraph they’re like, “oh, I just lived in the condo that my parents owned, and I lived there rent-free.”
And you’re like, do you understand what kinds of like social fabric carried you to this destination?Because if you just go into this saying, the things that would work for me would work for everyone else, you’re not going to do it. You’re never going to make that.
And it goes all the way back to our our own sort of mythology. Americans, I think, especially, that we are the builders and we have we’ve created all of all of this and it gives us the ability to absolve ourselves or erase the actual history in which we exploited basically everyone else in order to do the work for us. So there are workshops that are going on sort of around the United States where we are dealing with that sort of racial inequity. There are workshop leaders that are going from city to city and just sort of talking about white allyship.
First off you need to like understand aspects of the LGBTQ community and the fact that a huge percentage especially of trans kids have spent time on the street because they were kicked out by their families. Right. Something the most of us would never consider. But they start from nothing. I mean they’re there on the street with without a penny. Anything that they’ve done from that point you can you can say is a testament to them individually. But to understand that a lot of them start from a deep hole right there they don’t start from like, well, I made it from prep school to my freshman year at Harvard, anybody can do it!
To understand those layers of privilege from the perspective of somebody that never had them is just something that we don’t get to live ourselves if we had it. And we probably didn’t go to school learning that either. So there is an aspect of it where we need to we need to go back to where we started and we need to think about not just that this is more vulgar like the it’s not the color of somebody’s skin that made them that made them different.
It’s this like layering of oppression that’s happened sometimes in their lifetime and sometimes their parents and their grandparents and the rest of their fabric that has led to this. And if we’re going to destroy it, it actually requires us to do some work and us to not get credit for for it. And when you go in as somebody of a different race, gender, somebody who’s LGBTQ, somebody who has a disability and you go you go into this space that you’re nominally invited to, and then you feel like this might be an opportunity for me to talk about things from my perspective. And then you just get that same cultural conditioning put back on to you. It’s like getting knifed twice.
And so I think there has to be some sense of humility, some sense of responsibility among people of higher socioeconomic status. Some sense that if we’re ever going to get any better at this we can’t just discount the lived experiences of the people who didn’t have that status. And to a certain extent we have to we have to get used to the concept that there are people who deserve better and deserve it more than us. Deserve to be where we are because they do have to work harder.
They do have to endure things that we that we don’t. And that is something that, if you want to just sort of take we sort of hold your breath and say like, oh right, I’m going to be a good ally and I’m going to go to this thing and then and then sort of subject people to the same kind of environment that they expect white people to be giving them, that they expect men to be giving them. So, are you doing this or that you can feel better about yourself, or are you doing this so that you can make the world a better place? That’s that is a question that I’m sure everybody will be really happy to say, “Yes! I’m the good person that’s going to make this all better!”
But in order for you to do that you need to jump into places that are outside of your comfort zone, and you need to hear things that you don’t like to hear, that may not be 100 percent accurate from your perspective but if you go home and have sort of the long dark night of the soul, you’ll realize that it’s truer than you want to admit. And if you are if you think about it in terms of addiction, you have to hit rock bottom. You have to actually understand the damage that you have created in order for yourself to be in a position to build back up. It’s uncomfortable but it’s necessary.
And which isn’t to say that I’m going to walk into every every room and sort of destroy people, but I want to like I want to hit them like at the first layer where I know that the next step that they’re going to do is going to get us out of the hole rather than further in. That’s just expediency on my part. But there is there is a much broader scope of work that needs that there needs to be done and it sort of starts with with our own individual and cultural and subcultural histories.
[00:54:06] Corina: Yeah. And also just people right in their ability to do that long dark night of the soul and your eyes has been like the type of judgement that comes oh you’re doing this because you are ignorant or you don’t want to be better or…
[00:54:21] Matt: Yeah, and there’s the cultural awareness reader now at least in technology. So there see the books Automated Inequality, Algorithms of Oppression. Safiya Noble. Brotopia. There are a lot of books that are coming out that are describing what is happening from the perspective of an African-American woman from a white woman in technology.
The one thing I’m I’m waiting for is the it is from the perspectives of disabled people that is the one that I haven’t seen on the bookshelf yet. But we need to we need to reconcile those voices because they haven’t been a part of the of the cultural narrative for all this time. We need to reconcile that. So I think there’s room for a lot more of those discussions take place.
[00:55:43] Corina: I’m going to put all the links that you were mentioning in the description of the episode so that our listeners can go check those books out. Thank you so much for being with us today.
[00:55:58] Matt: Great, thank you for having me.
[00:55:58] Corina: Thank you for listening everyone. Follow us on our social media channels and look at the show notes for links to our speakers’ work. Join us next time for more interesting conversations.