In this episode of Reckoning, Kathryn Kosmides speaks with Adam Dodge about technology enabled abuse. Adam is the founder of EndTAB, an organization whose goal is to end technology-enabled abuse against victims of domestic violence, sexual violence, human trafficking, stalking and elder abuse.
In this episode, Adam discusses:
Welcome to Reckoning, a podcast that explores gender-based justice, safety, survival, and resilience in the digital age, through conversations with experts and advocates.
I'm your host, Kathryn Kosmides the founder and CEO of Garbo, a tech non-profit building a new kind of online background check. Before we jump in, I'd like to warn our audience, that we have raw, honest conversations about gender-based violence, which may be too much for some listeners. Please put your safety and health above all else when listening.
Kathryn: Today, we will be talking with Adam Dodge from EndTAB. Welcome, Adam.
Adam: Thanks for having me.
Kathryn: Yeah. So first, what is EndTab?
Adam: Sure. So EndTAB stands for ending technology-enabled abuse, and our focus is training victims, serving organizations and professionals to address the ways in which technology is used or misused to harm victims.
Kathryn: What exactly is technology-enabled abuse?
Adam: So technology-enabled abuse while a lot of folks find it very intimidating or think you need to be an expert to address, the reality is technology-enabled abuse is not new abuse. It's just the ways in which familiar, or traditional, or longstanding forms of abuse like stalking or harassment or power and control are perpetrated via technology. And by via technology, we mean online or via a victim's devices like their smartphone or their smartwatch or their wifi or things like that.
Kathryn: Can you tell us a little bit about your background and kind of how you got into this work?
Adam: Yeah. I often tell folks - which is probably a bad thing, that during my trainings - that I'm not an expert in technology. Mainly because I want them to know that you don't need to be an expert to engage with technologies that are being used to harm people and make a difference. So my background: I'm a licensed attorney in California and my focus primarily was on addressing legal issues relative to that domestic violence survivors were facing. So I helped a lot of folks get restraining order protection in California that were domestic violence survivors. And in doing that work, what I found was that technology was becoming pretty ubiquitous in those cases. And my clients were facing technology-enabled abuse in some shape or form, and yet, I didn't feel like I was well equipped to address that or prevent it.
And so I decided to educate myself and work with some cybersecurity experts and really drill down into the subject matter area that really has not been fully vetted or explored in a way that's meaningful for my colleagues and folks that work in this field. And when I did that, people started reaching out to me for training and asking me for technical assistance. And I realized the unmet demand or the critical need for those types of services was really broad, like globally broad. So I founded EndTAB last summer, and left the nonprofit I was working for, to focus on this full-time so I could help organizations around the world. And it's been, it's been really fun.
Kathryn: One question I have is how does technology-enabled abuse manifest itself among targets of not only domestic violence, but sexual violence, elder abuse, and human trafficking?
Adam: That's something that I spend a lot of time focusing on because those victimization groups that you just identified, all have something in common, which is that the victims tend to know who the person harming them is, know who the perpetrator is. And that's, that is the key to why those victims are at higher risk of technology-enabled abuse. You and I are at risk of being targeted with cybercrime, but I think the risk is, relatively low that some hacker in some third party that doesn't know us in another country is going to identify you or me to hack into our accounts or steal our identities or something like that. They're not motivated by any personal vendetta or have any skin in the game to go after us. Whereas if you take a victim of domestic violence, for example, the person that's seeking to harm them knows them, is personally motivated to harm them. A seven out of 10 sexual violence victims know who the perpetrator is, human trafficking victims almost always know who is trafficking them. So, the ways in which to take, a common form of tech-enabled abuse, that a lot of them experience is monitor. And that can be monitoring their accounts through unauthorized access to monitor their email or their social media, or that might manifest through location tracking. Our phones are preloaded with a variety of ways that to track us that are misused to do harm. So for example, with an iPhone, find my friends, family sharing, find my iPhone, are just three preloaded ways. If you're not using those and somebody who has access to your device because they're close to you and they know you, decides to engage one of them without your knowledge, they can follow you anywhere you go. And that really puts folks in danger.
Kathryn: It definitely does as a victim of kind of that monitoring, like you're talking about the unauthorized monitoring. It's so true, and it's so scary how easily these things can be done without your knowledge, without your consent.
Adam: Absolutely. One of the exercises I go through with folks a lot of the time is to access the significant locations section on an iPhone, Unbeknownst to most folks, significant locations is activated on our iPhones and it tracks everywhere we go. So if you get access to that, through the settings, if you get access to that section, you can see where somebody was, how long they spent there, how often they went there, whether they walk there, they drove there, how long it took them to drive there. And that kind of information while benefiting Apple to improve the way their apps work for you, also can provide somebody who wants to exert power and control over you or track your movements. It gives them a very detailed map of your whereabouts and your patterns. It's not intended to be malicious, but when misused it can or weaponized, it really can be sort of a terrifying tool to exert power and control over somebody.
Kathryn: And all it really takes is having access to your device once, you know, if you allow someone to use your device, they can easily put their own thumbprint in when that was still a thing. They can know your passcode, neatly go into those settings and give themselves those permissions. Like, I had my calendar settings, so someone gave themselves access to my calendar. And so they knew everywhere that I was going, they would show up there. It was so invasive and I had no idea how they had access to this information until, one day I was randomly in my calendar settings, doing something completely different, and I saw their name pop up and was like "Holy shit! I cannot believe this person gave themselves access to my calendar." And this is how they'd been stalking and harassing me. So all it really takes is just one time, one small thing and you never know the damage that it could do.
Adam: Totally. And you're, you're hitting on something else that I spent a lot of time talking about. We have these massive digital footprints, right? Like we have, the average person has, I think, 80 online accounts and 70 apps and, any access to any one of those can give somebody information about your address, your location, where spend your time. So to your point, it's really difficult to figure out where the leak is because we rely on technology so heavily to basically do everything. If somebody is trying to figure out how their address got compromised, it could be through a food delivery app, it could be through an exercise app, it could be through their calendar, it could be through their email, it could be through a variety of different things... I've worked with so many victims that can't figure out how their address was learned by the other party. And it's usually due to access from some account they didn't realize was still shared. a Domino's Pizza delivery app like. It's pretty shocking, you've really hit on something that really puts survivors and victims of this kind of harm in a really disadvantaged position, because we're very reliant on technology, we're very tethered to our technology. But when that technology is misused or it's turned on us, it becomes very intimidating, it becomes very difficult to navigate and it becomes difficult to sort of figure out or determine or use deductive reasoning to figure out where the leak is because we use it for everything.
Kathryn: Exactly, exactly. And when I was going through this people recommended that I just get a new phone or get a new computer and it's like “Yes, I can totally afford to just buy a brand new MacBook or a brand new iPhone..”. It's so unrealistic kind of advice that you often get when you're going through these scenarios.
Adam: Yeah. I get a little hot about that one because that's a form of victim-blaming, right? That's like, it's the victims’ issue? Like, well, why don't you just get off social media? Why don't you just change your email account? Why don't you... You know, and that's, for me, that's tantamount to saying to a rape survivor "Well, what do you think about to stop dressing that way or stop going to those bars?" I mean it's not the victims’ fault. It's we should not be asking why doesn't the victim, you know, get off social media. We should be asking why does this person that's harming them, continues to do this? Why don't we focus on the person causing the harm rather than making it the victim’s problem? And you're right, but the challenge with that is if, so for example, in your circumstance, if someone said “Well, just get a new phone and get a new computer.”
Well, your calendar settings are going to remain unchanged. So you would have done that and then still had the same issue. I don't like that solution very much at all, because it doesn't empower the person who's being targeted to figure out what the issue is. For most folks, I mean, when I was working for the nonprofit that I was at prior to Laura's House, I mean, I was working with the people that are at the poverty level below poverty level, working poor. I mean, getting a new device is not optional. So, I can't stand that. The victim-blaming stuff just gets me real hot, nuts.
Kathryn: It's so prevalent, so prevalent... What are some common red flags someone can look for when it comes to technology-enabled abuse? If they think that they're experiencing something?
Adam: So this is going to sound like a cop-out, but I've done this enough. And I've talked to enough folks to come to the conclusion that this really is the number one flag. Which is the person's instincts, if you feel like something is off. Then the problem with this is that often gets dismissed by others as paranoia or self dismissed by the person. "Well, I'm just being paranoid or this, this isn't real..." But we are so tethered, as I mentioned before. We use our technology for everything. We wake up in the morning, typically the first thing we do is grab our smartphone and that starts a pattern for the rest of the day. We're on our devices, we're on email or on social media. It's how we're getting our news. It's how we're communicating with our community.
That’s how we're working, it's how we're learning, it's how we're doing all these things. So we're very in tune with our technology and if there is something amiss, we may not be able to put our finger on it, like in your circumstance that it's the calendar, but something is off. And, I never ignore that, and I encourage all the professionals that I trained not to dismiss that. If somebody says, I feel like this person knows something, or they've listened to a conversation, or they read an email, or they know where I am, that is pulling on that flag, because it's very likely that, at least it's very possible that that is actually a red flag, that there is something in this, that there is something wrong. Some other red flags: this one I refer to as a reddish flag, because I don't mean to say that everybody who works in technology is engaging in technology-enabled abusive behaviors.
But again, I've worked on this enough where if I'm working with a client and for somebody and their partner works in tech, or is very tech-savvy, or set up their devices, or set up their home wifi or their network... I've seen it happen pretty, not uncommonly, where that person is using technology to monitor or engage in these sort of unhealthy behaviors because it's very natural. It's very second nature. If you work in tech or you're very tech-savvy, or tech is your world, then you're going to use that to do healthy things and unhealthy things. So, whenever I hear that you know, I'm working with a domestic violence survivor, whoever, let me say the other party works in tech, my antenna goes up because I'm immediately concerned and I'm immediately focusing on your devices. So those are, those are two good.
Kathryn: I know you focus a lot on, digital impersonation. How is this going to become an even bigger threat, like deep fakes, voice cloning, and spoofing things like that?
Adam: Huge, huge issue, huge issue. Impersonation already is a huge issue because there's no bat there's, there's no verification process to set up an email or an account in somebody else's name or to be them. So there are no barriers to access to impersonate someone. You can keep creating accounts over and over again. And, at its base, it's still impersonation. This isn't new abuse. It's just the ways in which technology has sort of made it easier to perpetrate and made the harm orders of magnitude worse, right? So deep fakes, for example, let's take nonconsensual pornography actually for the deep fake examples. So non-consensual pornography or revenge porn. Everybody knows what revenge porn is now, right? And yet revenge porn has been around for a long time. I would argue that when the camera became first commercially available in the 1830s, there could have been an act of revenge porn.
Somebody could have taken an intimate photo of their partner and shared it with their friends without that person's consent, okay? That's an act of revenge porn. The reason we all know what revenge porn is now is because of technology. Technology has been an accelerant for this form of abuse. So now flash forward to today, I can take a picture of my naked partner without their consent, and I can post it to social media or to the internet in three seconds and share it with the world. And the photo can never be scrubbed from the internet. So the order of magnitude of harm is greater, and my ability to perpetuate that harm is much greater. Now, fast forward to deep fakes, which is based on artificial intelligence: it's basically a face swapping, a form of abuse where you can take from a photo, you can take somebody's image or likeness, and you can swap it onto another person in a video and map their face onto that person, let's say pornography. And now it looks as if that victim is participating in this sex tape. So it's a form of - people argue about calling it sexual violence - but to me, it is a form of sexual violence or sexual abuse, you are fetishizing somebody without their consent. When these videos can be very realistic looking, whether it's believed or not, whether people know it's fake or believe it's real the harm is still there, right? The example I often give is if a young woman, a high school-age woman or girl is walking down the hall and sees a bunch of kids watching a video, and she sees what they're watching, and it turns out that it's a deep, fake porn video of her, that her face has been swapped into a pornographic video. And everybody knows it's fake when that young lady walks away, and that woman walks away, the idea that she doesn't feel any harm or is not impacted by that experience is ridiculous. We don't even know what this form of harm is going to look like long-term, but being sexualized and fetishized without one's consent is certainly damaging.
Kathryn: 100%. It goes to that quote: "Impact, not intent." Whether they're intending to cause harm or not, they are causing harm. I believe as you said, it is sexual violence, it's a new form of sexual violence. And I read on one of your resources that 96% of all deepfake videos are pornographic and almost 100% of them target women.
Adam: Yes. You know, online violence is so gendered, and skew so heavily towards women, I think nonconsensual deep fake pornography really encapsulates that well. And unfortunately, I think a lot of men are thinking about how to solve this problem. Everybody's worried about disproving deep fakes, you know. They're worried about "Oh, what if somebody creates a deep fake of Trump or Biden and it upsets the electoral process or upsets democracy?" But that hasn't become an issue yet. Almost a hundred percent of the victims of deepfake, - regardless of pornography or not, but it's mostly porn - of the deepfake victims are women. And nobody cares whether deepfake pornographic videos are real or not.
Millions, millions, and millions of viewers every month are flocking to these websites to watch these videos. Nobody believes it's really Scarlett Johannson in this sex tape you know, some other actor. And so even if they came out with technology tomorrow that said that was able to debunk a deepfake immediately, which doesn't exist by the way. But if it did, those websites that are sexualizing fetishizing women, and harming them would continue to thrive because disproving deepfakes is not the problem. You know, the problem is this idea that it's okay to take control over women's bodies and manipulate them and do whatever they want with them and, make that public. That is what we need to address. And it's really sort of a sickening symptom of online culture right now.
Kathryn: It is, it definitely is. What can someone do if they're experiencing technology-enabled abuse, whether it's digital stalking, harassment, this deep fake pornography issue? What are their options?
Adam: There are options. I will say that the current tech landscape really empowers people that want to do harm and disempowers victims. Because as we talked about before, it's really easy to make these online accounts and anonymously harass somebody. I'm really heavily focused on prevention, don't wait until you feel like you're being targeted. Not that I'm saying there's anything wrong with that, but if we can take steps in the near term before there are any issues. For example, when I provide trainings, I tell the victim serving professionals: every time a client comes in, assume that their devices and their accounts are compromised and engage in tech-savvy safety planning with them. Whether it's true or not, because as you pointed out, just you have access to somebody's device one time, and you can make all these changes and get into their calendars, or what have you.
I'll spin that another way. If you use your partner's laptop to log into your Amazon or your Gmail, and it saves the password, then that person can walk through and log into your accounts whenever they want. Amazon, for example, saves the password and views that laptop as safe now. So some of the things that folks should do when they get out of relationships, - let's talk about intimate partner relationships for a second - they should always update their passwords. The smartphone market is dominated by Android and Apple. And on both of those, you can remove all trusted devices. So you can go in and clear all trusted devices. So even if somebody has your password, if that device they're logging in from no longer as trusted, then Apple is going to require two-factor authentication. So updating passwords and removing trusted devices are big ones, especially for people, who are at high risk, like domestic violence or sexual violence, human trafficking, or stalking, or elder abuse when the person trying to harm them, had access to their devices.
So I know that maybe sounds oversimplified, but really those are two sorts of really critical steps folks can take to make sure that even if they're not targeted now, they're less likely to be targeted in the future because the reality is the person trying to harm them is not an expert, it's just somebody who had an opportunity. So if they're trying to log in and they can't get in, because the password is no longer valid, they're not going to try. They're very unlikely to try to hack the account at that point because they don't know how to do it. And they're likely at least going to stop trying to force enter the account in that manner. I'm not saying they won't try to do something else, but at least you can throw up a blockade, right?
Kathryn: Definitely agree when you say that they're not experts, they just have an opportunity. And that's the biggest thing: it's most people are not experts, most people are not hackers. They don't know how to hack into your computer, they don't know how to put a key logger on your computer things like that, or even install like an official stalkerware, but they can do so many things without having to do that.
Adam: Yeah. A hundred percent. Why would they do that? That's extra. I find that perpetrators of this time of abuse are inherently lazy. So they don't want that, they're not going to go out and read too. I mean, some do, but for most of them, they're not doing it. They're not going out and teaching themselves, you know, hacking skills or paying for stalkerware or anything. Why would they do that when they can log into their partner's iCloud account and monitor their text messages, see where they're going, monitor their location activity. I mean, it's the path of least resistance. And to that point, I hear a lot from victims saying, "Oh, I'm not tech-savvy, I'm not tech-savvy". The reality is there are plenty tech-savvy. They know what a smartphone is. They know what the internet is. They know what location apps do. And then all they have to do is pivot to think "Okay, now how are these things being misused to harm me?" I don't have to explain this to a victim serving professional, what Instagram is. When a victim says "Hey, I think my Instagram account has been hacked. They know what that is, they know that there are settings, they know how to reset a password. So they have the fundamental skills to do it. And so to survivors, it's just a matter of sort of getting them out of that mindset of being intimidated or being sort of overwhelmed by all the different ways in which they could be compromised. Because when you start to think about it, "Oh my gosh, I use my phone for everything. All my accounts could be compromised.", it can become overwhelming. And then you don't do anything because you're sort of paralyzed. So just take it one step at a time, start with your most used account and move on to the next one. You don't have to do it all in one day.
Kathryn: Very true. And I appreciate that you talked about prevention. Because here at Garbo prevention is the name of arguing, right? We believe in proactively preventing crimes by giving you access to information about that person, that you know they have a history of gender based violence. And so we can proactively prevent dangerous meetings from occurring, or at least arm you with information before you go into that meeting. And I think it's so true that in the same way, we need to focus on preventing these types of crimes from occurring by, you know, I don't allow anyone to use my device. Even if a stranger on the street asks if they can make a call, I just say “I'm so sorry”. I can't do that, you know? And, and so I don't let friends use it, I changed my passwords, like monthly. I'm crazy about prevention because you truly just never know who a potentially bad apple or bad actor is.
Adam: A hundred percent. Yeah, I wish everybody adopted that way of thinking. It would greatly reduce a lot of the issues that we face. And I'm really excited about what you're doing because I can't tell you how many domestic violence survivors I've worked with, who feel after getting a restraining order against someone, they feel compelled to warn others about this person. Or they want to publish an article about them, or they want to warn other people about it. And I always have to tell them, "Look, you likely don't want to take up that endeavor". I really value what they want to do, but I don't think that they should be obligated to do that. So it would be really great to be able to say, "Actually, there's a great tool out there that other potential victims of this person can utilize to determine whether or not that person is a danger to them. So I think that's a tool that'll be really, really beneficial to the domestic violence survivor community.
Kathryn: Especially because our whisper networks don't work in the way that they used to, right? We used to be able to warn an individual about someone because we kind of knew everyone with six degrees of separation. Your friends were all connected in a network. And you could say "Hey, Joe's a bad person, stay away from him." if you’d see someone interacting with that person. But now because of online dating and, just online everything, our networks have gone disparate, they're no longer connected. So whisper networks don't work in the way that they used to and that's really where Garbo can step in and start to begin to be a digital whisper network in a sense. And not rely on other people telling people, but just using the data that's out there using public records and reports, while not putting the victim in harm’s way, right? So many people like I am a victim of being sued civilly because I talked about my abuser and, that is a big thing that's happening. It is that abuses are suing their victims for talking about the abuse. It's insane but it's the reality of the situation. And the platform enables those people to express themselves, to get those records and reports out there while protecting themselves at the same time.
Adam: Yeah, don't, don't get me started on perpetrators of harm suing victims...
Kathryn: It's crazy, it's crazy. So as we get to the end of this conversation, what would you like to leave to people listening?
Adam: Oh, that's a great question. I'm gonna revisit something I already said, which is “trust your instincts”. I think that especially people that are in abusive relationships or have been made to believe that they're not smart or not capable, or aren't techie, or don't know what they're doing... And they've been convinced of these things, they tend to dismiss their instincts, but our instincts are designed to keep us alive. I mean, it is a very fundamental core of who we are very primitive, very lizard brain. And I've seen it enough now. Don't discount, if you feel like you're being watched, you feel like a device is compromised or someone knows where you are, or, you're getting harassed online, don't discount it. Don't convince yourself that it's not there or think that you're paranoid.
Really, maybe give yourself a little bit of a boost and look into it. It can be as easy as changing a password. You don't have to have this huge diagnostic test. I mean, it might get there at some point, but in the majority of cases, it's as simple as updating passwords and making sure you have two-factor authentication on. It’s really basic online safety steps that we can all take. We just, for as much as we rely on technology to do everything, we don't spend a lot of time making it safe. I would tell them to do what you do. I would change your passwords on the regular and, keep it simple. Trust your instincts, keep it simple, update your passwords, turn on two-factor authentication and remove any trusted devices.
Reckoning is a podcast produced by Garbo, a tech non-profit building a new kind of online background, check our executive producers, Amani Nichols with whisper and moderator. Please subscribe to the show via your favorite podcast app. And as always, please send your questions and comments to hello@garbo.io.
We work with online platforms to help proactively protect their communities through our innovative background check system. Get in touch to learn more about how we can integrate Garbo or help you proactively screen users at sacale.