Harm reduction for developer relations teams (ssr)

Written by Knut Melvær

Missing Image!

I wish I didn’t have to write this post. I think it’s unfair that we even have to take the following measures we’re outlining in this post. But the fact is, that being a public person on the web comes with risks to your health and wellbeing. Especially if you’re a person of color, identify as a woman or non-binary, or find yourself in an intersection of these. It’s fundamentally unfair that I, as a white guy, mostly get questions about my code editor theme or a follow-up to the code I demoed, while my colleagues in the industry who happen to present as female, get comments about how they look, having their competence questioned, being mansplained, or just outright stalked and harassed with disgusting messages in their DMs. (I recommend this post to get an idea of what we’re talking about). And it’s unfair that for them, this is normal.

Sure, we can tell ourselves that “it’s the Internet, what to expect?” But isn’t that just accepting that we can’t expect better? Is that really living by our values? And expectations regardless, receiving negative comments and harassment will get to you. Even if you grow a “thick skin.” And that worries me as a Developer Relations manager, and it should worry you too, especially if you have employees whose job it is to be public-facing on behalf of your company. Harassment and unwanted objectifying attention are obvious sources of burnout. It can suck the joy out of communicating and educating technology.

You need a harm reduction strategy

I’ve tried to ask around how companies with developer relation teams approach the fact that their employees get harassed as part of their job. I haven’t succeeded in getting many clear answers. My impression is that it’s usually up to the individual team member to figure it out. Typically, people form informal private support groups or alliances in order to vent and deal. In developer relations circles you’ll be advised to be frivolous with the block button on Twitter. This isn't sustainable, and it leaves the responsibility of those being abused.

So, I have together with my team and leaders, figured out some practical strategies for reducing harm in developer relations teams. Harm, in this context, is in most cases the psychological trauma and threat to personal safety that comes from attention that transgresses your boundaries and happens without consent, messages that are hateful and diminish our role as competent and professional people. Harm reduction is the measures, strategies, and practices we put in place to prevent this.

We hope this can spark a conversation on how we can promote better environments for technical educators to do their best work. Ultimately, it’s about keeping people who still are underrepresented in tech, who are crucial in inspiring and relating to people who want to get into tech and look for someone that looks like themselves. And, of course, these strategies aren’t exclusive to developer relations.

Doing the right thing because of human compassion and kindness also makes sense from a business perspective. Making sure that people whose job it is to communicate your technology and build community can grow and thrive in their job is directly tied to your ability to scale your endeavor. In other words, there should be a lot of incentive to do these things better.

Support psychological well-being

At Sanity, one of our benefits is access to mental health services and therapists, as well as physical exercise (personal trainers and company workout sessions in core strength and yoga). We support healthy work/home-life balance, we have mandatory vacation time, and we’re building a culture where you are expected and allowed to prioritize your work in a way where you can work on what matters, and feel good about the things you don’t have time to do. I should know, one of my development areas has been to work less. Our Slack is mostly inactive during weekends, and if something is posted on a Saturday, it’s either an emergency or just surplus banter.

Part of this is also to normalize going to therapy. As someone who has suffered from recurring depression and didn’t seek help as early as I should, I wholeheartedly subscribe to this. Still, there can be a certain stigma against having mental health issues, especially in professional life. But you would be surprised (or maybe not) how many have a story of mental health challenges to tell, either themselves or close ones, once you get through that outer shell. And I don’t know anyone who didn’t wish they had sought help earlier. Taking responsibility for one’s mental health is a strength and not a weakness. As a manager, the least that you can do is to be open to having that conversation and proactively remove the stigma. It can be helpful to learn some basics around common mental health conditions like depression, anxiety, and trauma.

Be invested in your team's emotional work

Being called incompetent or “just a talking head” in the comment section on YouTube, receiving marriage proposals, or being contacted by the same person in DMs across multiple channels just because you posted a video tutorial will get to you at some level. That is stuff that you will have to digest and work through. One negative comment from a stranger to ruin the rest of your day. The “working through” bit is emotional labor. It’s impossible to avoid shitty feedback from strangers on the web, but it can help to share that reality with someone.

As a manager, I’m invested in my teams’ well-being. Being inquisitive about their experience when they are out to represent Sanity makes it easier for me to help them navigate through our goals and expectations. Of course, for most people, it requires a level of trust to share stuff like this. We’re all too used to saying that “it’s fine,” when it’s really not. It also requires a level of preparedness from you as a manager. Fortunately, there are people like Lara Hogan who have written about how you can take part in reports’ emotional work, and how you can take care of yourself.

Choose tools that support community moderation

While I think that the awareness and support system you’re able to build in DevRel teams are the most important, there are some practical approaches too.

There is a lot of attention on community-led growth and with it comes a lot of tools and services to accommodate it. Inviting people into a space on the Internet, especially when you hope to get a large crowd also comes with responsibility. There are good reasons for large social media services to have block buttons and controls around who can engage with you. As a company, you want to have controls that let you remove unwanted content in your community spaces. Having a code of conduct is not enough, you should be able to enforce and promote it too.

Moderation options in Discord

One of the major drawbacks of having our community in a Slack workspace is that we don’t have great controls for moderation (automatic notifications for keywords etc), and its members can’t block other users, which is a primary measure against harassment. If we were to choose again (which we might at some point), we would probably go for a platform like Discord or Discourse which come with more advanced moderation features and let its users protect themselves from unwanted DMs. To compensate, we have to be diligent in monitoring and make it easy for folks to report transgressions. We have also put work into our Code of Conduct to make it enforceable and explicit about consequences.

Automate and collaborate on blocking

If you have developer relations specialists who are active and have a growing audience on Twitter, that will also be a source of harassment if they don’t identify as a white male. While Twitter makes it relatively easy to block people, you can get in situations with a lot of influx of unwanted attention as a  by-product of doing a great job making an impact. At Sanity, we have started using blockparty.app that will proactively filter and mute unwanted content and give you better tools for blocking accounts. The (cheap) premium edition even has a helper view that lets colleagues moderate block suggestions for you. Our experience with it so far has been great.

The Block Party filter configuration screen showing the free options and the premium options.

Turn off or pre-moderate all comments

It’s commonly known that the comment section of YouTube can be challenging. We have set our comments to pre-moderated by default. And we’re not the only ones. Jason Lengstorf, VP of Developer Experience at Netlify and host of “Learn with Jason” has seen it necessary to do the same for his show:

PortableText [components.type] is missing "twitterEmbed"

It’s not ideal, because it takes time to moderate comments and they are a signal of engagement. But making sure we’re not accommodating and hosting hateful and misogynist content is part of acting out a code of conduct. Because this is not only about the well-being of our team, it’s also keeping up the standards for our community.

Make (formal) expectations to partners

Writing this post and reflecting on past experiences, a final approach came up. Often in developer relations, we collaborate and appear as guests on other partner companies and channels. Then we’re beholden to other audiences and our hosts' practices around moderation. We’re working on being more intentional about setting expectations around moderation and harm reduction. Even though partners might not always follow up, asking will set a precedence and make content creators aware of problems they haven’t considered. It’s basically asking if there’s a code of conduct and how it’s enforced.

Whenever there is a transaction involved, for example, if it’s sponsored content, you can even put comment moderation as part of the contract. It’s also more impactful if the expectation comes from the person running the developer relations team or whoever decides if an agreement should be made or not. It also lets your report focus 100% on making their content knowing that someone has put a stake in their safety.

Have conversations like this in the industry

I don’t think we have figured out online harm reduction in tech yet. However, it’s not hard to start employing some preventive measures. The first step is becoming aware that harassment like this is rampant for non-white non-male-presenting devrels (as well as many other groups). Even though I believed myself to be relatively tuned in to this, I was taken aback seeing some of the messages that people (only men) has sent to my team. For the past years, we have seen that the recipients of online harassment have been more vocal about their experiences. Now it’s time that we who are leaders and have influence do our part of the work and figure out how we can be accountable and set higher standards.

If you have thoughts, questions, reflections, or criticisms, feel free to contact me on knut@sanity.io or @kmelve on Twitter. My DMs are open (which is not practically possible for everyone).