-
Services
-
Locations
- Philippines
- United States
- Colombia
- Eastern Europe
Locations -
Industries
-
Resources
- E-Books
- Blog
- Case Studies
Resources - About Us
- Careers
Guide to Trust & Safety and Content Moderation
Trust & Safety and Content Moderation stand as cornerstones of a responsible and secure online environment
A comprehensive guide to understanding both Trust & Safety and Content Moderation, including what they are, why they’re important, and how to do them well.
The Importance of Trust & Safety
Trust is an essential part of customer loyalty.
Without it, customers won’t continue to do business with you and they won’t recommend you to your friends. In fact, trust is a factor for 41% of American consumers who report switching to a different brand or business.
Building a safe environment for your customers, where they aren’t faced with offensive content, spam, fraud or harassment, increases the trust in your brand. To reduce this risk, many businesses are building Trust & Safety and Content Moderation into their operations plans. At Peak Support, we work closely with many of our clients, from gaming platforms to social media sites, to develop and execute a comprehensive Trust & Safety team and program that keeps their users safe. In this guide, we’ll share what we’ve learned, and help you develop your own content guidelines and Trust & Safety team.
What are Trust & Safety and Content Moderation?
Trust & Safety meaning: The process of monitoring, policing, and preventing fraudulent and/or abusive behavior on your platform. Trusty & Safety includes the policies you create, the interactions you have with users, and the steps you take to ensure your platform is a safe experience for everyone involved.
Content Moderation meaning: The process of reviewing content that’s posted on social media, or on your platform to make sure it meets your company’s guidelines. Much of this moderation happens on social media platforms, but it’s relevant for every company because any company can find itself with violent, racist, discriminatory content on their public forums.
Who Needs a Trust & Safety team?
Both Trust and Safety and Content Moderation are most often associated with social media platforms or online gaming communities. However, any company with an online presence can quickly find itself in need of moderation policies. |
While you might not need an entire team dedicated to monitoring content, you at least need to be thinking about these policies. If you don’t have Content Moderation standards, and if your customer support team isn’t trained on them, your brand is at risk. When something unusual arises or your company is embroiled in an issue such as sponsoring something controversial, or the target of fraud, you need to be able to act purposefully. If your agents aren’t prepared, things can get out of hand quickly, and your customers will lose trust in your company.
A well-thought-out set of policies helps mitigate the spread of misinformation, fraud, and abuse and ensures that your customers (whether online gamers, social media users, or consumers) can have the best experience.
Completing a risk assessment can open your eyes to the various ways Content Moderation can help secure your product and your brand. Here are some of the Trust and Safety problems your team may encounter:
- Players creating multiple accounts to “game” the system
- Commenters spreading misinformation
- Spam
- Racism or hate speech
- Unauthorized requests for users’ personal information
- Pornography
- Harassment of users on your social media profiles
Ask yourself: where can you start to see risk to your reputation, brand or users? What type of fraud can be perpetrated through your platform? Educate yourself on what your risks are. How can you start to mitigate the risks that are out there? One step is to build a Trust and Safety team.
7 Steps to Building a Great Trust and Safety team
1. Create Clear Guidelines For Your Team
Agents working in Trust and Safety and Content Moderation shouldn’t be making difficult decisions on every flagged post or account.
Instead, they should be following clear-cut guidelines that help divide the grey area into black and white.
Clear policies are essential for setting the right expectations for your customers. Without digital content guidelines, it becomes more difficult to hold customers accountable for their words.
To create helpful digital content guidelines for your customers and your agents, start by defining what you want to achieve through your Trust and Safety policies. What are your Content Moderation goals? What kind of experience are you trying to provide your customers with? Are you focused solely on preventing fraud? Or does your customer base want stronger moderation in place?
Include specific examples in your internal content guidelines. Terms like “objectionable content” or “offensive language” aren’t helpful for the agents who are making decisions about what to allow. These guidelines are far too vague.
Finally, be open to recalibrating your guidelines as needed, based on feedback from the team and evolving trends. Ask the team where they need more guidance, or what isn’t clear enough. Add more specific examples, and adjust as needed.
2. Create an Effective Hiring Process
Finding the right people for your Trust and Safety team is paramount to your success.
When hiring for our Trust and Safety teams, we look for someone with a “facilitator” mindset who can be objective and examine their own biases, even under stressful situations. To find that person, we have a set of skills that we’re looking for in the Trust and Safety job description:
We look for these skills in the Trust and Safety job resume review and a remote, written assessment that requires candidates to do some of their own research and create responses. Trust and Safety job applicants also meet with different members of the team including a rep and a team lead to assess their abilities and whether they are a good fit for the team culture.
Finally, because Trust and Safety team agents often have access to back-end systems and data, a background check is necessary to protect your customer’s sensitive information and prevent malicious behavior.
For all teams, we seek a diverse, geographically dispersed team so that we can bring as many perspectives to the table as possible. Remote hiring is really helpful for this. Note that if you have offshore agents, they may not have the same background or context as onshore agents.
3. Create Detailed Training
After you find the right Trust and Safety team, it’s time to onboard them with a well-designed training program.
This isn’t something where you can just throw agents into the deep end and hope they swim! Without training, you’re putting your brand at risk.
Start by showing Trust and Safety agents lots of real-world examples, both negative and positive. You want to get them familiar with the types of content they’ll run into on a daily basis.
Use job shadowing to show agents the workflows they need to know, and how to use any tools required (such as internal tools, social media platforms, help desks, etc). |
The new agent can sit beside an experienced team member (either in-person or over Zoom) as they go through their day, working together and discussing questions as they come up.
Once the new agents seem like they are familiar with the guidelines, create a queue of straightforward tickets for them to work through. Offer suggestions and check their work as they go. Immediate feedback will help them progress faster.
Training becomes a lot easier if you put together a thorough set of resources and documentation that new agents can refer to, including:
- Spreadsheet of examples that are unusual or in a grey area, along with the steps that were taken.
- Self-guided training tools such as Lessonly to review guidelines and action steps.
- Easily searchable, thorough documentation of Content Moderation guidelines.
4. Find Tools To Help
Depending on the scale of your Content
Moderation program, you may not be able to catch everything yourself.
Using software and algorithms to help moderate can ensure nothing drops through the cracks, and human eyes can spend more time sorting through the grey area, rather than monitoring a busy platform.
If you’re a platform that is concerned with Content Moderation, using a basic algorithm (either designed in-house or a pre-built tool) can help to categorize fraudulent behavior and give your team a head start. Tools like Sift and Utopia come with a catalog of flagged material.
However, automation isn’t the only answer. Technology is biased and with the current state of AI Content Moderation, it’s only as knowledgeable as the humans that train it. That means humans still need to be involved to catch mistakes, audit the algorithms, and find new trends where users have discovered ways around the content filters.
Besides an automated content flagging system, you might also find a logging system helpful. Being able to see where agents are looking and what changes they’re making is important for data security and for the ability to audit or QA their workflows. This is especially important for remote teams.
5. Stay Agile and Prepared for Change
Content Moderation is constantly changing and your team needs to be prepared for this.
New conspiracy theories, new racial codes and symbols, and controversial current events pop up every day. This requires that the team have an ongoing discussion, so they can stay up to date and keep customers safe. Building a process and a safe space for these discussions to happen is paramount.
Content Moderation updates will most often come from agents, rather than leadership. One person can’t be responsible for collecting every new issue themselves. Instead, empower the team to collect and document new issues together. |
Invest in building a culture where everyone knows they can bring up feedback and ask questions. Because each agent might only see a trending issue once, it’s important to share hunches and ask questions openly. Develop a supportive environment where people can ask questions or have a chat group where people feel comfortable raising anything. When controversial topics are being discussed, they need a communicative environment where they feel safe. A private Slack team channel can be a safe space for Trust and Safety or Content Moderation discussions to happen.
Tools to use:
- Help agents learn how to fact-check and investigate new issues, such as using factcheck.org or Snopes before bringing the issue to the team.
- Plan ahead for major events. Be aware of any controversial issues that are popping up in the news.
- Create a coordinated document where agents can log new issues and examples that arise. Review this document often to spot trends.
- Investigate and notate flagged content so that other agents can see the thought process behind each decision.
- Review updates in a weekly team meeting and discuss any unresolved issues as a team so everyone is on the same page.
6. Maintain Team Morale
Working in Trust & Safety can be draining, as agents only interact with the most difficult, offensive, fraudulent cases. It can be difficult to remember why we do what we do, or that there is an entire audience of happy customers out there who are following the rules and appreciate the safe, risk-free experience. Fortunately, there are ways to Maintain Team Morale and avoid burnout.
7. Measure Your Success
How do you know that you are doing Content Moderation well? What does it look like when Trust and Safety work is done correctly? It can be difficult to measure the success of your program when ideally, no one notices the work behind the scenes to keep your customers safe. However, there are three aspects of this work that are measurable: the productivity of the team, the experience of the customers on your platforms, and the quality of the audit work you’re doing.
Productivity:
Measuring Content Moderation and Trust and Safety productivity is similar to other customer service metrics. How quickly are reports being responded to? It’s important to respond quickly and be expedient about removing unsafe content or activity.
Consider tracking the length of time between the report or activity being flagged and the ticket being closed with a decision. We aim for 24 hours but frequently act more quickly. You can also track the number of tickets each agent responds to each hour, however, moving too quickly can create quality issues by rushing decisions. Find a happy balance between moving quickly and not rushing.
Customer Experience:
One of the primary reasons to have Trust and Safety processes is to improve and protect the customer experience. A well-executed program will increase the safety of your users and build trust in your product and platforms. These are measurable goals.
- How many fraud reports do you receive?
- What is the impact on NPS after implementing a Trust and Safety program? Are people more likely to refer you to friends and family?
- Add a question to customer surveys to ask about trust. “On a scale from 1 to 5, how much do you agree with the following statement: I trust [Company] to keep my information safe.”
- Read reviews of your company on Trustpilot and other websites where customers can rate their experience.
Occasionally a customer or user will report something that is offensive to them but doesn’t violate the Content Moderation guidelines. Sometimes things will be left up, even if it’s not from a mindset we personally agree with. This can result in customers leaving bad reviews, but doesn’t necessarily mean the Trust and Safety program isn’t doing its job.
Quality of Moderation:
How well are agents adhering to the guidelines you’ve developed? Are they catching every issue? Are they being too strict? Without measuring the quality of moderation, it’s impossible to know.
Quality assurance (QA) can help ensure consistency in your Content Moderation by adding another pair of eyes. |
Each week, review a specified number of tickets or decisions using a defined rubric. Discuss any feedback in 1-on-1s with team members so they can improve going forward. Learn more about starting a QA program in our free guide.
- Introduction: The Importance of Trust & Safety
- What are Trust & Safety and Content Moderation
- Who Needs a Trust & Safety team?
- Step 1: Create Clear Guidelines For Your Team
- Step 2: Create an Effective Hiring Process
- Step 3: Create Detailed Training
- Step 4: Find Tools To Help
- Step 5: Stay Agile and Prepared for Change
- Step 6: Maintain Team Morale
- Step 7: Measure Your Success
Ready to chat with us?
We’ve helped dozens of innovative companies launch and scale their customer service teams. Whatever you need to grow your business, our flexible offerings can fit. Let’s chat about how outsourcing can unlock new levels of growth for your business.
Talk to Sales