AI is helping brands avoid controversial influencer partnerships  

Date:

Influencer partnerships can be great for brands looking to pump out content that promotes their products and services in an authentic way. These types of engagements can yield significant brand awareness and brand sentiment lift, but they can be risky too. Social media stars are unpredictable at the best of times, with many deliberately chasing controversy to increase their fame. 

These antics don’t always reflect well on the brands that collaborate with especially attention-hungry influencers, leaving marketers no choice but to conduct careful due diligence on the individuals they work with. Luckily, that task can be made much easier thanks to the evolving utility of AI.  

Lightricks, a software company best known for its AI-powered video and image editing tools, is once again expanding the AI capabilities of its suite with this week’s announcement of SafeCollab. An AI-powered influencer vetting module that lives within the company’s Popular Pays creator collaboration platform, SafeCollab is a new tool for marketers that automates the vetting process.  

Traditionally, marketers have had no choice but to spend hours researching the backgrounds of influencers, looking through years’ worth of video uploads and social media posts. It’s a lengthy, manual process that can only be automated with intelligent tools. 

SafeCollab provides that intelligence with its underlying large language models, which do the job of investigating influencers to ensure the image they portray is consistent with brand values. The LLMs perform what amounts to a risk assessment of creators’ content across multiple social media channels in minutes, searching through hours of videos, audio uploads, images and text.  

In doing this, SafeCollab significantly reduces the time it takes for brand marketers to perform due diligence on the social media influencers they’re considering partnering with. Likewise, when creators opt in to SafeCollab, they make it easier for marketers to understand the brand safety implications of working together, reducing friction from campaign lifecycles. 

Brands can’t take chances 

The idea here is to empower brand marketers to avoid working with creators whose content is not aligned with the brand’s values – as well as those who have a tendency to kick up a storm.  

Such due diligence is vital, for even the most innocuous influencers can have some skeletons in their closets. A case in point is the popular lifestyle influencer Brooke Schofield, who has more than 2.2 million followers on TikTok and co-hosts the “Canceled” podcast on YouTube. With her large following, good looks and keen sense of fashion, Schofield looked like a great fit for the clothing brand Boys Lie, which collaborated with her on an exclusive capsule collection called “Bless His Heart.” 

However, Boys Lie quickly came to regret its collaboration with Schofield when a scandal erupted in April after fans unearthed a number of years-old social media posts where she expressed racist views.  

The posts, which were uploaded on X between 2012 and 2015 when Schofield was a teenager, contained a string of racist profanities and insulting jokes about Black people’s hairstyles. In one post, she vigorously defended George Zimmerman, a white American who was controversially acquitted of the murder of the Black teenager Trayvon Martin.  

Schofield apologized profusely for her posts, admitting that they were “very hurtful” while stressing that she’s a changed person, having had time to “learn and grow and formulate my own opinions.”  

However, Boys Lie decided it had no option but to drop its association with Schofield. After a statement on Instagram saying it’s “working on a solution,” the company followed by quietly withdrawing the clothing collection they had previously collaborated on.  

Accelerating due diligence  

If the marketing team at Boys Lie had access to a tool like SafeCollab, they likely would have uncovered Schofield’s controversial posts long before commissioning the collaboration. The tool, which is a part of Lightricks’ influencer marketing platform Popular Pays, is all about helping brands to automate their due diligence processes when working with social media creators.  

By analyzing years of creators’ histories of posts across platforms like Instagram, TikTok, and YouTube, it can check everything they’ve posted online to make sure there’s nothing that might reflect badly on a brand.  

Brands can define their risk parameters, and the tool will quickly generate an accurate risk assessment evaluation, so they can confidently choose the influencers they want to work with, safe in the knowledge that their partnerships are unlikely to spark any backlash.  

Without a platform like SafeCollab, the task of performing all of this due diligence falls on the shoulders of marketers, and that means spending hours trawling through each influencer’s profiles, checking everything and anything they’ve ever said or done to ensure there’s nothing in their past that the brand would rather not be associated with.  

When we consider that the scope of work might include audio voiceovers, extensive comment threads and frame-by-frame analyses of video content, it’s a painstaking process that never really ends. After all, the top influencers have a habit of churning out fresh content every day. Careful marketers have no choice but to continuously monitor what they’re posting.  

Beyond initial history scans, SafeCollab’s real-time monitoring algorithms assume full responsibility, generating instant alerts to any problematic content, such as posts that contain graphic language, inappropriate images, promote violence or drug and alcohol use, mention violence, or whatever else the brand deems to be unsavory.  

AI’s expanding applications 

With the launch of SafeCollab, Lightricks is demonstrating yet another use case for generative AI. The company first made a name for itself as a developer of AI-powered video and image editing apps, including Photoleap, Facetune and Videoleap.  

The latter app incorporates AI-powered video filters and text-to-video generative AI functionalities. It also boasts an AI Effects feature, where users can apply specialized AI art styles to achieve the desired vibe for each video they create.  

Lightricks is also the company behind LTX Studio, which is a comprehensive platform that helps advertising production firms and filmmakers to create storyboards and asset-rich pitch decks for their video projects using text-to-video generative AI.  

With all of Lightricks’ AI apps, the primary benefit is that they save users time by automating manual work and bringing creative visions to life, and SafeCollab is a great example of that. By automating the due diligence process from start to finish, marketers can quickly identify controversial influencers they’d rather steer clear of, without spending hours conducting exhaustive research.  

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

spot_imgspot_img

Popular

More like this
Related

Alibaba Marco-o1: Advancing LLM reasoning capabilities

Alibaba has announced Marco-o1, a large language model (LLM) designed to tackle both conventional and open-ended problem-solving tasks.

New AI training techniques aim to overcome current challenges

OpenAI and other leading AI companies are developing new training techniques to overcome limitations of current methods. Addressing unexpected delays and complications in the development of larger, more powerful language models, these fresh techniques focus on human-like behaviour to teach algorithms to ‘think.

Ai2 OLMo 2: Raising the bar for open language models

Ai2 is releasing OLMo 2, a family of open-source language models that advances the democratisation of AI and narrows the gap between open and proprietary solutions.

Generative AI use soars among brits, but is it sustainable?

A survey by CloudNine PR shows that 83% of UK adults are aware of generative AI tools, and 45% of those familiar with them want companies to be transparent about the environmental costs associated with the technologies.