Advertiser reports see different options under the new creative breakdown. I’ve recorded separate videos for flexible formatting and image generation breakdown options.
However, few advertisers see a third option: Related Media.
What is the relevant media and how will this new segment be applied?
What is related media?
If anything, that’s how the media in question works. When you create a new ad, it may be recommended to use media for existing ad sets. Meta explains the following:
For example, when you create a new ad set using Media A with wireless earbuds, the media related may recommend adding media B, C, and D from the currently active ad set to promote the same product.
Who is eligible?
At present, only advertisers who “regularly use advantages + creativity” are eligible to use relevant media. This doesn’t mean that if you use Advantage+ Creative you will have it automatically (I don’t). It’s not clear to what extent you need to use Advantage+ Creative to qualify (I didn’t have all the options turned on).
You must use a single image or video ad and you will see the relevant media in the preview. You will then be able to select or deselect the recommended media individually, or toggle it completely.
Note that the relevant media can only pull image or video assets from existing ad sets, not text or URLs. So when you use relevant media, you can provide multiple images or videos for a single ad. A bit like a flexible format.
Fault
But how are these related media performing? This is where this new segment comes into play. It separates your original media from the relevant media you use.
Although, it may not be completely separated. Meta explains the following:
When you use related media, you can view separate performance metrics in Ads Manager: one shows how the original idea was executed, and the other shows how the original and related assets were executed together.
Unless the statement is poor, you will not be able to completely isolate the performance of the relevant media from the original media, although you should be able to infer it.
It’s nice to see Meta continue to add some transparency reports.