Know What Your Audience Wants Before Investing in Content Creation and Marketing – Whiteboard Friday
Posted by randfish
Content marketing is an iterative process: We learn and improve by analyzing the success of the things we produce. That doesn’t mean, though, that we shouldn’t set ourselves up for that success in the first place, and the best way to do that is by knowing what our audiences want before we actually go through the effort to create it. In today’s Whiteboard Friday, Rand (along with his stick-figure friends Rainy Bill and Hailstorm Hal) explains how we can stack our own decks in our favor with that knowledge.
For reference, here’s a still of this week’s whiteboard!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. It’s 2015. It’s going to be a year where, again, many, many marketers engage in a ton of content investments and content marketing for a wide variety of purposes from SEO to driving traffic to growing their email newsletters and lists to earning links and attention and growing their social channels. Unfortunately, there’s a content marketing problem that we see over and over and over again, and that is that folks are making investments in content without knowing whether their audience is going to know and love and appreciate what they’re doing beforehand.
That kind of sucks because it adds a lot of risk to a process that is already risk intensive. You’re going to put a lot of work into the content that you’re creating. Well, hopefully you are. If you’re not, I don’t know how well it’s going to do. All of that work can be for naught.
Let me show you two examples. Over here I have Rainy Bill from WhatTheWeather.com, and here’s Hailstorm Hal from KingOfClimate.com. We’ll start with Rainy Bill’s story.
So Rainy Bill, he’s thinking to himself, “You know, I want to invest in some content marketing for WhatTheWeather.com.” He has an idea. He’s like, “You know, maybe I could make a chart of the T-shirts that meteorologists wear by season. I’ll look at all the TV meteorologists, all the Internet meteorologists, and I’ll look at the T-shirts that they wear. They all wear T-shirts, and I’ll make a big chart of them.”
You might think this is a ridiculous idea. I have seen worse. But Rainy Bill is thinking to himself, “Well, if I do this, it’s kind of ego bait. I get all the meteorologists involved. I’ll feature all their T-shirts, and, of course, all of them will see it and they’ll all link to me, talk about me, share it on their social media channels, email their friends with it. Oh check it out. Put it on their Facebook.”
He makes it. He’s got this beautiful chart showing different kinds of T-shirts that meteorologists are wearing over the seasons, and Bill’s just as happy as a clam. He can’t believe how beautiful that is until he tries to launch and promote it. Then it’s just sadness. He’s just crying tears.
What happened here is that no one actually cared what Bill had to say. No one cared about T-shirt patterns that are worn by meteorologists, and Bill didn’t actually realize this until he had already made the investment and started trying to do the promotion.
This might be a slightly ridiculous example, but I can’t tell you how many times I’ve seen exactly this story play out by marketer after marketer of content investments. They put something together that they hope will achieve their goal of reaching a new audience, of getting promoted, but it falls flat mostly because they had the idea before they talked to anyone else. Before they realized whether anyone else was interested, they went and built it.
That’s actually kind of a terrible idea. Unless you have your finger on the pulse of an industry, a field so incredibly well that you don’t need that process, I’m going to say that is the 1% of the 1% who can do this without going out and first talking to their audience and understanding.
Hailstorm Hal, from KingOfClimate, instead of having a great idea for a piece of content, Hailstorm Hal is going to start with the idea from which all content marketing springs, which is, “I want to make something people will really want and something they’ll really love.” Okay. They want it, and they’re going to love it when they see it and when they get it.
So Hailstorm Hal is going to go out and say, “Well, what are the weather watchers talking about? People who are active in this community, in this industry, the people who do the sharing and the amplification, who influence what the rest of us see, what are they talking about?”
So he goes onto this weather forum and hears someone complaining, “The weather in Cincinnati is totally unpredictable.” The reply, “Yeah, but it’s way more predictable than Seattle is.” “Nuh-uh, you liar.” From this, eureka, Hailstorm Hal has a great idea. “Wait a minute. What if I were to actually go and take all of this online commentary and turn it into something useful where these two commenters could prove to each other who’s correct and people would know for certain how much . . .”
It’s not just helpful to them. This is helpful to a huge, broad swath of society. How accurate are your meteorologists, on average, city by city? I don’t actually know, but I would be fascinated to know whether when I go to San Diego — I was there for the holidays to see my wife’s family — maybe the weather reports in San Diego are much more or much less accurate than what I’m used to here at home in Seattle.
So Hal’s going to put together this great map that’s got an illustration of different regions of the United States, and you can see that in the Midwest actually weather is more predictable than it is on the coast or less predictable than it is on the coast. That’s awesome. That’s terrific. This is going to work far, far better than anything that Hal could have come up with on his own without first understanding the industry.
Now the process and tips that I’m going to recommend here are not exhaustive. There are a lot more things in this. But if you follow these five, at least, I think you’re going to do much better with your content investment.
First off, even before you do this process, get to know the industry, the niche, or the community that you’re operating in. If Hal didn’t know where to find weather watchers, he might just search weather forum, click on the first link in Google, and be at some place that doesn’t really have a very serious investment from the community of people he’s trying to reach. Without understanding all of the sites and pages, without understanding who are the big influencers in the community on social media, without understanding what are the popular websites, what gets a lot of interaction and engagement and doesn’t, that’s going to be really tough for him to figure out.
So that’s why I would say you need to go out and learn about your industry before you make something for it. Incidentally, this is why it’s really tough to do this as a consultant and why if you are paying consultants to go and do this, you’re going to actually be paying quite a bit of money for this research time. This is going to be dozens of hours of research to understand the niche before you can effectively create content for it. That’s something where it isn’t just an on demand kind of thing.
Then from there you want to use the discussion forums, Q&A sites, social media, and blog comments to find topics and discussions that inspire questions, curiosity, and need. Some of that is going to be very blatant. Some of it is going to be much more latent, and you’re going to be drawing from both of those. Your job is to have insight and empathy, and that’s what a great marketer should be able to do when they’re researching these communities.
Number three, you want to validate that if you created something, (a) it would be unique, no one else has made it before, and (b) others would actually share it. You can do this very directly by reaching out and talking to people.
So Hal can go and say, “Hey, who’s this commenter right here? Let’s have a quick conversation. Would you like this?” If the answer is, “Yeah, not only would I like that, I would help share that. I would spread that. I would love to know the answer to this question.” Or no reply, or “Sounds interesting, let me know when you get it up.” There’s going to be a different variation.
You can go and use Twitter, Google+, and email to reach out directly to these people. Most of the time, if you’re finding commentary on these forums and in these places, there will be a way to reach them. I also have two tools I’m going to recommend, both for email. One is Conspire and the other is VoilaNorbert. VoilaNorbert.com is an email finding tool. I think it’s the best one out there right now, and Conspire is a great tool for seeing who you’re connected to that’s connected to people you might want to reach. When you’re trying to reach someone, those can be very helpful.
Number four, it tends to be the case that visual and/or interactive content is going to perform a lot better than text. So if Hal’s list had simply been a list of data — here are all the major U.S. regions and here’s how predictable and unpredictable their weather is — well, that might work okay. But this map, this visual is probably going to sail around the weather world much faster, much better, be picked up by news sources, be written about, be embedded in social media graphics, all that kind of stuff, far better than a mere chart would be.
Number five, remember that as you’re doing the creation, you need to align the audience goals with your business goals. So if KingOfClimate’s goal is to get people signing up for a weather tracking service on an email list, well great, you should have this and then say, “We can send you variability reports. We can tell you if things are getting more or less accurate,” and have an email call to action to get people to sign up to the newsletter. But you want to tie those business goals together.
The one thing I’d be careful of and this is a mistake that many, many folks who invest in content marketing make is that a lot of those benefits are going to be indirect and long term, meaning if the goal is that KingOfClimate.com is trying to sell professional meteorologists on a software subscription service, well, you know what? You’re probably not going to sell a whole lot with this. But you are going to get a lot more professional meteorologists who remember the name, KingOfClimate, and that brand memory is going to influence future purchase decisions, likely nudging conversation rates up a little bit.
It’s probably going to help with links. Links will lead to rankings. Rankings will lead to being higher up in search engines when professional meteorologists search for precisely, “I’m looking for weather tracking software or weather notification software.” So these kings of things are long term and indirect. You have to make sure you’re tying together all of the benefits of content marketing with your business goals that you might achieve.
I hope to see some phenomenal content here in 2015. I’m sure you guys are already working on some great stuff. Applying this can mean that you don’t have to be psychic. You just have to put in a little bit of elbow grease, and you can make things that will perform far better for your customers, for your community, and for your business.
All right, everyone. Look forward to the discussion, and we will see you again next week for another edition of Whiteboard Friday. Take care.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by CharleneKate
Have you ever noticed how Rand is often speaking at conferences all around the world? Well, we realized that those of us here in Seattle rarely get to see them. So we started MozTalks, a free event here at the MozPlex.
It normally runs 2-3 hours with multiple speakers, one of whom is Rand. The event is hosted at the Moz HQ and offers time for mingling, appetizers, refreshments and of course, swag. The series is still evolving as we continue to test out new ideas (maybe taking the show on the road), so be on the lookout for any updates.
The world of marketing and SEO continues to change, but are you changing with it? Staying on the cutting edge should always be a priority, especially since early adoption has proven more beneficial than ever. Sticking with what works isn’t enough anymore, and marketing isn’t just about analyzing our successes and failures or understanding our returns and losses. It’s about what we do next.
In the presentations below, Rand and Dr. Pete will dive deep into where metrics serve us best, as well as what really works to drive traffic and what’s better left behind.
Rand: What Changed? A Brief Look at How SEO has Evolved over the Last 5 Years
Dr. Pete: From Lag to Lead: Actionable Analytics
We asked both presenters for a few of their top takeaways from their talks, and they’ve got some gems. Here’s what they had to say:
- Keyword matching has become intent matching, which doesn’t mean we should avoid using keywords, but it does mean we need to change the way we determine which pages to build, which to canonicalize, and how to structure our sites and content.
- The job title “SEO” may be limiting the influence we have, and we may need broader authority to impact SEO in the modern era. The onus is on marketers to make teams, clients, and execs aware of these new requirements, so they understand what we need to do in order to grow search traffic.
- Webspam has gone from Google’s problem to our problem. The onus is on marketers to stay wary and up-to-date with how Google is seeing their links and their site.
From Dr. Pete
- As content marketers, we can’t afford to see only the forest or the trees. We have to understand a wide variety of metrics, and combine them in new and insightful ways.
- We have to stop looking backward using lag goals like “Get 100,000 Likes in Q4.” They aren’t actionable, and succeed or fail, we have no way to repeat success. We have to focus on objectives that drive specific, measurable actions.
Missed the previous talk?
The first MozTalk featured Rand and his wife Geraldine, known in the blogosphere as The Everywhereist. Rand covered what bloggers need to know about SEO, and Geraldine talked about how to make your blog audience fall in love with you. Check them both out here:
Join us for the next one
Our next free MozTalk is set for
Thursday, April 2nd, and we’re still finalizing plans. We’ll be sure to post the videos on this blog for those of you who can’t make it, but if you’re in town, keep your eyes open for more details. We hope to see you there!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by ChadPollitt
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.
Many of the traditional channels for online content discovery are thoroughly understood and their adoption rates are high.
The readily accepted channelsâ€”from SEO and PPC, to email and social media broadcastingâ€”can deliver the best content to the right people at the right time.
Today, however, the Internet is experiencing a deluge of content, and many channels for content discovery are bloated. Estimates say that
more than 2.73 million blog posts are written and published daily. Many industries are experiencing a content surplus, making it even more challenging for marketers to get their content seen.
Social media networks like Facebook and Twitter are adjusting their algorithms to ensure the least amount of organic visibility for brands, too. Traditional paid media, such as banner advertising, is becoming less effective year-over-year because
banner blindness runs rampant. According to Solve Media, you’re more likely to survive a plane crash than click on a banner ad.
That sounds farfetched until you look at the results from the Nielsen Norman Groups 2007 eyetracking study (shown below).
Red areas indicate where users looked the most; yellow areas indicate fewer views; areas colored blue depict the least-viewed portions of the page; gray areas didn’t attract any views/actions; and the green boxes are used to highlight advertisements.
As a result, new techniques, tactics and tools are cropping up and being used by marketers of all stripes to maximize the visibility of their content. There’s now an entire
content promotion ecosystem. From influencer marketing to native advertising, brands are experimenting in new ways.
Many brands are sponsoring articles on blogs or other online publications with large preexisting audiences. An interesting stat we just included in our own ”
Content Promotion Manifesto” is that brands spent, on average, 6.7 percent of their content marketing budgets on sponsored content in 2013. It’s trending upwards, too. From the The New York Times to Forbes’ Brand Voice, there’s no shortage of famous examples.
advertorials have been around for decades, this top-of-the-funnel sponsored article channel is relatively new for many content marketers. Over the last year, we have received many questions from clients about sponsored contentâ€”questions about pricing, scale, value and strategy. We struggled to answer most of them; there wasn’t anywhere to get answers.
Because of this, we decided to reach out to 550 online publications to gather as much information about their sponsored content programs as possible. We wanted to find out the following:
- An agreed upon definition for sponsored articles
- The current state of sponsored articles as a channel
- Examples of sponsored articles
- Sponsored article pricing and value
- A media buying strategy for sponsored articles
- Tools and platforms for sponsored articles
We quickly learned that sponsored content on blogs and other online publications, when viewed as a marketing channel, is very immature. Pricing doesn’t have much rhyme or reason, either. However, after collecting and interpreting data on 550 online properties, and dissecting countless native advertising studies, we hope to shine a light on a little known content marketing channel.
The results of the study are outlined below.
Note: The complete Media Buyers Guide to Sponsored Content study is available for download here.
Defining Sponsored Articles
With content marketing adoption rates so high, many brands are looking to native advertising to promote their content. The
Interactive Advertising Bureau (IAB) defines native advertising as “paid ads that are so cohesive with the page content, assimilated into the design, and consistent with the platform behavior that the viewer simply feels that they belong.” According to the IAB, native advertising contains six different types of ad units: in-feed, promoted listings, in-ad with native element, paid search, recommendation widgets, and custom.
Sponsored articles fall into the in-feed subgroup. However, so does promoted content on Facebook, LinkedIn and Twitter. Bcause they appear within the normal content feed of the publisher, it doesn’t matter if the publisher is Facebook or BuzzFeed.
In other words, sponsored articles amount to advertising on a media outlet in the form of editorial content that looks like it’s supposed to be there. Brands value this because association with a publication and exposure to its audience can drive awareness, traffic, conversions, and leads.
The Current State of Sponsored Articles
We uncovered lots of fascinating information about sponsored articles while conducting our research. They are actually an evolved version of what many marketers call advertorials, which have been around for decades. The biggest difference between the two is where the content resides in the customer buying journey. Advertorials are middle to bottom-of-the-funnel content.
An example of a magazine advertorial
On the other hand, sponsored articles strictly reside at the top of the funnel. Their purpose is to be helpful, entertaining, or both. Top-of-the-funnel content doesn’t appear to be salesy and brand-centric to the reader. It’s the rise of content marketing that helped move advertorials up the funnel. This helps brands become not just purveyors of goods and services, but a producer of ideas and a distributor of knowledge.
Sponsored articles have received pushback from some publishers, brands, and consumersâ€”and even government regulators who are concerned because the articles resemble editorial content. This can damage the editorial integrity of a publication, as well as a brand’s image.
Both publishers and marketers have a vested interest in not appearing to mislead consumers. Native advertising in general is misunderstood by many consumers and marketers. (That’s partly why we conducted this study.)
In the video below, John Oliver does a good job of articulating many consumers’ concerns regarding sponsored articles.
2014 State of Native Advertising Report surveyed over 2,000 marketers and discovered that 73 percent were either completely unfamiliar with or hardly familiar with native advertising.
Thirty-eight percent of the marketers could identify forms of native advertising from a checklist, and only three percent claimed to be very knowledgeable.
Earlier this year,
Contently surveyed 542 U.S. Internet users to determine what they thought about sponsored articles. Only 48 percent of the respondents believed sponsored content that was labeled as such was paid for by an advertiser that had influenced the content produced. The rest thought the label meant something else.
Just over 66 percent of the respondents reported they are not likely to click on an article sponsored by a brand and 33 percent said they’re just as likely to click on a sponsored article as they are to click on (unsponsored) editorial content.
There is also contradictory evidence surrounding the overall effectiveness of sponsored articles.
Research from Chartbeat shows that only 24 percent of visitors scroll past the fold when visiting a sponsored articleâ€”compared with 71 percent for editorial content.
The New York Times claims readers spend the same amount of time on sponsored articles as traditional news stories. This is backed up by a study from Sharethrough and IPG Media Labs. They found that consumers actually look at sponsored articles more than typical editorial articles (26 percent vs. 24 percent) and spend a similar amount of time on each (1 minute vs. 1.2 minutes).
Not all publishers offer sponsored article opportunities to marketers. During our research, some respondents told us that protecting editorial integrity and preserving audience trust were a higher priorities. On the other hand, many big name publishers like Forbes, The New York Times, Business Insider, The Atlantic, Washington Post and The Wall Street Journal have all embraced sponsored articles as a revenue source.
BuzzFeed’s entire business model is built around what it calls sponsored “listicles,” a.k.a. sponsored articles. While some publishers are averse to adopting this native form of advertising, it doesn’t seem to be causing any damage to the publishers who are using native ads.
The U.S. Federal Trade Commission (FTC) hasn’t quite figured out
how to regulate native advertising. The FTC has delayed handing down regulations around disclosure requirements, language and graphic separation. Until that happens, the display of native advertising will remain at the discretion of publishers.
With that said, the IAB has set native advertising guidelines for its members. The IAB reports that clarity and prominence of paid native ad unit disclosure are vital, regardless of native advertising type.
Their two criteria are straightforward:
- Use language that conveys the advertising has been paid for, thus making it an advertising unit, even if that unit does not contain traditional promotional advertising messages.
- Be large and visible enough for a consumer to notice it in the context of a given page and/or relative to the device the ad is being viewed on.
In the case of sponsored articles, a reasonable consumer should be able to distinguish between editorial content from the publisher and paid advertising.
A 2013 survey conducted by
Hexagram and Spada revealed that 62 percent of publishers had embraced sponsored articles, with another 16 percent planning to go this route by the end of 2014. Comparable research from eMarketer showed that only 10 percent of digital publishers didn’t have and weren’t considering native advertising on their sites.
2014 Native Advertising Roundup revealed that 73 percent of media buyers use native advertising, and 93 percent expect to spend the same or more in the future. Native advertising spending in the U.S. is expected to increase from .3 billion in 2013 to .4 billion in 2018. A full 40 percent of publishers expect native advertising to drive a quarter or more of their digital revenue this year.
Native advertising, when compared to traditional display ads, have been found to be more effective. 25 percent more consumers looked at sponsored articles than display ad units. Native ads produced an 18 percent lift in purchase intent and a nine percent lift for brand affinity responses.
BIA/Kelsey released a study which shows brands are planning on spending more on native advertising, and publishers stand to benefit as long as they can preserve the trust and interest of their audience.
Examples of Sponsored Articles
Since look, feel, design, language and requirements of sponsored content are left up to the discretion of publishers, the presentation of sponsored content on different sites varies widely. Some publications provide brands with what can be described as a virtual microsite within the site itself. Others are more streamlined, using an article that appears as a piece of featured content but is labeled as sponsored.
Sponsored Article Pricing
There are no real standards for pricing in the digital world with regard to sponsored content. This makes budgeting for the channel very tough to do. It also makes long-term strategic execution at scale and across multiple publications a near impossibility.
While the value of sponsoring content is clearly understood by many brands, how to execute it and who to talk to in order to get it done is generally unclear. The study set out to add rhyme and reason to this burgeoning channel by exploring costs and comparing them across a broad spectrum of online publications and blogs.
This is valuable information for marketers and media buyers wishing to negotiate with online publications. It can even be used by publications that have not yet offered sponsored content opportunities to establish fair pricing.
Since publishers completely control their own pricing and standards, they maintain their own criteria for validating costs associated with sponsored articles. In today’s analytics-driven marketing culture, where channels are often compared and returns are measured, sponsoring content across several different publications can’t be so easily consolidated into a single “sponsored content” channel since each one has a unique value proposition.
This study is the industry’s first attempt to scientifically justify, quantify, and predict current going-rate prices of sponsored articles using explicit data points that can be measured for each online publication. Our goal was to create the first-ever quantitatively supported pricing standard for sponsored articles.
We hope our research puts an end to these challenges and empowers marketers with the ability to budget, negotiate, and ultimately scale the deployment of sponsored articles within their channel mix.
In total, the research for this study was conducted over a five-month period of time earlier this year. It included manual outreach via email and phone to over 1,000 media outlets and blogs. The outreach resulted in responses from 550 publishers that sold sponsored article units.
The study took an unbiased approach to data inclusion and included a representative sample set. It collected data on globally-recognized publications, one-person blogs, and everything in between.
Publications were classified using the following criteria:
- Content is created by more than five writers/contributors/columnists, and:
- The website already utilizes traditional display advertising (e.g., banner ads)
Everything that didn’t meet the above criteria was classified as a blog.
Each price collected in the study was the minimum charge for getting a sponsored article published, regardless of other pricing factors. A total of 17 factors were cited as justification for pricing schemes from the 550 publishers.
- Word count: The number of words in a sponsored article
- User time on page: The amount of time a typical reader spends on a web page
- Links: Specifications regarding whether or not links would be provided, and if so, how many, where and whether or not they would be “nofollow” links
- Lead capture: For publishers that provide links to gated assets, many charge on a per-lead basis
- Impressions (CPM): Cost per thousand impressions based on historic data
- Time and effort required from publication’s editorial staff
- Monthly website traffic
- PageRank: Often used by publishers to justify relative pricing when they run more than one media outlet
- Domain Authority: Often used for publishers to justify relative pricing when they own more than one publication
- Page-level engagement: A metric that is measured by how far readers scroll down the page and the amount of time spent on a given article
- Social media promotion: Often an optional add-on that would increase price (may come as part of a package deal)
- Email promotion: Often an optional add-on that would increase price (may come as part of a package deal)
- Display advertising: Often an optional add-on that would increase price (may come as part of a package deal)
- Number of articles: How many sponsored articles you are buying at a time
- Visibility time: The amount of time an article stays live on the site
- Verticals: For large publications that cover many verticals or subject areas, some verticals are more expensive than others
- Pay-per-click: Another engagement-level metric that is measured by the number of click-throughs to an intended landing page
In order to do a quantitative analysis, explicit data was collected from all of the publications to calculate predictor variables. Those variables included:
- Domain Authority: A ranking score from Moz, on a 100-point scale, that uses more than 40 signals to calculate how well a website will perform in the search engine results pages (SERPs). The higher the score, the more authoritative the website is viewed as being.
- Page Authority: Another ranking score from Moz, on a 100-point scale, that calculates how well a given webpage is likely to rank in the SERPs. In the case of this study, the publication’s home pages were used.
- PageRank: A ranking metric from Google that calculates the relevance of a webpage. This score analyzes the number of incoming links and the quality of the referring webpages to generate a measurement between 0 (low relevance) and 10 (high relevance).
- AlexaRank: A ranking score from Alexa.com that is based on traffic data from users over a rolling three-month period. A site’s ranking is based on a combined measure of unique visitors and page views. The site with the greatest combination of these is ranked No. 1, and higher number rankings correlate with lower traffic data.
- Facebook Following: The number of fans (or “likes”) a publication’s Facebook page has.
- Twitter Following: The number of followers a publication’s or a blogger’s Twitter account has. For publications with multiple accounts and/or contributing authors, only the account with the largest following was used.
- Pinterest Following: The number of followers a publication’s or a blogger’s Pinterest account has.
It is assumed that the data set in this study is a representative sample of the entire ecosystem of blogs and other online publications because the results closely mirror Moz’s
distribution of Page Authority that analyzed more than 10,000 SERPs and 200,000 unique pages. This regression model had a mean (average) Page Authority of 40.8 and standard deviation of 15.1. The distribution can be seen below.
The regression model in this study had a mean of 47.1 and a standard deviation of 15.5. The sample set of blogs and publications had a slightly higher Page Authority than the Moz study. This was expected because the study only measured root domains and not long-tail pages within those domains.
Aside from that slight disparity, the distribution curves are nearly identical. For those readers who are number junkies, the descriptive statistics of the Page Authority data in the study are below.
Variations in sponsored content offerings â€“ The study established the pricing baseline based on the cost of one sponsored article. Since some publications only offered long-term commitments to marketers that could include other benefits (banners, email, social promotion, etc.), some publications’ unit pricing could be inflated. As a result, the regression model may not be an accurate price predictor in all scenarios.
Social account data â€“ Not all online publications have accounts on Facebook, Twitter and Pinterest. In these cases, the number zero was used to quantify followers. Also, for publications with multiple accounts on the same network, the study measured the account with the most followers.
Alexa Rank Inaccuracies â€“ Alexa admits publicly that there are limits to making judgments from its data. Sites with relatively low traffic may not be accurately measured by Alexa.
The graph below seeks to show the exact methodology we used to conduct the sponsored content pricing study. It’s purpose is to give readers confidence in our pricing models so they feel comfortable in adapting the formulas.
When all prices are graphed, bloat appears on each end of the pricing spectrum. In order to reconcile the dense areas, the study broke down the pricing data and regression models for blogs and publications separately.
Blog Pricing Analysis
The graph below represents the distribution of prices for all 474 blogs in the study.
Because of the wide range and low frequency of prices recorded in the “more” area, we decided to label these data points as outliers. By removing the outliers (approximately 3.8 percent of the sample) from the analysis, the variance decreased by 87 percent, making for a more accurate predictive model. All descriptive statistics for the blog data sample before and after removing the outliers were laid out in the study.
With the remaining 456 cases, a multi-variable regression test for price against all of the predictor variables was run, after which the insignificant variables were removed to formulate the pricing regression model for blogs, as shown below.
The end result confidently determined the fair market price formula for a sponsored article on a blog:
Publication Pricing Analysis
The graph below represents the distribution of pricing for all 76 publications recorded in the study.
The outliers were kept in this regression model because of the range in quality and size of online publications is large. The descriptive statistics are available in the actual study.
Following the same methods as the blog analysis, the study ran a multi-variable regression test to construct a predictive model for publication pricing. After removing the insignificant variables the output looks like this:
The end result confidently determines the fair market price formula for a sponsored article on a publication:
What All This Math Really Boils Down To
With the formulas below, marketers now have a way to assign value when purchasing or negotiating for sponsored articles on blogs or publications.
Prior to this study marketers had no way of knowing if they were getting a fair deal or not using this emerging channel.
- Blog Price Formula = -60.5 + 5.97(DA) + 0.978(thousand Fb fans) + 15.1(PR) â€“ 0.000007(AR)
- Publication Price Formua = -37000 + 314(DA) + 20.9(thousand Fb fans) + 5152(PR) â€“ 46.6(thousand Pinterest followers)
That said, media buyers should also note that many top-tier publications package their sponsored content offering in different ways. Keep this in mind when using the formulas above. Below are examples of some variation in sponsored article packages.
Networks and Tools for Sponsored Articles
While conducting research, several tools and networks kept coming up. Some networks set up for the sole purpose of connecting marketers with publishers for sponsored content. Even HubSpot has built an informal ad hoc network for its partner agencies to connect with its publishing customers. Content measurement tools, including
Nudge, which was built to measure sponsored content, are starting to crop up, too.
A few other networks and tools worth noting:
- Adproval: A media outlet marketplace for connecting publishers and advertisers
- BlogHer: A blog and social media influencer community focused on social media coverage of women
- Blogsvertise: A blog marketplace for connecting publishers and advertisers
- Buysellads: A media outlet marketplace for connecting publishers and advertisers
- Cision: The brand’s Content Marketing Database includes a searchable database of over 2,000 sponsored opportunities with thousands of U.S. publications
- GroupHigh: Blogger outreach marketing software that helps companies find bloggers, in addition to managing and tracking relationships, and measuring results
- Izea: A sponsorship marketplace that connects social media influencers with brands
- Markerly: A brand amplification platform that connects brands with bloggers
- Sway Group: Connects brands and agencies with the largest network of female bloggers on the Web
- The Syndicate: A brand storytelling partner and blog sponsorship network.
With the growth of online content showing no signs of slowing, the use of sponsored content as a marketing channel will undoubtedly continue to grow as well. Besides, it’s a proven revenue stream for publishers who have often struggled to make money on the Internet.
However, as the popularity of sponsored content grows, so does the likelihood of it being regulated by governments. Until then, consider this post your definitive guide to sponsored content. The study can be
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by randfish
Today I’m going to make a crazy claimâ€”that in modern SEO, there are times, situations, and types of analyses where correlation is actually MORE interesting and useful than causality. I know that sounds insane, but stick with me until the end and at least give the argument a chance. And for those of you who like visuals, our friend AJ Ghergich and his intrepid team of designers created some nifty graphics to accompany the piece.
Once upon a time, SEO professionals had a reasonable sense of many (or perhaps even most) of the inputs into the search engine’s ranking systems. We leveraged our knowledge of how Google interpreted various modifications to keywords, links, content, and technical aspects to hammer on the signals that produced results.
But today, there can be little argumentâ€”Google’s ranking algorithm has become so incredibly complex, nuanced, powerful, and full-featured, that modern SEOs have all but given up on hammering away at individual signals. Instead, we’re becoming more complete marketers, with greater influence on all of the elements of our organizations’ online presence.
Web marketers operate in a world where Google:
- Uses machine learning to identify editorial endorsements vs. spam (e.g. Penguin)
- Measures and rewards engagement (e.g. pogo-sticking)
- Rewards signals that correlate with brands (and attempts to remove/punish non-brand entities)
- Applies thousands of immensely powerful and surprisingly accurate ways to analyze content (e.g. Hummingbird)
- Punishes sites that produce mediocre content (intentionally or accidentally) even if the site has good content, too (e.g. Panda)
- Rapidly recognizes and accounts for patterns of queries and clicks as rank boosting signals (e.g. this recent test)
- Makes 600+ algorithmic updates each year, the vast majority of which are neither announced nor known by the marketing/SEO community
Given this frenetic ecosystem, the best path forward isn’t to exclusively build to the signals that are recognized and accepted as having a direct impact on rankings (keyword-matching, links, etc). Those who’ve previously pursued such a strategy have mostly failed to deliver on long-term results. Many have found their sites in serious trouble due to penalization, more future-focused competitors, and/or a devaluing of their tactics.
Instead, successful marketers have been engaging in the tactics that Google’s own algorithms are chasingâ€”popularity, relevance, trust, and a great overall experience for visitors. Very frequently, that means looking at correlation rather than causation.
[Via Moz's 2013 Ranking Factors - the new 2015 version is coming this summer!]
We’ll engage in a thought experiment to help highlight the issue:
Let’s say you discover, as a signal of quality, Google directly measures the time a given searcher spends on a page visited from the SERPs. Sites with pages searchers spend more time on get a rankings boost, while those with quick abandonment find their pages falling in the rankings. You decide to press your advantage with this knowledge by using some clever hacks to keep visitors on your page longer and to make clicking the back button more difficult. Sure, it may suck for some visitors, but those are the ones you would have lost anyway (and they would have hurt your rankings!), so you figure they’re not worth worrying about. You’ve identified a metric that directly impacts Google’s algorithm, and you’re going to make the most of it.
Meanwhile, your competitor (who has no idea about the algorithmic impact of this factor) has been working on a new design that makes their website content easier, faster, and more pleasurable to consume. When the new design launches, they initially see a fall in rankings, and don’t understand why. But you’re pretty sure you know what’s happened. Google’s use of the time-on-site metric is hurting them because visitors are now getting the information they want from your competitor’s new design faster than before, and thus, they’re leaving more quickly, hurting the site’s rankings. You cackle with delight as your fortune swells.
But what happens long term? Google’s quality testers see diminished happiness among searchers. They rework their algorithms to reward sites that successfully deliver great experiences more quickly. At the same time, competitors gain more links, amplification, social sharing, and word of mouth because real users are deriving more positive experiences from their site than yours. You found an algorithmic loophole and exploited it briefly, but by playing the “where’s Google weak?” game rather than the “where’s Google going?” game, you’ve ultimately lost.
Over the last decade, in case after case of marketers optimizing for the causal elements of Google’s algorithm, this pattern of short-term gain leading to long-term loss continually occurs. That’s why, today, I suggest marketers think about what correlates with rankings as much as what actually causes them.
If many high-ranking sites in your field are offering mobile apps for Android and iOS, you may be tempted to think there’s no point to considering an app-strategy just for SEO because, obviously, having an app doesn’t make Google rank your site any higher. But what if those mobile apps are leading to more press coverage for those competitors, and more links to their site, and more direct visits to their webpages from those apps, and more search queries that include their brand names, and a hundred other things that Google maybe IS counting directly in their algorithm?
And, if many high ranking sites in your field engage in TV ads, you may be tempted to think that it’s useless to investigate TV as a channel because there’s no way Google would reward advertising as a signal for SEO. But what if those TV ads drive searches and clicks, which could lead directly to rankings? What if those TV ads create brand-biasing behaviors through psychological nudges that lead to greater recognition and a higher likelihood of searchers click on, link to, share, talk about, write about, buy from, etc. your TV-advertising competitor?
Thousands of hard-to-identify, individual signals, mashed together through machine learning, are most likely directly responsible for your competitor’s website outranking yours on a particular search query. But even if you had a list of the potential inputs and the mathematical formulas Google’s process considers most valuable for that query’s ranking evaluation, you’d be little closer to competently beating them. You may feel smugly satisfied that your own SEO knowledge exceeded that of your competitor, or of their SEO consultants, but smug satisfaction does not raise rankings. In fact, I think some of the SEO field’s historic obsession with knowing precisely how Google works and which signals matter is, at times, costing us a broader, deeper understanding of big-picture marketing*.
Time and again, I’ve seen SEO professionals whom I admire, respect, and find to be brilliant analysts of Google’s algorithms lose out to less-hyper-SEO-aware marketers who combine that big picture knowledge with more-basic/fundamental SEO tactics. While I certainly wouldn’t advise anyone to learn less about their field nor give up their investigation of Google’s inner workings, I am and will continue to strongly advise marketers of all specialties to think about all the elements that might have a second-order or purely correlated effect on Google’s rankings, rather than just concentrate on what we know to be directly causal.
* No one’s guiltier than I am of obsessing over discovering and sharing Google’s operations. And I’ll probably keep being that way because that’s how obsession works. But, I’m trying to recognize that this obsession isn’t necessarily connected to being the most successful marketer or SEO I can be.
Posted by GeoffKenyon
Back in 2011, I wrote a technical site audit checklist, and while it was thorough, there have been a lot of additions to what is encompassed in a site audit. I have gone through and updated that old checklist for 2015. Some of the biggest changes were the addition of sections for mobile, international, and site speed.
This checklist should help you put together a thorough site audit and determine what is holding back the organic performance of your site. At the end of your audit, don’t write a document that says what’s wrong with the website. Instead, create a document that says what needs to be done. Then explain why these actions need to be taken and why they are important. What I’ve found to really helpful is to provide a prioritized list along with your document of all the actions that you would like them to implement. This list can be handed off to a dev or content team to be implemented easily. These teams can refer to your more thorough document as needed.
- Do a site: search.
- How many pages are returned? (This can be way off so don’t put too much stock in this).
- Is the homepage showing up as the first result?
- If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google’s John Mueller recently said that your homepage doesn’t need to be listed first.
Review the number of organic landing pages in Google Analytics
- Does this match with the number of results in a site: search?
- This is often the best view of how many pages are in a search engine’s index that search engines find valuable.
Search for the brand and branded terms
- Is the homepage showing up at the top, or are correct pages showing up?
- If the proper pages aren’t showing up as the first result, there could be issues, like a penalty, in play.
- Is the content showing up?
- Are navigation links present?
- Are there links that aren’t visible on the site?
Don’t forget to check the text-only version of the cached page. Here is a
bookmarklet to help you do that.
Do a mobile search for your brand and key landing pages
- Does your listing have the “mobile friendly” label?
- Are your landing pages mobile friendly?
- If the answer is no to either of these, it may be costing you organic visits.
- Title tags should be optimized and unique.
- Your brand name should be included in your title tag to improve click-through rates.
- Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.
- This will help improve your organic traffic independent of your rankings.
- You can use SERP Turkey for this.
Check for pages missing page titles and meta descriptions
Images’ file names and alt text are optimized to include the primary keyword phrase associated with the page.
- While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don’t have to change URLs with existing external links.
- No excessive parameters or session IDs.
- URLs exposed to search engines should be static.
- 115 characters or shorter â€“ this character limit isn’t set in stone, but shorter URLs are better for usability.
- Does the homepage have at least one paragraph?
- There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.
- Do these pages have at least a few paragraphs of content? Is it enough to give search engines an understanding of what the page is about?
- Is it template text or is it completely unique?
- Is there real content on the site or is the “content” simply a list of links?
- Does the intent behind the keyword match the intent of the landing page?
- Are there pages targeting head terms, mid-tail, and long-tail keywords?
- Do a site: search in Google for important keyword phrases.
- Check for duplicate content/page titles using the Moz Pro Crawl Test.
- In addition to search engine driven content, there should be content to help educate users about the product or service.
- Is the content formatted well and easy to read quickly?
- Are H tags used?
- Are images used?
- Is the text broken down into easy to read paragraphs?
- Good headlines go a long way. Make sure the headlines are well written and draw users in.
- Since the implementation of Panda, the amount of ad-space on a page has become important to evaluate.
- Make sure there is significant unique content above the fold.
- If you have more ads than unique content, you are probably going to have a problem.
How to Write Magnetic Headlines
SEO Copywriting Tips for Improved Link Building
The Ultimate Blogger Writing Guide
Tips to Earn Links and Tweets to Your Blog Post
- Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
- Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.
Exclude common parameters, such as those used to designate tracking code, in Google Webmaster Tools. Read more at
Search Engine Land.
- Take a content snippet, put it in quotes and search for it.
- Does the content show up elsewhere on the domain?
- Has it been scraped? If the content has been scraped, you should file a content removal request with Google.
- Does the same content exist on different sub-domains?
- Does the content exist on a secure version of the site?
- Is the content replicated on other domains owned by the company?
- If there are “printer friendly” versions of pages, they may be causing duplicate content.
Accessibility & Indexation
Check the robots.txt
- Has the entire site, or important content been blocked? Is link equity being orphaned due to pages being blocked via the robots.txt?
- Use the Web Developer Toolbar
- Is the content there?
- Do the navigation links work?
Now change your user agent to Googlebot
- Use the User Agent Add-on
- Are they cloaking?
- Does it look the same as before?
SEO Browser to do a quick spot check.
Check the SEOmoz PRO Campaign
- Check for 4xx errors and 5xx errors.
XML sitemaps are listed in the robots.txt file
XML sitemaps are submitted to Google/Bing Webmaster Tools
Check pages for meta robots noindex tag
- Are pages accidentally being tagged with the meta robots noindex command
- Are there pages that should have the noindex command applied
- You can check the site quickly via a crawl tool such as Moz or Screaming Frog
Do goal pages have the noindex command applied?
- This is important to prevent direct organic visits from showing up as goals in analytics
Site architecture and internal linking
- 100-200 is a good target, but not a rule.
- Homepage links to category pages.
- Category pages link to sub-category and product pages as appropriate.
- Product pages link to relevant category pages.
- Category pages link to other relevant category pages.
- Product pages link to other relevant product pages.
- Does not utilize massive blocks of links stuck in the content to do internal linking.
- Does not use a block of footer links instead of proper navigation.
- Does not link to landing pages with optimized anchors.
- Link Checker and Xenu are good tools for this.
Importance of Internal Linking
Internal Linking Tactics
Using Anchor Links to Make Google Ignore The First Link
Successful Site Architecture for SEO
The SEO Guide to Site Architecture
Information Architecture and Faceted Navigation
- Are 301s being used for all redirects?
- If the root is being directed to a landing page, are they using a 301 instead of a 302?
- Use Live HTTP Headers Firefox plugin to check 301s.
- These redirects can easily be identified with a tool like Screaming Frog.
- Redirect chains significantly diminish the amount of link equity associated with the final URL.
- Google has said that they will stop following a redirect chain after several redirects.
- Is content being pulled in via iFrames?
- Is the entire site done in Flash, or is Flash used sparingly in a way that doesn’t hinder crawling?
- Google WMT will give you a good list of technical problems that they are encountering on your site (such as: 4xx and 5xx errors, inaccessible pages in the XML sitemap, and soft 404s)
- Are XML sitemaps in place?
- Are XML sitemaps covering for poor site architecture?
- Are XML sitemaps structured to show indexation problems?
- Do the sitemaps follow proper XML protocols?
- Make sure it points to the correct page, and every page doesn’t point to the homepage.
- This can cause a lot of problems if you have a root domain with secure sections.
Review page load time for key pages
- Is it significant for users or search engines?
Optimize your images for the web
Minify your CSS/JS/HTML
- Consider using a CDN for your images.
Optimize your images for the web
- Is there a mobile site set up?
- If there is, is it a mobile site, responsive design, or dynamic serving?
Make sure analytics are set up if separate mobile content exists
If dynamic serving is being used, make sure the Vary HTTP header is being used
- This helps alert search engines understand that the content is different for mobile users.
- Google on dynamic serving.
- Do your mobile visitors have a different intent than desktop based visitors?
- If your site redirects mobile visitors away from their intended URL (typically to the homepage), you’re likely going to run into issues impacting your mobile organic performance.
- If a mobile site (m.) exists, does the desktop equivalent URL point to the mobile version with rel=”alternate”?
- Does the mobile version canonical to the desktop version?
- Official documentation.
- ex: site.com/uk/ or uk.site.com
- If the site is targeted to one specific country, is this specified in webmaster tools?
- If the site has international sections, are they targeted in webmaster tools?
- Try to avoid having all URLs in the default language
- You can check this using the “custom” filter in a Screaming Frog Crawl or by looking for self referrals.
- Are there pages that should be blocked?
- Having the same Google Analytics property will create problems with pageview-related metrics such as inflating page views and pages per visit and reducing the bounce rate.
- It is OK to have multiple GA properties listed, this won’t cause a problem.
- These can artificially lower bounce rates.
This audit covers the main technical elements of a site and should help you uncover any issues that are holding a site back. As with any project, the deliverable is critical. I’ve found focusing on the solution and impact (business case) is the best approach for site audit reports. While it is important to outline the problems, too much detail here can take away from the recommendations. If you’re looking for more resources on site audits, I recommend the following:
Helpful tools for doing a site audit:
Annie Cushing’s Site Audit
Web Developer Toolbar
User Agent Add-on
MozBar (Moz’s SEO toolbar)
Your own scraper
Inflow’s technical mobile best practices