Posted by randfish
We’re hearing a lot about voice search lately, and that trend doesn’t seem likely to disappear. But does it have a direct impact on how you should be thinking about your SEO strategy? In today’s Whiteboard Friday, Rand discusses what to expect when it comes to the future of search and what you can do to stay on top.
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about voice search, conversational search, Internet of things search, and how these attributes and the rise in these trends may or may not play a big role in our SEO strategy and tactics for the future.
Today, we have a few sort of nascent beginnings of this, and I made a prediction at the beginning of this year, in my traditional predictions post, that voice search, conversational search, Internet of things, that these wouldn’t actually have a big impact or much of an impact at all on the web marketing world. What we are hearing is from the engines, specifically Google and Bing, talking about how a higher and higher percent of their queries are coming through voice searches. However, what we’re not hearing is how this might be changing SEO or whether it’s changing SEO.
So today what we have going on is things like folks asking their device, their Android device, “Okay, Google, what’s the difference between libel and slander?” You might hear this. Maybe you have a question, something you want answered, and Google will respond verbally to you, or they might just show you the results on the screen, and then you can click through to there, or some combination of the two.
You can ask your Alexa device, the Amazon Echo Alexa device, you can ask it, “Alexa, did Iceland beat England in the Euro soccer game,” or football game as English and Icelandic people would call it. In fact they did. Really, sorry about that England, but I kind of want to see the Icelandic commentator freak out again. That seems exciting.
For Apple products, “Hey Siri, where can I get Vietnamese rice noodles near here?” And Siri will look around you, and then return some results, that sort of thing.
Talking to cars
Of course, there’s also this idea that with more and more cars are becoming hotspots for searches as drivers ask their cars things or ask their phones in their cars things like, “All right Tesla,” this is not real, you can’t actually say this to Tesla yet, but I’m sure it’s coming, “When is my brother-in-law’s birthday, and does he drink whiskey?” Hopefully, your Tesla will be smart enough, through whatever partnerships it has with these other technology companies, to be able to answer that.
This is what’s happening today. We’re seeing the rise in conversational and voice search. So there’s a new and different kind of keyword demand and also a new and different kind of result set that returns because of that. Does it really make a huge difference from an SEO perspective? Well, I’m going to argue that not yet, no, it doesn’t. However, I think there are strategic and tactical things that we should be paying attention to as this world progresses, this world of voice search, conversational search progresses.
1. The rise in instant answers without search results will continue
We’re going to see a continual rise in instant answers. What is happening is that when a lot of these voice and conversational search queries are coming through, they tend to be longer, and they tend to be seeking out an answer that a device can quickly give you a direct answer to. Things like, what I placed here, and this requires some logic and some understanding from the machines, some contextual understanding, but it’s not that challenging, and the machines are doing a good job of this.
I suspect that what we’ll continue to see is that the percent of queries with an instant answer result keeps rising over time. Now this is percent, not absolute numbers. I mean, obviously the absolute number is rising, but that doesn’t necessarily mean that the traditional kinds of queries that have been made to search engines are going to disappear.
In fact, one of the things that I would urge you and caution against is to say, “Oh, because voice and conversational search are rising, we should stop paying attention to direct, traditional web search and web results.” It may in fact be the case that even with the rise of all these instant answers and new SERP features and voice search that the raw number of clicks on search results in your industry, in your field, for your keywords may actually have gone up despite all these trends. Because these trends are additive, they are not necessarily taking away from other forms of queries, at least not necessarily.
2. Google (& Apple, Amazon, etc.) will continue to disintermediate simplistic answer/data problems:
I think Google and Apple and Amazon and Alexa and all of these engines that participate in this will be continuing to disintermediate simplistic data and answer publishers. So I think it behooves you to question what types of information you’re publishing
The way I’d phrase this is if a certain percent, X percent of queries that result in traffic can be answered in fewer than Y words, or with a quick image or a quick graphic, a quick number, then the engine is going to do it themselves. They don’t need you, and very frankly they’re faster than you are. They can answer that more quickly, more directly than you can. So I think it pays to consider: Are you in the safe or dangerous portion of this strategic framework with the current content that you publish and with the content plans that you have out in the future?
SAFETY DANCE VS. DANGER ZONE
- Safe: Recipes
- Dangerous: Cooking conversions
So if you’re in the world of food and cooking, recipes probably very safe. It’s very, very difficult for an engine to say, “Okay, here let me read you the ingredients. Let me show you the photos. Let me give you the entire rundown. I’ll give you the comments. I’ll give you the star rating.” This is too complex.
What’s very simple is cooking conversions. “Alexa, how many pounds of flour do I need to make up a cup?” Very simple cooking conversion, instant answer very possible. Pretty dangerous to be relying on a ton of your click-through traffic for that dangerous stuff.
- Safe: Sports analysis
- Dangerous: Sport scores
Sports analysis, very, very difficult for any of these services to try and provide analysis of a game, very easy for them to provide a score.
- Safe: In-depth product comparison
- Dangerous: Simplistic product price comparison
Very difficult for them to do an in-depth product comparison, very easy for them to do a specific, simplistic product price comparison. “What are the prices of X on these?”
- Safe: Learn to code tutorials
- Dangerous: Quick function lookups
Learn to code tutorials, almost impossible to disintermediate, but a quick function look-up, very easy to disintermediate.
SAFE: If it’s hard to aggregate and present simply, you have a competitive advantage, and you probably will be able to keep that traffic.
DANGEROUS: If it is easy to aggregate and present simply, you’re probably in dangerous territory long term.
There are three things that we really think about as we move to the conversational and voice search world. Those are…
1. Keyword research & targeting requires SERP feature analysis
It requires SERP analysis of both desktop and mobile, and preferably in the future I think we’re actually going to be looking for keyword research tools that can perform a voice query and then can tell us what the results either look like or sound like from the engine.
We need to do our prioritization of keyword targeting, which keywords we actually want to select and which keywords we want to create content for and try to rank for, based on our click-through opportunity and our value. If we don’t have that information and that data, then we’re probably going to be choosing some keywords unwisely compared to our competition who is thinking about this.
2. Content structure should optimize toward formats engines will use in their instant answers
If someone searches for libel versus slander, it is the case that if you rank on the first page and you have the right content structure, Google may pull you into that instant answer box. What we’ve seen from our research is that being in that instant answer box is not a bad thing. In fact, it tends to increase click-through rate and overall traffic for many, many publishers. Not true for everyone. Some instant answers do really disintermediate queries, the “Iceland versus England, what was the score?” If Google just tells you, you don’t need to click through. But certainly on libel versus slander you may see libel is written or published defamatory statement, while slander is spoken. It’s very likely that people will actually be clicking through to learn more about that subject, and then you have an opportunity to serve up ads or to serve up your services or whatever product you’re selling, those types of things. So format things intelligently.
Dr. Pete did a great blog post* on how to rank number zero, how to get into those instant answer results. He recently did a presentation at SMX Advanced that he’s published on SlideShare, that you can check out as well. Both those resources very handy.
*Editor’s note: This is indeed a great blog post, but it’s still a draft. Stay tuned â€” we’ll share this with you on Tuesday, July 26th.
3. Keep an eye on absolute volume and search volume demand trends, NOT just percentages of queries and aggregated stats
So if keyword search volume for the terms and phrases that you care about, if the orange is typed and the green here is voice search, you can see that it looks like over here this is 60% plus, so voice search has overtaken typed search. But what’s actually happened is that, year to year, typed search has gone up as well. It didn’t stop paying to try and rank for these keywords. In fact, it paid more and more dividends. It’s just that voice search grew even faster. So I think we have to be cautious if we think about voice as completely disintermediating or taking over our industry or our content. Rather we should think of this as additive, and we need to pay close attention to the true overall volume demand, both typed and voice search over time.
All right gang, look forward to your tactics, your strategies for voice and conversational search, and we’ll see you again next week for another edition of Whiteboard Friday. Take care.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by Loop11
Rapid growth. In the business world, it’s generally thought of as a good thing â€” scratch that, a great thing. But when you’re an ecommerce site, that rapid growth can also mean more hurdles to jump, especially when it comes to your SEO and information architecture.
In this episode of True North, you’ll be given a firsthand look at how one company found a way to overcome the obstacles and unite their processes of search, discovery, and transactions.
Architecting a Unicorn â€“ SEO & IA at Envato
Ben Newton: “Hi. I’m a serial entrepreneur.” We hear that a lot these days, don’t we? I don’t know about you, but when I hear a person say that, I kind of find it repelling, as though someone has blown cigarette smoke into my personal space. It didn’t always used to be this way. Ten years ago, for instance, being an entrepreneur wasn’t the buzzword it is today. Another thing that happened 10 years ago was the founding of a bootstrapped company called Envato.
Chris Thelwell: We’re a marketplace for creative professionals. So we have people that produce assets for us like WordPress themes, like graphic assets, like photography and videos, and then we have people that want to buy that kind of stuff. And we’re the marketplace that fits in between.
Ben: Unlike the many wantrepreneurs of today, Envato has actually helped to create thousands of real entrepreneurs who have been hugely successful.
Fiorella RizzĂ : A lot of our authors were able to quit their day-to-day job and just focus on doing what they love. We have stories of authors that have built houses and were able to provide for their children and were able to stay at home and spend more time with their family or travel around the world. To me it’s â€” we actually help people reach for freedom.
Ben: In the last six years, Envato has grown exponentially. Today, it has roughly 10 million assets available for sale through its network of websites. Some of their sites, such as ThemeForest, are among the most trafficked websites in the world. All of this is good news, but with a rapid growth fueled by user-generated content, problems are created. Many of these issues can be thrown into two main buckets: information architecture and search engine optimization, also known as IA and SEO.
Kate Hunter: We’ve got sort of two streams. We have new products. So when it comes to new products, it’s about working with those teams prior to development and mapping out a structure that will allow it to scale and not run into architecture problems down the track, that will inhibit the ability to grow organic traffic. And then conversely on the marketplaces, if you architect for a small amount of categories or a small amount of content 10 years ago, and you now have a large amount of content and you now have to shoehorn that into the original structure you created, it’s not necessarily the best fit.
Ben: So put yourself in Envato’s shoes. You have a runaway success, which is only growing in momentum, yet you know some things have to change. The bones of Envato need to be altered to not only handle future growth, but also to get the most out of what is currently there.
There’s no easy path here. And no step forward seems to be without two steps back, but they’re not letting that stop them. Let’s find out how they’re planning for success.
Search, discovery, and transactions in one seamless experience
Hi, and welcome to True North. My name is Ben Newton and I’m from Loop11. This is the podcast where we share stories of discovery and innovation.
As we found out, Envato manages roughly a dozen websites which are geared towards connecting creatives, from around the world, with people who need their services or assets. Having millions of visitors pouring into their sites would seem like the kind of problem you want to have, and it is. They’re not complaining.
What they are trying to do, though, is figure out how to give these visitors the best product, and this ultimately means wrapping search, discovery, and transactions into one seamless experience.
So whose problem is this? Is it an SEO problem? Or maybe it’s IA. Maybe it’s a UX or development issue.
Fiorella: From 2010 to 2015, the number of items that were uploaded to our marketplaces grew exponentially, roughly 50,000 items at the beginning of 2010 to almost 10 million items now.
Ben: That’s Fiorella RizzĂ . Her role straddles copywriting, search, and information architecture.
Fiorella: But of course, what happens when you get all this content coming through is you just want to put it out there and make sure that the users are going to see it. So we have this technical constraint where an item cannot fall under multiple categories. It might sound like it’s not a big deal, but it’s actually extremely . . . like it’s a huge constraint, and it doesn’t allow for flexibility.
So what happened was the easiest way to go was just create new categories that would accommodate for a new technology that would come up or a new type of item. But of course, the result of that is that the IA is, at the moment, extremely complex and not intuitive at all.
Ben: Often problems with IA don’t become apparent until you’re literally using the product and a completely rational use case exposes a severe limitation.
Fiorella: So we have a top-level category on VideoHive that’s called Stock Footage, which is pretty straightforward. You’re going to go there, you’re going to find Stock Footage. But then Stock Footage has a number of subcategories that â€” I’ll give you a few examples: Holiday, Water, Nature, Hobby.
So there is an item where you can just see a boat on water, and that’s it. That’s all you see, and it’s currently falling under the Water subcategory of Stock Footage. But there’s also a Vehicles subcategory, a Hobby subcategory. All these categories would be okay for this video.
The problem there is that those pieces of information are all good to describe that item. The problem is they have been treated in the wrong way. It’s not really about the content in this case. The content is fine. The problem is how we captured the information and how we presented it to the user.
So if you search for it from the homepage, then you’re going to find it. But you click “Stock Footage” and then you click “Vehicles,” and then you’re inside the Vehicles category, and you search for a boat, you’re not going to find that one because that one belongs to another subcategory. So you’re not going to find it.
SEO is what happens when everything else is done right â€” including UX & IA
Ben: Although navigation and discoverability are arguably two of the most important facets for an ecommerce website, poor IA has wider-reaching implications. For a company like Envato, organic search is a massive source of traffic, and the level of your IA relates directly to your potential to perform in search engines.
Kate: To steal a quote from a conference I was just at, SEO isn’t something you can sprinkle on or apply over the top of something. It’s what happens when everything else is done right, and one of those things that has to be done right is UX and IA.
Ben: That’s Kate Hunter. She’s the Organic Performance Manager at Envato.
Kate: Search engines are trying to emulate human behavior. If it’s hard for them to crawl, if it’s hard for them to understand, then they’re not going to rank you as high as possible because they don’t think you’re doing as good a job as you could.
So at the moment I can say we’ve mapped our click-through rates based on where you can possibly rank, and our content in some cases deserves to rank two positions higher than what it currently is, but it doesn’t because search engines aren’t able to crawl it efficiently. Which also means we’re aren’t allowed to distribute our PageRank efficiently between our pages, which means discoverability and authority is very hard to achieve and execute, which is why we don’t rank those two positions higher.
So the other thing is competition. So sometimes if you were to launch a niche and you launch with a terrible IA and it stays that way, but no one ever competes with you, you’ll always rank for that because you are still the best content. But the Envato business competes in a highly competitive field with a lot of money attached to them, and the reality is our competitors are building sites the way we would build them, if we built ThemeForest today, except we built ThemeForest 10 years ago.
Ben: So this cuts to the core of the problem Envato is facing. The direction that information architecture should head is clear, but how and when to implement those changes isn’t. Their decisions are bookended by the urgency to stay in front of newer competitors and the realization that the old architecture takes a long time to change. It’s the common scenario of too big to start again, but too important not to address.
Chris: Legacy is a huge issue, and you can sort of plan where you want to be. You can plan the future. It’s a similar issue with design. We have, we could design a really great marketplace, but we can’t deliver that. We’re too big to just deliver. We have to deliver in little tiny steps, and we’ll probably never reach that end goal.
Ben: That’s Chris Thelwell. He’s the head of UX and Design at Envato.
Chris: So that kind of vision idea of where you want to get to is really hard to achieve, and you have to kind of work out how we can take those little steps together. One of the examples is, who believes in clicking logos to go back to the homepage? Now, we’ve seen a significant amount of traffic that goes to our homepage. That’s the exits on that page. The theory behind that is that people are clicking on that logo expecting to go somewhere different than where we send them. So it’s kind of trying to understand why you get the results you do.
We’ve got a page with a very high exit rate. You’re trying to understand, why has it gotten that higher exit rate? Where do the people come from to visit? And it’s not necessarily a problem of the page, it’s maybe a problem of where the link to that page was, and we’re constantly trying to understand those things.
Ben: So knowing your product and the user data behind it is key to understanding the problems you need to address. We’ve also heard that SEOs are a big reason for getting your IA right, but can it work the other way? Can improving your SEO help your information architecture?
Can improving your SEO help improve your IA?
Kate: More and more importantly for SEO is tone of voice and authenticity. Google has always said it’s trying to get its algorithms to understand results and websites in the same way that humans do. It’s never been able to do that more so than it does now.
A great example is that, two years ago, best practice would be to not use stop words. So stop words being things like “on, a, by, from” because you’ve got character limits in your title tags and that’s a waste of characters in that title.
Since Hummingbird, the difference is that these stop words are actually really, really important now because they’re not a waste of characters because stop words help to define intent. So that’s where copywriting comes in. So, example, in that page title, instead of using “WordPress templates” or “WordPress themes” with the pipe character and then the term “ThemeForest,” it will now say “WordPress themes from ThemeForest,” because we need to indicate that we are a platform that allows people to sell this.
If we say “buy,” that would indicate that we make these ourselves, that we’ve made them. But we need to say they’re from us because we are the platform. If you think about how people talk or use voice search, they wouldn’t just say “WordPress themes ThemeForest”, they’d say “WordPress themes from ThemeForest.” So voice search is a good indicator. So how you’d search if you were verbally searching is a good indicator of how you should be looking at your on-page text.
Ben: What this can teach us is to keep coming back to what real people would respond to. How would normal people group or search for your information? This is easily forgotten, especially when the focus is on rankings and not the end user.
Now, Kate has said that good SEO is as a result of everything else being done right. But that doesn’t mean that it’s not constantly being monitored during each and every process of UX, design, and IA development. Both Kate and Fiorella are constantly forming benchmarks, running tests, and measuring results to ensure that the waves of constant improvement keep flowing.
Fiorella: When you create or change the information architecture, basically what you’re doing is you are defining or redefining the discovery patterns of users. So whether they’re going to be successful or unsuccessful is all up to how you structure the information. What we’ve found to be extremely useful is card sorting and especially tree-testing exercises. First we test the current structure and then we come up with a new proposed way of organizing the content, and we test it again and see how the results compare.
Ben: The way Fiorella executes tree testing is to use an online tool which is completely removed from their website’s design and content, so as not to influence the results. Rather, it’s an interface which shows just the categories and the subsequent subcategories as the user proceeds through the test. She then configures tasks for the user to complete which provides feedback.
For example, if she was testing the ThemeForest IA, the task might be for a user to pretend they were a restaurant owner looking for a new website template. They’d be asked to navigate the categories until they found an area which they believed would contain the content they were looking for.
Fiorella would then analyze the user data in aggregate, looking for the most common paths taken by users and what percentage of them found the right pages.
This example also acts as a counterpoint to the SEO monitoring and testing done by Kate. While IA can be done offline and removed from the actual website, SEO is a constant monitoring and tweaking process, based on what’s actually happening out in the wild.
Kate: One of the first things I did coming into the business was put in place a rank tracking tool, which allows me to see in the current state for popular terms which were relevant for how we were ranking. And I now know where everything ranks and I know where I think we deserve to rank. For me SEO isn’t about number one. Number one is a very old-school place to play, particularly because we’re a global company. You have to be at the top. One of the best 10 results worldwide, so it’s a very competitive game.
In WordPress for example, we’re highly relevant to WordPress, but WordPress is not our business. WordPress is WordPress.org or WordPress.com. So ThemeForest ranks for number two in the U.S. for WordPress themes. We cannot aim to be any higher than that because we will never trump the original source.
So a business like ours which is, we don’t actually have our products, in a lot of cases we can’t be number one. The best we can hope for is number two, which is not a bad thing by any means. It’s about we are not the most relevant in every case. So the goal is to improve and I have an idea in terms of where we deserve to rank, and that’s my goal. Number one is not the goal.
Build with a vision of the future in mind
Ben: So as Envato moves forward, there are two clear and separate ways in which they’re addressing the problem of IA and SEO.
Kate: We have, I guess, two streams. There’s the one working with the existing aging platform which we’re retrofitting, and then there’s working with the new products. Baking in IA before development, so mapping out what the future looks like. If you’re an online clothing retailer, you might only have pants and t-shirts to sell right now, but could you imagine where you sold all sorts of apparel in the future? And if so, what would the IA for a really detailed clothing store or online clothing store look like in 10 years? Map that structure and then build for that structure, but only populate the content you have now.
Fiorella: To me the thing that helped most was that we made the decision to go with the best-case scenario. Like imagine we don’t have any constraints. What is this going to look like? Because this helps you have a vision. You know where you’re going and what you’re heading towards, and that helps you, prevents you from losing track of what you’re doing and just wander off, thinking about other possible solutions, which it’s very likely to happen.
Ben: To find out more about the team at Envato and how they’re thinking about the challenges they face, go to inside.envato.com. If you have a story you’d like us to consider for the show, please visit our website and send us an email.
You can subscribe to our show on iTunes where you can also rate and review us. Or go to truenorthpodcast.com and join the community.
Our music is by the Mysterious Breakmaster Cylinder.
True North is produced by Loop11. We’ll see you next time.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by nikkielizabethdemere
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.
Any old picture might be worth a thousand words. But your target niche doesnâ€™t need or want a thousand words. Your ideal audience needs the right words, paired with the right images, to tell a story that uniquely appeals to their deepest desires.
Studies show that people understand images faster than words, remember them longer, and if thereâ€™s a discrepancy between what we see and what we hear, our brains will choose to believe what they see. Our brains prioritize visual information over any other kind, which makes images the fast-track to connection all marketers are looking for.
So donâ€™t just slap some text on a stock photo and call it good. You can do better. Much better. And I’ll show you how.
Understand the symbolic underpinnings
This homepage from Seer Interactive does a lot right. The copy below this central image is golden: “Weâ€™re Seer. We pride ourselves on outcaring the competition.” Outcaring? Thatâ€™s genius!
But, I would argue, pairing this image with these words, “Itâ€™s not just marketing, itâ€™s personal,” is less than genius. Thereâ€™s nothing personal about this picture. Sure, there are people in it, but chatting with a group of coworkers doesnâ€™t say “personal” to me. It says corporate.
What if they paired those words with this free image by Greg Rakozy from Unsplash?
Thereâ€™s something about this image that isnâ€™t just personal; itâ€™s intimate. Two people connecting in the dark, surrounded by snowflakes that almost look like white noise. Could this be a metaphor for reaching out through the noise of the Internet to make a personal connection? To get someone to fall in love (with your brand) even?
Many philosophers, anthropologists, sociologists, and psychologists have pointed out that humans are uniquely symbolic creatures.
â€“ Clay Routledge Ph.D., The Power of Symbolism, Psychology Today
A truly powerful image speaks to us on a symbolic level, feeding us information by intuition and association. Humans are associative creatures. We naturally derive deep, multifaceted meanings from visual cues, an idea brought into prominence by both Sigmund Freud and Carl Jung.
The magic behind an effective symbol is its ability to deliver messages to both our conscious minds and subconscious awareness. When choosing the right image for marketing copy â€” whether an ad or the “hero” section of your website â€” consider not just what you want to tell people, but what you want them to feel.
A symbol must possess at one and the same time a double or a multiple significance … Thus all symbols possess both a ‘face’ and a ‘hidden’ value, and it is one of the great achievements of psychology to have shown how the ‘hidden’ value is generally, from the point of view of function, the more important. …Behind this face value lies a mass of undifferentiated feelings and impulses, which do not rise into consciousness, which we could not adequately put into words even if we wanted to… and which, though they go unattended to, powerfully influence our behavior.
â€“ F.C. Bartlett, ‘The social functions of symbols,’ Astralasian Journal of Psychology and Philosophy
And, of course, as you’re looking through images, consider this:
What type of images and experiences will resonate with your target audienceâ€™s deepest desires?
This, of course, requires you to have built out a robust buyer persona that includes not just their demographic information with a catchy name but also their extracurricular passions: the driving forces that get them out of bed and into the office each day.
As with conversion copywriting, the key to success is identifying motivations and using them to create a visual representation of your nicheâ€™s most desired outcomes.
Set the stage for an experience, not just a product
In keeping with the theme of images that deliver the desired outcome, the most effective online ads do this in a way that invites the viewer to experience that outcome. Instead of featuring simply a product, for example, these ads set the stage for the experience that buying the product just might enable you to have.
ModCloth is a master of this. Doesnâ€™t this image make you want to take a nap in a nice, cozy cabin? You can get that experience (or something like it) if you buy their 0 hammock.
Unless you live in the deep woods of the Appalachian mountains, your home will never look like this. But some of us wish ours did, and we’re clearly the target audience. This picture speaks to our deepest need to get away from everyone and everything for some much-needed rest and recuperation.
When choosing images, it’s just as important to consider symbolism as it is to consider the target viewers. What experience will resonate with them most? What images will sell their desired experiences?
ModClothâ€™s recent “road trip” slider doesnâ€™t say anything about the clothes theyâ€™re trying to sell, for example. But it does speak to a sense of adventure and the power of female friendships, both of which are defining characteristics of their target niche of millennial women with a delightfully quirky fashion sense.
You donâ€™t have to be a clothing company to capitalize on this idea or even a B2C company. Check out how these B2B companies use images to make their words not just read, but felt.
Donâ€™t you feel like youâ€™re Superman out for a midnight joyride? All the world at your fingertips? Yeah, thatâ€™s the point. What theyâ€™re selling, essentially, is omniscience via data. All the benefits of DC Comics-like superpowers, minus the kryptonite.
You might not catch it at first glance, but look at how cozy these people are. Theyâ€™re wearing knit sweaters (not suits) while cradling warm cappuccinos in their hands â€” clearly, this sales meeting is going well. No pressure tactics here. Quite the opposite.
For this example from Blitz Marketing, youâ€™ll have to visit their website, because this isnâ€™t a static image â€” itâ€™s a video montage designed to get you PUMPED! Energy practically radiates off the screen (which, we are left to infer, is the feeling youâ€™d get all the time if you worked with this creative marketing agency).
Piston, another ad agency, takes a more subtle approach, which I love. Instead of having your standard stock photo of “man in a suit,” they did a custom photo shoot and added quirky elements, like a pink candy ring. I find this image particularly powerful because it effectively sets up an expectation (man in a suit), then adds a completely unexpected element (candy ring), which is conveniently located behind the word CREATIVE. This illustrates just how creative this agency is while remaining utterly professional.
Numbers are compelling. Numbers with visual aids? Unstoppable.
Letâ€™s say your buyer persona isnâ€™t driven by emotion. Show this persona a grid of city lights from 2,000 feet up, and he or she wonâ€™t feel like Superman. Theyâ€™ll be wondering what this has to do with the ROI they can expect.
Someone get this persona some numbers already.
When conversion depends heavily on gaining credibility, pictures can be very compelling. In fact, one study out of the Victoria University of Wellington in New Zealand showed that simply having an image makes the text alongside that image more believable, even if the image had nothing at all to do with the text.
When people evaluate claims, they often rely on what comedian Stephen Colbert calls ‘truthiness,’ or subjective feelings of truth.
â€“ Nonprobative photographs (or words) inflate truthiness, by E.J. Newman, M. Garry, D.M. Bernstein, J. Kantner, D.S. Lindsay
Essentially, any image is better than nothing. But the right image? Itâ€™s worth even more. In a similar study by the Psychology departments at both Colorado State University and the University of California, researchers experimented with brain images.
Brain images are believed to have a particularly persuasive influence on the public perception of research on cognition. Three experiments are reported showing that presenting brain images with articles summarizing cognitive neuroscience research resulted in higher ratings of scientific reasoning for arguments made in those articles, as compared to articles accompanied by bar graphs, a topographical map of brain activation, or no image.
â€“ Seeing is believing: The effect of brain images on judgments of scientific reasoning by David P. McCabe and Alan D. Castel
However, what if we traded in this either/or philosophy (either picture or no picture, either picture or bar graph) for a philosophy that uses the best of all resources?
Having the right image, supported by the right words, and given credibility by real numbers (as statistics or in graphs/charts) is the most effective possible combination.
Statistics have also proven to be compelling. In Blinded with science: Trivial graphs and formulas increase ad persuasiveness and belief in product efficacy, the study out of Cornell University reveals that just the appearance of being scientific increases an adâ€™s persuasiveness. What does that “appearance” require?
Graphs. Simple, unadorned graphs.
And, those graphs were even more effective at persuading people who had â€śa greater belief in scienceâ€ť (e.g., your logical buyer persona).
Put the right words together with the right image, then overlay with a supportive set of numbers, and you can convince even the most logical persona that you have the solutions they seek.
Caveat: When the name of the game is building credibility, donâ€™t undermine yourself with shoddy data and lazy analysis. One of your smart customers will, without fail, call you out on it.
Graphs and charts donâ€™t have to be fancy or complicated to be convincing. Check out these two graphs from the Kissmetrics article Most of Your A/B Test Results are Illusory and Thatâ€™s Okay by Will Kurt.
Do you even need to read the rest of the article to get the point? (Though you will want to read the article to find out exactly what that scientist is doing so right.) This is highly effective data storytelling that shows you, at a glance, the central point the author is trying to make.
CubeYou, a social data mining company that turns raw numbers into actionable insights, does great data storytelling by combining stats and images. Not only do these visuals deliver demographic information, they put a face on the target at the same time, effectively appealing to both logical and more intuitive personas in one fell swoop.
And for even more powerful images, look at the data visualizations Big Mountain Data put together of the #WhyIStayed domestic violence hashtag. Talk about telling an impactful story.
Then there are infographics that include data visualization, images, and analysis. I love this one from CyberPRMusic.com.
Itâ€™s all about telling their story
Uninspired visuals are everywhere. Seriously, theyâ€™re easy to find. In researching this article, I could find 20 bad images for every one good one Iâ€™ve included here.
Herein lies an opportunity to stand out.
Maybe the intersection of words, images, and numbers isnâ€™t well understood in online marketing. Maybe having free stock photos at our fingertips has made us lazy in their use. Maybe there arenâ€™t enough English majors touting the benefits of effective symbolism.
Whatever the reason, you now have the chance to go beyond telling your target niche about your product or serviceâ€™s features and benefits. You have the ability to set your brand apart by showing them just how great life can be. Free tools such as Visage make it possible.
But first, you have to care enough to make compelling images a priority.
What are your thoughts on using stunning visuals as needle-movers for your brand?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Posted by Kaitlin
There are lots of myths and misconceptions surrounding the subject of international SEO. I recently gave a Mozinar on this; I’d like to share the basis of that talk in written form here. Letâ€™s first explore why international SEO is so confusing, then dive into some of the most common myths. By the end of this article, you should have a much clearer understanding of how international SEO works and how to apply the proper strategies and tactics to your website.
One common trend is the lack of clarity around the subject. Let’s dig into that:
Why is international SEO so confusing?
There are several reasons:
- Not everyone reads Google Webmaster Guidelines and has a clear understanding of how they index and rank international content.
- Guidelines vary among search engines, such as Bing, Yandex, Baidu, and Google.
- Guidelines change over time, so itâ€™s difficult to keep up with changes and adapt your strategy accordingly.
- Itâ€™s difficult to implement best practices on your own site. There are many technical and strategic considerations that can conflict with business needs and competing priorities. This makes it hard to test and find out what works best for your site(s).
A little history
Let’s explore the reasons behind the lack of clarity on international SEO a bit further. Looking at its development over the years will help you better understand the reasons why it’s confusing, laying some groundwork for the myth-busting that is about to come. (Also, I was a history major in college, so I canâ€™t help but think in terms of timelines.)
Please note: This timeline is constructed almost entirely on Google Webmaster blog announcements. There are a few notes in here about Bing and Yandex, but it’s mostly focused on Google and isn’t meant to be a comprehensive timeline. Mostly this is for illustrative purposes.
Our story begins in 2006. In 2006 and 2007, things are pretty quiet. Google makes a few announcements, the biggest being that webmasters could use geo-targeting settings within Webmaster Tools. They also clarify some of the signals they use for detecting the relevance of a page for a particular market: ccTLDs, and the IP of a server.
In 2009, Bing reveals its secret sauce, which includes ccTLDs, reverse IP lookup, language of body content, server location, and location of backlinks.
In 2010, things start to get really exciting. Google reveals some of the other hints that they use to detect geo-targeting, presents the pros and cons of the main URL structures that you can use to set up your international sites, and gives loads of advice about what you should or shouldnâ€™t do on your site. Note that just about the same time that Google says they ignore the meta language tag, Bing says that they do use that tag.
Then, in fall of 2010, hreflang tags are introduced to the world. Until this, there was no standard page-level tag to tell a search engine what country or language you were specifically targeting.
Originally, hreflang tags were only meant to help Google sort out multi-regional pages (that is, pages in the same language that target different countries). Only, in 2011, Google expands hreflang tag support to work across languages as well. Also during this time, Google removes the requirement to use canonical tags in conjunction with hreflang tags, citing they want to simplify the process.
Then in 2012, hreflang tags are supported in XML sitemaps (not just page tags). Also, the Google International Help Center is created, with a bunch of useful information for webmasters.
In 2013, the concept of the “x-default” hreflang tag is introduced, and we learn that Yandex is also supporting hreflang tags. This same year, Bing adds geo-targeting functionality to Bing Webmaster Tools, a full 5 years after Google did.
Note that it isnâ€™t until 2014 that Google begins including hreflang tag reporting within Google Webmaster Tools. Up until that point, webmasters would have had to read about hreflang tags somewhere else to know that they exist and should be used for geo-targeting and language-targeting purposes. Hreflang tags become much more prominent after this change.
In 2015, we see improvements to locale-adaptive crawling, and some clarity on the importance of server location.
To sum up, this timeline shows several trends:
- Hreflang tags were super confusing at first
- There were several iterations to improve hreflang tag recommendations between 2011 and 2013
- Hreflang tag reporting was only added to Google Search Console in 2014
- Even today, only Google and Yandex support hreflang. Bing and the other major search engines still do not.
There are good reasons for why webmasters and SEO professionals have misconceptions and questions about how best to approach international SEO.
At least 25% of hreflang tags are incorrect
Letâ€™s look at the adoption of hreflang tags specifically. According to NerdyData, 1.7 million sites have at least one hreflang tag.
I did a quick search to find out:
438,417 sites have hreflang=â€śukâ€ť
7,829 sites have hreflang=â€śen-ukâ€ť
Both of these tags are incorrect. The correct ISO code for the United Kingdom is actually gb, not uk. Plus, you can’t target by country alone â€” you have to target by language-country pairs or by language. Thus, just writing â€śukâ€ť is incorrect as well.
That means at least 25% of hreflang tags are incorrect, and I only did a brief search to find a couple of the most commonly mistaken ones. You can imagine just how many sites out there are getting these hreflang values wrong.
All of this is to prove a point: the field is ripe for optimization when it comes to global SEO. Now, letâ€™s debunk some myths!
Myth #1: I need to have multiple websites in order to rank around the world.
There’s a lot of talk about needing ccTLDs or separate websites for your international content. (A ccTLD is a country-coded top-level domain, such as example.ca, which is country-coded for Canada).
However, it is possible for your website to rank in multiple locations around the world. You don’t necessarily need multiple websites or sub-domains to rank internationally; in many cases, you can work within the confines of your current domain.
In fact, if you take a look at your analytics on your website, even if it has no geo-targeting whatsoever, chances are you already have traffic coming in from various languages and countries.
Many global brands have only one site, using subfolders for their multilingual or multi-regional content. Don’t feel that international SEO is beyond your reach because you believe it requires multiple websites. You may only need one!
The most important thing to remember when deciding whether you need separate websites is that new websites will start with zero authority. You will have to fight an uphill battle to build authority for establish and rank those new ccTLDs â€” and for some companies, organic traffic growth may be for many years after launching ccTLDs. Now, this is not to say that ccTLDs are not a good option. But you just need to keep this in mind that they are not the only option.
Myth #2: “The best site structure for international rankings is _________.”
There’s a lot of debate about what the best site structure is for international rankings. Is it subfolders? Subdomains? ccTLDs?
Some people swear by ccTLDs, saying that in some markets users prefer to buy from local sites, resulting in higher click-through rates. Others champion subdomains or sub-directories.
There is no one answer to the best international site structure. You can dominate using any of these options. I’ve seen websites of all site structures dominate in their verticals. However, there are certain advantages and disadvantages to each, so it’s best to research your options and decide which is best for you.
Google has published their pros and cons breakdown of the URL structures you can use for international targeting. There are 4 options listed here:
- Country-specific, aka ccTLDs
- Subdirectories with gTLDs (generic top-level domains, like .com or .org)
- URL parameters. These are not recommended.
Subdirectories with gTLDs have the added benefit of consolidating domain authority, while subdomains and ccTLDs have the disadvantage of making it harder to build up domain authority. In my opinion, subdomains are the least advantageous of the 3 options because they do not have the distinct advantage of geo-targeting that ccTLDs do, and they donâ€™t have the advantage of a consolidated backlink profile that subdirectories do.
The most important thing to think about is whatâ€™s best for your business. Consider whether you want to target at a language level or a country level. Then decide how much effort you want (or can) put behind building up domain authority to your new domains.
Or, for those who are more visual learners:
- ccTLDs are a good option if youâ€™re Godzilla. If branding isnâ€™t a problem for you, if you have your own PR, if building up domain authority and handling multiple domains is no big deal, then ccTLDs are a good way to go.
- Subdirectories are a good option if youâ€™re MacGuyver. Youâ€™re able to get the job done using only what youâ€™ve got.
- Subdomains are a good option if youâ€™re Wallace and Gromit. Somehow, everything ends well despite the many bumps in the road.
I researched the accuracy of each type of site structure. First, I looked at Google Analytics data and SEMRush data to find out what percentage of the time the correct landing page URL was ranking in the correct version of Google. I did this for 8 brands and 30 sites in total, so my sample size was small, and there are many other factors that could skew the accuracy of this data. But it’s interesting all the same. ccTLDs were the most accurate, followed by subdirectories, and then subdomains. ccTLDs can be very effective because they give very clear, unambiguous geo-targeting signals to search engines.
However, there’s no one-size-fits-all approach. You need to take a cold, hard look at your business and consider things like:
- Marketing budget you have available for each locale
- Crawl bandwidth and crawl budget available for your site
- Market research: which locales should you target?
- Costs associated with localization and site maintenance
- Site performance concerns
- Overall business objectives
As SEOs, we’re responsible for forecasting how realistically our websites will be able to grow and improve in terms of domain authority. If you believe your website can gain fantastic link authority and your team can manage the work involved in handling multiple websites, then you can consider ccTLDs (but whichever site structure you choose will be effective). But if your team will struggle under the added burden of developing and maintaining multiple (localized!) content efforts to drive traffic to your varied sites, then you need to slow down and perhaps start with subdirectories.
Myth #3: I can duplicate my website on separate ccTLDs or geo-targeted sub-folders & each will rank in their respective Googles.
This myth refers to taking a site, duplicating it exactly, and then putting it on another domain, subdomain, or subfolder for the purposes of geo-targeting.
And when I say “in their respective Googles,” I mean the country-specific versions of Google (such as google.co.uk, where searchers in the United Kingdom will typically begin a search).
You can duplicate your site, but it’s kind of pointless. Duplication does not give you an added boost; it gives you added cruft. It reduces your crawl budget if you have all that content on one domain. It can be expensive and often ineffective to host your site duplicated across multiple domains. There will be cannibalization.
Often I’ll see a duplicate ccTLD get outranked by its .com sister in its local version of Google. For example, say a site like example.co.uk is a mirror of example.com, and the example.com outranks the example.co.uk site in google.co.uk. This is because geo-targeting is outweighed by the domain authority of the .com. We saw in an earlier chart that ccTLDs can be the most accurate for showing the right content in the right version of Google, but that’s because those sites had a good spread of link authority among each of their ccTLDs, as well as localized content.
There’s a big difference between the accuracy of ccTLDs when they’re localized and when they are dupes. I did some research using the SEMRush API, looking at 3 brands using ccTLDs in 26 country versions of Google, where the .com outranked the ccTLD 42 times. You shouldnâ€™t just host your site mirrored across multiple ccTLDs just for the heck of it; it’s only effective if you can localize each one.
To sum it up: Avoid simply duplicating your site if you can. The more you can do to localize and differentiate your sites, the better.
Myth #4: Geo-targeting in Search Console will be enough for search engines to understand and rank my content correctly.
Geo-targeting your content is not enough. Like we covered in the last example, if you have two pages that are exactly the same and you geo-target them in Search Console, that doesnâ€™t necessarily mean that those two pages will show up in the correct version of Google. Note that this doesnâ€™t mean you should neglect geo-targeting in Google Search Console (or Bing or Yandex Webmaster Tools) â€” you should definitely use those options. However, search engines use a number of different clues to help them handle international content, and geo-targeting settings do not trump those other signals.
Search engines have revealed what some of the international ranking factors they use are. Here are some that have been confirmed:
- Translated content of the page
- Translated URLs
- Local links from ccTLDs
- NAP info â€” this could also include local currencies and links to Google My Business profiles
- Server location*
*Note that I included server location in this list, but with a caveat â€” weâ€™ll talk more about that in a bit.
You need to take into account all of these factors, and not just some of them.
Myth #5: Why reinvent the wheel? There are multinational companies who have invested millions in R&D â€” just copy what they do.
The problem here is that large multinational companies don’t always prioritize SEO. They make SEO mistakes all the time. It’s a myth that you should look to Fortune 500 websites or top e-commerce websites to see how they structure their websiteâ€” they don’t always get it right. Imitation may be the best form of flattery, but it shouldnâ€™t replace careful thought.
Besides, what the multinational companies do in terms of site structure and SEO differs widely. So if you were to copy a large brandâ€™s site structure, which should you copy? Apple, Amazon, TripAdvisor, Ikeaâ€¦?
Myth #6: Using URL parameters to indicate language is OK.
Google recommends against this, and from my experience, it’s definitely best to avoid URL parameters to indicate language or region.
What this looks like in the wild is:
…where the target language or region of the page changes depending on the parameter. The problem is that parameters aren’t dependable. Sometimes they’ll be indexed, sometimes not. Search engines prefer unique URLs.
Myth #7: I can just proxy localized content into my existing URLs.
In this situation, a website will use the IP address or the Accept-Lang header of a user to detect their location or browser language preference, then change the content of the page based on that information. So the URL stays the same, but the content changes.
Google and Bing have clearly said they don’t like parameters and recommend keeping one language on one URL. Proxied content, content served by a cookie, and side-by-side translations all make it very problematic for search engines to index a page in one language. Search engines will appear to crawl from all over the world, so they’ll get conflicting messages about the content of a page.
Basically, you always want to have 1 URL = 1 version of a page.
Google has improved and will continue to improve its locale-aware crawling. As of early 2015, they announced that Googlebot will crawl from a number of IP addresses around the world, not just the US, and will use the Accept-Lang header to see if your website is locale-adaptive and changing the content of the page depending on the user. But in the same breath, they made it very clear this technology is not perfect, this does not replace the recommendation for using hreflang, and they still recommend you NOT use locale-adaptive content.
Myth #8: Adding hreflang tags will help my multinational content rank better.
Hreflang tags are one of the most powerful tools in the international SEO toolbox. They’re foundational to a successful international SEO strategy. However, they’re not meant to be a ranking factor. Instead, they’re intended to ensure the correct localized page is shown in the correct localized version of Google.
In order to get hreflang tags right, you have to follow the documentation exactly. With hreflang, there is no margin for error. Make sure to use the correct language (in ISO 639-1 format) and country codes (in ISO 3166-1 Alpha 2 format) when selecting the values for your hreflang tags.
- Exact ISO codes for language, and for language-country if you target by country
- Return tags
- Self-referential tags
- Point to correct URLs
- Include all URLs in an hreflang group
- Use page tags or XML sitemaps, preferably not both
- Use HTTP headers for PDFs, etc.
Be sure to check your Google Search Console data regularly to make sure no return tag errors or other errors have been found. A return tag error is when Page A has an hreflang tag that points to Page B, but Page B doesn’t have an hreflang tag pointing back to Page A. That means the entire hreflang association for that group of pages won’t work, and you’ll see return tag errors for those pages in Google Search Console.
Either the page tagging method or the XML hreflang sitemap method work well. For some sites, an XML sitemap can be advantageous because it eliminates the need for code bloat with page tags. Whichever implementation allows you to add hreflang tags programmatically is good. There are tools on the market to assist with page tagging, if you use one of the popular CMS platforms.
Here are some tools to help you with hreflang:
- Page tag generator: http://www.internationalseomap.com/hreflang-tags-generator
- Hreflang validators:
- XML sitemap generators:
Myth #9: I canâ€™t use a canonical tag on a page with hreflang tags.
When it comes to hreflang tags AND canonical tags, many eyes glaze over. This is where things get really confusing. I like to keep it super simple.
The simplest thing is to keep all your canonical tags self-referential. This is a standard SEO best practice anyways. Regardless of whether you have hreflang tags on a page, you should be implementing self-referential canonical tags.
Myth #10: I can use flag icons on my site to indicate the siteâ€™s language.
Flags are not languages â€” there’s even a whole website dedicated to talking about this common myth: http://www.flagsarenotlanguages.com. It has many examples of sites that mistakenly use flag icons to indicate languages.
For example, the UK’s Union Jack doesn’t represent all speakers of English in the world. Thanks to the course of history, there are at least 101 countries in the world where English is a common tongue. A flag of a country to represent speakers of a language is very off-putting for any users who speak the language but aren’t from that country.
Here’s an example where flag icons are used to indicate language. A better (and more creative) approach is to replace the flag icons with localized greetings:
If you have a multi-lingual site(s), you should not use flags to represent language. Instead, use the name of the language, written in the local language. English should be â€śEnglish,â€ť Spanish should be â€śEspaĂ±ol,â€ť German should be â€śDeutsch,â€ť etc. Youâ€™d be surprised how many websites forget to use localized language or country spellings.
Myth #11: I can get away with automated translations.
The technology for automated translations or machine translations has been improving in recent years, but it’s still better to avoid automated translations, especially machine translation that involves no human editing.
Automatic translations can be inaccurate and off-putting. They can hurt a website trying to rank in a competitive landscape. A great way to get an edge on your competitors is to use professional, high-quality native translators to localize your content into your target languages. High-quality localization is one of the key factors in improving your rankings when it comes to international SEO.
If you have a very large amount of content that you cannot afford to translate, choose some of the most important content for human translation, such as your main category and product pages.
Myth #12: Whichever site layout and user experience works best in our core markets should be rolled out across all our markets.
This is something Iâ€™ve seen happen on many, many sites, and it was part of the reason why eBay failed in China.
Porter Erisman tells the story in his book Alibabaâ€™s World, which I highly recommend. He spoke of how, when eBay and Alibaba were duking it out in China, eBay made the decision to apply its Western UX principles to its Chinese site.
In Alibabaâ€™s World, Erisman writes about how eBay â€śeliminated localized features and functions that Chinese Internet users enjoyed and forced them to use the same platform that had been popular in the US and Germany. Most likely, eBay executives figured that because the platform had thrived in more industrialized markets, its technology and functionality must be superior to a platform from a developing country.
“Chinese users preferred Alibabaâ€™s Taobao platform over eBay, because it had an interface that Chinese users were used to â€“ cute icons, flashing animations, and had a chat feature that connected customers with sellers. In the West, bidding starts low and ends high, but Chinese users preferred to haggle with sellers, who would start their bids high and end low.”
From this story, you can tell how localization â€” in terms of site design, UX, and holistic business strategy â€” can be of tantamount importance.
Here is an example of Lushâ€™s Japanese site, which has bright colors, a lot going on, and itâ€™s almost completely localized into Japanese. Also notice the chat box in the bottom right:
Now compare that to the Lush USA site. There’s a lot more white space here, fewer tiles, and the chat box is only a small button on the right sidebar.
Theyâ€™ve taken the effort to adjust layout according to how they want to express their brand to each market, rather than just replacing tiles in the same CMS layout with localized tiles. Yet, in both markets they have many elements that are similar, too. They’re a good example of keeping a unified global brand while leaving plenty of room for local expression.
The key to success internationally is localizing your online presence while at the same time having a unified global brand. From an SEO perspective, you should make sure there’s a logical organization to your global URLs so that localized content can be geo-targeted by subdirectory, subdomain, or domain. You should focus on getting hreflang tags right, etc. But you should also work with a content strategy team to make sure that there will be room for trans-creation of content, as well as with a UX design team to make sure that localized content can be showcased appropriately.
Design, UX, site architecture â€” all of these things play increasingly important roles in SEO. By localizing your design, you’re reducing duplicate content and you’re potentially improving your site engagement metrics (and by corollary, your clickstream data).
Things that an SEO definitely wants to localize are:
- Meta titles & descriptions
- Navigation labels
- Image file names, internal anchor text, & alt text
- Body content
Make sure to focus on keyword variations between countries, even within the same language. For example, there are differences in names and spellings for many things in the UK versus the US. A travel agency might describe their tours to a British audience as “tailor-made, bespoke holidays,” while they would tell their American audience they sell “customized vacation packages.”
If you used the same keywords to target all countries that share a common tongue, you’d be losing out on the ability to choose the best keywords for each country. Take this into account when considering your keyword optimization.
Myth #13: We can just use IP sniffing and auto-redirect users to the right place. We donâ€™t need hreflang tags or any type of geo-targeting.
A lot of sites use some form of automatic redirection, detecting the userâ€™s IP address and redirecting them to another website or to a different page on their site that’s localized for their region. Another common practice is to use the Accept-Language header to detect the userâ€™s browser language preference, redirecting users to localized content that way.
However, Google recommends against automatic redirection. It can be inaccurate, can prevent users and search engines from indexing your whole site, and can be frustrating for users when they’re redirected to a page they don’t want. In fact, hreflang annotations, when correctly added to all your localized content and correctly cross-referenced, should eliminate or greatly reduce the need for any auto-redirection. You should avoid automatic redirection as much as possible.
Here are all the reasons (that I can think of) why you shouldnâ€™t do automatic redirection:
- User agents like Googlebot may have a hard time reading all versions of your page if you keep redirecting them.
- IP detection can be inaccurate.
- Multiple countries can have multiple official languages.
- Multiple languages can be official in multiple countries.
- Server load time can be negatively affected by having to add in all these redirects.
- Shared computers between spouses, children, etc., could have different language preferences.
- Expats and travelers may try to access a website that assumes they’re locals, making it frustrating for the users to switch languages.
- Internet cafes, hotel computer centers, and school computer labs may have diverse users.
- The user prefers to browse in one language, but make transactions in another. For example, many citizens are fluent in English, and will search in English if they think they can get better results that way. But when it comes to the checkout process, especially when reading legalese, they will prefer to switch to their native language.
- A person sends a link to a friend, but that friend lives in a different place, and can’t see the same thing as her friend sees.
Instead, a much better user experience is to provide a small, unobtrusive banner that appears when you detect a user may find another portion of your site more relevant. TripAdvisor and Amazon do a great job of this. Here’s an image from Google Webmaster Blog that exemplifies how to do this well:
One exception to the never-use-auto-redirection rule is that, when a user selects a country and/or language preference on your site, you should store that preference in a cookie and redirect the user to their preferred locale whenever they visit your site in the future. Make sure that they can set a new preference any time, which will re-set the cookie preference.
On that note, also always make sure to have a country and/or language selector on your website that’s located on every page and is easy for users to see and for search engine bots to crawl.
Myth #14: I need local servers to host my global content.
Many website owners believe they need local servers in order to rank well abroad. This is because Google and Bing clearly stated that local servers were an important international ranking factor in the past.
However, Google confirmed last year that local server signals are not as important as they once were. With the rise in popularity of CDNs, local servers are generally not necessary. You definitely need a local server for hosting sites in China, and it may be useful in some other markets like Japan. Itâ€™s always good to experiment. But as a general rule, what you need is a good CDN that will serve up content to your target markets quickly.
Myth #15: I canâ€™t have multi-country targeted content thatâ€™s all in the same language, because then I’d incur a duplicate content penalty.
This myth is born from an underlying fear of duplicate content. Something like 30% of the web contains some dupe content (according to a recent RavenTools study). Duplicate content is a fact of life on the web. You have to do something spammy with that duplicate content, such as create doorway pages or scrape content, in order to incur a penalty.
Geo-targeted, localized content is not spammy or manipulative. There are valid business reasons for wanting to have very similar content geared for different users around the world. Matt Cutts confirmed that you will not incur a penalty for having similar content across multiple ccTLDs.
The reality is, you CAN have multi-country targeted content in the same language. Itâ€™s just that you need to combine hreflang tags + localization in order to get it right. Here are some ways to avoid duplicate content problems:
- Use hreflang tags
- Localized keyword optimization
- Adding in local info such as telephone numbers, currencies, addresses in schema markup, and Google My Business profiles
- Localized HTML sitemaps
- Localized navigation and home page features that cater to specific audiences.
- Localized images that resonate with the audience. American football, for example, is not very popular outside the US. Also, be mindful of holidays around the world and of current events.
- Transcreated content (where you take an idea and tailor it for a specific locale), rather than translation (which is more word-for-word than concept-for-concept)
- Obtain links from local ccTLDs pointing to your localized content
As you can see, there are many common myths surrounding international SEO, but hopefully you’ve gained some clarity and feel better equipped to build a great global site strategy. I believe international SEO will continue to be of growing interest, as globalization is a continuing trend. Cross-border e-commerce is booming â€” Facebook and Google are looking at emerging markets in Africa, India, and Southeast Asia, where more and more people are going online and getting comfortable buying online.
International SEO is ripe for optimization â€” so you, as SEO experts, are in a very good position if you understand how to set your website up for international SEO success.
Posted by Tom.Capper
Sampling is a process used in statistics when it’s unfeasible or impractical to analyse all the data that exists. Instead, a small, randomly selected subset is used to keep things manageable. Many analytics platforms use some sort of sampling to keep report loading times in check, and there seem to be three schools of thought when it comes to sampling in analytics. There are those who are terrified of it, insisting in unsampled versions of any report. Then there are those who are relaxed about it, trusting the statistical logic. And then, lastly, there are those who are oblivious.
All three are misguided.
Sampling isn’t something to fear, but, in Google Analytics in particular, it can’t always be trusted. Because of that, it’s definitely worth your time to understand when it occurs, how it affects your work, and how it can be avoided.
When it happens
You can always tell when sampling is being used, because of this line at the top of every report:
If the percentage is less than 100%, then sampling is in progress. You’ll notice above that I’ve produced a report based on more than half a billion sessions without any sampling â€” sampling isn’t just about the sheer number of sessions involved in a report. It’s about the complexity of what you’re asking the platform to report on. Contrast the below (apologies for the small screenshots; I wanted to make sure the whole context was included, so have added captions explaining just what you’re looking at):
No segment applied, report based on 100% of sessions
Segment applied, report based on 0.17% of sessions
The two are identical apart from the use of a segment in the second case. Google Analytics can always provide unsampled data for top-line totals like that first case, but segments in particular are very prone to prompting sampling.
The exact same level of sampling can also be induced through use of a secondary dimension:
Secondary dimension applied, report based on 0.17% of sessions
A few other specialised reports are also prone to this level of sampling, most notably:
- The Ecommerce Overview
- “Flow Reports”
Report based on 0.17% of sessions
Report based on <0.1% of sessions
To summarise so far, sampling can happen when we use:
- A segment
- More than one dimension
- Certain detailed reports (including Ecommerce Overview and AdWords Campaigns)
- “Flow” reports
The accuracy of sampling
Sampling, for the most part, is actually pretty reliable. Take the below two numbers for organic traffic over the same period, one taken from a tiny 0.17% sample, and one taken without sampling:
Report based on 0.17% of sessions, reports 303,384,785 sessions via organic
Report based on 100% of sessions, reports 296,387,352 sessions via organic
The difference is just 2.4%, from a sample of 0.17% of actual sessions. Interestingly, when I repeated this comparison over a shorter period (last quarter), the size of the sample went up to 71.3%, but the margin of error was fairly similar at 2.3%.
It’s worth noting, of course, that the deeper you dig into your data, the smaller the effective sample becomes. If you’re looking at a sample of 1% of data and you notice a landing page with 100 sessions in a report, that’s based on 1 visit â€” simply because 1 is 1% of 100. For example, take the below:
Report based on 45 sessions
Eight percent of a whole year’s traffic to Distilled is a lot, but 8% of organic traffic to my profile page is not, so we end up viewing a report (above) based on 45 visits. Whether or not this should concern you depends on the size of the changes you’re looking to detect and your threshold for acceptable levels of uncertainty. These topics will be familiar to those with experience in CRO, but I recommend this tool to get your started, and I’ve written about some of the key concepts here.
In extreme cases like the one above, though, your intuition should suffice – that click-through from my /about/ page to /resources/…tup-guide/ claims to feature in 12 sessions, and is based on 8.11% of sessions. As 12 is roughly 8% of 100, we know that this is in fact based on 1 session. Not something you’d want to base a strategy on.
If any of the above concerns you, then I’ve some solutions later in this post. Either way, there’s one more thing you should know about. Check out the below screenshot:
Report based on 100% of sessions, but “All Users” only accounts for 38.81% “of Total”
There’s no sampling here, but the number displayed for “All Users” in fact only contains 38.8% of sessions. This is because of the combination of there being more than 1,000,000 rows (as indicated by the yellow “high-cardinality” warning at the top of the report) and the use of a segment. This is because of the effect of those rows grouped into “(other)”, which are hidden when a segment is active. Regardless of any sampling, the numbers in the rows below will be as accurate as they would be otherwise (apart from the fact that “(other)” is missing), but the segment totals at the top end up of limited use.
So, we’ve now gone over:
- Sampling is generally pretty accurate (+/- 2.5% in the examples above).
- When you’re looking at small numbers in reports with a high level of sampling, you can work out how many reports they’re based on.
- For example, 1% sampling showing 100 sessions means 1 session was the basis of the number in the report.
- You should keep an eye out for that yellow high-cardinality warning when also using segments.
What you can do about it
Often it’s possible to recreate the key data you want in alternative ways that do not trigger sampling. Mainly this means avoiding segments and secondary dimensions. For example, if we wanted to view the session counts for the top organic landing pages, we might ordinarily use the Landing Pages report and apply a segment:
Landing Pages report with Organic Traffic segment, based on 71.27% of sessions
In the above report, I’ve simply applied a segment to the landing pages report, resulting in sampling. However, I can get the same data unsampled â€” in the below case, I’ve instead gone to the “Channels” report and clicked on “Organic Search” in the report:
Channels > Organic Search report, with primary dimension “Landing Page”, based on 100% of sessions
This takes me to a report where I’m only looking at organic search sessions, and I can pick a primary dimension of my choice â€” in this case, Landing Page. It’s worth noting, however, that this trick does not function reliably â€” when I replicated the same method starting from the “Source / Medium” report, I still ended up with sampling.
A similar trick applies to custom segments â€” if I wanted to create a segment to show me only visits to certain landing pages, I could instead write a regex advanced filter to replicate the functionality with less chance of sampling:
Lastly, there are a few more extreme solutions. Firstly, you can create duplicate views, then apply view-level filters, to replicate segment functionality (permanently for that view):
Secondly, you can use the API and Google Sheets to break up a report into smaller date ranges, then aggregate them. My colleague Tian Wang wrote about that tool here.
Lastly, there’s GA Premium, which for a not inconsiderable cost, gets you this button:
So lastly, here’s how you can avoid sampling:
- You can construct reports differently to avoid segments or secondary dimensions and thus reduce the chance of sampling being triggered.
- You can create duplicate views to show you subsets of your data that you’d otherwise have to view sampled.
- You can use the GA API to request large numbers of smaller reports then aggregate them in Google Sheets.
- For larger businesses, there’s always the option of GA Premium to receive unsampled reports.
I hope you’ve found this post useful. I’d love to read your thoughts and suggestions in the comments below.