Here’s Why Content Is More King than Ever – Here’s Why #218


You’ve heard that content is king, but today, content is more important than ever. Here’s why.

Content is king. It’s still king and it hasn’t really changed. And today, I’m going to show you three case studies that will show you that content is more king than it’s ever been.

Note: Our future videos will start publishing on Perficient Digital channel, please subscribe to Perficient Digital channel

Don’t miss a single episode of Here’s Why, click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

Resources

Transcript

Content is king. It’s still king and it hasn’t really changed. And today, I’m going to show you three case studies that will show you that content is more king than it’s ever been.

I’m going to start though by talking a little bit about Google’s algorithm updates over the past 14-16 months. I’m currently showing a chart for you that shows all the major updates that were called “core algorithm updates” by Google.

Chart shows Google's Major algorithm updates dating from March 9, 2018 to March 12, 2019

It turns out that these updates all had a certain number of things in common. There seemed to be a pretty big focus on user intent and better understanding of user intent. They were looking to lower the rankings of poorer quality content and raise the rankings of higher quality content. But another element of it that I felt really emerged is a much bigger emphasis on the depth and breadth of your content. So, with that in mind, I want to jump into the case studies and show you some data.

Here’s the first case study. This is in the addiction marketplace. The first chart shows the publishing volume of one particular vendor in that marketplace.

Chart shows publishing volume of an addiction treatment site from January 2013 to January 2019

You can see that there are wild fluctuations, but at times we’re talking about hundreds of actual new pieces of content being published every month, some months as high as 700. So, that’s the first data point.

Second data point: Let’s look at the rate at which this site was adding links, that you see in this chart here.

Chart shows added link volume added to an addiction treatment site from Jan. 2014 - Jan. 2019

 The linked volume begins to grow rapidly around the same time as the content volume started growing.

And now for our third chart. This is the SEO visibility from Searchmetrics. You see that that begins to accelerate rapidly in May of 2017. So, it’s very interesting to see the correlation between the rapid content growth, the rapid linked growth, and how it drove massive changes in traffic to this particular site.

Chart shows SEO Visibility score of an addiction treatment site from May 2016 to August 2018Now let’s look at case study two. This one’s in the career space. And again, I’m going start with a chart on the publishing volume for this particular company.

Chart shows content publishing volume of a career site from January 2018 to March 2019

The volume was actually moderately heavy in 2017, running about 45ish pieces of content a month. That’s pretty significant—one and a half pieces a day on average. But in January of 2018, this scaled into many hundreds of pieces of content per month. So, now let’s look at the “rate of links added” chart for this particular company.

Here you see that the links did not really scale until you got into around March and April of 2018, when it has a really sharp spike.

Chart shows link volume added to a career site showing a spike in links adding in May of 2018

Now, what that sharp spike is actually showing us is: it turns out that that was due to a redirect of another domain to this particular domain, and so a lot of links transferred very instantaneously, if you will.

Let’s look at the traffic chart for this particular company. The traffic actually scaled very rapidly after the links took off in May of 2018.

Chart shows SEO visibility of a career site from 2018-2019 resulting

What I like about this case study is that it shows us that the content publishing at a volume where the links aren’t really growing isn’t going to do much for you. You need to create lots of great content. It’s a key part of the picture, but if you don’t promote it effectively, you’re not going to get the right results.

Let’s look at case study number three. This one is a consumer retail sales site. Let’s start with the publishing volume chart.

Chart shows publishing volume of a retail site from August 2018 to April 2019

This site has been adding content at a heavy volume for a very sustained period of time—it’s consistently in the thousands per month.

Now let’s look at the rate of links added for this chart. This doesn’t have as sharp a spike as the second example I showed, or even as dramatic growth as the first example.

Chart shows rates of added link volume of a retail site from 2013-2019

Yet you do see that links are being added steadily over time built on top of a very strong base.

Now let’s look at the traffic for this one. This is actually the SEO visibility chart again from Searchmetrics.

Chart shows SEO Visibility score of a consumer retail site from August 2017 to April 2019

In this particular case, the SEO visibility started at a very high level, but you get continuous steady growth over time, as supported by the strength of their publishing program and the rates at which they’re adding links.

I have two more charts for you before we wrap up.

This chart is data from a company called serpIQ that shows the correlation between ranking in Google and length of content.

Chart from a study conducted by serpOQ shows the correlation between ranking in Google and length of content. Google may rank longer content in a higher position.You’ll see from this chart there’s a clear bias for Google to rank longer form content. Now, before we go off and say that every page should have tons of content on it, it’s very dependent on the context. There are plenty of pages where you don’t need a long-form article. I’m not saying every piece of content or every page on your site needs to have a mass of text on it. That’s not the point. But from the point of view of informational content, it’s very clear that longer form is better

And then another chart. This one’s from HubSpot. This data shows that longer form content actually earns more links.

Chart from Hubspot shows longer form content actually earns more links

Now you can see how I’m making the connection here and drawing all the pieces together.

One last chart. This one’s a bonus chart from a Perficient Digital study that we published on links as a ranking factor. In this chart, you can see that Google ranks content with more links higher based on a normalized link score that we created.

Chart shows data from Perficient Digital on links and ranking - Google ranks content with more links higher.

Look at the three pieces: longer form content ranks higher, longer form content gets more links, site with more links rank higher. These three things are all tied very, very closely together. The reason why content is king is that you’re not going to get the links if you don’t have the right content to earn them. So, content is indeed more king than ever.

Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

See all of our Here’s Why Videos | Subscribe to our YouTube Channel

Guide to Web Compliance and Web Accessibility


ADA compliance and web accessibility are more serious than you likely know. Consider this scenario. You or one of your clients suddenly receives a letter stating that the website you administer is not ADA compliant and you’re facing litigation. Facing litigation? Now what!

The best course of action is to proactively review your website for ADA compliance and ensure that it is accessible to people with disabilities, before you get into trouble. The level of compliance necessary is outlined in the Web Content Accessibility Guidelines (WCAG) 2.0 (available here). These guides are quite detailed, but it will help you fully comply with the law and insulate your company from litigation because it’s comprehensive.

A good place to start for website ADA compliance and accessibility, is to use the following:

  • Check the current state of your website accessibility with tools like WAVE wave.webaim.org and the Google Lighthouse tool (available in the Chrome browser)
  • Ensure that all images have descriptive alt text
  • Provide closed captioning on any videos your site may have
  • Provide text transcripts of any video or audio only files
  • Give users the ability to pause, stop or hide any automated content like email sign ups
  • Use simpler design, be sure the website isn’t overly complex and provide options for adjustments to size/color of text and content
  • Be sure your website supports keyboard navigation (think navigation between elements with arrows and tab keys)
  • Provide support features so a person with a disability can contact the webmaster and receive a response
  • Be sure any forms on your website have instructions for their use and that each form element is labeled with clear and understandable text
    • Also, use the id and label HTML elements on form items

Once the above checklist has been followed, it is advisable to have a legal professional review your website in light of the WCAG 2.0 guidelines.

 

Why You Must Know about the New Evergreen Googlebot – Here’s Why #217

Google made an announcement at Google I/O in early May of 2019 that Googlebot is now evergreen. What does it mean for the search community?

In this episode of the popular Here’s Why digital marketing video series, Eric Enge, together with Google’s Martin Splitt, explains of the new evergreen Googlebot in search including rendering hash URLs, <div> tags, and infinite scroll.



Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

Resources

Transcript

Eric: Hey, everybody. My name is Eric Enge and today I’m excited to bring to you Martin Splitt, a Google Webmaster trends analyst based out of Zurich, I believe.

Martin: Yes.

Eric: Say hi, Martin.

Martin: Hello, everyone. Very nice to be here. Thank you very much, Eric, for the opportunity to be a guest here as well. And yes, I am, in fact, based in Zurich.

Eric: Awesome. Great. Today, we want to talk a little bit about what happened to Google I/O related to the announcement that Googlebot became evergreen, which means that it will be on an ongoing basis on the latest version of Chrome— in this case, Chrome 74, for right now. So, what are some of the things that that means, and what are some of the things that still won’t be supported as a result of this move?

Martin: What it means is that we now support many, many features. I think it’s 1,000 features or so that haven’t been supported beforehand. I think most notably, ES 2015 or ES 6, and onwards. We have now upgraded to a modern version of JavaScript. A lot of language features are now supported by default; a bunch of new web APIs are supported, such as the intersection observer or the web components APIs version, one of which are the stable ones. That being said, there is a bunch of stuff that just doesn’t make sense for Googlebot and that we continue not to support. To give you examples, there is the service worker. We’re not supporting that because users clicking onto your page from the search result might never have been there beforehand. So, it doesn’t make sense for us to run the service worker who is basically caching or which is basically caching data for later visits. We do not support things that have permission requests such as webcam or the geolocation API or push notifications. If those block your content, Googlebot will decline these requests, and if that means that your content doesn’t show up, it means that Googlebot doesn’t see your content either. Those are the most important ones. Also, Googlebot is still stateless. That means we’re still not supporting cookies, session storage, local storage or IndexedDB across page load. So, if you wanna store data in any of these mechanisms, that is possible, but it will be cleared out before the next URL or the next page comes on.

Eric: Got it. There are some other common things that I’ve seen that people do that maybe you could comment on. I’ll give you three. One is putting or having URLs that have hash marks in them and rendering that as separate content. Another one is infinite scroll, and then a third one is links, implemented as <div> tags.

Martin: All of the examples you gave us, we have very good reasons not to implement. The hash URLs—the issue there is that you’re using a hack. The URL protocol was not designed to be used that way. The hash URL— the fragments these bits with a hash in front of them—they are supposed to be a part of the page content and not different kinds of content. Using hash URLs will not be supported still. Using links in things that are not links, like buttons or <div> tags or anything else, would still not be supported because we’re not clicking on things—that’s ridiculously expensive and also a very, very bad accessibility practice. You should definitely use proper links. What was the third one?

Eric: Infinite scroll.

Martin: Yes, infinite scroll is a different story. Googlebot still doesn’t scroll, but if you’re using techniques such as the Intersection Observer that we are pointing out in our documentation, I highly recommend using that and then you should be fine. You should still test it and we need to update the testing tools at this point. We’re working on that sooner rather than later. But generally speaking, lazy loading and infinite scroll is working better than before.

Eric: One of the things that I believe is still true is that the actual rendering of JavaScript-based content is deferred from the crawl process. So, that also has some impact on sites. Can you talk about that?

Martin: Yes. Absolutely. As you know, we have been talking about this last year as well as this year. Again, we do have render queue. It’s not always easy to figure out when rendering is the culprit or when crawling is the culprit because you don’t see the difference necessarily or that easily. But basically, we are working on removing this separation as well, but there’s nothing to announce at this point. If you have a site that has a high-frequency change of content—let’s say, a news site where news stories may change every couple of minutes—then you are probably well off considering something like server-side rendering or dynamic rendering to get this content seen a little faster. If you are a site like an auction portal, you might want to do the same thing. Basically, if you have lots of pages—and I’m talking about millions—that content basically continuously changes. Then you probably want to consider an alternative to client-side rendering.

Eric: Right. One of the things that used to be recommended was this idea of dynamic rendering. If you have one of these issues where you’re using infinite scroll or you have real-time content or some of the other things that we talked about, dynamic rendering allows a already pre-rendered, if you will, version of the content to be delivered to Googlebot. Is that something that you still recommend?

Martin: It’s not a recommendation, per se. If you can make the investment in server-side rendering or server-side rendering in hydration or pre-rendering, where pre-rendering means if you have a website that only changes so often and you know when it changes. Let’s say you have a marketing site that you update every month—then you know when you have the update, so you could use your JavaScript to be run whenever you deploy something new on your site and then create static HTML content from it. We recommend making these investments as a long-term strategy because they also speed up the experience for the user, whereas dynamic rendering only speeds it up or makes it more plausible for crawlers and not for users, specifically. It’s more a work around than a recommendation, but it still can get you out of hot water if you can’t make the investment in server-side rendering, pre-rendering or server-side rendering in hydration yet, or if you are basically on the way there but need something for the interim.

Eric: Awesome. Any final comments about JavaScript before we wrap up?

Martin: I would love to see more people experimenting and working with JavaScript rather than just downright disregarding it. JavaScript brings a lot of cool features and fantastic capabilities to the web. However, as it is with every other tool, if you use it the wrong way then you might hurt yourself.

Eric: Awesome. Thanks, Martin.

Martin: You’re welcome, Eric.

Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

See all of our Here’s Why Videos | Subscribe to our YouTube Channel

Why You Must be Prepared for Visual Search – Here’s Why #216

Images SEO in visual search has been around for a long time, but why is it becoming more important to marketers?  

In this episode of the award-winning Here’s Why digital marketing video series, Jess explains changes Google has made to their search result pages to show more visual content and how it may impact rankings. 



Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

Resources

Transcript

Eric: So, Jess, images SEO in visual search have been around for a long time. Why are they becoming more important now? What’s changed recently? 

Jess: In a macro sense, the technology surrounding image hosting, image recognition, visual search, and that kind of thing has really improved. Image processing has become faster and you can get better quality images. And Google has noticed. In the “Next 20 Years of Google Search” post, Google signaled a switch from text to a more visual way of search. You can see this with their commitment to a much more visual mobile SERP (Search Engine Result Page). 

Eric: A lot of these changes have happened over the last year. What changes have you seen most recently? 

Jess: Some major changes have been with Google Lens, SERP experiments and changes, the Google Discover feed, and Google Collections. 

Eric: Tell us about Google Lens. 

Jess: Lens is Google’s built-in image recognition and search product. It’s accessible through the Google app and it lets you search for objects, image first. Say I want a version of a shirt—I can just take a picture of it on my phone and search for it online. 

Eric: And we’ve also seen it in Discover and Collections. Both are services used by Google. Discover shows a feed of topics related to what the user’s interests are, and Collections lets the user save search results to boards. It’s kind of like Pinterest in that way. Both display search results with large visuals, titles, and then short amounts of text. They’re usually extremely visual-first, especially compared with traditional SERPs. So how is this showing up in the SERPs? 

Jess: We’ve seen massive fluctuations in visuals in the SERP results. Image thumbnails, increased importance of images on the page, all that kind of thing. But the million-dollar question is, “Does this impact rankings?” 

Eric: Probably. Maybe. Well, we don’t know directly, and we don’t know how much, especially when compared with other ranking factors. But recently, I did have a chance to talk with Bing’s Fabrice Canel, who confirmed the concept that a page with a high-quality relevant image on it could be seen as a higher-quality page, as a result. And as for Google, we know they also care about a user’s experience. Having relevant, well-optimized images can create a much better experience than just a big block of text. We do know that speed is a ranking factor and is clearly very important to Google. Won’t images slow down your page? Maybe that would impact rankings. 

Jess: You can use good compression and next-gen image formats like WebP and JPEG 2000. But you can also think about the speed of the information making its way to the user. In that way, images are speed. 

Eric: Can you explain? 

Jess: You can explain what the Mona Lisa is in 1,000 words, or you can just show what the Mona Lisa looks like. 

Eric: If images are important, how can publishers best implement images on their pages? 

Jess: The usual rules for image optimization still apply. Make sure your images are a good size, that you use alt text correctly and accurately, and make sure that your images are a good quality. Beyond that, for speed, you can try implementing lazy loading while still making sure Googlebot can see your images. Try next-gen image formats and use unique images. And even run your images through the Google Image Recognition API to see if it sees what you want it to see. 

Eric: Images can be useful in different ways for different niches. You have to think about how your images can be used, for users to find you—and then how they can help your user when they have found you. E-commerce sites, for example, should make sure their products are discoverable using a reverse image search. Financial pages should use images and visual storytelling to help their users understand their text, as well. 

Jess: Yes, exactly. You can use images to stand out in the SERPs, help your users take advantage of visuals and take advantage of search features like Collections and Google Discovery. 

Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

See all of our Here’s Why Videos | Subscribe to our YouTube Channel

Eric Enge’s interview with Fabrice Canel

Fabrice Canel is a Principal Program Manager at Bing, Microsoft where he is responsible for web crawling and indexing. Today’s post is the transcript of an interview in which I spoke with Fabrice. Over the 60 minutes we spent together we covered a lot of topics.  

During our conversation, Fabrice shared how he and his team thinks about the value of APIs, crawling, selection, quality, relevancy, visual search, and the important role the SEO continues to play. 

Eric: What’s behind this idea of letting people submit 10,000 URLs a day to Bing? 

Fabrice: The thought process is that as our customers expect to find latest content published online, we try to get this content indexed seconds after the content is published. Getting content indexed fast, is particularly important for content like News. To achieve freshness, relying only on discovering new content via crawling existing web pages, crawling sitemaps and RSS feeds do not always work. Many sitemaps are updated only once a day, and RSS feeds may not provide full visibility on all changes done on web sites. 

So instead of crawling and crawling again to see if content changed, the Bing Webmaster API allows to programmatically notify us of the latest URLs published on their site. We see this need not only for large websites but for small and medium websites who don’t have to wait for us to crawl it, and don’t like too many visits from our crawler Bingbot on their web sites.  

Eric: It’s a bit like they’re pushing Sitemaps your way. And the code to do this is really very simple. Here is what that looks like: 

You can use any of the below protocols to easily integrate the Submit URL API into your system.  

A screenshot of protocols you can use to easily integrate the Submit URL API into your system.

 

Fabrice: Yes, we encourage both, pushing the latest URLs to Bing and having sitemaps to insure we are aware of all relevant URLs on the site. Pushing is great but the internet is not 100% reliable, sites goes down, your publishing system or our system may have temporary issues, sitemaps is the guaranty that we are aware of all relevant URLs on your site. In general, we aim to fetch sitemaps at least once a day, and when we can fetch more often most sites don’t want us to fetch them as often as every second. Complementary to freshness, RSS feeds is still a good solution for small and medium sites, but some sites are really big, and one RSS can’t handle more than 2500 URLs to keep its size within 1 MB. All of these things are complementary to tell us about site changes. 

Eric: This means you will get lots of pages pushed your way that you might not have gotten to during crawling, so it should not only enable you to get more real time content, but you’ll be able to see some sites more deeply. 

Fabrice: Absolutely, every day we discover more than 100 billion URLs that we have never seen before. What is even scarier, these are the URLs that we normalized– no session ids, parameters, etc. This is only for content that really matters and it’s still 100 billion new ones a day. A large percentage of these URLs are not worth indexing. Some simple examples of this include date archives within blogs or pages that are largely lacking in unique content of value. The Bing mechanism for submitting URLs in many cases is more useful and trustable than what Bingbot can discover through links. 

Eric: For sites that are very large, I heard you make reference that you would allow them to form more direct relationships to submit more than 10,000 URLs per day. 

Fabrice: You can contact us, and we’ll review & discuss it, and how it bears on business criteria of the sites. please don’t send us useless URLs, as duplicate content or duplicate URLs, so we won’t send fetchers to fetch that. 

Eric: How will this change SEO? Will crawling still be important? 

Fabrice: It’s still important to ensure that search engines can discover your content and links to that content. With URL submission you may have solved the problem of discovery, but understanding interlinking still matters for context. 

Related to that is selection, SEOs should include links to your content and selection. The true size of the Internet is infinity, so no search engines can index all of it. 

Some websites are really big, instead of adding URLs to your sites to get only few of the URLs indexed, it’s preferable to focus on ensuring the head and body of your URLs are indexed. Develop an audience, develop authority for your site to increase your chances of having your URLs selected. URL submission helps with discovery, but SEOs still need to pay attention to factors that impact selection, fetching, and content. Ultimately, your pages need to matter on the Internet. 

Eric: So, even the discovery part, there is still a role for the SEO to play, even though the API makes it easier to manage on your end. 

Fabrice: Yes, for the discovery part there’s a role for the SEO to remove the noise and guide us to the latest content. LESS IS MORE. The basics of content structure still matter too.  For example, you: 

  • still need titles/headers/content
  • still need depth and breadth of content
  • still need readable pages
  • still need to be concerned about site architecture and internal linking 

Eric: On the AI side of things, one of the things I think we’re seeing is an increasing push towards proactively delivering what people want before they specifically request it– less about search, and more about knowing preferences & needs of users, serving up things to them real-time, even before they think to do a search. Can you discuss that a little bit? 

Fabrice: You might think of this as position “-1”, this is not only to provide results, but to provide content that may satisfy needs of the people, information that is related to you and your interests, within the Bing app or Bing Home page. You can set your own interest via the Bing settings and then you will see the latest content on your interest in various canvas. I am deeply interested in knowing the latest news quantum computing… what’s your interests? 

Instead of searching for the latest every five minutes, preferable to be notified about what’s happening in more proactive ways. 

Eric: So Bing, or Cortana, becomes a destination in of itself, and rather than searching you’re getting proactive delivery of content, which changes the use case. 

Fabrice: Yes. We prefer surfacing the content people are searching for based on their personal interests. To be the provider of that content, to have a chance to be picked up by search engines, you have to create the right content and establish the skill and authority of that content. You must do the right things SEO-wise and amplify the authority of your site above other sites.  

Eric: There’s always the issue of authority, you can make great content, but if aren’t sharing or linking to your content, it probably has little value. 

Fabrice: Yes, these things still matter. How your content is perceived on the web is a signal that helps us establish the value of that content. 

Eric: Let’s switch the topic to visual search and discuss use cases for visual search. 

Fabrice: I use it a lot, and shopping is a beautiful example of visual search in action. For example, take a picture of your chair with your mobile device, upload the image to the Bing Apps and bingo you have chairs that are matching this model. The image is of a chair, it’s black, and the App will find similar things that are matching. 

Bing Visual Search of 3rd Edition of Art of SEO Screen

Visual search involves everything related to shopping, day to day object recognition, people recognition, and extracting information that is matching what your camera was capturing. 

Eric: For example, I want to know what kind of tree that is … 

Fabrice: Trees, flowers, everything 

Eric: How much of this kind of visual search do you anticipate happening? I’d guess it’s currently small. 

Fabrice: Well, yes, and no. We use this technology already in Bing for search and image search– understanding images we are viewing on the Internet– images with no caption or no alt text relating to the image, if we are able to recognize the shapes in the image, people may put in text keywords, the image may have additional meaning, extracting information that can advance the relevance of a web page. 

Going beyond Bing and search, this capability is offered in Azure and articulated in all kinds of systems across the industry, this is offering enterprises the ability to recognize images, also camera inputs, and more. This can also extend into movies. 

Eric: You mentioned the role images can play in further establishing the relevance of a web page. Can visual elements play a role in assessing a page’s quality as well? 

Fabrice: Yes, for example you can have a page on the Internet with text content, and within it you may have an image that is offensive in different ways. The content of the text is totally okay, but the image is offensive for whatever reason. We must detect that and treat it appropriately. 

Eric: I’d imagine there are scenarios where the presence of an image is a positive quality identifier, people like content with images after all. 

Fabrice: Yes, images can make consuming the content of a page more enjoyable. I think in the end it’s all about the SEO, you need to have good text, good schema, and good images, Users would love to go back to your site if it’s not full of ads, and not too much text with nothing to illustrate. If you have a bad website with junky HTML people may not come back. They may prefer another site with preferable content. 

Eric: Integration of searching across office networks is one of the more intriguing things we’ve heard from Bing, including the integration with Microsoft Office documents. As a result, you can search Office files and other types of content on corporate networks. 

Fabrice: When you search with Bing and you are signed up to a Microsoft/Office 365 offering enabling Bing for business, Bing will also search your company data, people, documents, sites and locations, as well as public web results, and surface this search results in a unified search results experience with internet links. People don’t have to search in two three places to find stuff. Bing offers a one-click experience, where you can search your Intranet, SharePoint sites for the enterprise, and the Internet all at once. You can have an internal memo that comes up in a search as well as other information that we find online. We offer you a global view. As an employee, this is tremendously helpful to do more by easing finding the information. 

Need to find the latest vacation policy for your company? We can help you find it. Need to know where someone is sitting in your office? We can help you find that too. Or, informational searches that we do can seamlessly find documents both online and offline. 

Eric: Back to the machine learning topic for a moment – are we at the point today where the algorithm is obscure enough it is not possible for a single human to describe the specifics of ranking factors. 

Fabrice: In 15 minutes it can’t be effectively done. We are guided through decisions we are taking in terms of quality expectations and determining good results vs. not so good results. Machine learning is far more complicated, when we have issues, we can break it down, find out what is happening per search. But it’s not made up of simple “if-then” coding structures, it’s far more complicated 

Eric: People get confused when they hear about AI and machine learning and they think that it will fundamentally change everything in search. But the reality is that search engines will still want quality content, and need determine its relevance and quality.  

Machine learning may be better at this, but as publishers, our goal is still to create content that is very high quality, relevant, and to promote that content to give it high visibility. That really doesn’t change, it doesn’t matter whether you’re using AI / machine learning or a human generated algorithm. 

Fabrice: That will never change. SEO is like accessibility where you need common rules to make things accessible for people with disabilities. In the process of implementing SEO you’re helping search engines understand the thing, you need to follow the basic rules, you can’t expect search engines to do magic and adapt to each and every complex case. 

Eric: There’s an idea that people have that machine learning might bring in whole new ranking factors that have never been seen before. But it’s not really going to change things that much is it? 

Fabrice: Yes, a good article is still a good article. 

Eric: A couple of quick questions to finish. John Mueller of Google tweeted recently that they don’t use prev/next anymore. Does Bing use it? 

Fabrice: We are looking at it for links & discovery, and we use it for clustering, but it is a loose signal. One thing related to AI, at Bing we look at everything, this isn’t a simple “if-then” thing, everything on the page is a hint of some sort. Our code is looking at each and every character on each and every page. The only thing that isn’t a hint is robots.txt and meta noindex (which are directives), and everything else is a hint. 

About Fabrice Canel 

Bing's Fabrice Canel

Fabrice is 20 years search veteran at Bing, Microsoft. Fabrice is a Principal Program Manager leading the team crawling, processing and indexing at Bing, so dealing with the hundreds of billions of new or updated web pages every day! In 2006, Fabrice joined the MSN Search Beta project and since this day, Fabrice is driving evolution of the Bing platform to insure the Bing index is fresh and comprehensive and he is responsible for the protocols and standards for Sitemaps.org and AMP on Bing. Prior to that MSN Search, Fabrice was the Lead Program Manager for search across Microsoft Web sites in a role covering all aspect of search from Search Engines technology to Search User Experience… to content in the very early days of SEO.

Why Pagination is Important – Here’s Why #215

Google’s John Mueller confirmed that Google has not made use of rel=prev/next tags for some time. But should we still implement pagination?

In this episode of the award-winning Here’s Why digital marketing video series, Eric Enge explains why pagination is still important and how you should implement it.



Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

Resources

Transcript

So recently, Google’s John Mueller tweeted that Google has not made use of rel=prev/next tags for some time. But my assessment is that the reason they did this is because the quality of the tagging web developers were using was probably poor on average.

John Mueller Tweet on Google no longer uses rel=prev/next

This is actually a parallel to what happened with rel=author tags back in 2014, when Google discontinued support for those. Back at that time, we actually did a study on how well those were implemented by people at the time. We’ll share that in the show notes below.

This study shows that 71% of the sites with prominent readership made no attempt to implement authorship or implemented it incorrectly. Many of those who had implemented it didn’t understand exactly how to do it and they just got it wrong.

That said, what should we do to support paginating page sequences now? If you have prev/next tags, you could still use them on your page if you want. Google won’t use them. Bing might use them—we don’t actually know for sure. But if you are going to keep them on your pages, make sure they are implemented correctly. You do have to take the time to learn how to follow the specs carefully and get it right.

Putting aside the prev/next tags for a moment, let’s think about how you should implement pagination otherwise on your page. Our first preference is to implement that pagination in clean HTML tags that are visible in the source code for the pages on your site—something that is easy for the search engines to parse.

The second choice would be to implement it in a way that isn’t clinging to the source code, but you can actually see it in the DOM or the Document Object Model. That means that your links are going to be anchor tags with a valid href attribute, not span or button elements with attached JavaScript click events.

Paginated pages should also canonical to themselves—that’s a good reinforcing signal. These are the things that you need. The reason why this is still important is that pagination is something that still matters to users. If you’ve got 200 products in a particular category, you probably don’t want to show 200 products on one single page. Breaking that up into many pages is actually a very good way to make the content more parsable and readable and usable for users. This is really why pagination is still important. But make sure you get that pagination implemented the correct way as I’ve outlined in today’s video.

Don’t miss a single episode of Here’s Why. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

See all of our Here’s Why Videos | Subscribe to our YouTube Channel

Why We Live in the Age of Voice Assistance (And What That Means for Search) – Here’s Why #213

More and more people are comfortable interacting with devices using their voice. How does that change the world of marketing?

In this episode of the award-winning Here’s Why digital marketing video series, Mark Traphagen shares key insights from Google on how voice assistance is changing our world.



Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

Resources

Transcript

Eric: Mark, usually I’m the one talking about the rise of digital personal assistants and voice interactions with devices, but you had the opportunity to cover a keynote session on this topic at SMX West. Share with us what you learned.

Mark: We heard from Marco Lenoci, who’s the head of Google Product Partnership for the Google Assistant product, and he not only shared with us what Google Assistant can do now and what they’re working on for the future, but also the implications of the rise of voice assistance for search marketers.

Eric: I think one of the important things for people watching this show to be clear on is where we are with the volume of voice interactions with devices, which I call it rather than voice search, by the way, because it’s not all really search.

We’re not at the point where voice has taken over the world yet, and it’s important to understand that, but by 2020, it should be a significant percentage, which might be 5% or 10% of interactions with devices. That’s enough to matter to a lot of brands, and if you’re going to be ready for that, you have to get going on it now.

With that context, why don’t you go over some of the implications?

Mark: Okay, the things Marco shared with us. So, he gave us five key insights at the end of his talk, and that’s what I want to concentrate on.

I think one of the most important things is that we’re seeing that voice is about action. You said it before, it’s not all search, and that’s true.

In fact, Google data shows that there’s 40 times more action-oriented interactions in voice than in search. So, people using voice with devices are about doing things, getting things done. It’s not about finding the coffee, which is what you would be looking for on search, but ordering the coffee and expecting it to be ready when you arrive at the coffee shop. So, start to think about the actions your customers want to take: less passive discovery, more action to completion.

People also expect more conversations with their devices. In fact, Google data shows 200 times more conversations going on in voice assistance and voice-assisted devices than in search. So, this means we’re moving from keywords to something more dynamic. Keywords are still important, search is still so important, but in this world… Well, let me give you an example.

Doing a traditional search, you’d be searching for something like ‘weather’, and then your zip code, right? But now, we’d ask things to a voice-assisted device or a digital personal assistant like, “Do I need an umbrella today?”

We expect that device to understand, when we say, “Do I need an umbrella today?”, I’m asking a question about the weather. There’s also an expectation that the location is understood. Your device knows where you are, so the assistant should know where you are, and what time of day it is, and as I said, that ‘an umbrella’ implies, “Is it going to rain today?”

Marco told us that there are actually 5,000 ways users can ask for an alarm to be set on Google Assistant, just as an example.

Also, he told us that smart screens are changing everything, and by smart screens, we mean devices that interact by voice but still have a display of some type. Google says that nearly half of the people who are using voice also use touch input on a screen together with it.

So some things still need to be seen. We still live in a multi-modal world. That’s the way we interact as humans. That’s the way we expect these devices to interact.

The fourth insight is that daily routines matter. These devices are becoming more and more able to know things like the time of day, where I am, this is what I’d usually be doing that time of day. For example, this is the time I usually drive home, so do I want to hear my favorite podcast?

Developers need to be thinking in terms of day and time to be there when users need them most. The concept of micro-moments in marketing takes on a whole new context in this.

The fifth and final insight is that voice is universal. We already know how to do it. Keyboards and tapping are still not totally natural for humans. Voice is.

Eric: Yes, that’s really interesting, and some of the research that I dug up in my investigations into voice shows just how universal voice is. People don’t realize, for example, that a baby in a mother’s womb can recognize the mother’s voice as distinct from other voices. So, it’s actually something that’s innate.

Anyway, cool insights overall. What practical actions should we be taking as digital marketers?

Mark: Lenoci shared three takeaways.

The first is, show up. Be there. Be involved with this. Make sure your content, services, and apps are available on Google and across its various services, including developing things for Google Assistant, like we’ve been doing at Perficient Digital, and Amazon Alexa, and all these different things that we’re working with now.

The second is, speed up. Don’t just create experiences. Think about the micro-moments where you can assist. So, “I want to know, I want to play, I want to buy this, I want to go here,” being present at those moments. How can you make that easier and faster for your customers and prospects?

And the third takeaway is, wise up. Take advantage of the info coming out from Google and others who are involved in this marketplace about how to build for that world.

Eric: Thanks, Mark. And your suggestion about the focus on helping people in the moment, right now, is a really important one. That’s how these technologies get adapted by people or adopted by people, really, is when the technology makes it so much easier than the alternatives.

Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

See all of our Here’s Why Videos | Subscribe to our YouTube Channel

Why On-Page and Off-Page SEO Together Create Success – Here’s Why #212

What are the fundamental practices that create SEO success? 

 In this episode of the award-winning Here’s Why digital marketing video series, Eric Enge shares a case study that demonstrates the effectiveness of “blocking and tackling” SEO. 



Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

Resources

Transcript

Mark: Eric, have the most important fundamentals of effective SEO really changed much?  

Eric: You know, the basic hard work of technical SEO combined with a content and promotional strategy is really still fundamental to success. So, I’d say not really.  

Mark: How about an example of where those fundamentals paid off for a business?  

Eric: Sure, I am happy to do that. The example I’m going to use is an online travel company that we worked with that was looking to differentiate itself from big players. It’s actually a new entrin the market.  

A few years back they were trying to figure out how to carve out their own niche even though they were a late entrant. How they did that is with more authoritative local content and a really user-oriented experience around each marketplace.  

One of the fun things they did is they didn’t try to cover the whole globe or even the whole US. They targeted specific regions of the globe and went very, very deep and created awesome experiences around those marketplaces. That included things like partnerships with local tour guides and getting better content from those people.  

What they really illustrated very well is that it’s better to be excellent at a few things than mediocre at many. So, rather than thinking that you have to cover the entire marketplace, that focus that they brought was really, really great for them.  

They also structured their content in a way where they started small and scaled it over time. So, the local experts, as I mentioned, we’re driving the content creation and really putting out the kind of stuff you’d never get from a garden-variety travel writer.  

And, of course, they did the basic SEO fundamentals really well. They had good site audits repeated regularly. They continue to look at the right site structure, the right taxonomy. So, the basics of their SEO underpinnings were really sound 

Because of the localized content, they were able to attract attention in local markets really wellThat resulted for them in links from within the specific countries that they were recovering because they were writing about stuff that other people weren’t doing. 

Mark: Great, but did the plan produce any measurable results?  

Eric: You know, it really did. So, it had a really steady growth in organic traffic as you see here on the chart we’re showing right now.  

Line chart shows increased traffic of a travel company site resulting from good fundamental SEO

I think the basic SEO fundamentals really worked very, very well, but in this world today, some things are a little bit different. The level of commitment you need to your user experience and user value as a primary focus is probably more than we might’ve thought about 10 years ago. But the underpinnings sound site architecture, creating great content, an effective outreach and promotion planto be honest, are the same as they ever were. 

Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

See all of our Here’s Why Videos | Subscribe to our YouTube Channel

Why You Must Publish Frequently (But Keep Quality High!) – Here’s Why #210

One of the age-old debates in SEO is whether or not it matters how much content you publish or how frequently. 

In this episode of the award-winning Here’s Why digital marketing video series, Eric Enge shows evidence that having more content can be an advantage, but you must never sacrifice quality to get there. 



Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

Resources

Transcript

Mark: Eric, here at Perficient Digital, we’ve developed advanced content marketing strategies for major brands that drive brand awareness and consumer interests, but we also use that content to gain big SEO wins for those businesses. Now, a question I hear a lot about that is, “Does it matter how frequently a company publishes content, at least for SEO purposes?”

Eric: Sure. It can make a difference, but it’s not the only factor.

Mark: What do you mean by that?

Eric: To answer that, let me tell you a tale of four sites, all in one single marketplace.

The chart that you’re looking at right now shows the number of content updates in a year for four companies in the same industry.

Chart Shows Publishing Volumes among Four Websites

So, site one in this chart, even though the bar looks really, really tiny, is actually publishing three pieces of content a month, and site two is actually publishing 16 pieces of content a month, which most people would consider a lot. I certainly would. But, site three published almost 100 articles a month, while site four was publishing 500 articles per month.

Now, let’s look at the next chart.

Search Visibility Line Chart from SearchMetrics shows traffic of 4 different websites over the course of two yearsThis is a Searchmetrics search visibility chart over the past two years, and the green line is the brand that published five times more than the others, the biggest volume brand. It started out at last place. In fact, its site launched two years ago and by August 2018 had established itself as the dominant player in the market.

I believe that was solely on the back of the volume of content they were publishing, and their coverage of the marketplace with a great deal of depth and breadth.

Mark: That’s it then. That’s it, folks. The magic secret to SEO, outpublish your competitors. We’ll see you…

Eric: Not so fast. Let me tell you the rest of the story.

A line chart shows traffic of a website that has published a large volume of content - it can still face traffic drop due to Google algo updates in September and October of 2018

When you look at this chart, in September of 2018, the site that was publishing 500 articles a month suddenly sees a big drop in its SEO visibility.

So, it looks like that the September/October updates hit this site really hard. And like the rest of the updates that Google put out in 2018, there seemed to be this continual focus on content quality and how well you met user intent and those sorts of things.

Mark: So, they were cranking out a lot of content, but it wasn’t necessarily all that great?

Eric: Exactly right. So, I think what we see here is with the volume of content, they rode that wave up, but because it wasn’t good enough quality content, they kind of took the hit in the September/October updates, since Google continued to adjust their algorithms.

So, I think it’s really important to understand that hey, volume is great, content breadth and depth is great, but it better be good stuff.

Mark: Got you. So, what lesson can we take away from all this?

Eric: I think you have to have a lot of content on your site and really think about covering your market area in breadth and depth, if your goal is to have a strong role in the SEO results for Google.

But, if you don’t have the right level of quality, it will bite you in the end. So, now you have to set the balance between, “How do I get that coverage in depth and breadth, and really get a volume of stuff going out there so I get that coverage, but keep the quality really, really high?”

Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.

Subscribe to Here’s Why

See all of our Here’s Why Videos | Subscribe to our YouTube Channel

Mobile vs Desktop Traffic in 2019

Latest update April 4, 2019 — This is the latest edition of our study on the state of the mobile web. This update demonstrates the growth of the mobile web last year (2018) versus the desktop. I’ll also compare the latest data to usage levels in 2016 and 2017. The stats in this and our prior studies were pulled from SimilarWeb and reflect U.S. traffic across the web.

Where is the Mobile vs. Desktop Story Heading?

  1. In 2018, 58% of site visits were from mobile devices.
  2. Mobile devices made up 42% of total time spent online.
  3. Mobile Bounce Rate came in at 50%.

The details are in the charts below.

For reference, here are our prior years’ studies:

Changes to Our Data Collection Methodology for 2018

During 2018, SimilarWeb made some shifts in their data sources. For that reason, the charts below show the 2018 data separated from the 2016 and 2017 data. The new sources in 2018 have slightly lower mobile usage, but this does not reflect an actual drop in mobile usage—just a change in the data sources used.

Nonetheless, SimilarWeb has one of the largest data samples on the web, and was picked by Rand Fishkin as the best tool for getting data on web traffic. For that reason, we will continue to use SimilarWeb as the data source for this study on an annual basis.

Aggregated Stats: Desktop vs. Mobile

The most common stat that people talk about is the percentage of their visits that comes from mobile devices. Here is a look at the percentage of visits sites get from mobile vs. desktop for 2016, 2017, and 2018:

The data continues to show that for most sites, the majority of their traffic comes from mobile devices. This is a critical fact of life for all business and media web sites.

It’s also interesting to consider total time on site. Here is what we see across the three years:

Bear in mind, that’s the percentage of total aggregated time across all visits for mobile, compared with that of desktop. The total time users spend on sites when using desktop devices is still larger than the total time for mobile. This suggests that the time per visit must be longer, as we see here:

Next, let’s take a look at bounce rate. Here is what we saw for 2016, 2017, and 2018:

With the new data sources from SimilarWeb, the mobile bounce rate is back up a bit, but still higher than it was in 2016. As I said in last year’s study, I believe that mobile site experiences are improving, and users are getting more comfortable with it. However, desktop still has the lead over mobile as it relates to bounce rate, and that’s not likely to change. For one thing, the use cases for people on mobile devices often involve the need to look something up quickly while they are on the go.

Let’s now take a look at the total page views between desktop and mobile devices:

Because of the new data sources from SimilarWeb, we see a drop in the percentage of total page views from mobile devices vs desktop, but this number is still higher than it was in 2016.

To wrap this section up, let’s also take a look at page views per visitor:

The page views per visitor remain significantly higher on desktop than mobile. This is consistent with the differences in time on site and bounce rate data shown above.

Stats by Industry Category

As we did in the last two years’ studies, we also broke the data down by industry category, to determine which industries are the most mobile-centric. The variance between categories remains significant:

In 2016, the adult industry was the leader, with 73% of the visits coming from mobile devices. In spite of that, it was the biggest gainer this year, jumping up to 86% of all traffic coming from mobile. The other fascinating thing is that the finance category and arts & entertainment categories are the only industries that still see more traffic on desktop, by narrow 52% to 48% and 51% to 49% margins, respectively. By next year, these should also get most of their traffic from mobile.

Next up, let’s look at time on site by industry category:

Here we see that every industry has a longer time on site for desktop over mobile, except for books and literature. The latter is probably due to people reading on mobile devices such as tablets.

Let’s look at bounce rate next:

The desktop bounce rate is lower than the mobile bounce rate in every single industry, though the margin is quite small for these two categories:

  • Recreation and Hobbies
  • Books and Literature.

Last, but not least, let’s look at page views per visitor:

Page views per visitor remained higher in every industry for desktop than mobile.

Four Takeaway Recommendations

How can we use this data to inform our digital marketing strategy? Here are four of my top observations and ideas:

Mobile Experiences are Continuing to Improve: Mobile user interfaces are improving, and users are getting more accustomed to them. Being mobile friendly is important in all industries—it’s the largest source of traffic in nearly all of them.

This means designing your mobile site before you design the desktop site. Instead of coding your desktop site and then writing style sheets to shrink it into a smartphone form factor, design your mobile site first. Then you can figure out how to leverage the larger screen real estate available on a desktop platform as a second step.

Important note: I’m not saying this because desktop is dead; it’s not. It’s still very important, but it’s far easier to take a mobile UI to the desktop than take a desktop one to a smartphone.

Desktop Remains Very Important: Other industry data still suggests that more conversions continue to happen on desktop in most industries, so continuing to pay a lot of attention to your desktop site makes a great deal of sense. And, if you’re in an industry where 75% or more of your conversions come from desktop, you may even want to offer users on mobile devices the option to provide contact information, save shopping carts, or implement other functionality that allows them to defer the actual completion of a conversion to a later time (perhaps on a desktop).

The rationale is that users may not want to deal with complicated forms on a mobile device, and/or may not want to enter their credit card there. Following up with them later lets them come back on a desktop device and convert at a more convenient time.

If you’re open to this idea, I’d urge you to test it thoroughly first, to see which gets better results for you.

Compare Your Site’s Behavior to Industry Norms: If the average percentage of mobile visitors in your industry is 60%, and your site is at 35%, that may indicate a problem like a very slow mobile site. See how you compare to industry norms; if there is a large delta with your site, take the time to understand why.

Pay Attention to Site Speed: Consider implementing AMP. Here is our study on AMP, which thoroughly explains how effective AMP is in accelerating site speed, as well as our detailed guide to implementing AMP. AMP is not the only way to speed up your site, of course, but it’s an open source standardized way to do it, so it deserves consideration.

Wonder why page speed is so important? See our Page Speed Guide.