Home / Digital Marketing / When to Show Different Content to Engines and Visitors :
SEO Services in Delhi | SEO Services India

When to Show Different Content to Engines and Visitors :

There are a few common causes for displaying content differently to different visitors, including search engines. Here are some of the most common ones:

Multivariate and A/B split testing :

Testing landing pages for conversions requires that you show different content to different visitors to test performance. In these cases, it is best to display the content using JavaScript/cookies/sessions and give the search engines a single, canonical version of the page that doesn’t change with every new spidering (though this won’t necessarily hurt you). Google used to offer software called Google Website Optimizer to perform this function, which has been discontinued and replaced with Google Analytics Content Experiments (https://support.google.com/analytics/answer/2661700?topic=1745146). If you have used Google Website Optimizer in the past, Google recommends removing the associated tags from your site page

Content requiring registration and First Click Free:

If you force registration (paid or free) on users to view specific content pieces, it is best to keep the URL the same for both logged-in and non-logged-in users and to show a snippet (one to two paragraphs is usually enough) to non-logged-in users and search engines. If you want to display the full content to search engines, you have the option to provide some rules for content delivery, such as showing the first one to two pages of content to a new visitor without requiring registration, and then requesting registration after that grace period. This keeps your intent more honest, and you can use cookies or sessions to restrict human visitors while showing the full pieces to the engines.

In this scenario, you might also opt to participate in a specific program from Google called First Click Free, wherein websites can expose “premium” or login-restricted content to Google’s spiders, as long as users who click from the engine’s results are given the ability to view that first article for free. Many prominent web publishers employ this tactic, including the popular site, Experts-Exchange.com (http://www.experts-exchange.com/ ).

To be specific, to implement First Click Free, the publisher must grant Googlebot (and presumably the other search engine spiders) access to all the content they want indexed, even if users normally have to log in to see the content. The user who visits the site will still need to log in, but the search engine spider will not have to do so. This will lead to the content showing up in the search engine results when applicable. However, if a user clicks on that search result, you must permit him to view the entire article (all pages of a given article if it is a multiple-page article). Once the user clicks to look at another article on your site, you can still require him to log in.  Publishers can also limit the number of free accesses a user gets using this technique to five articles per day.

Navigation unspider-able by search engines:

If your navigation is in Flash, JavaScript, a Java application, or another format where the ability of the search engine to parse them is uncertain, you should consider showing search engines a version that has spider-able, crawl-ble content in HTML. Many sites do this simply with CSS layers, displaying a human-visible, search-invisible layer and a layer for the engines (and less capable browsers, such as mobile browsers). You can also employ the noscript tag for this purpose, although it is generally riskier, as many spammers have applied noscript as a way to hide content. Take care to make sure the content shown in the search-visible layer is substantially the same as it is in the human-visible layer.

Duplicate content:

If a significant portion of a page’s content is duplicated, you might consider restricting spider access to it by placing it in an iframe that’s restricted by robots.txt. This ensures that you can show the engines the unique portion of your pages, while protecting against duplicate content problems. We will discuss this in more detail in the next section.

Different content for different users:

At times you might target content uniquely to users from different geographies (such as different product offerings that are more popular in their area), with different screen resolutions (to make the content fit their screen size better), or who entered your site from different navigation points. In these instances, it is best to have a “default” version of content that’s shown to users who don’t exhibit these traits to show to search engines as well.

Read More : SEO Company in India

PHP Assignment Help in USA and Canada

About admin

Check Also

Manual Social Media Link Creation :

One way to create links manually in social media environments is by visiting social media …

Leave a Reply

Your email address will not be published. Required fields are marked *

3 + 6 =