Skip to main content

Home/ DISC Inc/ Group items tagged Design

Rss Feed Group items tagged

Rob Laporte

Geo-Targeting Redirects: Cloaking or Better User Experience? - Search Engine Watch (SEW) - 0 views

  • If you have your site set to detect a visitor's location and show content based on that, I would recommend the following: Serve a unique URL for distinct content. For instance, don't show English content to US visitors on mysite.com and French content to French visitors on mysite.com. Instead, redirect English visitors to mysite.com/en and French visitors to mysite.com/fr. T hat way search engines can index the French content using the mysite.com/fr URL and can index English content using the mysite.com/en URL. Provide links to enable visitors (and search engines) to access other language/country content. For instance, if I'm in Zurich, you might redirect me to the Swiss page, but provide a link to the US version of the page. Or, simply present visitors with a home page that enables them to choose the country. You can always store the selection in a cookie so visitors are redirected automatically after the first time.
  • Google's policies aren't as inflexible as you're trying to portray. The same Google page you quote also says that intent ought to be a major consideration (just as when evaluating pages with hidden content). Also, why would Google's guidelines prevent you from using geotargeting without an immediate redirect? Just because you don't immediately redirect search users to a different page doesn't mean you have to ask for their zip code instead of using IP-based geotargeting.    Lastly, I don't think using such redirects from SERPs improves user experience at all. If I click on a search result, then it's because that's the content I'm interested in. It's very annoying to click on a search result and get a page completely different from the SERP snippet. And what about someone who is on business in a different country? Search engines already provide different language localizations as well as language search options to favor localized pages for a particular region. So if someone goes to the French Google, they will see the French version of localized sites/pages. If they're seeing the U.S. version in their SERP, then it's because you didn't SEO or localize your pages properly, or they deliberately used the U.S. version of Google. Don't second guess your users. Instead, focus on making sure that users know about your localized pages and can access them easily (by choice, not through force).5 days ago, 17:00:11 – Flag – Like – Reply – Delete – Edit – Moderate Bill Hunt Frank your spot on as usual. We still keep chasing this issue and as I wrote on SEW last year in my article on language detection issues http://searchenginewatch.com/3634625 it is often more of the implementation that is the problem than the actual redirect.    Yes, it is exactly cloaking (maybe gray hat) when you have a single gateway such as "example.com" and if the person comes from Germany they see the site in German language or English if their IP was in New York. Engines typically crawl from a central location like Mountain View or Zurich so they would only see the English version since they would not provide signals for any other location. Where you really get into a tricky area is if you set it so that any user agent from a search engine can access any version they are asking for and let them in yet a human is restricted - sort of reverse cloaking. If Mr GoogleBot wants the French home page let him have it rather than sending him to the US homepage.    With the growth of CDN's (content data networks) I am seeing more and more of these issues crop up to handle load balancing as well as other forms of geographical targeting. I have a long list of global, multinational and enterprise related challenges that are complicated by many of Google's outdated ways of handling kindergarten level spam tactics. Sounds like a SES New York session...
Rob Laporte

The need for speed: Google dedicates engineering team to accelerate development of Word... - 0 views

  •  
    "presentation"
Rob Laporte

Removing URLs From The Index In Bulk - 0 views

  • Combining The URL Removal Tool With The Basic Tools Google’s URL Removal Tool only removes the content from their index for 90 days, so it is not permanent. It is important, therefore, that you take additional steps to make sure that content does not come back into the index. You need to combine its use with one of the Basic Tools discussed above. Here is a table that represents how I look at the choices: Tactic When to Use URL Removal Tool, Plus Deleting Pages and All Links to them, Plus 301s to Best Fit Pages Always the best choice if there is no need for the pages to exist and if you are able to eliminate the pages. URL Removal Tool Plus Rel=Canonical Tagging The best remaining choice if preserving PageRank is a priority; however, you can only use this when your pages are a true duplicate or a strict subset of the pages that the tags point to. URL Removal Tool Plus NoIndex Tag Use when preserving PageRank is a priority, but the Rel=Canonical tag is not appropriate. URL Removal Tool Plus DisAllow in Robots.txt Use when reducing the number of pages that the search engines have to crawl is the priority.
« First ‹ Previous 181 - 200 of 675 Next › Last »
Showing 20 items per page