Skip to main content

Home/ SEO FranceProNet/ Group items tagged indexation

Rss Feed Group items tagged

Aurelien FpN

Does Google crawl and index dynamic content? - 0 views

  •  
    Je crois avoir déjà partagé une ressource similaire mais ça ne fait jamais de mal de se rafraichir la mémoire (moi meme j'avais zappé, c'est Manon qui m'a filé ce lien) : plusieurs tests pour prouver que Google indexe bien des contenus chargés en JS à n'importe quel moment.
Aurelien FpN

Official Google Webmaster Central Blog: Faceted navigation best (and 5 of the worst) pr... - 0 views

  • example.com/product?item=swedish-fish&category=gummy-candy&sid=789 (URL parameters allow more flexibility for search engines to determine how to crawl efficiently)
    • Aurelien FpN
       
      Il préfère les url non rewritées
  • Rather than allow user-generated values to create crawlable URLs  -- which leads to infinite possibilities with very little value to searchers -- perhaps publish category pages for the most popular values, then include additional information so the page provides more value than an ordinary search results page. Alternatively, consider placing user-generated values in a separate directory and then robots.txt disallow crawling of that directory.
  • Required parameters may include item-id, category-id, page, etc.
  • ...13 more annotations...
  • I may find the URL parameter “taste” to be valuable to searchers for queries like [sour gummy candies]
    • Aurelien FpN
       
      confirme qu'il faut choisir des paramètres correspondant à des requêtes. L'exclusion du prix peut porter à confusion chez nous car c'est aussi une notion importante dans notre cas alors que c'est déconseillé dans l'article. 
  • Option 1: rel="nofollow" internal links
  • Option 2: Robots.txt disallow
  • Option 3: Separate hosts
  • Prevent clickable links when no products exist for the category/filter.
  • Improve indexing of paginated content
  • Adding rel=”canonical” from individual component pages in the series to the category’s “view-all” page
  • Using pagination markup with rel=”next” and rel=”prev” to consolidate indexing properties
  • Be sure that if using JavaScript to dynamically sort/filter/hide content without updating the URL
  • Include only canonical URLs in Sitemaps
  • Be sure that if using JavaScript to dynamically sort/filter/hide content without updating the URL
  • Adding rel=”canonical” from individual component pages in the series to the category’s “view-all” page
  • Using pagination markup with rel=”next” and rel=”prev” to consolidate indexing properties,
Aurelien FpN

Matt Cutts: Google Penalties Get More Severe for Repeat Offenders - Search Engine Watch... - 1 views

  • Google tends to look at buying and selling links that pass PageRank as a violation of our guidelines and if we see that happening multiple times, repeated times, then the actions that we take get more and more severe," Cutts said. "So we're more willing to take stronger action whenever we see repeated violations
  • So if your website has received a spam warning from Google, you need to be extra certain that you keep your SEO techniques very clean, and not skate too close to that gray line between white hat and black hat SEO. And this also means checking carefully for other things, like your backlink profile, so you can disavow immediately.
  • So that the sort of thing where company is willing to say, "You know what, we might've had good links for a number of years and then we had really bad advice and someone did everything wrong for a few months, maybe up to year, so just to be safe let's just disavow everything in that time frame." That's a pretty radical action and that's the sort of thing where we heard back on a reconsideration request that someone had taken that kind of a strong action, then we could look and say, "OK, this is something people are taking seriously."
  • ...2 more annotations...
  • Others take the more subtle approach and start disavowing the ones that are definitely dirty and hurting the website, and just add more to the disavow list and until the penalty is lifted. But considering what Cutts is saying now, it seems that if you want immediate help with getting a site back into the index that has been penalized for bad quality backlinks, the fastest resolution seems to be wiping the slate clean by disavowing everything, and starting over again.
  • "A good reconsideration request is often using the domain query (domain:), and taking out large amounts of domains that have bad links. I wouldn't necessarily recommend going the route of everything for last year or everything for the last year and a half, but that's the sort of large-scale action, if taken, can have an impact whenever we're assessing a domain within a reconsideration request."
jbfrancepronet

Les Extraits Structurés ou Google Structured Snippets - 0 views

  • En plus du descriptif classique de la page, ils affichent d'autres données trouvées, des sortes de faits précis sur la page.
  • cet algorithme n'exploite pas les données structurées au format schema.org
  • cet algo s'intéresse aux données tabulées (dit autrement, les tableaux, balise <table>)
  •  
    Google va piocher dans les tableaux présents dans le site pour ajouter des informations dans les serps
Aurelien FpN

Google n'indexe plus les textes invisibles placés dans des onglets au sein d'... - 0 views

  • si c'est un contenu que vous jugez important et pertinent, faîtes en sorte qu'il soit bien visible sur la page
  • Si cet utilisateur arrive sur votre page pour lire ce contenu et qu'il ne le voit pas, il sera frustré et se dira que ce contenu ne correspond pas à son attente
  • si un tel contenu est caché, c'est peut-être qu'il n'est pas si important que cela pour l'utilisateur et donc que nous ne devrons pas non plus lui accorder autant d'importance. Alors, nous ne l'indexerons pas
1 - 6 of 6
Showing 20 items per page