Skip to main content

Home/ DISC Inc/ Group items matching "canonical" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Rob Laporte

The March 12, 2019 Google Core Algorithm Update - A Softer Side Of Medic, Trust And The Link Graph, Quality Matters, And "The Kitchen Sink" - 1 views

  • when checking queries that dropped and their corresponding landing pages, they line up with the problems I have been surfacing. For example, thin content, empty pages, pages that had render issues, so on and so forth.
  • Author expertise is extremely important, especially for YMYL content.
  • Also, and this is important, the site consumes a lot of syndicated content. I’ve mentioned problems with doing this on a large scale before and it seems this could be hurting the site now. Many articles are not original, yet they are published on this site with self-referencing canonical tags (basically telling Google this is the canonical version). I see close to 2K articles on the site that were republished from other sources
  • ...5 more annotations...
  • And last, but not least, the site still hadn’t moved to https. Now, https is a lightweight ranking factor, but it can be the tiebreaker when two pages are competing for a spot in the SERPs. Also, http sites can turn off users, especially with the way Chrome (and other browsers) are flagging them. For example, there’s a “not secure” label in the browser. And Google can pick up on user happiness over time in a number of ways (which can indirectly impact a site rankings-wise). Maybe users leave quickly, maybe they aren’t as apt to link to the site, share it on social media, etc. So not moving to https can be hurting the site on multiple levels (directly and indirectly).
  • This also leads me to believe that if Google is using reputation, they are doing so in aggregate and not using third-party scores or ratings.
  • What Site Owners Can Do – The “Kitchen Sink” Approach To RemediationMy recommendations aren’t new. I’ve been saying this for a very long time. Don’t try to isolate one or two problems… Google is evaluating many factors when it comes to these broad core ranking updates. My advice is to surface all potential problems with your site and address them all. Don’t tackle just 20% of your problems. Tackle close to 100% of your problems. Google is on record explaining they want to see significant improvement in quality over the long-term in order for sites to see improvement.
  • Summary – The March 12 Update Was Huge. The Next Is Probably A Few Months AwayGoogle only rolled out three broad core ranking updates in 2018. Now we have our first of 2019 and it impacted many sites across the web.
  • Don’t just cherry pick changes to implement. Instead, surface all potential problems across content, UX, advertising, technical SEO, reputation, and more, and address them as thoroughly as you can. That’s how you can see ranking changes down the line. Good luck.
Rob Laporte

Diagnosing Search Issues From the Query Box - ClickZ - 0 views

  • This query: site:yourdomain.com -inurl:www will show you the subset of indexed pages on your site that don't have "www" in their URLs. If you have multiple subdomains on your site, this becomes slightly trickier to diagnose. For example, if you have subdomains called "www," "blog," and "clients," you'll need to add those subdomains to the preceding query to find canonical issues: site:yourdomain.com -inurl:www -inurl:blog -inurl:clients.
Rob Laporte

Google Changes Course on Nofollow - Search Engine Watch (SEW) - 0 views

  • This week at the SMX Advanced conference in Seattle, Cutts joined the discussion around nofollow during the duplicate content session. According to Outspoken Media's Lisa Barone: A debate broke out mid-session when Matt Cutts got involved about whether or not nofollow is still effective. Of course, as soon as it got hot, all search representatives got very tight lipped about who said what and what they really meant. As far as I could, Matt Cutts did NOT say that they ignore nofollow, but he DID hint that it is less effective today than it used to be. Later, Cutts addressed the issue again in his You&A keynote. When asked about PageRank sculpting, Cutts said that it will still work, but not as well. Basically, using nofollow will still prevent PageRank from passing from the linking page through the nofollowed link. But that PageRank is no longer "saved" to be used by other links on the page. It just "evaporates," according to Cutts. Rand Fishkin at SEOmoz has some visual aids to help describe the process. This change mainly affects those SEOs that have tried to optimize their pages using the nofollow tag for PageRank sculpting. It's safe to say that most site owners have no idea what PageRank sculpting is, which is probable a good thing, since it can quite easily be done wrong and cause more problems than it solves.
Rob Laporte

Google Extends Support for rel=canonical - Search Engine Watch (#SEW) - 0 views

  •  
    Google announced
Rob Laporte

301 vs. 410 vs. 404 vs. Canonical | LinkedIn - 0 views

  • However, after looking at how webmasters use them in practice we are now treating the 410 HTTP result code as a bit "more permanent" than a 404. So if you're absolutely sure that a page no longer exists and will never exist again, using a 410 would likely be a good thing. I don't think it's worth rewriting a server to change from 404 to 410, but if you're looking at that part of your code anyway, you might as well choose the "permanent" result code if you can be absolutely sure that the URL will not be used again. If you can't be sure of that (for whatever reason), then I would recommend sticking to the 404 HTTP result code.
Rob Laporte

65+ Best Free SEO Chrome Extensions (As Voted-for by SEO Community) - 1 views

  • Link Redirect Trace — Uncovers all URLs in a redirect chain including 301’s, 302’s, etc. Very useful for finding (and regaining) lost “link juice,” amongst other things.Other similar extensions: Redirect Path
  • Scraper — Scrape data from any web page using XPath or jQuery. Integrates with Google Sheets for one-click export to a spreadsheet. Or you can copy to clipboard and paste into Excel.Other similar extensions: Data Scraper — Easy Web Scraping, XPather
  • Tag Assistant (by Google) — Check for the correct installation of Google tags (e.g. Google Analytics, Tag Manager, etc) on any website. Also, record typical user flows on your website to diagnose and fix implementation errors.
  • ...16 more annotations...
  • Web Developer — Adds a web developer toolbar to Chrome. Use it to check how your website looks on different screen sizes, find images with missing alt text, and more.
  • WhatRuns — Instantly discover what runs any website. It uncovers the CMS, plugins, themes, ad networks, fonts, frameworks, analytics tools, everything.
  • Page Load Time — Measures and displays page load time in the toolbar. Also breaks down this metric by event to give you deeper insights. Simple, but very useful.
  • FATRANK — Tells you where the webpage you’re visiting ranks in Google for any keyword/phrase.
  • SEOStack Keyword Tool — Finds thousands of low-competition, long-tail keywords in seconds. It does this by scraping Google, Youtube, Bing, Yahoo, Amazon, and eBay. All data can be exported to CSV.
  • Window Resizer — Resize your browser window to see how a website looks on screens of different sizes. It has one-click emulation for popular sizes/resolutions (e.g. iPhone, iPad, laptop, desktop, etc).
  • Ghostery — Tells you how websites are tracking you (e.g. Facebook Custom Audiences, Google Analytics, etc) and blocks them. Very useful for regaining privacy. Plus, websites generally load faster when they don’t need to load tracking technologies.
  • Ayima Page Insights — Uncovers technical and on-page issues for any web page. It also connects to Google Search Console for additional insights on your web properties.
  • ObservePoint TagDebugger — Audit and debug issues with website tags (e.g. Google Analytics, Tag Manager, etc) on your websites. Also checks variables and on-click events.Other similar extensions: Event Tracking Tracker
  • The Tech SEO — Quick Click Website Audit — Provides pre-formatted links (for the current URL) to a bunch of popular SEO tools. A very underrated tool that reduces the need for mundane copy/pasting.
  • User-Agent Switcher for Chrome — Mimic user-agents to check that your website displays correctly in different browsers and/or OS’.
  • Portent’s SEO Page Review — Reviews the current page and kicks back a bunch of data including meta tags, canonicals, outbound links, H1-H6 tags, OpenGraph tags, and more.
  • FindLinks — Highlights all clickable links/elements on a web page in bright yellow. Very useful for finding links on websites with weird CSS styling.
  • SERPTrends SEO Extension — Tracks your Google, Bing, and Yahoo searches. Then, if you perform the same search again, it shows ranking movements directly in the SERPs.
  • SimilarTech Prospecting — Discovers a ton of useful information about the website you’re visiting. This includes estimated monthly traffic, company information, social profiles, web technologies, etc.
  • SEO Search Simulator by Nightwatch — Emulates Google searches from any location. Very useful for seeing how rankings vary for a particular query in different parts of the world.
  •  
    "Find Out How Much Traffic a Website Gets: 3 Ways Compared"
Rob Laporte

Google's internal SEO strategy: Make small changes, embrace change, consolidate - Search Engine Land - 0 views

  • Small changes make a big impact. Google’s first point is that often with large sites, making small changes can make a big impact and return when it comes to search rankings. Google plotted the growth of one of the 7,000 websites, the Google My Business marketing site, showing how adding canonicals, hreflang to their XML sitemaps, and improving their metadata all resulted in gains in their organic traffic in search.Here is that chart:
  • Here is the chart showing the improvement after making the AMP error fixes:
  • Consolidation. For the past several years, many SEOs have been saying “less is more.” Meaning, having fewer sites and fewer pages with higher quality content often leads to better SEO results. Google says that works for them and they have been working on consolidating their sites. Google said they found a “large number” of near duplicate sites across their properties.“Duplicate content is not only confusing for users, it’s also confusing for search engines,” Google said. Google added, “Creating one great site instead of multiple microsites is the best way to encourage organic growth over time.”In one case study Google provided with the Google Retail site, they took six old websites and consolidated the content. They made “one great website” and it lead to them doubling the site’s call-to-action click-through rate and increased organic traffic by 64%.
jack_fox

11 Little-Known Features In The SEO Spider | Screaming Frog - 0 views

  • if you need to crawl millions of URLs using a desktop crawler, you really can. You don’t need to keep increasing RAM to do it either, switch to database storage instead.
  • f you’re auditing an HTTP to HTTPS migration which has HSTS enabled, you’ll want to check the underlying ‘real’ sitewide redirect status code in place (and find out whether it’s a 301 redirect). Therefore, you can choose to disable HSTS policy by unticking the ‘Respect HSTS Policy’
  • For macOS, to open additional instances of the SEO Spider open a Terminal and type the following: open -n /Applications/Screaming\ Frog\ SEO\ Spider.app/ You can now perform multiple crawls, or compare multiple crawls at the same time.
  • ...3 more annotations...
  • Occassionally it can be useful to crawl URLs with fragments (/page-name/#this-is-a-fragment) when auditing a website, and by default the SEO Spider will crawl them in JavaScript rendering mode.
  • While this can be helpful, the search engines will obviously ignore anything from the fragment and crawl and index the URL without it. Therefore, generally you may wish to switch this behaviour using the ‘Regex replace’ feature in URL Rewriting. Simply include #.* within the ‘regex’ filed and leave the ‘replace’ field blank.
  • Saving HTML & Rendered HTML To Help Debugging We occasionally receive support queries from users reporting a missing page title, description, canonical or on-page content that’s seemingly not being picked up by the SEO Spider, but can be seen to exist in a browser, and when viewing the HTML source. Often this is assumed to be a bug of somekind, but most of the time it’s just down to the site responding differently to a request made from a browser rather than the SEO Spider, based upon the user-agent, accept-language header, whether cookies are accepted, or if the server is under load as examples.
jack_fox

Defense Against the Dark Arts: Why Negative SEO Matters, Even if Rankings Are Unaffected - Moz - 0 views

  • if you get 100,000 links pointing to your site, it is going to push you over the limit of the number of links that Google Search Console will give back to you in the various reports about links
  • Google cuts off at 100,000 total links
  • even though we know Google is ignoring most of these links, they don't label that for us in any kind of useful fashion. Even after we can get access to all of that link data, all of those hundreds of thousands of spammy links, we still can't be certain which ones matter and which ones don't.
  • ...8 more annotations...
  • if somebody syndicates an article of yours that has let's say eight links to other internal pages and they syndicate it to 10,000 websites, well, then you've just got 80,000 new what should have been internal links, now external links pointing to your site.
  • Nofollowed malware links in UGC
  • there are ways to make it look like there are links on your site that aren't really under your control through things like HTML injection
  • it's not so much about bowling you out of the search engines. It's about making it so that SEO just isn't workable anymore.
  • How do you fight back against negative SEO? 1. Canonical burn pages
  • Embedded styled attribution
  • Link Lists
  • As you get links, real links, good links, add them to a Link List, and that way you will always have a list of links that you know are good, that you can compare against the list of links that might be sullied by a negative SEO campaign.
‹ Previous 21 - 40 of 45 Next ›
Showing 20 items per page