Skip to main content

Home/ DISC Inc/ Group items tagged algorithms

Rss Feed Group items tagged

Rob Laporte

SEOmoz | Announcing SEOmoz's Index of the Web and the Launch of our Linkscape Tool - 0 views

  •  
    After 12 long months of brainstorming, testing, developing, and analyzing, the wait is finally over. Today, I'm ecstatic to announce some very big developments here at SEOmoz. They include: * An Index of the World Wide Web - 30 billion pages (and growing!), refreshed monthly, built to help SEOs and businesses acquire greater intelligence about the Internet's vast landscape * Linkscape - a tool enabling online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value. * A Fresh Design - that gives SEOmoz a more usable, enjoyable, and consistent browsing experience * New Features for PRO Membership - including more membership options, credits to run advanced Linkscape reports (for all PRO members), and more. Since there's an incredible amount of material, I'll do my best to explain things clearly and concisely, covering each of the big changes. If you're feeling more visual, you can also check out our Linkscape comic, which introduces the web index and tool in a more humorous fashion: Check out the Linkscape Comic SEOmoz's Index of the Web For too long, data that is essential to the practice of search engine optimization has been inaccessible to all but a handful of search engineers. The connections between pages (links) and the relationship between links, URLs, and the web as a whole (link metrics) play a critical role in how search engines analyze the web and judge individual sites and pages. Professional SEOs and site owners of all kinds deserve to know more about how their properties are being referenced in such a system. We believe there are thousands of valuable applications for this data and have already put some effort into retrieving a few fascinating statistics: * Across the web, 58% of all links are to internal pages on the same domain, 42% point to pages off the linking site. * 1.83%
Rob Laporte

Effective Internal Linking Strategies That Prevent Duplicate Content Nonsense - Search ... - 0 views

  •  
    The funny thing about duplicate content is that you don't really have to have it for it to appear as if you do have it. But whether you have duplicate content on your site or not, to the search engines appearances are everything . The engines are pretty much just mindless bots that can't reason. They only see what is, or appears to be there and then do what the programmers have determined through the algorithm. How you set up your internal linking structure plays a significant role in whether you set yourself up to appear if you have duplicate content on your site or not. Some things we do without thinking, setting ourselves up for problems ahead. With a little foresight and planning, you can prevent duplicate content issues that are a result of poor internal link development. For example, we know that when we link to site.com/page1.html in one place but then link to www.site.com/page1.html in another, that we are really linking to the same page. But to the search engines, the www. can make a difference. They'll often look at those two links as links to two separate pages. And then analyze each page as if it is a duplicate of the other. But there is something we can do with our internal linking to alleviate this kind of appearance of duplicate content. Link to the www. version only Tomorrow I'll provide information on how to set up your site so when someone types in yoursite.com they are automatically redirected to www.yoursite.com. It's a great permanent fix, but as a safety measure, I also recommend simply adjusting all your links internally to do the same. Example of not linking to www. version. In the image above you can see that the domain contains the www., but when you mouse over any of the navigation links, they point to pages without the www. Even if you have a permanent redirect in place, all the links on your site should point to the proper place. At the very least you're making the search engines and visitors NOT have to redirect. At best, should y
Rob Laporte

Google Search Algorithm Update Brewing? Depends Who You Ask. - 0 views

  •  
    "Accuranker"
Rob Laporte

Problems Continue With Google Local Business Listings - 0 views

  •  
    Oct 14, 2008 at 1:08pm Eastern by Mike Blumenthal Problems Continue With Google Local Business Listings What do the Google searches; Orlando Hotels, Miami Discount Car Rental & Dallas Discount Car Rental have in common? The obvious answer is that they are all local searches on popular phrases in major metro areas. A less obvious answer is that like the infamous Denver Florist search last December, they all return seemingly authoritative OneBox results on popular geo phrase searches in a major market, as in the example below: Orlando Hotels or the Marriott The searches demonstrate clear problems with Google's Universal Local OneBox algorithm. Certainly, "major city + service/product" searches should return a broad range of consumer choices and not an authoritative OneBox that limits the view to one highlighted provider of the service. Google returns the OneBox result because the ostensible business name in the result supposedly mirrors the search phrase and in Google's opinion provides strong relevance in relation to the user query. The problem with the above result is that the business shown on the map is the Marriott Orlando Downtown, not "travel.ian.com." The Marriott's business listing has apparently been hijacked. In fact, all of the listings returned on these searches have apparently been "hijacked" via Google's community edit feature and the business name of the listing has been modified from the original, Marriott Orlando Downtown, to match the search phrase. The URL's of the listings have also been modified to direct users to an affiliate link on an appropriate site. How? Through the use of Google's community edit feature for local business listings. Google's community edit feature has become the playground of black hat affiliate marketers and is sorely in need of more security. Of interest in this regards is that many of these listings are for multinational corporations. These are not small independent business that are t
jack_fox

Google Does Not Use BBB Or Other Trust Building Sites For Search Ranking - 0 views

  • John Mueller confirmed yesterday in a video hangout that Google does not use the BBB, Better Business Bureaus score or reviews as well as other third-party trust sites in their ranking algorithm.
Rob Laporte

Google Update 2019: Winners and Losers of the March 2019 Core Update - 0 views

  • Another clear trend resulting from this update seems to be Google favoring websites, particularly when users are searching for sensitive YMYL keywords, that are able to provide a higher level of trust. The main beneficiaries of this focus are websites with a strong brand profile and a broad topical focus. On the flipside, this has meant that niche websites dealing with these topics have seen their rankings fall.
  • An analysis conducted by Malte Landwehr, VP Product at Searchmetrics, suggests that Google’s algorithm has increased its weighting of user signals when calculating rankings. The results show that domains that improved their SEO Visibility following the Google Core Update have higher values for time on site and page views per visit, and lower bounce rates than their online competitors.
  •  
    "niche ranking factors"
jack_fox

Google's Ranking Algorithm Allows Unique Tactics - 0 views

  • just because a site isn't doing well with one specific area - it doesn't mean that site cannot rank well in Google. In this case, speed is one of hundreds of factors Google uses
Rob Laporte

The March 12, 2019 Google Core Algorithm Update - A Softer Side Of Medic, Trust And The... - 1 views

  • when checking queries that dropped and their corresponding landing pages, they line up with the problems I have been surfacing. For example, thin content, empty pages, pages that had render issues, so on and so forth.
  • Author expertise is extremely important, especially for YMYL content.
  • Also, and this is important, the site consumes a lot of syndicated content. I’ve mentioned problems with doing this on a large scale before and it seems this could be hurting the site now. Many articles are not original, yet they are published on this site with self-referencing canonical tags (basically telling Google this is the canonical version). I see close to 2K articles on the site that were republished from other sources
  • ...5 more annotations...
  • And last, but not least, the site still hadn’t moved to https. Now, https is a lightweight ranking factor, but it can be the tiebreaker when two pages are competing for a spot in the SERPs. Also, http sites can turn off users, especially with the way Chrome (and other browsers) are flagging them. For example, there’s a “not secure” label in the browser. And Google can pick up on user happiness over time in a number of ways (which can indirectly impact a site rankings-wise). Maybe users leave quickly, maybe they aren’t as apt to link to the site, share it on social media, etc. So not moving to https can be hurting the site on multiple levels (directly and indirectly).
  • This also leads me to believe that if Google is using reputation, they are doing so in aggregate and not using third-party scores or ratings.
  • What Site Owners Can Do – The “Kitchen Sink” Approach To RemediationMy recommendations aren’t new. I’ve been saying this for a very long time. Don’t try to isolate one or two problems… Google is evaluating many factors when it comes to these broad core ranking updates. My advice is to surface all potential problems with your site and address them all. Don’t tackle just 20% of your problems. Tackle close to 100% of your problems. Google is on record explaining they want to see significant improvement in quality over the long-term in order for sites to see improvement.
  • Summary – The March 12 Update Was Huge. The Next Is Probably A Few Months AwayGoogle only rolled out three broad core ranking updates in 2018. Now we have our first of 2019 and it impacted many sites across the web.
  • Don’t just cherry pick changes to implement. Instead, surface all potential problems across content, UX, advertising, technical SEO, reputation, and more, and address them as thoroughly as you can. That’s how you can see ranking changes down the line. Good luck.
Rob Laporte

Google confirms mid-December search ranking algorithm updates - 0 views

  •  
    "Searchmetrics"
« First ‹ Previous 41 - 60 of 136 Next › Last »
Showing 20 items per page