Skip to main content

Home/ DISC Inc/ Group items tagged algorithms

Rss Feed Group items tagged

Rob Laporte

Google Confirms "Mayday" Update Impacts Long Tail Traffic - 0 views

  • Google Confirms “Mayday” Update Impacts Long Tail Traffic May 27, 2010 at 11:02am ET by Vanessa Fox Google made between 350 and 550 changes in its organic search algorithms in 2009. This is one of the reasons I recommend that site owners not get too fixated on specific ranking factors. If you tie construction of your site to any one perceived algorithm signal, you’re at the mercy of Google’s constant tweaks. These frequent changes are one reason Google itself downplays algorithm updates. Focus on what Google is trying to accomplish as it refines things (the most relevant, useful results possible for searchers) and you’ll generally avoid too much turbulence in your organic search traffic. However, sometimes a Google algorithm change is substantial enough that even those who don’t spend a lot of time focusing on the algorithms notice it. That seems to be the case with what those discussing it at Webmaster World have named “Mayday”. Last week at Google I/O, I was on a panel with Googler Matt Cutts who said, when asked during Q&A,  ”this is an algorithmic change in Google, looking for higher quality sites to surface for long tail queries. It went through vigorous testing and isn’t going to be rolled back.” I asked Google for more specifics and they told me that it was a rankings change, not a crawling or indexing change, which seems to imply that sites getting less traffic still have their pages indexed, but some of those pages are no longer ranking as highly as before. Based on Matt’s comment, this change impacts “long tail” traffic, which generally is from longer queries that few people search for individually, but in aggregate can provide a large percentage of traffic. This change seems to have primarily impacted very large sites with “item” pages that don’t have many individual links into them, might be several clicks from the home page, and may not have substantial unique and value-added content on them. For instance, ecommerce sites often have this structure. The individual product pages are unlikely to attract external links and the majority of the content may be imported from a manufacturer database. Of course, as with any change that results in a traffic hit for some sites, other sites experience the opposite. Based on Matt’s comment at Google I/O, the pages that are now ranking well for these long tail queries are from “higher quality” sites (or perhaps are “higher quality” pages). My complete speculation is that perhaps the relevance algorithms have been tweaked a bit. Before, pages that didn’t have high quality signals might still rank well if they had high relevance signals. And perhaps now, those high relevance signals don’t have as much weight in ranking if the page doesn’t have the right quality signals. What’s a site owner to do? It can be difficult to create compelling content and attract links to these types of pages. My best suggestion to those who have been hit by this is to isolate a set of queries for which the site now is getting less traffic and check out the search results to see what pages are ranking instead. What qualities do they have that make them seen as valuable? For instance, I have no way of knowing how amazon.com has faired during this update, but they’ve done a fairly good job of making individual item pages with duplicated content from manufacturer’s databases unique and compelling by the addition of content like of user reviews. They have set up a fairly robust internal linking (and anchor text) structure with things like recommended items and lists. And they attract external links with features such as the my favorites widget. From the discussion at the Google I/O session, this is likely a long-term change so if your site has been impacted by it, you’ll likely want to do some creative thinking around how you can make these types of pages more valuable (which should increase user engagement and conversion as well). Update on 5/30/10: Matt Cutts from Google has posted a YouTube video about the change. In it, he says “it’s an algorithmic change that changes how we assess which sites are the best match for long tail queries.” He recommends that a site owner who is impacted evaluate the quality of the site and if the site really is the most relevant match for the impacted queries, what “great content” could be added, determine if the the site is considered an “authority”, and ensure that the page does more than simply match the keywords in the query and is relevant and useful for that query. He notes that the change: has nothing to do with the “Caffeine” update (an infrastructure change that is not yet fully rolled out). is entirely algorithmic (and isn’t, for instance, a manual flag on individual sites). impacts long tail queries more than other types was fully tested and is not temporary
Dale Webb

Local vs Traditional SEO: Why Citation Is the New Link - 0 views

  •  
    Google's Local algorithm (the one that populates maps.google.com and helps populate the 10-pack, 3-pack, and Authoritative OneBox) counts links differently than its standard organic algorithm. \nIn the Local algorithm, links can still bring direct traffic from the people who click on them. But the difference is that these "links" aren't always links; sometimes they're just an address and phone number associated with a particular business! In the Local algorithm, these references aren't necessarily a "vote" for a particular business, but they serve to validate that business exists at a particular location, and in that sense, they make a business more relevant for a particular search.
jack_fox

Google Says Many Manual Actions Are Algorithmic Now - 0 views

  • John Mueller: A lot of these manual actions have evolved over time. Where we would say well we need to do this manually and at some point we figure out how to do it algorithmically. And it kind of evolved into that direction.
  • Because the web is just gigantic and we can't manually review the whole web. We have to find ways to do as much as possible algorithmically. And in many cases there's still weird things out there that we don't catch algorithmically that maybe we do have to take manual action on.
  •  
    "John Mueller: A lot of these manual actions have evolved over time. Where we would say well we need to do this manually and at some point we figure out how to do it algorithmically. And it kind of evolved into that direction. "
jack_fox

Google Can & Does Use Historic Data For Search Rankings - 0 views

  • John said "similarly with some of the quality algorithms it can also take a bit of time to kind of adjust from one state to another state. So if you significantly improve your website then it's not that from one crawl to the next crawl we will say oh this is a fantastic website now. It's something we're probably over the course of a year maybe sometimes even longer our algorithms have to learn that actually this is a much better website than we thought initially. And we can treat that a little bit better in the search results over time."
  •  
    "John said "similarly with some of the quality algorithms it can also take a bit of time to kind of adjust from one state to another state. So if you significantly improve your website then it's not that from one crawl to the next crawl we will say oh this is a fantastic website now. It's something we're probably over the course of a year maybe sometimes even longer our algorithms have to learn that actually this is a much better website than we thought initially. And we can treat that a little bit better in the search results over time.""
Rob Laporte

They're Back! Google Issues Weather Report For Panda Update - 0 views

  • Panda Weather Report Issued That seemed pretty much like no, there aren’t going to be weather reports. But then early today, Matt Cutts tweeted: Weather report: expect some Panda-related flux in the next few weeks, but will have less impact than previous updates (~2%). Panda doesn’t constantly run. It’s a special algorithm that Google processes content through on a periodic basis. Why Google Panda Is More A Ranking Factor Than Algorithm Update explains much more about this.
Rob Laporte

August 22, 2017: The day the 'Hawk' Google local algorithm update swooped in - 0 views

  •  
    "Possum algorithm update"
Rob Laporte

A deep dive into BERT: How BERT launched a rocket into natural language understanding -... - 0 views

  • Google describes BERT as the largest change to its search system since the company introduced RankBrain, almost five years ago, and probably one of the largest changes in search ever.
  • it is not so much a one-time algorithmic change, but rather a fundamental layer which seeks to help with understanding and disambiguating the linguistic nuances in sentences and phrases, continually fine-tuning itself and adjusting to improve.
  • BERT achieved state-of-the-art results on 11 different natural language processing tasks.  These natural language processing tasks include, amongst others, sentiment analysis, named entity determination, textual entailment (aka next sentence prediction), semantic role labeling, text classification and coreference resolution. BERT also helps with the disambiguation of words with multiple meanings known as polysemous words, in context.
  • ...11 more annotations...
  • “Wouldn’t it be nice if Google understood the meaning of your phrase, rather than just the words that are in the phrase?” said Google’s Eric Schmidt back in March 2009, just before the company announced rolling out their first semantic offerings.This signaled one of the first moves away from “strings to things,” and is perhaps the advent of entity-oriented search implementation by Google.
  • On the whole, however, much of language can be resolved by mathematical computations around where words live together (the company they keep), and this forms a large part of how search engines are beginning to resolve natural language challenges (including the BERT update).
  • Google’s team of linguists (Google Pygmalion) working on Google Assistant, for example, in 2016 was made up of around 100 Ph.D. linguists.
  • By 2019, the Pygmalion team was an army of 200 linguists around the globe
  • BERT in search is mostly about resolving linguistic ambiguity in natural language. BERT provides text-cohesion which comes from often the small details in a sentence that provides structure and meaning
  • BERT is not an algorithmic update like Penguin or Panda since BERT does not judge web pages either negatively or positively, but more improves the understanding of human language for Google search.  As a result, Google understands much more about the meaning of content on pages it comes across and also the queries users issue taking word’s full context into consideration.
  • BERT is about sentences and phrases
  • We may see this reduction in recall reflected in the number of impressions we see in Google Search Console, particularly for pages with long-form content which might currently be in recall for queries they are not particularly relevant for.
  • International SEO may benefit dramatically too
  • Question and answering directly in SERPs will likely continue to get more accurate which could lead to a further reduction in click through to sites.
  • Can you optimize your SEO for BERT?Probably not.The inner workings of BERT are complex and multi-layered.  So much so, there is now even a field of study called “Bertology” which has been created by the team at Hugging Face.It is highly unlikely any search engineer questioned could explain the reasons why something like BERT would make the decisions it does with regards to rankings (or anything).Furthermore, since BERT can be fine-tuned across parameters and multiple weights then self-learns in an unsupervised feed-forward fashion, in a continual loop, it is considered a black-box algorithm. A form of unexplainable AI.BERT is thought to not always know why it makes decisions itself. How are SEOs then expected to try to “optimize” for it?BERT is designed to understand natural language so keep it natural.We should continue to create compelling, engaging, informative and well-structured content and website architectures in the same way you would write, and build sites, for humans.
Rob Laporte

Live Search Webmaster Center Blog : SMX East 2008: Webmaster Guidelines - 0 views

  • The bottom line is there are no scenarios in which we would ever recommend cloaking as a good solution, although we do understand that there are some technical reasons people cloak pages that are not directly related to spam. The problem is that cloaking can set off some automatic spam detection algorithms that may result in parts of your site being penalized. As a search engine optimization practice, cloaking can actually be counter-productive.
  • Q: What can you do if your website does get penalized? The first thing you should do is verify that your site has in fact been penalized. To do this, log into our Webmaster Tools and go to the Site Summary page. From here, looked for the Blocked: field in the right-hand column. If your site is blocked, this will show as Yes, otherwise it will show as No. If your site is blocked, then it is time to go review our Webmaster Guidelines and check your site to see which one(s) you may have violated. If you have any questions about this step, please consult our online forums, or contact our technical support staff. Once you've identified and resolved the issue(s), it is time to request that Live Search re-include your pages back into its index. To do that, you'll need to log back into the Webmaster Tools and click on the hyperlinked Blocked: Yes in your Site Summary page. This will take you to a form whereby you can request reevaluation from our support team. Thanks for all of your questions today! If you have any more, please leave them in the comments section and we'll try and answer them as soon as possible.
  •  
    The bottom line is there are no scenarios in which we would ever recommend cloaking as a good solution, although we do understand that there are some technical reasons people cloak pages that are not directly related to spam. The problem is that cloaking can set off some automatic spam detection algorithms that may result in parts of your site being penalized. As a search engine optimization practice, cloaking can actually be counter-productive.
Rob Laporte

YouTube's 'Buzz Targeting' Sells Ad Space on Soon-to-Be Viral Videos - MarketingVOX - 0 views

  • YouTube's 'Buzz Targeting' Sells Ad Space on Soon-to-Be Viral Videos What is the difference between'algorithm' and 'alchemy'? Google has introduced "Buzz Targeting" on YouTube, a new way to wring ad dollars from the video site. Buzz Targeting highlights videos that are about to go viral amongst YouTube users. The algorithm examines videos being favorited and distributed across other sites, among other criteria, then gives advertisers the opportunity to advertise around them. Ads incorporated on the ground floor can then piggy-back the video's popularity. Movie studio Lionsgate was among the first beta testers for Buzz Targeting. The studio placed ads for The Forbidden Kingdom alongside 500 entertainment-related videos. While no figures on the campaign's success were presented, Danielle DePalma of Lionsgate said the program "allowed us to reach a very large, diverse audience." It remains unclear how Buzz Targeting incorporates factors like demographic or location-based criteria. And while the notion of algorithmically gauging a video's ascension into pop culture is comforting, some skepticism is warranted. At ad:tech New York last year, video blogger Kevin Nalty admitted to being uncertain why some videos go viral and others do not. The wisest course, he told audience members of a user-generated video panel, is to keep your cost of entry down. Nalty, known as "Nalts" on YouTube, produced over 500 videos before 2008.
Jennifer Williams

Tag Categories - 24 views

Hey Dale, I added that for you. If anyone else really thinks a new "tag" (category) is needed, post here to the forum. Don't forget to use these tags and make sure that they are spelled the same...

tags

Rob Laporte

Number Crunchers: Who Lost In Google's "Farmer" Algorithm Change? - 0 views

  •  
    Over at Sistrix
Rob Laporte

Google Says Domain Registrations Don't Affect SEO, Or Do They? - 0 views

  •  
    Google Says Domain Registrations Don't Affect SEO, Or Do They? Sep 9, 2009 at 2:01pm ET by Matt McGee Over at Search Engine Roundtable today, Barry Schwartz writes about the latest comments from Google about domain registration and its impact on SEO/search rankings. In this case, it's Google employee John Mueller suggesting in a Google Webmaster Help forum thread that Google doesn't look at the length of a domain registration: A bunch of TLDs do not publish expiration dates - how could we compare domains with expiration dates to domains without that information? It seems that would be pretty hard, and likely not worth the trouble. Even when we do have that data, what would it tell us when comparing sites that are otherwise equivalent? A year (the minimum duration, as far as I know) is pretty long in internet-time :-). But let's look at some more evidence. Earlier this year, Danny spoke with Google's Matt Cutts about a variety of domain/link/SEO issues. In light of the claims from domain registrars that longer domain registrations are good for SEO, Danny specifically asked "Does Domain Registration Length Matter?" Matt's reply: To the best of my knowledge, no search engine has ever confirmed that they use length-of-registration as a factor in scoring. If a company is asserting that as a fact, that would be troubling. But wait, there's more! Shortly after the Q&A with Danny that we posted here, Matt published more thoughts on the matter in a video on the Google Webmaster Central Channel on YouTube. If you don't have time to watch the video, Matt says, "My short answer is not to worry very much about that [the number of years a domain is registered], not very much at all." He reiterates that the domain registrar claims "are not based on anything we said," and talks about a Google "historical data" patent that may or may not be part of Google's algorithm. He sums it up by saying, "make great content, don't worry nea
  •  
    Google Says Domain Registrations Don't Affect SEO, Or Do They? Sep 9, 2009 at 2:01pm ET by Matt McGee Over at Search Engine Roundtable today, Barry Schwartz writes about the latest comments from Google about domain registration and its impact on SEO/search rankings. In this case, it's Google employee John Mueller suggesting in a Google Webmaster Help forum thread that Google doesn't look at the length of a domain registration: A bunch of TLDs do not publish expiration dates - how could we compare domains with expiration dates to domains without that information? It seems that would be pretty hard, and likely not worth the trouble. Even when we do have that data, what would it tell us when comparing sites that are otherwise equivalent? A year (the minimum duration, as far as I know) is pretty long in internet-time :-). But let's look at some more evidence. Earlier this year, Danny spoke with Google's Matt Cutts about a variety of domain/link/SEO issues. In light of the claims from domain registrars that longer domain registrations are good for SEO, Danny specifically asked "Does Domain Registration Length Matter?" Matt's reply: To the best of my knowledge, no search engine has ever confirmed that they use length-of-registration as a factor in scoring. If a company is asserting that as a fact, that would be troubling. But wait, there's more! Shortly after the Q&A with Danny that we posted here, Matt published more thoughts on the matter in a video on the Google Webmaster Central Channel on YouTube. If you don't have time to watch the video, Matt says, "My short answer is not to worry very much about that [the number of years a domain is registered], not very much at all." He reiterates that the domain registrar claims "are not based on anything we said," and talks about a Google "historical data" patent that may or may not be part of Google's algorithm. He sums it up by saying, "make great content, don't worry nea
Dale Webb

W3C Validation not part of Google Search Engine Ranking Factor - 0 views

  •  
    Matt Cutts has announced that W3C Validation and clean coding does not factor into search engine rankings. There are other benefits to having a validated website, but there is nothing in Google's algorithm that will increase SERPS. note: cleaner code usually loads faster, and load time is a factor
Rob Laporte

Why Google Panda Is More A Ranking Factor Than Algorithm Update - 0 views

  • At our SMX Advanced conference earlier this month, the head of Google’s spam fighting team, Matt Cutts, explained that the Panda filter isn’t running all the time. Right now, it’s too much computing power to be running this particular analysis of pages. Instead, Google runs the filter periodically to calculate the values it needs. Each new run so far has also coincided with changes to the filter, some big, some small, that Google hopes improves catching poor quality content.  So far, the Panda schedule has been like this: Panda Update 1.0: Feb. 24, 2011 Panda Update 2.0: April 11, 2011 (about 7 weeks later) Panda Update 2.1: May 10, 2011 (about  4 weeks later) Panda Update 2.2: June 16, 2011 (about 5 weeks later) Recovering From Panda For anyone who was hit by Panda, it’s important to understand that the changes you’ve made won’t have any immediate impact. For instance, if you started making improvements to your site the day after Panda 1.0 happened, none of those would have registered for getting you back into Google’s good graces until the next time Panda scores were assessed — which wasn’t until around April 11. With the latest Panda round now live, Google says it’s possible some sites that were hit by past rounds might see improvements, if they themselves have improved.
jack_fox

10 Facts You Think You Know About SEO That Are Actually Myths - 0 views

  • The duplicate content penalty doesn’t exist.
  • It isn’t helpful to focus on individual ranking signals because search engine algorithms are too sophisticated for this to be a useful way of conceptualizing algorithms.
  • Google’s algorithms have gotten better at understanding these types of low-quality backlinks and knowing when they should be ignored. As a result, the need for SEO pros to maintain and update a disavow file has diminished significantly
  • ...2 more annotations...
  • Now it is only recommended to make use of the disavow file when a site has received a manual action, in order to remove the offending links.
  • while there are plans to introduce a speed update later in 2018, Google only uses speed to differentiate between slow pages and those in the normal range.
Rob Laporte

Google Says Being Different Helps Improve Rankings - Search Engine Journal - 0 views

  • So, in general, a site query is not representative of all of the pages that we have indexed.It’s a good way to get a rough view of what we have indexed. But it’s not the comprehensive list. It’s not meant to be like that.For more information on how or what we have indexed, I would use search console and the index coverage report there.That gives you a better look at the pages that are actually indexed.
  • With regards to losing traffic, I realize that’s sometimes hard.In general, I think with a website that’s focused on ringtones, it’ll be a little bit tricky because our algorithms really do try to look out for unique, compelling, high quality content.And if your whole website is built up on essentially providing ringtones that are the same as everywhere else then I don’t know if our algorithms would say this is a really important website that we need to focus on and highlight more in search.So with that in mind, if you’re focused on kind of this small amount of content that is the same as everyone else then I would try to find ways to significantly differentiate yourselves to really make it clear that what you have on your website is significantly different than all of those other millions of ringtone websites that have kind of the same content.Maybe there is a way to do that with regards to the content that you provide.Maybe there is a way to do that with the functionality that you provide.But you really need to make sure that what you have on your site is significantly different enough that our algorithms will say well this is what we need to index instead of all of these others that just have a list of ringtones on the website.So that’s probably not going to be that easy to make that kind of a shift but that’s generally the direction I would hit.And that’s the same recommendation I would have for any kind of website that offers essentially the same thing as lots of other web sites do.You really need to make sure that what you’re providing is unique and compelling and high quality so that our systems and users in general will say, I want to go to this particular website because they offer me something that is unique on the web and I don’t just want to go to any random other website.
jack_fox

BruceClay - How to Improve Google Image Search Ranking - 0 views

  • Google Images algorithm now considers not only the image but also the website where it’s embedded.
  • The authority of the webpage itself is now a signal for ranking an image.
  •  
    "Google Images algorithm now considers not only the image but also the website where it's embedded."
jack_fox

Google demotes libelous content in search through its predatory sites algorithms - 0 views

  • Google told us it has already deployed changes to its algorithms but it plans to continue to make changes to catch exploitative sites.
jack_fox

- 0 views

  • I'm not aware of any ranking algorithm that would take IPs like that into account.
  •  
    "I'm not aware of any ranking algorithm that would take IPs like that into account. "
1 - 20 of 136 Next › Last »
Showing 20 items per page