Skip to main content

Home/ DISC Inc/ Group items tagged algorithms

Rss Feed Group items tagged

1More

Google Algorithm Updates & Changes: A Complete History - 0 views

  •  
    "How the Google Hummingbird Update Changed Search"
7More

How Does the Local Algorithm Work? - Whiteboard Friday - Moz - 0 views

  • there are a couple of tools that will actually let you see results based on geo coordinates, which is really cool and very accurate. Those tools include the Local Falcon, and there is a Chrome extension which is 100% free, that you can put in your browser, called GS Location Changer.
    • jack_fox
       
      I will try out the free Chrome extension on our next local SEO Tier 2-3 job
  • these two levels, depending on what industry you are working in, it's really important to know which level you need to be looking at. If you work with lawyers, for example, zip code level is usually good enough.
  • if you work with dentists or restaurants, let's say, you really need to be looking at geo coordinate levels. We have seen lots of cases where we will scan a specific keyword using these two tools, and depending on where in that zip code we are, we see completely different three-packs
  • ...3 more annotations...
  • Generally speaking, if you're on a computer, they know what zip code you're in, and they'll list that at the bottom.
  • we've pretty much almost always seen a positive impact by switching to the homepage, even if that homepage is not relevant at all.
  • a Moz whitepaper that they did recently, where they found that only 8% of local pack listings had their website also appearing in the organic search results below.
1More

How Does the Local Algorithm Work? - Whiteboard Friday - Moz - 0 views

  • But there are a couple of tools that will actually let you see results based on geo coordinates, which is really cool and very accurate. Those tools include the Local Falcon, and there is a Chrome extension which is 100% free, that you can put in your browser, called GS Location Changer. I use this all the time in an incognito browser if I want to just see what search results look like from a very, very specific location. Now these two levels, depending on what industry you are working in, it's really important to know which level you need to be looking at. If you work with lawyers, for example, zip code level is usually good enough.
5More

The January 2020 Core Update: Affiliate Sites, Pet Health, Trust Issues and Spam likely... - 0 views

  • Affiliate sites that did not properly disclose their affiliate links may have been affected.Truly excellent content appears to have been rewarded.Several elements of trust, as outlined in the Quality Raters’ Guidelines (QRG) were possibly reassessed.
  • A lot of ultra-spammy content may have been deindexed.
  • we believe that if something is outlined in the QRG, it means that Google is either measuring this algorithmically, or they want to be able to measure it algorithmically.
  • ...2 more annotations...
  • some examples of things that we noticed on affiliate sites that saw improvements in overall keyword rankings with this update:Plain text to make it clear that the user is clicking on a link to take them to a sales page. Example: When I make this recipe, I love to use this blender which you can buy on Amazon.
  • Using an official widget from your affiliate partners.
10More

RankBrain Judgment Day: 4 SEO Strategies You'll Need to Survive | WordStream - 0 views

  • The future of SEO isn't about beating another page based on content length, social metrics, keyword usage, or your number of backlinks. Better organic search visibility will come from beating your competitors with a higher than expected click-through rate.
  • In “Google Organic Click-Through Rates” on Moz, Philip Petrescu shared the following CTR data:
  • The Larry RankBrain Risk Detection Algorithm. Just download all of your query data from Webmaster Tools and plot CTR vs. Average Position for the queries you rank for organically, like this:
  • ...7 more annotations...
  • Our research into millions of PPC ads has shown that the single most powerful way to increase CTR in ads is to leverage emotional triggers. Like this PPC ad: Tapping into emotions will get your target customer/audience clicking! Anger. Disgust. Affirmation. Fear. These are some of the most powerful triggers not only drive click through rate, but also increase conversion rates.
  • No, you need to combine keywords and emotional triggers to create SEO superstorms that result in ridiculous CTRs
  • Bottom line: Use emotional triggers + keywords in your titles and descriptions if you want your CTR to go from "OK" to great.
  • Bottom line: You must beat the expected CTR for a given organic search position. Optimize for relevance or die.
  • Let's say you work for a tech company. Your visitors, on average, are bouncing away at 80% for the typical session, but users on a competing website are viewing more pages per session and have a bounce rate of just 50%. RankBrain views them as better than you – and they appear above you in the SERPs. In this case, the task completion rate is engagement. Bottom line: If you have high task completion rates, Google will assume your content is relevant. If you have crappy task completion rates, RankBrain will penalize you.
  • 4. Increase Search Volume & CTR Using Social Ads and Display Remarketing People who are familiar with your brand are 2x more likely to click on your ads and 2x more likely to convert. We know this because targeting a user who has already visited your website (or app) via RLSA (remarketing lists for search ads) always produces higher CTRs than generically targeting the same keywords to users who are unfamiliar with your brand. So, one ingenious method to increase your organic CTRs and beat RankBrain is to bombard your specific target market with Facebook and Twitter ads. Facebook ads are proven to lift mobile search referral traffic volume to advertiser websites (by 6% on average, up to 12.8%) (here’s the research). With more than a billion daily users, your audience is definitely using the Social Network. Facebook ads are inexpensive – even spending just $50 dollars on social ads can generate tremendous exposure and awareness of your brand. Another relatively inexpensive way to dramatically build up brand recognition is to leverage the power of Display Ad remarketing on the Google Display Network. This will ensure the visitors you drive from social media ads remember who you are and what it is you do. In various tests, we found that implementing a display ad remarketing strategy has a dramatic impact on bounce rates and other engagement metrics. Bottom line: If you want to increase organic CTRs for your brand or business, make sure people are familiar with your offering. People who are more aware of your brand and become familiar with what you do will be predisposed to click on your result in SERP when it matters most, and will have much higher task completion rates after having clicked through to your site.
  • UPDATE: As many of us suspected, Google has continued to apply RankBrain to increasing volumes of search queries - so many, in fact, that Google now says its AI processes every query Google handles, which has enormous implications for SEO. As little as a year ago, RankBrain was reportedly handling approximately 15% of Google's total volume of search queries. Now, it's processing all of them. It's still too soon to say precisely what effect this will have on how you should approach SEO, but it's safe to assume that RankBrain will continue to focus on rewarding quality, relevant content. It's also worth noting that, according to Google, RankBrain itself is now the third-most important ranking signal in the larger Google algorithm, meaning that "optimizing" for RankBrain will likely dominate conversations in the SEO space for the foreseeable future. To read more about the scope and potential of RankBrain and its impact on SEO, check out this excellent write-up at Search Engine Land.
5More

Welcome BERT: Google's latest search algorithm to better understand natural language - ... - 0 views

  • Welcome BERT: Google’s latest search algorithm to better understand natural languageBERT will impact 1 in 10 of all search queries. This is the biggest change in search since Google released RankBrain.
  • What is BERT? It is Google’s neural network-based technique for natural language processing (NLP) pre-training. BERT stands for Bidirectional Encoder Representations from Transformers.
  • When is BERT used? Google said BERT helps better understand the nuances and context of words in searches and better match those queries with more relevant results. It is also used for featured snippets, as described above.
  • ...2 more annotations...
  • RankBrain is not dead. RankBrain was Google’s first artificial intelligence method for understanding queries in 2015. It looks at both queries and the content of web pages in Google’s index to better understand what the meanings of the words are. BERT does not replace RankBrain, it is an additional method for understanding content and queries. It’s additive to Google’s ranking system. RankBrain can and will still be used for some queries. But when Google thinks a query can be better understood with the help of BERT, Google will use that. In fact, a single query can use multiple methods, including BERT, for understanding query.
  • Why we care. We care, not only because Google said this change is “representing the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.”
11More

Google MUM: What to Know About the New Search Engine Tech - 0 views

  • Google MUM, or Multitask Unified Model, is the latest algorithmic successor for the search engine giant. Google has called MUM “a new AI milestone inside of Google search.
  • It basically gathers subcategories for the query and delivers a more holistic picture for the benefit of the end user. MUM is particularly attuned to comparisons for an umbrella of related queries.
  • One thing that’s interesting about MUM is that it understands things across text and images. In the future, Google expects to be able to incorporate audio and video in the omnichannel mix, too.
  • ...8 more annotations...
  • pull information from different languages
  • understand thoughtful subjects more holistically
  • Google’s algorithm update combines “entities, sentiments and intent” all for the sake of the user experience.
  • Google’s Senior Webmaster Trends Analyst John Mueller says, “I don’t really see how this would reduce the need for SEO
  • BERT and MUM are both built on something called a Transformer Architecture. However, MUM has more multitasking capabilities than BERT. Because of this, Google reports that MUM is 1,000 times stronger than BERT at providing nuanced results.
  • Google’s been mum on when MUM will expand from beta mode. It didn’t take an excessive amount of time for BERT, so the outlook seems promising.
  • Continue optimizing your content with multimedia in mind. Keep the user at the forefront of your strategy, since that’s exactly what Google MUM is doing.
  • a sprawling leap forward in machine learning
1More

How Google Interferes With Its Search Algorithms and Changes Your Results - WSJ - 0 views

  • Some very big advertisers received direct advice on how to improve their organic search results, a perk not available to businesses with no contacts at Google, according to people familiar with the matter. In some cases, that help included sending in search engineers to explain a problem, they said.
7More

Is Rank Tracking by Zip Codes Still Relevant? | How to Accurately Track Ranking in Loca... - 0 views

  • Google uses proximity to deliver local results and does not use zip codes. All searches are affected by the proximity factor, but businesses with more dense competition will be affected the most. Organic results are affected by proximity but much less than Google Maps rankings. Checking rankings by zip code center does not always provide a complete or meaningful picture to base optimization decisions on.
  • Google could care less about what zip code you are in when performing a search, it only cares about distances when it comes to local businesses.
  • In order to deal with this distance factor, rank trackers have adapted and are now scanning with many points instead of one. A grid is laid out according to distance and each point reflects a different result.This gives a much better picture of what the rankings look like around a business as it takes into account the granularity that the algorithm actually produces.
  • ...4 more annotations...
  • The only time Google will factor zip code into a search is if you specifically enter the zip code in your search: “Dentist near me 11219”. In which case you are telling Google to return results for dentists in that zip. Otherwise, the algorithm will use your location and give you results based off of distance.
  • Organic Google search results are less affected by proximity than Google Maps results.
  • This is invaluable data which helps you determine which keywords need better optimization, and when an area is so far out of reach for ranking well that a PPC campaign would be a good option.
  • Zip code tracking is perfectly fine for some industries. If you work with an industry like dentists or restaurants that have a high density of competition in a small radius, zip code tracking will be very inaccurate. I’d suggest scanning some manually in Incognito mode using this Chrome extension to see if there is much variance within the same zip code. https://chrome.google.com/webstore/detail/gs-location-changer/blpgcfdpnimjdojecbpagkllfnkajglp
22More

Google's December 2020 Core Update Themes - 0 views

  • The data and overall consensus point to Google’s December 2020 Core Update is it's one of the more impactful algorithm adjustments to hit the SERP over the past year or so.
  • I prefer to look at core updates almost from a pure content and UX perspective. For me, it’s about the specific pages Google swaps out more than it is a per domain analysis.
  • I am performing a qualitative analysis
  • ...19 more annotations...
  • I am not making any sort of definitive statements
  • What moves me, however, is when I look at 100 keywords I start seeing the same content-oriented theme arise again and again.
  • What I’m trying to say, and as you’ll see in the examples I will get into later, is that the content that was more focused on the specific topic mentioned in the query did better. So while the "ultimate guide” here did get to the topic the query deals with, it was not exclusively about that topic.
  • This might call the entire strategy of creating these ultimate guides into question. Perhaps you can’t cast a wide net in that way anymore? Perhaps, the "ultimate guide” is only really suitable for people who actually want to get a more broad understanding of a topic? (Crazy to think, I know!)
  • The pages from Rocket Mortgage, on the other hand, is only about how much you need for a down payment:
  • So too is the page from Quicken Loans:
  • The Moral of the Story: If I want to understand how much money on average I need to put down when buying a house or what the various options generally are and what they mean long term, the CFPG page, .gov or not, doesn’t really help me. Its content is not specifically honed in on that particular topic. Again, we have another page that takes a sweeping look at a topic that lost rankings when the query reflected a more specific sort of intent!
  • What’s interesting here is that unlike the previous examples, where too much content resulted in the page’s topical relevance being diluted, the lack of such content here is what I think caused the ranking loss. Look, it’s not bad content. However, it’s pretty much the "general” kind of content you see here, there, and everywhere for all sorts of topics. Just compare it to what the page from the Credit Card Insider offers:
  • This just oozes depth. The third topic on the page alone (6 Ways to Pay Off…) rivals the depth shown on the CreditCards.com page! What differentiates this page from the "guides” shown in the other examples is that this is a guide that drills deep into one topic as opposed to trying to span multiple subtopics. Also, have a look at the formatting, it reminds me of what we saw on the Motley Fool’s page:
  • It’s deep content that is easy to digest. It’s not hard to see why Google swapped these two pages.
  • The Moral of the Story: Exact content relevancy is not only about what topic you talk about. You can be topically aligned but it has to offer real information to the user. It’s even better when that information is digestible. In other words, if you want to rank for a keyword with topic specificity it might be better to create an "ultimate guide” that drills deep into the topic itself versus trying to cover every subtopic under the sun in order to try to rank for more topics with one piece of content.
  • The by-line really sums it up. It tells you this article is about the fact that you most likely won't get addicted to painkillers, but it’s definitely possible so here’s the scoop. To me, it’s far more in line with the average user’s intent of learning about the risks of addiction versus understanding the fine difference between addiction and dependence. It’s the same story with the WebMD page:
  • The Moral of the Story: Again, the issue here is not how authoritative or how substantial the content is. There is no doubt that content from the NIH is both substantial and authoritative. The issue here again seems to relate to Google being better able to show content that is specifically relevant to the nature of the query.
  • First things first, the page doesn’t speak to the query directly. While in the process of learning the difference between sadness and depression one could understand the signs of depression that route is certainly indirect. You could argue that the query how to tell if you have depression could be taken as ‘how do I know if I am just sad or depressed?’ but that really doesn’t seem to be the essential intent here. That topical line (i.e., sadness vs. depression) would most likely produce its own unique query (i.e., am I sad or depressed). From the content shown on the WebMD page, it appears that Google thinks of the intent as understanding the symptoms of depression:
  • The WebMD, in contradistinction to the MHA page, speaks to the "plain meaning” of the query’s intent… how can you tell if you’re suffering from depression? Aside from that, the WebMD page offers a bit more in terms of substance. While it doesn’t go into great detail per se, the WebMD page does offer a pretty comprehensive list of items. Compare that to the MHA page which, if you read it, is a bit thin and lacks tremendously in offering much of any details (even a basic list as seen on the WebMD page). The Moral of the Story: Relevancy is a two-pronged equation (at minimum). It requires the content to be topically focused on the one hand as well as substantial on the other
  • I’ve saved the best for last. This is my favorite example that I came across when diving into the December 2020 Core Update. I mean, for crying out loud, we’re talking about the CDC losing rankings in favor of a .org domain I never heard of. How could this be? Let’s understand the intent of the query. If I were searching for this it would be because I found something on my body that I thought might be skin cancer. If I could be so bold, I would imagine that this is why most of us would search for this term. I wouldn’t, and again I imagine most people in most instances wouldn’t search for this in order to understand if regular screening is officially recommended or not. Yet, that is what the CDC page is about:
  • I hate to make assumptions, but I would also think that someone running this query is most likely not interested in the common tests and methods doctors use to determine if skin cancer is present. Yet, this is what the page from Cancer.net focuses on:
  • Again, I would search for this term if I saw something weird on my body that made me think "Holy crap, do I have skin cancer?”. The page from the AOCD is entirely made for people on the verge of freaking out at the possibility of having skin cancer:
  • To me, when you see this page relative to the pages from Cancer.net and the CDC is painfully obvious why this page got the ranking boost. The Moral of the Story: Again, I think what has transpired here is painfully obvious. Google has looked past the immediate authority of some of the pages here and has more heavily considered how relevant the content is to the query. As with the cases I have shown earlier, Google is rewarding content that speaks in a highly-focused way to the intent and nature of the query. What Was the December 2020 Core Update About? Are you expecting a one-liner that definitively characterizes the December 2020 update? You’re not going to get one from me.  This update, like any other, certainly included a whole plethora of different "algorithmic considerations” and themes. That said, from where I sit, while other core updates did things to help put the most authoritative content at the top of the SERP, this update seemed to me as being more about pure relevancy. Updates of the past have done things to weed out sites using a marketing tone within YMYL informational content or have rewarded sites that put the right content ahead of their affiliate pursuits. All of that, while part of relevancy, speaks more to a need for something authoritative at the top of the SERP. Seeing so many .gov pages drop in favor of pages from sites like Healthline or WebMD seems to point to the update rewarding relevancy to the nth degree. Perhaps Google felt as if it had "authority” at the top of the SERP in order, paving the way for a focus on relevance? Who knows. All I can say is that I personally have not seen such a strong focus on pure relevance on page one of the SERP. Content Creation Takeaways Practically speaking, I think the era of broadly reaching pages is quickly coming to an end. I think that has been the case for a while. However, seeing Google pull pages off page one of the SERP because they deal with multiple subtopics is a new level, at least for me. It shows that you have to create content that talks about one topic and one topic only (unless the keyword reflects a specific desire for a broader survey of a topic). I wonder if the idea of having one ultimate guide so as to win numerous keywords should be replaced with multiple posts where each post deals with one specific topic or subtopic. (If you do that, please don’t create thin content, that is not what I am advocating for.) It’s a rather logical concept. As Google gets better at understanding content it is going to prefer highly-focused content around a specific topic to that which is of a more broad nature unless the query specifically shows intent for a general survey of a topic.
1More

1,000+ Winners and Losers of the December 2020 Google Core Algorithm Update | Path Inte... - 0 views

  • The most striking aspect of this update is the dramatic reversal in visibility among several of the sites that were the biggest winners of 2020 in the days prior to the update, such as Amazon, Pinterest, CDC, Overstock, CNN, New York Times, and other sites that greatly benefitted due to the coronavirus pandemic, mandatory quarantines, and other breaking news in 2020. Maybe Google decided it was time to give some of the smaller players a chance to compete against the big guys – an unexpected holiday gift, perhaps?
1More

Mueller Says Don't Make Assumptions Based on Site: Search - 0 views

  • Like all search operators, the advanced site is not connected with the Google algorithm. As a result, it’s doesn’t offer any insight related to the search algorithm.
6More

The myth of duplicate structured data being wrong - Ilana Davis - 0 views

  • It doesn’t help that Google is very vague with structured data and how it works.
  • if a set of structured data has an error Google will flat out ignore that entire set of data for this process
  • if you have three sets of duplicated Product data but two have errors, only that error-free set will be used. Even if those other two are more complete
  • ...3 more annotations...
  • They, being a software company all about algorithms, have a process and algorithm to evaluate all of the structured data on a page and pick the best one to use.
  • From what I’ve seen in my research, all of the types of structured data follow similar rules.
  • it makes sense that people don’t understand how it works because structured data is complex and required a deep investment of time and energy in order to do right.
1More

Google Says Don't Blindly Replace Your HTML Title Tags With Google's Titles - 0 views

  • maybe there are cases where Google's algorithms have selected a better title and where it makes sense to kind of go in that direction. But there are certainly also situations where maybe Google's algorithms select the a worse title and where you want to keep the one that you had there or maybe you even want to improve the one that you had previously. So I wouldn't just blindly use what we show in search
« First ‹ Previous 61 - 80 of 136 Next › Last »
Showing 20 items per page