Skip to main content

Home/ DISC Inc/ Group items tagged algorithms

Rss Feed Group items tagged

jack_fox

Google's Penguin Algorithm May Not Just Ignore Links, It May Target Whole Site - 0 views

  • Is the penguin penalty still relevant at all or are less relevant/spammy/toxic backlinks more or less ignored by the ranking algorithm these days?"John replied saying that in most cases, Google will just ignore the links but in some cases, where there is a clear pattern of spammy and manipulative links by the site, Penguin may decide to simply distrust the whole site.John said "I'd say it's a mix of both" when he answered that question. Meaning, Google Penguin can both ignore links and demote sites, if necessary. John said "if our systems recognize that they can't isolate and ignore these links across a website." John added that if Google can see a "very strong pattern there" the Google "algorithms" can lose "trust with this website" and you may see a "drop in the visibility there."
  •  
    "Penguin 4.0"
Rob Laporte

The Complete Guide to Google Penalties (Both Manual and Algorithmic) - 0 views

  •  
    "Panguin Tool"
Rob Laporte

Improved Snippets, Rank Boost For "Official" Pages Among 10 New Google Algorithm Changes - 0 views

  •  
    Google Can Now Execute AJAX & JavaScript For Indexing
Rob Laporte

The Quality Update: Google Confirms Changing How Quality Is Assessed, Resulting In Rank... - 0 views

  • there were changes to its core ranking algorithm in terms of how it processes quality signals.
  • we’re dubbing it the Quality Update
  • The update didn’t go after any particular class of sites or any particular sites. It was an update to the overall ranking algorithm itself.
Rob Laporte

The Google Killer No One Dares Discuss - Search Engine Watch (SEW) - 0 views

  • The Empire Strikes Back Taking the Google example, Google is already working desperately to capture all of the shared human experience data and to work it into their index in a usable form. That is what their attempts, though less than pretty, at presenting real-time search have been all about. Sharing data between people by creating Buzz for Gmail users was headed in exactly the same direction. And Google Analytics together with personalization have both been collating human behavior data for quite some time. Launching Android as an open source vehicle was about taking facilitating share in that vitally important mobile phone access zone. So, in fact, the "crawler+data organization (index)+algorithm+ search ranking" pattern of search we understand today has to disappear and be replaced with "human behavior logging+data organization (index)+algorithm+ranking" to produce the right result. Google could actually be its own Google Killer as they, for one, are well placed to do this.
Rob Laporte

BruceClay - Search Engine Optimization: You & A with Matt Cutts - 0 views

  • In 2003 they switched to an incremental index system, pushing it live every few days. (Update Fritz). Now they have Caffeine. But instead of crawling docs and then indexing later that night, they crawl and index at the same time making the whole index much closer to real time. It used to be like waiting for a bus, now it’s like the taxi. No waiting. It just is there and bringing it to you right away. It unlocks the ability for Google to index a whole lot more documents and makes the whole index 50% fresher.
  • Caffeine isn’t a ranking change.
  • Static HTML links are always nice and safe but they’re getting better at parsing JavaScript.
  • ...1 more annotation...
  • Is bounce rate part of the algorithm? Matt says not as part of the general ranking Web algorithm. (To the best of his knowledge.)
Rob Laporte

Evaluating Google's Response To Mapspam Reports - 0 views

  •  
    Conclusions * Local business owners seem to be confused about what actually constitutes spam, but can you blame them? The world of the Local search engines is often confusing even to those of us who study them on a daily basis! * Google's creation of a public forum for reporting anomalies in Maps has helped a lot of businesses recover traffic lost via Maps, and has probably helped Google identify weaknesses in its own algorithm as well. The responsiveness of the Maps team has been relatively admirable, even without providing verbal confirmation in the thread that changes have been made. (Of course, business owners whose situation hasn't been addressed are irate over the lack of response...) * The on-again/off-again bulk upload feature of Google Maps seems to be a particular favorite tool of mapspammers. * Local business owners: claim your listing at Google to avoid being victimized by hijackers and to decrease the likelihood of conflation with someone else's listing. If you don't have a website, direct your Local Business Listing at Google to one of your listings featuring the same information on another portal, such as Yahoo, Citysearch, or Yelp. * The large percentage of reported record conflations also underlines the importance of giving Google a strong signal of your business information (i.e. spiderable HTML address and phone number) on your own website. The more closely Google can associate that particular information with your business, the lower the chance of identifying someone else's business with the same information. In all honesty, I was surprised that the total number of bona-fide instances of spam reported in two months was so low, and I'm not quite sure what to make of it. It's possible that the quality of Local results has improved dramatically since the advent of the 10-pack in January. However, more likely is that the typical local business owner doesn't know where to report possible spam. It'll be interesting to see whether
‹ Previous 21 - 40 of 136 Next › Last »
Showing 20 items per page