Skip to main content

Home/ DISC Inc/ Group items tagged algorithms

Rss Feed Group items tagged

Rob Laporte

Google SEO Test - Google Prefers Valid HTML & CSS | Hobo - 0 views

  •  
    Well - the result is clear. From these 4 pages Google managed to pick the page with valid css and valid html as the preffered page to include in it's index! Ok, it might be a bit early to see if the four pages in the test eventually appear in Google but on first glance it appears Google spidered the pages, examined them, applied duplicate content filters as expected, and selected one to include in search engine results. It just happens that Google seems to prefer the page with valid code as laid down by the W3C (World Wide Web Consortium). The W3C was started in 1994 to lead the Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability. What is the W3C? * W3C Stands for the World Wide Web Consortium * W3C was created in October 1994 * W3C was created by Tim Berners-Lee * W3C was created by the Inventor of the Web * W3C is organized as a Member Organization * W3C is working to Standardize the Web * W3C creates and maintains WWW Standards * W3C Standards are called W3C Recommendations How The W3C Started The World Wide Web (WWW) began as a project at the European Organization for Nuclear Research (CERN), where Tim Berners-Lee developed a vision of the World Wide Web. Tim Berners-Lee - the inventor of the World Wide Web - is now the Director of the World Wide Web Consortium (W3C). W3C was created in 1994 as a collaboration between the Massachusetts Institute of Technology (MIT) and the European Organization for Nuclear Research (CERN), with support from the U.S. Defense Advanced Research Project Agency (DARPA) and the European Commission. W3C Standardising the Web W3C is working to make the Web accessible to all users (despite differences in culture, education, ability, resources, and physical limitations). W3C also coordinates its work with many other standards organizations such as the Internet Engineering Task Force, the Wireless Application Protocols (WAP) Forum an
Rob Laporte

Linkfluence: How to Buy Links With Maximum Juice and Minimum Risk - 0 views

  • Up first is Rand Fishkin. Rand says he asked to be kicked off this panel because he doesn’t endorse buying links and he doesn’t do it anymore [Hear that, Google. SEOmoz doesn't buy links. SO KEEP MOVING.]. He offered to go last…but everyone else bullied the moderator into making him go first. Poor Rand. Always the innocent bunny in a pack of wolves. Unfortunately, the projector is broken so we have no screen. Something about a plug that doesn’t work.  So…we’re doing question and answer first while they send someone to try and fix it. I’ll throw the questions at the bottom.  Back to Mr. Fishkin. He tries to be very clear about his shift in position about paid links. He doesn’t think not buying links is right for everyone, it’s just what’s right for his clients and for SEOmoz.   Rand says he falls into the “Operator of Interest’ category. Meaning, he’s profiled for being an SEO. The problem with paid links: Algorithmic detection is getting better than ever before. Penalties are hard to diagnose. Manual link penalties are also a threat Google’s’ Webspam team invests (A LOT of) time and resources in shutting down effective paid links. [Agreed. And almost an unhealthy amount.] Competitors have significant incentive to report link spam. (Don’t be a rat.)
Rob Laporte

How to report paid links - 0 views

  • Q: I’m worried that someone will buy links to my site and then report that. A: We’ve always tried very hard to prevent site A from hurting site B. That’s why these reports aren’t being fed directly into algorithms, and are being used as the starting point rather than being used directly. You might also want to review the policy mentioned in my 2005 post (individual links can be discounted and sellers can lose their ability to pass on PageRank/anchortext/etc., which doesn’t allow site A to hurt site B).
Rob Laporte

There is no penalty for buying links! - 0 views

  • There is no penalty for buying links! There, I said it. That’s what I believe is true; there is no such thing as a ‘you have been buying links so you should suffer’ penalty. At least, not if you do it correctly. I’ll make some statements about buying links that probably not everybody will agree on, but this is what I consider to be the truth. If you don’t publish your link buying tactics yourself and if your website’s link profile doesn’t contain >90% paid links, then: Buying links cannot get you penalized;Buying links from obvious link networks only results in backlinks with little to no search engine value;Buying links ninja style will continue to get you killer rankings;Selling links can only disable your ability to pass link juice or PR (but you might want to read this);Google will never be able to detect all paid links Just about every time the topic finally seems to be left alone, someone out there heats up the good old paid link debate again. This time, Rand Fishkin (unintentionally) causes the discussion to emerge once again. By showing the buying and selling link tactics of several websites on SEOmoz’ blog (this info has been removed now), he made it very easy for the Paid Link Police to add some more websites to the list of websites to check out while building the Paid Link Neglecting Algorithm. Several people got all wound up because of this, including (at first) me, because these sites would more than likely receive a penalty (just checked, none of them has been penalized yet). However, it is almost impossible for Google to penalize you for buying links for your website. At least, not if you didn’t scream “Hey, I’m artificially inflating my link popularity!” on your OWN website. David Airey penalized? Jim Boykin analyzed his penalty earlier and the same thing happened here. In some cases, it may seem that certain websites have been penalized for buying links. What in fact happened, is that the link juice tap of some obvious paid links has been closed, what resulted in less link juice, followed by lower rankings. In most other cases, you can buy all the links you want and not get penalized. You could buy the same links for your competition, right? And if Google states that Spammy Backlinks can’t Hurt You, paid backlinks probably can’t hurt you either. This basically is the same thing. The worst thing that can happen is that you buy hundreds of text links that only provide traffic. And, if you managed to buy the right ones, there’s nothing wrong with that.
Dale Webb

Google is Finally Killing PageRank - 0 views

  •  
    PageRank metric has been removed from Google Webmaster Tools
Rob Laporte

Introducing the NEW Ahrefs' Domain Rating (and how to use it) - 0 views

  • Does Google use anything like Domain Rating in their algorithm?If we refer to official statements, Google’s John Mueller stated that Google does not have a “website authority score.” We don’t have anything like a website authority score.John Mueller They consistently educated the SEO community that they calculate scores for actual pages, not entire domains (hence PageRank).But prior to that statement, John Mueller said something else: There are some things where we do look at a website overall though.John Mueller So is this a “yes” or is this a “no?”Well, here at Ahrefs we have a firm belief that “website authority” doesn’t exist as an isolated ranking factor.
Rob Laporte

BruceClay - SEO Newsletter - FEATURE: Takeaways from SMX Advanced Seattle 2010 - 0 views

  • You & A with Matt Cutts of GoogleGoogle's new Web indexing system, Caffeine, is fully live. The new indexing infrastructure translates to an index that is 50 percent fresher, has more storage capacity and can recognize more connections of information. The Mayday update was an algorithm update implemented at the beginning of May that is intended to filter out low-quality search results. A new report in the Crawl errors section of Google Webmaster Tools indicates "soft 404" errors in order to help webmasters recognize and resolve these errors. Keynote Q&A with Yusuf Mehdi of Microsoft Bing is opening up new ways to interact with maps. The newly released Bing Map App SDK allows developers to create their own applications which can be used to overlay information on maps. Bing Social integrates to Facebook firehose and Twitter results into a social search vertical. Bing plans to have the final stages of the Yahoo! organic and paid search integration completed by the end of 2010. Decisions about how to maintain or integrate Yahoo! Site Explorer have not been finalized. Bing's Webmaster Tools are about to undergo a major update. Refer to the Bing Webmaster Tools session for more on this development.
  • Bing's program manager said that the functionality provided by Yahoo! Site Explorer will still be available. It's not their intention to alienate SEOs because they consider SEOs users, too.
  • The Bing Webmaster team has built a new Webmaster Tools platform from the ground up. It is scheduled to go live Summer 2010. The platform focuses on three key areas: crawl, index and traffic. Data in each area will go back through a six month period. Tree control is a new feature that provides a visual way to traverse the crawl and index details of a site. The rich visualizations are powered by Silverlight. URL submission and URL blocking will be available in the new Webmaster Tools.
  • ...1 more annotation...
  • The Ultimate Social Media Tools Session Tools to get your message out: HelpaReporter, PitchEngine, Social Mention, ScoutLabs. Customer and user insight tools: Rapleaf, Flowtown. Tools to find influencers: Klout. Forum tools: Bing Boards, Omgili, Board Tracker, Board Reader. Digg tools: Digg Alerter, FriendStatistics, di66.net. Make use of the social tools offered by social networks, e.g. utilize Facebook's many options to update your page and communicate your fans by SMS. Encourage people to follow you using Twitter's short code.
Rob Laporte

Local Search Ranking Factors | Google & Yahoo Local SEO Best Practices - 0 views

  • MOST RECOMMENDED FACTORS TO FOCUS ON 79 → 1 34.44 → 37.61 ▲ Physical Address in City of Search (PLACE PAGE) ↑1 Manually Owner-verified Place Page (PLACE PAGE) ↓1 Proper Category Associations (PLACE PAGE) -- Volume of Traditional Structured Citations (IYPs, Data Aggregators) (OFF-PLACE/OFF-SITE) -- Crawlable Address Matching Place Page Address (WEBSITE) ↑25 PageRank / Authority of Website Homepage / Highest Ranked Page (WEBSITE) ↑34 Quality of Inbound Links to Website (OFF-PLACE/OFF-SITE) ↑9 Crawlable Phone Number Matching Place Page Phone Number (WEBSITE) n/a Local Area Code on Place Page (PLACE PAGE) ↑18 City, State in Places Landing Page Title
  •  
    20 Local Search Marketing experts weigh in on the importance of 47 criteria that influence rankings in the Google and Yahoo Local search algorithms.\nThis could be used as a checklist for going local SEO and IYP.
  •  
    Survey of most recommended factors to focus on. 
jack_fox

28 Google rich snippets you should know in 2019 [guide + infographic] - Mangools Blog - 0 views

  • unless you are an authoritative website such as Wikipedia, your information probably won’t appear in the answer box.
  • having an image from your website in an image pack is not very beneficial.
  • Besides the common video thumbnail and video knowledge panel, videos may also appear in a carousel, both on the mobile and the desktop devices.
  • ...15 more annotations...
  • It is always a good idea to have a video on your website. It increases the user engagement and grabs the attention. If you appear in a SERP with your own video thumbnail, it increases the CTRs, and the user will likely stay longer on your site.
  • If you decide to host (or embed) a video on your own website, you have to include proper structured data markup.
  • In general, it’s easier to appear as a video thumbnail in SERP with youtube video.
  • From the technical point of view, it is important to have a structured data markup for your article and it is recommended by Google to have an AMP version of the website.
  • It is based on internal Google algorithm. Your website has to be authoritative and contain high quality content. It doesn’t matter if you are a big news portal or you have a personal blog. If there is a long, high quality content, Google may include your website.
  • If you want to appear as an in-depth article, you should write long, high quality and unique content marked up with a structured data markup for article (don’t forget to include your company logo within the schema markup).
  • Higher CTRs. It’s kinda catchy as numbers will always attract people attention. An image can make the feature even more prominent.
  • Implementation: Good old friend: structured data
  • In the SERP, they replace the classic URL of a result. It’s a simplified and a common version of URL of the result. Categories and leaf pages are separated with chevrons. On the desktop you can achieve it with the right structured data, in mobile SERP it is automatic for all results.
  • Breadcrumbs (as opposed to a common URL) are easier to read for people, so it leads to a better UX right from the very first interaction with your website in the SERP, which can also lead to a higher CTR.
  • It’s really easy to implement it on every blog or ecommerce site – just another structured data to your website. If you have a WordPress site, you can do that with SEO plugins like Yoast SEO.
  • It mainly appears for the root domain, but it can be shown for a leaf page too (e.g. if you have the blog as a leaf page, blog categories (leaf pages) may appear as sitelinks).
  • Sitelinks contain links to leaf pages of a current website with title and description. It may contain 2 – 10 sitelinks. Appearance on a mobile is a bit different from a desktop. You may also spot small sitelinks as a vertical enhancement of an organic result.
  • High CTRs.
  • You can’t directly control the occurrence of sitelinks. Only Google decides whether to display them or not. However, the best practise is to have a clear website hierarchy in a top menu website with descriptive anchor text. The sitelinks are links from the menu.
Rob Laporte

Interviewing Google's John Mueller at SearchLove: domain authority metrics, sub-domains... - 0 views

  • Confirmed: Google has domain-level authority metrics
  • Confirmed: Google does sometimes treat sub-domains differently
  • At the back end of last year, John surprised us with a statement that pages that are noindex will, in the long run, eventually be treated as nofollow as well.
  • ...2 more annotations...
  • Algorithm changes don’t map easily to actions you can take
  • It's like all of these small steps are taking place, and some of them use machine learning.
Rob Laporte

Google's neural matching versus RankBrain: How Google uses each in search - Search Engi... - 0 views

  • Google said in September 2018 that neural matching impacts about 30 percent of all queries. We asked Google if that has increased, but have not received an update.What is RankBrain? Isn’t it similar? Google told us in 2016 that RankBrain (see our RankBrain FAQ) is also an AI, machine learning-based system that helps Google understand queries.Google said a good way to think about RankBrain is as an AI-based system it began using in 2016 primarily to understand how words are related to concepts.So what’s the difference between Neural matching and RankBrain? Google put it this way:RankBrain helps Google better relate pages to concepts.Neural matching helps Google better relate words to searches
  • Why it matters. The truth is, there isn’t much a search marketer can do to better optimize for RankBrain, as we said in 2016. The same seems to apply for neural matching, there doesn’t seem like you can do anything special to do better here. This is more about Google understanding queries and content on a page better than it currently does right now.That said, it seems to indicate that search marketers need to worry a bit less about making sure specific keywords are on their pages because Google is getting smarter at figuring out the words you use naturally on your pages and matching them to queries.We asked Google if it has additional recommendations around neural matching and RankBrain and were told its advice has not changed: Simply “create useful, high quality content.”
  • Google’s neural matching versus RankBrain: How Google uses each in searchNeural matching helps Google better relate words to searches, while RankBrain helps Google better relate pages to concepts.
  • ...1 more annotation...
  • What is neural matching? Google explained “Neural matching is an AI-based system Google began using in 2018 primarily to understand how words are related to concepts.”“It’s like a super synonym system. Synonyms are words that are closely related to other words,” Google added.
jack_fox

Black market in Google reviews means you can't believe everything you read | CBC News - 0 views

  • a growing black market in which some companies pay for fake positive reviews, while others are seemingly being extorted by web firms who post negative comments then propose their "review-fixing" services to get them taken down.
  • When CBC News asked Google about Riverbend's complaints, including Pereira's own fake review, it was finally removed — along with 32 other one-star reviews. And as a result, the company's star rating went up from 3.6 to 4.1 overnight. 
  • There is no evidence that Google is planning to turn away from algorithm-based content moderation, or make the kind of massive human investments that Toscano and others are calling for.
Rob Laporte

Google Page Experience Update Begins Rolling Out - 0 views

  • Google’s page experience algorithm update is starting to roll out now and will be completed by the end of August 2021.
  • The page experience update considers several signals that go into creating an optimal browsing experience for users.Google assesses each of the signals and gives a website an overall ‘page experience’ score. Site owners can view their score in the new page experience report in Search Console.These are each of the signals and what is required to achieve a “good” page experience score.Core Web Vitals: See our guide to Google’s Core Web Vitals.Mobile usability: A page must have no mobile usability errors.Security issues: Any security issues for a site disqualify all pages on the site from a Good status.HTTPS usage: A page must be served over HTTPS to be eligible for Good page experience status.Ad Experience: A site must not use advertising techniques that are distracting, interrupting, or otherwise not conducive to a good user experience.
jack_fox

Google categorizes what organic search traffic drops look like - 0 views

  • With technical issues that are on a page by page basis or an algorithm change like a core update, you would see a slower decline in your traffic, and it would then level off over tim
  • Google has been known to have reporting glitches in Search Console, where you see things bounce back to where they are
  • Note that the issues can be site-wide (for example, your website is down) or page-wide (for example, a misplaced noindex tag,
  • ...1 more annotation...
  • Sometimes changes in user behavior will change the demand for certain queries, either as a result of a new trend, or seasonality throughout the year. This means your traffic may drop simply as a result of external influences.
« First ‹ Previous 101 - 120 of 136 Next ›
Showing 20 items per page