Skip to main content

Home/ DISC Inc/ Group items tagged blog guidelines

Rss Feed Group items tagged

Rob Laporte

Google Removes Directory Links From Webmaster Guidelines - 0 views

  • Oct 3, 2008 at 9:48am Eastern by Barry Schwartz    Google Removes Directory Links From Webmaster Guidelines Brian Ussery reported that Google has dropped two important bullet points from the Google Webmaster Guidelines. Those bullet points include: Have other relevant sites link to yours. Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites. At the same time, Google Blogoscoped reported that Google removed the dictionary link in the search results, at the top right of the results page. Related, I am not sure. I speculated that maybe Google is going to go after more directories in the future. By removing those two bullet points, maybe Google can do this - without seeming all that hypocritical. In addition, I noted a comment from Google John Mueller at a Google Groups thread where he explained the logic behind removing those two points: I wouldn’t necessarily assume that we’re devaluing Yahoo’s links, I just think it’s not one of the things we really need to recommend. If people think that a directory is going to bring them lots of visitors (I had a visitor from the DMOZ once), then it’s obviously fine to get listed there. It’s not something that people have to do though :-). As you can imagine, this is causing a bit of a commotion in some of the forums. Some are worried, some are mad, and some are confused by the change.
  •  
    Oct 3, 2008 at 9:48am Eastern by Barry Schwartz Google Removes Directory Links From Webmaster Guidelines Brian Ussery reported that Google has dropped two important bullet points from the Google Webmaster Guidelines. Those bullet points include: * Have other relevant sites link to yours. * Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites. At the same time, Google Blogoscoped reported that Google removed the dictionary link in the search results, at the top right of the results page. Related, I am not sure. I speculated that maybe Google is going to go after more directories in the future. By removing those two bullet points, maybe Google can do this - without seeming all that hypocritical. In addition, I noted a comment from Google John Mueller at a Google Groups thread where he explained the logic behind removing those two points: I wouldn't necessarily assume that we're devaluing Yahoo's links, I just think it's not one of the things we really need to recommend. If people think that a directory is going to bring them lots of visitors (I had a visitor from the DMOZ once), then it's obviously fine to get listed there. It's not something that people have to do though :-). As you can imagine, this is causing a bit of a commotion in some of the forums. Some are worried, some are mad, and some are confused by the change.
Rob Laporte

Live Search Webmaster Center Blog : SMX East 2008: Webmaster Guidelines - 0 views

  • The bottom line is there are no scenarios in which we would ever recommend cloaking as a good solution, although we do understand that there are some technical reasons people cloak pages that are not directly related to spam. The problem is that cloaking can set off some automatic spam detection algorithms that may result in parts of your site being penalized. As a search engine optimization practice, cloaking can actually be counter-productive.
  • Q: What can you do if your website does get penalized? The first thing you should do is verify that your site has in fact been penalized. To do this, log into our Webmaster Tools and go to the Site Summary page. From here, looked for the Blocked: field in the right-hand column. If your site is blocked, this will show as Yes, otherwise it will show as No. If your site is blocked, then it is time to go review our Webmaster Guidelines and check your site to see which one(s) you may have violated. If you have any questions about this step, please consult our online forums, or contact our technical support staff. Once you've identified and resolved the issue(s), it is time to request that Live Search re-include your pages back into its index. To do that, you'll need to log back into the Webmaster Tools and click on the hyperlinked Blocked: Yes in your Site Summary page. This will take you to a form whereby you can request reevaluation from our support team. Thanks for all of your questions today! If you have any more, please leave them in the comments section and we'll try and answer them as soon as possible.
  •  
    The bottom line is there are no scenarios in which we would ever recommend cloaking as a good solution, although we do understand that there are some technical reasons people cloak pages that are not directly related to spam. The problem is that cloaking can set off some automatic spam detection algorithms that may result in parts of your site being penalized. As a search engine optimization practice, cloaking can actually be counter-productive.
Rob Laporte

Google & Microsoft Share Advice For Webmasters, SEOs - 0 views

  • On the Live Search blog, Nathan Buggia recaps his SMX East presentation on Webmaster Guidelines, shares the slides from his talk, and expands on topics such as paid links, cloaking, and website penalties. He shares some detail on how Live Search handles paid links: Essentially we look at each link individually to understand the degree to which the site is really endorsing the link. So, while we most likely will not ban your site for buying or selling a few links, it is also likely that they may not actually end up providing any value either.
  •  
    On the Live Search blog, Nathan Buggia recaps his SMX East presentation on Webmaster Guidelines, shares the slides from his talk, and expands on topics such as paid links, cloaking, and website penalties. He shares some detail on how Live Search handles paid links: Essentially we look at each link individually to understand the degree to which the site is really endorsing the link. So, while we most likely will not ban your site for buying or selling a few links, it is also likely that they may not actually end up providing any value either.
Rob Laporte

An SEO guide to understanding E-E-A-T - 0 views

  • Google recently added an extra “E” to the search quality standards of E-A-T to ensure content is helpful and relevant. The extra “E” stands for “experience” and precedes the original E-A-T concept – expertise, authoritativeness and trustworthiness. 
  • The Stanford Persuasive Technology Lab compiled 10 guidelines for building web credibility based on three-year research with over 4,500 participants. Make it easy to verify the accuracy of the information on your site. Show that there’s a real organization behind your site. Highlight the expertise in your organization and in the content and services you provide. Show that honest and trustworthy people stand behind your site. Make it easy to contact you. Design your site so it looks professional (or is appropriate for your purpose). Make your site easy to use – and useful. Update your site’s content often (at least show it’s been reviewed recently). Use restraint with any promotional content (e.g., ads, offers). Avoid errors of all types, no matter how small they seem. – Stanford Web Credibility Research If the above doesn’t scream, “Be a human, care about your users and your website experience,” I don’t know what does.
  • Experience is especially important in a digital world moving toward generative AI content
  • ...13 more annotations...
  • It’s probably no coincidence that Google announced the addition of “experience” in its search quality raters guidelines shortly after ChatGPT’s launch. 
  • Besides, expertise will build confidence with the human reading your content, so I would still consider adding: The author’s name. A descriptive bio containing: Their relevant qualifications. Links to their social media profiles. A Person schema with relevant properties for certifications or professions.
  • Authority can be demonstrated in three core ways:  Establishing a strong content architecture covering all aspects of a particular topic. Earning backlinks from other authoritative sites. Building a digital profile or personal brand as an expert in a particular topic.
  • Once again, the idea of publishing content that is truly helpful supports Standford’s web credibility guidelines: Make it easy to contact you. Make it easy to verify the accuracy of the information on your site. Design your site so it looks professional (or is appropriate for your purpose). Make your site easy to use – and useful. Update your site’s content often (at least show it’s been reviewed recently). Use restraint with any promotional content (e.g., ads, offers). Avoid errors of all types, no matter how small they seem.
  • Although they carry less weight than they used to, backlinks are still an indicator of an authoritative site.
  • Consider page experience
  • Show your humans with an About us or Team page
  • Link to authoritative sources
  • Build topical clusters
  • Use internal links
  • Include different content types
  • Engage experts
  • Encourage reviews
Jennifer Williams

Tag Categories - 24 views

Hey Dale, I added that for you. If anyone else really thinks a new "tag" (category) is needed, post here to the forum. Don't forget to use these tags and make sure that they are spelled the same...

tags

jack_fox

17 Advanced SEO Techniques for 2021 - 0 views

  • These entries are all optimized around a single keyword (usually a long tail). And they’re NOT written like normal blog posts. Instead, each entry is more like a Wikipedia article.
  • a recent SEO experiment discovered that “duplicate images” (like stock photos) can hurt your page’s rankings.
  • Content Features are things like: Calculators Comparison charts Feature breakdowns Pros and cons lists Summaries Quote boxes
  • ...3 more annotations...
  • Google’s own Quality Rater Guidelines state that Supplementary Content is “important”.
  • people are much more likely to edit a brand new post. So the faster you send your email, the more likely you’ll get a link.
  • Google Discover SEO: Content freshness (you get most of your Discover traffic the day a post goes live) Original, high-quality images, charts and graphs Engagement on Twitter (which Google indexes) High level of traffic to a single page Content about popular topics (Discover content suggestions are based largely on that user’s browsing history)
Rob Laporte

How to Optimize for Google's Featured Snippets to Build More Traffic - Moz - 1 views

  • Multiple studies confirm that the majority of featured snippets are triggered by long-tail keywords. In fact, the more words that are typed into a search box, the higher the probability there will be a featured snippet.
  • To avoid confusion, let's stick to the "featured snippet" term whenever there's a URL featured in the box, because these present an extra exposure to the linked site (hence they're important for content publishers):
  • It helps if you use a keyword research tool that shows immediately whether a query triggers featured results. SE Ranking offers a nice filter allowing you to see keywords that are currently triggering featured snippets:
  • ...7 more annotations...
  • Tools like Buzzsumo and Text Optimizer can give you a good insight into questions people tend to ask around your topic:
  • Note that Search Console labels featured snippet positions as #1 (SEO used to call them position 0). So when you see #1 in Google Search Console, there’s nothing to do here. Focus on #2 and lower.
  • MyBlogU (disclaimer: I am the founder) is a great way to do that. Just post a new project in the " Brainstorm" section and ask members to contribute their thoughts.
  • 1. Aim at answering each question concisely My own observation of answer boxes has led me to think that Google prefers to feature an answer which was given within one paragraph. An older study by AJ Ghergich cites that the average length of a paragraph snippet is 45 words (the maximum is 97 words), so let it be your guideline as to how long each answer should be in order to get featured. This doesn't mean your articles need to be one paragraph long. On the contrary, these days Google seems to give preference to long-form content (also known as " cornerstone content," which is obviously a better way to describe it because it's not just about length) that's broken into logical subsections and features attention-grabbing images.  Even if you don’t believe that cornerstone content receives any special treatment in SERPs, focusing on long articles will help you to cover more related questions within one piece (more on that below). All you need to do is to adjust your blogging style just a bit: Ask the question in your article (that may be a subheading)Immediately follow the question with a one-paragraph answerElaborate further in the article
  • 2. Be factual and organize well Google loves numbers, steps and lists. We've seen this again and again: More often than not, answer boxes will list the actual ingredients, number of steps, time to cook, year and city of birth, etc. Use Google’s guide on writing meta descriptions to get a good idea what kind of summaries and answers they are looking to generate snippets (including featured snippets). Google loves well-structured, factual, and number-driven content. There's no specific markup to structure your content. Google seems to pick up <table>, <ol>, and <ul> well and doesn't need any other pointers. Using H2 and H3 subheadings will make your content easier to understand for both Google and your readers. 3. Make sure one article answers many related questions Google is very good at determining synonymic and closely related questions, so should be you. There's no point in creating a separate page answering each specific question. Creating one solid article addressing many related questions is a much smarter strategy if you aim at getting featured in answer boxes. This leads us to the next tactic: 4. Organize your questions properly To combine many closely related questions in one article, you need to organize your queries properly. This will also help you structure your content well. I have a multi-level keyword organization strategy that can be applied here as well: A generic keyword makes a section or a category of the blogA more specific search query becomes the title of the articleEven more specific queries determine the subheadings of the article and thus define its structureThere will be multiple queries that are so closely related that they will all go under a single subheading For example: Serpstat helps me a lot when it comes to both discovering an article idea and then breaking it into subtopics. Check out its " Questions" section. It will provide hundreds of questions containing your core term and then generate a tag cloud of other popular terms that come up in those questions:
  • 5. Make sure to use eye-grabbing images
  • How about structured markup? Many people would suggest using Schema.org (simply because it's been a "thing" to recommend adding schema for anything and everything) but the aforementioned Ahrefs study shows that there's no correlation between featured results and structured markup.
  •  
    "Organize your questions properly"
Jennifer Williams

RSS To HTML - How To Convert RSS Feeds Into Published Web Pages - A Mini-Guide - Robin ... - 0 views

  •  
    Covers various services to convert RSS feeds into published web pages on your site. Good for clients who have an offsite blog.
jack_fox

Everything You Need to Know About Spammy Structured Markup Penalty - 0 views

  • According to Google, spammy structured markup penalty exists. On Webmasters Forum there are a lot of people that received a message in Search Console; Manual actions saying that the website’s schema code is spammy and it violates Google’s quality guidelines.
  • Use structured data for visible content only; Check and fix any warnings with Google’s testing tool; Use different markup for the pages within your website;
  • John Mueller said that, in most cases, the site’s ranking might not get affected by the loss of structured markup data.
  • ...1 more annotation...
  • In practice, if the structure data team takes action on a site it will get affected only the rich snippets. So, the spammy structured data doesn’t affect the rankings of a site. The rest of your site is still normally shown in search.
    • jack_fox
       
      6/2/17 video
jack_fox

A reintroduction to Google's featured snippets - 0 views

  • we launched an effort that included updates to our Search Quality Rater Guidelines to provide more detailed examples of low-quality webpages for raters to appropriately flag, which can include misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories. This work has helped our systems better identify when results are prone to low-quality content. If detected, we may opt not to show a featured snippet.
  • Showing more than one featured snippet may also eventually help in cases where you can get contradictory information when asking about the same thing but in different ways.
Rob Laporte

Beyond conventional SEO: Unravelling the mystery of the organic product carousel - Sear... - 0 views

  • How to influence the organic product carouselIn Google’s blog post, they detailed three factors that are key inputs: Structured Data on your website, providing real-time product information via Merchant Center, along with providing additional information through Manufacturer Center.This section of the article will explore Google’s guidance, along with some commentary of what I’ve noticed based on my own experiences.
  • Make sure your product markup is validatedThe key here is to make sure Product Markup with Structured Data on your page adheres to Google’s guidelines and is validated.
  • Submit your product feed to Google via Merchant CenterThis is where it starts to get interesting. By using Google’s Merchant Center, U.S. product feeds are now given the option to submit data via a new destination.The difference here for Google is that retailers are able to provide more up-to-date information about their products, rather than waiting for Google to crawl your site (what happens in step 1).Checking the box for “Surfaces across Google” gives you the ability to grant access to your websites product feed, allowing your products to be eligible in areas such as Search and Google Images.For the purpose of this study we are most interested in Search, with the Organic Product Carousel in mind. “Relevance” of information is the deciding factor of this feature.Google states that in order for this feature of Search to operate, you are not required to have a Google Ads campaign. Just create an account, then upload a product data feed.Commentary by PPC Expert Kirk Williams:“Setting up a feed in Google Merchant Center has become even more simple over time since Google wants to guarantee that they have the right access, and that retailers can get products into ads! You do need to make sure you add all the business information and shipping/tax info at the account level, and then you can set up a feed fairly easily with your dev team, a third party provider like Feedonomics, or with Google Sheets. As I note in my “Beginner’s Guide to Shopping Ads”, be aware that the feed can take up to 72 hours to process, and even longer to begin showing in SERPs. Patience is the key here if just creating a new Merchant Center… and make sure to stay up on those disapprovals as Google prefers a clean GMC account and will apply more aggressive product disapproval filters to accounts with more disapprovals. ”– Kirk WilliamsFor a client I’m working with, completing this step resulted in several of their products being added to the top 10 of the PP carousel. 1 of which is in the top 5, being visible when the SERP first loads.This meant that, in this specific scenario, the product Structured Data that Google was regularly crawling and indexing in the US wasn’t enough on it’s own to be considered for the Organic Product Carousel.Note: the products that were added to the carousel were already considered “popular” but Google just hadn’t added them in. It is not guaranteed that your products will be added just because this step was completed. it really comes down to the prominence of your product and relevance to the query (same as any other page that ranks).
  • ...2 more annotations...
  • 3. Create an additional feed via Manufacturer CenterThe next step involves the use of Google’s Manufacturer Center. Again, this tool works in the same way as Merchant Center: you submit a feed, and can add additional information.This information includes product descriptions, variants, and rich content, such as high-quality images and videos that can show within the Product Knowledge Panel.You’ll need to first verify your brand name within the Manufacturer Center Dashboard, then you can proceed to uploading your product feed.When Google references the “Product Knowledge Panel” in their release, it’s not the same type of Knowledge Panel many in the SEO industry are accustomed.This Product Knowledge Panel contains very different information compared to your standard KP that is commonly powered by Wikipedia, and appears in various capacities (based on how much data to which it has access).Here’s what this Product Knowledge Panel looks like in its most refined state, completely populated with all information that can be displayed:Type #1 just shows the product image(s), the title and the review count.Type #2 is an expansion on Type #1 with further product details, and another link to the reviews.Type #3 is the more standard looking Knowledge Panel, with the ability to share a link with an icon on the top right. This Product Knowledge Panel has a description and more of a breakdown of reviews, with the average rating. This is the evolved state where I tend to see Ads being placed within.Type #4 is an expansion of Type #3, with the ability to filter through reviews and search the database with different keywords. This is especially useful functionality when assessing the source of the aggregated reviews.Based on my testing with a client in the U.S., adding the additional information via Manufacturer Center resulted in a new product getting added to a PP carousel.This happened two weeks after submitting the feed, so there still could be further impact to come. I will likely wait longer and then test a different approach.
  • Quick recap:Organic Product Carousel features are due to launch globally at the end of 2019.Popular Product and Best Product carousels are the features to keep an eye on.Make sure your products have valid Structured Data, a submitted product feed through Merchant Center, along with a feed via Manufacturer Center.Watch out for cases where your clients brand is given a low review score due to the data sources Google has access to.Do your own testing. As Cindy Krum mentioned earlier, there are a lot of click between the Organic Product Carousel listings and your website’s product page.Remember: there may be cases where it is not possible to get added to the carousel due to an overarching “prominence” factor. Seek out realistic opportunities.
Jennifer Williams

Black Hat SEO Techniques Part 2: The Myth of Duplicate Content - SEO - Zimbio - 0 views

  •  
    Concerning rss feeds as duplicate content.
Rob Laporte

70+ Best Free SEO Tools (As Voted-for by the SEO Community) - 1 views

  • Soovle — Scrapes Google, Bing, Yahoo, Wikipedia, Amazon, YouTube, and Answers.com to generate hundreds of keyword ideas from a seed keyword. Very powerful tool, although the UI could do with some work.Hemingway Editor — Improves the clarity of your writing by highlighting difficult to read sentences, “weak” words, and so forth. A must-have tool for bloggers (I use it myself).
  • Yandex Metrica — 100% free web analytics software. Includes heat maps, form analytics, session reply, and many other features you typically wouldn’t see in a free tool.
  • For example, two of my all-time favourite tools are gInfinity (Chrome extension) and Chris Ainsworth’s SERPs extraction bookmarklet.By combining these two free tools, you can extract multiple pages of the SERPs (with meta titles + descriptions) in seconds.
  • ...17 more annotations...
  • Varvy — Checks whether a web page is following Google’s guidelines. If your website falls short, it tells you what needs fixing.
  • LSIgraph.com — Latent Semantic Indexing (LSI) keywords generator. Enter a seed keyword, and it’ll generate a list of LSI keywords (i.e. keywords and topics semantically related to your seed keyword). TextOptimizer is another very similar tool that does roughly the same job.
  • Small SEO Tools Plagiarism Checker — Detects plagiarism by scanning billions of documents across the web. Useful for finding those who’ve stolen/copied your work without attribution.
  • iSearchFrom.com — Emulate a Google search using any location, device, or language. You can customise everything from SafeSearch settings to personalised search.
  • Delim.co — Convert a comma-delimited list (i.e. CSV) in seconds. Not necessarily an SEO tool per se but definitely very useful for many SEO-related tasks.
  • Am I Responsive? — Checks website responsiveness by showing you how it looks on desktop, laptop, tablet, and mobile.
  • SERPLab — Free Google rankings checker. Updates up to 50 keywords once every 24 hours (server permitting).
  • Keyword Mixer — Combine your existing keywords in different ways to try and find better alternatives. Also useful for removing duplicates from your keywords list.Note: MergeWords does (almost) exactly the same job albeit with a cleaner UI. However, there is no option to de-dupe the list.
  • JSON-LD Schema Generator — JSON-LD schema markup generator. It currently supports six markup types including: product, local business, event, and organization.
  • KnowEm Social Media Optimizer — Analyses your web page to see if it’s well-optimised for social sharing. It checks for markup from Facebook, Google+, Twitter, and LinkedIn.
  • Where Goes? — Shows you the entire path of meta-refreshes and redirects for any URL. Very useful for diagnosing link issues (e.g. complex redirect chains).
  • Google Business Review Link Generator — Generates a direct link to your Google Business listing. You can choose between a link to all current Google reviews, or to a pre-filled 5-star review box.
  • PublicWWW — Searches the web for pages using source code-based footprints. Useful for finding your competitors affiliates, websites with the same Google Analytics code, and more.
  • Keywordtool.io — Scrapes Google Autosuggest to generate 750 keyword suggestions from one seed keyword. It can also generate keyword suggestions for YouTube, Bing, Amazon, and more.
  • SERPWatcher — Rank tracking tool with a few unique metrics (e.g. “dominance index”). It also shows estimated visits and ranking distribution charts, amongst other things.
  • GTMetrix — Industry-leading tool for analysing the loading speed of your website. It also gives actionable recommendations on how to make your website faster.
  • Mondovo — A suite of SEO tools covering everything from keyword research to rank tracking. It also generates various SEO reports.SEO Site Checkup — Analyse various on-page/technical SEO issues, monitor rankings, analyse competitors, create custom white-label reports, and more.
Rob Laporte

What Do Google's New, Longer Snippets Mean for SEO? - Whiteboard Friday - Moz - 0 views

  • So Google's had historic guidelines that said, well, you want to keep your meta description tag between about 160 and 180 characters. I think that was the number. They've updated that to where they say there's no official meta description recommended length. But on Twitter, Danny Sullivan said that he would probably not make that greater than 320 characters. In fact, we and other data providers, that collect a lot of search results, didn't find many that extended beyond 300. So I think that's a reasonable thing.
  • Now it's sitting at about 51% of search results that have these longer snippets in at least 1 of the top 10 as of December 2nd.
jack_fox

The Ultimate Cheat Sheet for Taking Full Control of Your Google Knowledge Panels - Moz - 0 views

  • Posts can be up to 1500 characters, but 150–350 characters is advisable.
  • Google has let Top Contributors to its forum know that it’s okay for businesses to contribute knowledge to their own Know This Place section
  • Review snippets This section of the Knowledge Panel features three excerpts from Google-based reviews, selected by an unknown process.
  • ...3 more annotations...
  • Avoid repetition in category choices
  • Do call out desirable aspects of your business in the description, but don’t use it to announce sales or promotions, as that’s a violation of the guidelines.
  • The most popular solution to the need to implement call tracking is to list the call tracking number as the primary number and the store location number as the additional number. Provided that the additional number matches what Google finds on the website, no serious problems have been reported from utilizing this strategy since it was first suggested in 2017
jack_fox

Why Google Plans to Give Non-AMP Pages the Same Treatment as AMP Pages - Here's Why #23... - 1 views

  • The folks that make AMP and that invented AMP don’t really think you have to use AMP to make things fast, it’s just easier to use AMP. So, they’ve thought all along that it’s actually better if any pages that follow AMP’s guidelines for speed, reliability, layout stability
  • AMP can also be restrictive too. So, if you don’t have to use AMP at all and it’s an option, it seems a lot nicer than having to use AMP to get these special Google Search features
1 - 19 of 19
Showing 20 items per page