Skip to main content

Home/ DISC Inc/ Group items tagged linking

Rss Feed Group items tagged

jack_fox

What You Need to Know About Open Graph Meta Tags for Total Facebook and Twitter Mastery - 0 views

  • All of the other major platforms, Twitter, LinkedIn, and Google+, recognize Open Graph tags. Twitter actually has its own meta tags for Twitter Cards, but if Twitter robots cannot find any, Twitter uses Open Graph tags instead.
  • The tags can affect conversions and click-through rates hugely
  • Adding Open Graph tags to your website won’t directly affect your on-page SEO, but it will influence the performance of your links on social media
  • ...8 more annotations...
  • if Facebook doesn’t find the og:title tag on your page, it uses the meta title instead.
  • og:typeThis is how you describe the kind of object you are sharing: blog post, video, picture, or whatever
  • In most cases, you will use the “website” value, since what you are sharing is a link to a website. In fact, if you don’t define a type, Facebook will read it as “website” by default.
  • og:descriptionThis meta data descriptor is very similar to the meta description tag in HTML. This is where you describe your content, but instead of it showing on a search engine results page, it shows below the link title on Facebook.
  • the picture you use as an Open Graph image can be different from what you have on your page
  • og:locale – defines the language, American English is the default
  • twitter:cardThis required tag works in a similar way to og:type. It describes the type of content you are sharing.
  • before you can fully benefit from Twitter Cards, you need to request an approval for your page from Twitter. Fortunately, this doesn’t take much time and can be done easily using their Card Validator
jack_fox

The January 2020 Core Update: Affiliate Sites, Pet Health, Trust Issues and Spam likely... - 0 views

  • Affiliate sites that did not properly disclose their affiliate links may have been affected.Truly excellent content appears to have been rewarded.Several elements of trust, as outlined in the Quality Raters’ Guidelines (QRG) were possibly reassessed.
  • A lot of ultra-spammy content may have been deindexed.
  • we believe that if something is outlined in the QRG, it means that Google is either measuring this algorithmically, or they want to be able to measure it algorithmically.
  • ...2 more annotations...
  • some examples of things that we noticed on affiliate sites that saw improvements in overall keyword rankings with this update:Plain text to make it clear that the user is clicking on a link to take them to a sales page. Example: When I make this recipe, I love to use this blender which you can buy on Amazon.
  • Using an official widget from your affiliate partners.
jack_fox

Subdomain vs. Subfolder, Is One Better Than the Other for SEO? - 0 views

  • Google has repeatedly said either is fine.
  • John Mueller said in 2017:Google websearch is fine with using either subdomains or subdirectories. Making changes to a site’s URL structure tends to take a bit of time to settle down in search so I recommend picking a setup that you can keep for longer.
  • Many SEOs believe that subdomains are treated as separate domains, but the truth is more complicated. Anyone that incorporates subdomains as a main part of their site will likely have them treated the exact same as a subfolder would be treated. However, if you’re not treating the subdomains as part of your main website (read as not connecting them with internal links), then they may be treated as separate.
  • ...3 more annotations...
  • These days, subdomains are likely to be treated as part of the same website if they appear to be part of the same website.
  • Many case studies show subfolders are better than subdomains, but I haven’t seen one that wasn’t complicated by other changes like additional internal linking or migrating multiple properties into one.
  • Changes introduce risk. You might want to think twice before changing from a subdomain to a subfolder if the only reason you’re doing it is for SEO.
jack_fox

Entity-Based Search For Advanced SEO - 0 views

  • From a technical SEO perspective, one of the most effective ways to create strong connections is through schema markup.
  • By linking the content on your website to resources across the Web, search engines begin to understand and contextualize the information.
  • One effective way to contextualize your content is to link the information on your website to other entities in knowledge graphs with high E-A-T like Wikipedia. Of course, not all entities exist on Wikipedia pages. Other types of entities, like you, your brand or your company, can be linked to knowledge graphs like LinkedIn.
jack_fox

- 0 views

  • 100s of backlinks from http://theglobe.net  and related spammy domains. Ignore or disavow?
    • jack_fox
       
      THS has this situation. According to John Mueller, it's nothing to worry about!
  • We already ignore links from sites like that, where there are unlikely to be natural links. No need to disavow :)
jack_fox

Update: Long Term Shared Hosting Experiment | Reboot - 0 views

  • The results showed that experiment websites sharing an IP address with spammy and/or 'low-quality' websites (defined as those we confirmed were using aggressive and suspicious link building strategies in adult, gambling or pharma niches) ranked lower than those hosted on a dedicated IP
  • In both versions of the experiment, the sites sharing an IP address with low-quality and potentially toxic domains ranked lower than those hosted on a dedicated IP.
  •  
    "The results showed that experiment websites sharing an IP address with spammy and/or 'low-quality' websites (defined as those we confirmed were using aggressive and suspicious link building strategies in adult, gambling or pharma niches) ranked lower than those hosted on a dedicated IP"
jack_fox

Patrick Stox en Twitter: "Uncommon SEO Knowledge #1 HTTPS - 0 views

  • HTTPS is required for many modern web technologies. HTTP/2 (H2), HTTP/3 (H3), Accelerated Mobile Pages (AMP), Progressive Web Apps (PWAs), service workers, geolocation, push notifications and more require HTTPS by default.
  • There are many types of certificates. The most common is Domain Validated (DV) which you can typically get for free from your web host, CDN, or issuers like https://letsencrypt.org. Organization Validated (OV) and Extended Validation (EV) may be seen as more trustworthy.
  • One certificate for a domain is a lot easier to maintain if you have a need for different subdomains.
  • ...2 more annotations...
  • You should setup monitoring on old domains with something like @contentking alerts. If your cert expires, users will receive a warning on your old pages and not be redirected to the new site.
  • You can still change the Referrer Policy for your site, but this mostly benefits other websites. You are still losing much of the referring page data to your own website. To see who is driving you traffic, you'll have to get data from a backlink index like @ahrefs.
Rob Laporte

Entity SEO: The definitive guide - 0 views

  • why are SEOs still confused about entities?
  • entities get conflated with keywords
  • Entity SEO is a far more scientific approach to SEO – and science just isn’t for everyone
  • ...13 more annotations...
  • By reading this, you’ll learn:  What an entity is and why it’s important. The history of semantic search. How to identify and use entities in the SERP. How to use entities to rank web content.
  • Examples of entities
  • Perhaps the best example of entities in the SERP is intent clusters. The more a topic is understood, the more these search features emerge
  • What is an entity? An entity is a uniquely identifiable object or thing characterized by its name(s), type(s), attributes, and relationships to other entities. An entity is only considered to exist when it exists in an entity catalog.  Entity catalogs assign a unique ID to each entity. My agency has programmatic solutions that use the unique ID associated with each entity (services, products, and brands are all included). If a word or phrase is not inside an existing catalog, it does not mean that the word or phrase is not an entity, but you can typically know whether something is an entity by its existence in the catalog.
  • concepts and ideas are entities
  • Schema is one of my favorite ways of disambiguating content. You are linking entities in your blog to knowledge repositories. Balog says:  “[L]inking entities in unstructured text to a structured knowledge repository can greatly empower users in their information consumption activities.” 
  • That brings us to the current search system. Google went from 570 million entities and 18 billion facts to 800 billion facts and 8 billion entities in less than 10 years. As this number grows, entity search improves.
  • How to optimize for entities What follows are key considerations when optimizing entities for search: The inclusion of semantically related words on a page. Word and phrase frequency on a page. The organization of concepts on a page. Including unstructured data, semi-structured data, and structured data on a page. Subject-Predicate-Object Pairs (SPO). Web documents on a site that function as pages of a book. Organization of web documents on a website. Include concepts on a web document that are known features of entities.
  • We know this, so how can we optimize for it?  Your documents should contain as many search intent variations as possible. Your website should contain every search intent variation for your cluster. Clustering relies on three types of similarity:  Lexical similarity.  Semantic similarity. Click similarity.
  • More could be said about schema, but suffice it to say schema is an incredible tool for SEOs looking to make page content clear to search engines.
  • (Remember, Google wants to understand the hierarchy of the content, which is why H1–H6 is important.)
  • Balog writes:  “We wish to help editors stay on top of changes by automatically identifying content (news articles, blog posts, etc.) that may imply modifications to the KB entries of a certain set of entities of interest (i.e., entities that a given editor is responsible for).” Anyone that improves knowledge bases, entity recognition, and crawlability of information will get Google’s love.  Changes made in the knowledge repository can be traced back to the document as the original source.  If you provide content that covers the topic and you add a level of depth that is rare or new, Google can identify if your document added that unique information. Eventually, this new information sustained over a period of time could lead to your website becoming an authority. This isn’t an authoritativeness based on domain rating but topical coverage, which I believe is far more valuable. With the entity approach to SEO, you aren’t limited to targeting keywords with search volume. All you need to do is to validate the head term (“fly fishing rods,” for example), and then you can focus on targeting search intent variations based on good ole fashion human thinking.
  • We begin with Wikipedia. For the example of fly fishing, we can see that, at a minimum, the following concepts should be covered on a fishing website: Fish species, history, origins, development, technological improvements, expansion, methods of fly fishing, casting, spey casting, fly fishing for trout, techniques for fly fishing, fishing in cold water, dry fly trout fishing, nymphing for trout, still water trout fishing, playing trout, releasing trout, saltwater fly fishing, tackle, artificial flies, and knots. The topics above came from the fly fishing Wikipedia page. While this page provides a great overview of topics, I like to add additional topic ideas that come from semantically related topics.  For the topic “fish,” we can add several additional topics, including etymology, evolution, anatomy and physiology, fish communication, fish diseases, conservation, and importance to humans.  Has anyone linked the anatomy of trout to the effectiveness of certain fishing techniques? Has a single fishing website covered all fish varieties while linking the types of fishing techniques, rods, and bait to each fish?  By now, you should be able to see how the topic expansion can grow. Keep this in mind when planning a content campaign. Don’t just rehash. Add value. Be unique. Use the algorithms mentioned in this article as your guide. Conclusion This article is part of a series of articles focused on entities. In the next article, I’ll dive deeper into the optimization efforts around entities and some entity-focused tools on the market.
Rob Laporte

Google's March 2022 Product Reviews Update (PRU) - Findings and observations from the a... - 0 views

  • sites should consider providing links to more than one retailer to purchase products
  • against Amazon’s TOS
  • my recommendation is to link to more than one seller, if possible (to future-proof your site), but it’s not a requirement as of now
  • ...1 more annotation...
  • provide evidence such as visuals, audio, or other links of your own experience with the product, to support your expertise and reinforce the authenticity of your review
Rob Laporte

Google's Matt Cutts: Black Hat & Link Spammers Less Likely To Show Up In Search Results... - 0 views

  •  
    Google Panda update
Rob Laporte

"Ask On Google+" Links Appearing In Google's Search Results - 0 views

  •  
    search
« First ‹ Previous 141 - 160 of 399 Next › Last »
Showing 20 items per page