Skip to main content

Home/ DISC Inc/ Group items tagged Keywords

Rss Feed Group items tagged

Rob Laporte

Inner View: Google's Keyword Research Tools (from SMX East) | Maine SEO Blog - 0 views

  • 55% of queries have more than 3 words 70% of queries have no exact keyword match 20% of queries in a given day have not been seen in the previous 90 days
  • Logged in vs. non-logged in When you’re logged in to KWT, you could get up to 1,000 queries. When you’re not logged in to KWT, you only get up to 100 queries
  • Google Suggest Keyword Tool uses Google Suggest, on top of a lot of other metrics.
Rob Laporte

65+ Best Free SEO Chrome Extensions (As Voted-for by SEO Community) - 1 views

  • Link Redirect Trace — Uncovers all URLs in a redirect chain including 301’s, 302’s, etc. Very useful for finding (and regaining) lost “link juice,” amongst other things.Other similar extensions: Redirect Path
  • Scraper — Scrape data from any web page using XPath or jQuery. Integrates with Google Sheets for one-click export to a spreadsheet. Or you can copy to clipboard and paste into Excel.Other similar extensions: Data Scraper — Easy Web Scraping, XPather
  • Tag Assistant (by Google) — Check for the correct installation of Google tags (e.g. Google Analytics, Tag Manager, etc) on any website. Also, record typical user flows on your website to diagnose and fix implementation errors.
  • ...16 more annotations...
  • Web Developer — Adds a web developer toolbar to Chrome. Use it to check how your website looks on different screen sizes, find images with missing alt text, and more.
  • WhatRuns — Instantly discover what runs any website. It uncovers the CMS, plugins, themes, ad networks, fonts, frameworks, analytics tools, everything.
  • Page Load Time — Measures and displays page load time in the toolbar. Also breaks down this metric by event to give you deeper insights. Simple, but very useful.
  • FATRANK — Tells you where the webpage you’re visiting ranks in Google for any keyword/phrase.
  • SEOStack Keyword Tool — Finds thousands of low-competition, long-tail keywords in seconds. It does this by scraping Google, Youtube, Bing, Yahoo, Amazon, and eBay. All data can be exported to CSV.
  • Window Resizer — Resize your browser window to see how a website looks on screens of different sizes. It has one-click emulation for popular sizes/resolutions (e.g. iPhone, iPad, laptop, desktop, etc).
  • Ghostery — Tells you how websites are tracking you (e.g. Facebook Custom Audiences, Google Analytics, etc) and blocks them. Very useful for regaining privacy. Plus, websites generally load faster when they don’t need to load tracking technologies.
  • Ayima Page Insights — Uncovers technical and on-page issues for any web page. It also connects to Google Search Console for additional insights on your web properties.
  • ObservePoint TagDebugger — Audit and debug issues with website tags (e.g. Google Analytics, Tag Manager, etc) on your websites. Also checks variables and on-click events.Other similar extensions: Event Tracking Tracker
  • The Tech SEO — Quick Click Website Audit — Provides pre-formatted links (for the current URL) to a bunch of popular SEO tools. A very underrated tool that reduces the need for mundane copy/pasting.
  • User-Agent Switcher for Chrome — Mimic user-agents to check that your website displays correctly in different browsers and/or OS’.
  • Portent’s SEO Page Review — Reviews the current page and kicks back a bunch of data including meta tags, canonicals, outbound links, H1-H6 tags, OpenGraph tags, and more.
  • FindLinks — Highlights all clickable links/elements on a web page in bright yellow. Very useful for finding links on websites with weird CSS styling.
  • SERPTrends SEO Extension — Tracks your Google, Bing, and Yahoo searches. Then, if you perform the same search again, it shows ranking movements directly in the SERPs.
  • SimilarTech Prospecting — Discovers a ton of useful information about the website you’re visiting. This includes estimated monthly traffic, company information, social profiles, web technologies, etc.
  • SEO Search Simulator by Nightwatch — Emulates Google searches from any location. Very useful for seeing how rankings vary for a particular query in different parts of the world.
  •  
    "Find Out How Much Traffic a Website Gets: 3 Ways Compared"
Rob Laporte

Valentine's Day AdWords "War" Among Florists Highlights Another Google Challenge - 0 views

  •  
    Though bidding on competitors' trademarked keywords, and even using them in ad text, are fair game on Google, the company does have policies preventing false or misleading ad copy. According to a spokesperson, "Google allows advertisers to bid on competitor keywords as well as to use competitor terms in the ad text itself as long as advertisers do not make any false or inaccurate claims in their ads (see more here). We use a combination of manual and automated processes to enforce this policy. Ads that are found in violation of our policies will be removed."
Jennifer Williams

Wordtracker vs Keyword discovery - 0 views

  •  
    Forum thread on Wordtracker vs. Keyword Discovery.
Jennifer Williams

Live Keyword Analysis - 0 views

  •  
    Keyword density tool. Recommended 2-8%
Rob Laporte

Google Sitelinks - What Sitelinks Are and How They Work - 0 views

  • What are Google Sitelinks? Google Sitelinks are displayed in Google search results and are meant to help users navigate your website. Google systems analyze the link structure of your website to find shortcuts that will save users time and allow them to quickly find the information. Sitelinks are completely automated by Google’s algorithm. In short, Google Sitelinks are shortcuts to your main pages from the search result pages. When do Google Sitelinks show? Google only shows Sitelinks for results when they think they’ll be useful to the user. If the structure of your website doesn’t allow Google spider to find good Sitelinks, or they don’t think the Sitelinks for your site are relevant for the user’s query, they won’t show them. Although there are no certain answers to this question from Google, the following factors seem to influence whether Google displays Sitelinks or not: Your site must have a stable no.1 ranking for the search query. So Sitelinks show up most often for searches on brand names. Your site must be old enough. It seems that websites under 2 years old don’t get Sitelinks The number of searches - it seems that the search keywords aren’t searched often enough don’t get Sitelinks The number of clicks - it seems that your site has to get many clicks for the searched keywords It seems that Sitelinks don’t show to search queries consisting of two or more keywords The number of links - links are important everywhere in the SEO world, aren’t they? The inbound links with the relevant anchor text seems to influence the chance of getting Sitelinks How can we get Sitelinks for our website? If you can meet the above mentioned criteria, you’ll have a big chance to get Sitelinks shown for your site. But you can also improve the structure of your website to increase the possibility and quality of your Sitelinks. Google seems to use the first level links on a website for the Sitelinks, so make sure all your important links are on the homepage. The links should be text links or image links with an IMG ALT attribute. JavaScript or Flash links are not considered for Sitelinks. Also, it seems that Google likes links that appear at the top of a webpage. So try to put your important links at the top of the HTML code and then re-position using CSS. Overall, build your website following SEO best practices and rank no.1 for your most important keywords will ensure the Sitelinks appearances and help users to navigate your website.
Rob Laporte

Limit Anchor Text Links To 55 Characters In Length? | Hobo - 0 views

  •  
    Limit Anchor Text Links To 55 Characters In Length? Blurb by Shaun Building LinksAs a seo I wanted to know - how many words or characters does Google count in a link? What's best practice when creating links - internal, or external? What is the optimal length of a HTML link? It appears the answer to the question 'how many words in a text link" is 55 characters, about 8-10 words. Why is this important to know? 1. You get to understand how many words Google will count as part of a link 2. You can see why you should keep titles to a maximum amount of characters 3. You can see why your domain name should be short and why urls should be snappy 4. You can see why you should rewrite your urls (SEF) 5. It's especially useful especially when thinking about linking internally, via body text on a page. I wanted to see how many words Google will count in one 'link' to pass on anchor text power to a another page so I did a test a bit like this one below; 1. pointed some nonsense words in one massive link, 50 words long, at the home page of a 'trusted' site 2. each of the nonsense words were 6 characters long 3. Then I did a search for something generic that the site would rank no1 for, and added the nonsense words to the search, so that the famous "This word only appear in links to the site" (paraphrase) kicked in 4. This I surmised would let me see how many of the nonsense words Google would attribute to the target page from the massive 50 word link I tried to get it to swallow. The answer was….. 1. Google counted 8 words in the anchor text link out of a possible 50. 2. It seemed to ignore everything else after the 8th word 3. 8 words x 6 characters = 48 characters + 7 spaces = a nice round and easy to remember number - 55 Characters. So, a possible best practice in number of words in an anchor text might be to keep a link under 8 words but importantly under 55 characters because everything after it is ignored
Rob Laporte

What Google Thinks of Your Site - Search Engine Watch (SEW) - 0 views

  • Internal Links Listings Sitelinks have been around for years, about five to be exact. Another important SERP feature that has also been around this long are site's internal links in the SERP listings. The occurrence of this isn't always deemed by branded or domain related searches as well as having a first place listing. These horizontally placed links located between the SERP listing description and URL are most often a mirrored replication of the anchor text of the text links you possess on your home page. To perform optimally at getting Google to display these, make sure the text links are placed in the first few paragraphs of copy to help increase your internal page CTR. Also, ensure that the anchor text is identical to the destination pages overall keyword focus. Having placement of internal links in Google SERPs is Google's thumbs up that you have a proper internal linking to keyword strategy.
  • Hierarchical Category Links One of the most recent SERP listing features you can use gauge Google's perception of your site are the hierarchical breadcrumb links placed in the URL line of SERP listings. These began to appear half a year ago and, like the internal link placement above, also don't require first place ranking, brand, or domain related searches to appear in SERPs. Receiving the hierarchical category links are achieved by utilizing a network of breadcrumb navigation across the internal pages of your site. To create an optimal process of breadcrumb linking, make sure you've applied your keyword strategy alongside the information architecture of your site content. Your URL structure should include keyword rich and content relevant category/folder naming conventions and ensure that site content falls into the appropriate categories. Furthermore, having a breadcrumb navigation in which the category links closely mimic the folder path of the URL helps to indicate to Google how the content of your site flows and that you have taken steps to properly deliver site content to search engines as well as users. Taking into consideration these Google SERP features will allow you to gain insight as to how Google understands the most important elements of your site from an SEO standpoint.
Dale Webb

Website ranking, search engine position & rank checker tool - KPMRS - 0 views

shared by Dale Webb on 24 Jul 09 - Cached
  •  
    KPMRS helps in finding your website ranks on Google, Yahoo an dLIve. Find your keyword website ranking on Google, Yahoo and MSN for different keywords
Rob Laporte

RankBrain Judgment Day: 4 SEO Strategies You'll Need to Survive | WordStream - 0 views

  • The future of SEO isn't about beating another page based on content length, social metrics, keyword usage, or your number of backlinks. Better organic search visibility will come from beating your competitors with a higher than expected click-through rate.
  • In “Google Organic Click-Through Rates” on Moz, Philip Petrescu shared the following CTR data:
  • The Larry RankBrain Risk Detection Algorithm. Just download all of your query data from Webmaster Tools and plot CTR vs. Average Position for the queries you rank for organically, like this:
  • ...7 more annotations...
  • Our research into millions of PPC ads has shown that the single most powerful way to increase CTR in ads is to leverage emotional triggers. Like this PPC ad: Tapping into emotions will get your target customer/audience clicking! Anger. Disgust. Affirmation. Fear. These are some of the most powerful triggers not only drive click through rate, but also increase conversion rates.
  • No, you need to combine keywords and emotional triggers to create SEO superstorms that result in ridiculous CTRs
  • Bottom line: Use emotional triggers + keywords in your titles and descriptions if you want your CTR to go from "OK" to great.
  • Bottom line: You must beat the expected CTR for a given organic search position. Optimize for relevance or die.
  • Let's say you work for a tech company. Your visitors, on average, are bouncing away at 80% for the typical session, but users on a competing website are viewing more pages per session and have a bounce rate of just 50%. RankBrain views them as better than you – and they appear above you in the SERPs. In this case, the task completion rate is engagement. Bottom line: If you have high task completion rates, Google will assume your content is relevant. If you have crappy task completion rates, RankBrain will penalize you.
  • 4. Increase Search Volume & CTR Using Social Ads and Display Remarketing People who are familiar with your brand are 2x more likely to click on your ads and 2x more likely to convert. We know this because targeting a user who has already visited your website (or app) via RLSA (remarketing lists for search ads) always produces higher CTRs than generically targeting the same keywords to users who are unfamiliar with your brand. So, one ingenious method to increase your organic CTRs and beat RankBrain is to bombard your specific target market with Facebook and Twitter ads. Facebook ads are proven to lift mobile search referral traffic volume to advertiser websites (by 6% on average, up to 12.8%) (here’s the research). With more than a billion daily users, your audience is definitely using the Social Network. Facebook ads are inexpensive – even spending just $50 dollars on social ads can generate tremendous exposure and awareness of your brand. Another relatively inexpensive way to dramatically build up brand recognition is to leverage the power of Display Ad remarketing on the Google Display Network. This will ensure the visitors you drive from social media ads remember who you are and what it is you do. In various tests, we found that implementing a display ad remarketing strategy has a dramatic impact on bounce rates and other engagement metrics. Bottom line: If you want to increase organic CTRs for your brand or business, make sure people are familiar with your offering. People who are more aware of your brand and become familiar with what you do will be predisposed to click on your result in SERP when it matters most, and will have much higher task completion rates after having clicked through to your site.
  • UPDATE: As many of us suspected, Google has continued to apply RankBrain to increasing volumes of search queries - so many, in fact, that Google now says its AI processes every query Google handles, which has enormous implications for SEO. As little as a year ago, RankBrain was reportedly handling approximately 15% of Google's total volume of search queries. Now, it's processing all of them. It's still too soon to say precisely what effect this will have on how you should approach SEO, but it's safe to assume that RankBrain will continue to focus on rewarding quality, relevant content. It's also worth noting that, according to Google, RankBrain itself is now the third-most important ranking signal in the larger Google algorithm, meaning that "optimizing" for RankBrain will likely dominate conversations in the SEO space for the foreseeable future. To read more about the scope and potential of RankBrain and its impact on SEO, check out this excellent write-up at Search Engine Land.
jack_fox

Taking Advantage Highlighted Terms In Google Results | SERPWoo - 0 views

  • Highlighted terms tend to happen on acronyms but also adds words that are relevant. This helps us as SEOs understand how Google interprets a particular keyword phrase and niche.
  • You should use as many highlighted terms as possible when creating content targeting keywords within your project so Google understands your content fits your niche and your content is as relevant as possible.
  • These terms that Google is suggesting on mobile keywords are under the "More Specific Searches" section of the Rich Data tab and My Keywords Tab. I personally think these suggestions are more powerful than highlighted terms since they are based off of Google's data of what searchers within a location have searched for in the past.
Rob Laporte

Google's December 2020 Core Update Themes - 0 views

  • The data and overall consensus point to Google’s December 2020 Core Update is it's one of the more impactful algorithm adjustments to hit the SERP over the past year or so.
  • I prefer to look at core updates almost from a pure content and UX perspective. For me, it’s about the specific pages Google swaps out more than it is a per domain analysis.
  • I am performing a qualitative analysis
  • ...19 more annotations...
  • I am not making any sort of definitive statements
  • What moves me, however, is when I look at 100 keywords I start seeing the same content-oriented theme arise again and again.
  • What I’m trying to say, and as you’ll see in the examples I will get into later, is that the content that was more focused on the specific topic mentioned in the query did better. So while the "ultimate guide” here did get to the topic the query deals with, it was not exclusively about that topic.
  • This might call the entire strategy of creating these ultimate guides into question. Perhaps you can’t cast a wide net in that way anymore? Perhaps, the "ultimate guide” is only really suitable for people who actually want to get a more broad understanding of a topic? (Crazy to think, I know!)
  • The pages from Rocket Mortgage, on the other hand, is only about how much you need for a down payment:
  • So too is the page from Quicken Loans:
  • The Moral of the Story: If I want to understand how much money on average I need to put down when buying a house or what the various options generally are and what they mean long term, the CFPG page, .gov or not, doesn’t really help me. Its content is not specifically honed in on that particular topic. Again, we have another page that takes a sweeping look at a topic that lost rankings when the query reflected a more specific sort of intent!
  • What’s interesting here is that unlike the previous examples, where too much content resulted in the page’s topical relevance being diluted, the lack of such content here is what I think caused the ranking loss. Look, it’s not bad content. However, it’s pretty much the "general” kind of content you see here, there, and everywhere for all sorts of topics. Just compare it to what the page from the Credit Card Insider offers:
  • This just oozes depth. The third topic on the page alone (6 Ways to Pay Off…) rivals the depth shown on the CreditCards.com page! What differentiates this page from the "guides” shown in the other examples is that this is a guide that drills deep into one topic as opposed to trying to span multiple subtopics. Also, have a look at the formatting, it reminds me of what we saw on the Motley Fool’s page:
  • It’s deep content that is easy to digest. It’s not hard to see why Google swapped these two pages.
  • The Moral of the Story: Exact content relevancy is not only about what topic you talk about. You can be topically aligned but it has to offer real information to the user. It’s even better when that information is digestible. In other words, if you want to rank for a keyword with topic specificity it might be better to create an "ultimate guide” that drills deep into the topic itself versus trying to cover every subtopic under the sun in order to try to rank for more topics with one piece of content.
  • The by-line really sums it up. It tells you this article is about the fact that you most likely won't get addicted to painkillers, but it’s definitely possible so here’s the scoop. To me, it’s far more in line with the average user’s intent of learning about the risks of addiction versus understanding the fine difference between addiction and dependence. It’s the same story with the WebMD page:
  • The Moral of the Story: Again, the issue here is not how authoritative or how substantial the content is. There is no doubt that content from the NIH is both substantial and authoritative. The issue here again seems to relate to Google being better able to show content that is specifically relevant to the nature of the query.
  • First things first, the page doesn’t speak to the query directly. While in the process of learning the difference between sadness and depression one could understand the signs of depression that route is certainly indirect. You could argue that the query how to tell if you have depression could be taken as ‘how do I know if I am just sad or depressed?’ but that really doesn’t seem to be the essential intent here. That topical line (i.e., sadness vs. depression) would most likely produce its own unique query (i.e., am I sad or depressed). From the content shown on the WebMD page, it appears that Google thinks of the intent as understanding the symptoms of depression:
  • The WebMD, in contradistinction to the MHA page, speaks to the "plain meaning” of the query’s intent… how can you tell if you’re suffering from depression? Aside from that, the WebMD page offers a bit more in terms of substance. While it doesn’t go into great detail per se, the WebMD page does offer a pretty comprehensive list of items. Compare that to the MHA page which, if you read it, is a bit thin and lacks tremendously in offering much of any details (even a basic list as seen on the WebMD page). The Moral of the Story: Relevancy is a two-pronged equation (at minimum). It requires the content to be topically focused on the one hand as well as substantial on the other
  • I’ve saved the best for last. This is my favorite example that I came across when diving into the December 2020 Core Update. I mean, for crying out loud, we’re talking about the CDC losing rankings in favor of a .org domain I never heard of. How could this be? Let’s understand the intent of the query. If I were searching for this it would be because I found something on my body that I thought might be skin cancer. If I could be so bold, I would imagine that this is why most of us would search for this term. I wouldn’t, and again I imagine most people in most instances wouldn’t search for this in order to understand if regular screening is officially recommended or not. Yet, that is what the CDC page is about:
  • I hate to make assumptions, but I would also think that someone running this query is most likely not interested in the common tests and methods doctors use to determine if skin cancer is present. Yet, this is what the page from Cancer.net focuses on:
  • Again, I would search for this term if I saw something weird on my body that made me think "Holy crap, do I have skin cancer?”. The page from the AOCD is entirely made for people on the verge of freaking out at the possibility of having skin cancer:
  • To me, when you see this page relative to the pages from Cancer.net and the CDC is painfully obvious why this page got the ranking boost. The Moral of the Story: Again, I think what has transpired here is painfully obvious. Google has looked past the immediate authority of some of the pages here and has more heavily considered how relevant the content is to the query. As with the cases I have shown earlier, Google is rewarding content that speaks in a highly-focused way to the intent and nature of the query. What Was the December 2020 Core Update About? Are you expecting a one-liner that definitively characterizes the December 2020 update? You’re not going to get one from me.  This update, like any other, certainly included a whole plethora of different "algorithmic considerations” and themes. That said, from where I sit, while other core updates did things to help put the most authoritative content at the top of the SERP, this update seemed to me as being more about pure relevancy. Updates of the past have done things to weed out sites using a marketing tone within YMYL informational content or have rewarded sites that put the right content ahead of their affiliate pursuits. All of that, while part of relevancy, speaks more to a need for something authoritative at the top of the SERP. Seeing so many .gov pages drop in favor of pages from sites like Healthline or WebMD seems to point to the update rewarding relevancy to the nth degree. Perhaps Google felt as if it had "authority” at the top of the SERP in order, paving the way for a focus on relevance? Who knows. All I can say is that I personally have not seen such a strong focus on pure relevance on page one of the SERP. Content Creation Takeaways Practically speaking, I think the era of broadly reaching pages is quickly coming to an end. I think that has been the case for a while. However, seeing Google pull pages off page one of the SERP because they deal with multiple subtopics is a new level, at least for me. It shows that you have to create content that talks about one topic and one topic only (unless the keyword reflects a specific desire for a broader survey of a topic). I wonder if the idea of having one ultimate guide so as to win numerous keywords should be replaced with multiple posts where each post deals with one specific topic or subtopic. (If you do that, please don’t create thin content, that is not what I am advocating for.) It’s a rather logical concept. As Google gets better at understanding content it is going to prefer highly-focused content around a specific topic to that which is of a more broad nature unless the query specifically shows intent for a general survey of a topic.
jack_fox

The 3-Step SEO Process That Grew Organic Traffic 200% - 0 views

  • use Mobile Moxie’s awesome SERPerator tool to check mobile results from your desktop.
  • The key sign of a “gimme” keyword is when the top results show missed opportunities. You can usually tell this just by skimming:  Does the page lack a sensible heading structure? Is it difficult to read or flooded with ads and pop-ups? Does the content seem too thin (or unnecessarily long)? This technique may involve a bit more leg work on the front end, but you will avoid wasting countless hours targeting irrelevant or high-difficulty keywords.
  • This free entity extraction tool provides semantic topics — people, places, brands, and events — referenced in a document.
  • ...2 more annotations...
  • Using the information from your content analysis, create an SEO outline, and have your company’s SMEs fill it out. This provides the trustworthy content you need, while still giving you control over how the content is written
  • t is perfectly acceptable to include trustworthy research from other sites. Outbound links can help users find out more about a topic and allow them to check your sources.
  •  
    "w keywords in the top 10 results, select Position  > Competitors > Top 10. You can also filter Volume to o"
Rob Laporte

How to Use Keyword Clustering to Seamlessly Optimize Your SEO Content - Moz - 0 views

  • Keyword clustering is powerful. The graph below shows one article’s journey in Google SERPs. It ranks for 50 clustered keywords and includes questions from PAA. This article quickly achieved a featured snippet, image rankings, 9.37k clicks, 68.9k impressions, 13.6% CTR and an average of six minutes spent on the page. Oh, and this was achieved before a single website back-linked directly to the article.
Jennifer Williams

Do Social Media Links Translate In Organic Rankings? - 0 views

  • Social media links (when not part of a larger strategy) are most effective for mid to long tail keywords. The head keywords were dominated by bigger brand domains with more domain trust and inbound links. Getting your most desirable keywords into the Digg title is crucial since subsequent links will use it as for the anchor text. Links and rankings gained from social media “stick”.
Rob Laporte

Google Webmaster Tools is Incorrectly Displaying Keyword Positions - 0 views

  • October 20, 2008 Google Webmaster Tools is Incorrectly Displaying Keyword Positions A WebmasterWorld member reports that he was dependent on the Top Search Queries report in Google Webmaster Tools and has found it to be providing incorrect data. After all, using another rank checker proved to see no results and there were no visitors to that page. This is likely to be a bug, according to Tedster: Webmaster Tools reports of all kinds are known to contain wrong information at times. This kind of wrong information would be particularly distrubing, but in any big system errors do creep in. The evidence of your own server logs is more dependable. He adds that it's possible that the ranking is achievable: [M]aybe the WMT report is pulling the position information before some filter is applied to come up with the final rankings. Even though that would certainly be buggy behavior, it might accidentally be showing you that your url COULD rank that well, if only you weren't tripping some kind of filter. Still, though, the tool in Google's backend is misleading. Would you consider this a bug? On a related note, The Official Google Webmaster Central Blog says that this could be an issue with the kind of data that WMT sees. They suggest that you add the www and non-www versions of the same site to Webmaster Central, do a site: search to look for any anomalies, set your preferred domain, and set a site-wide 301 redirect to www or the non-www. Of course, this is probably not applicable to the reporting issue in WebmasterWorld, though it may be related to other issues within Google Webmaster Tools. Forum discussion continues at WebmasterWorld.
‹ Previous 21 - 40 of 212 Next › Last »
Showing 20 items per page