Skip to main content

Home/ DISC Inc/ Group items tagged SEO

Rss Feed Group items tagged

1More

SEOmoz | The Disconnect in PPC vs. SEO Spending - 0 views

  •  
    The Disconnect in PPC vs. SEO Spending Posted by randfish on Tue (10/21/08) at 12:21 AM Paid Search Ads There's a big disconnect in the way marketing dollars are allocated to search engine focused campaigns. Let me highlight: Not surprisingly, search advertising should continue to be the largest category, growing from $9.1 billion in 2007 to $20.9 billion in 2013. - Source: C|Net News, June 30, 2008 OK. So companies in the US spent $10 billion last year on paid search ads, and even more this year. How about SEO? SEO: $1.3 billion (11%) - Source: SEMPO data via Massimo Burgio, SMX Madrid 2008 According to SEMPO's data, it's 11% for SEO and 87% for PPC (with another 1.4% for SEM technologies and s turn to Enquiro: Organic Ranking Visibility (shown in a percentage of participants looking at a listing in this location) Rank 1 - 100% Rank 2 - 100% Rank 3 - 100% Rank 4 - 85% Rank 5 - 60% Rank 6 - 50% Rank 7 - 50% Rank 8 - 30% Rank 9 - 30% Rank 10 - 20% Side sponsored ad visibility (shown in percentage of participants looking at an ad in this location) 1 - 50% 2 - 40% 3 - 30% 4 - 20% 5 - 10% 6 - 10% 7 - 10% 8 - 10% Fascinating. So visibility is considerably higher for the organic results. What about clicks? Thanks to Comscore, we can see that clicks on paid search results has gone down over time, and is now ~22%. Conclusions: SEO drives 75%+ of all search traffic, yet garners less than 15% of marketing budgets for SEM campaigns. PPC receives less than 25% of all search traffic, yet earns 80%+ of SEM campaign budgets. Questions: * Why does paid search earn so many more marketing dollar
1More

Two Ways To Justify SEO In Uncertain Times - 0 views

  •  
    Oct 22, 2008 at 10:55am Eastern by Paul Bruemmer Two Ways To Justify SEO In Uncertain Times In House - A Column From Search Engine Land During uncertain economic times like these, our advice is to always stick with the fundamentals to maintain business efficiency and progress. No matter what your business model, performing the fundamentals will keep you on-track and in-line for leveraging future success. If the C-level executives in your company are having any doubts about the value of SEO and are hesitating to release more funding, it's time to perform a cost-benefit exercise. It's your job as an in-house SEO manager to reestablish their confidence in the value of SEO as well as your value and the value of your team. When funding gets in the way, having a narrow focus, putting it on the table, and describing company goals you are committed to are all very important. 1) Leverage Your Paid Search Data To demonstrate implicit value for SEO, start with a baseline. Show where your key terms currently rank in organic and multiply by the cost-per-click value. Run the numbers for the value of direct clicks with high search intent. One way to go about this is to calculate an Effective Cost-Per-Click (eCPC) for your organic listings: 1. Access the Keyword Tool within your Google AdWords account. 2. Type your best performing (for instance, 20) keywords. 3. Select descriptive words or phrases and synonyms. 4. Click Get Keyword Ideas. This will produce a report; select Exact within the "Match Type" field and click on Approx Avg Search Volume. 1. Look at the Cost-Per-Click column to acquire the CPC value (let's assume it's $2.00). 2. Go to your web analytics data and identify the number of organic clicks for these keywords (let's assume 20,000/month). 3. Multiply the two (CPC times the number of organic clicks (in this case $40,000/mo)). 4. Create a spreadsheet with your best performing keywords and make the statement, "if we
1More

SEOmoz | Announcing SEOmoz's Index of the Web and the Launch of our Linkscape Tool - 0 views

  •  
    After 12 long months of brainstorming, testing, developing, and analyzing, the wait is finally over. Today, I'm ecstatic to announce some very big developments here at SEOmoz. They include: * An Index of the World Wide Web - 30 billion pages (and growing!), refreshed monthly, built to help SEOs and businesses acquire greater intelligence about the Internet's vast landscape * Linkscape - a tool enabling online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value. * A Fresh Design - that gives SEOmoz a more usable, enjoyable, and consistent browsing experience * New Features for PRO Membership - including more membership options, credits to run advanced Linkscape reports (for all PRO members), and more. Since there's an incredible amount of material, I'll do my best to explain things clearly and concisely, covering each of the big changes. If you're feeling more visual, you can also check out our Linkscape comic, which introduces the web index and tool in a more humorous fashion: Check out the Linkscape Comic SEOmoz's Index of the Web For too long, data that is essential to the practice of search engine optimization has been inaccessible to all but a handful of search engineers. The connections between pages (links) and the relationship between links, URLs, and the web as a whole (link metrics) play a critical role in how search engines analyze the web and judge individual sites and pages. Professional SEOs and site owners of all kinds deserve to know more about how their properties are being referenced in such a system. We believe there are thousands of valuable applications for this data and have already put some effort into retrieving a few fascinating statistics: * Across the web, 58% of all links are to internal pages on the same domain, 42% point to pages off the linking site. * 1.83%
2More

Google Says Domain Registrations Don't Affect SEO, Or Do They? - 0 views

  •  
    Google Says Domain Registrations Don't Affect SEO, Or Do They? Sep 9, 2009 at 2:01pm ET by Matt McGee Over at Search Engine Roundtable today, Barry Schwartz writes about the latest comments from Google about domain registration and its impact on SEO/search rankings. In this case, it's Google employee John Mueller suggesting in a Google Webmaster Help forum thread that Google doesn't look at the length of a domain registration: A bunch of TLDs do not publish expiration dates - how could we compare domains with expiration dates to domains without that information? It seems that would be pretty hard, and likely not worth the trouble. Even when we do have that data, what would it tell us when comparing sites that are otherwise equivalent? A year (the minimum duration, as far as I know) is pretty long in internet-time :-). But let's look at some more evidence. Earlier this year, Danny spoke with Google's Matt Cutts about a variety of domain/link/SEO issues. In light of the claims from domain registrars that longer domain registrations are good for SEO, Danny specifically asked "Does Domain Registration Length Matter?" Matt's reply: To the best of my knowledge, no search engine has ever confirmed that they use length-of-registration as a factor in scoring. If a company is asserting that as a fact, that would be troubling. But wait, there's more! Shortly after the Q&A with Danny that we posted here, Matt published more thoughts on the matter in a video on the Google Webmaster Central Channel on YouTube. If you don't have time to watch the video, Matt says, "My short answer is not to worry very much about that [the number of years a domain is registered], not very much at all." He reiterates that the domain registrar claims "are not based on anything we said," and talks about a Google "historical data" patent that may or may not be part of Google's algorithm. He sums it up by saying, "make great content, don't worry nea
  •  
    Google Says Domain Registrations Don't Affect SEO, Or Do They? Sep 9, 2009 at 2:01pm ET by Matt McGee Over at Search Engine Roundtable today, Barry Schwartz writes about the latest comments from Google about domain registration and its impact on SEO/search rankings. In this case, it's Google employee John Mueller suggesting in a Google Webmaster Help forum thread that Google doesn't look at the length of a domain registration: A bunch of TLDs do not publish expiration dates - how could we compare domains with expiration dates to domains without that information? It seems that would be pretty hard, and likely not worth the trouble. Even when we do have that data, what would it tell us when comparing sites that are otherwise equivalent? A year (the minimum duration, as far as I know) is pretty long in internet-time :-). But let's look at some more evidence. Earlier this year, Danny spoke with Google's Matt Cutts about a variety of domain/link/SEO issues. In light of the claims from domain registrars that longer domain registrations are good for SEO, Danny specifically asked "Does Domain Registration Length Matter?" Matt's reply: To the best of my knowledge, no search engine has ever confirmed that they use length-of-registration as a factor in scoring. If a company is asserting that as a fact, that would be troubling. But wait, there's more! Shortly after the Q&A with Danny that we posted here, Matt published more thoughts on the matter in a video on the Google Webmaster Central Channel on YouTube. If you don't have time to watch the video, Matt says, "My short answer is not to worry very much about that [the number of years a domain is registered], not very much at all." He reiterates that the domain registrar claims "are not based on anything we said," and talks about a Google "historical data" patent that may or may not be part of Google's algorithm. He sums it up by saying, "make great content, don't worry nea
2More

SEOmoz Crawls Web To Expand SEO Toolset - 0 views

  • Oct 6, 2008 at 8:06am Eastern by Barry Schwartz    SEOmoz Crawls Web To Expand SEO Toolset Rand Fishkin of SEOmoz announced they have been working for about a year on building out an index of the web, in order to be able to provide SEOs and SEMs with a toolset they have never had before. SEOmoz has crawled and built a 30 billion page index of the web. Rand explains this index is still growing and is refreshed monthly. The purpose, “to help SEOs and businesses acquire greater intelligence about the Internet’s vast landscape.” Part of the indexing was to build out a new tool named Linkscape. Linkscape gives SEOs “online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value,” said Rand. I hope to play with it more after the SMX East conference, but with a quick trial, it seems pretty comprehensive. SEOmoz also launched a new design and has given PRO members more options and features. To read all about these features and the tools, see Rand’s post.
  •  
    Oct 6, 2008 at 8:06am Eastern by Barry Schwartz SEOmoz Crawls Web To Expand SEO Toolset Rand Fishkin of SEOmoz announced they have been working for about a year on building out an index of the web, in order to be able to provide SEOs and SEMs with a toolset they have never had before. SEOmoz has crawled and built a 30 billion page index of the web. Rand explains this index is still growing and is refreshed monthly. The purpose, "to help SEOs and businesses acquire greater intelligence about the Internet's vast landscape." Part of the indexing was to build out a new tool named Linkscape. Linkscape gives SEOs "online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value," said Rand. I hope to play with it more after the SMX East conference, but with a quick trial, it seems pretty comprehensive. SEOmoz also launched a new design and has given PRO members more options and features. To read all about these features and the tools, see Rand's post.
1More

SEOmoz | SEO Pricing & Costs - What Should You Charge / How Much Should You Pay? - 0 views

  •  
    I wanted to explore the world of SEO pricing models from both sides of the issue, so let's dive right in. First off, we'll take a look at how SEO companies commonly price their services, then look at how businesses and organizations should expect to pay for SEO. The 7 Most Popular SEO Pricing Models * Hourly Consulting The simplest way to price a project is to charge by the hour. Rates in SEO vary with the lowest, entry level folks around $40-50, mid-tier consultants around $100-$200 and high-demand firms & people from $300-500. SEOmoz is obviously actively trying to limit our clients by going way outside the norm and charging $1000 / hour.
20More

65+ Best Free SEO Chrome Extensions (As Voted-for by SEO Community) - 1 views

  • Link Redirect Trace — Uncovers all URLs in a redirect chain including 301’s, 302’s, etc. Very useful for finding (and regaining) lost “link juice,” amongst other things.Other similar extensions: Redirect Path
  • Scraper — Scrape data from any web page using XPath or jQuery. Integrates with Google Sheets for one-click export to a spreadsheet. Or you can copy to clipboard and paste into Excel.Other similar extensions: Data Scraper — Easy Web Scraping, XPather
  • Tag Assistant (by Google) — Check for the correct installation of Google tags (e.g. Google Analytics, Tag Manager, etc) on any website. Also, record typical user flows on your website to diagnose and fix implementation errors.
  • ...16 more annotations...
  • Web Developer — Adds a web developer toolbar to Chrome. Use it to check how your website looks on different screen sizes, find images with missing alt text, and more.
  • WhatRuns — Instantly discover what runs any website. It uncovers the CMS, plugins, themes, ad networks, fonts, frameworks, analytics tools, everything.
  • Page Load Time — Measures and displays page load time in the toolbar. Also breaks down this metric by event to give you deeper insights. Simple, but very useful.
  • FATRANK — Tells you where the webpage you’re visiting ranks in Google for any keyword/phrase.
  • SEOStack Keyword Tool — Finds thousands of low-competition, long-tail keywords in seconds. It does this by scraping Google, Youtube, Bing, Yahoo, Amazon, and eBay. All data can be exported to CSV.
  • Window Resizer — Resize your browser window to see how a website looks on screens of different sizes. It has one-click emulation for popular sizes/resolutions (e.g. iPhone, iPad, laptop, desktop, etc).
  • Ghostery — Tells you how websites are tracking you (e.g. Facebook Custom Audiences, Google Analytics, etc) and blocks them. Very useful for regaining privacy. Plus, websites generally load faster when they don’t need to load tracking technologies.
  • Ayima Page Insights — Uncovers technical and on-page issues for any web page. It also connects to Google Search Console for additional insights on your web properties.
  • ObservePoint TagDebugger — Audit and debug issues with website tags (e.g. Google Analytics, Tag Manager, etc) on your websites. Also checks variables and on-click events.Other similar extensions: Event Tracking Tracker
  • The Tech SEO — Quick Click Website Audit — Provides pre-formatted links (for the current URL) to a bunch of popular SEO tools. A very underrated tool that reduces the need for mundane copy/pasting.
  • User-Agent Switcher for Chrome — Mimic user-agents to check that your website displays correctly in different browsers and/or OS’.
  • Portent’s SEO Page Review — Reviews the current page and kicks back a bunch of data including meta tags, canonicals, outbound links, H1-H6 tags, OpenGraph tags, and more.
  • FindLinks — Highlights all clickable links/elements on a web page in bright yellow. Very useful for finding links on websites with weird CSS styling.
  • SERPTrends SEO Extension — Tracks your Google, Bing, and Yahoo searches. Then, if you perform the same search again, it shows ranking movements directly in the SERPs.
  • SimilarTech Prospecting — Discovers a ton of useful information about the website you’re visiting. This includes estimated monthly traffic, company information, social profiles, web technologies, etc.
  • SEO Search Simulator by Nightwatch — Emulates Google searches from any location. Very useful for seeing how rankings vary for a particular query in different parts of the world.
  •  
    "Find Out How Much Traffic a Website Gets: 3 Ways Compared"
6More

11 Little-Known Features In The SEO Spider | Screaming Frog - 0 views

  • if you need to crawl millions of URLs using a desktop crawler, you really can. You don’t need to keep increasing RAM to do it either, switch to database storage instead.
  • f you’re auditing an HTTP to HTTPS migration which has HSTS enabled, you’ll want to check the underlying ‘real’ sitewide redirect status code in place (and find out whether it’s a 301 redirect). Therefore, you can choose to disable HSTS policy by unticking the ‘Respect HSTS Policy’
  • For macOS, to open additional instances of the SEO Spider open a Terminal and type the following: open -n /Applications/Screaming\ Frog\ SEO\ Spider.app/ You can now perform multiple crawls, or compare multiple crawls at the same time.
  • ...3 more annotations...
  • Occassionally it can be useful to crawl URLs with fragments (/page-name/#this-is-a-fragment) when auditing a website, and by default the SEO Spider will crawl them in JavaScript rendering mode.
  • While this can be helpful, the search engines will obviously ignore anything from the fragment and crawl and index the URL without it. Therefore, generally you may wish to switch this behaviour using the ‘Regex replace’ feature in URL Rewriting. Simply include #.* within the ‘regex’ filed and leave the ‘replace’ field blank.
  • Saving HTML & Rendered HTML To Help Debugging We occasionally receive support queries from users reporting a missing page title, description, canonical or on-page content that’s seemingly not being picked up by the SEO Spider, but can be seen to exist in a browser, and when viewing the HTML source. Often this is assumed to be a bug of somekind, but most of the time it’s just down to the site responding differently to a request made from a browser rather than the SEO Spider, based upon the user-agent, accept-language header, whether cookies are accepted, or if the server is under load as examples.
2More

Nofollow Link Social Media | SEO Training - 0 views

  •  
    The Nofollow Link & Social Media Published by Your SEO Mentor under SEO, Social Media Aug 23 2008 There has been a lot of questions about how Social Media is affecting the SEO industry. The question I would like to ask is how can it help the SEO industry and how will affect the SEO for my clients sites and my own. The major issue with Social Media sites and how they play a role in your SEO these days is a majority of them (especially the big boys e.g. Twitter) use the Nofollow link. "Well your asking what does this mean and why do I need to worry about it." First of all don't worry about it, this is not the end of the world but what it means is that going to all these major social media and networking sites and linking back to your website will for the most part have no affect on your search engine results. The NoFollow link (e.g. ) was originally created to block search engines from following links in blog comments, this was due to the very high amount of blog comment spamming. The wonderful Wikipedia definition says, "nofollow is an HTML attribute value used to instruct some search engines that a hyperlink should not influence the link target's ranking in the search engine's index. It is intended to reduce the effectiveness of certain types of search engine spam, thereby improving the quality of search engine results and preventing spamdexing from occurring in the first place." With Social Media sites popping up daily and with them being very easy to place user generated content and links spammers began the same old routine and therefore we suffer from their actions. The top social media and networking sites quickly found that they too needed to use the nofollow attribute to help reduce the amount of spam submitted. So for the most part placing a link on Social Media sites will not directly help your search engine optimization efforts. That doesn't mean Social Media can not help in gaining valuable links to
  •  
    Current Top 20 Social Bookmarking sites that Dofollow
1More

Wake Up SEOs, the New Google is Here | SEOmoz - 0 views

  •  
    Rel="author" and Rel="publisher" are the solution Google is adopting in order to better control, within other things, the spam pollution of the SERPs. If you are a blogger, you will be incentivized in marking your content with Author and link it to your G+ Profile, and as a Site, you are incentivized to create your G+ Business page and to promote it with a badge on you site that has the rel="publisher" in its code. Trusted seeds are not anymore only sites, but can be also persons (i.e.: Rand or Danny Sullivan) or social facets of an entity… so, the closer I am in the Social Graph to those persons//entity the more trusted I am to Google eyes. As we can see, Google is not trying to rely only on the link graph, as it is quite easy to game, but it is not simply adding the social signals to the link graph, because they too can be gamed. What Google is doing is creating and refining a new graph that see cooperating Link graph, Social graph and Trust graph and which is possibly harder to game. Because it can be gamed still, but - hopefully - needing so many efforts that it may become not-viable as a practice. Wake up SEOs, the new Google is here As a conclusion, let me borrow what Larry Page wrote on Google+ (bold is mine): Our ultimate ambition is to transform the overall Google experience […] because we understand what you want and can deliver it instantly. This means baking identity and sharing into all of our products so that we build a real relationship with our users. Sharing on the web will be like sharing in real life across all your stuff. You'll have better, more relevant search results and ads. Think about it this way … last quarter, we've shipped the +, and now we're going to ship the Google part. I think that it says it all and what we have lived a year now is explained clearly by the Larry Page words. What can we do as SEOs? Evolve, because SEO is not dieing, but SEOs can if they don't assume that winter - oops - the
20More

70+ Best Free SEO Tools (As Voted-for by the SEO Community) - 1 views

  • Soovle — Scrapes Google, Bing, Yahoo, Wikipedia, Amazon, YouTube, and Answers.com to generate hundreds of keyword ideas from a seed keyword. Very powerful tool, although the UI could do with some work.Hemingway Editor — Improves the clarity of your writing by highlighting difficult to read sentences, “weak” words, and so forth. A must-have tool for bloggers (I use it myself).
  • Yandex Metrica — 100% free web analytics software. Includes heat maps, form analytics, session reply, and many other features you typically wouldn’t see in a free tool.
  • For example, two of my all-time favourite tools are gInfinity (Chrome extension) and Chris Ainsworth’s SERPs extraction bookmarklet.By combining these two free tools, you can extract multiple pages of the SERPs (with meta titles + descriptions) in seconds.
  • ...17 more annotations...
  • Keyword Mixer — Combine your existing keywords in different ways to try and find better alternatives. Also useful for removing duplicates from your keywords list.Note: MergeWords does (almost) exactly the same job albeit with a cleaner UI. However, there is no option to de-dupe the list.
  • LSIgraph.com — Latent Semantic Indexing (LSI) keywords generator. Enter a seed keyword, and it’ll generate a list of LSI keywords (i.e. keywords and topics semantically related to your seed keyword). TextOptimizer is another very similar tool that does roughly the same job.
  • Small SEO Tools Plagiarism Checker — Detects plagiarism by scanning billions of documents across the web. Useful for finding those who’ve stolen/copied your work without attribution.
  • iSearchFrom.com — Emulate a Google search using any location, device, or language. You can customise everything from SafeSearch settings to personalised search.
  • Delim.co — Convert a comma-delimited list (i.e. CSV) in seconds. Not necessarily an SEO tool per se but definitely very useful for many SEO-related tasks.
  • Am I Responsive? — Checks website responsiveness by showing you how it looks on desktop, laptop, tablet, and mobile.
  • SERPLab — Free Google rankings checker. Updates up to 50 keywords once every 24 hours (server permitting).
  • Varvy — Checks whether a web page is following Google’s guidelines. If your website falls short, it tells you what needs fixing.
  • JSON-LD Schema Generator — JSON-LD schema markup generator. It currently supports six markup types including: product, local business, event, and organization.
  • KnowEm Social Media Optimizer — Analyses your web page to see if it’s well-optimised for social sharing. It checks for markup from Facebook, Google+, Twitter, and LinkedIn.
  • Where Goes? — Shows you the entire path of meta-refreshes and redirects for any URL. Very useful for diagnosing link issues (e.g. complex redirect chains).
  • Google Business Review Link Generator — Generates a direct link to your Google Business listing. You can choose between a link to all current Google reviews, or to a pre-filled 5-star review box.
  • PublicWWW — Searches the web for pages using source code-based footprints. Useful for finding your competitors affiliates, websites with the same Google Analytics code, and more.
  • Keywordtool.io — Scrapes Google Autosuggest to generate 750 keyword suggestions from one seed keyword. It can also generate keyword suggestions for YouTube, Bing, Amazon, and more.
  • SERPWatcher — Rank tracking tool with a few unique metrics (e.g. “dominance index”). It also shows estimated visits and ranking distribution charts, amongst other things.
  • GTMetrix — Industry-leading tool for analysing the loading speed of your website. It also gives actionable recommendations on how to make your website faster.
  • Mondovo — A suite of SEO tools covering everything from keyword research to rank tracking. It also generates various SEO reports.SEO Site Checkup — Analyse various on-page/technical SEO issues, monitor rankings, analyse competitors, create custom white-label reports, and more.
6More

SEO Clients Report 2021: What Do Clients Want from SEO? - 0 views

  • Content marketing is the most sought-after SEO service for 31.3% of SEO pros, followed by keyword strategy (30.8%) and web design (25.5%)
  • By tracking what SEO clients are asking their agencies for, we can begin to see where there is a demand for services and potential gaps to fulfill.
  • 1Content strategy
  • ...2 more annotations...
  • Even the biggest retail brands have this challenge because Local can be a royal PITA
  • They often ignore the locations and are missing out on a huge amount of potential local search revenue
  •  
    "Content marketing is the most sought-after SEO service for 31.3% of SEO pros, followed by keyword strategy (30.8%) and web design (25.5%)"
6More

The Real Impact of Mobile-First Indexing & The Importance of Fraggles - Moz - 0 views

  • We have also recently discovered that Google has begun to index URLs with a # jump-link, after years of not doing so, and is reporting on them separately from the primary URL in Search Console. As you can see below from our data, they aren't getting a lot of clicks, but they are getting impressions. This is likely because of the low average position. 
  • Start to think of GMB as a social network or newsletter — any assets that are shared on Facebook or Twitter can also be shared on Google Posts, or at least uploaded to the GMB account.
  • You should also investigate the current Knowledge Graph entries that are related to your industry, and work to become associated with recognized companies or entities in that industry. This could be from links or citations on the entity websites, but it can also include being linked by third-party lists that give industry-specific advice and recommendations, such as being listed among the top competitors in your industry ("Best Plumbers in Denver," "Best Shoe Deals on the Web," or "Top 15 Best Reality TV Shows"). Links from these posts also help but are not required — especially if you can get your company name on enough lists with the other top players. Verify that any links or citations from authoritative third-party sites like Wikipedia, Better Business Bureau, industry directories, and lists are all pointing to live, active, relevant pages on the site, and not going through a 301 redirect. While this is just speculation and not a proven SEO strategy, you might also want to make sure that your domain is correctly classified in Google’s records by checking the industries that it is associated with. You can do so in Google’s MarketFinder tool. Make updates or recommend new categories as necessary. Then, look into the filters and relationships that are given as part of Knowledge Graph entries and make sure you are using the topic and filter words as keywords on your site.
  • ...3 more annotations...
  • The biggest problem for SEOs is the missing organic traffic, but it is also the fact that current methods of tracking organic results generally don’t show whether things like Knowledge Graph, Featured Snippets, PAA, Found on the Web, or other types of results are appearing at the top of the query or somewhere above your organic result. Position one in organic results is not what it used to be, nor is anything below it, so you can’t expect those rankings to drive the same traffic. If Google is going to be lifting and representing everyone’s content, the traffic will never arrive at the site and SEOs won’t know if their efforts are still returning the same monetary value. This problem is especially poignant for publishers, who have only been able to sell advertising on their websites based on the expected traffic that the website could drive. The other thing to remember is that results differ — especially on mobile, which varies from device to device (generally based on screen size) but also can vary based on the phone IOS. They can also change significantly based on the location or the language settings of the phone, and they definitely do not always match with desktop results for the same query. Most SEO’s don't know much about the reality of their mobile search results because most SEO reporting tools still focus heavily on desktop results, even though Google has switched to Mobile-First.  As well, SEO tools generally only report on rankings from one location — the location of their servers — rather than being able to test from different locations. 
  • The only thing that good SEO’s can do to address this problem is to use tools like the MobileMoxie SERP Test to check what rankings look like on top keywords from all the locations where their users may be searching. While the free tool only provides results with one location at a time, subscribers can test search results in multiple locations, based on a service-area radius or based on an uploaded CSV of addresses. The tool has integrations with Google Sheets, and a connector with Data Studio, to help with SEO reporting, but APIs are also available, for deeper integrations in content editing tools, dashboards and for use within other SEO tools.
  • Fraggles and Fraggled indexing re-frames the switch to Mobile-First Indexing, which means that SEOs and SEO tool companies need to start thinking mobile-first — i.e. the portability of their information. While it is likely that pages and domains still carry strong ranking signals, the changes in the SERP all seem to focus less on entire pages, and more on pieces of pages, similar to the ones surfaced in Featured Snippets, PAAs, and some Related Searches. If Google focuses more on windowing content and being an "answer engine" instead of a "search engine," then this fits well with their stated identity, and their desire to build a more efficient, sustainable, international engine.
10More

RankBrain Judgment Day: 4 SEO Strategies You'll Need to Survive | WordStream - 0 views

  • The future of SEO isn't about beating another page based on content length, social metrics, keyword usage, or your number of backlinks. Better organic search visibility will come from beating your competitors with a higher than expected click-through rate.
  • In “Google Organic Click-Through Rates” on Moz, Philip Petrescu shared the following CTR data:
  • The Larry RankBrain Risk Detection Algorithm. Just download all of your query data from Webmaster Tools and plot CTR vs. Average Position for the queries you rank for organically, like this:
  • ...7 more annotations...
  • Our research into millions of PPC ads has shown that the single most powerful way to increase CTR in ads is to leverage emotional triggers. Like this PPC ad: Tapping into emotions will get your target customer/audience clicking! Anger. Disgust. Affirmation. Fear. These are some of the most powerful triggers not only drive click through rate, but also increase conversion rates.
  • No, you need to combine keywords and emotional triggers to create SEO superstorms that result in ridiculous CTRs
  • Bottom line: Use emotional triggers + keywords in your titles and descriptions if you want your CTR to go from "OK" to great.
  • Bottom line: You must beat the expected CTR for a given organic search position. Optimize for relevance or die.
  • Let's say you work for a tech company. Your visitors, on average, are bouncing away at 80% for the typical session, but users on a competing website are viewing more pages per session and have a bounce rate of just 50%. RankBrain views them as better than you – and they appear above you in the SERPs. In this case, the task completion rate is engagement. Bottom line: If you have high task completion rates, Google will assume your content is relevant. If you have crappy task completion rates, RankBrain will penalize you.
  • 4. Increase Search Volume & CTR Using Social Ads and Display Remarketing People who are familiar with your brand are 2x more likely to click on your ads and 2x more likely to convert. We know this because targeting a user who has already visited your website (or app) via RLSA (remarketing lists for search ads) always produces higher CTRs than generically targeting the same keywords to users who are unfamiliar with your brand. So, one ingenious method to increase your organic CTRs and beat RankBrain is to bombard your specific target market with Facebook and Twitter ads. Facebook ads are proven to lift mobile search referral traffic volume to advertiser websites (by 6% on average, up to 12.8%) (here’s the research). With more than a billion daily users, your audience is definitely using the Social Network. Facebook ads are inexpensive – even spending just $50 dollars on social ads can generate tremendous exposure and awareness of your brand. Another relatively inexpensive way to dramatically build up brand recognition is to leverage the power of Display Ad remarketing on the Google Display Network. This will ensure the visitors you drive from social media ads remember who you are and what it is you do. In various tests, we found that implementing a display ad remarketing strategy has a dramatic impact on bounce rates and other engagement metrics. Bottom line: If you want to increase organic CTRs for your brand or business, make sure people are familiar with your offering. People who are more aware of your brand and become familiar with what you do will be predisposed to click on your result in SERP when it matters most, and will have much higher task completion rates after having clicked through to your site.
  • UPDATE: As many of us suspected, Google has continued to apply RankBrain to increasing volumes of search queries - so many, in fact, that Google now says its AI processes every query Google handles, which has enormous implications for SEO. As little as a year ago, RankBrain was reportedly handling approximately 15% of Google's total volume of search queries. Now, it's processing all of them. It's still too soon to say precisely what effect this will have on how you should approach SEO, but it's safe to assume that RankBrain will continue to focus on rewarding quality, relevant content. It's also worth noting that, according to Google, RankBrain itself is now the third-most important ranking signal in the larger Google algorithm, meaning that "optimizing" for RankBrain will likely dominate conversations in the SEO space for the foreseeable future. To read more about the scope and potential of RankBrain and its impact on SEO, check out this excellent write-up at Search Engine Land.
1More

Common Shopify SEO pitfalls and how to avoid them - 0 views

  • No control over your robots.txt file The problem. Shopify does not allow store owners to edit their robots.txt file. This is an issue because the platform creates duplicate URLs for products associated with a collection/category page. “The ideal solution would be to use robots.txt disallow directives to block these pages from being crawled in the first place,” Kevin Wallner, founder of First Chair Digital, told Search Engine Land, noting that, while Shopify does add canonical tags pointing back to the correct product URL, this does not prevent the duplicate URLs from being crawled and potentially indexed. Solutions: Editing your Shopify theme, as discussed in our technical SEO for Shopify guide, is one way to resolve this issue. Alternatively, pages not included in your robots.txt file can be hidden from search engines by customizing the section of your theme’s layout file, as detailed on this Shopify help page. You can also use an app such as Sitemap & NoIndex Manager to add noindex tags and remove URLs from your sitemap, Wallner suggested. “Unfortunately this won’t work for duplicate product URLs, but it works for several other special Shopify page types with little to no SEO value, so it’s still a good move,” he said. Related: Shopify SEO Guide: How to increase organic traffic to your store Wallner also advised that store owners avoid linking to duplicate URLs in their header, footer, sidebars, breadcrumbs and within the text on their pages. If particular pages have earned important backlinks, store owners can also get in touch with webmasters to request that they link to the preferred URL.
2More

Google Openly Profiles SEOs As Criminals - 2 views

  • If we can stop talking about nofollow and PageRank sculpting for a second, maybe we can openly talk about the bigger story of last week’s SMX Advanced. The one that has to do with Matt Cutts taking the stage during the You&A and openly stating that Google profiles SEOs like common criminals. I was naïve in my youth. I’d read blog posts that accused Google of “having it out” for SEOs and laugh. There’d be rants about how Google was stricter on sites that were clearly touched by an SEO and how SEOs were dumb for “self-identifying” with attributes like nofollow. At the time, I thought these people were insane. Now I know they were right. Google does profile SEOs. They’re identified as “high risk” and so are all of their associated projects.
  •  
    Interesting...further strengthens the position that "content is King" and we should continue to encourage clients in that direction. Value to the audience first, play nice with the search engines second.
1More

Google; You can put 50 words in your title tag, we'll read it | Hobo - 0 views

  •  
    Google; You can put 50 words in your title tag, we'll read it Blurb by Shaun Anderson Note - This is a test, testing Title Tags in Google. Consider also Google Title Tag Best Practice. We recently tested "how many keywords will Google read in the title tag / element?" using our simple seo mythbuster test (number 2 in the series). And here's the results, which are quite surprising. First - here's the test title tag we tried to get Google to swallow. And it did. All of it. Even though it was a bit spammy; HoboA HoboB HoboC HoboD HoboE HoboF HoboG HoboH HoboI HoboJ HoboK HoboL HoboM HoboN HoboO HoboP HoboQ HoboR HoboS HoboT HoboU HoboV HoboW HoboX HoboY Hob10 Hob20 Hob30 Hob40 Hob50 Hob60 Hob70 Hob80 Hob90 Hob11 Hob12 Hob13 Hob14 Hob15 Hob16 Hob17 Hob18 Hob19 Hob1a Hob1b Hob1c Hob1d Hob1e Hob1f Hob1g Hob1h Using a keyword search - hoboA Hob1h - we were surprised to see Google returned our page. We also tested it using - Hob1g Hob1h - the keywords right at the end of the title - and again our page was returned. So that's 51 words, and 255 characters without spaces, 305 characters with spaces, at least! It seems clear Google will read just about anything these days! ************** Update: Qwerty pointed out an interesting fact about the intitle: site operator in Google. Google results with the intitle: command…..results as expected. But next in the sequence returns the following, unexpected result….. Google results with the intitle: command So what does this tell us? Google seems to stop at the 12th word on this page at least when returning results using the intitle: site operator. Another interesting observation. Thanks Qwerty. ************** We're obviously not sure what benefit a title tag with this many keywords in it has for your page, in terms of keyword density / dilution, and "clickability" in the search engine results pages (serps). 50+ words is certainly not best practice! When creating your title tag bear in
1More

Buying Sites? Use Trusts To Avoid Google Domain Demolitions | SEO ROI Services - 0 views

  • Buying Sites? Use Trusts To Avoid Google Domain Demolitions Author: Gabriel Goldenberg, May 9, 2008 submit_url = "http://seoroi.com/seo-roi-quality/buying-sites-use-trusts-beneficial-title/"; At the Domain Roundtable, Matt Cutts said that Google will cut down any sites that get sold back to zero ranking value. So after a site has built up SEO strength for a few years, the asset could be worthless on the search market because Google - which controls the overwhelming majority of North American and most Western search - makes the rules. This is clearly unfair to webmasters. Not to mention that the Fortune 500 are again on a different playing field, because their purchases are just mergers and acquisitions, not “site purchases”… Update: Apparently this treatment is reserved for sites that also change topics. The technique thus remains useful, but obviously the problem it resolves is narrowed to particular situations. Hat tip to Gustavo Cardial for pointing out the error. Lady Justice, blindfolded with scales and sword by California Criminal Defense Lawyer Rob Miller. In an effort to balance out the scales, I’m sharing a legal technique called “the trust.” My hope is that it will enable webmasters to buy sites and sell them without fear that their hard SEO work will go to naught.
1More

Internet Marketing and SEO Blog from Rank Magic - 0 views

  • Paid (PPC) Search versus SEO August 9, 2007 ::: Increasingly I read and hear about people in the Internet marketing business arguing over whether paid search (pay per click ads) is more valuable than organic SEO, and vice versa. While there are some fascinating and relevant arguments on either side, research shows that marketers are quite satisfied with both.   A report from the SEMPO State of the Market Survey from about 18 months ago shows that 83% of respondents were using PPC compared to only 11% using SEO. Other reports show that the value of SEO is rising as user sophistication increases (according to Chris Boggs in the Spring 2007 edition of Search Marketing Standard). Marketing Sherpa's 2005 report showed SEO conversion rates overtook PPC rates at 4.2% versus 3.6%. That's quite the opposite of what had been found the year before.   The Direct Marketing Association reported in 2005 on a list of "online marketing strategies that produce the best ROI that PPC and SEO were rated equally according to US retailers, behind only "having a website" and "using email marketing". A more recent study by Marketing Sherpa, though, showed SEO ahead of email marketing, with PPC a close third.   One thing seems to be true: if a given web site shows up in both the organic search engine listings and the PPC ads, that seems to super-validate it as a good choice, which increases the likelihood of a searcher clicking on one of those listings.
1More

SEO Tools Come To iPhone - 0 views

  • Oct 28, 2008 at 9:17am Eastern by Barry Schwartz    SEO Tools Come To iPhone I was waiting for the day someone would bring an SEO tool to the iPhone. Today is that day, Infindigm released a tool named proSEO - iPhone SEO Content Analyzer. You can download the tool on iTunes or on your iPhone. To see the tool on iTunes, use this link. It does cost $14.99 but it seems to have a nice feature set, including: Complete source code listing Listing of META keywords Listing of META description Listing of all META tags in the document Tag counts - this feature counts all the tags in a document to give clues about composition. Contents of the <title> tag. The body of text with tags removed Percentage of body words that are stop words (See Supported Languages Below). Stopwords are not counted by the search engines, so you can determine how effective your marketing copy is by knowing how much of what you’ve written will be ignored. The total word count of the document for words that are not determined to be numbers Phrase counting for phrases of length 1 to 5 words — this helps you determine repetitive phrases in the document The anchor tags in the document — and, specifically if there is an image in the link text. The inner HTML of of the tag. This is the same as the link text. All the image tags in the document The the text of the image “alt” attribute I wonder how popular this app will be. Even for SEOs, do they find themselves needing to analyze sites on the go? If so, would this tool be it?
1 - 20 of 2149 Next › Last »
Showing 20 items per page