Skip to main content

Home/ Groups/ DISC Inc
Rob Laporte

Relying On Print Yellow Pages? Most Local Customers Turn To The Web! - 0 views

  •  
    Oct 22, 2008 at 7:13pm Eastern by Greg Sterling Relying On Print Yellow Pages? Most Local Customers Turn To The Web! Online marketers have been predicting the death of print yellow pages for years. While that will never happen, print yellow pages are no longer the primary way that people seek local information. In fact, the internet collectively - through search engines, local search sites, online yellow pages and other venues - is the top way consumers look for local information. A new study underscores this change and documents with hard numbers why local advertisers have to take the internet into account when trying to reach customers. The study The shift from print to web was captured by advertising agency TMP Directional Marketing, which commissioned comScore to perform a study in May 2007 about local search user behavior - online and off. The stated purpose was to "understand the use and value of on- and offline local search sources," including Internet yellow pages, print yellow pages and search engines. That study involved behavioral observations and survey responses from 3,000 members of comScore's US consumer panel. TMP followed up that original study with a second one this year, in July 2008. The results were released late last week. This overview compares the topline findings from the previous study and those just published. Internet now 'primary' local information source When asked about their "primary" source for location business information, here's how survey respondents answered: In the 2007 findings, print yellow pages were the single, leading source for local business information. However the internet, in the aggregate, was used as a primary tool by almost twice as many respondents. In the 2008 survey, search engines (e.g., Google) have pulled ahead of print yellow pages, while internet yellow pages (e.g., Yellowpages.com) saw growth and local search sites (e.g., Google Maps, Yahoo Local) experienced a slight usage
Rob Laporte

Two Ways To Justify SEO In Uncertain Times - 0 views

  •  
    Oct 22, 2008 at 10:55am Eastern by Paul Bruemmer Two Ways To Justify SEO In Uncertain Times In House - A Column From Search Engine Land During uncertain economic times like these, our advice is to always stick with the fundamentals to maintain business efficiency and progress. No matter what your business model, performing the fundamentals will keep you on-track and in-line for leveraging future success. If the C-level executives in your company are having any doubts about the value of SEO and are hesitating to release more funding, it's time to perform a cost-benefit exercise. It's your job as an in-house SEO manager to reestablish their confidence in the value of SEO as well as your value and the value of your team. When funding gets in the way, having a narrow focus, putting it on the table, and describing company goals you are committed to are all very important. 1) Leverage Your Paid Search Data To demonstrate implicit value for SEO, start with a baseline. Show where your key terms currently rank in organic and multiply by the cost-per-click value. Run the numbers for the value of direct clicks with high search intent. One way to go about this is to calculate an Effective Cost-Per-Click (eCPC) for your organic listings: 1. Access the Keyword Tool within your Google AdWords account. 2. Type your best performing (for instance, 20) keywords. 3. Select descriptive words or phrases and synonyms. 4. Click Get Keyword Ideas. This will produce a report; select Exact within the "Match Type" field and click on Approx Avg Search Volume. 1. Look at the Cost-Per-Click column to acquire the CPC value (let's assume it's $2.00). 2. Go to your web analytics data and identify the number of organic clicks for these keywords (let's assume 20,000/month). 3. Multiply the two (CPC times the number of organic clicks (in this case $40,000/mo)). 4. Create a spreadsheet with your best performing keywords and make the statement, "if we
Rob Laporte

A Completely Different Kind Of Landing Page Optimization - 0 views

  •  
    How can you begin using segment optimization in your campaigns? Start by making a list of possible segments within your audience. Who are the different types of people who look for you online - and why? Don't restrict yourself to the way you may have segmented people in your database or your business plan. Brainstorm what's important and relevant from the respondent's point-of-view, by considering any or all of the following issues: * the specific "problem" the respondent wants to solve * the demographic/psychographic "persona" of the respondent * the respondent's stage in the buying process * the role of the respondent in their organization * the respondent's geographic location * the respondent's industry or the size of their organization These are your initial buckets into which respondents could be segmented. Don't worry if there's overlap between buckets, as these won't necessarily be either/or choices. Next, review the keywords and ad creatives you're running in your search marketing campaigns. For each keyword/creative pair, ask yourself - is there a particular segment that its respondents would clearly belong to? If the answer is yes, add it to that bucket along with the number of clicks per month it generates. If there answer is no, leave a question mark next to it - perhaps with a handful of segments it might appeal to. For instance, in our example above, the keyword phrases "french exam" and "college french" are obvious candidates for the student segment. Phrases like "business french" and "executive french" fall into the business traveler bucket. But "learn french" can't be segmented just from the keyword. Now, look over your segment buckets and see which ones have the most number of clicks per month. These are your best targets for segment optimization. For each one, create a dedicated landing page that is focused on the needs, wants, and characteristics of that particular
Rob Laporte

Domain Moving Day the Key Relevance Way | SEMClubHouse - Key Relevance Blog - 0 views

  •  
    Domain Moving Day the Key Relevance Way by Mike Churchill So, you're gonna change hosting providers. In many cases, moving the content of the site is as easy as zipping up the content and unzipping it on the new server. There is another aspect of moving the domain that many people over look: DNS. The Domain Name System (DNS) is the translation service that converts your domain name (e.g. keyrelevance.com) to the corresponding IP address. When you move hosting companies, it's like changing houses, if you don't set up the Change of Address information correctly, you might have some visitors going to the old address for a while. Proper handling of the changes to DNS records makes this transition time as short as possible. Let's assume that you are changing hosting, and the new hosting company is going to start handling the Authoritative DNS for the domain. The first step is to configure the new hosting company as the authority. This should best be done a couple or more days before the site moves to the new location. What does "Authoritative DNS" mean? There are a double-handful of servers (known as the Root DNS servers) whose purpose is to keep track of who is keeping track of the IP addresses for a domain. Rather than them handling EVERY DNS request, they only keep track of who is the authoritative publisher of the DNS information for each domain. In other words, they don't know your address, but they tell you who does know it. If we tell the Root level DNS servers that the authority is changing, this information may take up to 48 hours to propagate throughout the internet. By changing the authority without changing the IP addresses, then while visiting browsers are making requests during this transition, both the old authority and the new authority will agree on the address (so no traffic gets forwarded before you move). Shortening the Transition The authoritative DNS servers want to minimize their load, so every time they send out an answer to a
Rob Laporte

Google; You can put 50 words in your title tag, we'll read it | Hobo - 0 views

  •  
    Google; You can put 50 words in your title tag, we'll read it Blurb by Shaun Anderson Note - This is a test, testing Title Tags in Google. Consider also Google Title Tag Best Practice. We recently tested "how many keywords will Google read in the title tag / element?" using our simple seo mythbuster test (number 2 in the series). And here's the results, which are quite surprising. First - here's the test title tag we tried to get Google to swallow. And it did. All of it. Even though it was a bit spammy; HoboA HoboB HoboC HoboD HoboE HoboF HoboG HoboH HoboI HoboJ HoboK HoboL HoboM HoboN HoboO HoboP HoboQ HoboR HoboS HoboT HoboU HoboV HoboW HoboX HoboY Hob10 Hob20 Hob30 Hob40 Hob50 Hob60 Hob70 Hob80 Hob90 Hob11 Hob12 Hob13 Hob14 Hob15 Hob16 Hob17 Hob18 Hob19 Hob1a Hob1b Hob1c Hob1d Hob1e Hob1f Hob1g Hob1h Using a keyword search - hoboA Hob1h - we were surprised to see Google returned our page. We also tested it using - Hob1g Hob1h - the keywords right at the end of the title - and again our page was returned. So that's 51 words, and 255 characters without spaces, 305 characters with spaces, at least! It seems clear Google will read just about anything these days! ************** Update: Qwerty pointed out an interesting fact about the intitle: site operator in Google. Google results with the intitle: command…..results as expected. But next in the sequence returns the following, unexpected result….. Google results with the intitle: command So what does this tell us? Google seems to stop at the 12th word on this page at least when returning results using the intitle: site operator. Another interesting observation. Thanks Qwerty. ************** We're obviously not sure what benefit a title tag with this many keywords in it has for your page, in terms of keyword density / dilution, and "clickability" in the search engine results pages (serps). 50+ words is certainly not best practice! When creating your title tag bear in
Rob Laporte

Google SEO Test - Google Prefers Valid HTML & CSS | Hobo - 0 views

  •  
    Well - the result is clear. From these 4 pages Google managed to pick the page with valid css and valid html as the preffered page to include in it's index! Ok, it might be a bit early to see if the four pages in the test eventually appear in Google but on first glance it appears Google spidered the pages, examined them, applied duplicate content filters as expected, and selected one to include in search engine results. It just happens that Google seems to prefer the page with valid code as laid down by the W3C (World Wide Web Consortium). The W3C was started in 1994 to lead the Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability. What is the W3C? * W3C Stands for the World Wide Web Consortium * W3C was created in October 1994 * W3C was created by Tim Berners-Lee * W3C was created by the Inventor of the Web * W3C is organized as a Member Organization * W3C is working to Standardize the Web * W3C creates and maintains WWW Standards * W3C Standards are called W3C Recommendations How The W3C Started The World Wide Web (WWW) began as a project at the European Organization for Nuclear Research (CERN), where Tim Berners-Lee developed a vision of the World Wide Web. Tim Berners-Lee - the inventor of the World Wide Web - is now the Director of the World Wide Web Consortium (W3C). W3C was created in 1994 as a collaboration between the Massachusetts Institute of Technology (MIT) and the European Organization for Nuclear Research (CERN), with support from the U.S. Defense Advanced Research Project Agency (DARPA) and the European Commission. W3C Standardising the Web W3C is working to make the Web accessible to all users (despite differences in culture, education, ability, resources, and physical limitations). W3C also coordinates its work with many other standards organizations such as the Internet Engineering Task Force, the Wireless Application Protocols (WAP) Forum an
Rob Laporte

SEOmoz | The Disconnect in PPC vs. SEO Spending - 0 views

  •  
    The Disconnect in PPC vs. SEO Spending Posted by randfish on Tue (10/21/08) at 12:21 AM Paid Search Ads There's a big disconnect in the way marketing dollars are allocated to search engine focused campaigns. Let me highlight: Not surprisingly, search advertising should continue to be the largest category, growing from $9.1 billion in 2007 to $20.9 billion in 2013. - Source: C|Net News, June 30, 2008 OK. So companies in the US spent $10 billion last year on paid search ads, and even more this year. How about SEO? SEO: $1.3 billion (11%) - Source: SEMPO data via Massimo Burgio, SMX Madrid 2008 According to SEMPO's data, it's 11% for SEO and 87% for PPC (with another 1.4% for SEM technologies and s turn to Enquiro: Organic Ranking Visibility (shown in a percentage of participants looking at a listing in this location) Rank 1 - 100% Rank 2 - 100% Rank 3 - 100% Rank 4 - 85% Rank 5 - 60% Rank 6 - 50% Rank 7 - 50% Rank 8 - 30% Rank 9 - 30% Rank 10 - 20% Side sponsored ad visibility (shown in percentage of participants looking at an ad in this location) 1 - 50% 2 - 40% 3 - 30% 4 - 20% 5 - 10% 6 - 10% 7 - 10% 8 - 10% Fascinating. So visibility is considerably higher for the organic results. What about clicks? Thanks to Comscore, we can see that clicks on paid search results has gone down over time, and is now ~22%. Conclusions: SEO drives 75%+ of all search traffic, yet garners less than 15% of marketing budgets for SEM campaigns. PPC receives less than 25% of all search traffic, yet earns 80%+ of SEM campaign budgets. Questions: * Why does paid search earn so many more marketing dollar
Rob Laporte

Nofollow Monstrosity - 0 views

  •  
    # Many people link to social sites from their blogs and websites, and they rarely put 'nofollow' on their sites. Most social sites, on the other hand, started putting by default 'nofollow' on all external links. Consequence? For example, bookmark your new site 'example123.com' at 'stumbleupon.com'. If you google for 'example123′, stumbleupon.com page about it (with no content but the link and title) will be on top, while your site (with actual content) that you searched for will be below. Imagine what effect this PageRank capitalization has when you search for things other than your domain name! # Each site and blog owner is contributing to this unknowingly and voluntarily. Do any of these look familiar? social bookmarks Most blogs and sites have at least few of these on almost every single page. Not a single one of these buttons has 'nofollow', meaning that people give a very good chunk of their site's importance to these social sites (hint: importance that you give to these buttons is importance taken away from other internal links on your site). Most of social sites however, do have 'nofollow' on a link pointing back to peoples sites after users link to them for being good. Conclusion, people give them a lot of credit on almost every page, while these sites give nothing in return. (Two 'good' sites among these, that I know of, are Digg that does not have 'nofollow', and Slashdot that tries to identify real spam and puts 'nofollow' on those links only. There are probably few more.) # This can be easily prevented, and PageRank can be re-distributed, in no time! Solution is very simple. 'Do unto others as you would have others do unto you.' If you have a WordPress blog (as millions of internet users do), download plugins Antisocial and Nofollow Reciprocity. First one puts 'nofollow' on above buttons, second puts 'nofollow' on all external links pointing to 'bad' sites. If you are using some other blogging app
Jennifer Williams

Nofollow Link Social Media | SEO Training - 0 views

  •  
    The Nofollow Link & Social Media Published by Your SEO Mentor under SEO, Social Media Aug 23 2008 There has been a lot of questions about how Social Media is affecting the SEO industry. The question I would like to ask is how can it help the SEO industry and how will affect the SEO for my clients sites and my own. The major issue with Social Media sites and how they play a role in your SEO these days is a majority of them (especially the big boys e.g. Twitter) use the Nofollow link. "Well your asking what does this mean and why do I need to worry about it." First of all don't worry about it, this is not the end of the world but what it means is that going to all these major social media and networking sites and linking back to your website will for the most part have no affect on your search engine results. The NoFollow link (e.g. ) was originally created to block search engines from following links in blog comments, this was due to the very high amount of blog comment spamming. The wonderful Wikipedia definition says, "nofollow is an HTML attribute value used to instruct some search engines that a hyperlink should not influence the link target's ranking in the search engine's index. It is intended to reduce the effectiveness of certain types of search engine spam, thereby improving the quality of search engine results and preventing spamdexing from occurring in the first place." With Social Media sites popping up daily and with them being very easy to place user generated content and links spammers began the same old routine and therefore we suffer from their actions. The top social media and networking sites quickly found that they too needed to use the nofollow attribute to help reduce the amount of spam submitted. So for the most part placing a link on Social Media sites will not directly help your search engine optimization efforts. That doesn't mean Social Media can not help in gaining valuable links to
  •  
    Current Top 20 Social Bookmarking sites that Dofollow
« First ‹ Previous 241 - 260 Next › Last »
Showing 20 items per page