Skip to main content

Home/ Groups/ DISC Inc
Rob Laporte

Paid Search Beats SEO Conversion Rates? - Website Magazine - Website Magazine - 0 views

  • TEXT SIZE Advertisement <SCRIPT language='JavaScript1.1' SRC="http://ad.doubleclick.net/adj/N5621.websitemagazine.com/B3286961.2;abr=!ie;sz=300x250;ord=[unique-string]?"> </SCRIPT> <NOSCRIPT> <a target='_blank' HREF="http://ads.websiteservices.com/adclick.php?bannerid=244&zoneid=14&source=&dest=http%3A%2F%2Fad.doubleclick.net%2Fjump%2FN5621.websitemagazine.com%2FB3286961.2%3Babr%3D%21ie4%3Babr%3D%21ie5%3Bsz%3D300x250%3Bord%3D%5Bunique-string%5D%3F&ismap="> <IMG SRC="http://ad.doubleclick.net/ad/N5621.websitemagazine.com/B3286961.2;abr=!ie4;abr=!ie5;sz=300x250;ord=[unique-string]?" BORDER=0 WIDTH=300 HEIGHT=250 ALT="Click Here"></A> </NOSCRIPT> Advertisement Paid Search Beats SEO Conversion Rates? ShareThis In a statement that will surely have SEO's up in virtual arms, WebSideStory, a provider of digital marketing and analytics solutions, today announced the results of a  study that shows paid search has a nine percent edge in conversion rates over organic search. I can hear the furious typing of a million outraged SEO bloggers at this very minute. Via the news release, "In a study of leading business-to-consumer (B2C) e-commerce sites during the first eight months of this year, paid search -- keywords bought on a pay-per-click basis at search engines such as Google, Yahoo and MSN -- had a median order conversion rate of 3.40 percent at business-to-consumer e-commerce sites using the company's award-winning HBX Analytics technology. This compared to a conversion rate of 3.13 percent for organic search results, defined as non-paid or natural search engine listings, during the same January-to-August timeframe, according to the WebSideStory Index, a compilation of e-commerce, site search and global Internet user trends. The study analyzed more than 57 million search engine visits. Order conversions occurred during the same session. "For both paid and organic search, you have highly qualified traffic that converts far above the overall conversion rate of about 2 percent for most e-commerce sites," said Ali Behnam, Senior Digital Marketing Consultant for WebSideStory. "In the case of paid search, marketers have better control over the environment, including the message, the landing page and the ability to eliminate low-converting keywords."
Rob Laporte

Internet Marketing and SEO Blog from Rank Magic - 0 views

  • Paid (PPC) Search versus SEO August 9, 2007 ::: Increasingly I read and hear about people in the Internet marketing business arguing over whether paid search (pay per click ads) is more valuable than organic SEO, and vice versa. While there are some fascinating and relevant arguments on either side, research shows that marketers are quite satisfied with both.   A report from the SEMPO State of the Market Survey from about 18 months ago shows that 83% of respondents were using PPC compared to only 11% using SEO. Other reports show that the value of SEO is rising as user sophistication increases (according to Chris Boggs in the Spring 2007 edition of Search Marketing Standard). Marketing Sherpa's 2005 report showed SEO conversion rates overtook PPC rates at 4.2% versus 3.6%. That's quite the opposite of what had been found the year before.   The Direct Marketing Association reported in 2005 on a list of "online marketing strategies that produce the best ROI that PPC and SEO were rated equally according to US retailers, behind only "having a website" and "using email marketing". A more recent study by Marketing Sherpa, though, showed SEO ahead of email marketing, with PPC a close third.   One thing seems to be true: if a given web site shows up in both the organic search engine listings and the PPC ads, that seems to super-validate it as a good choice, which increases the likelihood of a searcher clicking on one of those listings.
Rob Laporte

BIZyCart SEO Manual - Controlled Navigation - 0 views

  • How The Robots Work Without getting into the programming details, the robots and web crawlers basically follow the following steps: On arrival, the robot pulls out all of the readable text it is interested in and creates a list of the links found on the page.  Links set as 'nofollow' or 'disallowed' are not added to the list.  If there are too many links, the robot may take a special action based on that. While the first robot completes processing the page, another robot script is launched to follow each of the links.  If there are ten links, there are now eleven robots running. Each of those robot scripts loads the page they were sent to and builds another link list.  Unless told otherwise, if there are ten links on each of those pages, one hundred additional robots get launched. Before going to the next page, the robots check to see if that page has already been looked at.  If already indexed that day, they cancel themselves and stop. The number of robots keeps expanding until all of the links have been followed and the site's web pages have been indexed or avoided. You can see that on some sites, thousands of robot processes can be taking their turns to work a web page.  There is physical limit on how much memory is available on the server.  If the number of active robots exceeds that, they have to be canceled or memory corruption will occur. If you let the robots run in too many directions, they may not finish looking at every web page or the results from some pages may get scrambled.  You are also subject to the number of robots on that server that are looking at other web sites.  Poorly managed robot servers can end up creating very strange results.
Rob Laporte

Problem with Google indexing secure pages, dropping whole site. - Search Engine Watch F... - 0 views

  • Coincidentally Google e-mailed me today saying to use a 301 redirect for the https page to http. This is the first thought I had and I tried to find code to do this for days when this problem first occurred-I never found it.
  •  
    04-25-2006 Chris_D's Avatar Chris_D Chris_D is offline Oversees: Searching Tips & Techniques Join Date: Jun 2004 Location: Sydney Australia Posts: 1,103 Chris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud of Hi docprego, Set your browser to reject cookies, and then surf your site (I'm assuming it's the one in your profile). now look at your URLS when you reject cookies..... /index.php?cPath=23&osCsid=8cfa2cb83fa9cc92f78db5f4 4abea819 /about_us.php?osCsid=33d0c44757f97f8d5c9c68628eee0e 2b You are appending Cookie strings to the URLS for user agents that reject cookie. That is the biggest problem. Get someone who knows what they are doing to look at your server configuration - its the problem - not Google. Google has always said: Quote: Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site. Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page. http://www.google.com/webmasters/guidelines.html You've also excluded a few pages in your http port 80 non secure robots.txt which I would have expected that you want to be indexed - like /about_us.php From an information architecture perspective, as Marcia said - put the stuff that n
Rob Laporte

SEOmoz | 12 Ways to Keep Your Content Hidden from the Search Engines - 0 views

  • Iframes Sometimes, there's a certain piece of content on a webpage (or a persistent piece of content throughout a site) that you'd prefer search engines didn't see. In this event, clever use of iframes can come in handy, as the diagram below illustrates: The concept is simple - by using iframes, you can embed content from another URL onto any page of your choosing. By then blocking spider access to the iframe with robots.txt, you ensure that the search engines won't "see" this content on your page. Websites may do this for many reasons, including avoiding duplicate content problems, lessening the page size for search engines, lowering the number of crawlable links on a page (to help control the flow of link juice), etc.
« First ‹ Previous 3361 - 3380 of 3586 Next › Last »
Showing 20 items per page