Skip to main content

Home/ Groups/ DISC Inc
Rob Laporte

Universal Search: The (War) Elephant in the Room - Search Engine Watch (SEW) - 0 views

  • Working as One Agencies need to throw the old model out the window. Rather than addressing only our own discipline, we need to look at the playing field as a whole. Focusing all efforts in harmony is the only way to properly address the challenges of universal search. Today's search strategy requires optimization of all of the client's assets, not just their Web site. This means press releases, images, videos, and even brick-and-mortar locations to take full advantage of all that universal search has to offer. Included in this is the paid search team, not only for the promotional opportunities provided by paid search but also for how they must work in conjunction with feeds management. Take the new Google feature being tested called the Plus Box. Clients can't take advantage of this new feature without having the paid search team and the feeds management team working together. The paid team needs to ensure the bidding strategy meets the needs of the user, and the feeds management team must provide a feed of the appropriately bid upon terms to Google Base in order to populate the Plus Box. One needs the other to function.
Rob Laporte

The Long-Tail is Your Best Bet for 1st Position | PalatnikFactor.com - 0 views

  •  
    long-tail keyword
Rob Laporte

WebMama's Look at the Web: Help Me Create a Top 10 SEM List for Blogs.com - 0 views

  • My morning coffee is enjoyed over--http://www.searchengineland.com/http://www.searchengineguide.com/http://www.seroundtable.com/hope that helps!   At 22/10/08 17:07 ,  Barry Schwartz said... Heh... I'd also vote for SERoundtable.com   At 23/10/08 12:22 ,  Claudia Bruemmer said... http://searchengineland.com/http://www.toprankblog.com/http://www.seroundtable.com/http://www.seomoz.org/bloghttp://www.seobook.com/blog
Rob Laporte

Should you sculpt PageRank using nofollow? | MickMel SEO - 0 views

  • Home About Contact RSS Feed   « Google releases Ad Manager A little more about Placement Targeting in AdSense » Should you sculpt PageRank using nofollow? I’ve seen a few posts (Dave Naylor, Joost de Valk) discussing this over the last few days and thought I’d share my view of it. Both posts bring up the same analogy, attributed to Matt Cutts: Nofollowing your internals can affect your ranking in Google, but it’s a 2nd order effect. My analogy is: suppose you’ve got $100. Would you rather work on getting $300, or would you spend your time planning how to spend your $100 more wisely. Spending the $100 more wisely is a matter of good site architecture (and nofollowing/sculpting PageRank if you want). But most people would benefit more from looking at how to get to the $300 level. While I agree in theory, I think that’s a bit oversimplified.  What if you could re-allocate your $100 more effectively in just a few minutes, then go try to raise it to $300? Sculpting PageRank is one of those things that can earn a nice benefit in a short period of time, but you can keep tweaking forever for progressively lesser and lesser gains.  See the chart on the left. For example, you probably have links on your site for “log-in”, “privacy policy” and other such pages.  Go in and nofollow those.  How long did that take?  Two minutes?  That alone probably brought as much benefit as it will to go through every page and carefully sculpt things out. Knock out a few of those links, then spend your time trying to work on getting $300.
Rob Laporte

NoFollow and PageRank Sculpting is it Worth the Effort - 0 views

  • For some websites using nofollow and pagerank sculpting is a complete waste of time, energy and resources. For other websites there may be some moderate level of benefit, and for some websites ignoring pagerank sculpting may be costing you traffic and sales.
Rob Laporte

NoFollow | Big Oak SEO Blog - 0 views

  • And while the business networking aspect is great, I’m writing to tell you it can be useful for your SEO efforts too, specifically link building. You may not know this, but LinkedIn does not employ the nofollow attribute on its links, like most other social networking sites. So that means we can use LinkedIn responsibly to build some nice one-way links to our sites and blogs. Even better your employees can use this to build some SEO-friendly links to your company site.
  • So the days of parsing links onto high PageRank Flickr pages are over. Or are they? No. Let’s examine why in list form. Let’s examine how you can use the remaining scraps of link juice from Flickr in your SEO campaigns. 1.) Flickr has not added nofollow to discussion boards. For those of you who liked to scout out high PageRank pages and just drop your link as a comment to the photo, which could be accomplished easily if you owned a link-laundering website, you can still do this in the Flickr group discussion boards. Flickr has not yet added nofollow tags to those, and given the preponderance of discussions that revolve around people sharing photos, you can just as easily drop relevant external links in the discussion and reap link juice benefits. 2.) Flickr has not added nofollow to personal profile pages. If you have a personal profile page, you can place targeted anchor text on it, point links at it, and receive full SEO benefit as it gains PageRank. 3.) Flickr has not added nofollow to group pages. If you own a Flickr group, you can still put as many links as you wish on the main group page without fear of them being turned into nofollow. Many Flickr personal profile and group pages gain toolbar PR just by having the link spread around in-house, so it’s not that hard to make those pages accumulate PR. Google seems to be very generous in that regard. There’s a lot of PR to be passed around through Flickr apparently. So, the glory days of Flickr SEO may be over (unless Yahoo does the improbable and flips the switch back), but Rome didn’t burn to rubble in a day, so we might as well make the most of Flickr before it completely collapses.
Rob Laporte

Nofollow Monstrosity - 0 views

  •  
    # Many people link to social sites from their blogs and websites, and they rarely put 'nofollow' on their sites. Most social sites, on the other hand, started putting by default 'nofollow' on all external links. Consequence? For example, bookmark your new site 'example123.com' at 'stumbleupon.com'. If you google for 'example123′, stumbleupon.com page about it (with no content but the link and title) will be on top, while your site (with actual content) that you searched for will be below. Imagine what effect this PageRank capitalization has when you search for things other than your domain name! # Each site and blog owner is contributing to this unknowingly and voluntarily. Do any of these look familiar? social bookmarks Most blogs and sites have at least few of these on almost every single page. Not a single one of these buttons has 'nofollow', meaning that people give a very good chunk of their site's importance to these social sites (hint: importance that you give to these buttons is importance taken away from other internal links on your site). Most of social sites however, do have 'nofollow' on a link pointing back to peoples sites after users link to them for being good. Conclusion, people give them a lot of credit on almost every page, while these sites give nothing in return. (Two 'good' sites among these, that I know of, are Digg that does not have 'nofollow', and Slashdot that tries to identify real spam and puts 'nofollow' on those links only. There are probably few more.) # This can be easily prevented, and PageRank can be re-distributed, in no time! Solution is very simple. 'Do unto others as you would have others do unto you.' If you have a WordPress blog (as millions of internet users do), download plugins Antisocial and Nofollow Reciprocity. First one puts 'nofollow' on above buttons, second puts 'nofollow' on all external links pointing to 'bad' sites. If you are using some other blogging app
Rob Laporte

Myths and Truths About Google GrayBar PR - 0 views

  • 2 opposing opinions on Graybar PR expressed: TBPR (and consequently Graybar PR) is just broken (as well as Google back link operator). OR: Both Toolbar PR and Back link operator are not broken but “de-SEO-usefulised“. Google uses them for disinformation. Graybar PR plays the role of a warning: the message might be that the page has been algorithmically flagged as looking like the kind of page that might be selling links. If this is the message, it would be directed both to the potential link buyer (to fuzz up what the TBPR of the page is) and to the potential link seller (as a note that Google is watching this page). Graybar PR might also mean the page was dropped out of index (or just not indexed yet) or penalized for infringing the guidelines. Graybar PR facts: FACT: gray PR is not the same as PR 0 (zero); FACT: graybar PR can mean the site is new and has not yet been into PR update; FACT: gray PR doesn’t directly mean the site is penalized or is deindexed; FACT: gray PR can be a signal of improper behavior (more checks are needed to make sure your OK / not OK); FACT: Toolbar PR can change and even become gray with no impact on performance; FACT: if gray PR did not effect other aspects of your site web life (rankings, number of indexed pages, etc), that might be a glitch inherent in the bar (wait a bit and see; or try to open the page in other browsers). Another possible signal of a glitch is that TBPR goes gray without waiting for the next PR update.
Rob Laporte

Google AdWords Finally Breaking Out Search Traffic From Partners - 0 views

  • Oct 17, 2008 at 5:41am Eastern by Barry Schwartz    Google AdWords Finally Breaking Out Search Traffic From Partners The Google AdWords blog announced they have added a method to the AdWords console to break out search traffic between Google and search partners. Beforehand, you were only able to see a breakout between your content campaigns and search campaigns. Now, you can breakout your AdWords results based on your content campaigns, Google search campaigns and search partner campaigns. How do you do this? Log into your AdWords console, navigate to a campaign and click on the “Statistics” drop down. Then select “Split: Google search/search partners/content network.” You will then get three rows of summary data by Google, Search Partner and Content network. This level of detail can be found in the ad group or campaign levels. Google said this level of detail is coming to the Report Center soon. Advertisers have been asking for this for a long time! For more information, see this help page.
Rob Laporte

SEOmoz | The Disconnect in PPC vs. SEO Spending - 0 views

  •  
    The Disconnect in PPC vs. SEO Spending Posted by randfish on Tue (10/21/08) at 12:21 AM Paid Search Ads There's a big disconnect in the way marketing dollars are allocated to search engine focused campaigns. Let me highlight: Not surprisingly, search advertising should continue to be the largest category, growing from $9.1 billion in 2007 to $20.9 billion in 2013. - Source: C|Net News, June 30, 2008 OK. So companies in the US spent $10 billion last year on paid search ads, and even more this year. How about SEO? SEO: $1.3 billion (11%) - Source: SEMPO data via Massimo Burgio, SMX Madrid 2008 According to SEMPO's data, it's 11% for SEO and 87% for PPC (with another 1.4% for SEM technologies and s turn to Enquiro: Organic Ranking Visibility (shown in a percentage of participants looking at a listing in this location) Rank 1 - 100% Rank 2 - 100% Rank 3 - 100% Rank 4 - 85% Rank 5 - 60% Rank 6 - 50% Rank 7 - 50% Rank 8 - 30% Rank 9 - 30% Rank 10 - 20% Side sponsored ad visibility (shown in percentage of participants looking at an ad in this location) 1 - 50% 2 - 40% 3 - 30% 4 - 20% 5 - 10% 6 - 10% 7 - 10% 8 - 10% Fascinating. So visibility is considerably higher for the organic results. What about clicks? Thanks to Comscore, we can see that clicks on paid search results has gone down over time, and is now ~22%. Conclusions: SEO drives 75%+ of all search traffic, yet garners less than 15% of marketing budgets for SEM campaigns. PPC receives less than 25% of all search traffic, yet earns 80%+ of SEM campaign budgets. Questions: * Why does paid search earn so many more marketing dollar
Rob Laporte

Tips On Getting a Perfect 10 on Google Quality Score - 0 views

  • October 20, 2008 Tips On Getting a Perfect 10 on Google Quality Score Ever since Google launched the real time quality score metric, where Google rated keywords between 0 and 10, 10 being the highest, I have rarely seen threads on documenting how to receive a 10 out of 10. Tamar blogged about How To Ensure That Your Google Quality Score is 10/10 based on an experiment by abbotsys. Back then, it was simply about matching the domain name to the keyword phrase, but can it be achieved with out that? A DigitalPoint Forums thread reports another advertiser receiving the 10/10 score. He documented what he did to obtain the score: Eliminated all the keywords that google had suggested and only used a maximum of three keywords per ad campaign.Used only 1 ad campaign per landing page and made each landing page specific for that keyword.Put the cost per click up high enough to give me around third spot.Geo targeted the campaigns only in the areas he can sell to.Limited the time his ads were on only to the times where there is really interest.Used three version of each keyword "keyword", [keyword], and keyword and then eliminated which every wasn't working well. If you want to reach that perfect 10, maybe try these tips and see what works for you. There is no guaranteed checklist of items, so keep experimenting. And when you get your perfect 10, do share!
Rob Laporte

Google Webmaster Tools is Incorrectly Displaying Keyword Positions - 0 views

  • October 20, 2008 Google Webmaster Tools is Incorrectly Displaying Keyword Positions A WebmasterWorld member reports that he was dependent on the Top Search Queries report in Google Webmaster Tools and has found it to be providing incorrect data. After all, using another rank checker proved to see no results and there were no visitors to that page. This is likely to be a bug, according to Tedster: Webmaster Tools reports of all kinds are known to contain wrong information at times. This kind of wrong information would be particularly distrubing, but in any big system errors do creep in. The evidence of your own server logs is more dependable. He adds that it's possible that the ranking is achievable: [M]aybe the WMT report is pulling the position information before some filter is applied to come up with the final rankings. Even though that would certainly be buggy behavior, it might accidentally be showing you that your url COULD rank that well, if only you weren't tripping some kind of filter. Still, though, the tool in Google's backend is misleading. Would you consider this a bug? On a related note, The Official Google Webmaster Central Blog says that this could be an issue with the kind of data that WMT sees. They suggest that you add the www and non-www versions of the same site to Webmaster Central, do a site: search to look for any anomalies, set your preferred domain, and set a site-wide 301 redirect to www or the non-www. Of course, this is probably not applicable to the reporting issue in WebmasterWorld, though it may be related to other issues within Google Webmaster Tools. Forum discussion continues at WebmasterWorld.
Rob Laporte

Google SEO Test - Google Prefers Valid HTML & CSS | Hobo - 0 views

  •  
    Well - the result is clear. From these 4 pages Google managed to pick the page with valid css and valid html as the preffered page to include in it's index! Ok, it might be a bit early to see if the four pages in the test eventually appear in Google but on first glance it appears Google spidered the pages, examined them, applied duplicate content filters as expected, and selected one to include in search engine results. It just happens that Google seems to prefer the page with valid code as laid down by the W3C (World Wide Web Consortium). The W3C was started in 1994 to lead the Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability. What is the W3C? * W3C Stands for the World Wide Web Consortium * W3C was created in October 1994 * W3C was created by Tim Berners-Lee * W3C was created by the Inventor of the Web * W3C is organized as a Member Organization * W3C is working to Standardize the Web * W3C creates and maintains WWW Standards * W3C Standards are called W3C Recommendations How The W3C Started The World Wide Web (WWW) began as a project at the European Organization for Nuclear Research (CERN), where Tim Berners-Lee developed a vision of the World Wide Web. Tim Berners-Lee - the inventor of the World Wide Web - is now the Director of the World Wide Web Consortium (W3C). W3C was created in 1994 as a collaboration between the Massachusetts Institute of Technology (MIT) and the European Organization for Nuclear Research (CERN), with support from the U.S. Defense Advanced Research Project Agency (DARPA) and the European Commission. W3C Standardising the Web W3C is working to make the Web accessible to all users (despite differences in culture, education, ability, resources, and physical limitations). W3C also coordinates its work with many other standards organizations such as the Internet Engineering Task Force, the Wireless Application Protocols (WAP) Forum an
Rob Laporte

Google; You can put 50 words in your title tag, we'll read it | Hobo - 0 views

  •  
    Google; You can put 50 words in your title tag, we'll read it Blurb by Shaun Anderson Note - This is a test, testing Title Tags in Google. Consider also Google Title Tag Best Practice. We recently tested "how many keywords will Google read in the title tag / element?" using our simple seo mythbuster test (number 2 in the series). And here's the results, which are quite surprising. First - here's the test title tag we tried to get Google to swallow. And it did. All of it. Even though it was a bit spammy; HoboA HoboB HoboC HoboD HoboE HoboF HoboG HoboH HoboI HoboJ HoboK HoboL HoboM HoboN HoboO HoboP HoboQ HoboR HoboS HoboT HoboU HoboV HoboW HoboX HoboY Hob10 Hob20 Hob30 Hob40 Hob50 Hob60 Hob70 Hob80 Hob90 Hob11 Hob12 Hob13 Hob14 Hob15 Hob16 Hob17 Hob18 Hob19 Hob1a Hob1b Hob1c Hob1d Hob1e Hob1f Hob1g Hob1h Using a keyword search - hoboA Hob1h - we were surprised to see Google returned our page. We also tested it using - Hob1g Hob1h - the keywords right at the end of the title - and again our page was returned. So that's 51 words, and 255 characters without spaces, 305 characters with spaces, at least! It seems clear Google will read just about anything these days! ************** Update: Qwerty pointed out an interesting fact about the intitle: site operator in Google. Google results with the intitle: command…..results as expected. But next in the sequence returns the following, unexpected result….. Google results with the intitle: command So what does this tell us? Google seems to stop at the 12th word on this page at least when returning results using the intitle: site operator. Another interesting observation. Thanks Qwerty. ************** We're obviously not sure what benefit a title tag with this many keywords in it has for your page, in terms of keyword density / dilution, and "clickability" in the search engine results pages (serps). 50+ words is certainly not best practice! When creating your title tag bear in
« First ‹ Previous 3381 - 3400 of 3482 Next › Last »
Showing 20 items per page