Skip to main content

Home/ DISC Inc/ Group items tagged rates

Rss Feed Group items tagged

3More

Republishing Content: How to Update Old Blog Posts for SEO - 0 views

  • republishing any old post isn’t going to work. You need to find those that are underperforming because of content issues.
  • It’s sometimes because those that outrank you have more high-quality backlinks and ‘link authority.’To check if that’s the case, search for your keyword in Keywords Explorer, scroll to the SERP overview, then look at the Domain Ratings (DR) and URL Ratings (UR) of the sites and pages that outrank you.
  • If product, category, or landing pages are outranking you, then maybe searchers aren’t looking for blog posts.
4More

BruceClay - Search Engine Optimization: You & A with Matt Cutts - 0 views

  • In 2003 they switched to an incremental index system, pushing it live every few days. (Update Fritz). Now they have Caffeine. But instead of crawling docs and then indexing later that night, they crawl and index at the same time making the whole index much closer to real time. It used to be like waiting for a bus, now it’s like the taxi. No waiting. It just is there and bringing it to you right away. It unlocks the ability for Google to index a whole lot more documents and makes the whole index 50% fresher.
  • Caffeine isn’t a ranking change.
  • Static HTML links are always nice and safe but they’re getting better at parsing JavaScript.
  • ...1 more annotation...
  • Is bounce rate part of the algorithm? Matt says not as part of the general ranking Web algorithm. (To the best of his knowledge.)
2More

Google Webmaster Tools Now Provide Source Data For Broken Links - 0 views

  • Google has also added functionality to the Webmaster Tools API to enable site owners to provide input on control settings (such as preferred domain and crawl rate) that could previously only be done via the application. As they note in the blog post: “This is especially useful if you have a large number of sites. With the Webmaster Tools API, you can perform hundreds of operations in the time that it would take to add and verify a single site through the web interface.”
  •  
    Oct 13, 2008 at 5:28pm Eastern by Vanessa Fox Google Webmaster Tools Now Provide Source Data For Broken Links Ever since Google Webmaster Tools started reporting on broken links to a site, webmasters have been asking for the sources of those links. Today, Google has delivered. From Webmaster Tools you can now see the page that each broken link is coming from. This information should be of great help for webmasters in ensuring the visitors find their sites and that their links are properly credited. The value of the 404 error report Why does Google report broken links in the first place? As Googlebot crawls the web, it stores a list of all the links it finds. It then uses that list for a couple of things: * As the source list to crawl more pages on the web * To help calculate PageRank If your site has a page with the URL www.example.com/mypage.html and someone links to it using the URL www.example.com/mpage.html, then a few things can happen: * Visitors who click on that link arrive at the 404 page for your site and aren't able to get to the content they were looking for * Googlebot follows that link and instead of finding a valid page of your site to crawl, receives a 404 page * Google can't use that link to give a specific page on your site link credit (because it has no page to credit) Clearly, knowing about broken links to your site is valuable. The best solution in these situations generally is to implement a 301 redirect from the incorrect URL to the one. If you see a 404 error for www.example.com/mpage.html, then you can be pretty sure they meant to link to www.example.com/mypage.html. By implementing the redirect, visitors who click the link find the right content, Googlebot finds the content, and mypage.html gets credit for the link. In addition, you can scan your site to see if any of the broken links are internal, and fix them. But finding broken links on your site can be tedious (although it's valuable to run a broken l
1More

3 Reasons Why Blogs for SEO Fail | Online Marketing Blog - 0 views

  • However, when it comes to blogs, consumer information discovery trends are involving social networks and social media at an increasing rate. Recommendations are competing with search. When looking at the web analytics of our blog and client blogs, social media traffic is in the top 5 referring sources of traffic. Blogs are social and social media sources will become increasingly important for many business blogging efforts in the coming year. So, what can a company do to build upon and benefit from, the compounding equity that grows with long term blogging and SEO efforts? I’ll be answering that question specifically in tomorrow’s post on 5 Tips for Successful Blog Optimization efforts. In the meantime, have you started a blog only to lose interest or stop contributing to it? What was your reason? What would you do differently?
1More

Yahoo Improves Content Match Targeting - 0 views

  • Oct 13, 2008 at 9:42am Eastern by Barry Schwartz Yahoo Improves Content Match Targeting The Yahoo Search Marketing Blog announced they have improved the targeting and relevancy of their content match product. The improvements will lead to a higher click through rate on ads and higher satisfaction. The specific improvement is that they now not only target the ads based on the content of the page, but also based on the user viewing the page. Yahoo will tailor the ad based on the “users’ geographic and behavioral profiles.”
1More

Tips On Getting a Perfect 10 on Google Quality Score - 0 views

  • October 20, 2008 Tips On Getting a Perfect 10 on Google Quality Score Ever since Google launched the real time quality score metric, where Google rated keywords between 0 and 10, 10 being the highest, I have rarely seen threads on documenting how to receive a 10 out of 10. Tamar blogged about How To Ensure That Your Google Quality Score is 10/10 based on an experiment by abbotsys. Back then, it was simply about matching the domain name to the keyword phrase, but can it be achieved with out that? A DigitalPoint Forums thread reports another advertiser receiving the 10/10 score. He documented what he did to obtain the score: Eliminated all the keywords that google had suggested and only used a maximum of three keywords per ad campaign.Used only 1 ad campaign per landing page and made each landing page specific for that keyword.Put the cost per click up high enough to give me around third spot.Geo targeted the campaigns only in the areas he can sell to.Limited the time his ads were on only to the times where there is really interest.Used three version of each keyword "keyword", [keyword], and keyword and then eliminated which every wasn't working well. If you want to reach that perfect 10, maybe try these tips and see what works for you. There is no guaranteed checklist of items, so keep experimenting. And when you get your perfect 10, do share!
1More

Bing - How Microsoft handles bots clicking on ads - Webmaster Blog - Bing Community - 0 views

  • AdCenter uses a variety of techniques to remove bots, including the Interactive Advertising Bureau’s (IAB) Spiders and Robots protocol.  The IAB provides a list of known bots, and Microsoft bots are a part of that list. As a result, any activity generated by bots will not skew AdCenter data because it will be categorized as low quality in AdCenter Reports. You can view the Standard Quality and Low Quality data by accessing the AdCenter Reports tab. In June, 2009, Microsoft received Click Quality Accreditation from the IAB, which holds the industry’s highest standards in click measurement. The IAB and independent third-part auditors verified that adCenter meets their requirements for Click Quality Accreditation, which includes not billing for our search bot’s ad clicks. For more information, visit the adCenter Blog, or the IAB site.
1More

The Era of Short URLs - ClickZ - 0 views

  • URL shorteners do two things. First, they shorten URLs. Second, they create a unique URL to a destination, which can be tracked. That's the key element for marketers. In fact, one of the up-and-comers in this space, bit.ly, has built analytics directly into its interface. By using URL shorteners, you can begin capturing some excellent insight. For example you can see how many times a particular community forwards a message compared to another community. That is, since you can create two unique short URLs to the same page, put one on Twitter and embed the other in an e-mail. You can then analyze the clicks you get in standard A/B testing. You can also measure pass-along rates. Since the recipient of the original short URL will most likely pass along the short URL (not the original URL), you can see how far down the particular path you've created to a page goes. Not Just for Other People's Content and Twitter The cool thing about URL shorteners is that you can shorten anything. Sure, the traditional way to use these services is to shorten a link to someone else's content and send it via a medium that requires you to be terse. But there's no reason you can't use the same method on links to your own site. Or put it in an e-mail. Or on your own site. Fact is, URL shorteners provide an easy way to track traffic you're generating via social media channels. Like most things on social media, the cost of entry to use these tools is free. They're definitely worth a try.
1More

Why You Should Use Article Directories to Promote Your Website » WNW Design -... - 0 views

  • Editor’s Note: You can find the Top 50 Article directories ranked by Alexa rating and Google Pagerank at: www.vretoolbar.com www.jackhumphrey.com
1More

Selling text links ads thorugh TLA or DLA result in Google penalty? - 0 views

  • Can selling text link ads in the sidebar using TLA or Direct-Link-Ads result in a Googlge penalty? I use to use TLA before for one of my sites but stopped using them for the fear of Google dropping the sit because i heard a few rumors on webmaster forums of this happening. Is this concrete or not? Are people still using TLA or DLA or some other similar? C7Mike#:3930956 4:52 am on June 11, 2009 (utc 0) Yes, you may receive a penalty for purchasing links that pass PageRank. See Google's Webmasters/Site owner Help topic for more information: [google.com...] Automotive site#:3930991 6:42 am on June 11, 2009 (utc 0) Well, I was actually going to use one of thoose to sell and not purchase. Anyway, I am going to apply to BuyandSellAds and see if I get accepted there, but I heard they mostly accept tech related sites. C7Mike#:3931237 2:25 pm on June 11, 2009 (utc 0) You may receive a penalty for both buying and selling paid links that pass PageRank (see [google.com...] I have had a few sites lose their PR because they published links through TLA. However the content was still good enough that advertisers have continued to purchase links on those pages through TLA inspite of the lack of PR, and at a substantially lower rate.
1More

Conversion Rate, A Most Powerful Lever Indeed - Search Engine Watch (SEW) - 0 views

  •  
    additional
1More

How to fix BingBot OverCrawling by controlling Crawl Rate? - 0 views

  •  
    "BingBots or MSNBots Over-crawling to Bring the Site Down?"
« First ‹ Previous 61 - 80 of 104 Next › Last »
Showing 20 items per page