Skip to main content

Home/ DISC Inc/ Group items tagged Reviews

Rss Feed Group items tagged

1More

Google releases September 2022 product reviews update - 0 views

  •  
    "these updates can be very big"
1More

Google Search Ranking Updates | Google Search Central  |  What's new  |  Goog... - 0 views

  •  
    "product reviews update"
1More

Google Confirms "Mayday" Update Impacts Long Tail Traffic - 0 views

  • Google Confirms “Mayday” Update Impacts Long Tail Traffic May 27, 2010 at 11:02am ET by Vanessa Fox Google made between 350 and 550 changes in its organic search algorithms in 2009. This is one of the reasons I recommend that site owners not get too fixated on specific ranking factors. If you tie construction of your site to any one perceived algorithm signal, you’re at the mercy of Google’s constant tweaks. These frequent changes are one reason Google itself downplays algorithm updates. Focus on what Google is trying to accomplish as it refines things (the most relevant, useful results possible for searchers) and you’ll generally avoid too much turbulence in your organic search traffic. However, sometimes a Google algorithm change is substantial enough that even those who don’t spend a lot of time focusing on the algorithms notice it. That seems to be the case with what those discussing it at Webmaster World have named “Mayday”. Last week at Google I/O, I was on a panel with Googler Matt Cutts who said, when asked during Q&A,  ”this is an algorithmic change in Google, looking for higher quality sites to surface for long tail queries. It went through vigorous testing and isn’t going to be rolled back.” I asked Google for more specifics and they told me that it was a rankings change, not a crawling or indexing change, which seems to imply that sites getting less traffic still have their pages indexed, but some of those pages are no longer ranking as highly as before. Based on Matt’s comment, this change impacts “long tail” traffic, which generally is from longer queries that few people search for individually, but in aggregate can provide a large percentage of traffic. This change seems to have primarily impacted very large sites with “item” pages that don’t have many individual links into them, might be several clicks from the home page, and may not have substantial unique and value-added content on them. For instance, ecommerce sites often have this structure. The individual product pages are unlikely to attract external links and the majority of the content may be imported from a manufacturer database. Of course, as with any change that results in a traffic hit for some sites, other sites experience the opposite. Based on Matt’s comment at Google I/O, the pages that are now ranking well for these long tail queries are from “higher quality” sites (or perhaps are “higher quality” pages). My complete speculation is that perhaps the relevance algorithms have been tweaked a bit. Before, pages that didn’t have high quality signals might still rank well if they had high relevance signals. And perhaps now, those high relevance signals don’t have as much weight in ranking if the page doesn’t have the right quality signals. What’s a site owner to do? It can be difficult to create compelling content and attract links to these types of pages. My best suggestion to those who have been hit by this is to isolate a set of queries for which the site now is getting less traffic and check out the search results to see what pages are ranking instead. What qualities do they have that make them seen as valuable? For instance, I have no way of knowing how amazon.com has faired during this update, but they’ve done a fairly good job of making individual item pages with duplicated content from manufacturer’s databases unique and compelling by the addition of content like of user reviews. They have set up a fairly robust internal linking (and anchor text) structure with things like recommended items and lists. And they attract external links with features such as the my favorites widget. From the discussion at the Google I/O session, this is likely a long-term change so if your site has been impacted by it, you’ll likely want to do some creative thinking around how you can make these types of pages more valuable (which should increase user engagement and conversion as well). Update on 5/30/10: Matt Cutts from Google has posted a YouTube video about the change. In it, he says “it’s an algorithmic change that changes how we assess which sites are the best match for long tail queries.” He recommends that a site owner who is impacted evaluate the quality of the site and if the site really is the most relevant match for the impacted queries, what “great content” could be added, determine if the the site is considered an “authority”, and ensure that the page does more than simply match the keywords in the query and is relevant and useful for that query. He notes that the change: has nothing to do with the “Caffeine” update (an infrastructure change that is not yet fully rolled out). is entirely algorithmic (and isn’t, for instance, a manual flag on individual sites). impacts long tail queries more than other types was fully tested and is not temporary
3More

Live Search Webmaster Center Blog : SMX East 2008: Webmaster Guidelines - 0 views

  • The bottom line is there are no scenarios in which we would ever recommend cloaking as a good solution, although we do understand that there are some technical reasons people cloak pages that are not directly related to spam. The problem is that cloaking can set off some automatic spam detection algorithms that may result in parts of your site being penalized. As a search engine optimization practice, cloaking can actually be counter-productive.
  • Q: What can you do if your website does get penalized? The first thing you should do is verify that your site has in fact been penalized. To do this, log into our Webmaster Tools and go to the Site Summary page. From here, looked for the Blocked: field in the right-hand column. If your site is blocked, this will show as Yes, otherwise it will show as No. If your site is blocked, then it is time to go review our Webmaster Guidelines and check your site to see which one(s) you may have violated. If you have any questions about this step, please consult our online forums, or contact our technical support staff. Once you've identified and resolved the issue(s), it is time to request that Live Search re-include your pages back into its index. To do that, you'll need to log back into the Webmaster Tools and click on the hyperlinked Blocked: Yes in your Site Summary page. This will take you to a form whereby you can request reevaluation from our support team. Thanks for all of your questions today! If you have any more, please leave them in the comments section and we'll try and answer them as soon as possible.
  •  
    The bottom line is there are no scenarios in which we would ever recommend cloaking as a good solution, although we do understand that there are some technical reasons people cloak pages that are not directly related to spam. The problem is that cloaking can set off some automatic spam detection algorithms that may result in parts of your site being penalized. As a search engine optimization practice, cloaking can actually be counter-productive.
1More

Microsoft adCenter Fall Upgrade - 0 views

  • Oct 28, 2008 at 9:11am Eastern by Barry Schwartz    Microsoft adCenter Fall Upgrade The Microsoft adCenter blog has an army of posts containing details of their large fall upgrade. The main features most advertisers may notice are: Campaign Management: ability to pause and resume ads and keywords, geo-targeting enhancements, and improved performance reporting on the Ads page Editorial Improvements: faster reviews, dynamic feedback about why ads and keywords were disapproved, and inline notification when dynamic text causes your ads to exceed character limits User Management: if previously you were only able to have one user, now you can create multiple account users Content Ads (U.S. only): get keyword bid suggestions and performance estimates for your content ads Here is a breakdown of all the blog posts I found pertaining to this fall upgrade: adCenter Fall Upgrade: New Features, adCenter Blog for Advertisers Blog adCenter API Production Fall Upgrade Now Live, adCenter API Blog for Developers adCenter Fall Upgrade: Campaign Management Updates, adCenter Blog for Advertisers adCenter Fall Upgrade: Content Ads Updates (U.S. only), adCenter Blog for Advertisers adCenter Fall Upgrade: Editorial Updates, adCenter Blog for Advertisers adCenter Fall Upgrade: User Management Updates, adCenter Blog for Advertisers adCenter Analytics Beta Refresh - Check Out The New Features, adCenter Analytics Blog adCenter API Production Upgrade Now Live, adCenter API Blog for Developers
1More

Yahoo! Search Updates: SearchMonkey Enables More Enhanced Results, Google Base Accepted... - 0 views

  • Additionally, Yahoo! search will now accept Google Base, a product publishing tool. Five Google Base items will now be supported: Event, Product, Review, Job, and Personals. Those who have Event and Product information can submit their feeds to Yahoo! Site explorer to get their enhanced results automatically displayed.
1More

Review: LinksManager Reciprocal Links Service - 0 views

  •  
    Link manager tool.
1More

Are PPC Ads Now Counting in Google Organic Backlinks? - Search Engine Watch (SEW) - 0 views

  • In the past, I've said there's no direct correlation between editorial rankings and paid advertisements. Well, it seems I was wrong. Paid search really can affect organic search. My team recently noticed this in one of our client's Google Webmaster Tools accounts. They saw instances of backlink anchor text that we knew we weren't optimizing against (not requesting links with these keywords) and they seemed very promotional in nature. When we reviewed these links, we saw that they were coming from paid search efforts. They were the titles of the ads on both Overture/Yahoo Search Marketing and Google AdWords. Yet, Google Webmaster Tools was (and still is) showing these as anchor text of backlinks to the Web site.
1More

SEOmoz | I Don't Buy Links - 0 views

  • How Google Can Discover Paid Links A while back I did a post called 15 Methods for Paid Link Detection. Here is a list of the methods I discussed in that post: Links Labeled as Advertisements Site Wides Links Are Sold By a Link Agency Selling Site Has Information on How to Buy a Text Link Ad Relevance of Your Link Relevance of Nearby Links Advertising Location Type Someone Reports Your Site for Buying Links Someone Reports Your Site for Some Other Reason Someone Reports the Site you Bought Links from for Selling Links Someone Reports the Site you Bought Links from for Some Other Reason Disgruntled Employee Leaves Your Company, and Reports Your Site Disgruntled Employee Leaves the Agency Your Used, and Reports Your Site Disgruntled Employee Leaves the Company of the Site You Bought Links from, and Reports Your Site Internal Human Review There are two major methods I want to emphasize here. These are: 1. Your competitor can report you. It's the grim truth that your paid links can be reported by your competitor. There is a form built right into Google Webmaster Tools. Here is what it looks like:
1More

How to report paid links - 0 views

  • Q: I’m worried that someone will buy links to my site and then report that. A: We’ve always tried very hard to prevent site A from hurting site B. That’s why these reports aren’t being fed directly into algorithms, and are being used as the starting point rather than being used directly. You might also want to review the policy mentioned in my 2005 post (individual links can be discounted and sellers can lose their ability to pass on PageRank/anchortext/etc., which doesn’t allow site A to hurt site B).
1More

There is no penalty for buying links! - 0 views

  • There is no penalty for buying links! There, I said it. That’s what I believe is true; there is no such thing as a ‘you have been buying links so you should suffer’ penalty. At least, not if you do it correctly. I’ll make some statements about buying links that probably not everybody will agree on, but this is what I consider to be the truth. If you don’t publish your link buying tactics yourself and if your website’s link profile doesn’t contain >90% paid links, then: Buying links cannot get you penalized;Buying links from obvious link networks only results in backlinks with little to no search engine value;Buying links ninja style will continue to get you killer rankings;Selling links can only disable your ability to pass link juice or PR (but you might want to read this);Google will never be able to detect all paid links Just about every time the topic finally seems to be left alone, someone out there heats up the good old paid link debate again. This time, Rand Fishkin (unintentionally) causes the discussion to emerge once again. By showing the buying and selling link tactics of several websites on SEOmoz’ blog (this info has been removed now), he made it very easy for the Paid Link Police to add some more websites to the list of websites to check out while building the Paid Link Neglecting Algorithm. Several people got all wound up because of this, including (at first) me, because these sites would more than likely receive a penalty (just checked, none of them has been penalized yet). However, it is almost impossible for Google to penalize you for buying links for your website. At least, not if you didn’t scream “Hey, I’m artificially inflating my link popularity!” on your OWN website. David Airey penalized? Jim Boykin analyzed his penalty earlier and the same thing happened here. In some cases, it may seem that certain websites have been penalized for buying links. What in fact happened, is that the link juice tap of some obvious paid links has been closed, what resulted in less link juice, followed by lower rankings. In most other cases, you can buy all the links you want and not get penalized. You could buy the same links for your competition, right? And if Google states that Spammy Backlinks can’t Hurt You, paid backlinks probably can’t hurt you either. This basically is the same thing. The worst thing that can happen is that you buy hundreds of text links that only provide traffic. And, if you managed to buy the right ones, there’s nothing wrong with that.
2More

Google Smart Campaigns: The Newest Way to Be Smart with Pay-Per-Click - 0 views

  • Make reviewing results/effectiveness easy.
  • If you want to make manual decisions within your account and fully control your ad spend this solution is not for you.
« First ‹ Previous 81 - 100 of 130 Next › Last »
Showing 20 items per page