Skip to main content

Home/ Groups/ DISC Inc
1More

How to report paid links - 0 views

  • Q: I’m worried that someone will buy links to my site and then report that. A: We’ve always tried very hard to prevent site A from hurting site B. That’s why these reports aren’t being fed directly into algorithms, and are being used as the starting point rather than being used directly. You might also want to review the policy mentioned in my 2005 post (individual links can be discounted and sellers can lose their ability to pass on PageRank/anchortext/etc., which doesn’t allow site A to hurt site B).
1More

Google Sitelinks - What Sitelinks Are and How They Work - 0 views

  • What are Google Sitelinks? Google Sitelinks are displayed in Google search results and are meant to help users navigate your website. Google systems analyze the link structure of your website to find shortcuts that will save users time and allow them to quickly find the information. Sitelinks are completely automated by Google’s algorithm. In short, Google Sitelinks are shortcuts to your main pages from the search result pages. When do Google Sitelinks show? Google only shows Sitelinks for results when they think they’ll be useful to the user. If the structure of your website doesn’t allow Google spider to find good Sitelinks, or they don’t think the Sitelinks for your site are relevant for the user’s query, they won’t show them. Although there are no certain answers to this question from Google, the following factors seem to influence whether Google displays Sitelinks or not: Your site must have a stable no.1 ranking for the search query. So Sitelinks show up most often for searches on brand names. Your site must be old enough. It seems that websites under 2 years old don’t get Sitelinks The number of searches - it seems that the search keywords aren’t searched often enough don’t get Sitelinks The number of clicks - it seems that your site has to get many clicks for the searched keywords It seems that Sitelinks don’t show to search queries consisting of two or more keywords The number of links - links are important everywhere in the SEO world, aren’t they? The inbound links with the relevant anchor text seems to influence the chance of getting Sitelinks How can we get Sitelinks for our website? If you can meet the above mentioned criteria, you’ll have a big chance to get Sitelinks shown for your site. But you can also improve the structure of your website to increase the possibility and quality of your Sitelinks. Google seems to use the first level links on a website for the Sitelinks, so make sure all your important links are on the homepage. The links should be text links or image links with an IMG ALT attribute. JavaScript or Flash links are not considered for Sitelinks. Also, it seems that Google likes links that appear at the top of a webpage. So try to put your important links at the top of the HTML code and then re-position using CSS. Overall, build your website following SEO best practices and rank no.1 for your most important keywords will ensure the Sitelinks appearances and help users to navigate your website.
1More

International SEO 101: Search Marketing for Foreign Countries and Languages - 0 views

  • When possible, have your site/subdomain hosted in the country you're targeting. This isn’t always an option. It may make more sense for your sites to all be hosted at the same place, in which case you'll want to make use of your Google Webmaster account.  Whether you use a ccTLD or a subdomain, you can have a separate Webmaster tools account for each.  Verify your accounts and then select which country you want to serve in its settings. *Tip: This helps ensure you can be found when searchers decide to use only country-targeted search results. However, it is not an appropriate setting for just foreign language-specific targeting and optimization because you don't want to exclude other countries who also speak that language. If searchers use the default “Web,” Google will search the entire Web, in which case it’s a win-win for you.
1More

Google Confirms "Mayday" Update Impacts Long Tail Traffic - 0 views

  • Google Confirms “Mayday” Update Impacts Long Tail Traffic May 27, 2010 at 11:02am ET by Vanessa Fox Google made between 350 and 550 changes in its organic search algorithms in 2009. This is one of the reasons I recommend that site owners not get too fixated on specific ranking factors. If you tie construction of your site to any one perceived algorithm signal, you’re at the mercy of Google’s constant tweaks. These frequent changes are one reason Google itself downplays algorithm updates. Focus on what Google is trying to accomplish as it refines things (the most relevant, useful results possible for searchers) and you’ll generally avoid too much turbulence in your organic search traffic. However, sometimes a Google algorithm change is substantial enough that even those who don’t spend a lot of time focusing on the algorithms notice it. That seems to be the case with what those discussing it at Webmaster World have named “Mayday”. Last week at Google I/O, I was on a panel with Googler Matt Cutts who said, when asked during Q&A,  ”this is an algorithmic change in Google, looking for higher quality sites to surface for long tail queries. It went through vigorous testing and isn’t going to be rolled back.” I asked Google for more specifics and they told me that it was a rankings change, not a crawling or indexing change, which seems to imply that sites getting less traffic still have their pages indexed, but some of those pages are no longer ranking as highly as before. Based on Matt’s comment, this change impacts “long tail” traffic, which generally is from longer queries that few people search for individually, but in aggregate can provide a large percentage of traffic. This change seems to have primarily impacted very large sites with “item” pages that don’t have many individual links into them, might be several clicks from the home page, and may not have substantial unique and value-added content on them. For instance, ecommerce sites often have this structure. The individual product pages are unlikely to attract external links and the majority of the content may be imported from a manufacturer database. Of course, as with any change that results in a traffic hit for some sites, other sites experience the opposite. Based on Matt’s comment at Google I/O, the pages that are now ranking well for these long tail queries are from “higher quality” sites (or perhaps are “higher quality” pages). My complete speculation is that perhaps the relevance algorithms have been tweaked a bit. Before, pages that didn’t have high quality signals might still rank well if they had high relevance signals. And perhaps now, those high relevance signals don’t have as much weight in ranking if the page doesn’t have the right quality signals. What’s a site owner to do? It can be difficult to create compelling content and attract links to these types of pages. My best suggestion to those who have been hit by this is to isolate a set of queries for which the site now is getting less traffic and check out the search results to see what pages are ranking instead. What qualities do they have that make them seen as valuable? For instance, I have no way of knowing how amazon.com has faired during this update, but they’ve done a fairly good job of making individual item pages with duplicated content from manufacturer’s databases unique and compelling by the addition of content like of user reviews. They have set up a fairly robust internal linking (and anchor text) structure with things like recommended items and lists. And they attract external links with features such as the my favorites widget. From the discussion at the Google I/O session, this is likely a long-term change so if your site has been impacted by it, you’ll likely want to do some creative thinking around how you can make these types of pages more valuable (which should increase user engagement and conversion as well). Update on 5/30/10: Matt Cutts from Google has posted a YouTube video about the change. In it, he says “it’s an algorithmic change that changes how we assess which sites are the best match for long tail queries.” He recommends that a site owner who is impacted evaluate the quality of the site and if the site really is the most relevant match for the impacted queries, what “great content” could be added, determine if the the site is considered an “authority”, and ensure that the page does more than simply match the keywords in the query and is relevant and useful for that query. He notes that the change: has nothing to do with the “Caffeine” update (an infrastructure change that is not yet fully rolled out). is entirely algorithmic (and isn’t, for instance, a manual flag on individual sites). impacts long tail queries more than other types was fully tested and is not temporary
4More

BruceClay - Search Engine Optimization: You & A with Matt Cutts - 0 views

  • In 2003 they switched to an incremental index system, pushing it live every few days. (Update Fritz). Now they have Caffeine. But instead of crawling docs and then indexing later that night, they crawl and index at the same time making the whole index much closer to real time. It used to be like waiting for a bus, now it’s like the taxi. No waiting. It just is there and bringing it to you right away. It unlocks the ability for Google to index a whole lot more documents and makes the whole index 50% fresher.
  • Caffeine isn’t a ranking change.
  • Static HTML links are always nice and safe but they’re getting better at parsing JavaScript.
  • ...1 more annotation...
  • Is bounce rate part of the algorithm? Matt says not as part of the general ranking Web algorithm. (To the best of his knowledge.)
1More

Official Google Webmaster Central Blog: Using site speed in web search ranking - 0 views

  • If you are a site owner, webmaster or a web author, here are some free tools that you can use to evaluate the speed of your site:Page Speed, an open source Firefox/Firebug add-on that evaluates the performance of web pages and gives suggestions for improvement.YSlow, a free tool from Yahoo! that suggests ways to improve website speed.WebPagetest shows a waterfall view of your pages' load performance plus an optimization checklist.In Webmaster Tools, Labs > Site Performance shows the speed of your website as experienced by users around the world as in the chart below. We've also blogged about site performance.Many other tools on code.google.com/speed.While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven't seen much change to your site rankings, then this site speed change possibly did not impact your site.
3More

SEO & Link Building: The Domain Authority Factor - Search Engine Watch (SEW) - 0 views

  • Authority Comes With Age The main ingredient of authority is time. Websites gain authority by behaving themselves for some time, having links pointing to the site for a longer period, and having other authority sites linking to them.
  • Subdomains start out with the same authority as their www parents, but when they start out linking intensively to low or negative authority websites they can lose theirs without affecting the rest of the domain too much. This effects the choice between using subdomains or subdirectories, because activities within a directory influence the entire (sub)domain it's on.
  • Links from authorities aren't easily acquired because they're careful when linking out. Use the Bing operator "linkfromdomain:authority.com" to discover what they already link to. Discover why those sites are being linked to and, by emulating that strategy, you might get great authority links.
1More

The Google Killer No One Dares Discuss - Search Engine Watch (SEW) - 0 views

  • The Empire Strikes Back Taking the Google example, Google is already working desperately to capture all of the shared human experience data and to work it into their index in a usable form. That is what their attempts, though less than pretty, at presenting real-time search have been all about. Sharing data between people by creating Buzz for Gmail users was headed in exactly the same direction. And Google Analytics together with personalization have both been collating human behavior data for quite some time. Launching Android as an open source vehicle was about taking facilitating share in that vitally important mobile phone access zone. So, in fact, the "crawler+data organization (index)+algorithm+ search ranking" pattern of search we understand today has to disappear and be replaced with "human behavior logging+data organization (index)+algorithm+ranking" to produce the right result. Google could actually be its own Google Killer as they, for one, are well placed to do this.
1More

Google Share of Searches Hits 72 Percent in May 2010 @SEWatch - 0 views

  • June 21, 2010 Google Share of Searches Hits 72 Percent in May 2010  Share tweetmeme_source = 'sewatch'; tweetmeme_service = 'bit.ly'; Experian Hitwise today announced that Google accounted for 72.17 percent of all U.S. searches conducted in the four weeks ending May 29, 2010. Yahoo! Search, Bing and Ask received 14.43 percent, 9.23 percent and 2.14 percent, respectively.
2More

What Google Thinks of Your Site - Search Engine Watch (SEW) - 0 views

  • Internal Links Listings Sitelinks have been around for years, about five to be exact. Another important SERP feature that has also been around this long are site's internal links in the SERP listings. The occurrence of this isn't always deemed by branded or domain related searches as well as having a first place listing. These horizontally placed links located between the SERP listing description and URL are most often a mirrored replication of the anchor text of the text links you possess on your home page. To perform optimally at getting Google to display these, make sure the text links are placed in the first few paragraphs of copy to help increase your internal page CTR. Also, ensure that the anchor text is identical to the destination pages overall keyword focus. Having placement of internal links in Google SERPs is Google's thumbs up that you have a proper internal linking to keyword strategy.
  • Hierarchical Category Links One of the most recent SERP listing features you can use gauge Google's perception of your site are the hierarchical breadcrumb links placed in the URL line of SERP listings. These began to appear half a year ago and, like the internal link placement above, also don't require first place ranking, brand, or domain related searches to appear in SERPs. Receiving the hierarchical category links are achieved by utilizing a network of breadcrumb navigation across the internal pages of your site. To create an optimal process of breadcrumb linking, make sure you've applied your keyword strategy alongside the information architecture of your site content. Your URL structure should include keyword rich and content relevant category/folder naming conventions and ensure that site content falls into the appropriate categories. Furthermore, having a breadcrumb navigation in which the category links closely mimic the folder path of the URL helps to indicate to Google how the content of your site flows and that you have taken steps to properly deliver site content to search engines as well as users. Taking into consideration these Google SERP features will allow you to gain insight as to how Google understands the most important elements of your site from an SEO standpoint.
4More

BruceClay - SEO Newsletter - FEATURE: Takeaways from SMX Advanced Seattle 2010 - 0 views

  • You & A with Matt Cutts of GoogleGoogle's new Web indexing system, Caffeine, is fully live. The new indexing infrastructure translates to an index that is 50 percent fresher, has more storage capacity and can recognize more connections of information. The Mayday update was an algorithm update implemented at the beginning of May that is intended to filter out low-quality search results. A new report in the Crawl errors section of Google Webmaster Tools indicates "soft 404" errors in order to help webmasters recognize and resolve these errors. Keynote Q&A with Yusuf Mehdi of Microsoft Bing is opening up new ways to interact with maps. The newly released Bing Map App SDK allows developers to create their own applications which can be used to overlay information on maps. Bing Social integrates to Facebook firehose and Twitter results into a social search vertical. Bing plans to have the final stages of the Yahoo! organic and paid search integration completed by the end of 2010. Decisions about how to maintain or integrate Yahoo! Site Explorer have not been finalized. Bing's Webmaster Tools are about to undergo a major update. Refer to the Bing Webmaster Tools session for more on this development.
  • Bing's program manager said that the functionality provided by Yahoo! Site Explorer will still be available. It's not their intention to alienate SEOs because they consider SEOs users, too.
  • The Bing Webmaster team has built a new Webmaster Tools platform from the ground up. It is scheduled to go live Summer 2010. The platform focuses on three key areas: crawl, index and traffic. Data in each area will go back through a six month period. Tree control is a new feature that provides a visual way to traverse the crawl and index details of a site. The rich visualizations are powered by Silverlight. URL submission and URL blocking will be available in the new Webmaster Tools.
  • ...1 more annotation...
  • The Ultimate Social Media Tools Session Tools to get your message out: HelpaReporter, PitchEngine, Social Mention, ScoutLabs. Customer and user insight tools: Rapleaf, Flowtown. Tools to find influencers: Klout. Forum tools: Bing Boards, Omgili, Board Tracker, Board Reader. Digg tools: Digg Alerter, FriendStatistics, di66.net. Make use of the social tools offered by social networks, e.g. utilize Facebook's many options to update your page and communicate your fans by SMS. Encourage people to follow you using Twitter's short code.
1More

BruceClay - SEO Newsletter - INTERNATIONAL: Universal Search Occurrences and Types in G... - 0 views

  • Recently, we noticed many more Universal Search results appearing in the Google.com.au SERPs. We performed some testing on the number of occurrences and the type of Universal Search results to provide some actionable insights and data to back up our observations.

    In addition, we wanted to test what Marissa Mayer, the Google VP of Search Products & User Experience stated in November 2009. In the interview, she noted that when Universal Search launched in 2007 a Universal Search item appeared in 4 percent of search queries, whereas in November 2009 a Universal Search item appeared in 25 percent of search queries.

    We selected a sample of different search results in Google.com.au (searched from an Australian IP and eliminating the impacts of personalised search) and recorded the occurrences and types of Universal Search results. We gathered this data across a number of different keyword groups including brand, high-volume, mid-tier, long-tail and celebrity- and news-related keywords. We then tracked those search results over a period of days to determine the level of change.

    Please note that these are based on a sample size and are based on an average across the sample set. The results of our research are outlined below:

    A) Percentage of Times a Universal Search Result Appears on Page 1

    Our research shows that:

    • 86 percent of all searches returned a Universal Search result on page
    • 74 percent of all searches returned a Universal Search result above the fold on page 1.
1More

Consumers Head Online for Local Business Information - Search Engine Watch (SEW) - 0 views

  • Importance of Ratings and Reviews From 2008 to 2009, usage of consumer ratings and reviews increased to 25 percent (+3) among IYP searchers and to 27 percent (+5) among general searchers. Additionally, people who use social networking sites for local business information are more likely to use consumer reviews (53 percent). It's interesting that, while overall usage of ratings and reviews is only 24 percent, its importance during the business selection process is 57 percent! Because users of ratings and reviews heavily rely on them to select a company to do business with, they should be a serious component of any marketer's online strategy.
1More

Time to Start Placing More Emphasis on Bing SEO | WebProNews - 0 views

  • Janet discusses a tool Bing has in its Webmaster tools that lets you see the types of links that point into you, and lets you look at their value, so you can go after similar links.
2More

Google Sidewiki and SEO -- Relevant to Each Other? - Search Engine Watch (SEW) - 0 views

  • Thus, the SEO connection is really related to reputation management. There are a few choices as a brand manager that you can make, based on early understanding and further discussed in the point of view published by my agency: Register a site with Google Webmaster Tools to claim the first Sidewiki listing for any owned page.
  • One alternative: consider completely blocking Sidewiki users from posting comments on your pages. This choice has many potential negative side effects, however.
1More

Paid Search Reports: Google Profits at Bing/Yahoo's Expense #SEWatch - 0 views

  • October 12, 2010 Paid Search Reports: Google Profits at Bing/Yahoo's Expense  Share tweetmeme_source = 'sewatch'; tweetmeme_service = 'bit.ly'; Google gobbled up more paid search spending share last quarter, a result of the Bing/Yahoo integration, according to new reports from Efficient Frontier and SearchIgnite. Google's share of paid search spend rose from 75.8 percent in Q2 to 77.9 percent in Q3, according to Efficient. SearchIgnite had Google growing to 80.2 percent of PPC ad spend. Paid clicks were up 9 percent year-over-year (YoY); CPCs were up 14 percent YoY; and impressions were up 6 percent YoY, Efficient reported. Efficient said this demonstrates Google's continued ability to increase consumer and advertiser demand.
2More

Geo-Targeting Redirects: Cloaking or Better User Experience? - Search Engine Watch (SEW) - 0 views

  • If you have your site set to detect a visitor's location and show content based on that, I would recommend the following: Serve a unique URL for distinct content. For instance, don't show English content to US visitors on mysite.com and French content to French visitors on mysite.com. Instead, redirect English visitors to mysite.com/en and French visitors to mysite.com/fr. T hat way search engines can index the French content using the mysite.com/fr URL and can index English content using the mysite.com/en URL. Provide links to enable visitors (and search engines) to access other language/country content. For instance, if I'm in Zurich, you might redirect me to the Swiss page, but provide a link to the US version of the page. Or, simply present visitors with a home page that enables them to choose the country. You can always store the selection in a cookie so visitors are redirected automatically after the first time.
  • Google's policies aren't as inflexible as you're trying to portray. The same Google page you quote also says that intent ought to be a major consideration (just as when evaluating pages with hidden content). Also, why would Google's guidelines prevent you from using geotargeting without an immediate redirect? Just because you don't immediately redirect search users to a different page doesn't mean you have to ask for their zip code instead of using IP-based geotargeting.    Lastly, I don't think using such redirects from SERPs improves user experience at all. If I click on a search result, then it's because that's the content I'm interested in. It's very annoying to click on a search result and get a page completely different from the SERP snippet. And what about someone who is on business in a different country? Search engines already provide different language localizations as well as language search options to favor localized pages for a particular region. So if someone goes to the French Google, they will see the French version of localized sites/pages. If they're seeing the U.S. version in their SERP, then it's because you didn't SEO or localize your pages properly, or they deliberately used the U.S. version of Google. Don't second guess your users. Instead, focus on making sure that users know about your localized pages and can access them easily (by choice, not through force).5 days ago, 17:00:11 – Flag – Like – Reply – Delete – Edit – Moderate Bill Hunt Frank your spot on as usual. We still keep chasing this issue and as I wrote on SEW last year in my article on language detection issues http://searchenginewatch.com/3634625 it is often more of the implementation that is the problem than the actual redirect.    Yes, it is exactly cloaking (maybe gray hat) when you have a single gateway such as "example.com" and if the person comes from Germany they see the site in German language or English if their IP was in New York. Engines typically crawl from a central location like Mountain View or Zurich so they would only see the English version since they would not provide signals for any other location. Where you really get into a tricky area is if you set it so that any user agent from a search engine can access any version they are asking for and let them in yet a human is restricted - sort of reverse cloaking. If Mr GoogleBot wants the French home page let him have it rather than sending him to the US homepage.    With the growth of CDN's (content data networks) I am seeing more and more of these issues crop up to handle load balancing as well as other forms of geographical targeting. I have a long list of global, multinational and enterprise related challenges that are complicated by many of Google's outdated ways of handling kindergarten level spam tactics. Sounds like a SES New York session...
1More

Link building and social media | Search Engine Optimization | Search Engines - 0 views

  • Link building and social media PDF  | Print |  E-mail Wednesday, 29 April 2009 10:10 It’s all about the secondary links silly Time and time again I see folks in the SEO world talking about getting links from social media websites. Many times this advice will include finding ‘followed’ links and even lists of ‘dofollow’ social media sites. This is quite strange and bewildering to me as the holy grail of link building in SM isn’t getting a link from the actual site…. but getting the secondary links that follow viral content. You see, one shouldn’t be using the state of the links on the site as the measure… and such approaches are often even frowned upon by many in the biz as noted in this recent Sphinn thread. Regardless of the emotional reaction, the whole concept is flawed. I could give a rat’s ass if the links on a given site (including social and blogs) are followed because that was never the consideration in the first place. Secondary links are the goal The main thing, from a link building perspective, is not really about direct links but the secondary links one garners from having a viral story on said site. If one gets a hot story on places such as Digg or Twitter, how many links are being generated? This is where the story begins for link builders. Having a viral story make the rounds can often result in a great number of back links that can often be of far more value than those single authority links social spammers seem bent on getting. This is the greater value to be had from SM sites for the adventurous link builder. Now, we can discuss brand development and authority building as an important aspect of content distribution, (and social media) but let’s stick to the potential of them for link building. When we look to target a given social site what do we want to know? Is the site targeted? Meaning does it have active categories relating to our market. What’s the demographic? Is there a viable number or market related peeps? What’s the reach? Is it syndicated heavily, (RSS, Twitter, Blogs, Scrapers..etc..) What links are top stories getting? (is the demo a linking group) You get the idea… we want the best possible opportunity for generating secondary links from the primary exposure. That is the goal at the end of the day (from a link building perspective).   Don’t be short sighted This is actually true of a lot of content distribution/placement channels. You shouldn’t be as concerned about the type of link as the ability to generate links from the situation. What would you rather have? Scenario 1 – a followed link from a marginally popular location such as http://www.under-link.com/ Scenario 2 – a nofollowed link from a popular site (or maybe dropped by a top Twitterer). Scenario 3 – a followed link buried on a popular site (poor exposure) If you said anything but Scenario 2 then please move to the front of the class, because you are failing sadly. Ultimately the actual status of the link is not going to be nearly as important as the ability to get the content in front of as many folks as possible. If you and the content team have done your job, and chosen the right locales, then you should end up with some great secondary links.
3More

Inner View: Google's Keyword Research Tools (from SMX East) | Maine SEO Blog - 0 views

  • 55% of queries have more than 3 words 70% of queries have no exact keyword match 20% of queries in a given day have not been seen in the previous 90 days
  • Logged in vs. non-logged in When you’re logged in to KWT, you could get up to 1,000 queries. When you’re not logged in to KWT, you only get up to 100 queries
  • Google Suggest Keyword Tool uses Google Suggest, on top of a lot of other metrics.
1More

Duplicate Content: Block, Redirect or Canonical | SEOmoz - 0 views

  • Having said that, the only problem in using robots.txt in eliminating duplicate content is some people may be linking to the page that is excluded. That would prevent these links from contributing to your website's search engine ranking.
« First ‹ Previous 81 - 100 Next › Last »
Showing 20 items per page