Skip to main content

Home/ DISC Inc/ Group items tagged Linking

Rss Feed Group items tagged

Rob Laporte

What Google Thinks of Your Site - Search Engine Watch (SEW) - 0 views

  • Internal Links Listings Sitelinks have been around for years, about five to be exact. Another important SERP feature that has also been around this long are site's internal links in the SERP listings. The occurrence of this isn't always deemed by branded or domain related searches as well as having a first place listing. These horizontally placed links located between the SERP listing description and URL are most often a mirrored replication of the anchor text of the text links you possess on your home page. To perform optimally at getting Google to display these, make sure the text links are placed in the first few paragraphs of copy to help increase your internal page CTR. Also, ensure that the anchor text is identical to the destination pages overall keyword focus. Having placement of internal links in Google SERPs is Google's thumbs up that you have a proper internal linking to keyword strategy.
  • Hierarchical Category Links One of the most recent SERP listing features you can use gauge Google's perception of your site are the hierarchical breadcrumb links placed in the URL line of SERP listings. These began to appear half a year ago and, like the internal link placement above, also don't require first place ranking, brand, or domain related searches to appear in SERPs. Receiving the hierarchical category links are achieved by utilizing a network of breadcrumb navigation across the internal pages of your site. To create an optimal process of breadcrumb linking, make sure you've applied your keyword strategy alongside the information architecture of your site content. Your URL structure should include keyword rich and content relevant category/folder naming conventions and ensure that site content falls into the appropriate categories. Furthermore, having a breadcrumb navigation in which the category links closely mimic the folder path of the URL helps to indicate to Google how the content of your site flows and that you have taken steps to properly deliver site content to search engines as well as users. Taking into consideration these Google SERP features will allow you to gain insight as to how Google understands the most important elements of your site from an SEO standpoint.
Rob Laporte

Article Pagination: Actions that Improved Google Search Traffic Google SEO News and Dis... - 0 views

  •  
    The value of "long-form journalism" has been tested on websites such as Salon and shown to be quite viable. It also attracts a better caliber of writer. With this in mind, over a year ago I was working with an online magazine that was already publishing longer, in-depth articles, in the area of many thousands of words. The SEO challenge we had was that page 2 and beyond for most articles were not getting any search traffic - even though there was plenty of awesome content there. The approach we decided on is labor intensive for the content creators. But after some education, the writers were all interested in trying to increase the audience size. Here are the steps we took: Page 1 naturally enough uses the overall title of the article for both its title tag and header, and has a unique meta-description. Every internal page then has its own unique title and header tag . These are based on the first SUB-head for that section of the article. This means more keyword research and writing of subheads than would normally be the case. If the article is considered as a whole, then an tag would seem more accurate semantically. But Google looks at the semantic structure one URL at a time, not for the overall multi-URL article. Most pages also include internal subheads, and these are style as On each internal page, there is also a "pre-head" that does use the article title from page 1 in a small font. This pre-head does not use a header tag of any kind, just a CSS style. This pre-head article title is at the top as a navigation cue for the user. An additional navigation cue is that the unique page titles each begin with the numeral "2." or "3." Each internal page also has a unique meta description, one that summarizes that page specifically, rather than summarizing the overall article. Every page of the article links to every other page at the top and the bottom. None of this anemic "Back | Next" junk. There's a complete page choice shown on everywhe
Rob Laporte

Google Sitelinks - What Sitelinks Are and How They Work - 0 views

  • What are Google Sitelinks? Google Sitelinks are displayed in Google search results and are meant to help users navigate your website. Google systems analyze the link structure of your website to find shortcuts that will save users time and allow them to quickly find the information. Sitelinks are completely automated by Google’s algorithm. In short, Google Sitelinks are shortcuts to your main pages from the search result pages. When do Google Sitelinks show? Google only shows Sitelinks for results when they think they’ll be useful to the user. If the structure of your website doesn’t allow Google spider to find good Sitelinks, or they don’t think the Sitelinks for your site are relevant for the user’s query, they won’t show them. Although there are no certain answers to this question from Google, the following factors seem to influence whether Google displays Sitelinks or not: Your site must have a stable no.1 ranking for the search query. So Sitelinks show up most often for searches on brand names. Your site must be old enough. It seems that websites under 2 years old don’t get Sitelinks The number of searches - it seems that the search keywords aren’t searched often enough don’t get Sitelinks The number of clicks - it seems that your site has to get many clicks for the searched keywords It seems that Sitelinks don’t show to search queries consisting of two or more keywords The number of links - links are important everywhere in the SEO world, aren’t they? The inbound links with the relevant anchor text seems to influence the chance of getting Sitelinks How can we get Sitelinks for our website? If you can meet the above mentioned criteria, you’ll have a big chance to get Sitelinks shown for your site. But you can also improve the structure of your website to increase the possibility and quality of your Sitelinks. Google seems to use the first level links on a website for the Sitelinks, so make sure all your important links are on the homepage. The links should be text links or image links with an IMG ALT attribute. JavaScript or Flash links are not considered for Sitelinks. Also, it seems that Google likes links that appear at the top of a webpage. So try to put your important links at the top of the HTML code and then re-position using CSS. Overall, build your website following SEO best practices and rank no.1 for your most important keywords will ensure the Sitelinks appearances and help users to navigate your website.
Rob Laporte

Linkfluence: How to Buy Links With Maximum Juice and Minimum Risk - 0 views

  • Up first is Rand Fishkin. Rand says he asked to be kicked off this panel because he doesn’t endorse buying links and he doesn’t do it anymore [Hear that, Google. SEOmoz doesn't buy links. SO KEEP MOVING.]. He offered to go last…but everyone else bullied the moderator into making him go first. Poor Rand. Always the innocent bunny in a pack of wolves. Unfortunately, the projector is broken so we have no screen. Something about a plug that doesn’t work.  So…we’re doing question and answer first while they send someone to try and fix it. I’ll throw the questions at the bottom.  Back to Mr. Fishkin. He tries to be very clear about his shift in position about paid links. He doesn’t think not buying links is right for everyone, it’s just what’s right for his clients and for SEOmoz.   Rand says he falls into the “Operator of Interest’ category. Meaning, he’s profiled for being an SEO. The problem with paid links: Algorithmic detection is getting better than ever before. Penalties are hard to diagnose. Manual link penalties are also a threat Google’s’ Webspam team invests (A LOT of) time and resources in shutting down effective paid links. [Agreed. And almost an unhealthy amount.] Competitors have significant incentive to report link spam. (Don’t be a rat.)
Jennifer Williams

SEOmoz | Divide and Conquer: Creating and Managing Your Link Campaign - 0 views

  •  
    Having battled the SEO war on all fronts (for myself, for clients, for a firm, and most recently, in-house), I've learned a lot over the years when it comes to link campaigning. Although I am completely FOR generating content that will get linked to naturally, often time this is easier said than done. If you're not a link baiting aficionado or if you're limited by what you're authorized to do, then you'll need to get links the old fashioned way and simply ask for them.
Dale Webb

Local vs Traditional SEO: Why Citation Is the New Link - 0 views

  •  
    Google's Local algorithm (the one that populates maps.google.com and helps populate the 10-pack, 3-pack, and Authoritative OneBox) counts links differently than its standard organic algorithm. \nIn the Local algorithm, links can still bring direct traffic from the people who click on them. But the difference is that these "links" aren't always links; sometimes they're just an address and phone number associated with a particular business! In the Local algorithm, these references aren't necessarily a "vote" for a particular business, but they serve to validate that business exists at a particular location, and in that sense, they make a business more relevant for a particular search.
Rob Laporte

SEO & Link Building: The Domain Authority Factor - Search Engine Watch (SEW) - 0 views

  • Authority Comes With Age The main ingredient of authority is time. Websites gain authority by behaving themselves for some time, having links pointing to the site for a longer period, and having other authority sites linking to them.
  • Subdomains start out with the same authority as their www parents, but when they start out linking intensively to low or negative authority websites they can lose theirs without affecting the rest of the domain too much. This effects the choice between using subdomains or subdirectories, because activities within a directory influence the entire (sub)domain it's on.
  • Links from authorities aren't easily acquired because they're careful when linking out. Use the Bing operator "linkfromdomain:authority.com" to discover what they already link to. Discover why those sites are being linked to and, by emulating that strategy, you might get great authority links.
Rob Laporte

Q&A: Rand Fishkin, CEO of SEOmoz | Blog | Econsultancy - 0 views

  • Paid links are always controversial. I found it interesting that "direct link purchases from individual sites/webmasters" was considered by your panel to be the fifth most effective link building tactic yet "link acquisition from known link brokers/sellers" was the second highest negative ranking factor. Any thoughts on this? Does this reflect the fact that even though paid links in general have a bad reputation, they're still widely employed? I think that's correct. Link buying and selling is still a very popular activity in the SEO sphere, and while the engines continue to fight against it, they're unlikely to ever weed out 100% of the sites and pages the employ this methodology. Link acquisition via this methodology is incredibly attractive to businesses and something the engines have also instilled as a behavior - with PPC ads, you spend more money and get more traffic. It's not unnatural that companies would feel they can apply the same principles to SEO. While I think the engines still have a long way to go on this front, I also believe that, at least at SEOmoz, where our risk tolerance is so low, the smartest way to go is to play by the engines' rules. Why spend a few hundred or few thousand dollars renting links when you could invest that in your site's content, user interface, public relations, social media marketing, etc. and have a long-term return that the engines are far less likely to ever discount.
Rob Laporte

Official: Selling Paid Links Can Hurt Your PageRank Or Rankings On Google - 0 views

  • Oct 7, 2007 at 5:38pm Eastern by Danny Sullivan    Official: Selling Paid Links Can Hurt Your PageRank Or Rankings On Google More and more, I’ve been seeing people wondering if they’ve lost traffic on Google because they were detected to be selling paid links. However, Google’s generally never penalized sites for link selling. If spotted, in most cases all Google would do is prevent links from a site or pages in a site from passing PageRank. Now that’s changing. If you sell links, Google might indeed penalize your site plus drop the PageRank score that shows for it.
Rob Laporte

Wake Up SEOs, the New Google is Here | SEOmoz - 0 views

  •  
    Rel="author" and Rel="publisher" are the solution Google is adopting in order to better control, within other things, the spam pollution of the SERPs. If you are a blogger, you will be incentivized in marking your content with Author and link it to your G+ Profile, and as a Site, you are incentivized to create your G+ Business page and to promote it with a badge on you site that has the rel="publisher" in its code. Trusted seeds are not anymore only sites, but can be also persons (i.e.: Rand or Danny Sullivan) or social facets of an entity… so, the closer I am in the Social Graph to those persons//entity the more trusted I am to Google eyes. As we can see, Google is not trying to rely only on the link graph, as it is quite easy to game, but it is not simply adding the social signals to the link graph, because they too can be gamed. What Google is doing is creating and refining a new graph that see cooperating Link graph, Social graph and Trust graph and which is possibly harder to game. Because it can be gamed still, but - hopefully - needing so many efforts that it may become not-viable as a practice. Wake up SEOs, the new Google is here As a conclusion, let me borrow what Larry Page wrote on Google+ (bold is mine): Our ultimate ambition is to transform the overall Google experience […] because we understand what you want and can deliver it instantly. This means baking identity and sharing into all of our products so that we build a real relationship with our users. Sharing on the web will be like sharing in real life across all your stuff. You'll have better, more relevant search results and ads. Think about it this way … last quarter, we've shipped the +, and now we're going to ship the Google part. I think that it says it all and what we have lived a year now is explained clearly by the Larry Page words. What can we do as SEOs? Evolve, because SEO is not dieing, but SEOs can if they don't assume that winter - oops - the
Rob Laporte

Rand Fishkin On Buying Links For SEO - PubCon Review | SEO.com - 0 views

  • About an hour ago, I attended a session about purchasing links to influence your search engine rankings. I have always thought of buying links as black hat SEO and Rand reconfirmed my belief this morning. I wanted to quickly summarize his presentation on purchasing links, why you shouldn’t do it, and what you should do instead. Believe it or not, Google employs a team dedicated to searching for webspam. They invest lots of time and resources into finding and shutting down effective paid linking opportunities. This is the number one reason why you should never participate in paid links. Once Google finds you, you are done!
jack_fox

4 Google My Business Fields That Impact Ranking (and 3 That Don't) - Whiteboard Friday ... - 0 views

  • you do want to kind of think and possibly even test what page on your website to link your Google My Business listing to. Often people link to the homepage, which is fine. But we have also found with multi-location businesses sometimes it is better to link to a location page.
  • we have found that review quantity does make an impact on ranking. But that being said, we've also found that it has kind of diminishing returns. So for example, if you're a business and you go from having no reviews to, let's say, 20 or 30 reviews, you might start to see your business rank further away from your office, which is great. But if you go from, let's say, 30 to 70, you may not see the same lift.
    • jack_fox
       
      I would argue though that recent reviews are a big CTR factor, especially due to COVID.
  •  
    "you do want to kind of think and possibly even test what page on your website to link your Google My Business listing to. Often people link to the homepage, which is fine. But we have also found with multi-location businesses sometimes it is better to link to a location page."
Rob Laporte

Combining Trust and Relevance - Search Engine Watch (SEW) - 0 views

  • What Happens When You Launch a New Site Section? If there's a close relationship between your new site section and the historical trusted aspect of the site, you'll likely pick up some traffic quite quickly. However, sites stall a bit after that. They get a little taste of the good traffic for their new section, but then it stops growing. Over a period of time, it will remain frozen, but then if you're doing the right things (developing quality content, link building), you may see a jump in traffic. My own conjecture is that a combination of quality inbound links and time raises the trust level of the new site section. Once you cross a trust threshold, you enable a new period of growth until you hit the next threshold. Then the cycle repeats. I've seen this behavior several times now during the development and promotion of new sections of content on existing sites. How Can You Speed Things Along? We already mentioned the two most important things above. Developing quality content was one of them. While search engine crawlers can't measure content quality in a direct sense, they can understand the relevance and depth of a Web page, provided you put enough text out there for them to chew on. Also, if a new site section is really thin on content, you can send negative signals to the search engines. The other thing you need to do? Our old friend, link building. At least some of the signals for evaluating trust are based on link analysis. Getting high quality links from high quality sites will help you establish that trust. The above is a sandbox scenario, but applied to new content section on an existing site, it operates much the same way. You benefit from the inherent trust of the existing domain, but still need to prove it to the search engines by getting new links to the new section itself.
Rob Laporte

BIZyCart SEO Manual - Controlled Navigation - 0 views

  • How The Robots Work Without getting into the programming details, the robots and web crawlers basically follow the following steps: On arrival, the robot pulls out all of the readable text it is interested in and creates a list of the links found on the page.  Links set as 'nofollow' or 'disallowed' are not added to the list.  If there are too many links, the robot may take a special action based on that. While the first robot completes processing the page, another robot script is launched to follow each of the links.  If there are ten links, there are now eleven robots running. Each of those robot scripts loads the page they were sent to and builds another link list.  Unless told otherwise, if there are ten links on each of those pages, one hundred additional robots get launched. Before going to the next page, the robots check to see if that page has already been looked at.  If already indexed that day, they cancel themselves and stop. The number of robots keeps expanding until all of the links have been followed and the site's web pages have been indexed or avoided. You can see that on some sites, thousands of robot processes can be taking their turns to work a web page.  There is physical limit on how much memory is available on the server.  If the number of active robots exceeds that, they have to be canceled or memory corruption will occur. If you let the robots run in too many directions, they may not finish looking at every web page or the results from some pages may get scrambled.  You are also subject to the number of robots on that server that are looking at other web sites.  Poorly managed robot servers can end up creating very strange results.
Rob Laporte

Selling text links ads thorugh TLA or DLA result in Google penalty? - 0 views

  • Can selling text link ads in the sidebar using TLA or Direct-Link-Ads result in a Googlge penalty? I use to use TLA before for one of my sites but stopped using them for the fear of Google dropping the sit because i heard a few rumors on webmaster forums of this happening. Is this concrete or not? Are people still using TLA or DLA or some other similar? C7Mike#:3930956 4:52 am on June 11, 2009 (utc 0) Yes, you may receive a penalty for purchasing links that pass PageRank. See Google's Webmasters/Site owner Help topic for more information: [google.com...] Automotive site#:3930991 6:42 am on June 11, 2009 (utc 0) Well, I was actually going to use one of thoose to sell and not purchase. Anyway, I am going to apply to BuyandSellAds and see if I get accepted there, but I heard they mostly accept tech related sites. C7Mike#:3931237 2:25 pm on June 11, 2009 (utc 0) You may receive a penalty for both buying and selling paid links that pass PageRank (see [google.com...] I have had a few sites lose their PR because they published links through TLA. However the content was still good enough that advertisers have continued to purchase links on those pages through TLA inspite of the lack of PR, and at a substantially lower rate.
Rob Laporte

Will Selling Links via Text Link Ads SLAM your PageRank? - Webmaster Central Help - 0 views

  • Will Selling Links via Text Link Ads SLAM your PageRank? Report abuse uploadjockey Level 1 1/6/10 I have read the FAQs and checked for similar issues: YESMy site's URL is: http://www.uploadjockey.comDescription (including timeline of any changes made): Removed Text Link AdsLast we started to sell links via text-link-ads.com for some additional income.I cannot say for certain that this caused the problem, but it seems like it did. Our PageRank has dropped from a PR4 to a PR0 in less than a year.Our traffic has dropped from over 75k+ unique hits a day to just barely 20k+Am I missing something? Is there some other violation that I could be missing that is killing our ranking results?Thanks
jack_fox

Does the URL You Link to in Google My Business Impact Ranking in the Local Pack? - Ster... - 0 views

  • The content on the specific URL you link to is important and impacts ranking in the local results.  For most businesses, it makes sense to link to the homepage since that URL has the most authority, backlinks, and relevance.  However, for some businesses that qualify to have multiple listings, using a strategy like this can help provide better results.
  •  
    "The content on the specific URL you link to is important and impacts ranking in the local results.  For most businesses, it makes sense to link to the homepage since that URL has the most authority, backlinks, and relevance.  However, for some businesses that qualify to have multiple listings, using a strategy like this can help provide better results."
Rob Laporte

SEOmoz Crawls Web To Expand SEO Toolset - 0 views

  • Oct 6, 2008 at 8:06am Eastern by Barry Schwartz    SEOmoz Crawls Web To Expand SEO Toolset Rand Fishkin of SEOmoz announced they have been working for about a year on building out an index of the web, in order to be able to provide SEOs and SEMs with a toolset they have never had before. SEOmoz has crawled and built a 30 billion page index of the web. Rand explains this index is still growing and is refreshed monthly. The purpose, “to help SEOs and businesses acquire greater intelligence about the Internet’s vast landscape.” Part of the indexing was to build out a new tool named Linkscape. Linkscape gives SEOs “online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value,” said Rand. I hope to play with it more after the SMX East conference, but with a quick trial, it seems pretty comprehensive. SEOmoz also launched a new design and has given PRO members more options and features. To read all about these features and the tools, see Rand’s post.
  •  
    Oct 6, 2008 at 8:06am Eastern by Barry Schwartz SEOmoz Crawls Web To Expand SEO Toolset Rand Fishkin of SEOmoz announced they have been working for about a year on building out an index of the web, in order to be able to provide SEOs and SEMs with a toolset they have never had before. SEOmoz has crawled and built a 30 billion page index of the web. Rand explains this index is still growing and is refreshed monthly. The purpose, "to help SEOs and businesses acquire greater intelligence about the Internet's vast landscape." Part of the indexing was to build out a new tool named Linkscape. Linkscape gives SEOs "online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value," said Rand. I hope to play with it more after the SMX East conference, but with a quick trial, it seems pretty comprehensive. SEOmoz also launched a new design and has given PRO members more options and features. To read all about these features and the tools, see Rand's post.
Jennifer Williams

11 Experts on Link Development Speak Out - Sugarrae - 0 views

  •  
    Last year, I asked a bunch of the link development pros to sit down and do an interview on the topic of developing links in regards to seo. What I ended up with as a result was a six thousand plus word tutorial on developing backlinks.
‹ Previous 21 - 40 of 392 Next › Last »
Showing 20 items per page