Skip to main content

Home/ DISC Inc/ Group items tagged redirects

Rss Feed Group items tagged

Rob Laporte

301 several pages to one page? - Search Engine Watch Forums - 0 views

  •  
    301 several pages to one page? I have had good reults by using a 301 from one old, unimportant page with good SERP to a newer page. I have a number of old, unimportant pages and I am wondering if it would be a good strategy to redirect them all or a number of them. Example: old-what-are-widgets.html old-blue-widget.html old-red-widget.html Redirect to: what-are-widgets.html blue-widget.html red-widget.html or important-widget-page.html The single page is much more important than the group of pages. Would I be wasting value by having them point to the single page? Reply With Quote fspezia View Public Profile Send a private message to fspezia Visit fspezia's homepage! Find all posts by fspezia #2 Old 1 Day Ago AussieWebmaster's Avatar AussieWebmaster AussieWebmaster is offline Forums Editor, SearchEngineWatch Join Date: Jun 2004 Location: NYC Posts: 5,662 AussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud ofAussieWebmaster has much to be proud of Re: 301 several pages to one page? send them to the most appropriate pages... your rank will increase across the board and pass around... giving site lift and the individual pages for keywords Reply With Quote AussieWebmaster View Public Profile Send a private message to AussieWebmaster Visit AussieWebmaster's homepage! Find all posts by AussieWebmaster #3 Old 1 Day Ago fspezia fspezia is offline Member Join Date: Mar 2006 Location: www.salarymap.com Posts: 21 fspezia is on a distinguished road Re: 301 several pages to one page? In general I do redirect one page to one page but I do have a case where there are 8 pages (6 have PR4) that are obsolete, provide no value to
Rob Laporte

Effective Internal Linking Strategies That Prevent Duplicate Content Nonsense - Search ... - 0 views

  •  
    The funny thing about duplicate content is that you don't really have to have it for it to appear as if you do have it. But whether you have duplicate content on your site or not, to the search engines appearances are everything . The engines are pretty much just mindless bots that can't reason. They only see what is, or appears to be there and then do what the programmers have determined through the algorithm. How you set up your internal linking structure plays a significant role in whether you set yourself up to appear if you have duplicate content on your site or not. Some things we do without thinking, setting ourselves up for problems ahead. With a little foresight and planning, you can prevent duplicate content issues that are a result of poor internal link development. For example, we know that when we link to site.com/page1.html in one place but then link to www.site.com/page1.html in another, that we are really linking to the same page. But to the search engines, the www. can make a difference. They'll often look at those two links as links to two separate pages. And then analyze each page as if it is a duplicate of the other. But there is something we can do with our internal linking to alleviate this kind of appearance of duplicate content. Link to the www. version only Tomorrow I'll provide information on how to set up your site so when someone types in yoursite.com they are automatically redirected to www.yoursite.com. It's a great permanent fix, but as a safety measure, I also recommend simply adjusting all your links internally to do the same. Example of not linking to www. version. In the image above you can see that the domain contains the www., but when you mouse over any of the navigation links, they point to pages without the www. Even if you have a permanent redirect in place, all the links on your site should point to the proper place. At the very least you're making the search engines and visitors NOT have to redirect. At best, should y
Rob Laporte

Redirects: Good, Bad & Conditional - 0 views

  •  
    There's one workaround I will leave you with that negates the use of redirects altogether-including conditional ones. It's useful specifically for tracking, and involves appending tracking information to URLs in such a way that tracked URLs are automatically collapsed by the engines. No, it doesn't involve JavaScript. Curiously, I don't ever hear this method being discussed. The method makes use of the # (hash or pound character), which is normally used to direct visitors to an anchored part of a web page. Simply append a # to your URL followed by the tracking code or ID. For example: www.example.com/widgets.php#partner42. Search engines will ignore the # and everything after it; thus, PageRank is aggregated and duplicates are avoided. Hopefully this has challenged you to think critically about redirects-temporary, permanent and conditional-and their implications for SEO. Opt for permanent (301) over temporary (302) if you want the link juice to transfer. Conditional redirects should be avoided, especially if your risk tolerance for penalization is low. If you take a good hard look at your "need" for conditional redirects, I think you may find you don't really need them at all.
jack_fox

JavaScript Redirects : TechSEO - 0 views

  •  
    "They should work fine for Google, but keep in mind not all search engines are processing JS. Still, I would do what's best and easiest for you and that probably is the JS redirects. 3 Reply Share Report Save level 2 garyillyes 1 day ago ^ this, what the Patrick said. we used js redirects on webmasters.googleblog.com because that was the only thing we could use for 1:1 redirects, and it works on Google, but i see other search engines are having a tougher time picking them up. edit: if i have had a choice, i wouldn't have used js redirects, ever. alas. i haven't"
jack_fox

Google Shares How 301 Redirects Pass PageRank - Search Engine Journal - 0 views

  • A redirect from one page to an entirely different page will result in no PageRank being passed and will be considered a soft 404.
  • the 301 redirect will pass 100% PageRank only if the redirect was a redirect to a new page that closely matched the topic of the old page.
  • Is there any link equity loss from redirect chains?
  • ...1 more annotation...
  • John Mueller answered:“For the most part that is not a problem. We can forward PageRank through 301 and 302 redirects. Essentially what happens there is we use these redirects to pick a canonical. By picking a canonical we’re concentrating all the signals that go to those URLs to the canonical URL.”
Rob Laporte

Geo-Targeting Redirects: Cloaking or Better User Experience? - Search Engine Watch (SEW) - 0 views

  • If you have your site set to detect a visitor's location and show content based on that, I would recommend the following: Serve a unique URL for distinct content. For instance, don't show English content to US visitors on mysite.com and French content to French visitors on mysite.com. Instead, redirect English visitors to mysite.com/en and French visitors to mysite.com/fr. T hat way search engines can index the French content using the mysite.com/fr URL and can index English content using the mysite.com/en URL. Provide links to enable visitors (and search engines) to access other language/country content. For instance, if I'm in Zurich, you might redirect me to the Swiss page, but provide a link to the US version of the page. Or, simply present visitors with a home page that enables them to choose the country. You can always store the selection in a cookie so visitors are redirected automatically after the first time.
  • Google's policies aren't as inflexible as you're trying to portray. The same Google page you quote also says that intent ought to be a major consideration (just as when evaluating pages with hidden content). Also, why would Google's guidelines prevent you from using geotargeting without an immediate redirect? Just because you don't immediately redirect search users to a different page doesn't mean you have to ask for their zip code instead of using IP-based geotargeting.    Lastly, I don't think using such redirects from SERPs improves user experience at all. If I click on a search result, then it's because that's the content I'm interested in. It's very annoying to click on a search result and get a page completely different from the SERP snippet. And what about someone who is on business in a different country? Search engines already provide different language localizations as well as language search options to favor localized pages for a particular region. So if someone goes to the French Google, they will see the French version of localized sites/pages. If they're seeing the U.S. version in their SERP, then it's because you didn't SEO or localize your pages properly, or they deliberately used the U.S. version of Google. Don't second guess your users. Instead, focus on making sure that users know about your localized pages and can access them easily (by choice, not through force).5 days ago, 17:00:11 – Flag – Like – Reply – Delete – Edit – Moderate Bill Hunt Frank your spot on as usual. We still keep chasing this issue and as I wrote on SEW last year in my article on language detection issues http://searchenginewatch.com/3634625 it is often more of the implementation that is the problem than the actual redirect.    Yes, it is exactly cloaking (maybe gray hat) when you have a single gateway such as "example.com" and if the person comes from Germany they see the site in German language or English if their IP was in New York. Engines typically crawl from a central location like Mountain View or Zurich so they would only see the English version since they would not provide signals for any other location. Where you really get into a tricky area is if you set it so that any user agent from a search engine can access any version they are asking for and let them in yet a human is restricted - sort of reverse cloaking. If Mr GoogleBot wants the French home page let him have it rather than sending him to the US homepage.    With the growth of CDN's (content data networks) I am seeing more and more of these issues crop up to handle load balancing as well as other forms of geographical targeting. I have a long list of global, multinational and enterprise related challenges that are complicated by many of Google's outdated ways of handling kindergarten level spam tactics. Sounds like a SES New York session...
Rob Laporte

Capital Letters (Pascal Casing) in URLs - Advantages and Disadvantages - 0 views

  •  
    I noticed CNN uses some capital letters and sometimes whole words in capital in their URL. Here is what I thought of the advantages and disadvantages and please feel free to share some more ideas. The advantages: # You make the word stand out # Some search engines might put more emphasis on those words The disadvantages: # It makes it more difficult for users to type in the URL or suggest the link via phone. # It may confuse users, making them think URL's like domains are not case sensitive at all. webing #:3652026 6:04 pm on May 16, 2008 (utc 0) i thought urls were not case sensitive? i just tried my domain name in capital letters and it redirected me to the non capital letters so i do think domains are not case sensitive. sorry if i'm completly wrong ^^. pageoneresults #:3652029 6:10 pm on May 16, 2008 (utc 0) You know, its funny you should start this topic. I was just getting ready to do a full blown topic on Pascal Casing and "visual" marketing advantages. I started a topic back in 2007 September here... Domain Names and Pascal Casing http://www.webmasterworld.com/domain_names/3457393.htm No, domain names are not case sensitive. These past 12 months I've been on a mission and changing everything to Pascal Casing when it comes to domain names. Its much easier to read and separate words and it just looks nicer. I've been experimenting with this and it works. Google AdWords is a great place to test the effectiveness of Pascal Casing. What's really cool is that you can literally change your hard coded references to Pascal Casing and when you hover over them, they show lower case. Its a browser feature I guess. I never gave it much thought until this past year when I started my changes. I've also gone one step further and use Pascal Casing in full addresses. We have a rewrite in place that forces lower case so we can do pretty much whatever we want with the URI and file naming. [edited by: pageoneresults at 6:11 pm (utc) on May 16, 2008] ted
jack_fox

How to Use the Site Move Tool - Bing Webmaster Tools - 0 views

  • Although permanently redirecting your URLs using 301 redirects is sufficient for Bing to understand that you want the new URLs to be indexed instead of the old ones and this normally happens automatically, the Site Move tool can help expedite this process.
  •  
    "Although permanently redirecting your URLs using 301 redirects is sufficient for Bing to understand that you want the new URLs to be indexed instead of the old ones and this normally happens automatically, the Site Move tool can help expedite this process."
Rob Laporte

Feedburner Goes All Permanent on Their URL Redirects - Search Marketing News Blog - Sea... - 0 views

  • September 30, 2009 Feedburner Goes All Permanent on Their URL Redirects If you've ever clicked on a link in your RSS reader and that link is associated with a site that uses Feedburner, you've probably noticed that the initial URL to appear in your browser's address bar was related to the feed and not the final URL. That's because Feedburner uses the URL to track the click. The redirect was a 302, a temporary redirect. But now Feedburner is updating the URLs to be permanent 301 redirects. Feedburner, which is owned by Google, says that the reason for the change was that some search engines index the feeds, which affects the popularity of a site. If you use Feedburner, you don't have to do anything special. The update is automatic.
Rob Laporte

65+ Best Free SEO Chrome Extensions (As Voted-for by SEO Community) - 1 views

  • Link Redirect Trace — Uncovers all URLs in a redirect chain including 301’s, 302’s, etc. Very useful for finding (and regaining) lost “link juice,” amongst other things.Other similar extensions: Redirect Path
  • Scraper — Scrape data from any web page using XPath or jQuery. Integrates with Google Sheets for one-click export to a spreadsheet. Or you can copy to clipboard and paste into Excel.Other similar extensions: Data Scraper — Easy Web Scraping, XPather
  • Tag Assistant (by Google) — Check for the correct installation of Google tags (e.g. Google Analytics, Tag Manager, etc) on any website. Also, record typical user flows on your website to diagnose and fix implementation errors.
  • ...16 more annotations...
  • Web Developer — Adds a web developer toolbar to Chrome. Use it to check how your website looks on different screen sizes, find images with missing alt text, and more.
  • WhatRuns — Instantly discover what runs any website. It uncovers the CMS, plugins, themes, ad networks, fonts, frameworks, analytics tools, everything.
  • Page Load Time — Measures and displays page load time in the toolbar. Also breaks down this metric by event to give you deeper insights. Simple, but very useful.
  • FATRANK — Tells you where the webpage you’re visiting ranks in Google for any keyword/phrase.
  • SEOStack Keyword Tool — Finds thousands of low-competition, long-tail keywords in seconds. It does this by scraping Google, Youtube, Bing, Yahoo, Amazon, and eBay. All data can be exported to CSV.
  • Window Resizer — Resize your browser window to see how a website looks on screens of different sizes. It has one-click emulation for popular sizes/resolutions (e.g. iPhone, iPad, laptop, desktop, etc).
  • Ghostery — Tells you how websites are tracking you (e.g. Facebook Custom Audiences, Google Analytics, etc) and blocks them. Very useful for regaining privacy. Plus, websites generally load faster when they don’t need to load tracking technologies.
  • Ayima Page Insights — Uncovers technical and on-page issues for any web page. It also connects to Google Search Console for additional insights on your web properties.
  • ObservePoint TagDebugger — Audit and debug issues with website tags (e.g. Google Analytics, Tag Manager, etc) on your websites. Also checks variables and on-click events.Other similar extensions: Event Tracking Tracker
  • The Tech SEO — Quick Click Website Audit — Provides pre-formatted links (for the current URL) to a bunch of popular SEO tools. A very underrated tool that reduces the need for mundane copy/pasting.
  • User-Agent Switcher for Chrome — Mimic user-agents to check that your website displays correctly in different browsers and/or OS’.
  • Portent’s SEO Page Review — Reviews the current page and kicks back a bunch of data including meta tags, canonicals, outbound links, H1-H6 tags, OpenGraph tags, and more.
  • FindLinks — Highlights all clickable links/elements on a web page in bright yellow. Very useful for finding links on websites with weird CSS styling.
  • SERPTrends SEO Extension — Tracks your Google, Bing, and Yahoo searches. Then, if you perform the same search again, it shows ranking movements directly in the SERPs.
  • SimilarTech Prospecting — Discovers a ton of useful information about the website you’re visiting. This includes estimated monthly traffic, company information, social profiles, web technologies, etc.
  • SEO Search Simulator by Nightwatch — Emulates Google searches from any location. Very useful for seeing how rankings vary for a particular query in different parts of the world.
  •  
    "Find Out How Much Traffic a Website Gets: 3 Ways Compared"
Rob Laporte

Is it OK to HTTP redirect images? - Stack Overflow - 0 views

  • The one thing you should definitely avoid is redirecting many images on a page. This will severely slow down page load time, especially on high-latency networks (e.g. phone, China, satellite internet) where each new HTTP request takes a long time. Also, HTTP clients are limited to a small number of simultaneous HTTP connections per server hostname, so even on fast networks you'll end up with a bottleneck. Redirecting 1 or 2 images on a page is not a big deal, however.
    • Rob Laporte
       
      301 redirects are very fast, so I assume it would have to be a lot of images 301'd to slow things too much, but only 80% sure of this.
jack_fox

Google will pass permanent signals with a redirect after a year - 0 views

  • if you have clients that really want to remove redirects for whatever reason, if the redirect is live for a year or more, it is safe to do so from an SEO perspective specific to Google Search. More importantly, if the redirect is removed over time because of just normal maintenance and it has been a year, you still do not need to worry.
Rob Laporte

Google Webmaster Tools Now Provide Source Data For Broken Links - 0 views

  • Google has also added functionality to the Webmaster Tools API to enable site owners to provide input on control settings (such as preferred domain and crawl rate) that could previously only be done via the application. As they note in the blog post: “This is especially useful if you have a large number of sites. With the Webmaster Tools API, you can perform hundreds of operations in the time that it would take to add and verify a single site through the web interface.”
  •  
    Oct 13, 2008 at 5:28pm Eastern by Vanessa Fox Google Webmaster Tools Now Provide Source Data For Broken Links Ever since Google Webmaster Tools started reporting on broken links to a site, webmasters have been asking for the sources of those links. Today, Google has delivered. From Webmaster Tools you can now see the page that each broken link is coming from. This information should be of great help for webmasters in ensuring the visitors find their sites and that their links are properly credited. The value of the 404 error report Why does Google report broken links in the first place? As Googlebot crawls the web, it stores a list of all the links it finds. It then uses that list for a couple of things: * As the source list to crawl more pages on the web * To help calculate PageRank If your site has a page with the URL www.example.com/mypage.html and someone links to it using the URL www.example.com/mpage.html, then a few things can happen: * Visitors who click on that link arrive at the 404 page for your site and aren't able to get to the content they were looking for * Googlebot follows that link and instead of finding a valid page of your site to crawl, receives a 404 page * Google can't use that link to give a specific page on your site link credit (because it has no page to credit) Clearly, knowing about broken links to your site is valuable. The best solution in these situations generally is to implement a 301 redirect from the incorrect URL to the one. If you see a 404 error for www.example.com/mpage.html, then you can be pretty sure they meant to link to www.example.com/mypage.html. By implementing the redirect, visitors who click the link find the right content, Googlebot finds the content, and mypage.html gets credit for the link. In addition, you can scan your site to see if any of the broken links are internal, and fix them. But finding broken links on your site can be tedious (although it's valuable to run a broken l
Rob Laporte

SEOmoz | Announcing SEOmoz's Index of the Web and the Launch of our Linkscape Tool - 0 views

  •  
    After 12 long months of brainstorming, testing, developing, and analyzing, the wait is finally over. Today, I'm ecstatic to announce some very big developments here at SEOmoz. They include: * An Index of the World Wide Web - 30 billion pages (and growing!), refreshed monthly, built to help SEOs and businesses acquire greater intelligence about the Internet's vast landscape * Linkscape - a tool enabling online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value. * A Fresh Design - that gives SEOmoz a more usable, enjoyable, and consistent browsing experience * New Features for PRO Membership - including more membership options, credits to run advanced Linkscape reports (for all PRO members), and more. Since there's an incredible amount of material, I'll do my best to explain things clearly and concisely, covering each of the big changes. If you're feeling more visual, you can also check out our Linkscape comic, which introduces the web index and tool in a more humorous fashion: Check out the Linkscape Comic SEOmoz's Index of the Web For too long, data that is essential to the practice of search engine optimization has been inaccessible to all but a handful of search engineers. The connections between pages (links) and the relationship between links, URLs, and the web as a whole (link metrics) play a critical role in how search engines analyze the web and judge individual sites and pages. Professional SEOs and site owners of all kinds deserve to know more about how their properties are being referenced in such a system. We believe there are thousands of valuable applications for this data and have already put some effort into retrieving a few fascinating statistics: * Across the web, 58% of all links are to internal pages on the same domain, 42% point to pages off the linking site. * 1.83%
Dale Webb

Inbound links: Official Google Webmaster Central Blog - 0 views

  • So how can you engage more users and potentially increase merit-based inbound links?Many webmasters have written about their success in growing their audience. We've compiled several ideas and resources that can improve the web for all users.Create unique and compelling content on your site and the web in generalStart a blog: make videos, do original research, and post interesting stuff on a regular basis. If you're passionate about your site's topic, there are lots of great avenues to engage more users.If you're interested in blogging, see our Help Center for specific tips for bloggers.
  •  
    How they factor into ranking. Most importantly - Google appropriately flows PageRank and related signals through 301 redirects!!
jack_fox

How to Build White-Hat Links by Using the Link Reclamation Tool - 0 views

  • A. Create a landing page with dedicated content, letting the users know that the info is not available anymore on that page, but they can access similar data in some other place.   B. 301 redirect – is the best way to ensure that users and search engines are directed to a page with correct info. A 301 redirect is a permanent one which passes the link juice and equity to the redirected page.
  • Archives such as Wayback Machine or Warrick can help you recover the original content
jack_fox

How To Use GSC's Crawl Stats Reporting To Analyze and Troubleshoot Site Moves (Domain N... - 0 views

  • By analyzing the source domain name that’s part of the migration, you can view urls that Googlebot is coming across that end up as 404s. And that can help you find gaps in your 301 redirection plan.
  • Although there’s a lag in the data populating (3-4 days), the Crawl Stats reporting can sure help surface problems during domain name changes and url migrations
  •  
    "By analyzing the source domain name that's part of the migration, you can view urls that Googlebot is coming across that end up as 404s. And that can help you find gaps in your 301 redirection plan."
1 - 20 of 78 Next › Last »
Showing 20 items per page