Skip to main content

Home/ DISC Inc/ Group items matching "404" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Rob Laporte

Official Google Webmaster Central Blog: Make your 404 pages more useful - 0 views

  •  
    This Blog Google Blogs Web Blog News This Blog Google Blogs Web Blog News Make your 404 pages more useful Tuesday, August 19, 2008 at 10:13 AM Your visitors may stumble into a 404 "Not found" page on your website for a variety of reasons: * A mistyped URL, or a copy-and-paste mistake * Broken or truncated links on web pages or in an email message * Moved or deleted content Confronted by a 404 page, they may then attempt to manually correct the URL, click the back button, or even navigate away from your site. As hinted in an earlier post for "404 week at Webmaster Central", there are various ways to help your visitors get out of the dead-end situation. In our quest to make 404 pages more useful, we've just added a section in Webmaster Tools called "Enhance 404 pages". If you've created a custom 404 page this allows you to embed a widget in your 404 page that helps your visitors find what they're looking for by providing suggestions based on the incorrect URL. Example: Jamie receives the link www.example.com/activities/adventurecruise.html in an email message. Because of formatting due to a bad email client, the URL is truncated to www.example.com/activities/adventur. As a result it returns a 404 page. With the 404 widget added, however, she could instead see the following: In addition to attempting to correct the URL, the 404 widget also suggests the following, if available: * a link to the parent subdirectory * a sitemap webpage * site search query suggestions and search box How do you add the widget? Visit the "Enhance 404 pages" section in Webmaster Tools, which allows you to generate a JavaScript snippet. You can then copy and paste this into your custom 404 page's code. As always, don't forget to return a proper 404 code. Can you change the way it looks? Sure. We leave the HTML unstyled initially, but you can edit the CSS block that we've included. For more information, check out our gu
Rob Laporte

Official Google Webmaster Central Blog: More on 404 - 0 views

  • Have you guys seen any good 404s?Yes, we have! (Confession: no one asked us this question, but few things are as fun to discuss as response codes. :) We've put together a list of some of our favorite 404 pages. If you have more 404-related questions, let us know, and thanks for joining us for 404 week!http://www.metrokitchen.com/nice-404-page"If you're looking for an item that's no longer stocked (as I was), this makes it really easy to find an alternative."-Riona, domestigeekhttp://www.comedycentral.com/another-404"Blame the robot monkeys"-Reid, tells really bad jokeshttp://www.splicemusic.com/and-another"Boost your 'Time on site' metrics with a 404 page like this."-Susan, dabbler in music and Analyticshttp://www.treachery.net/wow-more-404s"It's not reassuring, but it's definitive."-Jonathan, has trained actual spiders to build websites, ants handle the 404shttp://www.apple.com/iPhone4g"Good with respect to usability."http://thcnet.net/lost-in-a-forest"At least there's a mailbox."-JohnMu, adventuroushttp://lookitsme.co.uk/404"It's pretty cute. :)"-Jessica, likes cute thingshttp://www.orangecoat.com/a-404-page.html"Flow charts rule."-Sahala, internet travellerhttp://icanhascheezburger.com/iz-404-page"I can has useful links and even e-mail address for questions! But they could have added 'OH NOES! IZ MISSING PAGE! MAYBE TIPO OR BROKN LINKZ?' so folks'd know what's up."-Adam, lindy hop geek
jack_fox

- 0 views

  • a 404 is a 404, it's always been like this. Why would a search engine want to index the contents of something you're saying doesn't exist? It's not a matter of interpretation, it's a clear signal. 404s are perfectly fine & have a place on *every* website.
  •  
    "a 404 is a 404, it's always been like this. Why would a search engine want to index the contents of something you're saying doesn't exist? It's not a matter of interpretation, it's a clear signal. 404s are perfectly fine & have a place on *every* website."
Rob Laporte

Google Webmaster Tools Now Provide Source Data For Broken Links - 0 views

  • Google has also added functionality to the Webmaster Tools API to enable site owners to provide input on control settings (such as preferred domain and crawl rate) that could previously only be done via the application. As they note in the blog post: “This is especially useful if you have a large number of sites. With the Webmaster Tools API, you can perform hundreds of operations in the time that it would take to add and verify a single site through the web interface.”
  •  
    Oct 13, 2008 at 5:28pm Eastern by Vanessa Fox Google Webmaster Tools Now Provide Source Data For Broken Links Ever since Google Webmaster Tools started reporting on broken links to a site, webmasters have been asking for the sources of those links. Today, Google has delivered. From Webmaster Tools you can now see the page that each broken link is coming from. This information should be of great help for webmasters in ensuring the visitors find their sites and that their links are properly credited. The value of the 404 error report Why does Google report broken links in the first place? As Googlebot crawls the web, it stores a list of all the links it finds. It then uses that list for a couple of things: * As the source list to crawl more pages on the web * To help calculate PageRank If your site has a page with the URL www.example.com/mypage.html and someone links to it using the URL www.example.com/mpage.html, then a few things can happen: * Visitors who click on that link arrive at the 404 page for your site and aren't able to get to the content they were looking for * Googlebot follows that link and instead of finding a valid page of your site to crawl, receives a 404 page * Google can't use that link to give a specific page on your site link credit (because it has no page to credit) Clearly, knowing about broken links to your site is valuable. The best solution in these situations generally is to implement a 301 redirect from the incorrect URL to the one. If you see a 404 error for www.example.com/mpage.html, then you can be pretty sure they meant to link to www.example.com/mypage.html. By implementing the redirect, visitors who click the link find the right content, Googlebot finds the content, and mypage.html gets credit for the link. In addition, you can scan your site to see if any of the broken links are internal, and fix them. But finding broken links on your site can be tedious (although it's valuable to run a broken l
Dale Webb

Customizing Google's 404 Widget with a CSS modification - 0 views

  •  
    Google's handy 404 Error Page Enhancement in Google Webmaster Tools can be modified to ensure visitors stay on your site.
Rob Laporte

301 vs. 410 vs. 404 vs. Canonical | LinkedIn - 0 views

  • However, after looking at how webmasters use them in practice we are now treating the 410 HTTP result code as a bit "more permanent" than a 404. So if you're absolutely sure that a page no longer exists and will never exist again, using a 410 would likely be a good thing. I don't think it's worth rewriting a server to change from 404 to 410, but if you're looking at that part of your code anyway, you might as well choose the "permanent" result code if you can be absolutely sure that the URL will not be used again. If you can't be sure of that (for whatever reason), then I would recommend sticking to the 404 HTTP result code.
jack_fox

404 (Page Not Found) errors - Search Console Help - 0 views

  • Many (most?) 404 errors are not worth fixing because 404s don't harm your site's indexing or ranking.
  • Don't create fake content, redirect to your homepage, or use robots.txt to block 404s—all of these things make it harder for us to recognize your site’s structure and process it properly.
Rob Laporte

Deduping Duplicate Content - ClickZ - 0 views

  •  
    One interesting thing that came out of SES San Jose's Duplicate Content and Multiple Site Issues session in August was the sheer volume of duplicate content on the Web. Ivan Davtchev, Yahoo's lead product manager for search relevance, said "more than 30 percent of the Web is made up of duplicate content." At first I thought, "Wow! Three out of every 10 pages consist of duplicate content on the Web." My second thought was, "Sheesh, the Web is one tangled mess of equally irrelevant content." Small wonder trust and linkage play such significant roles in determining a domain's overall authority and consequent relevancy in the search engines. Three Flavors of Bleh Davtchev went on to explain three basic types of duplicate content: 1. Accidental content duplication: This occurs when Webmasters unintentionally allow content to be replicated by non-canonicalization (define), session IDs, soft 404s (define), and the like. 2. Dodgy content duplication: This primarily consists of replicating content across multiple domains. 3. Abusive content duplication: This includes scraper spammers, weaving or stitching (mixed and matched content to create "new" content), and bulk content replication. Fortunately, Greg Grothaus from Google's search quality team had already addressed the duplicate content penalty myth, noting that Google "tries hard to index and show pages with distinct information." It's common knowledge that Google uses a checksum-like method for initially filtering out replicated content. For example, most Web sites have a regular and print version of each article. Google only wants to serve up one copy of the content in its search results, which is predominately determined by linking prowess. Because most print-ready pages are dead-end URLs sans site navigation, it's relatively simply to equate which page Google prefers to serve up in its search results. In exceptional cases of content duplication that Google perceives as an abusive attempt to manipula
  •  
    One interesting thing that came out of SES San Jose's Duplicate Content and Multiple Site Issues session in August was the sheer volume of duplicate content on the Web. Ivan Davtchev, Yahoo's lead product manager for search relevance, said "more than 30 percent of the Web is made up of duplicate content." At first I thought, "Wow! Three out of every 10 pages consist of duplicate content on the Web." My second thought was, "Sheesh, the Web is one tangled mess of equally irrelevant content." Small wonder trust and linkage play such significant roles in determining a domain's overall authority and consequent relevancy in the search engines. Three Flavors of Bleh Davtchev went on to explain three basic types of duplicate content: 1. Accidental content duplication: This occurs when Webmasters unintentionally allow content to be replicated by non-canonicalization (define), session IDs, soft 404s (define), and the like. 2. Dodgy content duplication: This primarily consists of replicating content across multiple domains. 3. Abusive content duplication: This includes scraper spammers, weaving or stitching (mixed and matched content to create "new" content), and bulk content replication. Fortunately, Greg Grothaus from Google's search quality team had already addressed the duplicate content penalty myth, noting that Google "tries hard to index and show pages with distinct information." It's common knowledge that Google uses a checksum-like method for initially filtering out replicated content. For example, most Web sites have a regular and print version of each article. Google only wants to serve up one copy of the content in its search results, which is predominately determined by linking prowess. Because most print-ready pages are dead-end URLs sans site navigation, it's relatively simply to equate which page Google prefers to serve up in its search results. In exceptional cases of content duplication that Google perceives as an abusive attempt to manipula
jack_fox

How to Create a Spectacular 404 Error Page (with 12 Examples) - 0 views

  • Includes a search bar for navigating your website
  • Link to your popular posts or homepage
  • This works because humans are curious beings. By explaining the why and proposing a solution, you motivate the user to take corrective action. 
  • ...1 more annotation...
  • Allow the user to get in touch with you
jack_fox

Do 404s Hurt SEO and Rankings? - 0 views

  • We only redirect them when they get traffic or have backlinks pointing to them. If you change a URL or delete a page and nobody links to it or it gets absolutely no traffic (check with Google Analytics), it’s perfectly fine for it to return a 404.  
jack_fox

How To Use GSC's Crawl Stats Reporting To Analyze and Troubleshoot Site Moves (Domain Name Changes and URL Migrations) - 0 views

  • By analyzing the source domain name that’s part of the migration, you can view urls that Googlebot is coming across that end up as 404s. And that can help you find gaps in your 301 redirection plan.
  • Although there’s a lag in the data populating (3-4 days), the Crawl Stats reporting can sure help surface problems during domain name changes and url migrations
  •  
    "By analyzing the source domain name that's part of the migration, you can view urls that Googlebot is coming across that end up as 404s. And that can help you find gaps in your 301 redirection plan."
Rob Laporte

Google Toolbar, IE9, and 404 status on NTFD page - 1 views

  •  
    "Google Toolbar's "Provide suggestions on navigation errors""
Rob Laporte

Capital Letters (Pascal Casing) in URLs - Advantages and Disadvantages - 0 views

  •  
    I noticed CNN uses some capital letters and sometimes whole words in capital in their URL. Here is what I thought of the advantages and disadvantages and please feel free to share some more ideas. The advantages: # You make the word stand out # Some search engines might put more emphasis on those words The disadvantages: # It makes it more difficult for users to type in the URL or suggest the link via phone. # It may confuse users, making them think URL's like domains are not case sensitive at all. webing #:3652026 6:04 pm on May 16, 2008 (utc 0) i thought urls were not case sensitive? i just tried my domain name in capital letters and it redirected me to the non capital letters so i do think domains are not case sensitive. sorry if i'm completly wrong ^^. pageoneresults #:3652029 6:10 pm on May 16, 2008 (utc 0) You know, its funny you should start this topic. I was just getting ready to do a full blown topic on Pascal Casing and "visual" marketing advantages. I started a topic back in 2007 September here... Domain Names and Pascal Casing http://www.webmasterworld.com/domain_names/3457393.htm No, domain names are not case sensitive. These past 12 months I've been on a mission and changing everything to Pascal Casing when it comes to domain names. Its much easier to read and separate words and it just looks nicer. I've been experimenting with this and it works. Google AdWords is a great place to test the effectiveness of Pascal Casing. What's really cool is that you can literally change your hard coded references to Pascal Casing and when you hover over them, they show lower case. Its a browser feature I guess. I never gave it much thought until this past year when I started my changes. I've also gone one step further and use Pascal Casing in full addresses. We have a rewrite in place that forces lower case so we can do pretty much whatever we want with the URI and file naming. [edited by: pageoneresults at 6:11 pm (utc) on May 16, 2008] ted
Rob Laporte

BruceClay - SEO Newsletter - FEATURE: Takeaways from SMX Advanced Seattle 2010 - 0 views

  • You & A with Matt Cutts of GoogleGoogle's new Web indexing system, Caffeine, is fully live. The new indexing infrastructure translates to an index that is 50 percent fresher, has more storage capacity and can recognize more connections of information. The Mayday update was an algorithm update implemented at the beginning of May that is intended to filter out low-quality search results. A new report in the Crawl errors section of Google Webmaster Tools indicates "soft 404" errors in order to help webmasters recognize and resolve these errors. Keynote Q&A with Yusuf Mehdi of Microsoft Bing is opening up new ways to interact with maps. The newly released Bing Map App SDK allows developers to create their own applications which can be used to overlay information on maps. Bing Social integrates to Facebook firehose and Twitter results into a social search vertical. Bing plans to have the final stages of the Yahoo! organic and paid search integration completed by the end of 2010. Decisions about how to maintain or integrate Yahoo! Site Explorer have not been finalized. Bing's Webmaster Tools are about to undergo a major update. Refer to the Bing Webmaster Tools session for more on this development.
  • Bing's program manager said that the functionality provided by Yahoo! Site Explorer will still be available. It's not their intention to alienate SEOs because they consider SEOs users, too.
  • The Bing Webmaster team has built a new Webmaster Tools platform from the ground up. It is scheduled to go live Summer 2010. The platform focuses on three key areas: crawl, index and traffic. Data in each area will go back through a six month period. Tree control is a new feature that provides a visual way to traverse the crawl and index details of a site. The rich visualizations are powered by Silverlight. URL submission and URL blocking will be available in the new Webmaster Tools.
  • ...1 more annotation...
  • The Ultimate Social Media Tools Session Tools to get your message out: HelpaReporter, PitchEngine, Social Mention, ScoutLabs. Customer and user insight tools: Rapleaf, Flowtown. Tools to find influencers: Klout. Forum tools: Bing Boards, Omgili, Board Tracker, Board Reader. Digg tools: Digg Alerter, FriendStatistics, di66.net. Make use of the social tools offered by social networks, e.g. utilize Facebook's many options to update your page and communicate your fans by SMS. Encourage people to follow you using Twitter's short code.
jack_fox

Soft 404 errors - Search Console Help - 0 views

  • f you think that your page is incorrectly flagged as a soft 404, use the URL Inspection tool to examine the rendered content and the returned HTTP code.
Rob Laporte

Proof That 301 Redirects To Less-Relevant Pages Are Seen As Soft 404s To Google [Case Study] - 0 views

  •  
    "Analytics Edge"
Rob Laporte

How To 301 Redirect Images During a Website Redesign or CMS Migration - The Most Forgotten Step When Changing URLs - 0 views

  •  
    "Google can see those pages as soft 404s"
1 - 20 of 24 Next ›
Showing 20 items per page