Skip to main content

Home/ DISC Inc/ Group items tagged Other

Rss Feed Group items tagged

Rob Laporte

Nofollow Monstrosity - 0 views

  •  
    # Many people link to social sites from their blogs and websites, and they rarely put 'nofollow' on their sites. Most social sites, on the other hand, started putting by default 'nofollow' on all external links. Consequence? For example, bookmark your new site 'example123.com' at 'stumbleupon.com'. If you google for 'example123′, stumbleupon.com page about it (with no content but the link and title) will be on top, while your site (with actual content) that you searched for will be below. Imagine what effect this PageRank capitalization has when you search for things other than your domain name! # Each site and blog owner is contributing to this unknowingly and voluntarily. Do any of these look familiar? social bookmarks Most blogs and sites have at least few of these on almost every single page. Not a single one of these buttons has 'nofollow', meaning that people give a very good chunk of their site's importance to these social sites (hint: importance that you give to these buttons is importance taken away from other internal links on your site). Most of social sites however, do have 'nofollow' on a link pointing back to peoples sites after users link to them for being good. Conclusion, people give them a lot of credit on almost every page, while these sites give nothing in return. (Two 'good' sites among these, that I know of, are Digg that does not have 'nofollow', and Slashdot that tries to identify real spam and puts 'nofollow' on those links only. There are probably few more.) # This can be easily prevented, and PageRank can be re-distributed, in no time! Solution is very simple. 'Do unto others as you would have others do unto you.' If you have a WordPress blog (as millions of internet users do), download plugins Antisocial and Nofollow Reciprocity. First one puts 'nofollow' on above buttons, second puts 'nofollow' on all external links pointing to 'bad' sites. If you are using some other blogging app
Rob Laporte

SEOmoz | Announcing SEOmoz's Index of the Web and the Launch of our Linkscape Tool - 0 views

  •  
    After 12 long months of brainstorming, testing, developing, and analyzing, the wait is finally over. Today, I'm ecstatic to announce some very big developments here at SEOmoz. They include: * An Index of the World Wide Web - 30 billion pages (and growing!), refreshed monthly, built to help SEOs and businesses acquire greater intelligence about the Internet's vast landscape * Linkscape - a tool enabling online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value. * A Fresh Design - that gives SEOmoz a more usable, enjoyable, and consistent browsing experience * New Features for PRO Membership - including more membership options, credits to run advanced Linkscape reports (for all PRO members), and more. Since there's an incredible amount of material, I'll do my best to explain things clearly and concisely, covering each of the big changes. If you're feeling more visual, you can also check out our Linkscape comic, which introduces the web index and tool in a more humorous fashion: Check out the Linkscape Comic SEOmoz's Index of the Web For too long, data that is essential to the practice of search engine optimization has been inaccessible to all but a handful of search engineers. The connections between pages (links) and the relationship between links, URLs, and the web as a whole (link metrics) play a critical role in how search engines analyze the web and judge individual sites and pages. Professional SEOs and site owners of all kinds deserve to know more about how their properties are being referenced in such a system. We believe there are thousands of valuable applications for this data and have already put some effort into retrieving a few fascinating statistics: * Across the web, 58% of all links are to internal pages on the same domain, 42% point to pages off the linking site. * 1.83%
Rob Laporte

What You Can Learn From Google's "Site" Operator - 0 views

  • Though the “site:” operator can teach you a lot about how Google indexes your website, there are some things that it doesn’t show you. For example, the “site:” operator doesn’t show you: What your SERP description will look like Which pages of your website are most important Often people see their search results from a “site:” operator and panic because the snippet or description that shows up underneath their URL is part of their navigation or something else that looks icky and not click-worthy. Don’t despair! The snippet that shows up when you use a “site:” operator query is rarely the same as the description that shows up for an actual keyword query. Perform some keyword searches yourself and you’ll see the difference. The other mistake people make is thinking that the order of the pages listed when you use the “site:” operator in the SERP shows the order of importance of those pages. While Google does tend to show the home page of a site before the other pages, the rest of the list isn’t sorted in any particular order of importance. So be careful about drawing any conclusions based on that.
  •  
    Though the "site:" operator can teach you a lot about how Google indexes your website, there are some things that it doesn't show you. For example, the "site:" operator doesn't show you: What your SERP description will look like Which pages of your website are most important Often people see their search results from a "site:" operator and panic because the snippet or description that shows up underneath their URL is part of their navigation or something else that looks icky and not click-worthy. Don't despair! The snippet that shows up when you use a "site:" operator query is rarely the same as the description that shows up for an actual keyword query. Perform some keyword searches yourself and you'll see the difference. The other mistake people make is thinking that the order of the pages listed when you use the "site:" operator in the SERP shows the order of importance of those pages. While Google does tend to show the home page of a site before the other pages, the rest of the list isn't sorted in any particular order of importance. So be careful about drawing any conclusions based on that.
jack_fox

Web Hosting 101 - The Basics - 0 views

  • Linux servers are open source and can be based on a number of different distributions, such as Ubuntu, Debian, Red Hat, CentOS, or FreeBSD.
  • the most common forms of web hosting available are: Free Web Hosting Shared Web Hosting Managed Web Hosting VPS Web Hosting Dedicated Web Hosting Cloud Web Hosting
  • Free web hosting is offered by various companies primarily in order to up-sell other domain services or to publish advertising on pages that are hosted under the account.
  • ...33 more annotations...
  • With shared web hosting, there may be thousands of different businesses, individuals, and organizations all serving their website files to the public from the same computer.
  • The web hosting company employs systems administrators to manage the server software installation and security updates. The hosting clients use file transfer management tools to host web pages in HTML or other programming languages which serve the files to the public through the browser. The hard disk space on the remote server can be used for other purposes than web hosting, for example remote file storage, email accounts, sandbox web development, mobile app support, or running software scripts online.
  • Shared web hosting accounts can cost as little as $1 – $3 dollars per month and rarely cost more than $20. It is estimated that over 90% of the websites on the internet use shared web hosting to keep their information online 24 hours a day. Shared web hosts never turn off their services and offer seamless hardware upgrades in the data center that can keep a website online for years. Most of the available web development tools will integrate easily with a shared hosting account.
  • The main disadvantage of shared web hosting is that it is not able to scale effectively to support the traffic of large websites and usually includes strict limitations on the use of CPU processing power because of the pooled resources. Shared web hosting does not typically support the user installation of server extensions through the command line that are important for custom web development and mobile app support.
  • There is still no opportunity for advanced systems administration and custom server configurations on most shared hosting plans. Security on shared web hosting frameworks is not considered robust enough for sensitive corporate information and government accounts. There can also be performance issues that develop on a server if one domain is consistently consuming shared resources or hit with a DDoS attack. Because systems administration and root server configuration control is taken out of the hands of shared web hosting users, they are often overly reliant on the service company for tech support.
  • Shared web hosting is recommended for self-published websites and small business networks.
  • Managed web hosting is a version of shared hosting where the service company specializes in platform-specific products that support custom development frameworks. Examples of this can be seen in Pantheon and Acquia Cloud for Drupal, Nexcess for Magento, or WP Engine for WordPress. Managed host companies provide optimized server environments that can speed up website performance and page load times for high-traffic, CMS-driven websites.
  • Virtual Private Servers (VPS) are a web hosting solution designed to give more power and flexibility to website owners for custom developed software requirements and complex applications. Technically, a VPS will operate in the same manner as a dedicated server while operating on a partitioned hardware framework that allows for the use of only a fraction of the resources of the host machine.
  • Understanding which virtualization platform the VPS web hosting company is using to manage data center resources and client configurations is important.
  • Developers often prefer VPS accounts because they can custom configure the server with the choice of operating system and install whatever additional server extensions are required for programming web applications
  • The main benefit of VPS hosting is that website owners can “dial in” the exact amount of server resources that are required to optimize the performance of a complex website.
  • The main disadvantage of VPS web hosting is the complexity of systems administration required to install and manage the server software, which requires a lot of command line knowledge and background in web server configuration.
  • Inexperienced users can leave security holes in the environment that hackers using automated script bots and known server or database exploits can easily detect and target. Using a standardized cPanel, CentOS, & WHM environment or administration panels like Webmin and Virtualmin can help simplify the server administration process considerably by adding a GUI layer to access common tasks
  • VPS web hosting accounts are best suited for developers who need to custom configure the environment with server extensions that shared web hosts will not support. Typically these are related to the use of database frameworks other than MySQL, programming languages other than PHP, and server frameworks other than Apache.
  • Dedicated web hosting is the most expensive and flexible of all of the service plans offered by companies in the industry, as site owners are able to directly rent or lease a complete rack-mount server in a data center.
  • Dedicated servers are required to host the largest sites by traffic on the web, as well as by mobile apps which require elite performance
  • The main disadvantage of a dedicated server is that it is costly compared to shared hosting or VPS plans, and expensive even when compared to the price of the underlying hardware itself. With dedicated servers, the client is paying not only for the use of the server, but also for the trained technicians who manage it, the overhead costs of the data center, and access to the internet backbone. Data center costs include not only rental of office and warehouse space, but also the electricity required to run all of the servers and keep them cool. Data centers must also have back-up power generation facilities in case the local electricity supply is cut. All of the residual costs are included in the annual price of a dedicated server plan. Nevertheless, it is still often much cheaper then what would be required to manage a data center for a single business independently.
  • Cloud web hosting provides solutions for websites that need more processing power and require more than a single server instance because the amount of online traffic, including the number of queries to the database and resource files, is too high in volume for a single machine
  • Cloud web hosting is defined by the deployment of server clusters that scale automatically with the user traffic and processing power needs of a website, including advanced software applications for elastic load balancing, file storage, and database optimization
  • Cloud web hosting is similar to content delivery networks (CDNs) which use distributed global servers, advanced page caching, and file management software to optimize website performance for large websites. Many cloud hosting companies will offer the ability to choose the operating system, database framework, and geographic location of the server itself as part of the configuration options.
  • Cloud web hosting is designed for remote computing applications and required by large web sites whose user traffic exceeds the limits of what a single server instance will provide. Cloud web hosting is particularly designed to meet the needs of websites with large database requirements.
  • Not every website will require cloud hosting, but small businesses and start-ups who scale their traffic and user communities often find managed cloud services a reasonable option over dedicated servers because of the ability to “pay-as-you-go” for only the amount of server resources used and for the ability to keep sites online through cluster scaling at the times of peak user service provision.
  • The major downside to cloud hosting is the uncertainty involved with the variability of costs with sites on the “pay-as-you-go” model. Another problem can be associated with “hype” in the industry, which can lead to over-pricing and over-billing for unnecessary services or introductory plans.
  • Cloud web hosting can be similar to VPS or dedicated server frameworks where the systems administrator has the ability to custom configure the installation of the operating system with software extensions that are not available in shared hosting environments. Some managed cloud hosts simplify this process by offering optimally configured solutions with a proprietary base software package.
  • Some of the main features to look for in any web hosting account are: Server Architecture Operating System Version Domain Management Tools Systems Administration Tools Bandwidth & CPU Limitations Free Offers & Promotions Data Security Technical Support
  • Before purchasing any web hosting account, it is essential to verify the server hardware being used on the platform. Currently there is a wide variety of difference between the different versions of Intel Xeon, Atom, Itanium, and AMD Opteron servers deployed in data center use.
  • The version of Linux installed, for example CentOS and Cloud Linux, can also come with licensing restrictions due to the use of WHM and cPanel. The use of other Linux distributions like Ubuntu, Red Hat, Debian, FreeBSD, etc. in web servers is mostly related to developer preference for systems administration
  • Many hosting companies claim to offer “unlimited” bandwidth and data transfer. However, if a website uses too many CPU resources, it may still be throttled or taken offline at times of peak user traffic.
  • A web hosting company should provide a guaranteed uptime of at least 99.9% as part of the service plan.
  • Website owners should make sure that any web hosting plan will be configured securely, including firewalls and monitoring software to prevent intrusions by automated script-bot attacks. Check whether the web hosting company offers DDoS attack protection and auto-alerts for unauthorized logins. While shared web hosting plans include the company services related to upgrading the installed operating system and server software with the latest security patches, VPS and dedicated server accounts will need to be responsible for this through a qualified systems administrator. Web hosting companies that provide automated site file and database back-up tools like raid disk mirroring on advanced accounts provide an extra layer of site security in case of a server crash or technical error that leads to data loss.
  • Managed hosts have the advantage of experienced technical support teams with platform-specific knowledge
  • Small business website owners and independent publishers should start with a shared web hosting account, then upgrade to a VPS or Cloud hosting plan if the traffic scales beyond what the server will support.
  • While shared hosting under cPanel and CentOS remains the industry standard, innovations in cloud computing are changing the web hosting landscape quickly. Many web hosting companies are now offering “hybrid” approaches that combine the best features of cloud and shared hosting into a high-performance, low-cost retail plan that offers integrated load balancing, page caching, and CDN services on elite server hardware configurations.
  •  
    "Linux servers are open source and can be based on a number of different distributions, such as Ubuntu, Debian, Red Hat, CentOS, or FreeBSD."
Rob Laporte

SEO Solutions for Multi-Country Sites: Multi-Lingual XML Sitemaps | ClickZ - 0 views

  •  
    For these reasons, option two, editing the sitemap.xml, is the better method. It only concerns one file per version of the website, doesn't affect page loading times, and can be easily used with other file types. Issues Although this does the job solving the problem for Google searches, this method isn't universally recognized by other search engines like Bing and Yahoo, which still yield consistent traffic, albeit low, but converting still. Despite the lack of support by Bing, this method is a great stratagem for working with the biggest in the search game, Google. It permits you to do region-specific targeting of your website in search without incurring penalties associated with duplicate or similar content; an SEO win! This can also be achieved on other search engines. Bing, for example, allows you to make such a distinction with a meta tag inserted into the HTML page or make a change to the HTTP headers; a harder solution than Google's, but still recommended.
Rob Laporte

Google Starts To Classify Content Types In Web Search - 0 views

  • Oct 9, 2008 at 3:00pm Eastern by Matt McGee    Google Starts To Classify Content Types In Web Search Like other search engines, Google already distinguishes between various types of content. You can search specifically for images, videos, books, blog posts, and so forth. Google has separate search engines for each. But two recent changes suggest that Google is improving its ability to classify different types of content that’s gathered from ordinary web pages. Search Engine Roundtable points to a discussion on WebmasterWorld about the addition of dates at the beginning of some search results — something Michael Gray spotted in mid-September. From my personal experience, this seems to be happening mostly on content that Google can identify as blog posts and news articles — but not exclusively on those types of content. And speaking of identifying types of content, Google Operating System points out that Google is starting to show special forum-related information in search results when it can identify that the result comes from a message board. Author Alex Chitu suggests this could mean new advanced search options in the future: This new feature shows that Google is able to automatically classify web pages and to extract relevant information. Once Google starts to show data for other kinds of web pages, we can expect to see an option to restrict the search results to a certain category (forums, reviews, blogs, news articles). The screenshot above has examples of both cases, the top showing dates in the snippets, and the bottom showing forum information.
  •  
    Oct 9, 2008 at 3:00pm Eastern by Matt McGee Google Starts To Classify Content Types In Web Search Google Snippets Like other search engines, Google already distinguishes between various types of content. You can search specifically for images, videos, books, blog posts, and so forth. Google has separate search engines for each. But two recent changes suggest that Google is improving its ability to classify different types of content that's gathered from ordinary web pages. Search Engine Roundtable points to a discussion on WebmasterWorld about the addition of dates at the beginning of some search results - something Michael Gray spotted in mid-September. From my personal experience, this seems to be happening mostly on content that Google can identify as blog posts and news articles - but not exclusively on those types of content. And speaking of identifying types of content, Google Operating System points out that Google is starting to show special forum-related information in search results when it can identify that the result comes from a message board. Author Alex Chitu suggests this could mean new advanced search options in the future: This new feature shows that Google is able to automatically classify web pages and to extract relevant information. Once Google starts to show data for other kinds of web pages, we can expect to see an option to restrict the search results to a certain category (forums, reviews, blogs, news articles). The screenshot above has examples of both cases, the top showing dates in the snippets, and the bottom showing forum information.
Rob Laporte

Live Search Webmaster Center Blog : The key to picking the right keywords (SEM 101) - 0 views

  •  
    Tool time Lastly, augment all of that good data with professional keyword research tools. Microsoft offers a tool called adCenter Excel Add-in Keyword Research Tool for versions 2003 and 2007. (Note: You'll need to set up an adCenter account before you can use the tool. Luckily, unlike most other online ad vendors, adCenter offers customer support over the phone with a real person - at no cost to you! - to help you get your account set up and running.) Both Google and Yahoo! offer their own keyword research tools. In addition, there are many third-party keyword research tools available, some for free, others for a fee. The adCenter Excel Add-in Keyword Research Tool can do the following: * Scan your current website and extract the keywords that offer the highest confidence levels based on their current usage * Suggest new keywords based on user behavior or your existing keyword list * Provide: o Research data on top performing keywords o Performance data on the keywords you specify o Information on keyword usage based on geographic and demographic data Note that the keyword tool is primarily designed to help users figure out which keywords to use with their Pay-Per-Click (PPC) advertising campaigns. However, the tool's output is also extremely relevant to developing or revising a keyword list for your website as part of an SEO update. We'll talk about the process of creating a PPC campaign in later posts. To use the tool, I recommend adding your keywords (one word per line) to an empty Excel spreadsheet, listing them in column A. Select the words for which you want to see adCenter's confidence level rating, click the Ad Intelligence tab, and then click the lower half of the Keyword Suggestion button on the toolbar, using both the Contained and Similarity tasks. You'll get a list of additional suggested keywords and phrases that correspond to each of the keywords you selected. Use the ones that are relevant
Rob Laporte

Google Removes Directory Links From Webmaster Guidelines - 0 views

  • Oct 3, 2008 at 9:48am Eastern by Barry Schwartz    Google Removes Directory Links From Webmaster Guidelines Brian Ussery reported that Google has dropped two important bullet points from the Google Webmaster Guidelines. Those bullet points include: Have other relevant sites link to yours. Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites. At the same time, Google Blogoscoped reported that Google removed the dictionary link in the search results, at the top right of the results page. Related, I am not sure. I speculated that maybe Google is going to go after more directories in the future. By removing those two bullet points, maybe Google can do this - without seeming all that hypocritical. In addition, I noted a comment from Google John Mueller at a Google Groups thread where he explained the logic behind removing those two points: I wouldn’t necessarily assume that we’re devaluing Yahoo’s links, I just think it’s not one of the things we really need to recommend. If people think that a directory is going to bring them lots of visitors (I had a visitor from the DMOZ once), then it’s obviously fine to get listed there. It’s not something that people have to do though :-). As you can imagine, this is causing a bit of a commotion in some of the forums. Some are worried, some are mad, and some are confused by the change.
  •  
    Oct 3, 2008 at 9:48am Eastern by Barry Schwartz Google Removes Directory Links From Webmaster Guidelines Brian Ussery reported that Google has dropped two important bullet points from the Google Webmaster Guidelines. Those bullet points include: * Have other relevant sites link to yours. * Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites. At the same time, Google Blogoscoped reported that Google removed the dictionary link in the search results, at the top right of the results page. Related, I am not sure. I speculated that maybe Google is going to go after more directories in the future. By removing those two bullet points, maybe Google can do this - without seeming all that hypocritical. In addition, I noted a comment from Google John Mueller at a Google Groups thread where he explained the logic behind removing those two points: I wouldn't necessarily assume that we're devaluing Yahoo's links, I just think it's not one of the things we really need to recommend. If people think that a directory is going to bring them lots of visitors (I had a visitor from the DMOZ once), then it's obviously fine to get listed there. It's not something that people have to do though :-). As you can imagine, this is causing a bit of a commotion in some of the forums. Some are worried, some are mad, and some are confused by the change.
Rob Laporte

Domain Moving Day the Key Relevance Way | SEMClubHouse - Key Relevance Blog - 0 views

  •  
    Domain Moving Day the Key Relevance Way by Mike Churchill So, you're gonna change hosting providers. In many cases, moving the content of the site is as easy as zipping up the content and unzipping it on the new server. There is another aspect of moving the domain that many people over look: DNS. The Domain Name System (DNS) is the translation service that converts your domain name (e.g. keyrelevance.com) to the corresponding IP address. When you move hosting companies, it's like changing houses, if you don't set up the Change of Address information correctly, you might have some visitors going to the old address for a while. Proper handling of the changes to DNS records makes this transition time as short as possible. Let's assume that you are changing hosting, and the new hosting company is going to start handling the Authoritative DNS for the domain. The first step is to configure the new hosting company as the authority. This should best be done a couple or more days before the site moves to the new location. What does "Authoritative DNS" mean? There are a double-handful of servers (known as the Root DNS servers) whose purpose is to keep track of who is keeping track of the IP addresses for a domain. Rather than them handling EVERY DNS request, they only keep track of who is the authoritative publisher of the DNS information for each domain. In other words, they don't know your address, but they tell you who does know it. If we tell the Root level DNS servers that the authority is changing, this information may take up to 48 hours to propagate throughout the internet. By changing the authority without changing the IP addresses, then while visiting browsers are making requests during this transition, both the old authority and the new authority will agree on the address (so no traffic gets forwarded before you move). Shortening the Transition The authoritative DNS servers want to minimize their load, so every time they send out an answer to a
Rob Laporte

Page 3 - Textlinkbrokers.com & text-link-ads.com - SEO Chat - 0 views

  • Jarrod u seem pretty convincing here. I sent a mail to Brigette (ur account manager) last month and asked some few simple questions regarding the services. Not a single answer was convincing enough to buy your services and that's when i decided not to purchase links through u. Here are the excerpts: Quote: 1. What if we decide to discontinue your service in the future? Do we lose all the purchased back links in that case? TLB: If you rent links, they would come down. However, if you purchase products that are permanently placed, we do not take them down. But you don't place text links permanently. Even your permanent package gives only 6 months guarantee. Quote: 2. How we can secure the ownership of our purchased links? What if the webmaster removed the link we have purchased after some time or what if he moved the link to some other location or some other web page or changed the anchor text of the link or added large number of other external links (may be from our competitors) and thus reducing our link weight or what if he made our link no follow or what if he deleted the web page or shut down the website? Can we claim any compensation or refund in that case? TLB: Each of our products has different minimums and guarantees. Our permanent links that are included in the “Booster Package” have a 3 month guarantee. During this time we have a script that ensures your link stays live. If, for some reason, it were to come down we would replace it free of charge. Beyond that, you would have no recourse. However, if you purchase a permanent link package, they have a 6 month guarantee that works the same way. Do you call this a convincing reply? Quote: 3. How you can ensure us that you will not get our website penalized or banned by Google through your back links? What if our website gets penalized or banned by Google because of the link you have purchased for us? What is your policy in that case? TLB: We take every step possible to ensure that does not happen. We do things very differently than most link building companies. We do not use software, feeds or auto generated code of any kind. Each of our links are manually placed on 100% SEO friendly sites. Everyone who is accepted into our inventory goes through an extensive approval process. We deny applications daily for not meeting the large number of criteria our Quality Assurance team looks at. Once they are accepted into inventory, their information is not posted on the web site. They are not allowed to post anything on their site that says they are affiliated with us in any way. They are not asked to and not allowed to backlink to us under any circumstances. We take the protection of our Inventory Partners and our clients very seriously. If a potential client goes to our website to view inventory, they will only see general information such as a description, page rank, site age, number of outbound links, etc. The only way to view the actual url is to sign a non-disclosure agreement. That is only done after speaking with a Customer Service Representative or Account Manager who would create the list for you. So, as you can see, for years we have done everything we can do to protect our inventory partners as well as our clients. Our goals is to make you successful so that we can continue with a long term business relationship. If we do not protect our partners and they get penalized, your links will not pass SEO value. Therefore, we take that very seriously. Your so called forbidden inventory is just one report away from Google web spam team. Once identified, everyone associated with it will bust like a bubble. IMO that's the risk rand was talking about.
  • Himanshu160, I only wish that I could replicate myself, wouldn't that be great. I would be happy to discuss other options with you outside of the forums or get you to one of our senior account reps. I do not handle very many sales and this isn't the place for it. As for our perm links, most of those are placed on sites that we do not control thus it becomes too costly to guarantee them forever. We have found that if they have stayed up for 6 months the churn rate is fairly low after that.. The 3 month guarantee is being offered at a cheaper rate and usually only used in our bundles. Again if it has stayed live for 3 months the churn rate isn't going to be very high after that. There are advantages to being on our controlled inventory but also some disadvantages. With our controlled inventory we can make sure every link we place stays up, those tend to be the links we charge monthly, although we have done some custom perm links on controlled inventory. The disadvantage is that if someone reports one of our controlled sites to Google it can loose value, of course some sites are at more risk than others because they sell a lot of links or they sell homepage links in the sidebar etc.. We do have inventory that is cleaner than others and we can even do exclusive deals so that you are the only one on the site. It all depends on your budget. For most low competition keywords one of our cheap link bundles is all that is needed. Sure some of the links will go down over time, and yes Google may devalue some. However there are always new links being built to replace the few that go down so the results are a nice increase in rankings over time.
Rob Laporte

Google's December 2020 Core Update Themes - 0 views

  • The data and overall consensus point to Google’s December 2020 Core Update is it's one of the more impactful algorithm adjustments to hit the SERP over the past year or so.
  • I prefer to look at core updates almost from a pure content and UX perspective. For me, it’s about the specific pages Google swaps out more than it is a per domain analysis.
  • I am performing a qualitative analysis
  • ...19 more annotations...
  • I am not making any sort of definitive statements
  • What moves me, however, is when I look at 100 keywords I start seeing the same content-oriented theme arise again and again.
  • What I’m trying to say, and as you’ll see in the examples I will get into later, is that the content that was more focused on the specific topic mentioned in the query did better. So while the "ultimate guide” here did get to the topic the query deals with, it was not exclusively about that topic.
  • This might call the entire strategy of creating these ultimate guides into question. Perhaps you can’t cast a wide net in that way anymore? Perhaps, the "ultimate guide” is only really suitable for people who actually want to get a more broad understanding of a topic? (Crazy to think, I know!)
  • The pages from Rocket Mortgage, on the other hand, is only about how much you need for a down payment:
  • So too is the page from Quicken Loans:
  • The Moral of the Story: If I want to understand how much money on average I need to put down when buying a house or what the various options generally are and what they mean long term, the CFPG page, .gov or not, doesn’t really help me. Its content is not specifically honed in on that particular topic. Again, we have another page that takes a sweeping look at a topic that lost rankings when the query reflected a more specific sort of intent!
  • What’s interesting here is that unlike the previous examples, where too much content resulted in the page’s topical relevance being diluted, the lack of such content here is what I think caused the ranking loss. Look, it’s not bad content. However, it’s pretty much the "general” kind of content you see here, there, and everywhere for all sorts of topics. Just compare it to what the page from the Credit Card Insider offers:
  • This just oozes depth. The third topic on the page alone (6 Ways to Pay Off…) rivals the depth shown on the CreditCards.com page! What differentiates this page from the "guides” shown in the other examples is that this is a guide that drills deep into one topic as opposed to trying to span multiple subtopics. Also, have a look at the formatting, it reminds me of what we saw on the Motley Fool’s page:
  • It’s deep content that is easy to digest. It’s not hard to see why Google swapped these two pages.
  • The Moral of the Story: Exact content relevancy is not only about what topic you talk about. You can be topically aligned but it has to offer real information to the user. It’s even better when that information is digestible. In other words, if you want to rank for a keyword with topic specificity it might be better to create an "ultimate guide” that drills deep into the topic itself versus trying to cover every subtopic under the sun in order to try to rank for more topics with one piece of content.
  • The by-line really sums it up. It tells you this article is about the fact that you most likely won't get addicted to painkillers, but it’s definitely possible so here’s the scoop. To me, it’s far more in line with the average user’s intent of learning about the risks of addiction versus understanding the fine difference between addiction and dependence. It’s the same story with the WebMD page:
  • The Moral of the Story: Again, the issue here is not how authoritative or how substantial the content is. There is no doubt that content from the NIH is both substantial and authoritative. The issue here again seems to relate to Google being better able to show content that is specifically relevant to the nature of the query.
  • First things first, the page doesn’t speak to the query directly. While in the process of learning the difference between sadness and depression one could understand the signs of depression that route is certainly indirect. You could argue that the query how to tell if you have depression could be taken as ‘how do I know if I am just sad or depressed?’ but that really doesn’t seem to be the essential intent here. That topical line (i.e., sadness vs. depression) would most likely produce its own unique query (i.e., am I sad or depressed). From the content shown on the WebMD page, it appears that Google thinks of the intent as understanding the symptoms of depression:
  • The WebMD, in contradistinction to the MHA page, speaks to the "plain meaning” of the query’s intent… how can you tell if you’re suffering from depression? Aside from that, the WebMD page offers a bit more in terms of substance. While it doesn’t go into great detail per se, the WebMD page does offer a pretty comprehensive list of items. Compare that to the MHA page which, if you read it, is a bit thin and lacks tremendously in offering much of any details (even a basic list as seen on the WebMD page). The Moral of the Story: Relevancy is a two-pronged equation (at minimum). It requires the content to be topically focused on the one hand as well as substantial on the other
  • I’ve saved the best for last. This is my favorite example that I came across when diving into the December 2020 Core Update. I mean, for crying out loud, we’re talking about the CDC losing rankings in favor of a .org domain I never heard of. How could this be? Let’s understand the intent of the query. If I were searching for this it would be because I found something on my body that I thought might be skin cancer. If I could be so bold, I would imagine that this is why most of us would search for this term. I wouldn’t, and again I imagine most people in most instances wouldn’t search for this in order to understand if regular screening is officially recommended or not. Yet, that is what the CDC page is about:
  • I hate to make assumptions, but I would also think that someone running this query is most likely not interested in the common tests and methods doctors use to determine if skin cancer is present. Yet, this is what the page from Cancer.net focuses on:
  • Again, I would search for this term if I saw something weird on my body that made me think "Holy crap, do I have skin cancer?”. The page from the AOCD is entirely made for people on the verge of freaking out at the possibility of having skin cancer:
  • To me, when you see this page relative to the pages from Cancer.net and the CDC is painfully obvious why this page got the ranking boost. The Moral of the Story: Again, I think what has transpired here is painfully obvious. Google has looked past the immediate authority of some of the pages here and has more heavily considered how relevant the content is to the query. As with the cases I have shown earlier, Google is rewarding content that speaks in a highly-focused way to the intent and nature of the query. What Was the December 2020 Core Update About? Are you expecting a one-liner that definitively characterizes the December 2020 update? You’re not going to get one from me.  This update, like any other, certainly included a whole plethora of different "algorithmic considerations” and themes. That said, from where I sit, while other core updates did things to help put the most authoritative content at the top of the SERP, this update seemed to me as being more about pure relevancy. Updates of the past have done things to weed out sites using a marketing tone within YMYL informational content or have rewarded sites that put the right content ahead of their affiliate pursuits. All of that, while part of relevancy, speaks more to a need for something authoritative at the top of the SERP. Seeing so many .gov pages drop in favor of pages from sites like Healthline or WebMD seems to point to the update rewarding relevancy to the nth degree. Perhaps Google felt as if it had "authority” at the top of the SERP in order, paving the way for a focus on relevance? Who knows. All I can say is that I personally have not seen such a strong focus on pure relevance on page one of the SERP. Content Creation Takeaways Practically speaking, I think the era of broadly reaching pages is quickly coming to an end. I think that has been the case for a while. However, seeing Google pull pages off page one of the SERP because they deal with multiple subtopics is a new level, at least for me. It shows that you have to create content that talks about one topic and one topic only (unless the keyword reflects a specific desire for a broader survey of a topic). I wonder if the idea of having one ultimate guide so as to win numerous keywords should be replaced with multiple posts where each post deals with one specific topic or subtopic. (If you do that, please don’t create thin content, that is not what I am advocating for.) It’s a rather logical concept. As Google gets better at understanding content it is going to prefer highly-focused content around a specific topic to that which is of a more broad nature unless the query specifically shows intent for a general survey of a topic.
Rob Laporte

65+ Best Free SEO Chrome Extensions (As Voted-for by SEO Community) - 1 views

  • Link Redirect Trace — Uncovers all URLs in a redirect chain including 301’s, 302’s, etc. Very useful for finding (and regaining) lost “link juice,” amongst other things.Other similar extensions: Redirect Path
  • Scraper — Scrape data from any web page using XPath or jQuery. Integrates with Google Sheets for one-click export to a spreadsheet. Or you can copy to clipboard and paste into Excel.Other similar extensions: Data Scraper — Easy Web Scraping, XPather
  • Tag Assistant (by Google) — Check for the correct installation of Google tags (e.g. Google Analytics, Tag Manager, etc) on any website. Also, record typical user flows on your website to diagnose and fix implementation errors.
  • ...16 more annotations...
  • Web Developer — Adds a web developer toolbar to Chrome. Use it to check how your website looks on different screen sizes, find images with missing alt text, and more.
  • WhatRuns — Instantly discover what runs any website. It uncovers the CMS, plugins, themes, ad networks, fonts, frameworks, analytics tools, everything.
  • Page Load Time — Measures and displays page load time in the toolbar. Also breaks down this metric by event to give you deeper insights. Simple, but very useful.
  • FATRANK — Tells you where the webpage you’re visiting ranks in Google for any keyword/phrase.
  • SEOStack Keyword Tool — Finds thousands of low-competition, long-tail keywords in seconds. It does this by scraping Google, Youtube, Bing, Yahoo, Amazon, and eBay. All data can be exported to CSV.
  • Window Resizer — Resize your browser window to see how a website looks on screens of different sizes. It has one-click emulation for popular sizes/resolutions (e.g. iPhone, iPad, laptop, desktop, etc).
  • Ghostery — Tells you how websites are tracking you (e.g. Facebook Custom Audiences, Google Analytics, etc) and blocks them. Very useful for regaining privacy. Plus, websites generally load faster when they don’t need to load tracking technologies.
  • Ayima Page Insights — Uncovers technical and on-page issues for any web page. It also connects to Google Search Console for additional insights on your web properties.
  • ObservePoint TagDebugger — Audit and debug issues with website tags (e.g. Google Analytics, Tag Manager, etc) on your websites. Also checks variables and on-click events.Other similar extensions: Event Tracking Tracker
  • The Tech SEO — Quick Click Website Audit — Provides pre-formatted links (for the current URL) to a bunch of popular SEO tools. A very underrated tool that reduces the need for mundane copy/pasting.
  • User-Agent Switcher for Chrome — Mimic user-agents to check that your website displays correctly in different browsers and/or OS’.
  • Portent’s SEO Page Review — Reviews the current page and kicks back a bunch of data including meta tags, canonicals, outbound links, H1-H6 tags, OpenGraph tags, and more.
  • FindLinks — Highlights all clickable links/elements on a web page in bright yellow. Very useful for finding links on websites with weird CSS styling.
  • SERPTrends SEO Extension — Tracks your Google, Bing, and Yahoo searches. Then, if you perform the same search again, it shows ranking movements directly in the SERPs.
  • SimilarTech Prospecting — Discovers a ton of useful information about the website you’re visiting. This includes estimated monthly traffic, company information, social profiles, web technologies, etc.
  • SEO Search Simulator by Nightwatch — Emulates Google searches from any location. Very useful for seeing how rankings vary for a particular query in different parts of the world.
  •  
    "Find Out How Much Traffic a Website Gets: 3 Ways Compared"
Rob Laporte

Honey, Social Media Shrunk Big Business - ClickZ - 0 views

  •  
    Marketing Has Become Personal (Again) When the Big Guys want to look like Small Players, they make deep investments, mostly in social media. If you look at Coca-Cola's Facebook Page, for example, it doesn't look remarkably different from any other Facebook Page, even those created by tiny companies. On that Facebook Page, Coca-Cola -- one of the largest companies in the world and possibly the most recognized brand on the globe -- is presenting itself as not just small but also personal and approachable. In fact, if you are a fan of its page, you can write on its wall. Coke has videos of its fans and simple pictures of people enjoying a Coke. These aren't professional, glossy images but the sort of pictures we've come to expect online: a bit grainy, not well lit, and very real looking. The rule, and indeed the opportunity, of the new medium is to make your marketing personal. You need a bit of guts to do it. We all have a natural tendency to speak and act in ways we feel are professional when doing business, and this is true online as well. But social media is the single most important media space for brands right now, and its nature is different. If you are a big brand, you don't need to pretend you are small, but you do need to find ways to become approachable, engaging, and personal in the way that small brands do. Let's Get Small There are a few rules to follow when you try to get more personal in your marketing. Use these methods and you can start putting some real faces next to the brands consumers think they know: * Start with the current fans.This is really the great story of the Coca-Cola page. It was started by two guys who simply loved Coke, not by company itself. They amassed a following of brand loyalists, totally on their own. The company came to these guys and asked for the opportunity to help them out and keep them involved. Exactly what you would do if you were an actual human being, not a great big company more concerned with protectin
Rob Laporte

Microsoft Tests Social Media Monitoring Product - ClickZ - 0 views

  • Microsoft Tests Social Media Monitoring Product By Christopher Heine, ClickZ, Sep 24, 2009 Microsoft has developed a social media analytics tool that's designed, among other things, to improve a marketing organization's ability to adjust to social media phenomena on the fly. Called "Looking Glass," the product is still in prototype and will only be available to a few companies in the near term. It sends e-mail alerts when social media activity picks up considerably. The sentiment (i.e., negative or positive) of that chatter and the influence level of the content creator are reported in the alert. Digital flow charts show what days of the week generate the most activity on Twitter, Facebook, Flickr, YouTube, and other social media sites. But interweaving social media data with reporting from other campaign channels may turn out be Microsoft's most significant contribution to the already mature field of social media analytics. Feeds from social media sites can be connected to other business elements like customer databases, CRM centers and sales data within an organization. The data integrate via Microsoft's enterprise platforms like Outlook and Sharepoint. A handful or so companies will begin testing Looking Glass in the coming weeks.
Rob Laporte

Effective Internal Linking Strategies That Prevent Duplicate Content Nonsense - Search ... - 0 views

  •  
    The funny thing about duplicate content is that you don't really have to have it for it to appear as if you do have it. But whether you have duplicate content on your site or not, to the search engines appearances are everything . The engines are pretty much just mindless bots that can't reason. They only see what is, or appears to be there and then do what the programmers have determined through the algorithm. How you set up your internal linking structure plays a significant role in whether you set yourself up to appear if you have duplicate content on your site or not. Some things we do without thinking, setting ourselves up for problems ahead. With a little foresight and planning, you can prevent duplicate content issues that are a result of poor internal link development. For example, we know that when we link to site.com/page1.html in one place but then link to www.site.com/page1.html in another, that we are really linking to the same page. But to the search engines, the www. can make a difference. They'll often look at those two links as links to two separate pages. And then analyze each page as if it is a duplicate of the other. But there is something we can do with our internal linking to alleviate this kind of appearance of duplicate content. Link to the www. version only Tomorrow I'll provide information on how to set up your site so when someone types in yoursite.com they are automatically redirected to www.yoursite.com. It's a great permanent fix, but as a safety measure, I also recommend simply adjusting all your links internally to do the same. Example of not linking to www. version. In the image above you can see that the domain contains the www., but when you mouse over any of the navigation links, they point to pages without the www. Even if you have a permanent redirect in place, all the links on your site should point to the proper place. At the very least you're making the search engines and visitors NOT have to redirect. At best, should y
Rob Laporte

Google SEO Test - Google Prefers Valid HTML & CSS | Hobo - 0 views

  •  
    Well - the result is clear. From these 4 pages Google managed to pick the page with valid css and valid html as the preffered page to include in it's index! Ok, it might be a bit early to see if the four pages in the test eventually appear in Google but on first glance it appears Google spidered the pages, examined them, applied duplicate content filters as expected, and selected one to include in search engine results. It just happens that Google seems to prefer the page with valid code as laid down by the W3C (World Wide Web Consortium). The W3C was started in 1994 to lead the Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability. What is the W3C? * W3C Stands for the World Wide Web Consortium * W3C was created in October 1994 * W3C was created by Tim Berners-Lee * W3C was created by the Inventor of the Web * W3C is organized as a Member Organization * W3C is working to Standardize the Web * W3C creates and maintains WWW Standards * W3C Standards are called W3C Recommendations How The W3C Started The World Wide Web (WWW) began as a project at the European Organization for Nuclear Research (CERN), where Tim Berners-Lee developed a vision of the World Wide Web. Tim Berners-Lee - the inventor of the World Wide Web - is now the Director of the World Wide Web Consortium (W3C). W3C was created in 1994 as a collaboration between the Massachusetts Institute of Technology (MIT) and the European Organization for Nuclear Research (CERN), with support from the U.S. Defense Advanced Research Project Agency (DARPA) and the European Commission. W3C Standardising the Web W3C is working to make the Web accessible to all users (despite differences in culture, education, ability, resources, and physical limitations). W3C also coordinates its work with many other standards organizations such as the Internet Engineering Task Force, the Wireless Application Protocols (WAP) Forum an
Rob Laporte

Article Pagination: Actions that Improved Google Search Traffic Google SEO News and Dis... - 0 views

  •  
    The value of "long-form journalism" has been tested on websites such as Salon and shown to be quite viable. It also attracts a better caliber of writer. With this in mind, over a year ago I was working with an online magazine that was already publishing longer, in-depth articles, in the area of many thousands of words. The SEO challenge we had was that page 2 and beyond for most articles were not getting any search traffic - even though there was plenty of awesome content there. The approach we decided on is labor intensive for the content creators. But after some education, the writers were all interested in trying to increase the audience size. Here are the steps we took: Page 1 naturally enough uses the overall title of the article for both its title tag and header, and has a unique meta-description. Every internal page then has its own unique title and header tag . These are based on the first SUB-head for that section of the article. This means more keyword research and writing of subheads than would normally be the case. If the article is considered as a whole, then an tag would seem more accurate semantically. But Google looks at the semantic structure one URL at a time, not for the overall multi-URL article. Most pages also include internal subheads, and these are style as On each internal page, there is also a "pre-head" that does use the article title from page 1 in a small font. This pre-head does not use a header tag of any kind, just a CSS style. This pre-head article title is at the top as a navigation cue for the user. An additional navigation cue is that the unique page titles each begin with the numeral "2." or "3." Each internal page also has a unique meta description, one that summarizes that page specifically, rather than summarizing the overall article. Every page of the article links to every other page at the top and the bottom. None of this anemic "Back | Next" junk. There's a complete page choice shown on everywhe
jack_fox

Advanced Technical SEO: How social image sharing works and how to optimize your og:imag... - 0 views

  • It’s impossible to specify different images/formats/files for different networks, other than for Facebook and Twitter. The Facebook image is used, by default, for all other networks/systems). This is a limitation of how these platforms work. The same goes for titles and descriptions
  • The image size and cropping won’t always be perfect across different platforms, as the way in which they work is inconsistent.
  • Specifically, your images should look great on ‘broadcast’ platforms like Facebook and Twitter, but might sometimes crop awkwardly on platforms designed for 1:1 or small group conversations, like WhatsApp or Telegram.
  • ...11 more annotations...
  • For best results, you should manually specify og:image tags for each post, through the plugin. You should ensure that your primary og:image is between 1200x800px and 2000x1600px, and is less than 2mb in size.
  • As an open project, the Open Graph is constantly changing and improving
  • these tags and approaches sometimes conflict with or override each other. Twitter’s twitter:image property, for example, overrides an og:image value for images shared via Twitter, when both sets of tags are on the same page.
  • the open graph specification allows us to provide multiple og:image values. This, in theory, allows the platform to make the best decision about which size to use and allows people who are sharing some choice over which image they pick. How different platforms interpret these values, however, varies considerably
  • Because each platform maintains its own rules and documentation on how they treat og:image tags, there are often gaps in our knowledge. Specific restrictions, edge cases, and in particular, information on which rules override other rules, are rarely well-documented
  • we’re choosing to optimize the first image in the og:set for large, high-resolution sharing – the kind which Facebook supports and requires, but which cause issues with networks which expect a smaller image (like Instagram, or Telegram) sharing.
  • In the context of a newsfeed, like on Facebook or Twitter, the quality of the image is much more important – you’re scrolling through lots of noise, you’re less engaged, and a better image is an increased chance of a click/share/like. 
  • When the ‘full’ size image is over 2mb file size, and/or over 2000 pixels on either axis, we’ll try and fall back to a smaller standard WordPress image size (or to scan the post content for an alternative).
  • If we can’t find a suitable smaller image, we’ll omit the og:image tag, in the hopes that the platform will select an appropriate alternative. Note that this may result in the image not appearing in some sharing contexts.
  • If the ratio exceeds 3:1 we’ll present a warnin (this is the maximum ratio for many networks)
  • For most normal use-cases, we’d suggest that you manually set og:image values on your posts via the Yoast SEO plugin, and ensure that their dimensions are between 1200x800px and 2000x1600px (and that they’re less than 2mb in size)
Rob Laporte

The Real Impact of Mobile-First Indexing & The Importance of Fraggles - Moz - 0 views

  • We have also recently discovered that Google has begun to index URLs with a # jump-link, after years of not doing so, and is reporting on them separately from the primary URL in Search Console. As you can see below from our data, they aren't getting a lot of clicks, but they are getting impressions. This is likely because of the low average position. 
  • Start to think of GMB as a social network or newsletter — any assets that are shared on Facebook or Twitter can also be shared on Google Posts, or at least uploaded to the GMB account.
  • You should also investigate the current Knowledge Graph entries that are related to your industry, and work to become associated with recognized companies or entities in that industry. This could be from links or citations on the entity websites, but it can also include being linked by third-party lists that give industry-specific advice and recommendations, such as being listed among the top competitors in your industry ("Best Plumbers in Denver," "Best Shoe Deals on the Web," or "Top 15 Best Reality TV Shows"). Links from these posts also help but are not required — especially if you can get your company name on enough lists with the other top players. Verify that any links or citations from authoritative third-party sites like Wikipedia, Better Business Bureau, industry directories, and lists are all pointing to live, active, relevant pages on the site, and not going through a 301 redirect. While this is just speculation and not a proven SEO strategy, you might also want to make sure that your domain is correctly classified in Google’s records by checking the industries that it is associated with. You can do so in Google’s MarketFinder tool. Make updates or recommend new categories as necessary. Then, look into the filters and relationships that are given as part of Knowledge Graph entries and make sure you are using the topic and filter words as keywords on your site.
  • ...3 more annotations...
  • The biggest problem for SEOs is the missing organic traffic, but it is also the fact that current methods of tracking organic results generally don’t show whether things like Knowledge Graph, Featured Snippets, PAA, Found on the Web, or other types of results are appearing at the top of the query or somewhere above your organic result. Position one in organic results is not what it used to be, nor is anything below it, so you can’t expect those rankings to drive the same traffic. If Google is going to be lifting and representing everyone’s content, the traffic will never arrive at the site and SEOs won’t know if their efforts are still returning the same monetary value. This problem is especially poignant for publishers, who have only been able to sell advertising on their websites based on the expected traffic that the website could drive. The other thing to remember is that results differ — especially on mobile, which varies from device to device (generally based on screen size) but also can vary based on the phone IOS. They can also change significantly based on the location or the language settings of the phone, and they definitely do not always match with desktop results for the same query. Most SEO’s don't know much about the reality of their mobile search results because most SEO reporting tools still focus heavily on desktop results, even though Google has switched to Mobile-First.  As well, SEO tools generally only report on rankings from one location — the location of their servers — rather than being able to test from different locations. 
  • The only thing that good SEO’s can do to address this problem is to use tools like the MobileMoxie SERP Test to check what rankings look like on top keywords from all the locations where their users may be searching. While the free tool only provides results with one location at a time, subscribers can test search results in multiple locations, based on a service-area radius or based on an uploaded CSV of addresses. The tool has integrations with Google Sheets, and a connector with Data Studio, to help with SEO reporting, but APIs are also available, for deeper integrations in content editing tools, dashboards and for use within other SEO tools.
  • Fraggles and Fraggled indexing re-frames the switch to Mobile-First Indexing, which means that SEOs and SEO tool companies need to start thinking mobile-first — i.e. the portability of their information. While it is likely that pages and domains still carry strong ranking signals, the changes in the SERP all seem to focus less on entire pages, and more on pieces of pages, similar to the ones surfaced in Featured Snippets, PAAs, and some Related Searches. If Google focuses more on windowing content and being an "answer engine" instead of a "search engine," then this fits well with their stated identity, and their desire to build a more efficient, sustainable, international engine.
Rob Laporte

Google Says Being Different Helps Improve Rankings - Search Engine Journal - 0 views

  • So, in general, a site query is not representative of all of the pages that we have indexed.It’s a good way to get a rough view of what we have indexed. But it’s not the comprehensive list. It’s not meant to be like that.For more information on how or what we have indexed, I would use search console and the index coverage report there.That gives you a better look at the pages that are actually indexed.
  • With regards to losing traffic, I realize that’s sometimes hard.In general, I think with a website that’s focused on ringtones, it’ll be a little bit tricky because our algorithms really do try to look out for unique, compelling, high quality content.And if your whole website is built up on essentially providing ringtones that are the same as everywhere else then I don’t know if our algorithms would say this is a really important website that we need to focus on and highlight more in search.So with that in mind, if you’re focused on kind of this small amount of content that is the same as everyone else then I would try to find ways to significantly differentiate yourselves to really make it clear that what you have on your website is significantly different than all of those other millions of ringtone websites that have kind of the same content.Maybe there is a way to do that with regards to the content that you provide.Maybe there is a way to do that with the functionality that you provide.But you really need to make sure that what you have on your site is significantly different enough that our algorithms will say well this is what we need to index instead of all of these others that just have a list of ringtones on the website.So that’s probably not going to be that easy to make that kind of a shift but that’s generally the direction I would hit.And that’s the same recommendation I would have for any kind of website that offers essentially the same thing as lots of other web sites do.You really need to make sure that what you’re providing is unique and compelling and high quality so that our systems and users in general will say, I want to go to this particular website because they offer me something that is unique on the web and I don’t just want to go to any random other website.
1 - 20 of 806 Next › Last »
Showing 20 items per page