Skip to main content

Home/ DISC Inc/ Group items tagged IT

Rss Feed Group items tagged

jack_fox

The Ultimate Web Server Security Guide @ MyThemeShop - 0 views

  • They could insert links into the site to boost their SEO rankings. Hackers can make a killing selling links from exploited sites. Alternatively, a hacker could deface the site and demand money to restore it (ransom). They could even place ads on the site and use the traffic to make money. In most cases, an attacker will also install backdoors into the server. These are deliberate security holes that allow them to come back and exploit the site in the future – even if the insecure plugin has been replaced.
  • Unfortunately, under WordPress, every plugin and theme has the ability to alter anything on the site. They can even be exploited to infect other apps and sites hosted on the same machine.
  • Theme developers are often relatively inexperienced coders. Usually, they’re professional graphic artists who have taught themselves a little PHP on the side. Plugins are another popular line of attack – they account for 22% of successful hacks. Put together, themes and plugins are a major source of security trouble.
  • ...102 more annotations...
  • Each person who uses your system should only have the privileges they need to perform their tasks.
  • Don’t depend on a single security measure to keep your server safe. You need multiple rings of defense.
  • Security exploits exist at all levels of the technology stack, from the hardware up. WP White Security revealed that 41% of WordPress sites are hacked through a weakness in the web host.
  • While it’s important to use a strong password, password cracking is not a primary focus for hackers.
  • the more software you have installed on your machine, the easier it is to hack – even if you aren’t using the programs! Clearly, programs that are designed to destroy your system are dangerous. But even innocent software can be used in an attack.
  • There are 3 ways to reduce the attack surface: 1. Run fewer processes 2. Uninstall programs you don’t need 3. Build a system from scratch that only has the processes you need
  • A really good authentication system uses multiple tests. Someone could steal or guess your password. They could grab your laptop with its cryptographic keys.
  • If you want to run multiple processes at the same time, you need some way of managing them. This is basically what a kernel is. It does more than that – it handles all of the complex details of the computer hardware, too. And it runs the computer’s networking capabilities
  • programs exist as files when they are not running in memory
  • SELinux’s default response is to deny any request.
  • SELinux is extremely comprehensive, but this power comes at a price. It’s difficult to learn, complex to set up, and time-consuming to maintain.
  • AppArmor is an example of a MAC tool, although it’s nowhere near as comprehensive as SELinux. It applies rules to programs to limit what they can do.
  • AppArmor is relatively easy to set up, but it does require you to configure each application and program one by one. This puts the onus for security in the hands of the user or sysadmin. Often, when new apps are added, users forget to configure AppArmor. Or they do a horrible job and lock themselves out, so their only option is to disable the profile. That said, several distributions have adopted AppArmor.
  • Generic profiles shipped by repo teams are designed to cover a wide range of different use cases, so they tend to be fairly loose. Your specific use cases are usually more specific. In this case, it pays to fine-tune the settings, making them more restrictive.
  • GRSecurity is a suite of security enhancements
  • In the future, this could become a viable option. For now, we’ll use Ubuntu and AppArmor.
  • Apache is a user-facing service – it’s how your users interact with your website. It’s important to control this interaction too.
  • If your Apache configuration is bad, these files can be viewed as plain text. All of your code will be visible for anyone to see – this potentially includes your database credentials, cryptographic keys, and salts.
  • You can configure Apache to refuse any requests for these essential directories using .htaccess files. These are folder-level configuration files that Apache reads before it replies to a request.
  • The primary use for .htaccess files is to control access
  • If an attacker knows your WordPress cryptographic salts, they can use fake cookies to trick WordPress into thinking they have logged on already.
  • If the hacker has physical access to the computer, they have many options at their disposal. They can type commands through the keyboard, or insert a disk or USB stick into the machine and launch an attack that way.
  • When it comes to network-based attacks, attackers have to reach through one of the machine’s network ports.
  • For an attacker to exploit a system, they have to communicate to a process that’s listening on a port. Otherwise, they’d simply be sending messages that are ignored. This is why you should only run processes that you need for your site to run. Anything else is a security risk.
  • Often, ports are occupied by processes that provide no real valuable service to the machine’s legitimate users. This tends to happen when you install a large distribution designed for multiple uses. Large distros include software that is useless to you in terms of running a website. So the best strategy is to start with a very lightweight distro and add the components you need.
  • If you see any unnecessary processes, you can shut them down manually. Better yet, if the process is completely unnecessary, you can remove it from your system.
  • Firewalls are quite similar to access control within the computer. They operate on a network level, and you can use them to enforce security policies. A firewall can prevent processes from broadcasting information from a port. It can stop outside users from sending data to a port. And it can enforce more complex rules.
  • Simply installing and running a firewall does not make your host machine secure – it’s just one layer in the security cake. But it’s a vital and a powerful one.
  • First of all, we need to configure our software to resist common attacks. But that can only protect us from attacks we know about. Access control software, such as AppArmor, can drastically limit the damage caused by unauthorized access. But you still need to know an attack is in progress.
  • This is where Network Intrusion Detection Software (NIDS) is essential. It scans the incoming network traffic, looking for unusual patterns or signs of a known attack. If it sees anything suspicious, it logs an alert.
  • It’s up to you to review these logs and act on them.
  • If it’s a false alarm, you should tune your NIDS software to ignore it. If it’s an ineffective attack, you should review your security and block the attacker through the firewall.
  • That’s why it’s essential to have an automated backup system. Finally, you need to understand how the attack succeeded, so you can prevent it from recurring. You may have to change some settings on your Firewall, tighten your access rules, adjust your Apache configuration, and change settings in your wp-config file. None of this would be possible without detailed logs describing the attack.
  • Every web server has a breaking point and dedicated DOS attackers are willing to increase the load until your server buckles. Good firewalls offer some level of protection against naive DOS attacks
  • a tiny number of sites (less than 1%) are hacked through the WordPress core files
  • Major DNS attacks have taken down some of the biggest sites in the world – including Ebay and Paypal. Large hosting companies like Hostgator and Blue Host have been attacked. It’s a serious risk!
  • Right now, due to the way the web currently works, it’s impossible to download a web page without the IP address of a server. In the future, technologies like IFPS and MaidSafe could change that.
  • So there are 2 benefits to using a CDN. The first is that your content gets to your readers fast. The second benefit is server anonymity – nobody knows your real IP address – including the psychos. This makes it pretty impossible to attack your server – nobody can attack a server without an IP address.
  • When CDNs discover a DDOS attack, they have their own ways to deal with it. They often display a very lightweight “are you human?” message with a captcha. This tactic reduces the bandwidth costs and screens out the automated attacks.
  • If any of your DNS records point to your actual server, then it’s easy to find it and attack it. This includes A records (aliases) and MX records (mail exchange). You should also use a separate mail server machine to send your emails. Otherwise, your email headers will expose your real email address.
  • If your hosting company refuses to give you a new IP address, it may be time to find a new service provider.
  • WordPress uses encryption to store passwords in the database. It doesn’t store the actual password – instead, it stores an encrypted version. If someone steals your database tables, they won’t have the actual passwords.
  • If you used a simple hash function, a hacker could gain privileged access to your app in a short period of time.
  • The salt strings are stored in your site’s wp-config.php file.
  • Salts dramatically increase the time it would take to get a password out of a hash code – instead of taking a few weeks, it would take millions of years
  • You keep the other key (the decryption key) to yourself. If anyone stole it, they could decode your private messages! These 2-key cryptographic functions do exist. They are the basis of TLS (https) and SSH.
  • the most secure systems tend to be the simplest. The absolute secure machine would be one that was switched off.
  • For WordPress sites, you also need PHP and a database.
  • A VM is an emulated computer system running inside a real computer (the host). It contains its own operating system and resources, such as storage, and memory. The VM could run a completely different operating system from the host system – you could run OSX in a VM hosted on your Windows machine
  • This isolation offers a degree of protection. Let’s imagine your VM gets infected with a particularly nasty virus – the VM’s file system could be completely destroyed, or the data could be hopelessly corrupted. But the damage is limited to the VM itself. The host environment would remain safe.
  • This is how shared hosting and virtual private servers (VPSes) work today. Each customer has access to their own self-contained environment, within a virtual machine.
  • VMs are not just for hosting companies. If you’re hosting multiple sites on a dedicated server or a VPS, VMs can help to make your server more secure. Each site can live inside its own VM. That way, if one server is hacked, the rest of your sites are safe.
  • Even with all these considerations, the benefits of VMs outweigh their drawbacks. But performance is vital on the web.
  • Containers (like Docker) are very similar to VMs.
  • Because we’ve cut the hypervisor out of the loop, applications run much faster – almost as fast as processes in the host environment. Keeping each container separate does involve some computation by the container software. But it’s much lighter than the work required by a hypervisor!
  • Docker Cloud is a web-based service that automates the task for you. It integrates smoothly with the most popular cloud hosting platforms (such as Amazon Web Services, or Digital Ocean).
  • With containers, you can guarantee that the developer’s environment is exactly the same as the live server. Before the developer writes a single line of code, they can download the container to their computer. If the code works on their PC, it will work on the live server. This is a huge benefit of using containers, and it’s a major reason for their popularity.
  • A complete stack of these layers is called an “image”
  • The core of Docker is the Docker Engine – which lives inside a daemon – or long-running process
  • another great resource – the Docker Hub. The hub is an online directory of community-made images you can download and use in your own projects. These include Linux distributions, utilities, and complete applications.
  • Docker has established a relationship with the teams behind popular open source projects (including WordPress) – these partners have built official images that you can download and use as-is.
  • when you finish developing your code, you should wrap it up inside a complete container image. The goal is to put all the code that runs your site inside a container and store the volatile data in a volume.
  • Although Docker can help to make your site more secure, there’s are a few major issues you need to understand. The Docker daemon runs as a superuser It’s possible to load the entire filesystem into a container It’s possible to pass a reference to the docker daemon into a container
  • The solution to this issue is to use a MAC solution like SELinux, GRSecurity or AppArmor.
  • Never let anyone trick you into running a strange docker command.
  • only download and use Docker images from a trustworthy source. Official images for popular images are security audited by the Docker team. Community images are not
  • there are the core WordPress files. These interact with the web server through the PHP runtime. WordPress also relies on the file system and a database server.
  • A service is some software component that listens for requests (over a protocol) and does something when it receives those requests.
  • Using Docker, you could install WordPress, Apache, and PHP in one container, and run MySQL from another. These containers could run on the same physical machine, or on different ones
  • The database service container can be configured to only accept connections that originate from the web container. This immediately removes the threat of external attacks against your database server
  • This gives you the perfect opportunity to remove high-risk software from your host machine, including: Language Runtimes and interpreters, such as PHP, Ruby, Python, etc. Web servers Databases Mail Servers
  • If a new version of MySQL is released, you can update the database container without touching the web container. Likewise, if PHP or Apache are updated, you can update the web container and leave the database container alone.
  • Because Docker makes it easy to connect these containers together, there’s no reason to lump all your software inside a single container. In fact, it’s a bad practice – it increases the security risk for any single container, and it makes it harder to manage them.
  • If your site is already live on an existing server, the best approach is to set up a new host machine and then migrate over to it. Here are the steps you need to take:
  • With a minimal Ubuntu installation, you have a fairly bare-bones server. You also have the benefit of a huge repository of software you can install if you want.
  • If access control is like a lock protecting a building, intrusion detection is the security alarm that rings after someone breaks in.
  • Logging on to your host with a superuser account is a bad practice. It’s easy to accidentally break something.
  • Fail2ban blocks SSH users who fail the login process multiple times. You can also set it up to detect and block hack attempts over HTTP – this will catch hackers who attempt to probe your site for weaknesses.
  • With multiple WordPress sites on your machine, you have 2 choices. You could create a new database container for each, or you could reuse the same container between them. Sharing the DB container is a little riskier, as a hacker could, theoretically, ruin all your sites with one attack. You can minimize that risk by: Use a custom root user and password for your database – don’t use the default username of ‘root’. Ensuring the db container is not accessible over the internet (hide it away inside a docker network) Creating new databases and users for each WordPress site. Ensure each user only has permissions for their specific database.
  • What are the benefits of using a single database container? It’s easier to configure and scale. It’s easier to backup and recover your data. It’s a little lighter on resources.
  • you could also add a caching container, like Varnish. Varnish caches your content so it can serve pages quickly – much faster than WordPress can
  • Docker has the ability to limit how much processor time and memory each container gets. This protects you against exhaustion DOS attacks
  • A containerized process still has some of the abilities of root, making it more powerful than a regular user. But it’s not as bad as full-on root privileges. With AppArmor, you can tighten the security further, preventing the process from accessing any parts of the system that do not relate to serving your website.
  • Docker Hub works like GitHub – you can upload and download images for free. The downside is that there’s no security auditing. So it’s easy to download a trojan horse inside a container.
  • Official images (such as WordPress and Apache) are audited by the Docker team. These are safe. Community images (which have names like user/myapp) are not audited.
  • a kernel exploit executed inside a container will affect the entire system. The only way to protect against kernel exploits is to regularly update the host system
  • Containers run in isolation from the rest of the system. That does not mean you can neglect security – your website lives inside these containers! Even if a hacker cannot access the full system from a container, they can still damage the container’s contents.
  • Under Ubuntu, AppArmor already protects you – to a degree. The Docker daemon has an AppArmor profile, and each container runs under a default AppArmor profile. The default profile prevents an app from breaking out of the container, and restricts it from doing things that would harm the system as a whole. However, the default profile offers no specific protection against WordPress specific attacks. We can fix this by creating a custom profile for your WordPress container.
  • The net effect is that it’s impossible to install malware, themes or plugins through the web interface. We’ve already covered this to some degree with the .htaccess rules and directory permissions. Now we’re enforcing it through the Linux kernel.
  • There are versions of Docker for Mac and PC, so you’ll be able to run your site from your home machine. If the code works on your PC, it will also work on the server.
  • Tripwire tends to complain about the entries in the /proc filespace, which are auto-generated by the Linux kernel. These files contain information about running processes, and they tend to change rapidly while Linux runs your system. We don’t want to ignore the directory entirely, as it provides useful signs that an attack is in progress. So we’re going to have to update the policy to focus on the files we are interested in.
  • Now we should install an e-mail notification utility – to warn us if anything changes on the system. This will enable us to respond quickly if our system is compromised (depending on how often you check your emails).
  • Rootkits are malicious code that hackers install onto your machine. When they manage to get one on your server, it gives them elevated access to your system
  • Tripwire is configured to search in key areas. It’s good at detecting newly installed software, malicious sockets, and other signs of a compromised system. RKHunter looks in less obvious places, and it checks the contents of files to see if they contain known malicious code. RKHunter is supported by a community of security experts who keep it updated with known malware signatures – just like antivirus software for PCs.
  • If your hosting company offers the option, this would be a good point to make an image of your server. Most cloud hosting companies offer tools to do this.
  • With an image, it’s easy to launch new servers or recover the old one if things go horribly wrong.
  • We’ve hidden our server from the world while making it easy to read our content We’ve built a firewall to block malicious traffic We’ve trapped our web server inside a container where it can’t do any harm We’ve strengthened Linux’s access control model to prevent processes from going rogue We’ve added an intrusion detection system to identify corrupted files and processes We’ve added a rootkit scanner We’ve strengthened our WordPress installation with 2-factor authentication We’ve disabled the ability for any malicious user to install poisoned themes or plugins
  • Make a routine of checking the logs (or emails if you configured email reporting). It’s vital to act quickly if you see any warnings. If they’re false warnings, edit the configuration. Don’t get into a habit of ignoring the reports.
  • Virtually everything that happens on a Linux machine is logged.
  • You have to make a habit of checking for new exploits and learn how to protect yourself against them. Regularly check for security patches and issues in the core WordPress app: WordPress Security Notices Also, check regularly on the forums or mailing lists for the plugins and themes you use on your site.
  • network level intrusion detection service – you can fix that by installing Snort or PSAD.
  • The only way to guarantee your safety is to constantly update your security tactics and never get complacent.
Rob Laporte

Honey, Social Media Shrunk Big Business - ClickZ - 0 views

  •  
    Marketing Has Become Personal (Again) When the Big Guys want to look like Small Players, they make deep investments, mostly in social media. If you look at Coca-Cola's Facebook Page, for example, it doesn't look remarkably different from any other Facebook Page, even those created by tiny companies. On that Facebook Page, Coca-Cola -- one of the largest companies in the world and possibly the most recognized brand on the globe -- is presenting itself as not just small but also personal and approachable. In fact, if you are a fan of its page, you can write on its wall. Coke has videos of its fans and simple pictures of people enjoying a Coke. These aren't professional, glossy images but the sort of pictures we've come to expect online: a bit grainy, not well lit, and very real looking. The rule, and indeed the opportunity, of the new medium is to make your marketing personal. You need a bit of guts to do it. We all have a natural tendency to speak and act in ways we feel are professional when doing business, and this is true online as well. But social media is the single most important media space for brands right now, and its nature is different. If you are a big brand, you don't need to pretend you are small, but you do need to find ways to become approachable, engaging, and personal in the way that small brands do. Let's Get Small There are a few rules to follow when you try to get more personal in your marketing. Use these methods and you can start putting some real faces next to the brands consumers think they know: * Start with the current fans.This is really the great story of the Coca-Cola page. It was started by two guys who simply loved Coke, not by company itself. They amassed a following of brand loyalists, totally on their own. The company came to these guys and asked for the opportunity to help them out and keep them involved. Exactly what you would do if you were an actual human being, not a great big company more concerned with protectin
Rob Laporte

Wake Up SEOs, the New Google is Here | SEOmoz - 0 views

  •  
    Rel="author" and Rel="publisher" are the solution Google is adopting in order to better control, within other things, the spam pollution of the SERPs. If you are a blogger, you will be incentivized in marking your content with Author and link it to your G+ Profile, and as a Site, you are incentivized to create your G+ Business page and to promote it with a badge on you site that has the rel="publisher" in its code. Trusted seeds are not anymore only sites, but can be also persons (i.e.: Rand or Danny Sullivan) or social facets of an entity… so, the closer I am in the Social Graph to those persons//entity the more trusted I am to Google eyes. As we can see, Google is not trying to rely only on the link graph, as it is quite easy to game, but it is not simply adding the social signals to the link graph, because they too can be gamed. What Google is doing is creating and refining a new graph that see cooperating Link graph, Social graph and Trust graph and which is possibly harder to game. Because it can be gamed still, but - hopefully - needing so many efforts that it may become not-viable as a practice. Wake up SEOs, the new Google is here As a conclusion, let me borrow what Larry Page wrote on Google+ (bold is mine): Our ultimate ambition is to transform the overall Google experience […] because we understand what you want and can deliver it instantly. This means baking identity and sharing into all of our products so that we build a real relationship with our users. Sharing on the web will be like sharing in real life across all your stuff. You'll have better, more relevant search results and ads. Think about it this way … last quarter, we've shipped the +, and now we're going to ship the Google part. I think that it says it all and what we have lived a year now is explained clearly by the Larry Page words. What can we do as SEOs? Evolve, because SEO is not dieing, but SEOs can if they don't assume that winter - oops - the
Rob Laporte

Capital Letters (Pascal Casing) in URLs - Advantages and Disadvantages - 0 views

  •  
    I noticed CNN uses some capital letters and sometimes whole words in capital in their URL. Here is what I thought of the advantages and disadvantages and please feel free to share some more ideas. The advantages: # You make the word stand out # Some search engines might put more emphasis on those words The disadvantages: # It makes it more difficult for users to type in the URL or suggest the link via phone. # It may confuse users, making them think URL's like domains are not case sensitive at all. webing #:3652026 6:04 pm on May 16, 2008 (utc 0) i thought urls were not case sensitive? i just tried my domain name in capital letters and it redirected me to the non capital letters so i do think domains are not case sensitive. sorry if i'm completly wrong ^^. pageoneresults #:3652029 6:10 pm on May 16, 2008 (utc 0) You know, its funny you should start this topic. I was just getting ready to do a full blown topic on Pascal Casing and "visual" marketing advantages. I started a topic back in 2007 September here... Domain Names and Pascal Casing http://www.webmasterworld.com/domain_names/3457393.htm No, domain names are not case sensitive. These past 12 months I've been on a mission and changing everything to Pascal Casing when it comes to domain names. Its much easier to read and separate words and it just looks nicer. I've been experimenting with this and it works. Google AdWords is a great place to test the effectiveness of Pascal Casing. What's really cool is that you can literally change your hard coded references to Pascal Casing and when you hover over them, they show lower case. Its a browser feature I guess. I never gave it much thought until this past year when I started my changes. I've also gone one step further and use Pascal Casing in full addresses. We have a rewrite in place that forces lower case so we can do pretty much whatever we want with the URI and file naming. [edited by: pageoneresults at 6:11 pm (utc) on May 16, 2008] ted
Rob Laporte

E-Mail: Evaluating Dedicated vs. Shared IP Addresses - ClickZ - 0 views

  •  
    The downside to having a dedicated IP address is the cost. Most ESPs charge an initial set-up fee of $500 to $1,000 for a dedicated IP address; there's also often a $250 monthly fee for maintaining it. This directly impacts your e-mail ROI (define). For large quantity senders the additional cost is minimal, but for those sending small volumes of e-mail it can make a dent in your profit margin. A shared IP address is just what it sounds like -- you're sharing the IP address with other organizations. Every company sending from the IP address has the potential to impact, positively or negatively, its reputation. If your IP address neighbors are good guys, the reputation shouldn't be damaged. But if one of them (or if you) does something that raises a red flag, the IP address' reputation will be tarnished and all e-mail sent from it could be blacklisted. Why Might You Want to Share an IP Address? The ESP I spoke with recently raised another valid positive about shared IP addresses, at least for low-volume senders. When we talk reputation, we talk about positive, neutral, and negative. To get on the reputation radar, the IP address needs to be sending a certain amount of e-mail each month. If your sends are small, your dedicated IP address may be below the radar and never "qualify" for a positive or a negative reputation -- you'll be stuck with a "neutral" reputation or no reputation at all. This isn't all bad, but it's also not all good. By having companies share IP addresses, this ESP contends it is able to get enough volume to earn positive IP address reputations, which helps its customers' e-mail get to the inbox. This is a valid point, as long as everyone using the IP address behaves and avoids red flags. It's a calculated strategy, one which requires the ESP to provide education about e-mail best practices and closely monitor every IP address to ensure customers are in compliance. If you're sending from your own in-house system, these same pros and cons apply
  •  
    The downside to having a dedicated IP address is the cost. Most ESPs charge an initial set-up fee of $500 to $1,000 for a dedicated IP address; there's also often a $250 monthly fee for maintaining it. This directly impacts your e-mail ROI (define). For large quantity senders the additional cost is minimal, but for those sending small volumes of e-mail it can make a dent in your profit margin. A shared IP address is just what it sounds like -- you're sharing the IP address with other organizations. Every company sending from the IP address has the potential to impact, positively or negatively, its reputation. If your IP address neighbors are good guys, the reputation shouldn't be damaged. But if one of them (or if you) does something that raises a red flag, the IP address' reputation will be tarnished and all e-mail sent from it could be blacklisted. Why Might You Want to Share an IP Address? The ESP I spoke with recently raised another valid positive about shared IP addresses, at least for low-volume senders. When we talk reputation, we talk about positive, neutral, and negative. To get on the reputation radar, the IP address needs to be sending a certain amount of e-mail each month. If your sends are small, your dedicated IP address may be below the radar and never "qualify" for a positive or a negative reputation -- you'll be stuck with a "neutral" reputation or no reputation at all. This isn't all bad, but it's also not all good. By having companies share IP addresses, this ESP contends it is able to get enough volume to earn positive IP address reputations, which helps its customers' e-mail get to the inbox. This is a valid point, as long as everyone using the IP address behaves and avoids red flags. It's a calculated strategy, one which requires the ESP to provide education about e-mail best practices and closely monitor every IP address to ensure customers are in compliance. If you're sending from your own in-house system, these same pros and cons apply
Rob Laporte

Google Says Domain Registrations Don't Affect SEO, Or Do They? - 0 views

  •  
    Google Says Domain Registrations Don't Affect SEO, Or Do They? Sep 9, 2009 at 2:01pm ET by Matt McGee Over at Search Engine Roundtable today, Barry Schwartz writes about the latest comments from Google about domain registration and its impact on SEO/search rankings. In this case, it's Google employee John Mueller suggesting in a Google Webmaster Help forum thread that Google doesn't look at the length of a domain registration: A bunch of TLDs do not publish expiration dates - how could we compare domains with expiration dates to domains without that information? It seems that would be pretty hard, and likely not worth the trouble. Even when we do have that data, what would it tell us when comparing sites that are otherwise equivalent? A year (the minimum duration, as far as I know) is pretty long in internet-time :-). But let's look at some more evidence. Earlier this year, Danny spoke with Google's Matt Cutts about a variety of domain/link/SEO issues. In light of the claims from domain registrars that longer domain registrations are good for SEO, Danny specifically asked "Does Domain Registration Length Matter?" Matt's reply: To the best of my knowledge, no search engine has ever confirmed that they use length-of-registration as a factor in scoring. If a company is asserting that as a fact, that would be troubling. But wait, there's more! Shortly after the Q&A with Danny that we posted here, Matt published more thoughts on the matter in a video on the Google Webmaster Central Channel on YouTube. If you don't have time to watch the video, Matt says, "My short answer is not to worry very much about that [the number of years a domain is registered], not very much at all." He reiterates that the domain registrar claims "are not based on anything we said," and talks about a Google "historical data" patent that may or may not be part of Google's algorithm. He sums it up by saying, "make great content, don't worry nea
  •  
    Google Says Domain Registrations Don't Affect SEO, Or Do They? Sep 9, 2009 at 2:01pm ET by Matt McGee Over at Search Engine Roundtable today, Barry Schwartz writes about the latest comments from Google about domain registration and its impact on SEO/search rankings. In this case, it's Google employee John Mueller suggesting in a Google Webmaster Help forum thread that Google doesn't look at the length of a domain registration: A bunch of TLDs do not publish expiration dates - how could we compare domains with expiration dates to domains without that information? It seems that would be pretty hard, and likely not worth the trouble. Even when we do have that data, what would it tell us when comparing sites that are otherwise equivalent? A year (the minimum duration, as far as I know) is pretty long in internet-time :-). But let's look at some more evidence. Earlier this year, Danny spoke with Google's Matt Cutts about a variety of domain/link/SEO issues. In light of the claims from domain registrars that longer domain registrations are good for SEO, Danny specifically asked "Does Domain Registration Length Matter?" Matt's reply: To the best of my knowledge, no search engine has ever confirmed that they use length-of-registration as a factor in scoring. If a company is asserting that as a fact, that would be troubling. But wait, there's more! Shortly after the Q&A with Danny that we posted here, Matt published more thoughts on the matter in a video on the Google Webmaster Central Channel on YouTube. If you don't have time to watch the video, Matt says, "My short answer is not to worry very much about that [the number of years a domain is registered], not very much at all." He reiterates that the domain registrar claims "are not based on anything we said," and talks about a Google "historical data" patent that may or may not be part of Google's algorithm. He sums it up by saying, "make great content, don't worry nea
Rob Laporte

Google; You can put 50 words in your title tag, we'll read it | Hobo - 0 views

  •  
    Google; You can put 50 words in your title tag, we'll read it Blurb by Shaun Anderson Note - This is a test, testing Title Tags in Google. Consider also Google Title Tag Best Practice. We recently tested "how many keywords will Google read in the title tag / element?" using our simple seo mythbuster test (number 2 in the series). And here's the results, which are quite surprising. First - here's the test title tag we tried to get Google to swallow. And it did. All of it. Even though it was a bit spammy; HoboA HoboB HoboC HoboD HoboE HoboF HoboG HoboH HoboI HoboJ HoboK HoboL HoboM HoboN HoboO HoboP HoboQ HoboR HoboS HoboT HoboU HoboV HoboW HoboX HoboY Hob10 Hob20 Hob30 Hob40 Hob50 Hob60 Hob70 Hob80 Hob90 Hob11 Hob12 Hob13 Hob14 Hob15 Hob16 Hob17 Hob18 Hob19 Hob1a Hob1b Hob1c Hob1d Hob1e Hob1f Hob1g Hob1h Using a keyword search - hoboA Hob1h - we were surprised to see Google returned our page. We also tested it using - Hob1g Hob1h - the keywords right at the end of the title - and again our page was returned. So that's 51 words, and 255 characters without spaces, 305 characters with spaces, at least! It seems clear Google will read just about anything these days! ************** Update: Qwerty pointed out an interesting fact about the intitle: site operator in Google. Google results with the intitle: command…..results as expected. But next in the sequence returns the following, unexpected result….. Google results with the intitle: command So what does this tell us? Google seems to stop at the 12th word on this page at least when returning results using the intitle: site operator. Another interesting observation. Thanks Qwerty. ************** We're obviously not sure what benefit a title tag with this many keywords in it has for your page, in terms of keyword density / dilution, and "clickability" in the search engine results pages (serps). 50+ words is certainly not best practice! When creating your title tag bear in
Rob Laporte

SEOmoz | Announcing SEOmoz's Index of the Web and the Launch of our Linkscape Tool - 0 views

  •  
    After 12 long months of brainstorming, testing, developing, and analyzing, the wait is finally over. Today, I'm ecstatic to announce some very big developments here at SEOmoz. They include: * An Index of the World Wide Web - 30 billion pages (and growing!), refreshed monthly, built to help SEOs and businesses acquire greater intelligence about the Internet's vast landscape * Linkscape - a tool enabling online access to the link data provided by our web index, including ordered, searchable lists of links for sites & pages, and metrics to help judge their value. * A Fresh Design - that gives SEOmoz a more usable, enjoyable, and consistent browsing experience * New Features for PRO Membership - including more membership options, credits to run advanced Linkscape reports (for all PRO members), and more. Since there's an incredible amount of material, I'll do my best to explain things clearly and concisely, covering each of the big changes. If you're feeling more visual, you can also check out our Linkscape comic, which introduces the web index and tool in a more humorous fashion: Check out the Linkscape Comic SEOmoz's Index of the Web For too long, data that is essential to the practice of search engine optimization has been inaccessible to all but a handful of search engineers. The connections between pages (links) and the relationship between links, URLs, and the web as a whole (link metrics) play a critical role in how search engines analyze the web and judge individual sites and pages. Professional SEOs and site owners of all kinds deserve to know more about how their properties are being referenced in such a system. We believe there are thousands of valuable applications for this data and have already put some effort into retrieving a few fascinating statistics: * Across the web, 58% of all links are to internal pages on the same domain, 42% point to pages off the linking site. * 1.83%
Rob Laporte

Google's December 2020 Core Update Themes - 0 views

  • The data and overall consensus point to Google’s December 2020 Core Update is it's one of the more impactful algorithm adjustments to hit the SERP over the past year or so.
  • I prefer to look at core updates almost from a pure content and UX perspective. For me, it’s about the specific pages Google swaps out more than it is a per domain analysis.
  • I am performing a qualitative analysis
  • ...19 more annotations...
  • I am not making any sort of definitive statements
  • What moves me, however, is when I look at 100 keywords I start seeing the same content-oriented theme arise again and again.
  • What I’m trying to say, and as you’ll see in the examples I will get into later, is that the content that was more focused on the specific topic mentioned in the query did better. So while the "ultimate guide” here did get to the topic the query deals with, it was not exclusively about that topic.
  • This might call the entire strategy of creating these ultimate guides into question. Perhaps you can’t cast a wide net in that way anymore? Perhaps, the "ultimate guide” is only really suitable for people who actually want to get a more broad understanding of a topic? (Crazy to think, I know!)
  • The pages from Rocket Mortgage, on the other hand, is only about how much you need for a down payment:
  • So too is the page from Quicken Loans:
  • The Moral of the Story: If I want to understand how much money on average I need to put down when buying a house or what the various options generally are and what they mean long term, the CFPG page, .gov or not, doesn’t really help me. Its content is not specifically honed in on that particular topic. Again, we have another page that takes a sweeping look at a topic that lost rankings when the query reflected a more specific sort of intent!
  • What’s interesting here is that unlike the previous examples, where too much content resulted in the page’s topical relevance being diluted, the lack of such content here is what I think caused the ranking loss. Look, it’s not bad content. However, it’s pretty much the "general” kind of content you see here, there, and everywhere for all sorts of topics. Just compare it to what the page from the Credit Card Insider offers:
  • This just oozes depth. The third topic on the page alone (6 Ways to Pay Off…) rivals the depth shown on the CreditCards.com page! What differentiates this page from the "guides” shown in the other examples is that this is a guide that drills deep into one topic as opposed to trying to span multiple subtopics. Also, have a look at the formatting, it reminds me of what we saw on the Motley Fool’s page:
  • It’s deep content that is easy to digest. It’s not hard to see why Google swapped these two pages.
  • The Moral of the Story: Exact content relevancy is not only about what topic you talk about. You can be topically aligned but it has to offer real information to the user. It’s even better when that information is digestible. In other words, if you want to rank for a keyword with topic specificity it might be better to create an "ultimate guide” that drills deep into the topic itself versus trying to cover every subtopic under the sun in order to try to rank for more topics with one piece of content.
  • The by-line really sums it up. It tells you this article is about the fact that you most likely won't get addicted to painkillers, but it’s definitely possible so here’s the scoop. To me, it’s far more in line with the average user’s intent of learning about the risks of addiction versus understanding the fine difference between addiction and dependence. It’s the same story with the WebMD page:
  • The Moral of the Story: Again, the issue here is not how authoritative or how substantial the content is. There is no doubt that content from the NIH is both substantial and authoritative. The issue here again seems to relate to Google being better able to show content that is specifically relevant to the nature of the query.
  • First things first, the page doesn’t speak to the query directly. While in the process of learning the difference between sadness and depression one could understand the signs of depression that route is certainly indirect. You could argue that the query how to tell if you have depression could be taken as ‘how do I know if I am just sad or depressed?’ but that really doesn’t seem to be the essential intent here. That topical line (i.e., sadness vs. depression) would most likely produce its own unique query (i.e., am I sad or depressed). From the content shown on the WebMD page, it appears that Google thinks of the intent as understanding the symptoms of depression:
  • The WebMD, in contradistinction to the MHA page, speaks to the "plain meaning” of the query’s intent… how can you tell if you’re suffering from depression? Aside from that, the WebMD page offers a bit more in terms of substance. While it doesn’t go into great detail per se, the WebMD page does offer a pretty comprehensive list of items. Compare that to the MHA page which, if you read it, is a bit thin and lacks tremendously in offering much of any details (even a basic list as seen on the WebMD page). The Moral of the Story: Relevancy is a two-pronged equation (at minimum). It requires the content to be topically focused on the one hand as well as substantial on the other
  • I’ve saved the best for last. This is my favorite example that I came across when diving into the December 2020 Core Update. I mean, for crying out loud, we’re talking about the CDC losing rankings in favor of a .org domain I never heard of. How could this be? Let’s understand the intent of the query. If I were searching for this it would be because I found something on my body that I thought might be skin cancer. If I could be so bold, I would imagine that this is why most of us would search for this term. I wouldn’t, and again I imagine most people in most instances wouldn’t search for this in order to understand if regular screening is officially recommended or not. Yet, that is what the CDC page is about:
  • I hate to make assumptions, but I would also think that someone running this query is most likely not interested in the common tests and methods doctors use to determine if skin cancer is present. Yet, this is what the page from Cancer.net focuses on:
  • Again, I would search for this term if I saw something weird on my body that made me think "Holy crap, do I have skin cancer?”. The page from the AOCD is entirely made for people on the verge of freaking out at the possibility of having skin cancer:
  • To me, when you see this page relative to the pages from Cancer.net and the CDC is painfully obvious why this page got the ranking boost. The Moral of the Story: Again, I think what has transpired here is painfully obvious. Google has looked past the immediate authority of some of the pages here and has more heavily considered how relevant the content is to the query. As with the cases I have shown earlier, Google is rewarding content that speaks in a highly-focused way to the intent and nature of the query. What Was the December 2020 Core Update About? Are you expecting a one-liner that definitively characterizes the December 2020 update? You’re not going to get one from me.  This update, like any other, certainly included a whole plethora of different "algorithmic considerations” and themes. That said, from where I sit, while other core updates did things to help put the most authoritative content at the top of the SERP, this update seemed to me as being more about pure relevancy. Updates of the past have done things to weed out sites using a marketing tone within YMYL informational content or have rewarded sites that put the right content ahead of their affiliate pursuits. All of that, while part of relevancy, speaks more to a need for something authoritative at the top of the SERP. Seeing so many .gov pages drop in favor of pages from sites like Healthline or WebMD seems to point to the update rewarding relevancy to the nth degree. Perhaps Google felt as if it had "authority” at the top of the SERP in order, paving the way for a focus on relevance? Who knows. All I can say is that I personally have not seen such a strong focus on pure relevance on page one of the SERP. Content Creation Takeaways Practically speaking, I think the era of broadly reaching pages is quickly coming to an end. I think that has been the case for a while. However, seeing Google pull pages off page one of the SERP because they deal with multiple subtopics is a new level, at least for me. It shows that you have to create content that talks about one topic and one topic only (unless the keyword reflects a specific desire for a broader survey of a topic). I wonder if the idea of having one ultimate guide so as to win numerous keywords should be replaced with multiple posts where each post deals with one specific topic or subtopic. (If you do that, please don’t create thin content, that is not what I am advocating for.) It’s a rather logical concept. As Google gets better at understanding content it is going to prefer highly-focused content around a specific topic to that which is of a more broad nature unless the query specifically shows intent for a general survey of a topic.
Rob Laporte

Questioning the Future of Search - ClickZ - 0 views

  •  
    Questioning the Future of Search By Mike Grehan, ClickZ, Jan 26, 2009 Related Reading New Signals to Search Engines Ajax and Search Engines SuperPages.com Combines Local Search with Social Networking Search Engines Are Allowed to Reject Ads Suggested Searches search engines - social networking - reject ads - static link Subscribe to newsletters Subscribe to RSS feeds Post a comment (0 posted) Last week I presented a Webinar based on the "thought paper" I wrote called, "New Signals To Search Engines." As it was a long read at 23 pages, I highlighted the more salient points, but mainly wanted to try and answer the hundreds of questions I received following its publication. The top question was about social media. It seems that many companies already have barriers to entry. Amy Labroo, associate director of online media at Advantage Business Media, asked specifically about any backlash due to unmonitored content in the social media space. I've come across this situation quite a lot recently. Many companies worry about negative commentary and therefore don't accept comments on their blogs or social network sites. In fact, many haven't started a blog or a dialogue space at a social networking site. This is simply hiding from your audience. If people have negative commentary about you and they can't make it known at your Web site or blog, they'll make it known somewhere else. I advocate putting yourself out there and listening to your audience. Marketing has changed from a broadcast-my-corporate-message medium to a listening medium. The voice of the customer is very, very loud online. And those companies that still believe they own their brand and the message may well be in for a bit of shock as brands are hijacked by customers. Let your customers have their say. Keyword-driven marketing is all about understanding the language of the customer and creating marketing messages in that language. From time to time, I meet with creative agencies and almost always end u
Rob Laporte

Google SEO Test - Google Prefers Valid HTML & CSS | Hobo - 0 views

  •  
    Well - the result is clear. From these 4 pages Google managed to pick the page with valid css and valid html as the preffered page to include in it's index! Ok, it might be a bit early to see if the four pages in the test eventually appear in Google but on first glance it appears Google spidered the pages, examined them, applied duplicate content filters as expected, and selected one to include in search engine results. It just happens that Google seems to prefer the page with valid code as laid down by the W3C (World Wide Web Consortium). The W3C was started in 1994 to lead the Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability. What is the W3C? * W3C Stands for the World Wide Web Consortium * W3C was created in October 1994 * W3C was created by Tim Berners-Lee * W3C was created by the Inventor of the Web * W3C is organized as a Member Organization * W3C is working to Standardize the Web * W3C creates and maintains WWW Standards * W3C Standards are called W3C Recommendations How The W3C Started The World Wide Web (WWW) began as a project at the European Organization for Nuclear Research (CERN), where Tim Berners-Lee developed a vision of the World Wide Web. Tim Berners-Lee - the inventor of the World Wide Web - is now the Director of the World Wide Web Consortium (W3C). W3C was created in 1994 as a collaboration between the Massachusetts Institute of Technology (MIT) and the European Organization for Nuclear Research (CERN), with support from the U.S. Defense Advanced Research Project Agency (DARPA) and the European Commission. W3C Standardising the Web W3C is working to make the Web accessible to all users (despite differences in culture, education, ability, resources, and physical limitations). W3C also coordinates its work with many other standards organizations such as the Internet Engineering Task Force, the Wireless Application Protocols (WAP) Forum an
jack_fox

28 Google rich snippets you should know in 2019 [guide + infographic] - Mangools Blog - 0 views

  • unless you are an authoritative website such as Wikipedia, your information probably won’t appear in the answer box.
  • having an image from your website in an image pack is not very beneficial.
  • Besides the common video thumbnail and video knowledge panel, videos may also appear in a carousel, both on the mobile and the desktop devices.
  • ...15 more annotations...
  • It is always a good idea to have a video on your website. It increases the user engagement and grabs the attention. If you appear in a SERP with your own video thumbnail, it increases the CTRs, and the user will likely stay longer on your site.
  • If you decide to host (or embed) a video on your own website, you have to include proper structured data markup.
  • In general, it’s easier to appear as a video thumbnail in SERP with youtube video.
  • From the technical point of view, it is important to have a structured data markup for your article and it is recommended by Google to have an AMP version of the website.
  • It is based on internal Google algorithm. Your website has to be authoritative and contain high quality content. It doesn’t matter if you are a big news portal or you have a personal blog. If there is a long, high quality content, Google may include your website.
  • If you want to appear as an in-depth article, you should write long, high quality and unique content marked up with a structured data markup for article (don’t forget to include your company logo within the schema markup).
  • Higher CTRs. It’s kinda catchy as numbers will always attract people attention. An image can make the feature even more prominent.
  • Implementation: Good old friend: structured data
  • In the SERP, they replace the classic URL of a result. It’s a simplified and a common version of URL of the result. Categories and leaf pages are separated with chevrons. On the desktop you can achieve it with the right structured data, in mobile SERP it is automatic for all results.
  • Breadcrumbs (as opposed to a common URL) are easier to read for people, so it leads to a better UX right from the very first interaction with your website in the SERP, which can also lead to a higher CTR.
  • It’s really easy to implement it on every blog or ecommerce site – just another structured data to your website. If you have a WordPress site, you can do that with SEO plugins like Yoast SEO.
  • It mainly appears for the root domain, but it can be shown for a leaf page too (e.g. if you have the blog as a leaf page, blog categories (leaf pages) may appear as sitelinks).
  • Sitelinks contain links to leaf pages of a current website with title and description. It may contain 2 – 10 sitelinks. Appearance on a mobile is a bit different from a desktop. You may also spot small sitelinks as a vertical enhancement of an organic result.
  • High CTRs.
  • You can’t directly control the occurrence of sitelinks. Only Google decides whether to display them or not. However, the best practise is to have a clear website hierarchy in a top menu website with descriptive anchor text. The sitelinks are links from the menu.
Rob Laporte

Domain Moving Day the Key Relevance Way | SEMClubHouse - Key Relevance Blog - 0 views

  •  
    Domain Moving Day the Key Relevance Way by Mike Churchill So, you're gonna change hosting providers. In many cases, moving the content of the site is as easy as zipping up the content and unzipping it on the new server. There is another aspect of moving the domain that many people over look: DNS. The Domain Name System (DNS) is the translation service that converts your domain name (e.g. keyrelevance.com) to the corresponding IP address. When you move hosting companies, it's like changing houses, if you don't set up the Change of Address information correctly, you might have some visitors going to the old address for a while. Proper handling of the changes to DNS records makes this transition time as short as possible. Let's assume that you are changing hosting, and the new hosting company is going to start handling the Authoritative DNS for the domain. The first step is to configure the new hosting company as the authority. This should best be done a couple or more days before the site moves to the new location. What does "Authoritative DNS" mean? There are a double-handful of servers (known as the Root DNS servers) whose purpose is to keep track of who is keeping track of the IP addresses for a domain. Rather than them handling EVERY DNS request, they only keep track of who is the authoritative publisher of the DNS information for each domain. In other words, they don't know your address, but they tell you who does know it. If we tell the Root level DNS servers that the authority is changing, this information may take up to 48 hours to propagate throughout the internet. By changing the authority without changing the IP addresses, then while visiting browsers are making requests during this transition, both the old authority and the new authority will agree on the address (so no traffic gets forwarded before you move). Shortening the Transition The authoritative DNS servers want to minimize their load, so every time they send out an answer to a
Rob Laporte

Limit Anchor Text Links To 55 Characters In Length? | Hobo - 0 views

  •  
    Limit Anchor Text Links To 55 Characters In Length? Blurb by Shaun Building LinksAs a seo I wanted to know - how many words or characters does Google count in a link? What's best practice when creating links - internal, or external? What is the optimal length of a HTML link? It appears the answer to the question 'how many words in a text link" is 55 characters, about 8-10 words. Why is this important to know? 1. You get to understand how many words Google will count as part of a link 2. You can see why you should keep titles to a maximum amount of characters 3. You can see why your domain name should be short and why urls should be snappy 4. You can see why you should rewrite your urls (SEF) 5. It's especially useful especially when thinking about linking internally, via body text on a page. I wanted to see how many words Google will count in one 'link' to pass on anchor text power to a another page so I did a test a bit like this one below; 1. pointed some nonsense words in one massive link, 50 words long, at the home page of a 'trusted' site 2. each of the nonsense words were 6 characters long 3. Then I did a search for something generic that the site would rank no1 for, and added the nonsense words to the search, so that the famous "This word only appear in links to the site" (paraphrase) kicked in 4. This I surmised would let me see how many of the nonsense words Google would attribute to the target page from the massive 50 word link I tried to get it to swallow. The answer was….. 1. Google counted 8 words in the anchor text link out of a possible 50. 2. It seemed to ignore everything else after the 8th word 3. 8 words x 6 characters = 48 characters + 7 spaces = a nice round and easy to remember number - 55 Characters. So, a possible best practice in number of words in an anchor text might be to keep a link under 8 words but importantly under 55 characters because everything after it is ignored
jack_fox

When and how to ask your clients for testimonials and case studies - Credo - 0 views

  • if you write it for them and ask them for their approval/edits, you can write that testimonial so that it speaks directly to the pain you just solved with your solution to that pain.
  • First, make the ask when they are happiest, right after you solved their problem. Second, make it as easy on them by offering to write it for them. Third, do it quickly while it is still top of mind for them to approve and so that you do not forget about it; Fourth, write the testimonial so that it maps to the pain you just solved for the customer.
  •  
    "if you write it for them and ask them for their approval/edits, you can write that testimonial so that it speaks directly to the pain you just solved with your solution to that pain."
Rob Laporte

Google Webmaster Tools Now Provide Source Data For Broken Links - 0 views

  • Google has also added functionality to the Webmaster Tools API to enable site owners to provide input on control settings (such as preferred domain and crawl rate) that could previously only be done via the application. As they note in the blog post: “This is especially useful if you have a large number of sites. With the Webmaster Tools API, you can perform hundreds of operations in the time that it would take to add and verify a single site through the web interface.”
  •  
    Oct 13, 2008 at 5:28pm Eastern by Vanessa Fox Google Webmaster Tools Now Provide Source Data For Broken Links Ever since Google Webmaster Tools started reporting on broken links to a site, webmasters have been asking for the sources of those links. Today, Google has delivered. From Webmaster Tools you can now see the page that each broken link is coming from. This information should be of great help for webmasters in ensuring the visitors find their sites and that their links are properly credited. The value of the 404 error report Why does Google report broken links in the first place? As Googlebot crawls the web, it stores a list of all the links it finds. It then uses that list for a couple of things: * As the source list to crawl more pages on the web * To help calculate PageRank If your site has a page with the URL www.example.com/mypage.html and someone links to it using the URL www.example.com/mpage.html, then a few things can happen: * Visitors who click on that link arrive at the 404 page for your site and aren't able to get to the content they were looking for * Googlebot follows that link and instead of finding a valid page of your site to crawl, receives a 404 page * Google can't use that link to give a specific page on your site link credit (because it has no page to credit) Clearly, knowing about broken links to your site is valuable. The best solution in these situations generally is to implement a 301 redirect from the incorrect URL to the one. If you see a 404 error for www.example.com/mpage.html, then you can be pretty sure they meant to link to www.example.com/mypage.html. By implementing the redirect, visitors who click the link find the right content, Googlebot finds the content, and mypage.html gets credit for the link. In addition, you can scan your site to see if any of the broken links are internal, and fix them. But finding broken links on your site can be tedious (although it's valuable to run a broken l
Rob Laporte

How Individuals Can Build a Robust Social Presence - ClickZ - 0 views

  •  
    How to Build a Robust Social Presence Get your basic data out there. For many professionals, the core of your social presence probably involves one or more of these: LinkedIn, Facebook, and Twitter. Each of these can be set up in less than five minutes and costs you nothing. Before jumping in, a few tips are in order: * When creating your profile, be sure to include a nice photo, and follow the steps suggested at each site to complete as much of your profile as you can. When you're considering adding, following, or contacting someone, think about the impact of missing or otherwise insufficient information. Business networking should not feel like you're living in a mystery novel. None of us has time for that, so think about the people who are looking at you. Make it easy for them to understand who are and what you do. * Thoughtfully add people to your network. I overheard someone on a plane last week saying "I have over a thousand people in my personal network but have no idea who most of them are." If the people in your network lack credibility, what's that say about you? These are your "friends," right? * On LinkedIn, seek out recommendations, but only from people who are qualified to give them. Five hundred professional connections without a single recommendation sends an unfortunate message. Likewise, a recommendation that starts out "I've never actually worked with Dave, but..." is useless, and detracts from social capital and personal credibility. * Participate. Leverage your ability to add or become friends, to post, and to comment to your advantage. Talk about your business, about news that relates to you or your profession, about things that are of interest to your audience. Do not shill or spam. * Be careful with questions like "What are you doing right now?" This common question -- in the context of business -- is a thought-starter, not a literal interrogative. The best response is less along the lines of "ea
Rob Laporte

Effective Internal Linking Strategies That Prevent Duplicate Content Nonsense - Search ... - 0 views

  •  
    The funny thing about duplicate content is that you don't really have to have it for it to appear as if you do have it. But whether you have duplicate content on your site or not, to the search engines appearances are everything . The engines are pretty much just mindless bots that can't reason. They only see what is, or appears to be there and then do what the programmers have determined through the algorithm. How you set up your internal linking structure plays a significant role in whether you set yourself up to appear if you have duplicate content on your site or not. Some things we do without thinking, setting ourselves up for problems ahead. With a little foresight and planning, you can prevent duplicate content issues that are a result of poor internal link development. For example, we know that when we link to site.com/page1.html in one place but then link to www.site.com/page1.html in another, that we are really linking to the same page. But to the search engines, the www. can make a difference. They'll often look at those two links as links to two separate pages. And then analyze each page as if it is a duplicate of the other. But there is something we can do with our internal linking to alleviate this kind of appearance of duplicate content. Link to the www. version only Tomorrow I'll provide information on how to set up your site so when someone types in yoursite.com they are automatically redirected to www.yoursite.com. It's a great permanent fix, but as a safety measure, I also recommend simply adjusting all your links internally to do the same. Example of not linking to www. version. In the image above you can see that the domain contains the www., but when you mouse over any of the navigation links, they point to pages without the www. Even if you have a permanent redirect in place, all the links on your site should point to the proper place. At the very least you're making the search engines and visitors NOT have to redirect. At best, should y
Rob Laporte

A Completely Different Kind Of Landing Page Optimization - 0 views

  •  
    How can you begin using segment optimization in your campaigns? Start by making a list of possible segments within your audience. Who are the different types of people who look for you online - and why? Don't restrict yourself to the way you may have segmented people in your database or your business plan. Brainstorm what's important and relevant from the respondent's point-of-view, by considering any or all of the following issues: * the specific "problem" the respondent wants to solve * the demographic/psychographic "persona" of the respondent * the respondent's stage in the buying process * the role of the respondent in their organization * the respondent's geographic location * the respondent's industry or the size of their organization These are your initial buckets into which respondents could be segmented. Don't worry if there's overlap between buckets, as these won't necessarily be either/or choices. Next, review the keywords and ad creatives you're running in your search marketing campaigns. For each keyword/creative pair, ask yourself - is there a particular segment that its respondents would clearly belong to? If the answer is yes, add it to that bucket along with the number of clicks per month it generates. If there answer is no, leave a question mark next to it - perhaps with a handful of segments it might appeal to. For instance, in our example above, the keyword phrases "french exam" and "college french" are obvious candidates for the student segment. Phrases like "business french" and "executive french" fall into the business traveler bucket. But "learn french" can't be segmented just from the keyword. Now, look over your segment buckets and see which ones have the most number of clicks per month. These are your best targets for segment optimization. For each one, create a dedicated landing page that is focused on the needs, wants, and characteristics of that particular
Rob Laporte

Understanding Google Maps & Yahoo Local Search | Developing Knowledge about Local Search - 0 views

  •  
    Google Maps: relative value of a OneBox vs top organic results Category: Google Maps (Google Local) - Mike - 5:50 am Steve Espinosa has some interesting preliminary research on the relative click thru rates of a #1 listing in the Local 10-Pack and a simultaneous #1 listing in organic. The organic listing showed 1.6x the click thru of the the Local 10 Pack listing. As it is preliminary research and only looked at click thru not call in or other measures of action, it is an important piece of research but doesn't speak to ultimate customer action. According to TMP's Local Search Usage Study : Following online local searches, consumers most often contact a business over the telephone (39%), visit the business in-person (32%) or contact the business online (12%). If one works out the combined math of the two studies (a not very reliable number I assure you), in the end the top local ranking would still provide more client contacts either via phone or in person than the organic ranking. At the end of the day, Steve's research can not be viewed as a reason to not focus on local but rather as a call to action on the organic side. I think he would agree that, in the excitement around local, you can't forget organic's power and that in an ideal world a business would use every tool available to them. However, many times, due to the nature of a business, a business may not be able to legitimately play in the Local space and their only recourse is to optimize their website for local phrases. Another interesting outcome of Steve's initial research was "the fact is that the majority of the users who got to the site via the natural link had resolution above 1024×768 and the majority of users who visited via the Onebox result had resoultion of 1024×768 or under." As Steve pointed out, this could be do the greater real estate visible to those with larger screens and thus greater visibility of organic listings above the fold. It could also, however, be
1 - 20 of 586 Next › Last »
Showing 20 items per page