Skip to main content

Home/ DISC Inc/ Group items tagged ad

Rss Feed Group items tagged

Rob Laporte

Wake Up SEOs, the New Google is Here | SEOmoz - 0 views

  •  
    Rel="author" and Rel="publisher" are the solution Google is adopting in order to better control, within other things, the spam pollution of the SERPs. If you are a blogger, you will be incentivized in marking your content with Author and link it to your G+ Profile, and as a Site, you are incentivized to create your G+ Business page and to promote it with a badge on you site that has the rel="publisher" in its code. Trusted seeds are not anymore only sites, but can be also persons (i.e.: Rand or Danny Sullivan) or social facets of an entity… so, the closer I am in the Social Graph to those persons//entity the more trusted I am to Google eyes. As we can see, Google is not trying to rely only on the link graph, as it is quite easy to game, but it is not simply adding the social signals to the link graph, because they too can be gamed. What Google is doing is creating and refining a new graph that see cooperating Link graph, Social graph and Trust graph and which is possibly harder to game. Because it can be gamed still, but - hopefully - needing so many efforts that it may become not-viable as a practice. Wake up SEOs, the new Google is here As a conclusion, let me borrow what Larry Page wrote on Google+ (bold is mine): Our ultimate ambition is to transform the overall Google experience […] because we understand what you want and can deliver it instantly. This means baking identity and sharing into all of our products so that we build a real relationship with our users. Sharing on the web will be like sharing in real life across all your stuff. You'll have better, more relevant search results and ads. Think about it this way … last quarter, we've shipped the +, and now we're going to ship the Google part. I think that it says it all and what we have lived a year now is explained clearly by the Larry Page words. What can we do as SEOs? Evolve, because SEO is not dieing, but SEOs can if they don't assume that winter - oops - the
Rob Laporte

Official Google Webmaster Central Blog: Make your 404 pages more useful - 0 views

  •  
    This Blog Google Blogs Web Blog News This Blog Google Blogs Web Blog News Make your 404 pages more useful Tuesday, August 19, 2008 at 10:13 AM Your visitors may stumble into a 404 "Not found" page on your website for a variety of reasons: * A mistyped URL, or a copy-and-paste mistake * Broken or truncated links on web pages or in an email message * Moved or deleted content Confronted by a 404 page, they may then attempt to manually correct the URL, click the back button, or even navigate away from your site. As hinted in an earlier post for "404 week at Webmaster Central", there are various ways to help your visitors get out of the dead-end situation. In our quest to make 404 pages more useful, we've just added a section in Webmaster Tools called "Enhance 404 pages". If you've created a custom 404 page this allows you to embed a widget in your 404 page that helps your visitors find what they're looking for by providing suggestions based on the incorrect URL. Example: Jamie receives the link www.example.com/activities/adventurecruise.html in an email message. Because of formatting due to a bad email client, the URL is truncated to www.example.com/activities/adventur. As a result it returns a 404 page. With the 404 widget added, however, she could instead see the following: In addition to attempting to correct the URL, the 404 widget also suggests the following, if available: * a link to the parent subdirectory * a sitemap webpage * site search query suggestions and search box How do you add the widget? Visit the "Enhance 404 pages" section in Webmaster Tools, which allows you to generate a JavaScript snippet. You can then copy and paste this into your custom 404 page's code. As always, don't forget to return a proper 404 code. Can you change the way it looks? Sure. We leave the HTML unstyled initially, but you can edit the CSS block that we've included. For more information, check out our gu
Rob Laporte

NoFollow | Big Oak SEO Blog - 0 views

  • And while the business networking aspect is great, I’m writing to tell you it can be useful for your SEO efforts too, specifically link building. You may not know this, but LinkedIn does not employ the nofollow attribute on its links, like most other social networking sites. So that means we can use LinkedIn responsibly to build some nice one-way links to our sites and blogs. Even better your employees can use this to build some SEO-friendly links to your company site.
  • So the days of parsing links onto high PageRank Flickr pages are over. Or are they? No. Let’s examine why in list form. Let’s examine how you can use the remaining scraps of link juice from Flickr in your SEO campaigns. 1.) Flickr has not added nofollow to discussion boards. For those of you who liked to scout out high PageRank pages and just drop your link as a comment to the photo, which could be accomplished easily if you owned a link-laundering website, you can still do this in the Flickr group discussion boards. Flickr has not yet added nofollow tags to those, and given the preponderance of discussions that revolve around people sharing photos, you can just as easily drop relevant external links in the discussion and reap link juice benefits. 2.) Flickr has not added nofollow to personal profile pages. If you have a personal profile page, you can place targeted anchor text on it, point links at it, and receive full SEO benefit as it gains PageRank. 3.) Flickr has not added nofollow to group pages. If you own a Flickr group, you can still put as many links as you wish on the main group page without fear of them being turned into nofollow. Many Flickr personal profile and group pages gain toolbar PR just by having the link spread around in-house, so it’s not that hard to make those pages accumulate PR. Google seems to be very generous in that regard. There’s a lot of PR to be passed around through Flickr apparently. So, the glory days of Flickr SEO may be over (unless Yahoo does the improbable and flips the switch back), but Rome didn’t burn to rubble in a day, so we might as well make the most of Flickr before it completely collapses.
Rob Laporte

Will Selling Links via Text Link Ads SLAM your PageRank? - Webmaster Central Help - 0 views

  • Will Selling Links via Text Link Ads SLAM your PageRank? Report abuse uploadjockey Level 1 1/6/10 I have read the FAQs and checked for similar issues: YESMy site's URL is: http://www.uploadjockey.comDescription (including timeline of any changes made): Removed Text Link AdsLast we started to sell links via text-link-ads.com for some additional income.I cannot say for certain that this caused the problem, but it seems like it did. Our PageRank has dropped from a PR4 to a PR0 in less than a year.Our traffic has dropped from over 75k+ unique hits a day to just barely 20k+Am I missing something? Is there some other violation that I could be missing that is killing our ranking results?Thanks
jack_fox

Google Ads, helping small businesses do more - 1 views

  • small business can now use Smart campaigns, our new default ads experience.
  • We’ve found that Smart campaigns are 3 times better at getting your ad in front of the right audience.
Rob Laporte

Google introduces Smart Campaigns for small businesses - the first new solution to laun... - 0 views

  • The campaigns are almost entirely automated, from ad creatives to delivery optimization, based on the product or service being advertised and the goal the advertiser sets.
  • Smart Campaigns are built on AdWords Express technology, and Spalding says the company will continue to develop on it.
  • Smart Campaign ads can be delivered across Google’s properties, and users do not have the ability to turn off channels
  • ...2 more annotations...
  • Smart Campaigns are three times more effective at reaching a business’s target audiences than AdWords Express campaigns
  • The product is new and we are always experimenting with different approaches. As more small businesses use Smart Campaigns, we will take their feedback and continue to evolve the product
  •  
    "announced"
Rob Laporte

Conversion Rate Benchmarks: Find Out How YOUR Conversion Rate Compares | WordStream - 0 views

  •  
    "ndustry Average Google Ads Cost per Conversion (Search) Average Google Ads Cost per Conversion (GDN)"
jack_fox

The Ultimate Web Server Security Guide @ MyThemeShop - 0 views

  • They could insert links into the site to boost their SEO rankings. Hackers can make a killing selling links from exploited sites. Alternatively, a hacker could deface the site and demand money to restore it (ransom). They could even place ads on the site and use the traffic to make money. In most cases, an attacker will also install backdoors into the server. These are deliberate security holes that allow them to come back and exploit the site in the future – even if the insecure plugin has been replaced.
  • Unfortunately, under WordPress, every plugin and theme has the ability to alter anything on the site. They can even be exploited to infect other apps and sites hosted on the same machine.
  • Theme developers are often relatively inexperienced coders. Usually, they’re professional graphic artists who have taught themselves a little PHP on the side. Plugins are another popular line of attack – they account for 22% of successful hacks. Put together, themes and plugins are a major source of security trouble.
  • ...102 more annotations...
  • Each person who uses your system should only have the privileges they need to perform their tasks.
  • Don’t depend on a single security measure to keep your server safe. You need multiple rings of defense.
  • Security exploits exist at all levels of the technology stack, from the hardware up. WP White Security revealed that 41% of WordPress sites are hacked through a weakness in the web host.
  • While it’s important to use a strong password, password cracking is not a primary focus for hackers.
  • the more software you have installed on your machine, the easier it is to hack – even if you aren’t using the programs! Clearly, programs that are designed to destroy your system are dangerous. But even innocent software can be used in an attack.
  • There are 3 ways to reduce the attack surface: 1. Run fewer processes 2. Uninstall programs you don’t need 3. Build a system from scratch that only has the processes you need
  • A really good authentication system uses multiple tests. Someone could steal or guess your password. They could grab your laptop with its cryptographic keys.
  • If you want to run multiple processes at the same time, you need some way of managing them. This is basically what a kernel is. It does more than that – it handles all of the complex details of the computer hardware, too. And it runs the computer’s networking capabilities
  • programs exist as files when they are not running in memory
  • SELinux’s default response is to deny any request.
  • SELinux is extremely comprehensive, but this power comes at a price. It’s difficult to learn, complex to set up, and time-consuming to maintain.
  • AppArmor is an example of a MAC tool, although it’s nowhere near as comprehensive as SELinux. It applies rules to programs to limit what they can do.
  • AppArmor is relatively easy to set up, but it does require you to configure each application and program one by one. This puts the onus for security in the hands of the user or sysadmin. Often, when new apps are added, users forget to configure AppArmor. Or they do a horrible job and lock themselves out, so their only option is to disable the profile. That said, several distributions have adopted AppArmor.
  • Generic profiles shipped by repo teams are designed to cover a wide range of different use cases, so they tend to be fairly loose. Your specific use cases are usually more specific. In this case, it pays to fine-tune the settings, making them more restrictive.
  • GRSecurity is a suite of security enhancements
  • In the future, this could become a viable option. For now, we’ll use Ubuntu and AppArmor.
  • Apache is a user-facing service – it’s how your users interact with your website. It’s important to control this interaction too.
  • If your Apache configuration is bad, these files can be viewed as plain text. All of your code will be visible for anyone to see – this potentially includes your database credentials, cryptographic keys, and salts.
  • You can configure Apache to refuse any requests for these essential directories using .htaccess files. These are folder-level configuration files that Apache reads before it replies to a request.
  • The primary use for .htaccess files is to control access
  • If an attacker knows your WordPress cryptographic salts, they can use fake cookies to trick WordPress into thinking they have logged on already.
  • If the hacker has physical access to the computer, they have many options at their disposal. They can type commands through the keyboard, or insert a disk or USB stick into the machine and launch an attack that way.
  • When it comes to network-based attacks, attackers have to reach through one of the machine’s network ports.
  • For an attacker to exploit a system, they have to communicate to a process that’s listening on a port. Otherwise, they’d simply be sending messages that are ignored. This is why you should only run processes that you need for your site to run. Anything else is a security risk.
  • Often, ports are occupied by processes that provide no real valuable service to the machine’s legitimate users. This tends to happen when you install a large distribution designed for multiple uses. Large distros include software that is useless to you in terms of running a website. So the best strategy is to start with a very lightweight distro and add the components you need.
  • If you see any unnecessary processes, you can shut them down manually. Better yet, if the process is completely unnecessary, you can remove it from your system.
  • Firewalls are quite similar to access control within the computer. They operate on a network level, and you can use them to enforce security policies. A firewall can prevent processes from broadcasting information from a port. It can stop outside users from sending data to a port. And it can enforce more complex rules.
  • Simply installing and running a firewall does not make your host machine secure – it’s just one layer in the security cake. But it’s a vital and a powerful one.
  • First of all, we need to configure our software to resist common attacks. But that can only protect us from attacks we know about. Access control software, such as AppArmor, can drastically limit the damage caused by unauthorized access. But you still need to know an attack is in progress.
  • This is where Network Intrusion Detection Software (NIDS) is essential. It scans the incoming network traffic, looking for unusual patterns or signs of a known attack. If it sees anything suspicious, it logs an alert.
  • It’s up to you to review these logs and act on them.
  • If it’s a false alarm, you should tune your NIDS software to ignore it. If it’s an ineffective attack, you should review your security and block the attacker through the firewall.
  • That’s why it’s essential to have an automated backup system. Finally, you need to understand how the attack succeeded, so you can prevent it from recurring. You may have to change some settings on your Firewall, tighten your access rules, adjust your Apache configuration, and change settings in your wp-config file. None of this would be possible without detailed logs describing the attack.
  • Every web server has a breaking point and dedicated DOS attackers are willing to increase the load until your server buckles. Good firewalls offer some level of protection against naive DOS attacks
  • a tiny number of sites (less than 1%) are hacked through the WordPress core files
  • Major DNS attacks have taken down some of the biggest sites in the world – including Ebay and Paypal. Large hosting companies like Hostgator and Blue Host have been attacked. It’s a serious risk!
  • Right now, due to the way the web currently works, it’s impossible to download a web page without the IP address of a server. In the future, technologies like IFPS and MaidSafe could change that.
  • So there are 2 benefits to using a CDN. The first is that your content gets to your readers fast. The second benefit is server anonymity – nobody knows your real IP address – including the psychos. This makes it pretty impossible to attack your server – nobody can attack a server without an IP address.
  • When CDNs discover a DDOS attack, they have their own ways to deal with it. They often display a very lightweight “are you human?” message with a captcha. This tactic reduces the bandwidth costs and screens out the automated attacks.
  • If any of your DNS records point to your actual server, then it’s easy to find it and attack it. This includes A records (aliases) and MX records (mail exchange). You should also use a separate mail server machine to send your emails. Otherwise, your email headers will expose your real email address.
  • If your hosting company refuses to give you a new IP address, it may be time to find a new service provider.
  • WordPress uses encryption to store passwords in the database. It doesn’t store the actual password – instead, it stores an encrypted version. If someone steals your database tables, they won’t have the actual passwords.
  • If you used a simple hash function, a hacker could gain privileged access to your app in a short period of time.
  • The salt strings are stored in your site’s wp-config.php file.
  • Salts dramatically increase the time it would take to get a password out of a hash code – instead of taking a few weeks, it would take millions of years
  • You keep the other key (the decryption key) to yourself. If anyone stole it, they could decode your private messages! These 2-key cryptographic functions do exist. They are the basis of TLS (https) and SSH.
  • the most secure systems tend to be the simplest. The absolute secure machine would be one that was switched off.
  • For WordPress sites, you also need PHP and a database.
  • A VM is an emulated computer system running inside a real computer (the host). It contains its own operating system and resources, such as storage, and memory. The VM could run a completely different operating system from the host system – you could run OSX in a VM hosted on your Windows machine
  • This isolation offers a degree of protection. Let’s imagine your VM gets infected with a particularly nasty virus – the VM’s file system could be completely destroyed, or the data could be hopelessly corrupted. But the damage is limited to the VM itself. The host environment would remain safe.
  • This is how shared hosting and virtual private servers (VPSes) work today. Each customer has access to their own self-contained environment, within a virtual machine.
  • VMs are not just for hosting companies. If you’re hosting multiple sites on a dedicated server or a VPS, VMs can help to make your server more secure. Each site can live inside its own VM. That way, if one server is hacked, the rest of your sites are safe.
  • Even with all these considerations, the benefits of VMs outweigh their drawbacks. But performance is vital on the web.
  • Containers (like Docker) are very similar to VMs.
  • Because we’ve cut the hypervisor out of the loop, applications run much faster – almost as fast as processes in the host environment. Keeping each container separate does involve some computation by the container software. But it’s much lighter than the work required by a hypervisor!
  • Docker Cloud is a web-based service that automates the task for you. It integrates smoothly with the most popular cloud hosting platforms (such as Amazon Web Services, or Digital Ocean).
  • With containers, you can guarantee that the developer’s environment is exactly the same as the live server. Before the developer writes a single line of code, they can download the container to their computer. If the code works on their PC, it will work on the live server. This is a huge benefit of using containers, and it’s a major reason for their popularity.
  • A complete stack of these layers is called an “image”
  • The core of Docker is the Docker Engine – which lives inside a daemon – or long-running process
  • another great resource – the Docker Hub. The hub is an online directory of community-made images you can download and use in your own projects. These include Linux distributions, utilities, and complete applications.
  • Docker has established a relationship with the teams behind popular open source projects (including WordPress) – these partners have built official images that you can download and use as-is.
  • when you finish developing your code, you should wrap it up inside a complete container image. The goal is to put all the code that runs your site inside a container and store the volatile data in a volume.
  • Although Docker can help to make your site more secure, there’s are a few major issues you need to understand. The Docker daemon runs as a superuser It’s possible to load the entire filesystem into a container It’s possible to pass a reference to the docker daemon into a container
  • The solution to this issue is to use a MAC solution like SELinux, GRSecurity or AppArmor.
  • Never let anyone trick you into running a strange docker command.
  • only download and use Docker images from a trustworthy source. Official images for popular images are security audited by the Docker team. Community images are not
  • there are the core WordPress files. These interact with the web server through the PHP runtime. WordPress also relies on the file system and a database server.
  • A service is some software component that listens for requests (over a protocol) and does something when it receives those requests.
  • Using Docker, you could install WordPress, Apache, and PHP in one container, and run MySQL from another. These containers could run on the same physical machine, or on different ones
  • The database service container can be configured to only accept connections that originate from the web container. This immediately removes the threat of external attacks against your database server
  • This gives you the perfect opportunity to remove high-risk software from your host machine, including: Language Runtimes and interpreters, such as PHP, Ruby, Python, etc. Web servers Databases Mail Servers
  • If a new version of MySQL is released, you can update the database container without touching the web container. Likewise, if PHP or Apache are updated, you can update the web container and leave the database container alone.
  • Because Docker makes it easy to connect these containers together, there’s no reason to lump all your software inside a single container. In fact, it’s a bad practice – it increases the security risk for any single container, and it makes it harder to manage them.
  • If your site is already live on an existing server, the best approach is to set up a new host machine and then migrate over to it. Here are the steps you need to take:
  • With a minimal Ubuntu installation, you have a fairly bare-bones server. You also have the benefit of a huge repository of software you can install if you want.
  • If access control is like a lock protecting a building, intrusion detection is the security alarm that rings after someone breaks in.
  • Logging on to your host with a superuser account is a bad practice. It’s easy to accidentally break something.
  • Fail2ban blocks SSH users who fail the login process multiple times. You can also set it up to detect and block hack attempts over HTTP – this will catch hackers who attempt to probe your site for weaknesses.
  • With multiple WordPress sites on your machine, you have 2 choices. You could create a new database container for each, or you could reuse the same container between them. Sharing the DB container is a little riskier, as a hacker could, theoretically, ruin all your sites with one attack. You can minimize that risk by: Use a custom root user and password for your database – don’t use the default username of ‘root’. Ensuring the db container is not accessible over the internet (hide it away inside a docker network) Creating new databases and users for each WordPress site. Ensure each user only has permissions for their specific database.
  • What are the benefits of using a single database container? It’s easier to configure and scale. It’s easier to backup and recover your data. It’s a little lighter on resources.
  • you could also add a caching container, like Varnish. Varnish caches your content so it can serve pages quickly – much faster than WordPress can
  • Docker has the ability to limit how much processor time and memory each container gets. This protects you against exhaustion DOS attacks
  • A containerized process still has some of the abilities of root, making it more powerful than a regular user. But it’s not as bad as full-on root privileges. With AppArmor, you can tighten the security further, preventing the process from accessing any parts of the system that do not relate to serving your website.
  • Docker Hub works like GitHub – you can upload and download images for free. The downside is that there’s no security auditing. So it’s easy to download a trojan horse inside a container.
  • Official images (such as WordPress and Apache) are audited by the Docker team. These are safe. Community images (which have names like user/myapp) are not audited.
  • a kernel exploit executed inside a container will affect the entire system. The only way to protect against kernel exploits is to regularly update the host system
  • Containers run in isolation from the rest of the system. That does not mean you can neglect security – your website lives inside these containers! Even if a hacker cannot access the full system from a container, they can still damage the container’s contents.
  • Under Ubuntu, AppArmor already protects you – to a degree. The Docker daemon has an AppArmor profile, and each container runs under a default AppArmor profile. The default profile prevents an app from breaking out of the container, and restricts it from doing things that would harm the system as a whole. However, the default profile offers no specific protection against WordPress specific attacks. We can fix this by creating a custom profile for your WordPress container.
  • The net effect is that it’s impossible to install malware, themes or plugins through the web interface. We’ve already covered this to some degree with the .htaccess rules and directory permissions. Now we’re enforcing it through the Linux kernel.
  • There are versions of Docker for Mac and PC, so you’ll be able to run your site from your home machine. If the code works on your PC, it will also work on the server.
  • Tripwire tends to complain about the entries in the /proc filespace, which are auto-generated by the Linux kernel. These files contain information about running processes, and they tend to change rapidly while Linux runs your system. We don’t want to ignore the directory entirely, as it provides useful signs that an attack is in progress. So we’re going to have to update the policy to focus on the files we are interested in.
  • Now we should install an e-mail notification utility – to warn us if anything changes on the system. This will enable us to respond quickly if our system is compromised (depending on how often you check your emails).
  • Rootkits are malicious code that hackers install onto your machine. When they manage to get one on your server, it gives them elevated access to your system
  • Tripwire is configured to search in key areas. It’s good at detecting newly installed software, malicious sockets, and other signs of a compromised system. RKHunter looks in less obvious places, and it checks the contents of files to see if they contain known malicious code. RKHunter is supported by a community of security experts who keep it updated with known malware signatures – just like antivirus software for PCs.
  • If your hosting company offers the option, this would be a good point to make an image of your server. Most cloud hosting companies offer tools to do this.
  • With an image, it’s easy to launch new servers or recover the old one if things go horribly wrong.
  • We’ve hidden our server from the world while making it easy to read our content We’ve built a firewall to block malicious traffic We’ve trapped our web server inside a container where it can’t do any harm We’ve strengthened Linux’s access control model to prevent processes from going rogue We’ve added an intrusion detection system to identify corrupted files and processes We’ve added a rootkit scanner We’ve strengthened our WordPress installation with 2-factor authentication We’ve disabled the ability for any malicious user to install poisoned themes or plugins
  • Make a routine of checking the logs (or emails if you configured email reporting). It’s vital to act quickly if you see any warnings. If they’re false warnings, edit the configuration. Don’t get into a habit of ignoring the reports.
  • Virtually everything that happens on a Linux machine is logged.
  • You have to make a habit of checking for new exploits and learn how to protect yourself against them. Regularly check for security patches and issues in the core WordPress app: WordPress Security Notices Also, check regularly on the forums or mailing lists for the plugins and themes you use on your site.
  • network level intrusion detection service – you can fix that by installing Snort or PSAD.
  • The only way to guarantee your safety is to constantly update your security tactics and never get complacent.
jack_fox

Robots Meta Tag & X-Robots-Tag: Everything You Need to Know - 0 views

shared by jack_fox on 12 Oct 20 - No Cached
  •  
    "add the following HTML snippet to every page on your site to tell Google that you want no restrictions on your snippets: Note that if you use Yoast SEO, this piece of code is added automatically on every page unless you added noindex or nosnippet directives."
Rob Laporte

An SEO guide to understanding E-E-A-T - 0 views

  • Google recently added an extra “E” to the search quality standards of E-A-T to ensure content is helpful and relevant. The extra “E” stands for “experience” and precedes the original E-A-T concept – expertise, authoritativeness and trustworthiness. 
  • The Stanford Persuasive Technology Lab compiled 10 guidelines for building web credibility based on three-year research with over 4,500 participants. Make it easy to verify the accuracy of the information on your site. Show that there’s a real organization behind your site. Highlight the expertise in your organization and in the content and services you provide. Show that honest and trustworthy people stand behind your site. Make it easy to contact you. Design your site so it looks professional (or is appropriate for your purpose). Make your site easy to use – and useful. Update your site’s content often (at least show it’s been reviewed recently). Use restraint with any promotional content (e.g., ads, offers). Avoid errors of all types, no matter how small they seem. – Stanford Web Credibility Research If the above doesn’t scream, “Be a human, care about your users and your website experience,” I don’t know what does.
  • Experience is especially important in a digital world moving toward generative AI content
  • ...13 more annotations...
  • It’s probably no coincidence that Google announced the addition of “experience” in its search quality raters guidelines shortly after ChatGPT’s launch. 
  • Besides, expertise will build confidence with the human reading your content, so I would still consider adding: The author’s name. A descriptive bio containing: Their relevant qualifications. Links to their social media profiles. A Person schema with relevant properties for certifications or professions.
  • Authority can be demonstrated in three core ways:  Establishing a strong content architecture covering all aspects of a particular topic. Earning backlinks from other authoritative sites. Building a digital profile or personal brand as an expert in a particular topic.
  • Once again, the idea of publishing content that is truly helpful supports Standford’s web credibility guidelines: Make it easy to contact you. Make it easy to verify the accuracy of the information on your site. Design your site so it looks professional (or is appropriate for your purpose). Make your site easy to use – and useful. Update your site’s content often (at least show it’s been reviewed recently). Use restraint with any promotional content (e.g., ads, offers). Avoid errors of all types, no matter how small they seem.
  • Although they carry less weight than they used to, backlinks are still an indicator of an authoritative site.
  • Consider page experience
  • Show your humans with an About us or Team page
  • Link to authoritative sources
  • Build topical clusters
  • Use internal links
  • Include different content types
  • Engage experts
  • Encourage reviews
Rob Laporte

AI Overview Study for 8,000 Keywords in Google Search - 0 views

  • The average AI Overview is 169 words and 912 pixels long.Only 12.4% of the analyzed keywords display an AI Overview.A Featured Snippet is showing on 17.6% of the analyzed keywords.On average, AI Overviews appear alongside Featured Snippets in 7.4% of cases. For the Health niche, they show up together the most often (34.9% of queries).AI Overviews contain 7.2 links on average when expanded.33.4% of AI Overview links rank in that query's top 10 organic results.46.5% of the URLs included in AI Overviews rank outside the top 50 organic results.Five-word queries trigger an AI Overview most frequently.Keywords from the Health and Safety niches are more likely to trigger AI Overviews.No AI Overviews show up for brand related queries.Navigational intent keywords are less likely to display AI Overviews.Google Ads are displayed in 28.3% keywords that trigger AI Overviews.From all the keywords that trigger AI Overviews, Ads at the top of the SERP appear for 8.7% of keywords. Ads at the bottom are displayed for 19.5% of these keywords.Shopping Ads are almost never seen together with AI Overviews and when they are, they always appear below the AI Overview.
  • AI Overview Visibility by IndustryWhich industries show AI Overviews more frequently?The bar chart below shows the frequency of AI overviews across different industries.
Rob Laporte

Yahoo Improves Content Match Targeting - 0 views

  • Oct 13, 2008 at 9:42am Eastern by Barry Schwartz Yahoo Improves Content Match Targeting The Yahoo Search Marketing Blog announced they have improved the targeting and relevancy of their content match product. The improvements will lead to a higher click through rate on ads and higher satisfaction. The specific improvement is that they now not only target the ads based on the content of the page, but also based on the user viewing the page. Yahoo will tailor the ad based on the “users’ geographic and behavioral profiles.”
Rob Laporte

Google Now Working With Click Forensics - 0 views

  • Oct 13, 2008 at 9:53am Eastern by Barry Schwartz Google Now Working With Click Forensics Google allies with click-fraud-detection firm Click Forensics from ComputerWorld reports Google has now agreed to work with Click Forensics to aid in the detection and reporting of search ad click fraud. Specifically, Google said they would now accept click fraud submissions through the product, FACTr. FACTr is a product Click Forensics created with the help of Yahoo to gather and submit click-quality reports. Google will now be accepting these reports electronically, hopefully streamlining the process for advertisers to get refunds for approved click fraud. Looksmart and Miva are also now accepting them, along with Google, in news that Click Forensics announced last week. As you may remember, Google and Click Forensics have not always seen eye-to-eye. But this is a nice step to see from both parties involved. As a matter of history, Yahoo partnered with Click Forensics back in March of this year.
  •  
    Oct 13, 2008 at 9:53am Eastern by Barry Schwartz Google Now Working With Click Forensics Google allies with click-fraud-detection firm Click Forensics from ComputerWorld reports Google has now agreed to work with Click Forensics to aid in the detection and reporting of search ad click fraud. Specifically, Google said they would now accept click fraud submissions through the product, FACTr. FACTr is a product Click Forensics created with the help of Yahoo to gather and submit click-quality reports. Google will now be accepting these reports electronically, hopefully streamlining the process for advertisers to get refunds for approved click fraud. Looksmart and Miva are also now accepting them, along with Google, in news that Click Forensics announced last week. As you may remember, Google and Click Forensics have not always seen eye-to-eye. But this is a nice step to see from both parties involved. As a matter of history, Yahoo partnered with Click Forensics back in March of this year.
Rob Laporte

Tips On Getting a Perfect 10 on Google Quality Score - 0 views

  • October 20, 2008 Tips On Getting a Perfect 10 on Google Quality Score Ever since Google launched the real time quality score metric, where Google rated keywords between 0 and 10, 10 being the highest, I have rarely seen threads on documenting how to receive a 10 out of 10. Tamar blogged about How To Ensure That Your Google Quality Score is 10/10 based on an experiment by abbotsys. Back then, it was simply about matching the domain name to the keyword phrase, but can it be achieved with out that? A DigitalPoint Forums thread reports another advertiser receiving the 10/10 score. He documented what he did to obtain the score: Eliminated all the keywords that google had suggested and only used a maximum of three keywords per ad campaign.Used only 1 ad campaign per landing page and made each landing page specific for that keyword.Put the cost per click up high enough to give me around third spot.Geo targeted the campaigns only in the areas he can sell to.Limited the time his ads were on only to the times where there is really interest.Used three version of each keyword "keyword", [keyword], and keyword and then eliminated which every wasn't working well. If you want to reach that perfect 10, maybe try these tips and see what works for you. There is no guaranteed checklist of items, so keep experimenting. And when you get your perfect 10, do share!
Rob Laporte

Bing - How Microsoft handles bots clicking on ads - Webmaster Blog - Bing Community - 0 views

  • AdCenter uses a variety of techniques to remove bots, including the Interactive Advertising Bureau’s (IAB) Spiders and Robots protocol.  The IAB provides a list of known bots, and Microsoft bots are a part of that list. As a result, any activity generated by bots will not skew AdCenter data because it will be categorized as low quality in AdCenter Reports. You can view the Standard Quality and Low Quality data by accessing the AdCenter Reports tab. In June, 2009, Microsoft received Click Quality Accreditation from the IAB, which holds the industry’s highest standards in click measurement. The IAB and independent third-part auditors verified that adCenter meets their requirements for Click Quality Accreditation, which includes not billing for our search bot’s ad clicks. For more information, visit the adCenter Blog, or the IAB site.
Rob Laporte

Advertisers Lag Consumers in Mobile Adoption, For Now - ClickZ - 0 views

  • Only 11 percent of both brands and agencies responding to eMarketer said mobile represented a line item in their 2010 budgets; nineteen percent said they were "experimenting but have no future plans at all;" and 36 percent of brands said it was simply not part of their plans. But with the spread of smart phones and devices that facilitate easier Web searching, advertisers will find themselves faced with more options for reaching consumers on their phones, and are already preparing to take advantage of them. EMarketer projects spending on mobile ads to reach $593 million next year, and $830 million in 2011. By 2013, the report says that number will reach $1.56 billion, 9.9 percent of total spending on display advertising. "Mobile will grow considerably more quickly than online ad spending as a whole, more in line with emerging online formats such as digital video," Elkin said. The report also noted that widespread experimentation today is making marketers -- and consumers -- more comfortable with ads on mobile devices, and will pay off in the coming years. Of course, talking about mobile is talking about many different things: search, display and SMS texting, to name a few. As for where marketers will put this money, eMarketer predicts the steepest rise to come in money spent on search, from 18 percent of the total in 2008 to 37 percent in 2013. Meanwhile, SMS will see a decline in share as messaging options become more sophisticated, from 60 percent in 2008 to 28 percent in 2013. Display is expected to grow its share, from 22 percent last year to 35 percent in 2013.
Rob Laporte

Page 3 - Textlinkbrokers.com & text-link-ads.com - SEO Chat - 0 views

  • Jarrod u seem pretty convincing here. I sent a mail to Brigette (ur account manager) last month and asked some few simple questions regarding the services. Not a single answer was convincing enough to buy your services and that's when i decided not to purchase links through u. Here are the excerpts: Quote: 1. What if we decide to discontinue your service in the future? Do we lose all the purchased back links in that case? TLB: If you rent links, they would come down. However, if you purchase products that are permanently placed, we do not take them down. But you don't place text links permanently. Even your permanent package gives only 6 months guarantee. Quote: 2. How we can secure the ownership of our purchased links? What if the webmaster removed the link we have purchased after some time or what if he moved the link to some other location or some other web page or changed the anchor text of the link or added large number of other external links (may be from our competitors) and thus reducing our link weight or what if he made our link no follow or what if he deleted the web page or shut down the website? Can we claim any compensation or refund in that case? TLB: Each of our products has different minimums and guarantees. Our permanent links that are included in the “Booster Package” have a 3 month guarantee. During this time we have a script that ensures your link stays live. If, for some reason, it were to come down we would replace it free of charge. Beyond that, you would have no recourse. However, if you purchase a permanent link package, they have a 6 month guarantee that works the same way. Do you call this a convincing reply? Quote: 3. How you can ensure us that you will not get our website penalized or banned by Google through your back links? What if our website gets penalized or banned by Google because of the link you have purchased for us? What is your policy in that case? TLB: We take every step possible to ensure that does not happen. We do things very differently than most link building companies. We do not use software, feeds or auto generated code of any kind. Each of our links are manually placed on 100% SEO friendly sites. Everyone who is accepted into our inventory goes through an extensive approval process. We deny applications daily for not meeting the large number of criteria our Quality Assurance team looks at. Once they are accepted into inventory, their information is not posted on the web site. They are not allowed to post anything on their site that says they are affiliated with us in any way. They are not asked to and not allowed to backlink to us under any circumstances. We take the protection of our Inventory Partners and our clients very seriously. If a potential client goes to our website to view inventory, they will only see general information such as a description, page rank, site age, number of outbound links, etc. The only way to view the actual url is to sign a non-disclosure agreement. That is only done after speaking with a Customer Service Representative or Account Manager who would create the list for you. So, as you can see, for years we have done everything we can do to protect our inventory partners as well as our clients. Our goals is to make you successful so that we can continue with a long term business relationship. If we do not protect our partners and they get penalized, your links will not pass SEO value. Therefore, we take that very seriously. Your so called forbidden inventory is just one report away from Google web spam team. Once identified, everyone associated with it will bust like a bubble. IMO that's the risk rand was talking about.
  • Himanshu160, I only wish that I could replicate myself, wouldn't that be great. I would be happy to discuss other options with you outside of the forums or get you to one of our senior account reps. I do not handle very many sales and this isn't the place for it. As for our perm links, most of those are placed on sites that we do not control thus it becomes too costly to guarantee them forever. We have found that if they have stayed up for 6 months the churn rate is fairly low after that.. The 3 month guarantee is being offered at a cheaper rate and usually only used in our bundles. Again if it has stayed live for 3 months the churn rate isn't going to be very high after that. There are advantages to being on our controlled inventory but also some disadvantages. With our controlled inventory we can make sure every link we place stays up, those tend to be the links we charge monthly, although we have done some custom perm links on controlled inventory. The disadvantage is that if someone reports one of our controlled sites to Google it can loose value, of course some sites are at more risk than others because they sell a lot of links or they sell homepage links in the sidebar etc.. We do have inventory that is cleaner than others and we can even do exclusive deals so that you are the only one on the site. It all depends on your budget. For most low competition keywords one of our cheap link bundles is all that is needed. Sure some of the links will go down over time, and yes Google may devalue some. However there are always new links being built to replace the few that go down so the results are a nice increase in rankings over time.
Rob Laporte

Selling text links ads thorugh TLA or DLA result in Google penalty? - 0 views

  • Can selling text link ads in the sidebar using TLA or Direct-Link-Ads result in a Googlge penalty? I use to use TLA before for one of my sites but stopped using them for the fear of Google dropping the sit because i heard a few rumors on webmaster forums of this happening. Is this concrete or not? Are people still using TLA or DLA or some other similar? C7Mike#:3930956 4:52 am on June 11, 2009 (utc 0) Yes, you may receive a penalty for purchasing links that pass PageRank. See Google's Webmasters/Site owner Help topic for more information: [google.com...] Automotive site#:3930991 6:42 am on June 11, 2009 (utc 0) Well, I was actually going to use one of thoose to sell and not purchase. Anyway, I am going to apply to BuyandSellAds and see if I get accepted there, but I heard they mostly accept tech related sites. C7Mike#:3931237 2:25 pm on June 11, 2009 (utc 0) You may receive a penalty for both buying and selling paid links that pass PageRank (see [google.com...] I have had a few sites lose their PR because they published links through TLA. However the content was still good enough that advertisers have continued to purchase links on those pages through TLA inspite of the lack of PR, and at a substantially lower rate.
« First ‹ Previous 41 - 60 of 258 Next › Last »
Showing 20 items per page