Skip to main content

Home/ DISC Inc/ Group items tagged web programming

Rss Feed Group items tagged

jack_fox

Web Hosting 101 - The Basics - 0 views

  • Linux servers are open source and can be based on a number of different distributions, such as Ubuntu, Debian, Red Hat, CentOS, or FreeBSD.
  • the most common forms of web hosting available are: Free Web Hosting Shared Web Hosting Managed Web Hosting VPS Web Hosting Dedicated Web Hosting Cloud Web Hosting
  • Free web hosting is offered by various companies primarily in order to up-sell other domain services or to publish advertising on pages that are hosted under the account.
  • ...33 more annotations...
  • With shared web hosting, there may be thousands of different businesses, individuals, and organizations all serving their website files to the public from the same computer.
  • The web hosting company employs systems administrators to manage the server software installation and security updates. The hosting clients use file transfer management tools to host web pages in HTML or other programming languages which serve the files to the public through the browser. The hard disk space on the remote server can be used for other purposes than web hosting, for example remote file storage, email accounts, sandbox web development, mobile app support, or running software scripts online.
  • Shared web hosting accounts can cost as little as $1 – $3 dollars per month and rarely cost more than $20. It is estimated that over 90% of the websites on the internet use shared web hosting to keep their information online 24 hours a day. Shared web hosts never turn off their services and offer seamless hardware upgrades in the data center that can keep a website online for years. Most of the available web development tools will integrate easily with a shared hosting account.
  • The main disadvantage of shared web hosting is that it is not able to scale effectively to support the traffic of large websites and usually includes strict limitations on the use of CPU processing power because of the pooled resources. Shared web hosting does not typically support the user installation of server extensions through the command line that are important for custom web development and mobile app support.
  • There is still no opportunity for advanced systems administration and custom server configurations on most shared hosting plans. Security on shared web hosting frameworks is not considered robust enough for sensitive corporate information and government accounts. There can also be performance issues that develop on a server if one domain is consistently consuming shared resources or hit with a DDoS attack. Because systems administration and root server configuration control is taken out of the hands of shared web hosting users, they are often overly reliant on the service company for tech support.
  • Shared web hosting is recommended for self-published websites and small business networks.
  • Managed web hosting is a version of shared hosting where the service company specializes in platform-specific products that support custom development frameworks. Examples of this can be seen in Pantheon and Acquia Cloud for Drupal, Nexcess for Magento, or WP Engine for WordPress. Managed host companies provide optimized server environments that can speed up website performance and page load times for high-traffic, CMS-driven websites.
  • Virtual Private Servers (VPS) are a web hosting solution designed to give more power and flexibility to website owners for custom developed software requirements and complex applications. Technically, a VPS will operate in the same manner as a dedicated server while operating on a partitioned hardware framework that allows for the use of only a fraction of the resources of the host machine.
  • Understanding which virtualization platform the VPS web hosting company is using to manage data center resources and client configurations is important.
  • Developers often prefer VPS accounts because they can custom configure the server with the choice of operating system and install whatever additional server extensions are required for programming web applications
  • The main benefit of VPS hosting is that website owners can “dial in” the exact amount of server resources that are required to optimize the performance of a complex website.
  • The main disadvantage of VPS web hosting is the complexity of systems administration required to install and manage the server software, which requires a lot of command line knowledge and background in web server configuration.
  • Inexperienced users can leave security holes in the environment that hackers using automated script bots and known server or database exploits can easily detect and target. Using a standardized cPanel, CentOS, & WHM environment or administration panels like Webmin and Virtualmin can help simplify the server administration process considerably by adding a GUI layer to access common tasks
  • VPS web hosting accounts are best suited for developers who need to custom configure the environment with server extensions that shared web hosts will not support. Typically these are related to the use of database frameworks other than MySQL, programming languages other than PHP, and server frameworks other than Apache.
  • Dedicated web hosting is the most expensive and flexible of all of the service plans offered by companies in the industry, as site owners are able to directly rent or lease a complete rack-mount server in a data center.
  • Dedicated servers are required to host the largest sites by traffic on the web, as well as by mobile apps which require elite performance
  • The main disadvantage of a dedicated server is that it is costly compared to shared hosting or VPS plans, and expensive even when compared to the price of the underlying hardware itself. With dedicated servers, the client is paying not only for the use of the server, but also for the trained technicians who manage it, the overhead costs of the data center, and access to the internet backbone. Data center costs include not only rental of office and warehouse space, but also the electricity required to run all of the servers and keep them cool. Data centers must also have back-up power generation facilities in case the local electricity supply is cut. All of the residual costs are included in the annual price of a dedicated server plan. Nevertheless, it is still often much cheaper then what would be required to manage a data center for a single business independently.
  • Cloud web hosting provides solutions for websites that need more processing power and require more than a single server instance because the amount of online traffic, including the number of queries to the database and resource files, is too high in volume for a single machine
  • Cloud web hosting is defined by the deployment of server clusters that scale automatically with the user traffic and processing power needs of a website, including advanced software applications for elastic load balancing, file storage, and database optimization
  • Cloud web hosting is similar to content delivery networks (CDNs) which use distributed global servers, advanced page caching, and file management software to optimize website performance for large websites. Many cloud hosting companies will offer the ability to choose the operating system, database framework, and geographic location of the server itself as part of the configuration options.
  • Cloud web hosting is designed for remote computing applications and required by large web sites whose user traffic exceeds the limits of what a single server instance will provide. Cloud web hosting is particularly designed to meet the needs of websites with large database requirements.
  • Not every website will require cloud hosting, but small businesses and start-ups who scale their traffic and user communities often find managed cloud services a reasonable option over dedicated servers because of the ability to “pay-as-you-go” for only the amount of server resources used and for the ability to keep sites online through cluster scaling at the times of peak user service provision.
  • The major downside to cloud hosting is the uncertainty involved with the variability of costs with sites on the “pay-as-you-go” model. Another problem can be associated with “hype” in the industry, which can lead to over-pricing and over-billing for unnecessary services or introductory plans.
  • Cloud web hosting can be similar to VPS or dedicated server frameworks where the systems administrator has the ability to custom configure the installation of the operating system with software extensions that are not available in shared hosting environments. Some managed cloud hosts simplify this process by offering optimally configured solutions with a proprietary base software package.
  • Some of the main features to look for in any web hosting account are: Server Architecture Operating System Version Domain Management Tools Systems Administration Tools Bandwidth & CPU Limitations Free Offers & Promotions Data Security Technical Support
  • Before purchasing any web hosting account, it is essential to verify the server hardware being used on the platform. Currently there is a wide variety of difference between the different versions of Intel Xeon, Atom, Itanium, and AMD Opteron servers deployed in data center use.
  • The version of Linux installed, for example CentOS and Cloud Linux, can also come with licensing restrictions due to the use of WHM and cPanel. The use of other Linux distributions like Ubuntu, Red Hat, Debian, FreeBSD, etc. in web servers is mostly related to developer preference for systems administration
  • Many hosting companies claim to offer “unlimited” bandwidth and data transfer. However, if a website uses too many CPU resources, it may still be throttled or taken offline at times of peak user traffic.
  • A web hosting company should provide a guaranteed uptime of at least 99.9% as part of the service plan.
  • Website owners should make sure that any web hosting plan will be configured securely, including firewalls and monitoring software to prevent intrusions by automated script-bot attacks. Check whether the web hosting company offers DDoS attack protection and auto-alerts for unauthorized logins. While shared web hosting plans include the company services related to upgrading the installed operating system and server software with the latest security patches, VPS and dedicated server accounts will need to be responsible for this through a qualified systems administrator. Web hosting companies that provide automated site file and database back-up tools like raid disk mirroring on advanced accounts provide an extra layer of site security in case of a server crash or technical error that leads to data loss.
  • Managed hosts have the advantage of experienced technical support teams with platform-specific knowledge
  • Small business website owners and independent publishers should start with a shared web hosting account, then upgrade to a VPS or Cloud hosting plan if the traffic scales beyond what the server will support.
  • While shared hosting under cPanel and CentOS remains the industry standard, innovations in cloud computing are changing the web hosting landscape quickly. Many web hosting companies are now offering “hybrid” approaches that combine the best features of cloud and shared hosting into a high-performance, low-cost retail plan that offers integrated load balancing, page caching, and CDN services on elite server hardware configurations.
  •  
    "Linux servers are open source and can be based on a number of different distributions, such as Ubuntu, Debian, Red Hat, CentOS, or FreeBSD."
jack_fox

The Ultimate Web Server Security Guide @ MyThemeShop - 0 views

  • They could insert links into the site to boost their SEO rankings. Hackers can make a killing selling links from exploited sites. Alternatively, a hacker could deface the site and demand money to restore it (ransom). They could even place ads on the site and use the traffic to make money. In most cases, an attacker will also install backdoors into the server. These are deliberate security holes that allow them to come back and exploit the site in the future – even if the insecure plugin has been replaced.
  • Unfortunately, under WordPress, every plugin and theme has the ability to alter anything on the site. They can even be exploited to infect other apps and sites hosted on the same machine.
  • Theme developers are often relatively inexperienced coders. Usually, they’re professional graphic artists who have taught themselves a little PHP on the side. Plugins are another popular line of attack – they account for 22% of successful hacks. Put together, themes and plugins are a major source of security trouble.
  • ...102 more annotations...
  • Each person who uses your system should only have the privileges they need to perform their tasks.
  • Don’t depend on a single security measure to keep your server safe. You need multiple rings of defense.
  • Security exploits exist at all levels of the technology stack, from the hardware up. WP White Security revealed that 41% of WordPress sites are hacked through a weakness in the web host.
  • While it’s important to use a strong password, password cracking is not a primary focus for hackers.
  • the more software you have installed on your machine, the easier it is to hack – even if you aren’t using the programs! Clearly, programs that are designed to destroy your system are dangerous. But even innocent software can be used in an attack.
  • There are 3 ways to reduce the attack surface: 1. Run fewer processes 2. Uninstall programs you don’t need 3. Build a system from scratch that only has the processes you need
  • A really good authentication system uses multiple tests. Someone could steal or guess your password. They could grab your laptop with its cryptographic keys.
  • If you want to run multiple processes at the same time, you need some way of managing them. This is basically what a kernel is. It does more than that – it handles all of the complex details of the computer hardware, too. And it runs the computer’s networking capabilities
  • programs exist as files when they are not running in memory
  • SELinux’s default response is to deny any request.
  • SELinux is extremely comprehensive, but this power comes at a price. It’s difficult to learn, complex to set up, and time-consuming to maintain.
  • AppArmor is an example of a MAC tool, although it’s nowhere near as comprehensive as SELinux. It applies rules to programs to limit what they can do.
  • AppArmor is relatively easy to set up, but it does require you to configure each application and program one by one. This puts the onus for security in the hands of the user or sysadmin. Often, when new apps are added, users forget to configure AppArmor. Or they do a horrible job and lock themselves out, so their only option is to disable the profile. That said, several distributions have adopted AppArmor.
  • Generic profiles shipped by repo teams are designed to cover a wide range of different use cases, so they tend to be fairly loose. Your specific use cases are usually more specific. In this case, it pays to fine-tune the settings, making them more restrictive.
  • GRSecurity is a suite of security enhancements
  • In the future, this could become a viable option. For now, we’ll use Ubuntu and AppArmor.
  • Apache is a user-facing service – it’s how your users interact with your website. It’s important to control this interaction too.
  • If your Apache configuration is bad, these files can be viewed as plain text. All of your code will be visible for anyone to see – this potentially includes your database credentials, cryptographic keys, and salts.
  • You can configure Apache to refuse any requests for these essential directories using .htaccess files. These are folder-level configuration files that Apache reads before it replies to a request.
  • The primary use for .htaccess files is to control access
  • If an attacker knows your WordPress cryptographic salts, they can use fake cookies to trick WordPress into thinking they have logged on already.
  • If the hacker has physical access to the computer, they have many options at their disposal. They can type commands through the keyboard, or insert a disk or USB stick into the machine and launch an attack that way.
  • When it comes to network-based attacks, attackers have to reach through one of the machine’s network ports.
  • For an attacker to exploit a system, they have to communicate to a process that’s listening on a port. Otherwise, they’d simply be sending messages that are ignored. This is why you should only run processes that you need for your site to run. Anything else is a security risk.
  • Often, ports are occupied by processes that provide no real valuable service to the machine’s legitimate users. This tends to happen when you install a large distribution designed for multiple uses. Large distros include software that is useless to you in terms of running a website. So the best strategy is to start with a very lightweight distro and add the components you need.
  • If you see any unnecessary processes, you can shut them down manually. Better yet, if the process is completely unnecessary, you can remove it from your system.
  • Firewalls are quite similar to access control within the computer. They operate on a network level, and you can use them to enforce security policies. A firewall can prevent processes from broadcasting information from a port. It can stop outside users from sending data to a port. And it can enforce more complex rules.
  • Simply installing and running a firewall does not make your host machine secure – it’s just one layer in the security cake. But it’s a vital and a powerful one.
  • First of all, we need to configure our software to resist common attacks. But that can only protect us from attacks we know about. Access control software, such as AppArmor, can drastically limit the damage caused by unauthorized access. But you still need to know an attack is in progress.
  • This is where Network Intrusion Detection Software (NIDS) is essential. It scans the incoming network traffic, looking for unusual patterns or signs of a known attack. If it sees anything suspicious, it logs an alert.
  • It’s up to you to review these logs and act on them.
  • If it’s a false alarm, you should tune your NIDS software to ignore it. If it’s an ineffective attack, you should review your security and block the attacker through the firewall.
  • That’s why it’s essential to have an automated backup system. Finally, you need to understand how the attack succeeded, so you can prevent it from recurring. You may have to change some settings on your Firewall, tighten your access rules, adjust your Apache configuration, and change settings in your wp-config file. None of this would be possible without detailed logs describing the attack.
  • Every web server has a breaking point and dedicated DOS attackers are willing to increase the load until your server buckles. Good firewalls offer some level of protection against naive DOS attacks
  • a tiny number of sites (less than 1%) are hacked through the WordPress core files
  • Major DNS attacks have taken down some of the biggest sites in the world – including Ebay and Paypal. Large hosting companies like Hostgator and Blue Host have been attacked. It’s a serious risk!
  • Right now, due to the way the web currently works, it’s impossible to download a web page without the IP address of a server. In the future, technologies like IFPS and MaidSafe could change that.
  • So there are 2 benefits to using a CDN. The first is that your content gets to your readers fast. The second benefit is server anonymity – nobody knows your real IP address – including the psychos. This makes it pretty impossible to attack your server – nobody can attack a server without an IP address.
  • When CDNs discover a DDOS attack, they have their own ways to deal with it. They often display a very lightweight “are you human?” message with a captcha. This tactic reduces the bandwidth costs and screens out the automated attacks.
  • If any of your DNS records point to your actual server, then it’s easy to find it and attack it. This includes A records (aliases) and MX records (mail exchange). You should also use a separate mail server machine to send your emails. Otherwise, your email headers will expose your real email address.
  • If your hosting company refuses to give you a new IP address, it may be time to find a new service provider.
  • WordPress uses encryption to store passwords in the database. It doesn’t store the actual password – instead, it stores an encrypted version. If someone steals your database tables, they won’t have the actual passwords.
  • If you used a simple hash function, a hacker could gain privileged access to your app in a short period of time.
  • The salt strings are stored in your site’s wp-config.php file.
  • Salts dramatically increase the time it would take to get a password out of a hash code – instead of taking a few weeks, it would take millions of years
  • You keep the other key (the decryption key) to yourself. If anyone stole it, they could decode your private messages! These 2-key cryptographic functions do exist. They are the basis of TLS (https) and SSH.
  • the most secure systems tend to be the simplest. The absolute secure machine would be one that was switched off.
  • For WordPress sites, you also need PHP and a database.
  • A VM is an emulated computer system running inside a real computer (the host). It contains its own operating system and resources, such as storage, and memory. The VM could run a completely different operating system from the host system – you could run OSX in a VM hosted on your Windows machine
  • This isolation offers a degree of protection. Let’s imagine your VM gets infected with a particularly nasty virus – the VM’s file system could be completely destroyed, or the data could be hopelessly corrupted. But the damage is limited to the VM itself. The host environment would remain safe.
  • This is how shared hosting and virtual private servers (VPSes) work today. Each customer has access to their own self-contained environment, within a virtual machine.
  • VMs are not just for hosting companies. If you’re hosting multiple sites on a dedicated server or a VPS, VMs can help to make your server more secure. Each site can live inside its own VM. That way, if one server is hacked, the rest of your sites are safe.
  • Even with all these considerations, the benefits of VMs outweigh their drawbacks. But performance is vital on the web.
  • Containers (like Docker) are very similar to VMs.
  • Because we’ve cut the hypervisor out of the loop, applications run much faster – almost as fast as processes in the host environment. Keeping each container separate does involve some computation by the container software. But it’s much lighter than the work required by a hypervisor!
  • Docker Cloud is a web-based service that automates the task for you. It integrates smoothly with the most popular cloud hosting platforms (such as Amazon Web Services, or Digital Ocean).
  • With containers, you can guarantee that the developer’s environment is exactly the same as the live server. Before the developer writes a single line of code, they can download the container to their computer. If the code works on their PC, it will work on the live server. This is a huge benefit of using containers, and it’s a major reason for their popularity.
  • A complete stack of these layers is called an “image”
  • The core of Docker is the Docker Engine – which lives inside a daemon – or long-running process
  • another great resource – the Docker Hub. The hub is an online directory of community-made images you can download and use in your own projects. These include Linux distributions, utilities, and complete applications.
  • Docker has established a relationship with the teams behind popular open source projects (including WordPress) – these partners have built official images that you can download and use as-is.
  • when you finish developing your code, you should wrap it up inside a complete container image. The goal is to put all the code that runs your site inside a container and store the volatile data in a volume.
  • Although Docker can help to make your site more secure, there’s are a few major issues you need to understand. The Docker daemon runs as a superuser It’s possible to load the entire filesystem into a container It’s possible to pass a reference to the docker daemon into a container
  • The solution to this issue is to use a MAC solution like SELinux, GRSecurity or AppArmor.
  • Never let anyone trick you into running a strange docker command.
  • only download and use Docker images from a trustworthy source. Official images for popular images are security audited by the Docker team. Community images are not
  • there are the core WordPress files. These interact with the web server through the PHP runtime. WordPress also relies on the file system and a database server.
  • A service is some software component that listens for requests (over a protocol) and does something when it receives those requests.
  • Using Docker, you could install WordPress, Apache, and PHP in one container, and run MySQL from another. These containers could run on the same physical machine, or on different ones
  • The database service container can be configured to only accept connections that originate from the web container. This immediately removes the threat of external attacks against your database server
  • This gives you the perfect opportunity to remove high-risk software from your host machine, including: Language Runtimes and interpreters, such as PHP, Ruby, Python, etc. Web servers Databases Mail Servers
  • If a new version of MySQL is released, you can update the database container without touching the web container. Likewise, if PHP or Apache are updated, you can update the web container and leave the database container alone.
  • Because Docker makes it easy to connect these containers together, there’s no reason to lump all your software inside a single container. In fact, it’s a bad practice – it increases the security risk for any single container, and it makes it harder to manage them.
  • If your site is already live on an existing server, the best approach is to set up a new host machine and then migrate over to it. Here are the steps you need to take:
  • With a minimal Ubuntu installation, you have a fairly bare-bones server. You also have the benefit of a huge repository of software you can install if you want.
  • If access control is like a lock protecting a building, intrusion detection is the security alarm that rings after someone breaks in.
  • Logging on to your host with a superuser account is a bad practice. It’s easy to accidentally break something.
  • Fail2ban blocks SSH users who fail the login process multiple times. You can also set it up to detect and block hack attempts over HTTP – this will catch hackers who attempt to probe your site for weaknesses.
  • With multiple WordPress sites on your machine, you have 2 choices. You could create a new database container for each, or you could reuse the same container between them. Sharing the DB container is a little riskier, as a hacker could, theoretically, ruin all your sites with one attack. You can minimize that risk by: Use a custom root user and password for your database – don’t use the default username of ‘root’. Ensuring the db container is not accessible over the internet (hide it away inside a docker network) Creating new databases and users for each WordPress site. Ensure each user only has permissions for their specific database.
  • What are the benefits of using a single database container? It’s easier to configure and scale. It’s easier to backup and recover your data. It’s a little lighter on resources.
  • you could also add a caching container, like Varnish. Varnish caches your content so it can serve pages quickly – much faster than WordPress can
  • Docker has the ability to limit how much processor time and memory each container gets. This protects you against exhaustion DOS attacks
  • A containerized process still has some of the abilities of root, making it more powerful than a regular user. But it’s not as bad as full-on root privileges. With AppArmor, you can tighten the security further, preventing the process from accessing any parts of the system that do not relate to serving your website.
  • Docker Hub works like GitHub – you can upload and download images for free. The downside is that there’s no security auditing. So it’s easy to download a trojan horse inside a container.
  • Official images (such as WordPress and Apache) are audited by the Docker team. These are safe. Community images (which have names like user/myapp) are not audited.
  • a kernel exploit executed inside a container will affect the entire system. The only way to protect against kernel exploits is to regularly update the host system
  • Containers run in isolation from the rest of the system. That does not mean you can neglect security – your website lives inside these containers! Even if a hacker cannot access the full system from a container, they can still damage the container’s contents.
  • Under Ubuntu, AppArmor already protects you – to a degree. The Docker daemon has an AppArmor profile, and each container runs under a default AppArmor profile. The default profile prevents an app from breaking out of the container, and restricts it from doing things that would harm the system as a whole. However, the default profile offers no specific protection against WordPress specific attacks. We can fix this by creating a custom profile for your WordPress container.
  • The net effect is that it’s impossible to install malware, themes or plugins through the web interface. We’ve already covered this to some degree with the .htaccess rules and directory permissions. Now we’re enforcing it through the Linux kernel.
  • There are versions of Docker for Mac and PC, so you’ll be able to run your site from your home machine. If the code works on your PC, it will also work on the server.
  • Tripwire tends to complain about the entries in the /proc filespace, which are auto-generated by the Linux kernel. These files contain information about running processes, and they tend to change rapidly while Linux runs your system. We don’t want to ignore the directory entirely, as it provides useful signs that an attack is in progress. So we’re going to have to update the policy to focus on the files we are interested in.
  • Now we should install an e-mail notification utility – to warn us if anything changes on the system. This will enable us to respond quickly if our system is compromised (depending on how often you check your emails).
  • Rootkits are malicious code that hackers install onto your machine. When they manage to get one on your server, it gives them elevated access to your system
  • Tripwire is configured to search in key areas. It’s good at detecting newly installed software, malicious sockets, and other signs of a compromised system. RKHunter looks in less obvious places, and it checks the contents of files to see if they contain known malicious code. RKHunter is supported by a community of security experts who keep it updated with known malware signatures – just like antivirus software for PCs.
  • If your hosting company offers the option, this would be a good point to make an image of your server. Most cloud hosting companies offer tools to do this.
  • With an image, it’s easy to launch new servers or recover the old one if things go horribly wrong.
  • We’ve hidden our server from the world while making it easy to read our content We’ve built a firewall to block malicious traffic We’ve trapped our web server inside a container where it can’t do any harm We’ve strengthened Linux’s access control model to prevent processes from going rogue We’ve added an intrusion detection system to identify corrupted files and processes We’ve added a rootkit scanner We’ve strengthened our WordPress installation with 2-factor authentication We’ve disabled the ability for any malicious user to install poisoned themes or plugins
  • Make a routine of checking the logs (or emails if you configured email reporting). It’s vital to act quickly if you see any warnings. If they’re false warnings, edit the configuration. Don’t get into a habit of ignoring the reports.
  • Virtually everything that happens on a Linux machine is logged.
  • You have to make a habit of checking for new exploits and learn how to protect yourself against them. Regularly check for security patches and issues in the core WordPress app: WordPress Security Notices Also, check regularly on the forums or mailing lists for the plugins and themes you use on your site.
  • network level intrusion detection service – you can fix that by installing Snort or PSAD.
  • The only way to guarantee your safety is to constantly update your security tactics and never get complacent.
jack_fox

Programming for SEOs - Whiteboard Friday - Moz - 0 views

  • If you're going down the path of data analysis, your primary reason for learning how to program is to work with data and do more sophisticated things with data, then I think there's no better language than Python.
  • If you're going down the path of web development, you want to be a better technical SEO, you want to understand how websites are constructed, JavaScript is an incredibly robust programming language that has boomed in usage on websites over the last few years. It's also very capable of doing backend web development with a language like Node.js, which is just a variant of JavaScript. The only issue with learning JavaScript is I would say that you need to learn CSS and HTML first.
Jennifer Williams

Tag Categories - 24 views

Hey Dale, I added that for you. If anyone else really thinks a new "tag" (category) is needed, post here to the forum. Don't forget to use these tags and make sure that they are spelled the same...

tags

Rob Laporte

BIZyCart SEO Manual - Controlled Navigation - 0 views

  • How The Robots Work Without getting into the programming details, the robots and web crawlers basically follow the following steps: On arrival, the robot pulls out all of the readable text it is interested in and creates a list of the links found on the page.  Links set as 'nofollow' or 'disallowed' are not added to the list.  If there are too many links, the robot may take a special action based on that. While the first robot completes processing the page, another robot script is launched to follow each of the links.  If there are ten links, there are now eleven robots running. Each of those robot scripts loads the page they were sent to and builds another link list.  Unless told otherwise, if there are ten links on each of those pages, one hundred additional robots get launched. Before going to the next page, the robots check to see if that page has already been looked at.  If already indexed that day, they cancel themselves and stop. The number of robots keeps expanding until all of the links have been followed and the site's web pages have been indexed or avoided. You can see that on some sites, thousands of robot processes can be taking their turns to work a web page.  There is physical limit on how much memory is available on the server.  If the number of active robots exceeds that, they have to be canceled or memory corruption will occur. If you let the robots run in too many directions, they may not finish looking at every web page or the results from some pages may get scrambled.  You are also subject to the number of robots on that server that are looking at other web sites.  Poorly managed robot servers can end up creating very strange results.
Rob Laporte

Questioning the Future of Search - ClickZ - 0 views

  •  
    Questioning the Future of Search By Mike Grehan, ClickZ, Jan 26, 2009 Related Reading New Signals to Search Engines Ajax and Search Engines SuperPages.com Combines Local Search with Social Networking Search Engines Are Allowed to Reject Ads Suggested Searches search engines - social networking - reject ads - static link Subscribe to newsletters Subscribe to RSS feeds Post a comment (0 posted) Last week I presented a Webinar based on the "thought paper" I wrote called, "New Signals To Search Engines." As it was a long read at 23 pages, I highlighted the more salient points, but mainly wanted to try and answer the hundreds of questions I received following its publication. The top question was about social media. It seems that many companies already have barriers to entry. Amy Labroo, associate director of online media at Advantage Business Media, asked specifically about any backlash due to unmonitored content in the social media space. I've come across this situation quite a lot recently. Many companies worry about negative commentary and therefore don't accept comments on their blogs or social network sites. In fact, many haven't started a blog or a dialogue space at a social networking site. This is simply hiding from your audience. If people have negative commentary about you and they can't make it known at your Web site or blog, they'll make it known somewhere else. I advocate putting yourself out there and listening to your audience. Marketing has changed from a broadcast-my-corporate-message medium to a listening medium. The voice of the customer is very, very loud online. And those companies that still believe they own their brand and the message may well be in for a bit of shock as brands are hijacked by customers. Let your customers have their say. Keyword-driven marketing is all about understanding the language of the customer and creating marketing messages in that language. From time to time, I meet with creative agencies and almost always end u
Dale Webb

symfony Web PHP Framework - 0 views

shared by Dale Webb on 08 May 08 - Cached
  •  
    What we're using for new THS floorplans section, CMS, and all additional modules to be added in the future. Jesse is doing most advanced programming but I'm using research time to learn it as well so I don't have to rely on him moving forward.
Dale Webb

W3C Validation not part of Google Search Engine Ranking Factor - 0 views

  •  
    Matt Cutts has announced that W3C Validation and clean coding does not factor into search engine rankings. There are other benefits to having a validated website, but there is nothing in Google's algorithm that will increase SERPS. note: cleaner code usually loads faster, and load time is a factor
Dale Webb

Pixelsilk: SEO-Friendly Content Management System | Search Engine Journal - 0 views

  •  
    Article I found about PixelSilk in my Blog rounds. It does look pretty slick and user-friendly. I'll be very interested to see how easy it is to work with from a development perspective, how it's coded, etc. This article is interesting and insightful because the person is unfamiliar with CMS/coding in general, but knows SEO, and finds it very easy to use and likes the SEO features.
Rob Laporte

BruceClay - SEO Newsletter - FEATURE: Takeaways from SMX Advanced Seattle 2010 - 0 views

  • You & A with Matt Cutts of GoogleGoogle's new Web indexing system, Caffeine, is fully live. The new indexing infrastructure translates to an index that is 50 percent fresher, has more storage capacity and can recognize more connections of information. The Mayday update was an algorithm update implemented at the beginning of May that is intended to filter out low-quality search results. A new report in the Crawl errors section of Google Webmaster Tools indicates "soft 404" errors in order to help webmasters recognize and resolve these errors. Keynote Q&A with Yusuf Mehdi of Microsoft Bing is opening up new ways to interact with maps. The newly released Bing Map App SDK allows developers to create their own applications which can be used to overlay information on maps. Bing Social integrates to Facebook firehose and Twitter results into a social search vertical. Bing plans to have the final stages of the Yahoo! organic and paid search integration completed by the end of 2010. Decisions about how to maintain or integrate Yahoo! Site Explorer have not been finalized. Bing's Webmaster Tools are about to undergo a major update. Refer to the Bing Webmaster Tools session for more on this development.
  • Bing's program manager said that the functionality provided by Yahoo! Site Explorer will still be available. It's not their intention to alienate SEOs because they consider SEOs users, too.
  • The Bing Webmaster team has built a new Webmaster Tools platform from the ground up. It is scheduled to go live Summer 2010. The platform focuses on three key areas: crawl, index and traffic. Data in each area will go back through a six month period. Tree control is a new feature that provides a visual way to traverse the crawl and index details of a site. The rich visualizations are powered by Silverlight. URL submission and URL blocking will be available in the new Webmaster Tools.
  • ...1 more annotation...
  • The Ultimate Social Media Tools Session Tools to get your message out: HelpaReporter, PitchEngine, Social Mention, ScoutLabs. Customer and user insight tools: Rapleaf, Flowtown. Tools to find influencers: Klout. Forum tools: Bing Boards, Omgili, Board Tracker, Board Reader. Digg tools: Digg Alerter, FriendStatistics, di66.net. Make use of the social tools offered by social networks, e.g. utilize Facebook's many options to update your page and communicate your fans by SMS. Encourage people to follow you using Twitter's short code.
jack_fox

4 local review trends to watch in 2021 - 0 views

  • The changing distribution of middle star reviews means that it’s more critical than ever for businesses to create a review program to solicit a larger volume of reviews from people who may not have thought to leave one before.
  • BrightLocal’s 2020 edition of their annual survey to over 1,000 users in the US, 79% of consumers say they trust online reviews as much as personal recommendations from friends or family. However, if we look at the overall trend, we can see that 10% fewer respondents trust online reviews compared to 2014.
  • Users are more web-savvy than ever, and they can tell when there are suspicious patterns in reviews–like when a business has all 5-star reviews that were submitted all within the same time period. However, it also means that consumers can sort through potentially negative or fake reviews as one-offs when one or two individuals were perhaps having a bad day and took it out on your business. 
  • ...1 more annotation...
  • With reviews still believed to be a local SEO ranking factor, it’s important for businesses to not ignore the importance that reviews still have in the local pack–even if customer sentiment regarding reviews is slowly shifting, especially with the pandemic. The data also proves that it’s more important than ever for small businesses to implement a review solicitation strategy that follows each platform’s terms of service.
1 - 11 of 11
Showing 20 items per page