Skip to main content

Home/ DISC Inc/ Contents contributed and discussions participated by jack_fox

Contents contributed and discussions participated by jack_fox

jack_fox

- 0 views

  • I'm not aware of any ranking algorithm that would take IPs like that into account.
  •  
    "I'm not aware of any ranking algorithm that would take IPs like that into account. "
jack_fox

What We Learned From A "Google Only" Marketing Approach | GatherUp - 0 views

  • Consider offering a “Google only” entry-level service as one of your services
  • Then use the metrics that are available to prove its worth to your clients. Show them significant KPIs improvements as a rationale for upgrading to your higher-end services.
  • Consider selling reviews as a service beyond just asking for reviews.
  • ...8 more annotations...
  • A “Google only” approach to marketing works, and it works well.
  • ‘Deep-six’ expensive citations
  • we deleted all of the inappropriate business listings that were out there, cleaned up every bad listing we could find.
  • We took the time to build out new listings across several sites. Spot 2 Be received a few more links, but we saw virtually no impact by the end of that quarter in terms of rank of her top 50 terms. That being said we did see some movement lower down that indicated that the new citations had some value.
  • while citations aren’t what they used to be they might, if done judicially, provide some benefit. 
  • Can a Google My Business website rank? A: Yes
  • Would NAP confusion create additional problems? Would it screw the pooch? A: No
  • Could a Google-only marketing strategy provide ongoing lift and benefit?A: Yes and it could do so inexpensively
jack_fox

4 Google My Business Fields That Impact Ranking (and 3 That Don't) - Whiteboard Friday ... - 0 views

  • you do want to kind of think and possibly even test what page on your website to link your Google My Business listing to. Often people link to the homepage, which is fine. But we have also found with multi-location businesses sometimes it is better to link to a location page.
  • we have found that review quantity does make an impact on ranking. But that being said, we've also found that it has kind of diminishing returns. So for example, if you're a business and you go from having no reviews to, let's say, 20 or 30 reviews, you might start to see your business rank further away from your office, which is great. But if you go from, let's say, 30 to 70, you may not see the same lift.
    • jack_fox
       
      I would argue though that recent reviews are a big CTR factor, especially due to COVID.
  •  
    "you do want to kind of think and possibly even test what page on your website to link your Google My Business listing to. Often people link to the homepage, which is fine. But we have also found with multi-location businesses sometimes it is better to link to a location page."
jack_fox

7 Ways SEMrush Helped Me Launch A Successful PPC Agency - 0 views

  • The real power of SEMrush is its long tail keyword suggestions. SEMrush empowers my research process to identify profitable keyword phrases throughout the buyer funnel.
  • SEMrush helped me find related keyword ideas. I used that along with Google's Keyword Planner to find keyword opportunities. From there, I “reverse engineered” a profitable ad strategy based on their competitor’s mistakes. In turn, this information helped me seal the deal with a new client. It also saved me from wasting the client’s time and money chasing unprofitable keywords.
  • Historical CPC data from competitors' domains help me project ROAS for clients who don't have any search ads history. With this information, I'll review my client's average conversion rate from non-paid traffic sources and then cast it against the average click-through rate of the target ad position as well as each target keyword's estimated monthly search volume and average cost per click.
  • ...1 more annotation...
  • I'll look at their ad copies and what keywords their ads show for on Google and Bing. The estimated monthly ad spend data helps me qualify new prospects.
  •  
    "The real power of SEMrush is its long tail keyword suggestions. SEMrush empowers my research process to identify profitable keyword phrases throughout the buyer funnel."
jack_fox

How to identify a business prospect for your agency workflow workflow - Specialty Produ... - 0 views

  • You can look for improving domains to show them their full potential or use their declining traffic to demonstrate just how much you can help.
  • Next use the Keyword Gap to identify the keywords that are unique to their domain, and discover gaps in their competitors’ strategies.
  •  
    "You can look for improving domains to show them their full potential or use their declining traffic to demonstrate just how much you can help."
jack_fox

A new era has arrived in local search: Google's Local Trust Pack - 0 views

  • the real value of the badge is the access it provides to Local Services Ads (LSA). This is Google’s local trust pack. It is a cost-per-call advertising inventory unit that acts unlike anything we have ever encountered as marketers.
  • Badges are earned within two distinct programs – Google Guaranteed and Google Screened.
  • This year, Google solidified the growth intentions behind its newly minted trust layer, with the launch of Google Screened for Professional Services providers. This program is for lawyers, financial planners, real estate agents, photographers, event planners, and tax specialists.
jack_fox

How to Get a Places Label on Google Maps - 0 views

  • The place labels shown on Google Maps are determined algorithmically based on a large number of factors”.  Google only populates place labels for some businesses because, stylistically, there simply isn’t room for them all.  As you zoom in on Google Maps, different labels will start to appear that weren’t there originally.
  • According to our study, more listings (percentage-wise) that had labels also had websites on them. 
  • The listings that had place labels with no zooming had an average of 6,455 reviews whereas the average number of reviews for listings without place labels was 21.
  • ...8 more annotations...
  • Older listings are more likely to have place labels. 
  • If you have an editorial summary, there is a pretty good chance you’ll also get a place label
  • User engagement is likely a large factor for determining which businesses get a place label.
  • Listings that had popular times graphs on them (ie: businesses that get a lot of physical store visits) were a lot more likely to have a place label
  • In Toronto, only 5% of businesses that we looked at had an active Google post.
  • Professional services (lawyers, dentists etc) are the least likely to have them.
  • In Toronto, the businesses that had place labels in the 0-3 zoom levels had an average of 8,659 searches a month.  For the businesses that didn’t get a place label, the average was 565.
  • During this study, we looked at 12 different factors on 395 businesses in 3 different cities.  We purposely chose cities that had varying populations to see how this differed for businesses that had tons of competition.
jack_fox

Managed vs Unmanaged Hosting Plans: Which Is Best for Your Website? - 0 views

  • Less flexibility with regard to plugins and software versionsLimited access to account configuration options
  • With shared hosting as mentioned above, you are usually given account level access to the server so that you can manage your website. Root level access is only available to your hosting provider, which means that they take care of all of the server management responsibilities. If you need a higher level of server side customizations, then a VPS or Dedicated server may be for you. 
  • Managed VPS and Dedicated servers include several beneficial options:Fully managed Server (Linux, cPanel, etc)Technical support for server problemsAssistance with server side customizations such as PHP modules, firewall rules, etc.
  • ...1 more annotation...
  • Those who desire a higher level of control over their site might look to an unmanaged VPS or Dedicated server instead. With this type of plan, you get:Full root access to your serverA choice of preinstalled server templates that allow you to build your server however you likeExcellent choice for tech savvy developers who need a custom environmentLower recurring cost, but higher management/administration overhead
jack_fox

The Ultimate WordPress Security Guide - Make Your Site Hackproof @ MyThemeShop - 0 views

  • Even if you hand over the security role for your site, you should still have a good understanding of WordPress security – so you can be sure you’re really getting the protection you need.
  • recognize risk signs. A risky product is: Rarely updated – it’s hard to code a security fix for a theme you’ve virtually abandoned Has many bad consumer reviews Lacks adequate support Has a bad history of being hacked
  • how do you know if a theme or plugin has been hacked in the past? You can start by checking wpvulndb – a database that tracks thousands of exploits across a wide range of plugins and themes.
    • jack_fox
       
      Worth adding to plugin selection/research procedures
  • ...10 more annotations...
  • Google is your best choice for finding the vulnerabilities that are not listed in wpvulndb.com. Just type “plugin-name exploit”
  • Insecure plugins and themes are the main WordPress security risks. So it’s extremely important to make careful choices.
  • It’s better to deactivate the plugin than run it. You can either find a replacement or wait until the issue is fixed.
  • If you ever experience a “white screen of death” after updating your plugins, there’s a simple procedure that will fix the situation: Disable all plugins. One by one, activate each plugin you absolutely need Then activate the plugins that are not essential, but have cosmetic value – again, do it one by one Finally, delete the plugins you don’t need
  • Top security pros rely on automated software that scans their networks and sites for weaknesses, notifying them of problems. It gives them the ability to respond quickly. You can do the same with ReScan.
  • Directories should be set to 755
  • all files inside your WordPress installation should have a 644 permission
  • the wp-config.php file should have permissions set to 600
  • You can use .htaccess files to prevent hackers from looking at code they shouldn’t see – including your wp-config file
  • Disabling XML-RPC
jack_fox

Compare Plans - CodeGuard - 0 views

jack_fox

The Ultimate Web Server Security Guide @ MyThemeShop - 0 views

  • They could insert links into the site to boost their SEO rankings. Hackers can make a killing selling links from exploited sites. Alternatively, a hacker could deface the site and demand money to restore it (ransom). They could even place ads on the site and use the traffic to make money. In most cases, an attacker will also install backdoors into the server. These are deliberate security holes that allow them to come back and exploit the site in the future – even if the insecure plugin has been replaced.
  • Unfortunately, under WordPress, every plugin and theme has the ability to alter anything on the site. They can even be exploited to infect other apps and sites hosted on the same machine.
  • Theme developers are often relatively inexperienced coders. Usually, they’re professional graphic artists who have taught themselves a little PHP on the side. Plugins are another popular line of attack – they account for 22% of successful hacks. Put together, themes and plugins are a major source of security trouble.
  • ...102 more annotations...
  • Each person who uses your system should only have the privileges they need to perform their tasks.
  • Don’t depend on a single security measure to keep your server safe. You need multiple rings of defense.
  • Security exploits exist at all levels of the technology stack, from the hardware up. WP White Security revealed that 41% of WordPress sites are hacked through a weakness in the web host.
  • While it’s important to use a strong password, password cracking is not a primary focus for hackers.
  • the more software you have installed on your machine, the easier it is to hack – even if you aren’t using the programs! Clearly, programs that are designed to destroy your system are dangerous. But even innocent software can be used in an attack.
  • There are 3 ways to reduce the attack surface: 1. Run fewer processes 2. Uninstall programs you don’t need 3. Build a system from scratch that only has the processes you need
  • A really good authentication system uses multiple tests. Someone could steal or guess your password. They could grab your laptop with its cryptographic keys.
  • If you want to run multiple processes at the same time, you need some way of managing them. This is basically what a kernel is. It does more than that – it handles all of the complex details of the computer hardware, too. And it runs the computer’s networking capabilities
  • programs exist as files when they are not running in memory
  • SELinux’s default response is to deny any request.
  • SELinux is extremely comprehensive, but this power comes at a price. It’s difficult to learn, complex to set up, and time-consuming to maintain.
  • AppArmor is an example of a MAC tool, although it’s nowhere near as comprehensive as SELinux. It applies rules to programs to limit what they can do.
  • AppArmor is relatively easy to set up, but it does require you to configure each application and program one by one. This puts the onus for security in the hands of the user or sysadmin. Often, when new apps are added, users forget to configure AppArmor. Or they do a horrible job and lock themselves out, so their only option is to disable the profile. That said, several distributions have adopted AppArmor.
  • Generic profiles shipped by repo teams are designed to cover a wide range of different use cases, so they tend to be fairly loose. Your specific use cases are usually more specific. In this case, it pays to fine-tune the settings, making them more restrictive.
  • GRSecurity is a suite of security enhancements
  • In the future, this could become a viable option. For now, we’ll use Ubuntu and AppArmor.
  • Apache is a user-facing service – it’s how your users interact with your website. It’s important to control this interaction too.
  • If your Apache configuration is bad, these files can be viewed as plain text. All of your code will be visible for anyone to see – this potentially includes your database credentials, cryptographic keys, and salts.
  • You can configure Apache to refuse any requests for these essential directories using .htaccess files. These are folder-level configuration files that Apache reads before it replies to a request.
  • The primary use for .htaccess files is to control access
  • If an attacker knows your WordPress cryptographic salts, they can use fake cookies to trick WordPress into thinking they have logged on already.
  • If the hacker has physical access to the computer, they have many options at their disposal. They can type commands through the keyboard, or insert a disk or USB stick into the machine and launch an attack that way.
  • When it comes to network-based attacks, attackers have to reach through one of the machine’s network ports.
  • For an attacker to exploit a system, they have to communicate to a process that’s listening on a port. Otherwise, they’d simply be sending messages that are ignored. This is why you should only run processes that you need for your site to run. Anything else is a security risk.
  • Often, ports are occupied by processes that provide no real valuable service to the machine’s legitimate users. This tends to happen when you install a large distribution designed for multiple uses. Large distros include software that is useless to you in terms of running a website. So the best strategy is to start with a very lightweight distro and add the components you need.
  • If you see any unnecessary processes, you can shut them down manually. Better yet, if the process is completely unnecessary, you can remove it from your system.
  • Firewalls are quite similar to access control within the computer. They operate on a network level, and you can use them to enforce security policies. A firewall can prevent processes from broadcasting information from a port. It can stop outside users from sending data to a port. And it can enforce more complex rules.
  • Simply installing and running a firewall does not make your host machine secure – it’s just one layer in the security cake. But it’s a vital and a powerful one.
  • First of all, we need to configure our software to resist common attacks. But that can only protect us from attacks we know about. Access control software, such as AppArmor, can drastically limit the damage caused by unauthorized access. But you still need to know an attack is in progress.
  • This is where Network Intrusion Detection Software (NIDS) is essential. It scans the incoming network traffic, looking for unusual patterns or signs of a known attack. If it sees anything suspicious, it logs an alert.
  • It’s up to you to review these logs and act on them.
  • If it’s a false alarm, you should tune your NIDS software to ignore it. If it’s an ineffective attack, you should review your security and block the attacker through the firewall.
  • That’s why it’s essential to have an automated backup system. Finally, you need to understand how the attack succeeded, so you can prevent it from recurring. You may have to change some settings on your Firewall, tighten your access rules, adjust your Apache configuration, and change settings in your wp-config file. None of this would be possible without detailed logs describing the attack.
  • Every web server has a breaking point and dedicated DOS attackers are willing to increase the load until your server buckles. Good firewalls offer some level of protection against naive DOS attacks
  • a tiny number of sites (less than 1%) are hacked through the WordPress core files
  • Major DNS attacks have taken down some of the biggest sites in the world – including Ebay and Paypal. Large hosting companies like Hostgator and Blue Host have been attacked. It’s a serious risk!
  • Right now, due to the way the web currently works, it’s impossible to download a web page without the IP address of a server. In the future, technologies like IFPS and MaidSafe could change that.
  • So there are 2 benefits to using a CDN. The first is that your content gets to your readers fast. The second benefit is server anonymity – nobody knows your real IP address – including the psychos. This makes it pretty impossible to attack your server – nobody can attack a server without an IP address.
  • When CDNs discover a DDOS attack, they have their own ways to deal with it. They often display a very lightweight “are you human?” message with a captcha. This tactic reduces the bandwidth costs and screens out the automated attacks.
  • If any of your DNS records point to your actual server, then it’s easy to find it and attack it. This includes A records (aliases) and MX records (mail exchange). You should also use a separate mail server machine to send your emails. Otherwise, your email headers will expose your real email address.
  • If your hosting company refuses to give you a new IP address, it may be time to find a new service provider.
  • WordPress uses encryption to store passwords in the database. It doesn’t store the actual password – instead, it stores an encrypted version. If someone steals your database tables, they won’t have the actual passwords.
  • If you used a simple hash function, a hacker could gain privileged access to your app in a short period of time.
  • The salt strings are stored in your site’s wp-config.php file.
  • Salts dramatically increase the time it would take to get a password out of a hash code – instead of taking a few weeks, it would take millions of years
  • You keep the other key (the decryption key) to yourself. If anyone stole it, they could decode your private messages! These 2-key cryptographic functions do exist. They are the basis of TLS (https) and SSH.
  • the most secure systems tend to be the simplest. The absolute secure machine would be one that was switched off.
  • For WordPress sites, you also need PHP and a database.
  • A VM is an emulated computer system running inside a real computer (the host). It contains its own operating system and resources, such as storage, and memory. The VM could run a completely different operating system from the host system – you could run OSX in a VM hosted on your Windows machine
  • This isolation offers a degree of protection. Let’s imagine your VM gets infected with a particularly nasty virus – the VM’s file system could be completely destroyed, or the data could be hopelessly corrupted. But the damage is limited to the VM itself. The host environment would remain safe.
  • This is how shared hosting and virtual private servers (VPSes) work today. Each customer has access to their own self-contained environment, within a virtual machine.
  • VMs are not just for hosting companies. If you’re hosting multiple sites on a dedicated server or a VPS, VMs can help to make your server more secure. Each site can live inside its own VM. That way, if one server is hacked, the rest of your sites are safe.
  • Even with all these considerations, the benefits of VMs outweigh their drawbacks. But performance is vital on the web.
  • Containers (like Docker) are very similar to VMs.
  • Because we’ve cut the hypervisor out of the loop, applications run much faster – almost as fast as processes in the host environment. Keeping each container separate does involve some computation by the container software. But it’s much lighter than the work required by a hypervisor!
  • Docker Cloud is a web-based service that automates the task for you. It integrates smoothly with the most popular cloud hosting platforms (such as Amazon Web Services, or Digital Ocean).
  • With containers, you can guarantee that the developer’s environment is exactly the same as the live server. Before the developer writes a single line of code, they can download the container to their computer. If the code works on their PC, it will work on the live server. This is a huge benefit of using containers, and it’s a major reason for their popularity.
  • A complete stack of these layers is called an “image”
  • The core of Docker is the Docker Engine – which lives inside a daemon – or long-running process
  • another great resource – the Docker Hub. The hub is an online directory of community-made images you can download and use in your own projects. These include Linux distributions, utilities, and complete applications.
  • Docker has established a relationship with the teams behind popular open source projects (including WordPress) – these partners have built official images that you can download and use as-is.
  • when you finish developing your code, you should wrap it up inside a complete container image. The goal is to put all the code that runs your site inside a container and store the volatile data in a volume.
  • Although Docker can help to make your site more secure, there’s are a few major issues you need to understand. The Docker daemon runs as a superuser It’s possible to load the entire filesystem into a container It’s possible to pass a reference to the docker daemon into a container
  • The solution to this issue is to use a MAC solution like SELinux, GRSecurity or AppArmor.
  • Never let anyone trick you into running a strange docker command.
  • only download and use Docker images from a trustworthy source. Official images for popular images are security audited by the Docker team. Community images are not
  • there are the core WordPress files. These interact with the web server through the PHP runtime. WordPress also relies on the file system and a database server.
  • A service is some software component that listens for requests (over a protocol) and does something when it receives those requests.
  • Using Docker, you could install WordPress, Apache, and PHP in one container, and run MySQL from another. These containers could run on the same physical machine, or on different ones
  • The database service container can be configured to only accept connections that originate from the web container. This immediately removes the threat of external attacks against your database server
  • This gives you the perfect opportunity to remove high-risk software from your host machine, including: Language Runtimes and interpreters, such as PHP, Ruby, Python, etc. Web servers Databases Mail Servers
  • If a new version of MySQL is released, you can update the database container without touching the web container. Likewise, if PHP or Apache are updated, you can update the web container and leave the database container alone.
  • Because Docker makes it easy to connect these containers together, there’s no reason to lump all your software inside a single container. In fact, it’s a bad practice – it increases the security risk for any single container, and it makes it harder to manage them.
  • If your site is already live on an existing server, the best approach is to set up a new host machine and then migrate over to it. Here are the steps you need to take:
  • With a minimal Ubuntu installation, you have a fairly bare-bones server. You also have the benefit of a huge repository of software you can install if you want.
  • If access control is like a lock protecting a building, intrusion detection is the security alarm that rings after someone breaks in.
  • Logging on to your host with a superuser account is a bad practice. It’s easy to accidentally break something.
  • Fail2ban blocks SSH users who fail the login process multiple times. You can also set it up to detect and block hack attempts over HTTP – this will catch hackers who attempt to probe your site for weaknesses.
  • With multiple WordPress sites on your machine, you have 2 choices. You could create a new database container for each, or you could reuse the same container between them. Sharing the DB container is a little riskier, as a hacker could, theoretically, ruin all your sites with one attack. You can minimize that risk by: Use a custom root user and password for your database – don’t use the default username of ‘root’. Ensuring the db container is not accessible over the internet (hide it away inside a docker network) Creating new databases and users for each WordPress site. Ensure each user only has permissions for their specific database.
  • What are the benefits of using a single database container? It’s easier to configure and scale. It’s easier to backup and recover your data. It’s a little lighter on resources.
  • you could also add a caching container, like Varnish. Varnish caches your content so it can serve pages quickly – much faster than WordPress can
  • Docker has the ability to limit how much processor time and memory each container gets. This protects you against exhaustion DOS attacks
  • A containerized process still has some of the abilities of root, making it more powerful than a regular user. But it’s not as bad as full-on root privileges. With AppArmor, you can tighten the security further, preventing the process from accessing any parts of the system that do not relate to serving your website.
  • Docker Hub works like GitHub – you can upload and download images for free. The downside is that there’s no security auditing. So it’s easy to download a trojan horse inside a container.
  • Official images (such as WordPress and Apache) are audited by the Docker team. These are safe. Community images (which have names like user/myapp) are not audited.
  • a kernel exploit executed inside a container will affect the entire system. The only way to protect against kernel exploits is to regularly update the host system
  • Containers run in isolation from the rest of the system. That does not mean you can neglect security – your website lives inside these containers! Even if a hacker cannot access the full system from a container, they can still damage the container’s contents.
  • Under Ubuntu, AppArmor already protects you – to a degree. The Docker daemon has an AppArmor profile, and each container runs under a default AppArmor profile. The default profile prevents an app from breaking out of the container, and restricts it from doing things that would harm the system as a whole. However, the default profile offers no specific protection against WordPress specific attacks. We can fix this by creating a custom profile for your WordPress container.
  • The net effect is that it’s impossible to install malware, themes or plugins through the web interface. We’ve already covered this to some degree with the .htaccess rules and directory permissions. Now we’re enforcing it through the Linux kernel.
  • There are versions of Docker for Mac and PC, so you’ll be able to run your site from your home machine. If the code works on your PC, it will also work on the server.
  • Tripwire tends to complain about the entries in the /proc filespace, which are auto-generated by the Linux kernel. These files contain information about running processes, and they tend to change rapidly while Linux runs your system. We don’t want to ignore the directory entirely, as it provides useful signs that an attack is in progress. So we’re going to have to update the policy to focus on the files we are interested in.
  • Now we should install an e-mail notification utility – to warn us if anything changes on the system. This will enable us to respond quickly if our system is compromised (depending on how often you check your emails).
  • Rootkits are malicious code that hackers install onto your machine. When they manage to get one on your server, it gives them elevated access to your system
  • Tripwire is configured to search in key areas. It’s good at detecting newly installed software, malicious sockets, and other signs of a compromised system. RKHunter looks in less obvious places, and it checks the contents of files to see if they contain known malicious code. RKHunter is supported by a community of security experts who keep it updated with known malware signatures – just like antivirus software for PCs.
  • If your hosting company offers the option, this would be a good point to make an image of your server. Most cloud hosting companies offer tools to do this.
  • With an image, it’s easy to launch new servers or recover the old one if things go horribly wrong.
  • We’ve hidden our server from the world while making it easy to read our content We’ve built a firewall to block malicious traffic We’ve trapped our web server inside a container where it can’t do any harm We’ve strengthened Linux’s access control model to prevent processes from going rogue We’ve added an intrusion detection system to identify corrupted files and processes We’ve added a rootkit scanner We’ve strengthened our WordPress installation with 2-factor authentication We’ve disabled the ability for any malicious user to install poisoned themes or plugins
  • Make a routine of checking the logs (or emails if you configured email reporting). It’s vital to act quickly if you see any warnings. If they’re false warnings, edit the configuration. Don’t get into a habit of ignoring the reports.
  • Virtually everything that happens on a Linux machine is logged.
  • You have to make a habit of checking for new exploits and learn how to protect yourself against them. Regularly check for security patches and issues in the core WordPress app: WordPress Security Notices Also, check regularly on the forums or mailing lists for the plugins and themes you use on your site.
  • network level intrusion detection service – you can fix that by installing Snort or PSAD.
  • The only way to guarantee your safety is to constantly update your security tactics and never get complacent.
jack_fox

Web Hosting 101 - The Basics - 0 views

  • Linux servers are open source and can be based on a number of different distributions, such as Ubuntu, Debian, Red Hat, CentOS, or FreeBSD.
  • the most common forms of web hosting available are: Free Web Hosting Shared Web Hosting Managed Web Hosting VPS Web Hosting Dedicated Web Hosting Cloud Web Hosting
  • Free web hosting is offered by various companies primarily in order to up-sell other domain services or to publish advertising on pages that are hosted under the account.
  • ...33 more annotations...
  • With shared web hosting, there may be thousands of different businesses, individuals, and organizations all serving their website files to the public from the same computer.
  • The web hosting company employs systems administrators to manage the server software installation and security updates. The hosting clients use file transfer management tools to host web pages in HTML or other programming languages which serve the files to the public through the browser. The hard disk space on the remote server can be used for other purposes than web hosting, for example remote file storage, email accounts, sandbox web development, mobile app support, or running software scripts online.
  • Shared web hosting accounts can cost as little as $1 – $3 dollars per month and rarely cost more than $20. It is estimated that over 90% of the websites on the internet use shared web hosting to keep their information online 24 hours a day. Shared web hosts never turn off their services and offer seamless hardware upgrades in the data center that can keep a website online for years. Most of the available web development tools will integrate easily with a shared hosting account.
  • The main disadvantage of shared web hosting is that it is not able to scale effectively to support the traffic of large websites and usually includes strict limitations on the use of CPU processing power because of the pooled resources. Shared web hosting does not typically support the user installation of server extensions through the command line that are important for custom web development and mobile app support.
  • There is still no opportunity for advanced systems administration and custom server configurations on most shared hosting plans. Security on shared web hosting frameworks is not considered robust enough for sensitive corporate information and government accounts. There can also be performance issues that develop on a server if one domain is consistently consuming shared resources or hit with a DDoS attack. Because systems administration and root server configuration control is taken out of the hands of shared web hosting users, they are often overly reliant on the service company for tech support.
  • Shared web hosting is recommended for self-published websites and small business networks.
  • Managed web hosting is a version of shared hosting where the service company specializes in platform-specific products that support custom development frameworks. Examples of this can be seen in Pantheon and Acquia Cloud for Drupal, Nexcess for Magento, or WP Engine for WordPress. Managed host companies provide optimized server environments that can speed up website performance and page load times for high-traffic, CMS-driven websites.
  • Virtual Private Servers (VPS) are a web hosting solution designed to give more power and flexibility to website owners for custom developed software requirements and complex applications. Technically, a VPS will operate in the same manner as a dedicated server while operating on a partitioned hardware framework that allows for the use of only a fraction of the resources of the host machine.
  • Understanding which virtualization platform the VPS web hosting company is using to manage data center resources and client configurations is important.
  • Developers often prefer VPS accounts because they can custom configure the server with the choice of operating system and install whatever additional server extensions are required for programming web applications
  • The main benefit of VPS hosting is that website owners can “dial in” the exact amount of server resources that are required to optimize the performance of a complex website.
  • The main disadvantage of VPS web hosting is the complexity of systems administration required to install and manage the server software, which requires a lot of command line knowledge and background in web server configuration.
  • Inexperienced users can leave security holes in the environment that hackers using automated script bots and known server or database exploits can easily detect and target. Using a standardized cPanel, CentOS, & WHM environment or administration panels like Webmin and Virtualmin can help simplify the server administration process considerably by adding a GUI layer to access common tasks
  • VPS web hosting accounts are best suited for developers who need to custom configure the environment with server extensions that shared web hosts will not support. Typically these are related to the use of database frameworks other than MySQL, programming languages other than PHP, and server frameworks other than Apache.
  • Dedicated web hosting is the most expensive and flexible of all of the service plans offered by companies in the industry, as site owners are able to directly rent or lease a complete rack-mount server in a data center.
  • Dedicated servers are required to host the largest sites by traffic on the web, as well as by mobile apps which require elite performance
  • The main disadvantage of a dedicated server is that it is costly compared to shared hosting or VPS plans, and expensive even when compared to the price of the underlying hardware itself. With dedicated servers, the client is paying not only for the use of the server, but also for the trained technicians who manage it, the overhead costs of the data center, and access to the internet backbone. Data center costs include not only rental of office and warehouse space, but also the electricity required to run all of the servers and keep them cool. Data centers must also have back-up power generation facilities in case the local electricity supply is cut. All of the residual costs are included in the annual price of a dedicated server plan. Nevertheless, it is still often much cheaper then what would be required to manage a data center for a single business independently.
  • Cloud web hosting provides solutions for websites that need more processing power and require more than a single server instance because the amount of online traffic, including the number of queries to the database and resource files, is too high in volume for a single machine
  • Cloud web hosting is defined by the deployment of server clusters that scale automatically with the user traffic and processing power needs of a website, including advanced software applications for elastic load balancing, file storage, and database optimization
  • Cloud web hosting is similar to content delivery networks (CDNs) which use distributed global servers, advanced page caching, and file management software to optimize website performance for large websites. Many cloud hosting companies will offer the ability to choose the operating system, database framework, and geographic location of the server itself as part of the configuration options.
  • Cloud web hosting is designed for remote computing applications and required by large web sites whose user traffic exceeds the limits of what a single server instance will provide. Cloud web hosting is particularly designed to meet the needs of websites with large database requirements.
  • Not every website will require cloud hosting, but small businesses and start-ups who scale their traffic and user communities often find managed cloud services a reasonable option over dedicated servers because of the ability to “pay-as-you-go” for only the amount of server resources used and for the ability to keep sites online through cluster scaling at the times of peak user service provision.
  • The major downside to cloud hosting is the uncertainty involved with the variability of costs with sites on the “pay-as-you-go” model. Another problem can be associated with “hype” in the industry, which can lead to over-pricing and over-billing for unnecessary services or introductory plans.
  • Cloud web hosting can be similar to VPS or dedicated server frameworks where the systems administrator has the ability to custom configure the installation of the operating system with software extensions that are not available in shared hosting environments. Some managed cloud hosts simplify this process by offering optimally configured solutions with a proprietary base software package.
  • Some of the main features to look for in any web hosting account are: Server Architecture Operating System Version Domain Management Tools Systems Administration Tools Bandwidth & CPU Limitations Free Offers & Promotions Data Security Technical Support
  • Before purchasing any web hosting account, it is essential to verify the server hardware being used on the platform. Currently there is a wide variety of difference between the different versions of Intel Xeon, Atom, Itanium, and AMD Opteron servers deployed in data center use.
  • The version of Linux installed, for example CentOS and Cloud Linux, can also come with licensing restrictions due to the use of WHM and cPanel. The use of other Linux distributions like Ubuntu, Red Hat, Debian, FreeBSD, etc. in web servers is mostly related to developer preference for systems administration
  • Many hosting companies claim to offer “unlimited” bandwidth and data transfer. However, if a website uses too many CPU resources, it may still be throttled or taken offline at times of peak user traffic.
  • A web hosting company should provide a guaranteed uptime of at least 99.9% as part of the service plan.
  • Website owners should make sure that any web hosting plan will be configured securely, including firewalls and monitoring software to prevent intrusions by automated script-bot attacks. Check whether the web hosting company offers DDoS attack protection and auto-alerts for unauthorized logins. While shared web hosting plans include the company services related to upgrading the installed operating system and server software with the latest security patches, VPS and dedicated server accounts will need to be responsible for this through a qualified systems administrator. Web hosting companies that provide automated site file and database back-up tools like raid disk mirroring on advanced accounts provide an extra layer of site security in case of a server crash or technical error that leads to data loss.
  • Managed hosts have the advantage of experienced technical support teams with platform-specific knowledge
  • Small business website owners and independent publishers should start with a shared web hosting account, then upgrade to a VPS or Cloud hosting plan if the traffic scales beyond what the server will support.
  • While shared hosting under cPanel and CentOS remains the industry standard, innovations in cloud computing are changing the web hosting landscape quickly. Many web hosting companies are now offering “hybrid” approaches that combine the best features of cloud and shared hosting into a high-performance, low-cost retail plan that offers integrated load balancing, page caching, and CDN services on elite server hardware configurations.
  •  
    "Linux servers are open source and can be based on a number of different distributions, such as Ubuntu, Debian, Red Hat, CentOS, or FreeBSD."
jack_fox

Server Security Requirements and References: Information Technology - Northwestern Univ... - 0 views

  • Each and every recommendation will not be applicable to every server; therefore the system administrator should exercise their own judgment in conjunction with their department's own requirements and business needs. Deviations from the recommended guidelines should be documented
  • if a department is required to comply with PCI (Payment Card Industry) regulations, the specific recommendation has been labeled with "PCI/DSS"
  • All local and domain accounts with privileges above normal user level should have a minimum 15 character passphrase and must be changed at least once every quarter.
  • ...9 more annotations...
  • Machines may not be connected to the network until they have had the latest OS and application updates applied, anti-viral software installed and activated, firewall enabled, AND a strong passphrase enabled on all accounts.
  • Departments must establish, maintain, and effectively implement plans for emergency response, backup operations, and post-disaster recovery for organizational information systems to ensure the availability of critical information resources and continuity of operations in emergency situations.
  • Encrypted backups should be taken regularly, and all on/off site storage should be physically secure.
  • Clocks must be synchronized to two (2) internally hosted time servers
  • RedHat Linux
    • jack_fox
       
      Closest to CentOS, the distribution of Linux that FutureHosting uses
  • Encrypt all non-console administrative access. Use technologies such as SSH, VPN, or SSL/TLS (transport layer security) for web-based management and other non-console administrative access.
  • Logs must be available online (electronically) for three months, available on tape (or other removable media) for one year.
  • Establish a process to identify newly discovered security vulnerabilities (for example, subscribe to alert services freely available on the Internet). Update standards to address new vulnerability issues.
  • Remove inactive user accounts at least every 90 days.
jack_fox

Slow Google Sheets? Here are 27 techniques you can try right now - 0 views

  •  
    "The IMPORTRANGE is a slow formula because it's connecting to another Sheet to retrieve data. In general, it's best to minimize the number of these external calls required."
jack_fox

Robots Meta Tag & X-Robots-Tag: Everything You Need to Know - 0 views

shared by jack_fox on 12 Oct 20 - No Cached
  •  
    "add the following HTML snippet to every page on your site to tell Google that you want no restrictions on your snippets: Note that if you use Yoast SEO, this piece of code is added automatically on every page unless you added noindex or nosnippet directives."
« First ‹ Previous 341 - 360 of 750 Next › Last »
Showing 20 items per page