Skip to main content

Home/ DISC Inc/ Group items tagged display

Rss Feed Group items tagged

18More

28 Google rich snippets you should know in 2019 [guide + infographic] - Mangools Blog - 0 views

  • unless you are an authoritative website such as Wikipedia, your information probably won’t appear in the answer box.
  • having an image from your website in an image pack is not very beneficial.
  • Besides the common video thumbnail and video knowledge panel, videos may also appear in a carousel, both on the mobile and the desktop devices.
  • ...15 more annotations...
  • It is always a good idea to have a video on your website. It increases the user engagement and grabs the attention. If you appear in a SERP with your own video thumbnail, it increases the CTRs, and the user will likely stay longer on your site.
  • If you decide to host (or embed) a video on your own website, you have to include proper structured data markup.
  • In general, it’s easier to appear as a video thumbnail in SERP with youtube video.
  • From the technical point of view, it is important to have a structured data markup for your article and it is recommended by Google to have an AMP version of the website.
  • It is based on internal Google algorithm. Your website has to be authoritative and contain high quality content. It doesn’t matter if you are a big news portal or you have a personal blog. If there is a long, high quality content, Google may include your website.
  • If you want to appear as an in-depth article, you should write long, high quality and unique content marked up with a structured data markup for article (don’t forget to include your company logo within the schema markup).
  • Higher CTRs. It’s kinda catchy as numbers will always attract people attention. An image can make the feature even more prominent.
  • Implementation: Good old friend: structured data
  • In the SERP, they replace the classic URL of a result. It’s a simplified and a common version of URL of the result. Categories and leaf pages are separated with chevrons. On the desktop you can achieve it with the right structured data, in mobile SERP it is automatic for all results.
  • Breadcrumbs (as opposed to a common URL) are easier to read for people, so it leads to a better UX right from the very first interaction with your website in the SERP, which can also lead to a higher CTR.
  • It’s really easy to implement it on every blog or ecommerce site – just another structured data to your website. If you have a WordPress site, you can do that with SEO plugins like Yoast SEO.
  • It mainly appears for the root domain, but it can be shown for a leaf page too (e.g. if you have the blog as a leaf page, blog categories (leaf pages) may appear as sitelinks).
  • Sitelinks contain links to leaf pages of a current website with title and description. It may contain 2 – 10 sitelinks. Appearance on a mobile is a bit different from a desktop. You may also spot small sitelinks as a vertical enhancement of an organic result.
  • High CTRs.
  • You can’t directly control the occurrence of sitelinks. Only Google decides whether to display them or not. However, the best practise is to have a clear website hierarchy in a top menu website with descriptive anchor text. The sitelinks are links from the menu.
5More

How to Create the Perfect Robots.txt File for SEO - 0 views

shared by jack_fox on 07 Mar 19 - No Cached
  • One of the best uses of the robots.txt file is to maximize search engines’ crawl budgets by telling them to not crawl the parts of your site that aren’t displayed to the public.
  • you should not use robots.txt to block pages from search engines
    • jack_fox
       
      Why?
  • you could disallow a page, but it could still end up in the index.Generally, you don’t want that.That’s why you need the noindex directive. It works with the disallow directive to make sure bots don’t visit or index certain pages.
  • ...1 more annotation...
  • If you have any pages that you don’t want indexed (like those precious thank you pages), you can use both disallow and noindex directive:
4More

How AI can uncover new insights and drive SEO performance - Search Engine Land - 0 views

  • Monitoring website performance in analytics platforms to discover insights.
  • Monitoring performance: AI can process data, alert the user to any anomalies and highlight quick wins to action immediately.
  • The unifying thread through all of this is the fact that AI can deliver highly relevant insights automatically, at huge scale, and in a manner we can easily share with other departments in our organization. Without the right technology, we could only achieve this with the support of hundreds of analysts and an infinite budget.It is worth noting that the difference between a valuable insight and a simple observation is incredibly significant for any business. A true insight illuminates something new and guides future action based on the moments and metrics that matter
  • ...1 more annotation...
  • Search marketers should seek out a platform that employs deep learning technology to sift through search, social and content marketing data from a range of analytics platforms to produce these insights. This should be achieved across all territories, devices, and demographics, allowing new information to surface that would typically slip through the cracks.When evaluating technology for these purposes, marketers should ask these questions:What is the benefit? How does it save time and build efficiency?What data sources and data sets are involved in all calculations, including search, social and local?How does it index URLs? Is data fresh, accurate and collected frequently to keep track of the SEO landscape?How sophisticated is the AI? What are the machine learning and deep learning applications used to identify patterns in consumer data?How does it change our business operation capabilities?What clear business problems does it solve?Does it contain intuitive dashboards that display all findings in a digestible manner that can be shared with non-technical audiences and across the digital organization?
1More

Search Analytics Report - Search Console Help - 0 views

  • Aggregating data by site vs by page If you group, filter, or compare by page or search appearance, all metrics in the report are aggregated by page; otherwise, all metrics are aggregated by site. For impressions, if a site appears twice on a search results page when aggregating by site, it counts as a single impression; if grouping by page or search appearance, each unique page is counted separately. For clicks, if a site appears twice in search results when grouped by site, and the user clicks on one link, backs up, then clicks the other link, it counts as a single click, since the final destination is the same site. For position, when aggregating by site, the topmost position of your property in search results is reported; when grouped by page or search appearance, the topmost position of the page in search results is reported. When aggregating data by site, the site is the true target of the search results link, which might not be the same as the displayed URL, as determined by Google's skip redirect behavior.  Because of the different accounting methods, the click-through rate and average position are higher when aggregating by site if multiple pages from the same site appear in the search results. For example, imagine that search results for "fun pets for children" returns only the following three results, all from the same site, and that users click each of them with equal frequency: Google Search Results Metrics Aggregated by Site Metrics Aggregated by Page www.petstore.example.com/monkeys www.petstore.example.com/ponies www.petstore.example.com/unicorns Click-through rate: 100% All clicks for a site are combined Click-through rate: 33% 3 pages shown, 1/3 of clicks to each page Average position: 1 Highest position from the site in the results Average position: 2 (1 + 2 + 3) / 3 = 2
2More

Image SEO: Optimizing images for search engines * Yoast - 0 views

  • Adding structured data to your pages can help search engines display your images as rich results. While Google says structured data doesn’t help you rank better, it does help to achieve a more fleshed out listing in Image Search
  • Adding images to your XML sitemaps helps Google index your images, so be sure to do so for better image SEO.
2More

How to Create a Knowledge Panel for Your Organization - Go Fish Digital - 0 views

  • Sometimes, Google will display a brand KP or a local KP depending on what was searched.
  • keep in mind that it’s unlikely that you’ll be able to replace your local KPs with brand KPs for all searches.
2More

Manufacturer Center vs. Merchant Center feeds - Manufacturer Center Help - 0 views

  • It is possible for a manufacturer to participate in both Merchant Center and Manufacturer Center. Information provided via Manufacturer Center will not affect the listed retailers or the sale price of products advertised on Google via Merchant Center. If a manufacturer would like to advertise their products for sale, they can do so via Merchant Center.
  • Product information that is submitted may be displayed on Google.com, the Google Shopping tab, Google Now cards, and Google Express.
5More

Beyond conventional SEO: Unravelling the mystery of the organic product carousel - Sear... - 0 views

  • How to influence the organic product carouselIn Google’s blog post, they detailed three factors that are key inputs: Structured Data on your website, providing real-time product information via Merchant Center, along with providing additional information through Manufacturer Center.This section of the article will explore Google’s guidance, along with some commentary of what I’ve noticed based on my own experiences.
  • Make sure your product markup is validatedThe key here is to make sure Product Markup with Structured Data on your page adheres to Google’s guidelines and is validated.
  • Submit your product feed to Google via Merchant CenterThis is where it starts to get interesting. By using Google’s Merchant Center, U.S. product feeds are now given the option to submit data via a new destination.The difference here for Google is that retailers are able to provide more up-to-date information about their products, rather than waiting for Google to crawl your site (what happens in step 1).Checking the box for “Surfaces across Google” gives you the ability to grant access to your websites product feed, allowing your products to be eligible in areas such as Search and Google Images.For the purpose of this study we are most interested in Search, with the Organic Product Carousel in mind. “Relevance” of information is the deciding factor of this feature.Google states that in order for this feature of Search to operate, you are not required to have a Google Ads campaign. Just create an account, then upload a product data feed.Commentary by PPC Expert Kirk Williams:“Setting up a feed in Google Merchant Center has become even more simple over time since Google wants to guarantee that they have the right access, and that retailers can get products into ads! You do need to make sure you add all the business information and shipping/tax info at the account level, and then you can set up a feed fairly easily with your dev team, a third party provider like Feedonomics, or with Google Sheets. As I note in my “Beginner’s Guide to Shopping Ads”, be aware that the feed can take up to 72 hours to process, and even longer to begin showing in SERPs. Patience is the key here if just creating a new Merchant Center… and make sure to stay up on those disapprovals as Google prefers a clean GMC account and will apply more aggressive product disapproval filters to accounts with more disapprovals. ”– Kirk WilliamsFor a client I’m working with, completing this step resulted in several of their products being added to the top 10 of the PP carousel. 1 of which is in the top 5, being visible when the SERP first loads.This meant that, in this specific scenario, the product Structured Data that Google was regularly crawling and indexing in the US wasn’t enough on it’s own to be considered for the Organic Product Carousel.Note: the products that were added to the carousel were already considered “popular” but Google just hadn’t added them in. It is not guaranteed that your products will be added just because this step was completed. it really comes down to the prominence of your product and relevance to the query (same as any other page that ranks).
  • ...2 more annotations...
  • 3. Create an additional feed via Manufacturer CenterThe next step involves the use of Google’s Manufacturer Center. Again, this tool works in the same way as Merchant Center: you submit a feed, and can add additional information.This information includes product descriptions, variants, and rich content, such as high-quality images and videos that can show within the Product Knowledge Panel.You’ll need to first verify your brand name within the Manufacturer Center Dashboard, then you can proceed to uploading your product feed.When Google references the “Product Knowledge Panel” in their release, it’s not the same type of Knowledge Panel many in the SEO industry are accustomed.This Product Knowledge Panel contains very different information compared to your standard KP that is commonly powered by Wikipedia, and appears in various capacities (based on how much data to which it has access).Here’s what this Product Knowledge Panel looks like in its most refined state, completely populated with all information that can be displayed:Type #1 just shows the product image(s), the title and the review count.Type #2 is an expansion on Type #1 with further product details, and another link to the reviews.Type #3 is the more standard looking Knowledge Panel, with the ability to share a link with an icon on the top right. This Product Knowledge Panel has a description and more of a breakdown of reviews, with the average rating. This is the evolved state where I tend to see Ads being placed within.Type #4 is an expansion of Type #3, with the ability to filter through reviews and search the database with different keywords. This is especially useful functionality when assessing the source of the aggregated reviews.Based on my testing with a client in the U.S., adding the additional information via Manufacturer Center resulted in a new product getting added to a PP carousel.This happened two weeks after submitting the feed, so there still could be further impact to come. I will likely wait longer and then test a different approach.
  • Quick recap:Organic Product Carousel features are due to launch globally at the end of 2019.Popular Product and Best Product carousels are the features to keep an eye on.Make sure your products have valid Structured Data, a submitted product feed through Merchant Center, along with a feed via Manufacturer Center.Watch out for cases where your clients brand is given a low review score due to the data sources Google has access to.Do your own testing. As Cindy Krum mentioned earlier, there are a lot of click between the Organic Product Carousel listings and your website’s product page.Remember: there may be cases where it is not possible to get added to the carousel due to an overarching “prominence” factor. Seek out realistic opportunities.
1More

Google and Review Snippets - GatherUp - 0 views

  • 1. Reputation In Your Title Tag You have control over the meta title tag and can change it as we outline above to include your overall rating and review count. Especially if you are optimizing this title tag for a page dedicated to reviews you can also let the user know you are displaying all of your reviews in one place, offering them a lot of data and efficiency. GatherUp customers would benefit here from showing all of their 1st- and 3rd-party reviews in the Review Widget. 2. Reputation In Your Meta Description
7More

The Swiss Luxury-Watch Slump in the United States Is Over - Bloomberg - 0 views

  • The three-year luxury-watch slump in the United States is over.  Swiss luxury-watch sales in the U.S., Switzerland's second largest export market, jumped substantially in the first half of 2018 versus the same period in 2017, according to three indicators, two for wholesale sales, the other for retail sales. The retail data came from the NDP Group, the market research company whose widely respected watch retail tracking service collects point-of-sale data from thousands of stores in the United States. "We're reporting that U.S. sales for watches above $1,000 are up 13.5% in value year-to-date," Reg Brack, NPD's watches and luxury industry analyst, told HODINKEE. Swiss watches dominate the market above $1,000.
  • The Swatch Group boasted that it had "the best first semester sales in the history of the group," CHF 4.27 billion, a 14.7% increase over the same period in 2017. The company reported a 66.5% jump in net income to CHF 468 million.
  • The main drivers of this year's boomlet, according to the FH, were Asian markets, mechanical watches, and relatively affordable steel watches.
  • ...4 more annotations...
  • Mechanical-watch exports grew by double-digit percentages in both volume and value. Unit exports increased 13.6% to 3.8 million pieces. In value, mechanicals rose 11.3% to CHF 8.14 billion. That amounts to 82% of total export sales by value. Exports of electronic watches rose 6.4% in value, but dropped 3.8% in units to 7.85 million, continuing a steady, five-year decline.
  • Watches with export prices in the CHF 500 to CHF 3,000 range showed the strongest growth, up 14.8% in volume and 16.9% in value. Overall, steel watch exports enjoyed a "steep rise," the FH said, up 500,000 units (it didn't give the total number).
  • While global Swiss watch sales this year are strongest in the $1,000 to $5,000 retail range, according to the FH, that's not the case in the U.S. Here watch sales are strongest at the very top of the price pyramid, according to NPD. Watches priced $5,000 and up accounted for nearly half the sales of the entire U.S. watch market in value. 
  • In general, Swiss brands that are less well known have difficulty competing in the U.S. market. That's particularly true in the $1,000 to $3,000 price range, Brack said. That price range is extremely competitive: "A lot of brands are struggling for [consumer] awareness." 
2More

Frequently Asked Questions - LocalFalcon - 0 views

  • Local Falcon displays Google My Business rankings for listings that show up either in the Maps portion of the organic search, or from a search in the Google Maps Local Finder (i.e. Google Maps).
  • If you wish to record a visual representation of the scan, you will need to save a screenshot before performing another scan. We will need be added saved screenshots as a new feature shortly.
3More

- 0 views

  • 100s of backlinks from http://theglobe.net  and related spammy domains. Ignore or disavow?
    • jack_fox
       
      THS has this situation. According to John Mueller, it's nothing to worry about!
  • We already ignore links from sites like that, where there are unlikely to be natural links. No need to disavow :)
1More

How to Hunt Down and Capture Featured Snippets for More Traffic in 2019 - 0 views

  • Target question-based keywords. Check if there is a featured snippet on the SERP and what type it is (paragraph, list, table etc) using a tool like Ahrefs. Keep your paragraph and sentences fairly short. Answer the query as directly as possible. Structure your content with logical subheadings (H2, H3, H4 etc). Use tables to display any data. Include the question within the answer if possible. Include a summary at the start or end of the content.
3More

Meta Description Tag [2019 SEO] - Moz - 0 views

  • One way to combat duplicate meta descriptions is to implement a dynamic and programmatic way to create unique meta descriptions for automated pages. If possible, though, there's no substitute for an original description that you write for each page.
  • If a page is targeting between one and three heavily searched terms or phrases, write your own meta description that targets those users performing search queries including those terms.If the page is targeting long-tail traffic (three or more keywords), it can sometimes be wiser to let the engines populate a meta description themselves. The reason is simple: When search engines pull together a meta description, they always display the keywords and surrounding phrases that the user has searched for.
  • One caveat to intentionally omitting meta description tags:  Keep in mind that social sharing sites like Facebook commonly use a page's meta description tag as the description that appears when the page is shared on their sites. Without the meta description tag, social sharing sites may just use the first text they can find.
105More

The Ultimate Web Server Security Guide @ MyThemeShop - 0 views

  • They could insert links into the site to boost their SEO rankings. Hackers can make a killing selling links from exploited sites. Alternatively, a hacker could deface the site and demand money to restore it (ransom). They could even place ads on the site and use the traffic to make money. In most cases, an attacker will also install backdoors into the server. These are deliberate security holes that allow them to come back and exploit the site in the future – even if the insecure plugin has been replaced.
  • Unfortunately, under WordPress, every plugin and theme has the ability to alter anything on the site. They can even be exploited to infect other apps and sites hosted on the same machine.
  • Theme developers are often relatively inexperienced coders. Usually, they’re professional graphic artists who have taught themselves a little PHP on the side. Plugins are another popular line of attack – they account for 22% of successful hacks. Put together, themes and plugins are a major source of security trouble.
  • ...102 more annotations...
  • Each person who uses your system should only have the privileges they need to perform their tasks.
  • Don’t depend on a single security measure to keep your server safe. You need multiple rings of defense.
  • Security exploits exist at all levels of the technology stack, from the hardware up. WP White Security revealed that 41% of WordPress sites are hacked through a weakness in the web host.
  • While it’s important to use a strong password, password cracking is not a primary focus for hackers.
  • the more software you have installed on your machine, the easier it is to hack – even if you aren’t using the programs! Clearly, programs that are designed to destroy your system are dangerous. But even innocent software can be used in an attack.
  • There are 3 ways to reduce the attack surface: 1. Run fewer processes 2. Uninstall programs you don’t need 3. Build a system from scratch that only has the processes you need
  • A really good authentication system uses multiple tests. Someone could steal or guess your password. They could grab your laptop with its cryptographic keys.
  • If you want to run multiple processes at the same time, you need some way of managing them. This is basically what a kernel is. It does more than that – it handles all of the complex details of the computer hardware, too. And it runs the computer’s networking capabilities
  • programs exist as files when they are not running in memory
  • SELinux’s default response is to deny any request.
  • SELinux is extremely comprehensive, but this power comes at a price. It’s difficult to learn, complex to set up, and time-consuming to maintain.
  • AppArmor is an example of a MAC tool, although it’s nowhere near as comprehensive as SELinux. It applies rules to programs to limit what they can do.
  • AppArmor is relatively easy to set up, but it does require you to configure each application and program one by one. This puts the onus for security in the hands of the user or sysadmin. Often, when new apps are added, users forget to configure AppArmor. Or they do a horrible job and lock themselves out, so their only option is to disable the profile. That said, several distributions have adopted AppArmor.
  • Generic profiles shipped by repo teams are designed to cover a wide range of different use cases, so they tend to be fairly loose. Your specific use cases are usually more specific. In this case, it pays to fine-tune the settings, making them more restrictive.
  • GRSecurity is a suite of security enhancements
  • In the future, this could become a viable option. For now, we’ll use Ubuntu and AppArmor.
  • Apache is a user-facing service – it’s how your users interact with your website. It’s important to control this interaction too.
  • If your Apache configuration is bad, these files can be viewed as plain text. All of your code will be visible for anyone to see – this potentially includes your database credentials, cryptographic keys, and salts.
  • You can configure Apache to refuse any requests for these essential directories using .htaccess files. These are folder-level configuration files that Apache reads before it replies to a request.
  • The primary use for .htaccess files is to control access
  • If an attacker knows your WordPress cryptographic salts, they can use fake cookies to trick WordPress into thinking they have logged on already.
  • If the hacker has physical access to the computer, they have many options at their disposal. They can type commands through the keyboard, or insert a disk or USB stick into the machine and launch an attack that way.
  • When it comes to network-based attacks, attackers have to reach through one of the machine’s network ports.
  • For an attacker to exploit a system, they have to communicate to a process that’s listening on a port. Otherwise, they’d simply be sending messages that are ignored. This is why you should only run processes that you need for your site to run. Anything else is a security risk.
  • Often, ports are occupied by processes that provide no real valuable service to the machine’s legitimate users. This tends to happen when you install a large distribution designed for multiple uses. Large distros include software that is useless to you in terms of running a website. So the best strategy is to start with a very lightweight distro and add the components you need.
  • If you see any unnecessary processes, you can shut them down manually. Better yet, if the process is completely unnecessary, you can remove it from your system.
  • Firewalls are quite similar to access control within the computer. They operate on a network level, and you can use them to enforce security policies. A firewall can prevent processes from broadcasting information from a port. It can stop outside users from sending data to a port. And it can enforce more complex rules.
  • Simply installing and running a firewall does not make your host machine secure – it’s just one layer in the security cake. But it’s a vital and a powerful one.
  • First of all, we need to configure our software to resist common attacks. But that can only protect us from attacks we know about. Access control software, such as AppArmor, can drastically limit the damage caused by unauthorized access. But you still need to know an attack is in progress.
  • This is where Network Intrusion Detection Software (NIDS) is essential. It scans the incoming network traffic, looking for unusual patterns or signs of a known attack. If it sees anything suspicious, it logs an alert.
  • It’s up to you to review these logs and act on them.
  • If it’s a false alarm, you should tune your NIDS software to ignore it. If it’s an ineffective attack, you should review your security and block the attacker through the firewall.
  • That’s why it’s essential to have an automated backup system. Finally, you need to understand how the attack succeeded, so you can prevent it from recurring. You may have to change some settings on your Firewall, tighten your access rules, adjust your Apache configuration, and change settings in your wp-config file. None of this would be possible without detailed logs describing the attack.
  • Every web server has a breaking point and dedicated DOS attackers are willing to increase the load until your server buckles. Good firewalls offer some level of protection against naive DOS attacks
  • a tiny number of sites (less than 1%) are hacked through the WordPress core files
  • Major DNS attacks have taken down some of the biggest sites in the world – including Ebay and Paypal. Large hosting companies like Hostgator and Blue Host have been attacked. It’s a serious risk!
  • Right now, due to the way the web currently works, it’s impossible to download a web page without the IP address of a server. In the future, technologies like IFPS and MaidSafe could change that.
  • So there are 2 benefits to using a CDN. The first is that your content gets to your readers fast. The second benefit is server anonymity – nobody knows your real IP address – including the psychos. This makes it pretty impossible to attack your server – nobody can attack a server without an IP address.
  • When CDNs discover a DDOS attack, they have their own ways to deal with it. They often display a very lightweight “are you human?” message with a captcha. This tactic reduces the bandwidth costs and screens out the automated attacks.
  • If any of your DNS records point to your actual server, then it’s easy to find it and attack it. This includes A records (aliases) and MX records (mail exchange). You should also use a separate mail server machine to send your emails. Otherwise, your email headers will expose your real email address.
  • If your hosting company refuses to give you a new IP address, it may be time to find a new service provider.
  • WordPress uses encryption to store passwords in the database. It doesn’t store the actual password – instead, it stores an encrypted version. If someone steals your database tables, they won’t have the actual passwords.
  • If you used a simple hash function, a hacker could gain privileged access to your app in a short period of time.
  • The salt strings are stored in your site’s wp-config.php file.
  • Salts dramatically increase the time it would take to get a password out of a hash code – instead of taking a few weeks, it would take millions of years
  • You keep the other key (the decryption key) to yourself. If anyone stole it, they could decode your private messages! These 2-key cryptographic functions do exist. They are the basis of TLS (https) and SSH.
  • the most secure systems tend to be the simplest. The absolute secure machine would be one that was switched off.
  • For WordPress sites, you also need PHP and a database.
  • A VM is an emulated computer system running inside a real computer (the host). It contains its own operating system and resources, such as storage, and memory. The VM could run a completely different operating system from the host system – you could run OSX in a VM hosted on your Windows machine
  • This isolation offers a degree of protection. Let’s imagine your VM gets infected with a particularly nasty virus – the VM’s file system could be completely destroyed, or the data could be hopelessly corrupted. But the damage is limited to the VM itself. The host environment would remain safe.
  • This is how shared hosting and virtual private servers (VPSes) work today. Each customer has access to their own self-contained environment, within a virtual machine.
  • VMs are not just for hosting companies. If you’re hosting multiple sites on a dedicated server or a VPS, VMs can help to make your server more secure. Each site can live inside its own VM. That way, if one server is hacked, the rest of your sites are safe.
  • Even with all these considerations, the benefits of VMs outweigh their drawbacks. But performance is vital on the web.
  • Containers (like Docker) are very similar to VMs.
  • Because we’ve cut the hypervisor out of the loop, applications run much faster – almost as fast as processes in the host environment. Keeping each container separate does involve some computation by the container software. But it’s much lighter than the work required by a hypervisor!
  • Docker Cloud is a web-based service that automates the task for you. It integrates smoothly with the most popular cloud hosting platforms (such as Amazon Web Services, or Digital Ocean).
  • With containers, you can guarantee that the developer’s environment is exactly the same as the live server. Before the developer writes a single line of code, they can download the container to their computer. If the code works on their PC, it will work on the live server. This is a huge benefit of using containers, and it’s a major reason for their popularity.
  • A complete stack of these layers is called an “image”
  • The core of Docker is the Docker Engine – which lives inside a daemon – or long-running process
  • another great resource – the Docker Hub. The hub is an online directory of community-made images you can download and use in your own projects. These include Linux distributions, utilities, and complete applications.
  • Docker has established a relationship with the teams behind popular open source projects (including WordPress) – these partners have built official images that you can download and use as-is.
  • when you finish developing your code, you should wrap it up inside a complete container image. The goal is to put all the code that runs your site inside a container and store the volatile data in a volume.
  • Although Docker can help to make your site more secure, there’s are a few major issues you need to understand. The Docker daemon runs as a superuser It’s possible to load the entire filesystem into a container It’s possible to pass a reference to the docker daemon into a container
  • The solution to this issue is to use a MAC solution like SELinux, GRSecurity or AppArmor.
  • Never let anyone trick you into running a strange docker command.
  • only download and use Docker images from a trustworthy source. Official images for popular images are security audited by the Docker team. Community images are not
  • there are the core WordPress files. These interact with the web server through the PHP runtime. WordPress also relies on the file system and a database server.
  • A service is some software component that listens for requests (over a protocol) and does something when it receives those requests.
  • Using Docker, you could install WordPress, Apache, and PHP in one container, and run MySQL from another. These containers could run on the same physical machine, or on different ones
  • The database service container can be configured to only accept connections that originate from the web container. This immediately removes the threat of external attacks against your database server
  • This gives you the perfect opportunity to remove high-risk software from your host machine, including: Language Runtimes and interpreters, such as PHP, Ruby, Python, etc. Web servers Databases Mail Servers
  • If a new version of MySQL is released, you can update the database container without touching the web container. Likewise, if PHP or Apache are updated, you can update the web container and leave the database container alone.
  • Because Docker makes it easy to connect these containers together, there’s no reason to lump all your software inside a single container. In fact, it’s a bad practice – it increases the security risk for any single container, and it makes it harder to manage them.
  • If your site is already live on an existing server, the best approach is to set up a new host machine and then migrate over to it. Here are the steps you need to take:
  • With a minimal Ubuntu installation, you have a fairly bare-bones server. You also have the benefit of a huge repository of software you can install if you want.
  • If access control is like a lock protecting a building, intrusion detection is the security alarm that rings after someone breaks in.
  • Logging on to your host with a superuser account is a bad practice. It’s easy to accidentally break something.
  • Fail2ban blocks SSH users who fail the login process multiple times. You can also set it up to detect and block hack attempts over HTTP – this will catch hackers who attempt to probe your site for weaknesses.
  • With multiple WordPress sites on your machine, you have 2 choices. You could create a new database container for each, or you could reuse the same container between them. Sharing the DB container is a little riskier, as a hacker could, theoretically, ruin all your sites with one attack. You can minimize that risk by: Use a custom root user and password for your database – don’t use the default username of ‘root’. Ensuring the db container is not accessible over the internet (hide it away inside a docker network) Creating new databases and users for each WordPress site. Ensure each user only has permissions for their specific database.
  • What are the benefits of using a single database container? It’s easier to configure and scale. It’s easier to backup and recover your data. It’s a little lighter on resources.
  • you could also add a caching container, like Varnish. Varnish caches your content so it can serve pages quickly – much faster than WordPress can
  • Docker has the ability to limit how much processor time and memory each container gets. This protects you against exhaustion DOS attacks
  • A containerized process still has some of the abilities of root, making it more powerful than a regular user. But it’s not as bad as full-on root privileges. With AppArmor, you can tighten the security further, preventing the process from accessing any parts of the system that do not relate to serving your website.
  • Docker Hub works like GitHub – you can upload and download images for free. The downside is that there’s no security auditing. So it’s easy to download a trojan horse inside a container.
  • Official images (such as WordPress and Apache) are audited by the Docker team. These are safe. Community images (which have names like user/myapp) are not audited.
  • a kernel exploit executed inside a container will affect the entire system. The only way to protect against kernel exploits is to regularly update the host system
  • Containers run in isolation from the rest of the system. That does not mean you can neglect security – your website lives inside these containers! Even if a hacker cannot access the full system from a container, they can still damage the container’s contents.
  • Under Ubuntu, AppArmor already protects you – to a degree. The Docker daemon has an AppArmor profile, and each container runs under a default AppArmor profile. The default profile prevents an app from breaking out of the container, and restricts it from doing things that would harm the system as a whole. However, the default profile offers no specific protection against WordPress specific attacks. We can fix this by creating a custom profile for your WordPress container.
  • The net effect is that it’s impossible to install malware, themes or plugins through the web interface. We’ve already covered this to some degree with the .htaccess rules and directory permissions. Now we’re enforcing it through the Linux kernel.
  • There are versions of Docker for Mac and PC, so you’ll be able to run your site from your home machine. If the code works on your PC, it will also work on the server.
  • Tripwire tends to complain about the entries in the /proc filespace, which are auto-generated by the Linux kernel. These files contain information about running processes, and they tend to change rapidly while Linux runs your system. We don’t want to ignore the directory entirely, as it provides useful signs that an attack is in progress. So we’re going to have to update the policy to focus on the files we are interested in.
  • Now we should install an e-mail notification utility – to warn us if anything changes on the system. This will enable us to respond quickly if our system is compromised (depending on how often you check your emails).
  • Rootkits are malicious code that hackers install onto your machine. When they manage to get one on your server, it gives them elevated access to your system
  • Tripwire is configured to search in key areas. It’s good at detecting newly installed software, malicious sockets, and other signs of a compromised system. RKHunter looks in less obvious places, and it checks the contents of files to see if they contain known malicious code. RKHunter is supported by a community of security experts who keep it updated with known malware signatures – just like antivirus software for PCs.
  • If your hosting company offers the option, this would be a good point to make an image of your server. Most cloud hosting companies offer tools to do this.
  • With an image, it’s easy to launch new servers or recover the old one if things go horribly wrong.
  • We’ve hidden our server from the world while making it easy to read our content We’ve built a firewall to block malicious traffic We’ve trapped our web server inside a container where it can’t do any harm We’ve strengthened Linux’s access control model to prevent processes from going rogue We’ve added an intrusion detection system to identify corrupted files and processes We’ve added a rootkit scanner We’ve strengthened our WordPress installation with 2-factor authentication We’ve disabled the ability for any malicious user to install poisoned themes or plugins
  • Make a routine of checking the logs (or emails if you configured email reporting). It’s vital to act quickly if you see any warnings. If they’re false warnings, edit the configuration. Don’t get into a habit of ignoring the reports.
  • Virtually everything that happens on a Linux machine is logged.
  • You have to make a habit of checking for new exploits and learn how to protect yourself against them. Regularly check for security patches and issues in the core WordPress app: WordPress Security Notices Also, check regularly on the forums or mailing lists for the plugins and themes you use on your site.
  • network level intrusion detection service – you can fix that by installing Snort or PSAD.
  • The only way to guarantee your safety is to constantly update your security tactics and never get complacent.
15More

SEO Spider Tabs | Screaming Frog - 0 views

  • Hreflang URLs must be crawlable and indexable and therefore non-200 URLs are treated as errors, and ignored by the search engines.
  • This is optional, and not necessarily an error or issue.
  • Unsafe Cross-Origin Links – This shows any pages that link to external websites using the target=”_blank” attribute (to open in a new tab), without using rel=”noopener” (or rel=”noreferrer”) at the same time. Using target=”_blank” alone leaves those pages exposed to both security and performance issues.
  • ...12 more annotations...
  • Protocol-Relative Resource Links – This filter will show any pages that load resources such as images, JavaScript and CSS using protocol-relative links.
  • this technique is now an anti-pattern with HTTPS everywhere, and can expose some sites to ‘man in the middle’ compromises and performance issues.
  • Missing HSTS Header
  • The HTTP Strict-Transport-Security response header (HSTS) instructs browsers that it should only be accessed using HTTPS, rather than HTTP. If a website accepts a connection to HTTP, before being redirected to HTTPS, visitors will initially still communicate over HTTP.
  • Missing Content-Security-Policy Header
  • The SEO Spider only checks for existence of the header, and does not interrogate the policies found within the header to determine whether they are well set-up for the website. This should be performed manually.
  • To minimise these security issues, the X-Content-Type-Options response header should be supplied and set to ‘nosniff’. This instructs browsers to rely only on the Content-Type header and block anything that does not match accurately. This also means the content-type set needs to be accurate.
  • Missing X-Content-Type-Options Header
  • Missing X-Frame-Options Header
  • This helps avoid ‘click-jacking’ attacks, where your content is displayed on another web page that is controlled by an attacker.
  • Bad Content Type – This shows any URLs where the actual content type does not match the content type set in the header. It also identifies any invalid MIME types used.
  • To discover mixed content issues, where HTTPS pages load insecure elements such as images, CSS and JavaScript we recommend using the ‘Insecure Content‘ report under the ‘Reports’ top level menu.
10More

The State of Local SEO: Experts Weigh in on Industry-Specific Tactics - Moz - 0 views

  • Our financial client created COVID landing pages for both personal and business accounts. This client saw a 95% increase in organic goal completions from February to March. There was also a 97% increase in organic goal completions YoY. Google posts that focused on coronavirus-related services and products have also performed well.
  • Figure out the best method for earning reviews. Test email, texting, and in-person requests from your team, physical cards with a bit.ly link, etc. Test each one for a few months, then switch to a different method. Test until you find the method that works best for your customers.  The other thing that really needs to be considered is how to get customers to write about the specific services they used when working with your company. Little prompts or questions that they could answer when you reach out will help customers write better reviews.
  • Home Services
  • ...7 more annotations...
  • Financial Services My number one tactic for reviews has always been to have an actual person ask for a review during key points in the customer journey. For example, an associate that helps someone open a checking account
  • Most home service businesses should not be displaying their address since they are a Service Area Business, but this doesn’t stop some from keeping their address up to rank in that city.  Google does tend to prioritize proximity in the home services industry, unfortunately. 
  • Reviews should definitely play a bigger factor than proximity for financial institutions.
  • With digital banking and the amount of trust we put into financial organizations, proximity isn’t a major factor when considering a financial service provider, but Google results don’t reflect that. 
  • Paragraph, table, and carousel featured snippets are typically the types that we see financial websites achieving most often.
  • I believe that featured snippets will become more and more regionally specific. If you do a search for “new water heater cost” you see a featured snippet for Home Advisor. If a company that is local to me published content around the cost and installation, why wouldn’t Google serve that snippet to me instead of what is shown nationally?
  • Review strategies should include offline tactics. Community outreach and involvement are crucial. I would argue that anyone who is consulting about online reputation management should focus on the company’s reputation offline as well.
2More

Google logo schema markup now requires logos to look good on white backgrounds - 0 views

  • “make sure the image looks how you intend it to look on a purely white background (for example, if the logo is mostly white or gray, it may not look how you want it to look when displayed on a white background).”
  • It can show in the knowledge panel, maybe top stories and other areas
« First ‹ Previous 41 - 60 of 61 Next ›
Showing 20 items per page