Skip to main content

Home/ DISC Inc/ Group items tagged blogs

Rss Feed Group items tagged

Rob Laporte

Google Openly Profiles SEOs As Criminals - 2 views

  • If we can stop talking about nofollow and PageRank sculpting for a second, maybe we can openly talk about the bigger story of last week’s SMX Advanced. The one that has to do with Matt Cutts taking the stage during the You&A and openly stating that Google profiles SEOs like common criminals. I was naïve in my youth. I’d read blog posts that accused Google of “having it out” for SEOs and laugh. There’d be rants about how Google was stricter on sites that were clearly touched by an SEO and how SEOs were dumb for “self-identifying” with attributes like nofollow. At the time, I thought these people were insane. Now I know they were right. Google does profile SEOs. They’re identified as “high risk” and so are all of their associated projects.
  •  
    Interesting...further strengthens the position that "content is King" and we should continue to encourage clients in that direction. Value to the audience first, play nice with the search engines second.
Rob Laporte

Rand Fishkin | SEO Blog - 0 views

  • Why Doesn’t Rand Fishkin say the Words? October 2, 2009 by Roger · 2 CommentsFiled under: SEO General  There’s a very informative video on SEOmoz’s Whiteboard Friday about link volume verses link quality. At about the 5:00 minute mark you can see Rand Fishkin holding himself back trying not to say the B word … “buy links”. He does say barter. Does that mean exchange links for money? I guess it could. The sad truth is that if you are in a very competitive market like travel, car hire, hotels, and you aren’t a top 200 brand, the only way you are going to get on the front page of Google is to BUY LINKS. Cheap hotels Sydney is an example of the sort of search term you would probably need to buy links for. $1000 to $2000 per month for some quality links should do the trick which is still cheap compared to other forms of mass media, and I do see Google as a form of mass media. Yep, buy links. But that’s Blackhat you say and Google doesn’t like it I can hear some people say. It seems it’s OK to buy links if Google gets the cash via their Adwords money machine, but if you get caught selling or buying links, then watch out. Ever wondered why Google uses a very pale yellow background on their Adwords ads? Why not red or blue, or even a muted grey? You know the answer don’t you?  I suspect over 30% of the market don’t even know the difference between Adwords ads and organic links. What number do you believe? And if you believe the white-hat nonsense about not buying links you will still be spending time and/or money on article marketing, press release submissions, forum signatures, link exchanges, and other link-building methods.
Rob Laporte

NoFollow | Big Oak SEO Blog - 0 views

  • And while the business networking aspect is great, I’m writing to tell you it can be useful for your SEO efforts too, specifically link building. You may not know this, but LinkedIn does not employ the nofollow attribute on its links, like most other social networking sites. So that means we can use LinkedIn responsibly to build some nice one-way links to our sites and blogs. Even better your employees can use this to build some SEO-friendly links to your company site.
  • So the days of parsing links onto high PageRank Flickr pages are over. Or are they? No. Let’s examine why in list form. Let’s examine how you can use the remaining scraps of link juice from Flickr in your SEO campaigns. 1.) Flickr has not added nofollow to discussion boards. For those of you who liked to scout out high PageRank pages and just drop your link as a comment to the photo, which could be accomplished easily if you owned a link-laundering website, you can still do this in the Flickr group discussion boards. Flickr has not yet added nofollow tags to those, and given the preponderance of discussions that revolve around people sharing photos, you can just as easily drop relevant external links in the discussion and reap link juice benefits. 2.) Flickr has not added nofollow to personal profile pages. If you have a personal profile page, you can place targeted anchor text on it, point links at it, and receive full SEO benefit as it gains PageRank. 3.) Flickr has not added nofollow to group pages. If you own a Flickr group, you can still put as many links as you wish on the main group page without fear of them being turned into nofollow. Many Flickr personal profile and group pages gain toolbar PR just by having the link spread around in-house, so it’s not that hard to make those pages accumulate PR. Google seems to be very generous in that regard. There’s a lot of PR to be passed around through Flickr apparently. So, the glory days of Flickr SEO may be over (unless Yahoo does the improbable and flips the switch back), but Rome didn’t burn to rubble in a day, so we might as well make the most of Flickr before it completely collapses.
Rob Laporte

Limit Anchor Text Links To 55 Characters In Length? | Hobo - 0 views

  •  
    Limit Anchor Text Links To 55 Characters In Length? Blurb by Shaun Building LinksAs a seo I wanted to know - how many words or characters does Google count in a link? What's best practice when creating links - internal, or external? What is the optimal length of a HTML link? It appears the answer to the question 'how many words in a text link" is 55 characters, about 8-10 words. Why is this important to know? 1. You get to understand how many words Google will count as part of a link 2. You can see why you should keep titles to a maximum amount of characters 3. You can see why your domain name should be short and why urls should be snappy 4. You can see why you should rewrite your urls (SEF) 5. It's especially useful especially when thinking about linking internally, via body text on a page. I wanted to see how many words Google will count in one 'link' to pass on anchor text power to a another page so I did a test a bit like this one below; 1. pointed some nonsense words in one massive link, 50 words long, at the home page of a 'trusted' site 2. each of the nonsense words were 6 characters long 3. Then I did a search for something generic that the site would rank no1 for, and added the nonsense words to the search, so that the famous "This word only appear in links to the site" (paraphrase) kicked in 4. This I surmised would let me see how many of the nonsense words Google would attribute to the target page from the massive 50 word link I tried to get it to swallow. The answer was….. 1. Google counted 8 words in the anchor text link out of a possible 50. 2. It seemed to ignore everything else after the 8th word 3. 8 words x 6 characters = 48 characters + 7 spaces = a nice round and easy to remember number - 55 Characters. So, a possible best practice in number of words in an anchor text might be to keep a link under 8 words but importantly under 55 characters because everything after it is ignored
Rob Laporte

Official Google Webmaster Central Blog: Using site speed in web search ranking - 0 views

  • If you are a site owner, webmaster or a web author, here are some free tools that you can use to evaluate the speed of your site:Page Speed, an open source Firefox/Firebug add-on that evaluates the performance of web pages and gives suggestions for improvement.YSlow, a free tool from Yahoo! that suggests ways to improve website speed.WebPagetest shows a waterfall view of your pages' load performance plus an optimization checklist.In Webmaster Tools, Labs > Site Performance shows the speed of your website as experienced by users around the world as in the chart below. We've also blogged about site performance.Many other tools on code.google.com/speed.While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven't seen much change to your site rankings, then this site speed change possibly did not impact your site.
Rob Laporte

Google Confirms RSS For Web Search Results - 0 views

  • Oct 8, 2008 at 3:02pm Eastern by Matt McGee    Google Confirms RSS For Web Search Results Google has confirmed for Search Engine Land that they’ll soon start offering RSS feeds for web search results. When it happens, the RSS feeds will be an extension of Google Alerts, which currently only allow notification by email. The addition of RSS alerts was first picked up by Amit Agarwal, who found it mentioned in an October 1st Wall Street Journal article where author Katherine Boehret wrote, “In about a month, Google will begin delivering these alerts to users via feeds, as well as emails.” In an email today, a Google spokesperson told us: “While I can’t be more specific about an ETA, I can confirm the launch.” Google Alerts offers email-only notifications on results from News, Web, Blogs, Video and Groups. Google is currently the only major search engine not offering RSS feeds of web search results.
  •  
    Oct 8, 2008 at 3:02pm Eastern by Matt McGee Google Confirms RSS For Web Search Results Google has confirmed for Search Engine Land that they'll soon start offering RSS feeds for web search results. When it happens, the RSS feeds will be an extension of Google Alerts, which currently only allow notification by email. The addition of RSS alerts was first picked up by Amit Agarwal, who found it mentioned in an October 1st Wall Street Journal article where author Katherine Boehret wrote, "In about a month, Google will begin delivering these alerts to users via feeds, as well as emails." In an email today, a Google spokesperson told us: "While I can't be more specific about an ETA, I can confirm the launch." Google Alerts offers email-only notifications on results from News, Web, Blogs, Video and Groups. Google is currently the only major search engine not offering RSS feeds of web search results.
Rob Laporte

Introducing the NEW Ahrefs' Domain Rating (and how to use it) - 0 views

  • Does Google use anything like Domain Rating in their algorithm?If we refer to official statements, Google’s John Mueller stated that Google does not have a “website authority score.” We don’t have anything like a website authority score.John Mueller They consistently educated the SEO community that they calculate scores for actual pages, not entire domains (hence PageRank).But prior to that statement, John Mueller said something else: There are some things where we do look at a website overall though.John Mueller So is this a “yes” or is this a “no?”Well, here at Ahrefs we have a firm belief that “website authority” doesn’t exist as an isolated ranking factor.
jack_fox

How long should a post be? * Ideal blog post length * Yoast - 0 views

  • Your blog post should always contain more than 300 words, otherwise, your post will have too few words in order to rank in the search engines.
jack_fox

Identifying Knowledge Graph Entities in Google Images Results - 0 views

  • When you search for an image on mobile in the U.S., you might see information from the Knowledge Graph related to the result. That information would include people, places or things related to the image from the Knowledge Graph’s database of billions of facts
jack_fox

A WordPress safety plan for SEOs and developers - Search Engine Land - 0 views

  • Block ads to prevent sophisticated attacks that masquerade as images. Use VPN for end-to-end encryption whenever you’re working at public WiFi hotspots to prevent session hijacking and MITM attacks.
Rob Laporte

How to Hunt Down and Capture Featured Snippets for More Traffic in 2019 - 0 views

  • According to a study of 2 million featured snippets by Ahrefs, 8.6% of all clicks go to the featured snippet.
  • The most popular types of snippets, according to this study by SEMrush of 10 million SERPs are: Paragraphs 53.2% Lists 35.5% Tables 11.5%
Rob Laporte

Google's December 2020 Core Update Themes - 0 views

  • The data and overall consensus point to Google’s December 2020 Core Update is it's one of the more impactful algorithm adjustments to hit the SERP over the past year or so.
  • I prefer to look at core updates almost from a pure content and UX perspective. For me, it’s about the specific pages Google swaps out more than it is a per domain analysis.
  • I am performing a qualitative analysis
  • ...19 more annotations...
  • I am not making any sort of definitive statements
  • What moves me, however, is when I look at 100 keywords I start seeing the same content-oriented theme arise again and again.
  • What I’m trying to say, and as you’ll see in the examples I will get into later, is that the content that was more focused on the specific topic mentioned in the query did better. So while the "ultimate guide” here did get to the topic the query deals with, it was not exclusively about that topic.
  • This might call the entire strategy of creating these ultimate guides into question. Perhaps you can’t cast a wide net in that way anymore? Perhaps, the "ultimate guide” is only really suitable for people who actually want to get a more broad understanding of a topic? (Crazy to think, I know!)
  • The pages from Rocket Mortgage, on the other hand, is only about how much you need for a down payment:
  • So too is the page from Quicken Loans:
  • The Moral of the Story: If I want to understand how much money on average I need to put down when buying a house or what the various options generally are and what they mean long term, the CFPG page, .gov or not, doesn’t really help me. Its content is not specifically honed in on that particular topic. Again, we have another page that takes a sweeping look at a topic that lost rankings when the query reflected a more specific sort of intent!
  • What’s interesting here is that unlike the previous examples, where too much content resulted in the page’s topical relevance being diluted, the lack of such content here is what I think caused the ranking loss. Look, it’s not bad content. However, it’s pretty much the "general” kind of content you see here, there, and everywhere for all sorts of topics. Just compare it to what the page from the Credit Card Insider offers:
  • This just oozes depth. The third topic on the page alone (6 Ways to Pay Off…) rivals the depth shown on the CreditCards.com page! What differentiates this page from the "guides” shown in the other examples is that this is a guide that drills deep into one topic as opposed to trying to span multiple subtopics. Also, have a look at the formatting, it reminds me of what we saw on the Motley Fool’s page:
  • It’s deep content that is easy to digest. It’s not hard to see why Google swapped these two pages.
  • The Moral of the Story: Exact content relevancy is not only about what topic you talk about. You can be topically aligned but it has to offer real information to the user. It’s even better when that information is digestible. In other words, if you want to rank for a keyword with topic specificity it might be better to create an "ultimate guide” that drills deep into the topic itself versus trying to cover every subtopic under the sun in order to try to rank for more topics with one piece of content.
  • The by-line really sums it up. It tells you this article is about the fact that you most likely won't get addicted to painkillers, but it’s definitely possible so here’s the scoop. To me, it’s far more in line with the average user’s intent of learning about the risks of addiction versus understanding the fine difference between addiction and dependence. It’s the same story with the WebMD page:
  • The Moral of the Story: Again, the issue here is not how authoritative or how substantial the content is. There is no doubt that content from the NIH is both substantial and authoritative. The issue here again seems to relate to Google being better able to show content that is specifically relevant to the nature of the query.
  • First things first, the page doesn’t speak to the query directly. While in the process of learning the difference between sadness and depression one could understand the signs of depression that route is certainly indirect. You could argue that the query how to tell if you have depression could be taken as ‘how do I know if I am just sad or depressed?’ but that really doesn’t seem to be the essential intent here. That topical line (i.e., sadness vs. depression) would most likely produce its own unique query (i.e., am I sad or depressed). From the content shown on the WebMD page, it appears that Google thinks of the intent as understanding the symptoms of depression:
  • The WebMD, in contradistinction to the MHA page, speaks to the "plain meaning” of the query’s intent… how can you tell if you’re suffering from depression? Aside from that, the WebMD page offers a bit more in terms of substance. While it doesn’t go into great detail per se, the WebMD page does offer a pretty comprehensive list of items. Compare that to the MHA page which, if you read it, is a bit thin and lacks tremendously in offering much of any details (even a basic list as seen on the WebMD page). The Moral of the Story: Relevancy is a two-pronged equation (at minimum). It requires the content to be topically focused on the one hand as well as substantial on the other
  • I’ve saved the best for last. This is my favorite example that I came across when diving into the December 2020 Core Update. I mean, for crying out loud, we’re talking about the CDC losing rankings in favor of a .org domain I never heard of. How could this be? Let’s understand the intent of the query. If I were searching for this it would be because I found something on my body that I thought might be skin cancer. If I could be so bold, I would imagine that this is why most of us would search for this term. I wouldn’t, and again I imagine most people in most instances wouldn’t search for this in order to understand if regular screening is officially recommended or not. Yet, that is what the CDC page is about:
  • I hate to make assumptions, but I would also think that someone running this query is most likely not interested in the common tests and methods doctors use to determine if skin cancer is present. Yet, this is what the page from Cancer.net focuses on:
  • Again, I would search for this term if I saw something weird on my body that made me think "Holy crap, do I have skin cancer?”. The page from the AOCD is entirely made for people on the verge of freaking out at the possibility of having skin cancer:
  • To me, when you see this page relative to the pages from Cancer.net and the CDC is painfully obvious why this page got the ranking boost. The Moral of the Story: Again, I think what has transpired here is painfully obvious. Google has looked past the immediate authority of some of the pages here and has more heavily considered how relevant the content is to the query. As with the cases I have shown earlier, Google is rewarding content that speaks in a highly-focused way to the intent and nature of the query. What Was the December 2020 Core Update About? Are you expecting a one-liner that definitively characterizes the December 2020 update? You’re not going to get one from me.  This update, like any other, certainly included a whole plethora of different "algorithmic considerations” and themes. That said, from where I sit, while other core updates did things to help put the most authoritative content at the top of the SERP, this update seemed to me as being more about pure relevancy. Updates of the past have done things to weed out sites using a marketing tone within YMYL informational content or have rewarded sites that put the right content ahead of their affiliate pursuits. All of that, while part of relevancy, speaks more to a need for something authoritative at the top of the SERP. Seeing so many .gov pages drop in favor of pages from sites like Healthline or WebMD seems to point to the update rewarding relevancy to the nth degree. Perhaps Google felt as if it had "authority” at the top of the SERP in order, paving the way for a focus on relevance? Who knows. All I can say is that I personally have not seen such a strong focus on pure relevance on page one of the SERP. Content Creation Takeaways Practically speaking, I think the era of broadly reaching pages is quickly coming to an end. I think that has been the case for a while. However, seeing Google pull pages off page one of the SERP because they deal with multiple subtopics is a new level, at least for me. It shows that you have to create content that talks about one topic and one topic only (unless the keyword reflects a specific desire for a broader survey of a topic). I wonder if the idea of having one ultimate guide so as to win numerous keywords should be replaced with multiple posts where each post deals with one specific topic or subtopic. (If you do that, please don’t create thin content, that is not what I am advocating for.) It’s a rather logical concept. As Google gets better at understanding content it is going to prefer highly-focused content around a specific topic to that which is of a more broad nature unless the query specifically shows intent for a general survey of a topic.
jack_fox

Google Unpaid Shopping Listings: Where Are They Now? - 0 views

  • Now that free listings have been live for 7+ months and were expanded out to the main SERP, we pulled some data to check in on what Merkle clients are seeing
  • The overwhelming majority of this traffic likely comes through the Shopping tab, with some traffic coming from the product knowledge panel on the main SERP. While there’s really no limit to the inventory that Google can show on the Shopping tab, consumer interest in that page likely hasn’t changed much over the course of the year. Since the Google Shopping redesign in 2019, there haven’t been any recent efforts to pull customers away from the main SERP onto the Shopping property.
  • Include your entire product catalog
  • ...1 more annotation...
  • Keep an eye on SKU-specific searches
Rob Laporte

Should Google not trust links in all guest blog posts? - 0 views

  • Does Google even know what is a guest post and what is not? That is what Will Critchlow of Search Pilot and Brainlabs asked on Twitter. He said, “It’s also ridiculous because there is literally no way to tell from the outside whether a writer is an employee, a contractor, a freelancer, or a contributor (e.g. my status when I write for Moz.”As a matter of note; we nofollow links from our contributors here on Search Engine Land. In fact, even the links on my bio are nofollowed, and I am a daily writer here on staff at Search Engine Land.But it works. Many SEOs say that links in guest blog posts still work. It still works in that Google still somehow counts those links and they help you rank better in Google. Of course, it is almost impossible to test this to be true, since there are so many variables when it comes to ranking in Google search. But some believe it works.
jack_fox

The Ultimate Web Server Security Guide @ MyThemeShop - 0 views

  • They could insert links into the site to boost their SEO rankings. Hackers can make a killing selling links from exploited sites. Alternatively, a hacker could deface the site and demand money to restore it (ransom). They could even place ads on the site and use the traffic to make money. In most cases, an attacker will also install backdoors into the server. These are deliberate security holes that allow them to come back and exploit the site in the future – even if the insecure plugin has been replaced.
  • Unfortunately, under WordPress, every plugin and theme has the ability to alter anything on the site. They can even be exploited to infect other apps and sites hosted on the same machine.
  • Theme developers are often relatively inexperienced coders. Usually, they’re professional graphic artists who have taught themselves a little PHP on the side. Plugins are another popular line of attack – they account for 22% of successful hacks. Put together, themes and plugins are a major source of security trouble.
  • ...102 more annotations...
  • Each person who uses your system should only have the privileges they need to perform their tasks.
  • Don’t depend on a single security measure to keep your server safe. You need multiple rings of defense.
  • Security exploits exist at all levels of the technology stack, from the hardware up. WP White Security revealed that 41% of WordPress sites are hacked through a weakness in the web host.
  • While it’s important to use a strong password, password cracking is not a primary focus for hackers.
  • the more software you have installed on your machine, the easier it is to hack – even if you aren’t using the programs! Clearly, programs that are designed to destroy your system are dangerous. But even innocent software can be used in an attack.
  • There are 3 ways to reduce the attack surface: 1. Run fewer processes 2. Uninstall programs you don’t need 3. Build a system from scratch that only has the processes you need
  • A really good authentication system uses multiple tests. Someone could steal or guess your password. They could grab your laptop with its cryptographic keys.
  • If you want to run multiple processes at the same time, you need some way of managing them. This is basically what a kernel is. It does more than that – it handles all of the complex details of the computer hardware, too. And it runs the computer’s networking capabilities
  • programs exist as files when they are not running in memory
  • SELinux’s default response is to deny any request.
  • SELinux is extremely comprehensive, but this power comes at a price. It’s difficult to learn, complex to set up, and time-consuming to maintain.
  • AppArmor is an example of a MAC tool, although it’s nowhere near as comprehensive as SELinux. It applies rules to programs to limit what they can do.
  • AppArmor is relatively easy to set up, but it does require you to configure each application and program one by one. This puts the onus for security in the hands of the user or sysadmin. Often, when new apps are added, users forget to configure AppArmor. Or they do a horrible job and lock themselves out, so their only option is to disable the profile. That said, several distributions have adopted AppArmor.
  • Generic profiles shipped by repo teams are designed to cover a wide range of different use cases, so they tend to be fairly loose. Your specific use cases are usually more specific. In this case, it pays to fine-tune the settings, making them more restrictive.
  • GRSecurity is a suite of security enhancements
  • In the future, this could become a viable option. For now, we’ll use Ubuntu and AppArmor.
  • Apache is a user-facing service – it’s how your users interact with your website. It’s important to control this interaction too.
  • If your Apache configuration is bad, these files can be viewed as plain text. All of your code will be visible for anyone to see – this potentially includes your database credentials, cryptographic keys, and salts.
  • You can configure Apache to refuse any requests for these essential directories using .htaccess files. These are folder-level configuration files that Apache reads before it replies to a request.
  • The primary use for .htaccess files is to control access
  • If an attacker knows your WordPress cryptographic salts, they can use fake cookies to trick WordPress into thinking they have logged on already.
  • If the hacker has physical access to the computer, they have many options at their disposal. They can type commands through the keyboard, or insert a disk or USB stick into the machine and launch an attack that way.
  • When it comes to network-based attacks, attackers have to reach through one of the machine’s network ports.
  • For an attacker to exploit a system, they have to communicate to a process that’s listening on a port. Otherwise, they’d simply be sending messages that are ignored. This is why you should only run processes that you need for your site to run. Anything else is a security risk.
  • Often, ports are occupied by processes that provide no real valuable service to the machine’s legitimate users. This tends to happen when you install a large distribution designed for multiple uses. Large distros include software that is useless to you in terms of running a website. So the best strategy is to start with a very lightweight distro and add the components you need.
  • If you see any unnecessary processes, you can shut them down manually. Better yet, if the process is completely unnecessary, you can remove it from your system.
  • Firewalls are quite similar to access control within the computer. They operate on a network level, and you can use them to enforce security policies. A firewall can prevent processes from broadcasting information from a port. It can stop outside users from sending data to a port. And it can enforce more complex rules.
  • Simply installing and running a firewall does not make your host machine secure – it’s just one layer in the security cake. But it’s a vital and a powerful one.
  • First of all, we need to configure our software to resist common attacks. But that can only protect us from attacks we know about. Access control software, such as AppArmor, can drastically limit the damage caused by unauthorized access. But you still need to know an attack is in progress.
  • This is where Network Intrusion Detection Software (NIDS) is essential. It scans the incoming network traffic, looking for unusual patterns or signs of a known attack. If it sees anything suspicious, it logs an alert.
  • It’s up to you to review these logs and act on them.
  • If it’s a false alarm, you should tune your NIDS software to ignore it. If it’s an ineffective attack, you should review your security and block the attacker through the firewall.
  • That’s why it’s essential to have an automated backup system. Finally, you need to understand how the attack succeeded, so you can prevent it from recurring. You may have to change some settings on your Firewall, tighten your access rules, adjust your Apache configuration, and change settings in your wp-config file. None of this would be possible without detailed logs describing the attack.
  • Every web server has a breaking point and dedicated DOS attackers are willing to increase the load until your server buckles. Good firewalls offer some level of protection against naive DOS attacks
  • a tiny number of sites (less than 1%) are hacked through the WordPress core files
  • Major DNS attacks have taken down some of the biggest sites in the world – including Ebay and Paypal. Large hosting companies like Hostgator and Blue Host have been attacked. It’s a serious risk!
  • Right now, due to the way the web currently works, it’s impossible to download a web page without the IP address of a server. In the future, technologies like IFPS and MaidSafe could change that.
  • So there are 2 benefits to using a CDN. The first is that your content gets to your readers fast. The second benefit is server anonymity – nobody knows your real IP address – including the psychos. This makes it pretty impossible to attack your server – nobody can attack a server without an IP address.
  • When CDNs discover a DDOS attack, they have their own ways to deal with it. They often display a very lightweight “are you human?” message with a captcha. This tactic reduces the bandwidth costs and screens out the automated attacks.
  • If any of your DNS records point to your actual server, then it’s easy to find it and attack it. This includes A records (aliases) and MX records (mail exchange). You should also use a separate mail server machine to send your emails. Otherwise, your email headers will expose your real email address.
  • If your hosting company refuses to give you a new IP address, it may be time to find a new service provider.
  • WordPress uses encryption to store passwords in the database. It doesn’t store the actual password – instead, it stores an encrypted version. If someone steals your database tables, they won’t have the actual passwords.
  • If you used a simple hash function, a hacker could gain privileged access to your app in a short period of time.
  • The salt strings are stored in your site’s wp-config.php file.
  • Salts dramatically increase the time it would take to get a password out of a hash code – instead of taking a few weeks, it would take millions of years
  • You keep the other key (the decryption key) to yourself. If anyone stole it, they could decode your private messages! These 2-key cryptographic functions do exist. They are the basis of TLS (https) and SSH.
  • the most secure systems tend to be the simplest. The absolute secure machine would be one that was switched off.
  • For WordPress sites, you also need PHP and a database.
  • A VM is an emulated computer system running inside a real computer (the host). It contains its own operating system and resources, such as storage, and memory. The VM could run a completely different operating system from the host system – you could run OSX in a VM hosted on your Windows machine
  • This isolation offers a degree of protection. Let’s imagine your VM gets infected with a particularly nasty virus – the VM’s file system could be completely destroyed, or the data could be hopelessly corrupted. But the damage is limited to the VM itself. The host environment would remain safe.
  • This is how shared hosting and virtual private servers (VPSes) work today. Each customer has access to their own self-contained environment, within a virtual machine.
  • VMs are not just for hosting companies. If you’re hosting multiple sites on a dedicated server or a VPS, VMs can help to make your server more secure. Each site can live inside its own VM. That way, if one server is hacked, the rest of your sites are safe.
  • Even with all these considerations, the benefits of VMs outweigh their drawbacks. But performance is vital on the web.
  • Containers (like Docker) are very similar to VMs.
  • Because we’ve cut the hypervisor out of the loop, applications run much faster – almost as fast as processes in the host environment. Keeping each container separate does involve some computation by the container software. But it’s much lighter than the work required by a hypervisor!
  • Docker Cloud is a web-based service that automates the task for you. It integrates smoothly with the most popular cloud hosting platforms (such as Amazon Web Services, or Digital Ocean).
  • With containers, you can guarantee that the developer’s environment is exactly the same as the live server. Before the developer writes a single line of code, they can download the container to their computer. If the code works on their PC, it will work on the live server. This is a huge benefit of using containers, and it’s a major reason for their popularity.
  • A complete stack of these layers is called an “image”
  • The core of Docker is the Docker Engine – which lives inside a daemon – or long-running process
  • another great resource – the Docker Hub. The hub is an online directory of community-made images you can download and use in your own projects. These include Linux distributions, utilities, and complete applications.
  • Docker has established a relationship with the teams behind popular open source projects (including WordPress) – these partners have built official images that you can download and use as-is.
  • when you finish developing your code, you should wrap it up inside a complete container image. The goal is to put all the code that runs your site inside a container and store the volatile data in a volume.
  • Although Docker can help to make your site more secure, there’s are a few major issues you need to understand. The Docker daemon runs as a superuser It’s possible to load the entire filesystem into a container It’s possible to pass a reference to the docker daemon into a container
  • The solution to this issue is to use a MAC solution like SELinux, GRSecurity or AppArmor.
  • Never let anyone trick you into running a strange docker command.
  • only download and use Docker images from a trustworthy source. Official images for popular images are security audited by the Docker team. Community images are not
  • there are the core WordPress files. These interact with the web server through the PHP runtime. WordPress also relies on the file system and a database server.
  • A service is some software component that listens for requests (over a protocol) and does something when it receives those requests.
  • Using Docker, you could install WordPress, Apache, and PHP in one container, and run MySQL from another. These containers could run on the same physical machine, or on different ones
  • The database service container can be configured to only accept connections that originate from the web container. This immediately removes the threat of external attacks against your database server
  • This gives you the perfect opportunity to remove high-risk software from your host machine, including: Language Runtimes and interpreters, such as PHP, Ruby, Python, etc. Web servers Databases Mail Servers
  • If a new version of MySQL is released, you can update the database container without touching the web container. Likewise, if PHP or Apache are updated, you can update the web container and leave the database container alone.
  • Because Docker makes it easy to connect these containers together, there’s no reason to lump all your software inside a single container. In fact, it’s a bad practice – it increases the security risk for any single container, and it makes it harder to manage them.
  • If your site is already live on an existing server, the best approach is to set up a new host machine and then migrate over to it. Here are the steps you need to take:
  • With a minimal Ubuntu installation, you have a fairly bare-bones server. You also have the benefit of a huge repository of software you can install if you want.
  • If access control is like a lock protecting a building, intrusion detection is the security alarm that rings after someone breaks in.
  • Logging on to your host with a superuser account is a bad practice. It’s easy to accidentally break something.
  • Fail2ban blocks SSH users who fail the login process multiple times. You can also set it up to detect and block hack attempts over HTTP – this will catch hackers who attempt to probe your site for weaknesses.
  • With multiple WordPress sites on your machine, you have 2 choices. You could create a new database container for each, or you could reuse the same container between them. Sharing the DB container is a little riskier, as a hacker could, theoretically, ruin all your sites with one attack. You can minimize that risk by: Use a custom root user and password for your database – don’t use the default username of ‘root’. Ensuring the db container is not accessible over the internet (hide it away inside a docker network) Creating new databases and users for each WordPress site. Ensure each user only has permissions for their specific database.
  • What are the benefits of using a single database container? It’s easier to configure and scale. It’s easier to backup and recover your data. It’s a little lighter on resources.
  • you could also add a caching container, like Varnish. Varnish caches your content so it can serve pages quickly – much faster than WordPress can
  • Docker has the ability to limit how much processor time and memory each container gets. This protects you against exhaustion DOS attacks
  • A containerized process still has some of the abilities of root, making it more powerful than a regular user. But it’s not as bad as full-on root privileges. With AppArmor, you can tighten the security further, preventing the process from accessing any parts of the system that do not relate to serving your website.
  • Docker Hub works like GitHub – you can upload and download images for free. The downside is that there’s no security auditing. So it’s easy to download a trojan horse inside a container.
  • Official images (such as WordPress and Apache) are audited by the Docker team. These are safe. Community images (which have names like user/myapp) are not audited.
  • a kernel exploit executed inside a container will affect the entire system. The only way to protect against kernel exploits is to regularly update the host system
  • Containers run in isolation from the rest of the system. That does not mean you can neglect security – your website lives inside these containers! Even if a hacker cannot access the full system from a container, they can still damage the container’s contents.
  • Under Ubuntu, AppArmor already protects you – to a degree. The Docker daemon has an AppArmor profile, and each container runs under a default AppArmor profile. The default profile prevents an app from breaking out of the container, and restricts it from doing things that would harm the system as a whole. However, the default profile offers no specific protection against WordPress specific attacks. We can fix this by creating a custom profile for your WordPress container.
  • The net effect is that it’s impossible to install malware, themes or plugins through the web interface. We’ve already covered this to some degree with the .htaccess rules and directory permissions. Now we’re enforcing it through the Linux kernel.
  • There are versions of Docker for Mac and PC, so you’ll be able to run your site from your home machine. If the code works on your PC, it will also work on the server.
  • Tripwire tends to complain about the entries in the /proc filespace, which are auto-generated by the Linux kernel. These files contain information about running processes, and they tend to change rapidly while Linux runs your system. We don’t want to ignore the directory entirely, as it provides useful signs that an attack is in progress. So we’re going to have to update the policy to focus on the files we are interested in.
  • Now we should install an e-mail notification utility – to warn us if anything changes on the system. This will enable us to respond quickly if our system is compromised (depending on how often you check your emails).
  • Rootkits are malicious code that hackers install onto your machine. When they manage to get one on your server, it gives them elevated access to your system
  • Tripwire is configured to search in key areas. It’s good at detecting newly installed software, malicious sockets, and other signs of a compromised system. RKHunter looks in less obvious places, and it checks the contents of files to see if they contain known malicious code. RKHunter is supported by a community of security experts who keep it updated with known malware signatures – just like antivirus software for PCs.
  • If your hosting company offers the option, this would be a good point to make an image of your server. Most cloud hosting companies offer tools to do this.
  • With an image, it’s easy to launch new servers or recover the old one if things go horribly wrong.
  • We’ve hidden our server from the world while making it easy to read our content We’ve built a firewall to block malicious traffic We’ve trapped our web server inside a container where it can’t do any harm We’ve strengthened Linux’s access control model to prevent processes from going rogue We’ve added an intrusion detection system to identify corrupted files and processes We’ve added a rootkit scanner We’ve strengthened our WordPress installation with 2-factor authentication We’ve disabled the ability for any malicious user to install poisoned themes or plugins
  • Make a routine of checking the logs (or emails if you configured email reporting). It’s vital to act quickly if you see any warnings. If they’re false warnings, edit the configuration. Don’t get into a habit of ignoring the reports.
  • Virtually everything that happens on a Linux machine is logged.
  • You have to make a habit of checking for new exploits and learn how to protect yourself against them. Regularly check for security patches and issues in the core WordPress app: WordPress Security Notices Also, check regularly on the forums or mailing lists for the plugins and themes you use on your site.
  • network level intrusion detection service – you can fix that by installing Snort or PSAD.
  • The only way to guarantee your safety is to constantly update your security tactics and never get complacent.
jack_fox

- 0 views

  • I'd argue there's a lot of content on Pinterest -- even if it's not a collection of 3000 word blog posts. Sometimes images, even with minimal textual content, can be exactly what people are looking for. Not always, and sometimes we get it wrong, but it's certainly an option.
  •  
    "I'd argue there's a lot of content on Pinterest -- even if it's not a collection of 3000 word blog posts. Sometimes images, even with minimal textual content, can be exactly what people are looking for. Not always, and sometimes we get it wrong, but it's certainly an option."
jack_fox

Republishing Content: How to Update Old Blog Posts for SEO - 0 views

  • republishing any old post isn’t going to work. You need to find those that are underperforming because of content issues.
  • It’s sometimes because those that outrank you have more high-quality backlinks and ‘link authority.’To check if that’s the case, search for your keyword in Keywords Explorer, scroll to the SERP overview, then look at the Domain Ratings (DR) and URL Ratings (UR) of the sites and pages that outrank you.
  • If product, category, or landing pages are outranking you, then maybe searchers aren’t looking for blog posts.
jack_fox

Location Data + Reviews: The 1-2 Punch of Local SEO (Updated for 2020) - Moz - 0 views

  • If Google cares this much about ratings, review text, responses, and emerging elements like place topics and attributes, any local brand you’re marketing should see these factors as a priority.
  • In 2017, when I wrote the original version of this post, contributors to the Local Search Ranking Factors survey placed Google star ratings down at #24 in terms of local rankings influence. In 2020, this metric has jumped up to spot #8 — a leap of 16 spots in just three years.
  • local SEOs have noticed patterns over the years like searches with the format of “best X in city” (e.g. best burrito in Dallas) appearing to default to local results made up of businesses that have earned a minimum average of four stars.
  • ...4 more annotations...
  • The central goal of being chosen hinges on recognizing that your reviewer base is a massive, unpaid salesforce that tells your brand story. Survey after survey consistently finds that people trust reviews — in fact, they may trust them more than any claim your brand can make about itself.
  • don’t get too many reviews at once on any given platform but do get enough reviews on an ongoing basis to avoid looking like you’ve gone out of business.
  • There’s no magic number, but the rule of thumb is that you need to earn more reviews than the top competitor you are trying to outrank for each of your search terms. This varies from keyword phrase, to keyword phrase, from city to city, from vertical to vertical. The best approach is steady growth of reviews to surpass whatever number the top competitor has earned.
  • Many reviewers think of their reviews as living documents, and update them to reflect subsequent experiences.Many reviewers are more than happy to give brands a second chance when a problem is resolved.
  •  
    "If Google cares this much about ratings, review text, responses, and emerging elements like place topics and attributes, any local brand you're marketing should see these factors as a priority."
Rob Laporte

What Is Retrieval-Augmented Generation aka RAG | NVIDIA Blogs - 0 views

  •  
    "description of a RAG process"
« First ‹ Previous 61 - 80 of 846 Next › Last »
Showing 20 items per page