Skip to main content

Home/ DISC Inc/ Group items matching "Other" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
jack_fox

Is Your Google My Business Listing Getting Filtered? - Sterling Sky Inc - 0 views

  • If you have multiple locations for the same business, Google will often show the one with the highest relevance and filter others.
  • In some cases, the filter causes listings to be completely removed from the Local Finder results unless you zoom in
  • if your listing is too similar (based on criteria above) to a listing that outranks you, it will cause your listing to rank much lower because Google is trying to “diversify” the search results. 
  • ...1 more annotation...
  • The listing that has the most ranking authority for that particular keyword is the one that will rank.  For example, one attorney ranks for “personal injury attorney palmdale” but a different attorney ranks for “auto accident attorney palmdale” (both are using the same address). Because a listing is filtered for one keyword does not mean that it will be filtered for other keywords.
  •  
    "If you have multiple locations for the same business, Google will often show the one with the highest relevance and filter others."
Rob Laporte

Google Webmaster Central Hosting "Link Week" - 0 views

  • Oct 7, 2008 at 8:11am Eastern by Barry Schwartz    Google Webmaster Central Hosting “Link Week” This week at the Google Webmaster Central blog, Google has a series of blog posts all about links. The first two blog posts are live and are named: Links information straight from the source Importance of link architecture Google explains that they will be writing about three main topics this week. (1) Internal links, the links that you have within your site. That post is already live and is about the how you should structure your link structure for best search engine visibility. (2) Outbound links or the links you post on your pages to other sites. I assume Google will discuss the value of these links and who you should and should not link to. Clearly, think about your user here and not the search engine. (3) Inbound links or the external sites that are linking to your site. I assume Google left this for last, because this may be the most interesting topic. Google plans to bust some myths, so it will be interesting to see what they say on the topic of links hurting your site. Time will tell - but stay tuned for more information. Postscript: Here is Google’s post on linking outbound, which has useful tips for beginners on who and when to link out. In addition, it tells you how to handle user generated content links. Postscript 2: I was a bit let down by Google’s inbound link post.
  •  
    Oct 7, 2008 at 8:11am Eastern by Barry Schwartz Google Webmaster Central Hosting "Link Week" This week at the Google Webmaster Central blog, Google has a series of blog posts all about links. The first two blog posts are live and are named: * Links information straight from the source * Importance of link architecture Google explains that they will be writing about three main topics this week. (1) Internal links, the links that you have within your site. That post is already live and is about the how you should structure your link structure for best search engine visibility. (2) Outbound links or the links you post on your pages to other sites. I assume Google will discuss the value of these links and who you should and should not link to. Clearly, think about your user here and not the search engine. (3) Inbound links or the external sites that are linking to your site. I assume Google left this for last, because this may be the most interesting topic. Google plans to bust some myths, so it will be interesting to see what they say on the topic of links hurting your site. Time will tell - but stay tuned for more information. Postscript: Here is Google's post on linking outbound, which has useful tips for beginners on who and when to link out. In addition, it tells you how to handle user generated content links. Postscript 2: I was a bit let down by Google's inbound link post.
Rob Laporte

Google Analytics Upgrade: AdSense Reporting, Visualization Tools, & More - 0 views

  • Online publishers may be most interested in the AdSense integration tools coming to Google Analytics. After linking an AdSense and Analytics account, you’ll be able to see AdSense data including: total revenue, impressions, clicks, and click-through ratio revenue per day, per hour, etc. revenue per page (what pages are most profitable) revenue per referral (what other sites bring you profitable traffic) Here are a couple screenshots from Google’s videos on the new features (see below for link): During our call this morning, we asked why AdSense itself doesn’t also offer this data without requiring the need for also using Google Analytics to get it. We’re waiting for a reply from Google’s AdSense team and will let you know what we learn. Update: A Google spokesperson says, “We can’t comment on any future AdSense developments or features.” Motion Charts is a visualization tool lets you see and interact with analytics data in five dimensions, a capability made possible by Google’s purchase of Gapminder’s Trendalyzer software in March, 2007. The Google Analytics API, which is currently in private beta, will open up analytics data for developers to export and use however they want. Advanced segmentation allows users to dig deeper into subsets of traffic, such as “visits with conversions,” or create their own segment types. Custom reporting lets users create their own comparisons of metrics. Google has created a series of videos showing how some of these new tools work. Crosby says the new features will be rolled out in coming weeks to Google Analytics users, who may see some new features earlier than others. The AdSense integration, he warns, may take longer to roll out than the other new tools. More discussion at Techmeme.
Rob Laporte

YouTube's 'Buzz Targeting' Sells Ad Space on Soon-to-Be Viral Videos - MarketingVOX - 0 views

  • YouTube's 'Buzz Targeting' Sells Ad Space on Soon-to-Be Viral Videos What is the difference between'algorithm' and 'alchemy'? Google has introduced "Buzz Targeting" on YouTube, a new way to wring ad dollars from the video site. Buzz Targeting highlights videos that are about to go viral amongst YouTube users. The algorithm examines videos being favorited and distributed across other sites, among other criteria, then gives advertisers the opportunity to advertise around them. Ads incorporated on the ground floor can then piggy-back the video's popularity. Movie studio Lionsgate was among the first beta testers for Buzz Targeting. The studio placed ads for The Forbidden Kingdom alongside 500 entertainment-related videos. While no figures on the campaign's success were presented, Danielle DePalma of Lionsgate said the program "allowed us to reach a very large, diverse audience." It remains unclear how Buzz Targeting incorporates factors like demographic or location-based criteria. And while the notion of algorithmically gauging a video's ascension into pop culture is comforting, some skepticism is warranted. At ad:tech New York last year, video blogger Kevin Nalty admitted to being uncertain why some videos go viral and others do not. The wisest course, he told audience members of a user-generated video panel, is to keep your cost of entry down. Nalty, known as "Nalts" on YouTube, produced over 500 videos before 2008.
Rob Laporte

E-Mail: Evaluating Dedicated vs. Shared IP Addresses - ClickZ - 0 views

  •  
    The downside to having a dedicated IP address is the cost. Most ESPs charge an initial set-up fee of $500 to $1,000 for a dedicated IP address; there's also often a $250 monthly fee for maintaining it. This directly impacts your e-mail ROI (define). For large quantity senders the additional cost is minimal, but for those sending small volumes of e-mail it can make a dent in your profit margin. A shared IP address is just what it sounds like -- you're sharing the IP address with other organizations. Every company sending from the IP address has the potential to impact, positively or negatively, its reputation. If your IP address neighbors are good guys, the reputation shouldn't be damaged. But if one of them (or if you) does something that raises a red flag, the IP address' reputation will be tarnished and all e-mail sent from it could be blacklisted. Why Might You Want to Share an IP Address? The ESP I spoke with recently raised another valid positive about shared IP addresses, at least for low-volume senders. When we talk reputation, we talk about positive, neutral, and negative. To get on the reputation radar, the IP address needs to be sending a certain amount of e-mail each month. If your sends are small, your dedicated IP address may be below the radar and never "qualify" for a positive or a negative reputation -- you'll be stuck with a "neutral" reputation or no reputation at all. This isn't all bad, but it's also not all good. By having companies share IP addresses, this ESP contends it is able to get enough volume to earn positive IP address reputations, which helps its customers' e-mail get to the inbox. This is a valid point, as long as everyone using the IP address behaves and avoids red flags. It's a calculated strategy, one which requires the ESP to provide education about e-mail best practices and closely monitor every IP address to ensure customers are in compliance. If you're sending from your own in-house system, these same pros and cons apply
  •  
    The downside to having a dedicated IP address is the cost. Most ESPs charge an initial set-up fee of $500 to $1,000 for a dedicated IP address; there's also often a $250 monthly fee for maintaining it. This directly impacts your e-mail ROI (define). For large quantity senders the additional cost is minimal, but for those sending small volumes of e-mail it can make a dent in your profit margin. A shared IP address is just what it sounds like -- you're sharing the IP address with other organizations. Every company sending from the IP address has the potential to impact, positively or negatively, its reputation. If your IP address neighbors are good guys, the reputation shouldn't be damaged. But if one of them (or if you) does something that raises a red flag, the IP address' reputation will be tarnished and all e-mail sent from it could be blacklisted. Why Might You Want to Share an IP Address? The ESP I spoke with recently raised another valid positive about shared IP addresses, at least for low-volume senders. When we talk reputation, we talk about positive, neutral, and negative. To get on the reputation radar, the IP address needs to be sending a certain amount of e-mail each month. If your sends are small, your dedicated IP address may be below the radar and never "qualify" for a positive or a negative reputation -- you'll be stuck with a "neutral" reputation or no reputation at all. This isn't all bad, but it's also not all good. By having companies share IP addresses, this ESP contends it is able to get enough volume to earn positive IP address reputations, which helps its customers' e-mail get to the inbox. This is a valid point, as long as everyone using the IP address behaves and avoids red flags. It's a calculated strategy, one which requires the ESP to provide education about e-mail best practices and closely monitor every IP address to ensure customers are in compliance. If you're sending from your own in-house system, these same pros and cons apply
Rob Laporte

Google third-party policy - Advertising Policies Help - 0 views

  • In addition to meeting the requirements outlined below, third parties must make reasonable efforts to provide their customers with other relevant information when requested.
  • If your applicable terms of service require a monthly performance report for customers, you must include data on costs, clicks, and impressions at the Google advertising account level. When sharing Google advertising cost data with customers, report the exact amount charged by Google, exclusive of any fees that you charge.
  • you can meet this reporting requirement by allowing your customers to sign in to their Google advertising accounts directly to access their cost and performance data. Learn how to share account access.
  • ...4 more annotations...
  • Third parties often charge a management fee for the valuable services they provide, and end-advertisers should know if they are going to be charged these fees. If you charge a management fee (separate from the cost of AdWords or AdWords Express), let customers know. At a minimum, inform new customers in writing before each first sale and disclose the existence of this fee on customer invoices.
  • It's important for advertisers to have the ability to contact Google directly with concerns about a third-party partner. To allow Google to properly investigate and assist the advertiser, we require that you provide your customers with the customer IDs for their AdWords or AdWords Express accounts when requested. Learn how to find an AdWords customer ID
  • putting undue pressure on an advertiser to sign up or stay with your agency
  • Having a separate account for each end-advertiser is essential to maintaining the integrity of the AdWords Quality Score. Because account history is a core component of the AdWords Quality Score, mixing advertisers in one account can result in Quality Scores that inaccurately represent any one advertiser's performance. Additionally, we'll show only one ad per account for a particular keyword, so mixing advertisers in one account could unfairly limit ad serving for those advertisers. For these reasons, we require that you use a separate account for each end-advertiser that you manage.
  •  
    "In addition to meeting the requirements outlined below, third parties must make reasonable efforts to provide their customers with other relevant information when requested."
Rob Laporte

Giving Links Away - Search Engine Watch - 0 views

  • Enter Siloing and PageRank Sculpting This is simply the activity of controlling what pages of your site share their link love. You do this by adding a "nofollow" attribute to any link that you don't want the search engines to give credit to. Take the example Matt Cutts gives. Maybe you have a friend who is a total underground, blackhat, do-no-good, evil-empire, anarchist spammer. You know he's bad to the bone. But you have a soft place in your heart for him and you want others to check out his site. All you have to do is add a nofollow attribute to the link. It would look like this: <a href="http://www.total-underground-blackhat-do-no-good-evil-empire-anarchist-spammer.com/" rel="nofollow">a blackhat spammer</a>. In this article, Joost de Valk, a Dutch SEO and Web developer, quotes Matt Cutts as saying, "There's no stigma to using nofollow, even on your own internal links; for Google, nofollowed links are dropped out of our link graph; we don't even use such links for discovery." Joost's article explains PageRank sculpting in more detail if you find this topic fascinating. His article also talks about "siloing." He points to an article on BruceClay.com that discussed this concept in a great amount of detail. Siloing is the idea of only linking out to other pages on your site and other outside resources that relate to that specific category or topic. So, if you had a cherry ice cream cone page, you would only link to resources discussing cherry ice cream cones. Information about chocolate ice cream cones and ice cream sundaes would either not be linked to or would be linked to using the nofollow tag like I showed you above. Controlling Link Flow Using Robots.txt Finally, there's more than one way to block link love. You can also add this information to your robots.txt file. This handy file goes in the root folder of your Web server and tells the search engines how to not spider and index all sorts of things.
Rob Laporte

Geo-Targeting Redirects: Cloaking or Better User Experience? - Search Engine Watch (SEW) - 0 views

  • If you have your site set to detect a visitor's location and show content based on that, I would recommend the following: Serve a unique URL for distinct content. For instance, don't show English content to US visitors on mysite.com and French content to French visitors on mysite.com. Instead, redirect English visitors to mysite.com/en and French visitors to mysite.com/fr. T hat way search engines can index the French content using the mysite.com/fr URL and can index English content using the mysite.com/en URL. Provide links to enable visitors (and search engines) to access other language/country content. For instance, if I'm in Zurich, you might redirect me to the Swiss page, but provide a link to the US version of the page. Or, simply present visitors with a home page that enables them to choose the country. You can always store the selection in a cookie so visitors are redirected automatically after the first time.
  • Google's policies aren't as inflexible as you're trying to portray. The same Google page you quote also says that intent ought to be a major consideration (just as when evaluating pages with hidden content). Also, why would Google's guidelines prevent you from using geotargeting without an immediate redirect? Just because you don't immediately redirect search users to a different page doesn't mean you have to ask for their zip code instead of using IP-based geotargeting.    Lastly, I don't think using such redirects from SERPs improves user experience at all. If I click on a search result, then it's because that's the content I'm interested in. It's very annoying to click on a search result and get a page completely different from the SERP snippet. And what about someone who is on business in a different country? Search engines already provide different language localizations as well as language search options to favor localized pages for a particular region. So if someone goes to the French Google, they will see the French version of localized sites/pages. If they're seeing the U.S. version in their SERP, then it's because you didn't SEO or localize your pages properly, or they deliberately used the U.S. version of Google. Don't second guess your users. Instead, focus on making sure that users know about your localized pages and can access them easily (by choice, not through force).5 days ago, 17:00:11 – Flag – Like – Reply – Delete – Edit – Moderate Bill Hunt Frank your spot on as usual. We still keep chasing this issue and as I wrote on SEW last year in my article on language detection issues http://searchenginewatch.com/3634625 it is often more of the implementation that is the problem than the actual redirect.    Yes, it is exactly cloaking (maybe gray hat) when you have a single gateway such as "example.com" and if the person comes from Germany they see the site in German language or English if their IP was in New York. Engines typically crawl from a central location like Mountain View or Zurich so they would only see the English version since they would not provide signals for any other location. Where you really get into a tricky area is if you set it so that any user agent from a search engine can access any version they are asking for and let them in yet a human is restricted - sort of reverse cloaking. If Mr GoogleBot wants the French home page let him have it rather than sending him to the US homepage.    With the growth of CDN's (content data networks) I am seeing more and more of these issues crop up to handle load balancing as well as other forms of geographical targeting. I have a long list of global, multinational and enterprise related challenges that are complicated by many of Google's outdated ways of handling kindergarten level spam tactics. Sounds like a SES New York session...
Rob Laporte

70+ Best Free SEO Tools (As Voted-for by the SEO Community) - 1 views

  • Soovle — Scrapes Google, Bing, Yahoo, Wikipedia, Amazon, YouTube, and Answers.com to generate hundreds of keyword ideas from a seed keyword. Very powerful tool, although the UI could do with some work.Hemingway Editor — Improves the clarity of your writing by highlighting difficult to read sentences, “weak” words, and so forth. A must-have tool for bloggers (I use it myself).
  • Yandex Metrica — 100% free web analytics software. Includes heat maps, form analytics, session reply, and many other features you typically wouldn’t see in a free tool.
  • For example, two of my all-time favourite tools are gInfinity (Chrome extension) and Chris Ainsworth’s SERPs extraction bookmarklet.By combining these two free tools, you can extract multiple pages of the SERPs (with meta titles + descriptions) in seconds.
  • ...17 more annotations...
  • Keyword Mixer — Combine your existing keywords in different ways to try and find better alternatives. Also useful for removing duplicates from your keywords list.Note: MergeWords does (almost) exactly the same job albeit with a cleaner UI. However, there is no option to de-dupe the list.
  • LSIgraph.com — Latent Semantic Indexing (LSI) keywords generator. Enter a seed keyword, and it’ll generate a list of LSI keywords (i.e. keywords and topics semantically related to your seed keyword). TextOptimizer is another very similar tool that does roughly the same job.
  • Small SEO Tools Plagiarism Checker — Detects plagiarism by scanning billions of documents across the web. Useful for finding those who’ve stolen/copied your work without attribution.
  • iSearchFrom.com — Emulate a Google search using any location, device, or language. You can customise everything from SafeSearch settings to personalised search.
  • Delim.co — Convert a comma-delimited list (i.e. CSV) in seconds. Not necessarily an SEO tool per se but definitely very useful for many SEO-related tasks.
  • Am I Responsive? — Checks website responsiveness by showing you how it looks on desktop, laptop, tablet, and mobile.
  • SERPLab — Free Google rankings checker. Updates up to 50 keywords once every 24 hours (server permitting).
  • Varvy — Checks whether a web page is following Google’s guidelines. If your website falls short, it tells you what needs fixing.
  • JSON-LD Schema Generator — JSON-LD schema markup generator. It currently supports six markup types including: product, local business, event, and organization.
  • KnowEm Social Media Optimizer — Analyses your web page to see if it’s well-optimised for social sharing. It checks for markup from Facebook, Google+, Twitter, and LinkedIn.
  • Where Goes? — Shows you the entire path of meta-refreshes and redirects for any URL. Very useful for diagnosing link issues (e.g. complex redirect chains).
  • Google Business Review Link Generator — Generates a direct link to your Google Business listing. You can choose between a link to all current Google reviews, or to a pre-filled 5-star review box.
  • PublicWWW — Searches the web for pages using source code-based footprints. Useful for finding your competitors affiliates, websites with the same Google Analytics code, and more.
  • Keywordtool.io — Scrapes Google Autosuggest to generate 750 keyword suggestions from one seed keyword. It can also generate keyword suggestions for YouTube, Bing, Amazon, and more.
  • SERPWatcher — Rank tracking tool with a few unique metrics (e.g. “dominance index”). It also shows estimated visits and ranking distribution charts, amongst other things.
  • GTMetrix — Industry-leading tool for analysing the loading speed of your website. It also gives actionable recommendations on how to make your website faster.
  • Mondovo — A suite of SEO tools covering everything from keyword research to rank tracking. It also generates various SEO reports.SEO Site Checkup — Analyse various on-page/technical SEO issues, monitor rankings, analyse competitors, create custom white-label reports, and more.
Rob Laporte

Local Search Tools For the SMB and Professional | Understanding Google Maps & Local Search - 0 views

  •  
    Local Search Tools For the SMB and Professional Category: Local Search - Mike - 6:00 am I have been using two "new" local search tools of late and have been impressed with both of them. The Local Search Toolkit from seOverflow has recently been released from beta and upgraded to work with the many changes that occurred recently in Google Places. The tool provides competitive information for a range of information for the top 7 listings in a given geo search. It will provide both URLs and totals for each of the following: Site Title Tag, Categories, Citations, Reviews , Number of Photos, Number of Videos, whether the listing is Owner Verified and the listings Distance to City Center. It's free and provides a wealth of information. It's useful for determining which reviews sites are most prevalent in which industries and which citations sources are the most prominent. Another tool that I often use is the Whitespark Local Citation Finder. The free version has been around for a while and is also useful in finding citations for either keyword phrases, your own site or those of a competitor. They just released the Local Citation Finder Pro version. The Pro Version is $20/mo and normally I do not write about products that charge a fee but it has a new feature that I am finding incredibly useful (they provided me with a free subscription). Local Citations Pro now offers the ability compare the specific citations between any number of  searches and or business listings. So for example you can examine your business listing and the citations for the listing that is tops in your category and against the citations for a series of search pharse. The information is offered up both visually and via a spread sheet file: Pro users also get these other features: Compare Citations Easily determine which citations your competitors have that you're missing. Sort by Value Sort your results by SEOmoz Domain Authority and Majestic SEO ACRank. Get Results in Minutes
Rob Laporte

How to "Recycle" PPC & Analytics Audience Data for SEO | Seer Interactive - 1 views

  • opportunities to “recycle” data we already have to find new and different insights. Think about it–if you’re working at a full-service agency, your overall team may have access to a client’s Google Analytics and Adwords, SEMRush, STAT, HotJar, SurveyMonkey, SpyFu, Twitter Analytics, etc.
  • To get started on connecting utilizing other teams’ data, you have to pay attention to what the other teams are doing.
  • the key to integration
  • ...3 more annotations...
  • To find opportunities for recycled data, you’ll need to work collaboratively with all teams on the project: SEO, PPC, and Analytics.
  • For this post, we’re going to focus on opportunities to use “recycled” data for SEO strategy.
  • If you work with different channel teams in your day-to-day, it’s easy to become complacent in your own world with your own data. By taking a look at what you have access to as a team, you’ll be able to get outside of your typical resources and can find recycled data opportunities to use for your client–without having to request more from your already strapped-for-time POC.
Rob Laporte

For small, private colleges, fewer students means more worries - The Boston Globe - 0 views

  • Springfield College saw a 26 percent drop in enrollment over the past two decades, from 2,844 to 2,114, but recently managed to stabilize its numbers and even saw an increase this year, to 2,228 according to Stuart Jones, the school’s vice president for enrollment management. Among other tactics, the school used targeted digital marketing to recruit a subset of students it believed was likely to attend, he said.
  •  
    "Springfield College saw a 26 percent drop in enrollment over the past two decades, from 2,844 to 2,114, but recently managed to stabilize its numbers and even saw an increase this year, to 2,228 according to Stuart Jones, the school's vice president for enrollment management. Among other tactics, the school used targeted digital marketing to recruit a subset of students it believed was likely to attend, he said."
jack_fox

Black market in Google reviews means you can't believe everything you read | CBC News - 0 views

  • a growing black market in which some companies pay for fake positive reviews, while others are seemingly being extorted by web firms who post negative comments then propose their "review-fixing" services to get them taken down.
  • When CBC News asked Google about Riverbend's complaints, including Pereira's own fake review, it was finally removed — along with 32 other one-star reviews. And as a result, the company's star rating went up from 3.6 to 4.1 overnight. 
  • There is no evidence that Google is planning to turn away from algorithm-based content moderation, or make the kind of massive human investments that Toscano and others are calling for.
jack_fox

How to Establish a Legal State Residency (Domicile) as a Nomad - Two Meander - 0 views

  • Another consideration that many nomads face is the difficulty of finding nationwide medical insurance plans. Most plans currently available only cover medical services within a limited network in your home state. This makes these plans nearly useless other than for emergency room care if you seldom return to your home state.
  • It is fairly common for nomads to switch everything from their former address to the address of a friend or relative when they are ready to get on the road. As long as your friend or relative is agreeable, trustworthy, and reliable this can be a good and simple solution. Staying at this address whenever you are in your home state, even if you sleep in the driveway, may further reinforce your legal argument that it is indeed your domicile. It is worth considering though whether your friend or relative may tire of managing your mail or may move while you are on the road. Many people who start off by using the address of a friend or relative eventually end up choosing a different option.
  •  
    " Another consideration that many nomads face is the difficulty of finding nationwide medical insurance plans. Most plans currently available only cover medical services within a limited network in your home state. This makes these plans nearly useless other than for emergency room care if you seldom return to your home state. "
jack_fox

The Ultimate Web Server Security Guide @ MyThemeShop - 0 views

  • They could insert links into the site to boost their SEO rankings. Hackers can make a killing selling links from exploited sites. Alternatively, a hacker could deface the site and demand money to restore it (ransom). They could even place ads on the site and use the traffic to make money. In most cases, an attacker will also install backdoors into the server. These are deliberate security holes that allow them to come back and exploit the site in the future – even if the insecure plugin has been replaced.
  • Unfortunately, under WordPress, every plugin and theme has the ability to alter anything on the site. They can even be exploited to infect other apps and sites hosted on the same machine.
  • Theme developers are often relatively inexperienced coders. Usually, they’re professional graphic artists who have taught themselves a little PHP on the side. Plugins are another popular line of attack – they account for 22% of successful hacks. Put together, themes and plugins are a major source of security trouble.
  • ...102 more annotations...
  • Each person who uses your system should only have the privileges they need to perform their tasks.
  • Don’t depend on a single security measure to keep your server safe. You need multiple rings of defense.
  • Security exploits exist at all levels of the technology stack, from the hardware up. WP White Security revealed that 41% of WordPress sites are hacked through a weakness in the web host.
  • While it’s important to use a strong password, password cracking is not a primary focus for hackers.
  • the more software you have installed on your machine, the easier it is to hack – even if you aren’t using the programs! Clearly, programs that are designed to destroy your system are dangerous. But even innocent software can be used in an attack.
  • There are 3 ways to reduce the attack surface: 1. Run fewer processes 2. Uninstall programs you don’t need 3. Build a system from scratch that only has the processes you need
  • A really good authentication system uses multiple tests. Someone could steal or guess your password. They could grab your laptop with its cryptographic keys.
  • If you want to run multiple processes at the same time, you need some way of managing them. This is basically what a kernel is. It does more than that – it handles all of the complex details of the computer hardware, too. And it runs the computer’s networking capabilities
  • programs exist as files when they are not running in memory
  • SELinux’s default response is to deny any request.
  • SELinux is extremely comprehensive, but this power comes at a price. It’s difficult to learn, complex to set up, and time-consuming to maintain.
  • AppArmor is an example of a MAC tool, although it’s nowhere near as comprehensive as SELinux. It applies rules to programs to limit what they can do.
  • AppArmor is relatively easy to set up, but it does require you to configure each application and program one by one. This puts the onus for security in the hands of the user or sysadmin. Often, when new apps are added, users forget to configure AppArmor. Or they do a horrible job and lock themselves out, so their only option is to disable the profile. That said, several distributions have adopted AppArmor.
  • Generic profiles shipped by repo teams are designed to cover a wide range of different use cases, so they tend to be fairly loose. Your specific use cases are usually more specific. In this case, it pays to fine-tune the settings, making them more restrictive.
  • GRSecurity is a suite of security enhancements
  • In the future, this could become a viable option. For now, we’ll use Ubuntu and AppArmor.
  • Apache is a user-facing service – it’s how your users interact with your website. It’s important to control this interaction too.
  • If your Apache configuration is bad, these files can be viewed as plain text. All of your code will be visible for anyone to see – this potentially includes your database credentials, cryptographic keys, and salts.
  • You can configure Apache to refuse any requests for these essential directories using .htaccess files. These are folder-level configuration files that Apache reads before it replies to a request.
  • The primary use for .htaccess files is to control access
  • If an attacker knows your WordPress cryptographic salts, they can use fake cookies to trick WordPress into thinking they have logged on already.
  • If the hacker has physical access to the computer, they have many options at their disposal. They can type commands through the keyboard, or insert a disk or USB stick into the machine and launch an attack that way.
  • When it comes to network-based attacks, attackers have to reach through one of the machine’s network ports.
  • For an attacker to exploit a system, they have to communicate to a process that’s listening on a port. Otherwise, they’d simply be sending messages that are ignored. This is why you should only run processes that you need for your site to run. Anything else is a security risk.
  • Often, ports are occupied by processes that provide no real valuable service to the machine’s legitimate users. This tends to happen when you install a large distribution designed for multiple uses. Large distros include software that is useless to you in terms of running a website. So the best strategy is to start with a very lightweight distro and add the components you need.
  • If you see any unnecessary processes, you can shut them down manually. Better yet, if the process is completely unnecessary, you can remove it from your system.
  • Firewalls are quite similar to access control within the computer. They operate on a network level, and you can use them to enforce security policies. A firewall can prevent processes from broadcasting information from a port. It can stop outside users from sending data to a port. And it can enforce more complex rules.
  • Simply installing and running a firewall does not make your host machine secure – it’s just one layer in the security cake. But it’s a vital and a powerful one.
  • First of all, we need to configure our software to resist common attacks. But that can only protect us from attacks we know about. Access control software, such as AppArmor, can drastically limit the damage caused by unauthorized access. But you still need to know an attack is in progress.
  • This is where Network Intrusion Detection Software (NIDS) is essential. It scans the incoming network traffic, looking for unusual patterns or signs of a known attack. If it sees anything suspicious, it logs an alert.
  • It’s up to you to review these logs and act on them.
  • If it’s a false alarm, you should tune your NIDS software to ignore it. If it’s an ineffective attack, you should review your security and block the attacker through the firewall.
  • That’s why it’s essential to have an automated backup system. Finally, you need to understand how the attack succeeded, so you can prevent it from recurring. You may have to change some settings on your Firewall, tighten your access rules, adjust your Apache configuration, and change settings in your wp-config file. None of this would be possible without detailed logs describing the attack.
  • Every web server has a breaking point and dedicated DOS attackers are willing to increase the load until your server buckles. Good firewalls offer some level of protection against naive DOS attacks
  • a tiny number of sites (less than 1%) are hacked through the WordPress core files
  • Major DNS attacks have taken down some of the biggest sites in the world – including Ebay and Paypal. Large hosting companies like Hostgator and Blue Host have been attacked. It’s a serious risk!
  • Right now, due to the way the web currently works, it’s impossible to download a web page without the IP address of a server. In the future, technologies like IFPS and MaidSafe could change that.
  • So there are 2 benefits to using a CDN. The first is that your content gets to your readers fast. The second benefit is server anonymity – nobody knows your real IP address – including the psychos. This makes it pretty impossible to attack your server – nobody can attack a server without an IP address.
  • When CDNs discover a DDOS attack, they have their own ways to deal with it. They often display a very lightweight “are you human?” message with a captcha. This tactic reduces the bandwidth costs and screens out the automated attacks.
  • If any of your DNS records point to your actual server, then it’s easy to find it and attack it. This includes A records (aliases) and MX records (mail exchange). You should also use a separate mail server machine to send your emails. Otherwise, your email headers will expose your real email address.
  • If your hosting company refuses to give you a new IP address, it may be time to find a new service provider.
  • WordPress uses encryption to store passwords in the database. It doesn’t store the actual password – instead, it stores an encrypted version. If someone steals your database tables, they won’t have the actual passwords.
  • If you used a simple hash function, a hacker could gain privileged access to your app in a short period of time.
  • The salt strings are stored in your site’s wp-config.php file.
  • Salts dramatically increase the time it would take to get a password out of a hash code – instead of taking a few weeks, it would take millions of years
  • You keep the other key (the decryption key) to yourself. If anyone stole it, they could decode your private messages! These 2-key cryptographic functions do exist. They are the basis of TLS (https) and SSH.
  • the most secure systems tend to be the simplest. The absolute secure machine would be one that was switched off.
  • For WordPress sites, you also need PHP and a database.
  • A VM is an emulated computer system running inside a real computer (the host). It contains its own operating system and resources, such as storage, and memory. The VM could run a completely different operating system from the host system – you could run OSX in a VM hosted on your Windows machine
  • This isolation offers a degree of protection. Let’s imagine your VM gets infected with a particularly nasty virus – the VM’s file system could be completely destroyed, or the data could be hopelessly corrupted. But the damage is limited to the VM itself. The host environment would remain safe.
  • This is how shared hosting and virtual private servers (VPSes) work today. Each customer has access to their own self-contained environment, within a virtual machine.
  • VMs are not just for hosting companies. If you’re hosting multiple sites on a dedicated server or a VPS, VMs can help to make your server more secure. Each site can live inside its own VM. That way, if one server is hacked, the rest of your sites are safe.
  • Even with all these considerations, the benefits of VMs outweigh their drawbacks. But performance is vital on the web.
  • Containers (like Docker) are very similar to VMs.
  • Because we’ve cut the hypervisor out of the loop, applications run much faster – almost as fast as processes in the host environment. Keeping each container separate does involve some computation by the container software. But it’s much lighter than the work required by a hypervisor!
  • Docker Cloud is a web-based service that automates the task for you. It integrates smoothly with the most popular cloud hosting platforms (such as Amazon Web Services, or Digital Ocean).
  • With containers, you can guarantee that the developer’s environment is exactly the same as the live server. Before the developer writes a single line of code, they can download the container to their computer. If the code works on their PC, it will work on the live server. This is a huge benefit of using containers, and it’s a major reason for their popularity.
  • A complete stack of these layers is called an “image”
  • The core of Docker is the Docker Engine – which lives inside a daemon – or long-running process
  • another great resource – the Docker Hub. The hub is an online directory of community-made images you can download and use in your own projects. These include Linux distributions, utilities, and complete applications.
  • Docker has established a relationship with the teams behind popular open source projects (including WordPress) – these partners have built official images that you can download and use as-is.
  • when you finish developing your code, you should wrap it up inside a complete container image. The goal is to put all the code that runs your site inside a container and store the volatile data in a volume.
  • Although Docker can help to make your site more secure, there’s are a few major issues you need to understand. The Docker daemon runs as a superuser It’s possible to load the entire filesystem into a container It’s possible to pass a reference to the docker daemon into a container
  • The solution to this issue is to use a MAC solution like SELinux, GRSecurity or AppArmor.
  • Never let anyone trick you into running a strange docker command.
  • only download and use Docker images from a trustworthy source. Official images for popular images are security audited by the Docker team. Community images are not
  • there are the core WordPress files. These interact with the web server through the PHP runtime. WordPress also relies on the file system and a database server.
  • A service is some software component that listens for requests (over a protocol) and does something when it receives those requests.
  • Using Docker, you could install WordPress, Apache, and PHP in one container, and run MySQL from another. These containers could run on the same physical machine, or on different ones
  • The database service container can be configured to only accept connections that originate from the web container. This immediately removes the threat of external attacks against your database server
  • This gives you the perfect opportunity to remove high-risk software from your host machine, including: Language Runtimes and interpreters, such as PHP, Ruby, Python, etc. Web servers Databases Mail Servers
  • If a new version of MySQL is released, you can update the database container without touching the web container. Likewise, if PHP or Apache are updated, you can update the web container and leave the database container alone.
  • Because Docker makes it easy to connect these containers together, there’s no reason to lump all your software inside a single container. In fact, it’s a bad practice – it increases the security risk for any single container, and it makes it harder to manage them.
  • If your site is already live on an existing server, the best approach is to set up a new host machine and then migrate over to it. Here are the steps you need to take:
  • With a minimal Ubuntu installation, you have a fairly bare-bones server. You also have the benefit of a huge repository of software you can install if you want.
  • If access control is like a lock protecting a building, intrusion detection is the security alarm that rings after someone breaks in.
  • Logging on to your host with a superuser account is a bad practice. It’s easy to accidentally break something.
  • Fail2ban blocks SSH users who fail the login process multiple times. You can also set it up to detect and block hack attempts over HTTP – this will catch hackers who attempt to probe your site for weaknesses.
  • With multiple WordPress sites on your machine, you have 2 choices. You could create a new database container for each, or you could reuse the same container between them. Sharing the DB container is a little riskier, as a hacker could, theoretically, ruin all your sites with one attack. You can minimize that risk by: Use a custom root user and password for your database – don’t use the default username of ‘root’. Ensuring the db container is not accessible over the internet (hide it away inside a docker network) Creating new databases and users for each WordPress site. Ensure each user only has permissions for their specific database.
  • What are the benefits of using a single database container? It’s easier to configure and scale. It’s easier to backup and recover your data. It’s a little lighter on resources.
  • you could also add a caching container, like Varnish. Varnish caches your content so it can serve pages quickly – much faster than WordPress can
  • Docker has the ability to limit how much processor time and memory each container gets. This protects you against exhaustion DOS attacks
  • A containerized process still has some of the abilities of root, making it more powerful than a regular user. But it’s not as bad as full-on root privileges. With AppArmor, you can tighten the security further, preventing the process from accessing any parts of the system that do not relate to serving your website.
  • Docker Hub works like GitHub – you can upload and download images for free. The downside is that there’s no security auditing. So it’s easy to download a trojan horse inside a container.
  • Official images (such as WordPress and Apache) are audited by the Docker team. These are safe. Community images (which have names like user/myapp) are not audited.
  • a kernel exploit executed inside a container will affect the entire system. The only way to protect against kernel exploits is to regularly update the host system
  • Containers run in isolation from the rest of the system. That does not mean you can neglect security – your website lives inside these containers! Even if a hacker cannot access the full system from a container, they can still damage the container’s contents.
  • Under Ubuntu, AppArmor already protects you – to a degree. The Docker daemon has an AppArmor profile, and each container runs under a default AppArmor profile. The default profile prevents an app from breaking out of the container, and restricts it from doing things that would harm the system as a whole. However, the default profile offers no specific protection against WordPress specific attacks. We can fix this by creating a custom profile for your WordPress container.
  • The net effect is that it’s impossible to install malware, themes or plugins through the web interface. We’ve already covered this to some degree with the .htaccess rules and directory permissions. Now we’re enforcing it through the Linux kernel.
  • There are versions of Docker for Mac and PC, so you’ll be able to run your site from your home machine. If the code works on your PC, it will also work on the server.
  • Tripwire tends to complain about the entries in the /proc filespace, which are auto-generated by the Linux kernel. These files contain information about running processes, and they tend to change rapidly while Linux runs your system. We don’t want to ignore the directory entirely, as it provides useful signs that an attack is in progress. So we’re going to have to update the policy to focus on the files we are interested in.
  • Now we should install an e-mail notification utility – to warn us if anything changes on the system. This will enable us to respond quickly if our system is compromised (depending on how often you check your emails).
  • Rootkits are malicious code that hackers install onto your machine. When they manage to get one on your server, it gives them elevated access to your system
  • Tripwire is configured to search in key areas. It’s good at detecting newly installed software, malicious sockets, and other signs of a compromised system. RKHunter looks in less obvious places, and it checks the contents of files to see if they contain known malicious code. RKHunter is supported by a community of security experts who keep it updated with known malware signatures – just like antivirus software for PCs.
  • If your hosting company offers the option, this would be a good point to make an image of your server. Most cloud hosting companies offer tools to do this.
  • With an image, it’s easy to launch new servers or recover the old one if things go horribly wrong.
  • We’ve hidden our server from the world while making it easy to read our content We’ve built a firewall to block malicious traffic We’ve trapped our web server inside a container where it can’t do any harm We’ve strengthened Linux’s access control model to prevent processes from going rogue We’ve added an intrusion detection system to identify corrupted files and processes We’ve added a rootkit scanner We’ve strengthened our WordPress installation with 2-factor authentication We’ve disabled the ability for any malicious user to install poisoned themes or plugins
  • Make a routine of checking the logs (or emails if you configured email reporting). It’s vital to act quickly if you see any warnings. If they’re false warnings, edit the configuration. Don’t get into a habit of ignoring the reports.
  • Virtually everything that happens on a Linux machine is logged.
  • You have to make a habit of checking for new exploits and learn how to protect yourself against them. Regularly check for security patches and issues in the core WordPress app: WordPress Security Notices Also, check regularly on the forums or mailing lists for the plugins and themes you use on your site.
  • network level intrusion detection service – you can fix that by installing Snort or PSAD.
  • The only way to guarantee your safety is to constantly update your security tactics and never get complacent.
jack_fox

Does search consoles "target audience in:" has any effect? : TechSEO - 0 views

  • All you do with that doing is to very strongly hint us that your content is more relevant to users in the region you've set. This may help a little in that particular region, but won't affect your site in other regions.
  •  
    "All you do with that doing is to very strongly hint us that your content is more relevant to users in the region you've set. This may help a little in that particular region, but won't affect your site in other regions."
jack_fox

Bing Webmaster Tools (Step-By-Step Guide for 2020) - 0 views

  • this report automatically runs every other week and scans any verified website in your account
  •  
    "this report automatically runs every other week and scans any verified website in your account"
jack_fox

What To Do If You Lose Your Two-Factor Phone - 0 views

  • Other services even let you remove your two-factor security through email. They’ll send you an email at your registered email address and let you click through a few dialogs to remove the protection and gain access to your account. That’s not good for security—it means an attacker with access to your email could easily remove your two-step verification—but many services do it anyway.
  •  
    "Other services even let you remove your two-factor security through email. They'll send you an email at your registered email address and let you click through a few dialogs to remove the protection and gain access to your account. That's not good for security-it means an attacker with access to your email could easily remove your two-step verification-but many services do it anyway."
Rob Laporte

Record-Breaking Black Friday Paves Way For $1 Billion Cyber-Monday - 0 views

  •  
    Another compelling batch of data released over the weekend concerns the contribution of non-PC devices (smartphones and tablets) to Black Friday sales. IBM reported a marked increase in mobile shopping: Mobile traffic increased to 14.3 percent  . . . compared to 5.6 percent in 2010 Sales on mobile devices surged to 9.8 percent from 3.2 percent year over year Mobile shopping was led by Apple, with the iPhone and iPad ranking one and two for consumers shopping on mobile devices . . . Android came in third at 4.1 percent. Shoppers using the iPad led to more retail purchases more often per visit than other mobile devices with conversion rates reaching 4.6 percent compared to 2.8 percent for overall mobile devices
‹ Previous 21 - 40 of 806 Next › Last »
Showing 20 items per page