when you click a PASF topic that appears under a domain, 78.87 percent of the time that domain doesn’t rank anywhere in the top 20 of the new SERP. We take this to mean that either: Google knows better than to show you the same site you just left; or even though the site is topically-related, it’s not relevant enough for the new SERP.
how’s an SEO to take advantage of all this oddity? Well, if you’re struggling to rank higher, or rank at all, against your current query’s competitors, it may be worth investigating the SERPs of PASF topics as alternate avenues
While the decline is concerning, the reality is that the volume of searches on Google continues to increase, partially offsetting the impact of that decline
Spend more of your content budget on individualization
Develop content that answers user questions. Analyze competing sites for questions they're answering that you're not
opportunities to “recycle” data we already have to find new and different insights. Think about it–if you’re working at a full-service agency, your overall team may have access to a client’s Google Analytics and Adwords, SEMRush, STAT, HotJar, SurveyMonkey, SpyFu, Twitter Analytics, etc.
To get started on connecting utilizing other teams’ data, you have to pay attention to what the other teams are doing.
To find opportunities for recycled data, you’ll need to work collaboratively with all teams on the project: SEO, PPC, and Analytics.
For this post, we’re going to focus on opportunities to use “recycled” data for SEO strategy.
If you work with different channel teams in your day-to-day, it’s easy to become complacent in your own world with your own data. By taking a look at what you have access to as a team, you’ll be able to get outside of your typical resources and can find recycled data opportunities to use for your client–without having to request more from your already strapped-for-time POC.
So Google's had historic guidelines that said, well, you want to keep your meta description tag between about 160 and 180 characters. I think that was the number. They've updated that to where they say there's no official meta description recommended length. But on Twitter, Danny Sullivan said that he would probably not make that greater than 320 characters. In fact, we and other data providers, that collect a lot of search results, didn't find many that extended beyond 300. So I think that's a reasonable thing.
Now it's sitting at about 51% of search results that have these longer snippets in at least 1 of the top 10 as of December 2nd.
According to Google, spammy structured markup penalty exists. On Webmasters Forum there are a lot of people that received a message in Search Console; Manual actions saying that the website’s schema code is spammy and it violates Google’s quality guidelines.
Use structured data for visible content only;
Check and fix any warnings with Google’s testing tool;
Use different markup for the pages within your website;
John Mueller said that, in most cases, the site’s ranking might not get affected by the loss of structured markup data.
In practice, if the structure data team takes action on a site it will get affected only the rich snippets. So, the spammy structured data doesn’t affect the rankings of a site. The rest of your site is still normally shown in search.
If you're targeting users in different locations—for example, if you have a site in French that you want users in France, Canada, and Mali to read—don't use this tool to set France as a geographic target.