Skip to main content

Home/ DISC Inc/ Group items tagged google

Rss Feed Group items tagged

Rob Laporte

Google's internal SEO strategy: Make small changes, embrace change, consolidate - Searc... - 0 views

  • Small changes make a big impact. Google’s first point is that often with large sites, making small changes can make a big impact and return when it comes to search rankings. Google plotted the growth of one of the 7,000 websites, the Google My Business marketing site, showing how adding canonicals, hreflang to their XML sitemaps, and improving their metadata all resulted in gains in their organic traffic in search.Here is that chart:
  • Here is the chart showing the improvement after making the AMP error fixes:
  • Consolidation. For the past several years, many SEOs have been saying “less is more.” Meaning, having fewer sites and fewer pages with higher quality content often leads to better SEO results. Google says that works for them and they have been working on consolidating their sites. Google said they found a “large number” of near duplicate sites across their properties.“Duplicate content is not only confusing for users, it’s also confusing for search engines,” Google said. Google added, “Creating one great site instead of multiple microsites is the best way to encourage organic growth over time.”In one case study Google provided with the Google Retail site, they took six old websites and consolidated the content. They made “one great website” and it lead to them doubling the site’s call-to-action click-through rate and increased organic traffic by 64%.
Rob Laporte

5 Things Google Ads can now do automatically - Search Engine Land - 0 views

  • Smart campaigns for small business Along with Google’s recent rebranding of AdWords to Google Ads, they announced the arrival of a new automated campaign type for small businesses, called Smart Campaigns. This campaign type, now available in the US, is built on top of AdWords Express, and according to Google, it can produce significantly better results. For now, this will become the default campaign type for new advertisers. The target users of this type of campaign might have chosen AdWords Express or Local Service Ads in the past, and those options will remain available until further notice from Google. If a small business decides to work with an agency or wants to venture into PPC management, it can still opt for the full Google Ads experience. This means they can choose from varying levels of automation and make decisions about where to trade off using machine learning to drive results with manual management that provides more control.
  • What is automated In the case of Smart Bidding strategies like Target CPA, Target ROAS and Enhanced CPC, Google automatically predicts the likelihood of conversions by looking at auction-time signals including device, location, language, dayparts and more. These predictions feed the automated bids that are used for every unique auction. What still needs to be done manually While Google can predict changes in conversion rate and conversion value based on a variety of factors that are widely applicable across a range of advertisers, these systems don’t yet consider unique factors that impact individual advertisers. This means that advertisers should supplement “automated” bid strategies with a management methodology that changes targets based on business-specific conversion factors. Things like flash sales, coverage in the media, weather, social media buzz and so on can all impact how an ad campaign converts, but these factors may not be apparent to Google’s machine learning, so the advertiser who is aware of these factors must do active bid management. But instead of managing things by changing a max CPC bid, management now entails changing the target.
Rob Laporte

A deep dive into BERT: How BERT launched a rocket into natural language understanding -... - 0 views

  • Google describes BERT as the largest change to its search system since the company introduced RankBrain, almost five years ago, and probably one of the largest changes in search ever.
  • it is not so much a one-time algorithmic change, but rather a fundamental layer which seeks to help with understanding and disambiguating the linguistic nuances in sentences and phrases, continually fine-tuning itself and adjusting to improve.
  • BERT achieved state-of-the-art results on 11 different natural language processing tasks.  These natural language processing tasks include, amongst others, sentiment analysis, named entity determination, textual entailment (aka next sentence prediction), semantic role labeling, text classification and coreference resolution. BERT also helps with the disambiguation of words with multiple meanings known as polysemous words, in context.
  • ...11 more annotations...
  • “Wouldn’t it be nice if Google understood the meaning of your phrase, rather than just the words that are in the phrase?” said Google’s Eric Schmidt back in March 2009, just before the company announced rolling out their first semantic offerings.This signaled one of the first moves away from “strings to things,” and is perhaps the advent of entity-oriented search implementation by Google.
  • On the whole, however, much of language can be resolved by mathematical computations around where words live together (the company they keep), and this forms a large part of how search engines are beginning to resolve natural language challenges (including the BERT update).
  • Google’s team of linguists (Google Pygmalion) working on Google Assistant, for example, in 2016 was made up of around 100 Ph.D. linguists.
  • By 2019, the Pygmalion team was an army of 200 linguists around the globe
  • BERT in search is mostly about resolving linguistic ambiguity in natural language. BERT provides text-cohesion which comes from often the small details in a sentence that provides structure and meaning
  • BERT is not an algorithmic update like Penguin or Panda since BERT does not judge web pages either negatively or positively, but more improves the understanding of human language for Google search.  As a result, Google understands much more about the meaning of content on pages it comes across and also the queries users issue taking word’s full context into consideration.
  • BERT is about sentences and phrases
  • We may see this reduction in recall reflected in the number of impressions we see in Google Search Console, particularly for pages with long-form content which might currently be in recall for queries they are not particularly relevant for.
  • International SEO may benefit dramatically too
  • Question and answering directly in SERPs will likely continue to get more accurate which could lead to a further reduction in click through to sites.
  • Can you optimize your SEO for BERT?Probably not.The inner workings of BERT are complex and multi-layered.  So much so, there is now even a field of study called “Bertology” which has been created by the team at Hugging Face.It is highly unlikely any search engineer questioned could explain the reasons why something like BERT would make the decisions it does with regards to rankings (or anything).Furthermore, since BERT can be fine-tuned across parameters and multiple weights then self-learns in an unsupervised feed-forward fashion, in a continual loop, it is considered a black-box algorithm. A form of unexplainable AI.BERT is thought to not always know why it makes decisions itself. How are SEOs then expected to try to “optimize” for it?BERT is designed to understand natural language so keep it natural.We should continue to create compelling, engaging, informative and well-structured content and website architectures in the same way you would write, and build sites, for humans.
jack_fox

How To Get Your Images in a Featured Snippet | UpBuild - 0 views

  • when interacting with (i.e. clicking on) a featured snippet image, presently you are kept within Google’s ecosystem and taken to an expanded version of the image under Google Image Search results
  • May 2019, Google has now announced an upcoming feature for Image Search on mobile that would provide searchers the option to “swipe up to navigate” to the AMP article associated with the image
  • Google could begin experimenting or simply announce a decision to bypass the Google Image Search step altogether and take searchers directly to the source page
  • ...11 more annotations...
  • Simply ranking first in Google Images for the search term will not guarantee your image is included in the snippet, but ranking an image for a search term that generates a snippet is important for eligibility.
  • including srcset= and src= are a very important part of image accessibility
  • Google Images best practices state that you should avoid embedding text in images, especially important text elements like page headings and menu items because not all users can access them
  • there are instances where showing text in an image is useful. Take search queries such as, “how to write a check”
  • I would recommend ensuring an image that is at least 1024px wide for larger screen sizes.
  • a contextual relevance factor might supersede a technical one.
  • leverage the suggested search terms under Google Images to make sure that your text content and target image reflect them accordingly. This is a great insight into understanding full intent behind the query and how you can optimize your content to help address that.
  • Place the image you want to be featured in the snippet at the top of the page, and near relevant text. Keeping the image prominent on the page not only demonstrates its importance but can help to keep it closer to the H1 heading
  • I would recommend using it under the Article schema type (provided the page in question is an article).
  • Make sure that you use the srcset= and a fallback URL via the src= attribute. To ensure maximum accessibility, keep text in HTML and not embedded in the image itself.
  • Google considers the page content quality when ranking images.
Rob Laporte

Google's December 2020 Core Update Themes - 0 views

  • The data and overall consensus point to Google’s December 2020 Core Update is it's one of the more impactful algorithm adjustments to hit the SERP over the past year or so.
  • I prefer to look at core updates almost from a pure content and UX perspective. For me, it’s about the specific pages Google swaps out more than it is a per domain analysis.
  • I am performing a qualitative analysis
  • ...19 more annotations...
  • I am not making any sort of definitive statements
  • What moves me, however, is when I look at 100 keywords I start seeing the same content-oriented theme arise again and again.
  • What I’m trying to say, and as you’ll see in the examples I will get into later, is that the content that was more focused on the specific topic mentioned in the query did better. So while the "ultimate guide” here did get to the topic the query deals with, it was not exclusively about that topic.
  • This might call the entire strategy of creating these ultimate guides into question. Perhaps you can’t cast a wide net in that way anymore? Perhaps, the "ultimate guide” is only really suitable for people who actually want to get a more broad understanding of a topic? (Crazy to think, I know!)
  • The pages from Rocket Mortgage, on the other hand, is only about how much you need for a down payment:
  • So too is the page from Quicken Loans:
  • The Moral of the Story: If I want to understand how much money on average I need to put down when buying a house or what the various options generally are and what they mean long term, the CFPG page, .gov or not, doesn’t really help me. Its content is not specifically honed in on that particular topic. Again, we have another page that takes a sweeping look at a topic that lost rankings when the query reflected a more specific sort of intent!
  • What’s interesting here is that unlike the previous examples, where too much content resulted in the page’s topical relevance being diluted, the lack of such content here is what I think caused the ranking loss. Look, it’s not bad content. However, it’s pretty much the "general” kind of content you see here, there, and everywhere for all sorts of topics. Just compare it to what the page from the Credit Card Insider offers:
  • This just oozes depth. The third topic on the page alone (6 Ways to Pay Off…) rivals the depth shown on the CreditCards.com page! What differentiates this page from the "guides” shown in the other examples is that this is a guide that drills deep into one topic as opposed to trying to span multiple subtopics. Also, have a look at the formatting, it reminds me of what we saw on the Motley Fool’s page:
  • It’s deep content that is easy to digest. It’s not hard to see why Google swapped these two pages.
  • The Moral of the Story: Exact content relevancy is not only about what topic you talk about. You can be topically aligned but it has to offer real information to the user. It’s even better when that information is digestible. In other words, if you want to rank for a keyword with topic specificity it might be better to create an "ultimate guide” that drills deep into the topic itself versus trying to cover every subtopic under the sun in order to try to rank for more topics with one piece of content.
  • The by-line really sums it up. It tells you this article is about the fact that you most likely won't get addicted to painkillers, but it’s definitely possible so here’s the scoop. To me, it’s far more in line with the average user’s intent of learning about the risks of addiction versus understanding the fine difference between addiction and dependence. It’s the same story with the WebMD page:
  • The Moral of the Story: Again, the issue here is not how authoritative or how substantial the content is. There is no doubt that content from the NIH is both substantial and authoritative. The issue here again seems to relate to Google being better able to show content that is specifically relevant to the nature of the query.
  • First things first, the page doesn’t speak to the query directly. While in the process of learning the difference between sadness and depression one could understand the signs of depression that route is certainly indirect. You could argue that the query how to tell if you have depression could be taken as ‘how do I know if I am just sad or depressed?’ but that really doesn’t seem to be the essential intent here. That topical line (i.e., sadness vs. depression) would most likely produce its own unique query (i.e., am I sad or depressed). From the content shown on the WebMD page, it appears that Google thinks of the intent as understanding the symptoms of depression:
  • The WebMD, in contradistinction to the MHA page, speaks to the "plain meaning” of the query’s intent… how can you tell if you’re suffering from depression? Aside from that, the WebMD page offers a bit more in terms of substance. While it doesn’t go into great detail per se, the WebMD page does offer a pretty comprehensive list of items. Compare that to the MHA page which, if you read it, is a bit thin and lacks tremendously in offering much of any details (even a basic list as seen on the WebMD page). The Moral of the Story: Relevancy is a two-pronged equation (at minimum). It requires the content to be topically focused on the one hand as well as substantial on the other
  • I’ve saved the best for last. This is my favorite example that I came across when diving into the December 2020 Core Update. I mean, for crying out loud, we’re talking about the CDC losing rankings in favor of a .org domain I never heard of. How could this be? Let’s understand the intent of the query. If I were searching for this it would be because I found something on my body that I thought might be skin cancer. If I could be so bold, I would imagine that this is why most of us would search for this term. I wouldn’t, and again I imagine most people in most instances wouldn’t search for this in order to understand if regular screening is officially recommended or not. Yet, that is what the CDC page is about:
  • I hate to make assumptions, but I would also think that someone running this query is most likely not interested in the common tests and methods doctors use to determine if skin cancer is present. Yet, this is what the page from Cancer.net focuses on:
  • Again, I would search for this term if I saw something weird on my body that made me think "Holy crap, do I have skin cancer?”. The page from the AOCD is entirely made for people on the verge of freaking out at the possibility of having skin cancer:
  • To me, when you see this page relative to the pages from Cancer.net and the CDC is painfully obvious why this page got the ranking boost. The Moral of the Story: Again, I think what has transpired here is painfully obvious. Google has looked past the immediate authority of some of the pages here and has more heavily considered how relevant the content is to the query. As with the cases I have shown earlier, Google is rewarding content that speaks in a highly-focused way to the intent and nature of the query. What Was the December 2020 Core Update About? Are you expecting a one-liner that definitively characterizes the December 2020 update? You’re not going to get one from me.  This update, like any other, certainly included a whole plethora of different "algorithmic considerations” and themes. That said, from where I sit, while other core updates did things to help put the most authoritative content at the top of the SERP, this update seemed to me as being more about pure relevancy. Updates of the past have done things to weed out sites using a marketing tone within YMYL informational content or have rewarded sites that put the right content ahead of their affiliate pursuits. All of that, while part of relevancy, speaks more to a need for something authoritative at the top of the SERP. Seeing so many .gov pages drop in favor of pages from sites like Healthline or WebMD seems to point to the update rewarding relevancy to the nth degree. Perhaps Google felt as if it had "authority” at the top of the SERP in order, paving the way for a focus on relevance? Who knows. All I can say is that I personally have not seen such a strong focus on pure relevance on page one of the SERP. Content Creation Takeaways Practically speaking, I think the era of broadly reaching pages is quickly coming to an end. I think that has been the case for a while. However, seeing Google pull pages off page one of the SERP because they deal with multiple subtopics is a new level, at least for me. It shows that you have to create content that talks about one topic and one topic only (unless the keyword reflects a specific desire for a broader survey of a topic). I wonder if the idea of having one ultimate guide so as to win numerous keywords should be replaced with multiple posts where each post deals with one specific topic or subtopic. (If you do that, please don’t create thin content, that is not what I am advocating for.) It’s a rather logical concept. As Google gets better at understanding content it is going to prefer highly-focused content around a specific topic to that which is of a more broad nature unless the query specifically shows intent for a general survey of a topic.
jack_fox

Why You Should Embed a Google Map on Your Website - Sterling Sky Inc - 0 views

  • There are a few main reasons to add a Google map to your website: It helps customers or website visitors get directions to your business and saves customers the steps of opening a new browser window, leaving your website, and finding directions. Customers can simply use the map on your website. Your business contact information is easy to find. The data an embedded Google Map provides is your business address, phone number, website, directions, reviews, and review stars. A Google map can highlight nearby points of interest, parking areas, restaurants, theaters, parks, etc. Visitors can reference nearby areas if they are not exactly sure where your business is located.
  •  
    "There are a few main reasons to add a Google map to your website: It helps customers or website visitors get directions to your business and saves customers the steps of opening a new browser window, leaving your website, and finding directions. Customers can simply use the map on your website. Your business contact information is easy to find. The data an embedded Google Map provides is your business address, phone number, website, directions, reviews, and review stars. A Google map can highlight nearby points of interest, parking areas, restaurants, theaters, parks, etc. Visitors can reference nearby areas if they are not exactly sure where your business is located."
Rob Laporte

Problems Continue With Google Local Business Listings - 0 views

  •  
    Oct 14, 2008 at 1:08pm Eastern by Mike Blumenthal Problems Continue With Google Local Business Listings What do the Google searches; Orlando Hotels, Miami Discount Car Rental & Dallas Discount Car Rental have in common? The obvious answer is that they are all local searches on popular phrases in major metro areas. A less obvious answer is that like the infamous Denver Florist search last December, they all return seemingly authoritative OneBox results on popular geo phrase searches in a major market, as in the example below: Orlando Hotels or the Marriott The searches demonstrate clear problems with Google's Universal Local OneBox algorithm. Certainly, "major city + service/product" searches should return a broad range of consumer choices and not an authoritative OneBox that limits the view to one highlighted provider of the service. Google returns the OneBox result because the ostensible business name in the result supposedly mirrors the search phrase and in Google's opinion provides strong relevance in relation to the user query. The problem with the above result is that the business shown on the map is the Marriott Orlando Downtown, not "travel.ian.com." The Marriott's business listing has apparently been hijacked. In fact, all of the listings returned on these searches have apparently been "hijacked" via Google's community edit feature and the business name of the listing has been modified from the original, Marriott Orlando Downtown, to match the search phrase. The URL's of the listings have also been modified to direct users to an affiliate link on an appropriate site. How? Through the use of Google's community edit feature for local business listings. Google's community edit feature has become the playground of black hat affiliate marketers and is sorely in need of more security. Of interest in this regards is that many of these listings are for multinational corporations. These are not small independent business that are t
Rob Laporte

Google Search Console Insights is now available to all - 0 views

  • What are your best-performing pieces of content?How are your new pieces of content performing?How do people discover your content across the web?What do people search for on Google before they visit your content?Which article refers users to your website and content?
  • Missing data. If you are missing data, that means your Search Console property is not properly linked to your Google Analytics property. Google recommends associating (linking) your Google Analytics property with your relevant Search Console property to get the full experience and the best insights about your content. Please note that for now, Search Console Insights only supports Google Analytics Universal Analytics properties (their ID starts with a “UA-“), but the company is working to support Google Analytics 4.
jack_fox

Is Rank Tracking by Zip Codes Still Relevant? | How to Accurately Track Ranking in Loca... - 0 views

  • Google uses proximity to deliver local results and does not use zip codes. All searches are affected by the proximity factor, but businesses with more dense competition will be affected the most. Organic results are affected by proximity but much less than Google Maps rankings. Checking rankings by zip code center does not always provide a complete or meaningful picture to base optimization decisions on.
  • Google could care less about what zip code you are in when performing a search, it only cares about distances when it comes to local businesses.
  • In order to deal with this distance factor, rank trackers have adapted and are now scanning with many points instead of one. A grid is laid out according to distance and each point reflects a different result.This gives a much better picture of what the rankings look like around a business as it takes into account the granularity that the algorithm actually produces.
  • ...4 more annotations...
  • The only time Google will factor zip code into a search is if you specifically enter the zip code in your search: “Dentist near me 11219”. In which case you are telling Google to return results for dentists in that zip. Otherwise, the algorithm will use your location and give you results based off of distance.
  • Organic Google search results are less affected by proximity than Google Maps results.
  • This is invaluable data which helps you determine which keywords need better optimization, and when an area is so far out of reach for ranking well that a PPC campaign would be a good option.
  • Zip code tracking is perfectly fine for some industries. If you work with an industry like dentists or restaurants that have a high density of competition in a small radius, zip code tracking will be very inaccurate. I’d suggest scanning some manually in Incognito mode using this Chrome extension to see if there is much variance within the same zip code. https://chrome.google.com/webstore/detail/gs-location-changer/blpgcfdpnimjdojecbpagkllfnkajglp
Rob Laporte

Online Video Views Up 13% in March, Google Extends Market-Share Lead - MarketingVOX - 0 views

  • Online Video Views Up 13% in March, Google Extends Market-Share Lead Is Lost ever gettingoff the island? In March, Google Sites were again ranked top US video property, with more than 4.3 billion videos viewed (38 percent share of all videos), gaining 2.6 share points form the previous month, according to comScore's Video Metrix service, MarketingCharts reports. Other data issued: US internet users viewed 11.5 billion online videos during March - a 13 percent increase from February and a 64 percent gain versus March 2007 YouTube.com accounted for 98 percent of all videos viewed at Google Sites. Fox Interactive Media ranked second with 477 million videos (4.2 percent), followed by Yahoo Sites with 328 million (2.9 percent) and Viacom Digital with 249 million (2.2 percent). Nearly 139 million US internet users watched an average of 83 videos per viewer in March. Number of Viewers Google Sites also attracted the most viewers (85.7 million), where they watched an average of 51 videos per person. Fox Interactive attracted the second most viewers (54.3 million), followed by Yahoo Sites (37.5 million) and Viacom Digital (26.6 million). Other notable findings from March 2008: 73.7 percent of the total US internet audience viewed online video. 84.8 million viewers watched 4.3 billion videos on YouTube.com (50.4 videos per viewer). 47.7 million viewers watched 400 million videos on MySpace.com (8.4 videos per viewer). The average online video duration was 2.8 minutes. The average online video viewer watched 235 minutes of video.
Rob Laporte

Evidence of Page Level Google Penalties? - 0 views

  • June 18, 2009 Evidence of Page Level Google Penalties? Richard at SEO Gadget showed how Google seemed to have penalized specific pages of his site from ranking in the Google index. The penalty seemed to be fair, in that there were nasty comments that slipped through his comment spam filter. The drop in traffic can be seen by the keyword phrases that page ranked well for. He noticed a ~70% drop in traffic for that phrase, which in his case resulted in a 15% drop in his Google traffic and a 5% drop in overall traffic. What I find extra fun is that a Google Search Quality Analyst, @filiber, tweeted: Google Page level penalty for comment spam – rankings and traffic drop http://bit.ly/JNAly (via @AndyBeard) <- interesting read! Of course that is not admission to this as a fact, but it wouldn't be too hard to believe that bad comments caused such a decline. Now, I don't think this would be considered a keyword-specific penalty, which most SEOs believe in, but rather a specific page being penalized. Forum discussion at Sphinn.
Rob Laporte

Google Working on Faster, More Caffeinated Search Engine - MarketingVOX - 0 views

  • Google Working on Faster, More Caffeinated Search Engine Click to enlarge Google announced today that it has been working on a faster search engine that will improve results for web developers and power searchers. Dubbed Caffeine, the new project focuses on next-generation infrastructure and seeks to improve performance in a host of areas including size, indexing speed, accuracy and comprehensiveness. Could this also be seen as a step towards improving access to the Deep Web? For now though, developers are being asked to go and check out the http://www2.sandbox.google.com/ and try a few searches with it. Then, compare those results with those found on the current Google site. If a "Dissatisfied? Help us improve." link displays, click on it and type your feedback in the text box along with the word caffeine. Since it's still a work in progress, Google engineers will be monitoring all feedback.
Rob Laporte

Understanding Google Maps & Yahoo Local Search | Developing Knowledge about Local Search - 0 views

  •  
    Google Maps: relative value of a OneBox vs top organic results Category: Google Maps (Google Local) - Mike - 5:50 am Steve Espinosa has some interesting preliminary research on the relative click thru rates of a #1 listing in the Local 10-Pack and a simultaneous #1 listing in organic. The organic listing showed 1.6x the click thru of the the Local 10 Pack listing. As it is preliminary research and only looked at click thru not call in or other measures of action, it is an important piece of research but doesn't speak to ultimate customer action. According to TMP's Local Search Usage Study : Following online local searches, consumers most often contact a business over the telephone (39%), visit the business in-person (32%) or contact the business online (12%). If one works out the combined math of the two studies (a not very reliable number I assure you), in the end the top local ranking would still provide more client contacts either via phone or in person than the organic ranking. At the end of the day, Steve's research can not be viewed as a reason to not focus on local but rather as a call to action on the organic side. I think he would agree that, in the excitement around local, you can't forget organic's power and that in an ideal world a business would use every tool available to them. However, many times, due to the nature of a business, a business may not be able to legitimately play in the Local space and their only recourse is to optimize their website for local phrases. Another interesting outcome of Steve's initial research was "the fact is that the majority of the users who got to the site via the natural link had resolution above 1024×768 and the majority of users who visited via the Onebox result had resoultion of 1024×768 or under." As Steve pointed out, this could be do the greater real estate visible to those with larger screens and thus greater visibility of organic listings above the fold. It could also, however, be
jack_fox

Google My Business Not Applied Edits With Send Edit Feedback - 0 views

  • The feature seems to allow you to send additional feedback to Google when a suggested edit you make to a Google business listing is not accepted (i.e. not applied) for some reason.
  • this will help with the process of knowing when your edits are (1) not applied and (2) communicating more information to Google to get them applied to the business listing.
  •  
    "The feature seems to allow you to send additional feedback to Google when a suggested edit you make to a Google business listing is not accepted (i.e. not applied) for some reason."
Rob Laporte

Creating SEO-friendly how-to content - Search Engine Watch Search Engine Watch - 0 views

  • 2. Use “how-to” structured data Google has recently added the opportunity to properly mark up how-to content that lets you appear in rich results on Search and Google Assistant. Using HowTo structured data can distinctly tell Google that your content is related to a how-to and reaches the right users.  The best thing about implementing HowTo structured data is the ability to get users through a gang of steps to finish a task successfully. Moreover, you can also feature text, images, and video. If you want to focus on the how-to on the page, HowTo structured data can help you add value to your content. Here’s how it looks like in the search:  To find out more about adding the markup to web pages that have step-by-step directions on, you can visit the developer docs for Google Search and a “how-to” action with markup for Google Assistant. Notice that you don’t need to create a separate web page to implement this structured data. You can do it without a page.  Once Google marks up your page, you can visit a new enhancement report in the Search Console to track all issues, warnings, and errors related to your how-to pages.
jack_fox

Improve your local ranking on Google - Google My Business Help - 0 views

  •  
    "lete the following tasks in Google My Business. Providing and updating business information in Google My Business can help your business's local ranking on Google and enhance your presence in Search and Maps."
jack_fox

Does Adding Keywords in an Image Filename Help Ranking? - Sterling Sky Inc - 0 views

  • unlike photos on your website, photos on Google My Business listings don’t get indexed or included in Google Images search.  If you take a photo from your GMB listing and run it through Google Images search, it will return no results, provided the image isn’t also hosted anywhere else online.
  •  
    "unlike photos on your website, photos on Google My Business listings don't get indexed or included in Google Images search.  If you take a photo from your GMB listing and run it through Google Images search, it will return no results, provided the image isn't also hosted anywhere else online."
jack_fox

Google Says Using Internal Linking Can Help Google Trust Your Site More Over Time - 0 views

  • John Mueller explaining that new sites can ultimately use internal linking to funnel Google through the most trusted and quality pages on the site, to earn trust and then get Google to crawl more and more of those pages over time after that trust was earned.
  •  
    "John Mueller explaining that new sites can ultimately use internal linking to funnel Google through the most trusted and quality pages on the site, to earn trust and then get Google to crawl more and more of those pages over time after that trust was earned."
Rob Laporte

How to Optimize for Google's Featured Snippets to Build More Traffic - Moz - 1 views

  • Multiple studies confirm that the majority of featured snippets are triggered by long-tail keywords. In fact, the more words that are typed into a search box, the higher the probability there will be a featured snippet.
  • To avoid confusion, let's stick to the "featured snippet" term whenever there's a URL featured in the box, because these present an extra exposure to the linked site (hence they're important for content publishers):
  • It helps if you use a keyword research tool that shows immediately whether a query triggers featured results. SE Ranking offers a nice filter allowing you to see keywords that are currently triggering featured snippets:
  • ...7 more annotations...
  • Tools like Buzzsumo and Text Optimizer can give you a good insight into questions people tend to ask around your topic:
  • Note that Search Console labels featured snippet positions as #1 (SEO used to call them position 0). So when you see #1 in Google Search Console, there’s nothing to do here. Focus on #2 and lower.
  • MyBlogU (disclaimer: I am the founder) is a great way to do that. Just post a new project in the " Brainstorm" section and ask members to contribute their thoughts.
  • 1. Aim at answering each question concisely My own observation of answer boxes has led me to think that Google prefers to feature an answer which was given within one paragraph. An older study by AJ Ghergich cites that the average length of a paragraph snippet is 45 words (the maximum is 97 words), so let it be your guideline as to how long each answer should be in order to get featured. This doesn't mean your articles need to be one paragraph long. On the contrary, these days Google seems to give preference to long-form content (also known as " cornerstone content," which is obviously a better way to describe it because it's not just about length) that's broken into logical subsections and features attention-grabbing images.  Even if you don’t believe that cornerstone content receives any special treatment in SERPs, focusing on long articles will help you to cover more related questions within one piece (more on that below). All you need to do is to adjust your blogging style just a bit: Ask the question in your article (that may be a subheading)Immediately follow the question with a one-paragraph answerElaborate further in the article
  • 2. Be factual and organize well Google loves numbers, steps and lists. We've seen this again and again: More often than not, answer boxes will list the actual ingredients, number of steps, time to cook, year and city of birth, etc. Use Google’s guide on writing meta descriptions to get a good idea what kind of summaries and answers they are looking to generate snippets (including featured snippets). Google loves well-structured, factual, and number-driven content. There's no specific markup to structure your content. Google seems to pick up <table>, <ol>, and <ul> well and doesn't need any other pointers. Using H2 and H3 subheadings will make your content easier to understand for both Google and your readers. 3. Make sure one article answers many related questions Google is very good at determining synonymic and closely related questions, so should be you. There's no point in creating a separate page answering each specific question. Creating one solid article addressing many related questions is a much smarter strategy if you aim at getting featured in answer boxes. This leads us to the next tactic: 4. Organize your questions properly To combine many closely related questions in one article, you need to organize your queries properly. This will also help you structure your content well. I have a multi-level keyword organization strategy that can be applied here as well: A generic keyword makes a section or a category of the blogA more specific search query becomes the title of the articleEven more specific queries determine the subheadings of the article and thus define its structureThere will be multiple queries that are so closely related that they will all go under a single subheading For example: Serpstat helps me a lot when it comes to both discovering an article idea and then breaking it into subtopics. Check out its " Questions" section. It will provide hundreds of questions containing your core term and then generate a tag cloud of other popular terms that come up in those questions:
  • 5. Make sure to use eye-grabbing images
  • How about structured markup? Many people would suggest using Schema.org (simply because it's been a "thing" to recommend adding schema for anything and everything) but the aforementioned Ahrefs study shows that there's no correlation between featured results and structured markup.
  •  
    "Organize your questions properly"
jack_fox

Everything Publishers Need to Know About URLs - 0 views

  • if you’re currently getting good traffic from Google News and Top Stories, don’t change any part of your domain name.
  • don’t change section URLs unless you really need to.
  • Including folder names in the URL can help Google identify relevant entities that apply to the section, but there doesn’t need to be a hierarchical relationship.
  • ...12 more annotations...
  • Do article URLs need dates in them?No. If you currently use dates in your article URLs (like theguardian.com does), you don’t need to remove them. It’s fine to have them, but there may be a small downside with regards to evergreen content that you want to rank beyond the news cycle;
  • Should article URLs have a folder structure?This is optional, but it might help. Google likes to see relevant entities that are mentioned in an article reflected in the URL
  • Can the URL be different from the headline?Yep, as long as they convey the same meaning.
  • Can you use special characters in a URL?Short answer; yes, but you shouldn’t.
  • Long answer; special characters are often processed just fine, but sometimes can lead to issues when a character needs to be encoded or is otherwise not easily parsed. It ’s better to keep things simple and stick to basic characters. Use the standard alphabet and limit your non-text characters to hyphens and slashes.
  • Is capitalisation okay?This is one of those grey areas where you’ll want to keep things as simple as possible. For Google, a capital letter is a different character than the lowercase version.
  • What about file extensions like .html?This question can be relevant if you have a website that still uses extensions like .php or .html at the end of a webpage URL.Modern web technology doesn’t require file extensions anymore. Whether your article ends in .html or with a slash (which, technically, makes it a folder), or ends without any notation at all - it really doesn’t matter. All those URL patterns work, and they can all perform just as well in Google.
  • Can you use parameters in article URLs?You can, but you shouldn’t. In its Publisher Center documentation concerning redirects, Google specifically advises against using the ‘?id=’ parameter in your article URLs.
  • Do my article URLs need a unique ID?No. This is a leftover from the early days of Google News, when there was an explicit requirement for article URLs to contain a unique ID number that was at least 3 digits long.
  • How long should my URL be?As long as you want, up to the rather extreme 2048-character limit built into most browsers. There’s little correlation between URL length and article performance in Google’s news ecosystem.
  • For almost any other purpose, changing existing URLs is generally a Bad Idea.
  • Over the years Google has given conflicting information about this, though recently they seem to have standardised on “no link value is lost in a redirect” which I’ll admit I’m a little skeptical of.Regardless, the advice is the same: don’t change URLs for any piece of indexed content unless you have a damn good reason to.This is why site migrations are such a trepidatious enterprise. Changing a site’s tech stack often means changing page URLs, which can cause all sorts of SEO problems especially for news publishers.
« First ‹ Previous 41 - 60 of 1661 Next › Last »
Showing 20 items per page