Skip to main content

Home/ DISC Inc/ Group items tagged google

Rss Feed Group items tagged

9More

How Google's Selective Link Priority Impacts SEO (2023 Study) - 0 views

  • How Google’s Selective Link Priority Impacts SEO (2023 Study)
  • First Link Priority
  • only have selected one of the links from a given page.
  • ...6 more annotations...
  • Google only counted the first anchor text
  • So even if you manage to figure out how we currently do it today, then that’s not necessarily how we’ll do it tomorrow, or how it always is across all websites.
  • Test #1 Takeaway: Google seems to be able to count multiple anchor texts on the same page to the same target, at least if one of the links is an image.
  • Test #2 Takeaway: When Google encountered two text links followed by an image link, Google indexed the first text and image anchors only.
  • Test #3 Takeaway: When Google encountered two text links followed by an image link and finally another text link, Google indexed the first text and image anchors only.
  • How to Optimize For Google’s Selective Link Priority Let’s be clear: Selective Link Priority most likely isn’t going to make a huge difference in your SEO strategy, but it can make a difference, especially in tie-breaker situations. In particular, here are five internal linking practices in a Selective Link Priority world: Be aware when linking on a page multiple times to the same URL that Google may not “count” all of your anchor text. When in doubt, you should likely prioritize both the first text link and image links on the page. Remember that each link to a URL—regardless of anchor text—has the potential to increase that URL’s PageRank. Don’t leave image alt attributes empty, and remember to vary them from any text link anchors. Not only can Google index the alt attribute as a separate anchor, but this gives you the chance to further increase your anchor text variations. Sites with smaller external link profiles may wish to limit the number of navigational links in preference of in-body text links. The reason is that if Google does indeed tend to prefer the first links on the page—and these are navigational—this limits the number of anchor text variations you can send to any page. (This isn’t a hard-and-fast rule. In fact, it’s a nuanced, complex subject that may warrant a whole other post.) The most important thing to remember is this – anchor text is a powerful ranking signal, even for internal links. Carefully choosing your anchor text—while avoiding over-optimization—can make a difference in winning SEO. If your SEO game is otherwise strong, you may be able to get away with ignoring Google’s Selective Link Priority rules (as most sites do already.) But you should at least be aware of how it works and what it means to your strategy.
1More

Google Webmaster Tools is Incorrectly Displaying Keyword Positions - 0 views

  • October 20, 2008 Google Webmaster Tools is Incorrectly Displaying Keyword Positions A WebmasterWorld member reports that he was dependent on the Top Search Queries report in Google Webmaster Tools and has found it to be providing incorrect data. After all, using another rank checker proved to see no results and there were no visitors to that page. This is likely to be a bug, according to Tedster: Webmaster Tools reports of all kinds are known to contain wrong information at times. This kind of wrong information would be particularly distrubing, but in any big system errors do creep in. The evidence of your own server logs is more dependable. He adds that it's possible that the ranking is achievable: [M]aybe the WMT report is pulling the position information before some filter is applied to come up with the final rankings. Even though that would certainly be buggy behavior, it might accidentally be showing you that your url COULD rank that well, if only you weren't tripping some kind of filter. Still, though, the tool in Google's backend is misleading. Would you consider this a bug? On a related note, The Official Google Webmaster Central Blog says that this could be an issue with the kind of data that WMT sees. They suggest that you add the www and non-www versions of the same site to Webmaster Central, do a site: search to look for any anomalies, set your preferred domain, and set a site-wide 301 redirect to www or the non-www. Of course, this is probably not applicable to the reporting issue in WebmasterWorld, though it may be related to other issues within Google Webmaster Tools. Forum discussion continues at WebmasterWorld.
1More

Sphinn - Oops, Google Analytics Lost Your Data - 0 views

  • Oops, Google Analytics Lost Your Data Went Hot: May 21, 2008 - 3:07 am Posted By: Drupal 18 hours ago Topic Type: News Story (Jump to http://www.getelastic.com) my network Category: Google Google sent a notice to Analytics users this morning that a data processing error from April 30th to May 5th has occurred. They're working on reprocessing the data which should be ready in a few days, but some data cannot be recovered.
1More

Calling All SEOs and Webmasters: Google Wants You - MarketingVOX - 0 views

  • Calling All SEOs and Webmasters: Google Wants You Click to enlarge Now through Sept. 30th, Google is looking once again to its community of developers to help guide others and contribute short tutorial videos to their Webmaster Central YouTube channel. The basic requirements are as follows: - Keep the video short: Approximately 3-5 minutes. - Think small: A short video is a good way to showcase your use of Top Search Queries, but not long enough to highlight an entire SEO strategy. - Focus on a real-life example of how you used a particular feature: For example, you could show how you used link data to research your brand, or crawl errors to diagnose problems with your site structure. Do you have a great tip or recommendation? (Go here for a complete list of requirements and submit all videos through their help center.) This is not the first time Google has reached out, nor is it something new to the industry. The site is billed as a one-stop shop for webmaster resources that helps with crawling and indexing questions, as well as introducing offerings to enhance and increase site traffic. The YouTube channel has more than 5,000 subscribers and 113 uploaded tutorials since launching in January.
8More

Google third-party policy - Advertising Policies Help - 0 views

  • In addition to meeting the requirements outlined below, third parties must make reasonable efforts to provide their customers with other relevant information when requested.
  • If your applicable terms of service require a monthly performance report for customers, you must include data on costs, clicks, and impressions at the Google advertising account level. When sharing Google advertising cost data with customers, report the exact amount charged by Google, exclusive of any fees that you charge.
  • you can meet this reporting requirement by allowing your customers to sign in to their Google advertising accounts directly to access their cost and performance data. Learn how to share account access.
  • ...4 more annotations...
  • Third parties often charge a management fee for the valuable services they provide, and end-advertisers should know if they are going to be charged these fees. If you charge a management fee (separate from the cost of AdWords or AdWords Express), let customers know. At a minimum, inform new customers in writing before each first sale and disclose the existence of this fee on customer invoices.
  • It's important for advertisers to have the ability to contact Google directly with concerns about a third-party partner. To allow Google to properly investigate and assist the advertiser, we require that you provide your customers with the customer IDs for their AdWords or AdWords Express accounts when requested. Learn how to find an AdWords customer ID
  • putting undue pressure on an advertiser to sign up or stay with your agency
  • Having a separate account for each end-advertiser is essential to maintaining the integrity of the AdWords Quality Score. Because account history is a core component of the AdWords Quality Score, mixing advertisers in one account can result in Quality Scores that inaccurately represent any one advertiser's performance. Additionally, we'll show only one ad per account for a particular keyword, so mixing advertisers in one account could unfairly limit ad serving for those advertisers. For these reasons, we require that you use a separate account for each end-advertiser that you manage.
  •  
    "In addition to meeting the requirements outlined below, third parties must make reasonable efforts to provide their customers with other relevant information when requested."
2More

Problem with Google indexing secure pages, dropping whole site. - Search Engine Watch F... - 0 views

  • Coincidentally Google e-mailed me today saying to use a 301 redirect for the https page to http. This is the first thought I had and I tried to find code to do this for days when this problem first occurred-I never found it.
  •  
    04-25-2006 Chris_D's Avatar Chris_D Chris_D is offline Oversees: Searching Tips & Techniques Join Date: Jun 2004 Location: Sydney Australia Posts: 1,103 Chris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud ofChris_D has much to be proud of Hi docprego, Set your browser to reject cookies, and then surf your site (I'm assuming it's the one in your profile). now look at your URLS when you reject cookies..... /index.php?cPath=23&osCsid=8cfa2cb83fa9cc92f78db5f4 4abea819 /about_us.php?osCsid=33d0c44757f97f8d5c9c68628eee0e 2b You are appending Cookie strings to the URLS for user agents that reject cookie. That is the biggest problem. Get someone who knows what they are doing to look at your server configuration - its the problem - not Google. Google has always said: Quote: Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site. Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page. http://www.google.com/webmasters/guidelines.html You've also excluded a few pages in your http port 80 non secure robots.txt which I would have expected that you want to be indexed - like /about_us.php From an information architecture perspective, as Marcia said - put the stuff that n
1More

Official Google Webmaster Central Blog: More on 404 - 0 views

  • Have you guys seen any good 404s?Yes, we have! (Confession: no one asked us this question, but few things are as fun to discuss as response codes. :) We've put together a list of some of our favorite 404 pages. If you have more 404-related questions, let us know, and thanks for joining us for 404 week!http://www.metrokitchen.com/nice-404-page"If you're looking for an item that's no longer stocked (as I was), this makes it really easy to find an alternative."-Riona, domestigeekhttp://www.comedycentral.com/another-404"Blame the robot monkeys"-Reid, tells really bad jokeshttp://www.splicemusic.com/and-another"Boost your 'Time on site' metrics with a 404 page like this."-Susan, dabbler in music and Analyticshttp://www.treachery.net/wow-more-404s"It's not reassuring, but it's definitive."-Jonathan, has trained actual spiders to build websites, ants handle the 404shttp://www.apple.com/iPhone4g"Good with respect to usability."http://thcnet.net/lost-in-a-forest"At least there's a mailbox."-JohnMu, adventuroushttp://lookitsme.co.uk/404"It's pretty cute. :)"-Jessica, likes cute thingshttp://www.orangecoat.com/a-404-page.html"Flow charts rule."-Sahala, internet travellerhttp://icanhascheezburger.com/iz-404-page"I can has useful links and even e-mail address for questions! But they could have added 'OH NOES! IZ MISSING PAGE! MAYBE TIPO OR BROKN LINKZ?' so folks'd know what's up."-Adam, lindy hop geek
1More

Selling text links ads thorugh TLA or DLA result in Google penalty? - 0 views

  • Can selling text link ads in the sidebar using TLA or Direct-Link-Ads result in a Googlge penalty? I use to use TLA before for one of my sites but stopped using them for the fear of Google dropping the sit because i heard a few rumors on webmaster forums of this happening. Is this concrete or not? Are people still using TLA or DLA or some other similar? C7Mike#:3930956 4:52 am on June 11, 2009 (utc 0) Yes, you may receive a penalty for purchasing links that pass PageRank. See Google's Webmasters/Site owner Help topic for more information: [google.com...] Automotive site#:3930991 6:42 am on June 11, 2009 (utc 0) Well, I was actually going to use one of thoose to sell and not purchase. Anyway, I am going to apply to BuyandSellAds and see if I get accepted there, but I heard they mostly accept tech related sites. C7Mike#:3931237 2:25 pm on June 11, 2009 (utc 0) You may receive a penalty for both buying and selling paid links that pass PageRank (see [google.com...] I have had a few sites lose their PR because they published links through TLA. However the content was still good enough that advertisers have continued to purchase links on those pages through TLA inspite of the lack of PR, and at a substantially lower rate.
8More

The March 12, 2019 Google Core Algorithm Update - A Softer Side Of Medic, Trust And The... - 1 views

  • when checking queries that dropped and their corresponding landing pages, they line up with the problems I have been surfacing. For example, thin content, empty pages, pages that had render issues, so on and so forth.
  • Author expertise is extremely important, especially for YMYL content.
  • Also, and this is important, the site consumes a lot of syndicated content. I’ve mentioned problems with doing this on a large scale before and it seems this could be hurting the site now. Many articles are not original, yet they are published on this site with self-referencing canonical tags (basically telling Google this is the canonical version). I see close to 2K articles on the site that were republished from other sources
  • ...5 more annotations...
  • And last, but not least, the site still hadn’t moved to https. Now, https is a lightweight ranking factor, but it can be the tiebreaker when two pages are competing for a spot in the SERPs. Also, http sites can turn off users, especially with the way Chrome (and other browsers) are flagging them. For example, there’s a “not secure” label in the browser. And Google can pick up on user happiness over time in a number of ways (which can indirectly impact a site rankings-wise). Maybe users leave quickly, maybe they aren’t as apt to link to the site, share it on social media, etc. So not moving to https can be hurting the site on multiple levels (directly and indirectly).
  • This also leads me to believe that if Google is using reputation, they are doing so in aggregate and not using third-party scores or ratings.
  • What Site Owners Can Do – The “Kitchen Sink” Approach To RemediationMy recommendations aren’t new. I’ve been saying this for a very long time. Don’t try to isolate one or two problems… Google is evaluating many factors when it comes to these broad core ranking updates. My advice is to surface all potential problems with your site and address them all. Don’t tackle just 20% of your problems. Tackle close to 100% of your problems. Google is on record explaining they want to see significant improvement in quality over the long-term in order for sites to see improvement.
  • Summary – The March 12 Update Was Huge. The Next Is Probably A Few Months AwayGoogle only rolled out three broad core ranking updates in 2018. Now we have our first of 2019 and it impacted many sites across the web.
  • Don’t just cherry pick changes to implement. Instead, surface all potential problems across content, UX, advertising, technical SEO, reputation, and more, and address them as thoroughly as you can. That’s how you can see ranking changes down the line. Good luck.
4More

An Agency Workflow for Google My Business Dead Ends - Moz - 0 views

  • Client contracts that are radically honest about the nature of GoogleClient management that sets correct expectations about the nature of GoogleA documented process for seeking clarity when unusual client scenarios ariseAgency openness to experimentation, failure, and on-going learningRegular monitoring for new Google developments and changes
  • be sure you are continuing to keep tabs on any particularly aggravating dead ends in case solutions emerge in future.
  • Google has apparently changed the rules for front facing departments within a company. No longer allowed to share the same physical address.
  • ...1 more annotation...
  • take it directly to Google in private. You have a variety of options for doing so, including: Phone support at (844) 491-9665Chat support Twitter support
2More

Leveraging Wikidata To Gain A Google Knowledge Graph Result - Search Engine Land - 0 views

  • I immediately started filing feedback reports in the SERP advising Google of the incorrect logo. I did this every day for a week, and Google finally updated the logo element with the correct image.
  •  
    " I immediately started filing feedback reports in the SERP advising Google of the incorrect logo. I did this every day for a week, and Google finally updated the logo element with the correct image."
6More

Google Knowledge Graph and How it Works - 0 views

  • Google also states that while the information in the above list might be available directly in their search API, they augment this data considerably internally.
  • There is also a common misconception that Google’s Knowledge Panel is Google’s Knowledge Graph. This is not the case, although the Knowledge panel may represent a subset of data in the graph.
  • A Knowledge Graph is generally described as being made up of “Entities” but Google tends to refer to entities as “Topics” in its public documentation.
  • ...3 more annotations...
  • A new fact about a topic may have to pass some quality threshold before it is added to the Knowledge Graph, but these thresholds are unlikely to be discussed openly by Google.
  • Topics can also be seen in Google Trends.
  • Google also provides a Knowledge Graph Search API as shown above, and surfaces entities in the output of its NLP API.
4More

Physical Address vs. Mailing Address - What Does Google Base Ranking On? - Sterling Sky... - 0 views

  • The ranking is based on the physical location that Google thinks you are located in There is a major difference, as far as ranking goes, between what you think your address is (such as your mailing address) and what Google thinks your physical location is.
  • What you will see is that this business ranks amazing for explicit queries with “Lenexa” (their physical location) but ranks horribly for explicit searches with “Olathe” (mailing address and where they consider themselves to actually be located)
  • Implement a strategy to earn backlinks with anchors that mention the city name. Optimize your website, including your internal linking to make it clear to Google that you have a presence in that city. Move your business inside the border of the desired city (most impactful).
  •  
    "The ranking is based on the physical location that Google thinks you are located in There is a major difference, as far as ranking goes, between what you think your address is (such as your mailing address) and what Google thinks your physical location is."
1More

You can now link Google Analytics 4 to Google Search Console - 0 views

  •  
    "You can now link Google Analytics 4 to Google Search Console"
4More

Google reenables the validate fix feature in Search Console and adds new classifications - 0 views

  • Google reenables the validate fix feature in Search Console and adds new classifications
  • Google said the URLs or items in the Search Console report are no longer grouped at the top level by three or more status categories, i.e. Valid, Warning, and Error. Now they are grouped or classified into two more broad statuses that reflect whether those URLs or items are invalid or not. Google said invalid means that there is a report-specific critical issue in the page or item, and not invalid means that the item might still contain warnings, but has no critical issues. The implications and exact terms for the valid and invalid states varies by report type, Google added.
  • Google explained “grouping the top-level item (a rich result for the rich result reports, a page or URL for the other reports) into two groups: pages or items with critical issues are labeled something like invalid; pages or items without critical issues are labeled something like valid. We think this new grouping will make it easier to see quickly which issues affect your site’s appearance on Google, in order to help you prioritize your fixes.”
  • ...1 more annotation...
  • This should make it easier for you to understand errors in Search Console reports and thus which items to prioritize over others
5More

Is Google Dying? Or Did the Web Grow Up? - The Atlantic - 0 views

  • Search google dying on Twitter or Reddit and you can see people grousing about it going back to the mid 2010s. Lately, though, the criticisms have grown louder.
  • a PR response from Google’s Search liaison, Danny Sullivan, refuting one of Brereton’s claims. “You said in the post that quotes don’t give exact matches. They really do. Honest,” Sullivan wrote in a series of tweets.
  • Brereton cited Google Trends data that show that people are searching the word reddit on Google more than ever before.
  • ...2 more annotations...
  • In 2020, the company made $147 billion in revenue off ads alone, which is roughly 80 percent of its total revenue
  • Google could use such technology to continue to lead people away from their intended searches and toward its own products and paid ads with greater frequency. Or, less deviously, it could simply gently algorithmically nudge people in unexpected directions. Imagine all the life decisions that you make in a given year based on information you process after Googling. This means that the stakes of Google’s AI interpreting a searcher’s intent are high.
1More

Why do webpages lose page rank and indexing when they are redirected? - Google Groups - 0 views

  • ????1) Google has to crawl the old URL.    If Google typically crawls 10 pages a week, then it is only likely to find and follow 10 redirects per week!    (it may be less, as it may crawl some pages 2+ times per week)2) Google then has to transfer the various values/factors/scores/data etc. from one "account" to another "account".    (Think of it like moving house - Google is hte Mail service, and has to collect the mail and then pass it on)3) PAgeRank in the ToolBar may not update for a while.    Google only "pushes" the visibile PR (toolBar PAgeRank) every so often.    Thus you may not see a visible PR for some time.4) THere are numerous factors in Ranking.    If htere is a fair bit of PageRankFlow (the passing of link value between your own pages) - then this may suffer a temporary upset whilst things are being shifted around..Please - have some patience.This sort of thing takes time.
1More

Tips On Getting a Perfect 10 on Google Quality Score - 0 views

  • October 20, 2008 Tips On Getting a Perfect 10 on Google Quality Score Ever since Google launched the real time quality score metric, where Google rated keywords between 0 and 10, 10 being the highest, I have rarely seen threads on documenting how to receive a 10 out of 10. Tamar blogged about How To Ensure That Your Google Quality Score is 10/10 based on an experiment by abbotsys. Back then, it was simply about matching the domain name to the keyword phrase, but can it be achieved with out that? A DigitalPoint Forums thread reports another advertiser receiving the 10/10 score. He documented what he did to obtain the score: Eliminated all the keywords that google had suggested and only used a maximum of three keywords per ad campaign.Used only 1 ad campaign per landing page and made each landing page specific for that keyword.Put the cost per click up high enough to give me around third spot.Geo targeted the campaigns only in the areas he can sell to.Limited the time his ads were on only to the times where there is really interest.Used three version of each keyword "keyword", [keyword], and keyword and then eliminated which every wasn't working well. If you want to reach that perfect 10, maybe try these tips and see what works for you. There is no guaranteed checklist of items, so keep experimenting. And when you get your perfect 10, do share!
2More

How Serious Are Duplicate Meta Tag Issues? - 0 views

  • You may receive a warning in Google Webmaster Tools, but you will probably not receive a penalty. However, as JohnMu says in a Google Groups thread, those "warnings" are mostly nudges that you should try to diversify the meta data. You should keep in mind that the terminology should "makes sense to your users." If that means you need to work with duplicate meta data, then that's how you should proceed.
  •  
    Sept 8, 08 You may receive a warning in Google Webmaster Tools, but you will probably not receive a penalty. However, as JohnMu says in a Google Groups thread, those "warnings" are mostly nudges that you should try to diversify the meta data. You should keep in mind that the terminology should "makes sense to your users." If that means you need to work with duplicate meta data, then that's how you should proceed.
1More

Search Engine Land: Must Read News About Search Marketing & Search Engines - 0 views

  • Report: Verizon May Opt For Google To Provide Mobile Search Front Verizon, Google Close To Mobile Search Deal from the Wall Street Journal reports that Google and Verzion are close on a deal. The deal would make Google the default search provided for Verizon mobile devices. In the past, Verizon and other mobile carriers were reluctant to let Google or other search companies invade this space, but that might be over with. The Wall Street Journal says Verizon wants Google to create a new search platform that would be a one-stop shop. In exchange, Google would share the ad revenue with Verizon under this platform. The details of the deal are not complete yet and as soon as we have more information, we will update you. Click to continue reading...
« First ‹ Previous 61 - 80 of 1662 Next › Last »
Showing 20 items per page