Skip to main content

Home/ DISC Inc/ Group items tagged google

Rss Feed Group items tagged

Rob Laporte

Google Docs Gains E-Commerce Option - Google Blog - InformationWeek - 0 views

  • Google Docs Gains E-Commerce Option Posted by Thomas Claburn, Jul 30, 2009 06:10 PM Google (NSDQ: GOOG) on Thursday released the Google Checkout store gadget, software that allows any Google Docs user to create an online store and sell items using a Google spreadsheet. "Using new Spreadsheet Data APIs, we've integrated Google Docs and Google Checkout to make online selling a breeze," explains Google Checkout strategist Mike Giardina in a blog post. "In three simple steps, you'll be able to create an online store that's powered by Google Checkout and has inventory managed and stored in a Google spreadsheet." Giardina insists that the process is simple and can be completed in less than five minutes. To use the gadget, Google users first have to sign up for Google Checkout. They can then list whatever they want to sell in a Google spreadsheet and insert the Checkout gadget, which can also be used on Google Sites, Blogger, and iGoogle.
Rob Laporte

Google Confirms "Mayday" Update Impacts Long Tail Traffic - 0 views

  • Google Confirms “Mayday” Update Impacts Long Tail Traffic May 27, 2010 at 11:02am ET by Vanessa Fox Google made between 350 and 550 changes in its organic search algorithms in 2009. This is one of the reasons I recommend that site owners not get too fixated on specific ranking factors. If you tie construction of your site to any one perceived algorithm signal, you’re at the mercy of Google’s constant tweaks. These frequent changes are one reason Google itself downplays algorithm updates. Focus on what Google is trying to accomplish as it refines things (the most relevant, useful results possible for searchers) and you’ll generally avoid too much turbulence in your organic search traffic. However, sometimes a Google algorithm change is substantial enough that even those who don’t spend a lot of time focusing on the algorithms notice it. That seems to be the case with what those discussing it at Webmaster World have named “Mayday”. Last week at Google I/O, I was on a panel with Googler Matt Cutts who said, when asked during Q&A,  ”this is an algorithmic change in Google, looking for higher quality sites to surface for long tail queries. It went through vigorous testing and isn’t going to be rolled back.” I asked Google for more specifics and they told me that it was a rankings change, not a crawling or indexing change, which seems to imply that sites getting less traffic still have their pages indexed, but some of those pages are no longer ranking as highly as before. Based on Matt’s comment, this change impacts “long tail” traffic, which generally is from longer queries that few people search for individually, but in aggregate can provide a large percentage of traffic. This change seems to have primarily impacted very large sites with “item” pages that don’t have many individual links into them, might be several clicks from the home page, and may not have substantial unique and value-added content on them. For instance, ecommerce sites often have this structure. The individual product pages are unlikely to attract external links and the majority of the content may be imported from a manufacturer database. Of course, as with any change that results in a traffic hit for some sites, other sites experience the opposite. Based on Matt’s comment at Google I/O, the pages that are now ranking well for these long tail queries are from “higher quality” sites (or perhaps are “higher quality” pages). My complete speculation is that perhaps the relevance algorithms have been tweaked a bit. Before, pages that didn’t have high quality signals might still rank well if they had high relevance signals. And perhaps now, those high relevance signals don’t have as much weight in ranking if the page doesn’t have the right quality signals. What’s a site owner to do? It can be difficult to create compelling content and attract links to these types of pages. My best suggestion to those who have been hit by this is to isolate a set of queries for which the site now is getting less traffic and check out the search results to see what pages are ranking instead. What qualities do they have that make them seen as valuable? For instance, I have no way of knowing how amazon.com has faired during this update, but they’ve done a fairly good job of making individual item pages with duplicated content from manufacturer’s databases unique and compelling by the addition of content like of user reviews. They have set up a fairly robust internal linking (and anchor text) structure with things like recommended items and lists. And they attract external links with features such as the my favorites widget. From the discussion at the Google I/O session, this is likely a long-term change so if your site has been impacted by it, you’ll likely want to do some creative thinking around how you can make these types of pages more valuable (which should increase user engagement and conversion as well). Update on 5/30/10: Matt Cutts from Google has posted a YouTube video about the change. In it, he says “it’s an algorithmic change that changes how we assess which sites are the best match for long tail queries.” He recommends that a site owner who is impacted evaluate the quality of the site and if the site really is the most relevant match for the impacted queries, what “great content” could be added, determine if the the site is considered an “authority”, and ensure that the page does more than simply match the keywords in the query and is relevant and useful for that query. He notes that the change: has nothing to do with the “Caffeine” update (an infrastructure change that is not yet fully rolled out). is entirely algorithmic (and isn’t, for instance, a manual flag on individual sites). impacts long tail queries more than other types was fully tested and is not temporary
Rob Laporte

Google AdWords: Now With Images - 0 views

  • Oct 23, 2008 at 1:42pm Eastern by Barry Schwartz    Google AdWords: Now With Images Some AdWords ads on Google are now showing associated images — and getting much larger in the space they take up — through a “Show products from” Plus Box implementation that some are seeing now when searching at Google. For example, try search for bluenile, which brings up a Blue Nile ad. Under the usual ad title and description is a plus symbol (called a Plus Box), followed by the words, “Show products from Blue Nile for bluenile.” If you click on the box, it opens up three product listings from Blue Nile, each listing with an associated image. The most shocking part of this ad is that how much room it takes up. Here is the ad when it is closed: When you click to open up the product results, the whole visible part of the page is consumed with this one ad. Here is an image of just the ad, that measures about 370 pixels tall for me: The ad also shows on the right hand side, as Steve Rubel shows. I was able to replicate Steve’s findings, by searching for diamonds. This implementation is better, in my opinion, because it does not change how the natural/free results are shown but rather only pushes down other ads on the right hand side. Images associated with search ads are not too surprising. We have seen implementations of video ads in AdWords several times. It just seems to me that Google is willing to try anything now when it comes to ads, from video to images to multimedia and who knows what. Do note that back in November of last year January, I reported that Google was testing product results within AdWords. But those product results seemed to have been powered by Google Base and did not contain product images. Google has also been testing showing banner ads in image search.
Rob Laporte

Search Stats You Need to Know (Sept 08) & Build A Banner In Minutes - 0 views

  • Google AdWords: Separate metrics for Google and search partners are now available As reported on the Inside AdWords blog, and in the spirit of transparency, Google is finally breaking out stats between Google Search and the Google Search Network. I’ve actually run mirrored campaigns with each option just to be able to see the difference between the two search vehicles. I’m glad Google has now opened this up to us. According to the Google blog: We’re happy to let you know that we’ve changed the way your Campaign Summary and Ad Group Summary pages present statistics in order to give you additional level of detail into your campaign performance. Previously, these pages divided statistics into two categories: search, which included Google and search partners, and the content network. Now, we show one set of statistics for Google and another set aggregating search partner performance. Search partners include AOL, Ask.com, and many other search sites around the web. You can view ad group or campaign performance at a summary level, or broken down by different combination of Google, our search partners, and our content network. Additionally, separate Google and aggregate search partner statistics will soon be available in the Report Center. Click image above for full screen version
  • Average Search CPC Data by Category for September 2008 Reported by ClickZ based on an Efficient Frontier study A look at the average CPC (define) in search by vertical in the U.S. for September 2008, compared to the prior month. Data and research are provided by Efficient Frontier. “Total finance” includes auto finance, banking, credit, financial information, insurance, lending, and mortgage. Each vertical contains data from multiple advertisers. The percentage of change from the previous month is indicated in parenthesis. Total Finance - $2.06 (-22.6%) Mortgage - $2.89 (7.8%) Insurance - $12.65 (4.3%) Travel - $0.69 (-4.2%) Automotive - $0.54 (-5.3%) Retail - $0.50 (13.6%) Dating - $0.44 (2.3%) The biggest change came in the Finance category which dropped from $2.66 in August to $2.06 in September.
  • ...2 more annotations...
  • Paid Search Spending Pops: Very few cuts planned, most plan to splurge From eMarketer The near future of online ad spending in the US—or at least the largest portion of it—continues to look good despite turmoil in some other ad media and the economy at large. More than eight out of 10 marketers who spent at least $50,000 per month on paid search said they planned to maintain or increase their spending during the next 12 months, according to a Marin Software-sponsored study conducted by JupiterResearch. More than 90% of the big spenders also said they would spend as much as 22% more if they had better campaign management tools. Change in Paid Search Spending in next 12 Months according to US Search Marketers, 2008. 55% Plan to Increase spending 28% Plan to Maintain spending 17% Plan to decrease spending
  • Free tool of the week: Build banner ads in minutes in AdWords Called the Display Ad Builder, AdWords now offers a wizard type interface which walks you through the process of building a banner ad. As reported on their blog last week: Today we released the AdWords display ad builder, which lets you create professional-looking display ads in AdWords without needing to hire a designer or start from scratch. If you’ve wanted to expand beyond your text ad campaigns, or if you’ve been looking for an easier way to build display ads, this tool can help. This new tool lets you create customized display ads with your own text, images, and logo. You can also change colors and backgrounds. The tool can create ads to fit all possible placements across the Google content network, including video and game placements. The display ad builder is available now to all advertisers in the U.S. and Canada. The interface is very easy to use. Check out the sample ad I designed for this column: Okay, so I’m not going to win a Cleo award for this, but it is a good way to make a quick ad and I’m sure Google will expand the features in the near future. For more info on this tool, check out the YouTube video tutorial and the Display Ads 101 Tutorial.
  •  
    Top 10 Industry Search Terms - September, 2008 By Hitwise US The terms listed below are ranked by volume of searches that successfully drove traffic to websites in the Hitwise All Categories category for the 4 weeks ending September 27, 2008, based on US Internet usage. 1. myspace - .78% 2. craigslist - .47% 3. ebay - .34% 4. youtube - .26% 5. myspace.com - .26% 6. facebook - .20% 7. yahoo - .19% 8. mapquest - .16% 9. www.myspace.com - .10% 10. craigs list - .09% Top 10 Fast Moving Search Terms - September, 2008 by Hitwise This list features the search terms for the industry All Categories, ranked by largest relative increase for the week ending September 27, 2008, compared with the week ending September 20, 2008. 1. dancing with the stars 2. paul newman 3. david blaine 4. clay aiken 5. britney spears 6. 2009 ford mustang concept car 7. hooters 8. criss angel 9. heroes 10. presidential debate Some of the terms that are off the top ten list from August: sarah palin, hurricane gustav, how to get a tax refund, palin, democratic convention Average Search CPC Data by Category for September 2008 Reported by ClickZ based on an Efficient Frontier study A look at the average CPC (define) in search by vertical in the U.S. for September 2008, compared to the prior month. Data and research are provided by Efficient Frontier. "Total finance" includes auto finance, banking, credit, financial information, insurance, lending, and mortgage. Each vertical contains data from multiple advertisers. The percentage of change from the previous month is indicated in parenthesis. Total Finance - $2.06 (-22.6%) Mortgage - $2.89 (7.8%) Insurance - $12.65 (4.3%) Travel - $0.69 (-4.2%) Automotive - $0.54 (-5.3%) Retail - $0.50 (13.6%) Dating - $0.44 (2.3%) The biggest change came in the Finance category which dropped from $2.66 in August to $2.06 in September. Paid Search Spending Pops: Very few cuts planned, most plan to splurge From eMarketer
Rob Laporte

Google Discover SEO Best Practices - Moz - 0 views

  • Most article links that appear in Google Discover are sourced from non-Google publishers.
  • There are not many technical requirements to be featured in Google Discover, compared to Google News. You do not need a specific sitemap for Google Discover, nor is there any sort of manual submission process to make your content eligible for Discover feeds.
  • less predictable or dependable when compared to Search
  • ...9 more annotations...
  • Technical Guidelines for Article Links There are two technical requirements that are recommended by Google in order to be featured in Discover feeds, listed below. These recommendations apply only to the ‘Article Link’ content types. These technical guidelines do not apply to YouTube videos or shorts, web stories or Ads.
  • Images are a major part of the Google Discover experience
  • RSS Feeds
  • Follow Feature
  • quality of its content
  • Provide content that's timely for current interests, tells a story well, or provides unique insights
  • The “shelf life” of an article within a Google Discover feed may only be 1 or 2 days.
  • According to a Search Engine Journal study, 46% of a sample size of Google Discover URLs were news sites and 44% were Ecommerce.
  • It's important to note that impressions are only counted when a link from your site is scrolled into view.
  •  
    "Most article links that appear in Google Discover are sourced from non-Google publishe"
jack_fox

Can Google Ignore Portions Of Your Site For Accessing Quality - 0 views

  • how long does a site need to wait for Google to process a quality change and the answer was at least two months - one month won't cut it. And this applies to both Google Search and Google Discover, it isn't different. John said he would guess for a large site a couple of months would give Google a chance to understand it better. A month is too little to see a significant impact.
  • John then goes into explaining that for a site that produces a lot of new content often, then Google will "focus essentially on the newer content on the main category sections of the web site." Because of the structure of your site, you are giving your newer content more prominence on your web site and Google will focus its crawling and indexing more on that newer content. John said if you are constantly creating new content, then that is where Google will shift its focus on.
  • if you're looking at an overall quality issue with regards to your website and you have kind of this reference part that's really important for your website but it's really low quality then we will still balance that low quality part with your newer quality news content and try to find some some middle ground there with regards to how we understand the quality of your website overall.
  • ...1 more annotation...
  • John has said that Google only judges sites based on what it indexes of that site. And if Google is not indexing big portions of your site, it won't judge those portions. Get it? So if Google is focused on indexing newer content, based on how you structure your web site then Google might not be indexing that low quality content from ages ago anymore. That older lower quality content won't be ranking in Google but at the same time, it won't be dragging down your site's quality. Again, "it depends" on your site and specific situation for your web site.
Rob Laporte

Google Sitelinks - What Sitelinks Are and How They Work - 0 views

  • What are Google Sitelinks? Google Sitelinks are displayed in Google search results and are meant to help users navigate your website. Google systems analyze the link structure of your website to find shortcuts that will save users time and allow them to quickly find the information. Sitelinks are completely automated by Google’s algorithm. In short, Google Sitelinks are shortcuts to your main pages from the search result pages. When do Google Sitelinks show? Google only shows Sitelinks for results when they think they’ll be useful to the user. If the structure of your website doesn’t allow Google spider to find good Sitelinks, or they don’t think the Sitelinks for your site are relevant for the user’s query, they won’t show them. Although there are no certain answers to this question from Google, the following factors seem to influence whether Google displays Sitelinks or not: Your site must have a stable no.1 ranking for the search query. So Sitelinks show up most often for searches on brand names. Your site must be old enough. It seems that websites under 2 years old don’t get Sitelinks The number of searches - it seems that the search keywords aren’t searched often enough don’t get Sitelinks The number of clicks - it seems that your site has to get many clicks for the searched keywords It seems that Sitelinks don’t show to search queries consisting of two or more keywords The number of links - links are important everywhere in the SEO world, aren’t they? The inbound links with the relevant anchor text seems to influence the chance of getting Sitelinks How can we get Sitelinks for our website? If you can meet the above mentioned criteria, you’ll have a big chance to get Sitelinks shown for your site. But you can also improve the structure of your website to increase the possibility and quality of your Sitelinks. Google seems to use the first level links on a website for the Sitelinks, so make sure all your important links are on the homepage. The links should be text links or image links with an IMG ALT attribute. JavaScript or Flash links are not considered for Sitelinks. Also, it seems that Google likes links that appear at the top of a webpage. So try to put your important links at the top of the HTML code and then re-position using CSS. Overall, build your website following SEO best practices and rank no.1 for your most important keywords will ensure the Sitelinks appearances and help users to navigate your website.
Rob Laporte

RankBrain Judgment Day: 4 SEO Strategies You'll Need to Survive | WordStream - 0 views

  • The future of SEO isn't about beating another page based on content length, social metrics, keyword usage, or your number of backlinks. Better organic search visibility will come from beating your competitors with a higher than expected click-through rate.
  • In “Google Organic Click-Through Rates” on Moz, Philip Petrescu shared the following CTR data:
  • The Larry RankBrain Risk Detection Algorithm. Just download all of your query data from Webmaster Tools and plot CTR vs. Average Position for the queries you rank for organically, like this:
  • ...7 more annotations...
  • Our research into millions of PPC ads has shown that the single most powerful way to increase CTR in ads is to leverage emotional triggers. Like this PPC ad: Tapping into emotions will get your target customer/audience clicking! Anger. Disgust. Affirmation. Fear. These are some of the most powerful triggers not only drive click through rate, but also increase conversion rates.
  • No, you need to combine keywords and emotional triggers to create SEO superstorms that result in ridiculous CTRs
  • Bottom line: Use emotional triggers + keywords in your titles and descriptions if you want your CTR to go from "OK" to great.
  • Bottom line: You must beat the expected CTR for a given organic search position. Optimize for relevance or die.
  • Let's say you work for a tech company. Your visitors, on average, are bouncing away at 80% for the typical session, but users on a competing website are viewing more pages per session and have a bounce rate of just 50%. RankBrain views them as better than you – and they appear above you in the SERPs. In this case, the task completion rate is engagement. Bottom line: If you have high task completion rates, Google will assume your content is relevant. If you have crappy task completion rates, RankBrain will penalize you.
  • 4. Increase Search Volume & CTR Using Social Ads and Display Remarketing People who are familiar with your brand are 2x more likely to click on your ads and 2x more likely to convert. We know this because targeting a user who has already visited your website (or app) via RLSA (remarketing lists for search ads) always produces higher CTRs than generically targeting the same keywords to users who are unfamiliar with your brand. So, one ingenious method to increase your organic CTRs and beat RankBrain is to bombard your specific target market with Facebook and Twitter ads. Facebook ads are proven to lift mobile search referral traffic volume to advertiser websites (by 6% on average, up to 12.8%) (here’s the research). With more than a billion daily users, your audience is definitely using the Social Network. Facebook ads are inexpensive – even spending just $50 dollars on social ads can generate tremendous exposure and awareness of your brand. Another relatively inexpensive way to dramatically build up brand recognition is to leverage the power of Display Ad remarketing on the Google Display Network. This will ensure the visitors you drive from social media ads remember who you are and what it is you do. In various tests, we found that implementing a display ad remarketing strategy has a dramatic impact on bounce rates and other engagement metrics. Bottom line: If you want to increase organic CTRs for your brand or business, make sure people are familiar with your offering. People who are more aware of your brand and become familiar with what you do will be predisposed to click on your result in SERP when it matters most, and will have much higher task completion rates after having clicked through to your site.
  • UPDATE: As many of us suspected, Google has continued to apply RankBrain to increasing volumes of search queries - so many, in fact, that Google now says its AI processes every query Google handles, which has enormous implications for SEO. As little as a year ago, RankBrain was reportedly handling approximately 15% of Google's total volume of search queries. Now, it's processing all of them. It's still too soon to say precisely what effect this will have on how you should approach SEO, but it's safe to assume that RankBrain will continue to focus on rewarding quality, relevant content. It's also worth noting that, according to Google, RankBrain itself is now the third-most important ranking signal in the larger Google algorithm, meaning that "optimizing" for RankBrain will likely dominate conversations in the SEO space for the foreseeable future. To read more about the scope and potential of RankBrain and its impact on SEO, check out this excellent write-up at Search Engine Land.
Rob Laporte

Google December 2020 Core Update Is Completely Rolled Out - 0 views

  • Google has finished rolling out the Google December 2020 Core Update on December 16th. It began rolling at around 1pm ET on December 3rd and took 13 days to fully roll out, which is just about the two-week timeframe Google has given us for the core update rollouts.
  • This was atypical core update, like any core update is typical, but this roll out felt weird. We saw a huge spike in volatility on December 4th, the day after the update began rolling out. And then we saw nothing for days, that was until December 10th when I said we saw the second wave of the update.
  •  
    "Google has finished rolling out the Google December 2020 Core Update on December 16th. It began rolling at around 1pm ET on December 3rd and took 13 days to fully roll out, which is just about the two-week timeframe Google has given us for the core update rollouts."
Rob Laporte

Google Webmaster Tools Now Provide Source Data For Broken Links - 0 views

  • Google has also added functionality to the Webmaster Tools API to enable site owners to provide input on control settings (such as preferred domain and crawl rate) that could previously only be done via the application. As they note in the blog post: “This is especially useful if you have a large number of sites. With the Webmaster Tools API, you can perform hundreds of operations in the time that it would take to add and verify a single site through the web interface.”
  •  
    Oct 13, 2008 at 5:28pm Eastern by Vanessa Fox Google Webmaster Tools Now Provide Source Data For Broken Links Ever since Google Webmaster Tools started reporting on broken links to a site, webmasters have been asking for the sources of those links. Today, Google has delivered. From Webmaster Tools you can now see the page that each broken link is coming from. This information should be of great help for webmasters in ensuring the visitors find their sites and that their links are properly credited. The value of the 404 error report Why does Google report broken links in the first place? As Googlebot crawls the web, it stores a list of all the links it finds. It then uses that list for a couple of things: * As the source list to crawl more pages on the web * To help calculate PageRank If your site has a page with the URL www.example.com/mypage.html and someone links to it using the URL www.example.com/mpage.html, then a few things can happen: * Visitors who click on that link arrive at the 404 page for your site and aren't able to get to the content they were looking for * Googlebot follows that link and instead of finding a valid page of your site to crawl, receives a 404 page * Google can't use that link to give a specific page on your site link credit (because it has no page to credit) Clearly, knowing about broken links to your site is valuable. The best solution in these situations generally is to implement a 301 redirect from the incorrect URL to the one. If you see a 404 error for www.example.com/mpage.html, then you can be pretty sure they meant to link to www.example.com/mypage.html. By implementing the redirect, visitors who click the link find the right content, Googlebot finds the content, and mypage.html gets credit for the link. In addition, you can scan your site to see if any of the broken links are internal, and fix them. But finding broken links on your site can be tedious (although it's valuable to run a broken l
Rob Laporte

As Deal With Twitter Expires, Google Realtime Search Goes Offline - 0 views

  • While Twitter may need Google to continue offering archive search, Google also potentially needs Twitter in another way. Google may have lost some of the data it has recently been using to bring social signals into its results, as covered more below: Google’s Search Results Get More Social; Twitter As The New Facebook “Like” I’ve not yet been able to check on whether Google Social Search and other parts of Google have been impacted by the deal’s end. I’ll look at that later — I’m heading off to enjoy the 4th Of July myself now. Update: Google has sent us a statement addressing the issue above: While we will not have access to this special feed from Twitter, information on Twitter that’s publicly available to our crawlers will still be searchable and discoverable on Google. As for other features such as social search, they will continue to exist, though without Twitter data from the special feed. You can certainly understand why Google+ has become even more important to the service now. While Google has gotten by largely without social signals from Facebook, having its own data from Google+ gives it insulation if it now has to get by without Twitter signals, as well.
Rob Laporte

Eric Schmidt: Google Will Give Higher Rankings to Content Tied to Verified Profiles - S... - 0 views

  •  
    It's no understatement to say that Google's former CEO Eric Schmidt is quite outspoken. His "talk first, think later" approach has a tendency of providing some great soundbites for the media. This time, it's not an impromptu interview providing the interesting opinion. The Wall Street Journal has obtained some excerpts from Schmidt's upcoming book, "The New Digital Age." One of those excerpts clearly spells out where Google is heading in the future: "Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance." When Google introduced authorship markup in 2011, Google did note that they were "looking closely at ways this markup could help us highlight authors and rank search results." Now Schmidt has made it explicit: in the future, you can boost your rankings by using Google authorship, and as we've reported before, Google+ has been designed to be an identity verification network. 
Rob Laporte

Google Analytics Upgrade: AdSense Reporting, Visualization Tools, & More - 0 views

  • Online publishers may be most interested in the AdSense integration tools coming to Google Analytics. After linking an AdSense and Analytics account, you’ll be able to see AdSense data including: total revenue, impressions, clicks, and click-through ratio revenue per day, per hour, etc. revenue per page (what pages are most profitable) revenue per referral (what other sites bring you profitable traffic) Here are a couple screenshots from Google’s videos on the new features (see below for link): During our call this morning, we asked why AdSense itself doesn’t also offer this data without requiring the need for also using Google Analytics to get it. We’re waiting for a reply from Google’s AdSense team and will let you know what we learn. Update: A Google spokesperson says, “We can’t comment on any future AdSense developments or features.” Motion Charts is a visualization tool lets you see and interact with analytics data in five dimensions, a capability made possible by Google’s purchase of Gapminder’s Trendalyzer software in March, 2007. The Google Analytics API, which is currently in private beta, will open up analytics data for developers to export and use however they want. Advanced segmentation allows users to dig deeper into subsets of traffic, such as “visits with conversions,” or create their own segment types. Custom reporting lets users create their own comparisons of metrics. Google has created a series of videos showing how some of these new tools work. Crosby says the new features will be rolled out in coming weeks to Google Analytics users, who may see some new features earlier than others. The AdSense integration, he warns, may take longer to roll out than the other new tools. More discussion at Techmeme.
Rob Laporte

Small Business Alert: Claim Your Google Local Business Listing Before Someone Else Does! - 0 views

  •  
    Oct 7, 2008 at 11:59am Eastern by Mike Blumenthal Small Business Alert: Claim Your Google Local Business Listing Before Someone Else Does! Imagine going to the Post Office to check your post office box to discover that all of your mail and receipts for the past few weeks had been forwarded to an unknown party. The Post Office informed you that there was no chance of getting your receipts back and if you wanted to start receiving your mail at your PO box once again, you needed to go over to their new business center and fill out some forms to claim your box. Just notifying the Post Office that it was your box was not enough to protect it in the future. Due to normal delays in processing it would be 2 weeks before you started receiving your mail and money again. If you're a small business with a local listing in one of the major search engines, you need to beware: the same scenario described above could happen to your local search result info if you're not careful. The apparent hijacking of a large number of independent florists in Google Maps several weeks back is just such a story. Google, in the role of Post Office, allowed someone to hijack listings in the Florist industry using the community edit feature. For those of you unfamiliar with the incident here is a brief recap. The technique, apparently in widespread use in the locksmith, pay day loan and other industries, exploited weaknesses in Google's Community Edit capability. In this newly reported case in the floral industry, affiliate mapspamers targeted high ranking florists in major markets that had not claimed their business listings in the Local Business Center so as to be able to benefit from an existing businessâ ranking and reviews. The spammers, using these community edit tools, would change the phone number to another local number, change the location of the business slightly and then proceed to add a category, a new URL and ultimately the change name of the business. Apparently the smal
Rob Laporte

Google MUM: What to Know About the New Search Engine Tech - 0 views

  • Google MUM, or Multitask Unified Model, is the latest algorithmic successor for the search engine giant. Google has called MUM “a new AI milestone inside of Google search.
  • It basically gathers subcategories for the query and delivers a more holistic picture for the benefit of the end user. MUM is particularly attuned to comparisons for an umbrella of related queries.
  • One thing that’s interesting about MUM is that it understands things across text and images. In the future, Google expects to be able to incorporate audio and video in the omnichannel mix, too.
  • ...8 more annotations...
  • pull information from different languages
  • understand thoughtful subjects more holistically
  • Google’s algorithm update combines “entities, sentiments and intent” all for the sake of the user experience.
  • Google’s Senior Webmaster Trends Analyst John Mueller says, “I don’t really see how this would reduce the need for SEO
  • BERT and MUM are both built on something called a Transformer Architecture. However, MUM has more multitasking capabilities than BERT. Because of this, Google reports that MUM is 1,000 times stronger than BERT at providing nuanced results.
  • Google’s been mum on when MUM will expand from beta mode. It didn’t take an excessive amount of time for BERT, so the outlook seems promising.
  • Continue optimizing your content with multimedia in mind. Keep the user at the forefront of your strategy, since that’s exactly what Google MUM is doing.
  • a sprawling leap forward in machine learning
Rob Laporte

How to Optimize & Track Google Discover in Google Analytics - - 0 views

  • Enable max image preview
  • <meta name="robots" content="max-image-preview:large">
  • Make sure it’s mobile-friendly
  • ...3 more annotations...
  • Google Search Console has a report for Google Discover traffic. But it is not a tracking tool and does not give us any useful insights about the traffic.  Tracking this traffic in Google Analytics can serve the greater purpose of building audiences, analyzing user behavior, and tracking conversions. So, how do we track Discover traffic in Google Analytics? 
  • Method 1 (UA only)
  • Method 2 (Both UA & GA4):
  •  
    "How to Optimize & Track Google Discover in Google Analytics "
Rob Laporte

Chitika Insights | The Value of Google Result Positioning - 0 views

  • How much is the top spot on Google actually worth?  According to data from the Chitika network, it’s worth a ton – double the traffic of the #2 spot, to be precise. In order to find out the value of SEO, we looked at a sample of traffic coming into our advertising network from Google and broke it down by Google results placement.  The top spot drove 34.35% of all traffic in the sample, almost as much as the numbers 2 through 5 slots combined, and more than the numbers 5 through 20 (the end of page 2) put together. “Obviously, everyone knows that the #1 spot on Google is where you want to be,” says Chitika research director Daniel Ruby.  “It’s just kind of shocking to look at the numbers and see just how important it is, and how much of a jump there is from 2 to 1.” The biggest jump, percentage-wise, is from the top of page 2 to the bottom of page 1.  Going from the 11th spot to 10th sees a 143% jump in traffic.  However, the base number is very low – that 143% jump is from 1.11% of all Google traffic to 2.71%.  As you go up the top page, the raw jumps get bigger and bigger, culminating in that desired top position. Google Result Impressions Percentage 1 2,834,806 34.35% 2 1,399,502 16.96% 3 942,706 11.42% 4 638,106 7.73% 5 510,721 6.19% 6 416,887 5.05% 7 331,500 4.02% 8 286,118 3.47% 9 235,197 2.85% 10 223,320 2.71% 11 91,978 1.11% 12 69,778 0.85% 13 57,952 0.70% 14 46,822 0.57% 15 39,635 0.48% 16 32,168 0.39% 17 26,933 0.33% 18 23,131 0.28% 19 22,027 0.27% 20 23,953 0.29% Numbers are based on a sample of 8,253,240 impressions across the Chitika advertising network in May, 2010.
Rob Laporte

Google Insights Forecasts the Future - Google Blog - InformationWeek - 0 views

  • Google Insights Forecasts the Future Posted by Thomas Claburn, Aug 17, 2009 05:46 PM Google (NSDQ: GOOG) has enhanced Google Insights for Search, its search term data analysis tool, to help users see into the future. Now available in 39 languages, Google Insights for Search includes a new forecasting feature that can extrapolate a search term's future popularity based on its past performance. For search terms with a lot of historical data, Google Insights for Search can project a likely trend. It's not a perfect prediction of what's to come, but it may be useful in certain circumstances. Google has also added an animated map that allows users to see how search query volume changes over time in specific geographic regions. Graphs generated using Google Insights for Search can be presented on any Web page or iGoogle page using an embeddable gadget. This is particularly use for tracking the ebb and flow of online discussion about a particular topic.
Rob Laporte

Why Google Panda Is More A Ranking Factor Than Algorithm Update - 0 views

  • At our SMX Advanced conference earlier this month, the head of Google’s spam fighting team, Matt Cutts, explained that the Panda filter isn’t running all the time. Right now, it’s too much computing power to be running this particular analysis of pages. Instead, Google runs the filter periodically to calculate the values it needs. Each new run so far has also coincided with changes to the filter, some big, some small, that Google hopes improves catching poor quality content.  So far, the Panda schedule has been like this: Panda Update 1.0: Feb. 24, 2011 Panda Update 2.0: April 11, 2011 (about 7 weeks later) Panda Update 2.1: May 10, 2011 (about  4 weeks later) Panda Update 2.2: June 16, 2011 (about 5 weeks later) Recovering From Panda For anyone who was hit by Panda, it’s important to understand that the changes you’ve made won’t have any immediate impact. For instance, if you started making improvements to your site the day after Panda 1.0 happened, none of those would have registered for getting you back into Google’s good graces until the next time Panda scores were assessed — which wasn’t until around April 11. With the latest Panda round now live, Google says it’s possible some sites that were hit by past rounds might see improvements, if they themselves have improved.
Rob Laporte

70+ Best Free SEO Tools (As Voted-for by the SEO Community) - 1 views

  • Soovle — Scrapes Google, Bing, Yahoo, Wikipedia, Amazon, YouTube, and Answers.com to generate hundreds of keyword ideas from a seed keyword. Very powerful tool, although the UI could do with some work.Hemingway Editor — Improves the clarity of your writing by highlighting difficult to read sentences, “weak” words, and so forth. A must-have tool for bloggers (I use it myself).
  • Yandex Metrica — 100% free web analytics software. Includes heat maps, form analytics, session reply, and many other features you typically wouldn’t see in a free tool.
  • For example, two of my all-time favourite tools are gInfinity (Chrome extension) and Chris Ainsworth’s SERPs extraction bookmarklet.By combining these two free tools, you can extract multiple pages of the SERPs (with meta titles + descriptions) in seconds.
  • ...17 more annotations...
  • Varvy — Checks whether a web page is following Google’s guidelines. If your website falls short, it tells you what needs fixing.
  • LSIgraph.com — Latent Semantic Indexing (LSI) keywords generator. Enter a seed keyword, and it’ll generate a list of LSI keywords (i.e. keywords and topics semantically related to your seed keyword). TextOptimizer is another very similar tool that does roughly the same job.
  • Small SEO Tools Plagiarism Checker — Detects plagiarism by scanning billions of documents across the web. Useful for finding those who’ve stolen/copied your work without attribution.
  • iSearchFrom.com — Emulate a Google search using any location, device, or language. You can customise everything from SafeSearch settings to personalised search.
  • Delim.co — Convert a comma-delimited list (i.e. CSV) in seconds. Not necessarily an SEO tool per se but definitely very useful for many SEO-related tasks.
  • Am I Responsive? — Checks website responsiveness by showing you how it looks on desktop, laptop, tablet, and mobile.
  • SERPLab — Free Google rankings checker. Updates up to 50 keywords once every 24 hours (server permitting).
  • Keyword Mixer — Combine your existing keywords in different ways to try and find better alternatives. Also useful for removing duplicates from your keywords list.Note: MergeWords does (almost) exactly the same job albeit with a cleaner UI. However, there is no option to de-dupe the list.
  • JSON-LD Schema Generator — JSON-LD schema markup generator. It currently supports six markup types including: product, local business, event, and organization.
  • KnowEm Social Media Optimizer — Analyses your web page to see if it’s well-optimised for social sharing. It checks for markup from Facebook, Google+, Twitter, and LinkedIn.
  • Where Goes? — Shows you the entire path of meta-refreshes and redirects for any URL. Very useful for diagnosing link issues (e.g. complex redirect chains).
  • Google Business Review Link Generator — Generates a direct link to your Google Business listing. You can choose between a link to all current Google reviews, or to a pre-filled 5-star review box.
  • PublicWWW — Searches the web for pages using source code-based footprints. Useful for finding your competitors affiliates, websites with the same Google Analytics code, and more.
  • Keywordtool.io — Scrapes Google Autosuggest to generate 750 keyword suggestions from one seed keyword. It can also generate keyword suggestions for YouTube, Bing, Amazon, and more.
  • SERPWatcher — Rank tracking tool with a few unique metrics (e.g. “dominance index”). It also shows estimated visits and ranking distribution charts, amongst other things.
  • GTMetrix — Industry-leading tool for analysing the loading speed of your website. It also gives actionable recommendations on how to make your website faster.
  • Mondovo — A suite of SEO tools covering everything from keyword research to rank tracking. It also generates various SEO reports.SEO Site Checkup — Analyse various on-page/technical SEO issues, monitor rankings, analyse competitors, create custom white-label reports, and more.
‹ Previous 21 - 40 of 1658 Next › Last »
Showing 20 items per page