Skip to main content

Home/ @Publish/ Group items tagged Google Analytics

Rss Feed Group items tagged

Pedro Gonçalves

Google Launches Content Recommendation Engine For Mobile Sites, Powered By Google+ | Te... - 0 views

  • Google’s launch partner for this service is Forbes, but others can implement these recommendations by just adding a single line of code to their mobile sites.
  • These recommendations, Sternberg told me, are based on social recommendations on the site from your friends on Google+ (only if you are signed in, of course), what the story you just read was about, the story’s author and some of Google’s “secret sauce.”
  • The new Google+-based recommendations, interestingly, only appear once a reader slides back up on a page. This, Google’s analytics show, is a pretty good indicator that a user has finished reading a post (even if there is still more text left on the page). The recommendation widget then slides up from the bottom and one extra click brings up more relevant items for the page. The other option is to show the widget after a user scrolls past a configurable CSS entity.
  • ...1 more annotation...
  • Publishers will be able to manage the recommendations widget from their Google+ publisher accounts. From there, they can decide when exactly the widget should appear and manage a list of pages where the widget shouldn’t appear, as well as a list of pages that should never appear in recommendations.
Pedro Gonçalves

About Traffic Sources - Analytics Help - 0 views

  • The keywords that visitors searched are usually captured in the case of search engine referrals. This is true for both organic and paid search. If the a visitor is signed in to a Google account, however, Keyword will have the value “(not provided)”.
    • Pedro Gonçalves
       
      Why!?
Pedro Gonçalves

Standards and benchmarks - 0 views

  • The average top 1,000 web page is 1575 KB.
  • Page growth is a major reason why we keep finding, quarter after quarter, that pages are getting slower. And faster networks are not a cure-all for the challenges of page bloat.
  • According to Akamai’s most recent quarterly State of the Internet report, the global average connection speed among the top 50 internet-using countries is 3.3 Mbps — a 5.2% increase over the previous quarter. But when we’re seeing year-over-year page growth ranging from 45-50%, it’s easy to see that the gap is widening.
  • ...16 more annotations...
  • A whopping 804 KB per page is comprised of images. Three years ago, images comprised just 372 KB of a page’s total payload.
  • images are one of the single greatest impediments to front-end performance. All too often, they’re either in the wrong format or they’re uncompressed or they’re not optimized to load progressively — or all of the above.
  • Today, 38% of pages use Flash, compared to 52% in 2010. This is a good thing. Nothing against Flash, per se, but if Apple has no plans ever to support it, its obsolescence is inevitable in our increasingly mobile-first world.
  • use of custom fonts has exploded — from 1% in 2010 to 33% today.
  • But custom fonts have a dark side: they can incur a significant performance penalty.
  • These days, images on the web have to work hard. They need to be high-res enough to satisfy users with retina displays, and they also need to be small enough in size that they don’t blow your mobile data cap in one fell swoop. Responsive web design attempts to navigate this tricky terrain, with varying degrees of success.
  • Google published findings, based on Google Analytics data, which suggest that load times have gotten marginally faster for desktop users, and up to 30% faster for mobile users.
  • Here at Strangeloop/Radware, we’ve found the opposite. Using WebPagetest, we’ve been testing the same 2,000 top Alexa-ranked ecommerce sites since 2010, and our data tells us that top ecommerce pages have gotten 22% slower in the past year.
  • This quick-and-dirty case study illustrates how network speed doesn’t directly correlate to load time. For example, download bandwidth increases 333% from DSL (1.5Mbps) to cable (5Mbps), yet the performance gain is only 12%.
  • Move scripts to the bottom of the page
  • It’s better to move scripts from the top to as low in the page as possible. One reason is to enable progressive rendering, but another is to achieve greater download parallelization.
  • Make JavaScript and CSS external
  • If users on your site have multiple page views per session and many of your pages re-use the same scripts and stylesheets, you could potentially benefit from cached external files. Pages that have few (perhaps only one) page view per session may find that inlining JavaScript and CSS results in faster end-user response times.
  • Reduce DNS lookups
  • Minify JavaScript
  • In addition to minifying external scripts, you can also minify inlined script blocks. Even if you’re already gzipping your scripts, minifying them will still reduce the size by at least 5%.
Pedro Gonçalves

How Website Speed Actually Impacts Search Ranking - Moz - 0 views

  • in 2010, Google did something very different. Google announced website speed would begin having an impact on search ranking. Now, the speed at which someone could view the content from a search result would be a factor.
  • Google's Matt Cutts announced that slow-performing mobile sites would soon be penalized in search rankings as well.
  • While Google has been intentionally unclear in which particular aspect of page speed impacts search ranking, they have been quite clear in stating that content relevancy remains king.
  • ...13 more annotations...
  • When people say"page load time" for a website, they usually mean one of two measurements: "document complete" time or "fully rendered" time. Think of document complete time as the time it takes a page to load before you can start clicking or entering data. All the content might not be there yet, but you can interact with the page. Think of fully rendered time as the time it takes to download and display all images, advertisements, and analytic trackers. This is all the "background stuff" you see fill in as you're scrolling through a page.
  • Since Google was not clear on what page load time means, we examined both the effects of both document complete and fully rendered on search rankings. However our biggest surprise came from the lack of correlation of two key metrics! We expected, if anything, these 2 metrics would clearly have an impact on search ranking. However, our data shows no clear correlation between document complete or fully rendered times with search engine rank, as you can see in the graph below:
  • With no correlation between search ranking and what is traditionally thought of a "page load time" we expanded our search to the Time to First Byte (TTFB). This metric captures how long it takes your browser to receive the first byte of a response from a web server when you request a particular URL. In other words, this metric encompasses the network latency of sending your request to the web server, the amount of time the web server spent processing and generating a response, and amount of time it took to send the first byte of that response back from the server to your browser.
  • The TTFB result was surprising in a clear correlation was identified between decreasing search rank and increasing time to first byte. Sites that have a lower TTFB respond faster and have higher search result rankings than slower sites with a higher TTFB. Of all the data we captured, the TTFB metric had the strongest correlation effect, implying a high likelihood of some level of influence on search ranking.
  • The surprising result here was with the the median size of each web page, in bytes, relative to the search ranking position. By "page size," we mean all of the bytes that were downloaded to fully render the page, including all the images, ads, third party widgets, and fonts. When we graphed the median page size for each search rank position, we found a counterintuitive correlation of decreasing page size to decreasing page rank, with an anomalous dip in the top 3 ranks.
  • Our data shows there is no correlation between "page load time" (either document complete or fully rendered) and ranking on Google's search results page. This is true not only for generic searches (one or two keywords) but also for "long tail" searches (4 or 5 keywords) as well. We did not see websites with faster page load times ranking higher than websites with slower page load times in any consistent fashion. If Page Load Time is a factor in search engine rankings, it is being lost in the noise of other factors. We had hoped to see some correlation especially for generic one- or two-word queries. Our belief was that the high competition for generic searches would make smaller factors like page speed stand out more.
  • our data shows there is a correlation between lower time-to-first-byte (TTFB) metrics and higher search engine rankings. Websites with servers and back-end infrastructure that could quickly deliver web content had a higher search ranking than those that were slower. This means that, despite conventional wisdom, it is back-end website performance and not front-end website performance that directly impacts a website's search engine ranking.
  • We suspect over time, though, that page rendering time will also factor into rankings due to the high indication of the importance of user experience.
  • TTFB is affected by 3 factors: The network latency between a visitor and the server. How heavily loaded the web server is. How quickly the website's back end can generate the content.
  • Websites can lower network latency by utilizing Content Distribution Networks (CDNs). CDNs can quickly deliver content to all visitors, often regardless of geographic location, in a greatly accelerated manner.
  • Do these websites rank highly because they have better back-end infrastructure than other sites? Or do they need better back-end infrastructure to handle the load of ALREADY being ranked higher? While both are possible, our conclusion is that sites with faster back ends receive a higher rank, and not the other way around.
  • The back-end performance of a website directly impacts search engine ranking. The back end includes the web servers, their network connections, the use of CDNs, and the back-end application and database servers. Website owners should explore ways to improve their TTFB. This includes using CDNs, optimizing your application code, optimizing database queries, and ensuring you have fast and responsive web servers.
  • Fast websites have more visitors, who visit more pages, for longer period of times, who come back more often, and are more likely to purchase products or click ads. In short, faster websites make users happy, and happy users promote your website through linking and sharing. All of these things contribute to improving search engine rankings.
Pedro Gonçalves

Gmail To Marketers: Drop Dead - ReadWrite - 0 views

  • Google on Thursday updated its Gmail service so that you'll never have to click that pesky “Display images below” link again. Gmail will now automatically display images in email, the catch being that Google will host those images on its own servers. Prior to the change, most emailed images would be loaded from third-party servers—often enough, those of marketers.
  • But by filtering these photos through its own servers, however, Google may have shut out the use of Web bugs or beacons—bits of code that lets an advertiser know that an email has been opened. Marketers use images as beacons because, at least until now, services like Gmail would upload such images from an advertiser’s own web server. Any image can be a beacon, even an invisible one no more than a pixel wide.
  • the following likely consequences for his audience: Marketers won't be able to tell whether you've opened an email for the second or subsequent time Web bugs won't report reliable geolocations for opened emails, as they'll pick up the IP addresses of Gmail servers, not recipients Countdown clocks sent as animated images won't show the right time if email is opened a second or subsequent time Analytics will only track the first time an email is opened Marketers won't be able to update or change images once they're sent out
Pedro Gonçalves

The Dilemma of Social Media Reach « Radian6 - Social media monitoring tools, ... - 0 views

  • Altimeter Group recently studied the internal goals in corporate social strategy. The top priority stated by 48% of companies was “Creating ROI Measurements”. Hypatia Research showed management’s expectations of the return on social communities are rather low. Research by Chief Marketer shows that the number of likes, friends & followers are the most used metrics by 60% of U.S. B2C and B2B marketers.
  • There exists great controversy about the use of ‘reach’ metrics.
  • I noticed strong correlations between all of the metrics. This means that reach, amplification, conversations and sentiment appear to measure the same kind of digital influence.
  • ...9 more annotations...
  • Many consider these to be vanity metrics: measures which are easy to understand but on their own explain little about the actionable effect.  They are easily manipulated, and do not necessarily correlate to the numbers that really matter. More actionable metrics are argued to be active users, engagement, the cost of getting new customers, and ultimately revenues and profits
  • Talking about Twitter specifically, Adi Avnit de-emphasizes the importance of followers due to the fact some users follow back others simply because of etiquette. His ‘million follower fallacy’ entails that this etiquette is leveraged by some users to elevate their follower count. The theory is not without evidence. Cha et al. (2010) measured user influence in Twitter and found that retweets and mentions showed great overlap, while followers gained… not so much. However, Kwak et al. (2010) in contrast found followers and page rank to be similar, while ranking by retweets differed.
  • investigated to what extent consumers engaged on brand tweets based on 4 dimensions:  amplification (retweets), reach (followers), conversations (mentions) and attitude (sentiment).
  • Popular measures are the 3F’s (friends, fans & followers).
  • following a great amount of people primarily affects a brand’s follower count. It doesn’t correlate with the other, more actionable, metrics. In fact, those brands perform worse on the other measures. Ergo, brands that over-focus on increasing their follower count, perform worse based on the other metrics
  • All interactions, whether it be likes, shares or wallposts, increase the EdgeRank which in turn exposes more fans to your content.
  • As the number of fans grew, so did the number of engaged fans (the interactions per mille stayed about the same). These two elements act as a positive spiral constantly growing the other.
  • I pose that the amount of fans, followers or friends is a relevant metric, considering it as the potential interaction userbase. Taking in consideration that your goal is to increase the number of engaged users.
  • Reach, amplification, conversations and sentiment appear to measure the same kind of digital influence. Brands that over-focus on increasing their follower count, perform worse based on the other metrics. Increase your user base – as your fans grow, so will the number of engaged fans
Pedro Gonçalves

The Average Web Page Loads in 2.45 Seconds Google Reveals - 0 views

  • The median page load time  for desktop websites, as measured by Google Analytics, is about 2.45 seconds. That means that half the pages measured were faster than this, while the other half were slower. The mean page load is about 6.4 seconds.
  • On mobile, things are significantly slower, the median page load is about 4.4 seconds, while the mean is above 10 seconds.
1 - 20 of 27 Next ›
Showing 20 items per page