Skip to main content

Home/ DISC Inc/ Group items tagged interesting

Rss Feed Group items tagged

Rob Laporte

Beyond conventional SEO: Unravelling the mystery of the organic product carousel - Sear... - 0 views

  • How to influence the organic product carouselIn Google’s blog post, they detailed three factors that are key inputs: Structured Data on your website, providing real-time product information via Merchant Center, along with providing additional information through Manufacturer Center.This section of the article will explore Google’s guidance, along with some commentary of what I’ve noticed based on my own experiences.
  • Make sure your product markup is validatedThe key here is to make sure Product Markup with Structured Data on your page adheres to Google’s guidelines and is validated.
  • Submit your product feed to Google via Merchant CenterThis is where it starts to get interesting. By using Google’s Merchant Center, U.S. product feeds are now given the option to submit data via a new destination.The difference here for Google is that retailers are able to provide more up-to-date information about their products, rather than waiting for Google to crawl your site (what happens in step 1).Checking the box for “Surfaces across Google” gives you the ability to grant access to your websites product feed, allowing your products to be eligible in areas such as Search and Google Images.For the purpose of this study we are most interested in Search, with the Organic Product Carousel in mind. “Relevance” of information is the deciding factor of this feature.Google states that in order for this feature of Search to operate, you are not required to have a Google Ads campaign. Just create an account, then upload a product data feed.Commentary by PPC Expert Kirk Williams:“Setting up a feed in Google Merchant Center has become even more simple over time since Google wants to guarantee that they have the right access, and that retailers can get products into ads! You do need to make sure you add all the business information and shipping/tax info at the account level, and then you can set up a feed fairly easily with your dev team, a third party provider like Feedonomics, or with Google Sheets. As I note in my “Beginner’s Guide to Shopping Ads”, be aware that the feed can take up to 72 hours to process, and even longer to begin showing in SERPs. Patience is the key here if just creating a new Merchant Center… and make sure to stay up on those disapprovals as Google prefers a clean GMC account and will apply more aggressive product disapproval filters to accounts with more disapprovals. ”– Kirk WilliamsFor a client I’m working with, completing this step resulted in several of their products being added to the top 10 of the PP carousel. 1 of which is in the top 5, being visible when the SERP first loads.This meant that, in this specific scenario, the product Structured Data that Google was regularly crawling and indexing in the US wasn’t enough on it’s own to be considered for the Organic Product Carousel.Note: the products that were added to the carousel were already considered “popular” but Google just hadn’t added them in. It is not guaranteed that your products will be added just because this step was completed. it really comes down to the prominence of your product and relevance to the query (same as any other page that ranks).
  • ...2 more annotations...
  • 3. Create an additional feed via Manufacturer CenterThe next step involves the use of Google’s Manufacturer Center. Again, this tool works in the same way as Merchant Center: you submit a feed, and can add additional information.This information includes product descriptions, variants, and rich content, such as high-quality images and videos that can show within the Product Knowledge Panel.You’ll need to first verify your brand name within the Manufacturer Center Dashboard, then you can proceed to uploading your product feed.When Google references the “Product Knowledge Panel” in their release, it’s not the same type of Knowledge Panel many in the SEO industry are accustomed.This Product Knowledge Panel contains very different information compared to your standard KP that is commonly powered by Wikipedia, and appears in various capacities (based on how much data to which it has access).Here’s what this Product Knowledge Panel looks like in its most refined state, completely populated with all information that can be displayed:Type #1 just shows the product image(s), the title and the review count.Type #2 is an expansion on Type #1 with further product details, and another link to the reviews.Type #3 is the more standard looking Knowledge Panel, with the ability to share a link with an icon on the top right. This Product Knowledge Panel has a description and more of a breakdown of reviews, with the average rating. This is the evolved state where I tend to see Ads being placed within.Type #4 is an expansion of Type #3, with the ability to filter through reviews and search the database with different keywords. This is especially useful functionality when assessing the source of the aggregated reviews.Based on my testing with a client in the U.S., adding the additional information via Manufacturer Center resulted in a new product getting added to a PP carousel.This happened two weeks after submitting the feed, so there still could be further impact to come. I will likely wait longer and then test a different approach.
  • Quick recap:Organic Product Carousel features are due to launch globally at the end of 2019.Popular Product and Best Product carousels are the features to keep an eye on.Make sure your products have valid Structured Data, a submitted product feed through Merchant Center, along with a feed via Manufacturer Center.Watch out for cases where your clients brand is given a low review score due to the data sources Google has access to.Do your own testing. As Cindy Krum mentioned earlier, there are a lot of click between the Organic Product Carousel listings and your website’s product page.Remember: there may be cases where it is not possible to get added to the carousel due to an overarching “prominence” factor. Seek out realistic opportunities.
Rob Laporte

Google's December 2020 Core Update Themes - 0 views

  • The data and overall consensus point to Google’s December 2020 Core Update is it's one of the more impactful algorithm adjustments to hit the SERP over the past year or so.
  • I prefer to look at core updates almost from a pure content and UX perspective. For me, it’s about the specific pages Google swaps out more than it is a per domain analysis.
  • I am performing a qualitative analysis
  • ...19 more annotations...
  • I am not making any sort of definitive statements
  • What moves me, however, is when I look at 100 keywords I start seeing the same content-oriented theme arise again and again.
  • What I’m trying to say, and as you’ll see in the examples I will get into later, is that the content that was more focused on the specific topic mentioned in the query did better. So while the "ultimate guide” here did get to the topic the query deals with, it was not exclusively about that topic.
  • This might call the entire strategy of creating these ultimate guides into question. Perhaps you can’t cast a wide net in that way anymore? Perhaps, the "ultimate guide” is only really suitable for people who actually want to get a more broad understanding of a topic? (Crazy to think, I know!)
  • The pages from Rocket Mortgage, on the other hand, is only about how much you need for a down payment:
  • So too is the page from Quicken Loans:
  • The Moral of the Story: If I want to understand how much money on average I need to put down when buying a house or what the various options generally are and what they mean long term, the CFPG page, .gov or not, doesn’t really help me. Its content is not specifically honed in on that particular topic. Again, we have another page that takes a sweeping look at a topic that lost rankings when the query reflected a more specific sort of intent!
  • What’s interesting here is that unlike the previous examples, where too much content resulted in the page’s topical relevance being diluted, the lack of such content here is what I think caused the ranking loss. Look, it’s not bad content. However, it’s pretty much the "general” kind of content you see here, there, and everywhere for all sorts of topics. Just compare it to what the page from the Credit Card Insider offers:
  • This just oozes depth. The third topic on the page alone (6 Ways to Pay Off…) rivals the depth shown on the CreditCards.com page! What differentiates this page from the "guides” shown in the other examples is that this is a guide that drills deep into one topic as opposed to trying to span multiple subtopics. Also, have a look at the formatting, it reminds me of what we saw on the Motley Fool’s page:
  • It’s deep content that is easy to digest. It’s not hard to see why Google swapped these two pages.
  • The Moral of the Story: Exact content relevancy is not only about what topic you talk about. You can be topically aligned but it has to offer real information to the user. It’s even better when that information is digestible. In other words, if you want to rank for a keyword with topic specificity it might be better to create an "ultimate guide” that drills deep into the topic itself versus trying to cover every subtopic under the sun in order to try to rank for more topics with one piece of content.
  • The by-line really sums it up. It tells you this article is about the fact that you most likely won't get addicted to painkillers, but it’s definitely possible so here’s the scoop. To me, it’s far more in line with the average user’s intent of learning about the risks of addiction versus understanding the fine difference between addiction and dependence. It’s the same story with the WebMD page:
  • The Moral of the Story: Again, the issue here is not how authoritative or how substantial the content is. There is no doubt that content from the NIH is both substantial and authoritative. The issue here again seems to relate to Google being better able to show content that is specifically relevant to the nature of the query.
  • First things first, the page doesn’t speak to the query directly. While in the process of learning the difference between sadness and depression one could understand the signs of depression that route is certainly indirect. You could argue that the query how to tell if you have depression could be taken as ‘how do I know if I am just sad or depressed?’ but that really doesn’t seem to be the essential intent here. That topical line (i.e., sadness vs. depression) would most likely produce its own unique query (i.e., am I sad or depressed). From the content shown on the WebMD page, it appears that Google thinks of the intent as understanding the symptoms of depression:
  • The WebMD, in contradistinction to the MHA page, speaks to the "plain meaning” of the query’s intent… how can you tell if you’re suffering from depression? Aside from that, the WebMD page offers a bit more in terms of substance. While it doesn’t go into great detail per se, the WebMD page does offer a pretty comprehensive list of items. Compare that to the MHA page which, if you read it, is a bit thin and lacks tremendously in offering much of any details (even a basic list as seen on the WebMD page). The Moral of the Story: Relevancy is a two-pronged equation (at minimum). It requires the content to be topically focused on the one hand as well as substantial on the other
  • I’ve saved the best for last. This is my favorite example that I came across when diving into the December 2020 Core Update. I mean, for crying out loud, we’re talking about the CDC losing rankings in favor of a .org domain I never heard of. How could this be? Let’s understand the intent of the query. If I were searching for this it would be because I found something on my body that I thought might be skin cancer. If I could be so bold, I would imagine that this is why most of us would search for this term. I wouldn’t, and again I imagine most people in most instances wouldn’t search for this in order to understand if regular screening is officially recommended or not. Yet, that is what the CDC page is about:
  • I hate to make assumptions, but I would also think that someone running this query is most likely not interested in the common tests and methods doctors use to determine if skin cancer is present. Yet, this is what the page from Cancer.net focuses on:
  • Again, I would search for this term if I saw something weird on my body that made me think "Holy crap, do I have skin cancer?”. The page from the AOCD is entirely made for people on the verge of freaking out at the possibility of having skin cancer:
  • To me, when you see this page relative to the pages from Cancer.net and the CDC is painfully obvious why this page got the ranking boost. The Moral of the Story: Again, I think what has transpired here is painfully obvious. Google has looked past the immediate authority of some of the pages here and has more heavily considered how relevant the content is to the query. As with the cases I have shown earlier, Google is rewarding content that speaks in a highly-focused way to the intent and nature of the query. What Was the December 2020 Core Update About? Are you expecting a one-liner that definitively characterizes the December 2020 update? You’re not going to get one from me.  This update, like any other, certainly included a whole plethora of different "algorithmic considerations” and themes. That said, from where I sit, while other core updates did things to help put the most authoritative content at the top of the SERP, this update seemed to me as being more about pure relevancy. Updates of the past have done things to weed out sites using a marketing tone within YMYL informational content or have rewarded sites that put the right content ahead of their affiliate pursuits. All of that, while part of relevancy, speaks more to a need for something authoritative at the top of the SERP. Seeing so many .gov pages drop in favor of pages from sites like Healthline or WebMD seems to point to the update rewarding relevancy to the nth degree. Perhaps Google felt as if it had "authority” at the top of the SERP in order, paving the way for a focus on relevance? Who knows. All I can say is that I personally have not seen such a strong focus on pure relevance on page one of the SERP. Content Creation Takeaways Practically speaking, I think the era of broadly reaching pages is quickly coming to an end. I think that has been the case for a while. However, seeing Google pull pages off page one of the SERP because they deal with multiple subtopics is a new level, at least for me. It shows that you have to create content that talks about one topic and one topic only (unless the keyword reflects a specific desire for a broader survey of a topic). I wonder if the idea of having one ultimate guide so as to win numerous keywords should be replaced with multiple posts where each post deals with one specific topic or subtopic. (If you do that, please don’t create thin content, that is not what I am advocating for.) It’s a rather logical concept. As Google gets better at understanding content it is going to prefer highly-focused content around a specific topic to that which is of a more broad nature unless the query specifically shows intent for a general survey of a topic.
Rob Laporte

3 Reasons Why Blogs for SEO Fail | Online Marketing Blog - 0 views

  • However, when it comes to blogs, consumer information discovery trends are involving social networks and social media at an increasing rate. Recommendations are competing with search. When looking at the web analytics of our blog and client blogs, social media traffic is in the top 5 referring sources of traffic. Blogs are social and social media sources will become increasingly important for many business blogging efforts in the coming year. So, what can a company do to build upon and benefit from, the compounding equity that grows with long term blogging and SEO efforts? I’ll be answering that question specifically in tomorrow’s post on 5 Tips for Successful Blog Optimization efforts. In the meantime, have you started a blog only to lose interest or stop contributing to it? What was your reason? What would you do differently?
Rob Laporte

Tips On Getting a Perfect 10 on Google Quality Score - 0 views

  • October 20, 2008 Tips On Getting a Perfect 10 on Google Quality Score Ever since Google launched the real time quality score metric, where Google rated keywords between 0 and 10, 10 being the highest, I have rarely seen threads on documenting how to receive a 10 out of 10. Tamar blogged about How To Ensure That Your Google Quality Score is 10/10 based on an experiment by abbotsys. Back then, it was simply about matching the domain name to the keyword phrase, but can it be achieved with out that? A DigitalPoint Forums thread reports another advertiser receiving the 10/10 score. He documented what he did to obtain the score: Eliminated all the keywords that google had suggested and only used a maximum of three keywords per ad campaign.Used only 1 ad campaign per landing page and made each landing page specific for that keyword.Put the cost per click up high enough to give me around third spot.Geo targeted the campaigns only in the areas he can sell to.Limited the time his ads were on only to the times where there is really interest.Used three version of each keyword "keyword", [keyword], and keyword and then eliminated which every wasn't working well. If you want to reach that perfect 10, maybe try these tips and see what works for you. There is no guaranteed checklist of items, so keep experimenting. And when you get your perfect 10, do share!
Rob Laporte

Google Analytics Upgrade: AdSense Reporting, Visualization Tools, & More - 0 views

  • Online publishers may be most interested in the AdSense integration tools coming to Google Analytics. After linking an AdSense and Analytics account, you’ll be able to see AdSense data including: total revenue, impressions, clicks, and click-through ratio revenue per day, per hour, etc. revenue per page (what pages are most profitable) revenue per referral (what other sites bring you profitable traffic) Here are a couple screenshots from Google’s videos on the new features (see below for link): During our call this morning, we asked why AdSense itself doesn’t also offer this data without requiring the need for also using Google Analytics to get it. We’re waiting for a reply from Google’s AdSense team and will let you know what we learn. Update: A Google spokesperson says, “We can’t comment on any future AdSense developments or features.” Motion Charts is a visualization tool lets you see and interact with analytics data in five dimensions, a capability made possible by Google’s purchase of Gapminder’s Trendalyzer software in March, 2007. The Google Analytics API, which is currently in private beta, will open up analytics data for developers to export and use however they want. Advanced segmentation allows users to dig deeper into subsets of traffic, such as “visits with conversions,” or create their own segment types. Custom reporting lets users create their own comparisons of metrics. Google has created a series of videos showing how some of these new tools work. Crosby says the new features will be rolled out in coming weeks to Google Analytics users, who may see some new features earlier than others. The AdSense integration, he warns, may take longer to roll out than the other new tools. More discussion at Techmeme.
Rob Laporte

Evidence of Page Level Google Penalties? - 0 views

  • June 18, 2009 Evidence of Page Level Google Penalties? Richard at SEO Gadget showed how Google seemed to have penalized specific pages of his site from ranking in the Google index. The penalty seemed to be fair, in that there were nasty comments that slipped through his comment spam filter. The drop in traffic can be seen by the keyword phrases that page ranked well for. He noticed a ~70% drop in traffic for that phrase, which in his case resulted in a 15% drop in his Google traffic and a 5% drop in overall traffic. What I find extra fun is that a Google Search Quality Analyst, @filiber, tweeted: Google Page level penalty for comment spam – rankings and traffic drop http://bit.ly/JNAly (via @AndyBeard) <- interesting read! Of course that is not admission to this as a fact, but it wouldn't be too hard to believe that bad comments caused such a decline. Now, I don't think this would be considered a keyword-specific penalty, which most SEOs believe in, but rather a specific page being penalized. Forum discussion at Sphinn.
Rob Laporte

MediaPost Publications Study: A Third Of All Online Videos Are Shared 08/06/2009 - 0 views

  •  
    Marketers eager to exploit the Web's viral potential will be interested to learn that a full one-third of all videos are shared online. I noticed last week that the YouTube video we added for Pvteye.com already has nearly 5,000 views. We should definitely encourage people to create video for their site if possible. How about virtual tours for sites like THS and SHC?
Rob Laporte

Using Analytics To Measure SEO Success - 0 views

  • Next, you’ll want to see which pages of your site are bringing keyword traffic and whether they are the specific ones for which you optimized. To view this, click on the “content overview” section and then on the right hand side of the page, under “landing page optimization” click on “entrance keywords.”  This allows you to view specific stats for each page of your site. The first screen is the entrance keywords for the page that receives the most pageviews (typically your home page), but you can click to other pages via the drop down box that says “content.” If you don’t immediately see a page for which you are interested in viewing entrance keywords, you can type a word that you know is in the URL of that page in the search box that’s contained in the content dropdown. So if you’re looking for a page that has a file name of /green-widgets.php you can type just “green” or “widgets” into the search box and you’ll see all pages that have that word in the file name. Now you should be able to see all the entrance keywords for that page. Are they ones (or variations of) those for which you optimized?  If so, then your SEO is taking hold! If not, you’ll want to determine why. Perhaps it’s just too soon after your SEO work was completed. Perhaps they’re highly competitive phrases which will need more anchor text links pointing in.
Rob Laporte

BIZyCart SEO Manual - Controlled Navigation - 0 views

  • How The Robots Work Without getting into the programming details, the robots and web crawlers basically follow the following steps: On arrival, the robot pulls out all of the readable text it is interested in and creates a list of the links found on the page.  Links set as 'nofollow' or 'disallowed' are not added to the list.  If there are too many links, the robot may take a special action based on that. While the first robot completes processing the page, another robot script is launched to follow each of the links.  If there are ten links, there are now eleven robots running. Each of those robot scripts loads the page they were sent to and builds another link list.  Unless told otherwise, if there are ten links on each of those pages, one hundred additional robots get launched. Before going to the next page, the robots check to see if that page has already been looked at.  If already indexed that day, they cancel themselves and stop. The number of robots keeps expanding until all of the links have been followed and the site's web pages have been indexed or avoided. You can see that on some sites, thousands of robot processes can be taking their turns to work a web page.  There is physical limit on how much memory is available on the server.  If the number of active robots exceeds that, they have to be canceled or memory corruption will occur. If you let the robots run in too many directions, they may not finish looking at every web page or the results from some pages may get scrambled.  You are also subject to the number of robots on that server that are looking at other web sites.  Poorly managed robot servers can end up creating very strange results.
Rob Laporte

Linkfluence: How to Buy Links With Maximum Juice and Minimum Risk - 0 views

  • Up first is Rand Fishkin. Rand says he asked to be kicked off this panel because he doesn’t endorse buying links and he doesn’t do it anymore [Hear that, Google. SEOmoz doesn't buy links. SO KEEP MOVING.]. He offered to go last…but everyone else bullied the moderator into making him go first. Poor Rand. Always the innocent bunny in a pack of wolves. Unfortunately, the projector is broken so we have no screen. Something about a plug that doesn’t work.  So…we’re doing question and answer first while they send someone to try and fix it. I’ll throw the questions at the bottom.  Back to Mr. Fishkin. He tries to be very clear about his shift in position about paid links. He doesn’t think not buying links is right for everyone, it’s just what’s right for his clients and for SEOmoz.   Rand says he falls into the “Operator of Interest’ category. Meaning, he’s profiled for being an SEO. The problem with paid links: Algorithmic detection is getting better than ever before. Penalties are hard to diagnose. Manual link penalties are also a threat Google’s’ Webspam team invests (A LOT of) time and resources in shutting down effective paid links. [Agreed. And almost an unhealthy amount.] Competitors have significant incentive to report link spam. (Don’t be a rat.)
Rob Laporte

Google Openly Profiles SEOs As Criminals - 2 views

  • If we can stop talking about nofollow and PageRank sculpting for a second, maybe we can openly talk about the bigger story of last week’s SMX Advanced. The one that has to do with Matt Cutts taking the stage during the You&A and openly stating that Google profiles SEOs like common criminals. I was naïve in my youth. I’d read blog posts that accused Google of “having it out” for SEOs and laugh. There’d be rants about how Google was stricter on sites that were clearly touched by an SEO and how SEOs were dumb for “self-identifying” with attributes like nofollow. At the time, I thought these people were insane. Now I know they were right. Google does profile SEOs. They’re identified as “high risk” and so are all of their associated projects.
  •  
    Interesting...further strengthens the position that "content is King" and we should continue to encourage clients in that direction. Value to the audience first, play nice with the search engines second.
Rob Laporte

Q&A: Rand Fishkin, CEO of SEOmoz | Blog | Econsultancy - 0 views

  • Paid links are always controversial. I found it interesting that "direct link purchases from individual sites/webmasters" was considered by your panel to be the fifth most effective link building tactic yet "link acquisition from known link brokers/sellers" was the second highest negative ranking factor. Any thoughts on this? Does this reflect the fact that even though paid links in general have a bad reputation, they're still widely employed? I think that's correct. Link buying and selling is still a very popular activity in the SEO sphere, and while the engines continue to fight against it, they're unlikely to ever weed out 100% of the sites and pages the employ this methodology. Link acquisition via this methodology is incredibly attractive to businesses and something the engines have also instilled as a behavior - with PPC ads, you spend more money and get more traffic. It's not unnatural that companies would feel they can apply the same principles to SEO. While I think the engines still have a long way to go on this front, I also believe that, at least at SEOmoz, where our risk tolerance is so low, the smartest way to go is to play by the engines' rules. Why spend a few hundred or few thousand dollars renting links when you could invest that in your site's content, user interface, public relations, social media marketing, etc. and have a long-term return that the engines are far less likely to ever discount.
Rob Laporte

Geo-Targeting Redirects: Cloaking or Better User Experience? - Search Engine Watch (SEW) - 0 views

  • If you have your site set to detect a visitor's location and show content based on that, I would recommend the following: Serve a unique URL for distinct content. For instance, don't show English content to US visitors on mysite.com and French content to French visitors on mysite.com. Instead, redirect English visitors to mysite.com/en and French visitors to mysite.com/fr. T hat way search engines can index the French content using the mysite.com/fr URL and can index English content using the mysite.com/en URL. Provide links to enable visitors (and search engines) to access other language/country content. For instance, if I'm in Zurich, you might redirect me to the Swiss page, but provide a link to the US version of the page. Or, simply present visitors with a home page that enables them to choose the country. You can always store the selection in a cookie so visitors are redirected automatically after the first time.
  • Google's policies aren't as inflexible as you're trying to portray. The same Google page you quote also says that intent ought to be a major consideration (just as when evaluating pages with hidden content). Also, why would Google's guidelines prevent you from using geotargeting without an immediate redirect? Just because you don't immediately redirect search users to a different page doesn't mean you have to ask for their zip code instead of using IP-based geotargeting.    Lastly, I don't think using such redirects from SERPs improves user experience at all. If I click on a search result, then it's because that's the content I'm interested in. It's very annoying to click on a search result and get a page completely different from the SERP snippet. And what about someone who is on business in a different country? Search engines already provide different language localizations as well as language search options to favor localized pages for a particular region. So if someone goes to the French Google, they will see the French version of localized sites/pages. If they're seeing the U.S. version in their SERP, then it's because you didn't SEO or localize your pages properly, or they deliberately used the U.S. version of Google. Don't second guess your users. Instead, focus on making sure that users know about your localized pages and can access them easily (by choice, not through force).5 days ago, 17:00:11 – Flag – Like – Reply – Delete – Edit – Moderate Bill Hunt Frank your spot on as usual. We still keep chasing this issue and as I wrote on SEW last year in my article on language detection issues http://searchenginewatch.com/3634625 it is often more of the implementation that is the problem than the actual redirect.    Yes, it is exactly cloaking (maybe gray hat) when you have a single gateway such as "example.com" and if the person comes from Germany they see the site in German language or English if their IP was in New York. Engines typically crawl from a central location like Mountain View or Zurich so they would only see the English version since they would not provide signals for any other location. Where you really get into a tricky area is if you set it so that any user agent from a search engine can access any version they are asking for and let them in yet a human is restricted - sort of reverse cloaking. If Mr GoogleBot wants the French home page let him have it rather than sending him to the US homepage.    With the growth of CDN's (content data networks) I am seeing more and more of these issues crop up to handle load balancing as well as other forms of geographical targeting. I have a long list of global, multinational and enterprise related challenges that are complicated by many of Google's outdated ways of handling kindergarten level spam tactics. Sounds like a SES New York session...
Rob Laporte

Live Search Webmaster Center Blog : SMX East 2008: Unraveling URLs and Demystifying Dom... - 0 views

  • Another interesting statistic from this session is something that Sean Suchter from Yahoo! provided — all other things being equal, a searcher is twice as likely to click a short URL than they are to click a long URL.
Rob Laporte

Consumers Head Online for Local Business Information - Search Engine Watch (SEW) - 0 views

  • Importance of Ratings and Reviews From 2008 to 2009, usage of consumer ratings and reviews increased to 25 percent (+3) among IYP searchers and to 27 percent (+5) among general searchers. Additionally, people who use social networking sites for local business information are more likely to use consumer reviews (53 percent). It's interesting that, while overall usage of ratings and reviews is only 24 percent, its importance during the business selection process is 57 percent! Because users of ratings and reviews heavily rely on them to select a company to do business with, they should be a serious component of any marketer's online strategy.
jack_fox

Google Featured Snippets That Extract Definitions - 0 views

  • Google extracted the word “misdemeanor” and defined it within the snippet. It’s also interesting to note that Google pulled in the British spelling of the word “misdemeanor” even though it is spelled the American style in the snippet."
jack_fox

Just How Much is Your Website Worth, Anyhow? An Easy Guide to Valuation - Moz - 0 views

  • Digital agencies are incredibly hard to sell; to do so, you need to have streamlined your process as much as possible
  • 12-month net profit average and then times that by a multiple. Typically, a multiple will range between 20–50x of the 12-month average net profit for healthy, profitable online businesses. As you get closer to 50x you have to be able to show your business is growing in a BIG way month over month and that your business is truly defensible
  • EBITDA, which stands for earnings before interest, tax, depreciation, and amortization.
  • ...6 more annotations...
  • An example of a critical point of failure could be where all of your website traffic is purely Google-organic
  • a trending downward business is going to get a much worse multiple, likely in the 12–18x range. A business in decline can still be sold, though. There are specific buyers that only want distressed assets because they can get them at deep discounts and often have the skill sets needed to fix the site
  • The two areas that most affect the multiple come down to your actual average net profit and how long the business has been around making money
  • If you can prove the email list is adding value to your business, then your email list CAN improve your overall multiple.
  • if you can wade through all that and prove that your social following and social media promotion are driving real traffic and sales to your business, it will definitely help in increasing your multiple.
  • The harder it is to copy what you’ve built, the higher the multiple you’ll get.
jack_fox

How to Value a Website or Internet Business in 2019 - 0 views

  • The traffic valuation method can be useful for devising a value for a non-monetized site
  • a proper valuation of a website or internet business requires hard data, some financial analysis and most importantly, human judgement. This is unfortunately where automated website valuation tools cannot compete.
  • Earnings multiples are by and large the most popular valuation approach in small internet business M&A. There are two elements to the method that buyers should become experts in: defining profitability and identifying the factors that should influence the multiple
  • ...5 more annotations...
  • when evaluating the financial statements of an internet business you should sense check the SDE calculation of the broker and ensure it features only the right add backs, such as: Owner compensation Depreciation (uncommon but a legitimate add back) Travel expenses (if unrelated to the business) Office rent (if the business can be run from home)
  • Typically website valuations range from 1x to 5x annual net income with the vast majority of transactions occurring between 2x to 4x.
  • Content and lead generation sites can often be high workload and search-dependent, respectively, so tend to be discounted for these factors.
  • Strategic investors. Buyers with an existing interest in the same or a complementary niche may look at an acquisition target as a bolt-on or merger with the current asset and be able to realize significant cost and revenue synergies
  • research by Centurica suggests that industry-wide, multiples have increased from an average of 2.4x in 2010 to 3.3x in 2017.
‹ Previous 21 - 40 of 50 Next ›
Showing 20 items per page