Skip to main content

Home/ @Publish/ Group items tagged Keywords

Rss Feed Group items tagged

Pedro Gonçalves

Twitter's New Ad Product Could Create Hub Of Aggregated Advertising Data | Fast Company... - 0 views

  • now brands can serve up ads to users based on the content they're actually tweeting.
  • keyword-based advertising
  • advertisers can buy specific keywords to target certain users. "For example: let’s say a user tweets about enjoying the latest album from their favorite band, and it so happens that band is due to play a concert at a local venue," Malhotra explained. "That venue could now run a geotargeted campaign using keywords for that band with a tweet containing a link to buy the tickets. That way, the user who tweeted about the new album may soon see that Promoted Tweet in their timeline letting them know tickets are for sale in their area."
  • ...1 more annotation...
  • To set up a campaign, advertisers can choose the keywords or phrase they want to target. Next, they can set more specifics, such as the location or gender of a user, or what device he or she is using. Twitter says early tests of this new ad product yielded higher engagement rates. GoPro, for example, saw "engagement rates as high as 11%" when using keywords to target users on Twitter.
Pedro Gonçalves

Twitter Is About To Officially Launch Retargeted Ads [Update: Confirmed] | TechCrunch - 0 views

  • Twitter has confirmed our scoop with the announcement of Tailored Audiences - its name for retargeted ads. Available globally to all advertisers via a slew of adtech startup partners, advertisers will be able to target recent visitors to their websites with retargeted Promoted Tweets and Promoted Accounts.
  • Twitter’s users are on mobile. Seventy percent of its ad revenue already comes from the small screens, and it likely follows that a majority of engagement is on mobile, too.
  • retargeting happens like this. You visit a website, say a travel booking site, and look at a page for buying a flight to Hawaii. You chicken out at the last minute, don’t buy, and navigate away, but the site has dropped a cookie for that Hawaii flight page on your browser. Then, when you visit other sites or social networks that run retargeted ads, they detect that cookie, and the travel site can show you an ad saying “It’s cold in SF. Wouldn’t a vacation to Hawaii be nice?” to try to get you to pull the trigger and buy the flight it knows you were already interested in. But without cookies on mobile, you can’t retarget there… …unless you can tie the identity of a mobile user to what they do on the computer. And Twitter can. It’s one of the few hugely popular services that individuals access from multiple types of devices.
  • ...5 more annotations...
  • Essentially, when you log into your account on your full-size computer, Twitter will analyze the cookies in your browser to see where you’ve been on the non-mobile web. Then, when you log in to that same account on mobile, it can still use your web cookies to hit you with retargeted ads.
  • mobile phones don’t have the ability to set cookies so you can’t do retargeting.
  • Facebook only recently began allowing retargeted ads on mobile, and only through a “custom audiences” targeting program separate from FBX.
  • Lucky for Twitter, most of what people do on it is public, so it doesn’t spark the same privacy concerns as Facebook. Twitter also offers an opt-out of retargeting under Promoted Content on its Security And Privacy settings page. Plus it honors Do Not Track for users that enable it in their browsers.
  • It’s also recently opened up keyword targeting so advertisers can reach people who’ve tweeted certain words. Between keyword targeting and cookie retargeting, Twitter is breaking out of the demand generation and into the lucrative demand fulfillment part of the advertising funnel where Google’s search ad business lives. Advertisers are willing to pay top dollar if you can deliver them someone ready to buy their product. And there’s no better sign of someone’s intent to buy than having recently visited a site and almost made the purchase already. Cookies could be very tasty for Twitter.
Pedro Gonçalves

26 Ways to Create Social Media Engagement With Content Marketing | Social Media Examiner - 0 views

  • you need writers and team members who can think strategically about the content that will resonate most with your audience.
  • Search for people asking questions about your keyword or phrase on Twitter.
  • “Content curation is not about collecting links or being an information packrat, it is more about putting them into a context with organization, annotation and presentation.  Content curators provide a customized, vetted selection of the best and most relevant resources on a very specific topic or theme.”
  • ...7 more annotations...
  • Post curated content 50%, original content 30% and promotional material 20% of the time.
  • The data in a piece of content posted on Google+ is immediately indexed for Google search. On Twitter or Facebook, Google has restricted access to the data and indexing can take a few days. AuthorRank, the digital signature for Google+ users, is also set to affect the ranking order for search results.”
  • Take time to comment
  • Place your keywords:In the Bio or About Us section of all of your social networks
  • Post status updates often (especially every morning).
  • In survey after survey, smartphone users want to know if a close physical location is open. Content for these types of searches should include basic contact information—address, phone number and operating hours—as well as a short description of the location highlighting why a visitor should choose that location.
  • replying to the comments on media can be as important as the creation of the content itself. When someone comments, you must reply.”
Pedro Gonçalves

About Traffic Sources - Analytics Help - 0 views

  • The keywords that visitors searched are usually captured in the case of search engine referrals. This is true for both organic and paid search. If the a visitor is signed in to a Google account, however, Keyword will have the value “(not provided)”.
    • Pedro Gonçalves
       
      Why!?
Pedro Gonçalves

Google Study: 9 in 10 Consumers Engage in Sequential Device Usage - 0 views

  • As the number of Internet-enabled consumer devices continues to grow, so does the propensity of consumers to sequentially use multiple devices to complete a single online task. In fact, according to a new study from Google, 90 percent of people move among devices to accomplish a goal.
  • Examples of how consumers sequentially use multiple devices for a single task include opening an email on a smartphone and then finishing reading it on a home PC and looking up product specs on a laptop after seeing a TV commercial
  • The other primary way of using multiple devices is simultaneous use, meaning using more than one device at the same time. This includes both multitasking — performing different tasks on different devices — and complementary usage such as looking up a product online while watching a TV commercial.
  • ...6 more annotations...
  • The most popular reasons for sequential device usage include web browsing (81 percent), shopping online (67 percent), managing finances (46 percent) and planning a trip (43 percent). Eighty-one percent of sequential online shopping is spontaneous, which Google credits to the widespread availability of smartphones.
  • 98 percent of sequential screeners move between devices in the same day to complete a task
  • Seventy-seven percent of the time, TV viewers have another device plugged in — with smartphones (49 percent) and PCs/laptops (34 percent) the most popular.
  • The study also found search to be a critical connector between devices used sequentially. Consumers use search to pick up on a second device where they left off on the first 63 percet of the time they are conducting multi-device search, 61 percent of the time they are browsing the Internet using multiple devices, 51 percent of the time they are shopping online via multiple screens, and 43 percent of the time they are using more than one device to watch online video.
  • Google advises digital marketers to allow customers to save their progress between devices, as well as use tactics like keyword parity (maintaining the same keywords across different publishers and the three primary match type silos of broad, phrase and exact) to ensure that they can be found easily via search when that customer moves to the next device.
  • 80 percent of searches that happen on smartphones are spur-of-the-moment, and 44 percent of these spontaneous searches are goal-oriented. And more than half (52 percent) of PC/laptop searches are spontaneous, with 43 percent goal-oriented
Pedro Gonçalves

How Website Speed Actually Impacts Search Ranking - Moz - 0 views

  • in 2010, Google did something very different. Google announced website speed would begin having an impact on search ranking. Now, the speed at which someone could view the content from a search result would be a factor.
  • Google's Matt Cutts announced that slow-performing mobile sites would soon be penalized in search rankings as well.
  • While Google has been intentionally unclear in which particular aspect of page speed impacts search ranking, they have been quite clear in stating that content relevancy remains king.
  • ...13 more annotations...
  • When people say"page load time" for a website, they usually mean one of two measurements: "document complete" time or "fully rendered" time. Think of document complete time as the time it takes a page to load before you can start clicking or entering data. All the content might not be there yet, but you can interact with the page. Think of fully rendered time as the time it takes to download and display all images, advertisements, and analytic trackers. This is all the "background stuff" you see fill in as you're scrolling through a page.
  • Since Google was not clear on what page load time means, we examined both the effects of both document complete and fully rendered on search rankings. However our biggest surprise came from the lack of correlation of two key metrics! We expected, if anything, these 2 metrics would clearly have an impact on search ranking. However, our data shows no clear correlation between document complete or fully rendered times with search engine rank, as you can see in the graph below:
  • With no correlation between search ranking and what is traditionally thought of a "page load time" we expanded our search to the Time to First Byte (TTFB). This metric captures how long it takes your browser to receive the first byte of a response from a web server when you request a particular URL. In other words, this metric encompasses the network latency of sending your request to the web server, the amount of time the web server spent processing and generating a response, and amount of time it took to send the first byte of that response back from the server to your browser.
  • The TTFB result was surprising in a clear correlation was identified between decreasing search rank and increasing time to first byte. Sites that have a lower TTFB respond faster and have higher search result rankings than slower sites with a higher TTFB. Of all the data we captured, the TTFB metric had the strongest correlation effect, implying a high likelihood of some level of influence on search ranking.
  • The surprising result here was with the the median size of each web page, in bytes, relative to the search ranking position. By "page size," we mean all of the bytes that were downloaded to fully render the page, including all the images, ads, third party widgets, and fonts. When we graphed the median page size for each search rank position, we found a counterintuitive correlation of decreasing page size to decreasing page rank, with an anomalous dip in the top 3 ranks.
  • We suspect over time, though, that page rendering time will also factor into rankings due to the high indication of the importance of user experience.
  • our data shows there is a correlation between lower time-to-first-byte (TTFB) metrics and higher search engine rankings. Websites with servers and back-end infrastructure that could quickly deliver web content had a higher search ranking than those that were slower. This means that, despite conventional wisdom, it is back-end website performance and not front-end website performance that directly impacts a website's search engine ranking.
  • Our data shows there is no correlation between "page load time" (either document complete or fully rendered) and ranking on Google's search results page. This is true not only for generic searches (one or two keywords) but also for "long tail" searches (4 or 5 keywords) as well. We did not see websites with faster page load times ranking higher than websites with slower page load times in any consistent fashion. If Page Load Time is a factor in search engine rankings, it is being lost in the noise of other factors. We had hoped to see some correlation especially for generic one- or two-word queries. Our belief was that the high competition for generic searches would make smaller factors like page speed stand out more.
  • TTFB is affected by 3 factors: The network latency between a visitor and the server. How heavily loaded the web server is. How quickly the website's back end can generate the content.
  • Websites can lower network latency by utilizing Content Distribution Networks (CDNs). CDNs can quickly deliver content to all visitors, often regardless of geographic location, in a greatly accelerated manner.
  • Do these websites rank highly because they have better back-end infrastructure than other sites? Or do they need better back-end infrastructure to handle the load of ALREADY being ranked higher? While both are possible, our conclusion is that sites with faster back ends receive a higher rank, and not the other way around.
  • The back-end performance of a website directly impacts search engine ranking. The back end includes the web servers, their network connections, the use of CDNs, and the back-end application and database servers. Website owners should explore ways to improve their TTFB. This includes using CDNs, optimizing your application code, optimizing database queries, and ensuring you have fast and responsive web servers.
  • Fast websites have more visitors, who visit more pages, for longer period of times, who come back more often, and are more likely to purchase products or click ads. In short, faster websites make users happy, and happy users promote your website through linking and sharing. All of these things contribute to improving search engine rankings.
Pedro Gonçalves

The Basics Of Neuromarketing | Fast Company - 0 views

  • Gone are the days when you could stuff your website with low-quality articles packed with the right keywords or link spam exchanges to boost your Google rankings. Today the game is all about quality--content that’s authentic, informative, and, most of all, attractive to your intended audience. In short, we need to stop thinking about SEO as “search engine optimization” and more as “social engagement optimization,” as Greg Henderson at SEO Desk put it.
  • The brain notices how you begin and how you end more than what you’re saying in the middle, so you want to make sure that your site (and your content) has an attention-getting open and a close that really makes your case in dramatic fashion.
  • focus on quick ways to sum up how your product or service can change the customer’s life for the better.
  • ...5 more annotations...
  • What our eyes see connects directly with the unconscious parts of the brain that marketers want to reach; that means you want to make your points (and your website design) as visual as possible. Photos and pictures are a great way to sell concepts quickly and directly in a brain-pleasing way. And, by the way, facial expressions are great to use--our noggins immediately identify with them.
  • Our brains are getting inundated with messages all day long--so they respond well to pitches that are short and sweet. Short impactful statements on the homepage can do the job a whole lot better than huge blocks of copy that overexplain what you’re all about.
  • If you’re too clever or too abstract, our brains are going to want to move on (unless it’s something we really want to figure out, which isn’t usually the case with marketing). Make sure your content is written clearly in language everyone can understand (unless you’re serving a niche audience that expects more technical or sophisticated language).
  • Emotion hits our underground intellect more powerfully than the most effectively worded argument. It makes whatever the message is more memorable as well. Go beyond facts to make your customers feel.
  • By working towards more actual social engagement opportunities with our website visitors, instead of just artificially boosting traffic, we also increase our odds for creating conversations, conversions, and long-term clients.
Pedro Gonçalves

Facebook: $7 To Promote My Status Update??? - 0 views

  • Essentially what Facebook is doing here is making users pay to improve their EdgeRank score for individual status updates, the system that Facebook uses to determine how high individual stories appear in the news feed
  • It is really not all that different from how Google has structured its own business, where companies or individuals can pay for keywords to appear as advertisements in search results. The difference is that Google’s search engine is a much more impersonal mechanisms than is the Facebook news feed.
  • Where Facebook is making a mistake is in crossing the bridge between paid sponsored posts for businesses and applying it to individuals
  • ...1 more annotation...
  • It is one thing to ask a business to pay to increase its visibility, that is the type of thing that businesses budget for. It is another to ask users on their personal pages directly for money. Google has never asked me to spend money to improve the search results for my own name, for instance.
Pedro Gonçalves

For Brands, Being Cool Is As Hot As Sex | Fast Company - 0 views

  • For the study 353 volunteers were asked to submit adjectives they associated with coolness. Surprisingly, the word "friendly" topped the list, followed by "personal competence."
  • This ranking positioned socially skilled, popular, smart, and talented people as being the ultimate in cool; individualist hipsters featured lower on the list. Bar-Ilan concluded: "Coolness has lost so much of its historical origins and meaning." That is: rebels are not hot. Or cool.
  • Another attribute figured prominently in this recent study: physical attractiveness. The prominence of "good looks" in the study echoes the results of work I carried out for my most recent book, Brandwashed. During my $3 million study into the way word-of-mouth works, I asked a family of five to secretly promote brands to a cadre of their friends, family members and colleagues. During this experiment, I learned that the key to the family’s success was neither their extensive network, nor their gift of the gab; it was their good looks.
  • ...1 more annotation...
  • A slew of books published in the last year attest to this.  Daniel Hamermesh describes why attractive people are more successful in Beauty Pays, whereas Deborah Rhodes’s The Beauty Bias argues for a legal basis to prohibit discrimination against those who are not gifted in the looks department.
Pedro Gonçalves

Content Marketing is More Important than Ever | Experts' Corner | Big Think - 0 views

  • The major takeaways from Google Panda and update (in no particular order) are as follows: Focus on original content – you will get hammered for “stealing” or repurposing on too high a scale (for example, lifting content from Wikipedia) Over-optimization kills – Google can sniff out sites that are designed solely to exploit certain key words (for example, repeating the same keyword, or variations thereof to drive traffic) Link to high quality/authoritative sites – while Panda focused more on a more systematic sweep of SEO, Penguin is focused on the processes around linking. Don’t over-link, and when you do create links, link to high quality sources Excessive Ads are Bad – If it looks like you are running too many ads against your content, you will face the consequences SEO is a “Bad Word” – The rise of the term “content marketing” effectively means that high quality content trumps low quality link bait.
  • Write Guest Blog Posts for Authoritative SitesContent marketing does not just refer to content you write for your own site, but content you write for other sites.
  • content marketing also improves SEO rankings and traffic.  Link building is a common SEO strategy that is always difficult to grow through a paid channel.  The best way to get organic and quality links is by creating interesting content that drives people to link and share your content.  Whether or not it’s directly related to your line of business, driving free traffic is always a victory.
  • ...1 more annotation...
  • The whole purpose of your own blog is to drive the highest quality, most targeted traffic to your conversion funnel – if you are not doing that, you might as well not have a blog.
Pedro Gonçalves

How Users Read on the Web - 0 views

  • 79 percent of our test users always scanned any new page they came across; only 16 percent read word-by-word. (Update: a newer study found that users read email newsletters even more abruptly than they read websites.)
  • Web pages have to employ scannable text, using highlighted keywords (hypertext links serve as one form of highlighting; typeface variations and color are others) meaningful sub-headings (not "clever" ones) bulleted lists one idea per paragraph (users will skip over any additional ideas if they are not caught by the first few words in the paragraph) the inverted pyramid style, starting with the conclusion half the word count (or less) than conventional writing
Pedro Gonçalves

Can Artificial Intelligence Like IBM's Watson Do Investigative Journalism? ⚙ ... - 0 views

  • Two years ago, the two greatest Jeopardy champions of all time got obliterated by a computer called Watson. It was a great victory for artificial intelligence--the system racked up more than three times the earnings of its next meat-brained competitor. For IBM’s Watson, the successor to Deep Blue, which famously defeated chess champion Gary Kasparov, becoming a Jeopardy champion was a modest proof of concept. The big challenge for Watson, and the goal for IBM, is to adapt the core question-answering technology to more significant domains, like health care. WatsonPaths, IBM’s medical-domain offshoot announced last month, is able to derive medical diagnoses from a description of symptoms. From this chain of evidence, it’s able to present an interactive visualization to doctors, who can interrogate the data, further question the evidence, and better understand the situation. It’s an essential feedback loop used by diagnosticians to help decide which information is extraneous and which is essential, thus making it possible to home in on a most-likely diagnosis. WatsonPaths scours millions of unstructured texts, like medical textbooks, dictionaries, and clinical guidelines, to develop a set of ranked hypotheses. The doctors’ feedback is added back into the brute-force information retrieval capabilities to help further train the system.
  • For Watson, ingesting all 2.5 million unstructured documents is the easy part. For this, it would extract references to real-world entities, like corporations and people, and start looking for relationships between them, essentially building up context around each entity. This could be connected out to open-entity databases like Freebase, to provide even more context. A journalist might orient the system’s “attention” by indicating which politicians or tax-dodging tycoons might be of most interest. Other texts, like relevant legal codes in the target jurisdiction or news reports mentioning the entities of interest, could also be ingested and parsed. Watson would then draw on its domain-adapted logic to generate evidence, like “IF corporation A is associated with offshore tax-free account B, AND the owner of corporation A is married to an executive of corporation C, THEN add a tiny bit of inference of tax evasion by corporation C.” There would be many of these types of rules, perhaps hundreds, and probably written by the journalists themselves to help the system identify meaningful and newsworthy relationships. Other rules might be garnered from common sense reasoning databases, like MIT’s ConceptNet. At the end of the day (or probably just a few seconds later), Watson would spit out 100 leads for reporters to follow. The first step would be to peer behind those leads to see the relevant evidence, rate its accuracy, and further train the algorithm. Sure, those follow-ups might still take months, but it wouldn’t be hard to beat the 15 months the ICIJ took in its investigation.
1 - 16 of 16
Showing 20 items per page