Feedburner Goes All Permanent on Their URL Redirects - Search Marketing News Blog - Sea... - 0 views
-
September 30, 2009 Feedburner Goes All Permanent on Their URL Redirects If you've ever clicked on a link in your RSS reader and that link is associated with a site that uses Feedburner, you've probably noticed that the initial URL to appear in your browser's address bar was related to the feed and not the final URL. That's because Feedburner uses the URL to track the click. The redirect was a 302, a temporary redirect. But now Feedburner is updating the URLs to be permanent 301 redirects. Feedburner, which is owned by Google, says that the reason for the change was that some search engines index the feeds, which affects the popularity of a site. If you use Feedburner, you don't have to do anything special. The update is automatic.
Google Shares Mobile Search Volumes - Search Engine Watch Forums - 0 views
-
1 Week Ago AccuraCast vbmenu_register("postmenu_141249", true); This is just a forum. Opinions expressed here are not official! Join Date: Oct 2004 Location: London, UK Posts: 115 Google Shares Mobile Search Volumes After more than 2 years filled with numerous requests for more stats on mobile search volumes and click estimates, Google has finally shared this data via a rather inconspicuous feature on their new Keywords Tool. Google shares mobile search volumes - accuracast.com/search-daily-news/accuracast-7471/google-shares-mobile-search-volumes-for-the-first-time/ This is really BIG for all mobile advertisers! Finally we can put a number and a value to mobile search advertising, and make a stronger business case to prospective clients.
Content Central Blog: Introducing the Google Merchant Center - 0 views
Bing - Getting the IIS SEO Toolkit up and running - Webmaster Blog - Bing Community - 0 views
-
We recently published a popular blog post called Get detailed site analysis to solve problems that highlights the function and capabilities of the new, beta search engine optimization (SEO) Toolkit from the Microsoft Internet Information Server (IIS) team. The tool works as an extension to the latest version of IIS, version 7.
Bing - How Microsoft handles bots clicking on ads - Webmaster Blog - Bing Community - 0 views
-
AdCenter uses a variety of techniques to remove bots, including the Interactive Advertising Bureau’s (IAB) Spiders and Robots protocol. The IAB provides a list of known bots, and Microsoft bots are a part of that list. As a result, any activity generated by bots will not skew AdCenter data because it will be categorized as low quality in AdCenter Reports. You can view the Standard Quality and Low Quality data by accessing the AdCenter Reports tab. In June, 2009, Microsoft received Click Quality Accreditation from the IAB, which holds the industry’s highest standards in click measurement. The IAB and independent third-part auditors verified that adCenter meets their requirements for Click Quality Accreditation, which includes not billing for our search bot’s ad clicks. For more information, visit the adCenter Blog, or the IAB site.
Bing - Search Engine Optimization for Bing - Webmaster Blog - Bing Community - 0 views
-
The type of SEO work and tasks webmasters need to perform to be successful in Bing hasn’t changed—all of the legitimate, time-tested, SEO skills and knowledge that webmasters have invested in previously apply fully today with Bing
Top Search Providers for August 2009 - ClickZ - 0 views
-
Top Search Providers for August 2009 By Jack Marshall, ClickZ, Sep 15, 2009 Microsoft's Bing grew its number of queries from U.S. users by over 22 percent month-on-month during August, making it the fastest growing major search provider, according to data from Nielsen. The engine, which was re-launched in June, now accounts for 10.7 percent of all U.S. searches. Market leader Google managed growth of 2.6 percent in comparison, behind an overall average of 2.9 percent for the sector, but continues its dominance with 65 percent of searches. Yahoo, meanwhile, saw its volume of searches drop by 4.2 percent, but continues to hold second place in terms of overall share with 16 percent. Rounding out the top four, AOL experienced growth of 2.9 percent, accounting for 3.1 percent of total searches.
Microsoft Tests Social Media Monitoring Product - ClickZ - 0 views
-
Microsoft Tests Social Media Monitoring Product By Christopher Heine, ClickZ, Sep 24, 2009 Microsoft has developed a social media analytics tool that's designed, among other things, to improve a marketing organization's ability to adjust to social media phenomena on the fly. Called "Looking Glass," the product is still in prototype and will only be available to a few companies in the near term. It sends e-mail alerts when social media activity picks up considerably. The sentiment (i.e., negative or positive) of that chatter and the influence level of the content creator are reported in the alert. Digital flow charts show what days of the week generate the most activity on Twitter, Facebook, Flickr, YouTube, and other social media sites. But interweaving social media data with reporting from other campaign channels may turn out be Microsoft's most significant contribution to the already mature field of social media analytics. Feeds from social media sites can be connected to other business elements like customer databases, CRM centers and sales data within an organization. The data integrate via Microsoft's enterprise platforms like Outlook and Sharepoint. A handful or so companies will begin testing Looking Glass in the coming weeks.
Advertisers Lag Consumers in Mobile Adoption, For Now - ClickZ - 0 views
-
Only 11 percent of both brands and agencies responding to eMarketer said mobile represented a line item in their 2010 budgets; nineteen percent said they were "experimenting but have no future plans at all;" and 36 percent of brands said it was simply not part of their plans. But with the spread of smart phones and devices that facilitate easier Web searching, advertisers will find themselves faced with more options for reaching consumers on their phones, and are already preparing to take advantage of them. EMarketer projects spending on mobile ads to reach $593 million next year, and $830 million in 2011. By 2013, the report says that number will reach $1.56 billion, 9.9 percent of total spending on display advertising. "Mobile will grow considerably more quickly than online ad spending as a whole, more in line with emerging online formats such as digital video," Elkin said. The report also noted that widespread experimentation today is making marketers -- and consumers -- more comfortable with ads on mobile devices, and will pay off in the coming years. Of course, talking about mobile is talking about many different things: search, display and SMS texting, to name a few. As for where marketers will put this money, eMarketer predicts the steepest rise to come in money spent on search, from 18 percent of the total in 2008 to 37 percent in 2013. Meanwhile, SMS will see a decline in share as messaging options become more sophisticated, from 60 percent in 2008 to 28 percent in 2013. Display is expected to grow its share, from 22 percent last year to 35 percent in 2013.
Consumer Shopping Engines: An Exercise in Frustration - ClickZ - 0 views
-
Other frustrations of working with shopping engines include: You can cap but not truly control spend. A consistent delivery cannot be promised. Spikes will happen out of the blue and might burn through your budget overnight. You can try to regulate that in your insertion order (IO) if you can. The IO process is archaic. It lasts forever, has little detail, and can go across clients if you're an agency. Shopping engines have multiple budget limits, account reps, account preferences, systems, and so on. Managing billing and credit departments is a nightmare, too. There's no consistency in the channel, no best practices across engines across the feeds. They have not standardized the feeds, as each engine maintains its secret sauce. Some solutions for the shopping engines: Develop a spend limit based on daily spend, similar to Google AdWords. A monthly credit limit that simply shuts down isn't a regulated budget. Allow advertisers to have more influence with various methods of optimization. Have a third party develop a system similar to Google's My Client Center (MCC) to control multiple shopping engines and multiple clients. Allow budget limits, preferences, billing, systems, and so forth to be managed by this MCC-like interface as well. Develop a standard format across shopping engines. Unused fields could simply be ignored by specific shopping engines. If we had a wish list for shopping engines, stronger optimization opportunities would be right at the top. That would make this logical, effective consumer touch point more a part of an integrated plan to maximize revenue and less of a crapshoot.
Deduping Duplicate Content - ClickZ - 0 views
-
One interesting thing that came out of SES San Jose's Duplicate Content and Multiple Site Issues session in August was the sheer volume of duplicate content on the Web. Ivan Davtchev, Yahoo's lead product manager for search relevance, said "more than 30 percent of the Web is made up of duplicate content." At first I thought, "Wow! Three out of every 10 pages consist of duplicate content on the Web." My second thought was, "Sheesh, the Web is one tangled mess of equally irrelevant content." Small wonder trust and linkage play such significant roles in determining a domain's overall authority and consequent relevancy in the search engines. Three Flavors of Bleh Davtchev went on to explain three basic types of duplicate content: 1. Accidental content duplication: This occurs when Webmasters unintentionally allow content to be replicated by non-canonicalization (define), session IDs, soft 404s (define), and the like. 2. Dodgy content duplication: This primarily consists of replicating content across multiple domains. 3. Abusive content duplication: This includes scraper spammers, weaving or stitching (mixed and matched content to create "new" content), and bulk content replication. Fortunately, Greg Grothaus from Google's search quality team had already addressed the duplicate content penalty myth, noting that Google "tries hard to index and show pages with distinct information." It's common knowledge that Google uses a checksum-like method for initially filtering out replicated content. For example, most Web sites have a regular and print version of each article. Google only wants to serve up one copy of the content in its search results, which is predominately determined by linking prowess. Because most print-ready pages are dead-end URLs sans site navigation, it's relatively simply to equate which page Google prefers to serve up in its search results. In exceptional cases of content duplication that Google perceives as an abusive attempt to manipula
-
One interesting thing that came out of SES San Jose's Duplicate Content and Multiple Site Issues session in August was the sheer volume of duplicate content on the Web. Ivan Davtchev, Yahoo's lead product manager for search relevance, said "more than 30 percent of the Web is made up of duplicate content." At first I thought, "Wow! Three out of every 10 pages consist of duplicate content on the Web." My second thought was, "Sheesh, the Web is one tangled mess of equally irrelevant content." Small wonder trust and linkage play such significant roles in determining a domain's overall authority and consequent relevancy in the search engines. Three Flavors of Bleh Davtchev went on to explain three basic types of duplicate content: 1. Accidental content duplication: This occurs when Webmasters unintentionally allow content to be replicated by non-canonicalization (define), session IDs, soft 404s (define), and the like. 2. Dodgy content duplication: This primarily consists of replicating content across multiple domains. 3. Abusive content duplication: This includes scraper spammers, weaving or stitching (mixed and matched content to create "new" content), and bulk content replication. Fortunately, Greg Grothaus from Google's search quality team had already addressed the duplicate content penalty myth, noting that Google "tries hard to index and show pages with distinct information." It's common knowledge that Google uses a checksum-like method for initially filtering out replicated content. For example, most Web sites have a regular and print version of each article. Google only wants to serve up one copy of the content in its search results, which is predominately determined by linking prowess. Because most print-ready pages are dead-end URLs sans site navigation, it's relatively simply to equate which page Google prefers to serve up in its search results. In exceptional cases of content duplication that Google perceives as an abusive attempt to manipula
Social Media and Local Search 101 - ClickZ - 0 views
-
Social media is a now a more popular way to communicate online than e-mail, according to these Nielsen stats.
« First
‹ Previous
3221 - 3240 of 3586
Next ›
Last »
Showing 20▼ items per page