Skip to main content

Home/ New Media Ethics 2009 course/ Group items tagged Social Media Optimization

Rss Feed Group items tagged

Weiye Loh

How Is Twitter Impacting Search and SEO? Here's the (Visual) Proof | MackCollier.com - ... - 0 views

  • I picked a fairly specific term, in “Social Media Crisis Management”.  I checked prior to publishing yesterday’s post, and there were just a shade under 29,000 Google results for that term.  This is important because you need to pick the most specific term as possible, because this will result in less competition, and (if you’ve picked the right term for you) it means you will be more likely to get the ‘right’ kind of traffic.
  • Second, I made sure the term was in the title and mentioned a couple of times in the post.  I also made the term “Social Media Crisis Management” at the front of the post title, I originally had the title as “A No-Nonsense Guide to Social Media Crisis Management” but Amy wisely suggested that I flip it so the term I was targeting was at the front of the title.
  • when I published the post yesterday at 12:20pm, there were 28,900 Google results for the term “Social Media Crisis Management”.  I tweeted a link to it at that time.  Fifty minutes later at 1:10pm, the post was already showing up on the 3rd page for a Google search of #Social Media Crisis Management”:
  • ...5 more annotations...
  • I tweeted out another link to the post around 2pm, and then at 2:30pm, it moved a bit further up the results on the 3rd page:
  • The Latest results factors in real-time linking behavior, so it is picking up all the tweets where my post was being RTed, and as a result, the top half of the Latest results for the term “Social Media Crisis Management” were completely devoted to MY post.
  • That’s a perfect example of how Twitter and Facebook sharing is now impacting Google results.  And it’s also a wonderful illustration of the value of being active on Twitter.  I tweeted a link to that post several times yesterday and this morning, which was a big reason why it moved up the Google results so quickly, and a big reason why it dominated the Latest results for that term.
  • there are two things I want you to take away from this: 1 – This was very basic SEO stuff that any of you can do.  It was simply a case of targeting a specific phrase, and inserting it in the post.  Now as far as my having a large and engaged Twitter network and readership here (thanks guys!), that definitely played a big factor in the post moving up the results so quickly.  But at a basic level, everything I did from a SEO perspective is what you can do with every post.  And you should.
  • 2 – You can best learn by breaking stuff.  There are a gazillion ‘How to’ and ’10 Steps to…’ articles about using social media, and I have certainly written my fair share of these.  But the best way *I* learn is if you can show me the first 1 or 2 steps, then leave me alone and let me figure out the remaining 8 or 9 steps for myself.  Don’t just blindly follow my social media advice or anyone else’s.  Use the advice as a guide for how you can get started.  But there is no one RIGHT way to use social media.  Never forget that.  I can tell you what works for me and my clients, but you still need to tweak any advice so that it is perfect for you.  SEO geeks will no doubt see a ton of things that I could have done or altered in this experiment to get even better results.  And moving forward, I am going to continue to tweak and ‘break stuff’ in order to better figure out how all the moving parts work together.
Weiye Loh

Google Social Search with Twitter integration and more | T3 magazine - 0 views

  • Google adds more functionality to Social Search Google has made a few tweaks to Social Search, integrating it with Twitter and Google accounts for personalized and relevant results.
  • Google says it aims to combine the "goodness of Google" with the opinions of people the users care most about. These results could be based on whether your friends publish their information on their blogs/websites, YouTube or Flickr accounts and more.
  • he social results will no longer appear at the bottom of the page, but will be mixed with the regular search results depending on their relevance to the user. These results will be annonated, marking it as a social result. It will also include links people have shared on Twitter and other social networking sites. The new search also allows users to privately link their Twitter accounts.
Weiye Loh

Google's War on Nonsense - NYTimes.com - 0 views

  • As a verbal artifact, farmed content exhibits neither style nor substance.
  • The insultingly vacuous and frankly bizarre prose of the content farms — it seems ripped from Wikipedia and translated from the Romanian — cheapens all online information.
  • These prose-widgets are not hammered out by robots, surprisingly. But they are written by writers who work like robots. As recent accounts of life in these words-are-money mills make clear, some content-farm writers have deadlines as frequently as every 25 minutes. Others are expected to turn around reported pieces, containing interviews with several experts, in an hour. Some compose, edit, format and publish 10 articles in a single shift. Many with decades of experience in journalism work 70-hour weeks for salaries of $40,000 with no vacation time. The content farms have taken journalism hackwork to a whole new level.
  • ...6 more annotations...
  • So who produces all this bulk jive? Business Insider, the business-news site, has provided a forum to a half dozen low-paid content farmers, especially several who work at AOL’s enormous Seed and Patch ventures. They describe exhausting and sometimes exploitative writing conditions. Oliver Miller, a journalist with an MFA in fiction from Sarah Lawrence who once believed he’d write the Great American Novel, told me AOL paid him about $28,000 for writing 300,000 words about television, all based on fragments of shows he’d never seen, filed in half-hour intervals, on a graveyard shift that ran from 11 p.m. to 7 or 8 in the morning.
  • Mr. Miller’s job, as he made clear in an article last week in The Faster Times, an online newspaper, was to cram together words that someone’s research had suggested might be in demand on Google, position these strings as titles and headlines, embellish them with other inoffensive words and make the whole confection vaguely resemble an article. AOL would put “Rick Fox mustache” in a headline, betting that some number of people would put “Rick Fox mustache” into Google, and retrieve Mr. Miller’s article. Readers coming to AOL, expecting information, might discover a subliterate wasteland. But before bouncing out, they might watch a video clip with ads on it. Their visits would also register as page views, which AOL could then sell to advertisers.
  • commodify writing: you pay little or nothing to writers, and make readers pay a lot — in the form of their “eyeballs.” But readers get zero back, no useful content.
  • You can’t mess with Google forever. In February, the corporation concocted what it concocts best: an algorithm. The algorithm, called Panda, affects some 12 percent of searches, and it has — slowly and imperfectly — been improving things. Just a short time ago, the Web seemed ungovernable; bad content was driving out good. But Google asserted itself, and credit is due: Panda represents good cyber-governance. It has allowed Google to send untrustworthy, repetitive and unsatisfying content to the back of the class. No more A’s for cheaters.
  • the goal, according to Amit Singhal and Matt Cutts, who worked on Panda, is to “provide better rankings for high-quality sites — sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
  • Google officially rolled out Panda 2.2. Put “Whitey Bulger” into Google, and where you might once have found dozens of content farms, today you get links to useful articles from sites ranging from The Boston Globe, The Los Angeles Times, the F.B.I. and even Mashable, doing original analysis of how federal agents used social media to find Bulger. Last month, Demand Media, once the most notorious of the content farms, announced plans to improve quality by publishing more feature articles by hired writers, and fewer by “users” — code for unpaid freelancers. Amazing. Demand Media is stepping up its game.
  •  
    Content farms, which have flourished on the Web in the past 18 months, are massive news sites that use headlines, keywords and other tricks to lure Web-users into looking at ads. These sites confound and embarrass Google by gaming its ranking system. As a business proposition, they once seemed exciting. Last year, The Economist admiringly described Associated Content and Demand Media as cleverly cynical operations that "aim to produce content at a price so low that even meager advertising revenue can support it."
Weiye Loh

TODAYonline | Commentary | For the info-rich and time-poor, digital curators to the res... - 0 views

  • digital "curators" choose and present things related to a specific topic and context. They "curate", as opposed to "aggregate", which implies plain collecting with little or no value add. Viewed in this context, Google search does the latter, not the former. So, who curates? The Huffington Post, or HuffPo, is one high-profile example and, it appears, a highly-valued one too, going by AOL numbers-crunchers who forked out US$315 million (S$396.9 million) to acquire it. Accolades have also come in for Arianna Huffington's team of contributors and more than 3,000 bloggers - from politicians to celebrities to think-tankers. The website was named second among the 25 best blogs of 2009 by Time magazine, and most powerful blog in the world by The Observer.
  • By sifting, sorting and presenting news and views - yes, "curating" - HuffPo makes itself useful in an age of too much information and too many opinions. (Strictly speaking, HuffPo is both a creator and curator.) If what HuffPo is doing seems deja vu, it is hardly surprising. Remember the good old "curated" news of the pre-Internet days when newspapers decided what news was published and what we read? Then, the Editor was the Curator with the capital "C".
  • But with the arrival of the Internet and the uploading of news and views by organisations and netizens, the bits and bytes have turned into a tsunami. Aggregators like Google search threw us some life buoys, using text and popularity to filter the content. But with millions of new articles and videos added to the Internet daily, the "right" content has become that proverbial needle in the haystack. Hence the need for curation.
  •  
    Inundated by the deluge of information, and with little time on our hands, some of us turn to social media networks. Sometimes, postings by friends are useful. But often, the typically self-indulgent musings are not. It's "curators" to the rescue.
Weiye Loh

How Google's +1 Button Affects SEO - 0 views

  •  
    Google defines the +1 as a feature to help people discover and share relevant content from the people they already know and trust. Users can +1 different types of content, including Google search results, websites, and advertisements. Once users +1 a piece of content, it can be seen on the +1 tab in their Google+ profile, in Google search results, and on websites with a +1 button. The plot thickened last month when Google launched Search plus Your World. Jack Menzel, director of product management for Google Search, explained that now Google+ users would be able to "search across information that is private and only shared to you, not just the public web." According to Ian Lurie from the blog Conversation Marketing, in Search plus Your World, search results that received a lot of +1s tend to show up higher in results.
Weiye Loh

Why a hyper-personalized Web is bad for you - Internet - Insight - ZDNet Asia - 0 views

  • Invisibly but quickly, the Internet is changing. Sites like Google and Facebook show you what they think you want to see, based on data they've collected about you.
  • The filter bubble is the invisible, personal universe of information that results--a bubble you live in, and you don't even know it. And it means that the world you see online and the world I see may be very different.
  • As consumers, we can vary our information pathways more and use things like incognito browsing to stop some of the tracking that leads to personalization.
  • ...6 more annotations...
  • it's in these companies' hands to do this ethically--to build algorithms that show us what we need to know and what we don't know, not just what we like.
  • why would the Googles and Facebooks of the world change what they're doing (absent government regulation)? My hope is that, like newspapers, they'll move from a pure profit-making posture to one that recognizes that they're keepers of the public trust.
  • most people don't know how Google and Facebook are controlling their information flows. And once they do, most people I've met want to have more control and transparency than these companies currently offer. So it's a way in to that conversation. First people have to know how the Internet is being edited for them.
  • what's good and bad about the personalization. Tell me some ways that this is not a good thing? Here's a few. 1) It's a distorted view of the world. Hearing your own views and ideas reflected back is comfortable, but it can lead to really bad decisions--you need to see the whole picture to make good decisions; 2) It can limit creativity and innovation, which often come about when two relatively unrelated concepts or ideas are juxtaposed; and 3) It's not great for democracy, because democracy requires a common sense of the big problems that face us and an ability to put ourselves in other peoples' shoes.
  • Stanford researchers Dean Eckles and Maurits Kapstein, who figured out that not only do people have personal tastes, they have personal "persuasion profiles". So I might respond more to appeals to authority (Barack Obama says buy this book), and you might respond more to scarcity ("only 2 left!"). In theory, if a site like Amazon could identify your persuasion profile, it could sell it to other sites--so that everywhere you go, people are using your psychological weak spots to get you to do stuff. I also really enjoyed talking to the guys behind OKCupid, who take the logic of Google and apply it to dating.
  • Nobody noticed when Google went all-in on personalization, because the filtering is very hard to see.
Weiye Loh

Jonathan Stray » Measuring and improving accuracy in journalism - 0 views

  • Accuracy is a hard thing to measure because it’s a hard thing to define. There are subjective and objective errors, and no standard way of determining whether a reported fact is true or false
  • The last big study of mainstream reporting accuracy found errors (defined below) in 59% of 4,800 stories across 14 metro newspapers. This level of inaccuracy — where about one in every two articles contains an error — has persisted for as long as news accuracy has been studied, over seven decades now.
  • With the explosion of available information, more than ever it’s time to get serious about accuracy, about knowing which sources can be trusted. Fortunately, there are emerging techniques that might help us to measure media accuracy cheaply, and then increase it.
  • ...7 more annotations...
  • We could continuously sample a news source’s output to produce ongoing accuracy estimates, and build social software to help the audience report and filter errors. Meticulously applied, this approach would give a measure of the accuracy of each information source, and a measure of the efficiency of their corrections process (currently only about 3% of all errors are corrected.)
  • Real world reporting isn’t always clearly “right” or “wrong,” so it will often be hard to decide whether something is an error or not. But we’re not going for ultimate Truth here,  just a general way of measuring some important aspect of the idea we call “accuracy.” In practice it’s important that the error counting method is simple, clear and repeatable, so that you can compare error rates of different times and sources.
  • Subjective errors, though by definition involving judgment, should not be dismissed as merely differences in opinion. Sources found such errors to be about as common as factual errors and often more egregious [as rated by the sources.] But subjective errors are a very complex category
  • One of the major problems with previous news accuracy metrics is the effort and time required to produce them. In short, existing accuracy measurement methods are expensive and slow. I’ve been wondering if we can do better, and a simple idea comes to mind: sampling. The core idea is this: news sources could take an ongoing random sample of their output and check it for accuracy — a fact check spot check
  • Standard statistical theory tells us what the error on that estimate will be for any given number of samples (If I’ve got this right, the relevant formula is standard error of a population proportion estimate without replacement.) At a sample rate of a few stories per day, daily estimates of error rate won’t be worth much. But weekly and monthly aggregates will start to produce useful accuracy estimates
  • the first step would be admitting how inaccurate journalism has historically been. Then we have to come up with standardized accuracy evaluation procedures, in pursuit of metrics that capture enough of what we mean by “true” to be worth optimizing. Meanwhile, we can ramp up the efficiency of our online corrections processes until we find as many useful, legitimate errors as possible with as little staff time as possible. It might also be possible do data mining on types of errors and types of stories to figure out if there are patterns in how an organization fails to get facts right.
  • I’d love to live in a world where I could compare the accuracy of information sources, where errors got found and fixed with crowd-sourced ease, and where news organizations weren’t shy about telling me what they did and did not know. Basic factual accuracy is far from the only measure of good journalism, but perhaps it’s an improvement over the current sad state of affairs
  •  
    Professional journalism is supposed to be "factual," "accurate," or just plain true. Is it? Has news accuracy been getting better or worse in the last decade? How does it vary between news organizations, and how do other information sources rate? Is professional journalism more or less accurate than everything else on the internet? These all seem like important questions, so I've been poking around, trying to figure out what we know and don't know about the accuracy of our news sources. Meanwhile, the online news corrections process continues to evolve, which gives us hope that the news will become more accurate in the future.
1 - 7 of 7
Showing 20 items per page