Skip to main content

Home/ entreprise2.0/ Group items tagged knowledge

Rss Feed Group items tagged

Christophe Deschamps

Beyond Enterprise 2.0 ROI, evaluation and management of knowledge in the workplace - 0 views

  • It is common knowledge that “what you can’t measure, you can’t manage”. And because knowledge is intangible by nature, it is not measurable and therefore not manageable.  This argument is seated in a fundamental law of Science. Consequently, the only way to move forward is to rematerialise knowledge, which we do by transforming knowledge into information or data.
  • Social computing helps transform tacit knowledge into formal transferable knowledge. This is why social software fundamentally complements existing organisational information architecture, as well as provides a constructive replacement for email, which is often considered a silo because of its overtly individualistic nature.
  • Today, ROI is the iconic, easy-to-catch and use wording for a much significant concern: evaluation. ROI is one tiny piece of a real big puzzle. ROI is an indicative ratio commonly used to anticipate the financial impact of decisions. It is a simplistic rendering of a very complex set of parameters.
  • ...3 more annotations...
  • * New metrics. Because we deal with different stuff, we need to invent metrics that are relevant to what we are trying to follow and drive. For social software, one can start with the usual web and online community metrics. Some new initiatives, such as Me-trics, open doors to more in-depth analytics that are worth considering (with a barrage of ethical considerations however).
  • In fact, calculating the ROI on social software is complicated to the point that economically it is unrealistic to do so. Instead of an estimation a posteriori a pilot phase, ROI as it is commonly referenced in the “Enterprise 2.0″ scene is pure guess and absolute non-sense in most cases.
  • Why Balanced Score Cards? For four reasons: 1. Kaplan & Norton have escaped the collusion of measurement and quantity. Measurement is not necessarily quantitative. That is a common source of confusion and of inefficiencies in numerous parts of human activity (to name a few: reporting (exhaustiveness), research (methodology), education (elite creation via selection on maths)). Measurement can be qualitative (see  Georgescu Roegen work if you’re curious). It is no surprise if numerous initiatives in intellectual capital used Balanced Score Cards 2. Balanced Score Cards are notably visual, which is not so with quantitative ratios.  That visual characteristic invites greater meaning and relevance. 3. Balanced Score Cards are heterogeneous and are therefore a more natural receptacle for a) qualitative and quantitative analytics and b) can encompass a variety of topics. In this regard, one can build official reporting encompassing both physical and knowledge activities. 4. Balanced Score Cards are aggregative so that one can build reports from various levels in the organisations. Coupled with its heterogeneous nature (previous point), one can build reports for HR, Marketing, Finance, … under the same format and surface analytics at one or many levels. The result is that some knowledge related metrics can climb the hierarchy up to the summit.
Gregory Culpin

Using Enterprise 2.0 to prepare for recovery (part II) - Whitepaper to download - 0 views

  •  
    In a business world where change is constant, knowledge becomes an essential asset for any organization. Survival and growth require the development of solutions that will optimize collaboration and knowledge management. Focussing on this topic we recently produced our first whitepaper. It analyses the benefits associated with the introduction of Enterprise 2.0 solutions, and positions the collaborative management of knowledge as a stable and lasting solution, especially in these times of economic tumult.
Bertrand Duperrin

Management and Virtual Decentralised Networks: The Linux Project - 0 views

  •  
    This paper examines the latest of paradigms - the Virtual Network(ed) Organisation - and whether geographically dispersed knowledge workers can virtually collaborate for a project under no central planning. Co-ordination, management and the role of knowledge arise as the central areas of focus. The Linux Project and its development model are selected as a case of analysis and the critical success factors of this organisational design are identified. The study proceeds to the formulation of a framework that can be applied to all kinds of virtual decentralised work and concludes that value creation is maximized when there is intense interaction and uninhibited sharing of information between the organisation and the surrounding community. Therefore, the potential success or failure of this organisational paradigm depends on the degree of dedication and involvement by the surrounding community.
Christophe Deschamps

How companies are benefiting from web 2.0: McKinsey Global Survey Results - 0 views

  • 69 percent of respondents report that their companies have gained measurable business benefits, including more innovative products and services, more effective marketing, better access to knowledge, lower cost of doing business, and higher revenues. Companies that made greater use of the technologies, the results show, report even greater benefits.
  • We found that successful companies not only tightly integrate Web 2.0 technologies with the work flows of their employees but also create a “networked company,” linking themselves with customers and suppliers through the use of Web 2.0 tools. Despite the current recession, respondents overwhelmingly say that they will continue to invest in Web 2.0.
  • When we asked respondents about the business benefits their companies have gained as a result of using Web 2.0 technologies, they most often report greater ability to share ideas; improved access to knowledge experts; and reduced costs of communications, travel, and operations. Many respondents also say Web 2.0 tools have decreased the time to market for products and have had the effect of improving employee satisfaction.
  • ...15 more annotations...
  • Respondents also say they have been able to burnish their innovation skills, perhaps because their companies and customers jointly shape and cocreate products using Web 2.0 connections.
  • The median level of gains derived from internal Web 2.0 use ranged from a 10 percent improvement in operational costs to a 30 percent increase in the speed at which employees are able to tap outside experts.
  • Web 2.0 delivers benefits by multiplying the opportunities for collaboration and by allowing knowledge to spread more effectively. These benefits can accrue through companies’ use of automatic information feeds such as RSS2 or microblogs, of which Twitter is the most popular manifestation. Although many companies use a mix of tools, the survey shows that among all respondents deriving benefits, the more heavily used technologies are blogs, wikis, and podcasts—the same tools that are popular among consumers
  • Similarly, among those capturing benefits in their dealings with suppliers and partners, the tools of choice again are blogs, social networks, and video sharing. While respondents tell us that tapping expert knowledge from outside is their top priority, few report deploying prediction markets to harvest collective insights from these external networks.
  • Comparing respondents’ industries, those at high-technology companies are most likely to report measurable benefits from Web 2.0 across the board, followed by those at companies offering business, legal, and professional services
  • These survey results indicate that a different type of company may be emerging—one that makes intensive use of interactive technologies. This networked organization is characterized both by the internal integration of Web tools among employees, as well as use of the technologies to strengthen company ties with external stakeholders—customers and business partners.
  • As such, companies reporting business benefits also report high levels of Web 2.0 integration into employee workflows. They most often deploy three or more Web tools, and usage is high throughout these organizations
  • Respondents reporting measurable benefits say their companies, on average, have Web 2.0 interactions with 35 percent of their customers. These companies forged similar Web ties to 48 percent of their suppliers, partners, and outside experts. An organizational structure that’s more porous and networked may make companies more resilient and adaptive, sharpening their ability to access knowledge and thus innovate more effectively.
  • The survey results confirm that successful adoption requires that the use of these tools be integrated into the flow of users’ work (Exhibit 5). Furthermore, encouraging continuing use requires approaches other than the traditional financial or performance incentives deployed as motivational tools.
  • They also say role modeling—active Web use by executives—has been important for encouraging adoption internally.
    • Christophe Deschamps
       
      Cf le président de Cisco
  •  
    L'entreprise 2.0 n'est pas qu'un concept et cette étude menée sur 1700 dirigeants le prouve.
Christophe Deschamps

"C" Words - 0 views

  • Generation of KM Where Knowledge Lives Type of Knowledge Implications First Generation Artifacts Explicit Create the infrastructure for capturing, collecting, refining, and re-using artifacts Second Generation Individuals Tacit Focus on collaborative behaviors and person-to-person knowledge sharing Third Generation The network Emergent Provide the conditions for enabling knowledge and action to emerge.
  •  
    Résumé de l'intervention de Dave Pollard lors du salon Enterprise 2.0
Yan Thoinet

» Nine ideas for IT managers considering Enterprise 2.0 | Enterprise Web 2.0 ... - 1 views

  • In addition to Web 2.0 itself however, we have two more important enterprise software trends: Office 2.0 and Enterprise 2.0, coined by Ismael Ghalimi and Andrew McAfee respectively.  Office 2.0 represents the increasing use of browser-based software in the office, while Enterprise 2.0 is more Web 2.0-ish in that it specifically describes the use of freeform, emergent, social software to conduct collaboration and share knowledge.
  • Specifically this means the fact that corporate information tends to be non-shared by default, that the easiest productivity tools to use are the ones that have very little collaboration built-in, and that the information that does exist is often impossible to find and is often structured in some formal, centrally controlled way.
    • Yan Thoinet
       
      Very true.
  • Certainly, increased transparency, some loss of control over information flow, and outright abuse of low-barrier Intranet publishing tools gives enterprise IT and business leaders pause for thought.
  • ...16 more annotations...
  • And while some of it must remain under strict control, particularly in public companies, much of it is unnessarily — and usually to a fault — hidden, unreused, and unexploited.
    • Yan Thoinet
       
      Unexploited sources. Action: Implement a Wiki so as to share and keep up to date this wealth of information e.g. manuals, meeting agenda, minutes of meeting. This would act as the memory of the enterprise
  • Explain the reasoning behind retaining more knowledge, in making it public, searchable, and organizing it via tagging.  Describe the benefits of being able to access much fresher and more up-to-date information elsewhere in the organization because their colleagues are managing more of their projects, tasks, and other work via social tools. 
  • Provide useful templates for common activities and reference material such as projects, tasks, resource management, policies, procedures, standards, and so on.  You still have to keep template layouts and template usage simple; excessive structure tends to kill the golden goose of contributions quickly.  But a little basic structure goes a long way and prevents contributors from having to figure out how to structure all the white space and provide a simple layer of consistency.
  • The enterprise has not caught up, largely because most enterprise information doesn't allow a hyperlink structure, and links aren't encouraged very much when it does
  • setting up blog and wiki directories as well as good enterprise search based on link ranking (which is what Google does to make the right information come up in the first few pages of search results.) 
  • Provide your own search engine in the tools only if you must.
  • Create an internal Wikipedia that contains a seperate copy of all Intranet content and let users edit away.
  • This boils down to having some form of moderation, either human or automated, to ensure that the level of discourse remains at some bare minimimum acceptable standard. 
  • A high-profile executive sponsor that obviously uses the tools can also help in a big way.
  • Triggering an Enterprise 2.0 ecosystem quickly is likely an early activity driver.  This can mean a lot of things but the link structure of Web tools allows information to quickly flow, circulate, and mesh together.  You can leverage this in a almost infinite number of ways to drive user activity, interesting content, create awareness of what the company is "thinking", and more.  For example, create a blog for every employee in the company and mail the link to them with instructions on how to use it. >  Create a social bookmarking site for the enterprise where everyone can see what is being bookmarked by everyone else that day. >  Create an internal Wikipedia that contains a seperate copy of all Intranet content and let users edit away. >  The possibilities are endless and provide a much greater number of "entry points" where people can get started with these tools.
  • The problems will be with the business culture, not the technology. 
  • For example, create a blog for every employee in the company and mail the link to them with instructions on how to use it. 
  • Create a social bookmarking site for the enterprise where everyone can see what is being bookmarked by everyone else that day.
  • , the real issue, day in and day out, with getting Enterprise 2.0 to take off is to educate, evangelize, demonstrate, and most importantly, evolve the interface and structure of your tools until you pick the right formula that resonates with your audience.
  • Allowing the output of SQL queries to be inserted into wikis when they load, calling Web services or using Flash badges that access data resources can turn Enterprise 2.0 tools from pure knowledge management into actual hybrids of software and data
  • And the reverse should be true as well, getting data back out into traditional tools including Office documents, PDFs, and XML must be easy to inspire trust and lower barriers to use.
Pascal Bernardon

Competence06.com - Sophia : nouvelle étape pour KMP (Knowledge Management Pla... - 0 views

  • Un projet vaste comme l'avait souligné Catherine Thomas de Rodige (CNRS) : "nous devons bâtir une solution de knowledge management pour favoriser le partenariat entre les firmes et les institutions de recherche dans le domaine des télécommunications. Nous devons pour cela réussir aussi à capturer le savoir-faire des firmes et des labos en matière de partenariat : les procédures, les finalités, les éléments pris en compte."
Christophe Deschamps

Open-Source Spying - 0 views

  • The spy agencies were saddled with technology that might have seemed cutting edge in 1995.
  • Theoretically, the intelligence world ought to revolve around information sharing. If F.B.I. agents discover that Al Qaeda fund-raising is going on in Brooklyn, C.I.A. agents in Europe ought to be able to know that instantly.
  • Analysts also did not worry about anything other than their corners of the world.
  • ...57 more annotations...
  • When the Orange Revolution erupted in Ukraine in late 2004, Burton went to Technorati, a search engine that scours the “blogosphere,” to find the most authoritative blog postings on the subject. Within minutes, he had found sites with insightful commentary from American expatriates who were talking to locals in Kiev and on-the-fly debates among political analysts over what it meant. Because he and his fellow spies were stuck with outdated technology, they had no comparable way to cooperate — to find colleagues with common interests and brainstorm online.
  • Indeed, throughout the intelligence community, spies are beginning to wonder why their technology has fallen so far behind — and talk among themselves about how to catch up. Some of the country’s most senior intelligence thinkers have joined the discussion, and surprisingly, many of them believe the answer may lie in the interactive tools the world’s teenagers are using to pass around YouTube videos and bicker online about their favorite bands.
  • perhaps, they argue, it’ s time to try something radically different. Could blogs and wikis prevent the next 9/11?
  • during the cold war, threats formed slowly. The Soviet Union was a ponderous bureaucracy that moved at the glacial speed of the five-year plan. Analysts studied the emergence of new tanks and missiles, pieces of hardware that took years to develop.
  • Writing reports was thus a leisurely affair, taking weeks or months; thousands of copies were printed up and distributed via interoffice mail. If an analyst’s report impressed his superiors, they’d pass it on to their superiors, and they to theirs — until, if the analyst was very lucky, it landed eventually in the president’s inner circle.
  • The F.B.I. terminals were connected to one another — but not to the computers at any other agency, and vice versa.
  • If an analyst requested information from another agency, that request traveled through elaborate formal channels. The walls between the agencies were partly a matter of law.
  • Islamist terrorists, as 9/11 proved, behaved utterly unlike the Soviet Union. They were rapid-moving, transnational and cellular.
  • To disrupt these new plots, some intelligence officials concluded, American agents and analysts would need to cooperate just as fluidly — trading tips quickly among agents and agencies. Following the usual chain of command could be fatal. “To fight a network like Al Qaeda, you need to behave like a network,” John Arquilla,
  • This control over the flow of information, as the 9/11 Commission noted in its final report, was a crucial reason American intelligence agencies failed to prevent those attacks. All the clues were there — Al Qaeda associates studying aviation in Arizona, the flight student Zacarias Moussaoui arrested in Minnesota, surveillance of a Qaeda plotting session in Malaysia — but none of the agents knew about the existence of the other evidence. The report concluded that the agencies failed to “connect the dots.”
  • Spies, Andrus theorized, could take advantage of these rapid, self-organizing effects. If analysts and agents were encouraged to post personal blogs and wikis on Intelink — linking to their favorite analyst reports or the news bulletins they considered important — then mob intelligence would take over.
  • Pieces of intel would receive attention merely because other analysts found them interesting. This grass-roots process, Andrus argued, suited the modern intelligence challenge of sifting through thousands of disparate clues: if a fact or observation struck a chord with enough analysts, it would snowball into popularity, no matter what their supervisors thought.
  • What most impressed Andrus was Wikipedia’s self-governing nature. No central editor decreed what subjects would be covered. Individuals simply wrote pages on subjects that interested them — and then like-minded readers would add new facts or fix errors.
  • He pointed out that the best Internet search engines, including Google, all use “link analysis” to measure the authority of documents.
  • Each agency had databases to amass intelligence, but because of the air gap, other agencies could not easily search them. The divisions were partly because of turf battles and partly because of legal restrictions — but they were also technological.
  • This, Burton pointed out, is precisely the problem with Intelink. It has no links, no social information to help sort out which intel is significant and which isn’t. When an analyst’s report is posted online, it does not include links to other reports, even ones it cites.
  • “Analytical puzzles, like terror plots, are often too piecemeal for individual brains to put together. Having our documents aware of each other would be like hooking several brains up in a line, so that each one knows what the others know, making the puzzle much easier to solve.”
  • With Andrus and Burton’s vision in mind, you can almost imagine how 9/11 might have played out differently. In Phoenix, the F.B.I. agent Kenneth Williams might have blogged his memo noting that Al Qaeda members were engaging in flight-training activity. The agents observing a Qaeda planning conference in Malaysia could have mentioned the attendance of a Saudi named Khalid al-Midhar; another agent might have added that he held a multi-entry American visa. The F.B.I. agents who snared Zacarias Moussaoui in Minnesota might have written about their arrest of a flight student with violent tendencies. Other agents and analysts who were regular readers of these blogs would have found the material interesting, linked to it, pointed out connections or perhaps entered snippets of it into a wiki page discussing this new trend of young men from the Middle East enrolling in pilot training.
    • Christophe Deschamps
       
      Peut-être un peu idyllique?
  • “The 16 intelligence organizations of the U.S. are without peer. They are the best in the world. The trick is, are they collectively the best?”
  • in a system like this, as Andrus’s theory goes, the dots are inexorably drawn together. “Once the intelligence community has a robust and mature wiki and blog knowledge-sharing Web space,”
  • From now on, Meyerrose said, each agency would have to build new systems using cheaper, off-the-shelf software so they all would be compatible. But bureaucratic obstacles were just a part of the problem Meyerrose faced. He was also up against something deeper in the DNA of the intelligence services. “We’ve had this ‘need to know’ culture for years,” Meyerrose said. “Well, we need to move to a ‘need to share’ philosophy.”
  • In the fall of 2005, they joined forces with C.I.A. wiki experts to build a prototype of something called Intellipedia, a wiki that any intelligence employee with classified clearance could read and contribute to.
  • By the late summer, Fingar decided the Intellipedia experiment was sufficiently successful that he would embark on an even more high-profile project: using Intellipedia to produce a “national intelligence estimate” for Nigeria. An N.I.E. is an authoritative snapshot of what the intelligence community thinks about a particular state — and a guide for foreign and military policy.
  • But it will also, Fingar hopes, attract contributions from other intelligence employees who have expertise Fingar isn’t yet aware of — an analyst who served in the Peace Corps in Nigeria, or a staff member who has recently traveled there.
  • In the traditional method of producing an intelligence estimate, Fingar said, he would call every agency and ask to borrow their Africa expert for a week or two of meetings. “And they’d say: ‘Well, I only got one guy who can spell Nigeria, and he’s traveling. So you lose.’ ” In contrast, a wiki will “change the rules of who can play,” Fingar said, since far-flung analysts and agents around the world could contribute, day or night.
  • Intelink allows any agency to publish a Web page, or put a document or a database online, secure in the knowledge that while other agents and analysts can access it, the outside world cannot.
  • Rasmussen notes that though there is often strong disagreement and debate on Intellipedia, it has not yet succumbed to the sort of vandalism that often plagues Wikipedia pages, including the posting of outright lies. This is partly because, unlike with Wikipedia, Intellipedia contributors are not anonymous. Whatever an analyst writes on Intellipedia can be traced to him. “If you demonstrate you’ve got something to contribute, hey, the expectation is you’re a valued member,” Fingar said. “You demonstrate you’re an idiot, that becomes known, too.”
  • So why hasn’t Intelink given young analysts instant access to all secrets from every agency? Because each agency’s databases, and the messages flowing through their internal pipelines, are not automatically put onto Intelink. Agency supervisors must actively decide what data they will publish on the network — and their levels of openness vary.
  • It would focus on spotting and predicting possible avian-flu outbreaks and function as part of a larger portal on the subject to collect information from hundreds of sources around the world, inside and outside of the intelligence agencies.
  • Operational information — like details of a current covert action — is rarely posted, usually because supervisors fear that a leak could jeopardize a delicate mission.
  • “See, these people would never have been talking before, and we certainly wouldn’t have heard about it if they did,” the assistant said. By September, the site had become so loaded with information and discussion that Rear Adm. Arthur Lawrence, a top official in the health department, told Meyerrose it had become the government’s most crucial resource on avian flu.
  • Intelink has grown to the point that it contains thousands of agency sites and several hundred databases. Analysts at the various agencies generate 50,000 official reports a year, many of which are posted to the network. The volume of material online is such that analysts now face a new problem: data overload. Even if they suspect good information might exist on Intelink, it is often impossible to find it. The system is poorly indexed, and its internal search tools perform like the pre-Google search engines of the ’90s.“
  • But Meyerrose insists that the future of spying will be revolutionized as much by these small-bore projects as by billion-dollar high-tech systems. Indeed, he says that overly ambitious projects often result in expensive disasters, the way the F.B.I.’s $170 million attempt to overhaul its case-handling software died in 2005 after the software became so complex that the F.B.I. despaired of ever fixing the bugs and shelved it. In contrast, the blog software took only a day or two to get running. “We need to think big, start small and scale fast,” Meyerrose said.
  • But the agency’s officials trained only small groups of perhaps five analysts a month. After they finished their training, those analysts would go online, excited, and start their blogs. But they’d quickly realize no one else was reading their posts aside from the four other people they’d gone through the training with. They’d get bored and quit blogging, just as the next trainees came online.
  • This presents a secrecy paradox. The Unclassified Intellipedia will have the biggest readership and thus will grow the most rapidly; but if it’s devoid of truly sensitive secrets, will it be of any use?
  • Many in the intelligence agencies suspect not. Indeed, they often refuse to input sensitive intel into their own private, secure databases; they do not trust even their own colleagues, inside their own agencies, to keep their secrets safe.
  • These are legitimate concerns. After the F.B.I. agent Robert Hanssen was arrested for selling the identities of undercover agents to Russia, it turned out he had found their names by trawling through records on the case-support system.
  • “When you have a source, you go to extraordinary lengths to protect their identities,” I. C. Smith, a 25-year veteran of the bureau, told me. “So agents never trusted the system, and rightly so.”
  • What the agencies needed was a way to take the thousands of disparate, unorganized pieces of intel they generate every day and somehow divine which are the most important.
  • A spy blogosphere, even carefully secured against intruders, might be fundamentally incompatible with the goal of keeping secrets. And the converse is also true: blogs and wikis are unlikely to thrive in an environment where people are guarded about sharing information. Social software doesn’t work if people aren’t social.
  • the C.I.A. set up a competition, later taken over by the D.N.I., called the Galileo Awards: any employee at any intelligence agency could submit an essay describing a new idea to improve information sharing, and the best ones would win a prize.
  • The first essay selected was by Calvin Andrus, chief technology officer of the Center for Mission Innovation at the C.I.A. In his essay, “The Wiki and the Blog: Toward a Complex Adaptive Intelligence Community,”
  • For the intelligence agencies to benefit from “social software,” he said, they need to persuade thousands of employees to begin blogging and creating wikis all at once. And that requires a cultural sea change: persuading analysts, who for years have survived by holding their cards tightly to their chests, to begin openly showing their hands online.
    • Christophe Deschamps
       
      Un point essentiel. Il faut commencer petit technologiquement et grand humainement!
  • Indeed, Meyerrose’s office is building three completely separate versions of Intellipedia for each of the three levels of secrecy: Top Secret, Secret and Unclassified. Each will be placed on a data network configured so that only people with the correct level of clearance can see them — and these networks are tightly controlled, so sensitive information typed into the Top Secret Intellipedia cannot accidentally leak into the Unclassified one.
  • The chat room was unencrypted and unsecured, so anyone could drop in and read the postings or mouth off. That way, Meyerrose figured, he’d be more likely to get drop-ins by engineers from small, scrappy start-up software firms who might have brilliant ideas but no other way to get an audience with intelligence chiefs. The chat room provoked howls of outrage. “People were like, ‘Hold it, can’t the Chinese and North Koreans listen in?’ ” Meyerrose told me. “And, sure, they could. But we weren’t going to be discussing state secrets. And the benefits of openness outweigh the risks.”
  • Fingar says that more value can be generated by analysts sharing bits of “open source” information — the nonclassified material in the broad world, like foreign newspapers, newsletters and blogs. It used to be that on-the-ground spies were the only ones who knew what was going on in a foreign country. But now the average citizen sitting in her living room can peer into the debates, news and lives of people in Iran. “If you want to know what the terrorists’ long-term plans are, the best thing is to read their propaganda — the stuff out there on the Internet,”
  • Beat cops in Indiana might be as likely to uncover evidence of a terror plot as undercover C.I.A. agents in Pakistan. Fiery sermons printed on pamphlets in the U.K. might be the most valuable tool in figuring out who’s raising money for a possible future London bombing. The most valuable spy system is one that can quickly assemble disparate pieces that are already lying around — information gathered by doctors, aid workers, police officers or security guards at corporations.
  • The premise of spy-blogging is that a million connected amateurs will always be smarter than a few experts collected in an elite star chamber; that Wikipedia will always move more quickly than the Encyclopaedia Britannica; that the country’s thousand-odd political bloggers will always spot news trends more quickly than slow-moving journalists in the mainstream media.
  • In three meetings a day, the officials assess all the intel that has risen to their attention — and they jointly decide what the nation’s most serious threats are.
  • The grass roots, they’ve found, are good at collecting threats but not necessarily at analyzing them. If a lot of low-level analysts are pointing to the same inaccurate posting, that doesn’t make it any less wrong.
  • Without the knowledge that comes from long experience, he added, a fledgling analyst or spy cannot know what is important or not. The counterterrorism center, he said, should decide which threats warrant attention.
  • Many of the officials at the very top, like Fingar, Meyerrose and their colleagues at the office of the director of national intelligence, are intrigued by the potential of a freewheeling, smart-mobbing intelligence community. The newest, youngest analysts are in favor of it, too. The resistance comes from the “iron majors” — career officers who occupy the enormous middle bureaucracy of the spy agencies. They might find the idea of an empowered grass roots to be foolhardy; they might also worry that it threatens their turf.
  • The normal case for social software is failure,” Shirky said. And because Intellipedia is now a high-profile experiment with many skeptics, its failure could permanently doom these sorts of collaborative spy endeavors.
  • It might be difficult to measure contributions to a wiki; if a brilliant piece of analysis emerges from the mob, who gets credit for it?
  • “A C.I.A. officer’s career is advanced by producing reports,”
  • Though D.N.I. officials say they have direct procurement authority over technology for all the agencies, there’s no evidence yet that Meyerrose will be able to make a serious impact on the eight spy agencies in the Department of Defense, which has its own annual $38 billion intelligence budget — the lion’s share of all the money the government spends on spying.
  • if the spies do not join the rest of the world, they risk growing to resemble the rigid, unchanging bureaucracy that they once confronted during the cold war.
  •  
    Article du NY Times qui décrit en détail le projet Intellipedia, avantages, inconvénients,.... Très intéressant pour l'étude de cas de déploiement d'un projet 2.0. les risques et écueils ne sont pas oubliés. D'autant plus utile que c'est sans doute l'un des plus anciens projets de grande envergure de ce type actuellement. 10 pages.
Christophe Gauthier

Ten Leading Business Intelligence Software Solutions - 2 views

  •  
    "Ten Leading Business Intelligence Software Solutions By Thor Olavsrud May 5, 2010 The Business Intelligence software market is shaping up as a David vs. Goliath struggle. Behemoths like Microsoft, Oracle and IBM offer feature-rich BI suites along with their many other enterprise software products. Meanwhile, pure-play business intelligence software vendors -- such as MicroStrategy and Tableau -- have avid followers and are known for innovating around new features and quickly adjusting to the shifting marketplace. Why is this important? Because Business Intelligence software is used to extract data from disparate sources -- spreadsheets, databases and other software programs -- inside companies and then analyze that business data to better understand a firm's internal and external strengths and weaknesses. A business relies heavily on this data. Bottom line: Business intelligence software enables managers to better see the relationship between different data for critical decision-making -- particularly opportunities for innovation, cost reduction and optimal resource deployment. The list below includes ten industry-leading BI solutions, from vendors large and not-so-large. If you're looking for a bird's eye view of this rapidly evolving market, the following condensed portraits should help. Business Intelligence Software: Ten Solutions Note: This list is NOT ordered "best to worst." The question of what business intelligence software solution is best for a given company depends on an entire matrix of factors. This list is simply an overview of BI solutions, with the debate about quality left to individual clients. SAP Crystal Reports Crystal Reports is part of SAP's Business Objects portfolio of business intelligence software solutions. It allows users to graphically design interactive reports and connect them to virtually any data source, Microsoft Excel spreadsheets, Oracle databases, Business Objects Enterprise business views and local file system info
Christophe Deschamps

IndustryWeek : Seven Strategies for Implementing a Successful Corporate Wiki - 0 views

  • Integrate the wiki as one of several important tools in an organization's IT collaboration architecture.Understand the wiki "rules of conduct" and ensure they are monitored and enforced.Optimize the use of wikis for collaborative knowledge creation across geographically dispersed employees, and for crossing divisional or functional boundaries, in order to gain insights from people not previously connected.Assign a champion to each wiki and have that champion observe contributions that people make to the wiki; the champion will help foster employees who adopt the important "shaper" role within the wiki.Recognize that the most difficult barrier to cross in sustaining a wiki is convincing people to edit others' work; organizations should ask their champion and managers to help with this.Recognize that a significant value of wikis comes from embedding small software programs into the wiki that structure repetitive behavior. Some include organizing meeting minutes, rolling up project status or scheduling meetings. Ask wiki participants to keep watching for repetitive activity to evolve and enhance wiki technology.Understand wikis are best used in work cultures that encourage collaboration. Without an appropriate fit with the workplace culture, wiki technology will be of limited value in sharing knowledge, ideas and practices.
  •  
    Tout est dans le titre
Christophe Deschamps

Social Media vs. Knowledge Management: A Generational War - 0 views

  •  
    KM is dead
1 - 20 of 53 Next › Last »
Showing 20 items per page