Goodbye, Dilbert: 'The Rise of the Naked Economy' » Knowledge@Wharton - 2 views

-
The old cubicle-based, static company is increasingly being replaced by a more fluid and mobile model: “the constant assembly, disassembly, and reassembly of people, talent, and ideas around a range of challenges and opportunities.”
-
Therefore, the new economy and its “seminomadic workforce” will require “new places to gather, work, live, and interact.”
- ...17 more annotations...
-
The consumer electronics company Plantronics, for example, knowing that on any given day 40% of its workforce will be working elsewhere, designed its corporate campus to only 60% capacity
-
Their joint enterprise, NextSpace, became their first venture into what they call “coworking,” or the creation of “shared collaborative workspaces.”
-
also nurtures what the authors call “managed serendipity” — ad hoc collaboration between people with diverging but complementary skills
-
Coonerty and Neuner found that the most productive collaborations tended to pair highly specialized experts with big-picture thinkers
-
Clients get the specialized help they need at a cost below that of a full-time employee or traditional consulting firm, and specialists are well compensated and rewarded with flexible schedules and a greater degree of choice about which projects to take.
-
This has produced a new market dynamic in which the headhunter of yesteryear has been replaced by “talent brokers” who connect highly specialized talent with companies on a project-by-project basis
-
Matthew Mullenweg, doesn’t have much faith in traditional office buildings or corporate campuses: “I would argue that most offices are full of people not working.”
-
On the other hand, Mullenweg is a big believer in face-to-face collaboration and brainstorming, and flies his teams all over the globe to do so.
-
Additionally, a 2010 Kauffman-Rand study worried that employer-based health insurance, by discouraging risk-taking, will be an ongoing drag on entrepreneurship
Uber, Data Darwinism and the future of work - Tech News and Analysis - 0 views
-
also see http://www.baen.com/chapters/W200011/0671319744___1.htm on this theme of absolute accountability
-
something to keep in mind while designing value equations, understanding the impact of absolute accountability, and the need to ensure we are measuring wht matters as important decisions will be made based on the metric, whether or not the metric is being used properly in context (for example using klout scores to choose among candidates for an engineering job)
POWER-CURVE SOCIETY: The Future of Innovation, Opportunity and Social Equity in the Eme... - 1 views
-
how technological innovation is restructuring productivity and the social and economic impact resulting from these changes
-
concern about the technological displacement of jobs, stagnant middle class income, and wealth disparities in an emerging "winner-take-all" economy
-
personal data ecosystems that could potentially unlock a revolutionary wave of individual economic empowerment
- ...70 more annotations...
-
As the technology boom of the 1990s increased productivity, many assumed that the rising water level of the economy was raising all those middle class boats. But a different phenomenon has also occurred. The wealthy have gained substantially over the past two decades while the middle class has remained stagnant in real income, and the poor are simply poorer.
-
America is turning into a power-curve society: one where there are a relative few at the top and a gradually declining curve with a long tail of relatively poorer people.
-
For the first time since the end of World War II, the middle class is apparently doing worse, not better, than previous generations.
-
as businesses struggle to come to terms with this revolution, a new set of structural innovations is washing over businesses, organizations and government, forcing near-constant adaptation and change. It is no exaggeration to say that the explosion of innovative technologies and their dense interconnections is inventing a new kind of economy.
-
the new technologies are clearly driving economic growth and higher productivity, the distribution of these benefits is skewed in worrisome ways.
-
the networked economy seems to be producing a “power-curve” distribution, sometimes known as a “winner-take-all” economy
-
major component of this new economy, Big Data, and the coming personal data revolution fomenting beneath it that seeks to put individuals, and not companies or governments, at the forefront. Companies in the power-curve economy rely heavily on big databases of personal information to improve their marketing, product design, and corporate strategies. The unanswered question is whether the multiplying reservoirs of personal data will be used to benefit individuals as consumers and citizens, or whether large Internet companies will control and monetize Big Data for their private gain.
-
A special concern is whether information and communications technologies are actually eliminating more jobs than they are creating—and in what countries and occupations.
-
Is it polarizing income and wealth distributions? How is it changing the nature of work and traditional organizations and altering family and personal life?
-
many observers fear a wave of social and political disruption if a society’s basic commitments to fairness, individual opportunity and democratic values cannot be honored
-
what role government should play in balancing these sometimes-conflicting priorities. How might educational policies, research and development, and immigration policies need to be altered?
-
Conventional economics says that progress comes from new infusions of capital, whether financial, physical or human. But those are not necessarily the things that drive innovation
-
economists have developed a number of proxy metrics for innovation, such as research and development expenditures.
-
Atkinson believes that economists both underestimate and overestimate the scale and scope of innovation.
-
Calculating the magnitude of innovation is also difficult because many innovations now require less capital than they did previously.
-
believes that technological innovation follows the path of an “S-curve,” with a gradual increase accelerating to a rapid, steep increase, before it levels out at a higher level. One implication of this pattern, he said, is that “you maximize the ability to improve technology as it becomes more diffused.” This helps explain why it can take several decades to unlock the full productive potential of an innovation.
-
innovation keeps getting harder. It was pretty easy to invent stuff in your garage back in 1895. But the technical and scientific challenges today are huge.”
-
costs of innovation have plummeted, making it far easier and cheaper for more people to launch their own startup businesses and pursue their unconventional ideas
-
Atkinson conceded such cost-efficiencies, but wonders if “the real question is that problems are getting more complicated more quickly than the solutions that might enable them.
-
we may need to parse the different stages of innovation: “The cost of innovation generally hasn’t dropped,” he argued. “What has become less expensive is the replication and diffusion of innovation.”
-
A lot of barriers to innovation can be found in the lack of financing, organizational support systems, regulation and public policies.
-
there is a serious mismatch between the pace of innovation unleashed by Moore’s Law and our institutional and social capacity to adapt.
-
This raises the question of whether old institutions can adapt—or whether innovation will therefore arise through other channels entirely. “Existing institutions are often run by followers of conventional wisdom,”
-
The best way to identify new sources of innovation, as Arizona State University President Michael Crow has advised, is to “go to the edge and ignore the center.”
-
Paradoxically, one of the most potent barriers to innovation is the accelerating pace of innovation itself.
-
Part of the problem, he continued, is that our economy is based on “push-based models” in which we try to build systems for scalable efficiencies, which in turn demands predictability.
-
The real challenge is how to achieve radical institutional innovations that prepare us to live in periods of constant two- or three-year cycles of change. We have to be able to pick up new ideas all the time.”
-
The App Economy consists of a core company that creates and maintains a platform (such as Blackberry, Facebook or the iPhone), which in turn spawns an ecosystem of big and small companies that produce apps and/or mobile devices for that platform
-
tied this success back to the open, innovative infrastructure and competition in the U.S. for mobile devices
-
small businesses are becoming more comfortable using such systems to improve their marketing and lower their costs; and, vast new pools of personal data are becoming extremely useful in sharpening business strategies and marketing.
-
Another great boost to innovation in some business sectors is the ability to forge ahead without advance permission or regulation,
-
“In bio-fabs, for example, it’s not the cost of innovation that is high, it’s the cost of regulation,”
-
“In Europe and China, the law holds that unless something is explicitly permitted, it is prohibited. But in the U.S., where common law rather than Continental law prevails, it’s the opposite
On the Phenomenon of Bullshit Jobs - STRIKE! - 1 views
-
financial services or telemarketing, or the unprecedented expansion of sectors like corporate law, academic and health administration, human resources, and public relations
-
provide administrative, technical, or security support for these industries, or for that matter the whole host of ancillary industries (dog-washers, all-night pizza deliverymen) that only exist because everyone else is spending so much of their time working in all the other ones
-
It’s as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is exactly what is not supposed to happen
- ...23 more annotations...
-
Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat)
-
working 40 or even 50 hour weeks on paper, but effectively working 15 hours just as Keynes predicted, since the rest of their time is spent organising or attending motivational seminars
-
The ruling class has figured out that a happy and productive population with free time on their hands is a mortal danger
-
And, on the other hand, the feeling that work is a moral value in itself, and that anyone not willing to submit themselves to some kind of intense work discipline for most of their waking hours deserves nothing, is extraordinarily convenient for them
-
Hell is a collection of individuals who are spending the bulk of their time working on a task they don’t like and are not especially good at
-
they all become so obsessed with resentment at the thought that some of their co-workers might be spending more time making cabinets
-
It’s not entirely clear how humanity would suffer were all private equity CEOs, lobbyists, PR researchers, actuaries, telemarketers, bailiffs or legal consultants to similarly vanish. (Many suspect it might markedly improve.)
-
plagued with debts and a newborn daughter, ended up, as he put it, “taking the default choice of so many directionless folk: law school
-
Now he’s a corporate lawyer working in a prominent New York firm. He was the first to admit that his job was utterly meaningless, contributed nothing to the world, and, in his own estimation, should not really exist
-
I would not presume to tell someone who is convinced they are making a meaningful contribution to the world that, really, they are not. But what about those people who are themselves convinced their jobs are meaningless?
-
(Answer: if 1% of the population controls most of the disposable wealth, what we call “the market” reflects what they think is useful or important, not anybody else.)
-
should you meet them at parties and admit that you do something that might be considered interesting (an anthropologist, for example), will want to avoid even discussing their line of work entirely
-
This is a profound psychological violence here. How can one even begin to speak of dignity in labour when one secretly feels one’s job should not exist?
-
Yet it is the peculiar genius of our society that its rulers have figured out a way, as in the case of the fish-fryers, to ensure that rage is directed precisely against those who actually do get to do meaningful work
-
in our society, there seems a general rule that, the more obviously one’s work benefits other people, the less one is likely to be paid for it
-
There’s a lot of questions one could ask here, starting with, what does it say about our society that it seems to generate an extremely limited demand for talented poet-musicians, but an apparently infinite demand for specialists in corporate law?
-
You can see it when tabloids whip up resentment against tube workers for paralysing London during contract disputes: the very fact that tube workers can paralyse London shows that their work is actually necessary, but this seems to be precisely what annoys people
-
It’s even clearer in the US, where Republicans have had remarkable success mobilizing resentment against school teachers, or auto workers (and not, significantly, against the school administrators or auto industry managers who actually cause the problems)
-
It’s as if they are being told “but you get to teach children! Or make cars! You get to have real jobs! And on top of that you have the nerve to also expect middle-class pensions and health care?”
-
If someone had designed a work regime perfectly suited to maintaining the power of finance capital, it’s hard to see how they could have done a better job
-
The remainder are divided between a terrorised stratum of the – universally reviled – unemployed and a larger stratum who are basically paid to do nothing, in positions designed to make them identify with the perspectives and sensibilities of the ruling class (managers, administrators, etc) – and particularly its financial avatars – but, at the same time, foster a simmering resentment against anyone whose work has clear and undeniable social value
Shimana Laser Distance Meter (up to 50m) - 1 views

Laser Distancemeter - Leica Geosystems - Leica Geosystems - 1 views

How To Export STL Files From Google Sketchup - 0 views
www.cerebralmeltdown.com/...index.htm
Google Sketchup CNC machine 3D modeling Lee Nelson Daniel Brastaviceanu

Google Apps Script - introduction - 0 views
code.google.com/...guide_events.html
google script apps Programming javascript tool tools infrastructure

-
Google Apps Script provides simple event handlers and installable event handlers, which are easy ways for you to specify functions to run at a particular time or in response to an event.
- ...39 more annotations...
-
Calendar, Mail and Site are not anonymous and the simple event handlers cannot access those services.
-
Installable event handlers are set on the Triggers menu within the Script Editor, and they're called triggers in this document.
-
The spreadsheet containing the script does not have to be open for the event to be triggered and the script to run.
-
You can connect triggers to one or more functions in a script. Any function can have multiple triggers attached. In addition, you can add trigger attributes to a function to further refine how the trigger behaves.
-
When a script runs because of a trigger, the script runs using the identity of the person who installed the trigger, not the identity of the user whose action triggered the event. This is for security reasons.
-
An event is passed to every event handler as the argument (e). You can add attributes to the (e) argument that further define how the trigger works or that capture information about how the script was triggered.
-
an example of a function that sends email to a designated individual containing information captured by a Spreadsheet when a form is submitted.
-
With Google Apps, forms have the option to automatically record the submitter's username, and this is available to the script as e.namedValues["Username"]. Note: e.namedValues are only available for Google Apps domains and not for consumer Google accounts.
Google Apps Script - introduction - 0 views
code.google.com/...guide_writing_scripts.html
google script apps Programming javascript tool tools infrastructure

-
Use the Script Editor to write and run scripts, to set triggers, and to perform other actions such as sharing scripts.
- ...69 more annotations...
-
use the onOpen event handler in more than one script associated with a particular Spreadsheet, all scripts begin to execute when you open the Spreadsheet and the order in which the scripts are executed is indeterminate.
-
A script cannot currently call or create another script and cannot call functions in another script.
-
You can trigger Apps Script events from links that are embedded in a Google Site. For information about how to do this, see Using Apps Scrip in Your Ssite.
-
You also designate whether only you can invoke the service or whether all members of your domain can invoke the service.
-
API includes objects that you use to accomplish tasks such as sending email, creating calendar entries
-
Custom functions and formulas in the spreadsheet execute any time the entire Spreadsheet is evaluated or when the data changes in the function or formula's cell.
-
The debugger does not work with custom functions, onEdit functions, event triggers, or scripts running as a service.
-
use the debugger to find errors in scripts that are syntactically correct but still do not function correctly.
-
Functions ending in an underscore (_), for example, internalStuff_(), are treated differently from other functions. You do not see these function in the Run field in the Script Editor and they do not appear in the Script Manager in the Spreadsheet. You can use the underscore to indicate that users should not attempt to run the function and the function is available only to other functions.
Workflow Service - Google Apps Script Examples - 0 views
Digital Reality | Edge.org - 0 views
edge.org/...il_gershenfeld-digital-reality
*neilgershenfeld fab labs IoT molecular computing nano star trek replicator

-
When you snap the bricks together, you don't need a ruler to play Lego; the geometry comes from the parts
- ...75 more annotations...
-
In a 3D printer today, what you can make is limited by the size of the machine. The geometry is external
-
is the Lego tower is more accurate than the child because the constraint of assembling the bricks lets you detect and correct errors
-
detect and correct state to correct errors to get an exponential reduction in error, which gives you an exponential increase in complexity
-
The last one is when you're done with Lego you don't put it in the trash; you take it apart and reuse it because there's state in the materials. In a forest there's no trash; you die and your parts get disassembled and you're made into new stuff. When you make a 3D print or laser cut, when you're done there's recycling attempts but there's no real notion of reusing the parts
-
The metrology coming from the parts, detecting and correcting errors, joining dissimilar materials, disconnecting, reusing the components
-
On the very smallest scale, the most exciting work on digital fabrication is the creation of life from scratch. The cell does everything we're talking about. We've had a great collaboration with the Venter Institute on microfluidic machinery to load designer genomes into cells. One step up from that we're developing tabletop chip fab instead of a billion dollar fab, using discrete assembly of blocks of electronic materials to build things like integrated circuits in a tabletop process
-
There's a series of books by David Gingery on how to make a machine shop starting with charcoal and iron ore.
-
There are twenty amino acids. With those twenty amino acids you make the motors in the molecular muscles in my arm, you make the light sensors in my eye, you make my neural synapses. The way that works is the twenty amino acids don't encode light sensors, or motors. They’re very basic properties like hydrophobic or hydrophilic. With those twenty properties you can make you. In the same sense, digitizing fabrication in the deep sense means that with about twenty building blocks—conducting, insulating, semiconducting, magnetic, dielectric—you can assemble them to create modern technology
-
By discretizing those three parts we can make all those 500,000 resistors, and with a few more parts everything else.
-
Now, there's a casual sense, which means a computer controls something to make something, and then there's the deep sense, which is coding the materials. Intellectually, that difference is everything but now I'm going to explain why it doesn't matter.
-
Then in turn, the next surprise was they weren't there for research, they weren't there for theses, they wanted to make stuff. I taught additive, subtractive, 2D, 3D, form, function, circuits, programming, all of these skills, not to do the research but just using the existing machines today
-
What they were answering was the killer app for digital fabrication is personal fabrication, meaning, not making what you can buy at Walmart, it’s making what you can't buy in Walmart, making things for a market of one person
-
the Altair was life changing for people like me. It was the first computer you could own as an individual. But it was almost useless
-
It was hard to use but it brought the cost from a million dollars to 100,000 and the size from a warehouse down to a room. What that meant is a workgroup could have one. When a workgroup can have one it meant Ken Thompson and Dennis Ritchie at Bell Labs could invent UNIX—which all modern operating systems descend from—because they didn't have to get permission from a whole corporation to do it
-
At the PC stage what happened is graphics, storage, processing, IO, all of the subsystems got put in a box
-
To line that up with fabrication, MIT's 1952 NC Mill is similar to the million-dollar machines in my lab today. These are the mainframes of fab. You need a big organization to have them. The fab labs I'll tell you about are exactly analogous to the cost and complexity of minicomputers. The machines that make machines I'll tell you about are exactly analogous to the cost and complexity of the hobbyist computers. The research we're doing, which is leading up to the Star Trek Replicator, is what leads to the personal fabricator, which is the integrated unit that makes everything
-
The fab lab is 2 tons, a $100,000 investment. It fills a few thousand square feet, 3D scanning and printing, precision machining, you can make circuit boards, molding and casting tooling, computer controlled cutting with a knife, with a laser, large format machining, composite layup, surface mount rework, sensors, actuators, embedded programming— technology to make technology.
-
Ten years you can just plot this doubling. Today, you can send a design to a fab lab and you need ten different machines to turn the data into something. Twenty years from now, all of that will be in one machine that fits in your pocket.
-
We've been living with this notion that making stuff is an illiberal art for commercial gain and it's not part of the means of expression. But, in fact, today, 3D printing, micromachining, and microcontroller programming are as expressive as painting paintings or writing sonnets but they're not means of expression from the Renaissance. We can finally fix that boundary between art and artisans
-
Over the next maybe five years we'll be transitioning from buying machines to using machines to make machines. Self-reproducing machines
-
But they still have consumables like the motors, and they still cut or squirt. Then the interesting transition comes when we go from cutting or printing to assembling and disassembling, to moving to discretely assembled materials
-
because if anybody can make anything anywhere, it challenges everything
-
Now, the biggest surprise for me in this is I thought the research was hard. It's leading to how to make the Star Trek Replicator. The insight now is that's an exercise in embodied computation—computation in materials, programming their construction. Lots of work to come, but we know what to do
-
And that's when you do tabletop chip fab or make airplanes. That's when technical trash goes away because you can disassemble.
-
At something like a Maker Faire, there's hall after hall of repeated reinventions of bad 3D printers and there isn't an easy process to take people from easy to hard
-
We started a project out of desperation because we kept failing to succeed in working with existing schools, called the Fab Academy. Now, to understand how that works, MIT is based on scarcity. You assume books are scarce, so you have to go there for the library; you assume tools are scarce, so you have to go there for the machines; you assume people are scarce, so you have to go there to see them; and geography is scarce. It adds up to we can fit a few thousand people at a time. For those few thousand people it works really well. But the planet is a few billion people. We're off by six orders of magnitude.
-
Next year we're starting a new class with George Church that we've called "How to Grow Almost Anything", which is using fab labs to make bio labs and then teach biotech in it. What we're doing is we're making a new global kind of university
-
Amusingly, I went to my friends at Educause about accrediting the Fab Academy and they said, "We love it. Where are you located?" And I said, "Yes" and they said, "No." Meaning, "We're all over the earth." And they said, "We have no mechanism. We're not allowed to do that. There's no notion of global accreditation."
-
The way the Fab Academy works, in computing terms, it's like the Internet. Students have peers in workgroups, with mentors, surrounded by machines in labs locally. Then we connect them globally by video and content sharing and all of that. It's an educational network. There are these critical masses of groups locally and then we connect them globally
-
You still have Microsoft or IBM now but, with all respect to colleagues there, arguably that's the least interesting part of software
-
To understand the economic and social implications, look at software and look at music to understand what's happening now for fabrication
-
There's a core set of skills a place like MIT can do but it alone doesn't scale to a billion people. This is taking the social engineering—the character of MIT—but now doing it on this global scale.
-
Mainframes didn't go away but what opened up is all these tiers of software development that weren't economically viable
-
If you look at music development, the most interesting stuff in music isn't the big labels, it's all the tiers of music that weren't viable before
-
You can make music for yourself, for one, ten, 100, 1,000, a million. If you look at the tracks on your device, music is now in tiers that weren't economically viable before. In that example it's a string of data and it becomes a sound. Now in digital fab, it's a string of data and it becomes a thing.
-
What is work? For the average person—not the people who write for Edge, but just an average person working—you leave home to go to a place you'd rather not be, doing a repetitive operation you'd rather not do, making something designed by somebody you don't know for somebody you'll never see, to get money to then go home and buy something. But what if you could skip that and just make the thing?
-
It took about ten years for the dot com industry to realize pretty much across the board you don't directly sell the thing. You sell the benefits of the thing
-
2016 it's in Shenzhen because they're pivoting from mass manufacturing to enabling personal fabrication. We've set Shenzhen as the goal in 2016 for Fab Lab 2.0, which is fab labs making fab labs
-
To rewind now, you can send something to Shenzhen and mass manufacture it. There's a more interesting thing you can do, which is you go to market by shipping data and you produce it on demand locally, and so you produce it all around the world.
-
But their point was a lot of printers producing beautiful pages slowly scales if all the pages are different
-
In the same sense it scales to fabricate globally by doing it locally, not by shipping the products but shipping the data.
-
It doesn't replace mass manufacturing but mass manufacturing becomes the least interesting stuff where everybody needs the same thing. Instead, what you open up is all these tiers that weren't viable before
-
There, they consider IKEA the enemy because IKEA defines your taste. Far away they make furniture and flat pack it and send it to a big box store. Great design sense in Barcelona, but 50 percent youth unemployment. A whole generation can't work. Limited jobs. But ships come in from the harbor, you buy stuff in a big box store. And then after a while, trucks go off to a trash dump. They describe it as products in, trash out. Ships come in with products, trash goes out
-
instead of working to get money to buy products made somewhere else, you can make them locally
-
The biggest tool is a ShotBot 4'x8'x1' NC mill, and you can make beautiful furniture with it. That's what furniture shops use
-
it means you can make many of the things you consume directly rather than this very odd remote economic loop
-
the most interesting part of the DIY phone projects is if you're making a do-it-yourself phone, you can also start to make the things that the phones talk to. You can start to build your own telco providers where the users provide the network rather than spending lots of money on AT&T or whoever
-
Traditional manufacturing is exactly replaying the script of the computer companies saying, "That's a toy," and it's shining a light to say this creates entirely new economic activity. The new jobs don't come back to the old factories. The ability to make stuff on demand is creating entirely new jobs
-
To keep playing that forward, when I was in Barcelona for the meeting of all these labs hosted by the city architect and the city, the mayor, Xavier Trias, pushed a button that started a forty-year countdown to self-sufficiency. Not protectionism
-
I need high-torque efficient motors with integrated lead screws at low cost, custom-produced on demand. All sorts of the building blocks that let us do what I'm doing currently rest on a global supply chain including China's manufacturing agility
-
The short-term answer is you can't get rid of them because we need them in the supply chain. But the long-term answer is Shenzhen sees the future isn't mass producing for everybody. That's a transitional stage to producing locally
-
The real thing ultimately that's driving the fab labs ... the vacuum we filled is a technical one. The means to make stuff. Nobody was providing that. But in turn, the spaces become magnets. Everybody talks about innovation or knowledge economy, but then most things that label that strangle it. The labs become vehicles for bright inventive people who don't fit locally. You can think about the culture of MIT but on this global scale
-
My allegiance isn't to any one border, it's to the brainpower of the planet and this is building the infrastructure to scale to that brainpower
-
If you zoom from transistors to microcode to object code to a program, they don't look like each other. But if we take this room and go from city, state, country, it's hierarchical but you preserve geometry
-
The reason that's so important for the digital fabrication piece is once we build molecular assemblers to build arbitrary systems, you don't want to then paste a few lines of code in it. You need to overlay computation with geometry. It's leading to this complete do-over of computer science
-
If you take digital fab, plus the real sense of Internet of Things—not the garbled sense—plus the real future of computing aligning hardware and software, it all adds up to this ability to program reality
-
I run a giant video infrastructure and I have collaborators all over the world that I see more than many of my colleagues at MIT because we're all too busy on campus. The next Silicon Valley is a network, it's not a place. Invention happens in these networks.
-
When Edwin Land was kicked out of Polaroid, he made the Rowland Institute, which was making an ideal research institute with the best facilities and the best people and they could do whatever they want. But almost nothing came from it because there was no turnover of the gene pool, there was no evolutionary pressure.
-
the wrong way to do research, which is to believe there's a privileged set of people that know more than anybody else and to create a barrier that inhibits communication from the inside to the outside
-
you need evolutionary pressure, you need traffic, you need to be forced to deal with people you don't think you need to encounter, and you need to recognize that to be disruptive it helps to know what people know
-
For me the hardest thing isn't the research. That's humming along nicely. It's that we're finding we have to build a completely new kind of social order and that social entrepreneurship—figuring out how you live, learn, work, play—is hard and there's a very small set of people who can do that kind of organizational creation.
The New Normal in Funding University Science | Issues in Science and Technology - 1 views
-
Government funding for academic research will remain limited, and competition for grants will remain high. Broad adjustments will be needed
- ...72 more annotations...
-
systemic problems that arise from the R&D funding system and incentive structure that the federal government put in place after World War II
-
unding rates in many National Institutes of Health (NIH) and National Science Foundation (NSF) programs are now at historical lows, declining from more than 30% before 2001 to 20% or even less in 2011
-
even the most prominent scientists will find it difficult to maintain funding for their laboratories, and young scientists seeking their first grant may become so overwhelmed that individuals of great promise will be driven from the field
-
The growth of the scientific enterprise on university campuses during the past 60 years is not sustainable and has now reached a tipping point at which old models no longer work
-
ederal funding agencies must work with universities to ensure that new models of funding do not stymie the progress of science in the United States
-
The deeper sources of the problem lie in the incentive structure of the modern research university, the aspirations of scientists trained by those universities, and the aspirations of less research-intensive universities and colleges across the nation
-
if a university wants to attract a significant amount of sponsored research money, it needs doctoral programs in the relevant fields and faculty members who are dedicated to both winning grants and training students
-
Even though not all doctorate recipients become university faculty, the size of the science and engineering faculty at U.S. universities has grown substantially
-
These strategies make sense for any individual university, but will fail collectively unless federal funding for R&D grows robustly enough to keep up with demand.
-
At the very time that universities were enjoying rapidly growing budgets, and creating modes of operation that assumed such largess was the new normal, Price warned that it would all soon come to a halt
-
the human and financial resources invested in science had been increasing much faster than the populations and economies of those regions
-
growth in the scientific enterprise would have to slow down at some point, growing no more than the population or the economy.
-
studies sounded an alarm about the potential decline in U.S. global leadership in science and technology and the grave implications of that decline for economic growth and national security
-
Although we are not opposed to increasing federal funding for research, we are not optimistic that it will happen at anywhere near the rate the Academies seek, nor do we think it will have a large impact on funding rates
-
universities should not expect any radical increases in domestic R&D budgets, and most likely not in defense R&D budgets either, unless the discretionary budgets themselves grow rapidly. Those budgets are under pressure from political groups that want to shrink government spending and from the growth of spending in mandatory programs
-
The basic point is that the growth of the economy will drive increases in federal R&D spending, and any attempt to provide rapid or sustained increases beyond that growth will require taking money from other programs.
-
The demand for research money cannot grow faster than the economy forever and the growth curve for research money flattened out long ago.
-
The goal cannot be to convince the government to invest a higher proportion of its discretionary spending in research
-
Getting more is not in the cards, and some observers think the scientific community will be lucky to keep what it has
-
The potential to take advantage of the infrastructure and talent on university campuses may be a win-win situation for businesses and institutions of higher education.
-
Why should universities and colleges continue to support scientific research, knowing that the financial benefits are diminishing?
-
faculty members are committed to their scholarship and will press on with their research programs even when external dollars are scarce
-
it is critical to have active research laboratories, not only in elite public and private research institutions, but in non-flagship public universities, a diverse set of private universities, and four-year colleges
-
How then do increasingly beleaguered institutions of higher education support the research efforts of the faculty, given the reality that federal grants are going to be few and far between for the majority of faculty members? What are the practical steps institutions can take?
-
change the current model of providing large startup packages when a faculty member is hired and then leaving it up to the faculty member to obtain funding for the remainder of his or her career
-
universities invest less in new faculty members and spread their internal research dollars across faculty members at all stages of their careers, from early to late.
-
-
national conversation about changes in startup packages and by careful consultations with prospective faculty hires about long-term support of their research efforts
-
Many prospective hires may find smaller startup packages palatable, if they can be convinced that the smaller packages are coupled with an institutional commitment to ongoing research support and more reasonable expectations about winning grants.
-
Smaller startup packages mean that in many situations, new faculty members will not be able to establish a functioning stand-alone laboratory. Thus, space and equipment will need to be shared to a greater extent than has been true in the past.
-
construction of open laboratory spaces and the strategic development of well-equipped research centers capable of efficiently servicing the needs of an array of researchers
-
Collaborative proposals and the assembly of research teams that focus on more complex problems can arise relatively naturally as interactions among researchers are facilitated by proximity and the absence of walls between laboratories.
-
The more likely trajectory of a junior faculty member will evolve from contributing team member to increasing leadership responsibilities to team leader
-
nternal evaluations of contributions and potential will become more important in tenure and promotion decisions.
-
-
-
relationships with foundations, donors, state agencies, and private business will become increasingly important in the funding game
-
-
Further complicating university collaborations with business is that past examples of such partnerships have not always been easy or free of controversy.
-
some faculty members worried about firms dictating the research priorities of the university, pulling graduate students into proprietary research (which could limit what they could publish), and generally tugging the relevant faculty in multiple directions.
-
University faculty and businesspeople often do not understand each other’s cultures, needs, and constraints, and such gaps can lead to more mundane problems in university/industry relations, not least of which are organizational demands and institutional cultures
-
-
n addition to funding for research, universities can receive indirect benefits from such relationships. High-profile partnerships with businesses will underline the important role that universities can play in the economic development of a region.
-
Universities have to see firms as more than just deep pockets, and firms need to see universities as more than sources of cheap skilled labor.
-
We do not believe that research proposed and supervised by individual principal investigators will disappear anytime soon. It is a research model that has proven to be remarkably successful and enduring
-
However, we believe that the most vibrant scientific communities on university and college campuses, and the ones most likely to thrive in the new reality of funding for the sciences, will be those that encourage the formation of research teams and are nimble with regard to funding sources, even as they leave room for traditional avenues of funding and research.
Why Great Innovations Fail: It's All in the Ecosystem - 0 views
-
Michelin developed a revolutionary new kind of tire with sensors and an internal hard wheel that could run almost perfectly for 125 miles after a puncture.
- ...13 more annotations...
-
Mastery of the ecosystem is the great strength that made Apple the supreme success story of our time,
-
In a world where mobile phone makers sold their devices to operators to sell to consumers, Jobs had such a powerful ecosystem that he could get operators to compete to partner with him: “And here was Apple, offering not just exclusive access to the most talked-about phone in history, but also exclusive access to Apple consumers—the most desirable customer segment imaginable
-
How do you take the measure of the ecosystem that your innovation will need to be part of and rely on? How do you not miss the blind spots that can lurk almost anywhere?
-
There are terrible pitfalls in the usual progression from prototype to pilot to rollout. It relies perilously on getting everything right from the very start. Often a far wiser and safer approach can be what Adner calls a “minimum viable footprint (MVF) rollout followed by a staged expansion.” In other words, start with a complete ecosystem, but a limited one.
Beyond Blockchain: Simple Scalable Cryptocurrencies - The World of Deep Wealth - Medium - 0 views
-
I clarify the core elements of cryptocurrency and outline a different approach to designing such currencies rooted in biomimicry
-
This post outlines a completely different strategy for implementing cryptocurrencies with completely distributed chains
- ...95 more annotations...
-
we are interested in the resilience that comes from building a rich ecosystem of interoperable currencies
-
Holdings are electronic and only exist and operate by virtue of a community’s agreement about how to interpret digital bits according to rules about operation and accounting of the currency.
-
Specifically, access, issuance, transaction accounting, rules & policies, should be collectively visible, known, and held.
-
This cryptographic structure is used to enable a variety of people to host the data without being able to alter it.
-
there must be a way to associate these bits with some kind of account, wallet, owner, or agent who can use them
-
Other things that many take for granted in blockchains may not be core but subject to decisions in design and implementation, so they can vary between implementations
-
does not have to be money. It may be a reputation currency, or data used for identity, or naming, etc
-
Then you must tackle the problem of always tracking which coins exist, and which have been spent. That is one approach — the one blockchain takes.
-
You might optimize for anonymity if you think of cryptocurrency as a tool to escape governments, regulations, and taxes.
-
if you want to establish and manage membership in new kinds of commons, then identity and accountability for actions may turn out to be necessary ingredients instead of anonymity.
-
In the case of the MetaCurrency Project, we are trying to support many use cases by building tools to enable a rich ecosystem of communities and current-sees (many are non-monetary) to enhance collective intelligence at all scales.
-
Managing consensus about a shared reality is a central challenge at the heart of all distributed computing solutions.
-
If we want to democratize money by having cryptocurrencies become a significant and viable means of transacting on a daily basis, I believe we need fundamentally more scalable approaches that don’t require expensive, dedicated hardware just to participate.
-
Blockchain is about managing a consensus about what was “said.” Ceptr is about distributing a consensus about how to “speak.”
-
how nature gets the job done in massively scalable systems which require coordination and consistency
-
Each speaker of a language carries the processes to understand sentences they hear, and generate sentences they need
-
we certainly don’t carry some kind of global ledger of everything that’s ever been said, or require consensus about what has been said
-
there is certainly no global ledger with consensus about the state of trillions of cells. Yet, from a single zygote’s copy of DNA, our cells coordinate in a highly decentralized manner, on scales of trillions, and without the latency or bottlenecks of central control.
-
Imagine something along the lines of a Java Virtual Machine connected to a distributed version of Github
-
Every time this JVM runs a program it confirms the hash of the code it is about to execute with the hash signed into the code repository by its developers
-
This allows each node that intends to be honest to be sure that they’re running the same processes as everyone else. So when two parties want to do a transaction, and each can have confidence their own code, and the results that your code produces
-
Then you treat it as authoritative and commit it to your local cryptographically self-validating data store
-
Allowing each node to treat itself as a full authority to process transactions (or interactions via shared protocols) is exactly how you empower each node with full agency. Each node runs its copy of the signed program/processes on its own virtual machine, taking the transaction request combined with the transaction chains of the parties to the transaction. Each node can confirm their counterparty’s integrity by replaying their transactions to produce their current state, while confirming signatures and integrity of the chain
-
If both nodes are in an appropriate state which allows the current transaction, then they countersign the transaction and append to their respective chains. When you encounter a corrupted or dishonest node (as evidenced by a breach of integrity of their chain — passing through an invalid state, broken signatures, or broken links), your node can reject the transaction you were starting to process. Countersigning allows consensus at the appropriate scale of the decision (two people transacting in this case) to lock data into a tamper-proof state so it can be stored in as many parallel chains as you need.
-
When your node appends a mutually validated and signed transaction to its chain, it has updated its local state and is able to represent the integrity of its data locally. As long as each transaction (link in the chain) has valid linkages and countersignatures, we can know that it hasn’t been tampered with.
-
If you can reliably embody the state of the node in the node itself using Intrinsic Data Integrity, then all nodes can interact in parallel, independent of other interactions to maximize scalability and simultaneous processing. Either the node has the credits or it doesn’t. I don’t have to refer to a global ledger to find out, the state of the node is in the countersigned, tamper-proof chain.
-
Just like any meaningful communication, a protocol needs to be established to make sure that a transaction carries all the information needed for each node to run the processes and produce a new signed and chained state. This could be debits or credits to an account which modify the balance, or recoding courses and grades to a transcript which modify a Grade Point Average, or ratings and feedback contributing to a reputation score, and so on.
-
By distributing process at the foundation, and leveraging Intrinsic Data Integrity, our approach results in massive improvements in throughput (from parallel simultaneous independent processing), speed, latency, efficiency, and cost of hardware.
-
Another noteworthy observation about humans, cells, and atoms, is that each has a general “container” that gets configured to a specific use.
-
Likewise, the Receptors we’ve built are a general purpose framework which can load code for different distributed applications. These Receptors are a lightweight processing container for the Ceptr Virtual Machine Host
-
Ceptr enables a developer to focus on the rules and transactions for their use case instead of building a whole framework for distributed applications.
-
Most people think that money is just money, but there are literally hundreds of decisions you can make in designing a currency to target particular needs, niches, communities or patterns of flow.
-
the challenging task of tracking all the coins that exist to ensure there is no counterfeiting or double-spending
-
You wouldn’t need to manage consensus about whether a cryptocoin is spent, if your system created accounts which have normal balances based on summing their transactions.
-
In a mutual credit system, units of currency are issued when a participant extends credit to another user in a standard spending transaction
-
Managing the currency supply in a mutual credit system is about managing credit limits — how far people can spend into a negative balance
-
keep in mind there can be different classes of accounts. Easy to create, anonymous accounts may get NO credit limit
-
What if I alter my code to give myself an unlimited credit limit, then spend as much as I want? As soon as you pass the credit limit encoded in the shared agreements, the next person you transact with will discover you’re in an invalid state and refuse the transaction.
-
If two people collude to commit an illegal transaction by both hacking their code to allow a normally invalid state, the same still pattern still holds. The next person they try to transact with using untampered code will detect the problem and decline to transact.
-
Hawala is a network of merchants and businessmen, which has been operating since the middle ages, performing money transfers on an honor system and typically settling balances through merchandise instead of transferring money
-
To minimize key management infrastructure, each hawaladar’s public key is their address or identity on the network. To join the network you get a copy of the software from another hawaladar, generate your public and private keys, and complete your personal profile (name, location, contact info, etc.). You call, fax, or email at least 10 hawaladars who know you, and give them your IP address and ask them to vouch for you.
-
Once 10 other hawaladars have vouched for you, you can start doing other transactions because the protocol encoded in every node will reject a transaction chain that doesn’t start with at least 10 vouches
-
As described in the Mutual Credit section, at the time of transaction each party audits the counterparty’s transaction chain.
-
Our hawala crypto-clearinghouse protocol has two categories of transactions: some used for accounting and others for routing. Accounting transactions change balances. Routing transactions maintain network integrity by recording information about hawaladar
-
The final hash of all of the above fields is used as a unique transaction ID and is what each of party signs with their private keys. Signing indicates a party has agreed to the terms of the transaction. Only transactions signed by both parties are considered valid. Nodes can verify signatures by confirming that decryption of the signature using the public key yields a result which matches the transaction ID.
-
As with accounting transactions, the hash of the above fields is used as the transaction’s unique key and the basis for the cryptographic signature of both counterparties.
-
Remember, instead of making changes to account balances, routing transactions change a node’s local list of peers for finding each other and processing.
-
It would be possible for someone to hack the code on their node to “forget” their most recent transaction (drop the head of their chain), and go back to their previous version of the chain before that transaction. Then they could append a new transaction, drop it, and append again.
-
After both parties have signed the agreed upon transaction, each party submits the transaction to separate notaries. Notaries are a special class of participant who validate transactions (auditing each chain, ensuring nobody passes through an invalid state), and then they sign an outer envelope which includes the signatures of the two parties. Notaries agree to run high-availability servers which collectively manage a Distributed Hash Table (DHT) servicing requests for transaction information. As their incentive for providing this infrastructure, notaries get a small transaction fee.
-
This approach introduces a few more steps and delays to the transaction process, but because it operates on independent parallel chains, it is still orders of magnitude more efficient and decentralized than reaching consensus on entries in a global ledger
-
millions of simultaneous transactions could be getting processed by other parties and notaries with no bottlenecks.
-
There are other solutions to prevent nodes from dropping the head of their transaction chain, but the approach of having notaries serve out a DHT solves a number of common objections to completely distributed accounting. Having access to reliable lookups in a DHT provides a similar big picture view that you get from a global ledger. For example, you may want a way to look up transactions even when the parties to that transaction are offline, or to be able to see the net system balance at a particular moment in time, or identify patterns of activity in the larger system without having to collect data from everyone individually.
-
By leveraging Intrinsic Data Integrity to run numerous parallel tamper-proof chains you can enable nodes to do various P2P transactions which don’t actually require group consensus. Mutual credit is a great way to implement cryptocurrencies to run in this peered manner. Basic PKI with a DHT is enough additional infrastructure to address main vulnerabilities. You can optimize your solution architecture by reserving reserve consensus work for tasks which need to guarantee uniqueness or actually involve large scale agreement by humans or automated contracts.
-
It is not only possible, but far more scalable to build cryptocurrencies without a global ledger consensus approach or cryptographic tokens.
Promoting and Assessing Value Creation in Networks - P2P Foundation - 1 views
p2pfoundation.net/ing_Value_Creation_in_Networks
value creation networks theory model value system contribution accounting paper

- ...7 more annotations...
-
The first level—related to the satisfaction level—is called "immediate value" and it assesses what just happened, for example, in a webinar
-
The second level is called "potential value," and I like to think of this as the new knowledge or understanding that is lying latent but ready to be put to use in the future
-
The third level does this, and it is called "applied value" and this is where the model starts to become interesting to CEOs and others
-
hard metrics like reduced development time, improved efficiencies, or financial returns. The fourth level in the framework provides this, and the level is called "realized value."
-
he fifth level is where the community changes as a result of the activity occurring in the first four levels. At this highest level, the framework examines changes in the community—norms, standards, practices, and thought leadership—that has occurred as a result of activity within the community
The Baffler - 0 views
-
This tendency to view questions of freedom primarily through the lens of economic competition, to focus on the producer and the entrepreneur at the expense of everyone else, shaped O’Reilly’s thinking about technology.
-
the O’Reilly brand essence is ultimately a story about the hacker as hero, the kid who is playing with technology because he loves it, but one day falls into a situation where he or she is called on to go forth and change the world,
-
His true hero is the hacker-cum-entrepreneur, someone who overcomes the insurmountable obstacles erected by giant corporations and lazy bureaucrats in order to fulfill the American Dream 2.0: start a company, disrupt an industry, coin a buzzword.
- ...139 more annotations...
-
making it seem as if the language of economics was, in fact, the only reasonable way to talk about the subject
-
It’s easy to forget this today, but there was no such idea as open source software before 1998; the concept’s seeming contemporary coherence is the result of clever manipulation and marketing.
-
Free Software Foundation, preoccupied with ensuring that users had rights with respect to their computer programs. Those rights weren’t many—users should be able to run the program for any purpose, to study how it works, to redistribute copies of it, and to release their improved version (if there was one) to the public
-
profound critique of the role that patent law had come to play in stifling innovation and creativity.
-
Plenty of developers contributed to “free software” projects for reasons that had nothing to do with politics. Some, like Linus Torvalds, the Finnish creator of the much-celebrated Linux operating system, did so for fun; some because they wanted to build more convenient software; some because they wanted to learn new and much-demanded skills.
-
By early 1998 several business-minded members of the free software community were ready to split from Stallman, so they masterminded a coup, formed their own advocacy outlet—the Open Source Initiative—and brought in O’Reilly to help them rebrand.
-
The label “open source” may have been new, but the ideas behind it had been in the air for some time.
-
This budding movement prided itself on not wanting to talk about the ends it was pursuing; except for improving efficiency and decreasing costs, those were left very much undefined.
-
“open source is not particularly a moral or a legal issue. It’s an engineering issue. I advocate open source, because . . . it leads to better engineering results and better economic results
-
While free software was meant to force developers to lose sleep over ethical dilemmas, open source software was meant to end their insomnia.
-
Stallman the social reformer could wait for decades until his ethical argument for free software prevailed in the public debate
-
O’Reilly the savvy businessman had a much shorter timeline: a quick embrace of open source software by the business community guaranteed steady demand for O’Reilly books and events
-
The coup succeeded. Stallman’s project was marginalized. But O’Reilly and his acolytes didn’t win with better arguments; they won with better PR.
-
A decade after producing a singular vision of the Internet to justify his ideas about the supremacy of the open source paradigm, O’Reilly is close to pulling a similar trick on how we talk about government reform.
-
O’Reilly cared for only one type of freedom: the freedom of developers to distribute software on whatever terms they fancied.
-
is that which protects “my choice as a creator to give, or not to give, the fruits of my work to you, as a ‘user’ of that work, and for you, as a user, to accept or reject the terms I place on that gift.”
-
O’Reilly opposed this agenda: “I completely support the right of Richard [Stallman] or any individual author to make his or her work available under the terms of the GPL; I balk when they say that others who do not do so are doing something wrong.”
-
According to this Randian interpretation of open source, the goal of regulation and public advocacy should be to ensure that absolutely nothing—no laws or petty moral considerations—stood in the way of the open source revolution
-
must be opposed, since it would taint the reputation of open source as technologically and economically superior to proprietary software
-
Many developers did stop thinking about licenses, and, having stopped thinking about licenses, they also stopped thinking about broader moral issues that would have remained central to the debates had “open source” not displaced “free software” as the paradigm du jour.
-
Profiting from the term’s ambiguity, O’Reilly and his collaborators likened the “openness” of open source software to the “openness” of the academic enterprise, markets, and free speech.
-
“For me, ‘open source’ in the broader sense means any system in which open access to code lowers the barriers to entry into the market”).
-
The language of economics was less alienating than Stallman’s language of ethics; “openness” was the kind of multipurpose term that allowed one to look political while advancing an agenda that had very little to do with politics
-
the availability of source code for universal examination soon became the one and only benchmark of openness
-
What the code did was of little importance—the market knows best!—as long as anyone could check it for bugs.
-
The new paradigm was presented as something that went beyond ideology and could attract corporate executives without losing its appeal to the hacker crowd.
-
What Raymond and O’Reilly failed to grasp, or decided to overlook, is that their effort to present open source as non-ideological was underpinned by a powerful ideology of its own—an ideology that worshiped innovation and efficiency at the expense of everything else.
-
What they had in common was disdain for Stallman’s moralizing—barely enough to justify their revolutionary agenda, especially among the hacker crowds who were traditionally suspicious of anyone eager to suck up to the big corporations that aspired to dominate the open source scene.
-
As long as everyone believed that “open source” implied “the Internet” and that “the Internet” implied “open source,” it would be very hard to resist the new paradigm
-
Telling a coherent story about open source required finding some inner logic to the history of the Internet
-
“If you believe me that open source is about Internet-enabled collaboration, rather than just about a particular style of software license,”
-
The way O’Reilly saw it, many of the key developments of Internet culture were already driven by what he called “open source behavior,” even if such behavior was not codified in licenses.
-
No moralizing (let alone legislation) was needed; the Internet already lived and breathed open source
-
Openness as a happenstance of market conditions is a very different beast from openness as a guaranteed product of laws.
-
One of the key consequences of linking the Internet to the world of open source was to establish the primacy of the Internet as the new, reinvented desktop
-
This is where the now-forgotten language of “freedom” made a comeback, since it was important to ensure that O’Reilly’s heroic Randian hacker-entrepreneurs were allowed to roam freely.
-
Soon this “freedom to innovate” morphed into “Internet freedom,” so that what we are trying to preserve is the innovative potential of the platform, regardless of the effects on individual users.
-
Lumping everything under the label of “Internet freedom” did have some advantages for those genuinely interested in promoting rights such as freedom of expression
-
Forced to choose between preserving the freedom of the Internet or that of its users, we were supposed to choose the former—because “the Internet” stood for progress and enlightenment.
-
their value proposition lay in the information they delivered, not in the software function they executed.
-
to argue that the Internet could help humanity augment its “collective intelligence” and that, once again, open source software was crucial to this endeavor.
-
Now it was all about Amazon learning from its customers and Google learning from the sites in its index.
-
in 2004, O’Reilly and his business partner Dale Dougherty hit on the idea of “Web 2.0.” What did “2.0” mean, exactly?
-
he primary goal was to show that the 2001 market crash did not mean the end of the web and that it was time to put the crash behind us and start learning from those who survived.
-
Tactically, “Web 2.0” could also be much bigger than “open source”; it was the kind of sexy umbrella term that could allow O’Reilly to branch out from boring and highly technical subjects to pulse-quickening futurology
-
O’Reilly couldn’t improve on a concept as sexy as “collective intelligence,” so he kept it as the defining feature of this new phenomenon.
-
What set Web 2.0 apart from Web 1.0, O’Reilly claimed, was the simple fact that those firms that didn’t embrace it went bust
-
O’Reilly eventually stuck a 2.0 label on anything that suited his business plan, running events with titles like “Gov 2.0” and “Where 2.0.” Today, as everyone buys into the 2.0 paradigm, O’Reilly is quietly dropping it
-
assumption that, thanks to the coming of Web 2.0, we are living through unique historical circumstances
-
Take O’Reilly’s musings on “Enterprise 2.0.” What is it, exactly? Well, it’s the same old enterprise—for all we know, it might be making widgets—but now it has learned something from Google and Amazon and found a way to harness “collective intelligence.”
-
tendency to redescribe reality in terms of Internet culture, regardless of how spurious and tenuous the connection might be, is a fine example of what I call “Internet-centrism.”
-
“Open source” gave us the “the Internet,” “the Internet” gave us “Web 2.0,” “Web 2.0” gave us “Enterprise 2.0”: in this version of history, Tim O’Reilly is more important than the European Union
-
For Postman, each human activity—religion, law, marriage, commerce—represents a distinct “semantic environment” with its own tone, purpose, and structure. Stupid talk is relatively harmless; it presents no threat to its semantic environment and doesn’t cross into other ones.
-
Crazy talk, in contrast, challenges a semantic environment, as it “establishes different purposes and assumptions from those we normally accept.” To argue, as some Nazis did, that the German soldiers ended up far more traumatized than their victims is crazy talk.
-
For Postman, one of the main tasks of language is to codify and preserve distinctions among different semantic environments.
-
As he put it, “When language becomes undifferentiated, human situations disintegrate: Science becomes indistinguishable from religion, which becomes indistinguishable from commerce, which becomes indistinguishable from law, and so on.
-
Some words—like “law”—are particularly susceptible to crazy talk, as they mean so many different things: from scientific “laws” to moral “laws” to “laws” of the market to administrative “laws,” the same word captures many different social relations. “Open,” “networks,” and “information” function much like “law” in our own Internet discourse today.
-
For Korzybski, the world has a relational structure that is always in flux; like Heraclitus, who argued that everything flows, Korzybski believed that an object A at time x1 is not the same object as object A at time x2
-
Our language could never properly account for the highly fluid and relational structure of our reality—or as he put it in his most famous aphorism, “the map is not the territory.”
-
Korzybski argued that we relate to our environments through the process of “abstracting,” whereby our neurological limitations always produce an incomplete and very selective summary of the world around us.
-
nothing harmful in this per se—Korzybski simply wanted to make people aware of the highly selective nature of abstracting and give us the tools to detect it in our everyday conversations.
-
He also encouraged his followers to start using “etc.” at the end of their statements as a way of making them aware of their inherent inability to say everything about a given subject and to promote what he called the “consciousness of abstraction.”
-
“What are the characteristics of language which lead people into making false evaluations of the world around them?”
-
O’Reilly openly acknowledges his debt to Korzybski, listing Science and Sanity among his favorite books
-
It would be a mistake to think that O’Reilly’s linguistic interventions—from “open source” to “Web 2.0”—are random or spontaneous.
-
There is a philosophy to them: a philosophy of knowledge and language inspired by Korzybski. However, O’Reilly deploys Korzybski in much the same way that the advertising industry deploys the latest findings in neuroscience: the goal is not to increase awareness, but to manipulate.
-
O’Reilly, of course, sees his role differently, claiming that all he wants is to make us aware of what earlier commentators may have overlooked. “A metaphor is just that: a way of framing the issues such that people can see something they might otherwise miss,
-
But Korzybski’s point, if fully absorbed, is that a metaphor is primarily a way of framing issues such that we don’t see something we might otherwise see.
-
In public, O’Reilly modestly presents himself as someone who just happens to excel at detecting the “faint signals” of emerging trends. He does so by monitoring a group of überinnovators that he dubs the “alpha geeks.” “The ‘alpha geeks’ show us where technology wants to go. Smart companies follow and support their ingenuity rather than trying to suppress it,
-
His own function is that of an intermediary—someone who ensures that the alpha geeks are heard by the right executives: “The alpha geeks are often a few years ahead of their time. . . . What we do at O’Reilly is watch these folks, learn from them, and try to spread the word by writing down (
-
The name of his company’s blog—O’Reilly Radar—is meant to position him as an independent intellectual who is simply ahead of his peers in grasping the obvious.
-
As Web 2.0 becomes central to everything, O’Reilly—the world’s biggest exporter of crazy talk—is on a mission to provide the appropriate “context” to every field.
-
The thinker who emerges there is very much at odds with the spirit of objectivity that O’Reilly seeks to cultivate in public
-
meme-engineering lets us organize and shape ideas so that they can be transmitted more effectively, and have the desired effect once they are transmitted
-
O’Reilly meme-engineers a nice euphemism—“meme-engineering”—to describe what has previously been known as “propaganda.”
-
how one can meme-engineer a new meaning for “peer-to-peer” technologies—traditionally associated with piracy—and make them appear friendly and not at all threatening to the entertainment industry.
-
O’Reilly and his acolytes “changed the canonical list of projects that we wanted to hold up as exemplars of the movement,” while also articulating what broader goals the projects on the new list served. He then proceeds to rehash the already familiar narrative: O’Reilly put the Internet at the center of everything, linking some “free software” projects like Apache or Perl to successful Internet start-ups and services. As a result, the movement’s goal was no longer to produce a completely free, independent, and fully functional operating system but to worship at the altar of the Internet gods.
-
His “correspondents” at O’Reilly Radar don’t work beats; they work memes and epistemes, constantly reframing important public issues in accordance with the templates prophesied by O’Reilly.
-
Now, who stands to benefit from “cyberwarfare” being defined more broadly? Could it be those who, like O’Reilly, can’t currently grab a share of the giant pie that is cybersecurity funding?
-
Frank Luntz lists ten rules of effective communication: simplicity, brevity, credibility, consistency, novelty, sound, aspiration, visualization, questioning, and context.
-
Thus, O’Reilly’s meme-engineering efforts usually result in “meme maps,” where the meme to be defined—whether it’s “open source” or “Web 2.0”—is put at the center, while other blob-like terms are drawn as connected to it.
-
The exact nature of these connections is rarely explained in full, but this is all for the better, as the reader might eventually interpret connections with their own agendas in mind. This is why the name of the meme must be as inclusive as possible: you never know who your eventual allies might be. “A big part of meme engineering is giving a name that creates a big tent that a lot of people want to be under, a train that takes a lot of people where they want to go,”
-
News April 4 mail date March 29, 2013 Baffler party March 6, 2013 Žižek on seduction February 13, 2013 More Recent Press I’ve Seen the Worst Memes of My Generation Destroyed by Madness io9, April 02, 2013 The Baffler’s New Colors Imprint, March 21, 2013
-
There is considerable continuity across O’Reilly’s memes—over time, they tend to morph into one another.
Access control - Wikipedia, the free encyclopedia - 0 views
en.wikipedia.org/...Access_control
blockchainaccess_project wikipedia paper access passport IoPA tech

- ...26 more annotations...
-
Geographical access control may be enforced by personnel (e.g., border guard, bouncer, ticket checker)
-
n alternative of access control in the strict sense (physically controlling access itself) is a system of checking authorized presence, see e.g. Ticket controller (transportation). A variant is exit control, e.g. of a shop (checkout) or a country
-
access control refers to the practice of restricting entrance to a property, a building, or a room to authorized persons
-
can be achieved by a human (a guard, bouncer, or receptionist), through mechanical means such as locks and keys, or through technological means such as access control systems like the mantrap.
-
Historically, this was partially accomplished through keys and locks. When a door is locked, only someone with a key can enter through the door, depending on how the lock is configured. Mechanical locks and keys do not allow restriction of the key holder to specific times or dates. Mechanical locks and keys do not provide records of the key used on any specific door, and the keys can be easily copied or transferred to an unauthorized person. When a mechanical key is lost or the key holder is no longer authorized to use the protected area, the locks must be re-keyed.[citation needed] Electronic access control uses computers to solve the limitations of mechanical locks and keys. A wide range of credentials can be used to replace mechanical keys. The electronic access control system grants access based on the credential presented. When access is granted, the door is unlocked for a predetermined time and the transaction is recorded. When access is refused, the door remains locked and the attempted access is recorded. The system will also monitor the door and alarm if the door is forced open or held open too long after being unlocked
-
When a credential is presented to a reader, the reader sends the credential’s information, usually a number, to a control panel, a highly reliable processor. The control panel compares the credential's number to an access control list, grants or denies the presented request, and sends a transaction log to a database. When access is denied based on the access control list, the door remains locked.
-
The above description illustrates a single factor transaction. Credentials can be passed around, thus subverting the access control list. For example, Alice has access rights to the server room, but Bob does not. Alice either gives Bob her credential, or Bob takes it; he now has access to the server room. To prevent this, two-factor authentication can be used. In a two factor transaction, the presented credential and a second factor are needed for access to be granted; another factor can be a PIN, a second credential, operator intervention, or a biometric input
-
There are three types (factors) of authenticating information:[2] something the user knows, e.g. a password, pass-phrase or PIN something the user has, such as smart card or a key fob something the user is, such as fingerprint, verified by biometric measurement
-
Passwords are a common means of verifying a user's identity before access is given to information systems. In addition, a fourth factor of authentication is now recognized: someone you know, whereby another person who knows you can provide a human element of authentication in situations where systems have been set up to allow for such scenarios
-
A credential is a physical/tangible object, a piece of knowledge, or a facet of a person's physical being, that enables an individual access to a given physical facility or computer-based information system. Typically, credentials can be something a person knows (such as a number or PIN), something they have (such as an access badge), something they are (such as a biometric feature) or some combination of these items. This is known as multi-factor authentication. The typical credential is an access card or key-fob, and newer software can also turn users' smartphones into access devices.
-
An access control point, which can be a door, turnstile, parking gate, elevator, or other physical barrier, where granting access can be electronically controlled. Typically, the access point is a door. An electronic access control door can contain several elements. At its most basic, there is a stand-alone electric lock. The lock is unlocked by an operator with a switch. To automate this, operator intervention is replaced by a reader. The reader could be a keypad where a code is entered, it could be a card reader, or it could be a biometric reader. Readers do not usually make an access decision, but send a card number to an access control panel that verifies the number against an access list
-
Generally only entry is controlled, and exit is uncontrolled. In cases where exit is also controlled, a second reader is used on the opposite side of the door. In cases where exit is not controlled, free exit, a device called a request-to-exit (REX) is used. Request-to-exit devices can be a push-button or a motion detector. When the button is pushed, or the motion detector detects motion at the door, the door alarm is temporarily ignored while the door is opened. Exiting a door without having to electrically unlock the door is called mechanical free egress. This is an important safety feature. In cases where the lock must be electrically unlocked on exit, the request-to-exit device also unlocks the doo
-
Access control decisions are made by comparing the credential to an access control list. This look-up can be done by a host or server, by an access control panel, or by a reader. The development of access control systems has seen a steady push of the look-up out from a central host to the edge of the system, or the reader. The predominant topology circa 2009 is hub and spoke with a control panel as the hub, and the readers as the spokes. The look-up and control functions are by the control panel. The spokes communicate through a serial connection; usually RS-485. Some manufactures are pushing the decision making to the edge by placing a controller at the door. The controllers are IP enabled, and connect to a host and database using standard networks
-
Semi-intelligent readers: have all inputs and outputs necessary to control door hardware (lock, door contact, exit button), but do not make any access decisions. When a user presents a card or enters a PIN, the reader sends information to the main controller, and waits for its response. If the connection to the main controller is interrupted, such readers stop working, or function in a degraded mode. Usually semi-intelligent readers are connected to a control panel via an RS-485 bus.
-
Intelligent readers: have all inputs and outputs necessary to control door hardware; they also have memory and processing power necessary to make access decisions independently. Like semi-intelligent readers, they are connected to a control panel via an RS-485 bus. The control panel sends configuration updates, and retrieves events from the readers.
-
Systems with IP readers usually do not have traditional control panels, and readers communicate directly to a PC that acts as a host
-
Some readers may have additional features such as an LCD and function buttons for data collection purposes (i.e. clock-in/clock-out events for attendance reports), camera/speaker/microphone for intercom, and smart card read/write support
If not Global Captalism - then What? - 0 views
-
I posit an optimistic view of the potential for Society from the emergence of a new and “Open” form of Capitalism.
- ...162 more annotations...
-
‘Enterprise’ is defined as ‘any entity within which two or more individuals create, accumulate or exchange Value”.
-
Pirsig’s approach Capital may be viewed as “Static” Value and Money as “Dynamic” Value. “Transactions” are the “events” at which individuals (Subjects) interact with each other or with Capital (both as Objects) to create forms of Value and at which “Value judgments” are made based upon a “Value Unit”.
-
The result of these Value Events /Transactions is to create subject/object pairings in the form of data ie Who “owns” or has rights of use in What,
-
It, too, may then be defined in a subject/object pairing through the concept of “intellectual property”.
-
“The purpose of money is to facilitate barter by splitting the transaction into two parts, the acceptor of money reserving the power to requisition value from any trader at any time
-
The monetary process is a dynamic one involving the creation and recording of obligations as between individuals and the later fulfilment of these obligations
-
Static Value – which only becomes “Money”/ Dynamic Value when exchanged in the transitory Monetary process.
-
the practice of Lending involves an incomplete exchange in terms of risk and reward: a Lender, as opposed to an Investor, has no interest in the outcome of the Loan, and requires the repayment of Principal no matter the ability of the Borrower to repay.
-
-
"The Lender has no interest in the outcome of the loan", i.e doesn't care what happens in the end. The Lender ins not interested in the economical outcome of the Lender-Loner relation. So in fact there is no real risk sharing. the only risk for the Lender is when the Loner doesn't pay back, which is not really a risk... In fact it is a risk for the small bank, who has to buy money from the central bank, but not for the central bank.
-
-
an “Object” circulating but rather a dynamic process of Value creation and exchange by reference to a “Value Unit”.
-
in relation to Productive Capital relates to the extent of “property rights” which may be held over it thereby allowing individuals to assert “absolute” permanent and exclusive ownership - in particular in relation to Land
-
need for institutions which outlived the lives of the Members led to the development of the Corporate body with a legal existence independent of its Members
-
The key development in the history of Capitalism was the creation of the ‘Joint Stock’ Corporate with liability limited by shares of a ‘Nominal’ or ‘Par’ value
-
over the next 150 years the Limited Liability Corporate evolved into the Public Limited Liability Corporate
-
Such “Closed” Shares of “fixed” value constitute an absolute and permanent claim over the assets and revenues of the Enterprise to the exclusion of all other “stakeholders” such as Suppliers, Customers, Staff, and Debt Financiers.
-
It has the characteristics of what biologists call a ‘semi-permeable membrane’ in the way that it allows Economic Value to be extracted from other stakeholders but not to pass the other way.
-
-
Capital most certainly is and always has been - through the discontinuity (see diagram) between:‘Fixed’ Capital in the form of shares ie Equity; and ‘Working’ Capital in the form of debt finance, credit from suppliers, pre-payments by customers and obligations to staff and management.
-
xchange of Economic Value in a Closed Corporate is made difficult and true sharing of Risk and Reward is simply not possible
-
All that is needed is a simple ‘Member Agreement’ – a legal protocol which sets out the Aims, Objectives. Principles of Governance, Revenue Sharing, Dispute Resolution, Transparency and any other matters that Members agree should be included. Amazingly enough, this Agreement need not even be in writing, since in the absence of a written agreement Partnership Law is applied by way of default.
-
The ease of use and total flexibility enables the UK LLP to be utilised in a way never intended – as an ‘Open’ Corporate partnership.
-
it is now possible for any stakeholder to become a Member of a UK LLP simply through signing a suitably drafted Member Agreement
-
may instead become true Partners in the Enterprise with their interests aligned with other stakeholders.
-
-
no profit or loss in an Open Corporate Partnership, merely Value creation and exchange between members in conformance with the Member Agreement.
-
in an Enterprise constitute an infinitely divisible, flexible and scaleable form of Capital capable of distributing or accumulating Value organically as the Enterprise itself grows in Value or chooses to distribute it.
-
Within the OCP Capital and Revenue are continuous: to the extent that an Investee pays Rental in advance of the due date he becomes an Investor.
-
A Co-operative is not an enterprise structure: it is a set of Principles that may be applied to different types of enterprise structure.
-
the crippling factors in practical terms have been, inter alia: the liability to which Member partners are exposed from the actions of their co-partners on their behalf; limited ability to raise capital.
-
they favour the interests of other stakeholders, are relatively restricted in accessing investment; are arguably deficient in incentivising innovation.
-
The ‘new’ LLP was expressly created to solve the former problem by limiting the liability of Member partners to those assets which they choose to place within its protective ‘semi-permeable membrane’
-
However, the ability to configure the LLP as an “Open” Corporate permits a new and superior form of Enterprise.
-
it is possible to re-organise any existing enterprise as either a partnership or as a partnership of partnerships.
-
would be divided among Members in accordance with the LLP Agreement. This means that all Members share a common interest in collaborating/co-operating to maximise the Value generated by the LLP collectively as opposed to competing with other stakeholders to maximise their individual share at the other stakeholders’ expense.
-
he ‘Commercial’ Enterprise LLP – where the object is for a closed group of individuals to maximise the value generated in their partnership. There are already over 7,000 of these.
-
-
the Profit generated in a competitive economy based upon shareholder value and unsustainable growth results from a transfer of risks outwards, and the transfer of reward inwards, leading to a one way transfer of Economic Value.
-
Whether its assets are protected within a corporate entity with limited liability or not, it will always operate co-operatively – for mutual profit.
-
continuity between Capital as Static Value and Money as Dynamic Value which has never before been possible due to the dichotomy between the absolute/infinite and the absolute/finite durations of the competing claims over assets – “Equity” and “Debt”
-
Open Capital Partnership gives rise to a new form of Financial Capital of indeterminate duration. It enables the Capitalisation of assets and the monetisation of revenue streams in an entirely new way.
-
It is possible to envisage a Society within which individuals are members of a portfolio of Enterprises constituted as partnerships, whether limited in liability or otherwise.
-
‘Commercial’ enterprises of all kinds aimed at co-operatively working together to maximise value for the Members.
-
It can only be replaced by another ‘emergent’ phenomenon, which is adopted ‘virally’ because any Enterprise which does not utilise it will be at a disadvantage to an Enterprise which does.
-
The ‘Open’ Corporate Partnership is: capable of linking any individuals anywhere in respect of collective ownership of assets anywhere; extremely cheap and simple to operate; and because one LLP may be a Member of another it is organically flexible and ‘scaleable’. The phenomenon of “Open Capital” – which is already visible in the form of significant commercial transactions - enables an extremely simple and continuous relationship between those who wish to participate indefinitely in an Enterprise and those who wish to participate for a defined period of time.
-
Moreover, the infinitely divisible proportionate “shares” which constitute ‘Open’ Capital allow stakeholder interests to grow flexibly and organically with the growth in Value of the Enterprise. In legal terms, the LLP agreement is essentially consensual and ‘pre-distributive’: it is demonstrably superior to prescriptive complex contractual relationships negotiated adversarially and subject to subsequent re-distributive legal action. Above all, the ‘Open’ Corporate Partnership is a Co-operative phenomenon which is capable, the author believes, of unleashing the “Co-operative Advantage” based upon the absence of a requirement to pay returns to “rentier” Capitalists.