Carl Sagan Day | Center for Inquiry - 2 views
-
Please join us this November as we honor Carl Sagan and celebrate the beauty and wonder of the cosmos he so eloquently described. Carl Sagan was a Professor of Astronomy and Space Science and Director of the Laboratory for Planetary Studies at Cornell University, but most of us know him as a Pulitzer Prize winning author and the creator of COSMOS. That Emmy and Peabody award-winning PBS television series transformed educational television when it first aired in 1980, but now, thirty years later, it's gone on to affect the hearts and minds of over a billion people in sixty countries. No other scientist has been able to reach and teach so many nonscientists in such a meaningful way, and that is why we celebrate Dr. Sagan, remember his work, and revel in the cosmos he helped us understand.
Why Big Media Is Going Nuclear Against The DMCA | TechCrunch - 0 views
-
When Congress updated copyright laws and passed the Digital Millennium Copyright Act (DMCA) in 1998, it ushered an era of investment, innovation and job creation. In the decade since, companies like Google, YouTube and Twitter have emerged thanks to the Act, but in the process, they have disrupted the business models and revenue streams of traditional media companies (TMCs). Today, the TMCs are trying to fast-track a couple of bills in the House and Congress to reverse all of that. Through their lobbyists in Washington, D.C., media companies are trying to rewrite the DMCA through two new bills. The content industry's lobbyists have forged ahead without any input from the technology industry, the one in the Senate is called Protect IP and the one in the House is called E-Parasites. The E-Parasite law would kill the safe harbors of the DMCA and allow traditional media companies to attack emerging technology companies by cutting off their ability to transact and collect revenue, sort of what happened to Wikileaks, if you will. This would scare VCs from investing in such tech firms, which in turn would destroy job creation. The technology industry is understandably alarmed by its implications, which include automatic blacklists for any site issued a takedown notice by copyright holders that would extend to payment providers and even search engines. What is going on and how exactly did we get here?
Thatcher, Scientist - 0 views
-
This paper has two halves. First, I piece together what we know about Margaret Thatcher's training and employment as a scientist. She took science subjects at school; she studied chemistry at Oxford, arriving during World War II and coming under the influence (and comment) of two excellent women scientists, Janet Vaughan and Dorothy Hodgkin. She did a fourth-year dissertation on X-ray crystallography of gramicidin just after the war. She then gathered four years' experience as a working industrial chemist, at British Xylonite Plastics and at Lyons. Second, my argument is that, having lived the life of a working research scientist, she had a quite different view of science from that of any other minister responsible for science. This is crucial in understanding her reaction to the proposals-associated with the Rothschild reforms of the early 1970s-to reinterpret aspects of science policy in market terms. Although she was strongly pressured by bodies such as the Royal Society to reaffirm the established place of science as a different kind of entity-one, at least at core, that was unsuitable to marketization-Thatcher took a different line.
Princeton Engineering Anomalies Research - 0 views
-
The Princeton Engineering Anomalies Research (PEAR) program, which flourished for nearly three decades under the aegis of Princeton University's School of Engineering and Applied Science, has completed its experimental agenda of studying the interaction of human consciousness with sensitive physical devices, systems, and processes, and developing complementary theoretical models to enable better understanding of the role of consciousness in the establishment of physical reality.
Ockham's Razor is Dull « Apperceptual - 0 views
-
For a period of about a decade, extending from my late undergraduate years to my early postdoctoral years, it would be fair to say that I was obsessed with Ockham's razor. I was convinced that it was the key to understanding how we acquire knowledge about the world. I no longer believe in Ockham's razor.
A Philosophical Orientation Toward Solving Our Collective Problems As a Species | Think... - 0 views
-
To know what the most important virtue of our age is we need to have at least a basic understanding of our age. Our era is becoming increasingly characterized by uncertainty. Fortunately or unfortunately, more than a cursory elucidation of our situation is beyond the scope of this essay. There are geopolitical, economic, technological and environmental trends worth mentioning. When the more philosophical portion of this discourse arrives I will argue that the virtue of wisdom underlies the meaningfulness and efficacy of all other virtues, and this in broad strokes is primarily due to (1) the aforementioned instability in our surroundings ; (2) the relationship between the deontological and virtue; and (3) the nature of agency itself. Whether uncertainty itself can provide an ethical foundation for us to elaborate on will be a separate question, and finally I speculate on where wisdom leads us in the context of a philosophy that is politically active and not doomed to irrelevance to and by the larger population.
Deb Roy: The birth of a word | Video on TED.com - 0 views
-
MIT researcher Deb Roy wanted to understand how his infant son learned language -- so he wired up his house with videocameras to catch every moment (with exceptions) of his son's life, then parsed 90,000 hours of home video to watch "gaaaa" slowly turn into "water." Astonishing, data-rich research with deep implications for how we learn.
Algorithms identify and track the most important privately-held technology companies | ... - 0 views
-
A startup called Quid has developed algorithms that analyze Internet-based data from corporations to make fast-moving technology developments visible, navigable, and understandable. Quid has built a data set combining information about firms that succeeded and sank, patent documents, government grants, help wanted advertisements, and tweets. Its algorithms use the collection of information to analyze the prospects of around 35,000 firms and research groups working on new technologies. By extracting words and phrases from the collected documents, Quid constructs a "technology genome" that describes the primary focus of each of those 35,000 entities. A map of the connections between those genomes can be used by investors to find hints about interesting companies or ideas. Most companies cluster around established sectors, but a few will sit in the white spaces between the clusters and can represent the seeds of new technology sectors.
Public's Knowledge of Science and Technology | Pew Research Center for the People and t... - 3 views
Plastic Surgery | Plastic Surgeons Dr. Scott L. Tucker - 0 views
-
We are a plastic surgical practice specializing in cosmetic and restorative procedures. Our priority is to provide superior care to our patients while maintaining an empathetic understanding of the individual needs of each. From the first visit to our office, throughout scheduled surgery and during ongoing medical care, we strive to make your experience a positive one where your dignity and confidentiality are maintained.
How do you feel about the term 'citizen science'? | OceanSpaces - 0 views
-
The reason such a plethora of terms has proliferated is that each comes with the baggage - like how 'citizen science' might sound to an undocumented worker - of preconceived notions and affiliation with a particular structure of program. No one term has yet emerged to describe the wide spectrum of participatory science. Here at OST, we’ve decided to use the term ‘citizen science’ for a variety of reasons, most notably that it’s one of the easiest to understand and becoming one of the most popular. But we still have feelings about the term, so we’ve done a straw poll of staff members to see how they feel.
Public Understanding of Science - 1 views
Rationally Speaking: The very foundations of science - 0 views
-
The first way to think about probability is as a measure of the frequency of an event: if I say that the probability of a coin to land heads up is 50% I may mean that, if I flip the coin say 100 times, on average I will get heads 50 times. This is not going to get us out of Hume’s problem, because probabilities interpreted as frequencies of events are, again, a form of induction
-
Secondly, we can think of probabilities as reflecting subjective judgment. If I say that it is probable that the coin will land heads up, I might simply be trying to express my feeling that this will be the case. You might have a different feeling, and respond that you don’t think it's probable that the coin will lend heads up. This is certainly not a viable solution to the problem of induction, because subjective probabilities are, well, subjective, and hence reflect opinions, not degrees of truth.
-
Lastly, one can adopt what Okasha calls the logical interpretation of probabilities, according to which there is a probability X that an event will occur means that we have objective reasons to believe (or not) that X will occur (for instance, because we understand the physics of the solar system, the mechanics of cars, or the physics of coin flipping). This doesn’t mean that we will always be correct, but it does offer a promising way out of Hume’s dilemma, since it seems to ground our judgments on a more solid foundation. Indeed, this is the option adopted by many philosophers, and would be the one probably preferred by scientists, if they ever gave this sort of thing a moment’s thought.
The Public Values Failures of Climate Science in the US by Ryan Meyer - Minerva, Volume... - 0 views
-
"This paper examines the broad social purpose of US climate science, which has benefited from a public investment of more than $30 billion over the last 20 years. A public values analysis identifies five core public values that underpin the interagency program. Drawing from interviews, meeting observations, and document analysis, I examine the decision processes and institutional structures that lead to the implementation of climate science policy, and identify a variety of public values failures accommodated by this system. In contrast to other cases which find market values frameworks (the "profit as progress" assumption) at the root of public values failures, this case shows how "science values" ("knowledge as progress") may serve as an inadequate or inappropriate basis for achieving broader public values. For both institutions and individual decision makers, the logic linking science to societal benefit is generally incomplete, incoherent, and tends to conflate intrinsic and instrumental values. I argue that to be successful with respect to its motivating public values, the US climate science enterprise must avoid the assumption that any advance in knowledge is inherently good, and offer a clearer account of the kinds of research and knowledge advance likely to generate desirable social outcomes. "
Publicsciencetriumphs stories - io9 - 1 views
Human Computer Interaction (HCI) by John M. Carroll - Interaction-Design.org: HCI, Usab... - 0 views
-
The challenge of personal computing became manifest at an opportune time. The broad project of cognitive science, which incorporated cognitive psychology, artificial intelligence, linguistics, cognitive anthropology, and the philosophy of mind, had formed at the end of the 1970s. Part of the programme of cognitive science was to articulate systematic and scientifically-informed applications to be known as "cognitive engineering". Thus, at just the point when personal computing presented the practical need for HCI, cognitive science presented people, concepts, skills, and a vision for addressing such needs. HCI was one of the first examples of cognitive engineering. Other historically fortuitous developments contributed to establishment of HCI. Software engineering, mired in unmanageable software complexity in the 1970s, was starting to focus on nonfunctional requirements, including usability and maintainability, and on non-linear software development processes that relied heavily on testing. Computer graphics and information retrieval had emerged in the 1970s, and rapidly came to recognize that interactive systems were the key to progressing beyond early achievements. All these threads of development in computer science pointed to the same conclusion: The way forward for computing entailed understanding and better empowering users.
-
One of the most significant achievements of HCI is its evolving model of the integration of science and practice. Initially this model was articulated as a reciprocal relation between cognitive science and cognitive engineering. Later, it ambitiously incorporated a diverse science foundation, notably Activity Theory, distributed cognition, and ethnomethodology, and a culturally embedded conception of human activity, including the activities of design and technology development. Currently, the model is incorporating design practices and research across a broad spectrum. In these developments, HCI provides a blueprint for a mutual relation between science and practice that is unprecedented.
-
In the latter 1980s and early 1990s, HCI assimilated ideas from Activity Theory, distributed cognition, and ethnomethodology. This comprised a fundamental epistemological realignment. For example, the representational theory of mind, a cornerstone of cognitive science, is no longer axiomatic for HCI science. Information processing psychology and laboratory user studies, once the kernel of HCI research, became important, but niche areas. The most canonical theory-base in HCI now is socio-cultural, Activity Theory. Field studies became typical, and eventually dominant as an empirical paradigm. Collaborative interactions, that is, groups of people working together through and around computer systems (in contrast to the early 1980s user-at-PC situation) have become the default unit of analysis. It is remarkable that such fundamental realignments were so easily assimilated by the HCI community.