Skip to main content

Home/ Science Technology Society/ Group items tagged was

Rss Feed Group items tagged

thinkahol *

The American Wikileaks Hacker | Rolling Stone Culture - 0 views

  •  
    On July 29th, returning from a trip to Europe, Jacob Appelbaum, a lanky, unassuming 27-year-old wearing a black T-shirt with the slogan "Be the trouble you want to see in the world," was detained at customs by a posse of federal agents. In an interrogation room at Newark Liberty airport, he was grilled about his role in Wikileaks, the whistle-blower group that has exposed the government's most closely guarded intelligence reports about the war in Afghanistan. The agents photocopied his receipts, seized three of his cellphones - he owns more than a dozen - and confiscated his computer. They informed him that he was under government surveillance. They questioned him about the trove of 91,000 classified military documents that Wikileaks had released the week before, a leak that Vietnam-era activist Daniel Ellsberg called "the largest unauthorized disclosure since the Pentagon Papers." They demanded to know where Julian Assange, the founder of Wikileaks, was hiding. They pressed him on his opinions about the wars in Afghanistan and Iraq. Appelbaum refused to answer. Finally, after three hours, he was released. Sex, Drugs, and the Biggest Cybercrime of All Time Appelbaum is the only known American member of Wikileaks and the leading evangelist for the software program that helped make the leak possible. In a sense, he's a bizarro version of Mark Zuckerberg: If Facebook's ambition is to "make the world more open and connected," Appelbaum has dedicated his life to fighting for anonymity and privacy. An anarchist street kid raised by a heroin- addict father, he dropped out of high school, taught himself the intricacies of code and developed a healthy paranoia along the way. "I don't want to live in a world where everyone is watched all the time," he says. "I want to be left alone as much as possible. I don't want a data trail to tell a story that isn't true." We have transferred our most intimate and personal information - our bank accounts, e-mails, photographs, ph
Todd Suomela

Amateur Science and the Rise of Big Science | Citizen Scientists League - 0 views

  • Several trends came together to increase the professional nature of scientific work. First was the increasing cost of scientific work and its complexity. Scientific equipment became more precise and expensive. Telescopes, like those by Herschel, became bigger and bigger. Also, the amount of knowledge one needed to gain to contribute became increasingly daunting.
  • Second, the universities changed. Pioneered by the German states, which at the beginning of the 19th century was dismissed as a scientific backwater, universities began offering focused majors which trained students in a specific discipline rather than classical education as a whole. This was pioneered by Wilhelm von Humboldt, brother of the famous scientist Alexander von Humboldt, who was the Prussian Minister of Education.
  • Germany, once united, also provided impetus to two other trends that accelerated their dominance of science and the decline of amateurs. First, was the beginning of large-scale state sponsorship of science through grants which were first facilitated through the Kaiser Wilhelm Institute (now the Max Planck Institute). This eventually supplanted prizes as the dominant large-scale source of scientific funding. Countries like France that relied on prizes began to fall behind. Second, was the intimate cooperation between industrial firms like BASF and universities.
  • ...1 more annotation...
  • he final nail in the coffin was undoubtedly the Second World War. The massive mobilization of scientific resources needed to win and the discovery of war-winning inventions such as the atomic bomb and self-correcting bomb sight (with help from Norbert Wiener of MIT) convinced the nations of the world that the future was in large-scale funding and support of science as a continuous effort. Vannevar Bush, former president of MIT, and others pioneered the National Science Foundation and the military also invested heavily in its own research centers. Industrial labs such as those from Bell Labs, GE, Kodak, and others began dominating research as well. Interestingly, the first military investment in semiconductors coupled with research from Bell Labs led to what is now known as Silicon Valley.
Todd Suomela

Ockham's Razor is Dull « Apperceptual - 0 views

  •  
    For a period of about a decade, extending from my late undergraduate years to my early postdoctoral years, it would be fair to say that I was obsessed with Ockham's razor. I was convinced that it was the key to understanding how we acquire knowledge about the world. I no longer believe in Ockham's razor.
thinkahol *

Who is Peter Joseph? | Watch Free Documentary Online - 0 views

  •  
    In late 2009, Charles Robinson was able to interview Peter Joseph, the creator of Zeitgeist: The Movie, Zeitgeist: Addendum, Zeitgeist: Moving Forward, several lectures and a presentation; Founder of The Zeitgeist Movement and a friend of Jack Fresco, in his home. He described himself and his life in details in what is likely a rare interview. He was kind enough to provide him with previously unreleased media and video and in turn Charles did his best to create a documentary (albeit kinda poor in quality compared to his work!) that would help express who this person is. Peter Joseph was born in North Carolina to a middle class family. He has said in interviews that his mother's role as a social worker helped shape his opinion and impressions of American life. He later moved to New York to attend art school. Currently he lives and works in New York City as a freelance film editor/composer/producer for various industries. Due to the controversial content of his films and a desire to keep his day job private, he has not released his full name to the public.
thinkahol *

Citizen Scientist 2.0 - 1 views

  •  
    What does the future of science look like? About a year ago, I was asked this question. My response then was: Transdisciplinary collaboration. Researchers from a variety of domains-biology, philosophy, psychology, neuroscience, economics, law-all coming together, using inputs from each specialized area to generate the best comprehensive solutions to society's more persistent problems. Indeed, it appears as if I was on the right track, as more and more academic research departments, as well as industries, are seeing the value in this type of partnership. Now let's take this a step further. Not only do I think we will be relying on inputs from researchers and experts from multiple domains to solve scientific problems, but I see society itself getting involved on a much more significant level as well. And I don't just mean science awareness. I'm talking about actually participating in the research itself. Essentially, I see a huge boom in the future for Citizen Science.
Todd Suomela

The Technium: The World Without Technology - 0 views

  • Although strictly speaking simple tools are a type of technology made by one person, we tend to think of technology as something much more complicated. But in fact technology is anything designed by a mind. Technology includes not only nuclear reactors and genetically modified crops, but also bows and arrows, hide tanning techniques, fire starters, and domesticated crops. Technology also includes intangible inventions such as calendars, mathematics, software, law, and writing, as these too derive from our heads. But technology also must include birds' nests and beaver dams since these too are the work of brains. All technology, both the chimp's termite fishing spear and the human's fishing spear, the beaver's dam and the human's dam, the warbler's hanging basket and the human's hanging basket, the leafcutter ant's garden and the human's garden, are all fundamentally natural. We tend to isolate human-made technology from nature, even to the point of thinking of it as anti-nature, only because it has grown to rival the impact and power of its home. But in its origins and fundamentals a tool is as natural as our life.
  • The gravity of technology holds us where we are. We accept our attachment. But to really appreciate the effects of technology – both its virtues and costs -- we need to examine the world of humans before technology. What were our lives like without inventions? For that we need to peek back into the Paleolithic era when technology was scarce and humans lived primarily surrounded by things they did not make. We can also examine the remaining contemporary hunter-gatherer tribes still living close to nature to measure what, if anything, they gain from the small amount of technology they use.
  • Then about 50,000 years ago something amazing happened. While the bodies of early humans in Africa remained unchanged, their genes and minds shifted noticeably. For the first time hominins were full of ideas and innovation. These newly vitalized modern humans, which we now call Sapiens, charged into new regions beyond their ancestral homes in eastern Africa. They fanned out from the grasslands and in a relatively brief burst exploded from a few tens of thousands in Africa to an estimated 8 million worldwide just before the dawn of agriculture 10,000 years ago.
  • ...1 more annotation...
  • It should have been clear to Neanderthal, as it is now clear to us in the 21st century, that something new and big had appeared -- a new biological and geological force. A number of scientists (Richard Klein, Ian Tattersall, William Calvin, among many others) think that the "something" that happened 50,000 years ago was the invention of language. Up until this point, humanoids were smart. They could make crude tools in a hit or miss way and handle fire – perhaps like an exceedingly smart chimp. The African hominin's growing brain size and physical stature had leveled off its increase, but evolution continued inside the brain.  "What happened 50,000 years ago," says Klein, "was a change in the operating system of humans. Perhaps a point mutation effected the way the brain is wired that allowed languages, as we understand language today: rapidly produced, articulate speech."  Instead of acquiring a larger brain, as the Neanderthal and Erectus did, Sapien gained a rewired brain.  Language altered the Neanderthal-type mind, and allowed Sapien minds for the first time to invent with purpose and deliberation. Philosopher Daniel Dennet crows in elegant language: "There is no step more uplifting, more momentous in the history of mind design, than the invention of language. When Homo sapiens became the beneficiary of this invention, the species stepped into a slingshot that has launched it far beyond all other earthly species." The creation of language was the first singularity for humans. It changed everything. Life after language was unimaginable to those on the far side before it.
thinkahol *

On "Consciousness: The Black Hole of Neuroscience" aka the "hard" problem | Thinkahol's... - 1 views

  •  
    What had been lacking until relatively recently was an overarching framework or theory through which to grasp the nature of consciousness. The lack of a general theory of consciousness, of how it comes to be that there is something that it is like to be, was really the last rational bastion of opposition to the scientific assertion that consciousness emerges from the brain.
Todd Suomela

Thatcher, Scientist - 0 views

  •  
    This paper has two halves. First, I piece together what we know about Margaret Thatcher's training and employment as a scientist. She took science subjects at school; she studied chemistry at Oxford, arriving during World War II and coming under the influence (and comment) of two excellent women scientists, Janet Vaughan and Dorothy Hodgkin. She did a fourth-year dissertation on X-ray crystallography of gramicidin just after the war. She then gathered four years' experience as a working industrial chemist, at British Xylonite Plastics and at Lyons. Second, my argument is that, having lived the life of a working research scientist, she had a quite different view of science from that of any other minister responsible for science. This is crucial in understanding her reaction to the proposals-associated with the Rothschild reforms of the early 1970s-to reinterpret aspects of science policy in market terms. Although she was strongly pressured by bodies such as the Royal Society to reaffirm the established place of science as a different kind of entity-one, at least at core, that was unsuitable to marketization-Thatcher took a different line.
thinkahol *

YouTube - Think faster focus better and remember moreRewiring our brain to stay younger... - 0 views

  •  
    October 24, 2008 - Google Tech Talks June 16, 2008 ABSTRACT Explore the brain's amazing ability to change throughout a person's life. This phenomenon-called neuroplasticty-is the science behind brain fitness, and it has been called one of the most extraordinary scientific discoveries of the 20th century. PBS had recently aired this special, The Brain Fitness Program, which explains the brain's complexities in a way that both scientists and people with no scientific background can appreciate. This is opportunity to learn more about how our minds work-and to find out more about the latest in cutting-edge brain research, from the founder of Posit Science and creator of the Brain Fitness Program software, Dr. Michael Merzenich. Speaker: Dr. Michael Merzenich, Ph.D. Michael M. Merzenich, PhD: Chief Scientific Officer Dr. Merzenich leads the company's scientific team. For more than three decades, Dr. Merzenich has been a leading pioneer in brain plasticity research. He is the Francis A. Sooy Professor at the Keck Center for Integrative Neurosciences at UCSF. Dr. Merzenich is a member of the National Academy of Sciences. He is the recipient of numerous awards and prizes, including the Ipsen Prize, Zulch Prize of the Max Planck Institute, Thomas Alva Edison Patent Award and Purkinje Medal. Dr. Merzenich has published more than 200 articles, including many in leading peer-reviewed journals, such as Science and Nature. His work is also often covered in the popular press, including the New York Times, the Wall Street Journal, Time and Newsweek. He has appeared on Sixty Minutes II, CBS Evening News and Good Morning America. In the late 1980s, Dr. Merzenich was on the team that invented the cochlear implant, now distributed by market leader Advanced Bionics. In 1996, Dr. Merzenich was the founding CEO of Scientific Learning Corporation (Nasdaq: SCIL), which markets and distributes software that applies principles of brain plasticity to assist children with language
Todd Suomela

H. M., an Unforgettable Amnesiac, Dies at 82 - Obituary (Obit) - NYTimes.com - 0 views

  •  
    In 1953, he underwent an experimental brain operation in Hartford to correct a seizure disorder, only to emerge from it fundamentally and irreparably changed. He developed a syndrome neurologists call profound amnesia. He had lost the ability to form new memories. For the next 55 years, each time he met a friend, each time he ate a meal, each time he walked in the woods, it was as if for the first time. And for those five decades, he was recognized as the most important patient in the history of brain science.
thinkahol *

Let's Regulate Facebook! | The Awl - 0 views

  •  
    Apparently, if Facebook wanted to repair its reputation, all it had to do was seem like it was helping to topple an authoritarian regime. Now that the U.S. media is loudly pushing the idea that social media can change Egypt-and next, the world!-it makes Mark Zuckerberg's tendency to monetize every aspect of our online lives seem less important.
Todd Suomela

The Professional and the Scientist in Nineteenth-Century America - JSTOR: Isis, Vol. 10... - 0 views

  •  
    "In nineteenth‐century America, there was no such person as a "professional scientist." There were professionals and there were scientists, but they were very different. Professionals were men of science who engaged in commercial relations with private enterprises and took fees for their services. Scientists were men of science who rejected such commercial work and feared the corrupting influences of cash and capitalism. Professionals portrayed themselves as active and useful members of an entrepreneurial polity, while scientists styled themselves as crusading reformers, promoters of a purer science and a more research‐oriented university. It was this new ideology, embodied in these new institutions, that spurred these reformers to adopt a special name for themselves-"scientists." One object of this essay, then, is to explain the peculiar Gilded Age, American origins of that ubiquitous term. A larger goal is to explore the different social roles of the professional and the scientist. By attending to the particular vocabulary employed at the time, this essay tries to make clear why a "professional scientist" would have been a contradiction in terms for both the professional and the scientist in nineteenth‐century America. "
Todd Suomela

Human Computer Interaction (HCI) by John M. Carroll - Interaction-Design.org: HCI, Usab... - 0 views

  • The challenge of personal computing became manifest at an opportune time. The broad project of cognitive science, which incorporated cognitive psychology, artificial intelligence, linguistics, cognitive anthropology, and the philosophy of mind, had formed at the end of the 1970s. Part of the programme of cognitive science was to articulate systematic and scientifically-informed applications to be known as "cognitive engineering". Thus, at just the point when personal computing presented the practical need for HCI, cognitive science presented people, concepts, skills, and a vision for addressing such needs. HCI was one of the first examples of cognitive engineering. Other historically fortuitous developments contributed to establishment of HCI. Software engineering, mired in unmanageable software complexity in the 1970s, was starting to focus on nonfunctional requirements, including usability and maintainability, and on non-linear software development processes that relied heavily on testing. Computer graphics and information retrieval had emerged in the 1970s, and rapidly came to recognize that interactive systems were the key to progressing beyond early achievements. All these threads of development in computer science pointed to the same conclusion: The way forward for computing entailed understanding and better empowering users.
  • One of the most significant achievements of HCI is its evolving model of the integration of science and practice. Initially this model was articulated as a reciprocal relation between cognitive science and cognitive engineering. Later, it ambitiously incorporated a diverse science foundation, notably Activity Theory, distributed cognition, and ethnomethodology, and a culturally embedded conception of human activity, including the activities of design and technology development. Currently, the model is incorporating design practices and research across a broad spectrum. In these developments, HCI provides a blueprint for a mutual relation between science and practice that is unprecedented.
  • In the latter 1980s and early 1990s, HCI assimilated ideas from Activity Theory, distributed cognition, and ethnomethodology. This comprised a fundamental epistemological realignment. For example, the representational theory of mind, a cornerstone of cognitive science, is no longer axiomatic for HCI science. Information processing psychology and laboratory user studies, once the kernel of HCI research, became important, but niche areas. The most canonical theory-base in HCI now is socio-cultural, Activity Theory. Field studies became typical, and eventually dominant as an empirical paradigm. Collaborative interactions, that is, groups of people working together through and around computer systems (in contrast to the early 1980s user-at-PC situation) have become the default unit of analysis. It is remarkable that such fundamental realignments were so easily assimilated by the HCI community.
thinkahol *

25% of US car accidents due to using gadgets | KurzweilAI - 0 views

  •  
    Driving distractions such as cell phones and other electronic devices cause as much as 25% of all U.S. car accidents, researchers at the Governors Highway Safety Association have found, WinBETA notes. A major finding was that being distracted was the cause of 15 to 25% of all accidents, ranging from minor property damage to death. Their findings suggest that distracted driving accidents be reported in accident reports to assist in evaluating distracted driving laws and programs. They propose creating low-cost roadway measures that alert motorists when they are drifting out of their driving lane. They also propose that all cell phones be banned on the road, even hands-free versions. In another report by The New York Times, police in Syracuse and Hartford have handed out nearly 20,000 tickets for illegal use of a phone while driving - either for texting or use of a handheld phone. According to the federal government, these efforts have had the desired effect: distracted driving has fallen sharply. Their research shows that drivers talking on a phone are four times as likely to get into a crash as those not on a phone, and that the risks for motorists who text are at least twice as high. In Syracuse, handheld cellphone use and texting have each fallen by one-third. In Hartford, handheld cellphone use by drivers fell 57 percent while texting fell by 75 percent, the Times reports. Ref.: Vicki Harper. et al., Distracted Driving: What Research Shows and What States Can Do, GHSA, 2011
Todd Suomela

Sociology and History: Shapin on the Merton Thesis « Ether Wave Propaganda - 1 views

  • Shapin observed that the link Merton drew between Puritanism and seventeenth-century English science was a matter of happenstance rather than determinism. According to Merton, science requires certain “values” and “sentiments” allowing intellectual individualism, and fostering not only an interest in the transcendent, but also secular improvement. It so happened that these values and sentiments were to be found in Puritan asceticism and sense of social obligation, which thus provided a social context in which science could develop. Definitively, this was not to say that Puritanism provided a unique source of these values and sentiments, or that science did not have other roots. It was obviously possible for science to develop in Catholic contexts as well, despite the less hospitable value system of Catholicism. The confluence of values simply seemed to promise some insight into the growth of science in a particular time and place.
  • Robert K. Merton’s “functionalist” sociology viewed “science” as a kind of Weberian ideal type — a form of thought that is identifiable by its peculiar, philosophically-defined characteristics. Merton’s sociology of science held that this thought could also be identified with social behaviors, characterized by a set of “norms”, which made the thought possible. The Merton Thesis (which slightly predates Merton’s enumeration of science’s norms) holds that the rise of science in early-modern England could be linked to the social behaviors valued by the Puritanism of that milieu. This was the subject of Merton’s PhD thesis and his 1938 book Science, Technology and Society in Seventeenth-Century England.
  •  
    Great link. Thanks
seth kutcher

Fast and Reliable Computer Repair Services - 1 views

One day, I was working on my thesis which was due in three days and then suddenly my computer shut down. I then browsed for companies that offer computer repair services and found Computer Hardwar...

computer repair services

started by seth kutcher on 02 Nov 11 no follow-up yet
David Mills

Quality GPS Tracking Products - 3 views

I am very happy that I purchased Securatrak’s GPS car tracking device because I was able to retrieve my stolen car real fast. I gave all the GPS information of my car's exact whereabouts to t...

GPS tracking car

started by David Mills on 03 Jun 13 no follow-up yet
Todd Suomela

Tom Johnson, 1923–2012 - News Blog - SkyandTelescope.com - 0 views

  •  
    Thomas J. Johnson, the creator of the modern Schmidt-Cassegrain telescope and the founder of Celestron, died early this morning (March 13, 2012), according to Celestron president and CEO Joe Lupica. Johnson was 89. He ranked among the most important figures shaping the last half century of amateur astronomy.
thinkahol *

Carl Sagan Day | Center for Inquiry - 2 views

  •  
    Please join us this November as we honor Carl Sagan and celebrate the beauty and wonder of the cosmos he so eloquently described. Carl Sagan was a Professor of Astronomy and Space Science and Director of the Laboratory for Planetary Studies at Cornell University, but most of us know him as a Pulitzer Prize winning author and the creator of COSMOS. That Emmy and Peabody award-winning PBS television series transformed educational television when it first aired in 1980, but now, thirty years later, it's gone on to affect the hearts and minds of over a billion people in sixty countries. No other scientist has been able to reach and teach so many nonscientists in such a meaningful way, and that is why we celebrate Dr. Sagan, remember his work, and revel in the cosmos he helped us understand.
1 - 20 of 55 Next › Last »
Showing 20 items per page