Skip to main content

Home/ Science Technology Society/ Group items tagged together

Rss Feed Group items tagged

thinkahol *

EPFL spinoff turns thousands of 2D photos into 3D images | KurzweilAI - 0 views

  •  
    Researchers in EPFL's Computer Vision Laboratory developed a computer-based modeling service that generates a 3D image from up to thousands of 2D shots, with all the processing done in the cloud. Since April, the EPFL startup Pix4D has been offering the modeling service with a fourth dimension: time. Now, individuals and small businesses looking for fast, cheap, large-scale 3D models can get them without investing in heavy processing, the company states. With Pix4D, users upload a series of photos of an object, and within 30 minutes they have a 3D image. The software defines "points of interest" from among the photos, or common points of high-contrast pixels. Next, the program pastes the images together seamlessly by matching up the points of interest. Much in the same way our two eyes work together to calculate depth, the software computes the distance and angle between two or more photos and lays the image over the model appropriately, creating a highly accurate 3D model that avoids the time intensive, "point by point" wireframe method. With Pix4D's 3D models, you can navigate in all directions as well as change the date on a timeline to see what a place looked like at different times of the year. The company is collaborating with several drone makers (including another EPFL startup,senseFly) to market their software as a package with senseFly's micro aerial vehicles, or autonomous drones. Pix4D's time element avoids waiting for Google to update its satellite data or for an expensive plane to fly by and take high-resolution photos. Farmers, for example, can now send relatively inexpensive flying drones into the air to take pictures as often as they like, allowing them to survey the evolution of their crops over large distances and long periods of time. And since the calculations are done on a cloud server, the client doesn't need a powerful computer of his or her own.
Adam Fleaming

Initiative Targets Big Data Workloads {Open Hybrid} - Compliance4all - 0 views

  •  
    Hortonworks, IBM and Red Hat today announced they're banding together to build a consistent hybrid computing architecture for big data workloads. Dubbed the Open Hybrid Architecture Initiative, the program pledges simplicity of deployment and freedom of movement for data apps. The rapid ascent of cloud computing platforms like AWS, Azure, and Google Cloud has given enterprises abundant new options for storing data and deploying processing-intensive applications, such as deep learning and real-time stream processing. Throw in the progress being made at the edge, with sensors and speedy ARM chips collecting and processing massive amounts of data, and you have the makings of a computing revolution. While the computing possibilities in the cloud and on the edge may appear bountiful, the reality is that the underlying architectures for building apps that can span these three modes are just starting to come together. Enterprises today face a dearth of repeatable patterns to guide their developers, administrators, and architects, who are tasked with building, deploying and maintaining hybrid that span not just the cloud and the edge, but traditional on-prem data centers too. Hybrid computing architecture for big data workloads https://goo.gl/GQVXjs
  •  
    Hybrid computing architecture for big data workloads https://goo.gl/GQVXjs
Todd Suomela

Thatcher, Scientist - 0 views

  •  
    This paper has two halves. First, I piece together what we know about Margaret Thatcher's training and employment as a scientist. She took science subjects at school; she studied chemistry at Oxford, arriving during World War II and coming under the influence (and comment) of two excellent women scientists, Janet Vaughan and Dorothy Hodgkin. She did a fourth-year dissertation on X-ray crystallography of gramicidin just after the war. She then gathered four years' experience as a working industrial chemist, at British Xylonite Plastics and at Lyons. Second, my argument is that, having lived the life of a working research scientist, she had a quite different view of science from that of any other minister responsible for science. This is crucial in understanding her reaction to the proposals-associated with the Rothschild reforms of the early 1970s-to reinterpret aspects of science policy in market terms. Although she was strongly pressured by bodies such as the Royal Society to reaffirm the established place of science as a different kind of entity-one, at least at core, that was unsuitable to marketization-Thatcher took a different line.
thinkahol *

Breakthrough chip technology lights path to exascale computing: Optical signals connect... - 0 views

  •  
    ScienceDaily (Dec. 3, 2010) - IBM scientists have unveiled a new chip technology that integrates electrical and optical devices on the same piece of silicon, enabling computer chips to communicate using pulses of light (instead of electrical signals), resulting in smaller, faster and more power-efficient chips than is possible with conventional technologies.
Todd Suomela

ISHPSSB - 0 views

  •  
    The International Society for History, Philosophy, and Social Studies of Biology (ISHPSSB) brings together scholars from diverse disciplines, including the life sciences as well as history, philosophy, and social studies of science. ISHPSSB summer meetings are known for innovative, transdisciplinary sessions, and for fostering informal, co-operative exchanges and on-going collaborations.
thinkahol *

DNA can discern between two quantum states, research shows - 0 views

  •  
    ScienceDaily (June 4, 2011) - Do the principles of quantum mechanics apply to biological systems? Until now, says Prof. Ron Naaman of the Institute's Chemical Physics Department (Faculty of Chemistry), both biologists and physicists have considered quantum systems and biological molecules to be like apples and oranges. But research he conducted together with scientists in Germany, which appeared recently in Science, shows that a biological molecule -- DNA -- can discern between quantum states known as spin.
thinkahol *

‪Quantum Computers and Parallel Universes‬‏ - YouTube - 0 views

  •  
    Complete video at: http://fora.tv/2009/05/23/Marcus_Chown_in_Conversation_with_Fred_Watson Marcus Chown, author of Quantum Theory Cannot Hurt You: A Guide to the Universe, discusses the mechanics behind quantum computers, explaining that they function by having atoms exist in multiple places at once. He predicts that quantum computers will be produced within 20 years. ----- The two towering achievements of modern physics are quantum theory and Einsteins general theory of relativity. Together, they explain virtually everything about the world in which we live. But almost a century after their advent, most people havent the slightest clue what either is about. Radio astronomer, award-winning writer and broadcaster Marcus Chown talks to fellow stargazer Fred Watson about his book Quantum Theory Cannot Hurt You. - Australian Broadcasting Corporation Marcus Chown is an award-winning writer and broadcaster. Formerly a radio astronomer at the California Institute of Technology, he is now cosmology consultant of the weekly science magazine New Scientist. The Magic Furnace, Marcus' second book, was chosen in Japan as one of the Books of the Year by Asahi Shimbun. In the UK, the Daily Mail called it "a dizzy page-turner with all the narrative devices you'd expect to find in Harry Potter". His latest book is called Quantum Theory Cannot Hurt You.
Todd Suomela

The Technium: Chosen, Inevitable, and Contingent - 0 views

  • There are two senses of "inevitable" when used with technology. In the first case, an invention merely has to exist once. In that sense, every technology is inevitable because sooner or later some mad tinkerer will cobble together almost anything that can be cobbled together. Jetpacks, underwater homes, glow-in-the-dark cats, forgetting pills — in the goodness of time every invention will inevitably be conjured up as a prototype or demo. And since simultaneous invention is the rule not the exception, any invention that can be invented will be invented more than once. But few will be widely adopted. Most won't work very well. Or more commonly they will work but be unwanted. So in this trivial sense, all technology is inevitable. Rewind the tape of time and it will be re-invented. The second more substantial sense of "inevitable" demands a level of common acceptance and viability. A technology's use must come to dominate the technium or at least its corner of the technosphere. But more than ubiquity, the inevitable must contain a large-scale momentum, and proceed on its own determination beyond the free choices of several billion humans. It can't be diverted by mere social whims.
thinkahol *

The Robot Revolution Is Upon Us Already | Plus Ultra Technologies/30 steps - 0 views

  •  
    Humans and machines working together in the arts?  This is just the beginning!
thinkahol *

Citizen Scientist 2.0 - 1 views

  •  
    What does the future of science look like? About a year ago, I was asked this question. My response then was: Transdisciplinary collaboration. Researchers from a variety of domains-biology, philosophy, psychology, neuroscience, economics, law-all coming together, using inputs from each specialized area to generate the best comprehensive solutions to society's more persistent problems. Indeed, it appears as if I was on the right track, as more and more academic research departments, as well as industries, are seeing the value in this type of partnership. Now let's take this a step further. Not only do I think we will be relying on inputs from researchers and experts from multiple domains to solve scientific problems, but I see society itself getting involved on a much more significant level as well. And I don't just mean science awareness. I'm talking about actually participating in the research itself. Essentially, I see a huge boom in the future for Citizen Science.
techinstro com

Description and Specification of Graphene - 0 views

  •  
    Graphene is a single-atom-thick sheet made from combined carbon atoms. The carbon atoms are bonded together to share electron in a hexagonal, honeycomb-like structure. It is a single atom thick layered 2D material ever discovered in the world. It is made up of a hexagonal lattice pattern of carbon atoms in a monolithic honeycomb-like structure. It is a layer of SP2 single bonded carbon atoms arranged like a chicken wire mesh. It is 200 times stronger than stainless steel (SS) and 100 thousand times thinner than the human hair. It is the slimmest and strongest compound available on the earth. Anybody would be amazed when they know that we have found a material which is harder than diamond, yet lightweight, stronger than steel, but also highly flexible, and this material can be mined from the earth as it occurs naturally. It is yet thin enough to be mistaken for a saran wrap. Apart from the optimum physical properties, the other features are also impressive. Properties of Graphene These are some prominent features which make it a hi-tech material: Excellent Electronic Conductor Chief electronic property makes it an efficient Zero-Overlap Semi metal and gives it sufficient electrical conductivity. Carbon atoms possess typically 2 electrons in the inner shell, and 4 electrons in the outer orbit, total 6 electrons. Although conventionally, the outer 4 electrons in carbon can connect with another atom, each, the atoms can form a 2-dimensional bond with three atoms per single atom. This leaves an electron available for electronic conduction. Such electrons are known as 'Pi' electrons and found above and below in sheet. Ultimate Tensile Strength Mechanical strength is another prominent property of the material. It considered as being the foremost most robust material ever discovered, owing to the 0.142 Nm-long carbon bonds. It also possesses ultimate tensile strength, measuring 130 Gigapascals (or 130,000,000,000 Pascals). Compared to the tensile strength of industr
Todd Suomela

Amateur Science and the Rise of Big Science | Citizen Scientists League - 0 views

  • Several trends came together to increase the professional nature of scientific work. First was the increasing cost of scientific work and its complexity. Scientific equipment became more precise and expensive. Telescopes, like those by Herschel, became bigger and bigger. Also, the amount of knowledge one needed to gain to contribute became increasingly daunting.
  • Second, the universities changed. Pioneered by the German states, which at the beginning of the 19th century was dismissed as a scientific backwater, universities began offering focused majors which trained students in a specific discipline rather than classical education as a whole. This was pioneered by Wilhelm von Humboldt, brother of the famous scientist Alexander von Humboldt, who was the Prussian Minister of Education.
  • Germany, once united, also provided impetus to two other trends that accelerated their dominance of science and the decline of amateurs. First, was the beginning of large-scale state sponsorship of science through grants which were first facilitated through the Kaiser Wilhelm Institute (now the Max Planck Institute). This eventually supplanted prizes as the dominant large-scale source of scientific funding. Countries like France that relied on prizes began to fall behind. Second, was the intimate cooperation between industrial firms like BASF and universities.
  • ...1 more annotation...
  • he final nail in the coffin was undoubtedly the Second World War. The massive mobilization of scientific resources needed to win and the discovery of war-winning inventions such as the atomic bomb and self-correcting bomb sight (with help from Norbert Wiener of MIT) convinced the nations of the world that the future was in large-scale funding and support of science as a continuous effort. Vannevar Bush, former president of MIT, and others pioneered the National Science Foundation and the military also invested heavily in its own research centers. Industrial labs such as those from Bell Labs, GE, Kodak, and others began dominating research as well. Interestingly, the first military investment in semiconductors coupled with research from Bell Labs led to what is now known as Silicon Valley.
scott tucker

Scott Tucker: Long Live the New American Revolution - 0 views

  •  
    The stitched-together movement that is overflowing from the Wall Street protest can have a huge impact if it holds firm against a malevolent corporatism and the political hucksters who dangle promises of "hope and change."
Todd Suomela

Human Computer Interaction (HCI) by John M. Carroll - Interaction-Design.org: HCI, Usab... - 0 views

  • The challenge of personal computing became manifest at an opportune time. The broad project of cognitive science, which incorporated cognitive psychology, artificial intelligence, linguistics, cognitive anthropology, and the philosophy of mind, had formed at the end of the 1970s. Part of the programme of cognitive science was to articulate systematic and scientifically-informed applications to be known as "cognitive engineering". Thus, at just the point when personal computing presented the practical need for HCI, cognitive science presented people, concepts, skills, and a vision for addressing such needs. HCI was one of the first examples of cognitive engineering. Other historically fortuitous developments contributed to establishment of HCI. Software engineering, mired in unmanageable software complexity in the 1970s, was starting to focus on nonfunctional requirements, including usability and maintainability, and on non-linear software development processes that relied heavily on testing. Computer graphics and information retrieval had emerged in the 1970s, and rapidly came to recognize that interactive systems were the key to progressing beyond early achievements. All these threads of development in computer science pointed to the same conclusion: The way forward for computing entailed understanding and better empowering users.
  • One of the most significant achievements of HCI is its evolving model of the integration of science and practice. Initially this model was articulated as a reciprocal relation between cognitive science and cognitive engineering. Later, it ambitiously incorporated a diverse science foundation, notably Activity Theory, distributed cognition, and ethnomethodology, and a culturally embedded conception of human activity, including the activities of design and technology development. Currently, the model is incorporating design practices and research across a broad spectrum. In these developments, HCI provides a blueprint for a mutual relation between science and practice that is unprecedented.
  • In the latter 1980s and early 1990s, HCI assimilated ideas from Activity Theory, distributed cognition, and ethnomethodology. This comprised a fundamental epistemological realignment. For example, the representational theory of mind, a cornerstone of cognitive science, is no longer axiomatic for HCI science. Information processing psychology and laboratory user studies, once the kernel of HCI research, became important, but niche areas. The most canonical theory-base in HCI now is socio-cultural, Activity Theory. Field studies became typical, and eventually dominant as an empirical paradigm. Collaborative interactions, that is, groups of people working together through and around computer systems (in contrast to the early 1980s user-at-PC situation) have become the default unit of analysis. It is remarkable that such fundamental realignments were so easily assimilated by the HCI community.
1 - 14 of 14
Showing 20 items per page