Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged development

Rss Feed Group items tagged

Steve Bosserman

Specifying AI safety problems in simple environments | DeepMind - 0 views

  •  
    "As AI systems become more general and more useful in the real world, ensuring they behave safely will become even more important. To date, the majority of technical AI safety research has focused on developing a theoretical understanding about the nature and causes of unsafe behaviour. Our new paper builds on a recent shift towards empirical testing (see Concrete Problems in AI Safety) and introduces a selection of simple reinforcement learning environments designed specifically to measure 'safe behaviours'."
Steve Bosserman

About - Catalog - 0 views

  •  
    "We are developing next generation technology to store digital information in DNA molecules. Our vision is to fit the information content of entire data centers in the palm of your hand. We have proven our approach to encoding data in DNA and are in the process of scaling up our platform. CATALOG technology will make it economically attractive to use DNA as a medium for long-term archival of data."
Bill Fulkerson

(16) High-speed, Non-deformation Catching with High-speed Vision and Proximity Feedback... - 0 views

  •  
    "We demonstrated the high-speed, non-deformation catching of a marshmallow. The marshmallow is a very soft object which is difficult to grasp without deforming its surface. For the catching, we developed a 1ms sensor fusion system with the high-speed active vision sensor and the high-speed, high-precision proximity sensor. Generally, a tactile feedback is used to grasp various kinds of soft objects without deforming. However, the robot hand tends to deform the object surface with only tactile feedback. By slowing the grasping speed, the deformation becomes smaller. However, grasping time becomes longer. The 1ms sensor fusion system enabled seamless, high-sensitive sensing from non-contact to contact state. The robot hand could control fingertip position dynamically and precisely based on the visual and the proximity feedback. By the proximity feedback, contact to the object was detected before deforming its surface, and grasping motion is stopped. The robot hand could catch the marshmallow even if the position and posture of it were different. http://www.k2.t.u-tokyo.ac.jp/fusion/... SHOW MORE "
Steve Bosserman

Gamification has a dark side - 0 views

  • Gamification is the application of game elements into nongame spaces. It is the permeation of ideas and values from the sphere of play and leisure to other social spaces. It’s premised on a seductive idea: if you layer elements of games, such as rules, feedback systems, rewards and videogame-like user interfaces over reality, it will make any activity motivating, fair and (potentially) fun. ‘We are starving and games are feeding us,’ writes Jane McGonigal in Reality Is Broken (2011). ‘What if we decided to use everything we know about game design to fix what’s wrong with reality?’
  • But gamification’s trapping of total fun masks that we have very little control over the games we are made to play – and hides the fact that these games are not games at all. Gamified systems are tools, not toys. They can teach complex topics, engage us with otherwise difficult problems. Or they can function as subtle systems of social control.
  • The problem of the gamified workplace goes beyond micromanagement. The business ethicist Tae Wan Kim at Carnegie Mellon University in Pittsburgh warns that gamified systems have the potential to complicate and subvert ethical reasoning. He cites the example of a drowning child. If you save the child, motivated by empathy, sympathy or goodwill – that’s a morally good act. But say you gamify the situation. Say you earn points for saving drowning children. ‘Your gamified act is ethically unworthy,’ he explained to me in an email. Providing extrinsic gamified motivators, even if they work as intended, deprive us of the option to live worthy lives, Kim argues. ‘The workplace is a sacred space where we develop ourselves and help others,’ he notes. ‘Gamified workers have difficulty seeing what contributions they really make.’
  • ...1 more annotation...
  • The 20th-century French philosopher Michel Foucault would have said that these are technologies of power. Today, the interface designer and game scholar Sebastian Deterding says that this kind of gamification expresses a modernist view of a world with top-down managerial control. But the concept is flawed. Gamification promises easy, centralised overviews and control. ‘It’s a comforting illusion because de facto reality is not as predictable as a simulation,’ Deterding says. You can make a model of a city in SimCity that bears little resemblance to a real city. Mistaking games for reality is ultimately mistaking map for territory. No matter how well-designed, a simulation cannot account for the unforeseen.
Steve Bosserman

Tiny Lab-Grown 'Brains' Raise Big Ethical Questions - 0 views

  • These clusters of living brain cells are popularly known as minibrains, though scientists prefer to call them cerebral organoids. At the moment, they remain extremely rudimentary versions of an actual human brain and are used primarily to study brain development and disorders like autism.
  • But minibrain research is progressing so quickly that scientists need to start thinking about the potential implications now, says Nita Farahany, a professor of law and philosophy at Duke University and the director of Duke Science and Society.
  • "If you're talking about something like schizophrenia or autism, if you want to model those things, it is difficult to do so with animal models and it is ethically impossible in many instances to do so with living humans," She says. But it is possible to grow a minibrain from cells with genetic mutations associated with like autism and watch how it develops.
« First ‹ Previous 61 - 80 of 192 Next › Last »
Showing 20 items per page