Skip to main content

Home/ Robotics & AI/ Group items tagged computers

Rss Feed Group items tagged

Aasemoon =)

Artificial Intelligence and Robotics: LuminAR to shine a light on the future - 0 views

  • You might think that some devices in the modern age have reached their maximum development level, such as the common desk-lamp, but you would be wrong. Natan Linder, a student from The Massachusetts Institute of Technology (MIT) has created a robotic version that can not only light your room, but project internet pages on your desk as well. It is an upgrade on the AUR lamp from 2007, which tracks movements around a desk or table and can alter the color, focus, and strength of its light to suit the user’s needs. The LuminAR comes with those abilities, and much more. The robotic arm can move about on its own, and combines a vision system with a pico projector, wireless computer and camera. When turned on, the projector will look for a flat space around your room on which to display images. Since it can project more than one internet window, you can check your email and browse another website at the same time.
Aasemoon =)

robots.net - Physics-based Planning - 0 views

  • Later this month, Carnegie Mellon's CMDragons small-size robotic soccer team will be competing again at RoboCup, to be held in Singapore. CMDragons has tended to find their edge in their software as opposed to their hardware. Their latest software advantage will be their new "physics-based planning", using physics to decide how to move and turn with the ball in order to maintain control. Previous control strategies simply planned where the robot should move to and shoot from, assuming a ball placed at the front center of the dribbler bar would stay there. The goal of Robocup is to create a humanoid robotic soccer team to compete against human players in 2050. Manuela Veloso, the professor who leads the Carnegie Mellon robotic soccer lab, "believe[s] that the physics-based planning algorithm is a particularly noteworthy accomplishment" that will take the effort one step closer to the collective goal.
Aasemoon =)

Brain-controlled prosthetic limb most advanced yet - 0 views

  • Scientists at the Johns Hopkins University Applied Physics Laboratory (APL) were awarded no less than $34.5 million by the Defense Advanced Research Projects Agency (DARPA) to continue their outstanding work in the field of prosthetic limb testing, which has seen them come up with the most advanced model yet. Their Modular Prosthetic Limb (MPL) system is just about ready to be tested on human subjects, as it has proved successful with monkeys. Basically, the prosthetic arm is controlled by the brain through micro-arrays that are implanted (gently) in the head. They record brain signals and send the commands to the computer software that controls the arm. To be honest, it will be interesting to see just how these hair-chips are attached to the brain, but the APL say clinical tests have shown the devices to be entirely harmless. The monkeys didn’t mind them too much, at least.
Aasemoon =)

robots.net - Robots: Programmable Matter - 0 views

  • The latest episode of the Robots Podcast looks at the following scenario: Imagine being able to throw a hand-full of smart matter in a tank full of liquid and then pulling out a ready-to-use wrench once the matter has assembled. This is the vision of this episode's guests Michael Tolley and Jonas Neubert from the Computational Synthesis Laboratory run by Hod Lipson at Cornell University, NY. Tolley and Neubert give an introduction into Programmable Matter and then present their research on stochastic assembly of matter in fluid, including both simulation (see video above) and real-world implementation. Read on or tune in!
Aasemoon =)

DIY Drones - 0 views

  • This is a site for all things about amateur Unmanned Aerial Vehicles (UAVs). Use the tabs and drop-down menus above to navigate the site. These are our Arduino-based open source autopilot projects: * ArduPilot, a low-cost autopilot system for planes. * ArduCopter, a fully-autonomous quadcopter system (heli autopilot coming soon). * BlimpDuino, an autonomous blimp with both infrared and ultrasonic guidance.
Aasemoon =)

HRP-4C Dances Thanks to AIST's Choreonoid Software - 0 views

  • Japan’s National Institute of Advanced Industrial Science and Technology (AIST) has detailed the software used to make their robot dance (see some nice photos over at Pink Tentacle) in a recent press release.  The software, dubbed Choreonoid (Choreography and Humanoid), is similar to conventional computer animation software.  Users create key poses and the software automatically interpolates the motion between them.  What makes the software unique is that it also corrects the poses if they are mechanically unstable, such as modifying the position of the feet and waist, allowing anyone to create motions compatible with the ZMP balancing method.  This is especially important for robots like the HRP-4C, where complicated motions could easily cause it to fall over.
Aasemoon =)

Simulation Robot Programming with Microsoft Robotics Developer Studio (MRDS) and SPL - ... - 0 views

  • Simulation enables people with a personal computer to develop very interesting robots, cars, spaceship, and an enormous range of scientific effects with the main limiting factors becoming time and imagination. A novice user with little to no coding experience can use simulation; developing interesting applications in a game-like environment.
Aasemoon =)

Odex I Hexapod Robot From 1984 | BotJunkie - 0 views

  • Commenter Cynox was browsing through the 137 years of Popular Science magazine which are now available online, and he noticed this robot in the September 1984 issue. Called Odex I, it was developed by a (now apparently defunct) company called Odetics. Odex was six and a half feet tall, had six legs, and was fully capable of walking. Although it only weighed 370 pounds, each of its legs could lift 400 pounds. It could dead lift some 2100 pounds, and carry 900 pounds while walking at normal speed (which was about 18 inches per second). Odex used a tripod gait, and the fishbowl thing on top contained sensors that helped it avoid obstacles. It was one of the first robots with an onboard computer that helped coordinate all of its limbs. Since the limbs could articulate themselves in several directions independently, Odex was able to rapidly change its limb configuration to squeeze through tight spaces, move quickly, or lift stuff. It was able to climb into the back of a truck through a combination of automated step behaviors and teleoperation, which was pretty damn good for 1984.
Aasemoon =)

IEEE Spectrum: When Will We Become Cyborgs? - 0 views

  • I remember when, a decade ago, Kevin Warwick, a professor at the University of Reading, in the U.K., implanted a radio chip in his own arm. The feat caused quite a stir. The implant allowed him to operate doors, lights, and computers without touching anything. On a second version of the project he could even control an electric wheelchair and produce artificial sensations in his brain using the implanted chip. Warwick had become, in his own words, a cyborg. The idea of a cyborg -- a human-machine hybrid -- is common in science fiction and although the term dates back to the 1960s it still generates a lot of curiosity. I often hear people asking, When will we become cyborgs? When will humans and machines merge? Although some researchers might have specific time frames in mind, I think a better answer is: It's already happening. When we look back at the history of technology, we tend to see distinct periods -- before the PC and after the PC, before the Internet and after the Internet, and so forth -- but in reality most technological advances unfold slowly and gradually. That's particularly true with the technologies that are allowing us to modify and enhance our bodies.
Aasemoon =)

Gostai - robotics for everyone - 0 views

  • We are entering the robotic age. All over the world, we see research projects and companies working on realistic, market driven robots, with impressive realizations ranging from intelligent vacuum cleaners to humanoid robots.   This is a very exciting time and some people see in the current situation many common points with the early days of the computer industry. However, like PCs in the early 80's, today's robots are still incompatible in term of software. There is yet no standard way to reuse one component from one robot to the other, which is needed to have a real software industry bootstraping. And most attempts have been failing to provide tools genuinely adapted to the complex need of robot programming.   Here at Gostai, we believe that the industry needs a powerful robotics software platform, ready to face the challenges of Artificial Intelligence and autonomous robots programming.
Aasemoon =)

Kinect-enabled robotic telepresence | Computer Vision Central - 0 views

  • Taylor Veltrop used a Kinect to read his arm movements which were then carried out by a robot. The robot was programmed using Willow Garage's open-source robot operating system (ROS). As Kit Eaton suggest, this quick experiment provides an illustration of the path towards robotic avatars.
Aasemoon =)

Urus Project - 0 views

  • In this project we want to analyze and test the idea of incorporating a network of robots (robots, intelligent sensors, devices and communications) in order to improve life quality in urban areas. The URUS project is focused in designing a network of robots that in a cooperative way interact with human beings and the environment for tasks of assistance, transportation of goods, and surveillance in urban areas. Specifically, our objective is to design and develop a cognitive network robot architecture that integrates cooperating urban robots, intelligent sensors, intelligent devices and communications.
Aasemoon =)

Robots with a human touch - A*STAR Research - 0 views

  • In recent years, ‘social’ robots—cleaning robots, nursing-care robots, robot pets and the like—have started to penetrate into people’s everyday lives. Saerbeck and other robotics researchers are now scrambling to develop more sophisticated robotic capabilities that can reduce the ‘strangeness’ of robot interaction. “When robots come to live in a human space, we need to take care of many more things than for manufacturing robots installed on the factory floor,” says Haizhou Li, head of the Human Language Technology Department at the A*STAR Institute for Infocomm Research. “Everything from design to the cognitive process needs to be considered.”
Aasemoon =)

Automaton, Know Thyself: Robots Become Self-Aware: Scientific American - 0 views

  • Robots might one day trace the origin of their consciousness to recent experiments aimed at instilling them with the ability to reflect on their own thinking. Although granting machines self-awareness might seem more like the stuff of science fiction than science, there are solid practical reasons for doing so, explains roboticist Hod Lipson at Cornell University's Computational Synthesis Laboratory.
otakuhacks

Data annotation - 0 views

image

data-science data annotations annotation machine-learning

started by otakuhacks on 10 Nov 20 no follow-up yet
‹ Previous 21 - 39 of 39
Showing 20 items per page