Skip to main content

Home/ Robotics & AI/ Group items tagged #Technology

Rss Feed Group items tagged

Aasemoon =)

Robocopter Responds To Natural Language Direction | BotJunkie - 0 views

  • This little helicopter is able to understand you when you tell it what to do. No pushing buttons, no using special commands, you just tell it where you want it to go and (eventually) it goes. Of course, I’m sure it required a bit of work to define where “door” and “elevator” and “window” are, but it’s a much more intuitive way to control a UAV that works when your hands are full, when you’re stressed (think military), or simply when you have no idea now to control a UAV. I don’t have much in the way of other details on this project, besides the fact that it probably comes from the Robust Robotics Group at MIT, and possibly from someone who lives in this dorm. How do I know? Well, one of the research goals of the RRG is “to build social robots that can quickly learn what people want without being annoying or intrusive,” and this video is on the same YouTube channel. ‘Nuff said.
Aasemoon =)

robots.net - Robots: Chaos Control - 0 views

  • Walking, swallowing, respiration and many other key functions in humans and other animals are controlled by Central Pattern Generators (CPGs). In essence, CPGs are small, autonomous neural networks that produce rhythmic outputs, usually found in animal's spinal cords rather than their brains. Their relative simplicity and obvious success in biological systems has led to some success in using CPGs in robotics. However, current systems are restricted to very simple CPGs (e.g., restricted to a single walking gait). A recent breakthrough at the BCCN at the University of Göttingen, Germany has now allowed to achieve 11 basic behavioral patterns (various gaits, orienting, taxis, self-protection) from a single CPG, closing in on the 10–20 different basic behavioral patterns found in a typical cockroach. The trick: Work with a chaotic, rather than a stable periodic CPG regime. For more on CPGs, listen to the latest episode of the Robots podcast on Chaos Control, which interviews Poramate Manoonpong, one of the lead researchers in Göttingen, and Alex Pitti from the University of Tokyo who uses chaos controllers that can synchronize to the dynamics of the body they are controlling.
Aasemoon =)

IEEE Spectrum: A Robot in the Kitchen - 0 views

  • Rosie, the robot who kept house for the title family in "The Jetsons," a 1960s animated television show, has at last come alive—sort of. Before you'll see a robot slicing cucumbers in your kitchen, researchers will need to make these mechanical servants smarter. Here's how three teams are tackling this challenge.
fishead ...*∞º˙

Robots with skin enter our touchy-feely world - tech - 19 April 2010 - New Scientist - 0 views

  • BEAUTY may be only skin deep, but for humanoid robots a fleshy covering is about more than mere aesthetics, it could be essential to making them socially acceptable. A touch-sensitive coating could prevent such machines from accidentally injuring anybody within their reach. In May, a team at the Italian Institute of Technology (IIT) in Genoa will dispatch to labs across Europe the first pieces of touch-sensing skin designed for their nascent humanoid robot, the iCub. The skin IIT and its partners have developed contains flexible pressure sensors that aim to put robots in touch with the world. "Skin has been one of the big missing technologies for humanoid robots," says roboticist Giorgio Metta at IIT. One goal of making robots in a humanoid form is to let them interact closely with people. But that will only be possible if a robot is fully aware of what its powerful motorised limbs are in contact with.
  •  
    Wow this is cool!
Aasemoon =)

IEEE Spectrum: Hiroshi Ishiguro: The Man Who Made a Copy of Himself - 1 views

  • Hiroshi Ishiguro, a roboticist at Osaka University, in Japan, has, as you might expect, built many robots. But his latest aren’t run-of-the-mill automatons. Ishiguro’s recent creations look like normal people. One is an android version of a middle-aged family man—himself.
  •  
    Wow! I'm assuming that he is on the left, but can't really tell. Impressive!
Aasemoon =)

Gostai - robotics for everyone - 0 views

  • We are entering the robotic age. All over the world, we see research projects and companies working on realistic, market driven robots, with impressive realizations ranging from intelligent vacuum cleaners to humanoid robots.   This is a very exciting time and some people see in the current situation many common points with the early days of the computer industry. However, like PCs in the early 80's, today's robots are still incompatible in term of software. There is yet no standard way to reuse one component from one robot to the other, which is needed to have a real software industry bootstraping. And most attempts have been failing to provide tools genuinely adapted to the complex need of robot programming.   Here at Gostai, we believe that the industry needs a powerful robotics software platform, ready to face the challenges of Artificial Intelligence and autonomous robots programming.
Aasemoon =)

ALSOK Security Robot Patrols Gallery - 0 views

  • ALSOK, a security firm that specializes in robot guards, has sent their drones into shopping malls, office buildings, and museums.  This video shows one of them patrolling an art gallery.  Not surprisingly, even in Japan, the sight of a robot patrolling its beat is more than enough to distract some of the visitors from the actual works of art!
Aasemoon =)

robots.net - Robots: URBI Software Platform - 0 views

Aasemoon =)

・NAMO - 0 views

  • NAMO (Novel Articulated MObile platform)  is a humanoid robot built by The Institute of Field Robotics (FIBO) at King Mongkut’s University of Technology Thonburi in Thailand. FIBO is active in the RoboCup scene and have developed a wide range of robot types, including an experimental biped.  NAMO was unveiled on March 29th 2010, serving as FIBO’s mascot as part of the university’s 50 year anniversary celebrations.  NAMO will be used to welcome people to the university and may be deployed at museums.  Given its friendly appearance and functionality, it could be used to research human robot interaction and communication. NAMO is 130cm (4′3″) tall and has 16 degrees of freedom.  It moves on a stable three-wheeled omnidirectional base, and is equipped with a Blackfin camera for its vision system.  It is capable of simple gesture recognition, visually tracks humans or objects of interest automatically, and can speak a few phrases in a child-like voice (in Thai).
Aasemoon =)

An Open Source Personal Robot On The Horizon? - 0 views

  • GetRobo has pointed out a new website by Francisco Paz, which focuses on his experience building an open source personal robot called Qbo.  From the few images on the site Qbo looks remarkably well made and quite similar to NEC’s PaPeRo, meaning it might be used to experiment with image processing, speech recognition, speech synthesis, and (assuming it has wheels) obstacle detection and SLAM.  He also mentions in his blog some of the open source software that’s out in the wild such as OpenCV, Festival, and Sphinx, which would allow you to do some of that.
Aasemoon =)

untitled - 1 views

  • The animal world has been a source of inspiration for many robotic designs as of late, as who better to ask about life-like movements than mother Nature herself? Up until now, though, these designs had been mostly focused on small critters, like cockroaches, and simulating properties such as adaptability and speed. But what happens when we start looking at bigger and stronger animals? Like, say, an elephant? Well, Festo’s Bionic Handling Assistant is what happens. This innovation might seem like just another robotic arm at first glance, but the video demonstrates quite vividly how this design is such a big improvement over previous versions. Modeled after the elephant’s mighty trunk, this arm possesses great dexterity, flexibility and strength; operating with smooth, yet firm motions, and can pick up and move any kind of object from one place to another. It’s FinGripper fingers give it “an unparalleled mass/payload ratio”, and it has no problem twisting, assembling and disassembling things, such as the experimental toy in the video.
Aasemoon =)

A-pod the Ant-like Hexapod - 0 views

  • Remember A-pod, the realistic ant-like hexapod from last year?  Well its creator Kare Halvorsen has uploaded a brand new video showcasing its improved capabilities, and it’s a stunner.  His last video, posted around this time last year, went viral due to the robot’s realistic movements. This year, he ups the ante by showing it walking around and manipulating objects. Some of his past robot projects can be seen in brief snippets, and they’re not too shabby either.  Imagine a horde of these guys with sophisticated A.I.!
Aasemoon =)

IEEE Spectrum: Virginia Tech's Humanoid Robot CHARLI Walks Tall - 0 views

  • Dennis Hong, a professor of mechanical engineering and director of Virginia Tech's Robotics & Mechanisms Laboratory, or RoMeLa, has created robots with the most unusual shapes and sizes -- from strange multi-legged robots to amoeba-like robots with no legs at all. Now he's unveiling a new robot with a more conventional shape: a full-sized humanoid robot called CHARLI, or Cognitive Humanoid Autonomous Robot with Learning Intelligence. The robot is 5-foot tall (1.52 meter), untethered and autonomous, capable of walking and gesturing. But its biggest innovation is that it does not use rotational joints. Most humanoid robots -- Asimo, Hubo, Mahru -- use DC motors to rotate various joints (typically at the waist, hips, knees, and ankles). The approach makes sense and, in fact, today's humanoids can walk, run, and climb stairs. However, this approach doesn't correspond to how our own bodies work, with our muscles contracting and relaxing to rotate our various joints.
Aasemoon =)

Motion Capture Suit Makes Teleoperation Easy | BotJunkie - 0 views

  • One solution to getting robots to perform complex and/or variable tasks is to teleoperate them. Arguably this removes a significant portion of having a robot in the first place, but there will inevitably be tasks that even the most complex and well programmed robot just won’t be prepared for. If you’ve been reading BotJunkie for the past three years, you may remember Monty, a telepresence humanoid from Anybots. Monty was a bit difficult to control, and at the very least required some training.
Aasemoon =)

untitled - 0 views

  • Scientists from Columbia University, Arizona State University, the University of Michigan, and the California Institute of Technology (Caltech) have created a robot that’s just 4 nanometers wide. And no, it doesn’t have flashing lights, video cameras or wheels. It does, however, have four legs, and the ability to start, move, turn, and stop. Descendants of the molecular nanobot, or “spider,” could someday be used to treat diseases such as cancer or diabetes. The team built the spider by starting with a protein called streptavidin, that conveniently has four symmetrically-placed binding pockets for a chemical called biotin. The legs were made from four strands of biotin-labeled DNA, which were bound to the pockets. Three of the legs were made from enzymatic DNA, which is a type that binds to and then dissociates (cuts away) from other particular sequences of DNA. Its fourth leg was made from what the researchers call a “start strand” of DNA - it keeps the spider tethered to its starting site, until it’s released.
Aasemoon =)

Kinect-enabled robotic telepresence | Computer Vision Central - 0 views

  • Taylor Veltrop used a Kinect to read his arm movements which were then carried out by a robot. The robot was programmed using Willow Garage's open-source robot operating system (ROS). As Kit Eaton suggest, this quick experiment provides an illustration of the path towards robotic avatars.
Aasemoon =)

robots.net - It's Cognitive Robotics, Stupid! - 0 views

  • If you're a long time reader, you may remember our mention in 2008 of Emanuel Diamant's provocatively titled paper "I'm sorry to say, but your understanding of image processing fundamentals is absolutely wrong" (PDF). Diamant is back with a presentation created for the 3rd Israeli Conference on Robotics, with the equally provocative title: "It's Cognitive Robotics, Stupid" (PDF). In it he laments the lack of agreed upon definitions for words like intelligence, knowledge, and information that are crucial to the development of robotics.
Aasemoon =)

Urus Project - 0 views

  • In this project we want to analyze and test the idea of incorporating a network of robots (robots, intelligent sensors, devices and communications) in order to improve life quality in urban areas. The URUS project is focused in designing a network of robots that in a cooperative way interact with human beings and the environment for tasks of assistance, transportation of goods, and surveillance in urban areas. Specifically, our objective is to design and develop a cognitive network robot architecture that integrates cooperating urban robots, intelligent sensors, intelligent devices and communications.
Aasemoon =)

Building a Super Robust Robot Hand - IEEE Spectrum - 0 views

  • German researchers have built an anthropomorphic robot hand that can endure collisions with hard objects and even strikes from a hammer without breaking into pieces. In designing the new hand system, researchers at the Institute of Robotics and Mechatronics, part of the German Aerospace Center (DLR), focused on robustness. They may have just built the toughest robot hand yet. The DLR hand has the shape and size of a human hand, with five articulated fingers powered by a web of 38 tendons, each connected to an individual motor on the forearm.
Aasemoon =)

Interview: iRobot's AVA Tech Demonstrator | BotJunkie - 0 views

  • With all of the new competition in the consumer robotics field, it’s about time for iRobot to show that they’re still capable of innovating new and exciting things. AVA, their technology demonstrator, definitely fits into the new and exciting category. AVA is short for ‘Avatar,’ although iRobot was careful not to call it a telepresence robot so as not to restrict perceptions of what it’s capable of. AVA is capable of fully autonomous navigation, relying on a Kinect-style depth sensing camera, laser rangefinders, inertial movement sensors, ultrasonic sensors, and (as a last resort) bump sensors. We got a run-down a few days ago at CES, check it out:
« First ‹ Previous 81 - 100 of 146 Next › Last »
Showing 20 items per page