Skip to main content

Home/ Robotics & AI/ Group items tagged Machine

Rss Feed Group items tagged

Aasemoon =)

Artificial Intelligence and Robotics: Robot fish leader - 0 views

  • Humans have been coming up with innovative ways with which to plunder the Earth and its resources for as long as we have existed, so perhaps its time we give back a little. Leading aquatic animals, such as fish, away from underwater power plant turbines seems like a good place to begin, and a researcher at the Polytechnic Institute of New York University has designed a robot that will help just with that. Assistant professor Maurizio Porfiri studied the characteristics of small schools of fish to learn what exactly they look for in a leader, and he designed a palm-sized robot that possesses these traits. By taking command, this leader can be programmed to guide the fish away from danger, but the tricky part is getting the animals to accept the robot as one of their own.
Aasemoon =)

IEEE Spectrum: RoboCup Kicks Off in Singapore This Week - 1 views

  • Humans aren't the only ones playing soccer right now. In just two days, robots from world-renowned universities will compete in Singapore for RoboCup 2010. This is the other World Cup, where players range from 15-centimeter tall Wall-E-like bots to adult-sized advanced humanoids. The RoboCup, now in its 14th edition, is the world’s largest robotics and artificial intelligence competition with more than 400 teams from dozens of countries. The idea is to use the soccer bots to advance research in machine vision, multi-agent collaboration, real-time reasoning, sensor-fusion, and other areas of robotics and AI. But its participants also aim to develop autonomous soccer playing robots that will one day be able to play against humans. The RoboCup's mission statement:
Aasemoon =)

robots.net - New Model Mimics Human Vision Tasks - 0 views

  • Researchers at MIT’s McGovern Institute for Brain Research are working on a new mathematical model to mimic the human brain's ability to identify objects. The model can predict human performance on certain visual-perception tasks suggesting it’s a good indication of what's actually happening in the brain. Researchers are hoping the new findings will make their way into future object-recognition systems for automation, mobile robotics, and other applications.
Aasemoon =)

Robots Preparing to Defeat Humans in Soccer - 0 views

  • Can a team of soccer-playing robots beat the human World Cup champions by 2050? That's the ultimate goal of RoboCup, an international tournament where teams of soccer robots compete in various categories, from small wheeled boxes to adult-size humanoids. IEEE Spectrum's Harry Goldstein traveled to Singapore to attend RoboCup 2010 -- and check out how the man vs. machine future of soccer is playing out.
Aasemoon =)

Carnegie Mellon's Incredible Robot Snake Climbs a Real Tree | Singularity Hub - 0 views

  • Carnegie Mellon has taught its robotic snake to climb trees, though one hopes it won’t start offering your spouse apples. “Uncle Sam” (presumably named for its red, white, and blue markings) is a snake robot built from modular pieces. The latest in a line of ‘modsnakes’ from Carnegie Mellon’s Biorobotics Lab, Uncle Sam can move in a variety of different ways including rolling, wiggling, and side-winding. It can also wrap itself around a pole and climb vertically, which comes in handy when scaling a tree. You have to watch this thing in action. There is something incredibly life-like, and eerie, about the way it scales the tree outdoors and then looks around with its camera ‘eye’. Projects like Uncle Sam show how life-mimicking machines could revolutionize robotics in the near future.
Aasemoon =)

Autonomous Satellite Chasers Can Use Robotic Vision to Capture Orbiting Satellites | Po... - 0 views

  • UC3M's ASIROV Robotic Satellite Chaser Prototype ASIROV, the Acoplamiento y Agarre de Satélites mediante Sistemas Robóticos basado en Visión (Docking and Capture of Satellites through computer vision) would use computer vision tech to autonomously chase down satellites in orbit for repair or removal. Image courtesy of Universidad Carlos III de Madrid Spanish robotics engineers have devised a new weapon in the battle against zombie-sats and space junk: an automated robotics system that employs computer vision technology and algorithmic wizardry to allow unmanned space vehicles to autonomously chase down, capture, and even repair satellites in orbit. Scientists at the Universidad Carlos III de Madrid (UC3M) created the system to allow for the removal of rogue satellites from low earth orbit or the maintenance of satellites that are nearing the ends of their lives, prolonging their service (and extending the value of large investments in satellite tech). Through a complex set of algorithms, space vehicles known as “chasers” could be placed into orbit with the mission of policing LEO, chasing down satellites that are damaged or have gone “zombie” and dealing with them appropriately.
Aasemoon =)

Oh, Those Robot Eyes! | h+ Magazine - 0 views

  • Willow Garage is organizing a workshop at the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) 2010 in San Francisco to discuss the intersection of computer vision with human-robot interaction. Willow Garage is the hardware and open source software organization behind the Robot Operating System (ROS) and the PR robot development platform. Here’s a recent video from Willow Garage of work done at the University of Illinois on how robots can be taught to perceive images:
fishead ...*∞º˙

Robots with skin enter our touchy-feely world - tech - 19 April 2010 - New Scientist - 0 views

  • BEAUTY may be only skin deep, but for humanoid robots a fleshy covering is about more than mere aesthetics, it could be essential to making them socially acceptable. A touch-sensitive coating could prevent such machines from accidentally injuring anybody within their reach. In May, a team at the Italian Institute of Technology (IIT) in Genoa will dispatch to labs across Europe the first pieces of touch-sensing skin designed for their nascent humanoid robot, the iCub. The skin IIT and its partners have developed contains flexible pressure sensors that aim to put robots in touch with the world. "Skin has been one of the big missing technologies for humanoid robots," says roboticist Giorgio Metta at IIT. One goal of making robots in a humanoid form is to let them interact closely with people. But that will only be possible if a robot is fully aware of what its powerful motorised limbs are in contact with.
  •  
    Wow this is cool!
Aasemoon =)

untitled - 1 views

  • The animal world has been a source of inspiration for many robotic designs as of late, as who better to ask about life-like movements than mother Nature herself? Up until now, though, these designs had been mostly focused on small critters, like cockroaches, and simulating properties such as adaptability and speed. But what happens when we start looking at bigger and stronger animals? Like, say, an elephant? Well, Festo’s Bionic Handling Assistant is what happens. This innovation might seem like just another robotic arm at first glance, but the video demonstrates quite vividly how this design is such a big improvement over previous versions. Modeled after the elephant’s mighty trunk, this arm possesses great dexterity, flexibility and strength; operating with smooth, yet firm motions, and can pick up and move any kind of object from one place to another. It’s FinGripper fingers give it “an unparalleled mass/payload ratio”, and it has no problem twisting, assembling and disassembling things, such as the experimental toy in the video.
Aasemoon =)

This Robotic Dragonfly Flew 40 Years Ago | BotJunkie - 0 views

  • In the 1970s the CIA had developed a miniature listening device that needed a delivery system, so the agency’s scientists looked at building a bumblebee to carry it. They found, however, that the bumblebee was erratic in flight, so the idea was scrapped. An amateur entymologist on the project then suggested a dragonfly and a prototype was built that became the first flight of an insect-sized machine. A laser beam steered the dragonfly and a watchmaker on the project crafted a miniature oscillating engine so the wings beat, and the fuel bladder carried liquid propellant. Despite such ingenuity, the project team lost control over the dragonfly in even a gentle wind. “You watch them in nature, they’ll catch a breeze and ride with it. We, of course, needed it to fly to a target. So they were never deployed operationally, but this is a one-of-a-kind piece.”
Aasemoon =)

robots.net - Microbots can now swim back and forth - 0 views

  • Until now you can have big elaborate robots or very small microbots but it is very difficult to have both. A blog post from New Scientist (where this video is from) points out the research on microbots, very small machines that will move, navigate and perform simple tasks. The ability to remotely power a microbot, thus eliminating the need for onboard battery or fuel, is already proven and one of the methods is the application of an AC field to a liquid where the robot is located. This microbot is essentially a diode, a one-way electric conductor. The different electric charges at its ends force the neighboring ions to move thus creating a small thrust that propels the bot. The team of Rachita Sharma and Orlin Velev from North Carolina State University developed a method where a controlled application of an additional DC field changes the ion distribution around the microbot and this time the ion field creates a torque that rotates the microbot. The DC field is applied until the completion of a 180-degree turn. Then the microbot moves again, now in the opposite direction. It is only 1.3mm long and as claimed by other scientists like Vesselin Paunov from the University of Hull, UK this arrangement can be further scaled down where it can be useful for diagnostic and localized drug supply applications.
Aasemoon =)

Automaton, Know Thyself: Robots Become Self-Aware: Scientific American - 0 views

  • Robots might one day trace the origin of their consciousness to recent experiments aimed at instilling them with the ability to reflect on their own thinking. Although granting machines self-awareness might seem more like the stuff of science fiction than science, there are solid practical reasons for doing so, explains roboticist Hod Lipson at Cornell University's Computational Synthesis Laboratory.
Aasemoon =)

Graspy PR2 robot learns to read | Computer Vision Central - 0 views

  • Researchers at the University of Pennsylvania are developing algorithms to enable robots to learn to read like a human toddler. Using a Willow Garage PR2 robot (nicknamed Graspy), the researchers demonstrate the ability for a robot to learn to read anything from simple signs to full-length warnings. Graspy recognizes the shapes of letters and associates them with sounds. Part of the computer vision challenge is reading hundreds of different fonts. More information is available in a Psyorg article and from the ROS website.
Aasemoon =)

Robotland: Google Android@Home and Cloud Robotic Apps on Wheels - 0 views

  • Google announced at Google IO conference 2011 that they will supply Android@Home framework for home automation to developers, giving them the ability to think of "every appliance in your home" as a potential accessory for your phone. The Google team teased ideas like lights turning on and off based on calendar events, applications talking to washing machines, games automatically adjusting for mood lighting, and basically little green dudes taking care of all the menial duties in your house. One amazing demo was a concept, Android-powered device hub called Tungsten. Using RFID embedded into CD cases the device was able to detect the CD and add it to your library. Another touch and it started automatically.
Aasemoon =)

Make Computers See with SimpleCV - The Open Source Framework for Vision - 0 views

  • So after all that you are probably asking, “What is SimpleCV?” It is an open source computer vision framework that lowers the barriers to entry for people to learn, develop, and use it across the globe. Currently there are a few open source vision system libraries in existence, but the downside to these is you have to be quite the domain expert and knowledgeable with vision systems as well as know cryptic programming languages like C. Where SimpleCV is different, is it is “simple”. It has been designed with a web browser interface, which is familiar to Internet users everywhere. It will talk to your webcam (which most computers and smart phones have built in) automatically. It works cross platform (Windows, Mac, Linux, etc). It uses the programming language Python rather than C to greatly lower the learning curve of the software. It sacrifices some complexity for simplicity, which is needed for mass adoption of any type of new technology.
pawanosplabs

How Healthcare Providers Can Prepare for Artificial Intelligence, Machine Learning - 0 views

  •  
    Learn smart insights about how leveraging artificial intelligence for hospital operations benefit hospital administration to reduce costs while increasing revenues and improve functionalities to simplify the hospital workflow management.
otakuhacks

Transformers in NLP: Creating a Translator Model from Scratch | Lionbridge AI - 0 views

  •  
    Transformers have now become the defacto standard for NLP tasks. Originally developed for sequence transduction processes such as speech recognition, translation, and text to speech, transformers work by using convolutional neural networks together with attention models, making them much more efficient than previous architectures. And although transformers were developed for NLP, they've also been implemented in the fields of computer vision and music generation. However, for all their wide and varied uses, transformers are still very difficult to understand, which is why I wrote a detailed post describing how they work on a basic level. It covers the encoder and decoder architecture, and the whole dataflow through the different pieces of the neural network. In this post, we'll get deeper into looking at transformers by implementing our own English to German language translator.
mikhail-miguel

Infinite Drum - Create unique beats with AI-powered tool using everyday sounds (experim... - 0 views

  •  
    Infinite Drum: Create unique beats with AI-powered tool using everyday sounds (experiments.withgoogle.com).
« First ‹ Previous 41 - 60 of 64 Next ›
Showing 20 items per page