Skip to main content

Home/ Robotics & AI/ Group items tagged AT

Rss Feed Group items tagged

mikhail-miguel

Juri Flow - Legal assistance anytime, anywhere with the finest Artificial Intelligence ... - 0 views

  •  
    Juri Flow: Legal assistance anytime, anywhere with the finest Artificial Intelligence lawyer at your fingertips! (juriflow.com).
mikhail-miguel

Mails.ai - Unlimited Cold Email Outreach at Scale (mails.ai). - 0 views

  •  
    Mails.ai: Unlimited Cold Email Outreach at Scale (mails.ai).
mikhail-miguel

Juri Flow - Legal assistance anytime, anywhere with the finest Artificial Intelligence ... - 0 views

  •  
    Juri Flow: Legal assistance anytime, anywhere with the finest Artificial Intelligence lawyer at your fingertips! (juriflow.com).
mikhail-miguel

AI Gallery - Generated images at speed, with variety (aigallery.app). - 0 views

  •  
    AI Gallery: Generated images at speed, with variety (aigallery.app).
mikhail-miguel

Luminal - Clean, transform and analyze spreadsheets at lightspeed with Artificial Intel... - 0 views

  •  
    Luminal: Clean, transform and analyze spreadsheets at lightspeed with Artificial Intelligence (getluminal.com). Luminal: Luminal helps you tame large, messy spreadsheets using GPT3 (getluminal.com).
mikhail-miguel

Postly - Design and publish your social media marketing campaigns at scale (postly.ai). - 0 views

  •  
    Postly: Design and publish your social media marketing campaigns at scale (postly.ai).
Aasemoon =)

robots.net - Physics-based Planning - 0 views

  • Later this month, Carnegie Mellon's CMDragons small-size robotic soccer team will be competing again at RoboCup, to be held in Singapore. CMDragons has tended to find their edge in their software as opposed to their hardware. Their latest software advantage will be their new "physics-based planning", using physics to decide how to move and turn with the ball in order to maintain control. Previous control strategies simply planned where the robot should move to and shoot from, assuming a ball placed at the front center of the dribbler bar would stay there. The goal of Robocup is to create a humanoid robotic soccer team to compete against human players in 2050. Manuela Veloso, the professor who leads the Carnegie Mellon robotic soccer lab, "believe[s] that the physics-based planning algorithm is a particularly noteworthy accomplishment" that will take the effort one step closer to the collective goal.
Aasemoon =)

IEEE Spectrum: Engineers Turn Robot Arm into Formula 1 Simulator - 0 views

  • As Paolo Robuffo Giordano and colleagues at the Max Planck Institute for Biological Cybernetics, in Tübingen, Germany, would have it, scientific research means riding the business end of a giant industrial robot arm while playing video games. But hey -- they produced some serious research on it, which was presented at ICRA 2010.  The CyberMotion Simulator is basically a full motion simulator adapted to a racing car game. Players (or subjects, the researchers prefer to call them) sit in a cabin on a robot arm some 2 meters off the ground and drive a Ferrari F2007 car around a projected track with force-feedback steering wheel and pedals. The aim is to make the experience as realistic as possible without having to buy a real F2007, and to test the simulator with an environment that requires sudden, massive acceleration.
Aasemoon =)

Drive Servo Control Problems - 0 views

  • Perhaps the most difficult control problem for a drive servo is that of going down a ramp. Any back drivable drive servo will exhibit a freewheeling velocity on a given ramp. This is the speed at which the robot will roll down the ramp in an unpowered state. At this speed, the surface drag and internal drag of the servo are equal to the gravitational force multiplied by the sine of the slope. The freewheeling speed is thus load dependent.
  •  
    Great series of articles. Make sure to check out parts 1 and 2.
Aasemoon =)

robots.net - Robots: Distributed Flight Array - 0 views

  • In its latest episode, the Robots Podcast interviews the lead researcher of the Distributed Flight Array and one of my colleagues at the ETH Zurich's IDSC, Raymond Oung. The Distributed Flight Array (DFA) is an aerial modular robot. Each individual module has a single, large propellor and a set of omniwheels to move around. Since a single propellor does not allow stable flight, modules move around to connect to each other. As shown in this video of the DFA, the resulting random shape then takes flight. After a few minutes of hovering the structure breaks up and modules fall back to the ground, restarting the cycle. As most projects at the IDSC, the DFA is grounded in rigorous mathematics and design principles and combines multiple goals: It serves as a real-world testbed for research in distributed estimation and control, it abstracts many of the real-world issues of the next generation of distributed multi-agent systems, and it provides an illustration for otherwise abstract concepts like distributed sensing and control to a general public. For more information on current work, future plans and real-world applications, read on or tune in!
Aasemoon =)

Brain-controlled prosthetic limb most advanced yet - 0 views

  • Scientists at the Johns Hopkins University Applied Physics Laboratory (APL) were awarded no less than $34.5 million by the Defense Advanced Research Projects Agency (DARPA) to continue their outstanding work in the field of prosthetic limb testing, which has seen them come up with the most advanced model yet. Their Modular Prosthetic Limb (MPL) system is just about ready to be tested on human subjects, as it has proved successful with monkeys. Basically, the prosthetic arm is controlled by the brain through micro-arrays that are implanted (gently) in the head. They record brain signals and send the commands to the computer software that controls the arm. To be honest, it will be interesting to see just how these hair-chips are attached to the brain, but the APL say clinical tests have shown the devices to be entirely harmless. The monkeys didn’t mind them too much, at least.
Aasemoon =)

robots.net - Robots: Programmable Matter - 0 views

  • The latest episode of the Robots Podcast looks at the following scenario: Imagine being able to throw a hand-full of smart matter in a tank full of liquid and then pulling out a ready-to-use wrench once the matter has assembled. This is the vision of this episode's guests Michael Tolley and Jonas Neubert from the Computational Synthesis Laboratory run by Hod Lipson at Cornell University, NY. Tolley and Neubert give an introduction into Programmable Matter and then present their research on stochastic assembly of matter in fluid, including both simulation (see video above) and real-world implementation. Read on or tune in!
Aasemoon =)

Scalable Object Recognition | Willow Garage - 0 views

  • Marius Muja from University of British Columbia returned to Willow Garage this summer to continue his work object recognition. In addition to working on an object detector that can scale to a large number of objects, he has also been designing a general object recognition infrastructure. One problem that many object detectors have is that they get slower as they learn new objects. Ideally we want a robot that goes into an environment and is capable of collecting data and learning new objects by itself. In doing this, however, we don't want the robot to get progressively slower as it learns new objects. Marius worked on an object detector called Binarized Gradient Grid Pyramid (BiGGPy), which uses the gradient information from an image to match it to a set of learned object templates. The templates are organized into a template pyramid. This tree structure has low resolution templates at the root and higher resolution templates at each lower level. During detection, only a fraction of this tree must be explored. This results in big speedups and allows the detector to scale to a large number of objects.
Aasemoon =)

robots.net - Robots: Swarming Satellites - 0 views

  • The latest episode of the Robots podcast interviews Dr. Alvar Saenz-Otero from MIT on the SPHERES project. SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) are basketball-sized satellites able to fly in and maintain formation at nanometer precision. In the second part of this episode we continue our quest for a good definition of a robot by looking at a well-known definition dating back to 1979. Read on or tune in!
frank smith

Memristor minds: The future of artificial intelligence - tech - 08 July 2009 - New Scie... - 0 views

  •  
    EVER had the feeling something is missing? If so, you're in good company. Dmitri Mendeleev did in 1869 when he noticed four gaps in his periodic table. They turned out to be the undiscovered elements scandium, gallium, technetium and germanium. Paul Dirac did in 1929 when he looked deep into the quantum-mechanical equation he had formulated to describe the electron. Besides the electron, he saw something else that looked rather like it, but different. It was only in 1932, when the electron's antimatter sibling, the positron, was sighted in cosmic rays that such a thing was found to exist. In 1971, Leon Chua had that feeling. A young electronics engineer with a penchant for mathematics at the University of California, Berkeley, he was fascinated by the fact that electronics had no rigorous mathematical foundation. So like any diligent scientist, he set about trying to derive one. And he found something missing: a fourth basic circuit element besides the standard trio of resistor, capacitor and inductor. Chua dubbed it the "memristor". The only problem was that as far as Chua or anyone else could see, memristors did not actually exist. Except that they do.
Aasemoon =)

Oh, Those Robot Eyes! | h+ Magazine - 0 views

  • Willow Garage is organizing a workshop at the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) 2010 in San Francisco to discuss the intersection of computer vision with human-robot interaction. Willow Garage is the hardware and open source software organization behind the Robot Operating System (ROS) and the PR robot development platform. Here’s a recent video from Willow Garage of work done at the University of Illinois on how robots can be taught to perceive images:
Aasemoon =)

IEEE Spectrum: When Will We Become Cyborgs? - 0 views

  • I remember when, a decade ago, Kevin Warwick, a professor at the University of Reading, in the U.K., implanted a radio chip in his own arm. The feat caused quite a stir. The implant allowed him to operate doors, lights, and computers without touching anything. On a second version of the project he could even control an electric wheelchair and produce artificial sensations in his brain using the implanted chip. Warwick had become, in his own words, a cyborg. The idea of a cyborg -- a human-machine hybrid -- is common in science fiction and although the term dates back to the 1960s it still generates a lot of curiosity. I often hear people asking, When will we become cyborgs? When will humans and machines merge? Although some researchers might have specific time frames in mind, I think a better answer is: It's already happening. When we look back at the history of technology, we tend to see distinct periods -- before the PC and after the PC, before the Internet and after the Internet, and so forth -- but in reality most technological advances unfold slowly and gradually. That's particularly true with the technologies that are allowing us to modify and enhance our bodies.
Aasemoon =)

robots.net - Robots: Chaos Control - 0 views

  • Walking, swallowing, respiration and many other key functions in humans and other animals are controlled by Central Pattern Generators (CPGs). In essence, CPGs are small, autonomous neural networks that produce rhythmic outputs, usually found in animal's spinal cords rather than their brains. Their relative simplicity and obvious success in biological systems has led to some success in using CPGs in robotics. However, current systems are restricted to very simple CPGs (e.g., restricted to a single walking gait). A recent breakthrough at the BCCN at the University of Göttingen, Germany has now allowed to achieve 11 basic behavioral patterns (various gaits, orienting, taxis, self-protection) from a single CPG, closing in on the 10–20 different basic behavioral patterns found in a typical cockroach. The trick: Work with a chaotic, rather than a stable periodic CPG regime. For more on CPGs, listen to the latest episode of the Robots podcast on Chaos Control, which interviews Poramate Manoonpong, one of the lead researchers in Göttingen, and Alex Pitti from the University of Tokyo who uses chaos controllers that can synchronize to the dynamics of the body they are controlling.
fishead ...*∞º˙

Robots with skin enter our touchy-feely world - tech - 19 April 2010 - New Scientist - 0 views

  • BEAUTY may be only skin deep, but for humanoid robots a fleshy covering is about more than mere aesthetics, it could be essential to making them socially acceptable. A touch-sensitive coating could prevent such machines from accidentally injuring anybody within their reach. In May, a team at the Italian Institute of Technology (IIT) in Genoa will dispatch to labs across Europe the first pieces of touch-sensing skin designed for their nascent humanoid robot, the iCub. The skin IIT and its partners have developed contains flexible pressure sensors that aim to put robots in touch with the world. "Skin has been one of the big missing technologies for humanoid robots," says roboticist Giorgio Metta at IIT. One goal of making robots in a humanoid form is to let them interact closely with people. But that will only be possible if a robot is fully aware of what its powerful motorised limbs are in contact with.
  •  
    Wow this is cool!
Aasemoon =)

robots.net - Robots: URBI Software Platform - 0 views

‹ Previous 21 - 40 of 102 Next › Last »
Showing 20 items per page