Skip to main content

Home/ Robotics & AI/ Group items tagged Learning

Rss Feed Group items tagged

nehasaxena

IOS Chatbots: The Future of Mobile Assistance - The Dehradun Daily - 0 views

  •  
    Discover the future of mobile assistance with iOS chatbots! Learn how this revolutionary technology is transforming user experiences. Get insights on development, implementation, and optimization for maximum impact. #iOSchatbots #chatbot #AI #mobileapps #userexperience #futureoftech
Aasemoon =)

Artificial Intelligence and Robotics: Robot fish leader - 0 views

  • Humans have been coming up with innovative ways with which to plunder the Earth and its resources for as long as we have existed, so perhaps its time we give back a little. Leading aquatic animals, such as fish, away from underwater power plant turbines seems like a good place to begin, and a researcher at the Polytechnic Institute of New York University has designed a robot that will help just with that. Assistant professor Maurizio Porfiri studied the characteristics of small schools of fish to learn what exactly they look for in a leader, and he designed a palm-sized robot that possesses these traits. By taking command, this leader can be programmed to guide the fish away from danger, but the tricky part is getting the animals to accept the robot as one of their own.
Aasemoon =)

Add-ons for the RDS Simulator - Microsoft Robotics Blog - Site Home - MSDN Blogs - 0 views

  • The Robotics Developer Studio (RDS) Simulator is a key feature of the package that allows you to get started without buying expensive robots. It is a great tool for use in education. The add-ons outlined below help you to create your own simulation environments and get started on learning about robotics.
Aasemoon =)

ICT Results - Computers to read your body language? - 0 views

  • Can a computer read your body language? A consortium of European researchers thinks so, and has developed a range of innovative solutions from escalator safety to online marketing. The keyboard and mouse are no longer the only means of communicating with computers. Modern consumer devices will respond to the touch of a finger and even the spoken word, but can we go further still? Can a computer learn to make sense of how we walk and stand, to understand our gestures and even to read our facial expressions?The EU-funded MIAUCE project set out to do just that. "The motivation of the project is to put humans in the loop of interaction between the computer and their environment,” explains project coordinator Chaabane Djeraba, of CNRS in Lille. “We would like to have a form of ambient intelligence where computers are completely hidden,” he says. “This means a multimodal interface so people can interact with their environment. The computer sees their behaviour and then extracts information useful for the user."
Aasemoon =)

TechOnline | Introduction to NI LabVIEW Robotics - 0 views

  • NI LabVIEW Robotics is a software package that provides a complete suite of tools to help you rapidly design sophisticated robotics systems for medical, agricultural, automotive, research, and military applications. The LabVIEW Robotics Software Bundle includes all of the functionality you need, from multicore real-time and FPGA design capabilities to vision, motion, control design, and simulation. Watch an introduction and demonstration of LabVIEW Robotics.
Aasemoon =)

・e-nuvo HUMANOID - 0 views

  • The Nippon Institute of Technology, with Harada Vehicle Design, ZMP, and ZNUG Design, have developed a humanoid robot about the size of an elementary school student for educational purposes.  The university adopted 35 of ZMP’s e-nuvo WALK robots in 2004 for a 1:1 student-robot ratio.  Whereas the e-nuvo WALK (the educational version of NUVO) is quite small, the new robot is tall enough to interact with its environment in a more meaningful way.  Students will demonstrate the robot at elementary and junior high schools, as well as care facilities.  The goal is to improve student learning by raising awareness of bipedal robot technology and its connection to math and physics, while also giving them hands-on experience with the bot.  Additionally, by visiting care facilities the university students will come to understand the real-world needs and applications for robots.
Aasemoon =)

Robocopter Responds To Natural Language Direction | BotJunkie - 0 views

  • This little helicopter is able to understand you when you tell it what to do. No pushing buttons, no using special commands, you just tell it where you want it to go and (eventually) it goes. Of course, I’m sure it required a bit of work to define where “door” and “elevator” and “window” are, but it’s a much more intuitive way to control a UAV that works when your hands are full, when you’re stressed (think military), or simply when you have no idea now to control a UAV. I don’t have much in the way of other details on this project, besides the fact that it probably comes from the Robust Robotics Group at MIT, and possibly from someone who lives in this dorm. How do I know? Well, one of the research goals of the RRG is “to build social robots that can quickly learn what people want without being annoying or intrusive,” and this video is on the same YouTube channel. ‘Nuff said.
Aasemoon =)

IEEE Spectrum: Virginia Tech's Humanoid Robot CHARLI Walks Tall - 0 views

  • Dennis Hong, a professor of mechanical engineering and director of Virginia Tech's Robotics & Mechanisms Laboratory, or RoMeLa, has created robots with the most unusual shapes and sizes -- from strange multi-legged robots to amoeba-like robots with no legs at all. Now he's unveiling a new robot with a more conventional shape: a full-sized humanoid robot called CHARLI, or Cognitive Humanoid Autonomous Robot with Learning Intelligence. The robot is 5-foot tall (1.52 meter), untethered and autonomous, capable of walking and gesturing. But its biggest innovation is that it does not use rotational joints. Most humanoid robots -- Asimo, Hubo, Mahru -- use DC motors to rotate various joints (typically at the waist, hips, knees, and ankles). The approach makes sense and, in fact, today's humanoids can walk, run, and climb stairs. However, this approach doesn't correspond to how our own bodies work, with our muscles contracting and relaxing to rotate our various joints.
otakuhacks

Transformers in NLP: Creating a Translator Model from Scratch | Lionbridge AI - 0 views

  •  
    Transformers have now become the defacto standard for NLP tasks. Originally developed for sequence transduction processes such as speech recognition, translation, and text to speech, transformers work by using convolutional neural networks together with attention models, making them much more efficient than previous architectures. And although transformers were developed for NLP, they've also been implemented in the fields of computer vision and music generation. However, for all their wide and varied uses, transformers are still very difficult to understand, which is why I wrote a detailed post describing how they work on a basic level. It covers the encoder and decoder architecture, and the whole dataflow through the different pieces of the neural network. In this post, we'll get deeper into looking at transformers by implementing our own English to German language translator.
mikhail-miguel

Perplexity Artificial Intelligence - Perplexity Artificial Intelligence is a demo Artif... - 0 views

  •  
    Perplexity AI unlocks the power of knowledge with information discovery and sharing. A comprehensive knowledge hub where anyone can explore and learn effortlessly. In pursuit of this vision, we are committed to providing citations with every answer, providing proper attribution for sources of information and allowing for verification. Perplexity Artificial Intelligence: Perplexity Artificial Intelligence is a demo Artificial Intelligence search engine inspired by OpenAI WebGPT (perplexity.ai).
mikhail-miguel

Hugging Face Introduction - 0 views

  •  
    We're on a journey to advance and democratize artificial intelligence through open source and open science.
Aasemoon =)

Android Robotics Projects - Android - Mobile - 5 views

  • Android Robotics Projects
  • Setting up a development environment ready for Android robotics code Learning how to program for the AVR microcontroller Connecting servos and sensors Home-brewing your own PCB design, and choosing PCB suppliers Mounting the phone as a robot brain and teaching the robot to obey touch commands Approaching and designing different robot architectures
« First ‹ Previous 81 - 100 of 101 Next ›
Showing 20 items per page