Assistive Robotics Lab at UCF

   Home>>Assistive Robotics>>Multimedia

YouTube Videos


  • Passive Physical Human-Robot Interaction

    Description:
    Blind massaging a mannequin's back (among other interactions) using a first-principles (differential geometric) formulation for safe Physical Human Robot Interaction (PHRI)
  • Autonomous Robotic Grasping of Novel Objects in Cluttered Scenes

    Description:
    Faster than real-time surface segmentation of novel objects using a first-principles (geometry and calculus) formulation with a parallelized implementation on a desktop computer.
  • UCFTV Showcases UCF-MANUS


    • Description:
      The Assistive Robotics team at UCF has developed a prototype that may one day improve the lives of people with disabilities.
  • Orlando Health study examines technology developed by UCF


    • Description:
      The Orlando Health Rehabilitation Institute and the University of Central Florida are evaluating technology developed by the university to operate the robotic arm, in an effort to help design controls that are best suited for patients as they reach forward for a greater quality of life.
  • UCF Manus Arm Goes Stereo


    • Description:
      This video shows different examples of automated object grabbing using stereo vision in the hand of a robot. We show the ability of the system with objects of different sizes, objects on high and low shelves as well as objects on the floor, we also show upright and laid down objects. We also demonstrate how the GUI can be commanded using head tracking, speech, touch screen, trackball, etc. thereby allowing disabled users with varying levels of disability to access the system. We also show the ability of our system to provide audio and video feedback to the user. Notice that all tasks are performed under natural lighting and in an unstructured environment.
  • Smart Manus Arm Overview


    • Description:
      This is a simple demonstration showing the gross and fine motion phases of the robotic arm after the user indicates an object of interest through speech, trackball, touchscreen or another input device. Gross motion is the stage in which the robot makes a rough approach to the object of interest. Once the object is centered and zoomed in on, the robot can perform fine motion, where the gripper will orient itself to retrieve the object.
  • Manual versus Automatic Control


    • Description:
      This shows three different control modes on the robot, namely (a) Manual using Keypad, (b) Manual using Graphical User Interface (GUI), and (c) Automatic using GUI. In manual keypad control, the user can control the robot in Cartesian mode using a keypad giving commands such as UP, LEFT, YAW RIGHT, PITCH DOWN, etc. In the second mode, the user interfaces with the robot through a GUI instead of the keypad, thus, making it easier on the user to command the Cartesian motions. The third control mode is called Auto control mode. Auto control mode consolidates the 14 different command buttons on the GUI into a a smaller and more intuitive subset such as approach, retreat, go, etc. This simplifies common pick and place tasks.


Downloadable Videos

Intelligent Gripping via Translational & Rotational Slip Detection
Smart Grasping of Objects
Slip Detection
Particle Filter Based Robust Target Tracking
Orlando Health study examines technology developed by UCF
UCF Manus ARM Goes Stereo
Smart Manus Arm Overview
Manual versus Automatic Control
 

Presentations

2009.08 UCF MANUS ARM - General Overview
2009.09 Focus Group Analysis (Rebekah Hazlett)
2009.09 Lessons Learned from User Trials (Dae-Jin Kim)
2010.04 Applying Human Factors Psychology Principles to Design Graphical User Interface of an Assistive Robot (Melissa Smith)