QTrobot

The QTrobot is a humanoid robot with a range of motors, sensors and high computational power to provide students with an excellent package of hardware for investigating and learning about the world of robotics and human computer interaction (HCI).

From a hardware perspective the QTrobot contains several sensors to perceive the world around including

  • ReSpeaker Mic Array v2.0 provides Far-field voice capture for enhanced voice-based interaction making it possible to engage in conversation with the QTrobot.
  • Intel® RealSense™ Depth Camera D455 provides image data to the robot including depth data, providing the QTrobot with high quality visual data of the world.

In total the QTrobot has 8 motors, which control head and arm motion in the following:

  • Head Yay and Pitch
  • Shoulder Pitch and Roll (Left and Right Arms)
  • Elbow Roll (Left and Right Arms)

Controlling these motors can be achieved through either using the provided gestures, while more advanced motion such as handwriting can be achieved through the students learning and application of inverse kinematics on the robot.

A QT Robot with its right arm raised

QTrobot also has an integrated 800x480 standard display in the head which is mainly used for showing facial emotions with over 30 emotions built in. However, the use of the display is not limited, and students can use it as standard screen for displaying any visual content.

Internally the QTrobot is powered by both a Raspberry Pi for general computational task while a powerful Nvidia AGX Orin developer board provides the capability to run state-of-the-art AI models locally on the devices including LLM’s and computer vision models. 

Programming the QTrobot can be achieved using the Robot Operating System (ROS) API compatible with Pythion, C++ and JavaScript programming languages. You can also leverage popular machine learning frameworks like TensorFlow and PyTorch, to develop and deploy machine learning model that provide advanced integration and perception. As ROS this make sit possible for the QTrobot to communicate not only with other QTrobots but also other robot and machines that use ROS, opening the world of distributed robotics systems to students.

With this hardware and software there are many learning experiences that the QTrobot can provide to students. For example, it is ideal for learning about human computer interaction (HCI) where the QTrobot is capable perception of their uses, their location, identifies, gestures and emotional state as well as learning safe interaction.

Getting classroom ready

We have had similar processes for both the QT Robots and the Clearpath Jackal.

First, we have had to learn the hardware and software associated. These pieces of kit come with specific software bundles that we haven’t used before, which allow us to learn how it goes together.

After that, we have been liaising with IT Services to make sure the infrastructure is in place for teaching, so they know what is required (as the robots rely on wireless network).

Finally, we have been liaising with academics for including it in their modules. Our team will document the setup, create instructions and ask them how they see it integrating with their modules. We will work together to develop this with the academics so we can use the technologies in our teaching.

The QT Robot is allowing us to expand the types of robots we currently use and is the first humanoid robot we have. This human interactive element provides a new experience and way of teaching. We also have the potential to use the kit in other Robotics modules.

It also provides consistency to our students. The software (robot operating system) is used in the workplace, so utilising this in our teaching will provide the skills to equip them for the workplace.