This project demonstrates how to control a robotic head using facial emotion recognition. The system uses a client-server architecture where the client detects human emotions via a camera and sends the information to an ESP32-based robotic head, which then reacts by moving servos to express corresponding emotions.
Here are a few examples of the robotic head expressions for different emotions:
- Real-Time Emotion Recognition: Recognizes emotions like neutral, happy, sad, angry, and surprise using a webcam.
- ESP32-Based Control: Communicates with an ESP32 microcontroller over a TCP socket.
- Servo Motor Expressions: Controls multiple servo motors to simulate facial expressions.
- Blinking Mechanism: Simulates natural blinking every 10 seconds.
Here is the updated Hardware Requirements section with the added details for the power supply:
-
ESP32 Microcontroller: The central controller for the robotic head, which receives emotion data and controls the servos.
-
Servo Motors (11 Total):
- Eye movement (X and Y axes)
- Eyelids
- Eyebrows (left and right)
- Mouth (left, right, and opening/closing)
-
Power Supply for ESP32 and Servos:
- The power supply should provide a voltage between 6V and 12V and a current greater than 5A.
- For example, a 7.5V, 5A power supply can be used to power both the ESP32 and the servos.
-
Camera: A camera connected to the client machine to capture video for emotion recognition.
-
Wi-Fi Network: The ESP32 requires a stable Wi-Fi connection to communicate with the client machine.
- Python 3.10 (specific version required)
- Required libraries:
opencv-pythonfer
- Required libraries:
- Arduino IDE
- ESP32 board package.
- Servo library:
ESP32Servo.h
First, clone the repository to your local machine:
git clone https://github.com/heyangli0304/RobotFace.git
cd RobotFace- Download Python 3.10 from the official Python website.
- During installation, ensure to check the box "Add Python to PATH".
- Verify the installation by running the following command in Command Prompt:
It should return
python --version
Python 3.10.x.
- Open a Command Prompt (Press
Windows + R, typecmd, and hit Enter). - Navigate to your project folder using the
cdcommand:cd path\to\Emotion-Based-Robotic-Head
- Create a virtual environment named
RobotFace:python -m venv RobotFace
- Activate the virtual environment:
You should now see
RobotFace\Scripts\activate
(RobotFace)before the command prompt, indicating the virtual environment is active. - Install the required libraries in the virtual environment:
pip install opencv-python fer
python client_emotion_recognition.pyMake sure to replace server_ip in the client code with the ESP32's IP address.
- Open a terminal window.
- Install Python 3.10 (if not already installed):
sudo apt update sudo apt install python3.10 python3.10-venv python3.10-dev
- Verify the installation by running:
It should return
python3.10 --version
Python 3.10.x.
- Navigate to your project folder:
cd /path/to/RobotFace - Create a virtual environment named
RobotFace:python3.10 -m venv RobotFace
- Activate the virtual environment:
You should now see
source RobotFace/bin/activate(RobotFace)before the command prompt, indicating the virtual environment is active. - Install the required libraries in the virtual environment:
pip install opencv-python fer
python client_emotion_recognition.pyMake sure to replace server_ip in the client code with the ESP32's IP address.
Here is the updated ESP32 Setup section, now including the mention of the face models available in the Release folder:
-
Download Arduino IDE:
- Go to the Release folder in your project directory.
- Download and run
Arduino IDE.exeto install the Arduino IDE on your computer.
-
Install ESP32 Board Package:
- In the Release folder, download and run
32_package_1.0.6_arduino.cn.exe. This will install the necessary ESP32 board package for the Arduino IDE.
- In the Release folder, download and run
-
Open Arduino IDE:
- Once the installation is complete, open the Arduino IDE. You should now have the ESP32 boards available for selection.
-
Select the ESP32 Board:
- Go to
Tools > Board > ESP32 Dev Module(or any other appropriate ESP32 board). - Ensure the selected board matches the one you're using.
- Go to
-
Download Face Models:
- In the Release folder, you will find the FaceModel.zip file, which contains the 3D models and necessary parts for the robot's face. Extract the contents to a folder on your computer.
-
Connect Servos:
- The face models in FaceModel.zip include parts like the skin, mouth, and eyes, which are connected to servos using wires or mechanical links. Make sure the servos are connected according to the instructions provided in the
face_control.inocode.
- The face models in FaceModel.zip include parts like the skin, mouth, and eyes, which are connected to servos using wires or mechanical links. Make sure the servos are connected according to the instructions provided in the
-
Test the Face Movements:
- After connecting the servos and the face parts, you can run the
test_servocode in the server_robot_face_esp32 folder to test the movement of the servos and ensure everything is functioning correctly.
- After connecting the servos and the face parts, you can run the
- Connect your ESP32 to the computer.
- Open
esp32_robot_head.ino(this is the server code) in the Arduino IDE. - Update the Wi-Fi credentials in the code:
const char* ssid = "YOUR_WIFI_SSID"; const char* password = "YOUR_WIFI_PASSWORD";
- Upload the code to the ESP32.
Connect each servo to the appropriate pin as defined in the servoPins array in the code. Make sure the power supply is adequate to drive all servos.
- Power on the ESP32 and ensure it connects to the Wi-Fi network.
- Run the client code on the computer with a connected webcam:
python client_emotion_recognition.py
- The ESP32 will:
- Move the servos to express the detected emotion.
- Blink every 10 seconds.
- Press
qin the client window to terminate the program.
The client code, client_emotion_recognition.py, uses OpenCV and the FER library to:
- Capture frames from the webcam.
- Detect faces and recognize emotions.
- Send emotion and face position data to the ESP32 over a TCP connection.
The ESP32 code, esp32_robot_head.ino, serves as the server. It:
- Receives emotion and face position data from the client.
- Maps face position to servo angles for X and Y axes.
- Adjusts servos to simulate the detected emotion.
- Implements a blinking mechanism every 10 seconds.
RobotFace/
├── README.md # Project documentation
├── RobotFace.jpg # Demo image (Robot Face)
├── .gitattributes # Git configuration file
└── src/
├── client/
│ └── recognize_emotion.py # Python client code for emotion recognition
└── server_robot_face_esp32/
├── face_control/
│ └── face_control.ino # ESP32 code for controlling the robotic face
├── test_servo/ # Folder for testing servos (optional)
└── libraries/ # Servo libraries (for ESP32)
| Emotion | Servo Behavior |
|---|---|
| Neutral | Eyebrows relaxed, mouth closed. |
| Happy | Eyebrows slightly raised, mouth corners lifted. |
| Sad | Eyebrows angled down, mouth corners dropped. |
| Angry | Eyebrows furrowed inward, mouth corners tightened. |
| Surprise | Eyebrows fully raised, mouth opened slightly. |
- Ensure proper lighting for the camera.
- Adjust the webcam position for clear visibility of faces.
- Verify that the client and ESP32 are on the same Wi-Fi network.
- Check the
server_ipvalue in the client code matches the ESP32's IP address.
- Check the servo connections and power supply.
- Ensure the servo pins match the
servoPinsarray in the ESP32 code.
- Add more detailed emotion mappings.
- Use additional sensors for enhanced interactivity.
- Integrate a mobile app for remote control and monitoring.
This project is licensed under the MIT License. Feel free to use, modify, and distribute this code as needed.
- FER Library for emotion recognition.
- ESP32 and Arduino communities for providing excellent resources and support.




