The project is about Vision-Based Hand Gesture Recognition using Machine Learning and Deep Learning techniques. It is aimed at enabling human-computer interaction through gesture recognition, which is useful in various fields such as robotics, healthcare, automotive, and consumer electronics.
Why Did You Decide to Make It?The project is an assignment based project from APU University of Technology and Innovation.
How Does It Work?The project is divided into different tasks assigned to team members, each focusing on different aspects of gesture recognition:
1. Finger Counting (Aravind)- Uses OpenCV and NumPy to process real-time video feed.
- Identifies hand region, applies image preprocessing, and detects contours and convexity defects to count fingers.
- Implemented using Python in Visual Studio Code.
- Tracks hand motion trajectories in real-time.
- Detects swiping gestures like left, right, up, and down.
- Uses OpenCV and background subtraction techniques.
- Identifies if the hand is left or right and if it's showing the palm or back.
- Uses image processing and AI-based classification.
- Recognizes four hand gestures: Pinching, Grabbing, Rotation, and Thumbs Up.
- Uses YOLO v8 for real-time object detection and Roboflow for dataset preparation.
Comments
Please log in or sign up to comment.