What hardware and OS does the guide require to run the hand-tracking demos?
A Raspberry Pi 4 Model B, a Pi camera (High Quality or Camera V2 with a suitable lens), microSD flashed with the Raspberry Pi Buster OS, plus power, HDMI and keyboard/mouse for desktop setup.
Which main libraries and frameworks are used for hand keypoint detection and control?
The project uses MediaPipe (with TensorFlow Lite delegate for acceleration), OpenCV and CVzone for vision/overlay, plus GPIO libraries for hardware, and keyboard/subprocess for sending VLC hotkeys.
Can the system track more than one hand at once?
Yes — you can increase the number of hands the MediaPipe detector tracks, but expect reduced frame rates on the Raspberry Pi as CPU load increases.
How are gestures mapped to actions like LED colours and VLC controls?
Scripts map finger counts to actions: the GlowBit script changes matrix colour by number of fingers up; the media script triggers custom VLC hotkeys via the keyboard package (e.g., 4 fingers = play, 3 = pause, 2 = volume up, 1 = mute, no fingers = shutdown).