Identify and track every joint in the fingers of your human hands, live. Then use your human hands to send commands to control media software and GPIO attached hardware. All via a Raspberry Pi Single Board Computer.

Transcript

Hey gang, Tim here at Core Electronics and today is hand recognition and finger identification with a Raspberry Pi single board computer.

Have you ever wanted your Raspberry Pi 4 Model B to be able to identify every single joint or finger in your hand live, or perhaps use hand gestures to send commands or utilize hand key point detection or hand pose detection? Then this video is exactly where you need to be. Machine and deep learning has never been more accessible and this video will show just this.

Here in the table is everything you need to start locating hands. Naturally, you're gonna need a camera. You can use any Pi camera module. Today, I'm using the high quality with a five millimetre lens attached to it. Bring along with you the most powerful Raspberry Pi you can get your hands on. Here, I'm using a Raspberry Pi 4 Model B. You're also gonna want a microSD card that has been flashed with Raspberry Pi Buster OS that has been connected to the internet and is updated and has the camera enabled. We have also installed a number of packages such as OpenCV and MediaPipe to the OS on this microSD card. At the time of recording OpenCV works best with the previous Buster OS. Jump to the article link in the description to see a step-by-step process to do all of this.

We will also set up our Raspberry Pi to run as a desktop. So a power supply, HDMI cord, and the normal keyboard and mouse combo are needed too. Once the booting sequence is complete, you're gonna be welcome with the familiar Raspberry Pi Buster OS.

Now, I've created a couple of different scripts to do hand recognition. You can find and download all of them by jumping to the bottom of the page. I've also included a link to the GitHub repository where you can find all the code and resources used in this video.

So, let's get started. Open up a terminal and navigate to the directory where you downloaded the scripts. Run the command "python3 hand_recognition.py" to start the hand recognition script. This script uses the OpenCV and MediaPipe libraries to detect and track hands in real-time.

Once the script is running, you should see a live video feed from your camera with hand landmarks overlaid on top. You can move your hand around and see how the script tracks your hand movements. You can also try different hand gestures and see how the script recognizes them.

In the script, you'll find a few different options to customize the hand recognition. For example, you can adjust the minimum confidence threshold for hand detection, or enable/disable hand tracking. Feel free to experiment with these settings and see how they affect the hand recognition.

That's it for this video. I hope you found it helpful and that it inspires you to explore the possibilities of hand recognition with your Raspberry Pi. If you have any questions or need further assistance, feel free to leave a comment below or reach out to our support team. Thanks for watching, and happy making!The article page. Use the link in the description.

With the file downloaded, unzip the contents onto your Raspberry Pi desktop or wherever you deem appropriate. Each script has been filled with comments so you'll be able to know exactly what's going on behind the curtains if you're curious.

The first one to check out is the simplest one, which you saw at the start of the video named simplehandtracker.py. All you need to do is right click it and open it up using Thonny IDE or any Python interpreter that you would like. Then with it open, run the script by pressing the big green run button. And as soon as you do it, it's gonna open up a live stream of what the camera is seeing and initiate hand identification.

The really sweet thing about this machine learn system is the way it predicts where it believes your fingers are gonna be, even if it cannot see your fingers completely. It's quite incredible how accurately it knows where your fingers are or where they're gonna be.

So if we look at the script now, we can see we import those two important packages, MediaPipe and CB2. This is part of the method of getting that hand framework drawn onto the live video. Here we can add confidence values and other kinds of settings.

So for example, right now, it's only able to track one hand, but let's stop it just for a sec. Here we're gonna increase the number of hands to two. And when we run the code now, naturally should be able to deal with two hands. You can increase that number even more, but keep in mind the Raspberry Pi frame rate will decrease the more computing you have it doing.

You can also easily alter the code to spit out the X, Y location of a particular.Joint or fingertip to the shell. For instance, if we jump to the script and I uncomment, this section of the code is gonna print to the shell the X, Y coordinates of the tip of my index finger, which is referred to in this code as joint eight. And you'll notice that the index finger has been given an X, Y coordinate, which is based on where I put it.

The natural next step is to add some extra functionality to our previous scripts. So that way it will be able to tell when we have certain fingers up and also give a total of how many fingers are up and how many fingers are down. I've already added this functionality to another code and you're gonna be able to find this code in the unzipped location called our fingers up or down.py and module.py. Both are fully annotated for understanding.

This is gonna be very similar to the previous script, but will also provide a total finger count up and down and specific details on which finger is up and which finger is down. This statement then gets printed to the shell. Five fingers up, zero fingers down, and thumb, index, middle, ring, and pinky tip are all those fingers that it can see. And if I give it a rock and roll sign, rotate it this way so I can keep track of what's going on. So it knows my pinky finger and my index finger are up, two of them are up, and three fingers are down. Now it knows four fingers are up, one finger is down, and then identifies them all correctly.

The way this script is working is it knows the coordinates of each joint and if it determines that the fingertip is below the first knuckle, it's gonna state that the finger is down. As it currently stands, the script works for a single joint or fingertip to the shell.Hand, but it can be altered to accommodate more. You will notice an increased delay when doing so.

Also, as an Easter egg, there's an added text-to-speech feature. So whenever you press Q on the keyboard as the script is running, the Raspberry Pi will enunciate to you exactly what it sees, letting you know exactly how many fingers are up and how many fingers are down. Allow me to demonstrate.

You have five fingers up and zero fingers down. You have two fingers up and three fingers down.

Naturally, the next step is to use our hands to control general purpose input and output pins on our Raspberry Pi. GPIO pins are the doorway to controlling an almost endless amount of sensors, motors, actuators, or signals.

So I'm gonna wire up this four by four Glowbit matrix, add a few lines of code to our previous script, and we should have a hand-activated light system. You're gonna be able to find this script under the name glowbitgesturecontrol.py.

So depending on how many fingers I raise in front of the Raspberry Pi camera, when the script is running, it's gonna change the color of this matrix right here. Allow me a second to dive into the script so you can see exactly how I altered the fingers up or down script to add this Glowbit functionality.

The first difference you're gonna see is at the very top of the script where I've added two new import statements. The first new statement imports all the functionality of the Glowbit library, which makes it easy to light up this four by four matrix to whatever color we want. And the second imports the general purpose input and output library to our script. This second imported library enables us to send data through.Specific GPIO pins on our Raspberry Pi. The next line, the GPIO set warnings to false, just keeps the shell clear of any warning messages that the GPIO library might want to output.

The next line under that is our first line that takes advantage of the added Glowbit functionality. In it, we are determining the general behavior of this four by four matrix. A value here of 255 for brightness means that we will be maxing out the brightness that this little guy can output. And rate limit FPS equaling 80 means that the refresh rate of this matrix will be at 80 frames per second.

After that, all the code is exactly the same until we get closer to the bottom. Scrolling down, we're gonna pay attention to the fact that we're inside the wild true loop. I have created a number of if statements that have a chance of running with every iteration.

So let's follow through one of these if statements. If say we have one finger up to the camera. When this occurs, this first if statement is gonna trigger for up will equal one. Inside this if statement, we can now see that it will create a variable named C with the value matrix.purple. This value comes directly from the Glowbit library and it has several color functions like this that we will take advantage of.

This line in combination with the next is gonna make all the pixels in this matrix be assigned to the color purple. In the other if statements that will be triggered for different amounts of fingers up or down, you can see I have chosen different matrix color values for C, which then get told to fill the matrix in the same way.

The final step after all these if statements is the matrix.pixelshow statement.This is used to refresh the output of the matrix to the correct color. If wiring up and coding for Glowbit is new to you, I have a link in the description that will give you all the information that you need.

Now, this kind of gesture control can also be used not only to control hardware, but also software. For instance, let's take our script a step further and create some gesture controls to aid accessibility when watching a video. If you open up computergesturecontrol.py, in it I have coded in the ability to start and pause videos and also control the sound.

So when I lower a finger, it's gonna start the video. When I lower two fingers, it's gonna stop the video. If I lower three fingers, it's gonna turn on the music. And if I lower four fingers, it's gonna stop the sound.

You will notice some delay using this system. Initially, I had play and pause set to the same keystroke, which was spacebar. But this was less than ideal as if you're not fast enough with your hand inputs, you could easily end up with multiple spacebar inputs, which stutters the VLC player.

The solution is found by opening up VLC and creating custom hotkeys by using the preferences setting. In here, you're gonna be able to configure a hotkey which sole purpose is to play and another which sole purpose is to pause. The particular ones I created were Ctrl-Alt-B and Ctrl-Alt-N. Almost all programs will let you create dedicated macros in a similar manner to this.

Diving into the script for this, you can see I added the functionality in the exact same places as I did for the GlowBit script. You can see a difference here at the top where I have imported keyboard and subprocesses.This adds functionality to the system. The keyboard package will allow us to execute in Python those VLC hotkeys that we created to play and pause our video.

The subprocesses will enable us to directly alter the system sound to a specific percentage via this script.

Scrolling down to the section where in the previous script we were controlling the light, depending on how many fingers you can see, this section has been remixed. There are still if statements that have a chance to be triggered with every iteration of the while true loop.

Looking only at the first if statement that I added, you can see that if four fingers are seen to be up, then it will execute a keyboard function that is a VLC hotkey to start playing our video.

The next if statement is very similar. Just if three fingers are shown, then it will execute that hotkey that we created to stop the video.

The next if statement will activate when it only sees two fingers and it's gonna turn the volume up to 100%. Starts by setting up a variable named volume to equal 100. The next variable command may look a little complicated, but mainly it is just formatting to make the data understood when it gets outputted by the following subprocesses function. When it gets outputted by this function, it will result in full volume for the Raspberry Pi system.

The final if goes through the same process, but for one finger, and if it sees that one finger, it's gonna turn the volume to zero.

I also have a section here that you can uncomment, which will shut down your Raspberry Pi if it sees zero fingers up. I kept accidentally triggering it, so I uncommented it here, but it is a great demonstration of.How to run a terminal command through Python, so I left it here just for that.

If you continue to go down this direction, you could definitely control a YouTube video to play, pause, fast forward, or thumbs up, depending on the figure gestures you provide to the camera.

There is also a potential for true gesture recognition here, but it will require a re-tinkering of my scripts.

Hopefully this has got you excited to try this out for yourself and tinker with some code and make something really rad in your makeovers.

This hand recognition system is absolutely raring to be expanded upon to take it to some really amazing places.

So until next time, stay cozy. ♪

Feedback

Please continue if you would like to leave feedback for any of these topics:

  • Website features/issues
  • Content errors/improvements
  • Missing products/categories
  • Product assignments to categories
  • Search results relevance

For all other inquiries (orders status, stock levels, etc), please contact our support team for quick assistance.

Note: click continue and a draft email will be opened to edit. If you don't have an email client on your device, then send a message via the chat icon on the bottom left of our website.

Makers love reviews as much as you do, please follow this link to review the products you have purchased.