Main Content

使用机器学习算法在Arduino硬件上识别打孔和弹性手势

This example shows how to use the Simulink® Support Package for Arduino® Hardware to identify punch and flex hand gestures using a machine learning algorithm. The example is deployed on an Arduino Nano 33 IoT hardware board that uses an onboard LSM6DS3 6DOF IMU sensor to identify the hand gestures. The output from the machine learning algorithm, after identifying whether a hand gesture is a punch or a flex, is transmitted to the serial port where 0 represents a punch and 1 represents a flex.

先决条件

需要硬件

  • Arduino Nano 33 IoT board

  • USB cable

硬件设置

Connect the Arduino Nano 33 IoT board to the host computer using the USB cable.

准备培训机器学习算法的数据集

This example uses a MATLAB code file namedcapture_training_datato measure the raw data for the flex and punch hand gestures. Open the file and configure these parameters.

  1. 指定加速度阈值accelerationThreshold范围。在此示例中,阈值设置为2.5.

  2. Create anLSM6DS3object and specify the number of samples read in a single execution of the read function. In this example, the parameter is set to119.

  3. 指定在循环中以每个手势捕获的帧数。在此示例中100每个手势捕获框架。

来capture the flex hand gestures, run this command in the MATLAB Command Window.

flex = capture_training_data;

将Arduino硬件握在手掌中,然后伸出弯曲。要创建一个100帧的数据集,以进行弹性手势,请以100次的弹性伸出手。观察价值Gesture no.增加每次你抛出一个flex在the MATLAB Command Window.

A 1-by-100 flex hand gesture data samples are available in Workspace that are read by the LSM6DS3 IMU sensor. Right-click flex in Workspace and save it asflex_100.matin the same working directory of the example.

Follow the same procedure to create a data set of 100 frames for a punch gesture.

要捕获拳手手势,请在MATLAB命令窗口中运行此命令。

punch = capture_training_data;

将Arduino硬件握在手掌中,然后挥拳。要创建一个100帧的数据集来打孔手势,请伸出手100次。观察价值Gesture no.每次您在MATLAB命令窗口中挥拳时都会增加。

A 1-by-100 punch hand gesture data samples are available in Workspace that are read by the LSM6DS3 IMU sensor. Right-click punch in Workspace and save it aspunch_100.matin the same working directory of the example.

when you create your own data set for the flex and punch hand gestures, use the same names for the MAT-file in thegr_scriptMATLABcode file.

Thegr_scriptMATLABcode file is used to preprocess and train the data set for flex and punch hand gestures, train the machine learning algorithm with the data set, and evaluate its performance to accurately predict these hand gestures.

来edit thegr_script.m file, run this command in the MATLAB® Command Window.

编辑gr_script;

采用this MATLAB code file to prepare the Simulink model in the support package and then deploy the model on the Arduino hardware.

Alternatively, you can also load the flex and the punch data set fromgr_scriptavailable in MATLAB.

加载flex_100负载punch_100

为了训练和测试机器学习算法,从加速度计和陀螺仪中读取119个数据样本。这119个样品分为100个这样的框架,每个帧代表一个手势。每个帧都有六个值,分别从加速度计和陀螺仪的X,Y和Z轴获得。在数据集中,共有11,900个此类观察值存储在两个不同的手势,即Flex和Punch中。

Extract Features

Features are extracted by taking the mean and the standard deviation of each column in a frame. This results in a 100-by-12 matrix of observations for each gesture.

f1 = cellfun(@mean,flex','UniformOutput',false); f2 = cellfun(@std,flex','UniformOutput',false); flexObs = cell2mat([f1,f2]); flexLabel = ones(100,1);
p1 = cellfun(@mean,punch','UniformOutput',false); p2 = cellfun(@std,punch','UniformOutput',false); punchObs = cell2mat([p1,p2]); punchLabel = 2*ones(100,1);
X = [flexObs;punchObs]; Y = [flexLabel;punchLabel];

Prepare Data

该示例使用90%的观测值来训练一种模型,该模型对两种类型的手势和10%的观测值进行了分类,以验证训练有素的模型。采用cvpartition为测试数据集指定10%的持有。

rng('default') % For reproducibility Partition = cvpartition(Y,'Holdout',0.10); trainingInds = training(Partition); % Indices for the training set XTrain = X(trainingInds,:); YTrain = Y(trainingInds); testInds = test(Partition); % Indices for the test set XTest = X(testInds,:); YTest = Y(testInds);

Train Decision Tree at Command Line

Train a classification model.

treemdl = fitctree(Xtrain,ytrain);

执行五倍的交叉验证classificationEnsemble并计算验证精度。

partitionedModel = crossval(treeMdl,'KFold',5); validationAccuracy = 1-kfoldLoss(partitionedModel);
ans =
validationAccuracy = 1

Evaluate Performance on Test Data

Evaluate performance on the test data set.

testAccuracy = 1-als(treemdl,xtest,ytest)
ans =
testAccuracy = 1

The trained model accurately classifies 100% of the hand gestures on the test data set. This result confirms that the trained model does not overfit the training data set.

Prepare Simulink Model and Calibrate Parameters

After you have prepared a classification model, use thegr_script.m file as a Simulink model initialization function.

要打开Simulink模金宝app型,请在MATLAB命令窗口中运行此命令。

open_system('arduino_machinelearning')

Arduino Nano 33 IoT板具有一个板载LSM6DS3 IMU传感器,该传感器可测量沿X,Y和Z轴的线性加速度和角速度。在LSM6DS3 IMU传感器块的“块参数”对话框中配置这些参数:

  1. 设置I2C地址传感器0x6a与传感器的加速度计和陀螺仪外围物进行通信。

  2. 选择Acceleration (m/s^2)and角速度(rad/s)output ports.

  3. 设置Sample timeto0.01.

The 1-by-3 acceleration and angular velocity vector data is collected from the LSM6DS3 IMU sensor at the sample time you specify in the Block Parameters dialog box. This data is then preprocessed in the Preprocessing subsystem.

来open the subsystem, run this command in the MATLAB Command Window.

open_system('arduino_machinelearning/Preprocessing')

首先将加速度数据从m/s^2转换为g。然后将绝对值求和,每119个数据值大于2.5g的阈值,dataReadEnableMATLAB功能块中的参数在逻辑上变为正确。这是对分类区域中触发子系统的触发。

The angular velocity data is converted from radians to degrees. The acceleration and angular velocity data is multiplexed and given as an input to the Switch. For a data value greater than 0, the buffer stores the valid 119 gesture values corresponding to a punch and a flex. For data values less than zero, which indicates that no hand gesture is detected, a series of (1,6) zeros are sent to the output to match the combined acceleration and angular velocity data.

Configure this parameter in the Block Parameters dialog box of the Buffer block.

  1. 设置输出缓冲区大小parameter to119.

Features are extracted by calculating the mean and the standard deviation values of each column in a frame that results in a 100-by-12 matrix of observations for each gesture. These extracted features are further passed as an input to the Triggered subsystem in the Classification area.

来open the subsystem, run this command in the MATLAB Command Window.

open_system('arduino_machinelearning/Triggered Subsystem')

速率转换块从以一个速率运行的预处理子系统的输出转移数据到以不同速率运行的触发子系统的输入。

分类树预测块是来自统计信息和机器学习工具箱™的库块,它使用提取的功能对手势进行分类。这个块使用treeMdlmachine learning model to identify punches and flexes. This block outputs the predicted class label. The output is either a 0 or 1 corresponding to a punch or a flex, respectively.

串行传输块参数配置为其默认值。

在Arduin金宝appo Board上部署Simulink模型

1.在硬件仿真软件模型的选项卡,金宝app在Modesection, selectRun on boardand then clickBuild, Deploy & Start.

2.要简单地分析机器学习算法识别的手势数据,请在MATLAB命令窗口中运行以下脚本,并在Arduino串行端口上读取数据。

device = serialport(,9600); while(true) rxData = read(device,1,"double"); if rxData==0 disp('Punch'); elseif rxData==1 disp('Flex'); end end

Replace theportparameter with the actual com port of the Arduino board. The gesture detected by the machine learning algorithm is displayed on the Arduino serial port at the baud rate of 9600 where 0 represents a punch and 1 represents a flex.

3. Hold the hardware in the palm of your hand and throw a punch or a flex. Observe the output in the MATLAB Command Window.

See Also