Touch information classified computing and modelling method based on machine learning

一种基于机器学习的触觉信息分类计算建模方法

Abstract

一种基于机器学习的触觉信息分类计算建模方法,获取训练集样本的触觉序列,使用线性动态系统模型建模提取子触觉序列的动态特征,使用马丁距离计算子触觉序列的动态特征之间的距离,使用K‑中心点算法对马丁矩阵进行聚类构建码书,使用码书对每组触觉序列进行表征,得到系统包模型,将训练集样本的系统包模型和训练集样本标签一起送入极限学习机中训练分类器,将待分类样本的系统包模型送入分类器得到物体类型的标签。本发明解决了机器人对非合作目标稳定、柔顺抓取的实际需求,为精细操作任务的完成提供数据基础,并可与其它传感结果融合计算,从而通过多源深度感知增强对不同目标物的描述和辨识能力,为智能操控的实施奠定技术基础。
The invention relates to a touch information classified computing and modelling method based on machine learning. The method comprises the following steps: acquiring a touch sequence of a training set sample, modelling by adopting a linear dynamic system model, extracting dynamic characteristics of a sub touch sequence, calculating distance of the dynamic characteristics of the sub touch sequence by adopting Martin distance, clustering a Martin matrix by adopting a K-medoids algorithm, constructing a code book, carrying out characterization on each touch sequence by adopting the code book to obtain a system packet model, putting the system packet model of the training set sample and a training set sample label into an extreme learning machine for training a classifier, and putting the system packet model of a to-be-classified sample into the classifier to obtain a label for type of an object. The touch information classified computing and modelling method has the advantages that the actual demand of a robot on stable and complaisant grasping of a non-cooperative target is met, data foundation is provided for completion of a precise operation task, and other sensing results can be fused and computed, so that the description and recognition capability on different targets is enhanced by virtue of multi-source deep perception, and a technical foundation is laid for implementation of intelligent control.

Claims

Description

Topics

Download Full PDF Version (Non-Commercial Use)

Patent Citations (0)

    Publication numberPublication dateAssigneeTitle

NO-Patent Citations (0)

    Title

Cited By (0)

    Publication numberPublication dateAssigneeTitle