Tendon-driven anthropomorphic robotic hands often lack direct joint angle sensing, as the integration of joint encoders can compromise mechanical compactness and dexterity. This paper presents a computational method for estimating joint positions from measured tendon displacements and tensions. An efficient kinematic modeling framework for anthropomorphic hands is first introduced based on the Denavit-Hartenberg convention. Using a simplified tendon model, a system of nonlinear equations relating tendon states to joint positions is derived and solved via a nonlinear optimization approach. The estimated joint angles are then employed for closed-loop control through a Jacobian-based proportional-integral (PI) controller augmented with a feedforward term, enabling gesture tracking without direct joint sensing. The effectiveness and limitations of the proposed estimation and control framework are demonstrated in the MuJoCo simulation environment using the Anatomically Correct Biomechatronic Hand, featuring five degrees of freedom for each long finger and six degrees of freedom for the thumb.
翻译:肌腱驱动的仿人机器人手掌通常缺乏直接的关节角度传感,因为关节编码器的集成可能损害机械紧凑性和灵巧性。本文提出了一种通过测量的肌腱位移和张力估计关节位置的计算方法。首先基于Denavit-Hartenberg约定,引入了一种高效的仿人手掌运动学建模框架。利用简化的肌腱模型,推导了肌腱状态与关节位置关系的非线性方程组,并通过非线性优化方法求解。随后,将估计的关节角度用于闭环控制,采用基于雅可比矩阵的比例-积分(PI)控制器并结合前馈项,实现了无需直接关节传感的姿态跟踪。在MuJoCo仿真环境中,使用具有每个长指五自由度和拇指六自由度的解剖学正确生物机电手掌模型,验证了所提估计与控制框架的有效性和局限性。