Journal "Software Engineering"
a journal on theoretical and applied science and technology
ISSN 2220-3397
Issue N8 2022 year
The developed integrated system of sign language teaching, aimed at a wide range of users, is considered. The Dimskis notation for sign language writing implementation and the Unity 3D cross-platform development to control an animated 3D character are used in this system. The approach for the describing of 3D character movements based on the meta-language is proposed. The Dimskis notation has been expanded, within the framework of the meta-language. Additional and service characters, control words, special coordinates were introduced. Special control symbols and rules for processes synchronization have been developed. It is allows one to implement the movement of several elements at different times, to create complex movements, returning them to the initial state, to cancel the transferences of individual moving elements by using the "fictitious" movement, to change only individual coordinates of the avatars hand. The technique of creating the sign language reference book is considered. The advantages of this approach are analyzed. In addition to this full-fledged functional toolkit for creating the reference book the integrated system has the ability to test the users knowledge. The mode gesture demonstration — answer selection is proposer for mobile devices. For stationary devices with a fixed web camera the user has to show the gesture proposed by the system. The following stages of this mode implementation are considered: image capture, gesture localization, gesture identification, decision making. The gesture localization is based on the double background subtraction and histogram method. The gesture identification is based on the projection method and is carried out in two stages. At the first stage the entire hand is analyzed. If it is difficult to make a decision at this stage, then the second stage is carried cut, where only fingers are analyzed. The trend of arm movement between key frames is additionally determined for dynamic gestures. The features of each of the stages are shown. The algorithm for making a decision is given. The results of an experimental investigation are presented.