main| new issue| archive| editorial board| for the authors| publishing house|
Ðóññêèé
Main page
New issue
Archive of articles
Editorial board
For the authors
Publishing house

 

 


ABSTRACTS OF ARTICLES OF THE JOURNAL "INFORMATION TECHNOLOGIES".
No. 4. Vol. 29. 2023

DOI: 10.17587/it.29.189-196

E. V. Alymova, Cand. of Eng. Sc., Associate Professor,
Russian Customs Academy (Rostov-on-Don branch), O. V. Khachkinaev, First Class Senior Research Officer,
Federal State Autonomous Scientific Establishment "Research Institute "Specialized Computing Protection Devices and Automation", Rostov-na-Donu, Russian Federation

Automation of Software and Hardware Systems Acceptance Testing in the Paradigm of Behavior-Driven Development

The paper is devoted to the problem of software and hardware systems acceptance testing according to the Agile Testing methodology. The Agile approach is widely used by software developers, however, developers of software and hardware solutions rarely use this approach, not believing in its effectiveness. The paper assumes that for software and hardware complexes, the practice of continuous integration should be applicable and, as a result, test interaction with hardware at the level ofphysical interfaces performs automatically. The article's authors presented Accepta, a system for automating acceptance testing of software and hardware complexes, explicitly designed for the Agile Testing methodology in the context of continuous integration. The main component of Accepta is an interface block based on the Nucleo-F767ZI debug board manufactured by ST. Test actions that run within the framework of Accepta supplement by expanding the command system of the interface block. The software part, which implements the functions of test describing and execution, is based on the Cucumber framework for automating software systems acceptance testing. The requirements for the object under test and the scenarios for checking the requirements are described in the Gherkin language, which is close to the natural description. The test script steps are described programmatically in the Ruby language. The actual execution of test actions is provided by sending commands through the COM port to the interface unit and analyzing the received responses. As the practice of using Accepta in working projects has shown, this approach allows us to successfully apply the Agile development methodology for software and hardware systems. Due to the automotive interaction with the device under the test interface, high intensity of testing in the development process, including regression, is ensured. The regular testing consequence is fast feedback: as soon as any functionality stops working correctly, developers find out about it fast. At the same time, due to the use of test automation tools, the reproducibility of test action sequences led to the detection of a defect is ensured.
Keywords: software and hardware system, test automation, acceptance testing, Agile Testing methodology, software testing tools

DOI: 10.17587/it.29.189-196

References

  1. Dima A. M., Maassen M. A. From Waterfall to Agile soft­ware: Development models in the IT sector, 2006 to 2018. Impacts on company management, Journal of International Studies, 2018, vol. 11, no. 2, pp. 315—326.
  2. Borle N. C. et al. Analyzing the effects of test driven development in GitHub, Empirical Software, 2018, vol. 23, no. 4, pp. 1931—1958.
  3. Demeyer S. et al. Evaluating the efficiency of continuous testing during test-driven development, 2018 IEEE Workshop on Validation, Analysis and Evolution of Software Tests (VST), IEEE, 2018, pp. 21—25.
  4. Madeyski L., Kawalerowicz M. Continuous Test-Driven Development: A Preliminary Empirical Evaluation using Agile Experimentation in Industrial Settings, Towards a SynergisticCombination of Research and Practice in Software Engineering, Springer, Cham, 2018, pp. 105—118.
  5. Lenka R. K., Kumar S., Mamgain S. Behavior driven development: Tools and challenges, 2018 International Conferenceon Advances in Computing, Communication Control and Networking (ICACCCN), IEEE, 2018, pp. 1032—1037.
  6. Moe M. M. Comparative Study of Test-Driven Develop­ment (TDD), Behavior-Driven Development (BDD) and Accep­tance Test—Driven Development (ATDD), International Journal of Trend in Scientific Research and Development, 2019, pp. 231—234.
  7. Lawrence R., Rayner P. Behavior-Driven Development with Cucumber: Better Collaboration for Better Software, Addi-son-Wesley Professional, 2019.
  8. Arachchi S., Perera I. Continuous integration and con­tinuous delivery pipeline automation for agile software project management, 2018 Moratuwa Engineering Research Conference (MERCon), IEEE, 2018, pp. 156—161.
  9. Nayyar A. Instant approach to software testing: Principles, applications, techniques, and practices, BPB Publications, 2019.
  10. Dalton J. Acceptance Testing, Great Big Agile, Apress,Berkeley, CA, 2019, pp. 111—112.
  11. Lenka R. K. et al. Performance Analysis of Automated Testing Tools: JMeter and TestComplete, 2018 International Conference on Advances in Computing, Communication Control and Networking (ICACCCN), IEEE, 2018, pp. 399—407.
  12. Jain V., Rajnish K. Comparative Study of Software Automation Testing Tools: OpenScript and Selenium, Int. J. Eng. Res. Appl., 2018, vol. 8, no. 2, pp. 29—33.
    1. Fletcher M. et al. Evolving into embedded develop, Agile 2007 (AGILE 2007), IEEE, 2007, pp. 150—155.
    2. Karlesky M. et al. Mocking the embedded world: Test-driven development, continuous integration, and design patterns, Proc. Emb. Systems Conf, CA, USA, 2007, pp. 1518—1532.
    3. Alhaj M., Arbez G., Peyton L. Using behaviour-driven de­velopment with hardware-software co-design for autonomous load management, 2017 8th International Conference on Information and Communication Systems (ICICS), IEEE, 2017, pp. 46—51.
    4. Robster Overview — Third Pin, available at: https://third-pin.io/robster (accessed 28 June 2021).
    5. Blokdyk G. Robot framework, 5STARCooks, 2018, 282 p.
    6. Paiva A. C. R., Maciel D., da Silva A. R. From requirements to automated acceptance tests with the RSL language, International Conference on Evaluation of Novel Approaches to Software, Springer, Cham, 2019, pp. 39—57.
    7. Suffian M. D. M. et al. Towards Formulating Dynamic Model for Predicting Defects in System Testing using Metrics in Prior Phases, International Journal of Integrated, 2018, vol. 10, no. 6.
    8. Sinha A., Das P. Agile Methodology Vs. Traditional Wa­terfall SDLC: A case study on Quality Assurance process in Soft­ware Industry, 2021 5th International Conference on Electronics, Materials Engineering & Nano-Technology, IEMENTech, IEEE, 2021, pp. 1—4.
    9. Adzic G. Bridging the communication gap: specification by example and agile acceptance testing, Neuri, London, 2009, 284 p.
    10. dos Santos E. C., Vilain P. Automated acceptance tests as software requirements: An experiment to compare the applicability of fit tables and gherkin language, International Conference on Agile Software Development, Springer, Cham, 2018, pp. 104—119.

    To the contents