56 C. Patrício et al.
Fig. 11 Test design and implementation process activities
about the status of the test environment. Figure12illustrates a graphic overview of the various activities of the test design and implementation process. Table5gives further details about the multiple tasks in need of completion for each action.
6.3 Test Execution Process
The test execution process is applied to manage the test methods that are created as a consequence of the development process. The test execution demands running numerous times, considering all feasible test procedures which cannot be performed a specific iteration. Moreover, if a problem is solved, the test execution must be conducted again [12]. The actions included in this workflow are the test procedure execution, test results comparison, and test execution registration [1]. Figure 13 illustrates the overview of the test execution process graphically. Table6summarizes the actions and tasks of this method.
6.4 Test Incident Reporting Process
The testing incident reporting process is applied for examination occurrence report as a consequence of failure exposure, items among accidental or uncommon perfor- mance though performing the test or in the event of retest [9]. The actions required are: examining test sequence and design or update incident outcomes [12]. These activities are illustrated in Fig.14and described in Table7.
A Study on Software Testing Standard Using ISO/IEC/IEEE … 57
Table 4 Tasks and activities from the test design and implementation
Activity Tasks
Feature sets identification (a) Analyse the test data to know the conditions for the test (b) Combine the elements to be examined into feature settings (c) Prioritize the examination of the feature settings
(d) Obtain agreement with the stakeholder concerning the composition and prioritization of feature sets
(e) Document feature sets in the test designation
(f) Record the traceability within the test data and the feature sets Derive test conditions (a) Determine the test requirements for every feature
(b) Prioritize the test requirements using risk levels (c) Record the test conditions in the test designation
(d) Record the traceability among the test data, features sets and test requirements
(e) Obtain approval for test design specifications by the stakeholders
Derive test coverage items (a) Derive the test coverage details to be handled during the test (b) Prioritize the test coverage items considering exposure levels (c) Record the test coverage item in the test designation (d) Record the traceability among the test data, feature settings, test requirements, and test coverage items
Derive test cases (a) Derive individual or group test cases by defining
pre-conditions, choosing input values and activities to execute the decided test coverage items
(b) Record test cases in the test designation
(c) Record the traceability within the test data, feature sets, test conditions, test coverage items, and test cases
(d) Obtain approval from the stakeholders for the test case specification
Assemble test sets (a) Distribute the test cases into several test settings according to the execution constraints
(b) Record test sets in the test procedure specification
(c) Record the traceability among the test basis, feature settings, test description, test coverage, test cases and test settings Derive test procedures (a) Derive test procedures by organising test cases inside a test set
(b) Identify each test data and test conditions that are not incorporated in the plan
(c) Prioritize the test methods using risk exposure levels (d) Record the test procedures in the test designation
(e) Record the traceability among the test data, feature sets, test requirements, test coverage items, test samples, test sets, and test methods
(f) Obtain approval on the test procedures specification with the stakeholders
58 C. Patrício et al.
Fig. 12 Test environment set-up and maintenance
Table 5 Tasks and activities from the test environment set-up and maintenance process
Activity Tasks
Establish a test environment (a) Based on the test plan, perform the following:
1. Prepare the set-up of the test settings 2. Plan the test conditions
3. Define the configuration management 4. Implement the test environment
5. Prepare the test data that will be used in the testing process 6. Prepare the test tools to assist the examination
7. Configure the test object
8. Check if the test environment fulfils the specification 9. Assure that the test environment meets the defined requirements
(b) Record the test environment status and data and communicate it to the relevant stakeholders
(c) Include the details of the disparities among the test and the operation environment
Maintain test environment (a) Maintain the test environment as defined
(b) Communicate the updates to the status regarding the test environment to the stakeholders
Fig. 13 Graphic overview of the test execution process
A Study on Software Testing Standard Using ISO/IEC/IEEE … 59
Table 6 Tasks and activities from the test execution process
Activity Tasks
Execute test procedure(s) (a) Perform individual or group test methods in the qualified test environment
(b) Supervise the real results for every test case (c) Record the actual outcomes
Compare test results (a) Compare the actual and expected results for each test case (b) Determine the test outcome of performing the test samples in the inspection procedure. If the rate is reached, this order needs an update to an event report by the test process
Record test execution plan (a) recording test performance, as defined in the standard
Fig. 14 Graphic overview of the test incident reporting process
Table 7 Activity and tasks from the test reporting phase
Activity Tasks
Analyse test results (a) Analyze the test result and update the incident details when the test is associated with a previous incident
(b) Analyze the test result when it is related to a newly identified bug. This analysis will be conducted whether it is an occurrence that needs reporting, an operation item that will be solved without conflict reporting, or needs no additional action to be considered
(c) Assign the activity details to a proper person for decision Create/update incident report (a) Identify and report/update the data that requires to be
registered concerning the occurrence
(b) Communicate the status of new or updated episodes to the relevant stakeholders
(RQ1) The ISO/IEC/IEEE 29119 is a group of globally agreed practices for software testing guidelines. It must be used by any software developer or software testing organization within any step of the application development [1].
(RQ2) The ISO/IEC/IEEE 29119 Standard implementation is a rigorous process for small and medium-sized companies and scenarios.
(RQ3) The main limitation associated with the implementation of this standard is related to the standard architecture. This standard is divided into sixteen outputs, which range from planning, system requirements, project execution to the application reports [10]. The implementation of this standard in the
60 C. Patrício et al.
software development scenario incorporates several limitations regarding its complexity and scalability.
(RQ4) The adoption of this standard makes it possible to decrease the number of documents which promote the acceptability in small and medium scale orga- nizations [10]. The incorporation of the new methods and models, the testing process can be improved which leads to several improvements in different sectors of the companies, in particular associated with the development, analysis, and team management.
(RQ5) This standard allows the testing team to centralize their efforts on the devel- opment processes by abstracting from details, which prevents effects such as the discovery of bugs during the testing phase. Therefore, developers can focus on their essential tasks. Furthermore, the testing and management team can focus on the development of high-quality software and avoid losing their time fixing bugs [11]. The developers and managers should understand that by adopting significant procedures in the testing phase does not depend on the technology used and must be taking into consideration in terms of capability assessment [20–25].
(RQ6) The effort in terms of time and cost of adopting this kind of standard for testing is relevant. However, the final price can be higher in the case of bugs during the product installation phase, especially in enterprise and safety- critical systems.
In sum, software testing is a relevant element of the code development which can not be avoided in any situation. Nowadays, software testing must be done in line with all the software development processes for the design of reliable applications.
Software testing is currently particularly critical regarding the new advances in AI applications. These smart or intelligent systems will change our daily routine, and therefore software testing practices must be adopted to ensure high-quality services and applications. Is should be noted that the adoption of a standard test process model is an asset for any organization since both software and product quality is guaranteed by the established metrics of quality of software and the test process models followed.
References
1. Matalonga, S., Rodrigues, F., Travassos, G.H.: Matching context aware software testing design techniques to ISO/IEC/IEEE 29119. In: Rout, T., O’Connor, R.V., Dorling, A. (eds.) Soft- ware Process Improvement and Capability Determination, pp. 33–44. Springer International Publishing, Cham (2015).https://doi.org/10.1007/978-3-319-19860-6_4
2. Thakur, M.S.: Review on structural software testing coverage approaches. Int. J. Adv. Res.
Ideas Innovations Technol. 281–286 (2017)
3. Anwar, N., Kar, S.: Review paper on various software testing techniques & strategies.
Glob. J. Comput. Sci. Technol. [S.l.], May 2019. ISSN: 0975-4172. Available at: https://
computerresearch.org/index.php/computer/article/view/1873. Date accessed: 25 Feb 2020
A Study on Software Testing Standard Using ISO/IEC/IEEE … 61
4. Rana, I., Goswami, P., Maheshwari, H.: A review of tools and techniques used in software testing. Int. J. Emerg. Technol. Innovative Res.6(4), 262–266.www.jetir.org. ISSN:2349-5162, April 2019. Available:http://www.jetir.org/papers/JETIR1904Q46.pdf
5. Hrabovská, K., Rossi, B., Pitner, T.: Software testing process models benefits & drawbacks: a systematic literature review. arXiv preprintarXiv:1901.01450(2019)
6. Sanchez-Gordon, S., Luján-Mora, S.: A method for accessibility testing of web applications in agile environments. In: Proceedings of the 7th World Congress for Software Quality (WCSQ), pp. 13, 15 (85) (2017)
7. Jan, S.R., Shah, S.T.U., Johar, Z.U., Shah, Y., Khan, F.: An innovative approach to investi- gate various software testing techniques and strategies. Int. J. Sci. Res. Sci. Eng. Technol.
(IJSRSET). Print ISSN: 2395-1990 (2016)
8. Ali, S., Yue, T.: Formalizing the ISO/IEC/IEEE 29119 software testing standard. In: 2015 ACM/IEEE 18th International Conference on Model Driven Engineering Languages and Sys- tems (MODELS), pp. 396–405. IEEE, Ottawa, ON, Canada (2015).https://doi.org/10.1109/
MODELS.2015.7338271
9. Jamil, M.A., Arif, M., Abubakar, N.S.A., Ahmad, A.: Software testing techniques: a literature review. In: 2016 6th International Conference on Information and Communication Technology for the Muslim World (ICT4M), pp. 177–182. IEEE (2016)
10. Eira, P., Guimaraes, P., Melo, M., Brito, M.A., Silva, A., Machado, R.J.: Tailoring ISO/IEC/IEEE 29119-3 standard for small and medium-sized enterprises. In: 2018 IEEE Inter- national Conference on Software Testing, Verification and Validation Workshops (ICSTW), pp. 380–389. IEEE, Vasteras (2018).https://doi.org/10.1109/ICSTW.2018.00077
11. Pröll, R., Bauer, B.: Toward a consistent and strictly model-based interpretation of the ISO/IEC/IEEE 29119 for early testing activities: In: Proceedings of the 6th Interna- tional Conference on Model-Driven Engineering and Software Development, pp. 699–706.
SCITEPRESS—Science and Technology Publications, Funchal, Madeira, Portugal (2018).
https://doi.org/10.5220/0006749606990706
12. Departamento de Ingeniería, Pontificia Universidad Católica del Perú, Lima, Lima 32, Perú, Dávila, A., García, C., Departamento de Ingeniería, Pontificia Universidad Católica del Perú, Lima, Lima 32, Perú, Cóndor, S., Escuela Profesional de Ingeniería de Software Universidad Nacional Mayor de San Marcos, Lima, Lima 1, Perú: Análisis exploratorio en la adopción de prácticas de pruebas de software de la ISO/IEC 29119-2 en organizaciones de Lima, Perú. risti.
1–17 (2017).https://doi.org/10.17013/risti.21.1-17
13. Park, B.H., Seo, Y.G.: Process improvement for quality increase of weapon system software based on ISO/IEC/IEEE 29119 test method.한국컴퓨터정보학회논문지23, 115–122 (2018).
https://doi.org/10.9708/JKSCI.2018.23.12.115
14. Sánchez-Gordón, M.-L., Colomo-Palacios, R.: From certifications to international standards in software testing: mapping from ISQTB to ISO/IEC/IEEE 29119-2. In: Larrucea, X., Santa- maria, I., O’Connor, R.V., Messnarz, R. (eds.) Systems, Software and Services Process Improve- ment, pp. 43–55. Springer International Publishing, Cham (2018).https://doi.org/10.1007/978- 3-319-97925-0_4
15. Henderson-Sellers, B., Gonzalez-Perez, C., McBride, T., Low, G.: An ontology for ISO soft- ware engineering standards: creating the infrastructure. Comput. Stand. Interfaces36, 563–576 (2014).https://doi.org/10.1016/j.csi.2013.11.001
16. Condor, S., Garcia, C., Davila, A.: Adoption of ISO/IEC 29119-2 software testing practices:
an exploratory analysis in organizations in Lima, Perú. In: 2016 International Conference on Software Process Improvement (CIMPS), pp. 1–8. IEEE, Aguascalientes, Mexico (2016).
https://doi.org/10.1109/CIMPS.2016.7802802
17. Munir, H., Runeson, P.: Software testing in open innovation: an exploratory case study of the acceptance test harness for jenkins. In: Proceedings of the 2015 International Conference on Software and System Process—ICSSP 2015, pp. 187–191. ACM Press, Tallinn, Estonia (2015).
https://doi.org/10.1145/2785592.2795365
18. Felderer, M., Wendland, M.-F., Schieferdecker, I.: Risk-based testing. In: Margaria, T., Steffen, B. (eds.) Leveraging Applications of Formal Methods, Verification and Validation. Specialized
62 C. Patrício et al.
Techniques and Applications, pp. 274–276. Springer, Berlin (2014).https://doi.org/10.1007/
978-3-662-45231-8_19
19. Kawaguchi, S.: Trial of organizing software test strategy via software test perspectives. In:
2014 IEEE Seventh International Conference on Software Testing, Verification and Validation Workshops, pp. 360–360. IEEE, OH, USA (2014).https://doi.org/10.1109/ICSTW.2014.42 20. Ruy, F.B., Falbo, R.A., Barcellos, M.P., Guizzardi, G., Quirino, G.K.S.: An ISO-based software
process ontology pattern language and its application for harmonizing standards. SIGAPP Appl.
Comput. Rev.15, 27–40 (2015).https://doi.org/10.1145/2815169.2815172
21. Dussa-Zieger, K., Ekssir-Monfared, M., Schweigert, T., Philipp, M., Blaschke, M.: The current status of the TestSPICE® project. In: Stolfa, J., Stolfa, S., O’Connor, R.V., Messnarz, R. (eds.) Systems, Software and Services Process Improvement, pp. 589–598. Springer International Publishing, Cham (2017).https://doi.org/10.1007/978-3-319-64218-5_49
22. Garcia, C., Dávila, A., Pessoa, M.: Test process models: systematic literature review. In: Mita- siunas, A., Rout, T., O’Connor, R.V., Dorling, A. (eds.) Software Process Improvement and Capability Determination, pp. 84–93. Springer International Publishing, Cham (2014).https://
doi.org/10.1007/978-3-319-13036-1_8
23. Siegl, S., Russer, M.: Systematic use case driven environmental modeling for early validation of automated driving functionalities. In: Gühmann, C., Riese, J., von Rüden, K. (eds.) Simulation and Testing for Vehicle Technology, pp. 383–392. Springer International Publishing, Cham (2016).https://doi.org/10.1007/978-3-319-32345-9_26
24. Großmann, J., Seehusen, F.: Combining security risk assessment and security testing based on standards. In: Seehusen, F., Felderer, M., Großmann, J., Wendland, M.-F. (eds.) Risk Assess- ment and Risk-Driven Testing, pp. 18–33. Springer International Publishing, Cham (2015).
https://doi.org/10.1007/978-3-319-26416-5_2
25. Adlemo, A., Tan, H., Tarasov, V.: Test case quality as perceived in Sweden. In: Proceedings of the 5th International Workshop on Requirements Engineering and Testing—RET ’18, pp. 9–12.
ACM Press, Gothenburg, Sweden (2018).https://doi.org/10.1145/3195538.3195541