It is the cache of ${baseHref}. It is a snapshot of the page. The current page could have changed in the meantime.
Tip: To quickly find your search term on this page, press Ctrl+F or ⌘-F (Mac) and use the find bar.

A Synthesis of Software Evaluation Methodologies and the Proposal of a New Practical Approach | Azarian | Journal of Software
Journal of Software, Vol 6, No 11 (2011), 2271-2281, Nov 2011
doi:10.4304/jsw.6.11.2271-2281

A Synthesis of Software Evaluation Methodologies and the Proposal of a New Practical Approach

Armin Azarian, Ali Siadat

Abstract


A large number of developed, acquired or purchased software tools do not respond to the users’ requirements and expectations which had often been at the origin of the project. This is mainly due to two reasons: firstly because the users’ requirements are not well identified or formalized as they should, secondly because the software and tool evaluation is not robust enough or does not have a minimum required quality [1]. We attempt to propose a new approach in order to assess and quantify the quality of the software evaluation process. The theoretical approach is based on elaborating a matrix (A n,m) of software functionalities versus user’s scenarios. The norm of the columns and lines vectors of this matrix may be considered as a quality indicator of the process. Performance metrics are also derived from such indicators to summarize the quality of software evaluation process from the users’ point of view. We have applied such an approach on one case study and derived indicators like task effectiveness and efficiency of the software product evaluation. This method helps in two stages of the development cycle of the software: during the design phase and during the verification and validation. The case study helps to identify software imperfections and insufficiencies in a practical manner early in the development lifecycle. Thus improving subsequent releases of the software product.


Keywords


Operational Software Evaluation; Functional Matrix Decomposition; Verification and validation Process; Process Quality; Indicator and Metrics

References


[1] Azarian A., Brindejonc V., Bruère J.M, Investigation about elearning systems – SINTES’12, 2005

[2] Zelesnik G., Introduction to Software Requirements. (Requirements Engineering course material), Software Engineering Institute, Carnage Melon University, Pittsburgh, PA. eds. 1992

[3] Yannouth M., CHAOS (Application Project and Failure), Standish Group Study, 1995, pp.1-4

[4] Reiterer H., EVADIS II: A new method to evaluate user interfaces in Proceedings of the Human Computer Interactions, Cambridge University Press,1992, ISBN: 0521445914.

[5] Boehm, B.W., Software Engineering Economics. Englewood Cliffs, NJ: Prentice-Hall, Inc. 1981, ISBN: 0138221227.

[6]ISO DIN 9241-10, 11. Ergonomic requirements for office work with display terminals (VDTs): Guidance on usability. Berlin: Beuth, 1995

[7] Wottawa: Lehrbuch Evaluation. In Weidemann, B., Krapp, A.: Pädagogische Psychologie (4.,vollst. überarb. Aufl). Weinheim : Beltz PVU, 2001, ISBN: 3621275649.

[8] Palanque P.A., Bastide R., Design, Specification and Verification of Interactive Systems, 1995. Springer 1995, ISBN: 3211827390.

[9] Becker C.H., Using eXtreme Programming in a Student Environment, Master Thesis, Grin, ISBN: 9783640720040.

[10] Reiterer H., User Interface evaluation in Encyclopedia of Library and Information Science, 1997, ISBN: 0824720601.

[11] Kent A., Williams J.G., Encyclopedia of Computer Science and Technology,Volume 45, 2002, ISBN: 0824722981.

[12] Gediga G., Hamborg K.-C., Düntsch I., Evaluation of Software Systems, in Encyclopedia of Computer Science and technology, CRC Press, 2002, ISBN: 0824722981.

[13] Docherty P., System design for human development and productivity: participation and beyond, North-Holland, 1987, ISBN: 0444702512.

[14] Nielsen, J., Usability Engineering. San Francisco, Academic Press Inc, 1993, p. 195-198. ISBN: 0125184050

[15] Lin, H. X., Choong, Y., Salvendy, G., A proposed index of usability: a method for comparing the relative usability of different software systems. Behavior and Information Technology 16 (4/5), 1997, pp. 267-278. ISSN: 104492910.

[16] Hemmecke J., Stary C., The tacit dimension of user tasks: elicitation and contextual representation, in Proceedings of the 5th international conference on Task models and diagrams for users interface design, Springer, October 2006.

[17] Ammon U., An International Handbook of the Science of Language and Society, Walter de Gruyter, 2005, ISBN: 3110171481.

[18] Leimeister J., Huber M., Bretschneider U., Krcmar H., Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition, Journal of Management Information Systems (vol. 26 issue 1), 2009, doi:10.2753/MIS0742-1222260108.

[19] Nielsen, J., Usability engineering at a discount, In: Salvendy, G., and Smith, M.J. (Eds.), Designing and Using Human-Computer Interfaces and Knowledge Based Systems, Elsevier Science Publishers, Amsterdam, 1989, p.394-401. ISBN: 9780444880789.

[20] Carroll, J. M., Human-Computer Interaction: Psychology as a science of design. In: Annual review of psychology, Vol. 48, pp.61-83, Palo alto, CA: Annual Reviews, 1997. ISSN: 00664308.

[21] Jeffries, R., Miller, J.R., Wharton, C. & Uyeba, K.M., User interface evaluation in the real world: a comparison of four techniques, Proceedings of ACM CHI’91 conference on Human Factors in Computing Systems, 119- 124 (Association for Computing Machinery, New York), 1991, ISSN:0736-6906.

[22] Lewis, C. & Wharton, C., Cognitive Walkthroughs. In: Helander, M. G., Landauer, T.K., Prabhu, P. (Hrsg.), Handbook of Human-Computer Interaction. Amsterdam: Elsevier Science B.V., 1997, ISBN: 0444818626

[23] Nielsen, J., Usability Inspection Methods, New York. Wiley, 1994. ISBN: 0471018775.

[24] Tan D., Wandke H., Process-oriented user support for workflow applications, in proceedings of the 12th international conference on Human-computer interaction: applications and services, Springer, July 2007.

[25] Salvendy, G., Smith, M., Designing and Using Human-Computer Interfaces and Knowledge Based Systems, Amsterdam. Elsevier, 1989, pp. 394–401. ISBN: 04448807810

[26] S. Stowasser “Methods of Software Evaluation” in The International Encyclopedia of Ergonomics and Human Factors, CRC Press, 2006, p. 3249, ISBN: 04153043010.

[27] A. Azarian, A. Siadat “A Proposal of a practical approach for quantified quality software evaluation during the development cycle“ in the 7th WSEAS Int. Conf. on Applied informatics and communication (AIC '07), Athens, Greece, August 24-26, 2007, p. 331-336, ISSN: 1790-51117.

[28] McCabe, A Complexity Measure, IEEE Transactions on Software Engineering: 315, December 1976.

[29] Suh N.P., Axiomatic design: Advances and applications Oxford University Press, 2001

[30] Youtian Q., Chaonan W., Lili Z., Huilai Z., Hua L., Research for an Intelligent Component-Oriented Software Development Approaches, Journal of software, vol. 4 n°10, December, 2009, doi:10.4304/jsw.4.10.1136-1144
http://dx.doi.org/10.4304/jsw.4.10.1136-1144

[31] Lin Y., Wei S., Chi-long Z., Honglei T., On Practice of Big Software Designing, Journal of Software, Vol 5, No 1, p.81-88, January, 2010, doi:10.4304/jsw.5.1.81-88.
http://dx.doi.org/10.4304/jsw.5.1.81-88

[32] Asikainen T., Männistö T., Soininen T. Kumbang: “A domain ontology for modelling variability in software product families”. Advanced Engineering Informatics 21, 2007. ISSN: 1474-0346.


Full Text: PDF


Journal of Software (JSW, ISSN 1796-217X)

Copyright @ 2006-2014 by ACADEMY PUBLISHER – All rights reserved.