What you see is what you test : a testing methodology for form-based visual programs Public Deposited

http://ir.library.oregonstate.edu/concern/graduate_thesis_or_dissertations/cf95jd68s

Descriptions

Attribute NameValues
Creator
Abstract or Summary
  • Visual programming languages employ visual representation to make programming easier and make programs more reliable and more accessible. Visual program testing becomes increasingly important as more and more visual programming languages and visual programming environments come into real use. In this work, we focus on one important class of visual programming languages: form-based visual programming languages. This class of languages includes electronic spreadsheets and a variety of research systems that have had a substantial impact on end-user computing. Research shows that form-based visual programs often contain faults, but that their creators often have unwarranted confidence in the reliability of their programs. Despite this evidence, we find no discussion in the research literature of techniques for testing or assessing the reliability of form-based visual programs. This lack will hinder the real use of visual programming languages. Our work addresses the lack of testing methodologies for form-based visual programs. In this document, we first examine differences between the form-based and imperative programming paradigms, discuss effects these differences have on methodologies for testing form-based programs, and analyze challenges and opportunities for form-based program testing. We then present several criteria for measuring test adequacy for form-based programs, and illustrate their application. We show that an analogue to the traditional "all-uses" dataflow test adequacy criterion is well suited for testing form-based visual programs: it provides important error-detection ability, and can be applied more easily to form-based programs than to imperative programs. Finally, we present a testing methodology that we have developed for form-based visual programs. To accommodate the evaluation model used with these programs, and the interactive process by which they are created, our methodology is validation-driven and incremental. To accommodate the user base of these languages, we provide an interface to the methodology that does not require an understanding of testing theory. We discuss our implementation of this methodology, its time costs, the mapping from our approach to the user interface, and empirical results achieved in its use.
Resource Type
Date Available
Date Copyright
Date Issued
Degree Level
Degree Name
Degree Field
Degree Grantor
Commencement Year
Advisor
Academic Affiliation
Non-Academic Affiliation
Subject
Rights Statement
Peer Reviewed
Language
Digitization Specifications
  • File scanned at 300 ppi (Monochrome, 256 Grayscale) using Capture Perfect 3.0.82 on a Canon DR-9080C in PDF format. CVista PdfCompressor 4.0 was used for pdf compression and textual OCR.
Replaces
Additional Information
  • description.provenance : Approved for entry into archive by Patricia Black(patricia.black@oregonstate.edu) on 2012-09-25T20:33:21Z (GMT) No. of bitstreams: 1 LiLixin1998.pdf: 4778323 bytes, checksum: 817a29770eb0437a65e3dcfa89bb06e1 (MD5)
  • description.provenance : Made available in DSpace on 2012-09-25T20:34:30Z (GMT). No. of bitstreams: 1 LiLixin1998.pdf: 4778323 bytes, checksum: 817a29770eb0437a65e3dcfa89bb06e1 (MD5) Previous issue date: 1997-11-06
  • description.provenance : Submitted by Erin Clark (ecscannerosu@gmail.com) on 2012-09-25T20:15:21Z No. of bitstreams: 1 LiLixin1998.pdf: 4778323 bytes, checksum: 817a29770eb0437a65e3dcfa89bb06e1 (MD5)
  • description.provenance : Approved for entry into archive by Patricia Black(patricia.black@oregonstate.edu) on 2012-09-25T20:34:30Z (GMT) No. of bitstreams: 1 LiLixin1998.pdf: 4778323 bytes, checksum: 817a29770eb0437a65e3dcfa89bb06e1 (MD5)

Relationships

Parents:

This work has no parents.

Last modified

Downloadable Content

Download PDF

Items