Note
i missed monday’s lecture, whoopsies
Announcements
Assignment 1
- think about service alerts (alert if a part is not working correctly)
- non-functional
- have sufficient encryption so card information is not comprimised
- ensure casing of vending machine remains ground (power-wise)
- if not grounded, trip a breaker and shut it down
Assignment 2
- will be a junit testing assignment
- will be writing the tests first (“because otherwise it’s pointless”)
Software testing
Note
Starting from chapter 8, software testing, slide 0
Program testing
- Software testing: show that a program does what it is intended to do and discover bugs prior to release
- test using artificial data
- reveals the presence of errors, NOT their absence
- cannot prove a negative
- part of a more general verification and validation process → also includes static validation techniques
- Software inspections: analysis of the static system representation (code) to discover problems
- can check conformance with a spec but not conformance with customer’s real requirements
- cannot check non-functional characteristics such as performance, usability, etc.
- Software testing & software inspections are not opposing verification techniques → both should be used
- Program testing goals:
- demonstrate to the developer and customer that the software meets its requirements
- Custom software: there should be at least once test for every requirement in the requirements doc
- Generic software products: there should be tests for all of the system features, combinations of system features
- leads to validation testing
- discover situations where the behaviour of software is not intended
- Defect testing: root out undesirable system behaviour, such as system crashes, unwanted interactions with other systems, incorrect computation, and data corruption
- leads to defect testing
- demonstrate to the developer and customer that the software meets its requirements
- V&V confidence: establish that the system is fit for purpose → depends on system’s purpose, user expectations, marketing environment
- Software purpose: how critical the software is to an organization
- User expectations: users may have low expectations of certain kinds of software
- Marketing environment: getting a product to market early may be more important than finding defects in the program
- Stages of testing:
- Development testing: system is tested during development to discover bugs and defects
- Release testing: separate testing team tests a complete version before release
- User testing: where users or potential users of a system test the system in their own environment
- live testing is a big no no
- Automated testing: tests should be automated so that tests are run and checked without manual intervention
- use testing framework (JUnit, pytest, …)
- Automated test components:
- Setup part: setup the test (inputs, expected outputs)
- Call part: call object/method to be tested
- Assertion part: compare the result with the expected result → if it’s the same, return true → else false
- Testing strategies:
- Partition testing: identify groups of inputs that have common characteristics, and should be processed in the same way
- should choose tests from within each group
- Guideline-based testing: use testing guidelines to choose test cases
- reflects prev. experience of the kinds of programmers that often make while developing components (e.g. bounds testing on an array)
- Partition testing: identify groups of inputs that have common characteristics, and should be processed in the same way
- General testing guidelines:
- choose inputs that force the system to generate all error messages (UI design issue)
- design inputs that cause input buffers to overflow
- repeat the same input or series of inputs numerous times
- force invalid outputs to be generated
- force computation results to be too large or too small
Interface testing
detect faults due to interface errors or invalid assumptions about interfaces - interface types: - Parameter interfaces: data passed form one method or procedure to another - Shared memory interfaces: block of memory is shared between procedures or functions - e.g. having to clear cache on a browser - Procedural interfaces: sub-system encapsulates a set of procedures to be called by another sub-system - Message passing interfaces: sub-systems request services from other sub-systems
- Interface errors:
- Interface misuse: a calling component calls another component and makes an error in its use of its interface (e.g. parameters in the wrong order)
- Interface misunderstanding: a calling component embeds assumptions about the behaviour of the called component which are incorrect
- Timing errors: the called and out of date calling component operate at different speeds and out-of-date information is accessed
- Interface testing guidelines:
- design tests so that parameters to a called procedure are at the extreme ends of their ranges
- always test pointer parameters with null pointers
- design tests which cause the component to fail
- use stress testing in message passing systems
- in shared memory systems, vary the order in which components are activated
- reveals deadlocks
System and component testing
- during system testing, reusable components that have been separately developed and off-the-shelf systems may be integrated with the newly developed components → complete system is then tested
- components developed by different team members or sub-teams may have been integrated at this stage → system testing is a collective rather than an individual process
- system testing may involve a separate testing team & no involvement from designers or programmers
Use-case testing
- identify system interactions can be used as a basis for system testing
- each use case usually involves several system components so testing the use case forces these interactions to occur
Note
Ended slide 39