Glossary of Software Testing Terms Provided by Testing Realms

Glossary of Software Testing Terms: F

This glossary of software testing terms and conditions is a compilation of knowledge, gathered over time, from many different sources. It is provided “as-is” in good faith, without any warranty as to the accuracy or currency of any definition or other information contained herein. If you have any questions or queries about the contents of this glossary, please contact Project Realms directly.

Failover Testing Feature Testing Functional Decomposition
Failure Firing a Rule Functional Requirements
Fault Fit for Purpose Testing Functional Specification
Feasible Path Full Release Functional Testing

Failover Testing
Failover testing ensures that the systems can successfully failover and recover from a variety of hardware, software, or network malfunctions with undue loss of data or data integrity. Failover testing ensures that, for those systems that must be kept running, when a failover condition occurs, the alternate or backup systems properly "take over" for the failed system without loss of data or transactions.

This type of testing requires help from a system specialist to simulate a failover condition. It also requires an understanding of what failover means for the particular system. Failovers can be transparent and non-transparent.

  • Transparent failovers are such that the user is not aware of the system failover. The user experience is a seamless use of the system
  • Non-transparent failovers are more obvious to the end user. There may be an error message and when the user tries again, the systems works. In other cases, the failover requires a human to complete the system failover

Testing for this type verifies the user experience during the failover is as expected. It also validates the functionality of the system and integrity of the data after the failover has occurred.

Non performance or deviation of the software from its expected delivery or service.

A manifestation of an error in software. Also known as a bug.

Feasible Path
A path for which there exists a set of input values and execution conditions which causes it to be executed.

Feature Testing
A method of testing which concentrates on testing one feature at a time.

Firing a Rule
A rule fires when the “if” part (premise) is proven to be true. If the rule incorporates an “else” component, the rule also fires when the “if” part is proven to be false.

Fit For Purpose Testing
Validation carried out to demonstrate that the delivered system can be used to carry out the tasks for which it was designed and acquired.

Full Release
All components of the release unit that are built, tested, distributed and implemented together. See also delta release.

Functional Decomposition
A technique used during planning, analysis and design; creates a functional hierarchy for the software. Functional Decomposition broadly relates to the process of resolving a functional relationship into its constituent parts in such a way that the original function dam be reconstructed (i.e., recomposed) from those parts by function composition. In general, this process of decomposition is undertaken either for the purpose of gaining insight into the identity of the constituent components (which may reflect individual physical processes of interest, for example), or for the purpose of obtaining a compressed representation of the global function, a task which is feasible only when the constituent processes possess a certain level of modularity (i.e. independence or non-interaction).

Functional Requirements
Define the internal workings of the software: that is, the calculations, technical details, data manipulation and processing and other specific functionality that show how the use cases are to be satisfied. They are supported by non-functional requirements, which impose constraints on the design or implementation (such as performance requirements, security, quality standards, or design constraints).

Functional Specification
A document that describes in detail the characteristics of the product with regard to its intended features.

Functional Testing
Functional testing of the target-of-test should focus on any requirements for test that can be traced directly to functional specifications or business rules. The goals of these tests are to verify proper data acceptance, processing, and retrieval. It tests the features and operational behavior of a product to ensure they correspond to its specifications, and incorporates tests that ignore the internal mechanism of a system or component and focus solely on the outputs generated in response to selected inputs and execution conditions. This type of testing is based upon typically black-box techniques, that is, verifying the application (and its internal processes) by interacting with the application via the UI (User Interface) and analyzing the output (results).
  • User Interfaces - This area focuses on ensuring the UI functions based on business rules. It addresses behaviors between fields on the same screen and between screens
    • Field Validations
      • Inter-field Calculations -Verification of calculations based on visible fields, system calculations displayed in the UI and combinations of the two
      • Interdependent fields - Verification of action driven behaviors between one or more fields (i.e.: If a Leap year is selected from the Year field and February is selected from the Month field, then the day field displays a list of numbers from 1 through 29)
    • Inter-screen Data flows
      • Happy Path Testing - This is the verification that screen behaviors and data display values behave as expected when executing through a series of screens or business process
      • Data Dependent Branching - This is the verification that data from a prior screen or from another system causes the expected change in the current screen. These behaviors are typically driven by business rules
  • System Interfaces - This is a type of functional testing that verifies invoking selected systems based on business rules. It also verifies complex business rules systems such as a scoring engine
    • Conditional System Interfaces - This is the verification that a system is invoked when a set of conditions are met. Those conditions can be system or business driven. This type could also validate variances in the data passing through the interface that are a result of a set of system or business conditions
    • Business Rules Systems - Some systems are dedicated business rules processors. Testing these systems directly can be more efficient than verifying the results in a display system. This type of testing involves submitting various data sets to invoke certain expected data responses.

| Contact us for more info