"pdponline" TQM |
||
TQM | TPM | | | Contact | |
Taguchi, Genichi Developed a set of practices known as Taguchi
Methods, as they are known in the U.S., for improving quality while reducing
costs. Taguchi Methods focus on the design of efficient experiments, and the
increasing of signal to noise ratios. Dr. Taguchi also articulated the developed
the quality loss function. Currently, he is executive director of the American
Supplier Institute and director of the Japan Industrial Technology
Institute.
Tampering Not
differentiating between common and special cause variation and changing the
process.
Test case.
(IEEE) Documentation specifying inputs, predicted results, and a set of
execution conditions for a test item. Syn: test
case specification.
Test design. (IEEE) Documentation
specifying the details of the test approach for a software feature or
combination of software features and identifying the associated tests. See:
testing functional; cause effect graphing; boundary value analysis; equivalence
class partitioning; error guessing; testing, structural; branch analysis; path
analysis; statement coverage; condition coverage: decision coverage;
multiple-condition coverage.
Test
documentation. (IEEE) Documentation describing plans for, or results
ot the testing of a system or component, Types include test case specification,
test incident report, test log, test plan, test procedure, test
report.
Test driver.
(IEEE) A software module used to invoke a module under test and, often, provide
test inputs, control and monitor execution, and report test results. Syn: test
harness.
Test log. (IEEE)
A chronological record of all relevant details about the execution of a
test.
TEST OF SIGNIFICANCE
A procedure to determine whether a quantity subjected to random variation
differs from a postulated value by an amount greater than that due to random
variation alone.
Test
phase. (IEEE) The period of time in the software life cycle in which
the components of a software product are evaluated and
integrated, and the
software product is evaluated to determine whether or not requirements have been
satisfied.
Test plan.
(IEEE) Documentation specifying the scope, approach, resources, and schedule of
intended testing activities. It identifies test items, the features to be
tested, the testing tasks, responsibilities, required resources, and any risks
requiring contingency planning. See: test design, validation
protocol.
Test procedure.
(NIST) A formal document developed from a test plan that presents detailed
instructions for the setup, operation, and evaluation of the results for each
defined test. See: test case.
Test
report. (IEEE) A document describing the conduct and results of the
testing carried out for a system or system component.
Test. (IEEE) An activity in which a
system or component is executed under specified conditions, the results are
observed or recorded and an evaluation is made of some aspect of the system or
component.
Testability.
(IEEE) (1) The degree to which a system or component facilitates the
establishment of test criteria and the performance of tests to determine whether
those criteria have been met. (2) The degree to which a requirement is stated in
terms that permit establishment of test criteria and performance of tests to
determine whether those criteria have been met.
Testing, acceptance. (IEEE) Testing
conducted to determine whether or not a system satisfies its acceptance criteria
and to enable the customer to determine whether or not to accept the system.
Contrast with testing, development; testing, operational. See: testing,
qualification.
Testing, alpha
[a]. (Pressman) Acceptance testing performed by the customer in a controlled
environment at the developer's site. The software is used by the customer in a
setting approximating the target environment with the developer observing and
recording errors and usage problems.
Testing, beta [B].(1) (Pressman) Acceptance testing performed
by the customer in a live application of the software, at one or more end user
sites, in an environment not controlled by the developer. (2) For medical device
software such use may require an Investigational Device Exemption [ICE] or
Institutional Review Board (IRS] approval.
Testing, compatibility. The process of determining the ability
of two or more systems to exchange information. In a situation where the
developed software replaces an already working program, an investigation should
be conducted to assess possible comparability problems between the new software
and other programs or systems. See: different software system analysis; testing,
integration; testing, interface. program variables. Feasible only for small,
simple programs.
Testing, design
based functional. (NBS) The application of test data derived through
functional analysis extended to include design functions as well as requirement
functions. See: testing, functional.
Testing, development. (IEEE) Testing conducted during the
development of a system or component, usually in the development environment by
the developer. Contrast with testing, acceptance; testing,
operational.
Testing,
formal. (IEEE) Testing conducted in accordance with test plans and
procedures that have been reviewed and approved by a customer, user, or
designated level of management. Antonym: informal testing.
Testing, functional. (IEEE) (1)
Testing that ignores the internal mechanism or structure of a system or
component and focuses on the outputs generated in response to selected inputs
and execution conditions. (2) Testing conducted to evaluate the compliance of a
system or component with specified functional requirements and corresponding
predicted results. Syn: black-box testing, input/output driven testing. Contrast
with testing, structural.
testing, interface. (IEEE) Testing conducted to
evaluate whether systems or components pass data and control correctly to one
another. Contrast with testing, unit; testing, system. See: testing,
integration.
Testing,
operational. (IEEE) Testing conducted to evaluate a system or
component in its operational environment. Contrast with testing, development;
testing, acceptance; See: testing, system.
Testing, parallel .(ISO) Testing a new or an alternate data
processing system with the same source data that is used in another system. The
other system is considered as the standard of comparison. Syn: parallel
run.
Testing, path. (NBS)
Testing to satisfy coverage criteria that each logical path through the program
be tested. Often paths through the program are grouped into a finite set of
classes. One path from each class is then tested. Syn path coverage. Contrast
with testing, branch; testing, statement; branch coverage; condition coverage;
decision coverage.
Testing,
qualification. (IEEE) Formal testing, usually conducted by the
developer for the consumer, to demonstrate that the software meets its specified
requirements. See: testing, acceptance; testing, system.
Testing, regression. (NIST) Rerunning
test cases which a program has previously executed correctly in order to detect
errors spawned by changes or corrections made during software development and
maintenance.
Testing,
system. (IEEE) The process of testing an integrated hardware and
software system to verify that the system meets its specified requirements. Such
testing may be conducted in both the development environment and the target
environment.
testing, unit. (1) (NIST) Testing of a module for typographic,
syntactic, and logical errors, for correct implementation of its design, and for
satisfaction of its requirements. (2) (IEEE) Testing conducted to verify the
implementation of the design for one software element; e.g., a unit or module;
or a collection of software elements. Syn: component testing.
Testing, usability. designed in a
manner such that the information is displayed in a understandable fashion
enabling the operator to correctly interact with the system?
testing, volume.
Testing designed to challenge a system's ability to manage the maximum amount of
data over a period of time. This type of testing also evaluates a system's
ability to handle overload situations in an orderly fashion.
Testing, worst case. Testing which
encompasses upper and lower limits, and circumstances which pose the greatest
chance finding of errors. Syn: most appropriate challenge conditions. See:
testing, boundary' value; testing, invalid case; testing. special case: testing,
stress; testing, volume.
Testing.
integration. (IEEE) An orderly progression of testing in which
software elements, hardware elements, or both are combined and tested, to
evaluate their interactions, until the entire system has been
integrated.
Testing.
performance. (IEEE) Functional testing conducted to evaluate the
compliance of a system or component with specified performance
requirements.
Testing. special
case. A testing technique using input values that seem likely to
cause program errors; e.g., "0", "1", NULL, empty string. See: error
guessing.
Testing.
statement. (NIST) Testing to satisfy the criterion that each
statement in a program be executed at least once during program testing. Syn:
statement coverage. Contrast with testing, branch; testing, path; branch
coverage; condition coverage; decision coverage; multiple condition coverage;
path coverage.
Testing. valid
case. A testing technique using valid [normal or expected] input
values or conditions. See: equivalence class partitioning.
Testing. (IEEE) (1) The process of
operating a system or component under specified conditions, observing or
recording the results, and making an evaluation of some aspect of the system or
component. (2) The process of analyzing a software item to detect the
differences between existing and required conditions, i.e., bugs, and to
evaluate the features of the software items. See: dynamic analysis, static
analysis, software engineering.
TGR
Things Gone Right.
TGW Things Gone Wrong.
TOPS Team Oriented Problem Solving
Total Quality Management (TQM) TQM is
management and control activities based on the leadership of top management and
based on the involvement of all employees and all departments from planning and
development to sales and service. These management and control activities focus
on quality assurance by which those qualities which satisfy the customer are
built into products and services during the above processes and then offered to
consumers.
Total Quality
Management Managing for quality in all aspects of an organization
focusing on employee participation and customer satisfaction. Often used as a
catch-all phrase for implementing various quality control and improvement
tools.
Total Quality Management/Total
Quality Leadership (TQM/TQL) Both a philosophy and a set of guiding
principles that represent the foundation of the continuously improving
organization. TQM/TQL is the application of quantitative methods and human
resources to improve the material and services supplied to a organization, all
the processes within an organization, and the degree to which the needs of the
customer are met, now and in the future. TQM/TQL integrates fundamental
management techniques, existing improvement efforts and technical tools under a
disciplined approach focused on continuous improvement.
TQM Total Quality Management: A
management approach of an organization centered on quality.
Trace. (IEEE) (1) A record of the
execution of a computer program, showing the sequence of instructions executed,
the names and values of variables, or both. Types include execution trace,
retrospective trace, subroutine trace, symbolic trace, variable trace. (2) To
produce a record as in (1). (3) To establish a relationship between two or more
products of the development process: a.g., to establish the relationship between
a given requirement and the design element that implements that
requirement.
Traceability
analysis. (IEEE) The tracing of (1) Software Requirements
Specifications requirements to system requirements in concept documentation, (2)
software design descriptions to software requirements specifications and
software requirements specifications to software design descriptions, (3) source
code to corresponding design specifications and design specifications to source
code. Analyze identified relationships for correctness, consistency,
completeness, and accuracy. See: traceability, traceability matrix.
Traceability matrix. (IEEE) A matrix
that records the relationship between two or more products; ag., a matrix that
records the relationship between the requirements and the design of a given
software component. See: traceability, traceability analysis.
Traceability The ability to trace a
product back through the process , and identify all sub-processes, components,
and equipment that were involved in its manufacture.
Traceability. (IEEE) (1) The degree to
which a relationship can be established between two or more products of the
development process, especially products having a predecessor-successor or
master-subordinate relationship to one another; ag., the degree to which the
requirements and design of a given software component match. See: consistency.
(2) The degree to which each element in a software development product
establishes its reason for existing; e.g., the degree to which each element in a
bubble chart references the requirement that it satisfies. See: traceability
analysis, traceability matrix.
Transition Period Time when an organization is moving away
from an old way of thinking to the new way.
Tree diagram A chart used
to break any task, goal, or category into increasingly detailed levels of
information. Family trees are the classic example of a tree diagram.
TRIZ Theory of Inventive
Problem Solving
Trojan
horse. A method of attacking a computer system, typically by
providing a useful program which contains code intended to compromise a computer
system by secretly providing for unauthorized access, the unauthorized
collection of privileged system or user data, the unauthorized reading or
altering of files, the performance of unintended and unexpected functions, or
the malicious destruction of software and hardware See: bomb, virus,
worm.
Type I error
Rejecting something that is acceptable. Also known as an alpha
error.
Type II error
Accepting something that should have been rejected. Also known as beta
error.
u chart A control chart showing the count of defects per unit in a
series of random samples.
UPPER
CONTROL LIMIT A horizontal line on a control chart (usually dotted)
which represents the upper limits of process capability.
Usability. (IEEE) The ease with which
a user can operate, prepare inputs for, and interpret of a system or
component.
User. (ANSI)
Any person, organization, or functional unit that uses the services of an
information processing system. See: end user.
User's guide. (ISO) Documentation that
describes how to use a functional unit, and that may include description of the
rights and responsibilities of the user, the owner, and the supplier of the
unit. Syn: user manual, operator manual