Monthly Archives: November 2007

Teleworking Works

I read this in Macworld UK this morning:

Ravi S. Gajendran and David A. Harrison, at the Department of Management and Organisation at Pennsylvania State University studied data on 12,833 telecommuters who spend time working away from the office, and found that working away from the office has more pluses than negatives for people and the companies that employ them.

They reported their findings (PDF: 154KB) to the journal of Applied Psychology, published by the American Psychological Association (APA).

“Our results show that telecommuting has an overall beneficial effect because the arrangement provides employees with more control over how they do their work,” said lead author Gajendran.

“We found that telecommuters reported more job satisfaction, less motivation to leave the company, less stress, improved work-family balance, and higher performance ratings by supervisors,” he said.

“Contrary to expectations in both academic and practitioner literatures, telecommuting has no straightforward, damaging effects on the quality of workplace relationships or perceived career prospects.”

——–

Test Planning: Test Data Sheets

When you take on a large testing project you find some interesting management issues.

  • You have multiple people doing the testing and you have to be confident you know how they are testing

    • You want to know they are doing the same tests
    • You want to know how much they have tested
    • You want to have a record of what they have done
    • You want to know who did the testing
  • You have multiple items you are testing and you want to be confident what you are testing

    • You want to know what model was tested
    • You want to know what version was tested
  • It takes significant time to do all of this and you want to be sure when testing was done

A simple test tool that is helpful in addressing these testing issues is the test data sheet. This can be as simple as a spreadsheet or as complicated as an interactive web site.

To address the issues above, you want to have a place on your test data sheet for each of the following:

  • Tester name
  • Test date
  • Model numbers involved in the test
  • Serial numbers involved in the test
  • Hardware/Firmware/Software versions involved in the test
  • An identifier for the test(s) being performed
  • The result of the test (Pass/Fail: Explanation)

It just sounds like paperwork, but a month after four people finish running a dozen tests on three devices in five configurations from each of nine vendors you will appreciate having that pile of paper (or bits) to refer back to.

Labels: ,

——–

Test Planning: Test Verification Matrix

You can make test planning a little easier if you start with a Test Verification Matrix. This is simply a table that lists each requirement and the method that you have decided to use to verify the requirement.

Suppose the requirements I used as examples in my previous post were the only ones we had. In that case, the verification matrix would look like this:

Test Verification Matrix
Requirement Verification Method
PC2. Port connectors provided on the front panel Inspection
AE1. Provides at least 32 10/100/1000BASE-T ports Inspection
EN1. Operates in temperatures between 32 and 104�F Inspection
CM5. Manageable locally using console access Demonstration
IO1. Passes IPv4 unicast packets at full line rate Test
ZZ1. Provides an average jitter of less than 10 milliseconds. Analysis

This can be useful because you can frequently verify requirements that have the same method at the same time. For instance, in your test plan you might group all of your inspection requirements together and perform them as you unpacked the item to test.

Labels: ,

——–

Classical Requirements Verification Methods

Back in August, I started introducing Program Management with an introduction to the System Development Life Cycle. I have talked in more detail about some of the early phases of the life cycle, like Operational Concept Development and System Requirements Specification, but right now I would like to skip ahead to the Test phase.

We are at a point in our basic switch replacement process when we have to start planning to test the candidates that have made it through our paper evaluation.

In this part of the process, we need to verify that the devices we are considering actually meet our stated requirements. I will provide a future post on the testing process, but before I do it is good to have an understanding of the four classical requirements verification methods:

  • Inspection
  • Demonstration
  • Test
  • Analysis

Inspection

Inspection is observation using one or more of the five senses, simple physical manipulation, and mechanical and electrical gauging and measurement to verify that the item conforms to its specified requirements.

For instance, we have this requirement on the physical characteristics of our basic switch:

PC2. Port connectors provided on the front panel

We will verify this requirement through inspection. That is, we will look at the switch and observe where the port connectors are located.

In fact, we will verify many of our requirements through inspection. All of the requirements indicating port types and counts, like:

AE1. Provides at least 32 10/100/1000BASE-T ports

In addition, for anything that is beyond our capabilities to test, we must rely on vendor documentation. We also considered these inspection methods. For instance:

EN1. Operates in temperatures between 32 and 104�F

Some companies would put the item in a temperature chamber and cycle the temperature while conducting performance tests. We do not have that capability. We will satisfy ourselves with an inspection of the vendor documentation as to their reported operating temperature range to determine whether the units satisfy these requirements.

Demonstration

Demonstration is the actual operation of an item to provide evidence that it accomplishes the required functions under specific scenarios.

Consider this requirement:

CM5. Manageable locally using console access

We will plug a local console device into the item and demonstrate that we can use it to perform a sampling of management functions.

Test

Test is the application of scientific principles and procedures to determine the properties or functional capabilities of items.

Test is similar to demonstration, but is more exacting, generally requiring specialized test equipment, configuration, data, and procedure in order to verify that the item satisfies the requirement.

Consider this requirement:

IO1. Passes IPv4 unicast packets at full line rate

We will verify this requirement by testing. We will use specialized test equipment (SmartBits Data Sheet) we will connect it to the item in a particular way, configure the control software just so, and run specific data through it according to a repeatable procedure from which we will get a binary result indicating whether the item did, or did not satisfy the requirement.

Analysis

Analysis is the use of established technical or mathematical models or simulations, algorithms, or other scientific principles and procedures to provide evidence that the item meets its stated requirements.

As test was like a more involved version of demonstration, so analysis is like testing on steroids. In analysis, many tests may be performed, but the results of any given test do not give a pass or fail indication, rather all of the results must be taken in concert and we must perform some further operation in order to determine whether the item satisfies the requirement.

Let me say that I do not believe we have any requirements that require analysis in this case. After all, this is the basic switch. However, if we had a requirement like this:

ZZ1. Provides an average jitter of less than 10 milliseconds.

In this case, we would have to verify this requirement using analysis. We would perform many tests, collect the resulting data, measure the time between output packets, and then calculate the average. We might repeat this many times over a 24-hour period and show whether the average jitter ever went above 10 milliseconds.

Labels: ,

——–