Sunday, October 28, 2007

Software in Testing

Even in testing, you have to pay attention to software. It's part of the test development knowledge base. The old paradigm of "Software's free" needs to be left in the dust. If it's ignored or the attitude of you design the hardware and plan the test...oh yeah, just do the software needs to be left behind and you need to move forward into the 90's (yes, the 1990's). If you ignore the software, you'll pay in the long run.

I say this because this is the attitude from a test manager. He said there were no software problems with the test sets at his site. Everyone had a good laugh but I think he was serious.

Even in Test Engineering people need to continuously improve your software skills. If you're a hardware guy who had 1 software class you need to try to improve. If you're a degree in Software Engineering and have been developing software for 15 years there's still room to learn new software techniques.

If the software is poorly written the tests will be hard to maintain or possibly won't work properly. Without the proper software discipline the overall testing process won't go smoothly. As you move through the test development process, things will be rocky, problems will occur, and it will take to long to fix the problems. Software is one of the corner stones of testing.

So don't ignore it or it will come back to bite you.

Saturday, October 27, 2007

What is a "Test" Person

Lately, I've been thinking about what exactly is a "Test" person...or more to the point a good test person. Someone who is good at testing needs to have multiple skills, good at each but not necessarily an expert in them. The skills needed are Software Engineering, Electrical Engineering, and Systems Engineering. The good test person needs to be good at Software because software drives the test. With tools such as TestStand, NI is trying to make software easier, when in actuality they're trying to make people who don't do software think they can. But if you do software poorly, the tests developed don't work well.

Electrical Engineering skills are needed because that's what is tested and the interfaces need to be understood. Also, the Unit Under Test (UUT) has to be understood well enough to know it's being tested throughly.

Systems are needed to tie the tests together and to understand how the UUT fits into the rest of the system so it can be tested.

A good tester has qualities of each and most are very good in one area and adequate in the others.

I was pondering this as I was trying to understand a test, in TestStand, where the developer used a lot of nifty TestStand features that just made the test harder to follow and understand what was really going on. I think the person who wrote it understood Systems really well, knew hardware well enough to use it, and knew a lot of the details of software without being that good at developing software that will need to be maintained by others.

Wednesday, October 24, 2007

Testing is Tricky Business

Actually, testing can tricky if your customer doesn't know what they want. You can't base real requirements on "Test this!". You say OK, you look at all the inputs and outputs and test them, you test the circuits behind the interfaces as best as possible. You even tell them in advance what you're going to test, they say "that looks good" (you keep all the e-mail transactions) .

You go to the test set sell off and you get the conversation "That's not what we wanted". "Well...What did you want", "I don't know, but when I see it, I'll know it". That's not how it really goes because after the "That's not what we wanted" you start cussing (in your head) and discussing what they signed up for and what they say they signed up for.

Again, this is the need for the good requirements up front so that they understand what they will be getting and you understand what they want.

Recent experiences have shown that the above discussion happens more in testing than the good requirements.

Good luck on getting through the cussing and discussing part. It's never easy.