Tuesday, October 29, 2013

After a recent discussion with another tester about options for learning more about software testing, I started to put together a brief response.  The more I thought about it the more it continued to grow, so I decided it was worth resurrecting my defunct blog to make it a full blog post.  

The content below only skims the surface of the knowledge and opportunities available.  It's like that when you start thinking about context.  As Socrates said, "True knowledge exists in knowing that you know nothing", a natural extension of the realization that the more you know, the more you know that you don't know.

Before I start, I should note that I fall into the context driven testing school of software testing.  This means that my suggestions tend to prioritize investigation and self-learning, instead of memorizing terms and taking multiple choice tests.

In rough order of commitment in terms of both time and price, I offer the following groups of suggestions for beginning, continuing, or attempting to master the study of the art of software testing.

1. Self education: Things you can learn on your own in your spare time, mostly free or low cost options.
  • Follow test industry leaders on Twitter: James Bach, Michael Bolton, Keith Klain, Elizabeth Hendrickson and many more can be found online.
  • Read testing books, blogs and magazines
  • Learn a scripting or programming language
  • Join a local testing group
  • Watch free presentations online from test conferences past and current
A good place to start is a free eBook called '99 Things You Can Do To Become a Better Tester', which includes the things above as well as many more:
Michael Larsen has even taken it upon himself to write a blog post about each of the 99 things!

2. More formal education: This will likely involve some expense, and a more structured time commitment.  Check with your manager about the expense: Many companies have a training budget that may offset at least some of the cost of things like conferences and training courses.  You may also need to join an organization to gain access to conferences or courses.  The cost is usually reasonable, and provides other perks as well, such as publications and training material.

  • Attend a conference in person - this is a good way to get exposed to a range of different ideas and have discussions with many other testers.  Search the web, there may be a conference coming to your area!
  • Take courses like the online BBST (Black Box Software Testing): http://www.associationforsoftwaretesting.org/training/courses/
  • or the Rapid Software Testing course taught by James Bach or Michael Bolton http://www.satisfice.com/jamesschedule.shtml
  • Some experienced testers offer direct coaching, usually via Skype.  This may be on their own as time allows, or through an organization like the Association for Software Testing.

3. Formal and most expensive - A Masters program.  I only list this one because I've been asked about it.  All of the 'Masters in Software Testing' programs I have found in a Google search seem to be certificate programs offering a fancy (ie expensive) wrapper for ISTQB certification, not a true Master's degree.
The Florida Institute of Technology does offer a Masters in Software Engineering with a focus in software testing, a very respected program: http://www.fit.edu/programs/8050/  This is a full and comprehensive Masters program requiring a thesis and a background in computer science.

I found the following discussion of about the possibilities of establishing more formal training from a 2008 blog post, the post and replies make interesting reading. 

As I consider myself to have fallen firmly in the camp of Context Driven Testing (see context-driven-testing.com), my feeling is that ISTQB or any other certification is a waste of your time and money.  Many others have expressed their objections more eloquently so I won't belabor the point, except to say that if it's a requirement to get a job and you really have no alternative, do what you have to do.  But don't expect it to make you a 'Master' tester. 

As Groucho Marx said, "Those are my principles, and if you don't like them... well, I have others."

In summary, I think options 1 and 2 will take most testers anywhere they wish to go in the field of software testing.  There are plenty of options and content to stimulate your brain, which is the main tool you need to test with anyway!

Update: James Bach followed up shortly after this with a similar post titled 'To the New Tester' : http://www.satisfice.com/blog/archives/958

Friday, June 10, 2011

Testing Forums and Web Sites

There are quite a few forums and web pages devoted to the testing craft in part or in whole, and they seem to be multiplying.
Here are some of the active ones that I am currently aware of and participate in (or at least browse occasionally):

Forum Sites - these sites are primarily for posting and discussing questions in an open user environment.

Dedicated to software testing and QA, lots of specific forums for different test tools.

A part of the UTest paid online software testing community, the forums also cover test discussions that are not specific to the UTest community.  Registration for the forums is different from the main UTest registration.

News Sites - these sites primarily contain industry articles.  Several of them have discussion groups as well.

Targeted towards the software industry in general and not just testing.  StickyMinds is part of TechWell, and seems to be on track to be consumed by the parent site at some point.

Publishers of Software Test & Quality Assurance Magazine.  In addition to the articles, this site has the concept of groups that you can earn recognition awards by joining and participating in.

A relatively new site hosted by SmartBear software, content is good and growing but the discussion groups haven't really taken off yet.  This is also a more general software quality site, but is broken down by areas like 'Quality' so it's easy to find what you're interested in.

This used to be one of my primary go to sites for aggregating test related blog posts, but has mostly died.  I list it for completeness.

Other
Many well known testers in the industry have their own blogs, which can usually be found by Internet searches or references in their postings to the sites above.  As they come and go, and appeal to different audiences, I won't attempt to list them here (maybe a subject for another post?).  Most of them as well as some of the sites above are active on Twitter, and there are testing communities on Facebook and LinkedIn as well if you are a member of those sites.

I encourage you to browse through the sites above, and consider participating in some of the discussion groups if you find a topic you're interested in.  News is always flowing in the testing and software development world, and keeping your mind active and engaged with the latest ideas will help to keep your job interesting, as well as maintaining your skills and marketability.
 
I'm sure there are probably some I missed, please feel free to comment on your favorites. 

Tuesday, May 31, 2011

My Journey From Developer To Tester

I've been thinking about this post for a while, and was inspired to complete it after just returning from a local tester group meeting discussing the difficulties of finding good testers.  Hopefully my story will help out, at least from one perspective.
I graduated with a computer science degree from Campbell University.  Like most CS graduates, I expected to be a software developer when I graduated, and my major classes were in programming and theory.  A career as a tester was never discussed, and there were no classes in testing.

Due to a scheduling error my last semester I was only taking one class at night, so I took my first career job doing telephone tech support for a large personal computer company.  I learned a lot about the way people
 think (or don't, in many cases!) and how to translate technical speak for the average user.  I was also doing a lot of testing.  Forget black box, this was troubleshooting a computer I couldn't even see via a phone conversation with a non technical user!  It was frequently frustrating, and I looked forward to escaping to a real programming job when I graduated.  Like many such experiences, I didn't appreciate some of the skills I was learning until after I left.

Once I got my degree I happily took my first job writing code for a large telecom company, back when the industry was booming.  My job was writing software fixes that would be deployed via patches to communications switches.  The actual coding was a relatively small part of the job, most of it involved reproducing the problem, configuring test environments, and testing the fix to ensure the problem was gone.  I enjoyed the coding and the thrill of making something work that was broken before, but the actual testing was often challenging and interesting to me also.

A few years later I moved to a position working on a similar product, but where I got to do a few pure development projects.  It was satisfying at times, but difficult, and didn't really seem like a good fit for me.  I did OK and managed to finish my projects in time, but I was uncomfortable with my job and performance level.  It was about this time that the industry tanked and layoffs finally caught up with me.  Having seen them coming, I had already been taking a good look at my skill set and what I really wanted to do.  I realized that throughout my career, breaking things (and to a lesser extent fixing them) had always been more easy and satisfying for me than creating them from scratch.  I was a better tester than a developer!

Armed with this newfound realization, my job search got a lot easier.  I was able to find lots of ways in which my previous positions had involved testing and supporting skill sets like reporting and communication with end users.  I was also able to easily look outside of the telecom industry for jobs, a challenging problem for development positions as the languages and platforms I had worked on were very industry specific.

Through a family connection I found a small company that would take me on to fill a tester position, a job which they didn't even have at the time.
It was interesting interviewing for an opportunity to create a new role in a completely different company and industry.  While my testing experience at various projects was useful, I emphasized my interest in learning.  Perhaps more importantly from an attitude perspective, I distilled all the customer facing experiences I had during my career to talk about my focus on customer quality, and being able to represent the customer's interests.
It also helped that I had taken the time to research the company as much as possible, and think in advance about how I could work on their product.

I was hired, and it turned out to be a great fit:  While learning the product and industry I had time to find out what was necessary to MAKE the position. I found and read the testing blogs and periodicals, and got involved with the local testing group for support.  I researched and implemented an automated testing tool the 'right' way, with a data driven framework that was easy to maintain, reviewed and optimized their defect workflow, and performed a lot of exploratory testing.  As you can imagine this all took a couple of years to get running smoothly, especially with the learning curve of a new industry and product.

Do I miss the development world?  Not a bit.  Writing and maintaining the automated test suite and supporting scripts satisfies my coding urge and keeps my skills up, and occasionally I use my free time to tinker with other automation tools and languages.   A steady stream of new functionality and fielding defects and questions from the customer base gives me enough challenge to keep my manual testing skills up to snuff, as well as spec reviews and critical thinking.
I'm part of the development group, so I can still 'tech talk' with the developers and do a little white box testing now and then.

Would I go back to being a developer?  Not me, I've had enough experience to know that testing is my passion and strength.  I do know others who have gone back and forth between the two positions, but in my experience it's not that common.  I think that there is enough of a difference in the 'fix it' versus 'break it' mindset that most people have a preference and aptitude one way or the other.

Whether you're a developer or a tester, or one of the rare folks who enjoy both, the important thing is to know yourself.  Even if you're fresh out of college, think about your experiences with people and technology, in terms of what you enjoyed and what you were good at (be honest).  It may change over time.  But if you find your passion it's a lot more enjoyable to go to work on Monday morning!



Monday, May 23, 2011

Testing and the Art of Auto Repair

I do a good bit of basic work on my car, from oil changes all the way to intake manifold cleaning and suspension work.  It's an 8 year old Volkswagen diesel, and an online community helps keep me informed and encouraged about how to work on it.
I'm often struck by the similarities with software testing.  And yes, a lot of software diagnosis is involved with modern cars also.

In my most recent example,  I had recently cleaned out my intake manifold, a several hour job on a Saturday morning (a major software release with significant changes to the source code).  Performance was greatly improved, but there was still a nagging sensation that it wasn't performing as it should (customers seem happy with the new functionality, but occasional non-reproducible bug reports and logs filter in).

After parking the car for a couple of days, I went to start it on Monday morning to go to work, and it wouldn't stay running.  It would idle for 10-15 seconds before quitting, and refused to rev over 1000 rpm (a crippling system failure just as the customers need to use the system).

I had seen very similar symptoms a few months before, and a new fuel filter fixed the problem.  So my wife and I pushed the car out of the way, and I took another car to work (reverted to the earlier release to provide needed functionality during the day).    After I got home, I installed the new fuel filter with high hopes of success, but in spite of priming and many starting attempts the car still would not remain running (previous fix doesn't work for the current problem).

I called the mechanic to give him a heads up, and called the tow truck to take it to the expert (call subject matter experts and architects).  But when the tow truck arrived two hours later to pick up the car, after some initial hesitation it started up an ran fine (this never happens with software - just kidding!).  I'm left to assume that some air in the fuel line managed to finally purge itself after being able to sit for a while (external influences on the software/server or varying system loads affecting an installation routine finally settled down).

A test drive shows that the car is back to its old self with even more power than before, now that it has plenty of both air and fuel.  But without ever conclusively proving the source of the problem, I'll still be crossing my fingers when I start it in the morning.
And that, too, is often just like software.

Monday, May 16, 2011

Large data set testing

One testing scenario that I've been thinking about recently is testing large data sets.  Our software runs large furniture manufacturing plants, and the combination of all of the different options available in some screens can get quite large and cause a performance issue.  More importantly, each customer is capable of configuring their data in many different ways.  

Take a recent example.  A customer had configured two features with the same name, but one contained alphanumeric data and one had integer data.  A  poor data setup from our perspective, so we disregarded this possibility when considering a schema change. When upgrading to a newer software version which put features in a common pool by name, the SQL upgrade script tried to merge the two data sets as one, and some data was lost because of the unexpected data type difference.  Luckily this was caught in a test upgrade by customer support and no live data was lost, but the point is the same:  How do you test for data compatibility issues when each customer is capable of creating data combinations in new and 'interesting' ways?  You may not agree with a data setup, but if it's allowed there's always the chance that someone somewhere will try it, and you can't always get away with calling it a 'data problem' (not if you want to keep your customers for long).

Large data sets like this are one of those good examples of the testing maxim that it isn't possible to test everything.  In any kind of large scale data based solution, there are going to be too many possible ways for customers to create and combine their data.  So how do we get the best test coverage?  Here are a few ways I currently go about it,  I'm sure you can think of more.

1) Focus on the most important data scenarios.
This may fall under the heading of 'duh', but is always worth restating.  It doesn't do your customer any good if you catch a problem with a board being cut 2 mm too long, but the procurement screen for ordering new material is broken.  There is a mantra in aviation that in an emergency situation, 'First, fly the airplane'.  Don't get distracted by details and forget the big stuff.  There is plenty of material already published on techniques for determining what's most important to you and your customers, I won't restate it here.

2) Restrict possible data entries and combinations.
While this is a development function based on lots of spec discussion and user input, you as the tester also have important input as a representative of the end user.   Sure, we all know to test edge cases and weird data inputs.  But we also need to slow down occasionally and think about whether
we can or should allow the end user to enter that data if it doesn't make sense.  Speak up and suggest a restriction if you think one is needed.  If it's shot down (hopefully with an explanation) you've learned something about the data and at least planted the idea.  If not, and the user later decides they want that functionality, now it's a discrete feature to test. 
Once the restrictions are in place, don't forget to test them.  More importantly, test data migration from versions without the restrictions to versions with the restriction to catch any data that has already strayed.

3) Keep expanding your regression test suite as defects are reported.
This is an easy one, you don't even have to find your own bugs!  As data problems are found and reported, try to add them to your suite of regression tests.  While automated tests are preferable for this, even a list of data combinations to try in manual testing is better than nothing.
The most important thing is to add the necessary data to whatever database you are using for testing, and use it.  Take the time to do a little exploratory testing around the defect scenario to see if you can catch any other issues revealed by that data combination.

4) Exploratory testing.
This one can be difficult, especially if you've been looking at a lot of the same program functionality over and over again for years like I have.
The idea is to try different ways to break out of the 'normal' success path testing mindset, and test the software in different ways and data
combinations.  As I mentioned before, it it's allowed there's a chance someone will try it, so try to get away from the 'nobody would ever do that!' mindset for at least a few minutes.  Talk to customer support and see if any end users are known for pushing the boundaries of the software, then take a look at their data if you can.  Read industry publications, competitor news, or occasionally even your own corporate website to see what marketing is trying to tell people your software will do.

5) Use customer data.
This one overlaps with the exploratory testing point.  If you can get your hands on some representative customer databases that use a lot of the functionality your software offers, spend some time getting to know their data.  Test the same test cases you're familiar with on a different data set, see if they are handled differently.  Stop and think: "What mindset does this data represent?  Can I apply it back to my familiar data?"
If you have customer representatives who work directly with the customer they can be a big help here.
Another test to try here is to upgrade several sets of customer data between software versions.  As seen in the example I first mentioned, this often
finds data scenarios that the person writing the upgrade scripts didn't think of.

6) Pairwise testing
http://www.pairwise.org/
This is a particularly useful tool to have in your back pocket when dealing with large sets of data that can be combined.
Any time I have more than 3 discrete sets of data being combined into a single scenario, I consider this approach.
However, if the individual data sets are very large, and/or have interdepencies and restrictions, I find that it loses its usefulness.

7) Fuzzing and semi-random data sets
Don't have enough data?  Make some up!  Free tools such as http://www.generatedata.com/#about can be used to generate data sets within certain parameters that you can use to test your application with (e.g. addresses, names, integers, strings).  The TestComplete testing tool that I use comes with its own data generator also, or if you're handy with scripting you can write your own.
Fuzzing is a similar principle, but automatically applies large numbers of semi random inputs to the program.  
I must admit, though, while both of these can be useful for stress tests, I haven't had much luck in applying them to create realistic scenarios.

Hopefully this has given you a few ideas to start with.  One thing I can guarantee you is that customers will continue to find ways to break your software that you could never have imagined.  Don't feel too bad about it, add each problem to your testing repertoire, and make up your mind that that's one bug that won't slip by you again!

Saturday, May 14, 2011

Welcome to my blog!  I just returned from an inspiring week at the STAREast 2011 conference, which I highly recommend.  One piece of advice I often hear at events such as this is to distinguish yourself online, via Twitter, blogs etc.
Since I already have a Twitter account (follow me at @allenjfly), I decided it's finally time to get a 'Round TUIT' (ha) and test my blogging skills.

I have a full time job and plenty to do outside of work, so don't expect too much, but I'll try to keep posting when I can.

I started as a developer with a Computer Science degree, working on software patches for telecommunications systems. This involved a lot of testing to reproduce issues and to verify that the fix worked.  When the telecom market went downhill and I got laid off,
I decided that the testing part was really what I was best at, and have been doing it ever since.

I landed at a small company with no tester through a personal connection, and convinced them they would be better off with some quality control in the development department.  6 years later I'm still there, and still the lone tester.   I've created and maintain an automated test suite, test fixes on every  nightly build, oversee the defect tracking process and filter incoming defects, serve as 3rd level tech support, and perform any other testing task I can think of in my spare time: Performance testing, exploratory testing, etc.  

Thanks for reading, and I welcome your comments and discussion!