Try again. Fail again. Fail better

What are your best KPIs for QA department?
November 9, 2011, 10:16
Filed under: QA

Well that is a very used subject these days. Because personal development plans are more often used for software development companies recently, and since it’s a bit more difficult to define KPIs for software quality assurance than for Sales for example, people tend to use weird criterias along good ones. So let’s review together some of the most used:

Defects per unit/component – well how’s that reflecting the quality assurance performance? I mean if we have a low defects per a unit, that obviously means that the development department did a pretty good job or that QA department did a poor job in finding defects. If we have a high defect ratio then we have poor development and not necessary good QA. In the first case we cannot say 100% that QA didn’t find issues because of poor QA execution and in the second case we might say that poor development actually popped out trivial problems first.
Customer reported problems – well for me this is the best indication of the overall quality in a program. Not too much to object to this one. Maybe just one: many times we’re blind to customer feedback and therefore working in a blackbox that we enjoy a good opinion of the work we do.
Delivery deadlines are met – well yes this is a quality indicator factor. But it is a quality indicator for the QA process and organization rather than quality of the product. Also dependencies need to be checked because often in agile development, testers are left with lots of work in the last days and pressure to deliver. This is an acceptable quality indicator but mainly reflects planning.
Cost of Poor Quality – how much do you see this? More or less invisible, but one good KPI. It is very hard to match the cost of poor quality in software industry. It is easier for shoes for example where you get returns because of that. Nevertheless if it is possible, it is very relevant.
Testing waste – this is my favorite. It’s never used, mostly unknown. Usually QA engineers think of their work that the more testing the better it is. That’s absolutely wrong. The more useless testing you make, the more expensive the testing it is for the product in general and the more chances you have to not finish in time, or do sloppy work because you just want to get that huge amount of testing effort done regardless of everything else. This can be easily stated by taking in consideration the testing reiteration number, time spent on regressing the n-time the same functionality, automation testing coverage and time, tester’s dead time waiting for tasks.
Customer satisfaction – and how are you going to measure that applied to the QA team? It could be used as a KPI if you ask the questions at the right moment in product lifetime, and if the questions are mapping relevant aspects you’re following. If you’re asking in the early development stages it’s useless, if you’re asking after release it might be biased by customer’s customers reaction, which might reflect a lack of interest in the products purpose rather than quality.
Improper bug reporting ratio – well yes, this is an indicator but if you resume to this, you’ll just create a bug reporting anxiety that will reflect into quality after all.

As a summary, QA department doesn’t create quality, it checks for quality. KPI’s should refer to quality testing efficiency, quality of the testing quality itself, customer feedback on the quality of the overall product, coverage you have on the areas where most of the problems emerged.
There’s no standard for this, start where your department’s impact can be quantified and define your own set of values. And of course, write me back if you think you have a new KPI ☺


Let’s review Eggplant tool

Yesterday, I published a post regarding some automation tools for Android applications. In my review, I considered Eggplant initially, but due to something I still consider as difficult trial I gave up following that lead. I’m not saying it requires special skills to register for a trial but it was requiring more time than I would have spent, so I checked the others instead.
One of their support people contacted me yesterday after publishing the article and offered to help me out to make a review of Eggplant as well and that’s the purpose of this post.
Testplant is offering a trial license for anybody from my blog or Romanian Testing Conference to test their product against an Android application. The license is valid for 15 days (compared to the regular 7 days on their usual offering). What we should do is take it, play with it and write a feedback. So do you want to play this game?
In order to take it please fill in the form at and use the offer code RTC (as from Romanian Testing Community).

A glance of mobile application testing

Well, just before holiday, I want to tell you something.
It happened before, about 7 years ago when testing was … testing with the heart and based on creativity, intuition and other feminine attributes which actually impacted the distribution of male/female testers. But we don’t want to jump into that just yet, maybe some other time.
So, 7 years ago I found testing interesting and somehow reversed to development and I was attracted to everything around it. In time, I felt the feeling is changing and just as in every relationship now it is in mature phase. But as for any relationship, sometimes you need butterflies in your stomach to revive it and feel it’s fresh and I just got that from testing. Again. Mobile testing.
First I’ve heard we’re releasing an application with the only testing being done by the mobile app developer. It doesn’t seem wrong, it seems a roadway to failure actually. And now that the application is in continuous refactoring, here and there, bug fixing and so on, it shows I was correct. The same application will have its own version on Android and we had to learn from mistakes. Here is my experience:
Functional testing – that is quite simple and trivial. It was formerly done by developers, but we had to enrich it. Working on the flows is as simple as on any software application. This doesn’t need any special skills or time. Check architecture, check flows, and make test cases that are following it. Don’t forget about negative cases.
Non-functional testing – now it’s getting more interesting. Because in mobile application field multitasking is a virtue that needs to be checked. Therefore all the interruption in normal flows like incoming phone call, battery depleted and so on might break the application logic or keep resources busy. I tried finding this on my own and also consulted my friends on Romanian Testing Community and finally I’ve got to Mobile QA Zone website and found a unified Android Testing Criteria. Please follow the link if you are in need of typical android criteria and you’ll not be disappointed.
Automation and performance – yes I want to do that. I need to do it and I want to do it. So I engaged in a search that had multiple suspects. First it was Eggplant, and I was kind of disappointed by the so called trial version. Trust me guys, if you read this, you won’t be convincing too many people with that kind of registration for a trial version. Then there were Monkey (a random UI exerciser tool), Monkeyrunner (a Jython based test scripting tool), Robotium (Some sort of a Selenium for Android), Roboelectric (Unit testing for Android app), Cucumber (testing based on scenario and implemented in Ruby). I made my decision but first let me walk you through the suspects:
Monkey is an application exerciser for the UI. It is composed by streams of random user events such as touches, gestures, types and others. It might be useful to check your application under a bit of stress and constant use. I might even want to try this but not as the first option. It doesn’t have the capability of designing regression testing classified on categories that run and report findings restricted to specific things. In other words, this is some sort of hazard testing, which is good for some aspects but not to be our main option. You can get more information from here.
Monkeyrunner is something else. It is an evolution to previous tool by providing an API that allows the control of a device or Android emulator from outside the Android code itself. It looks close to the solution we adopted but it lacks a bit from the complexity and documentation available for our chosen one. I would recommend this for those in loved of Python, Jython and such. Application is described in detail here. Please let me know your feedback if you are using it.
Robotium is my fave. It is well integrated with Android community, has a lot of documentation materials, it has the complexity required for my project and “It’s like Selenium, but for Android”. Now, what it actually is, is a library that allows us to write tests, looking alike to unit tests, that has implemented gestures and actions from Android. The library doesn’t have the complexity to cover ALL of the things you might want to do and then you might consider developing those yourselves but in general it does its job. It can connect to a device/emulator outside of Android code. It is robust for automation. You can get it from here. I encourage you to use this one as for my project it did its job successfully. If you are interested in the features it has, check the full list.
Roboelectric has its own fame and glory. They have a great logo . What the tool is doing is running the unit tests on a virtual machine so no connection to device/emulator. The advantage is the independent platform of running the tests but I see no real benefit as I prefer to use a device. Take it from here.
Cucumber application on the other hand seems immature. It has a great idea behind to use natural language words and map it to Android activities. I didn’t consider this too much but in case that you are using it please tell me your impressions.
Well by now you noticed there is no performance testing done yet. I’m still thinking about that. I might use a time tracker in the automated tests to measure. What I noticed is that there is a common pipeline for all the tests except the JUnits to connect and run tests on Android: InstrumentationTestRunner. And one other tip is to read the general structure of an Android application, it helps a lot. Also bear in mind that just as in Linux, every application runs into its own shell, like different users, and problems might appear when they interact either between them or with OS activities and resources. Another tip is to test against all Android versions out there, but make sure that you test more on the most used one, currently 2.2.
I know this is an oversized article, I apologize for that. And happy vacation for me, 19 days to vacation left.

Interviewing a QA
June 28, 2011, 09:11
Filed under: QA

I’ve been trying recently to put up an interview template for QA positions. What I’m trying is to create a set of 10-15 questions that will touch aspects which do not surface out of the current practice of talking about the CV. The current tendency is that people are talking about the experience you can see in CV, which is fine, but should not be the most important thing. I think that questions regarding the resume should be restricted to details regarding the actual work done in the position and roles mentioned and the use of technology and definitely not telling stories about the actual stuff and projects he/she was doing. Besides the fact that it is unprofessional to give inside details from your previous position, it brings no real weighting points.
I’ve heard about a certain story regarding interviewing when a big company in IT hired an individual starting from his hobbies. He was a passionate player of a computer game and he wrote that in his CV. Well, I cannot imagine how often that will happen, since people are reluctant to do that and think that will pinpoint a weakness, somehow close to a drug addiction. The story is that the interviewer started from the game and asked questions related to game content, suggested scenarios and asked for specific decision from him in certain situations. The employer was surprised about the solutions and justifications he made, but concluded that the strategy and way of supporting decision based on the game scenarios are actually pointing out he has good strategic thinking, good capacity of analysis and suggested to him a leading position instead of the seniority technical position he was interviewed for.
Of course, this story is not common, but in fact reveals alternative path of estimation and evaluation. It uses common ground between interviewer and interviewed, outside regular office map and investigates abilities rather than experience.
I’m looking to create an example of this, based on stories that people would need to solve, analysis questions when the interviewed will be asked to approximate quantity and quality of certain something that he cannot relate to, graphical puzzles to solve and so on. I’m looking for logic, intuition, analysis, decision making, imagination and after that we might have a talk on the CV at a beer after the office hours.
Will you help me out on this?

Updates from a friend now updates from two friends

Yes I know this is not a subject, but it deserves a praise. I’ve been doing something else around in the past time. I didn’t put it up here from the beginning, but now that my friend posted it on his blog I think I’ll make it over here as well. His latest post announces our latest initiative:

What is the Romanian Testing Community? Well if you’d ask I would say that at some point we’re too tired and upset of being tagged as the stupid brothers of the developers. We want to raise self-awareness within the testing community and raise the level of involvement in this job art. Yes it is not a job. It involves imagination, it involves investigation, passion and initiative. There is often that you need to have some peculiar non technical background to apply testing against a software. We’ve all been through that.

We’re trying to share experiences with each other, we’re trying to support each other, so we’re not staying isolated in blocks and we can keep up with the trends. We’re organizing a conference, building a website, setting up some very nice surprises that didn’t surface yet and I hope QAHeaven won’t release any news until we’re on our way with those. It might end nice or it can fail miserably. What can we do to make it happen? Besides your support, our perseverance and a lot of luck, I guess it will come down to what Samuel Beckett wrote: “Ever tried. Ever failed. No matter. Try Again. Fail again. Fail better.”.

Cum au evoluat lefurile angajatilor din IT&C in primii ani de criza

Anul trecut, in Romania lucrau 115.800 de specialisti in domeniul IT&C, mai putin cu 8.300 de persoane fata de cei 124.100 de angaiati din 2008. Pe de alta parte, in 2010, salariul mediu brut anual a fost de 9.133 de euro mai mult cu 10,1% fata de 2009 si in crestere cu 4,6% comparativ cu 2008.

Insa, productivitatea a inregistrat un regres din cauza sectoarelor de servicii, unde reducerea puternica a valoarii adaugate nu a fost insotita de o comprimare corespunzatoare a personalului. Practic, anul trecut, un angajat din IT&C a adus companiei sale 27.948 de euro pentru intregul an, mai putin cu 8,6% fata de 2008 si in scadere cu 1,7 fata de 2009.

Astfel, cel mai bun an din ultimii trei in privinta productivitatii a fost 2008, anul in care a inceput criza economica globala. In timp ce in telecomunicatii, in software si servicii a persistat decalajul dintre salariile in crestere si productivitatea in scadere in domeniul hardware au crescut salariile, dar a crescut si productivitatea.

Salariul mediu brut anual al unui angajat in IT&C intre 2008-2010. Cifrele se refera la sume in euro.

Sectorul care a mentinut cel mai ridicat nivel salarial din industria IT&C a fost cel al telecomunicatiilor, cu un salariu mediu brut anual de 9.950 euro in 2009 si 11.040 euro in 2010. Daca salariul mediu brut anual a crescut in telecomunicatii cu 4,6% in 2010 fata de 2008, in schimb productivitatea medie s-a diminuat anul trecut cu 10,1% comparativ cu 2008.

In domeniul software si servicii IT, anul 2009 a insemnat stoparea escaladarii salariilor cu care sectorul s-a confruntat inaintea crizei.

Salariul mediu brut anual a fost de 7.786 euro, mai putin cu 4,8% in fata de anul anterior. Anul trecut, acesta a urcat insa la 8.564 de euro, in crestere cu 4,6% comparativ cu 2008 si mai mult cu 10% fata de 2009.

“In multe din companiile confruntate cu un declin al veniturilor scaderea salariilor a depasit 10% si chiar 20%. Au operat reduceri substantiale si majoritatea filialelor multinationalelor, unde salariul brut mediu anual depasea 25.000 de euro). Analiza pe subsectoare indica o reducere mai consistenta in activitatile de consultanta TI (-11%) si mai atenuata (-3%) in serviciile de dezvoltare software si editarea de produse software”, explica oficialii ITC.

In majoritatea companiilor locale mai mari salariul mediu brut se situeaza intre 13.000 si 16.000 euro, in firmele de outsourcing si centrele de dezvoltare straine intervalul este 15.000 – 20.000 euro, iar la cele mai multe din filialele multinationalelor se depaseste nivelul de 20.000 euro.

Daca salariile au crescut anul trecut, in schimb productivitatea a scazut cu 7,5% fata de 2008 si cu 11,8 comparativ cu 2009.  “Este evident ca problema decalajului intre cresterile de salarii si cele de productivitate, cu care sectorul era confruntat si in anii anteriori, nu s-a rezolvat ci s-a agravat in conditiile crizei”, adauga oficialii Institutului pentru Tehnica de Calcul.

Productivitatea medie anuala al unui angajat in IT&C intre 2008-2010. Cifrele se refera la sume in euro.

In domeniul hardware si electronic, salariul mediu a urmat tendinta generala, cu o reducere cu 5,3% in 2009 fata de 2008 si de o majorare de 5,9% in 2010 comparativ cu 2008.  Dar, spre deosebire de celelalte doua sectoare, in domeniul hardware reducerile de personal au fost insotite de marirea productivitatii. Astfel, productivitatea medie per angajat a crescut cu 19,6% in 2009 comparativ cu 2008 si cu 40,2% anul trecut fata de 2008.

Numarul de salariati din domeniul IT&C intre 2008 si 2010:

Sursa: Institutul pentru Tehnica de Calcul /

Is Agile testing the “new black” of testing?
June 6, 2011, 11:13
Filed under: QA

Obviously Agile is a strong trend. It’s like owning an iPhone, iTab or Mac. And just like the previously mentioned, besides the actual and proven advantages of it, there is a big flock of worshipers that create this big momentum.

But did you ever give it a thought? Is Agile compatible with quality testing? First, let’s take the quality verification definition or testing definition that traditionally requires a finished product which is verified from several perspectives against the specifications it has been built for. That happens for any product not just software, sport shoes for example. In the so called Agile testing, response to change is more valuable than a fixed  specification and thus a changing product must be verified against changing circumstances. How’s that not good? Well, there are several minuses.

1. Testing effort is prolonged, redundant and obsolete. We’re testing blocks of products as they develop both alone and in integration. By each product we have a whole lot of redundancy integration tests. Of course, many people skip integration tests completely leaving them to the end. But they might as well never start testing and begin from integration from the first time.

2. Testing in agile is dumb and strategy less. Yes I know you probably are outraged about this, but think a bit. Do you develop plans and strategies for testing? Nope because you’re getting blocks of products to test and informal / formal specifications from developer are enough to start testing. Automation? That is pretty ineffective in agile testing, you don’t have time to develop it between iterations and even if you do, there’s no point in running the automated tests more than once since you don’t test the second time the same block.

There are of course advantages for Agile testing as well. I’m not going to go through all those, but I was just wondering if we didn’t get too much distracted by worshiping Agile from the original goals of testing, assuring that a ready to sell product will satisfy the needs of the potential customers and successfully pass any test against requirements.