During my stint as a Test Practice Lead last year, one of the key strategic initiative that I created and led was to assess and uplift skills and capability of the whole test practice.
The process that I employed to deliver on that initiative was –
- Create a framework for Testers in the practice to assess their skills against
- Gather data (through the framework)
- Analyze data and create top 3 areas of focus , for the Test Practice’s upskilling
In this post I am going to share my approach to achieve point # 1 above.
There were 3 key elements of my approach –
- Why do the testers need to assess their skills
- What do they need to assess themselves against
- What will be the levels of assessment
Starting with the “why” was absolutely critical in my mind , because I did not want to the Practice to feel insecure in terms of this being an appraisal or rating exercise. My narrative was built on the plank that this is essentially a data gathering exercise that will inform where we (as an organisation) should invest in terms of upskilling our Testers.
I addressed the practice with the below rationale and it was taken very well (in terms of no push back received about the intent of the exercise)
2. Skill set(s) to be assessed against
The next step was to frame the various skills that the Testers will be rating themselves against. I wanted the practice to self-reflect against ( what I believe) are a set of skills that every “contemporary” Tester needs to posses to be effective in cross functional Teams (working in complex Tech and non-Tech organisations).
I captured the following technical and non-technical skills and tool sets for the practice to rate them selves against . This by no means is an exhaustive and a precise list , but my best judgement (based on my awareness and research) on where the Testing industry is at (and heading towards). This list should be taken with a pinch of context i.e. that the skill and tool set will need to be fine tuned based on the organisation ,product and technical circumstances.
Skill set # 1 –
“Core” (sapient) Testing skills
Skills set # 2 –
Tooling set to augment core testing skills above
Skills set # 3 –
Programming and scripting skills that enable a Tester to be wield tools (skill set # 2) effectively
Skills set # 4 –
Practicing knowledge and experience of methodologies
3. Levels of assessment
This was a tricky one to articulate objectively. I was keen on collecting data on folks who have explored some of the above skill sets as hobbyists ( a trait that I respect) while also differentiating from actual practitioner experience . I also wanted to track the skills that people were keen to learn ( and had been deprived of because of either no training being offered and/or lack of client opportunities)
So, ultimately I settled on the below categories for the Testers in the practice to rate themselves against.
The central assessment question became …
Are you a Starter ? a Learner ? a Practitioner ? or a Champion ? for this skill set
Is this a skill set that you want to grow in ?
Putting it all together –
Here is a copy of the mind map that collates everything in one place. This is what the whole Practice filled and assessed themselves against. .
Hope this helps other Test Leaders in collating skills data for their Teams or Practice to make informed decisions on where the training focus should be