Blurring the lines
Many years ago when I worked for Royal Mail the role of a Performance Tester was different to the typical role advertised today.
There we would physically build the hardware, rack, patch the server into the network, install the OS and install the application. The instructions for the build of the OS and application would be documented by the associated live support teams and our job was to test the instructions within the test environment. We would then look to performance test the application and infrastructure. This complex role gave me a great understanding of the infrastructure and application which certainly was an advantage when analysing the results from testing.
When we were absorbed into CSC things dramatically changed. Every employee had to fit into a specialist role. A performance tester purely tested. There were other teams to rack the server, to connect it to the network, to install the OS and to install the application.
Testing has very much followed the same specialist roles in my experience. With specific individuals for Unit Testing, System Testing, Performance Testing and Operational Acceptance Testing.
In the current testing market there seems to be a blurring of the lines again. Many Performance roles are now listing system testing, automation and unit testing as part of the requirements. One may argue that this is the big corporate model vs the SME model. However, more and more big corporates seem to be moving to this model. This is most likely driven by cost.
In my experience having highly skilled individuals across testing domains can only benefit testing as a whole. The risk associated with this merging of roles could ultimately be that of resource bottlenecks. Where several individuals now become one.
It will be interesting to see if this practise becomes common place and the effect it will have overall.