Which is the motto of the Aeroplane & Armament Experimental Establishment at Boscombe Down in Wiltshire, where the Ministry of Defence test new military aircraft and where the Empire Test Pilots’ School is located – “To test properly”.
The BBC Radio 4 afternoon current affairs programme PM has started a new thread this past week, “Will your job be replaced by a robot?”. (Their definition of ‘robot’ is quite wide; what they actually mean is ‘Artificial Intelligence’, which can come in a number of different packages, very few of them – currently – robot-shaped.) Their overall thrust is that within twenty years, very many professional and middle-management jobs will be replaced by AI systems.
We’ve been here before. In the 1960s, the watchword was “automation”. Automated systems were not in any way ‘intelligent’, but they were beginning to replace a number of manual and skilled jobs. Big teams of workers with shovels were replaced by excavators; farm workers were replaced by specialised attachments to tractors; skilled machine workers were replaced by numerically-controlled lathes and drilling machines.
In engineering terms, this was about the advance of miniaturisation, from mechanical or electromechanical automation systems, to transistorisation, then integrated circuits (ICs), large-scale integration (LSI), very large scale integration (VLSI) and then the microprocessor revolution. In each case, the devices got smaller, faster and cheaper; and humans got ever more ingenious at identifying new situations where these devices could be applied to both systems and machines.
At some point in the late 1970s, this effect crossed over from physical activities and started to appear in knowledge and information systems. Libraries began to see the arrival of computer systems, first for complex technical searching, then for cataloguing, and finally the knowledge itself was no longer confined to the pages of books and journals.
Thinkers in what was then called ‘library science’ talked of the ‘information explosion’ and how librarians would be essential in almost every kind of organisation to help guide professionals through the jungle of printed information sources to drill down to the information they needed to run their organisations or plan their products.
So to me, a software tester is someone whose primary role is to improve software products before they go out of the door by doing a number of different things:
- Participating in the design process by casting an experienced eye over proposed products, challenging assumptions made about users and their behaviour, and thinking about how the product should be tested;
- Collaborating with developers and product owners during the build process in an on-going challenge process;
- Arranging for the quality assurance of the product at various stages in its development by helping to make sure that the application works as expected; and
- Test-driving early prototypes to see if the application does what it is supposed to, to find out the best way of using it, to critically appraise the end result, and to help those who have to write user instructions to understand what the application does, how it works and (sometimes more importantly) why it works.
- In this last function, the tester is standing in for the end user, reviewing the product with a view to seeing if it can be improved in any way. This in turn feeds back into the design process; software goes through different versions, (hopefully) ‘upgrades’, through user feedback and the process of fixing bugs and implementing new features. At least with computer software, the old corporate mantra of “we are continually striving to improve our products” is more likely to actually be true…
Meanwhile, as the impact of computers grew in the daily life of more and more people, pundits began to debate the issue. There was a joke in the late 1980s which suggested that there was a new sort of party game. When any group of computer scientists or employment experts or economists got together, one of them would name a number – say five million – and the rest would take it in turns to explain why the new generation of microprocessor-driven computers would cause either the loss or creation of that many jobs. Oh how we laughed (or forecast doom, depending on where we were standing at the time).
At the time, of course, it was easier to see how many jobs were being destroyed by any given change – say, in mining or manufacturing. My father left his job designing and implementing signalling schemes on the railways because he was being asked to design schemes that put peopleTweet