A new book is released in March by KnowRes Publishing on Assessment Centres (ACs), titled Assessment Centres: Unlocking potential for growth. The editors are Sandra Schlebusch and Gert Roodt. Sandra has extensive experience in the practice of ACs whilst Gert has for some time now been involved on AC research in SA.
Assessment Centre technology has the potential to provide a platform for firstly, diagnosing potential candidates’ management potential; secondly for developing suitably structured management development plans and thirdly, for training and developing these candidates’ management capabilities according to these plans. Diagnostic ACs can also be used to assess candidates’ development progress.
AC technology has the advantage of providing training and development as well as hands-on management experience based on realistic simulations at the same time. This is perhaps the only management training approach where one learns the theory of ‘riding a bicycle’ and by simultaneously riding the bicycle in practice. ACs are widely underestimated as a vehicle for providing needs-based management training and development.
The book follows a logical structure of an introduction, the body and a conclusion. The three broad content areas will be described in more detail below.
The introductory section
The first two chapters provide an introduction to and an historical overview of ACs in South Africa respectively. Chapter 1 describes the different types of ACs, features of ACs, key stakeholders in ACs and AC applications. The chapter is concluded with an overview of the design model superimposed on the guiding principles of program evaluation. Chapter 2 (by Prof Deon Meiring) covers the historical development of ACs in SA with the focus on the ACSG as the focal theme. Important mileposts were the development of different sets of Guidelines and the challenges they were faced with at the time. This constitutes the first and introductory part of the book.
The second part of the book is the ‘body.’ The body is divided into four different sections that represent the different steps and stages of the design model. The design model is widely used in the training literature and consists of the analysis, design, implementation and evaluation stages. Each stage is introduced with a stage introduction and then concluded by a stage synthesis. These four stages of the design model are also superimposed on the broad principles of program evaluation. Stated differently, this means that each stage is concluded with a check list to confirm whether the principles of program evaluation have been adhered to.
Having a better understanding on how the book is structured and what the underlying logic of this structure is, we can now proceed to the different stages of the design model and cover each of them in more depth.
Stage 1: analysis This stage consists of three chapters representing three steps in the analysis stage. In the introduction to this stage, the deliverables of the stage are clearly specified. Chapter 3 deals with the analysis step in the design model. This chapter covers a contextual analysis, specifically referring to the social, legal and ethical contexts before the focus shifts to a business analysis. Chapter 4 deals with business effectiveness, the construct management and different management levels, as well as management effectiveness. Lastly, Chapter 5 deals with job analyses and covers different job analysis techniques as well as the intended deliverables of the techniques. This stage is then concluded with a synthesis and a final checklist to determine whether the analysis stage has been conducted effectively – that is to provide profiles for the jobs to be included in the AC.
Stage 2: design This stage consists of three chapters that represent the three steps of the design stage. The introduction to this stage introduces and specifies the deliverables of this stage. Chapter 6 introduces the design of the simulations based on the operational blueprint derived from the analysis stage. Different design considerations such as validity, types of simulations, simulation documentation and pre-piloting the simulations are covered in this chapter. Chapter 7 deals with the design of the centre and covers specific aspects such as the role of the administrator, the observers and AC documentation. Chapter 8 deals with piloting the centre. Important aspects for piloting a centre, such as the selection of participants and considerations before, during and after the pilot are covered in this chapter. This stage is then concluded with a synthesis and a final checklist to determine whether the analysis stage has been conducted effectively – that is to provide fully functional simulations for an AC.
Stage 3: implementation The three chapters of this stage represent the three steps of the implementation stage. The introduction introduces and specifies the deliverables to this stage. Chapter 9 deals with important aspects before the centre such as the training and selection of observers, the training of administrators and the required paper work. Chapter 10 deals with a number of aspects during an AC, such as the orientation of the participants, the debriefing of the participants and possible pitfalls to avoid during an AC. A number of practical hints are provided to ensure an effective AC. Chapter 11 deals with a number of post-centre considerations that relate to the participants and to the future and maintenance of the centre. This stage is concluded with a synthesis and a final checklist to determine whether the implementation stage has been conducted effectively – that is to provide a fully operational AC.
Stage 4: Evaluation and validation The two chapters in this stage represent the two steps of the evaluation and validation stage. Chapter 12 covers the content evaluation stage of ACs. In this chapter different stakeholders’ assessment such as the AC process owners, the participants, participants’ subordinates, the participants’ managers and the HR specialists’ of both the content and the process of the AC are taken into consideration. This procedure ensures a systematic coverage of content and process aspects of the AC process. Chapter 13 deals with both the reliability and validation of ACs. Basic statistical concepts are introduced first, after which the reliability and the content, construct and predictive validity of ACs are systematically covered. This stage is then concluded with a synthesis and a final checklist to ensure that the validation and evaluation stage has been conducted effectively – that is to provide reliable and valid ACs.
The last, concluding section of the book consists of three different chapters. Chapters 14 and 15 (by renowned international AC experts, Proff George C Thornton III and Diana E Krause) provide an overview of AC practices in the international context as well as a comparative overview of practices between Europe and the US of A respectively. Chapter 16, the last and closing chapter, provides a synthesis of the challenges facing AC practitioners in SA across the different stages of the design model and provides suggestions on how we should deal with these challenges in order to sustain a promising future for ACs in SA.
Features of this book
The book also includes the latest, updated version of the Guidelines for Assessment and Development Centres in South Africa (2007) of the ACSG. It also contains a glossary of most frequently used AC terminology and an index of terms used in this book.