The Department of Performance Monitoring and Evaluation (DPME) gave a presentation on the Municipal Assessment Tool (MAT), which was introduced as a mechanism for the proper measuring, monitoring and supporting of improved management practices in local government. It was essentially a management information tool that would enable municipal managers to shape their environment and practices to deliver quality services. It was introduced to answer concerns about service delivery and to ensure functionality as well as access to services, and in recognition of the fact that despite years of capacity-building initiatives, institutional performance remained poor. At municipal level, any interventions had tended to be fragmented, as opposed to the provincial level where the Management Performance Assessment Tool (MPAT) had been introduced some years ago. At municipal level, any interventions to date tended to be crisis-driven, but the new tool would provide early warning signals. It was suggested that a return to core mandates was needed and that quality of management was at the heart of many problems. Through the MAT, it was hoped that national and provincial departments could target their support more effectively. It was not to be seen as “yet another report” but would actually develop holistic information. Progressive improvements would be tracked, using key indicators, with input from various departments. A steering committee comprised of the DPME, DCOG, SALGA, Cities Network and National Treasury would oversee the programme, with involvement of broader reference groups. The tool should be ready by end September, with a pilot involving eleven municipalities, to run from September 2013 to March 2013, to test the functionality and identify areas needing improvement, and full scale rollout after April 2014. The steps taken to implement were described and progress to date was set out. Members asked for an assurance that the MAT would not duplicate other efforts and asked in exactly what respects it would differ from other tools. The DPME was at pains to point out that this was a unique tool, and that its specific interventions would address internal managerial and operational components necessary to produce the outcomes expected of municipalities.
The DPME then briefed the Committee on the progress towards reaching the targets in the delivery agreements for the priority outcomes. On many, there had been good progress. Some of the 2014 targets had been achieved, but others were unlikely to be achieved. This was attributable to external factors, such as the global economic downturn. Some of the targets were identified as very ambitious. The presentation covered a limited number of key performance targets giving a full picture of performance. The Departments of Education were cited as examples. It was noted that the departments’ annual reports, the mid-term review, the 20 year review and the development indicators provided additional information. A careful balance had to be struck to ensure that the emphasis on certain targets did not lead to neglect of areas for which targets were not set. It was suggested that the Portfolio Committees should be provided with progress reports against delivery agreements, and that the DPME should participate in further engagements. Members appreciated the frank nature of the presentation. They questioned, however, the sources of data and their reliability, whether departments took the recommendations seriously, and raised specific concerns about educational matters. They asked what action would be taken after reports were submitted, and reiterated the roles of the various other portfolio committees, and how these reports would link to annual plans. The data gathering process, comparisons and future sources were all interrogated further.
Introduction by the Chairperson
The Chairperson prefaced the meeting by providing some background on the presentations to be given by the Department of Performance Monitoring and Evaluation (DPME) and a general introduction to what the assessment tools comprised. He further highlighted that the main challenge with government policies lay in implementation, and challenged the Members of the Committee to think about how the information shared with the Committee in these presentations may be broadly spread to enhance the work of others.
Department of Performance Monitoring and Evaluation: Municipal Assessment Tool briefing
Mr Sean Phillips, Director General, Department of Performance Monitoring and Evaluation, introduced the team who would be briefing the Committee.
Mr Hussen Mohamed, Outcome Facilitator and Project Manager: Municipal Assessment Tool, Department of Performance Monitoring and Evaluation, the opportunity to introduce the new model of the Municipal Assessment Tool (MAT) and the approach. He noted that his presentation should be read in the context of why the local government sector was not performing as desired. He would illustrate how the model and approach would re-instate good performance on the fundamental aspects that would lead local governments in a more positive direction.
He emphasised the constant concerns about service delivery and said that a new threat had emerged, in ensuring functionality in addition to access to these services. Another key area of weakness within local government was institutional performance, and there were questions asked as to why there had been no results despite many years of capacity building initiatives. It was found that, even through the outcomes approach, the perception of performance was very simplistic and overly optimistic. At the municipal level, interventions in the internal workings of the organisations had been fragmented and ad hoc, as opposed to those at the provincial and national level, where the Management Assessment Performance Tool (MPAT) had first been introduced. The result was that actions in the local government area tended to be crisis-driven as opposed to proactive. He noted that the MAT would provide early warning signals, to ensure that the right management and operational conditions were used for a timely response.
Mr Mohamed said that it was generally acknowledged that the local government sphere was very complex in terms of its socio economic context, with high demands. It was suggested that a return to the core mandate of the institutions was needed, and a concomitant enabling environment must be created. Studies were cited that linked quality of management and the performance of organisations, in terms of both productivity and the quality of performance. He reiterated that the challenge with past initiatives was that they were neither targeted nor coordinated.
The new MAT would identify each municipality’s specific issues, ensuring a response that was appropriate to the requirements of that particular municipality. The effect of such coordinated and targeted interventions should ensure that national and provincial departments who were responsible for supporting municipalities would be able to offer better support. He stressed that the MAT was not to be seen as “yet another report” required of the municipality, and noted that a recent study had noted that municipalities already had to produce between 30 and 42 reports a month, many with duplicated information. MAT, by way of contrast, was an introspective device for management to determine what support was required from national and provincial sectors, as opposed to those other sectors performing assessments from outside. The MAT drew on existing processes, and would develop a holistic picture, instead of duplicating information.
Mr Mohamed confirmed that the tool was not to be seen as a way of introducing section 139 interventions. Various Municipal Performance Areas would be assessed, in the categories of integrated development planning, human resource management, financial management, service delivery, community engagement and governance. The ideal performance would be assessed, using key indicators for each category, and the criteria would be set out that would need to be progressively met when moving to meet the ideal.
Similar to the MPAT, four levels of management performance were to be identified. Each performance standard would be assessed against levels and the scores would be aggregated for each key performance area. The evolution of the programme involved the consultation of key stakeholders, such as the Department of Cooperative Governance (DCOG), National Treasury (NT) and others.
A steering committee would oversee the programme, comprised of the DPME, DCOG, South African Local Government Association (SALGA), the Cities Network, and the NT Technical Assistance Unit. There was a broader reference group, which consisted of provincial departments of local government, national sector departments, and the Office of the Auditor-General. This group was in the process of being established, for broader consultation and participation, and should be particularly relevant during the moderation phases. The DPME was currently about to conclude the content of the Tool. By end September this would be automated into a user friendly interface for municipalities. The pilot phase would run from September 2013 to March 2014, and the large-scale rollout would happen from April 2014 onwards.
The purpose of the pilot phase was to test the functionality of the tool and identify areas of improvements. The participating municipalities were selected from keen initiatives at the formulative stages of the tool, particularly in the North West and Gauteng, coupled with considerations of inclusivity and balance. The pilot phase was to consist of organising and communicating steps, self assessment and internal verification, followed by moderation and feedback. The aim would be improvement and monitoring of implementation, with the requisite support.
The DPME was in the process of appointing staff and was headed by an official at level 15. One director had been appointed, and interviews for an additional two had been completed. Four deputy directors were being recruited and two administrative support staff had been appointed. With an initial contingent of over ten people, the pilot phase would provide an indication of additional capacity required for implementation on a national scale.
Mr Mohamed then noted the progress to date. Consultation on the tool for application in the pilot phase had been completed, comments were being incorporated and the final tool would be ready for application during the pilot phase at the end of September 2013. Eleven municipalities were to participate in the pilot phase. After that, there would be training of Municipal Assessment Coordinators of the four cities, on the tool content. In October 2013, Municipal Assessment Coordinators of the seven North West municipalities would be trained on content and technical application of the tool. Self assessments would ensue in mid October in the pilot municipalities. Members of the Standing Committee would receive the tool when it was completed in September 2013.
Mr M Swart (DA) lamented that the problem lay with municipal management and the fact that the inadequately skilled or trained people were placed management positions there. He enquired whether it was possible to add other municipalities to the pilot programme, particularly the municipality of George.
Mr Mohamed stated that the municipalities involved were not confined to Gauteng and North West, but included Cape Town, Western Cape; Msunduzi, Pietermaritzburg, Buffalo City and Tshwane. A further seven municipalities in the North West as well as King Sabata Dalindyebo in Mthatha, and the West Rand were included. The capacity of the programme could only support these municipalities, while it was in its development stages. Many more municipalities would be addressed in the following rounds.
Ms R Mashigo (ANC) wondered to what extent the MAT had consolidated existing information and processes. She asked whether the MAT would reassess all areas from scratch, or whether the DPME would build on existing assessments and recommendations. She was concerned that there should be some communication between the assessment tool and the observation of specific departments.
Mr L Ramatlakane (COPE) questioned the implementation period after the pilot phase. He also wanted to know about the relationship between the old reporting system and the new assessment tool. He asked for clarity on the objectives and how these objectives would address the challenges.
Mr Mohamed responded that at rollout stage, in the following financial year, there would be more certainty and stability than in the pilot phase.
Dr S Van Dyk (DA) echoed Mr Ramatlakane’s concerns as to how exactly the MAT would differ from the existing monitoring and assessment tools. He was of the view that the shortcomings, in both quantity and quality of services, were attributable to inadequately skilled management. If this was indeed so, then new assessment tools could forever be proposed to remedy weaknesses in the old tools. He asked for a more precise illustration of how the mechanisms employed by MAT would translate into tangible improvements in service delivery.
Mr Mohamed noted that it was generally acknowledged that skills were a major issue, and so regulations were promulgated to ensure that municipalities met with minimum competencies. Bodies with legislative and constitutional mandates endeavoured to ensure that the key staff recruited met with minimum competencies. MAT was to support this function. It was acknowledged that the basic objectives for municipalities were not novel, but in order for those objectives to be realised, new and dynamic practices were needed in management and administration. MAT was necessary for the identification of practices which would lead to achieving the desired objectives.
Ms L Yengeni (ANC) continued the point raised by her colleagues and summarily asked how MAT would be different from existing assessment tools.
Mr Mohamed reflected more specifically on the key question of what made MAT different from existing assessment tools. He stated that nothing akin to MAT existed at the moment, as specific departments made particular demands on municipalities, and the reports they required were geared towards those demands. MAT’s specific intervention addressed the internal managerial and operational components necessary to produce the outcomes expected from municipalities by departments in general. He added that the support from the various stakeholders was testament to the fact that MAT did not duplicate any existing measures.
Mr G Snell (ANC) required information on the type of linkage MAT would have to the Performance Management and Development System (PMDS), if indeed local government did have a PMDS mechanism.
Mr Mohamed explained that local government did not have as robust a system as the PMDS in national and provincial government. Instead, the planning regime served that purpose at local government level. The regime comprised the Integrated Development Plans (IDP) and the Service Delivery Budget Improvement plan –equivalent to the Annual Performance Plan (APP)- the criteria of which translated into that manager’s performance. MAT would ensure that the mechanisms for meeting those criteria were present, and ensure proper monitoring as part of regular management activity.
Ms A Mfulo (ANC) enquired whether the Department had ‘teeth’ to enforce compliance. She also wondered if the tool took into account the accountability of staff in the lower ranks, as it seemed primarily focused on management. She asked how MAT would assist governance and management levels in monitoring municipality officials. Politicians often did not have access to information kept by such officials. She also asked who, at the moment, was monitoring the large number of reports produced by municipalities, and suggested that DPME, as a monitoring and evaluation body, perhaps needed to be involved in those reports and in ensuring compliance.
Mr Mohamed conceded that the present MAT did not have the ‘teeth’ to ensure compliance. It was posited that, as one way to obtain compliance, the DPME should formally present the findings of MAT at council meetings. This would also address the issue of political leadership not being aware about the condition of its municipal management and operational environment. It would enable better monitoring of whether the areas identified for improvements were being implemented. Although the tool had limited compliance mechanisms, the steering committee that would oversee the programme had the requisite authority to ensure compliance. The provincial departments of local government were key partners in this regard.
Mr Sean Phillips added that at the completion of the pilot phase, DPME would be suggesting that they should insert, as a specific condition in the performance agreements with municipal manager and other senior managers, that MAT scores must improve from year to year. Similarly, it was thought that the improvement of MPAT results should be included as a performance criterion for national and provincial heads of departments.
The Chairperson commented that MAT was a response to the Committee’s demand for a mechanism similar to the MPAT at a local government level. It was thus conceded that MAT would be useful. However, this tool could not operate in a vacuum. With the requisite cooperation, the implementation of MAT would lead to desired results. He highlighted the training of Municipal Assessment Coordinators in October, at the seven North West municipalities, and expressed the expectation that such training should not be confined to the North West. He also feared that the capacity of the unit to implement this programme would prove problematic. He suggested that the provinces should be actively solicited to provide support. Finally, tighter time frames after the pilot phase were expected, in the context of a clear programme of action.
DPME report on targets in delivery agreements
Mr Phillips said that this Committee had requested a presentation on the progress towards reaching the targets in the delivery agreements for the priority outcomes. The context was important and in a constitutional democracy the separation of powers, checks and balances were important. The executive had its own monitoring bodies, over and above external monitoring bodies in the legislative branch. National Treasury, DCOG and the Department of Public Service and Administration all had legislative authority for internal monitoring. In terms of the Constitution, the President and Premiers had the roles of coordinating their administrations, and monitoring was necessary for effective coordination. Therefore, the DPME complemented the monitoring work of other bodies. This presentation set out only one aspect of the Department’s monitoring responsibilities.
He added that the delivery agreements were a new concept, introduced in 2009, as part of the Outcomes system. This was to address insufficient coordination and the fact that government was over-extended. n 2010, performance agreements had been signed with Ministers, and there was extensive coverage in those of the content of delivery agreements. At the end of 2010, delivery agreements had been signed for all twelve priority outcomes. The agreements set out the outcome as well as the activities and outputs necessary, along with indicators and targets, and they also described the links between activities and outcomes, or the theory of change. The Department had also tried to ensure that the Annual Performance Plans of departments were aligned to these agreements. Over the years, there had been reviews and revisions to the agreements, based on monitoring and evaluation. The details of the delivery agreements were publicly available on the websites www.thepresidency-dpme.gov.za and www.poa.gov.za.
The Programme of Action website contained information on core indicators of progress against the delivery agreements. It also contained narrative documents such as the mid-term review, links to articles, specific outcomes and coordinating outcome departments.
The information on progress was applicable to the end of the fourth quarter of 2012/2013. The delay was due to the need to collect and process lengthy information. In the limited number of key targets addressed, the focus was on the status of the outcomes and a brief analysis of progress, or the lack thereof.
He cited some examples. In the education and skills sector, the Grade Three literacy and numeracy results showed increases from 2011 to 2013: respectively of 35% to 52%, and 28% to 41%. However, only 36% of learners scored above 50% for numeracy and only 57% scored above 50% for literacy. 2012 Grade Six results improved to 36%, up from the 28% in 2011, and declined from 30% in 2011, to 27% in 2012 for numeracy. Only 11% of learners scored above 50% for Grade Six maths and only 39% scored above 50% for Grade 6 language. The provision of workbooks and textbooks reflected a 99.4% timeous delivery rate of textbooks and 98% delivery rate of workbooks to schools. The target was formulated in terms of the percentage of learners who had access to workbooks and textbooks. However, because the data was not available yet, progress was reported in terms of books planned for delivery. The National Department of Basic Education (DBE) needed to introduce an improved monitoring system to measure learners’ access to textbooks. Curriculum coverage had no available comparative data. It was noted that only 32% of learners in Grade Six met the standard of four maths exercises per week. The access to Early Childhood Development indicator stated that 87.8% of learners in Grade One in public schools had attended formal Grade R. The 2014 target for the unemployed in learnerships had been achieved. The challenge persisted in placing learners in sustainable employment, due to the economic slowdown and weak relations between the training institutions and industry. Further Education and Training Colleges (FET) graduate pass rates showed significant progress towards meeting the target yet remained relatively low. The quality of training provided by the FET colleges was a challenge. The employment rate in the first quarter of 2013 had increased by 0.1 % from the 2009 level. The share of youth in employment had declined to 53.5 % in 2013 from 57% in 2009. The context was that South Africa had lost a million jobs in 2008-2010, but had regained approximately 700 000 since then. Overall, employment increased in the first quarter of 2013, yet unemployment remained high at 25.2%. This had not improved significantly since 2009, as the economy had not been able to absorb existing and new entrants quickly enough. Poor education outcomes, the lack of skills development and weak growth contributed to high unemployment levels. Recommendations were posited for the priority outcomes addressed in the presentation.
Mr Ramatlakane commended the frank nature of the presentation. He enquired as to the sources of data and the inter-changeability when data measurement units may not be identical. He asked whether such information was shared with legally authoritative monitoring bodies, such as National Treasury. He also wanted to know if the recommendations made by the Department following the analyses for each outcome were taken into account by the corresponding departments. He echoed Mr Phillip’s sentiments that it would be useful to invite specific departments to Committee meetings, where the monitoring information presented could be utilised as a reference point for further engagement.
Ms Mfulo required clarity on the school starting age, and the recommendation that if the target were to be formulated as a percentage it was more likely to be achieved. It was also not clear what was to be done about the problem of excess teachers. She noted that three departments were involved in Early Childhood Development (ECD) but that there was still very little success. She was of the belief that learnerships should be linked with FETs to ensure an impact on placement. The labels of “skilled” and “semi-skilled” were questioned, in light of workplace experience versus academic qualifications and it was posited that the monitoring body interact with the Department of Labour in this regard.
Dr Thulani Masilela, Outcome Facilitator, DPME, confirmed the school starting age as seven years, and clarified that the indicator portrayed how many children had actually attended Grade R. With regards to data triangulation, the DBE was not the single source relied on, but the general household survey was consulted. On the curriculum issue, it was noted that DBE had tried to monitor this area but the DPME had found it unsatisfactory and thus had instructed DBE to rework the monitoring system. The DBE had committed itself to working with DPME to produce a survey which would investigate and produce accurate data on curriculum coverage.
Mr Swart thanked the Director-General for a good presentation. He stressed that DPME was a department doing monitoring and evaluation, and could not be instructed in relation to its findings. This was the function of the Portfolio Committee of each particular department. For this Appropriations Committee the outcomes were of significance, as the crux of the matter was the expenditure on various matters
The Chairperson reminded Mr Swart that in this case the Committee did act as the Portfolio Committee for the Department of Monitoring and Evaluation.
Mr J Gelderblom (ANC) agreed with his colleagues, and congratulated the DPME on a good presentation. He requested that the document be circulated to the different provinces to counter the belief that nothing was being done by government. The issue was what action would be taken after such a report, having recognised that the onus lay on the Executive and the responsible departments. He also agreed that the Portfolio Committees were responsible for the monitoring of implementation of these recommendations.
Mr Phillips said that a key step forward would be taken if Portfolio Committees responsible for line function areas in Parliament were to use the information as part of their oversight tools. The information was also presented to Cabinet on a quarterly basis. As indicated in the presentation, part of the process required changing the culture in government to a culture of frankness, but this would not happen overnight. The answer on what action was needed now was to have a continuous process of emphasising results and indicators, and being frank repeatedly, until the culture was changed, over time, through Parliament but also in the Executive.
Mr Snell was impressed to have seen that some of the issues where there had been a strong influence at a central level bore results at a provincial level. He asked whether allocating responsibility, as far as possible, to individual departments for key activities, so that they could then be included in the APPs, meant that they could also be allocated to other departments, and said it was important to move away from departments operating in silos.
Mr Phillips stated that there should be a negotiated process, within the purview of the Medium Term Strategic Framework (MTSF), for the identification of the key actions to be undertaken by individual departments, pursuant to the National Development Plan (NDP), at provincial level, given the APPs and the concurrency on some functions. At a national level, and as part of the Presidency, the DPME, together with National Treasury, was able to work with national departments, and include some of the actions in the APPs. The national level departments would then in turn need to engage in a negotiation process with their provincial counterparts. For concurrent functions, such as Basic Education and Health, the meeting of an NDP target was dependant on the provincial level. The target in a national department’s APP comprised the sum of implementing targets in the provincial departments. All the key actions would be sourced from the NDP, New Growth Path, Industrial Policy Action Plan, and Infrastructure Plan.
Ms Mashigo asked whether the accreditation of, and lack of public confidence in FETs were taken into account as indicators by the Department of Higher Education, and the concomitant negative effect these had on employment. She asked to what extent the Department had interrogated these issues when presenting its findings.
Dr Masilela said, in relation to learnerships and FET graduate placements, that the DPME had raised the concern of non- placements that was caused by employers’ lack of confidence in the FET qualifications, with the Department of Higher Education and Training (DHET). This had spurred actions such as the improvement of lecturers and agreements with industries on workplace experience. Recognition of Prior Learning (RPL) remained an issue, as many candidate artisans failed to obtain their qualifications for lack of training. As a result DHET commenced a programme of pre-screening that would alert students to their shortcomings. For access to text books, the DBE had attempted to develop its own monitoring system, but that did not meet the standards of DPME. Subsequently, DBE had undertaken to work with the DPME on the development of an adequate monitoring system.
Ms Mashigo questioned the basis on which data was calculated, particularly in regard to employment statistics, the role of informal employment and the small, medium and micro enterprises (SMMEs).
Mr Rudi Dicks, Outcome Facilitator, DPME, provided information pertaining to total employment within the SMME sector, and explained that the indicator looked at the percentage of employment in SMMEs that employed 50 people or less. The indicator in terms of development was not shown, but within the first quarter it had declined, in terms of the total number employed in SMMEs, to about 53%. Informal activities played critical roles for employment, and further SMMEs were important in that there was a higher degree of labour absorption and they were generally the drivers of employment.
Ms Yengeni appreciated that it was difficult to obtain the accurate number of children without textbooks, as the Department of Basic Education used estimates, and where that Department had said that it was unable to ascertain curriculum coverage, cumulatively, that pointed to a problem that it was reliant on other departments for data, which it would have difficulty in verifying. It was assumed that the Department of Basic Education would have the capacity to monitor the amount of work done on curriculum coverage against quarterly targets, and where no such capacity existed, DPME should address this, by obtaining accurate information. On the employment, she commented that it would have been helpful to have the initial numbers as well as the decreases and subsequent increase, for comparative purposes, as the presentation currently gave totals and percentages with no points of reference. This information would prove useful when Members visited their communities. She felt that the mandate of the DPME should not be merely to restate information given to it by other departments. The creation of the DPME was an indication of the government’s intention to independently verify and ascertain data.
Mr Phillips spoke to the sources of the data, saying that the DPME had tried to use the best data available and had a dedicated unit focused on the quality of the data across government. Data forums were held on the outcomes, when DPME met regularly with those responsible for producing the data in each department, and, through the forums, assessments were done on the quality of data. The DPME had, as set out already, voiced concerns, where relevant, about the quality of data. There was an acute awareness of the time consumed by public servants routinely producing masses of reports, but not having enough time to spend on actually working, and this was a substantial problem particularly at municipal level. DPME did not intend to add to the over-reporting problem. To avoid duplication, efforts at DPME were concentrated on ensuring that departments had put good monitoring systems in place. That would be done through the data forums and other engagements to ensure effective data collection systems across all sectors and levels. The DPME, as a central body and custodian of monitoring, would monitor those information systems and provide assistance with the improvement of those systems. It had been emphasised that the DPME had engaged with the Department of Basic Education in relation to improved monitoring systems, to get clarity on curriculum coverage and access to textbooks.
Mr Phillips added that the 25% unemployment figure was obtained from Statistics South Africa’s quarterly labour force survey. He took the point that the DPME should not be a mere “postbox” and accepted that a key duty was to provide an independent view. The quality of the data was assessed, and the Auditor General’s reports and independent sources were used. They were triangulated, to test the reliability of data provided by departments. He would ensure that DPME would report on progress in reliability of data at future meetings.
Mr Dicks added that the DPME would have to respond to the question of the progress made from one quarter to another, in terms of employment. Shortly before the financial crisis of 2008, total employment was just over 14 million, so there had been significant recovery in moving employment figures now back up to 13.6 million. Analysis from the first quarters of 2012 to 2013 showed the creation of around 200 000 jobs. There had been a slight improvement in the level of unemployment, but this was not significant enough to have altered the percentage, which remained at 25%. The largest growth in employment, surprisingly, was seen in the agricultural, manufacturing and mining sectors. Job losses were experienced in construction, some utilities, as well as wholesale and retail. In the first quarter there was, overall, a net increase in employment due to growth in the sectors mentioned. The Income and Expenditure surveys were used, as well as data from the general Household Survey, in reaching employment, income and poverty figures.
The Chairperson posed the question of unsigned delivery agreements and the actions taken.
Mr Phillips clarified that all delivery agreements were signed by the end of 2010, and that all the Ministers had signed performance agreements with the President. Some of the delivery agreements were perhaps too detailed and ambitious
The Chairperson concluded the meeting by emphasising the importance of the report. He requested that it be made available to this Committee, from where it would be made available to other respective portfolio committees. The novel nature of the DPME was seen in that this kind of report would normally necessitate direct engagement with other departments, but this report encompassed all Parliamentary work.
The meeting was adjourned.
- We don't have attendance info for this committee meeting