National Evaluation System: Department of Performance Monitoring & Evaluation update

This premium content has been made freely available

Mineral Resources and Energy

19 June 2013
Chairperson: Ms F Bikani (ANC) (Acting)
Share this page:

Meeting Summary

The Department of Performance Monitoring and Evaluation (DPME) briefed the Committee on the National Evaluation System and forthcoming call for departments to be included in the evaluation cycle. It was noted that although portfolio committees would undoubtedly benefit in many ways from evaluations being conducted, it was the departments themselves who must formally request the evaluation, as they must buy into and own the process. The evaluation system was placed into the context of all the work that DPME did, and it was noted that this was a vital tool in helping Parliament to have an in-depth look into how policies and programmes were performing, and into areas that needed to change. It was apparent that only 44% of departments were complying with basic legal prescripts, but only 135 were actually using the evaluations to look at what was or was not working, to inform their policy making and budgeting and therefore to improve. Assessments would help departments achieve more with the same resources, and could, when used properly, be used to motivate for extra budget. The reasons why it was important to evaluate were set out, which also indicated that evaluations assisted managers, decision makers, the public and the whole learning process. Evaluations were possible at a number of stages of a programme, and the different types of evaluations, from pre-design, design, implementation, impact to economic evaluations, were described and explained. When a proposal was submitted, an Evaluation Technical Working Group would do the selection, and the evaluations would be publicised. In this year, the plan was that fifteen departments or significant projects would be evaluated. In order to ensure independence, DPME and the department would be represented on a Steering Committee, but an independent evaluator would be appointed. DPME would contribute R750 000 to each evaluation, but the department selected must find the rest. After the evaluation, the department must drawn and then implement, an improvement plan, with DPME doing six-monthly reporting, and it was suggested that this point was also particularly important for close interaction and follow-up by the portfolio committees, not only of ongoing improvements, but to ensure that certain matters were included in the Annual Performance Plan.

The typical cycle for evaluations was outlined, and an example of the evaluation on Early Childhood Development, recently completed, was used to illustrate how the report would look and be used. It was noted that it was not only DPME who would do evaluations, as departments could do their own, if they had capacity, but DPME had identified 130 evaluations done in the past, managed to get reports on only 93, discovered that 10 of these were not evaluations, but research, and found 12 of these so seriously flawed that they should not have been used for decision making. DPME, as part of its portfolio, had developed quality assessment tools, was trying to offer training, was encouraging provinces also to do evaluations, and summarised the important of cross-cutting studies and an emphasis on departments trying to find an entirely different approach. DPME ideally sought to do evaluations in government priority areas, with large or strategic projects. Interest was rapidly developing and design evaluations, in particular, had the potential to make a huge impact. It was stressed that the information must be used and departments must be held accountable.

The Chairperson emphasised the importance of cross-cutting and well-coordinated work, and commented that she thought it would be useful for DPME also to evaluate Parliament itself, as she thought there were many areas where improvements were needed. Members asked what factors DPME would consider when making its selection, whether Cabinet needed any special expertise to evaluate the plan of DPME, what kind of training it offered, and how independence could be achieved. Members suggested that a design evaluation of the MPRDA Amendment Bill, the changes to the Mining Charter, the impact of safety stoppages at mines, and an implementation evaluation of the licensing process, particularly for small miners, were all vital issues, but after further debate, the Committee decided to think further and prioritise particular issues, particularly in view of the Mining Charter Workshop in the first week of July. The DPME pointed out that it was possible to stagger the requests, or to isolate some issues that the Department of Mineral Resources could be asked to self-evaluate, and that a closer rather than broader focus was ideal, using stratified samples. Members asked how provincial planning could be encouraged and whether there was any compulsory aspect. The Committee would now formally request the DMR to make the necessary proposal for inclusion, but accepted that this would be an ongoing process.
 

Meeting report

Chairperson’s opening comment
The Acting Chairperson noted that although the State Diamond Trader was due to present this morning, it was unable to attend. However, since one of the requirements arising from the meeting with the Chair of Chairs was the issue of evaluations and monitoring, and Members’ role in this process, it was apposite that the Department of Performance Monitoring and Evaluation (DPME) could give a presentation on the National Evaluation System now. The Committee would be working on the Mining Charter and the review of the Minerals and Petroleum Resources Development Act (MPRDA), on which an amendment Bill should shortly be presented, and these might be good areas for DPME to consider in its evaluations process. There were likely to be substantial changes arising from the input at the public hearings on the Mining Charter. This Committee had also met with the top ten mining companies, other stakeholders and the unions to give their input on possible remedies to the problems in implementation of the Charter. To her mind, the Mining Charter and MPRDA could not be separated. The Department of Mineral Resources (DMR or the Department) would be asked to be part of the process, and the Committee wanted to see this Department improving its evaluation processes and oversight.

Department of Performance Monitoring and Evaluation: Update on National Evaluation System and call for evaluations
Dr Ian Goldman, Head: Evaluation and Research, Department of Performance Monitoring and Evaluation, noted that he would, in this presentation, speak to the whole system of national evaluations. However, he wanted to stress, at the outset, that if this Committee wanted an evaluation to be done on the areas mentioned by the Acting Chairperson, the Committee would have to ask the DMR to request the evaluation, by the end of the following week. The DMR would be required to fill out a form, which should take a matter of hours to complete.

He was very encouraged that the Committee was taking this initiative and using this important source to help it in its oversight process.

Dr Goldman said he would only present on one of the five main tools, namely the evaluations. The others used by the DPME were outcomes monitoring, with quarterly reports to Cabinet; assessment of management services, frontline development; and municipal performance monitoring. All of those were potentially available to the Committee. However, he would concentrate on the national evaluation system. 

Evaluations provided a very important tool for a portfolio committee to get an in-depth look at how policies and programmes were performing, and on areas that needed to change. It was important, for improving performance, to ensure that the evaluations looked at and attempted to solve the problems. He reiterated that this Portfolio Committee could ask the DMR to undertake the rigorous independent evaluations process, but that the deadline for requests to be included was the end of June. Not all the requests would be selected, but there would be feedback.

Dr Goldman said that he had outlined the Management Performance Assessment Tool (MPAT) previously, which used about 31 indicators. He tabled these, explaining that level 3 meant compliance with what was expected of a department, using management reports to track progress regularly. Level 4 meant that the evaluations were being used to go beyond the basics to get a greater understanding.

44% of departments were complying with legal prescripts, and 44% were sub-standard. Only 13% were using evaluations. That meant that very few were in fact were using the opportunity to look, in depth, as to what was working, and why, to inform their policy making and budgeting, which meant that they were losing out on the opportunity to improve. In Chile, where there had been a national evaluation system for some time, it had found that 51% of all programmes were not delivering as effectively as they could -and that implied that the situation would be far worse in South Africa, which was less organised. All the assessments were essentially a tool to help departments do more with the same resources. They could, when used properly, also be used to motivate for more budget.

The question was asked “Why evaluate?” and the answer was firstly, to improve the policy for programme performance, to have more impact on citizens. The core target group here was the programme managers. The second reason to evaluate was to develop accountability, to decide whether the spending was making a difference. Thirdly, decision makers could use this to improve their decision making, and help them decide, for example, whether certain interventions should be continued, their implementation changed, or their budget allocations increased. Fourthly, evaluation was important for generating knowledge for learning. Government in South Africa tended to have poor institutional knowledge, and the DPME was often finding itself battling to get information, and tended to get it from service providers rather than from government.

He emphasised that evaluations should not only be done when something was finished. In fact, they could be done at any stage, and the most important evaluation was probably the one done prior to design, to ensure that the design was based on sound evidence, would get to the root of the problem, and considered what options may inform the design process. This was a diagnostic evaluation. Too often, policies and programmes were developed without this and had no solid foundation.

The next type was a design evaluation, to check whether the design would work, rather than waiting for it to run and only then pick up problems. DPME was trying to make sure that all departments had the capacity to do this evaluation. The essential question was whether the design would achieve the right results. This would not be part of the National Evaluations Plan (NEP) but the DPME would be building capacity.

The third stage was the implementation evaluation. Whilst it was accepted that the full impact may not be measured yet, it was necessary to ask what was happening, whether this matched expectations, and why. This could be a rapid exercise. If there was a large project, it would have to be much larger, and it would typically be done a year or so after the project had started.

The fourth stage was the impact evaluation. It measured whether the spending was a good use of money, and, in the case of something such as the Mining Charter, it would look at whether the Charter had an impact on the management, ownership and management of small business at mines. Impact evaluations were complex and sophisticated. It was necessary to separate out the various factors – external factors such as the gold price, the rand value and so forth, from what the department was doing. Ideally, if there were new interventions, departments should be asked to call for impact evaluations, even if the process had already started. It was necessary to draw comparisons between one group of events and another, and assess the real reason for the difference.

Finally, the economic evaluation looked at the cost benefits and cost effectiveness of what was being done, and whether the results merited what was being spent. In most evaluations, this question must be answered at some stage.

He reiterated that it was the departments themselves who must submit proposals for evaluations by the DPME. That was because, at the end of the day, they must buy into it, and they had to own the evaluation and implement the findings. Although National Treasury or Parliament could propose evaluations, departments must normally submit. Importance was attached to cross-government selection, and the Evaluation Technical Working Group would do the selection. It would be meeting on 18 July to select the 2014/15 evaluations.

All evaluations would go public, unless there were security concerns, and that was why some departments were reluctant to submit. They would also go to Cabinet, which would approve the overall plan. In this year, the plan was for 15 departments or significant projects to be evaluated.

To ensure independence, DPME would not do the evaluation itself, but would work with the department and constitute a Steering Committee, but also appoint an independent evaluator. Evaluations where several departments were involved were particularly useful. The decisions on evaluations were made by the Steering Committee, representing the partnership, not the department alone, to ensure that expedient and non-political choices were made. In most cases, external service providers would do the evaluation. To try to ensure quality, various systems had been developed. An evaluation panel consisted of 42 service providers, with minimum qualifications. 200 staff had been trained by DPME in the last year, and there were various standards, guidelines, a quality assessment tool and joint funding. DPME budgeted R750 000 for each evaluation from its own funding, but evaluations generally cost around R1.5 to R3 million, and the relevant department would co-fund. After the evaluation, there must be an improvement plan drawn, which would be monitored. Dr Goldman noted that the first evaluation, on Early Childhood Development, had now led to an improvement plan, and the first six-monthly report on that was shortly to be delivered.

He outlined the typical annual cycle for evaluations. The first stage, for this year, was the call for evaluations, which set out the process and what had to be submitted, by the set date (July 2013). This was followed by the selection. Once that was done, the DPME would be refining the concept and setting the key questions and core concepts. Cabinet would approve the plan in November or December of 2013. DPME hoped to start on the terms of reference almost as soon as selection happened, and hopefully would be able to commission in January. By the following April it should be on its way. The evaluations would be done, usually taking about six months, with a completion date of September to March of 2014/15. Once the report was received it would have to be approved by the Steering Committee. He said that the Steering Committee was not being asked to comment on the findings, but to approve that the report was technically sound and the process had been correct. The department would be given a month to comment (by way of the management response) on the findings, and this would be publicised after the next stage, the approval by the Cluster and Cabinet system. Even if the department chose not to submit comments, the report would still be publicised. The improvement plan would then have to be drafted, which could take about four months from approval date, and that may involve more than one department.

He summarised that this meant that if this Portfolio Committee were to send a message to the DMR encouraging it to motivate for an evaluation, the results could be expected around one year later; instant results would not be produced. However, one f the questions asked in the evaluation documents was whether the department was about to face any critical developments or factors, around which the evaluation could build. He emphasised the importance of planning ahead. It was possible to get a department to do its own evaluation, in less time, if it had the capacity, but through the DPME system it should be expected that the work would only start about six months after the submission of the call.

The DPME had already had two NEPs approved, as set out in a glossy booklet that he showed to the Committee, which was also published on the DPME website. The policy of evaluations was approved only in November 2011, but there were already 23 evaluations under way. Reports on these should now be expected fairly regularly, every one or two months. DPME was also aware that some departments had done their own evaluations, and had asked for those reports. 130 evaluations were identified, but it was only able to get reports on 93 of them, only to discover that ten were not really evaluations, but research. Of the 83 that were then tested, 12 were found to be flawed, and should not be used for decision-making.

The DPME had developed a quality assessment tool for all future assessments. Provinces were developing their own plans and Western Cape and Gauteng were already doing evaluations in this year. It was important that the provinces ask the same kind of questions as the national government. He noted that those departments who had already done evaluations were already performing noticeably better than others.

Dr Goldman then tabled a copy of the report on Early Childhood Development (ECD), as an example of how a report would look. The first few pages would contain a policy summary, intended for the Ministers and Parliament. This should clearly set out the choices that policy makers had to make. This was followed by an executive summary, with the recommendations, then the main report. Ideally, the policy should be one page, the executive summary three pages, and the main report 25 pages, to make it more readable. (He commented that another evaluation study had been done by the Department of Basic Education on Grade R, but it was not written in a readable style and was unlikely to be used).

Dr Goldman summarised, as an example, the findings on the ECD study, and emphasised that some of them were entirely different from what had been expected in the past, and had emphasised that other departments needed to play a greater role, that certain legislation needed to be changed, and that services and quality of coordination had to be improved. The findings therefore addressed policy, implementation and funding. The next evaluations, which would soon be made public, included a report on Grade R and on Business Process Outsourcing, which would significantly affect the future design of programmes.

He outlined the status of the current ongoing evaluations, which covered the sectors of Basic Education, Rural Development, Health, Human Settlements, and Trade and Industry with little emphasis on the economic sector, but the plans for 2013/14 were more focused on the economic sector, covering various different fields (outlined in the attached presentation, slides 22 to 24). The DPME was taking one cross cutting study forward, which looked to the outcomes approach of government and evaluated the government coordination systems, including cluster and MinMEC systems.

Dr Goldman said that doing 15 evaluations a year was really a drop in the ocean, as every department should be doing about five a year, as the Departments of Trade and Industry (dti) and of Science and Technology (DST) were doing already.

There were some recommendations already for the 2014/15 evaluations, which included a proposal on revitalisation of irrigation schemes, evaluation of the Funza Lusaka bursaries, an evaluation of the Ilima Letsema agricultural scheme and small scale farmers. In 2015/16 it was proposed that evaluations be done into land care, the National Rural Youth Service, the new school curriculum and DPME 's own evaluation of the impact of implementation of the national evaluation system.

The call for the following year’s proposals emphasised that the DPME was essentially seeking evaluations in areas that were being focused upon as government priorities, notably the 12 Government Outcomes and the National Development Plan (NDP). They should be large projects, preferably over R500 million worth, or affecting 10% of the population, covering a large footprint. However, a project could still be considered if it was not large, but strategic or innovative. If it was politically sensitive, the results of the pilot should go to Cabinet for approval. Another possible factor to take into account was whether this was a “hot topic” with a need to get more information. He reiterated that DPME would make available R750 000 for the budget, and the evaluation would take a couple of years to complete.

The relevance of evaluations to portfolio committees was that the repository of DPME would provide 71 evaluations, which could be a source of evidence. Secondly, because management responses were called for, with one month, it would be useful if these were also reported back to the relevant portfolio committees, and thus the committees would get the chance to interrogate the projects. The departments would be asked to develop improvement plans and committees could ask for regular updates. The DPME itself would be calling for six-monthly reports, Lastly, the committees could make suggestions to departments regarding priority areas for evaluations, so that the DPME could put the spotlight on that.

Dr Goldman concluded that the system was still new, but was developing fast, and interest was growing. The design evaluations would potentially have a huge impact, as all departments should go through them. Many countries were looking at South Africa to see what it was doing. A challenge may emerge as evaluation reports were being finalised and the focus was shifting to improvement plans, and he emphasised that at this stage the role of the portfolio committees would be particularly important, to ensure that all the findings had been captured in the improvement plans.

He reiterated that the evaluations were an important tool to highlight to portfolio committees how policies and programmes were performing. It was important that the information be used, and that departments be held accountable. If the Mining Charter and MPRDA were so important, then he would strongly suggest that this Committee should ask the DMR to submit proposals. The DPME also would do progress updates to Chairs of Committees, perhaps using the Cluster system. Specific departments with specific evaluations would be reported on separately.

The Acting Chairperson added that there were often cross-cutting issues, where one Cluster may have National Treasury, Water and Environmental Affairs, Mining and Energy committees all interlinked through legislation or where a particular form of action on service delivery was needed. She emphasised that the approach should also be cross-cutting to avoid duplication of work. The aim of monitoring and evaluation was to display progress rather than repetition and waste of money, which made it very important to have a strong coordinating system.

Discussion
Mr C Gololo (ANC) said that companies (such as Moody) were constantly evaluating South Africa. He asked what kind of factors they were looking at when making the assessments, and he asked how DPME would look at the issues, to ensure that South Africa experienced no downgrading.

Dr Goldman responded that DPME had nothing to do with this; it was more a National Treasury issue. National Treasury and the Department of Economic Development would look into sustainability, but rather than looking to external reports, they needed rather to examine the core issues around sustainability and the economic transformation in the Cluster. They should be asking how the country was dealing with economic matters, and whether an evaluation was needed of the impact, and whether the progress was acceptable.

Mr Gololo thought that the approval of the Plan for DPME’s evaluations would require capacity and expertise. He noted that all evaluation reports went to Cabinet for approval. He questioned whether Cabinet would need any special capacity to be able to approve the plans. He said that there was experience from the past, such as the Pebble Bed Modular Reactor, where the initial plans were approved, a substantial amount of money was expended and then the project was discontinued.

Dr Goldman said that when the DPME approached Cabinet, the latter was not asked to approve the details of the evaluations individually, but rather the process that 15 programmes had been selected, and whether, in principle, the evaluations on those 15 should proceed. The Evaluation Report was another issue. Here, the DPME was developing training courses for government departments to be able to read evaluation reports effectively and critically. That training could be useful for Parliament also – as now many more people knew how to compare reports, how to assess reliability and so forth. That was one of the reasons why DPME was also trying to ensure, in its reports, that the findings were clearly set out.

Mr J Lorimer (DA) thought that this was a laudable, if difficult programme, and it would be good to see all prospective legislation being independently evaluated. He commented that he thought it must be hard to achieve independence, and it was no doubt equally difficult for bureaucracies to critically examine themselves. This made him a little sceptical as to how well the process would work.

Dr Goldman said that there were many measures built into the process to try to ensure independence, and this had the support from Cabinet. He reverted to slide 9, which set out the answers to the question “Why evaluate?” He had thought initially that his Minister (at DPME) would have attached prime importance to the first consideration, but he had not, and had maintained that all four elements that Dr Goldman had outlined earlier were equally important. If there was “uncomfortable” evidence presented, the DPME would have to see how that was taken by departments. At the moment, however, it was getting positive feedback on the evaluations process. The question was, of course, how to take this through effectively.

Mr Lorimer said that he thought a design evaluation of the MPRDA Amendment Bill and of the changes to the Mining Charter would be ideal, and he would also like to see an evaluation being done on the impact of safety stoppages at mines, from a mine stability and safety standpoint.

Adv H Schmidt (DA) agreed that the monitoring and evaluation process was necessary, but suggested that it would have to be tied in with the 2014 five-year deadline for the Mining Charter. 

Dr Goldman said that this would imply that this evaluation was an ideal one to be promoted immediately. Some results could already be apparent fairly early in 2014 if the initial stages were completed by late January. He said that he was enthused by a culture of new learning, as opposed to trying to continue with “business as usual” for fear of making mistakes, and this was an ideal opportunity to do something different. The moment that the DPME started to ask a department what the core questions were represented the start of the learning process.

Adv Schmidt said that the whole implementation of the Mining Charter was “chaotic” at the moment. The interpretations put on it by the DMR and the mining companies were at odds with each other, and he did not believe that the DMR had the ability to monitor, or, if it did, he doubted that it was truly independent. He said that the question of independence of the evaluation was therefore critical for this Committee. He was very pleased to hear of the evaluation, which was the first time in his years in Parliament, that this kind of initiative was being taken.

Adv Schmidt noted that there were 1 500 mining operations at the moment and more exploration activities, meaning that hundreds of companies were involved. He thought that it would be an enormous task to deal with this properly. There would have to be a sincere acceptable of the challenges, from both economic viewpoints. The Deputy President, who was now involved, said that no stone would be left unturned in trying to address the problems in the mining industry, which would imply an investigation into the MPRDA, the Labour Relations Act and all other mining issues. The evaluation would be particularly important.

Dr Goldman noted that the number of mining operations, but said that it would be necessary to take a stratified sample, spread across the different types of enterprises, and this would also have to be representative of the broader population, and of the different activities, including those exploring and those implementing. This was all part of the research process.

Ms L Mjobo (ANC) noted that provinces needed to develop their provincial evolution plans. If they did this, then she wondered what would happen after their funding, and if this was a mandatory process.

Dr Goldman replied that it was not compulsory, so this raised issues around the accountability process. He was assuming that the provinces would also buy into and take on the improvement plans, which would then also give the provincial legislatures the opportunity to get reports. It was possible to do some very strong naming and shaming. It was also possible to ask that, in the next Annual Performance Plan, certain issues must be specifically addressed, and to link that also to the performance agreement with the Director-General. Those were all ways to implement effectively without creating separate legislative obligations.

The Acting Chairperson said that, over and above DPME looking at the departments, she thought that it should be looking at the way that Parliament was run. There had been many shifts from the way in which Parliament used to operate, to the way it was operating now, the most obvious being the need to step up to technological improvements. Parliament’s approach to oversight could also be improved upon, before even talking about the possibility of policy changes. The way that Parliament did things needed, in her view, to change.

Dr Goldman said that the whole DPME role in relation to Parliament was evolving and something similar to what she had proposed could be piloted as the DPME explored further how its role would be played out. Many of the information products that DPME was producing were indeed very important for Parliament. There were many aspects that would be explored, such as the Clusters, how to report regularly, and how products would be made available. There had been a first interaction between DPME and the Parliamentary Researchers recently, and DPME wanted to do something similar with the Content Advisers. DPME would be able to inform the Committee what was coming up and how the committees could be kept involved.

The Chairperson asked if the DMR wanted to comment.

Mr Andre Andreas, Chief Director, Department of Mineral Resources, said that he did not wish to make any comments at this stage but he would be reporting to the Department in full on the meeting.

Mr Lorimer asked if the Committee should recommend that DMR ask for inclusion in the evaluations.

The Acting Chairperson acknowledged that she would write a letter requesting the DMR to initiate the process to be included in the next round of evaluations. In a way, this was a Committee call, and a slightly different way of approaching oversight.

The Acting Chairperson noted the suggestions of Members about the topics that could usefully be included in the evaluation, but added that there was also a problem with licensing in DMR. Because the work was so widespread and this Committee had such as short time for oversight, the smaller industry players, in particular, were expressing frustration because they could not get their licences. Part of the budget improvements of DMR was intended so that DMR could put a new licensing system in place. The Committee needed to understand that system, how it was intended to close the gaps and how it would talk to the very small mining industry players. The turnaround time for the licences had to be shortened, and the question was how this would be done. Capacity within DMR, and whether those in place were working effectively, also had to be examined. More specifics were needed on that.

Mr Lorimer fully agreed with the Acting Chairperson.

Adv Schmidt quipped that he now thought that more than one evaluation was needed. The Mining Charter had originally been a “gentleman’s agreement” but later became a compulsory document that must be complied with in terms of the MPRDA. There was, however, more than one stakeholder, as it also involved the MIGDIT, a coordinating structure with labour, industry and the DMR, and communities. A proper evaluation on whether the Mining Charter was successful would mean reading the Charter against the scorecard and debating the issues with the communities. A multi-faceted approach was needed, and this was likely to deviate from the standard operating procedures used by the DPME for other departments, because that Charter involved agreement by all the stakeholders, and because it was intended to improve the “sustainability” of mining for as long as possible, although this was perhaps a contradiction in terms.

The Committee noted that the Committee Researcher was present, and said that it would be useful if he would now start to coordinate all the information, as it was clearly necessary for the DPME, the Committee and DMR all to work together. She thought that another meeting would be needed to set the approach. She agreed that both the Mining Charter and MPRDA would have to be considered together, and that the licenses and mine, health and safety issues were all interlinked, but said she would need more time to think about the best approach. Thinking out loud, she suggested that the Committee and DPME should develop a joint plan of action, to ensure that everyone was involved. This was unlikely to be a process that would be completed in a short time. The MPRDA amendments had not yet been presented to the Committee, and there was a need to look carefully into the dates. The workshop on the Mining Charter would soon be held, and this would help the Committee to decide what issues would be put forward.

Adv Schmidt agreed that this would be a major undertaking, and described it as not an event, but a process. In a sense, the two day Workshop would be “the event to understand the process” and said that after that, another  meeting would be needed with DPME.

Mr Lorimer pointed out that the problem was that the deadline of end-June was before the Workshop.

Dr Goldman said that it was probable that not everything could be covered simultaneously in one DPME evaluation plan. However, the Committee could request the DMR to do its own evaluation, or ongoing evaluations, or it could also request it to put up another evaluation request to DPME in the following year, making the necessary budgetary allocations. He agreed that a design evaluation of the MPRDA might be useful, and not so expensive, but the design evaluation alone might be too small to be considered for the DPME plan. The licensing aspects seemed to suggest that an implementation evaluation would be needed, to look at how the system was working and the different players. He suggested that, instead of trying to collapse everything into one evaluation, perhaps the Committee should pick an area of initial focus. The purpose of the evaluation must be outlined, then the focal areas, and then specific questions would be asked, which would involve talking to a wide variety of people. It was important to focus on what the Committee wanted to find out, because the research methodology would depend on the questions that were asked. Members had mentioned labour, mine health and safety as aspects, and this suggested that it could be useful, for instance, to have labour representation on the Steering Committee, or representatives from the Royal Bafokeng community. He reiterated that the stakeholders had to own the findings and must all talk to the process.

Mr M Hlengwa(IFP) noted the deadline, asked where the notices went, and whether the DMR had received them.

Dr Goldman said that letters had gone out to all Directors-General on 1 April, although he was aware that these letters may not always reach the right people, so they had been copied to the Head of Monitoring and Evaluation in each department also. The reason for the deadline was to maintain some discipline in the process. Although the deadline date was stated as 30 June it might, because the Workshop was later, be possible to ask for the submission by a slightly later date, but the selection was on 18 July and the DPME must have a consent notice from DMR at the least. After July, plans would have to be prepared  by DPME for Cabinet, so the time limits had been chosen to enable the DPME to do its job properly. The deadlines were also important to fit in the budget windows. He reiterated that it was possible for this Committee to ask that DMR must do its own evaluations on certain topics. In relation to the budget, he noted that National Treasury would not give additional money for evaluations and the money for this would have to be taken out of a specific evaluation budget. Departments should be budgeting for evaluation, as part of their planning processes.

Adv Schmidt suggested that the Committee should then perhaps work from the impact of the Mining Charter, the implementation criteria of the DMR, and then go back one step to assess the goals, objectives and criteria with which the companies must comply, including whether they were over-burdensome.

The Acting Chairperson said that the Committee was now essentially being asked to supply pressure, so that the DMR would attend to the necessary requests. She asked Mr Andreas to follow up on this with the Director General. She commented that this had been a very informative discussion, and it was necessary for the Committee now also to evaluate its own processes and recognise that it could not achieve everything through its own oversight alone, particularly since reporting was not always effective.

Other Committee business
The Acting Chairperson read out correspondence and noted an invitation to the Committee to discussions on how mines were affected by the Land Act.

She noted that the Mining Charter Workshop would be held in Committee week, the first week of July, and a venue outside Parliament may be arranged, but this had still to be approved by the Chair of Chairs.

Another two weeks of oversight had been planned. However, it would be necessary to leave time for public hearings on the MPRDA amendments. Some public hearings may need to be held in the vicinity f mines, and she called for suggestions. If the opportunity arose, the Committee could then move on to oversight visits, perhaps in the Western Cape.

Adv Schmidt suggested that perhaps Rustenberg, Johannesburg and Cape Town might be appropriate for the public hearings.

The Acting Chairperson added that there were a lot of mines in Cape Town, and marine mining was another matter that had to be taken into consideration. She noted that the visits could be risky, and it was necessary to have a particularly strict programme and to avoid making any empty promises to any people at the mines.

The meeting was adjourned.
 

Present

  • We don't have attendance info for this committee meeting

Download as PDF

You can download this page as a PDF using your browser's print functionality. Click on the "Print" button below and select the "PDF" option under destinations/printers.

See detailed instructions for your browser here.

Share this page: