Statistics South Africa Budget: briefing

This premium content has been made freely available

Finance Standing Committee

16 May 2005
Share this page:

Meeting Summary

A summary of this committee meeting is not yet available.

Meeting report

FINANCE PORTFOLIO AND SELECT COMMITTEES
17 May 2005
STATISTICS SOUTH AFRICA BUDGET: BRIEFING

Chairpersons:

Mr S Ralane (ANC) [NCOP] and Dr R Davies (ANC) [NA]

Documents handed out:
 

Presentation on Budget
Statistics South Africa Strategic Plan 2005/6 – 2009/10
Statistics South Africa website

SUMMARY
The Statistician-General provided feedback on the Finance Portfolio Committee's October 2004 recommendations about the concerns they had with Statistics South Africa's (SSA) qualified audit report, its relations with the Statistics Council, the need for measurable objectives, the Census Replacement Survey and the slow progress in developing statistical capacity. He discussed SSA’s strategic plan and direction. Its priorities were the Community Survey, CPI direct price collections, Income and Expenditure Survey, its coverage of the whole economy, data management and information delivery, improvements to governance and the development of statistical professionals.

The following matters arose during the discussion:
- postal surveys were still used for most surveys, but efforts were being made to gather information via fax, telephonically and electronically;
- SSA had been criticised in the media for not accurately and timeously capturing statistics on the construction industry, but it was obtaining a list of all the civil engineering and construction projects that were underway;
- its quality control measures and methodology were being improved by being reviewing its frames, consulting domestic stakeholders and statistical agencies in other countries, conducting repeat visits to individuals, and parallel surveys would be conducted to constantly update outdated information and very detailed checking systems were in place and each statistical publication was checked by all before release;
- the 2006 community survey would not be replacing the census as the census provided a very fine level of geographic disaggregation;
- the composition of the Statistics Council was being finalised;
- the credibility and perception of its statistics were being improved by providing journalists with training on statistics to improve the understanding of the complexities of the field as well as hosting meetings within the organisation to boost confidence of staff;
- SSA complied with international statistics standards by abiding by the International Monetary Fund (IMF) Report on Standards and Codes (ROSC) and the United Nations statistics division guidelines;
- it planned to deal with the "high-wall areas" in conducting the income and expenditure survey by collecting prices directly from the point of outlet, and some indication was expected by June 2006;
- the cause of death survey was conducted at Statistics South Africa’s own cost and it was currently engaged in discussions with the Department of Health to recover the R10 million it had committed to the project.

MINUTES

Statistics South Africa briefing
The Statistics South Africa delegation consisted of Mr P Leholha, Statistician-General; Dr R Hirschowitz, Deputy Director-General: Quality and Integration; Dr L Gavin, Deputy Director-General: Statistical Support and Informatics; Mr R Maluleke, Executive Manager: Statistician-General Office Support and Mr T Oosterwyk, Manager: Communication

Mr Pali Lehohla, Statistician-General, provided feedback on the measures taken to allay the concerns the Committee had had with SSA's qualified audit report, its relations with the Statistics Council, the need for measurable objectives, the Census Replacement Survey and the slow progress in developing statistical capacity. He also discussed SSA's strategic plan and direction. Its priorities were the Community Survey, CPI direct price collections, Income and Expenditure Survey, its coverage of the whole economy, data management and information delivery, improvements to governance and the development of statistical professionals. He also looked at its funding over the next three years (see document).

Discussion
Mr K Moloto (ANC) [NA] stated that the presentation indicated that, as part of SSA’s quality control measures, it planned to use postal surveys. He asked SSA to indicate the extent to which the information received would be reliable, if only the postal survey were relied on.

Dr Ros Hirschowitz, Deputy Director-General: Quality and Integration, responded that the postal surveys at the moment were not the only way by which SSA collected information. Questionnaires were sent via post to people who had a postal address, but if people in the business surveys preferred the questionnaires to be faxed, SSA had a system that allowed for faxing and emailing of questionnaires to large businesses. It had engaged in a series of personal visits to such businesses to improve response rates and, when it did not receive a response, it conducted telephone follow-ups. In general the response rate was over 80% for business surveys.

SSA placed particular focus on the large businesses that contributed approximately 80% to total turnover in a particular industry. Of those, SSA tried to collect as much as possible from the sample, but for the really large businesses it visited them personally and tried to get as much information from them as possible. The sample was stratified by size, depending on the financial turnover of the businesses. SSA also conducted a whole series of telephone follow-ups.

Efforts were being made to automate the telephone phone-ups so that SSA could capture information at the same time the phone call was made, rather than filling out the form and then capturing that information electronically. SSA was also exploring methods that avoided the recapturing of data received via emails, and thus lowering the data capture rate would lower the incidences of mistakes in actual data capture.

Mr Moloto stated that concerns were raised by the media on SSA’s inability to capture timely and relevant information on the construction industry.

Dr Hirschowitz replied that SSA at the moment received plans for new buildings and completed buildings from municipalities, and this was its main way for measuring building statistics. SSA was however aware that it was underestimating certain aspects of construction, such as dams and other civil engineering projects. Its business register did not accommodate these because they were projects. Attempts were thus being made to in future gain a list of all the civil engineering and construction projects there were underway, and to measure those. SSA was informed by other statistical agencies that this was a very difficult process and the whole world was struggling to accurately measure the contribution of the construction industry.

The Statistician-General added that the construction industry and its projects were very mobile by nature, and it could only be managed properly once an economic register of activities existed. SSA was thus largely dependent on the information management systems of others. In the absence of such information, it employed a census. It was thus quite feasible to improve the registers, such as the population and business registers.

Mr Moloto sought reasons for SSA’s decision against collecting data on foreign trade.

Dr Hirschowitz responded that at the moment these statistics were being collected by the South African Revenue Services (SARS) and the South African Reserve Bank (SARB), and not by Statistics South Africa.

The Statistician-General added that SSA must consider whether it could pilot such a project, but he doubted whether they could replace the SARS and SARB. Perhaps a collaborative effort could be established in time to come.

Mr Ralane asked SSA to explain whether it was making multi-payments for procurement of goods.

Secondly, Mr Ralane stated that the presentation indicated that the community survey pilot project would be launched in February 2006; yet this project was already underway in his constituency. The problems were that the people involved were trained for only one day and there were no incentives for them to perform the work. The failure to remunerate them or provide basic services created animosity amongst people, and would negatively affect the implementation in February 2006. He asked SSA to explain its efforts to ensure the project would be a success.

The Statistician-General requested that SSA be granted leave to investigate the matter. He stated that to his knowledge they might not be doing work in the area referred to by Mr Ralane, but if work was in fact being done, it would only have been a pilot project aimed at assessing the methods etc. It was a difficult question. SSA relied on stability when it compiled its statistics, but that did not mean that it did not conduct its work in the absence of stability.

Dr Liz Gavin, Deputy Director-General: Statistical Support and Informatics, added that clearly the quality of the information SSA collected depended critically on the co-operation and inputs of their respondents. She stated that she would appreciate an opportunity to follow up on this matter.

The Chair stated that SSA needed to indicate the clear measurable objectives that would provide a clear picture of improvement in the quality of statistics. The primary problem appeared to be the capturing of the data from which the sample was drawn, and he asked whether the recent cases raised in the media were due largely to inaccurate information gained through the sampling frame.

The Statistician-General responded that problems were caused when the incorrect frames were used for the collection of data. SSA and SARS consulted on the business frame, which had improved from the status in 2000, as the figures were more credible. SSA had received criticism in the media for the changes it was effecting, but the changes were aimed at improving the frames and the credibility of the statistics. The business survey was also being improved and making the registration of businesses mandatory would assist SSA in collecting the data.

Secondly, the address was also very important, but repeat visits became very difficult when SSA did not have the address. They were working with the Department of Communications and the South African Post Office to ensure that they were provided with an address register. This was however a long-term project that would span the next three years. The address register was particularly important for the success of the community survey. The same applied to the population survey, and in this regard SSA was working closely with the Department of Home Affairs.

The Statistician-General replied that he wished to apologise for sounding defensive in answering the questions that related to SSA’s quality control measures, but stated that the sharing of lessons with other jurisdictions would assist the Committee in understanding South Africa’s context. In January 2005, SSA was struggling with manufacturing statistics, and the issue of quality was raised. One of their key consultants recognised the magnitude of the issue, and he was informed that his comparative office in Canada was experiencing the same problem. This thus clearly illustrated that the quality of the statistics produced was an inherent problem for any statistical institution that dealt with masses of data. Furthermore, automation at times created problems as well and countries all over the world faced this problem from time to time.

Dr Hirschowitz stated that the first issue regarding quality control that she wished to speak about was the business register. SSA compiled it from data received from SARS and the Department of Trade and Industry, which was consolidated into a register of all businesses that were registered in South Africa. They drew samples from the register. She stated that they had fallen behind because it used to have an address register system for businesses which was pretty out of date before 2000, and the new business register was in the process of being put in place for quite a while. In 1998, SSA requested certain legislative changes to grant it access to data from SARS, and efforts were now being made to work towards an integrated business register for the whole country where each business had a unique identifier.

The biggest problem with the compilation of the business register was that a business was a legal entity, yet SSA then had to reclassify the business into a statistical entity or entities. They felt that it had completed an enormous amount of work on its business register. It was confident that, although the business register did not currently properly classify every single business, with an ongoing business improvement survey the classification of the businesses were as good as they could get them at this stage.

When compiling economic statistics, SSA would freeze the business register once a year, and new samples were drawn for each survey. Parallel surveys would be run in which the older and new businesses were contacted, and the two would then be compared. To date this exercise had only been conducted twice, but they planned to conduct it on an annual basis. At times different figures were arrived at, but this was in fact ‘planned change’. Perhaps SSA did not handle publicity regarding these figures as well as it should. The register must be changed continually so that new businesses were given the opportunity to be selected within their own category. As the methodology advanced it should lead to fewer discrepancies.

The next quality improvement measure, a systematic method of assuring quality, was gained from their Swedish counterparts. SSA took each process in the statistical system and described each of the steps in as much detail as possible. This was an ongoing process. This system identified mistakes and improved systems.

Statistics South Africa also had very detailed checking procedures and every statistical release would be checked by everyone and signed off on. Furthermore, in the methods and standards division within the quality and integration directorate, an independent person who was not involved in collecting the statistics would actually check each figure. There were thus several checks in place, but they did not necessarily overcome all the weaknesses completely. One of the main reasons was that statisticians did not necessarily understand the economic implications or the social implications of the statistics. The issue was thus whether their statistics made sense and, if not, why? This was the most difficult issue to address. They had decided to host a staff seminar every Friday in which each division would present to the entire organisation to increase awareness of their activities within the organisation itself.

The Chair asked SSA to explain whether, in view of its proposal to move from the household census to a community survey, the door-to-door census was becoming obsolete. Not every house was visited during the census process as a sampling technique was used and, in fact, the Chair’s house had not been visited during 2001.

Dr Gavin responded that all their household-based surveys had a publicity-based component built into it, because it was realised that it was absolutely critical to receive quality information through co-operation from the community. In fact one of SSA’s specific activities was for a component of its Census research group to explore respondent attitudes to enable them to improve on its response rates in the next Census. This could identify means by which questions could be couched in a specific manner that would make people more comfortable in responding and would elicit accurate responses.

Statistics South Africa was very careful to refer to it as a community survey and no longer as a Census replacement survey. A census of population did in fact involve identify documents and a visit to 100% of all the households in South Africa, but some households were missed. During the last survey they had missed approximately 17% of households, and the Chair appeared to fall within that category. It was the challenge of reaching 100% of South African households that made a census enormously logistically complex and expensive.

A benefit of reaching every single household was that it allowed them to gather information at a very fine level of geographic disaggregation, and when the sampling process began it would then limit the size of the population for which useful statistics could be provided. Thus at this stage the planning and refining the sample design and size would be considered with the Statistics Council, and the aim was to visit approximately 17 000 numerator areas as defined in Census 2001. Within each of those units SSA would be collecting information from approximately 10 households, which would make it a survey of about 170 000 households. They believed that this would provide certain information at municipal level, but would not provide the level of detail possible through a full census.

The Statistician-General added that in 2002 a particular process was introduced to deal with the 2001 Census results. The agricultural census was then conducted and statistical methods were introduced there as well. The next project was the area of economics where those methods were now applied. In terms of management and leadership that meant that the methods that were piloted in the projects such as the census that were proven successes, then had to be transferred to within SSA itself. They were only in a position to check whether those methods were viable when it used them in areas different to those in which they were piloted. They had introduced other managers into that environment to lead the process. This was a key intervention.

Dr Gavin stated that at this stage the information to be collected was directed by being able to inform the equitable share and all the formulae used by National Treasury in the division of revenue. It also provided a better understanding of South Africa’s demographic variables so that it could understand population dynamics at a lower geographical level.

Ms J Fubbs (ANC) [NA] sought clarity on the structure and composition of the new Statistical Council.

Mr Risenga Maluleke, Executive Manager: Statistician-General’s Office, responded that the Council consisted of 25 members who were nominated by the executive councils in the provinces. The functions of the Council related to the collection, analysis, storage and broad areas of statistics.

Ms Fubbs stated that the Committee had been concerned for a while that SSA was perceived to have lost much of its credibility.

Mr I Davidson (DA) [NA] agreed with the concerns raised by Members regarding the quality and loss of credibility of the statistics produced by Statistics South Africa. This had negative impacts on the re-rating of South Africa by various financial organisations.

The Statistician-General responded to these two questions by stating that he agreed and that the rebuilding of SSA’s systems, as explained, was needed for them to understand the environment within which it operated. They were operating in a very transparent environment but he suggested that, in view of the kind of statistics environment South Africa came from, perhaps "we appear to be beating ourselves too hard oftentimes" and this created a credibility crisis. In fact they might have bitten off more than they could chew in setting such high targets for itself in 1999. The type of progress made in parallel jurisdictions must also be borne in mind.

Mr Trevor Oosterwyk, Manager: Communication, added thatpart of the problem was that reporting on statistics had become problematic as there were limited numbers of journalists who had an intimate understanding of statistics to be able to report accurately. They were thus currently providing some training to journalists on statistics to improve understanding of its work, and the hope was that this would improve the coverage of statistics. Statistics South Africa had also invited the most senior editors and journalists to discuss the changes in economic statistics to carefully explain and contextualise the statistics to ensure that the reporting on the statistics were as accurate as possible.

SSA also preferred not to engage in a direct and confrontational manner with attacks in the media, but in fact chose to focus on providing credible statistics and allow the public to decide for itself.

Ms Fubbs asked SSA to explain the compliance protocol referred to with regard to the measurable objectives, because the percentages of compliance indicated in the presentation did not provide a full picture.

The Statistician-General responded that they had decided against the publication of its user satisfaction survey. In 2004, they had received 61% approval, and in 2005 it stood at 69%. The percentage referred to in the presentation related to their aim to gain 2% points each year. It thus aimed for 63% this year but had achieved no less than 69%.

Mr M Robertson (ANC) [Eastern Cape] asked SSA to indicate the financial and other methods it put in place in its budget to ensure the improvement of its IT systems to comply with the PFMA requirements. Furthermore he asked whether these measures were sustainable and whether they would successfully reverse the Auditor-General’s opinion that it could not place any reliance on the IT systems used by Statistics South Africa.

Mr Y Bhamjee (ANC) [NA] stated that he was aware that people used and interpreted the statistics produced by SSA to further their own agendas.

Dr Gavin replied to these two questions by stating that, with regard to their transferal programmes, attention was paid to ensure that passwords were not shared to access their IT systems, which were in fact hosted by the State Information and Technology Agency (SITA). Statistics South Africa had discussed with SITA the possible problem of unauthorised access to its IT systems, and they might be able to obtain the log files of the system on a regular basis so that they could be reviewed.

SSA’s statistical information services component was tasked with looking very closely at all information published, and to propose appropriate software that would minimise any manual interventions in terms of moving from the actual data and into the publications. On a slightly longer time scale, the aim was that the data that would feed the publication would be the data that resided in their data warehouse, rather than being stored in various conditions across the Department.

Mr Robertson stated that although it was understood that SSA operated in a highly skilled and technical environment, the South African government believed in the transfer of skills and the reskilling of individuals, especially in a scarce skills environment. Statistics South Africa had consistently lacked a Deputy Director-General in Economic and Social Statistics with the result that people from abroad were employed. He requested SSA to inform the Committee of the status of the Deputy Director-General post.

The Statistician-General responded that the one DDG was still suspended. The disciplinary process was lengthy, despite the need to finalise it as soon as possible.

Mr A Asiya (ANC) requested SSA to indicate to the Committee the reasons for the problems it was experiencing in its procurement, recruitment and training operations. He asked whether they were using the Preferential Procurement Policy Framework Act (PPPFA).

Secondly, Mr Asiya sought clarity on the relationship between Statistics South Africa and the Statistics Council, and the status of the Council.

The Chair asked SSA to explain the role of the new Council in ensuring that the statistical techniques used were satisfactory, and how the Committee, as non statisticians, could exercise some form of oversight and ensure that the statistical processed in place produced accurate and reliable results.

Mr Maluleke replied to these three questions by stating that, following the directive issued by the Committee late last year to finalise the relationship between Statistics South Africa and the Council, the two bodies had met and decided to agree amongst other things on broad levels of work including poverty, labour statistics, prices etc. It was however decided that the tools of measurement would be left to SSA, and for that reason the subcommittees of the Council continued to operate as they did. The enabling legislation required the Council to produce an annual report at the end of the financial year which had to be submitted to the Statistician-General and the Minister of Finance, which would indicate issues it had raised with regard to advice given as well as responses indicating why the advice was not considered. The Minister must then table the report in Parliament.

The Council was currently being re-established and Members of the Council were still confirming their availability to serve on the Council, and it should be finalised by the end of the week. The Minister would then publicise the names, in accordance with the usual procedure. A programme would then be decided on to work out the relationship between the Council and Statistics South Africa.

Mr Asiya asked whether the municipal surveys undertaken at a local government level included the IDP’s and issues of economic development at local level.

Dr Gavin responded that SSA was still engaging with users and it had just had a series of user consultation workshops in each province, and only the Western Cape and Mpumalanga provinces needed to still be visited. There was a trade-off between the volume of information that they aimed to collect in a particular survey or Census and the cost associated with it, and perhaps even the quality of the individual responses.

Mr Bhamjee stated that SSA had not followed up on ensuring that it had achieved the targets that it set for itself last year. It was thus not possible for the Committee to assess whether they were setting unrealistic targets.

Mr Robertson stated that SSA indicated its plans to ensure good governance by focusing on compliance with regulatory frameworks such as the PFMA and Treasury regulations, and to improve administration and service delivery to users. He asked them to indicate how it planned to meet all these plans when it had not been successful in the past, and how exactly the new measures differed from what it had done in the past. Furthermore he sought clarity on who would be put in charge of the process.

Mr A Asiya (ANC) [NA] stated that the presentation did not indicate the date by which Statistics South Africa would report that it had improved its operations, so that it could assure the Committee it would receive an unqualified audit report in the following financial year.

Dr Hirschowitz replied that most of the reports were produced on time, on a monthly, quarterly and annual basis. There had been some problems that were due to the timing of the comparative parallel surveys, and this was being reconsidered to smooth over the process to arrive at an ongoing series of statistics. Thus at the beginning of 2005, SSA had delayed the publication of certain statistics because it was trying to harmonise the old and new samples.

Statistics South Africa was still currently experiencing problems with retail trade because it had not included the very small businesses in its previous samples, which referred to those non-VAT paying but income tax paying businesses. They were included for the first time during last year’s survey and provided a different picture for retail trade than before, which caused some confusion in the press. It was however a planned change to include more small businesses to understand the effects.

The Statistician-General added that Statistics South Africa presented its priority matrix in 2004 that identified prices indices, GDP etc as needing urgent attention. The hope was that some of those would have been migrated to the top quadrant, and Dr Hirschowitz explained earlier the steps taken to improve its functions. They had methodology that supported and critiqued all data produced.

Mr Bhamkjee asked SSA to explain the "international standards" referred to in the presentation.

Dr Hirschowitz replied that the United Nations statistics division provided guidelines and books on how to compile certain statistics, with the most recent guidelines being on household surveys for developing countries. Statistics South Africa attended the meetings of that division and used its helplines, and it also stuck to their standards and definitions to a large extent. The International Monetary Fund (IMF) had two standards on which it assessed Statistics South Africa every few years called the special data dissemination standard, which ensured that everyone received statistics at the same time. SSA was found to be 100% compliant with this.

The Statistician-General added that the second standard was the IMF Report on Standards and Codes (ROSC), which involved SARB and National Treasury as well. Each time the ROSC mission evaluated Statistics South Africa it reported coherence within its operations, and expressed both confidence and stability in the process. They accepted that there were errors in its statistics at times, but there was stability and confidence in those statistics. South Africa was the only country on the continent that subscribed to these statistics and was one of the few countries in the world that adhered to the timetables. SSA was thus confident in the statistics it produced, and it believed that it must take responsibility for errors that occurred, correct them and move forward.

Dr Hirschowitz stated that they were thus adhering to international standards, and had the reports from international bodies to substantiate that. Statistics South Africa currently had three international consultants, one of whom used to head Statistics Canada and two from the Australian Bureau of Statistics. They were involved in assisting them to improve their statistics.

Mr Bhamjee stated that there was a discrepancy between the figures for measuring standards of performance and target outputs in the presentation and the Strategic Plan.

The Statistician-General replied that, save for the delays in the production of the manufacturing statistics, all the deadlines were met. The improvements in the quality of statistics were effected, especially in the manufacturing statistics, and the presence of the frames gained from the parallel surveys referred to by Dr Hirschowitz enabled them to handle the reality of the economy much better than before.

Mr Bhamjee noted that the new Strategic Plan indicated that the Statistics South Africa’s postal collection methodology would be phased out and new methodology would be introduced in all nine provinces. He sought clarity on the new methodology, how reliable it would be and whether it would be subjected to a pilot project before being adopted.

Mr Davidson asked SSA to indicate whether it had arrived at a timetable for the achievement of an appropriate methodology. The presentation indicated, for example, that they would be engaged in a substantive review of the CPI methodology. No timeframe had been provided for the finalisation of this process.

Mr Oosterwyk responded that this was a complex issue. They had decided on a vision that expressed the aim to become the preferred provider of quality statistics. Concerns with the credibility of Statistics South Africa’s statistics had been one of the factors that bedeviled perceptions about them. The bedrock of a credible identity and image was that the figures must be trusted. If one analysed the amount of products Statistics South Africa produced, the amount of work that went into the statistics and the amounts of times that they had admitted to getting things wrong, one would then wonder why there was such a great divide between the trust showed for certain statistics and not others.

In order to address this problem they realised they needed to improve their relationship with critical stakeholders via holding regular meetings. They were currently engaged in a roadshow that would introduce people to the organisation. Furthermore SSA was busy building confidence and perceptions within its organisation regarding the credibility of its statistics.

Mr Davidson questioned the quality of the budgeting process within the organisation. He stated that they had reported overspending of 33% on its Programme 1 during the previous financial year, yet it had now budgeted for a decrease in expenditure on that programme of 49%. There were similar overspends and subsequent increases in the other programmes as well.

The Statistician-General replied that this fluctuation was due to the decision not to run a census. Secondly, SSA had introduced projects throughout the year and some projects would come to an end, and thus the management of these projects was tricky.

Ms D Robinson (DA) [Western Cape] asked SSA to indicate its plans to deal with the problems with the "high-wall areas" in conducting the income and expenditure survey. She suggested that the electronic gathering of information be considered as a more viable option.

Dr Hirschowitz responded that the income and expenditure survey was a 5-yearly survey. It used to be conducted on a ‘recall method’ by which people were asked to remember what they had bought over the previous reporting period. This method had been questioned internationally because it was 30 years outdated, and Statistics South Africa had shifted to a diary method. This entailed visiting the same household over a month approximately 5 times, and they would be asked to retain specific information for the survey. This method yielded much more accurate figures. It would be very difficult to get this kind of information through emails. Statistics South Africa had begun consulting the various rate-payers’ associations and body corporates for a sample. They also planned to approach ward committees and would appreciate any response from them.

This approach would not be followed for only the income and expenditure survey but it was particularly important for that survey because 60% of national income was consistently being spent by the top 20% of earners. Thus the top 20% would have to be covered in this much detail in order to yield a good CPI figure. It was for that reason that they were so concerned about the "high-walls" for the income and expenditure survey. Other countries used telephone interviews but that was not a viable option for South Africa because SSA had to be democratic in the manner in which it administered its questionnaire and, secondly, it did not necessarily have the necessary telephone numbers.

As far as direct price collections for the CPI were concerned, the postal survey did not include transaction prices at the point of sale. There might be different prices for the same item in different parts of the country, and the international standard as set by the United Nations Statistics Division was to collect prices directly from the point of outlet. Even in other African countries such as Malawi direct price collection was done even in the informal sector, and they were receiving accurate transaction prices. Statistics South Africa had been criticised for using the postal survey because it did not yield accurate transaction prices. The result was that they had to collect prices and service prices throughout the year from different types of business outlets in the retail trade. There were certain types of information that were still gathered via the postal survey, such as medical aid information, but in general the aim was to collect the prices from the point of sale by systematically sampling different businesses that sold different kinds of products.

This was thus a big change in methodology and they expected an indication by June 2006, as direct price collections had been piloted in Mpumalanga, Gauteng and the Western Cape. It would be spread out throughout the country in due course.

The Statistician-General added that SSA had adopted a statistics value chain, which ensured that they began with the user in mind. This process was introduced in 2003 and was beginning to take root within the organisation.

Dr Gavin stated that, with regard to the electronic collection of data, many international agencies were considering the multi-nodal collection of information. This meant that the data could be collected via different techniques within the same survey, but it did raise methodological issues. Both Canada and the United States experienced that the data collected from the Internet had been of a higher quality than data gathered from the face-to-face method. Statistics South Africa would be investigating their method, but would also be proceeding with caution.

Mr Oosterwyk added that a greater sense of interest in the work done by SSA was being created amongst its demographers and statisticians. A communications unit would go out with field workers to identify methods and a publicity strategy that could be used and developed to access the "high-walled" areas. This would be done in conjunction with the Government Communication and Information System (GCIS) so that the problem could be addressed from as many angles as possible.

Ms Fubbs stated that SSA planned to integrate and improve the quality of registers across government, and especially its administrative records, as they would assist departments and complement their capacity. She requested a timeframe for this process.

Secondly, Ms Fubbs asked Statistics South Africa to indicate whether it was possible to find out from people who used the statistics how helpful they found them.

The Statistician-General replied that the survey run by Statistics South Africa examined who exactly used statistics and which statistics they used. They would consider an additional question to ascertain whether they found it useful or not.

Mr Kabela (ANC) sought clarity on SSA’s skills transfer plan, because the presentation indicated that international professionals were employed. Furthermore he asked them to indicate whether this skills transfer plan would yield any results in the short or medium term.

The Statistician-General responded that the consultants from Australia and Canada were surrounded by the SSA staff, and the skills were thus transferred to them.

Mr Van Wyk (NNP) [NA] questioned the effectiveness of access to statistics produced by Statistics South Africa, as many people did not have access to the Internet and were not on their distribution list.

The Statistician-General replied that they had a fairly sophisticated electronic process by which information could be accessed. Access to information did however have to be dealt with via intermediaries from other departments and municipalities, and they must be used more effectively.

Mr Van Wyk noted that only 16% of Statistics South Africa’s budget was allocated to statistical support and informatics, and asked them to explain how SMME’s and other stakeholders would access information.

The Statistician-General responded that the census information and labour survey would be useful to the SMME sector, and the Department of Trade and Industry would benefit from such information. The possibility of duplication must be minimised. Those offices needed statisticians to assist in the analysis of Statistics South Africa’s data.

Dr Hirschowitz added that they had a user inquiry service which provided people with information on any statistics provided by Statistics South Africa. They also had a computer centre that could be used by anyone to analyse data.

Mr M Johnson (ANC) [NA] asked whether, in view of the consultant to staff ratio of 3:1, Statistics South Africa relied more on consultants than its core staff.

The Statistician-General replied that the ratio referred to Statistics South Africa’s temporary employees and not its consultants. The number of consultants amounted to less than 10.

Mr Johnson sought clarity on the audit opinion of the Auditor-General.

The Statistician-General responded that the qualified audit opinion emanated from the census, and Statistics South Africa had admitted in this meeting that a census strained the organisation and its budget. There would be spikes in the budget due to the 2006 community survey and the 2011 project.

Mr Johnson asked whether the cause of death survey was not conducted due to lack of funds.

The Statistician-General replied in the affirmative. Funds were instead available for the community survey study. The Department of Health was supposed to have contributed R10 million but had failed to do so. Statistics South Africa nevertheless conducted the survey at a cost of R23 million, and it was currently engaged in discussions with that Department regarding the outstanding contribution.

Ms B Ntuli (ANC) [NA] sought clarity on a perception that the prominence of HIV/AIDS would result in less people entering the schooling system, and whether other government departments could use such statistics.

The Statistician-General replied that a schools census would be very helpful and should be considered further. In the United Kingdom and Australia children would answer questions about themselves online, and produced and managed data online. A tighter working relationship was needed between Statistics South Africa and the Department of Education.

Dr Hirschowitz added that 100 computers were handed out to various schools to complete the schools survey. The databases were made available not only on computer but also in tabular form and were provided to various schools. This was a very good starting point.

The meeting was adjourned.

 

Audio

No related

Documents

No related documents

Present

  • We don't have attendance info for this committee meeting
Share this page: