You are here : Home / Original Research / Assessment of implementation of the health management information system at the district level in southern Malawi

Assessment of implementation of the health management information system at the district level in southern Malawi

Ansley Kasambara,1,2 Save Kumwenda3 Khumbo Kalulu3, Kingsley Lungu,3Tara Beattie,1 Salule Masangwi,2Neil Ferguson,1 Tracy Morse1,3

1. Department of Civil and Environmental Engineering, University of Strathclyde, Glasgow, United Kingdom
2. Department of Mathematics and Statistics, The Polytechnic, University of Malawi, Blantyre, Malawi
3. Department of Environmental Health, The Polytechnic, University of Malawi, Blantyre, Malawi


                                                                                      Abstract
Background
Despite Malawi’s introduction of a health management information system (HMIS) in 1999, the country’s health sector still lacks accurate, reliable, complete, consistent and timely health data to inform effective planning and resource management.
Methods
A cross-sectional survey was conducted wherein qualitative and quantitative data were collected through in-depth interviews, document review, and focus group discussions. Study participants comprised 10 HMIS officers and 10 district health managers from 10 districts in the Southern Region of Malawi. The study was conducted from March to April 2012. Quantitative data were analysed using Microsoft Excel and qualitative data were summarised and analysed using thematic analysis.
Results
The study established that, based on the Ministry of Health’s minimum requirements, 1 out of 10 HMIS officers was qualified for the post. The HMIS officers stated that HMIS data collectors from the district hospital, health facilities, and the community included medical assistants, nurse–midwives, statistical clerks, and health surveillance assistants. Challenges with the system included inadequate resources, knowledge gaps, inadequacy of staff, and lack of training and refresher courses, which collectively contribute to unreliable information and therefore poorly informed decision-making, according to the respondents. The HMIS officers further commented that missing values arose from incomplete registers and data gaps. Furthermore, improper comprehension of some terms by health surveillance assistants (HSAs) and statistical clerks led to incorrectly recorded data.
Conclusions
The inadequate qualifications among the diverse group of data collectors, along with the varying availability and utilisation different data collection tools, contributed to data inaccuracies. Nevertheless, HMIS was useful for the development of District Implementation Plans (DIPs) and planning for other projects. To reduce data inconsistencies, HMIS indicators should be revised and data collection tools should be harmonised.


© 2017 The College of Medicine and the Medical Association of Malawi. This work is licensed under the Creative Commons Attribution 4.0 International License. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)


Introduction
The processes of a health management information system (HMIS) include the structured coordination of health data (input) collection, storage, retrieval, and processing, to facilitate evidence-based decision-making and interventions (output).1 Decision-making broadly includes managerial aspects, such as planning, organising, and control of healthcare facilities at both national and institutional levels, along with clinical aspects, which aim to provide optimal patient care. Reports that are accurate, reliable, complete, consistent, relevant, and up-to-date are required by district health offices (DHOs) and the Ministry of Health (MOH) for monitoring and evaluation of public health indicators, such as population health status; service provision, coverage, and utility; drugs stocks and consumption patterns; equipment status; and availability of finances.1 An accurate report is one that is precise and correct, whereas a reliable report is one that is subsequently reproducible using the same data collection methods as the original report.2,3 Complete reports are those that have all required elements available.4 Consistent reports are those that are comparable in redundant or distributed databases (which is also a good measure of reliability).3,5 Up-to-date reports are those that conform to timeliness, which enables appropriate data use.6 These characteristics of data quality are required for reports to be deemed fit and appropriate for their intended purposes.3,7
Malawi lacks quality HMIS data, and this is compounded by inadequate use of available information in the planning and management of health services.8,9 In 1999, Malawi began the process of strengthening the quality of its health data an information systems with an analysis of the strengths and weaknesses of existing information systems and sharing the findings of the analysis with all stakeholders.9[9] This analysis identified a need for reformation of various vertical programme-specific information systems into a comprehensive, integrated, decentralised, and action-oriented system. The first step was the conceptualisation of a transition from the old paper system to a more user-friendly digital form.9 In 2002, the new HMIS platform was adopted for use nationwide (Figure 1).8 Despite the system now being in place for about 15 years, the system still has data and information challenges.
A number of factors have been documented as contributing to poor quality and performance of Malawi’s HMIS. One major problem has been the production of reports for following progress and identifying shortfalls. Using maternal health as an example: despite the large focus on interventions to improve maternal health in Malawi, data and reports produced to demonstrate impact are limited.7–9
These reports are also affected by missing and poorly recorded information.14,15 Missing information on maternal mortality from the records compiled by health surveillance assistants (HSAs) has been attributed to a number of factors, such as lack of knowledge on how to report causes of death, and who to report to on the deaths that occurred. Additionally, some deaths occur under the care of traditional birth attendants (TBA), and these deaths are either only reported to traditional leaders or not reported at all.15 In 2007, the Malawi government directed that TBAs should no longer conduct deliveries in rural areas.16 However, many TBAs still conduct deliveries and when there are complications they do not report to government or health centre officials for fear of reprisal.16 It can therefore be assumed that estimates of data reported at district hospitals may not reflect the actual situation in the rural communities. Thus the database of information for Malawi and sub-Saharan Africa, potentially contains limited and defective data.12–14
The constant changing of international indicators, through the Millennium Development Goals (MDGs) and now the Sustainable Development Goals (SDGs), means that Malawi’s HMIS programme should be aligned to suit Health Sector Strategic Plan (HSSP) and Essential Health Package (EHP) needs. Unfortunately, HMIS processes are not versatile enough to quickly adapt to changing international indicators,20 which is a drawback for both public institutions and nongovernmental organisations (NGOs) that rely on these indicators for their reports.
A lack of comprehensive information to guide monitoring and evaluation, policy formulation, and resource allocation negatively affects the day-to-day running of health services in Malawi. Insufficient information may also affect support from NGOs and multilateral donors, who may withdraw support from the sector if the impact of their interventions cannot be measured and reported accurately and reliably. A midterm review of Malawi’s HMIS judged it to be among the best in Africa despite the challenges that it faced during implementation.8 However, in the absence of a more recent review, it is not clear if the system is improving or if progress has been made to overcome the challenges highlighted. This study therefore aimed at identifying the current challenges faced during implementation of HMIS, assessing the data usage by policy makers, and assessing the completeness of data and adequacy of indicators in Malawi’s HMIS.

239_MMJ_29_3_Kasambara OR_Figure 1_171015.pages

 

 

 

 

 

 

 

 

 

239_MMJ_29_3_Kasambara OR_Figure 2_171017.pages

 

 

 

 

 

 

 

 

 

Methods
Study design
This was a cross-sectional survey, in which qualitative and quantitative data were obtained from HMIS officers and district data users. The study was carried out from March to April 2012 in the Southern Region of Malawi. Information was collected about the qualification of data custodians at district health offices, length of service, data sources, perceived data accuracy and reliability, reports from the system, challenges faced during HMIS implementation, and the objectives of the system as interpreted by HMIS officers. Suggestions on how to improve the system in order to produce better outputs were also documented.
Population and sampling
Of the 13 districts in the Southern Region, 10 districts were conveniently sampled to participate in the study. Resources and logistical challenges excluded 3 districts. Data were collected from 10 HMIS officers and 10 district data users, specifically 7 district environmental health officers (DEHOs) and 3 programme coordinators. The HMIS officers and data users were sampled from the following districts: Blantyre, Mulanje, Chikwawa, Chiradzulu, Mwanza, Thyolo, Balaka, Zomba, Phalombe, and Machinga (Figure 2).
Data collection and analysis
A combination of face-to-face interviews using a semistructured interview guide and self-administered questionnaires were used to collect data from HMIS officers and district data. The data collection tools were pre-tested. HMIS officers were targeted with face-to-face interviews and data users were given self-administered questionnaires. Copies of HMIS data collection tools were acquired from respondents and responses to interview questions were summarised. Themes were derived from information within interview transcripts and questionnaire data. Frequency tables were developed for the sources of data and challenges met by these sources of data, as perceived by HMIS officers and district data users. Data collectors’ reliability was rated by HMIS officers on a scale from 1 to 10, with 1 meaning “unreliable” and 10 meaning “very reliable”, with the mean of these scores representing overall reliablity.
Results
Data management
HMIS data were generated by a number of data collectors, as presented in Figure 2.
The challenges faced by the HMIS in terms of data management were outlined as:
The registers used to collect data by health workers (HSAs) had more indicators than those contained in the standard HMIS form (Form 15) used by statistical clerks. As such, the data transferred for submission to HMIS offices include less information than the initially captured data.
HMIS officers indicated that discrepancies existed in the data, as different data collectors generated data using different tools. This was verified by examining and comparing the data collection tools for HSAs, health facility personnel, and programme coordinators, who were found to define indicators differently.
Another cause of discrepancies was the inability of the system to consolidate data from the different data collectors to eliminate duplication, reduce data loss, and handle referrals. This was reported as a common problem. For example, records of the same patient submitted by an HSA and health facility personnel may be duplicated in the HMIS and subsequently interpreted as data originating from 2 different patients.
Untimely submission of reports, inconsistency of data, and misplaced or fabricated data, were reported as common issues encountered when consolidating monthly reports into quarterly reports. Untimely submission was attributed to, for example, data collectors in remote rural areas experiencing logistical difficulties or resource limitations when sending their data to senior HSAs to compile and subsequently relay the data to the HMIS office.
Power failures and faulty computers and delays in maintenance and repairs, at the district level.

239_MMJ_29_3_Kasambara OR_Figure 3_171017.pages

 

 

 

 

 

 

 

 

239_MMJ_29_3_Kasambara OR_Table 1_171017.pages

 

 

 

 

 

At the time of data collection for this study, the districts were creating their own databases to collate data, and as such they were not harmonised. This can lead to discrepancies. One HMIS officer stated, “databases are different between the districts and they need to be harmonised”.

239_MMJ_29_3_Kasambara OR_Table 2_171017.pages

 

 

 

 

239_MMJ_29_3_Kasambara OR_Table 3_171017.pages

 

 

 

 

Inadequate personnel at health facilities leading to extra pressures on a few people, including that related to the collation of accurate HMIS data. Owing to a lack of statistical clerks at health facilities, clinical staff (nurses and medical assistants) were often responsible for data collection. As clinicians and nurses were already overburdened by large numbers of patients, HMIS work could be seen as secondary to their patient care duties.

HMIS officers further stated that there was usually a delay in data acquisition from the different vertical programmes (the Safe Motherhood Programme, for example), which gave rise to challenges in the production of reports. In terms of safe motherhood, HMIS officers also indicated a concern that figures may be inflated in order for facilities to receive more medication, thereby affecting the planning system and resource management.
Other issues leading to challenges with HMIS objectives were related to human resource capacity within the system and included:
Lack of training, refreshers, and review meetings was mentioned 14 times by different HMIS officers for the different data collectors. The need for refresher courses was evidenced by unsatisfactory or varying comprehension of indicators among data collectors, leading to inconsistent data and data gaps.
For HSAs, statistical clerks, and ward clerks, lack of training (leading to difficulties using the data collection tools) was a problem. HMIS officers reported that this could be due to their level of education, which suggests the need for training supplemented by constant supervision to develop competency.
Human resoure capacity within HMIS
Qualification of the HMIS officers
The qualifications of these HMIS officers ranged from Malawi School Certificate of Education to a postgraduate diploma in statistics. Table 3 shows the qualifications of the HMIS Officers at the time of interview.
Eight HMIS officers interviewed were qualified statisticians with a minimum of a certificate in statistics and 2 were found to only have a Malawi School Certificate of Education with no specific statistical training. Competency through experience was not assessed in this study. Overall, 9 of the HMIS officers did not meet the MOHs minimum requirements of having a bachelor’s degree, however those who responded had an mean of about 5 years’ experience in the position of HMIS officer.
Understanding of the HMIS objectives by HMIS officers
The objectives of the system are to collect, compile, and analyse data, and to disseminate information while ensuring that the data are complete and consistent.9 The system analyses the data to check health indicators and produce reports at appropriate times to enable decision makers or managers to plan accordingly. This further enables monitoring and evaluation of the health systems’ progress. All HMIS officers were able to state the HMIS objectives (Box 1).

 

box 1

 

 

 

 

 

 

 

 

The HMIS officers stated that HMIS was on average achieving 88% of its objectives (mean assessment for all HSAs). The reasons HMIS was not achieving 100% were delays in receiving data, corrupted files and software, lack of training, and lack of interaction between departments. For example, HMIS officers indicated poor data sharing between their office and the Safe Motherhood Programme. It was felt that programme planning and decisions were made without reference to HMIS data, as programme personnel had collated their own statistics, and data collected by the 2 sections was also reported as inconsistent.
Data use
Data use according to HMIS officers
HMIS officers indicated that data should be used for a number of purposes, including registration of patients in terms of visits and admissions, which enables district data users to determine the use and potential overloads at different health facilities. All HMIS officers stated that data provides information that should help decision makers plan different activities. Furthermore, the information helps to check indicators by production of reports in the form of graphs to different programme coordinators, for example the community-based maternal and neonatal health coordinator and safe motherhood coordinator. However, it was mentioned that the HMIS was not used to its full potential—decision-making for programme planning, as well as human resource allocation and drug distribution did not fully or optimally utilise HMIS data. “As a result some departments overestimate figures for drugs required based on their own estimates,” stated an HMIS officer.
HMIS officers were asked to give their interpretations of the data collectors’ reliability (“the extent to which we can rely on the source of data and therefore the data itself”22,23). Reliability was measured on a scale from 1 to 10 (1 = “not reliable”, 10 = “very reliable”), and a mean reliability was calculated for every category of data collector (Table 4).

239_MMJ_29_3_Kasambara OR_Table 4_171017.pages

 

 

 

 

The mean reliability of data collected by nurse–midwives (as reported by the HMIS officers included in this study) was 9.5, seconded by data collected by statistical clerks (mean reliability 9.2). Statistical clerks and nurse–midwives were scarce in health facilities, which may have led to shortfalls in data reliability.
Responding to the question of accuracy (the degree of correctness,24 or the degree of measure to a standard or a true value25) of the data collected by the different professional groups, 8 out of 10 HMIS officers said that the data was, on average, 85% accurate, and the remaining 2 officers’ responded that the accuracy of the data was neutral. The poor data quality was attributed to a number of factors, including incompleteness of data (missing values from registers or private clinics failing to submit data, for example) and lack of training of data collectors, with an HMIS officer stating that, “[a lack of] comprehension of some terms by the statistical clerks or HSA who is compiling the data” contributes to poor data quality. The HMIS officers stated that if they doubt the accuracy of the data, they are supposed to make an attempt to verify the data.
Adequacy of data for making plans and decisions as district data users
According to 3 district data users, the HMIS data were not comprehensive enough and inadequate for effective planning and decision-making because of the limited number of indicators for their specific programmes. However, district data users felt that these types of data are very useful for general monitoring of service delivery, policy development, and high-level planning. Four out of 10 stated that, as district data users, information was adequate for their specific programmes because it allowed them to monitor trends related to health service delivery and disease prevalences. Despite the data being inadequate, 2 district data users claimed that various plans and decisions were being made based on the HMIS data reports which were circulated quarterly. These included: district health strategic plans, district annual implementation plans, departmental plans to guide partners in the district on programme and project planning, immunisation plans, development of the Icelandic International Development Agency/District Health Office (ICEIDA/DHO), public health programme documents, and disease outbreak responses.
Discussion
There were multiple data collectors documenting information at different levels and using different registers and forms, which fed into the same system, thereby contributing to discrepancies. Some vertical programmes still fed data into the HMIS despite the MOH agreement, in 1999, with all stakeholders on the need for the reformation of various vertical programme-specific information systems into the current integrated HMIS.9 Furthermore, verification of results at the facility level from the multiple collectors was not being carried out because of human resource constraints, as the position of assistant statistician (responsible for authenticating results before forwarding them to the next level) was not filled at most facilities. The MOH recommended quarterly verification of individual records for completeness and accuracy.10 The discrepancies in data collected resulted from the system’s failure to consolidate data from different data collectors in order to eliminate data duplication and loss. Discrepancies in data collected and reported were also due to the practice of districts creating their own databases.
The MOH recognises the shortfalls in data quality in terms of reliability for programme planning.9 However, among HMIS officers, the HMIS data originating from statistical clerks and nurses–midwives was perceived as being more reliable than data from HSAs. This finding is similar to earlier observations by other HMIS surveys conducted in the Malawi.10,26 The HMIS officers interviewed in this study perceived the data from the systems as neither comprehensive nor adequate for effective planning and decision-making owing to a limited number of indicators for specific programmes.
In order to improve the quality of data collected from all facilities, the MOH has previously suggested that there should be at least quarterly verification of individual data and monthly data by district health management teams (DHMTs), for completeness and accuracy of data before entering in the computer at the district health office (DHO) level. Also, there should be at least quarterly follow-up and practice-based training of each person involved in data recording and aggregation to ensure the accuracy and completeness of the data. However, staff interviewed in this survey bemoaned the lack of supportive supervision, training, and refresher courses. The requirement of training and refresher courses may generally be influenced by the personnel’s potential financial gain, so it might not be an entirely effective way of addressing the issue of data quality. One factor that may contribute to poor data quality from facilities is a lack of facility management to discuss the compiled reports before sending them to the next level.26
This survey revealed that the system is characterised by untimely submission of reports from the facility to the district level, due to factors such as remoteness of the facilities, logistical constraints, and resource limitations. At the district level, power failures and faulty computers lead to challenges producing quarterly reports. The MOH acknowledged the untimely data reporting and lack of support for equipment maintenance at the district level.10 However, this study did not ask how long it takes after data has been collected to compile reports at the district and lower levels.
In 2009, the MOH identified a huge deficit in human resources (in terms of quality and quantity), especially at the district and facility levels.10 The MOH states that the minimum qualification for an HMIS officer is either a Bachelor of Science in information technology (IT), information systems (IS), or computer science (CS).10 This study found that none of the existing HMIS officers in the districts met the minimum qualification for this position. This explains the reported limited data analysis and use of information in the management of health services at facility and district levels.
Most data users reported that HMIS data quality was questionable. Some data users stated that data were accurate because HMIS data were collected at the point of service delivery and there was a system in place for health facility management teams to collect data directly from the registers and compile the data in prepared HMIS data collection forms. However, other users argued that data were not accurate because of differences with data collected through other parallel systems from the same source (disease surveillance data and routine immunisation data, for example).
Conclusions
The MOH has been implementing a comprehensive and decentralised routine HMIS countrywide since 2002. However, the HMIS continues to face a number of problems, which range from use of different tools for data collection, missing data, untimely reporting, human resource constraints, and poor infrastructure at the district level. Data management is unsatiasfactory in terms of accuracy, completeness, consistency, and timeliness, making the HMIS unreliable for effective programme planning and decision-making. There is need for the review of HMIS indicators and harmonisation of data collection tools feeding into the HMIS to reduce data inconsistencies.
Recommendations
More HSAs, clinical staff, and statistical clerks need to be trained on HMIS to ensure accurate data capturing and timely reporting. Forms and registers should always be available to HSAs and medical personnel to avoid data gaps. The current forms need to be reviewed to address indicators for emerging and noncommunicable diseases. The MOH needs to employ HMIS officers with bachelor’s degrees, as stated in the job qualification requirements. We recommend further research on the extent of use of HMIS data for planning by district and health centre managers. We suggest that, if the MOH cannot employ HMIS officers with suitable qualifications, they consider upgrading those in place, and regular refresher courses should be organised to increase the competence of staff involved in data management at all levels.
Acknowledgements
The authors wish to acknowledge the district environmental health officers (DEHOs), district health officers (DHOs), and HMIS officers who provided written informed consent and participated in our study.
The study was conducted under the Scotland Chikwawa Health Initiative (SCHI), a collaborative project between the University of Strathclyde, Chikwawa District Health Office, the University of Malawi Polytechnic, and the Malawi Ministry of Health. This initiative was funded by the Scottish Government International Development Fund, with approval from the Malawi Ministry of Health and was conducting joint interventions in maternal and neonatal health in Traditional Authority Chapananga. The Polytechnic Research Committee also funded and approved the project to be extended to interviews across the Southern Region.
Competing interests
All authors declare that they have no competing interests related to this work.
References
1. Bodavala R. Evaluation of Health Management Information System in India Need for Computerized Databases in HMIS. Ranganayakulu Bodalava Takami Fellow Int Health Harv Sch Public Health [Internet]. 1998 [cited 2014 Jun 10];665. Available from: http://folk.uio.no/patrickr/refdoc/teams/Evaluation%20of%20HMIS%20in%20India.pdf
2. English PE. Information Quality Applied: Best Practices for Improving Business Information, Processes and Systems [Internet]. John Wiley & Sons; 2009 [cited 2016 Jun 23]. 840 p. Available from: http://eu.wiley.com/WileyCDA/WileyTitle/productCd-047013447X.html
3. Pierce R. Research Methods in Politics [Internet]. New Edition. SAGE Publications; 2008 [cited 2016 Jun 23]. 352 p. Available from: http://www.barnesandnoble.com/w/research-methods-in-politics-roger-pierce/1100205605?ean=9781412935517
4. Reiter A, Fischer B, Kotting J, Geraedts M, Jackel WH, Barlag H, et al. QUALIFY: Instrument for the Assessment of Quality Indicators. Dusseldorf Bundes Geschafts Stelle Qual Sich [Internet]. 2007 [cited 2016 Jun 23]; Available from: https://www.researchgate.net/profile/Burkhard_Fischer2/publication/267256474_QUALIFY_Instrument_for_the_Assessment_of_Quality_Indicators/links/55c2e65608aebc967defe710.pdf
5. Chisholm M. Is Consistency the Same as Quality in Data Reconciliation? IDQ Newsletter. 2010;6:3.
6. Sarmad Alshawi, Farouk Missi, Tillal Eldabi. Healthcare information management: the integration of patients’ data. Logist Inf Manag. 2003 Jun 1;16(3/4):286–95.
7. Juran JM, Godfrey AB, editors. Juran’s quality handbook. 5th ed. New York: McGraw Hill; 1999. 1 p.
8. Chaulagai CN, Moyo CM, Koot J, Moyo HBM, Sambakunsi TC, Khunga FM, et al. Design and implementation of a health management information system in Malawi: issues, innovations and results. Health Policy Plan. 2005 Nov;20(6):375–84.
9. Ministry of Health. Health Information Systems Assessment Report. Lilongwe, Malawi: Malawi Government; 2003.
10. Ministry of Health. Health Information System Assessment. Lilongwe, Malawi: Malawi Government; 2009.
11. Fauveau V. Effect on mortality of community-based maternity-care programme in rural Bangladesh. The Lancet. 1991 Nov;338(8776):1183–6.
12. Ministry of Health. National Reproductive Health Strategy. Lilongwe, Malawi: Malawi Government; 2009.
13. Versalovic J, Lupski JR. Molecular detection and genotyping of pathogens: more accurate and rapid answers. Trends Microbiol. 2002;10(10):s15–s21.
14. Graham WJ, Ahmed S, Stanton C, Abou-Zahr CL, Campbell OMR. Measuring maternal mortality: An overview of opportunities and options for developing countries. BMC Med. 2008 May 26;6(1):12.
15. Masangwi S. Scotland Chikhwawa Health Initiative Safe Motherhood Project, Chapananga, Chikwawa District, Malawi. Chikhwawa, Malawi: The Scottish Government; 2010.
16. C K, T M, S M, P M. Barriers to maternal health service use in Chikhwawa, Southern Malawi. Malawi Med J J Med Assoc Malawi. 2011 Mar;23(1):1–5.
17. Boerma T. The magnitude of the maternal mortality problem in sub-Saharan Africa. Soc Sci Med 1982. 1987;24(6):551–8.
18. Say L, UNICEF, United Nations Population Fund, World Health Organization, Reproductive Health and Research, World Bank. Maternal mortality in 2005: estimates developed by WHO, UNICEF, UNFPA and the World Bank. Geneva: World Health Organization; 2008.
19. World Health Organization, UNICEF, United Nations Fund for Population Activities, World Bank. Trends in maternal mortality: 1990 to 2010 : WHO, UNICEF, UNFPA, and The World Bank estimates [Internet]. 2012 [cited 2014 Jun 10]. Available from: http://www.who.int/reproductivehealth/publications/monitoring/9789241503631/en/
20. Zachariah R, Mwagomba B, Misinde D, Mandere BC, Bemeyani A, Ginindza T, et al. Vital registration in rural Africa: is there a way forward to report on health targets of the Millennium Development Goals? Trans R Soc Trop Med Hyg. 2011 Jun 1;105(6):301–9.
21. Ministry of Health. Malawi Health Management Information System Bulletin. Lilongwe, Malawi: Ministry of Health, Planning Department; 2005.
22. Davis CL, Pierce JR, Henderson W, Spencer CD, Tyler C, Langberg R, et al. Assessment of the reliability of data collected for the Department of Veterans Affairs national surgical quality improvement program. J Am Coll Surg. 2007 Apr;204(4):550–60.
23. Chapman AD. Principles of Data Quality, [Internet]. GBIF Secretariat; 2005 [cited 2014 Jun 10]. Available from: http://www.gbif.org/resources/2829
24. Scannapieco M, Catarci T. Data quality under a computer science perspective. Arch Comput. 2002;2:1–15.
25. Managing Information Quality – Increasing the Value of Information in Knowledge-intensive Products [Internet]. [cited 2014 Jun 10]. Available from: http://www.springer.com/business+%26+management/business+information+systems/book/978-3-540-31408-0
26. Moyo C. An assessment of the quality of health management information system data in selected health facilities in Lilongwe district [Internet] [Thesis]. [Blantyre, Malawi]: University of Malawi, College of Medicine; 2005 [cited 2014 May 25]. Available from: www.medcol.mw

Post a comment: