Skip Navigation HRSA - U.S Department of Health and Human Services, Health Resources and Service Administration HHS
Home
Questions
Order Publications
 
Grants Find Help Service Delivery Data Health Care Concerns About HRSA

Cultural Competence Resources for Health Care Providers

 

Transforming the Face of Health Professions Through Cultural & Linguistic Competence Education:
The Role of the HRSA Centers of Excellence

Chapter 7: Assessment and Evaluation

Cultural and linguistic competence education is a relatively new and evolving field. Evaluation will determine whether the COEs have achieved their mandated goals. The results become a guidepost and support for continuous improvement. The COEs are charged with developing innovative methods to teach cultural and linguistic competence more effectively and efficiently. It has been suggested that cultural and linguistic competence education programs that are exposed to rigorous evaluation are more credible to peers and policymakers. This enhanced credibility could then improve the programs’ acceptance and replication by other health professions schools.

Health professional education is organized so that students learn in a wide spectrum of settings, including classrooms, laboratories, health care delivery locations, such as hospitals, health centers, clinics, and in extracurricular activities, such as those in the community. Cultural attitudes and information can be woven into the operations of each of these settings. COEs face the difficult challenge of assessing and supporting cultural and linguistic competence across the educational spectrum.

When undertaking cultural and linguistic competence initiatives, it is critical that COEs make an initial assessment (establish a baseline) and then continuously assess the organization and the educational programming against this baseline. The role of evaluation in a change effort is to increase the likelihood that significant and sustainable change will occur by drawing attention to existing gaps and accomplishments.

To assist the COEs in considering evaluation strategies, in this chapter we highlight information related to educational and organizational assessment and evaluation, as well as a number of methods of evaluation. This chapter includes a discussion of educational assessments and evaluations, three examples of curriculum evaluation, organizational assessments and evaluations, the HRSA domains as a framework for organizational assessment, and integrated and stand alone evaluation processes.

In particular, the Expert Team believes strongly that organizational assessments and evaluations should be considered core components of all cultural and linguistic competence programming. The organization plays a significant role in the development of students’ cultural and linguistic competence, and is a major component of the implicit curriculum.

Building assessments and evaluations into educational programming will also:

  • Improve the effectiveness of the cultural education for health professionals
  • Provide regular adjustments to the curriculum in response to the dynamic and multifaceted nature of culture
  • Provide a basis for the COE to determine which methods are effective in developing culturally and linguistically competent clinicians
  • Support the COEs in achieving their mandated goals

To accomplish these goals, a variety of evaluations should be conducted, including those that are formative and summative. Formative evaluations may be considered a pro forma assessment in that it might be done with a small group of people to test various aspects of instructional materials. A summative evaluation would evaluate whether students learned what they were supposed to learn.

Such evaluations can be used to track the effect of changes made in the explicit (formal) and implicit (hidden) curricula. Pre- and post-training assessments of student learning, using both quantitative and qualitative methods, are strongly recommended, along with tailoring of cross-cultural content to fit individual and group needs and capabilities. As defined earlier, some educators say the explicit curriculum is the formal program of learning, and the implicit curriculum is “hidden” or unspoken component. (Chapter 10, Resources, Section III, provides a list of evaluations at the individual, organizational, and curricular levels.)


I. Educational Assessments and Evaluations

In evaluating cultural and linguistic competence education, COEs should analyze four key aspects of educational programming.

A. The content of the program as defined by expert knowledge and standards in the field
B. The effect of the programming on student learning and performance
C. The effect on clinician learning, patient care, and health outcomes
D. The effect of the curriculum as a whole on students, faculty, administrators, and the organization.

A. Content of the Program

When evaluating the content of a program, COEs should ensure that the program is comprehensive. The Expert Team believes that COEs should use all three of the following frameworks for a comprehensive cultural evaluation: The Tool for Assessing cultural Competence Training (TACCT) developed by the American Association of Medical Colleges and scheduled to be published in 2005. TACCT provides a framework that can be used across the entire curriculum (see reference in Chapter 10, Resources), The Principles and Recommended Standards For Cultural Competence Education (www.calendow.org), and the ASKED framework, which is described in Chapter 4.

B. The Effect of the Programming on Student Learning and Performance

“Curricular evaluation hinges on measuring whether the goals and objectives of a course have been met by determining whether the desired change in the learner’s attitudes, knowledge, or skills has been achieved.” (Weissman, J. and Betancourt, J.R., N.D.) Therefore, the standard means of evaluating curriculum is by answering key questions involving student performance. COEs must evaluate three critical questions across their entire cultural and linguistic competence educational programming:

1. Are students learning what is taught?
2. Are they using what they learn?
3. How well are students using what is taught?

1. Are students learning what is taught?

Nora et al, using multiple-choice questions, showed that students had a greater knowledge of Hispanic health and cultural issues after completing a ‘Spanish Language and Cultural Competence Curriculum.’ These students were also “less ethnocentric and more comfortable with others.” (Nora et al., 1994) Another study reported that “family practice residents exposed to a three-year, multi-method cross-cultural curriculum had more cultural knowledge and cross-cultural skills, via self-report and faculty corroboration.” (Gonzalez-Lee and Simon, 1987; as cited in Betancourt 2003). Thus, attitudes, knowledge, and skills were changed.

As discussed, there are a variety of techniques that allow COEs to measure student learning in the dimensions of attitudes, skills, knowledge, encounter, and desire. Combining techniques will allow COEs to determine how much students have learned from their experiences beyond what they knew when entering the health professions school.

2. Do students use what is taught?

Health professions students are often assessed on their interactions with actual and standardized patients. These encounters, when observed and analyzed, can show whether students are able to apply what they have learned. As Betancourt notes, however, it is often difficult to “consistently assess clinical encounters in real time to assure that the behavior exhibited truly reflects the skills demonstrated in a controlled setting” (2003). A critical question can be whether the student under time pressure, in a pediatric clinic with 10 families waiting for his or her services, is able to perform the culturally sensitive history he/she conducted with a standardized patient in a structured setting?

3. How well do students use what is taught?

The question “how well” implies an evaluation of the quality of a clinician’s judgment. Betancourt suggests that qualitative physician and patient interviews can elicit whether cross-cultural skills have been used effectively. The challenge arises on how one can employ these skills in a real clinical setting. For example, trained reviewers can evaluate video- and audio-taped clinical encounters to judge the quality of student actions in a clinical setting. The checklist for assessment should contain items that relate to attitudes and behaviors that reflect students’ attitudes.


Like et al. (1996) noted that culturally competent clinicians require a variety of skills in diagnosis (e.g. eliciting the patient’s perspective about health and illness), education (e.g. providing culturally sensitive patient education and counseling), and treatment (e.g. prescribing or negotiating a culturally sensitive treatment plan). In testing students for these skills, COEs and other schools have the opportunity to measure and improve the curriculum itself, as well as train clinicians who will apply its principles more effectively. Testing for skills also has a symbolic effect in that it tells students and faculty that cultural and linguistic competence skills are important to the school.

A useful tool for evaluation may be the LEARN mnemonic (Berlin and Fowkes, 1983), which offers a framework to consider how students may learn, practice, and be evaluated on skills. While this mnemonic is included, it is simply an example of how mnemonics can be used in evaluation.

C. The Effect on Clinician Learning, Patient Care, and Health Outcomes

Does what is taught affect patient care, and ultimately health outcomes? Because of the three-year cycle of COE operations and the newness of the COE programs, there is not yet enough alumni data available from COEs to answer this question. Furthermore, COEs have not traditionally been asked this question. Even if such data were available, it is not necessarily a question a single COE could address. Answering the question may require a collaboration of multiple COEs and a careful and rigorous evaluation design.

As Betancourt calls it, “connecting the dots” presents a set of challenges. Does what is included in a curriculum affect health outcomes? He also notes the difficulty in evaluation, even with skilled, unbiased evaluators. As Betancourt notes, “It is important that we not hold cross-cultural curricula to unfair evaluation standards; detractors have asked for a direct link between curricula and the improvement of hard clinical outcomes.” (2003)

Health professions students graduating from COEs will be practicing clinicians for many years. Their undergraduate and graduate education should serve as a foundation for lifelong learning in cultural and linguistic competence. If the ideal goal is to measure the effect of cultural and linguistic competence education on clinicians’ behavior in patient care settings, and that behavior’s effect on patient outcomes, COEs need to begin collecting quantitative and qualitative data that will lay the foundation for future evaluation of such performance. This form of evaluation becomes ever more challenging over time as students move further from the classroom experience. One possibility may be longitudinal studies of students from varied programs to observe how their practice patterns and patient outcomes vary. The methodological challenges related to intervening variables and comparable patient populations are substantial. Again, such research would likely be beyond the scope of any single COE, but could be an attractive opportunity for a collaborative effort.

D. Evaluating the Curriculum as a Whole

While COEs should evaluate their students’ development as culturally competent clinicians based on the curricula’s effect on student performance, the entire curricula (explicit, implicit, and null) should be evaluated in an on-going manner. Student evaluations will determine whether individual students have learned enough “baseline competencies” to proceed or graduate. Such evaluations also will be useful in helping students learn more effectively. However, a formative evaluation of the entire program or, in other words, the curriculum can highlight successes and identify opportunities for improvement.

A comprehensive evaluation may also include the curriculum development and implementation processes by attempting to determine if the curriculum is inclusive and culturally competent, and how the faculty creators might evaluate and improve their own cultural and linguistic competence. This formative approach is parallel to the developmental and continuous-improvement approach recommended for student evaluation.


II. Three Examples of Curriculum Evaluation

A. Evaluating Students in Cross-cultural Education

Regardless of the manner in which cultural and linguistic competence is taught or transmitted, the outcomes should have one common theme. As Gilbert notes, “consistent high-level expectations should be obtained.” Evaluation of students’ mastery of cultural and linguistic competence attitudes, skill, knowledge, encounter-based learning and desire should rely on a variety of techniques, both qualitative and quantitative, including oral and written examination, self-assessment and, where possible, evaluation of the application of attitudes, knowledge and skills in the actual practice setting. Given that there are a variety of cultural and linguistic competence training and educational venues and modalities, assessment strategies need to be flexible and adaptable to the training circumstances. When doing this, The Standards for Evaluating Cultural and linguistic competence Learning, Principles and Recommended Standards for Cultural Competence Education of Health Care Professionals (2003) from the California Endowment (at www.calendow.org) in Woodland Hills, CA, may be a helpful tool.

COEs may use multiple methods of evaluation to measure changes in students’ attitudes, knowledge, and skills as shown in the table, Evaluation Tools, on the next page. The specific combination of methods will depend on each COE’s resources and needs.

 

Tool Areas Evaluated Description / Uses
Written and Fact-Based Examinations
Pre-post questionnaires and multiple-choice exams Awareness, skills, knowledge These could be designed to assess students’ knowledge, attitudes, and skills through incorporation of clinical cases. COEs and others may wish to develop examinations based on Nora LM, et al; Improving cross-cultural skills of medical students through medical school-community partnerships, West. J Med. 1994; 161:144-7 and Nunez, AE, Transforming cultural and linguistic competence into cross-cultural efficacy in women’s health education. Acad Med. 2000;75:1071-80.
Cultural Competence Health Practitioner Assessment: National Center for Cultural Competence Awareness, skills, knowledge 20-minute questionnaire includes six sub scales. Developed for practicing clinicians; may be useful for students as well. Completing survey online provides assessment results and referral to appropriate resources based on results. http://www11.georgetown.edu/research/gucchd/nccc/features/CCHPA.html
Latino Cultural Competence Self-Assessment: Nilda Chong, Kaiser Permanente Knowledge, skills 20 item, self-administered questionnaire assessing cultural knowledge and patient interaction skills. Developed for practicing clinicians; may be useful for students as well. The Latino Patient: A Cultural Guide for Health Care Providers, p. 85-87
“The Provider’s Guide to Quality & Culture” Management Sciences for Health (MSH); U.S. Department of Health and Human Services; Health Resources and Services Administration. Bureau of Primary Health Care Awareness, knowledge 23-item, self-administered online questionnaire. http://erc.msh.org/mainpage.cfm?file=2.0.htm&module=provider&language=English
A Family Physician’s Practical Guide to Culturally Competent Care, http://cccm.thinkculturalhealth.org Knowledge, skills, awareness/attitudes HHS OMH-developed online/DVD course. Includes assessments leading to CME credits. COEs are encouraged to develop their own interactive online or DVD/CD-Rom tools for assessment.
Real and Simulated Clinical Encounters
Objective Standardized Clinical Examinations (OSCEs)-(See Appendix A for sample) Knowledge, skills, awareness Students examine standardized patients (actors) from diverse backgrounds presenting cross-cultural issues. It is important to integrate cross-cultural issues seamlessly into the encounter or stations. COE’s should develop OSCEs that assess knowledge, skills, as well as the behaviors/attitudes important for cross-cultural communication.
Videotaped/audio taped clinical encounter Knowledge, skills, awareness Students are recorded examining actual patients as part of their clinical experience. COEs developing and using this method are encouraged to publish their research and tools to advance the field.
Curriculum Assessment
Tool for Assessing Cultural Competence Training (TACCT)—AAMC Knowledge, skills, awareness Provides an opportunity to identify and monitor cultural competence education across the basic science and clinical curriculum. COEs are encouraged to assess the overall effect of the curriculum as a whole on student’s knowledge, attitudes, and skills.

B. Assessing Clinical Skills

The National Asian American Pacific Islander Mental Health Association (NAAPIMHA) has developed a curriculum to address the mental health needs of Asian Americans and Pacific Islanders. Using a workforce training grant from the U.S. Department of Health and Human Services, Substance Abuse and Mental Health Services Administration, Center for Mental Health Services, the association sought to help reduce disparities in mental health care for diverse populations by building a workforce capacity. As mentioned in Chapter 5, the Growing Our Own curriculum is based on the DSM IV TR Outline for Cultural Formulation.

In addition to the curriculum, NAAPIMHA has developed an evaluation design that uses Standardized Patient (SP) protocols to assess the clinical skills of interns. Often used in medical school training, these protocols may be an effective tool in assessing cultural competency for therapists in training. The SP evaluation protocol uses trained actors and scripted vignettes involving Asian-American patients to assess the effectiveness of the training program. The evaluation of trainees from all sites has been done at the UCSF Clinical Skills Center, which is used primarily to assess UCSF medical students through SP protocols. Each trainee interviews a total of two SPs and then writes a brief DSM-IV TR Outline for Cultural Formulation. Trainee evaluations are based on the written outline, review of the videotaped interviews, and written feedback from the SP as to the quality of the clinician-consumer interaction.

C. Using the CLAS Standards as a Framework for Assessment

The Center for Healthy Families and Cultural Diversity at the University of Medicine and Dentistry of New Jersey-Robert Wood Johnson Medical School has been actively involved in providing training about cultural competency and racial and ethnic health disparities, and employing quality improvement methods to evaluate the impact of practice interventions. The work they have done suggests some potential evaluation strategies that could be adapted by the COEs in assessing their programs. Both qualitative and quantitative assessment approaches were used.

In 200l, the center was awarded a two-year grant from the Aetna Foundation’s Quality Care Research Fund to assess, along with other quality improvement issues, whether integrating a cultural competency training program into ongoing quality improvement activities at two large urban family practices would result in improved physician knowledge, skills, attitudes, and comfort levels relating to the care of patients from diverse backgrounds. Another goal related to cultural competency issues was to learn more from physicians, staff, and patients about the challenges involved in meeting the DHHS Office of Minority Health’s Culturally and Linguistically Appropriated Services (CLAS) Standards.

  • Assessing Gains in Clinical Cultural Competency: An assessment tool, a Clinical Cultural Competency Questionnaire (CCCQ), was administered to 17 faculty physicians both before and after A Cultural Competency for Health Care Providers Training Program was presented to faculty, residents, and medical students. The training program consisted of five 1.5-hour interactive seminars and workshops over an eight-month period. Findings: Pre-post- training assessments showed that physician’s self-perceived cultural competence knowledge, skills, and comfort levels increased significantly.
  • Addressing the CLAS Standards: Four in-depth interviews were held with the Medical Directors and Practice Managers at the two study sites. In addition, six focus groups were conducted with physicians, staff, and patients at the two sites. Patients, staff, and physicians, while not initially fully familiar with the CLAS Standards, were highly interested in learning about ways to infuse cultural competency into patient care delivery systems. Significant challenges to implementation were also noted and discussed.

The research suggests that Quality Improvement (QI) teams can positively impact the provision of culturally responsive care in a clinical setting, The project staff found that practice-based evaluation research, while challenging, can be successfully carried out in busy primary care settings if attention is paid to 1) obtaining the support and buy-in of leadership and champions, 2) identifying the appropriate personnel, technological, and financial resources, and 3) carefully planning and executing the study. Quantitative and qualitative tools that can help measure physician’s self-perceived cultural competence do exist (e.g. sample of CCCQ is included in Appendix A). The results of the project also indicated that multi-method assessment strategies are useful in providing a richer and deeper understanding of cultural competence in a practice setting.


III. Organizational Assessments and Evaluations

Organizational assessments and evaluations should be considered core components of all cultural and linguistic competence programming. Typically, an initial assessment involves articulating the desired outcomes or goals and establishing the methods of measurement and evaluation. A cultural and linguistic competence evaluation is a means of charting and measuring change and progress and a means of developing and clarifying organizational self-awareness. In addition, the organization plays a significant role in the development of students’ cultural and linguistic competence and is a major component of the implicit curriculum. As has been demonstrated, the context in which education takes place is equally as important as the content. An organization that does not practice cultural and linguistic competence will have difficulty teaching cultural and linguistic competence. It is therefore necessary that each COE continually assess its organizational cultural awareness in order to teach cultural and linguistic competence (see Section IIIB in the Resources chapter for a listing of organizational assessments).

As the COE begins to address specific issues related to cultural and linguistic competence, it may encounter challenges from those who represent the structures and processes of the university, the health delivery system or public policy. As a result, those leading the effort to develop such competence will need to adapt and adjust to accommodate these challenges. It is critical that each COE maintain an awareness of its own internal development. For example, a COE seeking to understand and address issues of URM faculty advancement may need to engage in conversations or even negotiations with an individual or group that does not value cultural and linguistic competence. When addressing these, the COE may influence or be influenced by curriculum, other health professions schools, and public policy.

Cultures—and our understanding of them—are constantly changing, requiring continuous assessments and dynamic program evaluations. The absence of organizational assessments, or evaluations performed to inflexible pre-established goals, risk the possibility that cultural and linguistic competence education becoming irrelevant, or even stereotypical and harmful.

We propose an approach in which the evaluators are partners with the COE in developing and promoting organizational cultural and linguistic competence. Systemic change is difficult in any environment, particularly in academia. COEs are relatively small, distinct entities within large universities and in larger health care delivery and training networks. The role of the evaluator is to support the COE in developing awareness of its cultural and linguistic competence and to better understand its own strengths and challenges in the various cultural and linguistic competence dimensions. The initial evaluation helps the COE to understand where it is in comparison with others and in comparison with the ideal vision. Program staff and evaluators then work in partnership to design, implement, and evaluate its cultural and linguistic competence efforts. Evaluators in this context provide real time information to enable the COE to make informed decisions and provide program leaders with information they would not otherwise be able to gather. This permits a seamless and more participatory integration of cultural and linguistic competence programming across the entire organization.

For COEs seeking to use CLAS standards, they can be made applicable to COEs by:

  • Replacing the term “Health Care Organizations” with COEs
  • Including “faculty” and “students” when the standards say “staff”
  • Adding “education” and “research” to the patient care element when the standards say “services.”

Here are some specific examples of adapting the CLAS standards for COE use.

[COEs] should implement strategies to recruit, retain, and promote at all levels of the organization a diverse [faculty, student body,] staff, and leadership that are representative of the demographic characteristics of the service area.

Many COEs choose to focus on specific populations while others work across populations. To adapt this CLAS standard, all COEs will need to describe the characteristics of a “diverse faculty, student body, staff, and leadership.” This description is essential to develop and implement the diverse strategies needed to achieve this standard. Doing so will provide a basis for evaluation. For COEs with a focused population, the concept of “service area” does not apply. Therefore, COEs could consider their unique stakeholders’ needs and develop an appropriate definition of COE participants.

[COEs] should ensure that [students]/patients/consumers receive from all staff members effective, understandable, and respectful [education] care that is provided in a manner compatible with their cultural health beliefs and practices and preferred language.

COEs could examine themselves for cultural barriers that make it more difficult for some students to succeed and respond accordingly. Such barriers could involve different learning styles, issues of direct versus indirect communication, and the challenges in leaving behind family support. This standard is complementary with COEs’ mandate to assess and improve the performance of students from underrepresented minorities.

[COEs] must offer and provide language assistance services, including bilingual staff and interpreter services, at no cost to each patient/consumer with limited English proficiency at all points of contact, in a timely manner during all hours of operation.

In addition to teaching students how to work with interpreter services, this standard also suggests the need to address those patients or consumers with limited English proficiency and who interact with the COE and its students. These patients or consumers may include community members, extended family of students, and patients.

Health care organizations should maintain a current demographic, cultural, and epidemiological profile of the community [student, staff, faculty and patient populations] as well as a needs assessment to accurately plan for and implement services that respond to the cultural and linguistic characteristics of the service area.

COEs define their service populations in terms of demographic groups and conditions rather than geographic service areas. COEs will therefore develop and maintain needs assessments and population profiles that reflect the communities they serve.


IV. HRSA Domains as a Framework for Organizational Assessment

While the CLAS standards offer substantial guidance in developing culturally and linguistically competent organizations and programs, the HRSA domains offer specific areas that permit quantitative as well as qualitative analysis.

The following adaptation of the HRSA Organizational Cultural Competence Profile may be used as an organizing framework. (It was developed by Husbands/Stubblefield-Tave in the cultural proficiency assessment of the University of Texas College of Pharmacy).

Communication: This area involves the exchange of information between the college (the faculty and the staff), and the students and the broader community; and internally among the faculty and the staff, in ways that promote cultural and linguistic competency. The areas to address in this realm include:

  • Understanding the communication needs of the students
  • Offering culturally competent communication
  • Communicating within the college

Services: The college’s delivery of educational programming in a culturally competent manner. These include:

  • Student/faculty/community input into educational activities
  • Assessment and educational planning
  • Educational guidelines and framework that address differences related to culture

Organizational infrastructure: The organizational resources required to deliver or facilitate delivery of culturally competent education, which include:

  • Financial and budgetary infrastructure
  • Faculty and staff development
  • Providing physical facilities that support culturally competent education

Organizational values: The college’s perspective and attitudes with respect to the worth and importance of cultural competency and its commitment to provide culturally competent education.

Governance: The goal-setting, policy-making, and other oversight vehicles the college uses to help ensure the delivery of culturally competent education.

Planning and monitoring and evaluation: The mechanisms and processes used for long- and short-term policy, programmatic, and operational cultural competency planning that is informed by external and internal consumers; and the systems and activities needed to actively track and assess the college’s level of cultural competency.

Faculty and staff development: The college’s efforts to ensure faculty, staff, and other service providers have the requisite attitudes, knowledge, and skills for delivering culturally competent education.

The HRSA Domains as a Framework for Organizational Assessment have proven useful at the University of Medicine and Dentistry of New Jersey-New Jersey Medical School. The NJMS-HCOE has partnered with the UMDNJ Bildner Project to translate its experiences and practices into the attainment of cultural competency at the organizational, school, and health care levels throughout the university. Students, faculty and administrators will benefit from this approach. For two years, the UMDNJ Bildner Project Team conducted interviews and focus groups to identify information concerning strategies those in the university community believed were integral to the successful incorporation of cultural and linguistic competence at all levels. The information has been analyzed and will be used as the framework for the development of cultural and linguistic competency training, curricula, and other educational services and products. Using this framework, the HCOE can leverage university-wide expertise and programs that already exist, thus avoiding duplication and extending its capacity to achieve organizational change.

V. Integrated and Stand Alone Evaluation Processes

Evaluation of cultural and linguistic competence can be integrated into other evaluation processes, conducted as a stand alone activity, or both. Making this decision involves evaluating the unique resources and needs present in each COE.

The University of Pennsylvania, for example, has integrated cultural and linguistic competence curriculum evaluation into its campus-wide curriculum evaluation process, and supplemented it with evaluation methods recognizing the unusual nature of cultural and linguistic competence education (Jerry Johnson, University of Pennsylvania, comments during HRSA COE focus group, March, 2004.).

The University of Texas, College of Pharmacy and the University of Colorado School of Pharmacy have employed stand-alone evaluations of their schools’ cultural and linguistic competence. These evaluations were developed and facilitated by an outside consulting group, The Cultural Imperative. The University of Texas, College of Pharmacy used the evaluation report as part of its accreditation process and created an ongoing committee to evaluate and implement findings of the assessment.

Ultimately, evaluation will determine whether the COEs have achieved their mandated goals.

 

   
Questions Order Publications