Skip Navigation HRSA - Health Resources and Service Administration U.S. Department of Health & Human Services
Home
Questions
Order Publications
 
Grants Find Help Service Delivery Data Health System Concerns About HRSA

SEVERITY OF NEED INDEX (SON)

 

PROCESS

Panel Process

Panelist Responsibilities. The responsibility of each panel is to add structure and specificity to previous conceptual discussions held regarding SON. Panelists determined specific items that could be included in a SON Index. Results of deliberations and research were compiled into a written report. Members of the HRSA/HAB Workgroup and the Contractor Team were assigned to each panel to provide assistance as needed.

Panelist Tasks
Task Detail
Defining Components Examine the proposed set of variables

Define the meaning of each variable and how it influences a component of the SON equation (disease burden, cost of care, or available resources)

Add and remove variables, with documented justification
Identifying data elements, their sources, and availability Determine and report on data sources for each variable and whether they are available at the county level

If data are available only at the State level, consider whether it is reasonable to use State-level data for county-level estimates

Determine how frequently data for each measure are compiled

Determine the availability (cost, necessary use agreements) of each data element

• If measures are limited for data elements, are there measures which can be used as proxies? If so, consider the same issues for proxy measures

• Determine the uniqueness and importance of each variable. Does the variable measure some unique component of the panel’s subject area that relates to resource needs?
Identifying problems Assess reliability of measures. Are elements measured in a similar way across units of analysis?

Assess validity of measures. Do elements measure the intended underlying concept or phenomenon?

Assess the measurement error of the data element. Is measurement error systematic in a way that it would bias resource allocations to the benefit of certain areas?

If issues of reliability, validity, or systematic measurement error exist, are they resolvable with statistical adjustments?

Do the measurement problems of the variable outweigh its information value?

• Do any other problems preclude the use of this variable?
Creating final variable list List variables for inclusion

Provide justification for each included variable

Indicate data sources and measures for each variable
Weighting Recommend relative weights for each measure used to indicate a variable (for example, different measures used to indicate disease burden)

Recommend relative weights for each variable used to indicate a component (for example, weight for each variable – disease burden, availability of health care resources, etc. – used to indicate a component)

Articulate a rationale for the weights

Panel Operation. Each panel was staffed with members from four areas. Panelists were chosen on the basis of their ability to provide expertise in their particular area—and beyond.

Panel Representatives and Their Recommended Roles
Type of Representative Recommended Role
Subject Matter Experts To introduce the main challenges of measuring SON from an academic and research perspective
Ryan White Program Representatives To provide input from an experiential or contextual perspective in the SON index development process
HHS and HRSA/HAB Representatives To provide input and act as a resource in the SON index development. A HAB representative served as co-chair for each panel.
Contractor Team (Altarum) To receive input from panels and HRSA on SON model(s). To facilitate meetings and conference calls, and coordinate communications, travel arrangements, reimbursements, and payments of honoraria
Contractor Team (RTI) To synthesize the content of discussions between meetings and raise clarifying questions to ensure that panel recommendations can be translated usefully into a quantifiable index

Initiating discussion. Each panel began with a discussion of its specific area, as outlined on the corresponding chart. Once the overall conceptualization was addressed, a discussion about each of the first variables ensued, based on the following questions:

How did the IOM Committee define the variable?

  • What is missing in this description?
  • How exactly does this variable influence SON?
  • What would be an ideal measure of this variable?
  • Realistically, what variables are likely to be available?
  • What are the specific measures and what are their sources?
  • For each variable, are data available at the county level? If not, is it reasonable to use State-level data? How frequently are data compiled? Are selected data sources available for free? If measures are limited for data elements, are there measures that can be used as proxies?

At the meeting’s conclusion, the Contractor Team summarized the session notes and forwarded them to the panel for approval. Panelists had the opportunity to propose changes to the notes. The session notes then served to document the process.

Electing a panel chair. To ensure that all variables were covered, that panelists’ views had been adequately addressed, and that the group progressed as intended, panelists elected a co-chair to work directly with HAB and the Contractor Team. Each panel also had a HRSA/HAB co-chair.

The panel co-chairs were ultimately responsible for insuring that the panel arrived at recommendations that were usable by the Contractor to generate a Severity of Need Index. At the end of the expert panel deliberations, each panel had the responsibility to submit the following to the contractor:

  • A completed data template for each variable included in the model
  • A document explaining the rationale for the inclusion of each variable on the list.

The panel co-chairs were responsible for finalizing their reports and delivering them to the contractor. Although unanimous consensus among panel members regarding the contents of their recommendations was desirable, it was not required for finalization. The panel co-chairs strived to reach a broad consensus among panelists whenever possible. The co-chairs were encouraged to call for votes among the panelists to resolve difficult issues, and the nature of all controversies was documented in the panel's final report.

The specific roles and responsibilities of the panel co-chairs were to:

  • Set the agenda for each conference call
  • Facilitate the division of tasks among panel participants
  • Forward formal requests for technical assistance to the contractor
  • Call for votes to resolve controversial issues
  • Deliver a completed data template and written documentation of the groups decisions to the contractor.

Voting. Panelists were asked to vote on the group's recommendations as a whole, and the work of the panel was not considered complete until it was ratified by a majority of panel members. The panel chair was empowered to call votes to resolve disagreements in order to move the process forward. Only panel members (excluding the Contractor Team) were empowered to vote to resolve disputes. The Contractor Team, interested members of other panels, and delegates of named panel member were allowed to freely offer input on the panel deliberations but were not empowered to vote.

Organizing additional work. To facilitate completion of the project, panel members volunteered to be responsible for coordinating the response for individual variables. Contractor Team and HRSA/HAB representatives also assisted panelists.

Reporting results. The end product from each panel was a written report that outlined the information described above for each data element examined. The Contractor Team and HRSA/HAB representatives were available to assist with the creation of these documents.