National Institute for Literacy
 

[ProfessionalDevelopment 1973] Re: Debunkng or Endorsing learning styles

Catherine B. King cb.king at verizon.net
Thu Feb 14 12:36:11 EST 2008


Hello Tom and all:

I have read Tom's notes and appreciate having The Learning and Skills
Research Centre at www.lsrc.ac.uk at hand.

Tom's note is instructive on many levels; however, a perusal of the site's
documents and abstracts suggests that a fuller (and more critical) picture
can be gleaned from a review of those documents and abstracts. That view
suggests that Tom has done a bit of interpretive "cherry-picking," perhaps
to lend credence to his self-described mantle as (I cringe) "Adult
Literacy's One Man Bunco Squad." (bunko?)

FIRST: Tom cites a study there saying:

"In a table showing their Overall Assessment of the Dunn & Dunn learning
style work, the authors state, quote 'Despite a large and evolving research
programme, forceful claims made for impact are questionable because of
limitations in many of the supporting (sic) are questionable because of
limitations in many of the supporting studies and the lack of independent
research on the model. Concerns raised in our review need to be addressed
before further use is made of the model in the UK.' End quote The same
recommendation should hold for the US."


The above rightly calls for concerns being addressed, and for more and
independent study on this issue. It doesn't call dogmatically for
"debunking" all research on and interest in LS/MI. On the contrary, a
review of the entire site (see a volume of quotes below) gives us a much
more reasoned and general picture of a call for inter-field collaboration,
further study, integration of thought from all concerned, including
academics and practitioners; for inclusion of quantitative, qualitative, and
action-research models; and with an exceptional emphasis on the fluid
relationship between the student and the research community in all phases
from question/problem to implementation and integration.

Also, one of the Dunn works in question in Tom's quote was from
1978--roughly at the beginning+ of a cycle of change that has finally begun
to break with positivist research methods and their unbalanced and
distorting hegemony over research methodologies--a group-think view that (I
argue) has hindered collaboration in and integration of the fields.

Clearly, we (or those in the UK) haven't found the end of the cycle and
balanced or integrated ourselves or our fields; however, a perusal of the
literature on the UK site is heartening as it suggests a conscious and
self-reflective aim at doing so (see excerpts below) as well as
well-developed and ongoing dialogue with qualitative, self-reflective, and
other methods of research that include the broader perspectives--unlike many
in Stitcht's research community--and that would be a better foundation that
(instead of aiming at contemptuous debunking) could include further
qualified research on many issues, including LS/MI. I would like to suggest
that Tom inspect his own foundations and take on a more dynamic attitude as
his own so that we can all benefit by it through his good works.

Furthermore, the report cited above also cautions on using "forceful
claims," especially by those who are "invested" (financially or otherwise)
in those claims. But this is a common and professionally delivered caution
that applies to ANY research situation and to any researcher and/or
theoretician, and certainly not only research on learning styles or MI or
"matching" teaching styles with learning styles, or in using diverse methods
to reach individual students, etc.

The caution here does not mean to throw out or "debunk" all research on
learning styles or MI. The difference here is between (a) a reasoned,
exploratory, and critical response and (2) a knee-jerk one.

Also, at some point, researchers and theoreticians must realize the
development of a critical mass of teacher wisdom, as well as students who
"vote with their feet" about responding to methods that incorporate even
cursory knowledge about learning styles, etc. This is
evidence-in-the-making, should we pay attention to it.

It should at least raise some questions about methods, and about those in
our field whose automatic/dogmatic rejection of research interest and
teacher-use of such developments speaks more of their own closed-mindedness
and failure to thrive intellectually than it does those who foster such
interest. The teachers whom I know who endorse such LS/MI informed method
are not invested monetarily, but their investments lay wholly in their
students' understanding and achievements, as mine are. I refer here to
several recent posts--even today--from practitioners suggesting the varied
use of differentiated instruction using derivatives of learning
style/teaching methods, student choice, etc.

Also, as the report-writers cited below already understand, formal
statistical studies are not always the only or best evidence-provider here
(see excerpts below). Further, the above site's/institutes 7th more general
aim is: to "encourage innovation and vision in the development of policy and
practice in learning and skills." This and the other material on the site
(some excerpted below) suggest an openness that does not come forward in
Stitcht's notes.

SECOND, the site published other documents and a review from which I have
quoted below what I think is relevant to this discussion for anyone's easy
review--but a review of the entire site is instructive, to say the least.

....and I must leave the discussion now for other work.

Catherine King

The Below is all quoted/excerpted from the LSRC site. I encourage those who
have time to visit the site. I suggest that Tom Stitcht and his colleagues
read the whole report, and pay attention to the reasoned inclusion of other
forms of research than positivist or statistical-based research as
valid/authentic, and the high regard for listening to their adult-students
and practitioners.

QUOTED

This Learning and Skills Research Centre (LSRC) review is the first stage of
a three-year research programme addressing the question: what is the nature
of non-formal learning, and how does it contribute to the pursuit of further
learning and employment opportunities?


Research studies will typically benefit from a quantitative preface to the
particular study, whether the study itself uses quantitative or qualitative
methods. P. 8 on the site, and p. 17 in the downloadable copy.




>From Idea to Impact: A guide to the Research Process.




The purpose of this publication is to encourage collaboration in research
and development in post-16 education and training through increased
understanding of the diversity of contributions. To achieve this, it
describes the different traditions, provides information and puts forward
ideas.



... aims to encourage collaboration through increased understanding of the
diversity of contributions across post-16 education and training. Greater
collaboration and wider engagement are important factors in increasing the
effectiveness and influence of research (Davies, Nutley and Smith 2000).



... To achieve this, it describes the different traditions, provides
information and puts forward ideas. If different kinds of knowledge are to
be drawn upon in conducting an investigation - knowledge drawn from practice
as well as from theory, for example - then people with different knowledge
backgrounds will need to work together. Practitioner participation is
important in:



identifying relevant research questions

advising on sampling or access to research subjects

contributing knowledge based on practice

interpreting emerging findings

elaborating implications of the findings for policy or practice.



For impact, it is helpful to think through in advance which kinds of people
will need to act on the findings, and to find ways in which they can be
involved at all stages of the research process. Systems and structures that
operate throughout the full research process are needed to facilitate the
interaction of the various players at different stages. These need to work
across the research, development, practice and policy communities.



The full research process calls for a wider range of skills, knowledge and
understanding than a single individual or even institution can normally be
expected to provide. It is important to analyse the roles needed and to plan
the way they interact. The full range of expertise needed to tackle
research from the original ideas through to impact will be located in
different professions and different institutions. Collaboration is called
for between individuals and between organisations.



The management of collaborative effort requires 'hybrid' managers, who speak
the languages of the various parties and are credible to each of them.

Research is an ill-defined word. There is little agreement about what it
embraces and what it is for. Its central role in decision-making and
improvement action is widely discussed, but less widely encountered. In many
areas of public life, the persistence of many social and economic problems
and the limited impact of interventions suggest that a stronger base of
evidence is needed to inform our practices and our strategic decisions.
Motorways designed to ease traffic congestion add to it. Housing planned to
reduce social distress adds to it. Education intended to enhance achievement
adds to social polarisation. Research is looked to for the remedy, but
expectations of it can be too high. Each research method has its
shortcomings and some problems of practice and policy remain intractable,
even after illumination by research. There is more to research than simply
the provision of evidence. It also breaks new ground conceptually, probes
the future and analyses the current state of things.



This publication focuses on research applied to improvement and discusses
this within the context of post-16 education and training, specifically that
of the learning and skills sector. It is descriptive rather than
analytical, and takes an inclusive approach to the many traditions of
research. Its purpose is to help the reader understand his or her own
activity in relation to that of others, with a view to encouraging better
integration of the various processes within research, development and
practical change. Research on evidence-based policy and practice in public
service areas such as health care, social

care and criminal justice, as well as education, suggests that the
development of partnerships between the various users of research enhances
the influence of evidence on practice (Davies, Nutley and Smith 2000).



In education, the question of practitioner involvement in the research
process was brought to prominence by David Hargreaves in a speech to the
Teacher Training Agency (TTA), where he suggested that the results of
research are sometimes not worth disseminating, because of insufficient

involvement of practitioners in setting the research agenda (Hargreaves
1996). In the FE sector, the regular involvement of practitioners in
development activity has led to proposals for a stronger, iterative
relationship between development and research (Stanton and Morris 2000).



Within the learning and skills sector as a whole, there are many pockets of
research and of development activity that could add up to a significant and
influential resource for the sector. They do not yet do so, however, partly
because they are not conceptualised and managed as a unified resource

for the service. In the field of health care, by contrast, a service-led
research and development (R&D) programme has been in existence since 1990,
functioning alongside the work of the biomedical sciences research community
(NHS R&D Programme 2002).



This publication is intended to help people involved in the planning of
research or of development to make sense of the various parts of a complex
system. As a first step, it considers types of research already taking place
in the learning and skills sector, and the uses to which they are

put. It addresses the stages and relationships within a holistic research
process. The expectation is that greater understanding of the motives and
traditions within the learning and skills research community will lead on to
more effective research.



p. 14 or 30:



In the area of guidance and learning support, a growing body of evidence is
available on what approaches work for different client groups (Martinez
2001) and how learners themselves perceive the process (Bloomer and
Hodkinson 1999). Pedagogic practice, an area that has been relatively
neglected hitherto in post-16 learning, has recently become the subject of a
major ESRC research programme(TLRP 2002b).



p. 19 or p. 35 below



A wide range of approaches - involving, for example, interview,
questionnaire surveys, biographical study, statistical analysis,
observation, photographic and diary recording - are all commonly used in the
sector. The merits and drawbacks of particular methods are dealt with
extensively in the literature.

Some useful introductory texts are given in the References section (Bell
1993; Wilkinson 2000).



An issue of particular importance here is the combination of methods.
Quantitative and qualitative approaches are sometimes wrongly counterposed,
when a judicious mix of the two might yield a more rounded view of issues of
practical concern. A salient example is the work of the Centre for

Research on the Wider Benefits of Learning, which combines analysis of the
national birth cohort studies (longitudinal studies of a sample of people
born in a given year) with systematic analysis of in-depth interviews with
adults in the community (WBoL 2002). The experience at this centre,

as elsewhere, is that statistical data is important in demonstrating trends
and pointing up relationships between factors for which a record exists
(such as attainment at school and subsequent success in the labour market);
however, it is limited by its lack of finer explanatory detail. Conversely,

data drawn from the lives of individuals is rich in such detail, but yields
generalisable conclusions less easily. For addressing issues of policy and
practice, using a combination of research methods is often advisable.




p. 22 & 38 below




An issue of particular importance here is the combination of methods.
Quantitative and qualitative approaches are sometimes wrongly counterposed,
when a judicious mix of the two might yield a more rounded view of issues of
practical concern. A salient example is the work of the Centre for

Research on the Wider Benefits of Learning, which combines analysis of the
national birth cohort studies (longitudinal studies of a sample of people
born in a given year) with systematic analysis of in-depth interviews with
adults in the community (WBoL 2002). The experience at this centre,

as elsewhere, is that statistical data is important in demonstrating trends
and pointing up relationships between factors for which a record exists
(such as attainment at school and subsequent success in the labour market);
however, it is limited by its lack of finer explanatory detail. Conversely,

data drawn from the lives of individuals is rich in such detail, but yields
generalisable conclusions less easily. For addressing issues of policy and
practice, using a combination of research methods is often advisable.



below p. 24 & 42



In the interpretation stage, knowledge drawn from practitioners may be
important in the 'transformation' of research findings. This process,
described by Desforges (2000), combines two kinds of knowledge: the
contextual understanding developed by experienced and reflective
practitioners, and the more generalised findings of systematic research.
Understanding drawn from practice, though not necessarily generalisable or
capable of validation, may

provide important clues as to how to interpret findings in ways that will
make sense to other practitioners. This influence may prove helpful in
creating research outcomes that are relevant and applicable.



An example of the process of combining knowledge from practice and research,
drawn from work for the DfES commissioned by the Social Exclusion Unit in
2001, involved investigating ways that public agencies could help young
people who are neither in education, employment nor training.

The work involved practitioners in community-based organisations working
with a research team to identify principles and practices. An outline of
this work is given in the box on page 21.






below p. 24 & 41





The active engagement of users of research in appropriate stages of the
research process contributes to both a sense of ownership among the user
community and enhancement of the quality and applicability of research
outcomes. The sense of ownership may be crucial in determining whether

the research evidence is ultimately used in practice. For research planners
wishing to develop a strategy for impact, it is helpful to think through in
advance which kinds of people will need to act on its findings, and find
ways to involve them at the stages identified above.

Research report Section 3 page 14/15



Below p. 26 & 41




Section 4 Working together

Research report page 16/17

Roles



In the various stages of the full process indicated in Figure 1 on page 9,
contributions are called for from many different kinds of professional.
Those identified with the investigative phase include discipline
specialists - social scientists, management scientists, historians,
anthropologists, for

example; and specialists in methods - statistics, surveying, market
research, qualitative analysis, for example. These may be associated with
higher education institutions, independent research institutes, colleges,
work-based or community-based providers, local LSCs, or charities, or they
may operate independently as freelancers. Other kinds of professional play
key roles in the planning and influencing phases, particularly several kinds
of

practitioner. In the post-16 sector, these include teachers, trainers and
lecturers, guidance and counselling staff, library and information
specialists, marketing and MIS people, as well as those managing teams,
services and institutions.



Likewise, issues relating to policy need to involve policy strategists,
developers and advisers, as well as those who implement it. Their knowledge,
drawn from practical experience, needs to inform the identification of key
priorities for research, the specification of research questions related to
practical problems, and the interpretation of findings. Without this kind
of knowledge, research runs the risk of being directed to less important
areas, of producing less useful outcomes, or of being poorly timed in
relation to the delivery of teaching and learning. In short, it risks being
less likely to influence actual behaviour.



Mediators ...

Communicators ,,,

Integration ...

Responsibilities ...

Collaboration ...







More information about the ProfessionalDevelopment mailing list