Ohio State nav bar

The Ohio State University

  • BuckeyeLink
  • Find People
  • Search Ohio State

Qualitative v. Quantitative Research Reflection

Initially, after learning, reading, and researching about these to methods of approaching research in the social work field, I found myself immediately drawn towards quantitative research. Numbers make sense to me and it seems incredibly logical and convenient in theory for me to be able to reduce the human experience into a data set of numbers which I can then calculate and compute to give me a meaningful answer. However, it’s become clear to me over these past few weeks through studying and reading more qualitative studies, that they can be an incredibly valuable resource to actually understanding with and sympathizing with the material we are researching. I believe that qualitative research gives the researcher as well as the person applying the conclusions reached from the research into practice a good understanding of the human component and nuances that go into implementing interventions. Often, it seems that qualitative research can explore the complexities a little more delicately than quantitative research might be able to because the data is becoming synthesized into numbers. Qualitative research does have it’s downfalls though. While all forms of research is subject to various biases, it would seem that qualitative research has a higher risk because, instead of interpreting numbers and calculations, we must interpret human thoughts, feelings, and experiences, which are much less concrete variables. It also may be harder to reach a definitive, mathematically supported answer to the question being posed. Ultimately, I believe mixed methods approach could take the advantages of both methods and combine them so that the research covers both the concrete evidence presented through quantitative researched with the complex insight of the qualitative research.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Leeds Beckett University

Skills for Learning : Research Skills

Data analysis is an ongoing process that should occur throughout your research project. Suitable data-analysis methods must be selected when you write your research proposal. The nature of your data (i.e. quantitative or qualitative) will be influenced by your research design and purpose. The data will also influence the analysis methods selected.

We run interactive workshops to help you develop skills related to doing research, such as data analysis, writing literature reviews and preparing for dissertations. Find out more on the Skills for Learning Workshops page.

We have online academic skills modules within MyBeckett for all levels of university study. These modules will help your academic development and support your success at LBU. You can work through the modules at your own pace, revisiting them as required. Find out more from our FAQ What academic skills modules are available?

Quantitative data analysis

Broadly speaking, 'statistics' refers to methods, tools and techniques used to collect, organise and interpret data. The goal of statistics is to gain understanding from data. Therefore, you need to know how to:

  • Produce data – for example, by handing out a questionnaire or doing an experiment.
  • Organise, summarise, present and analyse data.
  • Draw valid conclusions from findings.

There are a number of statistical methods you can use to analyse data. Choosing an appropriate statistical method should follow naturally, however, from your research design. Therefore, you should think about data analysis at the early stages of your study design. You may need to consult a statistician for help with this.

Tips for working with statistical data

  • Plan so that the data you get has a good chance of successfully tackling the research problem. This will involve reading literature on your subject, as well as on what makes a good study.
  • To reach useful conclusions, you need to reduce uncertainties or 'noise'. Thus, you will need a sufficiently large data sample. A large sample will improve precision. However, this must be balanced against the 'costs' (time and money) of collection.
  • Consider the logistics. Will there be problems in obtaining sufficient high-quality data? Think about accuracy, trustworthiness and completeness.
  • Statistics are based on random samples. Consider whether your sample will be suited to this sort of analysis. Might there be biases to think about?
  • How will you deal with missing values (any data that is not recorded for some reason)? These can result from gaps in a record or whole records being missed out.
  • When analysing data, start by looking at each variable separately. Conduct initial/exploratory data analysis using graphical displays. Do this before looking at variables in conjunction or anything more complicated. This process can help locate errors in the data and also gives you a 'feel' for the data.
  • Look out for patterns of 'missingness'. They are likely to alert you if there’s a problem. If the 'missingness' is not random, then it will have an impact on the results.
  • Be vigilant and think through what you are doing at all times. Think critically. Statistics are not just mathematical tricks that a computer sorts out. Rather, analysing statistical data is a process that the human mind must interpret!

Top tips! Try inventing or generating the sort of data you might get and see if you can analyse it. Make sure that your process works before gathering actual data. Think what the output of an analytic procedure will look like before doing it for real.

(Note: it is actually difficult to generate realistic data. There are fraud-detection methods in place to identify data that has been fabricated. So, remember to get rid of your practice data before analysing the real stuff!)

Statistical software packages

Software packages can be used to analyse and present data. The most widely used ones are SPSS and NVivo.

SPSS is a statistical-analysis and data-management package for quantitative data analysis. Click on ‘ How do I install SPSS? ’ to learn how to download SPSS to your personal device. SPSS can perform a wide variety of statistical procedures. Some examples are:

  • Data management (i.e. creating subsets of data or transforming data).
  • Summarising, describing or presenting data (i.e. mean, median and frequency).
  • Looking at the distribution of data (i.e. standard deviation).
  • Comparing groups for significant differences using parametric (i.e. t-test) and non-parametric (i.e. Chi-square) tests.
  • Identifying significant relationships between variables (i.e. correlation).

NVivo can be used for qualitative data analysis. It is suitable for use with a wide range of methodologies. Click on ‘ How do I access NVivo ’ to learn how to download NVivo to your personal device. NVivo supports grounded theory, survey data, case studies, focus groups, phenomenology, field research and action research.

  • Process data such as interview transcripts, literature or media extracts, and historical documents.
  • Code data on screen and explore all coding and documents interactively.
  • Rearrange, restructure, extend and edit text, coding and coding relationships.
  • Search imported text for words, phrases or patterns, and automatically code the results.

Qualitative data analysis

Miles and Huberman (1994) point out that there are diverse approaches to qualitative research and analysis. They suggest, however, that it is possible to identify 'a fairly classic set of analytic moves arranged in sequence'. This involves:

  • Affixing codes to a set of field notes drawn from observation or interviews.
  • Noting reflections or other remarks in the margins.
  • Sorting/sifting through these materials to identify: a) similar phrases, relationships between variables, patterns and themes and b) distinct differences between subgroups and common sequences.
  • Isolating these patterns/processes and commonalties/differences. Then, taking them out to the field in the next wave of data collection.
  • Highlighting generalisations and relating them to your original research themes.
  • Taking the generalisations and analysing them in relation to theoretical perspectives.

        (Miles and Huberman, 1994.)

Patterns and generalisations are usually arrived at through a process of analytic induction (see above points 5 and 6). Qualitative analysis rarely involves statistical analysis of relationships between variables. Qualitative analysis aims to gain in-depth understanding of concepts, opinions or experiences.

Presenting information

There are a number of different ways of presenting and communicating information. The particular format you use is dependent upon the type of data generated from the methods you have employed.

Here are some appropriate ways of presenting information for different types of data:

Bar charts: These   may be useful for comparing relative sizes. However, they tend to use a large amount of ink to display a relatively small amount of information. Consider a simple line chart as an alternative.

Pie charts: These have the benefit of indicating that the data must add up to 100%. However, they make it difficult for viewers to distinguish relative sizes, especially if two slices have a difference of less than 10%.

Other examples of presenting data in graphical form include line charts and  scatter plots .

Qualitative data is more likely to be presented in text form. For example, using quotations from interviews or field diaries.

  • Plan ahead, thinking carefully about how you will analyse and present your data.
  • Think through possible restrictions to resources you may encounter and plan accordingly.
  • Find out about the different IT packages available for analysing your data and select the most appropriate.
  • If necessary, allow time to attend an introductory course on a particular computer package. You can book SPSS and NVivo workshops via MyHub .
  • Code your data appropriately, assigning conceptual or numerical codes as suitable.
  • Organise your data so it can be analysed and presented easily.
  • Choose the most suitable way of presenting your information, according to the type of data collected. This will allow your information to be understood and interpreted better.

Primary, secondary and tertiary sources

Information sources are sometimes categorised as primary, secondary or tertiary sources depending on whether or not they are ‘original’ materials or data. For some research projects, you may need to use primary sources as well as secondary or tertiary sources. However the distinction between primary and secondary sources is not always clear and depends on the context. For example, a newspaper article might usually be categorised as a secondary source. But it could also be regarded as a primary source if it were an article giving a first-hand account of a historical event written close to the time it occurred.

  • Primary sources
  • Secondary sources
  • Tertiary sources
  • Grey literature

Primary sources are original sources of information that provide first-hand accounts of what is being experienced or researched. They enable you to get as close to the actual event or research as possible. They are useful for getting the most contemporary information about a topic.

Examples include diary entries, newspaper articles, census data, journal articles with original reports of research, letters, email or other correspondence, original manuscripts and archives, interviews, research data and reports, statistics, autobiographies, exhibitions, films, and artists' writings.

Some information will be available on an Open Access basis, freely accessible online. However, many academic sources are paywalled, and you may need to login as a Leeds Beckett student to access them. Where Leeds Beckett does not have access to a source, you can use our  Request It! Service .

Secondary sources interpret, evaluate or analyse primary sources. They're useful for providing background information on a topic, or for looking back at an event from a current perspective. The majority of your literature searching will probably be done to find secondary sources on your topic.

Examples include journal articles which review or interpret original findings, popular magazine articles commenting on more serious research, textbooks and biographies.

The term tertiary sources isn't used a great deal. There's overlap between what might be considered a secondary source and a tertiary source. One definition is that a tertiary source brings together secondary sources.

Examples include almanacs, fact books, bibliographies, dictionaries and encyclopaedias, directories, indexes and abstracts. They can be useful for introductory information or an overview of a topic in the early stages of research.

Depending on your subject of study, grey literature may be another source you need to use. Grey literature includes technical or research reports, theses and dissertations, conference papers, government documents, white papers, and so on.

Artificial intelligence tools

Before using any generative artificial intelligence or paraphrasing tools in your assessments, you should check if this is permitted on your course.

If their use is permitted on your course, you must  acknowledge any use of generative artificial intelligence tools  such as ChatGPT or paraphrasing tools (e.g., Grammarly, Quillbot, etc.), even if you have only used them to generate ideas for your assessments or for proofreading.

  • Academic Integrity Module in MyBeckett
  • Assignment Calculator
  • Building on Feedback
  • Disability Advice
  • Essay X-ray tool
  • International Students' Academic Introduction
  • Manchester Academic Phrasebank
  • Quote, Unquote
  • Skills and Subject Suppor t
  • Turnitin Grammar Checker

{{You can add more boxes below for links specific to this page [this note will not appear on user pages] }}

  • Research Methods Checklist
  • Sampling Checklist

Skills for Learning FAQs

Library & Student Services

0113 812 1000

  • University Disclaimer
  • Accessibility

Data Presentation — Quantitative Data

Cite this chapter.

Book cover

  • David Bowers 2  

99 Accesses

In Chapter 2 we discussed various ways (several graphical and one tabular) of presenting qualitative data. In all the example we considered, the data arose from a nominal measuring scale. Although nominal (i.e. qualitative) data often occurs in business and economics, more common is quantitative data, arising from the use of ordinal and interval/ratio measuring scales. In this chapter we will discuss methods of presenting such data in ways which enable a rapid appreciation of its principal features. The methods we discuss include both tabular and graphical descriptions of data, but the emphasis throughout the chapter lies with frequency distributions and associated procedures.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Unable to display preview.  Download preview PDF.

Author information

Authors and affiliations.

Department of Social and Economic Studies, University of Bradford, UK

David Bowers ( Lecturer )

You can also search for this author in PubMed   Google Scholar

Copyright information

© 1991 David Bowers

About this chapter

Bowers, D. (1991). Data Presentation — Quantitative Data. In: Statistics for Economics and Business. Palgrave Macmillan, London. https://doi.org/10.1007/978-1-349-21346-7_3

Download citation

DOI : https://doi.org/10.1007/978-1-349-21346-7_3

Publisher Name : Palgrave Macmillan, London

Print ISBN : 978-0-333-56029-7

Online ISBN : 978-1-349-21346-7

eBook Packages : Palgrave Business & Management Collection Business and Management (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Logo for British Columbia/Yukon Open Authoring Platform

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

67 Types of Quantitative Data Analysis and Presentation Format

If your thesis is quantitative research, you will be conducting various types of analyses (see the following table).

Practicing and Presenting Social Research Copyright © 2022 by Oral Robinson and Alexander Wilson is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

Share This Book

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

10.3: Types of Quantitative Data Analysis and Presentation Format

  • Last updated
  • Save as PDF
  • Page ID 194132

If your thesis is quantitative research, you will be conducting various types of analyses (see the following table).

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Am J Pharm Educ
  • v.80(1); 2016 Feb 25

Qualitative Analysis of Written Reflections during a Teaching Certificate Program

Ashley n. castleberry.

a University of Arkansas for Medical Sciences College of Pharmacy, Little Rock, Arkansas

Nalin Payakachat

Sarah ashby, amanda nolen.

b University of Arkansas at Little Rock

Martha Carle

c University of Arkansas for Medical Sciences Office of Educational Development, Little Rock, Arkansas

Kathryn K. Neill

Amy m. franks.

Objective. To evaluate the success of a teaching certificate program by qualitatively evaluating the content and extent of participants’ reflections.

Methods. Two investigators independently identified themes within midpoint and final reflection essays across six program years. Each essay was evaluated to determine the extent of reflection in prompted teaching-related topic areas (strengths, weaknesses, assessment, feedback).

Results. Twenty-eight themes were identified within 132 essays. Common themes encompassed content delivery, student assessment, personal successes, and challenges encountered. Deep reflection was exhibited, with 48% of essays achieving the highest level of critical reflection. Extent of reflection trended higher from midpoint to final essays, with significant increases in the strengths and feedback areas.

Conclusion. The teaching certificate program fostered critical reflection and self-reported positive behavior change in teaching, thus providing a high-quality professional development opportunity. Such programs should strongly consider emphasizing critical reflection through required reflective exercises at multiple points within program curricula.

INTRODUCTION

Reflection is an intentional, dynamic process that allows improvement in one’s actions, abilities, and knowledge by learning from past experiences. 1-4 While this process can be useful in almost all aspects of life, reflection in the workplace can be particularly beneficial. Successful professionals must be able to reflect on their experiences in order to find solutions to complex problems encountered on a daily basis. 1-2 Such reflection is not only necessary for pharmacists and other health care professionals to improve their practice, but also to further hone their expertise as educators. 5 Reflection should be used by pharmacists committed to professional growth as lifelong learners. 6

The practice of reflection during residency programs offers a valuable opportunity to observe and guide residents in this process at the beginning of their careers. Teaching certificate programs within pharmacy residency programs were founded on the idea that having specialized pharmacy knowledge does not necessarily equate with being an excellent teacher. 7 Teaching certificate programs give participants general pedagogical knowledge to combine with their existing content knowledge. Participants complete didactic and experiential teaching activities to develop such teaching skills. A critical component of the teaching certificate program at the University of Arkansas for Medical Sciences (UAMS) College of Pharmacy is the extensive reflection required throughout the year-long program. Because reflection is such an integral part of professional development, evaluation of the content and extent of participants’ reflections is imperative.

Research on the topic of reflection is extensive, but investigation of reflection on teaching by pharmacy faculty members or faculty members in training is not described in the literature. Additionally, methods to assess teaching certificate programs focus on surveys but lack the details offered by more in-depth analysis. Qualitative analysis of reflective essays could provide better understanding of program benefits and participant growth. Our mixed-methods evaluation is the first to examine thematic composition of reflections as well as the extent of reflection evidenced in the written essays of potential future pharmacy faculty members as they participate in teaching certificate programs.

This study was designed as a mixed methods thematic analysis of teaching certificate program participants’ reflective essays. A qualitative approach was chosen because this method allows deep analysis of the text not obtainable from survey-based research. A modified constant comparison method of analysis was employed, and categories and themes were constructed from open and axially coded data. 8 The coding scheme arose from the data as researchers explored them. Resulting themes were evaluated in comparison to the typology derived from teacher reflection theoretical framework. Qualitative methods were quantified to provide comparison of themes and level of reflection. The data evaluation provides a descriptive evaluation of the effectiveness of this teaching certificate program activity and its impact on teaching development.

The school’s teaching certificate program was originally developed in 2005 to enhance the teaching skills of pharmacy residents but quickly expanded to include preceptors because of an increased demand for preceptor development in teaching. The program, described in a previous manuscript, 9 facilitated development as an educator through the following experiences: formulating personal goals for development in teaching, tailoring teaching approaches to learning setting and audience, practicing effective assessment and feedback skills, receiving ongoing feedback from program faculty members, reflecting upon individual teaching experiences, developing a personal teaching philosophy, and documenting experiences through the development of a comprehensive teaching portfolio.

Over the course of the program (July to May), participants attended formal teaching seminars, self-selected teaching activities, and met with a faculty teaching mentor, who monitored their progression in the program. Participants also were required to write two reflective essays describing their teaching development. The midpoint reflection was submitted in December, and the final reflection was submitted in May of the program year. In these global reflections, participants were asked to discuss their teaching development and specifically include commentary on each of four topic areas: teaching-related strengths, teaching-related weaknesses, ability to effectively assess learners, and ability to provide effective learner feedback. Pharmacy residents and preceptors from across the state participated in all aspects of the program. All retrievable essays from participants who submitted written essays in December and May were analyzed. Participants were excluded if they did not complete the teaching certificate program by submitting December and May essays or if the data were not available.

De-identified electronic copies of reflection essays were used to extract themes and the extent of reflection using the four prompted topic areas of strengths, weaknesses, assessment, and feedback. The 4-category coding scheme described by Kember et al 10 was adapted to determine the extent of reflection for each of the four prompted topic areas as well as the highest level obtained overall ( Table 1 ). A fifth level of “0” was added to be assigned when the participant did not discuss a topic area. Because the instructions for the reflective assignment served only as prompts for reflection, participants were not required to write about each area, and therefore could receive a score of “0.”

Five-Category Scheme for Assessing Extent of Reflection in Written Essays 10

An external file that holds a picture, illustration, etc.
Object name is ajpe80110-t1.jpg

Two investigators in the team independently identified the themes discussed in the essays along with the extent of reflection of each essay in the four topic areas using NVivo, v10 (QSR International Pty Ltd., Doncaster, Victoria, Australia). The primary coder used a sample of 10 essays to create initial categories of themes and shared this initial set of themes with the second coder. Coders independently analyzed essays for themes discussed and the extent of reflection obtained in each topic area. The coders met weekly to discuss the extent of reflection assigned to each essay and resolved any differences.

By using a constant comparative approach, coders agreed by consensus on every essay to achieve 100% inter-rater reliability. Additional reflective themes also were added based on coder agreement when a new topic was discussed that did not fit into one of the predetermined themes. A third investigator periodically reviewed the coding process and identified themes for verification to enhance agreement between coders. An electronic coding platform made it possible to efficiently code, analyze, and organize this large amount of data. Each essay, depending on length, took about 10-20 minutes to analyze.

Further patterns emerged from the data during qualitative analysis and were explored with additional detail using triangulation methods. Coder agreement on coding of themes was assessed using Cohen’s kappa. Data were analyzed to identify differences in the extent of reflection in the midpoint and final essays using Wilcoxon signed-rank test. These data were further analyzed to identify whether extent of reflection varied according to participant gender and experience (preceptor vs resident) using the Wilcoxon rank sum test with Stata/SE, v12.0 (StatCorp LP, College Station, TX). The thematic analysis was granted exempt status by the UAMS Institutional Review Board.

All available pairs of essays were evaluated from participants completing the teaching certificate program between 2006 and 2012. One hundred thirty-two essays were analyzed from 66 participants. Of the 66 participants, 53 (80%) were female, and 10 (15%) were preceptors.

The coders identified more than 11 000 references to 28 themes discussed within the 132 essays. Themes covered a broad range of professional development topic areas, including delivering educational content, interacting with students, evaluating success, and encountering challenges. A full list of identified themes is available from the authors. Agreement between the coders on themes discussed in each essay was high (kappa=0.74). Data saturation of the identified themes occurred early in the coding process, indicating that the list of themes was representative of the context of the essays.

Themes were tracked by the number of essays in which the theme was discussed (“mention”) as well as the total number of references to the theme in all essays (“weight”), as themes could be discussed multiple times within a single essay. Table 2 provides the 10 most common discussed themes and weight placed on theme according to total number of references to that theme. The same 10 themes ranked highly according to both mentions and weight, indicating the emphasis placed on these ideas by the participants was consistent between and within essays. (A detailed description of themes and results of analyses are available from the corresponding author.)

Ten Most Common Themes by Mention and Weight

An external file that holds a picture, illustration, etc.
Object name is ajpe80110-t2.jpg

High levels of reflection were exhibited in the participant essays, with 48% of essays achieving the level of critical reflection (level 4). Quotations from teaching certificate program participant essays specific to the area of assessment were extracted to illustrate the extent of reflection. An example of nonreflection (level 1) was: “Assessment questions are aimed at evaluating comprehension of stated objectives.” A statement that was consistent with understanding (level 2) was: “There are various characteristics that students display that allow assessment of their abilities.” Reflection was evident in the level 3 statement: “It is easier to see two students evaluate similar patients than it is to see students present two different cases and discern which is more proficient.” Critical reflection (level 4) was evidenced by the statement: “But then I realized that I didn’t know if I was being an effective teacher without an assessment of some type. So I have started incorporating pre- and post-lecture questions to see what the learner gained from the presentation.”

Figure 1 shows the levels of reflection obtained overall and in the four prompted topic areas of strengths, weaknesses, assessment, and feedback for midpoint and final essays. Mean levels of reflection achieved in combined midpoint and final essays in each topic area were 2.4, 2.9, 2.7, and 2.9, respectively, indicating the greatest extents of reflection in the weakness and feedback topic areas. Each topic area showed increases in the mean level of reflection between midpoint and final essays [strengths, 2.2 (1.1) vs 2.5 (1.1); weaknesses, 2.8 (1.0) vs 2.9 (1.0); assessment, 2.6 (1.0) vs 2.8 (0.9); and feedback, 2.4 (1.0) vs 3.0 (1.0)].

An external file that holds a picture, illustration, etc.
Object name is ajpe80110-fig1.jpg

Level of Reflection 10 Achieved in Midpoint and Final Reflection Essays. Number of essays according to the highest level of reflection (Level 0=Absent; Level 1=Non-reflection; Level 2=Understanding; Level 3=Reflection; Level 4=Critical Reflection) attained in the four prompted topic areas for both midpoint and final essays (n=132).

The extent of reflection between midpoint and final essays significantly increased in the prompted topic areas of strengths ( p =0.03) and feedback ( p =0.0002). In the strengths topic area, 29 (43.9%) participants did not show change in the extent of reflection from midpoint to final reflections, but 25 (37.8%) showed deeper reflection, compared with only 12 (18.2%), who had decreased reflection scores. In the feedback topic area, 28 (42.4%) participants did not show change in the extent of reflection from midpoint to final reflections, but 30 (45.5%) showed deeper reflection, compared with only 8 (12.1%) who had decreased reflection scores. The highest level of reflection achieved seemed to be higher in the final essays [mean value 3.53 (0.53)] when compared with the midpoint essays [mean value 3.36 (0.57)], but this change did not reach significance ( p =0.055) despite increases in each individual topic area. There were no differences observed when comparing change in the extent of reflection at midpoint and final according to gender or preceptor/resident status.

Further patterns emerged from the data during qualitative analysis and were explored with additional detail using triangulation methods. Program participants often discussed others’ evaluation of their teaching activities, but focused primarily on receiving evaluations from students and faculty members (65% and 49% of essays, respectively). Just 24% of essays mentioned gaining evaluation by peers (despite a teaching certificate program requirement for obtaining a peer teaching evaluation), and only 4% commented on gaining feedback from patients on their teaching. In contrast to their discussion on receiving evaluations, only five (7.6%) participants discussed the process of evaluating others’ performance in teaching. Four of these regarded providing peer evaluations, and one participant commented on giving feedback to faculty members.

A frequent conclusion of the participants was that they did not have the opportunity to complete a specific type of teaching development activity. This theme was reported in 31 (23.5%) essays. As expected, just over 70% of these occurrences were found in midpoint essays, when participants had completed only the first half of the teaching certificate program. This theme often disappeared from the final essays. However, the perceived lack of opportunities to participate in formally grading students and writing test questions (n=4), serving as an experiential education preceptor for students (n=3), giving a formal presentation (n=1), and providing student feedback (n=1) remained in the nine final reflective essays.

Reflection is vital in the life of an educator. 11 Even more, reflection is the key to learning, which occurs when we create meaning from a past event and use this to shape future experiences. 4,12 While other disciplines such as education have been using reflective practices for some time, the health sciences professions have more recently adopted this concept in the training of future health care professionals. Medicine, nursing, and pharmacy are among the disciplines that are adapting these types of reflective processes in curricula to aid in learning as well as improve patient care. 13-21

The extent of medical residents' reflection has been explored 13-15 and showed that physicians' decision-making skills were improved in complex clinical cases if they were able to critically reflect on those experiences. 15 An evaluation of nursing students concluded that interview sessions (individual, paired, and group) on reflective practice were viewed as beneficial for the participating students and encouraged them to practice reflective thinking on their own. 17 Like many other professions over the past few decades, accreditation standards for pharmacy education require student reflection and subsequent assessment of these skills. 18-20

Additionally, the American Society of Health-System Pharmacists Required Competency Areas, Goals, and Objectives, for Postgraduate Year One Pharmacy Residencies address the need for residents to reflect on their personal performance and professional development. 21 Published works on reflection in pharmacy center on pharmacy students, but the literature suggests reflection is necessary in the day-to-day practice of a pharmacist as well. 22,23 However, little is known about the reflective abilities of pharmacy educators and how these skills affect teaching-related self-development.

Assignments in this teaching certificate program served as tools to facilitate and encourage reflective thinking in order to promote growth in teaching abilities, and participants achieved high levels of reflection when completing these assignments. Almost 50% of essays evidenced critical reflection by describing how change occurred in their practice of teaching (behavior change). This high level is rarely achieved by so many. 10 This could be because participants in this program were highly achievement-oriented as supported by their admission to and completion of pharmacy school as well as pursuit of residency training.

Additionally, preceptors in the program were seeking teaching development and voluntarily participated. The teaching certificate program also emphasized reflection throughout the program. In addition to the reflections reviewed in this evaluation, participants were asked to reflect immediately after each teaching activity; this practice of reflective thinking and writing may have facilitated deeper reflection in the more summative midpoint and final essays. Another factor that could have contributed to the high levels of reflection was the mentorship of program faculty members, who reviewed reflection drafts and further prompted participants to consider their experiences, personal characteristics, and teaching development goals.

When analyzing themes discussed by participants, it was evident that they considered the reflection prompts when writing their essays because strengths, weaknesses, assessment, and feedback all appeared often. A closer look at the references to these categories revealed that participants wrote most frequently about the areas of assessment and feedback. These areas stood out because they were the areas in which participants lacked confidence early in the teaching certificate program, but they were also the areas in which participants recognized the most growth at the conclusion of the program. Collectively, the extent of reflection significantly increased from the midpoint essays to the final essays in the feedback category.

There was no intervention on reflective writing between the midpoint and final essays, so the change evidenced by our evaluation seems to have occurred naturally. Participants gained more practice with these skills over the course of the program year, and they subsequently reflected more and explained their growth in this area. This finding of deeper reflection in the area of feedback validated the design of the UAMS teaching certificate program and its focus on self-reflection as a method of teaching skills development.

The area discussed by all participants in the final essays was their confidence and comfort in teaching. This linked nicely with the significant increase in the extent of reflection seen in the strengths category. Participants seemed to be more deeply aware of their strengths as the program progressed, and they discussed areas of perceived confidence in these essays. This is consistent with our previous findings that demonstrated increased self-perceived teaching abilities during the program. 9

Because the teaching certificate program provides the opportunity for gaining practical experience in teaching, confidence in these abilities is expected to grow. Gaining confidence can allow a deeper awareness of personal strengths. Growth was evidenced as experiences led to increased confidence, and prompted reflection on these experiences led to gains in self-perceived teaching abilities and strengths. Increased confidence through experiences was the aim of the teaching certificate program and explains why such programs are beneficial to participants, future employers, and academic institutions.

Participants most often solicited feedback from students and faculty members. Feedback from peers and patients were reported much less frequently. Because it is a teaching certificate program requirement for participants to receive feedback from faculty members, peers, and students, it is not surprising that these perspectives were discussed in the reflections. However, patient teaching also is encouraged by the program, and the perspective and feedback of patients should be valued. Although the number of participants soliciting feedback from patients was not collected in our evaluation, few discussed these perspectives in their reflections.

Twenty participants (30%) listed patient counseling as a teaching activity, but only three participants discussed receiving teaching evaluations from patients. This could be because this information was not solicited from participants and, therefore, they do not reflect on it specifically or because the participants perceived patient evaluations to be lower quality or less importance than those from faculty members or peers. Additionally, participants might have viewed the patient as a different type of learner than students and not seen the need for reflection on their evaluations. Curricular emphasis of patients as learners increased in later program years to encourage variation of feedback from learners. To foster the view of patients as learners, programs should consider requiring patient teaching activities along with assessment of patient learning and completion of teaching evaluations by patients.

It is also noteworthy that participants wrote more frequently about receiving feedback from students and faculty members and less frequently about feedback received from peers. This corresponds to the small number of participants who reported giving feedback to faculty members (n=1) and peers (n=4). Giving and receiving feedback from peers may be an area of discomfort for participants despite the teaching certificate program requirement to give and receive this type of feedback. Additionally, only one participant reflected on the program requirement to provide feedback to at least two faculty members after observing their teaching. Overall, reflection on giving feedback to peers and faculty members was lacking, and participants did not reflect on receiving feedback from all sources.

The extent to which feedback was sought is not known, so additional research is needed to clarify this finding. Many factors could contribute to this pattern, including the perceived need to obtain feedback from certain groups, the feasibility and convenience of sampling, the perceived relative importance of feedback from more experienced or more educated groups, or the comfort level of the participants to give feedback to and receive it from certain groups. Teaching certificate programs should consider increased emphasis on 360-degree evaluations of teaching and both giving and receiving feedback to help participants understand the necessity and value of feedback from others. If participants gain comfort in this activity, they could provide higher quality feedback to others and potentially could benefit more from the feedback they receive.

Participants were not prompted to write about opportunities they did not get to experience; therefore, it can be assumed that participants discussing this theme must have been expecting to participate in such activities. By identifying these patterns, program directors can get an idea of skills that some participants might want to gain from the program. As expected, more than 70% of the themes coded for no opportunity were expressed in the midpoint essays. Of the remaining nine items discussed in the final essays, grading students and writing test questions accounted for the most themes discussed. Our program has taken these comments into consideration, incorporating additional opportunities for examination item review sessions to give participants insight on how to develop and evaluate examination items.

Qualitative evaluation of participant reflections can provide feedback on teaching certificate program effectiveness. These reflections provide a detailed view of how participants develop teaching skills throughout the program and impart realizations of the effectiveness of the program. By using reflective essays as quality indicators, program directors can shape program content to better develop teachers. At the same time, the reflection itself can aid participants in the development of a well-informed, highly individualized written statement of teaching philosophy, another potential quality measure of the teaching certificate program. This research did not evaluate participants’ teaching philosophy statements. However, qualitative evaluations of these documents also could provide a proxy of the teaching certificate program’s effectiveness in increasing participants’ awareness of their own teaching style and ideals.

Although our qualitative evaluation was rigorous and followed the guidelines presented by Anderson, 24 care must be taken when generalizing the results to other programs. Our data represent the products of a single teaching certificate program and are specific to the participants and experiences of this program. The characteristics of the residents and preceptors completing the program are varied, but they may reflect these groups at other institutions.

The use of quantitative and qualitative measures to evaluate this teaching certificate program is a strength of the study, and additional analyses are needed to determine if participants’ reflective abilities remain constant over time and predict future teaching performance. Finally, because of the extensive nature of our evaluation and the volume of the qualitative data, the potential exists for coding errors and inconsistencies between coders. Several mechanisms were in place to limit these inaccuracies and preserve the integrity of the data.

Participation in the teaching certificate program appeared to increase confidence and enhance awareness of strengths through participant reflection. Pharmacy residents and preceptors frequently achieved the highest level of reflection (critical reflection) in global self-assessments of teaching experiences. Such deep reflection is indicative of professional development because teaching certificate program participants evidenced change not only in teaching attitudes, but also teaching behaviors, as discussed in written essays. Just as reflective exercises are emphasized for pharmacy students and residents, findings from this analysis suggest that teaching certificate programs should strongly consider emphasizing purposeful critical reflection through required reflective exercises at multiple points within the program curricula. Qualitative evaluation of participant reflections can provide quality indicators to assist program directors in shaping program content.

  • Open access
  • Published: 14 August 2019

Reflections on qualitative data analysis training for PPI partners and its implementation into practice

  • Alison Cowley 1 , 2 ,
  • Margaret Kerr 3 ,
  • Janet Darby 2 &
  • Pip Logan 2 , 4  

Research Involvement and Engagement volume  5 , Article number:  22 ( 2019 ) Cite this article

8157 Accesses

14 Citations

23 Altmetric

Metrics details

Plain English summary

Service users should be involved in every part of the research process, including analysis of qualitative research data such as interviews and focus groups. To enhance their participation, confidence and contributions, training and support for both the ‘professional’ researcher and lay member of public is essential. Historically this has taken a number of forms from short 1 day training sessions through to training spread out over several months. There currently is limited guidance on the quantity and content of such training sessions for Patient and Public Involvement (PPI) Partners. This paper discusses and explores the content and delivery of qualitative analysis training held over two sessions of 3 h duration to members of a University PPI group. The training was designed by experienced qualitative researchers and PPI partners based on available literature and research expertise. Training included the theory of qualitative research methods, and practical qualitative analysis coding skills. These skills were developed through the use of ‘mock’ interviews which participants practiced coding in supportive group sessions. Their feedback on the training is provided. One of the PPI partners subsequently went onto code data with a researcher working on a funded research study, and has reflected on both the training sessions and the subsequent analysis of the data. These reflections have been supplemented by reflections of the researcher who worked alongside the PPI partner, revealing that the process challenged perspectives and helped them view data through a service users eyes. A positive working relationship was central to this.

Service users should be involved in every part of the research process to ensure that interventions are fit for those whom they are intended to help. Involving service users in analysing qualitative data such as focus groups and interviews has been recognised as particularly valuable. Older people have frequently been less involved in these initiatives. A wide range of training programmes have been proposed but there is currently limited guidance on the quantity and content of training sessions to support training Patient and Public Involvement (PPI) Partners. This paper discuses and explores the content and delivery of qualitative data analysis training to members of a University PPI Group.

Existing literature on PPI in qualitative data analysis was reviewed by the research team and an outline programme was designed. This comprised of two three hour sessions held at an easily accessible venue familiar to members of the PPI group. The course included theories behind qualitative research methodology and methods, what is coding and how to code independently and as part of a research team using Thematic Analysis. A mock research question was generated and two mock interviews were completed, audio recorded and transcribed verbatim. This provided participants with real life experience of coding data. The session was positively reviewed and said to be interesting, enjoyable and provided a good overview of qualitative analysis. One of the PPI partners subsequently went onto code data with a researcher working on a funded research study, and has reflected on both the training sessions and the subsequent analysis of the data. These reflections have been supplemented by reflections of the researcher who worked alongside the PPI, revealing that the process challenged perspectives and helped them view data through a service users eyes. A positive working relationship was central to this.

Conclusions

Feedback suggests that the training enabled PPI partners to become active members of the research team in qualitative data analysis. There is a need for further research into the optimal amount of training needed by PPI’s to participate as partners in qualitative analysis.

Peer Review reports

NHS guidance states that service user involvement should exist at every stage of the research process [ 1 ] and there are increasing examples of this involvement throughout the research cycle, particularly in planning, designing and in data collection. However it has been argued there are less examples of Patient and Public Involvement (PPI) in analysing and interpreting qualitative data [ 2 , 3 ]. Involving PPI partners in qualitative analysis has been recognised as particularly valuable as they can draw on their own experiences to make sense of the data [ 4 ], and their involvement has been recognised as a means of improving the quality, robustness and validity of the analysis [ 5 , 6 ]. Although PPI in health and social care research has been on the increase, it has been noted that research in partnership with older people has been slower to develop than with other service user groups and have often been bypassed by these initiatives [ 7 ]. Clough et al. [ 8 ] reported that one of their greatest frustrations was finding older PPI partners who had been involved in other research studies. Three studies have been identified where older PPI partners were involved in qualitative analysis [ 4 , 7 , 8 , 9 , 10 ].

An essential requisite of PPI in any aspect of the research process is the provision of appropriate training to equip them with the necessary knowledge and skills, without which, the ‘research product’ will be compromised [ 11 , 12 ]. Indeed one of the eight principles of successful PPI in NHS research has been identified as training [ 13 ]. Although training has been recommended for PPI in research little is known about what training is needed [ 14 ]. A recent mapping exercise was conducted to identify such training initiatives in health and social care, and this revealed just 26 initiatives across England, with 12 of these initiatives providing training on analysing and interpretation of data. The training across these initiatives was diverse in respect of style, content and length of provision, with training provided on a single day through to training spread out over several months [ 11 ]. One course has gone further to provide training to service users over two academic terms on qualitative interviewing and analysis, culminating in a validated university certificate in research methods [ 8 ]. It is less clear therefore on the quantity and content of training that would be beneficial for PPI partners participating in qualitative analysis.

Another concern raised in the literature is how to strike a balance between increasing PPI research knowledge and skills, enabling their participation as partners, whilst not professionalising their involvement [ 7 , 9 , 14 ]. There is a need therefore to explore appropriate training education for PPI partners partaking in qualitative analysis that prepares them to work alongside research partners, whilst at the same time, retaining their unique contribution as service users.

Dewar [ 15 ] has called for more opportunities for the sharing of experiences about the process of involving older people in research. This paper therefore describes one example of qualitative data analysis training provided to a group of older PPI partners. It incorporates a reflective account, from the researcher and service user perspective, of implementing this training into research practice. Implementation of research skills into practice has been identified as a key outcome of service user training [ 11 ].

In 2017 the primary author, Alison Cowley, was awarded a Clinical Doctoral Research Fellowship funded by Health Education England and the National Institute for Health Research to explore how healthcare professionals define and apply the concept of ‘rehabilitation potential’ in frail older people in the hospital setting and to develop and test an assessment tool for use in clinical practice. The fellowship included funding to train members of the public to code focus group data alongside members of the research team to provide a differing perspective on analysing research data.

The aim of this article is to describe the development and implementation of the qualitative data analysis training programme and the experiences of the academic researchers and ‘lay’ researchers in data collection and analysis.

Qualitative data analysis training

A small training team, comprised of staff from the Division of Rehabilitation, Ageing and Well-being (JD, AC, MC) reviewed existing literature on PPI in qualitative data analysis and an outline programme was developed to meet the needs of the identified participants. The training team consisted of two experienced qualitative researchers (JD and AC), a clinical nurse (MC), and a Professor experienced in the development of complex rehabilitation interventions (PL). In view of the limited availability of qualitative analysis training for older PPIs, advice on the content and format of training was sought from TW, who had previously published work around older lay researchers conducting qualitative data analysis following training [ 10 ]. Ideally a member of the local PPI group would have been involved in the design and content of the course, but as PPI experience of qualitative analysis was limited, and was the motivating factor behind developing the course, this was not feasible The course was designed to specifically train members of the Dementia, Frail Older People and Palliative Care PPI Advisory Group at the University of Nottingham, who comprise of older adults. The group, established in 2012, consists of 20 members of the general public who advise on research priorities, study design, ethical considerations and dissemination of studies at the University of Nottingham and the wider health community. Many of the group are co-applicants for large nationally funded studies. Prior to the training programme, members had no experience of qualitative data analysis but were keen to develop these skills. A formal evaluation of pre and post course knowledge and skills was not completed. They attended the group on a voluntary basis.

The course was held over two sessions of 3 h duration in February and March 2018 at an easily accessible venue in the University of Nottingham familiar to members of the PPI group. This was to ensure that they felt at ease during the session, and is in line with other PPI partners training programmes around research [ 11 ]. Seven members of the group attended the first training session with five participants attending both sessions. Again attrition is common amongst other PPI partners training programmes [ 11 ].

The first training session included taught sessions of the theory behind the use of qualitative research methodology and methods, what is coding and how to code as an independent researcher and as part of the wider multi-disciplinary team. This provided participants with a basic introduction into the epistemology associated with the qualitative paradigm, strengths and weaknesses enabling the appreciation as to why researchers use qualitative methods. Participants were taught the principles behind coding qualitative data and more specifically how to code data using Thematic Analysis. This analytic method was selected as it has been recognised as a foundational method for learning qualitative analysis skills [ 16 ], and a method that has been used with a variety of service user groups [ 3 , 4 , 17 ]. A clear step-by-step guide was sought to provide structure to the Thematic Analysis process. Braun and Clarke’s [ 16 ] Thematic Analysis Guide met this requirement. This six phase guide was used to structure the training sessions.

The training team developed a ‘mock’ research question titled “What factors affect medication adherence in adults living in the community setting?” This topic was chosen as it was felt that potential participants would have experience of either taking medications themselves or supporting others in medication adherence but was unlikely to be too controversial, providing a ‘safe’ training topic and exercise. Two ‘mock’ semi-structured interviews were completed and audio recorded for this training by the team. Participants on the course were asked to listen to the audio recording of the first interview to familiarise themselves with the data and were then given the transcript (transcribed by the team) to read through and make reflective notes. This was then discussed with the group followed by time to read and discuss a pre-prepared coded transcript of the interview. The second interview audio recording was played to the group and participants were given the second transcript to independently code at home in preparation for the second session. Prior to the close of the session, participants were advised that they could contact the team if they required further support whilst coding the transcript at home.

The second training session, held 2 weeks later, provided the participants with the opportunity to share their experiences from the coding exercise and to discuss the codes generated from the two interview transcripts. The training then progressed to look at how codes are sorted into potential themes. The participants were separated into two groups and provided with cards, each card labelled with the code name and its descriptor. The participants then sorted the cards into potential theme piles, and then discussed their constructed themes in relation to the research question. The session concluded with reflections on the course and an opportunity to ask the trainers questions.

Reflection on qualitative data analysis training

Five participants on the course provided feedback on the course evaluation form and through email correspondence. Feedback has been anonymised and included in this commentary paper with permission from the participants. Overall, the course was said to be interesting and enjoyable, providing participants with an introduction into qualitative data analysis. One participant stated that:

“Previously I knew nothing of analysing qualitative data and this ignorance had become an increasing concern for me as I have been and am still the lay co-applicant on several studies and attend management meetings where discussion included such topics…. The course was an excellent introduction to the topic. With two sessions and homework it gave me an opportunity to flesh out my understanding with practical exercises.” [Participant One].

The use of ‘mock’ interviews to provide participants with real life examples were positively received, highlighting the challenges that all researchers face when coding data autonomously and as part of a team:

“I see you need to do it [code] several times…and it was surprising that we all produced surprisingly similar results from us all.” [Participant Two].

Despite the condensed nature of the training course, compared to many other research training initiatives which can span from anything from a single day to 28 days [ 11 ], participants described an increased confidence when dealing with qualitative research data such as focus groups and interviews:

“I did a mature degree using qualitative research, and whilst was well tutored and the degree was a good one, the level of QD [qualitative] analysis was nowhere near as intricate or sophisticated as your course.” [Participant Three].

Whilst the course was primarily aimed to train PPI partners in coding qualitative research data, it was reported to also increase their confidence in questioning choices made by researchers:

“The questions I ask will be different, more incisive and detailed… Did they build up themes, codes and a suitable QD strategy? Researchers coming in front of PPI panels often tend to concentrate on general procedures, using lay language, ethics, accuracy of paperwork etc., when really the PPI panel should be more concentrated on the background and quality of QD implementation, as this really is the central issue of the research.” [Participant Three]

One of the key benefits of training PPI partners has been identified as increased confidence, motivation and skills to be actively involved in research activities [ 11 ]. The condensed nature of the training course did not appear to diminish this positive outcome.

The participants on the course began to reflect on the additional contributions that PPI partners could bring to the research process, moving from a tokenistic involvement towards true co-production. Training was viewed as a:

“Genuine commitment to enhance our skills as PPI volunteers, replenishing our enthusiasm for being part of the team” [Participant One].

Implementing training into practice

As part of a PhD study, five focus groups were co-facilitated by AC and MK (PPI member) consisting of 28 participants. The discussions were audio recorded and the data was transcribed verbatim. Ethical approval for the focus groups was obtained. The Framework Approach, which sits within the broad family of thematic analysis, was used to analyse the data [ 18 , 19 ]. It is recommended for collaborative multi-disciplinary research [ 19 ] involving seven distinct stages designed to provide a robust and transparent method of qualitative data analysis. Table  1 outlines the seven steps of the Framework Approach and PPI in each stage. Both AC and MK kept reflective diaries during data collection and analysis and excerpts from these documents will be shared to highlight the challenges and opportunities for collaborative data analysis.

PPI reflections

Having volunteered to support Alison with her research project which would involve using the coding training we took part in, I was curious to see how it worked in practice. When you attend a focus group, either as a member or facilitator, you come away with an opinion on what was said and what main points were raised. Having coded the date from these focus groups you realise what would be missed if you didn’t look closely at the written transcripts. It takes time to code data; one read through is not enough, but the main points raised gradually begin to gel as you become more familiar with your codes and what you are reading. I found that I had to be especially aware of any bias slipping in and the positives and negatives of bringing my own experiences to the table. I was surprised how much more relevant detail was uncovered using this technique.

Having coded two transcripts, I met with Alison and was surprised/relieved/amazed to discover that the points I had identified were basically the same as hers. That and the training gave me the confidence to complete the remaining focus groups and that my views were as equally valid in the analysis. As a PPI partner I felt my interpretation should come from that perspective. This is an essential part of the research process and with the right training and support, PPI representatives can positively contribute to analysis of research data. Professional researchers automatically take into account the restraints of their working environment, whereas ‘lay people’ are generally unware of these or choose to ignore them and we can therefore look at findings with a more open mind.

Researcher reflections

Involving PPI partners in co-production of qualitative data analysis has led to me challenge my own perspectives and beliefs. Researchers or clinicians view the world, their practice and data through a ‘professional’ lens. We frequently recruit patients, carers, families and service users to studies but their voice can then be lost when data is analysed. We are embedded in the process, our research question and despite being reflective, we still see the world as clinicians or researchers. Co-facilitating and analysing the data with MK led me to challenge my assumptions not only around the clinical practice of rehabilitation assessments but also the notion of good outcomes. During data coding, I found my focus was drawn towards very medical/technical terms and how rehabilitation assessments were operationalised whereas MK was good to drawing attention towards the patient and carer experience. By discussing these two perspectives we were able to develop a number of cross cutting themes such as communication, clinical staff training and expertise and supporting family member’s involvement in rehabilitation which would have not featured strongly in the final analysis without lay involvement. Other studies have similarly found that lay involvement has identified themes which would otherwise have been missed without their input in the analysis process [ 6 , 17 ].

In order to achieve these insights it was essential to develop an open and honest relationship where MK was supported and encouraged to challenge and question. We conversed frequently via email but this was supplemented with regular face-to-face meetings in a comfortable environment; MK’s house, so we could talk through our findings and develop a joint understanding of the data. I found that I had to clearly articulate the codes and themes and adopt a non-jargonistic terminology and language. This reflected concerns that although they made sense to me, as a researcher and clinician, they may not have resonated with the understanding of members of the public. By providing a clear and unambiguous coding framework and articulating the process of data analysis, greater rigour and transparency was introduced into the study.

When coding data a plethora of interesting findings may emerge and I found that both I and MK were guilty of going off topic at times. The study sought to understand what was meant by the clinical term ‘rehabilitation potential in older people living with frailty’, why this was used and how it was assessed. The resulting focus groups led to lively discussions around outcomes of rehabilitation and whilst interesting, they did not directly address the research question. Through an open dialogue and regular meetings we were able to mutually steer our data analysis back towards the research question, parking our extraneous musings for a later date.

In order to ensure that research interventions are designed which embrace and understands everyone’s needs, it is essential that members of the public are involved in all stages of the research process, not just the study design and dissemination. This reflective account revealed that designing and providing a qualitative data analysis training programme for members of a University PPI Advisory Group was feasible and well received by participants. Feedback revealed that this training increased participant knowledge and confidence. This is in keeping with studies conducted with PPI partners [ 7 , 10 , 11 ]. The training has since enabled two members of the group to take part in qualitative data analysis as part of the research team, providing differing perspectives, viewpoints and a voice for those whom the ultimate intervention is designed for. These strengths have been recognised by studies which have evaluated the impact of PPI partners upon research studies [ 7 , 10 , 11 ]. Since the completion of the course, MK (PPI partner) has been involved in a qualitative analysis course for post-graduate students and researchers, where she shared her experiences of conducting data analysis and working as a member of a research team, providing advice and guidance for others aspiring for greater PPI. Her perspective and experiences were positively evaluated by course attendees. Research has shown that training and subsequent involvement in qualitative research can provide a platform for participants to move onto complete other such projects [ 10 , 11 ].

In order to achieve an effective working relationship, ongoing support and guidance is essential, where all parties respect and value each other’s knowledge, opinions and experiences whilst working towards answering a specific research question. Involving PPI partners in analysing qualitative research data provides a differing perspective, challenging researchers’ assumptions and positions. It has the potential to enhance the analysis process. Another member of the University PPI Advisory Group has also gone onto conduct qualitative analysis as part of a Process Evaluation embedded into a large Randomised Controlled Trial.

It should be recognised that this commentary provides a reflective account of a small scale training programme and therefore the generalisability of the findings are limited.

Implications for the future

In light of the above limitation, further research is needed to understand the optimal amount and type of training provided for PPI partners to support their active involvement in qualitative data analysis. Future training courses for qualitative data analysis by older adults should be co-designed by those they are intended to support, thus ensuring that course content and delivery are appropriate and fit for purpose. Further research is also recommended to explore the type and amount of training professional researchers require to effectively integrate PPI partners into the research process and fully realise the potential of their contributions.

Availability of data and materials

Data sharing is not applicable to this article as no dataset were generated or analysed during the current study.

Abbreviations

Health Education England

National Institute for Health Research

Patient and Public Involvement

Department of Health. Research Governance Framework for Health and Social Care: Crown Copyright; 2005.

Google Scholar  

Byrne A, Canavan J, Millar M. Participatory research and the voice-centred relational method of data analysis: is it worth it? Int J Soc Res Methdol. 2009;12(1):67–77.

Article   Google Scholar  

Nind M. Participatory data analysis: a step too far? Qual Res. 2011;11(4):349–63.

Ward L, Barnes M, Gahagari B. Well-being in old age: findings from participatory research. Hove & Portslade: University of Brighton and Age Concern; 2012. https://www.brighton.ac.uk/_pdf/research/ssparc/wellbeing-in-oldage-executive-summary.pdf .

Staley K. Exploring impact: public involvement in NHS, Public Health and Social Care research Online: INVOLVE; 2009 http://www.invo.org.uk/posttypepublication/exploring-impact-public-involvement-in-nhs-public-health-and-social-care-research/ . Accessed 9 Jan 2019.

Best P, Badham J, Corepal R, O'Neill RF, Tully MA, Kee F, et al. Network methods to support user involvement in qualitative data analyses: an introduction to participatory theme elicitation. Trials. 2017;18(1):559.

Littlechild R, Tanner D, Hall K. Co-research with older people: perspectives on impact. Qual Soc Work. 2015;14(1):18–35.

Clough R, Green B, Hawkes B, Raymond G, Bright L. Older people as researches: evaluating a participative project: Joseph Rowntree Foundation: York Publishing Services Ltd; 2006. https://www.jrf.org.uk/sites/default/files/jrf/migrated/files/9781859354346.pdf . Accessed 9 Jan 2019

Reed J, Cook G, Bolter V, Douglas B. Older people ‘getting things done’: involvement in policy and planning initiatives York [online]: Joseph Rowntree Foundation; 2006. https://www.jrf.org.uk/report/older-people-getting-things-done-involvement-policy-and-planning-initiatives . Accessed 9 Jan 2019

Williamson T, Brogden J, Jones E, Ryan J. Impact of public involvement in research on quality of life and society: a case study of research career trajectories. In J Consumr Stud. 2010;34(5):551–7.

Lockey R, Ahmed S, Bennett C, Gillingham T, Millyard J, Parfoot S, et al. Training for service user involvement in health and social care research: a study of training provision and participants’experiences. (the TRUE project): Worthing and Southlands; 2004. http://www.invo.org.uk/posttypepublication/training-for-service-user-involvement-in-health-and-social-care-research/ . Accessed 9 Jan 2019

McLaughlin H. Involving young service users as co-researchers: possibilities, benefits and costs. Brit J Soc Work. 2006;36(8):1395–410.

Telford R, Boote JD, Cooper CL. What does it mean to involve consumers successfully in NHS research? A consensus study. Health Expect. 2004;7(3):209–20.

Dudley L, Gamble C, Allam A, Bell P, Buck D, Goodare H, et al. A little more conversation please? Qualitative study of researchers’ and patients’ interview accounts of training for patient and public involvement in clinical trials. Trials. 2015;16:190.

Dewar BJ. Beyond tokenistic involvement of older people in research - a framework for future development and understanding. J Clin Nurs. 2005;14(Suppl 1):48–53.

Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

Gillard S, Simons L, Turner K, Lucock M, Edwards C. Patient and public involvement in the coproduction of knowledge: reflection on the analysis of qualitative data in a mental health study. Qual Health Res. 2012;22(8):1126–37.

Ritchie J, Spencer L. Qualitative data analysis for applied policy research. In: Bryman A, Burgess R, editors. Analyzing Qualitative data. New York: Routledge; 1994. p. 173–94.

Chapter   Google Scholar  

Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13:117.

Download references

Acknowledgements

Thank you to Marie Cook (MC) for her role in developing and delivering the training.

Thank you to members of the Dementia, Frail Older People and Palliative Care PPI Advisory Group at the University of Nottingham.

Thank you to Professor Tracey Williamson (TW) for her advice and support on planning the training sessions.

This report is independent research arising from a Clinical Doctoral Research Fellowship, Alison Cowley, ICA-CDRF-2016-02-015 supported by the National Institute for Health Research (NIHR) and Health Education England (HEE). The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health.

Author information

Authors and affiliations.

Institute of Nursing & Midwifery Care Excellence, Nottingham University Hospitals NHS Trust, Hucknall Road, Nottingham, NG5 1PB, UK

Alison Cowley

Division of Rehabilitation, Ageing and Well-being, School of Medicine, University of Nottingham, Queen’s Medical Centre, Nottingham, NG7 2UH, UK

Alison Cowley, Janet Darby & Pip Logan

Patient and Public Involvement Partner C/O School of Health Sciences, University of Nottingham, Queen’s Medical Centre, Nottingham, NG7 2UH, UK

Margaret Kerr

Nottingham CityCare Partnership, 1 Standard Court, Park Row, Nottingham, NG1 6GN, UK

You can also search for this author in PubMed   Google Scholar

Contributions

AC, JD and MC reviewed the existing literature on PPI training, designed and delivered the course. Drafting of the paper was led by AC and JD, with regular review and comments from MK and PL (AC primary doctoral supervisor). All authors read and approved the final manuscript.

Corresponding author

Correspondence to Alison Cowley .

Ethics declarations

Ethics approval and consent to participate.

Not required or sought for the training. Overall rehabilitation potential study had ethical approval for involvement of MK in coding from The Yorkshire and The Humber Bradford Leeds Research Ethics Committee REC reference number 17/YH/0356.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Cowley, A., Kerr, M., Darby, J. et al. Reflections on qualitative data analysis training for PPI partners and its implementation into practice. Res Involv Engagem 5 , 22 (2019). https://doi.org/10.1186/s40900-019-0156-0

Download citation

Received : 17 January 2019

Accepted : 31 July 2019

Published : 14 August 2019

DOI : https://doi.org/10.1186/s40900-019-0156-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Co-production
  • Qualitative research

Research Involvement and Engagement

ISSN: 2056-7529

reflective essay about quantitative data presentation and interpretation

IMAGES

  1. How to Write a Reflective Essay

    reflective essay about quantitative data presentation and interpretation

  2. Write A Reflective Essay About Your Learning Experience On The

    reflective essay about quantitative data presentation and interpretation

  3. Reflective Essay

    reflective essay about quantitative data presentation and interpretation

  4. 50 Best Reflective Essay Examples (+Topic Samples) ᐅ TemplateLab

    reflective essay about quantitative data presentation and interpretation

  5. Write A Reflective Essay About Your Learning Experience On The

    reflective essay about quantitative data presentation and interpretation

  6. Write A Reflective Essay About Your Learning Experience On The

    reflective essay about quantitative data presentation and interpretation

VIDEO

  1. CHAPTER 2 PRESENTATION, ANALYSIS, AND INTERPRETATION OF DATA- SHS Practical Research 1

  2. PRESENTATION, ANALYSIS AND INTERPRETATION OF DATA

  3. Quantitative Data Analysis 101 Tutorial: Descriptive vs Inferential Statistics (With Examples)

  4. Quantitative & Qualitative Data

  5. 9 Quantitative data analysis

  6. Presenting quantitative data

COMMENTS

  1. (PDF) An essay: comparing and contrasting quantitative ...

    An essay: comparing and contrasting quantitative and qualitative research. ... analysis, interpretation and presentation of data. ... The reflective practitioner. New York: Basic Books. Shields, L.

  2. Chapter Four Data Presentation, Analysis and Interpretation 4.0

    DATA PRESENTATION, ANALYSIS AND INTERPRETATION. 4.0 Introduction. This chapter is concerned with data pres entation, of the findings obtained through the study. The. findings are presented in ...

  3. Reflexivity in quantitative research: A rationale and beginner's guide

    Perhaps the most obvious place to embed reflexivity into research is the data analysis and interpretation phase of the research process. In order to begin to embed reflexivity into the process of analysing quantitative data, we first need to dismantle the myth that numerical data is objective and textual data is subjective (Jamieson, 2020). The ...

  4. PDF Chapter 4: Analysis and Interpretation of Results

    The analysis and interpretation of data is carried out in two phases. The. first part, which is based on the results of the questionnaire, deals with a quantitative. analysis of data. The second, which is based on the results of the interview and focus group. discussions, is a qualitative interpretation.

  5. Reflection 4

    In quantitative research one way of representing our findings will be using Inferential Statistics. This is when we want to make an inference (generalization) to the population. We use inferential statistic in calculating the representativeness of the sample to the population. We can use hypothesis testing to accept or reject the claim of the ...

  6. PDF Reflection Paper

    The first is a poster presentation by Chris Pirotto and Robert Dykes that describes common errors in quantitative analysis. The paper details both the poster and the discussion that the writer had with the presenters. The paper then describes Caroline Handley's presentation on the usage of JASP, a free software for conducting quantitative ...

  7. reflective essay about your learning experience on the quantitative

    Data Analysis and Interpretation. The objective of this chapter is to describe the procedures used in the analysis of the data and present the main findings. It also presents the different tests performed to help choose the appropriate model for the study. The chapter concludes by providing thorough statistical interpretation of the findings.

  8. Quantitative content analysis procedures to analyse students

    Because the reflection construct and the data in reflective essays are often descriptive, context bound and personal, both require significant interpretation before they can be assessed. For instance, Kember et al. (1999) noted the lack of a widely accepted procedure for assessing the level of reflection in reflective essays.

  9. 7. Introduction: Reflections on quantification; Quantitative analysis

    Quantitative analysis methods can be used for qualitative data where there is a need to summarise data across many sites or to quantify the degree of confidence in research results (Abeyasekera ...

  10. Quantitative Data Presentation and Analysis: Descriptive Analysis

    5.1 Introduction. This chapter provides a descriptive analysis of the quantitative data and is divided into five sections. The first section presents the preliminary consideration of data, showing the response rate and the process of data screening and cleaning. The second section deals with the demographic profiles of the respondents.

  11. Qualitative v. Quantitative Research Reflection

    Qualitative v. Quantitative Research Reflection. October 19 2016. Initially, after learning, reading, and researching about these to methods of approaching research in the social work field, I found myself immediately drawn towards quantitative research. Numbers make sense to me and it seems incredibly logical and convenient in theory for me to ...

  12. A Practical Guide to Writing Quantitative and Qualitative Research

    Hypothesis-testing (Quantitative hypothesis-testing research) - Quantitative research uses deductive reasoning. - This involves the formation of a hypothesis, collection of data in the investigation of the problem, analysis and use of the data from the investigation, and drawing of conclusions to validate or nullify the hypotheses.

  13. The Library: Research Skills: Analysing and Presenting Data

    Overview. Data analysis is an ongoing process that should occur throughout your research project. Suitable data-analysis methods must be selected when you write your research proposal. The nature of your data (i.e. quantitative or qualitative) will be influenced by your research design and purpose. The data will also influence the analysis ...

  14. Data Presentation

    Abstract. In Chapter 2 we discussed various ways (several graphical and one tabular) of presenting qualitative data. In all the example we considered, the data arose from a nominal measuring scale. Although nominal (i.e. qualitative) data often occurs in business and economics, more common is quantitative data, arising from the use of ordinal ...

  15. Types of Quantitative Data Analysis and Presentation Format

    Systematic Literature Reviews, Essays, and Theoretical Theses. 53. Data Organization. 54. Summary. 55. Worksheet - Resolving Data Doubts. 56. Additional Resources. IX. Qualitative Data Analysis ... 67 Types of Quantitative Data Analysis and Presentation Format If your thesis is quantitative research, you will be conducting various types of ...

  16. What i have learned directions using the space below

    What I Have Learned Directions: Using the space below, write a reflective essay about your learning experience on the quantitative data presentation and interpretation. Let your essay reveal how much you learned about each concept behind each topic dealt with in this lesson. Express which concepts are the most understood, slightly understood, and the least understood ones.

  17. 10.3: Types of Quantitative Data Analysis and Presentation Format

    10.3: Types of Quantitative Data Analysis and Presentation Format Expand/collapse global location 10.3: Types of Quantitative Data Analysis and Presentation Format ... Appropriate Quantitative Analysis Presentation Format; Univariate: Descriptive statistics (range, mean, median, mode, standard deviation, skewness, kurtosis)

  18. (PDF) An Interviewer's Reflection of Data Collection in Building an

    Interviewing is one of the most common data collection tools in qualitative research. It is widely discussed in research methods classes and literature and considered as an invaluable tool for ...

  19. Qualitative Analysis of Written Reflections during a Teaching

    Additionally, methods to assess teaching certificate programs focus on surveys but lack the details offered by more in-depth analysis. Qualitative analysis of reflective essays could provide better understanding of program benefits and participant growth. Our mixed-methods evaluation is the first to examine thematic composition of reflections ...

  20. (PDF) DATA PRESENTATION AND ANALYSINGf

    Abstract. Data is the basis of information, reasoning, or calculation, it is analysed to obtain information. Data analysis is a process of inspecting, cleansing, transforming, and data modeling ...

  21. Reflections on qualitative data analysis training for PPI partners and

    Plain English summary Service users should be involved in every part of the research process, including analysis of qualitative research data such as interviews and focus groups. To enhance their participation, confidence and contributions, training and support for both the 'professional' researcher and lay member of public is essential. Historically this has taken a number of forms from ...