Qualitative+Data+Analysis+Section


 * ** EDU8005-8 ** ||  ||
 * ** Qualitative Research Design ** || ** 3 Qualitative Data Analysis Section ** ||
 * ** Qualitative Research Design ** || ** 3 Qualitative Data Analysis Section ** ||


 * Thanks, Stephen. Please consider my in-text comments as well. **
 * Data analysis for a qualitative case study can vary. Given your mention of interviewing and document analysis, the expectation is that data analysis may likely entail some type of coding scheme and content analysis. These analytical results can be displayed via tables, charts, and/or narrative explanations. It would have been nice to have seen such results displayed in paper (see published qual study articles for how Results or Findings sections were structured and written). **
 * More specifically, coding schemes can be helpful in making sense of commodious data sets. Not only does coding break down large pools of data, but it also guides the analyst to construct bridges between what once seemed disparate information entities into a more sensible storyline concerning an observed occurrence in reality. Qualitative data analysis is to be done with some objectivity as well as sensitivity. How well both these elements are wielded is really up to each analyst. Only with practice may come a more refined ability to deftly sift through the clutter of unfiltered data. **
 * I encourage you to read much of articles concerning completed qualitative studies that used varying designs, including case study, grounded theory, phenomenology, etc. In those articles, give attention to data collection strategies used, data analysis approaches, and how the results were presented/discussed. Reading through such articles is a good way to become more familiar with how qualitative study reporting can be structured and organized. **
 * I look forward to your next completed assignment. **

=Qualitative Data Analysis Section= A positive relationship has been demonstrated between online participation and learning performance ( Huang, Lin, & Huang, 2012; Martinez-Caro, 2009; Pelz, 2010; Ruey, 2010), as well as between learning performance and student satisfaction in online courses ( Ali & Ahmad, 2011; Chen & Lien, 2011; Ferguson & DeFelice, 2010; Kozub, 2010; Martinez-Caro, 2009). There is, however, little empirical research regarding adult professional development or appropriate techniques for teaching and engaging non-traditional learners ( Donavant, 2009), or on appropriate modes of interaction in learning management systems ( So & Bonk, 2010). The specific problem is to investigate means of improving the learning experience of online professional development learners by identifying factors that increase learner satisfaction ( Watts, 2012).

Purpose of the Study
The purpose of this descriptive, qualitative case study was to gain an in-depth understanding of the various factors that increase the learner satisfaction of adult online professional development learners. This case study explored an online instructor-led professional development environment in two phases. In the first phase, historical data from course evaluations was analyzed to determine factors that students indicated enhanced or improved their satisfaction or perceived learning ( George & Bennett, 2005; Merriam, 1998). Information from two open-ended questions on the evaluation (Would you recommend training to others? Why or why not? and Suggest how we could improve your satisfaction with the course.) were collected, analyzed, and categorized to cluster factors of satisfaction and dissatisfaction. In the second phase, 13 learners were interviewed over the phone to validate or contradict the factors identified in phase one, and to more fully explore factors that increased their learner satisfaction in online professional development classes ( Watts, 2013).

Data Collection
Phase one. In the first phase of the study, extant data was selected from course evaluation data. The data selected was obtained from online, instructor-led, synchronous courses offered by a US-based technology company between January 2008 and December 2012. All of the courses were delivered using Cisco WebEx Training Center, which is an online learning management system. There were 3,147 online classes during the five-year timeframe, which involved 17,537 individual evaluations. The data most relevant from these course evaluations were two open-ended questions (Would you recommend training to others? Why or why not? and Suggest how we could improve your satisfaction with the course). For the first question, 4,103 learners (23.4%) included information regarding why or why not, generally consisting of a sentence or two. For the second question, 3,218 learners (18.4%) [|*1] included suggestions for improvement, usually consisting of only a sentence or two, but some were much lengthier. Phase two.In the second phase of the study, 13 semi-structured interviews were conducted. Each interview averaged 72 minutes and was recorded, with the participant’s permission, using a telephone recording adapter attached to the researcher’s telephone and then faithfully transcribed. [|*2] The interviews took place over a period of one month in March 2013. Thirteen interviews were decided on because saturation had been reached, and no new factors were forthcoming. While the learners selected for interview were chosen at random [|*3] they represented diverse groupings in terms of age (ranged from 33 to 62), ethnicity (8 White, 2 Asian [|*4], 2 African American, and 1 Hispanic), gender (9 male and 4 female), and geographically. This interview makeup is roughly identical to the population of online learners for this organization. Since the participant’s were scattered across the United States, the interviews were conducted by telephone rather than face-to-face. Each interview was conducted following the same set of 17 open-ended questions, in the same order, that focused on each participant’s preferences and perceptions regarding factors that would improve or impede student satisfaction in online professional development class including perceived learning and in-class participation. Additional expansion or clarification questions were asked as appropriate during the interviews. The questions were derived from a review of the pertinent literature and from the factors discovered in phase one.

Data Analysis[|*5]
To ensure trustworthiness and stability of the analysis, the first 1500 evaluation responses (750 per survey question) were independently coded by the researcher and his immediate supervisor. At that point an interrater reliability correlation (Cohen’s Kappa) was determined. The initial interrater correlation was .74, which indicated an acceptable level of reliability ( Holsti, 1969). To increase the interrater agreement, we discussed the differences and came to 100% agreement regarding appropriate codes. The remaining responses were then coded and a final interrater reliability for all of phase one was .88. Phase two. Triangulation of data from the literature search and from phase one was provided in phase two. Triangulation seeks convergence among the data sources within a case study, and, when successful, increases the validity of the study (Smith & Hakel, 1979[| *7] ). In the analysis of the transcribed semi-structured open-ended interviews, the researchers continued to use the analysis methodology of phase one[| *8], with one exception. One additional piece of data was collected for each thought unit in the interviews, a numeric code representing a pseudonym for the interviewee to ensure their anonymity through the process. The interrater reliability for phase two was .94, indicating a high degree of agreement between raters regarding the meaning and direction of the factors leading to increased or decreased learner satisfaction in an adult online professional development course. =Discussion[| *9]= One of the prime advantages of a case study is no specific data collection method has to be used ( Sinclair, 2009) and that multiple sources of data may be utilized for analysis ( Boling, Hough, Krinsky, Saleem, & Stevens, 2011; Creswell, 2009; Oncu & Cakir, 2011). By choosing to combine an analysis of extant data ( Burian, Rogerson, & Maffei, 2010; Chyung & Vachon, 2005) with semi-structured interviews ( Falloon, 2011; Nissen & Tea, 2012; Rhode, 2009<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">) the credibility of the study is enhanced ( Scagnoli, Buki, & Johnson, 2008), and more comprehensive insights may be gained into the phenomenon under study <span style="font-family: "Times New Roman",Times,serif; font-size: 90%;">( Burian et al., 2010; Hrastinski, 2008; Ruey, 2010; Sinclair, 2009<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">). Each data source can provide complementary explanatory information for the other (Bennett et al., 2012). The collection and analysis of historical data grants the researcher access to much more data than might otherwise be the case <span style="font-family: "Times New Roman",Times,serif; font-size: 90%;">( Creswell, 2009<span style="font-family: "Times New Roman",Times,serif; font-size: 90%;">) <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">. Numerous authors indicated that trustworthiness of the analysis is incumbent on having multiple raters, checking for interrater reliability at several points in the analysis, and using a constant comparative method for identifying codes, themes, and clusters or categories ( Batagiannis, 2011; Creswell, 2006; Ke, 2010; Hrastinski & jaldemark, 2012; Mancuso, Chlup, & McWhorter, 2010<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">). The constant comparative method also promotes deep within-case analysis as new factors are determined, earlier material is reviewed to ensure that nuances regarding the factor were not overlooked ( Scagnoli et al., 2009). <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">The use of semi-structured, open-ended questions in interviews is often, but not always, the hallmark of a case study ( Creswell, 2008<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">). These types of questions provide a chance to delve deep into the experience of the participant, and encourage open communication regarding their experience, opinions, and rationale for their answers, while providing the chance to clarify or expand upon those answers ( Bennett et al., 2012; Rhode, 2009; Ruey, 2010; Sinclair, 2009<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">). By using the same raters, and utilizing the same codes, themes, and categories, interrater reliability increased, but a clearer and more general understanding of the factors that lead to increased and decreased satisfaction in an adult online professional development course was also provided. =Conclusion= <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">The purpose of this descriptive, qualitative case study was to gain an in-depth understanding of the various factors that increase the learner satisfaction of adult online professional development learners. This case study was conducted in two phases. Phase one consisted of an analysis of historical data from course evaluations and a literature search for factors that contribute or detract from online learner satisfaction. Phase two consisted of 13 semi-structured phone interviews to corroborate or contradict the factors identified in phase one, and to more fully explore factors that increase learner satisfaction in online professional development classes.
 * Phase one.** <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">In phase one historical data was collected and analyzed using content analysis. “The goal of content analysis is to create systematic and objective criteria for transforming written text into highly reliable data that can be analyzed for the symbolic content of the communication” ( Simmons, Conlon, Mukhopadhyay, & Yang, <span style="font-family: "Times New Roman",Times,serif; font-size: 90%;">2011<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">, p. 44 <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">). Initial coding categories were derived from a literature review into learner satisfaction, and responses were coded using these categories; learner-instructor interactions, learner-learner interactions, reflection, sense of community, real-world applicability, and motivation [| *6]<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> ( Abrami, Bernard, Bures, Borkhovski, & Tamim, 2010; Boling et al., 2011; Cacciamani, Cesareni, Martini, Ferrini, & Fujita, 2012; Jackson, Jones, & Rodriguez, 2010; Karge, Phillips, Dodson, & McCabe, 2011; Keengwe & Georgina, 2011, Zemke & Zemke, 1995<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">). Analysis of themes was accomplished using a modified version of Strauss and Corbin’s (2008) constant comparative method. As new themes were discovered relating to the research questions, previously coded material was reviewed to ensure that additional responses corresponding to the new theme had not been missed in earlier coding. This iterative, spiral approach to the data analysis ensured inclusion of all thought units appropriate to the research topics (Burian et al., 2010; Creswell, 2009 <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">). To ensure that data were not lost during the analysis the qualitative data analysis computer software package, NVivo (<span style="font-family: "Times New Roman",Times,serif; font-size: 90%;">QSR International, 2012 <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;">), was used to collect, code, organize, arrange, and sort information during both phases of the study. The data entered into NVivo for each case included the code, the specific thought unit, and the direction (positive or negative) of the comment.

<span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|1]A simple chart or table would help to organize these ‘numbers’ in this data analysis section… <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|2]transcribed verbatim? <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|3]Why random?...why not a purposeful sample to cover the array of diversification issues: gender, race, age, online experience, etc.? – qual study is typically better addressed by ‘select informants’ who can provide rich information on specific phenomenon…thus, purposeful sampling is preferred over random… <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|4]Asian? (term ‘oriental’ may more so be in reference to inanimate objects (oriental rugs, oriental ornaments, etc.)…) <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|5]This is also the Results section…and, as such, concrete findings needed to be here…what would be results of analysis from phase one?...from phase two?...for this qual study, you needed to have provided direct quotes from archival data and interview transcripts… <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|6]…your explanation here denotes a ‘deductive content analysis’… <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|7]More recent citation is needed <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|8]So for interview analysis, a deductive approach was used to code, which is fine to do…however, it may have been interesting, and potentially particularly useful, to have noted any ‘unexpected’ trend or theme emerging from phase one analysis that may be sought out in phase two as well…the issue here concerns leaving open the possibility (in qual analysis) for ‘new’ themes to emerge from collected data…thus, the usefulness to use both deductive and inductive coding approaches… <span style="font-family: "Times New Roman",Times,serif; font-size: 120%;"> [|9]By this section, do you mean Discussion for this specific assignment or the mock study?...to be sure, Discussion for the mock study is assignment 4, thanks

**© 2013 Stephen W. Watts. All Rights Reserved.**
 * = References ||
 * * Abrami, P. C., Bernard, R. M., Bures, E. M., Borokhovski, E., & Tamim, R. (2010, July). Interaction in distance education and online learning: Using evidence and theory to improve practice. //The Evolution from Distance Education to Distributed Learning. Symposium// conducted at Memorial Union Biddle Hotel, Bloomington, IN. http://dx.doi.org/10.1007/s12528-011-9043-x
 * Ali, A., & Ahmad, I. (2011). Key factors for determining students’ satisfaction in distance learning courses: A study of Allama Iqbal Open University. //Contemporary Educational Technology, 2//(2), 118-134. Retrieved from http://cedtech.net/
 * Batagiannis, S. C. (2011). Promise and possibility for aspiring principals: An emerging leadership identity through learning to do action research. //Qualitative Report, 16//(5), 1304-1329. Retrieved from ERIC database. (EJ941707)
 * Bennett, S., Bishop, A., Dalgarno, B., Waycott, J., & Kennedy, G. (2012). Implementing Web 2.0 technologies in higher education: A collective case study. //Computers and Education, 59//, 524-534. http://dx.doi.org/10.1016/j.compedu.2011.12.022
 * Boling, E. C., Hough, M., Krinsky, H., Saleem, H., & Stevens, M. (2011). Cutting the distance in distance education: Perspectives on what promotes positive, online learning experiences. //Internet and Higher Education//. http://dx.doi.org/10.1016/j.iheduc.2011.11.006
 * Burian, P. E., Rogerson, L., & Maffei, F. R. III. (2010). The research roadmap: A primer to the approach and process. //Contemporary Issues in Education Research, 3//(8), 43-57. Retrieved from http://journals.cluteonline.com/index.php/CIER
 * Cacciamani, S., Cesareni, D., Martini, F., Ferrini, T., & Fujita, N. (2012). Influence of participation, facilitator styles, and metacognitive reflection on knowledge building in online university courses. //Computers and Education, 58//, 874-884. http://dx.doi.org/10.1016/j.compedu.2011.10.019
 * Chen, L.-C., & Lien, Y.-H. (2011). Using author co-citation analysis to examine the intellectual structure of e-learning: A MIS perspective. //Scientometrics, 89//, 867-886. http://dx.doi.org/10.1007/s11192-011-0458-y
 * Chyung, S. Y., & Vachon, M. (2005). An investigation of the profiles of satisfying and dissatisfying factors in e-learning. //Performance Improvement Quarterly, 59//(3), 227-245. http://dx.doi.org/10.1177/0741713609331546
 * Creswell, J. W. (2006). //Qualitative inquiry and research design: Choosing among five traditions// (2nd ed). Thousand Oaks, CA: SAGE.
 * Creswell, J. W. (2009). //Research design: Qualitative, quantitative, and mixed methods approaches// (3rd ed.). Thousand Oaks, CA: SAGE.
 * Donavant, B. W. (2009). The new, modern practice of adult education: Online instruction in a continuing professional education setting. //Adult Education Quarterly, 59//(3), 227-245. http://dx.doi.org/10.1177/0741713609331546
 * Ferguson, J. M., & DeFelice, A. E. (2010). Length of online course and student satisfaction, perceived learning, and academic performance. //International Review of Research in Open and Distance Learning, 11//(2), 73-84. Retrieved from http://www.irrodl.org/ index.php/irrodl
 * George, A. L., & Bennett, A. (2005). //Case studies and theory development in the social sciences//. Cambridge, MA: MIT Press.
 * Holsti, O. R. (1969). //Content analysis for the social sciences and humanities//. Reading, MA: Addison-Wesley.
 * Hrastinski, S. (2008). The potential of synchronous communication to enhance participation in online discussions: A case study of two e-learning courses. //Information and Management, 45//, 499-506. http://dx.doi.org/10.1016/j.im.2008.07.005
 * Hrastinski, S., & Jaldemark, J. (2012). How and why do students of higher education participate in online seminars? //Education and Information Technologies, 17//, 253-271. http://dx.doi.org/10.1007/s10639-011-9155-y
 * Huang, E. Y., Lin, S. W., & Huang, T. K. (2012). What type of learning style leads to online participation in the mixed-mode e-learning environment? A study of software usage instruction. //Computers and Education, 58//(1), 338-349. http://dx.doi.org/10.1016/j.compedu.2011.08.003
 * Jackson, L. C., Jones, S. J., & Rodriguez, R. C. (2010). Faculty actions that result in student satisfaction in online courses. //Journal of Asynchronous Learning Networks, 14//(4), 78-96. Retrieved from http://jaln.sloanconsortium.org/index.php/jaln
 * Karge, B. D., Phillips, K. M., Dodson, T. J., & McCabe, M. (2011). Effective strategies for engaging adult learners. //Journal of College Teaching and Learning, 8//(12), 53-56. Retrieved from http://journals.cluteonline.com/index.php/TLC/article/view/6621
 * Ke, F. (2010). Examining online teaching, cognitive, and social presence for adult students. //Computers and Education, 55//, 808-820. http://dx.doi.org/10.1016/j.compedu.2010.03.013
 * Keengwe, J., & Georgina, D. (2012). The digital course training workshop for online learning and teaching. //Educational and Information Technologies, 17//, 365-379. http://dx.doi.org/10.1007/s10639-011-9164-x
 * Kozub, R. M. (2010). An ANOVA analysis of the relationships between business students' learning styles and effectiveness of web based instruction. //American Journal of Business Education, 3//(3), 89-98. Retrieved from http://journals.cluteonline.com/index.php/AJBE
 * Mancuso, D. S., Chlup, D. T., & McWhorter, R. R. (2010). A study of adult learning in a virtual world. //Advances in Developing Human Resources, 12//, 681-699. http://dx.doi.org/10.1177/1523422310395368
 * Martinez-Caro, E. (2011). Factors affecting effectiveness in e-learning: An analysis in production management courses. //Computer Applications in Engineering Education, 19//(3), 572-581. http://dx.doi.org/10.1002/cae.20337
 * Merriam, S. B. (1998). //Qualitative research and case study application in education//. San Francisco, CA: Jossey-Bass.
 * Pelz, B. (2010). (My) three principles of effective online pedagogy. //Journal of Asynchronous Learning Networks, 14//(1), 103-116. Retrieved from http://sloanconsortium.org/publications/jaln_main
 * QSR International. (2012). NVivo 10. Retrieved from www.qsrinternational.com/products_nvivo.aspx
 * Rhode, J. F. (2009). Interaction equivalency in self-paced online learning environments: An exploration of learner preferences. //The International Review of Research in Open and Distance Learning, 10//(1). Retrieved from http://www.irrodl.org/index.php/irrodl/ article/view/603/1178
 * Ruey, S. (2010). A case study of constructivist instructional strategies for adult online learning. //British Journal of Educational Technology, 41//(5), 706-720. http://dx.doi.org/10.1111/j.1467-8535.2009.00965.x
 * Scagnoli, N. I., Buki, L. P., & Johnson, S. D. (2009). The influence of online teaching on face-to-face teaching practices. //Journal of Asynchronous Learning Networks, 13//(2), 115-128. Retrieved from http://sloanconsortium.org/sites/default/files/v13n2_scagnoli_1.pdf
 * Simmons, L. L., Conlon, S., Mukhopadhyay, S., & Yang, J. (2011). A computer aided content analysis of online reviews. //The Journal of Computer Information Systems, 52//(1), 43-55. Retrieved from http://www.iacis.org/jcis/jcis.php
 * Sinclair, A. (2009). Provocative pedagogies in e-learning: Making the invisible visible. //International Journal of Teaching and Learning in Higher Education, 21//(2), 197-209. Retrieved from ERIC Database. (EJ899306)
 * Smith, J. E., & Hakel, M. D. (1979). Convergence among data sources, response bias, and reliability and validity of a structured job analysis questionnaire. //Personnel Psychology, 32//, 677-692. http://dx.doi.org/10.1111/j.1744-6570.1979.tb02340.x
 * So, H.-J., & Bonk, C. J. (2010). Examining the roles of blended learning approaches in computer-supported collaborative learning (CSCL) environments: A Delphi study. //Educational Technology and Society//, //13//(3), 189–200. Retrieved from ERIC Database. (EJ899878)
 * Strauss, A. L., & Corbin, J. (2008). //Basics of qualitative research: Techniques and procedures for developing grounded theory// (3rd ed.). Los Angeles, CA: SAGE
 * //Watts, S. W. (2012).// Technological tools impact on learning in online professional development courses//. Unpublished manuscript, Department of Education, Northcentral University,// //Prescott Valley, AZ. Retrieved from https://stevesncujourney.wikispaces.com/file/view/WattsSEDU7006-8-8Graded.docx//
 * //Watts, S. W. (2013).// Proposal 2//. Unpublished manuscript, Department of Education, Northcentral University,// //Prescott Valley, AZ. Retrieved from https://stevesncujourney.wikispaces.com/file/view/WattsSEDU8005-8-2Graded.docx//
 * Yin, R. K. (2009). //Case study research: Design and methods// (4th ed.). Thousand Oaks, CA: SAGE.
 * Zemke, R., & Zemke, S. (1995). Adult learning: What do we know for sure? //Training, 32//, 69-82. Retrieved from ERIC Database. (ED504481) ||