Constructivism within Synchronous and Asynchronous Online Learning Environments? Absolutely.

Integrated Classroom Student Engagement Technology

I have got something exciting going on!  I have been tasked with helping with the design and development of asynchronous active learning strategies!

Yes! For me, that is indeed exciting.

Paradigm Shift in Higher Education

A recent Department of Education (2017) report revealed that, overall, college and university enrollments are declining. But, at the same time, research also indicates the number of students opting to take online courses at our nation’s colleges and universities is actually on the increase.  In fact, online course enrollments have grown consistently since 2016. Further, predictions are that enrollments in online course are expected to continue to grow by double digits in the future (Ginder, Kelly-Reid, & Mann, Department of Education, National Center for Education Statistics, 2017).  

That’s good news.

It is well documented that online college courses make college more accessible, attainable, and affordable. Increased offerings of online postsecondary courses have the potential to put a college education within the reach of even the most challenged households (Mayadas, Bourne, & Bacsich, 2009).

However, as a social constructivist totally bought into the theory that humans learn best within active collaborative learning environments (Vygotsky, 1978), I have to stop and ask myself this first question: Is it possible to create collaborative active and engaging learning activities for groups of students who may be hundreds, or even thousands, of miles apart?

The Challenges Facing Online College Courses

Research has noted that the attrition rates for online courses are 10 – 20% higher among students enrolled solely in online courses (Angelino, Williams, & Natvig, 2007).  Further, some MOOCs (Massive Open Online Courses) have reported drop-out rates as high as 90% (Hew & Cheung, 2014) for higher education classes – even when the classes are being given away. This infers that instructional designers tasked with creating synchronous and asynchronous online courses gotta have some skills if they want to be successful.

iClickers: An Innovative Student Engagement Learning Technology

But, technology is evolving to help address such challenges. For instance, I recently learned how easy it is to embed iClicker technology into online Canvas content. As a refresher, iClicker is an easy to use cloud or web-based “student response system and classroom engagement tool.” This learning technology was designed exclusively to enable instructors to engage their students during online and face-to-face learning activities and provide almost instantaneous feedback to improve student performance.

Simply put, the iClicker app enables students to use their desktop computers, laptops, tablets, smartphones, or any other electronic advice to respond to instructor prompts and probing exams and quizzes. The technology is so versatile, instructors can:

  • Use iClickers to take attendance;
  • Poll and quiz student real-time;
  • Create study and collaborative learning groups; and
  • Design engaging online synchronously and asynchronously learning activities (“Student Response Systems & Classroom Engagement Tools,” n.d.).†††

And, iClicker technology is user-friendly. Canvas is fast becoming the preferred Learning Management System in higher education. Therefore, I was pleasantly surprised to learn how easy it is to integrate iClicker technology into Canvas web content. For instance, once the iClicker tool is selected and a remote ID is assigned, instructional designers need only confirm their registration to begin building out engaging content that students can access anywhere, anytime, and anyhow 24/7 (“Registering Your iClicker in Canvas: Canvas Training Center,” n.d.).

But, do iClickers work? Personally, as a scholar-under-construction, I have gotten a little hard to convince without evidence.

Evidence-Based Impacts and Results

Well. The evidence is in. It has been documented that instructional design systems with integrated iClicker engagement activities helped 118 college freshman English students improve their learning progress during both synchronous and asynchronous situated learning classes (Yang, 2011).  Further, a researcher measured and documented learning improvements among 24 second semester college Calculus students and attributed the increases to the immediate performance feedback and data provided by integrated iClicker activities (Lucas, 2009).  Finally, a constructivism-inspired iClicker instructional design system boosted both the factual knowledge and conceptual exam performance of 858 diverse Midwestern undergraduates because of the instructor’s ability to solicit and immediately use performance improvement feedback and input during problem-solving exercises, group discussions, or other activities (Shapiro et al., 2017).

Jennie’s Perspective

I see the integration of iClicker into distance and online course design systems as the employment of an evidence-based strategy to improve student performance and promote the development of metacognitive and critical thinking skills. 

So, to answer my first question: Yes.  I feel instructional designers can create active and engaging activities among groups of students who may be hundreds or thousands of miles apart?

But, that raises a second question: Do I have what it takes to design active , engaging, and effective, student engagement activities for asynchronous online course?

Well. What do you think? 

I’m just sayin’ …

References

Ginder, S. A., Kelly-Reid, J. E., & Mann, F. B. (2017). Enrollment and Employees in Postsecondary Institutions, Fall 2016; and Financial Statistics and Academic Libraries, Fiscal Year 2016. First Look (Provisional Data). NCES 2018-002. National Center for Education Statistics.

Hew, K. F., & Cheung, W. S. (2014). Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educational research review12, 45-58.

Lucas, A. (2009). Using peer instruction and i-clickers to enhance student participation in calculus. Primus19(3), 219-231.

Mayadas, A. F., Bourne, J., & Bacsich, P. (2009). Online education today. Science323(5910), 85-89.

Registering Your iClicker in Canvas: Canvas Training Center. (n.d.). Retrieved from https://utexas.instructure.com/courses/633028/pages/registering-your-iclicker-in-canvas

Shapiro, A. M., Sims-Knight, J., O’Rielly, G. V., Capaldo, P., Pedlow, T., Gordon, L., & Monteiro, K. (2017). Clickers can promote fact retention but impede conceptual understanding: The effect of the interaction between clicker use and pedagogy on learning. Computers & Education111, 44.

Student Response Systems & Classroom Engagement Tools. (n.d.). Retrieved from https://www.iclicker.com/

Vygotsky _____-

Vygotsky, L. (1978). Interactions between learning and development. In Mind and Society (pp. 79-91). Retrieved from https://www.docsity.com/en/vygotsky-readings-on-the-development-of-children/2233276/

Yang, Y. F. (2011). Engaging students in an online situated language learning environment. Computer Assisted Language Learning24(2), 181-198.

“To be, or not to be?” That really is a good question.

Putting all politics aside, you would have to admit that Ayan Rand (1957) wrote a pretty good book.

The Struggle to Become

As a quick recap, Ayn Rand’s Atlas Shrugged tells the tale of a dystopian society where oppressive governmental officials and businessmen use regulations and other tricks to shackle the creativity of highly developed intellectuals. Misguided policies are implemented to manage and regulate individual creativity and control the output of the society’s great thinkers. The result is the creation of a country separated into “producers,” the creative minds, and “looters,” a powerful class of moochers. The latter exploited the productivity of former with little abandon claiming collectivism over self-interest for the betterment of all. Moocher rules were enforced using backbiting, double-dealing, under-the-tap deal making, and unfair oppressive trade practices with stringent commerce regulations. Offenders were quickly destroyed overnight by public opinion and found themselves facing stiff consequences. In the midst of the struggle, however, a shadowy figure emerges. John Galt appears and begins convincing the great thinkers to go on a “strike of the mind.” Soon, the society’s most brilliant and productive individuals began to slowly disappear leaving their jobs and employers abandoned.  Chaos ensues as the dependent class begins to crumble and their economy collapses as its leaders realize that – there was no one left to bring home the bread. Rand’s 1,168 tome ends with the striking producers building a new world dedicated to reason, individualism, and the rights of freedom to the fruits of one’s thinking (SparkNotes Editors, 2002).

Popular opinion is that Rand wrote Atlas Shrugged as a statement of her objectivist perspective. She was well known for her association with those that promoted the importance of one’s rights to benefit from one’s labor and creative self-expressions. It should be small wonder then that the ethical egotism promoted in Atlas Shrugged led to negative critical reviews upon its release. In fact, the novel’s underpinnings remains a topic of heated debates even today – over 60 years later

The Slings and Arrows of Cybertheft

While realizing the book’s fictitious dystopia could never happen in America (right?), we must admit that the growing plague of cyber crime is having pretty much the same effect. Web-based crimes are reported to have cost individuals and companies $13 million in 2017 alone (Accenture Security, n.d.)

Gordon and Ford (2006) defined “cybercrime” as a range of criminal activity that spans from data theft to copyright infringement. The outputs of cybercrime were described as fraud, forgery, unauthorized access, child pornography, and cyberstalking. But, what is truly disturbing is that Gordon and Ford found these illegal acts represent a continuum of crimes almost exclusively aligned with technological intrusions and purposeful Internet-based security attacks (p. 14-15).

After studying the phenomena, Poufinas and Vordonis (2018) wrote that digital theft and cyber crimes are so dangerous because they significantly impact global economies through the direct loss of taxable income, property, and profits (p. 128).  The researchers found that cyber crimes target the income-generating capacity of individuals and companies, which inhibits their ability to earn legitimate revenues from goods and services sold. The researchers also wrote that theft of proprietary intellectual property and trade secrets threaten a nation’s economic system thus creating security risks for all contributors to an economy (p. 130). 

Of equal concern should be findings by Ablon, Golay, and Libicki (2014) who explain that the demand for stolen trade secrets and intellectual property has created a dark web “Hackers’ Bazaar.” These websites allow like-minded criminals to congregate online to trade security breech techniques, tools, and weapons created exclusively to target and steal from their cyber theft victims. Such websites appear to be flourishing internationally with little regard, concern, or care for the resultant costs or damage done to individuals, companies, and economies (Ablon, Golay, & Libicki, 2014).

To Catch a Thief

The high levels of ruthlessness displayed by cyber thieves is leading design and technology engineers toward the study of computer-linguistic analytics as a possible defense. Fedushko and Bardyn (2013) studied promising technologies under development that would use algorithms to profile and track cyber criminals online using web-personality traits (2013).

Nevertheless, while such technologies sound exciting, one can only wonder how many intellectual producers and creative thinkers are simply holding onto possible solutions for everything from a common cold to global warming in fear they will not be compensated because of cyber theft.

Jennie’s Perspective

Taken in context, Rand’s Atlas Shrugged could provide the opening discussion for my question: “To be, or not to be?” Should creative producers continue to generate cutting-edge innovative and original content and products and services knowing full well that groups of lazy looters are patiently waiting to steal their ideas immediately or illegally download every single letter they type? My answer is simple and emphatic: Yes.

For instance, one of my final assignments was to create an advanced workforce development instructional design that included all the bells and whistles. But, my dilemma was: Should I release that little jewel I’ve been holding in my pocket for a few years knowing that I could not protect it?

While I faced a moment of hesitation, once I got started, I could not hold back. I quickly opened my mind and really got into it. My professor commented she found the end result “exemplary”, which meant a lot coming from her. (She’s one of the best in the industry.) That simple word made my early feelings of discomfort vanish into thin air. (Along with the lost of my intellectual property, I’m sure.)

But, I decided, why should I deny myself self-actualization and creative expression because someone else can’t stop stealing? In retrospect, I realize that, no matter how many times some low-down, dirty, and despicable lying looter steals my stuff, I will continue pursuing a degree of excellence. Even if that means I will have to carry a few moochers on my coat tails. That is, of course, until they get caught. Keep hope alive!

I’m just saying, …

References

Ablon, L., Golay, A. A., & Libicki, M. C. (2014). Markets for Cybercrime Tools and Stolen Data : Hackers’ Bazaar. Santa Monica, California: RAND Corporation.

Accenture. (n.d.). Retrieved from https://www.accenture.com/us-en

Bernstein, A. (2000). CliffsNotes on Rand’s Atlas Shrugged. Foster City, CA: Cliffs Notes. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=67214&scope=site

Fedushko, S., & Bardyn, N. (2013). Algorithm of the cyber criminals identification. Global journal of engineering, design & technology2(4), 56-62.

Gordon, S., & Ford, R. (2006). On the definition and classification of cybercrime. Journal in Computer Virology, 2(1), 13-20.

McQuade, S. C. (2009). Encyclopedia of Cybercrime. Westport, Conn: Greenwood Publishing Group.

Poufinas, T., & Vordonis, N. (2018). Pricing the Cost of Cybercrime—A Financial Protection Approach. iBusiness, 10(03), 128.

Voice of the Customer: Successive Approximation Model Instructional Design

Source: Johnson, 2019. Unpublished.

I learned about another instructional design model a few weeks ago: Successive Approximation Model (SAM).

SAM was the brainchild of Michael Allen (2012), Chief Executive Officer of Allen Interactions, an educational e-learning applications provider. Allen determined the prolonged A.D.D.I.E. (Analyze, Design, Develop, Implementation, and Evaluation) linear “waterfall” design process was too convoluted and inflexible to adjust to the volatile, disruptive, and ever changing demands of e-learning, gamification, and micro learning innovations (Allen, 2012). 

Successive Approximation Model Instructional Design Process

Allen proposed an alternative to A.D.D.I.E. by collapsing A.D.D.I.E.’s 5-phase waterfall linear process into a 3-phase cyclical non-linear phases as illustrated succinctly below:

Phase I: Analyze. During the analyze phase, instructional designers create a pathway to a Savvy Start to the instructional design process. Instructional designers work with stakeholders collaboratively and engaging in active brainstorming to gather and organizing information regarding learners and their learning needs.  The ideas from the meetings are assessed and used to determine the requirements for the instructional design based on learning outcomes and objectives.   

Phase II: Design. During the design phase, the prototype for the instructional design is created and sent to stakeholders for confined testing.  An iterative process is initiated to use feedback from stakeholder evaluations and critiques to make corrections and adjustments required to refine and improve the prototype until a semblance of the desired product has been reached and implemented.

Phase III: Development. During the development phase, the prototype is implemented and sent to stakeholders for testing on a broader scale. The iterative evaluation – correction – adjustments – redesign – implementation process continues between the three phases repeatedly until the creation of an instructional design acceptable by stakeholders as a proven training solution.

Successive Approximation Model in Practice

Question: Is SAM an acceptable alternative to A.D.D.I.E.?

Acknowledged advantages associated with the SAM instructional design process include the process reflects the realistic strategic and tactical product design process in that it incorporates feedback and input from a multitude of sources, using small steps during development, relies on collaborative teamwork and communications, and enables balancing of project development processes to allow focus on those of greatest importance.

Known disadvantages associated with the SAM instructional design process include team members can begin to operate from the “I can always find something wrong” mentality which can bog down and delay the instructional design process, and corrections made during one iteration could alter corrections made during an earlier iteration causing issues to go undetected. 

Overall, however, it is generally agreed that SAM’s cyclical and iterative design process allows for the early detection of errors and includes a built-in quality assurance process. SAM also aligns with the agile implementation approach to project management and is adaptable to most agile-support software products including the SCRUM (Allen, 2006; Allen, 2012; Czeropski & Pembrook, 2017; Galagan, 2013; Glova, 2018; Interactions, 2017).J

Jennie’s Perspective

Back to the question: Is SAM an acceptable alternative to A.D.D.I.E.?

I accept SAM approach as an alternative. What I like most about the SAM instructional design process is its focus on the voice of the customer.  The need to heed the voice of the customer can often get overlooked during the instructional design process as practitioners strive to create the “perfect” lesson plan.

Heeding the voice of the customer should be especially important in higher education. Numerous researchers are warning academia that the loss of customer (students and society’s) goodwill and confidence can not only lead to tarnished institutional images but can further strain budgets by adding additional intangible costs (Matorera, 2015; Mulay & Khanna, 2017; Raharjo, Xie, Goh, & Brombacher, 2007).

I like money well spent. Don’t you?

References

Allen, M. (2012). Leaving ADDIE for SAM: An agile model for developing the best learning experiences. American Society for Training and Development.

Allen, W. C. (2006). Overview and evolution of the ADDIE training system. Advances in Developing Human Resources, 8(4), 430-441.

Czeropski, S. & Pembrook, C. (2017). E-Learning Ain’t Performance: Reviving HPT in an Era of Agile and Lean. Performance Improvement, 56(8), 37–45. https://doi.org/10.1002/pfi.21728 rd

Galagan, P. (2013). Greed for Speed. T+D, 67(5), 22.

Glova, S. E. (2018). Toward Effective Facilitation for Adult Learners: An Action Research Study on the Design and Delivery of Workshops for Women Business Owners. ProQuest LLC.

Interactions, A. (2017). Iterative eLearning Development with SAM. Retrieved from https://www.alleninteractions.com/sam-process

Matorera, D. (2015). A Conceptual Analysis of Quality in Quality Function Deployment-Based Contexts of Higher Education. Journal of Education and Practice, 6(33), 145–156.

Mulay, R., & Khanna, V. T. (2017). A Study on the Relationship between the Voice of Customer with the Cost of Quality in Processes of Professional Higher Education Institutions. South Asian Journal of Management, 24(4), 55-72.

Raharjo, H., Xie, M., Goh, T., & Brombacher, A. (2007). A Methodology to Improve Higher Education Quality using the Quality Function Deployment and Analytic Hierarchy Process. Total Quality Management & Business Excellence, 18(10), 1097–1115. https://doi.org/10.1080/14783360701595078

Zaharie, M., Osoian, C., & Gavrea, C. (2013). Applying Quality Function Deployment to Improve Quality in Higher Education: Employers’ Perspective. Managerial Challenges of the Contemporary Society, (5), 172–176.

Knowing Right from Wrong. How not to get carried away with the bells and whistles.

“Should I or shouldn’t I?”

Question: What is the most important thing to consider while designing online learning materials?

Kali, Levin-Peled, and Dori (2007) sought to “formulate design-principles that translate socio-constructivist learning into general guidelines, design hybrid courses according to these principles, explore the effect of the courses on student learning, refine the principles, and contribute knowledge to a Design Principles Database” (p. 2). (Okay.)

Online Instructional Design Considerations

Basically, the researchers conducted a study that sought to promote higher-order thinking and critical reflection skills, collaborative learning, and product-construction among a group of research participants participating in a hybrid learning environment (p. 1).

Between 2004 and 2007, Kali, Levin-Peled, and Dori oversaw an iterative instructional design research project that involved 624 undergraduate and graduate students learning to design hybrid college courses and online learning assessments.  The three courses created required the integration of collaborative peer learning activities, development and reuse materials of student artifacts, and the creation of embedded online learning assessments (p. 2).

Specifically, the student learning objectives included:

•             Collaboratively constructing Wiki pages;

•             Designing and developing a two-week online mini-courses; and

•             Creating lessons plans to teach learning assessment topics to their peers

Kali, Levin-Peled, and Dor reported positive learning outcomes among participants at the end of the study. However, what was of most interest to me was how the researchers chose to introduce and conclude their study. 

Traditional v. Recommended Approaches

At that time, Kali, Levin-Peled, and Dori, wrote education and educators have remained “traditional” in their approaches to instructional design. Further, the researchers cited other studies that concluded instructors in higher education may be simply uploading learning materials to websites instead of using instructional design principles that promote “meaningful learning.”  The researchers also wrote that most college courses are not interactive and do not allow for student participation, shared ownership, learning motivations, or assessments (p. 1). (Humph.)

Therefore, Kali, Levin-Peled, and Dori concluded there is a “large gap” within the body of knowledge regarding the need to use iterative course design principles while designing for hybrid learning environments.  The researchers thereby advised academia that educators and administrator should “gain their strength” by becoming a part of the Design Principles Database, which was, by the way, developed and maintained by – wait for it – Kali, Levin-Peled, and Dori (p. 7)!  (Got it.)

Intrigued, I decided to “gain” more “strength” by joining the Design Principles Database myself. Unfortunately, however, I found the website appears to have not been updated since 2008. (Oh, well.)

Jennie’s Perspective

Obviously times have changed since Kali, Levin-Peled, and Dori made their observations and conclusions. Recent studies indicate that most colleges and universities now embrace blended and hybrid courses as evidenced by the rapidly increasing enrollment statistics for online college courses. 

In addition, based on my studies and conversations with university instructors, administrators, and peers, I have reason to assume that colleges and universities are using interactive instructional design approaches that include interactive and collaborative learning activities.

This is good. However, Warren and Linn (2012) warns creating interactive and collaborative learning activities should not be considered the most important thing during the instructional design process.

Warren and Linn advise instructional designers, college professors, and other creators of online learning material should instead recognize that legally afforded accessibility, ethical constructs, and learning objectives should be the primary concerns during the online learning instructional design and development phases.  Otherwise, owners of online, hybrid, or blended learning materials could find themselves facing maximum risks for minimal educational benefits. (No! Thank you.)

In my opinion then, instructional designers should always remember to do no harm while creating interactive activities and learning tasks for online learners. They should also remember that certain vulnerable populations are protected by law. (Can anybody say “IRB?”)

So. Since I was unable to gain strength from that website, I  guess I will just have to heed Warren and Linn’s warnings and ask myself one fundamental question while designing that fancy, sparkling, shiny, new, exciting and engaging learning solution: I know I can do this, but – should I? (Warren & Linn, 2012).

I’m just sayin’ …

References

Kali, Y., Levin-Peled, R., & Dori, Y. J. (2007, October). How Can Hybrid Courses Designed with Socio-Constructivist Design-Principles Promote Learning in Higher Education?. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 6071-6078). Association for the Advancement of Computing in Education (AACE).

Warren, S. J., & Lin, L. (2012). Ethical considerations for learning game, simulation, and virtual world design and development. In Handbook of Research on Practices and Outcomes in Virtual Worlds and Environments (pp. 1-18). IGI Global.

Qualitative Data Analysis: When you want more than just the numbers.

What is “triangulation” anyway?

It appears qualitative content analysis has come a long way. Formerly attributed only to quantitative research, qualitative content analysis is gaining respect because of its ability to address some of the weaknesses of statistical analysis and probabilities. For instance, unlike quantitative statistics, qualitative data analysis, qualitative study allows for the subjective interpretation of data using systematic coding classifications to better identify themes or patterns (Hsieh & Shannon, 2005, p. 1278) and analyze data within the context of communications (Mayring, 2000, p. 2).

Question: How does qualitative data collection and analysis methods differ from the quantitative methods?

Qualitative Content Analysis and Why

Zhang and Wildermuth (2009) explain qualitative content analysis is better suited to allow researchers to condense large amounts of data into categories, draw inferences and interpretations, then carefully examine information gathered and apply the process of either inductive or deductive reasoning repeatedly until the data makes sense.  Further, Zhang and Wildermuth identify the methods used to code and categorize qualitative data and that allows researchers to ground their examination of a topic of inquiry using flexible or standardized categories in their attempts to generate valid inferences and interpretative theories regarding the observed phenomenon (p. 1-2).

Best Practices for Coding Qualitative Data

Zhang and Wildermuth supported their position by explaining how researchers could condense raw data into categories or themes to facilitate researcher examination and comparisons (p.2).  These researchers advocate qualitative researchers use eight key steps to prepare, define, analyze, test and retest then assess raw data. Zhang and Wildermuth recommend the below key steps for qualitative coding data:

Step 1. Prepare the data. Convert the various types of data collected into text for examination.

Step 2. Classify data. Convert text into “chunks” or units of information based on content or context.

Step 3. Coding Scheme.  Chunks of information are organized into categories using context and content then coded according to a scheme.

Step 4. Test Coding Scheme. A sample of the coded data is used to validate the levels of consistency and sufficiency. The process is iterative until all doubts about the accuracy of coding is satisfied and create a manual.

Step 5. Code Data. Sample coding plied to all data collected using the coding manual as a guide.

Step 6. Assess Coding Accuracy. Coding schemes are rechecked for consistency to ensure all mistakes and problems have been addressed or mitigated.

Step 7. Draw Conclusions. Themes and patterns allow exploration of properties to make sense of the data using at least three different approaches (triangulation) and by examining relationships between categories, uncovering patterns, and testing categories.

Step 8. Report Methods and Findings. Inferences are drawn from data, implications, limitations, and reported with recommendations (Zhang & Wildermuth, 2009, p. 2-5).

But …is it right?

Zhang and Wildermuth warn that researchers should ensure they remain transparent throughout the above process by taking thorough field notes and recording their data collection, coding, analyses, and reporting methods (creditability).  Researchers are also advised to generate a hypothesis that can be applied in another context or different settings (transferability). Finally, Zhang and Wildermuth recommend qualitative researchers ensure their internal processes account for changing conditions (dependability) and that their method can be confirmed by others who read or review their research results (confirmability) (p. 6)

Ravitch and Carl also noted the importance of creditability, transferability, dependability, and confirmability for qualitative researchers. These authors indicated that qualitative researchers are held to a different standard in that qualitative researchers must meet the standard of trustworthiness because of the subjectivity of their interpretations of data. By comparison, quantitative researchers seek validity, reliability, and objectivity for their understanding of statistical probabilities. The former is much more prone to bias and vulnerable to positionality .

Creditability Through Triangulation

Therefore, Ravitch and Carl ensure triangulation processes are embedded in their data collection and analyses processes such as:

  • Using different strategies and methods during data evaluation;
  • Searching for and collecting data from as many data sources as possible using different sampling strategies and individuals at varying times and places;
  • Involving multiple researcher perspectives during interpretation of data, identification of themes and patterns, and reporting findings and conclusions;
  • Framing the theoretical underpinnings of the study and noting how it compares or contrasts with previous works and why; and
  • Intentionally and systematically including a wide range of participants and diverse groups (Ravitch & Carl, 2015, p. 186-200).

Jennie’s Perspective

While exploring the above advice from Zhang and Wildermuth and Ravitch and Carl, I assumed qualitative coding of participant interviews would be more than a notion. I was right. My first interview field notes coding experience was quite a challenge. I struggled through it, however, and preserved until I reached an elementary level of coding accuracy sufficient to get a passing grade on the assignment.

Lessons learned? Garbage in. Garbage out.  In other words, qualitative data collection and coding takes practice.  And, I discovered I definitely need more of that.

References

Hsieh, H.-F., & Shannon, S.E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277-1288

Mayring, P. (2000). Qualitative content analysis. Forum: Qualitative Social Research, 1(2).

Ravitch, S. M., & Carl, N. M. (2015). Qualitative research: Bridging the conceptual, theoretical, and methodological. Sage Publications.

Zhang, Y., & Wildemuth, B. M. (2009). Qualitative analysis of content. Applications of social research methods to questions in information and library science308, 319.

The Rules of Conversation: Another Win for Science

I was provided the opportunity to conduct a mock qualitative interview a few weeks ago. So what if my interview participant was a class cohort?  The challenge of capturing good data pertinent to the objectives of the study was the same: capture sufficient data to allow for qualitative coding and analysis later. 

Question: What is the best way to capture both individual and social contexts of a participant’s lived experiences during qualitative interviews?


The Science of Qualitative Interviewing

Ravitch and Carl (2015) explain, context is everything. The authors wrote that “every interview is conducted within multiple, intersecting contexts.” The inference is that regardless of the participant being interviewed, where the participant is being interviewed, or how the participant is being interviewed, the interviewer should remain cognizant of the context that might shape the participant’s opinions and views (p. 148).

After warning that “general questions lead to general responses,” Ravitch and Carl advise a skilled interviewer understands that interviews happen within a complex ecosystem of a participant’s life and that both individual and social conditions can influence a participant’s response to interview questions (p. 148).

The Rules of Conversation

Work by Gibson (2003) supports Ravitch and Carl by noting there are two certain things about conversation. First, he explains that conversation is “rule-governed” concerning who speaks and what they say and how the dynamics of discourse ensure basic levels or order and intelligibility. Absent the rules, Gibson opines, the phenomenon of conversation would not exist which would lead to perpetual chaotic encounters (p. 1335) among humans. Gibson argues the rules of conversation must be established to ensure only one person speaks at a time speaks, the conversation is set at a level that enables all parties to understand what is being said, and discord is avoided or limited to the extent that keeps an open and honest exchange ongoing between the parties.

The second thing Gibson explains about conversation is that life can deal one or the other party an unfair hand in terms of who gets to talk, when, and under what terms and referred to this as-as “differentiation of persons.” Gibson opines here that personalities and positions understand that conversations can lead to either favorable or unfavorable consequences which can impact the balance of what is said and heard between the parties involved. Gibson wrote these two rules lead to tension during conversations between parties that should be addressed simultaneously for full and open conversation to flow (p. 1336). Calling this a “participation shift,” Gibson advocates that skilled interviews then must learn to manage the turn-by-turn transformation of the interview process using the two stated rules if the full width and breadth of data collection using interviews is to be achieved (p. 1368).

If Gibson’s position is accepted, how can qualitative interviewers design questions that capture both the individualized underlying social processes that lie at the root of a participant’s responses (micro-sociological) and the systematic or social environmental process that might influence a participant’s answers?  

How to Conduct Quality Qualitative Interviews

Ravitch and Carl wrote a skilled interviewer’s ability to distinguish between individualized and contextualized responses by gaining an understanding in advance of how the questions to be asked might shape a person’s responses and how such variable will impact the study’s goals (p. 148). The authors infer the key to making the distinction is in how the questions are framed.  For instance, Ravitch and Carl wrote interviewers write research questions that probe for the:

  • Individualized (micro-social) and environmental/systematic (macro-social) contexts of the participant lived experience and their relevance to research study;
  • Specifics surrounding places, times, positions, and circumstances while capturing the participants’ lived experiences;
  • Assumptions the participant is making regarding what the interviewer or society knows about the phenomena; and
  • Implicit cultural or economic aspects of the participant’s thoughts, opinions, attitudes, and beliefs about the topic and the participant’s views related to the suitability of their feelings.

Ravitch and Carl advise these foundational principles for writing qualitative research questions can lead to quality data collection during structured, semi-structured, and unstructured qualitative interviews by ensuring sufficient data is provided in the observation and fieldnotes to allow for comprehensive data analysis later (153-154).

Jennie’s Perspective

The topic of inquiry for my mock qualitative interview was how do teachers feel about student engagement, using 21st Century problem-based learning techniques, and the systematic/cultural impediments and supports provided to teachers interested in using problem-based learning strategies.

The individualized context was that my cohort had taught at a local high school that fully supported teachers that used student engagement and problem-based learning to promote student experiential learning. The interviewee described several instances within stated timeframes where she had used problem-based learning in her classroom as well as described several experiences where she received support from her principal while using student engagement and problem-based learning strategies. The interviewee also conveyed that she realized that all school districts do not support these types of innovations and that other districts are failing to provide their teachers with adequate training to implement such initiatives.  

The results were that, had if the assignment had required me to code and analyze the data after the interview for inference and conclusions – I had enough micro-sociological and macro-sociological to have been able to do so.

“Good job, Jennie!”, some of you may be thinking. However, the caveat is, my professor developed the interview questions. Point proved. Conducting good qualitative interviews is science. When properly planned and orchestrated – it will work every time.

References

Ravitch, S. M., & Carl, N. M. (2015). Qualitative research: Bridging the conceptual, theoretical, and methodological. Sage Publications.

Gibson, D. R. (2003). Participation Shifts: Order and Differentiation in Group Conversation. Social Forces, 81(4), 1335. https://doi.org/10.1353/sof.2003.0055

Invisible Bridges for Sale! Real Cheap.

Hey! For 20 of these babies I could ….

Have you ever purchased something that you thought was such a great deal until – you got home, open the box, and realized you’d been had?

Question: How can you ensure the person you are hiring can get the job done?

Understanding Fraud

While exploring the reasons why crooked salespersons perpetrate fraud against buyers, Darby and Karni (1973) found some people give false information regarding their skills and qualifications to bilk prospective buyers who they know would otherwise not hire them. Specifically, cozeners knowing and willingly mislead prospective buyers by claiming skills, qualifications, knowledge, experience, and products or services quality with the sole purpose of separating the buyer from their hard-earned money.  Period.  The game is, without such lies and pretenses, the deceivers know no one would pay a dime for that bags of tricks (p. 67). 

Darby and Karni identified “credence qualities” fraud as the most egregious. They felt this type of scam is the worst because charlatans use a systematic method to lift hard-earned currency from bank accounts. Credence fraud involves the con artists carefully seeking out, grooming, and playing their victims to first gain favor then stringing the buyers along with lies until the defrauders can escape with their money. Darby & Karni explained credence tricksters know to look for buyers of products or services in high demand then offer the buyers the same items at a reduced or lower-than-market cost.  Once the scammers get a sucker on the hook, the swindlers demand an advance payment to lock the contract down, then meticulously keep the customer waiting and in a positive mood until the grifters can get away with at least one more check (p. 69 – 72). (Outrageous!)

While some call for governmental or law enforcement interventions against such cheaters, others rightfully argue that regulations are no substitute for consumer due diligence and contract monitoring (p. 84).  Instead of regulations, Darby & Karni advise buyers can prevent credence fraud by demanding proof of ability or product quality before purchase (p. 87). 

Digital Badging and Micro-Credentialization

The ability to provide proof of what you know or what you can do is the cornerstone of digital badging, or micro-credentialization.  Badging has long been a symbol of achievement and performance. For decades, governments, militarys, and social groups have all used badging to denote class, character, and capabilities.

Law (2015) studied the relatively new phenomenon of digital badging in higher education. After noting that Massive Open Online Courses (MOOCs) led the way by offering individuals a chance to earn proof they learned things from reading certain threads, reading content, or voted on content posted by universities and colleges all over the world, Law examined the new player, open education resources (OER). This free online digital learning platform also enables individuals and educational institutions to access free learning and micro-credentialization that helps them prove they gained knowledge and skills during online education provided by tutors and educators (p. 222).

But, does it work? To test the effectiveness of OU open badging, Law conducted a large-scale study that administered a multiple-choice survey to 2,448 OU users between April and July 2013 and again between April and July 2014. The results revealed OU served a diverse range of learners interested in furthering their knowledge, skills, and abilities for a variety of reasons.

And, the majority of respondents indicate they achieved that purpose.  Of total responses, 80% said OU offered high-quality content with 58% saying OU studies improved their confidence and abilities.  One of the main benefits respondents noted was their ability to receive micro-credentialization, or digital badges, as proof of their accomplishments and achievements (Law, 2015).

But, what does this have to do with chiselers, hoaxers, hustlers, impostors, impersonators, and snake oil salespeople?  

Jennie’s Perspective

Back to the question: How can you ensure the person you are hiring can get the job done?

Well. The answer appears simple.  If some pretender begins ragging you about why you should pay him big bucks to do this or that – shouldn’t he be able to prove it with more than just talk?  Especially considering, at this point, digital badges are either low-cost or no cost?

Buyers, beware of the scam. Make them prove it!

I’m just sayin’ ….

References

Ahn, J., Pellicone, A., & Butler, B. S. (2014). Open badges for education: what are the implications at the intersection of open systems and badging?. Research in Learning Technology22.

Darby, M. R., & Karni, E. (1973). Free competition and the optimal amount of fraud. The Journal of law and economics16(1), 67-88.

Law, P. (2015). Digital badging at The Open University: recognition for informal learning. Open Learning: The Journal of Open, Distance and e-Learning30(3), 221-234.