“The New Ph.D.” Again

“There are far, far better things ahead than any we leave behind.”
C.S. Lewis

A while ago, I wrote a blog post Re-Envisioning the PhD +13 Years.  While a graduate student, I was associated with the Woodrow Wilson Re-Envisioning the PhD project.  In that blog post, I reviewed my old notes to see if my opinion had changed. I concluded that the problem was not with a PhD degree but people trying to hijack the degree for other uses.

Today I came across an article in the Chronical of Higher Education, “The New Ph.D.: Momentum grows to rewrite the rules of graduate training.” While I am reluctant to dip back into the topic of changing the PhD, there is a lot going on, and I think we should give the Chronical article a look.  As Rear Admiral Grace Hopper said, “The most dangerous phrase in the language is: We’ve always done it this way.” (By the way, if you don’t know who Grace Hopper is, shame on you and educate yourself.)

The article starts with a story about Meg Berkobien, a graduate student in comparative literature.  Her dissertation was on 19th-century Catalan-language periodicals.  Meg was not motivated by her project and eventually decided to leave the program.  In a letter to her department chair, Meg wrote,

“Every time I sit down to write, I’m overwhelmed by a quiet despair — that our world is literally on fire and I’m not doing nearly enough to build a better world,” Berkobien wrote in an email to her department chair. “Pair these concerns with a downright awful job market, and I hope it’s clear why I think my best option is to leave.”

Instead of letting Berkobien leave the department let her “reimagine her dissertation as a series of essays focused largely on her public-facing work, which included building a translators’ collective that prints books and creating translation workshops for immigrant high schoolers learning English.” Beyond Berkobien’s story, the authors focused on a whole section of the Chronicle article on the dissertation.

One complaint is that the dissertation does not prepare students for jobs outside of academia. Since the bulk of Doctoral graduates will work outside of academia, maybe the dissertation should reflect that.  Sidonie Smith argues, “The one-size-fits-all proto-book structure shackles scholarship,” “It often yields bloated projects that don’t merit such long-form treatment.” While Earl Lewis says, “Lewis made a much-discussed suggestion that historians should consider allowing students to pursue co-authored dissertations. This, he says, would enable them to produce better answers to really big scholarly questions.”

The Chronical article lists several programs experimenting with alternative dissertations. It also contains several examples were alternative dissertation formats have been successful. However, the article never talks about the purpose of the dissertation.  Why is the dissertation part of the PhD?  Additionally, the dissertation is not that old.  According to DED: A Brief History of the Doctorate, a University awarded the first doctoral degree in the 12th century.  Universities awarded the first PhD in the 19th centaur, and Yale awarded the first US PhD in 1861.  Therefore, in the US, at most, the PhD dissertation is only 159 years.

What is the dissertation purpose? Why should the students write anything? The PhD is predominantly a research degree. If you do, a web search asking what a PhD is some were in the description will be a phrase like “original research” or “contribute new knowledge to your field.” The writing of a dissertation is how you show that your research answered the original research question.

I think the writers of the Chronical article are confusing several different problems. Let’s use Meg Berkobien as an example.  Meg was not engaged by her original research into 19-century Catalan-language periodicals.  As the article said, “What excited her was political organizing and mobilizing her translation expertise outside academe.” The department let her change her research topic to her translational working outside academia. They also changed the format of her dissertation.  Did the department have to do both?  Why couldn’t they have let Meg do a research project about her translational work outside academia while still writing a traditional dissertation?

Over the years, I have met many graduate students that have complained about their research projects.  There was an English lit major that wanted to study a 20th-century science fiction writer. The student’s advisors told the student no because science fiction wasn’t scholarly enough.  There was a biology student who wished to understand society’s comprehension of science. The student was told that it was not scientific enough.  I know an engineering student that wanted to understand how engineering impacted government policy; their advisor told them the department didn’t care.

In the end, these three students and many others left school.  In this case, the problem was not with the dissertation but with what was considered “scholarly” research.  However, it seems to me that almost any topic can be a research project, especially if we truly believe that all knowledge is worthwhile.  Do books have to be 100, 200, or 400 years old to be worthy of research. Isn’t it worthwhile to understand what the best way to communicate scientific information is?  The dissertation does not have to change to let in new and modern research questions.

The other reason given to change the dissertation is because it does not adequately prepare a student for work outside of academia.  While it is undoubtedly vital to train people so that they can be happy contributing members of society, we also need to train people for jobs in academia and research.  Part of the problem is overfilling in graduate programs, coupled with schools not being transparent about prospects.  I have had several faculty members tell me the only reason their departments enroll the number of graduate students is to fill the Graduate Teaching Positions, not because they need them.

While schools should be aware of student futures and provide their prospective students with realistic expectations, instead of changing the dissertation, why not allow a student to create additional projects or participate in internships to complement and enhance their graduate experiences.

The last issue brought up by Dr. Smith, and Dr. Lewis is that the current dissertation model inhibits the type of research and questions that students can ask. These are good questions concerning changes to the dissertation.  If a change to the structure of the dissertation improves the student’s ability to do research or open new kinds of research, then we should make changes.

While continuing to do something because we have always done it, that way is dumb.  It is equally foolish to change something because of problems with something else.  It is still worth looking for a better way to do things.  Just because something is not a perfect fit for everything doesn’t mean it should be changed.  After all, there are things for which a PhD is ideal.  As time and society change, schools will undoubtedly have to adapt to provide an educated society. However, as I have said before, perhaps the appropriate switch is to create a new degree not to edit the old degree out of existence.

Thanks for Listing to my Musings
The Teaching Cyborg

PS. In case you think rose-tinted glass biased my opinion, I hate my dissertation.  Not just because the company my school used to print and bind the digital files did such a horrible job.  The entire document looks like a bad copy produced off a low-quality copy machine. 

I suppose what gets me is that while I was worried about writing a document that large, I had a plan and was looking forward to creating the pseudo book.  I had a story to tell, present the background, which showed where there were holes in our knowledge.  Then develop the experimental methods to address the gaps.  Finally, I would get to show how my data added to the models and lead to new questions for future research.  Instead, my department wanted a catalog of every single experiment I did.  In the end, I felt like “my” dissertation belonged more to my committee, then it did to me.

Misconceptions in Cell Biology

“Every living thing is made of cells, and everything a living thing does is done by the cells that make it up.”
L.L. Larison Cudmore

Cells are the building blocks of all biology.  Every living organism is composed of cells.  All cells came from preexisting cells.  If you are a trained biologist, you recognize the last two sentences as The Cell Theory, one of the core theories of modern biology.  A lot of The Cell Theory seems basic considering what we know.  However, remember cells are smaller than can be seen by the naked eye.  Until the invention of microscopes, we didn’t even know cells existed.  The word cell was first used by Robert Hooke in the 1660s while examining thin slices of cork.  Hooke used the word cell to describe the structures he observed because they reminded him of the rooms of monks.

Additionally, it wasn’t until Loise Pasteur’s famous swan-necked flask experiment in 1859 that the idea of spontaneous generation, life spontaneous occurring out of organic material, was disproven.  Therefore, every cell must come from a preexisting cell. With the importance of The Cell Theory, it is not surprising that students spend a lot of time learning about the structure, function, and behavior of cells.  However, because cells are not visible to the naked eye, it is not surprising that many students have misconceptions concerning cells.

What is a misconception? Scientific misconceptions “are commonly held beliefs about science that have no basis in actual scientific fact. Scientific misconceptions can also refer to preconceived notions based on religious and/or cultural influences. Many scientific misconceptions occur because of faulty teaching styles and the sometimes-distancing nature of true scientific texts.”  When we teach students biology, how good are we at dealing with misconceptions?  The critical questions are what the student’s misconceptions are and how do we deal with them?

Musa Dikmenli looked at the misconceptions that student teachers had in his article Misconceptions of cell division held by student teachers in biology: A drawing analysis.  In the study, Dikmenli examined the understanding of 124 student teachers in cell division.  According to the study, these student teachers “had studied cell division in cytology, genetics, and molecular biology, as a school subject during various semesters.”  Therefore, the student teachers had already studied cell division at the college level.

At a basic level, cell division is the process of a single cell dividing to form two cells.  Scientists organize cell division (the cell cycle) into 5 phases Interphase, Prophase, Metaphase, Anaphase, and Telophase.  The cell cycle is often depicted using a circle. 

Figure of the cell cycle at different levels of detail. Created by PJ Bennett
Figure of the cell cycle at different levels of detail. Created by PJ Bennett

Instead of answering quiz questions or writing essays, the students were “asked to draw mitosis and meiosis in a cell on a blank piece of A4-sized paper. The participants were informed about the drawing method before this application.” (Dikmenli) The use of drawing as an analysis method has several advantages.  The most important of which is that it can be used across languages and by students in multiple nationalities.

After analyzing the drawings, almost half of the student teachers had misconceptions about cell division.  Some of the most come misconceptions are, when DNA synthesis occurs during mitosis and mistakes about the ploidy, the number of chromosome copies, during meiosis.  The research results mean that individuals that are going to teach biology at the primary and high school level are likely to pass their misconceptions along to their students.

So, where does the problem with student misconceptions start?  Students learn misconceptions from their teacher about cell division.  However, the teachers all have biology degrees from colleges, and their college faculty failed to address their misconceptions. However, perhaps we are not asking the correct questions.  Instead of trying to decide who, K-12 or College, is responsible for correcting student misconceptions, we should ask why students get through any level of school with misconceptions?

I can hear all the teachers now, while obviously, students get through school with misconceptions because it’s difficult to correct misconceptions. However, we know a lot about teaching to correct misconceptions.  Professor Taylor presents one method, refutational teaching in the blog post GUEST POST: How to Help Students Overcome Misconceptions.  With a quick Google search, you can find other supported methods.  In all cases getting the student to overcome the misconception, the student must actively acknowledge the misconception while confronting countering facts.

It is unlikely that the problem is that it is hard to teach to misconceptions, let’s be honest most teachers at any level are willing to use whatever techniques work.  No, I suspect the real problem is that most teachers don’t realize their students have misconceptions. So, then the real questions are why instructors don’t realize students have misconceptions.  In this case, I suspect it is the method of assessment.

Most classroom assignments and assessments ask the students to provide the “right” answer.  The right answer is especially prevalent in the large lecture class where multiple-choice questions are common.  However, the fact that a recent review article A Review of Students’ Common Misconceptions in Science And Their Diagnostic Assessment Tools covers 111 research articles suggest that identifying misconceptions is not complicated if teachers use the correct methods.  Therefore, the incorporation of the proper assessment methods alongside teachers’ standard methods will help teachers identify student misconceptions.

However, it is not good enough to identify misconceptions. The misconceptions must be identified early enough in the course so the teacher can address them.  Finding misconceptions is a perfect justification for course pretests either comprehensively at the beginning of the course or smaller pretests at the start of unites.  In an ideal world, pretests would be a resource that departments or schools would build, maintain and make available to their teachers ideally as a question bank.  Until schools provide resources to identify misconceptions, think about adding a pretest to determine your student’s misconceptions.  It will help you do a better job in the classroom

Thanks for Listing to My Musings
The Teaching Cyborg

Virtual Education

“There are as many applications for VR as you can think of, it’s restricted by your imagination.”
John Goddard

Virtual reality is an exciting technology.  For the last several years, there have been numerous articles talking about Virtual Reality (VR) the emerging technology. As a small example:

What makes virtual reality interesting is that for an emerging technology VR is quite old.  While the term virtual reality was coined in 1987 by John Lanier devices and the idea at the core of the technology can be traced back to 1935 (The Very Real History of Virtual Reality (+A Look Ahead).) Therefore, VR is more than 80 years old, though the first working example didn’t appear until 1957.

One of the first custom-built educational VR programs I encountered was the Boise State Universities Virtual Reality Nursing Simulation with Custom Haptic System for Patient Safety at the 2015 WCET (WICHE Cooperative for Educational Technologies) conference. This system was designed as a supplement or replacement to nurse training with expensive medical manikins.  Additional studies showed that students that used the VR system had comparable pass rates on practical skills tests compared to students that used the Manikins.

While we have seen a few of these educational VR programs developed over the last couple of years, A recent Chronical of Higher Education article, Virtual Reality Comes to the Classroom, presents a different approach (the article is also available here.)   Nhora Lucía Serrano added VR to her literature course at Hamilton College.  Professor Serrano’s students designed a virtual environment based on the novels they read.  The students used Unity and Tinkercad to build their virtual worlds.

Unity is a game engine which as Unity says, “A game engine is a software that provides game creators with the necessary set of features to build games quickly and efficiently.”  Unity also has a personal version that is free if you make less then 100K a year on your Unity projects.  Tinkercad is a free 3D modeling program. These two tools give students or faculty the ability to create and modify 3D objects and then build a VR environment.

Professor Serrano’s use of VR in the classroom reminds me of video essays.  While most people are probably familiar with the video essay, the idea behind a video essay is to take the analytical structure of an essay and build a video instead of a written essay.  It is probably only a matter of time before we see someone try and create a VR essay.  However, we do need to be careful that we don’t run to VR simply because we are attracted to the shiny new thing.

As the Chronical article says, “But what is the pedagogical value of a virtual or enhanced experience? Just because students may like it, does that mean they will learn more than they would through a simple computer program or a textbook and lecture?” Pedagogy is essential we need to use technology to solve problems.  However, the question “does that mean they will learn more than they would through a simple computer program or a textbook and lecture?” is not really the correct question.

There is nothing wrong with using technology, even if the outcomes are the same as a “simple computer program or a textbook and lecture.”  If that new technology is more accessible more engaging easier to use or more cost-effective, then there is nothing wrong with using it.  Additionally, even if the outcomes are the same, there is value in using a tool that engages the students differently.  There is always something to be said about using different approaches to relieve the monotony and potentially engage a broader audience.

While professor Serrano’s VR project appears to have been engaging and quite successful, it is posable even likely that the learning gains were not from VR as much as they had another way to access and think about the material.  It is quite posable that the pedagogical advantage of VR won’t be strictly speaking derived from the virtual world.  It is more likely that the benefits of VR will be the ability to do things that would otherwise be imposable or prohibitively costly.

As an example, it would be impossible to visit all the locations discussed in a course on the history of western civilization.  Even if it were posable to travel to all the places in the time frame of a single semester, the cost would be prohibitive.  High fidelity VR recreations would give students the ability to see and explore these sites.  Additionally, it would be impossible for every student in an architectural class to build a multimillion-dollar building in real life.  However, in VR, not only could they build the building, other students and faculty could walk around explore their structure.  Another example in an astronomy class VR would make it possible for students to stand on the surface of the Sun or Mars.

While it is possible, we will develop a VR pedagogy.  It is also important to remember that sometimes, a tool is just a tool.  We don’t talk about the pedagogy of the hammer, yet it is an essential tool in building a set for a theater production or collecting a rock sample in geology.  Whether or not we develop “Pedagogy of VR” and whether it’s better than existing technologies, there is always a place for tools that let us do the otherwise imposable.

Thanks for Listing To my Musings
The Teaching Cyborg

Much Ado about Lectures

“Some people talk in their sleep. Lecturers talk while other people sleep”
Albert Camus

The point of research is to improve our knowledge and understanding.  An essential part of research is the understanding that what we know today may be different from what we know tomorrow as research progresses, our knowledge changes.  Conclusions changing over time does not mean the earlier researchers were wrong. After all, the researchers based their conclusions on the best information; they had available at the time.  However, future researchers have access to new techniques, equipment, and knowledge, which might lead to different conclusions.  Education is no different. As research progresses and we get new and improved methods, our understanding grows.

Out of all the topics in educational research, the most interesting is the lecture.  No subject seems to generate as much pushback as the lecture.  A lot of faculty seems to feel the need to be offended for the lecture’s sake.  Anyone that has trained at a university and received a graduate degree should understand that our understanding changes over time.  Yet no matter how much researchers publish about the limited value of the lecture in education, large numbers of faculty insist the research must be wrong.

I suspect part of the push back about the lectures is because lecturing is what a lot of faculty have done for years.  If they except that the lecture is not useful, then they have been teaching wrong.  Faculty shouldn’t feel bad about lectures; after all, it is likely what they experienced in school.  I think it is the faculty member’s own experience with lectures as students that lead to the problem.  I have had multiple faculty tell me over the years some version of the statement “The classes I had were lectures, and I learned everything, so lectures have to work.”

The belief that you learned from lectures when you where a student is likely faulty.  The reason this belief is defective is that you have probably never actually had a course that is exclusively a lecture course.  I can hear everyone’s response as they read that sentence, “what are you talking about as a student most of my classes were lectures.  I went into the classroom, and the teacher stood at the front and lectured the whole period. So, of course, I have had lecture courses.”

Again, I don’t think most people have ever had an exclusive lecture course. Let’s braked down a course and see if you really can say you learned from the lecture.  First, did your course have a textbook or other readings assignments?  Just about every course I took had reading assignments.  In most of my classes, I spent more time reading then I spent in the class listing to the lecturer.  Most of my courses also had homework assignments and written reports.  Many of the courses also had weekly quizzes, and one or two midterms were we could learn from the feedback.

Can you honestly say that in a lecture course, you didn’t learn anything from the course readings?  That you didn’t learn anything from the homework assignments and papers. That you didn’t learn anything by reviewing the graded homework assignments, papers, quizzes, and midterms, the truth is even in a traditional lecture course, there are lots of ways for students to learn.  As a student, it is next to imposable to determine how much you learn from any one thing in a course.  So, with all these other ways to learn in a “Lecture” course, can you honestly say you learned from the lecture?  In truth, the only way to have a course where you could say you learned from the lecture is if you had a course that only had a lecture and final, no readings, no assignments, no exams with feedback, only a lecture.

However, there is an even deeper issue with the lecture, then the faculty insisting it works (without any real evidence.)  As faculty members, what should our goal as a teacher be?  It is quite reasonable to say that anyone teaching at a college, university, or any school should attempt to provide the best learning environment they can.  So, even if we accept the argument that students can learn from, let’s call it, a traditional lecture (I don’t) if the research says there is a better way to teach shouldn’t we be using it?

If faculty approach teaching based on what is the best way to teach, it does not matter if students can learn from lectures if there is a better way to teach, we should use it.  The research says we should be using Active Learning when we teach because it benefits the students.  A recent article, Active learning increases student performance in science, engineering, and mathematics from PNAS show that students in classes that don’t use active learning are 1.5 times more likely to fail the course.  At a time when universities and the government are pushing for higher STEM graduation rates, active learning would make a big difference.

So how much of a problem is the lecture?  I know a lot of faculty that say they use active learning in their classrooms.  In a recent newsletter from the Chronicle of Higher Education, Can the Lecture Be Saved? Beth McMurtrie states, “Most professors don’t pontificate from the moment class starts to the minute it ends, but lecturing is often portrayed that way.”

However, a recent paper from the journal Science Anatomy of STEM teaching in North American universities might refute this statement.  The Science paper shows, at least in the STEM disciplines, that when classroom teaching methods are observed rather than reported by survey, 55% of all the course observed are traditional lectures.  Only 18% of the courses are student-centered active learning environments.  The rest have some amount of active learning.

Regardless of whether you think the lecture works or not, it is long past time to change.  There is no reason to feel ashamed or think poorly about faculty that used lectures in the past.  After all, for a lot of reasons, lectures where believed to work.  However, we are also long past the time where anyone should be offended for the lecture’s sake.  We need to use the best teaching methods currently available.  The best methods are the techniques called active learning because students measurably learn better than in a traditional lecture.

Thanks for Listing To my Musings
The Teaching Cyborg

In Education What Does It Mean to Be Competent?

“What you know is more important than where or how you learned it.”
Tina Goodyear

While competency-based education (CBE) has been part of US education for 40- or 50-years, interest has been increasing over the last couple of years.  A faculty member I was working with once described a problem they were having at his school.  He worked with a system of schools that used a common course system across all their campuses.  A common course system can solve a lot of issues. The common course system allows students to transfer between schools smoothly.  It also lets the system office negotiate guaranteed transfer agreements with other universities for all the schools in the system rather than each school having to negotiation individual transfer agreements.

However, how the system he worked at maintained their common courses system was causing problems.  The system office maintained a central list of the learning outcomes of the common courses.  When a school taught a class, they only needed to teach 80% of the outcomes that were on the common list.  If the common list had 26 learning outcomes, you only need to teach 21 (20.8). Faculty don’t have to teach five of the learning outcomes on the common course list.

To pass a common course, the student must earn at least a C (70% of the learning outcomes taught).  That means a student can pass while only learning 15 (14.56) learning outcomes.  Therefore, a student can pass without knowing 11 of the 26 learning outcomes on the system’s core list. Taken to the extreme, it means that two students each from a different school that both earned a C and transferred to the same school might only have four learning outcomes in common between them.

The committee my friend was working with suggested the implementation of competency-based education as a solution to the problem with their current common course system.  I asked how they were planning on implementing CBE.  He answered, “well, we already have learning goals all we need to do is turn them into competencies then modify our assessments a little, and we will be doing CBE.”

I remember asking, “are you changing how you assign grades?”  “If you’re not changing the grades, are you going to change how your transcript?”  If you don’t make changes like “grading” differently or list mastered competencies on your transcripts, you will still have the same problem. A lot of people that are trying to jump on the CBE bandwagon are just rephrasing their learning goals into “competencies.”

Implement of CBE requires changes to the whole system.  One of the core ideas behind competency-based education is that given enough time, most people can master any concept.

“Supporters of mastery learning have suggested that the vast majority of students (at least 90%) have the ability to attain mastery of learning tasks (Bloom, 1968; Carroll, 1963). The key variables, rather, are the amount of time required to master the task and the methods and materials used to support the learning process.” (How did we get here? A brief history of competency‐based higher education in the United States)

This one idea turns the current educational system on its head.  In most schools,’ students’ progress is measured by the number of credits earned.  Students earn credits by passing a class.  If a student passes the class, they get the credits whether they get an A or C. Institutions assign the number of credits to a course based on the number of hours the course meets.  This system, the Carnegie Unit, was established over a century ago by the Carnegie Foundation.  Therefor students earn credits based on time.

However, the Carnegie Unit or Credit Hour was initially created as part of a program to determine eligibility in the Carnegie Pension Plan (today is known as TIAA-CREF).

“To qualify for participation in the Carnegie pension system, higher education institutions were required to adopt a set of basic standards around courses of instruction, facilities, staffing, and admissions criteria. The Carnegie Unit, also known as the credit hour, became the basic unit of measurement both for determining students’ readiness for college and their progress through an acceptable program of study.” (The Carnegie Unit A Century­ old Standard In A Changing Education Landscape)

While the Carnegie Unit brought standardizations to a nascent US educational system, it is possible, if not likely, that we have become too focused on the easily measured like the Carnegie unit.  In the CBE system, students earn credits based on mastery of concepts. Therefore, students take as much or little time as they need to master concepts and move forward at a pace that best suits them.  CBE puts the information learned as the central component used to earn credits, not the length of time spent in a course.

Beyond restructuring the educational experience to focus on mastery, there are questions about assessments.  It is not merely a matter of rewording learning goals into competencies.  Course designers build competencies around what students should be able to do, or vice versa.  The assessments must be carefully thought out to match the desired outcome and then ascertain whether the student has mastered the competency.  While the process of assessment creation is involved, the fact that schools like Western Governor’s University and the University of Wisconsin’s Flexible Option program are using CBE can provide examples and a knowledge pool to develop new programs.

I don’t know if most of the educational system will adopt CBE.  The changes need to the standard system are enormous. After all, if students can learn at their pace, semesters, and time to degree will have to be rethought. However, the thought of competency-based education changing the focus back to learning over sorting is appalling.  The CBE system could also help alleviate student frustrations over a course moving to slow or too fast, leading to higher matriculation rates.  In the long run, I suspect the degree to which CBE is adopted will depend mostly on the success of the institutions currently leading the way.  Regardless of the success or failure of CBE, it will be fun to follow the developments in CBE over the next several years.

Thanks for Listing to My Musings
The Teaching Cyborg