Interview with Bodong Chen of University of Minnesota
Professor Bodong Chen teaches courses in technology and ethics, as well as learning analytics at the University of Minnesota. He has great insights regarding the intersection between data analytics and using the Yellowdig platform to capture and measure students’ abilities to critique and learn from one another via virtual conversations and posts. This is a discussion at the forefront of education, data analytics, linguistics, and psychology.
On how he uses Yellowdig in his course:
I used the Canvas discussion forum and other discussion boards before but I found that the discussion relied on me too much. So every time students would expect me to post a question and then everybody would reply to me instead of replying to others. I think Yellowdig has been tremendously helpful in terms of transforming a discussion to be more student-centric. [The students] are in a community and they are talking with each other without too much reliance on me, so I think it’s a powerful tool for discussion.
Every week when I review their discussions, there are different threads that are very vibrant. There is a lot of participation.
Have your approaches to Yellowdig changed over the semester?
My class was really structured on a weekly basis. Every week there is a discussion so there are no changes that I noticed across different weeks.
There are definitely some exceptional students who posted more and I used the Yellowdig points, which is very motivational for students. And there are some students who would always exceed expectations and post more and also post content with a lot of cognitive presence. So not just replying to my reflective questions, but going beyond and talking about things he or she thought about.
And going beyond with data analytics, we are also going to analyze their constructive criticism when replying to each other. I want them to be more critical with each other but also constructive. And that’s something I noticed--- reading some of these replies, there are some students who can do that. But it’s really hard because this is purely an online course. They don’t meet with each other. They don’t meet with me. There’s less trust that the discussion can draw on and that’s something that I am relying on Yellowdig for, to build a community of trust which I expect to support more in-depth discussions.
What would be a good way to measure constructive criticism? Are there key buzz words you are looking for in a post?
There’s been some established coding schemes that researchers have been trying to develop, which focus on e-listening and e-speaking--so [the students] need to listen to each other and also speak with each other. And in the speaking category there are certain coding schemes that focus on constructive criticism. Earlier when I read student replies, they always said, “I agree with you.” And there’s less discussion with that. So, like you said, there’s some words that I am looking for, like “I agree with you but this is my opinion…” So these kinds of linguistics features are something that we will depend on when trying to analyze the data.
So you’re trying to create a data system that’s more linguistic focused and will parse through the language, the words used in each post and in the comments?
That might be a long term goal. So for this course that I taught, maybe use human coders. I have graduate assistants and we will code it. So the next step would be to train machine learning algorithms based on the coding. If the algorithm can pick up constructive criticism posts, that would be great. So that’s something that we’re going to explore and maybe even long-term, maybe even have a research assistant that will work with Yellowdig so that the system can pick up those posts automatically.
Machine learning-- from my understanding-- primarily involves storing large databases worth of previous data and then picking up the patterns in the previous data to predict the future and “learn” the future. Do you think we have enough data at the moment to accurately create a system using machine learning?
I think you are on point about machine learning--and with learning analytics-- a lot of people in education will think about the learning side more: When using data analytics, do you think about student agency? Do you help students to see [the trends]? And if they do see it, they can make decisions better by themselves. So it’s empowering students to take action instead of using big data sets and algorithms to predict [trends] for students. There are a lot of debates in this community about the power relationship between students, teachers, algorithms, administrators-- many different players in the arena. My interest, especially for a system like Yellowdig, is very community driven and encourages students to post and talk with each other. I am focused on the social dimension of learning and also using smaller data sets that will help students make decisions in the discussion context.
On the future of Yellowdig:
What I would like to see is more social dimension in Yellowdig, so having a simple sociogram or social network visualization of student discussions and identifying students in the past two weeks who have not posted at all. Or the figure could become even richer, so maybe I could have a slider to customize the search or visualization. So for example, if I want to increase the threshold of posting to two posts in the past two weeks-- then maybe I can expand to three students who were maybe not that active, posted one [thing] but didn’t really participate closely. So something like that would be very helpful.
So I guess the first step is identifying the students who are struggling in class but then the second step is correcting for those students. So if they cannot push themselves in class then maybe Yellowdig can do something or maybe Yellowdig can give the professor the data to do something. What are your thoughts on that?
I think we need to be very careful about that next step. At some point, we will have to make a decision about which actor will take the next step. It could be a teacher like me or an assistant that will send an email alert to the student. And then you do some pilots and collect data on how students might respond to those interventions. In my research actually, these kinds of interventions function differently for [students]-- so if they regulate themselves mostly by preventing the actions, they are less likely to respond to a suggestion that’s more promotive. So there’s a lot of factors involved in the next steps.
In a theoretical world, I was thinking that maybe Yellowdig can figure out a way to see the rate at which a student is able to read and comprehend an article. Some students work slower than other students and it’s usually the faster students who are constantly posting but some students get left behind because they are slower at processing the information. So maybe have feeds tailored to learning ability?
Maybe allowing for some space for the negotiation between the student and the system, and the student and the teacher. That would be great. So treating it like a conversation instead of tagging a student based on their specific learning characteristics.
Some perspective on the ideal data analytics for a learning platform:
I wish to see more reading data or listening data. How do students listen to each other? And we don’t want to focus on just the end product. Like you said, there are some students who work slowly in terms of reading but maybe they are trying to read as much as possible. And currently, this data is not included.
Also, there’s a lot of literature we can draw from about how to support discussion, how to encourage learning communities. And the design of the [professor] dashboard: I wish it will not be only about data, but more informed by educational theory or design principles or frameworks for educational discussion. I think Yellowdig data analytics could distinguish itself from other systems. I think if Yellowdig can jump ahead [of Canvas] and do something more that’s grounded in the literature and design principles, that would be fantastic.