Buzzwords, Data and the Adaptive Learning Summit


“Advancing Innovations in Adaptive Learning” – that tag line alone explains why I attended the National Education Initiative’s Adaptive Learning Summit in Washington DC  a few weeks ago.  As a subject matter expert for McGraw-Hill, I have been involved with adaptive learning for a long time, and my company has had a number of contracts to build learning resources for adaptive learning platforms. But I wanted to know not only how others are utilizing and  developing adaptive learning technologies, but where these platforms may be heading in the next few years. And this conference provided an insight into both.

Here are a few of my major take aways from a day full of information:

Terminology is Getting in the Way

Is a technology adaptive, individualized, differentiated or personalized? Does it really matter? This was one of the key points of the keynote address by Richard Culatta, from  the Office of Educational Technology – Department of Education (for a great review – see this article by Jeff Young at the Chronicle). And the answer is – not really.

I have witnessed the same issue when presenting to faculty – the buzzwords are getting in the way. In fact, in a second article, Jeff Young presents survey data that suggests that the proliferation of buzzwords may actually be enhancing skepticism, and increasing confusion, among faculty.  And that is the last thing we need if we are going to advance educational reform. 

Basically, we need to drop the jargon, or at least try to reduce the comparisons between adaptive and differentiated, personalized and competency, and so on, in presentations. At one time, learning was identified as visual, auditory or kinesthetic. We now know that the use of these terms did not help learners directly, since very few of them knew how to assess themselves or apply the terminology to a learning environment. Students tell me all the time ‘I am a visual learning” to which I hear ‘I like to watch videos”.

The same thing seems to be happening now with the buzzwords.  Richard Culatta provided some nice definitions, as a form of working model for the meeting. What was really interesting was how quickly the meeting stopped focusing exclusively on adaptive learning (as a term) and began to talk about applying these models as ways of developing the learning ecosystem, and one that places the student at the center of that environment.

Make Data Serve the Users

As an author and educator, I have been using data from an adaptive learning platform to revise my textbooks, and to more effectively flip my classes.  As a scientist,  I do not think that a single decision should be made about learning, from the level of the student to the highest offices of an institution, without first collecting and analyzing data. Data is our insight into the minds of the learners.  At a recent online symposium that I presented at, almost all of the presenters demonstrated how the use of data has transformed not only their classroom, but their teaching philosophies. What was interesting from that meeting was that each of these instructors, without any real formal guidance, had been developing metrics to look at problems facing them in the class (preparedness, mastery, etc). The data was the catalyst for change.

The problem, as I see it, is that we want to place this data in a box and analyze it using the same old tired educational assessment processes. Often, the presentations I attend present a model that if a student X does Y activity using a certain platform, then the instructor can get Z result in the class.  While this may solve an immediate problem – it is like buying a Porsche to go the grocery store – it may work… but you are wasting a lot of potential.

Instead of thinking outside of the box – we need to pretend that there is no box at all. Stop looking at how students perform in a single class – start using the data to understand how the student is performing in the entire learning ecosystem.

Let me give you an example. I teach an online course in human genetics. There is a pre-requisite of chemistry for this class.  The academic enrollment systems are very good at only allowing students who have completed chemistry to enter the class. I can even look up the grade they received in the class. That information is completely meaningless. What would be useful is if a student, as they start to engage with the content in the genetics class, are reminded that they struggled with similar content in the chemistry class – and then delivered learning resources to help them with this knowledge deficiency at that specific point in time. Not after they failed an exam.

We need to expand this by showing students how to assess the data coming from the platforms. But before we do this – we need to address another core concept that came from the meeting –Students are Not Instructional Designers or Learning Engineers. Sure, this may be the digital generation, but they usually do not understand what the platform is suppose to do, or how to interpret what they have done outside the immediate satisfaction of an assignment grade. Why? Because they have never been shown how to assess themselves. Can this be done? Yes, if it is an integral part of the learning ecosystem, and is part of the learning experience from the beginning. Imagine a series of freshman seminars that allow the student to understand themselves as a learner.

This type of formative assessment that is directly tied to learning resources is the key to the future of ed tech.Does this matter if that technology is adaptive, personalized or differentiated? Who cares? If the end result is that a student can identify what they need to do to reach their desired level of success in a course – then you can call it anything you want.

The Future and Challenges

There were also some discussions of where this is all heading. One of the more futuristic visions was out of MIT, where Nish Sonwalker’s group is researching a wearable device, in this case a headband, that can read brainwaves (even while you sleep) and then deliver the necessary learning resources. On the more immediate horizon there was the general recognition among the panelists that the learning environment (Be it adaptive, individualized or differentiated) needs to progress towards the concept of a learning ecosystem that consists not only of students and teachers, but peer groups and entire classrooms. The technologies that power these new learning ecosystems need not only to be adaptive (in that they respond to what the user needs), but also nimble, intelligent, and most importantly, be able to empower student-driven learning.

There were also a few current challenges that were brought to the table. At several points in the meeting the concept of how to get instructor buy-in was introduced, and the general reluctance of faculty to migrate towards a data-riven adaptive model. I have felt that frustration many times, and frequently wondered if I belonged to a relatively rare subset of the educational community. I left this meeting feeling invigorated and hopeful that this is not the case. The intellect, ideas, and mostly importantly, energy of the individuals at the Summit made me recognize that we are on the right path – it may take some time to get there, and there is a lot of work to be done, but the end result will be worth it.