Tag Archives: education strategy

One New York Times Sunday Review

When I opened the September 13th NY Times Sunday Review recently, I found not only  a print version of Annie Paul’s blog featured in my last post,  Is There a Lecture Learning Gap?,  but three other articles on higher education as well. While Paul suggested we replace the Western cultural lecture that favors privileged white males with activity-based learning benefiting everyone, the others took an unquestioning view of American college culture.

In What the Privileged Poor Can Teach Us?, sociologist Anthony Abraham Jack talks about his research on black students in elite colleges. He compared the success of low-income black students who attended private schools (the privileged poor) with poor blacks who hadn’t. He found that the privileged poor had a great advantage because they were comfortable in the dominant American culture that permeates our elite colleges. For example, while the privileged poor will ask, even demand, extra help when they are confused or behind, non-privileged blacks are too embarrassed or uncomfortable to reach out to instructors so they will continue on a downward spiral.

Nicholas Kristof’s piece, From Somaliland to Harvard, is in sync with Jack’s observations.  Kristof highlights the journey of Abdisamad Adan, a poor African, to Harvard this year. Abdisamad attended a private boarding school run by an American, Jonathan Starr, in Somaliland. Forty-five students from that high school have already graduated from top US colleges. Kristof’s point in the article is that access to schools like Starr’s is the key to success for children like Abdisamad.

This “if you can’t fight them, join them” attitude when it comes to the prevailing culture in American colleges is assumed by Frank Bruni in his article, Measuring a College’s Value. He looked at data from the Gallup-Purdue Index which surveyed 30,000 college graduates. According to Bruni, the research shows that

….graduates fared better if, during college, they did any of these: developed a relationship with a mentor; took on a project that lasted a semester or more; did a job or internship directly connected to their chosen field; or became deeply involved in a campus organization or activity….

Developing a relationship with a mentor? Applying for an internship? Plunging into an extracurricular activity? It sounds like you would have to have the cultural confidence of the privileged poor to take advantage of those opportunities.

Bruni concludes that “what college gives you hinges almost entirely on what you give it.” “What you give it” isn’t neutral. It depends on who you are, the circumstances of your childhood, what culture is familiar to you. So is the student to blame or should we take a look at American college culture?

Before we give more support to elite schools mimicking a biased dominant Western perspective, before we try to make everyone conform to a culture that hasn’t always done the best for everyone, shouldn’t we question what exists and find solutions for change that support and encourage all students including those who bring diversity to the college campus?

Formative Assessment in a World of Learning Outcomes

Consider this scenario: You’re teaching language arts to a middle school special ed class. The learning objective is to write a story about making something. While you go through the provided writing sample about children building a clubhouse, your students get more excited about the clubhouse than writing a story. They ask to build a clubhouse. Do you make them write the story or do you let them build a clubhouse first?

If you go with the clubhouse, you’re delaying writing the story and you may not have time to fulfill all the learning objectives embedded in your curriculum. On the other hand, if you decide, as I did, to build your lesson on your students’ spontaneous enthusiasm, you are choosing to write in additional learning objectives involving commitment, collaboration, and problem-solving before writing the story. And, you must alter your teaching plans to achieve them.

My decision was based on formative assessment or assessment for learning. Paul Black and Dylan Wiliam wrote the classic definition of formative assessment in 1998:

….the term ‘assessment’ refers to all those activities undertaken by teachers, and by their students in assessing themselves, which provide information to be used as feedback to modify the teaching and learning activities in which they are engaged. Such assessment becomes ‘formative assessment’ when the evidence is actually used to adapt the teaching work to meet the needs.

This definition holds true for higher education even though Wiliam’s continuing work is with teachers in K-12. He emphasizes that many strategies can be successful as long as we remember “the big idea is to use evidence-based learning to adapt instruction to meet student needs.”  I encourage you to watch his exceptional talk, Assessment for Learning, below:

Education technology offers us valuable tools for assessment. Evidence-based programs can quickly adapt instruction based on feedback from student learning. These programs also help instructors alter their class instruction because aggregate data is available in real time. (see my earlier blog, What’s a Seventeen-Year-Old to Do?).

But there is a downside. Since, like all effective formative assessment, adaptive learning programs tie instruction and feedback to learning outcomes, the learning outcomes in adaptive programs are predetermined Formative assessment means changing student learning pathways–more material for a struggling student; less for an excelling student. But all pathways lead to the same goal.

The movement for student competencies and consistency in higher education also rests on predetermined learning outcomes. While these trends have merit, we need to be cautious and not allow them to get us entrenched in rigid practices, deterring instructors from going “off-script” and tapping into students’ enthusiasm and innovative ideas–these, too, are worthwhile in the learning environment. (When you look back, isn’t it the off-script instructors who influenced you the most?)

As we develop and use technology to get more precise evidence-based snapshots of student progress, we need to build in flexibility so that formative assessment based on student feedback can modify learning outcomes as well as learning pathways.

If You Give a Student a Cell Phone…

Originally posted by Rochelle Diogenes on Acrobatiq.

With the increase in digital distractions, interest in how we pay attention has grown. Although researchers continue to delineate definitions, most agree with the early psychologist, William James:

Everyone knows what attention is. It is taking possession of the mind, in clear and vivid form, of one out of what seems several simultaneously possible objects or trains of thought. Focalization, concentration of consciousness are of its essence. It implies a withdrawal from some things in order to deal effectively with others.

Attention is really selective attention.  We consciously or automatically choose which things to ignore and which to focus on. You are more likely to pay attention to something that affects you, interests you, or has deep meaning.

What we pay attention to is contextual and subjective. At a play, we think it’s important to focus on what’s happening on the stage without distraction. If an 8-year-old points out that there’s a man behaving oddly in the next row, he will probably get shushed. But these days, if he makes the same observation as his mother rushes him to catch a train or plane, Mom will probably pay attention and report it to security personnel.

Attention is the gateway to learning, to remembering and processing information. Instructors competing for student attention isn’t new. Remember when we thought all students were taking notes, but many were doodling, or writing love letters, or passing notes to other students? Remember when daydreaming was a common class distraction?

Cell phones may just be a more efficient way of channeling wandering attention. Researchers have shown that students texting/posting on their cell phones while watching a video lecture tested more than a grade level below their phoneless counterparts. They suggested that instructors discuss cell phone use policies with their students. That’s a start, but it doesn’t get to contextual factors that may contribute to cell phone distractions.

If, as the Pew Research Center reports, 93% of 18-29 year old smartphone owners use their phones to avoid being bored, maybe we should consider that having students listen to long lectures is not the best way to hold their attention. Even I’ve been known to check my cell phone during the most inspirational TED Talk.

Distraction can work in the opposite way as well.  A student who tunes out biology to check Instagram, may also avoid the boredom of waiting on line at Chipotle by accessing their course online. And, with Acrobatiq, the professor standing behind them can evaluate how their students’ progress.

While helping students think about how and when they use cell phones, educators need to expand opportunities for students to accomplish a wide variety of goals from communication to graduation with mobile devices. Formats such as blended or hybrid classes using digital learning platforms can lessen student distraction. Some instructors are already incorporating education apps into class time as part of the curriculum.

Mobile device programs will not replace all forms of teaching. They are meant as an active way to promote student learning by using the technology around us.

What’s a Seventeen-Year-Old to Do?

Originally posted on Acrobatiq and cross-posted with permission.

When co-founder Larry Page recently announced Google’s reorganization, he referred to the company as ” still a teenager.”  Incorporated in 1998, Google is 17 years old, making Page and co-founder Sergey Brin 17-year-old businessmen.

Like 17-year-olds, they are tired of routine and want the flexibility to do things they like doing.  Restructuring Google frees them from the mundane tasks of running a large company. As Page writes, it allows them “to do things other people think are crazy but we are super excited about.”

While the Google founders are getting what they want, most 17-year-olds getting ready for college will not. According to Remediation—Higher Education’s Bridge to Nowhere, Complete College America’s study of public institutions in 33 states including Florida, Illinois, Massachusetts, and Texas, over 50% of students starting college or community college test into remedial courses.  In California, 68% of students entering the state college system test into remediation.

These courses rehash high school material. To adapt a University of Chicago motto, this is where excitement and freedom come to die. The proof is in how many people don’t go on to graduate.  According to the study, about 40% of community college students don’t even complete their remedial courses. Less than 10% of community college students who finish remedial classes graduate within 3 years. Only about 35% who start in remedial courses in four-year colleges obtain a degree in 6 years.

Factor in the cost of remedial courses, courses that do not count towards a degree, and you have a formula for failure. To make things even worse, placement in remediation is often based on one test that researchers say is a poor indicator of student potential.

To address this situation, educators are calling for reforms. Some states are giving remedial students a choice of whether to take remedial courses or directly enroll in regular college courses. Others are eliminating remedial education as prerequisites for regular courses, allowing students to take them as co-requisites. Technology offers other ways to support students with weak skills in regular college courses.

Digital adaptive learning programs can be adopted for all students in college courses. They address individual student abilities at every level, in effect, containing what might be called “embedded co-requisites,” academic support where needed.

As students learn, these self-paced digital programs assess their abilities, tailoring the material so that they get the most out of their time in the course. For example, if a student shows non-mastery in identifying evidence for historical argument after reading a text section, the program might give them more explanation, activities, or video on that subject. On the other hand, if a student shows mastery after one text section, they can immediately move forward.

Adaptive programs also give instructors data on how each student is progressing in real time. Based on this data, instructors can reach out to students while they are learning instead of after they have failed a test.

As students succeed at their own pace in courses that count, they will have greater motivation to complete their degrees. They might even enjoy the experience.

Flipping the Curriculum

When I realized I was a technology immigrant trying to learn Technology As A Second Language℠ (see previous blog TSL℠: Do You Speak It?), I hadn’t yet read Marc Prensky’s two amazing articles,  Digital Natives, Digital Immigrants and Do They Really Think Differently? Part II.

Turns out we are on the same wavelength. In 2001, Prensky had already defined digital natives as those who have grown up with technology and digital immigrants as those who did not. While my focus has been on how to educate digital immigrants, his focus has been on how to educate digital natives.

According to Prensky, what makes this digital immigrants/digital natives situation unique is that while traditional immigrants learn the prevailing culture from natives, in education the situation is reversed. Digital immigrants are trying to teach digital natives how to succeed in an increasingly digital world, one that students have a better grasp of than teachers.  Prensky believes that this is “the single biggest problem facing education today.”

There is only a slim, slow chance of solving this problem if digital immigrants continue to obsess about “negative” effects of technology.  Steven Pinker describes the hysteria:

Media critics write as if the brain takes on the qualities of whatever it consumes, the informational equivalent of “you are what you eat.” As with primitive peoples who believe that eating fierce animals will make them fierce, they assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.

Even if this were true, we would still have to deal with it. If we are going to help next generations, we need to stop wasting time lamenting basic truths:

  1. Technology has changed everything and will continue to do so.
  2. Using technology rewires the brain and changes the way people think just as driving cars and working in offices changed the brains and thinking of people who rode in buggies and farmed.
  3. Technology means we don’t have to remember as much information as we used to.

Prensky accepts these premises. In his 2014 article, The World Needs a New Curriculum, he explains that the core subjects, math, science, language arts, and history are just “proxies” for teaching what most agree is needed to succeed such as informed thinking, acting, and communicating. We expect students to learn the latter even though they are rarely taught directly.

According to Prensky, modern times require flipping the curriculum to teach what we value most upfront, using subject matter where it fits. He proposes four basic subjects: Effective Thinking such as critical thinking, mathematical thinking, design thinking, and problem solving. Effective Actions such as mindset, grit, and entrepreneurship. Effective Relationships such as communication, collaboration, ethics, and politics. And, Effective Accomplishment which requires students to work on real-world projects. Technology is included as digital natives include it–integrated into whatever is happening.

Prensky doesn’t have all the answers, but his proposals change the focus of the reforming education conversation from how to use technology to teach to how to teach the meaningful thinking that will help students navigate the challenges of a technology world. I think he’s on the right track.