Tag Archives: digital

Cell Phones, a New Form of Birth Control?

In the informative Coursera MOOC on social marketing that I’m presently taking, a number of stats were thrown at us. My favorite: “More people own a mobile device than a toothbrush.”

If you’ve been reading my blog, you know that I’m not going to accept this kind of statement at face value (see previous blog, Or Would You Rather Be a Fish?). So I had to dig deeper to find out if it’s true.

60 Second Marketer Nicole Hall traced the origins and accuracy of this statistic in 2011. She looked at statistics on subscriptions to cell phone services and deduced the number of mobile phones owned worldwide.

Then Hall unpacked the statistic from Oral-B that yearly global toothbrush sales were about $5 billion. She concluded that, yes, “more people own a mobile phone on the planet than own a toothbrush.”

Jumping on this conclusion, I was all ready to discuss the oral hygiene of mobile phone users without toothbushes—poor oral hygiene leads to rotting teeth, leads to less romantic relationships, leads to less sex and less children. Could mobile phones be a new form of birth control? (Is that so much more far-fetched than many of the dire predictions people are making about the impact of mobile phones?)

But, my friend Lindsay reminded me that not everyone uses a plastic toothbrush to ward off tooth decay. As a Peace Corps volunteer in Ethiopia, she observed Ethiopians using roots or twigs to clean their teeth. Apparently, I was thinking about this in a culturally biased way. Shame on me.

So I had to find out how the rest of the world takes care of their oral health. Not only do Ethiopians and their West African neighbors brush with and chew twigs to clean their teeth and freshen their breath, but according to researchers these twigs are very effective at fighting bacteria.

Protection from gum disease and other ailments is associated with the use of chewing sticks from the Neem tree in India and twigs called Miswaks that have been used by Muslims for centuries. (You can buy Neem toothpaste and Miswaks on Amazon.) In other words, most people in the world are brushing their teeth.

Now that data is part of our everyday lives and shaping our lives online, we need to be very careful about how we interpret it. Comparing unrelated variables is particularly tricky. Despite Hall’s excellent research, the statement should be: More people own a mobile device than a Western style toothbrush. Putting it that way makes it less intriguing and even more trivial.

For those of you who are thinking about how mobile phones are changing our lives and the potential impact of social marketing, here are a couple of examples of more relevant comparison data:

More than 50% of US adults aged 18-44 have cellphones rather than telephone landlines in their homes.

Americans now spend more time accessing digital media on mobile devices than they do on desktop or laptop computers.

Now these stats are worth thinking about next time you brush your teeth.

What’s a Seventeen-Year-Old to Do?

Originally posted on Acrobatiq and cross-posted with permission.

When co-founder Larry Page recently announced Google’s reorganization, he referred to the company as ” still a teenager.”  Incorporated in 1998, Google is 17 years old, making Page and co-founder Sergey Brin 17-year-old businessmen.

Like 17-year-olds, they are tired of routine and want the flexibility to do things they like doing.  Restructuring Google frees them from the mundane tasks of running a large company. As Page writes, it allows them “to do things other people think are crazy but we are super excited about.”

While the Google founders are getting what they want, most 17-year-olds getting ready for college will not. According to Remediation—Higher Education’s Bridge to Nowhere, Complete College America’s study of public institutions in 33 states including Florida, Illinois, Massachusetts, and Texas, over 50% of students starting college or community college test into remedial courses.  In California, 68% of students entering the state college system test into remediation.

These courses rehash high school material. To adapt a University of Chicago motto, this is where excitement and freedom come to die. The proof is in how many people don’t go on to graduate.  According to the study, about 40% of community college students don’t even complete their remedial courses. Less than 10% of community college students who finish remedial classes graduate within 3 years. Only about 35% who start in remedial courses in four-year colleges obtain a degree in 6 years.

Factor in the cost of remedial courses, courses that do not count towards a degree, and you have a formula for failure. To make things even worse, placement in remediation is often based on one test that researchers say is a poor indicator of student potential.

To address this situation, educators are calling for reforms. Some states are giving remedial students a choice of whether to take remedial courses or directly enroll in regular college courses. Others are eliminating remedial education as prerequisites for regular courses, allowing students to take them as co-requisites. Technology offers other ways to support students with weak skills in regular college courses.

Digital adaptive learning programs can be adopted for all students in college courses. They address individual student abilities at every level, in effect, containing what might be called “embedded co-requisites,” academic support where needed.

As students learn, these self-paced digital programs assess their abilities, tailoring the material so that they get the most out of their time in the course. For example, if a student shows non-mastery in identifying evidence for historical argument after reading a text section, the program might give them more explanation, activities, or video on that subject. On the other hand, if a student shows mastery after one text section, they can immediately move forward.

Adaptive programs also give instructors data on how each student is progressing in real time. Based on this data, instructors can reach out to students while they are learning instead of after they have failed a test.

As students succeed at their own pace in courses that count, they will have greater motivation to complete their degrees. They might even enjoy the experience.

Or Would You Rather Be a Fish?*

In a recent Faculty Focus blog, The Power of Mindfulness, Jennifer Lorenzetti points out that the average attention span of humans is estimated at 8 seconds, down from 12 seconds in 2000, while that of goldfish is 9 seconds. She follows this with: “Wouldn’t it be wonderful if everyone in your class could manage to be mentally present for the entire class?” Should she also have asked “Wouldn’t it be great if everyone in your class were a goldfish?”

More questions: How do you measure the attention span of a goldfish? What does that have to do with human attention span? What do goldfish use their attention spans for? How can I have any self-esteem if I have a shorter attention span than a fish?

Perhaps the fact that I got stuck on the goldfish intro and didn’t go on to grasp the rest of the article proves that I have a short attention span. Nevertheless, rather than write about mindfulness, I decided to browse for information on the attention span of goldfish.

Almost immediately I discovered that I am not the first to do this. Among those who have, Ray Adams is very skeptical about Google searches and attention span research. He could not confirm the actual attention span of goldfish (or people for that matter). Back to goldfish. Ken McCall did an even more thorough search. He traced the statistic back to the Statistic Brain, but they don’t explain its source either. They define attention span as “the amount of concentrated time on a task without becoming distracted.”

A 2014 Ministry of Truth blog also can’t find a source for attention span in goldfish, likening it to another widely touted goldfish characteristic–that goldfish have a 3-second memory span (how long something is remembered). However, that assertion was debunked by scientists in two studies showing that goldfish memories could last for months.

Lorenzetti may have gotten her information from a recent publicized article by Microsoft Canada, Attention Spans, reporting research on human attention span in the digital age. They used the goldfish 9-second statistic.

The researchers found that Canadian attention spans are decreasing, but people are able “to do more with less,” making decisions based on little information. The Microsoft study’s goal was to advise advertisers on digital messaging. Their advice was to be concise, novel, and interactive where appropriate.

The Microsoft researchers didn’t study goldfish or give advice on how to get their attention. Should we just be amazed that we function as well as we do with such short attention spans?

Research will continue on human attention span in the digital age because it affects how we learn and communicate. But is attention span the same no matter what we are involved in? While my attention span for Lorenzetti’s article was short, it was quite substantial for researching goldfish and writing this blog.

So, I’m not convinced that we are losing out to goldfish. Now, if we could measure goldfish attention span while they’re surfing the Internet or playing Grand Theft Auto….

*From Swinging on a Star by Johnny Burke and James Van Heusen

Flipping the Curriculum

When I realized I was a technology immigrant trying to learn Technology As A Second Language℠ (see previous blog TSL℠: Do You Speak It?), I hadn’t yet read Marc Prensky’s two amazing articles,  Digital Natives, Digital Immigrants and Do They Really Think Differently? Part II.

Turns out we are on the same wavelength. In 2001, Prensky had already defined digital natives as those who have grown up with technology and digital immigrants as those who did not. While my focus has been on how to educate digital immigrants, his focus has been on how to educate digital natives.

According to Prensky, what makes this digital immigrants/digital natives situation unique is that while traditional immigrants learn the prevailing culture from natives, in education the situation is reversed. Digital immigrants are trying to teach digital natives how to succeed in an increasingly digital world, one that students have a better grasp of than teachers.  Prensky believes that this is “the single biggest problem facing education today.”

There is only a slim, slow chance of solving this problem if digital immigrants continue to obsess about “negative” effects of technology.  Steven Pinker describes the hysteria:

Media critics write as if the brain takes on the qualities of whatever it consumes, the informational equivalent of “you are what you eat.” As with primitive peoples who believe that eating fierce animals will make them fierce, they assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.

Even if this were true, we would still have to deal with it. If we are going to help next generations, we need to stop wasting time lamenting basic truths:

  1. Technology has changed everything and will continue to do so.
  2. Using technology rewires the brain and changes the way people think just as driving cars and working in offices changed the brains and thinking of people who rode in buggies and farmed.
  3. Technology means we don’t have to remember as much information as we used to.

Prensky accepts these premises. In his 2014 article, The World Needs a New Curriculum, he explains that the core subjects, math, science, language arts, and history are just “proxies” for teaching what most agree is needed to succeed such as informed thinking, acting, and communicating. We expect students to learn the latter even though they are rarely taught directly.

According to Prensky, modern times require flipping the curriculum to teach what we value most upfront, using subject matter where it fits. He proposes four basic subjects: Effective Thinking such as critical thinking, mathematical thinking, design thinking, and problem solving. Effective Actions such as mindset, grit, and entrepreneurship. Effective Relationships such as communication, collaboration, ethics, and politics. And, Effective Accomplishment which requires students to work on real-world projects. Technology is included as digital natives include it–integrated into whatever is happening.

Prensky doesn’t have all the answers, but his proposals change the focus of the reforming education conversation from how to use technology to teach to how to teach the meaningful thinking that will help students navigate the challenges of a technology world. I think he’s on the right track.

Is Multitasking the End?

There’s an epidemic that’s hard to miss: People texting, watching videos, working, checking their messages, listening to music, walking, driving, talking on cell phones all in the same timeframe. They’re multitasking, doing different things simultaneously or moving back and forth between activities that overlap or interrupt each other.

The most dangerous technology multitasking is driving and talking or texting on your cell phone. Cell phone activity slows reaction times more than alcohol. Then multitasking while walking. Forty per cent of over a thousand teens surveyed by Safekids reported they were hit or almost hit by a moving vehicle while walking and listening to music, talking or texting on their phones.

Still, the epidemic continues to spread. We can try to contain it, but it may be more useful to find ways to save us from ourselves. People are working on solutions such as inventing cars that can stop on their own or drive themselves so distractions such as texting won’t cause accidents.

For walking and texting or “wexting,” one city is wrapping soft materials around light poles to cause less harm when people walk into them. Another has a pop-up ad interrupt texting when you approach intersections to get you to pay attention. To raise awareness and change behavior Utah Valley University designed a staircase with lanes marked walk, run, and text.

Still, the outlook is grim. Articles appear daily with headlines such as “Why the Modern World is Bad for Your Brain”   “The High Cost of Multitasking”  “Multitasking Damages Your Brain And Career, Studies Show” These articles highlight research showing that technology multitasking can make you less productive, lower your IQ, and damage your brain.

These are all important issues, but how do we continue to adapt to the future with this kind of ominous news?

Looking for good news I found two small studies. showing that the effects of technology multitasking aren’t all bad. One study of children who were instant messaging (IMing) while reading a passage found that although they took longer to read the passage, they tested as well as those who just focused on reading. In another study, Dr. Zheng Wang found that college students who used media while studying were happier than those who did not, despite not completing their school work.

Then I thought I found real optimism in the New Yorker article “Multitask Masters.” When psychologist David Strayer found the more you multitask, the worse your performance, he also found one person who was a highly effective multitasker.  Instead of treating her as a statistical irrelevancy, he decided to learn from her and other “supertaskers.” But, it turns out supertaskers only comprise about 2% of the population and genetics basically determines their ability—their brains are wired differently.

How do we keep hope alive? I have to think that in the thousands of years of human life, our brains have adapted even though we have no MRIs or psychology studies to prove it. Take the telephone, invented in the 1870’s, for example. The sound of a phone ring is so familiar to us now that coma patients will answer phones, speak into them, and then return to their comas! That’s gotta count for something.

Let’s continue to do the studies, analyze the problems, and work on solutions. And, let’s hope that we don’t become idiots or accidentally kill each other before experience with technology turns humanity into better multitaskers.