Playing Head Games: Learning (and Un-Learning) in the Future

Someday not so far away, scientists, astronauts, physicians, pilots, and surgeons will not have to get a formal education.  In fact, no one, not even children will have to go to school.

OK, I may be overstating that just a bit.

Let me restate:  In the future, the definition and process of learning is shaping up to be quite different than how we know it today.  The changes we may see in the future could have big implications on many areas of our lives — education and medicine, to name just a couple.

In the Future, We’re All More Intelligent

I’ve spent a good deal of my little share of the blogosphere discussing how the Millenial and Homelander generations are different from Gen-Xers and Boomers.  Here’s another:  Millenials and Homelanders and generations beyond will spend less time learning, yet likely know more, than their parents’ generations.

When older generations were growing up, learning involved committing to memory (and hopefully using in practice) various facts about history, language, mathematics, and so on.  Let’s face it:  some of that information turned out to be not so useful in our adult lives.  For example, we may remember facts here and there about the American Civil War – and may be a nasty Trivial Pursuit competitor as a result –  but most of us use that information very little in our day-to-day lives.

Its almost inevitable that younger generations will view their parents’ education process as quaint or even wasteful.  To their way of thinking, why bother learning more than the basics of how to communicate and manipulate numbers?  They already know that the Internet holds more collective knowledge, available at their fingertips 24X7,  than they can possibly learn on their own.  If they need to know something, they just Google it.  Some people consider this ability to instantly access new information in the cloud as a bump in the collective intelligence.

I can almost hear the masses of educators and parents bemoaning how today’s lazy and unmotivated kids are going to ruin the world.  In fact, there is a great suspicion of the web in many educational circles.  Many educators even limit the types of websites a student can reference in their projects, presuming that students can’t assess the difference between reliable information and misinformation.  This seems a bit short-sighted to me:  I don’t recall a teacher instructing our class not to use a particular book or newspaper for our projects.  Even if, generally speaking, the information we used to be exposed to was more “reliable” because they were filtered by a publisher of some sort, it’s well known that even the books and newspapers of those bygone days contained misinformation, or at least biased information.  Better to teach kids how to detect misinformation, I’d say.

I’m slightly off topic… again.

Really, though, is the avoidance of committing to memory a bunch of information you don’t need right now a bad thing?  Why not find and use information when you actually need it rather than studying something just to pass an exam?

Perceptual Learning

Well, that just might happen.  In the future, we might use new perceptual learning techniques to learn something when its useful as opposed to learning something only for an exam that may or may not be useful to us in the future.

We all already use some degree of perceptual learning today:  a child uses perceptual learning to to detect patterns in numbers sequences or a series of shapes, for example.  The basic concept will remain the same.  What might change is that we may train our brains to enable more dynamic and instant perceptual learning.  You know, just like Neo, who sits back in a dentist’s chair, plugs his brain into the Matrix and, upon waking up, has this profound observation:  “I know Judo.”

Well, sort of.   There are different systems in the brain.  Broadly speaking there is the cerebrum, the cerebellum, the limbic system, and the brain stem.  The cerebrum is the system involved in higher thought and action, and its this part of the brain that I’m mostly concerned with today.  To learn Judo, I’d also need to discuss other parts of the brain that control movement, and hey, I’m not writing a term paper here (You can read this article to see if brain training could be used to learn Judo.  Of course, you’ll read it after you read my post, right?).  Also, I don’t foresee – or ever hope to see – an era in which parents will consent to having large outlets installed into the backs of their children’s heads.

It used to be thought that the brain was essentially static and somewhat untrainable after 12 months of age.  But, in 1994, an Israeli neurobiologist, Dov Sagi, demonstrated that rigorous, extended training in certain visual tasks could improve how well people performed those tasks.  Sagi’s study (and others that followed) have proven that the brain is highly plastic and even capable of re-wiring itself well past the age of 12 months.

Neuroscientist Takeo Watanabe recently conducted experiments to see if the brain’s visual system (in the cerebrum) could be used for more instantaneous perceptual learning.  First, using an fMRI, Watanabe recorded brain activity patterns while subjects observed diagonal lines on a computer screen.  Later, the same subjects were wired up again and then shown a computer image of a small disc.  Now here’s where it gets weird:  they were asked to use their mental powers to increase the size of the disc.  They were not told how to use their mental powers, just to try to mentally increase the disc.

Now, even weirder, many of the subjects – or, rather, the subjects’ brains –  actually could make the disc bigger.  The connection between the subjects brain and the computer had been set up such that if the subjects could reproduce the brain activity patterns previously recorded, then the disc would grow.  Turns out that when faced with a visual challenge, the brain will automatically replay old perception patterns to find a match and “solve” the problem.  In effect, the subjects displayed the capability to replicate brain activity patterns and work with “information” through their brain patterns.

So, if this technique could be developed to be used in education, students may not be educated in the traditional sense, but rather be trained to call up certain kinds of brain activity.  Once the students’ baseline is established, they would activate the relevant brain pattern while observing material that they need to know.  The hope is that the material is then automatically stored in the brain and can be called up as needed to complete a particular task.

I’m guessing here, but it seems that this technique could solve a whole host of educational issues that we struggle with today.  We already know that the “one size fits all” or “teaching to the middle” methods aren’t working well.  Perhaps this technique will enable us to deliver personalized educational programs for each student?  Presumably all students will learn much faster, yet each will learn at their own rate.  Maybe learning problems, like dyslexia or ADHD, will be irrelevant?  And, maybe, just maybe, all learning will happen at home and kids won’t have to go to a formal school environment any more!  (I’m not going to mention this to my kids just yet, but I, for one, will be glad to see the early morning drop-off eradicated from my schedule.)

There’s more.  This type of perceptual learning may be something that can happen while the students are asleep just as often as it happens when they are awake.  Can you imagine…?

“Now students, your homework assignment tonight is go home, wire your head to your home learning unit, look at a picture of the material for 10 minutes and then take a 2 hour nap.  Feel free to stay up all night partying after that.  Just be back here tomorrow morning for the exam.”

How many of us would have loved to hear that after calculus or biochemistry class?

Un-learning … and Stanley Kubrick

Have you seen the movie, Eternal Sunshine of the Spotless Mind?   It’s definitely one of my top five movies, though I can’t say for sure whether it ranks better than Kick-AssGalaxy Quest, Napoleon Dynamite, Bedazzled, or Tropic Thunder … yes, I am that person.

Anyway…

Eternal Sunshine is a great movie (and relevant, which I’ll get to in a minute) because it examined a fascinating and terrifying concept:  if the brain could be manipulated to forget painful memories, would it make us happier?

Let’s return to Watanabe’s research.  Watanabe also hypothesizes that we could use perceptual learning techniques — but in reverse — as a cure for things like depression, kind of along the same lines as cognitive psychology.  For example, a depressed person could go through un-learning therapy (my term), where a doctor would expose the depressed person to images of kittens and babies, record their brain activity during the session and then have the patient access the happy brain activity as the need arose.  Alternatively,   doctors may use this technique to erase a certain painful time period from a patient’s memory.  (Would they also get to wear cool, semi-retro suits and get an instant memory-zapper tool like Tommy Lee Jones and Will Smith in Men in Black?)

Amazing as it sounds, I’m not so sure that un-learning or erasing memories would be a good or desirable “cure”.  Watanabe suggests erasing a whole time period, as opposed to selective events, from a person’s memory.  Would we really want that?  Maybe for people with certain kinds of mental issues, like soldiers who suffer from PTSD, erasing a whole year might make sense.  But for the man on the street, both good and bad things happen through the course of a year.  Surely painful experiences have lessons to teach us.  And what is life without both joy and pain?  Boring.

Certainly there’s also the potential for chaos and confusion.  Wouldn’t your co-workers look at you a little funny if you can’t remember what you discussed in the last team meeting?  I’m thinking that erasing your memories might be a career-limiting move.  (I’m also thinking that I’ve worked with a few people who may have already had this procedure.)

I’ll concede that there would be something about un-learning that could help with problems caused by poor habits and deficient thought patterns.  For example, if un-learning could curb Americans’ cravings for processed, high-fat, sugary foods (except, of course, chocolate) and increase our desire to exercise, maybe we could solve the obesity crisis.  Ditto for smoking.  Also, repeat criminal offenders have been shown to fall victim to poor reasoning.  Maybe un-learning could help them rehabilitate?  Along those same lines, could teenagers become people that their parents can live with???

Even so, I see ominous shades of A Clockwork Orange in all of this (Moloko Plus, anyone?) — and I hate to be sympatico with anything Stanley Kubrick has had a hand in.

But, Will It Work?

If un-learning comes to pass, I’m kinda hoping it turns out the same way it did in Eternal Sunshine: (SPOILER ALERT) ultimately the characters who go through the memory erasing procedure are inevitably drawn to back to the person they wanted to forget.  The moral:  Not even amazing technology can usurp Fate.

Posted in Healthcare | Tagged , , , , , , , , , , , , , , , | 2 Comments

Nothing is Certain But Death and Taxes… Well, Taxes, at Least

When is a person or animal dead?  A simple question, right?

Wrong.

Turns out, it’s not such an easy question to answer.  And, with advances in medical technology and the influence of religious and cultural norms, it’s becoming even more difficult to be certain that someone is truly dead.

Now, I know what you’re thinking:  This woman watches way too many sci-fi movies.  But, wait, just hear me out.

The Death Debate Begins:  Jesus and Premature Burial

The diagnosis of death has been under question, since, well, probably since Jesus flouted all the known laws of the universe and walked out of his tomb 2,012-ish Easter Sundays ago. (My dates may be a little off on that.  I vaguely recall something about three days passing and then Easter happened.  Also, I’m fuzzy on whether A.D. started exactly on that day or how the recently-defunct Mayan calendar may come into play.  Not sure.  It’s been a while since I’ve been to church.)

Anyway…

Jesus may be the first human / deity in recorded history who experienced “accidental” premature burial (And no, I do not count Connor MacLeod of Highlander).  The fear of premature burial is high on the minds of many and became a fever pitch in the mid-18th century when people began to notice that doctors weren’t so great at knowing when someone was really dead.  By some accounts, somewhere between estimated 800-2700 English and Welsh people who had been mistakenly diagnosed as dead miraculously awakened days later.  Some awoke just as they were about to be embalmed, others while lying in their coffins (either above or below ground and presumably not embalmed or it really would be a miracle).

How does one explain the situation to loved ones that wake up as the guest of honor at their own funeral?  “Really, we were told you were dead!!  Honest!”  Just doesn’t seem to cut it, does it?  And, it isn’t hard to imagine how unsettling it would be to find your muddy, well-dressed and highly annoyed loved one glaring at you in the kitchen a few days after their lovely funeral.

Diagnosing Death

Possibly foreseeing the threat of rampant malpractice suits, doctors suggested various procedures that I will call “tests of death” to determine if the patient was really and truly dead.  The following techniques were catalogued by Jan Bondeson in his book Buried Alive:

  • Pouring vinegar and pepper into the corpse’s mouth
  • Pinching the nipples with clamps (courtesy of a French physician)
  • Applying red hot pokers to the feet or into the “the rear entry” (another French technique)
  • Administering tobacco enemas (what was going on with the French??)
  • Jamming needles under toenails
  • Cutting the feet with razors
  • Screaming “hideous shrieks and excessive Noises” or playing bugle fanfares close to the ears
  • Pouring wax on the forehead and urine in the mouth (this one from the Spaniards)
  • Putting an insect in the ear (brought to you by the Swedes)

My, my, my.  Maybe if you explained to your loved one that they passed the “tests of death” they might forgive the premature burial – but I don’t think they would as readily  forgive you for allowing the tests to be carried out in the first place.  I’m guessing that more than a few relationships ended as a result of all this drama.

Actually, maybe “tests of death” is not the right term here.   Perhaps we should refer to these humiliating and painful procedures as “You might be faking, so I’m going to do something that will surely kill you to be sure you are truly dead so that your relatives won’t come after me with pitchforks and torches”.  That hot poker in the rectum thing?  Surely that doesn’t heal well.  Any pour soul who had the misfortune of waking up after it’s application surely died days later of intestinal obstruction.  Rectifying the whole situation quite nicely, really.

I could spend all day on premature burial and other such things, but its not really the main topic here.  Let’s move on.

The Definition(s) of Death

The ethical-medical-cultural debate about the definition of death has raged on since those heady times.  Some think that death occurs when the heart stops.  Others say its when the brain stops functioning (the brain can continue to function to some degree for a considerable period after the heart stops).  Still others claim that it is when the soul leaves the body.  Finally, in the 1990s, some very smart people came up with a definition of death which is still in play today:  Information-theoretic death.

Information-theoretic death is the destruction of the information within a human brain (or any cognitive structure capable of constituting a person) to such an extent that recovery of the original person is theoretically impossible by any physical means.  The concept of information-theoretic death arose in response to the problem that as medical technology advances, conditions previously considered to be death, such as cardiac arrest, become reversible and are no longer considered to be death.

“Information-theoretic death” is intended to mean death that is absolutely irreversible by any technology, as distinct from clinical death and legal death, which denote limitations to contextually-available medical care rather than the true theoretical limits of survival. 

Following this logic and taking into consideration the astounding possibilities of medicine, it will be more difficult, not less, to declare someone absolutely, positively, truly dead.  The dead person of today will be tomorrow’s semi-dead person and maybe even a completely previously-dead-but-now-very-alive person the day after that.

You see?  As far as certainties in life go, apparently death no longer qualifies.

The Future of Death

Let’s fast forward to today, or rather, yesterday.  Yesterday, I watched “The Body” segment of an outstanding Discovery Channel documentary called 2057 During one of the early scenes of the segment, the main character (who was kind of a jerk, truth be told) has a horrible, highly lethal accident by 2013 standards.  But, lo and behold, a flying ambulance shows up moments later and one of the EMT suggests “reversible death” to sustain the patient long enough so that they can get to the hospital.  That got my attention.  “What is reversible death??”  I had to know.

So, I looked it up.  Essentially — and this is my own summary because I couldn’t find a good one online, so be kind — medical professionals will semi-freeze your body by allowing your blood to leak out (hopefully catching it in something because you’ll need it later) and replacing some or all of it with saline and/or cold plasma.  As the body cools to something like 50-ish degrees Fahrenheit, the normal biological functions that cause organs to fail and shut down are suspended.  Thus, you would be in a state of “reverse death”.  Once you’re reversed dead, you have time to be transported to the hospital without suffering further damage to your body and you may remain in that state if the medical team decides to perform surgery  – or administer evil “tests of death” upon you (just kidding about that last one).

After they are done fixing you up, you get your blood back and, Voila!, there you are, a reversed-reversed-dead person.  (Remember, a double negative is a positive.)  Wouldn’t you almost expect your doctor to say “Ta Da!” when you woke up?  I would be disappointed if she didn’t.

Too far fetched?  Will never happen, you say?  Well, it kinda is happening already.

  • I found an amazing (hopefully factual) tidbit on the Wikipedia Death page:  People found unconscious under icy water may survive if their faces are kept continuously cold until they arrive at an emergency room.   Who knew that literally keeping your face on ice could help you to survive drowning in icy water???  Certainly not I!
  • According to Singularity Hub, Dr. Peter Rhee received approval in 2011 from the Food and Drug Administration to go into clinical trials for the use of “suspended animation” during surgery.  The procedure sounds essentially the same as reverse death.  The US Army is very interested in all this, for obvious reasons, and funded some of Rhee’s research.  By the way, Rhee is one impressive man:  he’s the guy who saved Gabrielle Giffords.
  • Also on Singularity HubMark Roth takes a metabolic approach to [suspended animation], decreasing the need for oxygen by introducing a molecule that binds oxygen receptors and thus prevents oxygen from doing so.  Dr. Roth’s team has successfully put yeast, roundworms, fruit flies, frogs, and zebrafish into a state of suspended animation for up to 24 hours. No, I don’t know why those animals were chosen, but what a menagerie!  The metabolism-slowing molecule is now in Phase II clinical trials.

So, that covers short-term reverse death – it’s almost here!

Double Death, Or, Maybe, Eternal Life

But what about people who are looking at the bigger picture?  What if you’re willing to wait for medical technology to catch up with your incurable disease or if you really really really want to see the world 500 years from now?

Enter cryonics.  Cryonics is the low-temperature preservation of humans and animals who cannot be sustained by contemporary medicine, with the hope that healing and resuscitation may be possible in the future. (That sounds familiar!  Someday, I really have to look into whether these are the same guys who came up with the Information-Theoretic definition of death).  Many cryonics advocates further believe that in the future medicine should be able restore youth and health, possibly even allowing people to live for hundreds or thousands of years.

Most of us have heard of people who have been cryonically preserved.  If not, here are a few (spoiler alert – despite urban myth, apparently Walt Disney is not in cryostasis):  Ted Williams, Dick Clair Jones, James Bedford, and FM-2030.  I have to make a special call-out to FM-2030, a man after my own heart (from the future-looking perspective, not the cryonics perspective).  Here is a summary of FM-2030’s journey from mental_floss:

FM-2030 – Yeah, that was his real name. He was born Fereidoun M. Esfandiary, but changed his name to reflect his goal of living to be 100 (2030 would have been his 100th birthday). He also predicted that 2030 would be “a magical time. In 2030 we will be ageless and everyone will have an excellent chance to live forever. 2030 is a dream and a goal.”

Back at ya, FM-2030.

According to Wikipedia, there are only about 250 people in cryopreservation as of 2012.  But, cryonics is catching on.  One facility,  The Cryonics Institute (CI), claims 1,040 members, with 112 humans and 91 pets currently in cryopreservation.   In some cases, entire families have signed up.  And, for a “cool” $28-35K, you and your family can do it, too!  Oh, but you also need $88-95K for someone to be on standby as soon as your heart stops beating and for transport to the facility.  So, at $130K per person (or pet), totally affordable.

One more thing I should mention about these guys.  Cryonicists have also tried their hand at tackling the ethical-cultural-religious debate about death.  On their site, CI provides a handy FAQ for those struggling with the decision to freeze themselves for future resuscitation (They hate the word “freezing”, by the way.  Those in the know call it vitrification.  So, don’t say “freezing” at a cryonicists party or you’ll be shown the door).  I found the following response to the concern about whether cryonics is in conflict with religious beliefs to be quite thought-provoking:

Some have said that to refuse new life extension technologies could be a sin comparable to suicide.

Hm.  So, the natural, if no longer inevitable, act of dying may actually be considered a sin.  Never thought of it that way.  I wonder what the new Pope will think?

Interesting fact:  CI is licensed in Michigan as a cemetery, although they refer to their residents as patients who are simply waiting for medical technology to catch up with them.  So, are they dead or not?  See???  We can’t say for sure!

I’m wondering…

  • Are people who opt for cryonic preservation required to undergo a thorough mental evaluation?  I’m not being smarmy or judgmental here.  The world will be a radically different place when they wake up.  Patients will be heavily reliant on others to help them understand how to function.  Spoken and written language is likely to be very different by then, perhaps the patient won’t understand a word anyone is saying.  Aren’t they concerned about suffering a nervous breakdown or becoming suicidal?  (The CI FAQ has a not-wholly-convincing answer for this risk, by the way.)
  • What about national healthcare programs and cryonics?  Where do we draw the line between who should/shouldn’t be preserved and who should pay for it?
  • Will future Presidents and world leaders be required to be cryopreserved?  What would happen if they were revived and they expected to run the world again?
  • If you are in cryostasis for, let’s say, 300 years, do you have to pay taxes for those 300 years when you wake up?

The Heroism of Death

All of this research into suspending people and animals so that they can be revived someday sort of implies that death is to be avoided at all costs (or for just $130K at CI).  Really, though, should we be trying so hard to cheat death?

Not that I’m obsessed with death or anything, but back in 2004 I read an excellent book by Mary RoachStiff:  The Curious Lives of Human Cadavers.  (Truly, it’s fantastic and you should read it.  Don’t be a wimp.)  Roach lays out many of the reasons why one should donate their body to science – when one is done with said body, of course – and explains the many fascinating things that cadavers help scientists to study.

One of the best parts of this book is the Introduction, so don’t skip it.  In the Introduction, Roach compares traditional death and burial — that, is dying and then being a body that is not-prematurely-buried — to being on a cruise ship.  You can chose a cruise ship on which you lay around all day, doing nothing other than slowly decomposing.  Or, you can chose to cruise Ms Roach-style.  Here’s her perspective:

If I were to take a cruise, I would prefer that it be one of those research cruises, where the passengers, while still spending much of the day lying on their backs with blank minds, also get to help out with a scientist’s research project.  These cruises take their passengers to unknown, unimagined places. They give them the chance to do things they would not otherwise get to do.

I guess I feel the same way about being a corpse. Why lie around on your back when you can do something interesting and new, something useful? For every surgical procedure developed, from heart transplants to gender reassignment surgery, cadavers have been there alongside the surgeons, making history in their own quiet, sundered way.

So, while we debate who’s dead and who’s not and as we develop incredible technologies to pluck from the jaws of death those who can be saved, let’s not forget that the act of dying can be generous, brave, and heroic.  We must be grateful to the dead who have offered up their bodies to science:  They’ve given us incredible gifts.

Now, about those taxes…

Posted in Healthcare, Legal | Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | 3 Comments

The Vice Premium: Insurance Companies and Big Brother

The other day I was getting a spiffy new hair cut when I noticed the semi-Steam-Punkish bracelet that my hairdresser was wearing.  Because I’m a curious (nosey) person, and, hey, she’s my hairdresser so I’ve already told her my life story, I asked her what it was.  It turned out to be a Jawbone UP, a bracelet that monitors your sleep patterns and activity levels (or, more to the point, inactivity levels) throughout the day.  If you also track your food and drink intake (accurately!  no skipping the donut you had at the breakfast meeting!), it helps you to correlate your overall well being and see where that extra slice of cake or glass of beer may not have been the best idea.

The fine men and women of Jawbone UP are taking advantage of one of the most recent trends in self-obsession:  the Quantified Self.  According to QuantifiedSelf.com, self-quantification is “self knowledge through numbers”.  Self-quanitifiers (or is it quantified-selfers??) seek out “new ways to learn about yourself through the data you collect”, craft spreadsheets that track everything that they do and eat, and have meet-ups worldwide where they discuss their self-stats and how to learn even more about themselves using more gadgets and even more spreadsheets.   (I should mention that the person who publishes QuantifiedSelf.com is a devotee of FitBit, a  product similar to Jawbone UP.)

Though its not for me (24X7 monitoring and the constant need to update my diet log would cause undue stress, leading my body to produce more of that nasty cortisol and epinephrine and I’m sure I’d gain a few pounds), I do see the benefits of self-quantification: in short, better health, better living.

Turns out, insurance companies think this is a good idea, too, and, in the foreseeable future, monitoring yourself 24X7 won’t be optional.

My husband, who is responsible for seeking out technology partnerships to take advantage of big data processing and analysis, told me about a discussion that he had with an insurance executive at a recent health and technology conference.  They had just left a session where a scientist had discussed the benefits of big data analysis to support personalized medicine to treat cancer.  My husband was surprised when the insurance executive had one word for personalized medicine:  “Pshaw!”

Insurance companies, the exec said, are much more interested in technologies and applications that would help to prevent disease, not cure it.  In fact, many insurance companies are pressuring their large enterprise clients to have their employees to wear… what else… Jawbone UPs and FitBits!

But, it doesn’t stop there.  Soon after the bracelets gain broader acceptance, they’ll want access to the all those self-stats employees start collecting about themselves.  Why would you agree to this?  Because the insurance companies will promise lower premiums, perhaps even adjusted on a daily basis for people with healthy habits.  So, if you have a donut, be ready to pay a Vice Premium (my term, not theirs).  Oh, and if you don’t want to be monitored, you won’t be insured and the “health care” you receive will be atrocious.  If you have an accident, God forbid, they’ll find you a dirty bed in a crowded ward, give you a box of band-aids, stick you with a saline drip and hope for the best.

Think of the impacts!  Fast food companies will go out of business (yeah, right).  Donut manufacturers will have to down size and switch to vegan, gluten free, products (ha!).  Couples will get divorced when the Vice Premiums from Daddy’s weekend with the guys in Vegas ends up eating up the funds meant for the family vacation!  Since we’re talking about Vegas: Nothing will “stay in Vegas” anymore – they’ll have to develop a new ad campaign!  Maybe “Try to Have Fun in Vegas”?  Or, “Please come to Vegas!  We’ll pay the Vice Premiums for You!”?  Even if you are insured, if you don’t have sufficiently healthy habits, maybe you won’t even be able to get married… your date will ask to see your self-stats before even having coffee with you, and that will be that.  Which means fewer children.  Maybe this is the answer to overpopulation???

I joke, but this is serious business…

So, you’ll be OK with allowing the insurance company to analyze you 24X7 and you will  think a lot about the Vice Premium when you are faced that donut.  Instead of having a few beers, you’ll allow your friends treat you to a mineral water with a twist and then go for a run.

Following shortly on all that, though, it won’t matter if you want to opt out of self-quantification, Vice Premiums or no.  Self-quantification, or rather, insurance-risk-quantification will happen whether you want it to or not…  Scientists such as Danilo DeRossi and Mohammed Mabrook have been working for years on “intelligent textiles” that incorporate wires and sensors that will monitor your vital signs.  By the time these clothes are featured on your favorite online store, its likely that insurance companies will already have access to the personal data that these clothes will produce.

Oh, woe, Aldous Huxley, you were only too right.

Sure, there are upsides:  Your clothes will also automatically alert emergency services if you have an accident or are about to have a heart attack, so they can respond more quickly and you’ll have a greater chance of survival.  Epidemics caused by all the new drug-resistant superbugs can be more easily identified and contained.  Maybe your clothes will even be capable of basic first aid, like compressing your body to reduce bleeding.

One thing we should all take note of:  So far, the clothes that I’ve seen which can monitor the body are all skin-tight suits.  It seems those outfits that all of the characters wear in sci-fi movies are no longer a distant (and, let’s face it, ridiculous) threat.  Forget about exercise programs so that you can Look Good Naked!  If you want to look good not naked you better start that diet and exercise program now.  Heck, why not get a Jawbone UP or FitBit to get you started?

All of this Big Brother stuff is making my head swim.  I think I’ll have a donut.  I can start my diet tomorrow, right?

Posted in Healthcare, Insurance, Privacy | Tagged , , , , , , , , , , , , | 6 Comments

No More Bad Drivers in the Future (If Ford Motor Co Gets Its Way)

Tonight, I had the rare opportunity of listening to an entire radio broadcast of an event at the Computer History Museum in Silicon Valley.  (This was a rare opportunity for me because I was alone in my car and there were no kids to scream over the program or to demand to listen to “that Lady Gaga song”. )   In the program, Bill Ford, of said automobile company and related ancestry, discussed how Ford Motor Co. is opening a technology lab in Silicon Valley to get closer to innovative technology companies, especially software giants such as Google, Microsoft and, of course, the belle of the ball, Apple.  By collaborating more closely with technology companies, Ford (the man) says that he hopes that Ford (the company) will capture the leadership position in auto-related technologies and safety.  As Paul Mascarenes, CTO and VP of Ford Research and Advanced Engineering (who was also on stage) says, “We want Silicon Valley to view Ford as a platform that is open, accessible and ready for their innovative ideas and technologies.

In terms of new technologies, Ford discussed how cars now (or will soon) integrate voice recognition, internet connectivity, ipod integration, and, of late, green / ecologically-friendly options to traditional gas burning engines.

Quite a new tune coming from Detroit these days…

But the technologies that aim to improve safety were, for me, the most interesting part of the discussion.  One of the audience members asked a very interesting question:  “If Ford realizes it’s vision, will bad drivers be a thing of the past?”  Mr. Mascarenes called up several technologies which, he claimed, might do just that.  He cited developments such as driver monitoring, blind-spot monitoring, MyKey‘s to protect teens from themselves by setting limits on things like speed and radio volume, and radio communications between vehicles and the grid that would allow the car to take over to avoid collisions and also help to alleviate traffic congestion.

Leaving my multitude of questions aside about who — or what —  is really driving for a moment, Mr. Ford said something that took me by surprise.  That last idea about using vehicle-to-grid-radio communications to alleviate congestion is, for Mr. Ford, a civil rights issue.  A few people in the audience giggled.  I was merely confused, but, then he pointed out that in China in 2010, there had been an 11 day, 74.5 mile traffic jam.  (I couldn’t believe it, either, but I looked it up.  It happened!)  To give you an idea what that looked like, here are photos of part of the traffic jam over the duration:

Day 1:  A minor annoyance

Day 4:  The kids are starting to get hungry

Day 7:  We’ve gnawed the leather off the backseat

Day 10:  OK, I’m starting to get really annoyed now…

Mr. Ford explained his point: if we don’t do something to control traffic (he pointedly didn’t mention improving access to eh-hem, public transportation), then multi-day traffic jams could become a global phenomenon, thus preventing access to timely healthcare, education, food, water, and other basic needs.  (As it turns out, its likely that Mr Ford lifted the idea from civil rights activists like the Leadership Conference on Civil and Human Rights, who are pushing for more public transportation, but that is a topic for another time.)

Now, back to the question of who is really driving.  Will the drivers of the future be ready and willing to drive a car that they are not really driving?  There are mixed opinions between auto manufacturers, organizations that track the real-world performance of safety features, and — the men  and women who really matter the most — insurance companies.  As our friends at the Highway Loss Data Institute and the Insurance Institute for Highway Safety reported:

Forward collision avoidance systems, particularly those that can brake autonomously, along with adaptive headlights, which shift direction as the driver steers, show the biggest crash reductions. Lane departure warning appears to hurt, rather than help, though it’s not clear why, and other systems, such as blind spot detection and park assist, aren’t showing clear effects on crash patterns yet.

So, it seems, the proof, as they say, will be in the as-yet-to-be-tasted “future pudding”.

What do you think?  Will bad driving disappear?  Are you a bad driver, ready for your car to take over when your nasty habits expose themselves?   Bad driver or not, are you ready to let your car to do the driving for you?

Posted in Demographics | 3 Comments

2030: Will It All Be Fun and Games?

If you’re paying any attention to the business world these days, you are sure to have come across the concept of “gamification”.  There is talk of “gamifying” everything from how we purchase groceries to how we get our work done.

So, what is gamification and does it live up to the hype?

The CEO of Gamification Co., Gabe Zichermann, says “Gamification is the process of using game thinking and game mechanics to engage audiences and solve problems.  Put another way, it’s about taking the best ideas from games, loyalty programs, and behavior economics and using them in new and exciting ways.”  Still not sure what it means?  In short, everyone from your doctor’s office to your car service center to your boss are trying to find ways of using “games” to make your interactions with them more exciting and rewarding, thereby improving your relationship with and — hopefully — your loyalty to them.

But, gamification does not necessarily show up as a game, as least not one that you would recognize as a game.  If you’ve purchased or sold anything recently it’s likely you’ve run into gamification without knowing it.

  • Do you have a loyalty card at your favorite department store?  My Nordstrom card says I can brag to my friends that I’m a Silver shopper.  My Silver status also means I get extras like early access to sales, free shipping and other perks.
  • Are you on Twitter?  How many followers do you have?  When social networks display your level of “connectedness” (which, presumably, translates into how you influential you are), they are using a form of gamification.
  • Similarly, if you are active on eBay, you are awarded a certain degree of status by earning stars of various colors as you buy or sell more items through the online marketplace service.  Your status is enhanced by the overall ratings from the people that you conduct your “e-business”.   These ratings are meant to give potential trading partners a degree of comfort of doing business with you.

As it turns out, status-seeking, even just for bragging rights and a pink star by your name, is sufficient motivation for many people to enthusiastically and, more importantly, repeatedly engage online with various organizations.  According to research by the Outdoor Industry Association, 57% of consumers are motivated to interact with online product-related games to get discounts, 50% for social action that benefits children, 50% for games that reward loyalty points, 44% for social action that benefits the environment, 42% for social action that benefits the community, and 26% to gain status in the online community.

Does all that social philanthropy as a motivation surprise you?  It did me, until I remembered that the growth and momentum of the gaming AND social networking AND mobility revolution is heavily fueled by women — and not just women, but moms… those people who naturally spend a great deal of their time worrying about how to improve the lives of children, the environment, and the communities they live in.  Anyone in the retail or gamification world would do well to take note and ensure that their gamification strategies are suited to this audience.

But, how to motivate people at the office?  Sometimes, work is just work and you just have to slog through it, right?  Well, your boss has noticed that you feel that way and she’s concerned.  Employee engagement is a major concern for many organizations and companies such as Bunchball and Badgeville are trying to answer that question.  The applicability of gamification for talent development are obvious, but how to make your day-to-day job more interesting and, yes, even fun?  How to use games to encourage the kind of enthusiastic collaboration that leads to innovation and employee loyalty?

I don’t know the answers to those questions, but I think that gamification visionary, Jane McGonigal, is onto something.  Back in 2010, Ms McGonigal made an impassioned speech at TED claiming that “Gaming Can Make a Better World”.   McGonigal pointed out that the best games challenge a group of people to achieve epic goals (my emphasis), promote collaboration, and build trust.  McGonigal has created alternate reality games that “challenge players to tackle real-world problems at a planetary-scale: hunger, poverty, climate change, or global peace.”  No shrinking violet, her!

But, back to epic goals.  For my money, therein lies the missing piece for worker engagement and why the retail world has a leg up on the business world when it comes to gamification.  Remember three of the top five reasons that many people play product-related games:  social action results from playing the game.  I’ve worked in a number of professional environments and I can vouch for the fact that very little of one’s life at the office concerns itself with a goal, epic or otherwise, that has any relation to “social action” or “real-world problems of planetary concern”.  If workplace gamification strategies can help to reduce the noise of and time wasted on office politics and help workers to understand how their work is an important contribution to the state of the world — or even just the state of the world — then the gamification companies will have achieved an epic goal, indeed.

 
Posted in Mobile Working | 1 Comment

Investing Virtual Dollars For Digital Business?

Image

This recent McKinsey study, Minding Your Digital Business, is fascinating in that it points out that many senior-level executives have significant expectations of the business benefits of “new” digital technologies, but have not yet allocated significant budget or other resources towards those effort.  How does your marketing organization’s digital marketing practices stack up against other companies?  Has your company put enough focus on these topics to achieve business results?

 

Posted in Uncategorized | 1 Comment

Are You Experienced?

There is a lot of talk in the marketing world these days about “The Experience”.  The Experience refers to a philosophy, event, or communication crafted by a particular company or brand to engage customers on a deeper level — to prompt a unique feeling or personal connection with the company or brand.  Hopefully, that unique feeling or connection creates customer loyalty so that they will not only be repeat customers, but also will influence their friends, family, business connections, etc. to conduct business with the brand.

Like many things in marketing, the idea behind The Experience is not new — really, it’s just a new term that refers to the ongoing effort to align various functions like branding, packaging, communications, customer service, and so on within an organization.  The reason that The Experience is capturing so much of the marketers’ attention is that there are new channels and tactics that many are just beginning to understand how to use to their advantage:  things like digital media, social networking, gamification, and mobile technologies.

This is all good, but many marketers usually just copy tactics they’ve seen used by other companies.  If Brand #1 crafts an online game for their target audience, its usually not long until Brands #2, 3, and 4 do the very same thing.  Doesn’t this defeat the original objective of crafting a feeling or connection unique to the brand?  The answer is “yes”, but marketers are often so concerned about damaging the corporate image in some way or not generating enough leads for sales that it makes them highly risk-averse.  In short, marketers don’t like to try new things, even if The Experience they could create would be truly innovative enough to capture the highly fickle attention of the marketplace.

This brings me to my primary topic (yes, I do have a point here!).  One of the emerging technologies which marketers often overlook — or even actively avoid — is augmented reality.   Yesterday, I was invited to attend an augmented reality conference, Life 3.0 – Funders and Founders, in San Francisco.   Based on what I saw there, I have a message for my marketing buddies:  If you truly want to create a unique Experience, ignore augmented reality at your peril.  Today, augmented reality is (1) readily accessible to business (2) unmatched in its ability to create a captivating and interactive engagement with customers and (3) customers are almost 100% guaranteed to rave about your Experience with their extended network.

Image

Me and Roman Hasenbeck of Metaio

I’ve been poking around the augmented reality world for several months now, lead on my journey by one of the leaders in the space, Metaio.  Since it was started in 2003, Metaio has done several highly impressive projects, with lots of potential for the B2C and B2B world alike:

Other notable companies at the event (P.S. – I was really happy to see a number of women-owned / women-managed AR companies at the event!):

A little girl’s dream!

Zugara – This was one of my favorite companies as it appealed to my avoidance of department stores to try on new clothes.  CEO Matthew Szymczyk walked me through a demo of their LazyLazy.com demo in which online shoppers use a gesture-based interface and a webcam to “try on” different clothes.  While the clothes don’t yet align to the shape of shoppers’ bodies, users can assess color and style to see if they might want to order the item or even venture into the BAM store try the real thing on.  A related, more fun application was Barbie’s Dream Closet for Mattel, which allowed attendees at Mercedes Fashion Week in New York to walk into Barbie’s candy pink closet and try on her amazing gowns.

3D Bin – This company offers 3D views of various retail products that shop-at-home customers can use to examine potential purchases from all angles.

ComeOut&Play / Uwar – Francesco Ferrazzino has crafted a social network and “war game” which can be played in any forum, indoors or out.  Each player has a unique graphic code (similar to a QR-code but highly simplified for long-range targetability) that they wear on their shirt and players “shoot” each other using their mobile phones.  Player points and rankings are tracked on the social network.  While this wouldn’t be appropriate for most corporate experiences, I could see the X-Games crowd or other companies which target the Gen Y and Z getting into this.

I, for one, am really looking forward to seeing how marketers will embrace augmented reality for their Experiences.  All signs indicate that the shopping experience in 2030 will be vastly different — and vastly improved — from today’s experience.

Can you think of augmented reality experiences that you could deploy for your company or that you’d like to see from your favorite brands?

Posted in Augmented Reality, Experience Marketing | Tagged , , , , , , , , , , | 1 Comment

Does Your Job Make You Look Fat?

Does your job make you look fat?

If you’re a “white collar” worker, it probably does.  In addition, your job is probably  increasing your risk of disease.

This is exactly the topic that a team of researchers at the University of Queensland, Australia explored in a recent study.  More specifically, the researchers wanted to see how white collar workers may adjust their activity levels and workstyles given adjustable office furniture that allowed standing or sitting and accelerometers and inclinometers that measured physical activity and inactivity, respectively, throughout the worker’s day.  The team also monitored the participant’s levels of glucose, insulin, and triglycerides to understand how seemingly minor adjustments in daily physical activity may impact these blood markers for cardiovascular disease.

Their findings:  people need to not only exercise, but also break up the periods of sitting around during the day, too. “Ideally, having a good mix of standing, like shuffling about and walking in the working day, and avoiding too much prolonged sitting, is what business owners, managers, and workers should be aiming for,” says study coauthor Neville Owen, PhD, director of Australia’s Cancer Prevention Research Centre at the University of Queensland, in Brisbane.

Turns out, many office workers had already figured out that prolonged periods of sitting in commute traffic and then in a cubicle with free, high-calorie snacks and sodas at arm’s length was causing them to suffer not only obesity, but also a range of other issues such as back pain, type 2 diabetes, inability to concentrate, and even writer’s block.

So, there is an emerging and growing subculture of office workers who demand adjustable desks to allow them to chose to stand while working.  The key word there is “chose”.  Many companies will provide standing desks to employees who cite an ergonomic need, but truly adjustable desks — desks that easily switch between standing and sitting positions — are often considered specialty items and generally don’t find their way to the company’s approved list of options for office equipment.

Office furniture designers and manufacturers are paying attention, though, and anticipate that more enlightened choices for office equipment will prevail.  My Google search of “adjustable standing office desk” returned “about 1,450,000 results”.  My favorite is the GeekDesk, which sports “an electric motor that (allows shifting) from chair height to person height at the push of a button“.  (Anyone from GeekDesk reading this?  I would love a sample, hint hint!)

In the meantime, we may observe more office trends from colleagues who are leading the way to a more active life-… er… work-style.

Do you practice an active workstyle?  What additional tips do you have?

Posted in Mobile Working | Tagged , , , , , , , , , , , , | 1 Comment

Where Do You Microwork?

In a previous post, I explored the decline of “Good Jobs”.  If Good Jobs are disappearing, what will replace them?

Instead of hiring even temporary workers, many companies are experimenting with models to access required skills on an as-needed basis, rather than to “own” (read: hire) employees.  An access-vs-hire trend to watch:  microwork.  Microwork, sometimes also called crowd-sourced work, provides even more flexibility to employers than temporary contract workers.  Instead, companies that use this model break up large projects into much smaller tasks and then access labor – both skilled and unskilled – through the Internet to get the work done.  The tasks are typically so small that people engaged in microwork generally receive a few cents compensation for completing each task.

So far, microwork has been primarily used for tasks which are necessary, but tedious, and don’t require specialized knowledge to complete, such as simple data entry or cataloging web pages.  Another reason companies consider microwork is that they can access hundreds or even thousands of people to complete individual tasks and so the overall project can be completed more quickly and cheaply than if the company created a small, temporary team for the project.

While microwork was new to me until a few days ago, a Google search on the term returned over 80,000 results and infoDev estimates that there are around 100 organizations such as Jobsfor10.com and Microworkers.com who would be happy to help you to sign up for or set up a microwork project right away.  Even Amazon is getting into the game with their Mechanical Turk service for “Artificial Artificial Intelligence” (I gotta wonder how the marketers came up with that name?  I do have to give the Mechanical Turk marketers a few kudos, though:  They at least raised the profile by dubbing microwork as Human Intelligence Tasks, or HITs.)

One does not have to look far to find examples of microwork projects.  One of the most heavily publicized examples of late was a project undertaken by AOL.  AOL needed to inventory all videos published on thousands of web pages.  AOL used Mechanical Turk to farm out microprojects, and voila, the project was done in a matter of a few months.   As one might predict, the microwork model is popular in China, and some lucky microworkers there get to play World of Warcraft all day (though this work may likely better categorized as crowd-sourced work).   The workers earn WoW gold that their employer can then turn and sell to regular players who don’t have the time to earn the WoW gold themselves.

If employers figure out how to make the microwork model expand to more complex jobs it seems likely that many of us will have microwork, rather than Good Jobs, to pay the bills in 2030.

What do you think?  Would you take on a microwork project?

Posted in Crowdsourcing, Job Search, social networking | Tagged , , , , , , , , , , , | 1 Comment

Is the “Good Job” an Endangered Species?

Will any of us have a Good Job in 2030?

If the employment practices that organizations are experimenting with today are any indication, it’s likely that there will be fewer Good Jobs – at least as we know them in 2012 – in existence in 2030.

First, to define the Good Job.  It’s an almost universal rite of passage into adulthood for a young person to find full-time, gainful employment with a stable company.  Upon securing the Good Job, said young person is generally expected to leave the comfort of the family home for fully-fledged, independent adulthood with complete responsibility for their own finances and overall well being.

But, today’s young people have come of age when the global Good Job market has crumbled and they are particularly singled out as victims.  In 2011, the OCED found that 12.5% of people globally who were 15-24 years old of age were NEETS (not in employment, education or training).  In May of the same year, 60% of university graduates in the U.S. had enormous challenges finding a minimum-wage or even an unpaid internship in their field of choice, let alone a full-time professional role, such that 9.4% of U.S. graduates 24 years and under were jobless.

More recently, there are indications that the international economy is improving, though almost imperceptibly, and that full time jobs are steadily, if slowly, being created.  If you are looking for a job in the United States, though, you can take little comfort in this news:  according to a Wall Street Journal analysis thirty-five large U.S.-based multinational companies (>50,000 employees) added jobs much faster than other U.S. employers in the past two years, but only one-fourth (113,000) of those jobs were created in the U.S while nearly three-fourths (333,000) of those jobs were created in their overseas foreign operations.  Interestingly, economists posit that this overseas expansion was not fueled by a search for cheaper labor, but rather to enable the companies to capture revenues from new or expanding regional markets.

One would think that with the impending global talent gap, companies would seek out the best people in the larger candidate pool and invest in them.  It turns out that plans to address the talent gap are not being crafted, let alone implemented, at most companies because the business market has been so volatile that plans would be obsolete almost upon completion. This means that employment models that enable agility and responsiveness in sourcing skills instead of stockpiling and investing in a workforce is garnering a lot of executives’ attention.

I’ll save those employment models for future posts.  For now, it seems clear that the Good Job as we know it is in for some big changes in the years to come.  What do you think?  Have you seen evidence of these changes in your own organization?

Posted in Job Search, Mobile Working | Tagged , , , , , , , , , , , , , | 4 Comments