The Mythology of Poverty

Whoever said “There’s honour among thieves” obviously hadn’t met many thieves.  This is one of those modern day truisms that simply isn’t true.  Thieves steal things; that’s their job.  When there’s no one else about, they will steal from each other.  Haven’t you ever seen The Sting?  Our world is chock full of these pseudo aphorisms — all widely accepted and all utter crap.  For the most part, they’re harmless, even cute.  But lately they’ve been creeping into our fundamental thinking, causing trouble, distorting our ability to handle problems.

For example, “Honour among thieves” suggests that there’s some kind of a Rogue’s Code out there that governs the little bastard who stole your iPhone™.  There isn’t!  He doesn’t belong to a fleet-footed fraternity of contemporary Robin Hoods, dedicated to redistributing technology to the less fortunate.  The only creed he lives by is economics – straight up and down.  He stole your phone for money: that’s it!  We attribute a modicum of honour to his profession because most of us simply can’t fathom an ordinary person following a moral compass that has no dial.  However, the reality is the gentleman thief is a fiction, created by Sir Arthur Conan Doyle’s brother-in-law to amuse his Victorian friends.  Unfortunately, it has somehow gotten stuck to our psyche, with disastrous results.  And it’s not the only one.  There are others way more serious.

There is a general misunderstanding that poor people have a moral leg up on the rest of us.  It is widely believed that if you are struggling to make ends meet, you’re absolutely bursting with integrity.  Not only that, but if, for whatever reason, you jump off the moral balance beam, the assumption is you were forced into it by an unforgiving society.  Let me set the record straight.  People who take the early bus to menial, minimum wage (or below) jobs do not necessarily have either honesty or empathy hardwired into their DNA.  Yes, they are working hard and, quite probably, getting the shaft on a daily basis, but I doubt very much that moral intrepidity is based on an unfavourable income tax bracket.  The “Poor but honest” stories we all grew up on are wonderful tales for children.  However, unless you’re seriously into economic profiling, there’s no reason to believe that poor people are any less corruptible than your average middle-class, 80K-a-year systems technician (No offence, systems technicians!)

And while we’re at it, the other prevailing myth about poor people is they all want to live together.  There is an unshakable belief among NGOs, city planners and politicians, that the cure for homelessness, slumlords and squalor is social housing (sometimes euphemized as affordable housing.)  Surprisingly, this legend is still with us, even after half a century of building gigantic, high and low rise concrete bunkers to warehouse the poor.  These urban battle zones are low rent Mogadishus and probably contribute as much to our low income social problems as cheap, hardcore drugs.  The real head scratcher, however, is the biggest proponents of social housing all live in tidy little neighbourhoods with painted fences, dogs on leashes and manicured lawns.  Either that, or they’re in gabled condo communities with assigned underground parking and more security than the Green Zone in Bagdad.  Is it just me, or is the disconnect here so wide you could sail the USS Abe Lincoln through it?

These are just two examples of truisms about poverty that just aren’t true.  There are piles more.  Think about it.  Poverty is not one homogenous entity.  It covers a huge area of land and has millions of people in it.  It’s also a relative term.  Poor in Detroit is quite a bit different from poor in Seattle.  The below average family in Biloxi has more in common with their wealthier neighbours than they do a statistically similar family in Newark.  Yet, we continue to think, talk and act as though poverty were a one-size-fits-all affliction you throw money at.

Furthermore, some of those most willing to perpetuate these myths are those socially and politically active people who are walking examples of exactly what I’m talking about.  Ever since Bob Geldof couldn’t figure out what to do with Tuesday, wealthy activists have been making a part-time profession out of poverty management.  Sometimes, they’re celebrities but mostly they’re just people with money and time on their hands.  Unfortunately, extended amounts of leisure do not qualify anybody to dabble in economics, education, social or urban planning.  Their opinions are no more valid then the local drycleaner.  In fact, the very success that gave them this free time is actually a detriment to their thinking.  For the most part, they are isolated from the real world and some have become so cocooned they wouldn’t know how to cope with reality (poverty-stricken or otherwise) if it bit them on the bum.  I’m sure these people truly care, but that doesn’t mean they know what they’re talking about.  Sympathetic does not equal smart.  That’s just another truism that isn’t.

Our society has some serious problems, and most of us sincerely want to fix them.  Unfortunately, we’re never going to come close to solving any of them as long as we keep taking mythology as our starting point.

How Good TV Goes Bad!

Apparently, the Fox Network is going to cancel House.  I have never seen the show.  No, I’m not a television snob who only watches PBS, nor do I have a philosophical disagreement with scripted TV.  I just didn’t watch it in the beginning, couldn’t figure it out in the middle and wasn’t willing to give it any time after it had passed its prime.  Over the years, literally thousands of TV shows have slipped past me this way.  By the time my friends convince me that the drama is riveting or the comedy hilarious, the program is two or three seasons deep and already going stale.  I usually tune in just in time to catch nothing more than saggy dialogue, lame insults and baggy clichés.  Sometimes, I go back and find a program’s broadcast youth in hit-and-miss syndication, but mostly I don’t, and I doubt if I will with House.  Grumpy medical people haven’t intrigued me since Doctor Gillespie.  Anyway, House was born, lived and is now going to die without us ever becoming friends…oh, well!  It had a good life.

Actually, House is an exception: most television programs don’t have a good life.  If they are bad, they die young.  If they’re good and nobody watches them, they die young.  If they are bad and tons of people watch them, they’re still bad and become a running joke (a la Gilligan’s Island.)  Plus, everybody from the executive producer down to the teenage viewer spends the rest of their lives trying to live down their association with that piece of trash.  However, the worst thing that can ever happen to a television show is that it’s good and tons of people watch it.  Only the very best programs can survive that kind of success, and most of them don’t.

Aside from a few excellent aberrations, really good TV is based on character and writing.  All you have to do is look at the CSI franchise to figure that out, and while Miami Vice kinda needed Miami, it could have just as easily have been Malibu or New Orleans.  This is the way it’s always been, since the dawn of television.  Even way back in black and white days, 77 Sunset Strip and Hawaiian Eye weren’t that much different, and everybody knows that Star Trek was just Wagon Train with short skirts and phasers.  Good characters make good TV, and good writing makes good characters.  However, this is also exactly what makes good TV go so horribly bad.

In the world of television, professional writers pour miles of work (and paper!) into creating characters.  They put them into storylines that let them shine and give them clever things to say.  The sole purpose of this is to make these characters interesting enough that we, the audience, come back next week to see them again.  It’s a hit-and-miss proposition, but when it works, a television show becomes successful.   The characters become our television friends — witty, sexy, smart, comical, caring or just plain cool – in short, everything we wish our real friends were but never are.  After all, who would you rather have a drink with, Lucy, the smart chick from Alcatraz or your idiot sister-in-law?  No contest!

Unfortunately, this is also the problem: once these imaginary people become our friends, nobody wants to get rid of them.  The producers, directors and technical crowd — right down to the guy who pours the orange juice — have a good gig going.  They’re not going to kill the goose that’s laying the golden eggs.  Furthermore, the advertisers don’t care if we’re watching dancing Bavarian mud monkeys — as long as the audience numbers are up.  And the writers will sell their own mothers before they start the whole process over again.  After all, it probably took them ten years to sell this idea.  So the characters keep hanging around, long after the professional writers (who mostly suffer from acute, undiagnosed ADD, anyway) have run out of imagination.  The stories go flat and repetitive.  (How many ways can everybody love Raymond, for God’s sake?)  They generally outlast themselves by two, three or five years and keep staggering along, like wheezing pensioners looking for the Rest Home.  Either that, or the writers, sensing imminent unemployment, go nuts and call in the aliens or reinvent someone’s parent as a gratuitous celebrity to eke out another season or two.  And that’s how most good TV shows die, shadows of their former selves, alone and abandoned by everyone (often, even the original cast) only the most loyal fans remaining.  As old friends will, we sometimes come back for the last episode, like hangers-on at a funeral, but mostly we’ve gone on to other things enthralled by our new friends who are young and exciting.

Now that I think about it, maybe it’s too bad I missed House completely.  From the looks of things, it was probably an intelligent, interesting program.  After all, the producers were smart enough to retire the old boy before he was literally on his last legs.

Talkin’ ’bout the i-Generation!

If you’re reading this, chances are good you were born in the 20th century.  If you weren’t, put this down, you precocious little beast, and go out and play.  For the rest of us, the 20th century was the cradle, the nursery, and probably most of the education of our existence.  As Herman Raucher once said, “It is our time, and we’ll never leave it.”  To us, it’s our life.  It’s not history; it’s memory.  The great events we witnessed are coupled with our birthdays, divorces, new cars and houses.  However, in a couple of hundred years (or maybe a thousand) when people look at our time, they’re going to draw a sharp line between the 20th and 21st centuries.  They’re going to separate us like exhibits in a museum.  Right now, we exist simultaneously in both centuries, like two pages of a book — totally different, yet intimately touching at every point and completely useless without the other.  In the future, however, we are going to be one thing and those precocious little beasts poking away at their iPads are going to be another.

As much as most people would like to deny it, we are the cumulative result of history.  There is a direct line from you and me back to the dim reaches of time, when the epic human struggle was merely to stand on our own two feet.  For example, if I wanted to be an intellectual smart-ass, I could trace the birth of our world back to shoddy obstetrics in the Imperial court of Prussia in 1859.  Or I could research (plagiarize is such a hard word) Paul Johnson and go back even further to a Frenchman’s hemorrhoids at the dawn of the 19th century.   My point, of course, is that there is no start to history — only final judgements passed on the results.

Our current 21st century’s i-Generation is a perfect example.  These are the kids who are adapting our world to Facebook and Google, one app at a time.  They’ve changed Wikipedia from a slightly tawdry secret to a tolerated research tool.  They are intent on sharing, not personal experience, but data with the world.  In the future, they will be judged in isolation.  Nobody will bother looking at the seeds and shallow roots provided for them.  Their obsession with consumption rather than creation will be seen as a character flaw – an aberration which was always destined to kill or cure our fat, wheezing planet.  Yet, the i-Generation didn’t just appear one day like Athena springing from the head of Zeus, nor is it even fully grown yet.  It still depends on Generation Y for its existence.

Generation Y! — those 80’s babies who can’t seem to decide if they are a stand-alone product of the Baby Boom or only just an echo.  Constantly harassed about the dangers awaiting them, these are the folks whose abilities have always outshone their underdeveloped egos.  They risk little and expect much.  Their literature is the graphic novel; their art, the expressive font; and their technological advances are made on the playing fields of virtual reality.  Generation Y stands alone — with its devices connected to the planet but they, themselves utterly isolated from it.  Not since the Dark Ages have humans been so devoid of contact with the outside world.  Generation Y lives and works in a series of separate technological villages, timidly toiling like Copeland’s microserfs, afraid to venture beyond their firewalls.  But they, too, did not arrive fully formed like a Botticelli Venus rising from the ocean’s foam.  They are the children of Generation X.

Generation X, the first generation of the Age of Entertainment.  They showed up just in time to see America leave the Moon, never to return, and George Lucas unleash his Jedi to battle the Death Star.  Raised on Sesame Street and Cocoa Puff cartoons, Generation X has never understood why the world doesn’t play nice like its television friends do.  Completely overshadowed, Gen X was forgotten and left to fend for itself.  As it saw the brave old world bending under the weight of its uber-ego parents, it could only step back in fear of the imminent collapse of power, oil and profit and seek salvation in Spielberg and Scorsese.  Still, we must remember Generation X was never abandoned like the solitary child of the goddess Hera, twice tossed from heaven. It was born into the rarified air of the Baby Boom.  It cut its teeth on impending disaster, with only discredited institutions and disassembled Gods to comfort it.

And so it goes; back and back, each generation shaped by the wants and fears of the ones preceding it.  If history judges the early 21th century harshly, it will be because the i-Generation believes that clicking Like on Facebook can change the world.  Yet, it is we, from the 20th century, who taught them that.  I-Gen is the product of two generations of constraint and constriction.  The Xs and Ys watched the world of their parents and grandparents falter, assailed on all sides by everything from financial ruin to pandemic disease.  So they taught their children to be wary, to keep their distance and to “Stay strong.”

John F. Kennedy spoke of his generation (what we call “The Greatest Generation”) as “born in this century, tempered by war, disciplined by a hard and bitter peace.”  I don’t have many regrets, but I do regret not being able to hear what the i-Generation will eventually say about itself.