Tuesday, December 27, 2011


Though some of my colleagues might cringe to hear it, non-architects--those who lacked either the formal schooling or the license to legally use the title “architect”--have had a huge impact on American architecture over the past century.  If they weren’t architects in the legal sense, they more than lived up to the title’s original meaning of “master builder”.

Why not start at the top?  Frank Lloyd Wright’s only formal training consisted of a year of engineering classes at the University of Wisconsin.  Thoroughly bored, he dropped out in 1888 and headed for Chicago to find a job.  He quickly found one, first apprenticing with the Chicago architect  J. Lyman Silsbee, and later and more famously with his “lieber Meister”, Louis Sullivan.  In 1893, after a falling out with Sullivan over taking outside work, Wright left the firm and opened his own office, where was able to use the title “architect” only because his practice predated the Illinois licensure requirements by four years.  
Wright nurtured a lifelong disdain for traditional architectural training, which eventually led him to found the Taliesin Fellowship, a unique school in which apprentice architects learned largely by doing.

But Wright is only the best-known example of brilliant architects with unconventional or even nonexistent educations.  In another vein entirely is Addison Mizner, the California-born, Guatemala-raised, Florida-polished raconteur who improbably rose to become the top society architect of Palm Beach during the Roaring Twenties.  Mizner despised school, and accordingly his only architectural training was a three-year apprenticeship with the San Francisco architect Willis Polk.  The happy result was a personal style that drew more from his childhood knowledge of Spanish Colonial Guatemala than from the copybooks so beloved by his contemporaries.  

Nevertheless, Mizner’s romantic antiquarian villas were considered vulgar setpieces by his academically-trained colleagues.  It probably didn’t help that he also ran a business manufacturing mock-antique furniture and building materials, which he used liberally in his own  work.  Mizner’s career was spectacular but brief; he died in 1933.  Today, his surviving Palm Beach work ranks among the finest Spanish Revival architecture in the nation.

On the opposite coast, Cliff May, the San Diego architect widely considered the father of the California Rancher, started his career building Monterey-style furniture.  When he began designing Spanish Colonial-style houses for speculative builders in the early 1930s, academic architects dismissed him as a purveyor of kitsch.  Yet over time, May’s rambling, site-sensitive designs metamorphosed into the rustic and low-slung homes that Americans came to love so well.  All told, May built his Ranchers in forty U.S. states, and their spiritual heirs went on to become the dominant style of the postwar era.  Genuine May-designed Ranchers, not to mention his earlier Spanish Revival designs, are now celebrated and studied by architectural connoisseurs.  

Despite these formidable accomplishments, May received only late and grudging acceptance from his licensed colleagues--or as he rather poignantly put it,  “It took real architects a long time to let me into the club.”   

Next time, we’ll look at a few more outsiders who changed the course of architecture, and see what they all had in common.

Monday, December 19, 2011


During the past few weeks, every time I’ve had to use yet another badly-designed appliance, or had to sit idling at yet another ineptly-timed traffic light, or had to decipher yet another garbled set of instructions, I’ve thought of one man: Steven Jobs. And I wish there could’ve been a hundred more like him.

There’s no doubt that, with Jobs’s passing, the world has lost one of the most important visionaries of the last hundred years. But for me, the loss has less to do with his putting a computer for the rest of us on a million desktops, nor with his uncanny knack for creating things that people didn’t even know they needed. Granted, these accomplishments are vastly important to Jobs’s legacy. But to my mind, his ultimate triumph was his singular skill at persuading a largely indifferent public that excellent design really matters. He wanted us all to be as passionate about beauty and simplicity as he himself was. And to the extent that Apple’s famously intuitive and user-friendly products are now more popular than ever, he seems finally to have succeeded.

The fact is that the average American consumer has been amazingly tolerant of third-rate product design. Consequently--and understandably--any company that knows it can make perfectly good money selling clumsy, overcomplicated, or unintuitive products has no incentive whatever to improve them. And so most don’t. 

In Jobs, however, we had the unique case of a businessman on a near-religious crusade to educate his own market, relentlessly challenging us to demand more than the run-of-the-mill crap we’re typically offered. 

It’s interesting to note that the Apple cofounder, despite being a pioneer in one of the most technically complex fields yet known to man, was not an engineer but rather a laid-back college dropout with a mystical streak. To add yet another layer of paradox to this singular mind, he was notoriously--some would say tyrannically--demanding of the people who worked for him. But if this is what it took to engender the phenomenally beautiful and beautifully functional objects Apple has created out over the years, then it was all worth it.

As you’ve probably guessed, I write on a Macintosh, and have done since I bought the very first model through an Apple engineer pal back in 1984. So yes, kids--I’ve been a true believer since long before the iPod, iPad, or iPhone even existed. And for many of those years, I tried in vain to convince doubters why there was nothing like using a Mac--in short, why good design really mattered. Thankfully, with the wild success of those assorted i-Things, Jobs was finally able to make that case for me. 

Steve Jobs had already revolutionized the fields of computing, film, music, and telephonics. I wish he’d been given the time for even more far-flung conquests, because I have no doubt that the world would have been a better place for it.

We could have used a hundred more like him, but alas, there was only one.

Monday, December 5, 2011


Not long ago, I handed a young architectural intern a preliminary sketch to be drafted up on the computer. It was a site plan for an agricultural research facility comprising 130 acres, about eighty acres of which were supposed to be reserved  for farmland.

A week later, as promised, I received the computer drawing. But lo and behold, the great swath of undeveloped acreage shown in the original plan had been completely filled up with a meandering web of plazas and pedestrian malls in a galaxy of arbitrary shapes--pinwheels, checkerboards, crescents, what have you. Setting aside the fact that these busy forms would only have made sense from the air, they would also have made for some rather difficult farming.

When I asked the intern why she’d added all those features unbidden, she replied:  “The plan looked so empty, I thought the client would want to see more things in it.” 

This is a problem that afflicts all creative people, so much so that we even have a Latin name for it: horror vacui, or fear of emptiness. Herbert Muschamp, the architecture critic of The New York Times, has called it “the driving force in contemporary American taste...(and) the major factor now shaping attitudes toward public spaces, urban spaces, and even suburban sprawl."

As Muschamp rightly perceives, the horror vacui is especially pronounced among architects.  Many, like my young drafter, think that if they don’t fills up every space with an avalanche of ideas and images, however unrelated to the program, they’ve somehow fallen short of their creative charge.

In fact, just the opposite is true.  Architecture is a process of reduction, not just compilation.  Ideally, the architect distills a complex set of requirements into the simplest form that will both satisfy the client’s needs and offer some measure of personal artistic grace.  The avalanche of ideas has its place early in the process, but as things progress, design features that aren’t essential--whether for function or effect--fall away, leaving the final polished kernel of a solution. When carried out with skill, this process doesn’t preclude fanciful ideas, but it does preclude dysfunctional and clumsy ones. 

Of course, today’s designers aren’t the only ones afflicted with horror vacui-- it’s a tendency that waxes and wanes over decades. Victorian architects, for instance, couldn’t bear to see an unadorned surface.  The dawning twentieth century brought a counterreaction to this compulsive decoration; it began with the Mission Revival and Craftsman styles and reached its zenith with International Style Modernism, whose practitioners turned architectural reduction into an art form.  

Ironically, it’s precisely this Modernist austerity that’s sent us hurtling back toward the frenetic gimcrackery so evident in contemporary design. And while architecture without complexity is dull, architecture that’s layer upon layer of complexity is simply meaningless.  

As in so many other things, the answer lies in striking a balance. Some of our era’s most idealized domestic architecture--rural French farmhouses, say, or those much-admired vernacular hillside towns in Italy or Spain--are about as spare and simple as could be while still suiting their purpose. Against such a clean sharp background, a single flowerpot or bit of filigreed ironwork fairly bursts with ornamental power.  

Alas, like my young intern, many architects still grow fidgety at the sight of a plain white wall, much less an empty plot of land. That’s too bad because, more often than we’d like to think, the best designing we can do is none at all. 

Monday, November 28, 2011


We all know that nothing looks more dated than last year’s red-hot style.  What’s not so obvious is why consumer styles-- whether clothes, curtains, or cars--come and go with such cyclical certainty.  More often than not, the seeds of new design trends are carefully nurtured by their respective industries to spur sales, and then disseminated via design magazines, television shows, and the like.  Clever marketing encourages consumers to believe that they’re the ones driving these trends, when in fact it’s more often the other way around.  

Once a hot trend inevitably runs its course, another comes along to replace it.  Those who literally bought into the previous fashion cycle are left with outmoded items that once again beg to be replaced with more current ones, thereby starting the cycle anew.  

The American auto industry brilliantly exploited this marketing ploy during the postwar era.  Back then, Detroit’s enormous, chrome-laden cars were heavily restyled each and every year, ensuring that the driver of last year’s model would be acutely aware that his near-new car was already out of date.  While most people are now wise to the role of planned obsolescence in selling cars, not so many are aware that the makers of domestic products play the same marketing game.  

Take kitchen appliances, for example.  Since a washing machine or refrigerator will ordinarily last decades, the simplest way to coax consumers into buying a new one is to make them embarassed at how dated the old one looks. Accordingly, over the years, we’ve seen a whole succession of color and finish fads come and go, each by turns energetically touted as the ultimate in chic.  They’ve ranged from the basic sanitary-white appliances of the late 1940s through Turquoise, Coppertone, Advocado, Harvest Gold, Almond, Black, and eventually back to white again.  

Of course, merely ending up right where you started wouldn’t carry much urgency as a fashion statement, so appliance makers found a new sales angle: Why, this wasn’t just plain old white--it was White on White.    

Given that any fashionable item is doomed to look uniquely dated in a very short time, one wonders why people continue to be so easily swayed by the artificial dictates of fashion, rather than recognizing it for the finely-tuned  sham that it is.  

At the root of this susceptibility lies, I think, an unfounded lack of confidence in our ability to judge for ourselves.  Dig even deeper, and we may find a reluctance to trust one of our most important design tools:  our own intuition.  For instance, when clients bring me a range of color choices for, say, countertops, they’ll dutifully run through the ones they perceive to be in step with current design trends.  But at some point, they’ll show me the one color that really makes their eyes light up, which they’ll resignedly dismiss with some comment such as, “I absolutely LOVE this color, but I know it’s way out of fashion.”

I couldn’t think of a better reason to choose it.

Monday, November 21, 2011


I often get calls from nice folks who’ve drawn up their own plans and want me to check them for problems.  Some of these designs are wonderfully creative, yet virtually all of them are sabotaged by the same basic shortcomings:  People never allow enough space for hallways, staircases, kitchens, or baths. 

Stairs are undoubtedly the biggest booby trap for neophyte planners.  Even a relatively steep, straight stair climbing your basic nine-foot-high story requires a bare minimum floor area of three by ten feet--and this doesn’t include the top and bottom landings or the thickness of the enclosing walls.  L- or U-shaped stairs need even more room. Yet people routinely show me designs for second-story additions in which the entire staircase is miraculously packed into a linen closet.  They’re usually crestfallen to learn that, in fact, the new second-floor bedroom they thought they were adding will only be replacing the one wiped out by the stairs. 

Kitchens are typically overcrowded as well.  The absolute minimum aisle width between facing countertops--even those on islands--is four feet.  Although this may seem excessive on paper, it won’t be once you’ve got doors, drawers, and dishwasher racks projecting into the aisle, not to mention a few bystanders “helping” you cook.  Nor should sinks and cooktops have less than eighteen inches of counter space on either side--and again, this includes islands.  

Even when they know there really isn’t enough room to accomodate everything they want, amateur planners will often try to cheat their way out of the problem by cannibalizing other spaces.  Clothes closets are a common victim:  Although they need to be at least two feet deep, people are always trying to whittle a few inches off  them to buy space somewhere else.  Forget it--jacket sleeves cannot be fooled by this strategy.    

Other immutable rock-bottom minimums:

•  Foyers need to be at least six by six feet.

•  Hallways, like stairs, can be no less than three feet wide.

•  Walk-in closets need to be at least five feet wide for a single-sided arrangement, and seven feet wide for a double-sided one.  

•  Double lavatory sinks require a countertop at least six feet wide.  Never mind those dinky five-foot examples you find at the big-box store--that’s just wishful thinking. 

•  Toilets should occupy a space at least 30 inches wide when between a wall and a counter, and at least 36 inches wide when between two walls.

•  Stall showers require a space no less that three feet square; tubs and tub/showers need at least 2 foot 8 inches by 5 feet.

•  Garages must be at least 19 feet deep inside.  And don’t dream of trying to squeeze a furnace, water heater, or washer and dryer into that minimum, either.

When space is tight, both architects and amateurs can be tempted to fudge minimum dimensions by a few inches here or there.  Don’t.  In fact, it’s good practice to allow a few inches more than you need, since finishes, trim, and unexpected errors or obstructions often conspire to nibble away preciousroom from a space that’s already squeezed.  If you can’t accomodate the above minimums, you may need to rethink your wish list.  Better to throw a few things overboard than to sink the whole ship.

Monday, November 14, 2011


Years ago, when I was a punk architect in my twenties, I asked a well-known local contractor what he considered the most important factor in a good remodel.  I  suppose I was fishing for an answer along the lines of, “Excellent design”, or at the very least, “A decent set of plans”.  

His one-word reply:  “Painting.”  

He went on to explain that he had a sort of fetish for excellent painting.  He maintained that the quality of the paint job was what really set apart a top-notch project, because when all was said and done, the paint was the surface that everyone saw.

At the time, having just recently emerged from Berkeley’s incomparably touchy-feely school of architecture, I remember thinking to myself, “Now, this is one shallow cat.”  But over the years, I’ve come to realize that he was absolutely right.  Not that good design isn’t important--obviously, I think it is, or I’d become a hot dog vendor on the Berkeley Pier quicker than you could say Mies van der Rohe.

But the fact is that even the best design and the finest workmanship can be instantly reduced to a tawdry mess by the sort of slapdash painting that’s all too common these days.

Skilled painters are accorded far too little respect, partly because there aren’t that many of them.  Instead, the field is swamped by low-balling incompetents who think the ability to wield a dribbling roller qualifies them to use the title “painter”. Most find work solely because they’re cheap.  Top-quality painters are further cursed by the fact that the painting phase occurs toward the end of the project, just when overextended owners are most likely to start tightening their purse-strings.  

Alas, whether you’re building from scratch or remodeling, cutting corners on painting is likely to cost you dearly.  My not-so-shallow contractor friend was exactly right:  Painted surfaces are ultimately what most people notice.  Hence, if your house looks like it was painted by Mr. Magoo on crack, all of your earlier efforts will have been in vain.  Here, then, are some bare minimum standards to expect from a paint job:

•  The coverage should be uniform, without a watery, skim-milk appearance.  In addition to proper application, using top quality paint makes a big difference here.  A good painter will automatically insist on using the very best paint.  If you find your painter using econo-buy paint, plan to be disappointed with the results.

•  Borders between different colors should be sharply cut in without wavering.  

•  Painted wood windows should have a sharp, clean line where wood meets glass, not a raggedy edge.  

•  There shouldn’t be a speck--and I mean not a speck--of paint or overspray on stone, brick, glass, tile, unpainted metal, or any other finished surfaces.  Nor should there be overspray on shrubbery, walkways, natural wood structures, or roofing if it’s an exterior paint job.  Don’t buy the frequent excuse that the resulting mess can be cleaned up later--nine times out of ten, it won’t happen.  Instead, insist that all surfaces are properly protected in the first place.  Door lock hardware, for example, should be removed--a procedure that takes a few minutes per door--not painted around as is common with cut-rate practitioners.  

The standard for neatness is simple:  Paint what’s meant to be painted, and don’t mess up the rest.

Monday, November 7, 2011


“The physician can bury his mistakes,” Frank Lloyd Wright told the New York Times in 1953,  “but the architect can only advise his client to plant vines.”  

Such wisecracking aside, Wright probably new better than most architects the value of integrating nature into his work, and not just as a remedy for aesthetic failure.  

A visit to Taliesin, his home in central Wisconsin, makes this amply clear:  The house is wrapped around the crest of a hill on three sides--”not on the hill, but of the hill”, as Wright liked to say--and the erstwhile farmboy’s love for nature informs every nook and cranny of the place.   

Whether cottage or mansion, a truly livable house should, like Taliesin, seem inseperable from its site.  Sometimes, the simple passage of time and the attendant growth of planting are enough to create this effect, as many an overgrown bungalow will testify.  If you can’t wait around fifty years, however, there are also a number of design strategies that can help weave a new home or addition into its site right from the start.

•   Build decks or terraces as close as possible to the interior floor level, rather than having a back-porch-like stair leading down to them.  Since a house that’s markedly above the outside ground level can feel cut off from the outdoors, creating outdoor space that’s flush with the ground floor will both visually expand the interior space and help integrate it with the surroundings.  

If the vertical distance to the outside grade is more than a couple of feet, consider having several levels of decks or terraces that gradually step down to the ground.  Use level changes of two or at most three steps, each no more than six inches high, and avoid using single steps, as they can create a tripping hazard.  Integrate planters or beds for trees and shrubs into the layout to help visually smooth the transition from indoors to out.

•  Except where there are doors leading outside, don’t install paving or other ground-level hardscaping right up to your home’s exterior walls.  A house with bare paving meeting bare walls has about as much connection to its setting as the hotel on a Monopoly board.  A better approach is to leave a planting bed at least three feet wide between the  foundation and any paving.  Make sure you provide drainage so this area doesn’t become a swamp during the rainy season.  

•  Extend architectural details such as walls, colonnades, or porches from the house into the surrounding landscape.  One of Wright’s favorite techniques was to have low walls radiating root-like from the building, visually tying it to its site.  Often, these walls also formed integral planters that helped from a transition to the natural landscape.  

Traditional architects could be equally adept at this technique: Spanish Revival homes, for example, often featured an arcade or a pergola extending from the house into the garden, or a covered veranda that formed a space halfway between indoors and out.

•  Lastly, always think of your house as an integral part of its site, rather than being an object placed on top of it.  Plan the garden as a series of outdoor rooms that are an extension of the indoor ones, and make the ones nearest the house serve as transition points between inside and out.  

Monday, October 31, 2011


Perhaps the most singular trait of American homes is the hollow, cardboardy thud of our gypsum-board walls.  No one else has anything quite like them.  Mind you, if it weren’t for World War II, our walls might not sound quite so hollow.  

Before the war, American homes were routinely plastered inside--a painstaking process that first required nailing thousands of feet of wooden strips known as lath to the ceiling and walls of every room.  

The lath was covered with a coarse layer of plaster called the “scratch coat”.  The wet plaster squeezed through the gaps in the lath, locking it to the walls and ceiling.  Days later, when the scratch coat was dry, a second “brown coat” was applied to make the surfaces roughly flat.  This, too, had to dry for several days.  Last came the “skim coat”, a thin layer of pure white plaster that produced a smooth finished surface, something like the cream cheese topping does on a cheesecake.  

Depending on the weather, this process could take days or weeks, during which no other trade could work inside the house.  This was how plasterwork had been done for centuries, and there seemed no reason to change.  

Then came World War II, and with it an urgent need for military structures ranging from barracks to whole bases.  Faced with shortages of both labor and material, Uncle Sam was desperate to find faster and cheaper ways to build.  And since beauty was not much of an issue, eliminating plaster was an obvious starting point.

Enter the United States Gypsum Company, which way back in 1916 had invented a building board made of gypsum sandwiched between sheets of tough paper.  After more than two decades, the product they called Sheetrock still hadn’t really caught on.  Even its successful use in most of the buildings at the Chicago’s World’s Fair of 1933-34 didn’t do much for sales.  But the urgencies of wartime construction changed all that.  

As the government soon came to appreciate, Sheetrock did away with the need for wood lath, multiple plaster coats, and days and days of drying time (hence its generic name, “drywall”).  Installation was simple:  After the 4x8 sheets were nailed up, the nail holes were filled, paper tape was used to cover the joints, and a textured coating was troweled on to help disguise the defects.

Of course, all this was only meant as a stopgap replacement for plaster, but as you’ve probably guessed, it didn’t turn out that way.  By the war’s end, many builders who’d gotten used to slapping up drywall were suddenly reluctant to go back to the trouble and expense of plastering. 

What’s more, Sheetrock’s arrival coincided with the rise of modern architecture, which preferred plain, flat surfaces to the fussy moldings and reveals of prewar styles.  To Modernist tastes, the fact that Sheetrock couldn’t be molded the way wet plaster could was hardly a drawback.  People seemed more dismayed by the flimsy cardboardish sound of the walls in their postwar homes, but they soon got used to it.

Flimsy or not, there’s no doubt that Sheetrock proved a huge boon to the postwar housing industry.  Prior to the war, the typical American developer built about four houses a year.  By the late Forties, a developer like the legendary Bill Levitt was able to churn out 17,000 tract homes at Long Island’s Levittown, sell them for $7,990 , and still make a thousand dollars profit on each.  Mass production was the key to the postwar housing boom, and Sheetrock helped make it happen.  

Just something to bear in mind next time your kids smash a doorknob through the bedroom wall.

Monday, October 24, 2011


A few years back, at the height of the dot-com boom, I  came across a bronze plaque outside the headquarters of one of those instant internet giants. In consummate public-relations prose, its text declared the company’s absolute commitment to quality and excellence at every level, invoking all the usual corporate buzzwords of the era.  What really fixed the plaque in my memory, though, was that one of its most mundane phrases was mispunctuated, reading “it’s ideals” instead of “its ideals”.  

Given the firm’s purported obsession with quality, you’d think they’d have given their mantra a quick proofread or two before committing it to bronze. 

This incident reminded me that a commitment to quality demands tangible final results, not just a lot of high-flying babble.  It requires vigilance down to the very last detail--even to a lowly apostrophe.

Quality relates to architecture and construction in much the same way:  The last little details can make the difference.  Hence, a project that’s going along swimmingly can still become shark bait in the last few days, because that’s when many of the parts you really notice are completed. The trouble is, this is just about the time the owner, the contractor, and yes, even the architect are tired, impatient, and rushing to get things buttoned up. 

Too often, this means that the most conspicuous details get the least effort and attention.
Here are some notorious quality killers that can sabotage a project at the last minute:

•  Moldings such as baseboard, door trim, and ceiling cove are often treated as last-minute frou-frou by harried contractors, even though they’re among the most obvious finish items.  Quality killers include inaccurate or open miters, ragged or splintered cuts, and gaps between moldings and floors, walls, or ceilings.  All standing moldings (such as door trim) should be installed plumb and square.  Running moldings (such as baseboard) should align properly and have clean, tight miters, or in the case of internal corners, coped butt cuts.  Gaps should be neatly caulked.  The last step, mind you, is seldom carried out but is a must for any quality installation.  

•  Indifferent painting is the surest way to destroy a quality job.  Ironically, although paint is the predominant finish on most houses, the painting phase is often cursed from being carried out late in the project, when money and patience are at low ebb.  Hence, workmanship suffers either because the job is rushed or because incompetent painters are hired in a misguided attempt to save money.  The quality killers:  Excessively thick or thin application, drips and runs, ragged or wavy brushwork along edges, and paint on fixtures, finish hardware, masonry, or glass.  None of these shortcomings should be tolerated.

•  Highly conspicuous finish hardware items such as door locksets, cabinet pulls, towel bars, grilles, and the like usually get hasty treatment because they’re among the very last items installed.  The quality killers include mismatched finishes (polished brass mixed with satin brass, for instance), off-plumb or misaligned pulls or trim plates, crooked towel bars, and locks and catches that don’t engage properly.  Insist that such items are neatly installed and are placed perfectly plumb, level, or square, as appropriate.

And in case you think fussing over such details is obsessive, one last remark about that would-be internet giant with the big bronze plaque:  “its” gone out of business.

Monday, October 17, 2011


Flying isn’t what it used to be. The fact that air travel has become overly familiar, even routine, is one reason. The more recent equation of airplanes with doom and destruction is another. Yet there’s a more concrete reason that flying has lost much of its romance:  The modern urban airport just isn’t the sort of place we’d like to spend time in.  

The mechanics of travel weren’t always something merely to be endured. During the heyday of the passenger railroads, arriving, departing, or even just hanging around in one of the great major terminals--whether Portland, Cincinatti, or Washington DC--was an experience to remember. A first-time visitor couldn’t help but feel thrilled in such a temple of travel. 

Approaching an unfamiliar airport, on the other hand, more often elicits a rising sense of dread. Even the most architecturally celebrated of them are maddeningly difficult to navigate. For example, after an eternity of construction bedlam, San Francisco’s airport finally boasts a magnificent new International Terminal. Yet reaching it from either the highway or from public transportation remains a nightmare for any first-time visitor.  

Most of us navigate airports by one of three methods, the only reliable one of which involves already knowing the way. Failing that, we walk around slack-jawed, trying to figure out directional signs that ought to be obvious, or else we simply follow the crowd and eventually stumble onto our objective.

With all this confusion within, don’t even ask about what airports look like from the outside. What with changing technologies and endless reconstruction, architects long ago gave up trying to give airport exteriors a unified appearance.

Of course, there was a time when airports, like railroad terminals, were designed to look all-of-a-piece. Among the few that survive more-or-less intact are the modest but remarkable Spanish Revival gem in Santa Barbara, California. 

When Modernism hit town, though, it became fashionable for airports to be inspired by the objects they served:  aircraft.  This was a refreshing concept back in the early 1960s, when Eero Saarinen completed his famously swoopy TWA terminal at New York’s Kennedy (then Idlewild) Airport.  Alas, architects have drunk from the same well countless times since--albeit without Saarinen’s audacity--thereby turning the concept into a well-worn cliche.

In the ensuing decades, it’s become acceptable for airports to be disjointed aesthetic jumbles so long as they vaguely resemble airplanes, with lots of shiny metal, curvy plastic panels, and carpeting on the walls.  Never mind that there’s no intrinsic reason why an airport lounge should look like the cabin of a 747, any more than your garage should look like the inside of a Toyota.

Today, with the growing despair over security, overcrowding of terminals and airplanes, and the shaky financial shape of the airline industry, airport architecture seems likely to remain stuck in the plastic-and-stainless steel rut it has occupied for decades. 

Rail travel never did regain its cachet after World War II, and the palatial terminals of railroading’s golden age sadly gave way to mundane structures that could barely compete with the local Greyhound station.  Likewise, perhaps, the airport’s day as a romantic portal to other worlds has been doomed by the very ordinary thing that air travel has become.  Short of rocket rides to the moon, I wonder what can replace it.  

Tuesday, October 11, 2011


Next time you head for the bathroom in the middle of the night, consider what the casual act of lighting your way would’ve entailed just over a century ago: If you were lucky enough to have a house with piped-in gas, you could strike a match to the nearest gas mantle to get a blinding white flame. Otherwise, you’d have to stumble your way to the john by the light of a guttering candle. No wonder so many Victorian houses burned to the ground.

Although nowadays it’s hard to imagine a world without electric lighting, it's been with us for a relative wink of an eye. Thomas Edison perfected his incandescent bulb in 1879, after trying out hundreds of filament materials ranging from bamboo to hair to paper (he finally settled on tungsten). Not so well known is that Edison also had to invent a way to evacuate the air from the bulbs--no mean task using Victorian technology.  

Even so, it took another twenty years or so before electric lights had largely replaced gas mantles in American homes. As late as the early 1900s, older houses with gaslight were still being retrofitted for electricity. These transitional houses are easy to spot: the wires leading to the electric fixtures were often run inside the old gas pipes. 

In the early days of electric lighting, fixtures intentionally flaunted naked bulbs so that no one could possibly mistake them for gas.  It was a way for people to advertise their modernity, much as hipsters of the 1990s sported conspicious cell phone antennas on their cars.

Since that time, there have been surprisingly few fundamental changes in residential lighting.  Switches and wiring were eventually hidden inside of walls instead of being mounted on top of them, but other than that, most houses continued to have lighting fixtures in the center of ceilings, much as they had in the days of gaslight. The Revivalist home styles of the 1920s brought a craze for wall sconces--another gaslight derivative--but the fashion had largely died out by the end of that decade.

The first really new development in lighting since Edison’s light bulb was neon tubing, which made a big splash in the early 1930s. It made its American debut in a sign for a Packard showroom, and was soon all the rage as signage in movie theaters and other commercial buildings. However, with its otherworldly glow, it found little use in residential design.  

Fluorescent lighting (not to be confused with neon) was introduced not long afterward.  Being diffuse and hence glare-free, and also producing much more light for a given amount of power, it quickly became the standard for commercial buildings.  Still, no matter how hard architects tried to push its use in luminous ceilings and other Modernist lighting concepts, the sickly blue-green quality of its light did not endear it to homeowners. It took another forty years of improvement, as well as laws mandating its use, before fluorescent lighting was grudgingly accepted into American homes.

In the interim, a number of other high-efficiency lighting types have been developed, including mercury vapor, sodium vapor, and metal halide, but the unnatural spectrum of light they produce has also precluded their use in domestic work. 

By contrast, halogen residential lighting, introduced during the 80s, was an instant hit with the public. Why? Halogen’s warm, yellow-white light is very close to the spectrum of sunlight. Accordingly, engineers are currently working hard to make the next big development in high efficiency lighting--light-emitting diodes, or LEDs--as warm and friendly as incandescent and halogen lamps.

Because the sun, after all, is still everyone’s favorite lighting fixture.  

Monday, October 3, 2011


A few years years back, just before the real estate bubble burst, a housing tract inspired by mass artist Thomas Kinkade’s bucolic townscapes and happy-happy cottages opened near Vallejo, California. Many people found this idea amusing, if not horrifying. But while there’s much that can be said about Kinkade’s trademark painting style--none of which I’ll say here--there’s nothing new about architecture being influenced by art.  It’s been happening for centuries.  

During the 1600s, for instance, the dynamic forms, layering of space, and dramatic use of light found in Baroque painting enormously influenced concurrent Baroque architecture.  In the middle of the next century, the Italian G. B. Piranesi’s engravings of ancient Rome foreshadowed the rise of Romantic Classicism, an architectural style whose austere, sharply-drawn classical forms went on to dominate the 1800s.  

Piranesi’s Carceri, a volume of engravings containing eerily atmospheric depictions of imaginary ruins, was especially influential to a branch of Romantic Classicism known as the Sublime.  Set in motion in the late 1700s by the otherworldly designs of the Frenchmen C.-N. Ledoux and and L.-E. Boullee--many never built, some perhaps not even buildable--architects of the Sublime school used stark geometric forms raised to colossal scale to evoke feelings of awe bordering on apprehension.  

Meanwhile, a romantic style of landscape painting gave rise to architecture’s Picturesque movement, whose work aimed to capture the rustic charm of naturalistic art in three dimensions.  An early Picturesque landmark of 1744, the English garden of Stourhead, was in fact literally based on a landscape painting by Claude Lorrain done a century earlier.  Later Picturesque works in England, such as the thatch-roofed peasant cottages conjured up by royal architect John Nash in 1811, continued to exploit the romance of Picturesque art--perhaps the closest historical parallel to those tract homes based on Kinkade’s work.

While representational art might seem more likely to inspire architects, abstract art has had, if anything, a more powerful influence.  During the Teens, the work of the Futurists--an art movement that deified technology to an almost nauseating degree--was soon reflected in the architecture of the Russian Constructivists, whose startling mechanistic projects of the Twenties might have been widely influential had they not lost favor with Joseph Stalin soon afterward. 

The Modernist architects Le Corbusier and Walter Gropius had close links with both Expressionism and with the Dutch movement known as de Stijl (Corbusier himself was a painter early in his career).  The rectilinear geometries of de Stijl artists such as Piet Mondrian and Theo van Doesburg profoundly influenced Modernist floor plans and elevations, many of which resembled abstract art in themselves.  

Unlike most of the foregoing examples, of course, Kinkade hardly represents the artistic vanguard of his era.  Still, the fact that many laypersons--not to speak of critics--consider his work banal doesn’t mean Kinkade’s influence can be dismissed.  Norman Rockwell’s paintings were long considered to be sentimental dreck;  critics pointedly referred to Rockwell as an “illustrator”, refusing to dignify his work with the label of art.  Today, in the more generous light of retrospect, Rockwell is widely considered an American original.  

Whether you love it or hate it, Kinkade’s work seems to have the same sort of mainstream appeal that Rockwell’s art once did, and his status may someday be equally enhanced by time.  Whether this bodes a coming generation of candyland cottages, their windows all aglow, we can only imagine. 

Monday, September 26, 2011


When it comes to identifying home styles, most people know generic terms such as Victorian, Bungalow, and Spanish.  Really pegging the thing is a little tougher, though.  Although more precise terms like Tudor, Mission, and Craftsman are often casually thrown about--especially by real estate agents, who ought to know better--they’re used wrongly more often than not.  Herewith are some of the most common points of confusion.   

For starters, calling a house “Victorian” is like calling a car “postwar”--it  only describes what era the thing was built in.  Luckily, the four major styles of Victorians are easy to tell apart:  If the house has horizontal siding, false cornerstones, and windows with segmental arches, it’s an Italianate.  If it looks like an Italianate but also has a steep mansard roof, it’s a Mansard.  If it has a square bay window, skinny proportions, and a porch with lots of linear wooden gingerbread, it’s a Stick (also called Eastlake).  If it has windows with colored glass borders, a few curved walls or a turret, and a porch with lots of decorative spindles, you can bet it’s a Queen Anne.  Next category, please.

Bungalow is ageneric term describing any home that’s built close to the ground and has a low-pitched roof.  More precisely, if a bungalow has wood siding or shingle (often with stone or clinker brick trim), it’s a Craftsman Bungalow.   If it has stucco on the outside, it’s a California Bungalow.

The gaggle of labels hung on Spanish-style homes--Mission, Spanish Colonial, Churrigueresque, Moorish, Mediterranean--are another endless source of confusion.  Strictly speaking, Mission refers only to architecture modeled on the West’s Spanish Colonial missions, and would suggest a rather plain house with thick stucco walls, an Alamo-like scrolled gable, and a few decorative barrel tiles, if not a whole roof full of them (for practical purposes, the term Spanish Colonial is essentially synonymous with Mission).  

On the other hand, tile-roofed houses with more ornate features such as spiral columns and elaborate door and window surrounds are called Churriguersque, after the 17th-century Spanish Renaissance architect Jose Churriguera.  Pointed or parabolic arches, ceramic tile accents, and perhaps castle-like crennelation would be clues that you were looking at a Moorish-style home.  Of course, when in doubt, you’re always safe using the term Mediterranean, which has come to include pretty much anything with red tile on the roof.  

The terms Tudor, Elizabethan, or Half-Timbered are often used interchangeably to describe English-inspired homes, but these terms don’t mean the same thing.  A Tudor-style house usually has brickwork combined with restrained half-timbering, steep gables, a massive and prominent chimney, and relatively small windows sometimes topped by a pointed Tudor arch.  By contrast, an Elizabethan-style home would have large areas of leaded windows divided into grids or into the familiar “Olde English” diamond pattern, along with lots of florid half-timbering in repeating motifs. 

While both of the above examples might also be called “Half-Timbered”, that term more properly refers to a building technique and not a style.

If you’re wondering why I haven’t mentioned any postwar home styles, it’s because it takes quite a bit of time for style names to stabilize.  Case in point:  During the Sixties, California Ranchers and split levels were routinely called “Contemporaries”, as if they were going to stay in fashion forever.  Today that term is all but forgotten.  

Likewise, today’s gewgaw-laden tract houses are often referred to as “neo-traditionals”, but that term is so vague that it’s unlikely to survive.  Hence, it’ll be a while before we know what posterity deems to call them. 

Monday, September 19, 2011


Most architectural writing deals with what you might call “legitimate” styles: mass-produced, popular and relatively buttoned-down stuff.  But some of the most fascinating architecture of the twentieth century came neither from architects nor builders, and can’t be fit any stylistic cubbyhole. 

Such works, sometimes classed as “naive” or “visionary” design, are the product of singular personalities refreshingly free of academic influences.  Here are a sampling: 

•    In 1921 Simon Rodia, an uneducated Italian immigrant laborer, began building the first of a group of towers around his house in Los Angeles’ Watts district.  Fashioned out of cement-covered steel bars and encrusted with fantastic arrays of shells, bottles, and bits of tile and glass, the tallest of the structures eventually soared nearly a hundred feet.  After laboring on the towers for thirty-three years Rodia, then 79, laid down his tools, deeded the property to his neighbor for nothing, and disappeared.  Of the now-famous Watts Towers he said simply,  “I had in mind to do something big and I did.”

•   In the mid-50s, “Grandma” Tressa Prisbrey found that her collection of 2000 pencils had outgrown her house trailer in Santa Susana, California.  So she began building a small structure to display them, using a material that was cheap and plentiful--discarded bottles.  Over the next twenty years, this humble beginning evolved into the Bottle Village, a 40-by-300 foot compound of 13 buildings and nine other structures, all built out of some one million bottles laid up in cement.  

Prisbrey, who liked to sport a floppy sun hat ringed with old television vacuum tubes, also made daily trips to the dump, where she collected bits of broken tile, old headlights, and a cavalcade of other discards.  These she lovingly inlaid into every square inch of paving between the structures, as well as into numerous free-form planters which she built on the site.  Prisbrey filled these planters with cactus, explaining:

“I don’t care much for cactus myself, but I don’t have a green thumb and if I forget to water the cactus they just grow anyhow. . .they remind me of myself.  They are independent, prickly, and ask nothing from anybody.”

•  And of course, no account of wacky architecture would be complete without mention of Sara Winchester, diminutive heiress to the Winchester arms fortune. Supposedly plagued by the spirits of the untold men who had died at the business end of Winchester rifles, Sara consulted a fortune teller and learned that as long as she kept adding onto her modest San Jose farmhouse, she would not only escape their wrath, but would never die to boot.  

Psychics having a good deal more credibility in the late-19th century, she immediately embarked on the remodel to end all remodels--a project that would last several decades and ultimately yield a spectacularly rambling Victorian/Edwardian house with 160 rooms. Among its idiosyncrasies:  A seance room, a bell tower for summoning the spirits, and the repeated use of design motifs with 13 elements.  Tourguide puffery aside, the Winchester House remains a fine place to view the transition of architectural style from the late-nineteenth to the twentieth century-- a wacky enough subject in itself.

Tuesday, September 6, 2011


The other day I was driving down a local street lined with carefully inoffensive white, beige, or tan bungalows when something remarkable caught my peripheral vision: Jumping out from among the oatmealy shades was an electric blue cottage with lavender trim. While no doubt a few of the neighbors were dismayed by this violation of Waspish color preferences, the effect was both unexpected and charming. 

Colors are a mysterious thing. We all see them a little differently, and when you get right down to it, they exist as much in the mind as in the objects we perceive. Few reasonable people would argue that one color is better than another. Still, there are always folks out there who think they know best which colors are “tasteful” and which aren’t, and are anxious to let people know about it.  

In fact, color preferences are an intensely individual choice that varies from person to person and from culture to culture. Consequently, it’s nobody’s business but our own to decide which colors we like best.

A glance at the previous century’s changing color fashions shows both the human craving for variation and the relentlessly cyclical nature of taste, which has swung from reticent colors to vibrant ones and back again.  

In the United States, the opening of the twentieth century gave rise to the Craftsman era, a reaction to the kaleidoscopic palette of Victorian architecture.  Artifice was out, and natural simplicity was in. In keeping with these naturalistic aspirations, pristine whites once again returned to architecture, set off by deep, muted browns, greens, and golds.  

By the late 1920s, however, the arrival of Art Deco, with its electrifying jags-and-curves motifs, brought with it an equally dramatic shift in color tastes. Art Deco designers daringly allied black with celadon greens, icy blues, and a whole range of red and yellow ochres--a trend that lasted until the eve of World War II.  

The drab, camoflauge-like colors of the early postwar era--gray-greens, gray-blues, or ruddy browns--were surely inspired by the inescapable military imagery of the war years. A rebuke to this trend arrived in the 1950s, when light, airy pastels in pink, blue, yellow and turquoise dominated residential design. This gradual return to strong, clear colors lasted well into the 60s, culminating in the vivid psychedelic palette of the late decade.  

The pendulum of taste began its reversal during the Seventies, when the ecology movement helped foster a trend toward “earth tones”--a muted, naturalistic palette of beiges, tans, and browns. Despite a brief Postmodernist digression into happy neopolitan ice cream shades in the early 80s, the trend away from strong colors continued, culminating in the late-century fixation on whites, grays, and gunmetal blues. 

When the history of the new millenium’s first decade is written, poisonous greens, bilious yellows, and muddy browns will likely come to represent its taste in architectural colors--no doubt a sort of rebellion against the resolutely bland palette of the 80s and 90s. Personally, colors with such insistently unpleasant associations aren’t my cup of tea. But would I dream of telling my neighbors that their color choices weren’t “tasteful”--whatever that means?

If the guy in the electric blue house can’t make me do it, neither can they.

Monday, August 29, 2011


Channel surfing a while back, I happened across an old Joan Crawford movie called Mildred Pierce.  I won’t summarize the plot here--I couldn’t do it in the length of this blog anyway--but suffice it to say there were adequate histrionics to win Crawford an Oscar for best actress in 1945. What really caught my attention, though, was a scene in which her social-climbing character is about to buy a spectacular though long-empty half-timbered mansion.  As she surveys the ornate interior, she sighs resignedly and declares: “It’s not so bad, really...just tear down some of this gingerbread--”.

I puzzled over this line for a moment before realizing that, from the vantage point of 1945, the home’s design was supposed to be revolting. 

How far we’ve come--or rather, how far we’ve come around. Like everything else in history, architectural styles are cyclical:  every half-century or so, our idea of what constitutes good taste does a flip-flop. In Mildred Pierce’s time,  “gingerbread” was practically an epithet, and people tore it down if they had it. Today, people put up gingerbread if they haven’t got any, and it’s Modernism that’s down for the count.

The lesson is that, in architecture as in art, there are no hard and fast rules, no right answers, and ultimately, no such thing as good taste. I’m always amused at the astonished reactions I get when I make this statement. Some people bristle as if they’ve been personally insulted.  All of us think we know what good taste is, and--surprise surprise--it’s usually pretty close to our own. But like beauty, good taste is in the eye of the beholder. What passes for exquisite refinement in Dallas would draw yawns in Bombay or Manila. Moreover, there’s no reason to assume that our own ideas of good taste are any more valid than those of other cultures--they’re just more familiar, that’s all.  

What’s more, even within a particular culture, good taste is a prisoner of its own time. In 1889, a Swiss engineer constructed an enormous, riveted wrought-iron tower to serve as the centerpiece of the Paris Exhibition. The French considered it an abomination and demanded its prompt demolition after the fair closed. Rather than being destroyed, of course, the Eiffel Tower eventually became the very symbol of Paris.  

Likewise, at the dawn of the twentieth century, residents of the tony Chicago suburb of Oak Park were repeatedly outraged by the construction of a series of new homes which most of them considered monstrous. They were referring to Frank Lloyd Wright’s epoch-making Prairie houses. 

Some might argue that, apart from the temporal biases most of us are constrained by, there are still some absolutes of good taste that remain valid in any era or setting--rules based on classical proportions, color theory, respect for context, and the like.  But even this notion doesn’t hold water. Over the centuries, dozens of architects have changed the course of design history by flouting accepted “rules” of good taste, not the least of them Michaelangelo, Bernini, Richardson, Wright, and Venturi.

All this leads to a rather unsettling question. If there are no absolutes of taste--or, to put it more precisely, if our ideas of good taste are always prisoners of our own zeitgeist--how do we decide what our buildings should look like?  

Why, we rely on the infallible judgement of our local design review board, of course.

Just kidding. 

Tuesday, August 23, 2011


Halfway up one of the brick walls of my office, part of an old factory building dating from 1907, there’s a single brick that’s twisted slightly out of position.  Beneath it, a solidified ribbon of mortar hangs frozen in a drooping arc, attesting to the fact that the brick was bumped within a few minutes of the time it was placed, while the mortar was still wet.  

All told, there are about six thousand exposed bricks in the walls of my office and some half-million in the building altogether, most of them laid with ordinary accuracy.  That single brick, however, stands out both literally and figuratively.  

Why?  Because it gives an almost eerily direct temporal connection to the moment in 1907 when a mason, now long dead, placed--and then accidentally displaced--that single brick.  Perhaps he nudged it with his foot as he moved along the scaffold;  perhaps he had a few nips of whiskey with his lunch;  or perhaps it was just close to quitting time, and he was tired.  The possibilities are as vast as the likelihood of ever really knowing is small.  The brick can’t tell the story; it can only record the outcome of that moment over a century ago.

It may seem odd that imperfections are often the very things we find intriguing in our surroundings, but so it is.  Imperfections, which are the inevitable traces of human effort, are what put a premium on handcrafted objects over machine-made ones.  They tell us that someone--perhaps someone much like us--put heart and soul into making them.  

For this reason, architects have long admired brick, stone, carved wood, wrought iron, and other building materials that provide an obvious record of human effort.  If flaws seem like a strange thing to admire, the alternative is much worse.  Pursuing visual perfection, as some architects are wont to do, is a sure ticket to failure.  This is the inevitable flaw in the sort of frigid Minimalist work that appears ad nauseum in chic design magazines.  While such projects always look smashing in glossy photo spreads, the real test comes later, when time has inevitably begun to affect those “perfect” details and they start showing wear or simply fall to pieces.

For a time following the Industrial Revolution, machine-made objects were regarded as superior to handmade ones.  Yet eventually, social critics such as England’s John Ruskin managed to reawaken the public to the beauty of items fashioned by hand, whose innate sense of life no machine could ever match. 

The resulting counterreaction ushered in the Arts and Crafts movement in England, as well as its American counterpart, the Craftsman style. Craftsman architecture showcased coarse materials such as rough stone, clinker brick, and carved wood that were pointedly worked by hand, directly refuting the Victorian machine aesthetic. Later on in the early 20th century, Spanish, Tudor, and other period revival styles provided an even bigger canvas for hand craftsmanship.

“Every time a man puts his hand down to cut or carve or chisel or build a house,” wrote the architect William R. Yelland during the period revival era, “he must express his own self.”  It is this self-expression, a record of human passing forever condensed out of evanescent time, that is architecture’s greatest gift.  

Monday, August 15, 2011

AFFORDABLE HOUSING: The Invisible Answer, Part Three

Believe it or not, prior to the late 1930s, people who lived in travel trailers full-time were hailed as adventurous, modern-day nomads, and were widely admired by the public. By the tail end of the Depression, however, vast numbers of impoverished families had resorted to living in broken-down homemade trailers, and the public perception of trailer dwellers completely reversed. Cities and towns passed laws barring them from entering city limits, or else imposed heavy fees to discourage them from staying overnight.  

Today, this sad legacy persists in the unkind treatment of mobile home dwellers as second-class citizens--people whom zoning laws still relegate to living beside tank farms or beneath runway approaches. Little wonder that even the most mortgage-enslaved Americans still recoil at the thought of dwelling in such places. 

Yet if and when America ever develops a true mass-produced form of housing--one that does for the cost of homes what the Model T did for the cost of cars--it will most likely be an outgrowth of the mobile home. For decades, and without the fanfare accompanying the many “affordable” housing solutions proposed by architects and visionaries, mobile homes (or, as the industry prefers to call them, “manufactured homes”) have been providing decent, mass-produced lodging for a fraction of the cost of site-built houses.  

The main reason for this difference is simple. While conventional homes use a few factory-built components such as roof trusses, doors, windows, and cabinets, the lion’s share of the structure remains entirely hand-built. By contrast, the manufactured home industry literally grew up with mass production, thanks to its prewar origins in building travel trailers.  From a modest start--few early trailers exceeded 160 square feet or so--the industry inexorably progressed to larger and more sophisticated units. By the late Sixties, huge, factory-built “doublewides” routinely enclosed areas of around a thousand square feet, which is about the size of an average bungalow home of the 1920s. Along the way, manufactured home builders quietly acquired the sort of mass production techniques that the site-built housing industry still considers revolutionary.

Why all  the fuss about mass production? What’s wrong with the way we build traditional houses? The answer is that, of America’s innumerable consumer products, homes are among the last that are predominantly handmade. This implies the same thing for houses that it does for any other handmade product: High cost. It’s one of several admittedly complex reasons that fewer and fewer middle-class Americans--let alone the poor--can achieve the dream of home ownership these days.  

Still, despite the thrashing we’ve gotten from five years of the Great Recession, many Americans still believe that a “real” house, whether affordable or otherwise, should be built onsite and not in a factory--a perception heartily supported by the building industry, whose livelihood depends upon it. Hence, it’s doubtful that manufactured homes will be accepted by mainstream home buyers until they can unflinchingly compete with site-built homes in appearance, construction quality, amenities, and safety.  

Up to now, the manufactured home industry hasn’t been up to this challenge. For the most part, it remains satisfied with often-haphazard planning and a dubious, two-dimensional aesthetic. Yet an industry that’s ridden out wildly changing fortunes, regulatory discrimination, and decades of public ridicule might still be counted on to provide a few surprises.

Monday, August 8, 2011

AFFORDABLE HOUSING: The Invisible Answer, Part Two

Architects love to start from a clean slate.  It’s inherent in our training, and often, it’s for the best--after all, clean-slate thinking has given us Falling Water, Ronchamps, and countless other architectural triumphs. 

Yet sometimes, incremental improvements on a humble concept are more useful than the grandest plans made from scratch.  This is the case with affordable housing. Consider what architects have done to make homes more affordable during the past eighty years--in practical terms, next to nothing--and compare this with the erstwhile trailer industry, that paragon of gauche design, which has stumbled along unceremoniously only to arrive at affordable housing that really works.   

The trailer story begins in the late Teens, when Americans first piled into their flivvers to go “autocamping” along the nation’s scenic new roads.  At first, campers simply carried tents, but by the early Twenties, many were towing tiny trailers that cleverly unfolded into roomy canvas cabins.  Meanwhile, towns throughout the country opened auto camps--later known as trailer parks--to attract tourist dollars.  

In 1929, a Michigan man named Arthur Sherman got tired of wrestling with his tent trailer and built himself a solid-walled masonite version that didn’t need setting up.  The idea caught on, and Sherman wound up in the trailer business, with hundreds of others soon following.  By the mid-Thirties, trailering and trailer parks were such a huge phenomenon that one expert foresaw half of all Americans living in trailers by 1955.  

Yet by 1937 the trailer boom had collapsed, the victim of a saturated market and its own overheated rhetoric.  Meanwhile, broken-down trailers became the only homes many Depression-bound Americans could afford, changing the public’s original perception of trailer dwellers as wholesome, fun-living nomads to the more familiar stereotype presuming shiftlessness and poverty.  

World War II  briefly redeemed the trailer’s image.  Faced with an urgent need to house defense workers, the government ordered some one hundred thousand trailers during the course of the war, and in the process helped demonstrate the lowly trailer’s value as a year-round dwelling.

The postwar housing shortage brought many novel ideas for affordable, mass-produced housing, from the all-steel Lustron home to Buckminster Fuller’s aircraft-based Wichita House.  Once again, however, the clean-slate approach created spiraling costs that premempted any chance of affordability.

The trailer industry, on the other hand, simply picked up where it left off, adding homey touches and increasing size, until by the early 1950s some models were over 25 feet long.  These units were now clearly designed for year-round living, though in light of the trailer dweller’s shady reputation, the industry remained loathe to concede this.  

Only in 1954, when a Wisconsin firm introduced a trailer so large it required a special permit to transport, did the industry finally begin to acknowledge that year-round trailer dwellers were its real market.  Twelve-foot-wide, fourteen-foot-wide, and double-twelve-foot wide trailers eventually followed, at prices that nevertheless were a fraction of conventional site-built homes.

Today, the travel trailer’s descendants--now known as manufactured homes--have quietly fulfilled the whole gamut of affordable housing requirements, and have done so through evolution and not revolution.  They are mass-produced and hence affordable; they can be easily customized and rapidly deployed, and they provide the familiar domestic imagery so many homeowners take comfort in.

Yet despite these attributes, manufactured homes remain largely invisible to the architectural profession.  Hence, the question is not whether such homes can provide an affordable housing solution--they already have, and for decades.  The real question is why architects, and much of the public, still seem to wish they hadn’t.