Tuesday, January 31, 2012


If you’ve ever seen one of the old Buck Rogers movie serials, with their packing-crate robots and Art Deco rockets shooting sparks, you can appreciate how quaint another era’s vision of the future can be--and how difficult it is to get it right.  Yet speculating on things to come, whether in writing, in images, or in three dimensions, is something humans find irresistible.

Architects are no exception.  The Futurist movement of the early 20th century, for instance, saw technology as man’s saviour, and liked to wax poetic over things like turbines and high voltage towers.  Yet to many modern eyes, their stark, mechanistic cities of tomorrow are not so much redemptive as sinister. 

During the 1920s, the Russian Constructivists saw architecture in equally edgy terms.  Thanks to Stalin’s growing distaste for their work, their most ambitious ideas, like those of the Futurists, were never built.  This fact has ironically worked in their favor, since speculating on the future is a good deal safer than actually trying to build it in three dimensions.  Paper predictions remain snugly encased in the context of their own time, while real structures must actually occupy--however uncomfortably--the future they were meant to predict. 

Disneyland’s 1957 House of Tomorrow, an all-plastic home designed by MIT and sponsored by the chemical giant Monsanto, is a classic example of this phenomenon.  With its plastic furniture, plastic dishes, and molded plastic walls, it turned out to be an almost comically inept predictor of housing’s future.  While plastics did find limited acceptance in many kinds of building materials, from drain pipes to windows, the predicted plastics revolution augured by the House of Tomorrow never materialized.  

Indeed, the actual building trends of the early twenty-first century show a steady retreat from man-made polymers and controlled environments, back toward organic materials and more environmentally-sensitive design.   

Theme parks and expositions in general have been a steady source of futuristic centerpieces, from the Trylon and Perisphere of the 1939 New York World’s Fair, to the globe-like, 140-foot tall Unisphere at the 1964 fair held on the same site, to the more recent Spaceship Earth, the Florida EPCOT Center’s eighteen-story geodesic sphere of 1982.  

Overshadowing all of these is the 605-foot tall Space Needle, centerpiece of the 1962 Seattle World’s Fair. With its concave pylons and flying-saucer superstructure, the Space Needle evoked the sort of future in which people would have robot housekeepers and fly around in jet-powered backpacks--that is, when they weren’t out driving their atomic cars.  This space-age optimism even permeates the color names used in the tower’s paint scheme:  Astronaut White, Orbital Olive, Re-entry Red, and Galaxy Gold.

As a now charmingly-retro hallmark for Seattle, the Space Needle has been an unqualified success--even today, it remains the city’s biggest tourist attraction. As a predictor of future architectural trends, though, the Needle missed the mark.  

The fact that the Space Needle and its futuristic brethren already seemed quaintly outdated within a decade of their completion shows just how risky building a vision of the the future can be.  It’s a sure bet that our own “House of Tomorrow” predictions about computer-orchestrated homes--the sort of scenario in which your toaster automatically goes online to buy more Eggos--are just as likely to come to naught.   

Still, architects will no doubt keep offering you their ideas of what’s to come.  Whether our predictions pan out or not--well, the future will be here soon enough.

Monday, January 23, 2012


What do movie palaces have to do with how you light your home?

Plenty. After electric lighting replaced gaslight at the end of the 19th century, most electric lighting was “specular”, a fancy way of saying it came from a point source like the white-hot filament of a standard light bulb.  That situtation changed during the 1920s with the arrival of indirect lighting (“indirect” meaning that the light source is hidden).   

Indirect lighting took a while to catch on because, at first, electric fixtures were used just like gas mantles.  No one thought of hiding them, since doing so would have been foolhardy with gas.  Moreover, exposed light bulbs were initially seen as an emblem of modernity.  

If you’ve ever tried to read by the light of an unshaded light bulb, though, you know that the glare they produce can be a real problem.  Indirect lighting provided a dramatic solution:  By concealing the light source, it diffused the light and, unlike an ordinary shade, completely eliminated specular glare.  

Movie theaters were among the first to adopt indirect lighting.  Auditoriums needed subdued lighting for safety even during the show, and of course having a lot of glary specular lamps wouldn’t do.  Since live theaters had long used concealed footlights along the front edge of the stage--the well-known “limelight” you’ve heard about--it wasn’t much of a stretch to use indirect lighting in other parts of the building.  

Perhaps the most dramatic new form of indirect lighting in theaters was soffit lighting.  Typically, it consisted of a ceiling that stepped up from a low level at the perimeter (the “soffit”) to a higher one in the center.  Lighting fixtures were hidden in a continuous horizontal recess separating the two levels, so that a diffuse, glare-free light would bounce off of the upper ceiling into the space below.   

The futuristic hovering effect this technique produced soon became a favorite with Art Deco commercial architects, who used it in countless clever ways.  Naturally, it wasn’t long before these ideas were showing up in the latest homes as well.

But don’t think indirect lighting is all just theatrical razzle dazzle.  It can be practical as well.  For example, if you mount miniature fixtures under your kitchen’s wall cabinets and conceal them with a shallow skirt or “valance”, they’ll light the countertop beautifully, but won’t shine in your eyes.

What’s more, indirect lighting can be remarkably cheap.  Since you don’t see the light source, you can use ordinary fixtures costing a few dollars--instead of overpriced boutique fixtures costing hundreds--and still get very sophisticated results.  Depending on the space available, ordinary porcelain sockets, light ropes, or even strands of miniature Christmas lights will do the job.  Nor does the structure that conceals the lamps have to be expensive:  Most soffit lighting, for example, consists of little more than a simple lumber framework finished with drywall.  

Regardless of how you design your indirect lighting, though, remember that the lamps will need replacement now and then.  Since you can’t always see into the recess that hides the fixture, make sure you can change the lamps by feel alone.  And for  heaven’s sake, turn off the juice first.

Monday, January 16, 2012


A few blocks from my office, there’s a dreary, ten-year-old strip mall fronted by literally acres of unrelieved parking lot.  Though it has no fewer than five separate entrances for cars, God help anyone who dares to approach the place on foot.  To reach its quarter-mile-long phalanx of storefronts, you can either negotiate the single paltry thread of sidewalk the developers saw fit to provide, or else try to cross a vast sea of dirty asphalt on foot, with cars flashing carelessly past on all sides and bearing down behind you unseen.

One of the many exasperating tenets of postwar planning was the assumption that nobody would ever want to walk anywhere, anytime.  Shopping centers, not to speak of downtown streets, were laid out mainly to suit automobiles and not people.  Seemingly, the only time a human was expected to walk outdoors was enroute to the driver’s seat.  

Yet many people do walk, and hopefully many more will do so in coming years.  What with traffic snarls, interminable waits at signals, and the inevitable battle for parking, it’s often quite literally faster to walk three or four blocks than it is to drive that far.  And mind you, I say this as a lifelong motorhead. 

Given all the bad things we’ve found out about designing cities around cars instead of people, modern planners are doing their best to bring pedestrians into this creaky old equation. It’s a fine idea in theory, but in practice, wherever cars and pedestrians mix, the cars invariably win out.  The reason is obvious:  Since a car weighs twenty to thirty times what a person does, any contest between the two will not end up in the pedestrian’s favor.  Hence, we’re psychologically conditioned from childhood to subordinate ourselves to those big bad cars.  

Less obvious, but just as problematic, a car also takes up about thirty times as much space as a person on foot, resulting in vast areas of our cities that have no function whatever but to store our four-wheeled friends.  All told, we pave over about forty percent of our cities solely to accomodate motor vehicles (in Los Angeles, the figure is said to be closer to sixty percent).  This autocentric environment extends right into our own homes, one-quarter of which we happily devote to garage space. 

Unfortunately, despite the rhetoric of New Urbanist planning, which promises to reverse these twisted priorities, little has changed on the ground.  I recently stopped in at yet another shopping complex not far from my office, this one barely two years old.  Unlike the stupefying strip mall mentioned earlier, this “retail village” employs many of the latest New Urbanist planning ideas--varied building facades, happy little plazas, pretty paving, and the like.  

For hapless shoppers, alas, these potentially lovely surroundings are completely co-opted by the constant stream of cars that go barreling right through the heart of the place. That’s right:  For some inexplicable reason, automobiles weren’t barred from what might have been a charming little shopping lane.  

A smattering of New Urbanist rules, it seems, hasn’t been enough to change the game.  Those big bad cars are still winning it.

Monday, January 9, 2012


In the past, an architect was just what his Latin name suggested--a “master builder”.  Practical experience was the most important schooling such a person could have, and architects thus trained gave us the Great Pyramid of Cheops, the Parthenon, and all the cathedrals of the Middle Ages. 

Only during the past hundred years or so has the right to use the title “architect” been determined by academic degrees and testing rather than by practice.  In 1897, Illinois became the first state to require that architects be licensed. California followed suit in the early years of the new century.  

The National Council of Architectural Registration Boards was founded in 1919 and held its first annual meeting two years later.  Given the ever-increasing complexity of building technology, the remaining states instituted requirements for licensure over the next thirty years, with the last two holdouts, Vermont and Wyoming, doing so only in 1951.  

Today, no one may use the title “architect” in the United States without fulfilling a  seven-and-a-half-year long course of education and office internship, including an exhaustive series of examinations.  Despite the rigors of this procedure, mere possession of an architectural license has never been a guarantee of talent.  Or, as my old boss used to put it, “You can have a fishing license, but it doesn’t mean you’re gonna catch any fish.”

Conversely, a lack of formal education and licensure hasn’t always ruled out extraordinary ability.  The last two columns in this series recounted six non-architects--Frank Lloyd Wright, Addison Mizner, Cliff May, Carr Jones, Buckminster Fuller, and Craig Ellwood--who changed the course of architecture and, just as important, made the world a more interesting and beautiful place.

None of the six had formal training or licenses (in Wright’s case, his practice predated licensure requirements). Wright and Mizner gained their entire architectural educations through apprenticeship--Wright with Louis Sullivan, and Mizner with Willis Polk.  May, Jones, Fuller, and Ellwood had no formal architectural training whatever.  

None of this is meant to suggest that no schooling is better than bad schooling, or that licensure is unimportant.  But it does suggest that there are alternatives to the usual way we teach architecture and building, and how we judge architectural skill.  

It’s no accident that each of the gifted non-architects cited above learned his craft mainly through practical experience, not through academics.   Today, a handful of schools still struggle to include such hands-on training--Wright’s Taliesin and Paolo Soleri’s Arcosanti among them.  Yet for the most part, the architectural establishment remains firmly entrenched in the belief that formal schooling and office internship are the only legitimate basis for competence and licensure.  

Today, few would deny the contributions of geniuses like Wright and Fuller, romantics like Jones, Mizner and May, and even consummate front men like Ellwood.  Yet the current process of education and licensure, overwhelmingly weighted as it is toward academic and office training, holds little room for such mavericks in the future.  That’s a pity, because in many ways, the practically-trained architect follows most closely in the footsteps of the  “master builder”.    

Monday, January 2, 2012


Last time, we looked at the careers of Frank Lloyd Wright, Addison Mizner, and Cliff May, all renowned architects who were never formally trained or licensed.  Today we’ll touch on a few more architects who made an undeniable contribution to the profession, despite their lack of formal credentials.

Carr Jones, a designer-builder who practiced in the San Francisco Bay Area for almost half a century beginning in the late teens, was a pioneer in green architecture if ever there was one. Jones fashioned lyrically beautiful homes out of used brick, salvaged timber, and castoff pieces of tile, slate, and iron, often wrapping his dramatically-vaulted rooms around a landscaped central court.  Perhaps because he was trained as a mechanical engineer and never traveled abroad, Jones was all but innocent of architectural pretension. Instead, he built on unvarying principles of comfort, conservation, and craftsmanship.  And unlike many trained architects whose style changes with every faddish breeze that blows, Jones’s convictions remained uncompromised right down to his death in 1966.

R. Buckminster Fuller had no architectural training either, and indeed was expelled from Harvard during his freshman year for "irresponsibility and lack of interest".  His first job was working as an apprentice machine fitter.  Yet over the course of his long and wide-ranging career, Fuller’s architectural innovations included not only the geodesic dome--his best-known invention--but also the gleaming, steel-sheathed Dymaxion House, a dwelling meant to be mass produced in a factory and installed on the site as you might bolt down a lamppost.    
In the context of today’s fussy, retrograde home designs, Fuller’s visionary proposals for the geodesic dome and the futuristic Dymaxion House may draw smiles, but this reflects more on the glacial pace of architectural progress than any flaw in Fuller’s thinking.

Not surprisingly, Fuller dismissed conventional architects, saying: “They work under a system that hasn't changed since the Pharaohs.” During his lifetime, the onetime Harvard dropout received exactly 47 honorary doctorates from universities the world over, and today is deservedly included in practically any general survey of twentieth-century architecture.  

One highly influental non-architect had creative skills of another kind.  Craig Ellwood was the celebrated Southern California modernist whom one critic called “the very best young architect to emerge from the West Coast in the years following World War II.”  A brilliant self-promoter, Ellwood (who was born Johnny Burke and took his tonier surname from a local liquor store) parlayed some minor development experience into a career that reached the highest echelon of modern architecture.  So skilled was Ellwood at presenting himself that despite being barely educated--his entire formal training consisted of night classes at UCLA--he was twice considered for the deanship at Mies van der Rohe’s Illinois Institute of Technology.  

Understandably, Ellwood took pains to hide the fact that he was unlicensed from his elite clientele, and he relied heavily on a gifted staff to carry out his basic concepts.  That he was able to enrapture critics, editors, and clients alike despite his lack of education can only increase one’s admiration for his skill.  And in the final analysis, nothing can detract from the breathtakingly elegant steel-and-glass creations that are the legacy of the Ellwood office.

Next week:  The common thread among great architects and great non-architects alike.