Monday, September 14, 2020

INDIRECT LIGHTING: From The Stage To Your Living Room

 

Early predecessor of indirect lighting:
Limelight spotlight, used to illuminate
the front stage area of theaters
until the end of the nineteenth century.

What do movie palaces have to do with how you light your home?

Plenty. After electric lighting replaced gaslight at the end of the 19th century, most electric lighting was “specular”, a fancy way of saying it came from a point source like the white-hot filament of a standard light bulb. That situation changed during the 1920s with the arrival of indirect lighting (“indirect” meaning that the light source is hidden).   

Indirect lighting took a while to catch on because, at first, electric fixtures were used just like gas mantles. No one thought of hiding them, since doing so would have been foolhardy with gas.  Moreover, exposed light bulbs were initially seen as an emblem of modernity.  

If you’ve ever tried to read by the light of an unshaded light bulb, though, you know that the glare they produce can be a real problem. Indirect lighting provided a dramatic solution: by concealing the light source, it diffused the light and, unlike an ordinary shade, completely eliminated specular glare.  
Spectacular use of soffit lighting in the
auditorium of the Wiltern Theater,
Los Angeles, c. 1931 (Architects: Stiles O. 
Clements and G. Albert Landsburgh)




Movie theaters were among the first to adopt indirect lighting. Auditoriums needed subdued lighting for safety even during the show, and of course having a lot of glary specular lamps wouldn’t do. Since live theaters had long used concealed footlights along the front edge of the stage—the well-known “limelight” you’ve heard about—it wasn’t much of a stretch to use indirect lighting in other parts of the building.  

Perhaps the most dramatic new form of indirect lighting in theaters was soffit lighting.  Typically, it consisted of a ceiling that stepped up from a low level at the perimeter (the “soffit”) to a higher one in the center.  Lighting fixtures were hidden in a continuous horizontal recess separating the two levels, so that a diffuse, glare-free light would bounce off of the upper ceiling into the space below.   

Indirect under cabinet lighting
provides the most even and
glare-free lighting for
kitchen work surfaces.
The futuristic hovering effect this technique produced soon became a favorite with Art Deco commercial architects, who used it in countless clever ways.  Naturally, it wasn’t long before these ideas were showing up in the latest homes as well.

But don’t think indirect lighting is all just theatrical razzle dazzle. It can be practical as well. For example, if you mount miniature fixtures under your kitchen’s wall cabinets and conceal them with a shallow skirt or “valance”, they’ll light the countertop beautifully, but won’t shine in your eyes. 

What’s more, indirect lighting can be remarkably cheap. Since you don’t see the light source, you can use ordinary fixtures costing a few dollars—instead of overpriced boutique fixtures costing hundreds—and still get very sophisticated results. Today, LEDs have vastly expanded the opportunities for indirect lighting. LED lighting strip is available as narrow as 3/8" wide, allowing it to be hidden practically anywhere. 

LED lighting strip has made it possible
to install indirect lighting in places it
couldn't go before.

However, indirect lighting can be low-tech as well; d
epending on the space available, ordinary porcelain sockets, light ropes, or even strands of miniature Christmas lights will do the job. Nor does the structure that conceals the lamps have to be expensive: most soffit lighting, for example, consists of little more than an ordinary lumber framework finished with drywall.  

Regardless of how you design your indirect lighting, though, remember that the lamps—yes, even LEDs—will need replacement now and then. Make sure that you have reasonable access, especially in tight locations like ceiling coves. And for  heaven’s sake, turn off the juice first.

Tuesday, September 8, 2020

IN AMERICA, ALAS, THE CAR IS STILL KING

Heaven help the pedestrian in shopping centers like this one—
which unfortunately are typical across the nation.
A few blocks from my office, there’s a dreary, ten-year-old strip mall fronted by literally acres of unrelieved parking lot.  Though it has no fewer than five separate entrances for cars, God help anyone who dares to approach the place on foot. To reach its quarter-mile-long phalanx of storefronts, you can either negotiate the single paltry thread of sidewalk the developers saw fit to provide, or else try to cross a vast sea of dirty asphalt on foot, with cars flashing carelessly past on all sides and bearing down behind you unseen.

One of the many exasperating tenets of postwar planning was the assumption that nobody would ever want to walk anywhere, anytime. Shopping centers, not to speak of downtown streets, were laid out mainly to suit automobiles and not people. Seemingly, the only time a human was expected to walk outdoors was enroute to the driver’s seat.  

In an environment designed for
and dominated by cars,
pedestrians are just in the way.
Yet many people do walk, and hopefully many more will do so in coming years. What with traffic snarls, interminable waits at signals, and the inevitable battle for parking, it’s often quite literally faster to walk three or four blocks than it is to drive that far. And mind you, I say this as a lifelong motorhead. 

Given all the bad things we’ve found out about designing cities around cars instead of people, modern planners are doing their best to bring pedestrians into this creaky old equation. It’s a fine idea in theory, but in practice, wherever cars and pedestrians mix, the cars invariably win out. The reason is obvious:  Since a car weighs twenty to thirty times what a person does, any contest between the two will not end up in the pedestrian’s favor. Hence, we’re psychologically conditioned from childhood to subordinate ourselves to those big bad cars.  
Self-driving cars are not going to change scenes like this—
they may even make them worse. The problem is
in the cars, not in who's driving them.
(Image: Bill O'Leary, The Washington Post)

Less obvious, but just as problematic, a car also takes up about thirty times as much space as a person on foot, resulting in vast areas of our cities that have no function whatever but to store our four-wheeled friends. All told, we pave over about forty percent of our cities solely to accommodate motor vehicles (in Los Angeles, the figure is said to be closer to sixty percent).  This autocentric environment extends right into our own homes, one-quarter of which we happily devote to garage space. 

For decades, the rhetoric of New Urbanist planning has promised to reverse these twisted priorities. More recently we've heard utopian predictions about the benefits of self-driving cars, but these, too, will not address the root problem—driverless or not, they are still cars, and will still dominate public roads at the expense of those who'd rather walk. 

In my town, you'll find this lovely shopping street—
but rather than making it a pedestrian mall, traffic engineers
decided to let cars go barreling down the middle of it.
Much has been predicted, but little has actually changed on the ground. I recently stopped in at yet another shopping complex not far from my office, this one barely two years old. Unlike the stupefying strip mall mentioned earlier, this “retail village” employs many of the latest New Urbanist planning ideas--varied building facades, happy little plazas, pretty paving, and the like.  
For hapless shoppers, alas, these potentially lovely surroundings are completely co-opted by the constant stream of cars that go barreling right through the heart of the place. That’s right:  For some inexplicable reason, automobiles weren’t barred from what might have been a charming little shopping lane.  

So far, neither New Urbanist rules nor Silicon Valley tech have been enough to change the game.  Those big bad cars are still winning it.  





Tuesday, September 1, 2020

THE ARCHITECTURAL OUTSIDERS Part Three of Three Parts

The Great Pyramid of Giza, built 2580 to 2560 BC.
Architect/master builder: Hemiunu
In the past, an architect was just what the Latin word suggested—a “master builder”.  Practical experience was the most important schooling such a person could have, and architects thus trained gave us the Great Pyramid of Giza, the Parthenon, and all the cathedrals of the Middle Ages. 

Only during the past hundred years or so has the right to use the title “architect” been determined by academic degrees and testing rather than by practice. In 1897, Illinois became the first state to require that architects be licensed. California followed suit in the early years of the new century.  

 The National Council of Architectural Registration Boards was founded in 1919 and held its first annual meeting two years later. Given the ever-increasing complexity of building technology, the remaining states instituted requirements for licensure over the next thirty years, with the last two holdouts, Vermont and Wyoming, doing so only in 1951.  

Today, no one may use the title “architect” in the United States without fulfilling a  seven-and-a-half-year long course of education and office internship, including an exhaustive series of examinations. Despite the rigors of this procedure, mere possession of an architectural license has never been a guarantee of talent. Or, as my old boss used to put it, “You can have a fishing license, but it doesn’t mean you’re gonna catch any fish.”

The boardroom at Frank Lloyd Wright's Taliesin West, one of the
handful of schools that still emphasize hands-on training.
Most of the facility was built by its students.
Conversely, a lack of formal education and licensure hasn’t always ruled out extraordinary ability. The last two installments in this series recounted six non-architects—Frank Lloyd Wright, Addison Mizner, Cliff May, Carr Jones, Buckminster Fuller, and Craig Ellwood—who changed the course of architecture and, just as important, made the world a more interesting and beautiful place.

None of the six had formal training or licenses (in Wright’s case, his practice predated licensure requirements). Wright and Mizner gained their entire architectural educations through apprenticeship—Wright with Louis Sullivan, and Mizner with Willis Polk. May, Jones, Fuller, and Ellwood had no formal architectural training whatever.  

None of this is meant to suggest that no schooling is better than bad schooling, or that licensure is unimportant. But it does suggest that there are alternatives to the usual way we teach architecture and building, and how we judge architectural skill.  

Buckminster Fuller, non-architect, but one of the
most creative thinkers and builders of the
twentieth century.
It’s no accident that each of the gifted non-architects cited above learned his craft mainly through practical experience, not through academics.  Today, a handful of schools still struggle to include such hands-on training—Wright’s Taliesin and Paolo Soleri’s Arcosanti among them. Yet for the most part, the architectural establishment remains firmly entrenched in the belief that formal schooling and office internship are the only legitimate basis for competence and licensure.  

Today, few would deny the contributions of geniuses like Wright and Fuller, romantics like Jones, Mizner and May, and even consummate front men like Ellwood. Yet the current process of education and licensure, overwhelmingly weighted as it is toward academic and office training, holds little room for such mavericks in the future. That’s a pity, because in many ways, the practically-trained architect follows most closely in the footsteps of the  “master builder”.    

Monday, August 24, 2020

ARCHITECTURAL OUTSIDERS Part Two of Three Parts

Architect Carr Jones managed to conjure lyrical homes out of
castoff materials—practicing green architecture long before
that term was invented.
Last time, we looked at the careers of Frank Lloyd Wright, Addison Mizner, and Cliff May, all renowned architects who were never formally trained or licensed. Today we’ll touch on a few more architects who made an undeniable contribution to the profession, despite their lack of formal credentials.

Carr Jones, a designer-builder who practiced in the San Francisco Bay Area for almost half a century beginning in the late teens, was a pioneer in green architecture if ever there was one. Jones fashioned lyrically beautiful homes out of used brick, salvaged timber, and castoff pieces of tile, slate, and iron, often wrapping his dramatically-vaulted rooms around a landscaped central court. 


Some of Jones's interiors are startling in their modernity;
this living room of a Carr Jones home in Piedmont,
California dates from 1932.
Perhaps because he was trained as a mechanical engineer and never traveled abroad, Jones was all but innocent of architectural pretension. Instead, he built on unvarying principles of comfort, conservation, and craftsmanship. And unlike many trained architects whose style changes with every faddish breeze that blows, Jones’s convictions remained uncompromised right down to his death in 1966.

R. Buckminster Fuller had no architectural training either, and indeed was expelled from Harvard during his freshman year for "irresponsibility and lack of interest". His first job was working as an apprentice machine fitter. Yet over the course of his long and wide-ranging career, Fuller’s architectural innovations included not only the geodesic dome—his best-known invention—but also the gleaming, steel-sheathed Dymaxion House, a dwelling meant to be mass produced in a factory and installed on the site as you might bolt down a lamppost.    
Buckminster Fuller posing with an early model of his
Dymaxion house, circa 1927. 

In the context of today’s fussy, retrograde home designs, Fuller’s visionary proposals for the geodesic dome and the futuristic Dymaxion House may draw smiles, but this reflects more on the glacial pace of architectural progress than any flaw in Fuller’s thinking.

A later Dymaxion house in Rose Hill, Kansas, designed
to be built using postwar-idled aircraft plants,
and built between 1948 and 1958.
Not surprisingly, Fuller dismissed conventional architects, saying: “They work under a system that hasn't changed since the Pharaohs.” During his lifetime, the onetime Harvard dropout received exactly 47 honorary doctorates from universities the world over, and today is deservedly included in practically any general survey of twentieth-century architecture.  

One highly influential non-architect had creative skills of another kind. Craig Ellwood was the celebrated Southern California modernist whom one critic called “the very best young architect to emerge from the West Coast in the years following World War II.”  A brilliant self-promoter, Ellwood (who was born Johnny Burke and took his tonier surname from a local liquor store) parlayed some minor development experience into a career that reached the highest echelon of modern architecture. So skilled was Ellwood at presenting himself that despite being barely educated—his entire formal training consisted of night classes at UCLA—he was twice considered for the deanship at Mies van der Rohe’s Illinois Institute of Technology.  

A typically elegant Craig Ellwood design in the Brentwood
area of Los Angeles, circa 1958.
Understandably, Ellwood took pains to hide the fact that he was unlicensed from his elite clientele, and he relied heavily on a gifted staff to carry out his basic concepts. That he was able to enrapture critics, editors, and clients alike despite his lack of education can only increase one’s admiration for his skill. And in the final analysis, nothing can detract from the breathtakingly elegant steel-and-glass creations that are the legacy of the Ellwood office.

Next week:  The common thread among great architects and great non-architects alike.







Monday, August 17, 2020

ARCHITECTURAL OUTSIDERS: Part One of Three Parts

One of Frank Lloyd Wright's earliest and
least-known "commissions"—this curious
windmill tower built for his family's
Spring Green, Wisconsin, farm in 1897.
Though some of my colleagues might cringe to hear it, non-architects—those who lacked either the formal schooling or the license to legally use the title “architect”—have had a huge impact on American architecture over the past century.  If they weren’t architects in the legal sense, they more than lived up to the title’s original meaning of “master builder”.

Why not start at the top? Frank Lloyd Wright’s only formal training consisted of a year of engineering classes at the University of Wisconsin. Thoroughly bored, he dropped out in 1888 and headed for Chicago to find a job.  He quickly found one, first apprenticing with the Chicago architect  J. Lyman Silsbee, and later and more famously with his “lieber Meister”, Louis Sullivan. 


Addison Mizner rose to become one of the "must-have"
society architects of Palm Beach, despite his lack of
formal credentials. Among his most enchanting works
is this Palm Beach shopping court, now named for him.
In 1893, after a falling out with Sullivan over taking outside work, Wright left the firm and opened his own office, where was able to use the title “architect” only because his practice predated the Illinois licensure requirements by four years. Wright nurtured a lifelong disdain for traditional architectural training, which eventually led him to found the Taliesin Fellowship, a unique school in which apprentice architects learned largely by doing.

Addison Mizner, whose work
was seldom taken seriously
by the architectural profession
despite his great success.
But Wright is only the best-known example of brilliant architects with unconventional or even nonexistent educations. In another vein entirely is Addison Mizner, the California-born, Guatemala-raised, Florida-polished raconteur who improbably rose to become the top society architect of Palm Beach during the Roaring Twenties. Mizner despised school, and accordingly his only architectural training was a three-year apprenticeship with the San Francisco architect Willis Polk. The happy result was a personal style that drew more from his childhood knowledge of Spanish Colonial Guatemala than from the copybooks so beloved by his contemporaries.  

One of Cliff May's early Spanish Revival homes in
San Diego's Talmadge Park, designed in 1932, when
May was just 23 years old.
(Photo: Sande Lollis, San Diego Union-Tribune)
Nevertheless, Mizner’s romantic antiquarian villas were considered vulgar setpieces by his academically-trained colleagues. It probably didn’t help that he also ran a business manufacturing mock-antique furniture and building materials, which he used liberally in his own  work. Mizner’s career was spectacular but brief; he died in 1933. Today, his surviving Palm Beach work ranks among the finest Spanish Revival architecture in the nation.

On the opposite coast, Cliff May, the San Diego architect widely considered the father of the California Rancher, started his career building Monterey-style furniture. When he began designing Spanish Colonial-style houses for speculative builders in the early 1930s, academic architects dismissed him as a purveyor of kitsch. Yet over time, May’s rambling, site-sensitive designs metamorphosed into the rustic and low-slung homes that Americans came to love so well. All told, May built his Ranchers in forty U.S. states, and their spiritual heirs went on to become the dominant style of the postwar era. Genuine May-designed Ranchers, not to mention his earlier Spanish Revival designs, are now celebrated and studied by architectural connoisseurs.  
May's later designs hewed to a more Mid-Century vibe,
such as this Long Beach "pool house" of 1953.
Cliff May: Despite his skill,
"real" architects didn't want him
in the club.

Despite these formidable accomplishments, May received only late and grudging acceptance from his licensed colleagues—or as he rather poignantly put it,  “It took real architects a long time to let me into the club.”   

Next time, we’ll look at a few more outsiders who changed the course of architecture, and see what they all had in common.




Tuesday, August 11, 2020

TO THE DETRIMENT OF DESIGN, THERE WAS ONLY ONE STEVE JOBS

The iPhone screen, with its idiot-proof icons,
builds on the long evolution of Apple's
graphic user interface.
For close to a decade now, every time I’ve had to use yet another badly-designed appliance, or had to sit idling at yet another ineptly-timed traffic light, or had to decipher yet another garbled set of instructions, I’ve thought of one man: Steven Jobs. And I wish he was still with us, or barring that, that at least there could’ve been a hundred more like him.

There’s no doubt that, with Jobs’s passing, the world lost one of the most important visionaries of the last hundred years. But for me, the loss has less to do with his putting a computer for the rest of us on a million desktops, nor with his uncanny knack for creating things that people didn’t even know they needed. Granted, these accomplishments are vastly important to Jobs’s legacy. But to my mind, his ultimate triumph was his singular skill at persuading a largely indifferent public that excellent design really matters. He wanted us all to be as passionate about beauty and simplicity as he himself was. And to the extent that Apple’s famously intuitive and user-friendly products are now more popular than ever, he seems finally to have succeeded.
A young Jobs poses with the original Macintosh,
circa 1984.

The fact is that the average American consumer has been amazingly tolerant of third-rate product design. Consequently—and understandably—any company that knows it can make perfectly good money selling clumsy, overcomplicated, or unintuitive products has no incentive whatever to improve them. And so most don’t. 

In Jobs, however, we had the unique case of a businessman on a near-religious crusade to educate his own market, relentlessly challenging us to demand more than the run-of-the-mill crap we’re typically offered. 
Apple's logo, circa the 1980s.

It’s interesting to note that the Apple cofounder, despite being a pioneer in one of the most technically complex fields yet known to man, was not an engineer but rather a laid-back college dropout with a mystical streak. To add yet another layer of paradox to this singular mind, he was notoriously—some would say tyrannically—demanding of the people who worked for him. But if this is what it took to engender the phenomenally beautiful and beautifully functional objects Apple has created out over the years, then it was all worth it.
The iPhone 11: Would Jobs have approved? Hmm....

As you’ve probably guessed, I write on a Macintosh, and have done since I bought the very first model through an Apple engineer pal back in 1984. So yes, kids—I’ve been a true believer since long before the iPod, iPad, or iPhone even existed. In fact, I was a believer back when Steve Jobs still had a full head of hair. And for many of those years, I tried in vain to convince doubters why there was nothing like using a Mac—in short, why good design really mattered. Thankfully, with the wild success of those assorted iThings, Jobs was finally able to make that case beyond any doubt. 


Whether Apple has been able to maintain its "insanely great" design in the near ten-year absence of Steve Jobs is debatable, as one look at the plug-ugly iPhone 11 makes clear. It's almost certain that, with all due acknowledgment of its technical brilliance,  Jobs would not have tolerated the inelegance of its design. 

Jobs had already revolutionized the fields of computing, film, music, and telephonics. I wish he’d been given the time for even more far-flung conquests. The world could have used a hundred more like him, but alas, there was only one.



Tuesday, August 4, 2020

HORROR VACUI: Enough Design, Already

The Fall of Babylon, an etching by
Jean Duvet, c. 1555, is often cited
as an example of horrovacui in art.
Not long ago, I handed a young architectural intern a preliminary sketch to be drafted up on the computer. It was a site plan for an agricultural research facility comprising 130 acres, about eighty acres of which were supposed to be reserved  for farmland.

A week later, as promised, I received the computer drawing. But lo and behold, the great swath of undeveloped acreage shown in the original plan had been completely filled up with a meandering web of plazas and pedestrian malls in a galaxy of arbitrary shapes—pinwheels, checkerboards, crescents, what have you. Setting aside the fact that these busy forms would only have made sense from the air, they would also have made for some rather difficult farming.

When I asked the intern why she’d added all those features unbidden, she replied:  “The plan looked so empty, I thought the client would want to see more things in it.” 

Victorian architecture is famous for its tendency to decorate
every available surface—a trait that fomented the chaste
counter-reaction known as Arts and Crafts, and later on,
the asceticism of Modernist architecture.
This is a problem that afflicts all creative people, so much so that we even have a Latin name for it: horror vacui, or fear of emptiness. Herbert Muschamp, the architecture critic of The New York Times, has called it “the driving force in contemporary American taste...(and) the major factor now shaping attitudes toward public spaces, urban spaces, and even suburban sprawl."

As Muschamp rightly perceives, horror vacui is especially pronounced among architects.  Many, like my young drafter, think that if they don’t fills up every space with an avalanche of ideas and images, however unrelated to the program, they’ve somehow fallen short of their creative charge.

The pendulum swung wildly to the opposite extreme with the
advent of Modernist architecture—and once again,
too much ( or in this case, or too little)
brought on today's distinct lean toward horror vacui.
In fact, just the opposite is true. Architecture is a process of reduction, not just compilation. Ideally, the architect distills a complex set of requirements into the simplest form that will both satisfy the client’s needs and offer some measure of personal artistic grace. The avalanche of ideas has its place early in the process, but as things progress, design features that aren’t essential—whether for function or effect—fall away, leaving the final polished kernel of a solution. When carried out with skill, this process doesn’t preclude fanciful ideas, but it does preclude dysfunctional and clumsy ones. 

Of course, today’s designers aren’t the only ones afflicted with horror vacui—it’s a tendency that waxes and wanes over decades. Victorian architects, for instance, couldn’t bear to see an unadorned surface. The dawning twentieth century brought a counter-reaction to this compulsive decoration; it began with the Mission Revival and Craftsman styles and reached its zenith with International Style Modernism, whose practitioners turned architectural reduction into an art form.  
There is a middle ground, of course—in this Spanish Revival
home dating from the 1920s, for example, the plain wall
surfaces serve to intensify the effect of the other elements.

Ironically, it’s precisely this Modernist austerity that’s sent us hurtling back toward the frenetic gimcrackery so evident in contemporary design. And while architecture without complexity is dull, architecture that’s layer upon layer of complexity is simply meaningless.  

A house on the Greek island of
Mykonos: No fear of plain surfaces here.
As in so many other things, the answer lies in striking a balance. Some of our era’s most idealized domestic architecture—rural French farmhouses, say, or those much-admired vernacular hillside towns in Italy or Greece—are about as spare and simple as could be while still suiting their purpose. Against such a clean sharp background, a single flowerpot or bit of filigreed ironwork fairly bursts with ornamental power.  

Alas, like my young intern, many architects still grow fidgety at the sight of a plain white wall, much less an empty plot of land. That’s too bad because, more often than we’d like to think, the best designing we can do is none at all.