Can’t wait to see this movie!
The genius of “Lincoln” lies in its vision of politics as a noble, sometimes clumsy dialectic of the exalted and the mundane.
Can’t wait to see this movie!
The genius of “Lincoln” lies in its vision of politics as a noble, sometimes clumsy dialectic of the exalted and the mundane.
Today marks the 150th anniversary of the Battle of Antietam, in southwest Maryland, where some 23,000 men were killed, wounded or missing after twelve hours of savage combat on September 17, 1862 — the worst day of fighting in American history. It gave President Lincoln a (costly) victory with which, five days later, to issue the Emancipation Proclamation, and also saw the first battlefield photographs ever recorded.
Living in Northern Virginia, exploring these historical treasures is a pastime of mine, one I personally find very rewarding. The major confrontation occurred in “the cornfield,” which from the photos below one can tell looks a bit different today. The fighting lasted from sunup to sundown — it took another five days just to bury the dead.
At the DisCo Project, we naturally focus on the current, dynamic technology marketplace and the disruption it is continuing to cause to brick-and-mortar and other “legacy” industries. But disruptive innovation is not new and not unique to high-tech. It’s been around for hundreds of years and serves as a key driver of both economic growth and social evolution.
Let’s start with the poster child of disruption, buggy whip manufacturers. In the late 19th century there were some 13,000 companies involved in the horse-drawn carriage (buggy) industry. Most failed to recognize that the era of raw horsepower was giving way to that of internal combustion engines and the automobile. Buggy whips, once a proud, artisan craft, essentially became relegated to S&M purveyors. Read Theodore Levitt’s influential 1960 book Marketing Myopia for a more detailed look.
Not everyone was obsoleted by Henry Ford. Timken & Co., which had developed roller bearings for buggies to smooth the ride of wooden wheels, prospered into the industrial age by making the transition to a market characterized as “personal transportation” rather than buggies. Likewise carriage interior manufacturers, who successfully supplied customized leather-clad seats and accessories to Detroit.
One might suspect this industrial myopia has been confined to small markets with few dominant players. But not hardly. One of the more famous series of patent cases in history were the battles between Western Union and Alexander Graham Bell in the 1870s, where the telegraph giant (along with scores of others) vainly tried to contest Bell’s U.S. patents on the telephone. Ironically, the telephone was initially rejected by Western Union, the leading telecommunications company of the 1800s, because it could carry a signal only three miles. The Bell telephone therefore took root as a local communications service simple enough to be used by everyday people. Little by little, the telephone’s range improved until it supplanted Western Union and its telegraph operators altogether.
Apart from scurrilous character assassination suggesting Bell had bribed U.S. Patent and Trademark Office clerks to stamp his patent application first, the patent cases are best remembered for their eventual 1879 settlement. Western Union assigned all telephone rights to the nascent Bell System with the caveat that Bell would not compete in the lucrative telegraphy market. After all, Western Union surmised, no one wanted to have their peaceful homes invaded by ringing monsters from the stressful outside world. Check out this verbatim 1876 internal memo from Western Union:
Messrs. Hubbard and Bell want to install one of their “telephone devices” in every city. The idea is idiotic on the face of it. Furthermore, why would any person want to use this ungainly and impractical device when he can send a messenger to the telegraph office and have a clear written message sent to any large city in the United States?
Epically wrong! But that, of course, is the challenge of disruptive innovation. It forces market participants to rethink their premises and reimagine the business they are in. Those who get it wrong will be lost in the dustbin (or buggy whip rack) of history. Those who get it right typically enjoy a window of success until the next inflection point arrives. Were barbers out of business when, some 200 years ago, doctors began to curtail the practice of bleeding patients, eventually usurping barbers as providers of health care? No, because barbershops moved from medicine to personal grooming.
Disruptive technologies create major new growth in the industries they penetrate — even when they cause traditionally entrenched firms to fail — by allowing less-skilled and less-affluent people to do things previously done only by expensive specialists in centralized, inconvenient locations. In effect, they offer consumers products and services that are cheaper, better, and more convenient than ever before. Disruption, a core microeconomic driver of macroeconomic growth, has played a fundamental role as the American economy has become more efficient and productive.
There are hundreds or thousands more examples we can discuss. Polaroid and Kodak, both innovators in their own right, have faced bankruptcy and virtual irrelevance over the past few years because they could not cope with rapid disintermediation of their photography businesses by digital technologies. Walgreens, CVS and camera shops, meanwhile, have retained a solid photography revenue stream by supporting photo printing from SD cards and even Facebook photo collections.
Some businesses get it and some do not. Disruptive competition drives out those whose world view tries quixotically to preserve the past or to protect economic and social customs from technology-driven change. Disruption is of course not a panacea for all social ills; New Yorkers, for instance, complained as much about the filth and stench of cobblestoned city streets filled with horse droppings in the 19th century as they did about the filth and stench of paved streets filled with cars and CO2 fumes in the 20th century. As an economic and competitive matter, however, disruption is a process of continually “out with the old and in with the new.” And it’s been that way for as long as anyone can remember.
Nearly 10 years ago, days after the 9/11 terrorist attacks, I drove home to the Washington, DC suburbs from Santa Fe, New Mexico. It was a long, long trip, some 28 hours of driving over two and 1/2 days, but an experience like no other. There was a special sense of community, of shared loss, of egalitarianism and of fraternity that pervaded the highways. Flags and signs hung from overpasses. Everyone listened to the same news alerts. People made eye contact at rest stops and restaurants, nodding knowingly about the inner rage, and determination, affecting the United States. In many ways, it was a highly spiritual experience and a unique time in this country.
Sunday’s special ops killing in Pakistan of Al Qaeda leader Osama bin Laden — mastermind, symbol and financial underwriter of the Al Qaeda network — produced much of the same feelings. Twitter and social media were overwhelmed. Young people, who have never known a United States without its current national security state apparatus, celebrated in front of the White House. CNN and the other television news networks served as a place of gathering for Americans of all races, backgrounds and socio-economic status.
Bin Laden’s theory was that Western democracies are weak and thus that direct terrorist attacks would splinter the citizenry and end Western involvement in the Middle East. He got it entirely backwards. The reality is that 9/11 united the United States. We debate and fight about tactics, long-term strategy and effectiveness, but since that day no American can look at the massive hole of ground zero in Manhattan’s financial district, or the new granite walls of the Pentagon, without recalling where they were and how they felt on 9/11. That’s a legacy that has already outlasted bin Laden.
There’s another way in which bin Laden’s death has once again transformed this country from a nation of strangers to a shared community. This president, whose policies on healthcare, deficit reduction and the like are attacked from all sides, risked everything to get America’s most well-known terrorist enemy. If the operation had failed Obama would have been a crippled leader, like Jimmy Carter after the 1980 Iranian hostage rescue operation faltered in the desert sands, with re-election impossible. His was a balls-out call. For a Democrat, especially, to maintain secret, unilateral “black” intelligence operations in foreign countries has been all but anathema. Obama acted more like Ronald Reagan than either W. or Bush 41 ever did.
John Ullyot, a former Marine intelligence officer who served as a Republican spokesman on the Senate Armed Services Committee, said the operation was “a gutsy call because so much could have gone wrong. The fact that Obama approved this mission instead of the safer option of bombing the compound was the right call militarily, but also a real roll of the dice politically because of how quickly it could have unraveled.”
No one is criticizing the decision to assassinate bin Laden. That in itself is simply amazing, another sign of the feelings of community pervading this country. They will not last, of course. But today we are once again all Americans.
One difference is that although worldwide support for America spiked after 9/11, it seems even Arabs and other Muslims have now largely abandoned the anti-Western Jihad mentality that bin Laden fostered. The revolutions in Egypt, Tunisia, Bahrain and Libya re not being driven by radical Shi’ite imams, rather by middle class tech executives and students. This year’s Arab Spring movement is secular and largely non-violent. American flags are not being burned and our government — massively out of character historically, and at long last — actually stood on the side of the protesters and against entrenched, repressive Arab governments. That’s another arrow in Al Qaeda’s coffin, and another way in which, in the instantly connected global community of today’s Earth, we really are all Americans.
Bin Laden was adept at convincing smaller, regional terrorist groups that allying with Al Qaeda and focusing on America were the best ways to topple corrupt regimes at home. But many of his supporters grew increasingly distressed by Al Qaeda’s attacks in the last few years — which have killed mostly Muslims — and came to realize that bin Laden had no long-term political program aside from nihilism and death.
The Arab Spring, during which ordinary people in countries like Tunisia and Egypt overthrew their governments, proved that contrary to Al Qaeda’s narrative, hated rulers could be toppled peacefully without attacking America. Indeed, protesters in many cases saw Washington supporting their efforts, further undermining Al Qaeda’s claims.
Thirty years is a long time. I remember the night of John Lennon’s murder very well, as I was living just blocks away from his NYC apartment at The Dakota at the time. A sad and oh so poignant moment.
I had no idea until hearing it on the radio this morning that the America tradition of having our president throw the first pitch on baseball’s opening day in the spring originated with William Howard Taft — pictured below — in 1910.
Thankfully presidents in the late 20th and early 21st centuries are a bit more muff than large Mr. Taft!
The Führer’s still dead, or is he? Tests on Skull Fragment Cast Doubt on Adolf Hitler Suicide Story [The Observer] . Sixty-four years later, the world remains in the dark about what really happened in Adolph Hitler’s bunker on April 30, 1945. And it appears that, just as with the JFK assassination, the Soviets are in the middle of it, as they hid what are claimed to be Hitler’s bones, supposedly found burned outside his underground Berlin safe house, from the Allies until 2000. But the most interesting part is that DNA tests show the skull is from a female. If it’s not Eva Braun, then maybe Hitler was really a woman?
Today was not the end of the financial carnage in equity markets. While 24/7 news cycles, collars and index options have all increased volatility, what we are dealing with now is simple panic. Asian Markets Plunge After Huge Wall Street Losses [International Herald Tribune]. In the 19th century, economic recessions were called “panics” because they usually led to GDP deflation, bank runs and currency inflation (a flight to safety, i.e., gold).
Crop failures, drops in cotton prices, reckless railroad speculation and sudden plunges in the stock market all came together at various times to send the growing American economy into chaos. The effects were often brutal, with millions of Americans losing jobs, farmers being forced off their land and railroads, banks and other businesses going under for good.
And they happened with frequent cyclicality, 1819, 1837, 1957, 1873, 1893. But the last major event to be called a “panic” was the Panic of 1907. The New Deal’s regulations and Federal Reserve system, coupled with the end of the gold standard, were supposed to end all of that.
Well in my view the past 10 days demonstrate that despite our technological and financial advances, the 21st century looks a lot like the 1800s. Even if the deflationary consequences — except in housing, of course — have not hit like they did 150 years ago, we are dealing with the same sort of psychological panic from which modern capitalism was supposed to be immune. It obviously ain’t.
Oh, poor New York Times, editorializing in a piece titled So Far Over the Line that the McCain-Obama race has become mired in “gone negative” attacks.
This year’s presidential campaign has already been marked by far too much negative advertising, with coded racial images and sophomoric insults. It was outrageous when Mr. McCain’s campaign juxtaposed Mr. Obama with Paris Hilton and Britney Spears as part of its effort to denigrate him as a person, rather than debating him on this country’s huge problems.. . . Madonna’s video is immeasurably worse. If she thought she was helping Mr. Obama by juxtaposing his image with that of Gandhi and Bono, she was wrong.
This stuff has been going on for hundreds of years. Watch the excellent series The Presidents on the History Channel to get a sense of how earlier campaigns treated Tyler, Lincoln, Cleveland, Truman and others. More importantly, what the Times fails to realize is that neither of these candidates can run on their records — run AWAY is more like it — and both are transitioning (i.e., flip-flopping) from primary positions to general election moderation. The race is negative because attacks work and because the puffery about respect and elevating American political rhetoric was just that….a bunch of hot air.
They spent two years laboriously reconstructing a copy of the Wright Brothers’ first powered airplane, but today — the 100th anniversary of that first flight — our modern 21st century engineers couldn’t get off the ground and flopped in the mud at rain-soaked Kitty Hawk (now Kill Devil Hills), North Carolina [Reuters].
Shows how ingenious Oriville and Wilbur really were. The world has changed a lot as a result of their invention, mostly for the good, but it’s still a place where 90% hard work isn’t always enough to compensate for the lack of 10% inspiration. Apologies to Thomas Edison for butchering his aphorism.