A sample text widget

Etiam pulvinar consectetur dolor sed malesuada. Ut convallis euismod dolor nec pretium. Nunc ut tristique massa.

Nam sodales mi vitae dolor ullamcorper et vulputate enim accumsan. Morbi orci magna, tincidunt vitae molestie nec, molestie at mi. Nulla nulla lorem, suscipit in posuere in, interdum non magna.

IoT + EU = ?

People have been talking, and pontificating, about a coming “Internet of Things” since 1999. The idea is that the many sensors, actuators and digital data recorders in the environment around us — like the electronic control units (ECUs) in modern automobiles — will be uniquely identified and connected via IP to each other and to the world. This would allow instantaneous supply chain fulfillment, green initiatives like demand-side management and smart refrigerators, as well as simply cool stuff that puts remotely programming one’s DVR from a smartphone app to shame. As McKinsey & Co. noted in 2010:

The physical world itself is becoming a type of information system… When objects can both sense the environment and communicate, they become tools for understanding complexity and responding to it swiftly. What’s revolutionary in all this is that these physical information systems … work largely without human intervention.

So what’s going on? Two things. The obvious one is that more than a decade later (and despite the fact that by 2008 there were already more “things” connected to the Internet than people in the world) we are still not “there” yet. Refrigerators cannot detect when the last soda can is used, let alone order more autonomously; HVAC systems largely cannot interact with electricity genitors in real time to consume more energy when rates are lower, and vice-versa; and suitcases cannot communicate with airport luggage systems to tell the machines onto which flight they should be loaded (except with barcode readers). Partially, that’s because technologists frequently overstate adoption projections for new networks by 10 years or more. Less obvious is that there’s been a quiet push in the European Union (EU) to regulate the IoT even before it is fully gestated and born.

A European Commission “consultancy” on the Internet of Things was launched in 2008. By 2009 the EU had already issued an Action Plan for Europe for the IoT, which concluded:

Although IoT will help to address certain problems, it will usher in its own set of challenges, some directly affecting individuals. For example, some applications may be closely interlinked with critical infrastructures such as the power supply while others will handle information related to an individual’s whereabouts. Simply leaving the development of IoT to the private sector, and possibly to other world regions, is not a sensible option in view of the deep societal changes that IoT will bring about.

As a result, libertarian business groups such as the European-American Business Council and TechAmerica Europe have this summer come out in opposition to the EU’s approach, pressing for industry-led standards and application of existing measures, like the existing EU data protection rules (which already exceed the United States’ by a wide margin), “in lieu of a new regulatory structure.”

This is a scary prospect. That the EU would even consider crafting a regulatory scheme now for a technology revolution that realistically remains years away, requires immense levels of cooperation among industries, and holds the potential to transform business and life as we know it, is remarkable. Remarkable because such a philosophy is so alien to American economic values and to the spirit of innovation and entrepreneurship that launched the commercial Internet and Web 2.0 revolutions.

This article is not the place to debate the conflicts, trade-offs and differing views of government animating current technology policy issues like net neutrality, privacy and cybersecurity, copyright and the like. The reality though, is that issues such as those are generally being assessed within a spectrum of solutions, worldwide, which reflect known risks and benefits, some proposals of course being more interventionist than others. But that is far different from allowing a single bureaucratic monolith to dictate the shape of an industry and technology that remains embryonic. How is it even possible to develop fair rules for the IoT when no one has any real idea what or when it will be?

Wikipedia fair use imageMore than 15 years ago, this writer worked for one of his corporate clients on a legislative amendment offered by Rep. Anna Eshoo (D-Cal.) to the Telecommunications Act of 1996. The so-called “Eshoo Amendment,” designed to limit the role of the Federal Communications Commission in mandating standards for emerging, competitive digital technologies like home automation, passed. The irony, of course, is that at the time Congresswoman Eshoo analogized home automation to a future world like that of The Jetsons. Now 16 years down the road, we are barely closer to George, Elroy and their flying cars, robotic maids and the like than we were then.

But that ’96 effort illustrated a fundamental difference between the United States and the European Union about the proper role of government with respect to innovation. The EU subsidizes research, sets agendas and looks to intervene in the marketplace in order to establish rules of the road even before new industries are launched. The US sits back, lets the private sector innovate, and generally intervenes only when there has been a “market failure.” That’s a philosophy largely embraced by both major American parties regardless of the increasingly polarized political landscape in Washington, DC.

This basic difference in world views between the home of the Internet and European regulators — as true today as in 1996, if not more so — could doom the Internet of Things. So if you are a fan of future shock, then it’s clear you should not react to the EU’s efforts to shape the IoT with a viva la difference attitude. The difference is dangerous to innovation and especially dangerous to disruptive innovation. It’s no wonder that few real digital innovations have come from Europe. Don’t expect many in the future unless the EU finds a way to decentralize and privatize its bureaucratic tendency towards aggrandizing government in the face of what IoT experts anticipate will be “a small avalanche of disruptive innovations.”

Note:  Originally prepared for and reposted with permission of the Disruptive Competition Project.

Disco Project

 

Of Buggy Whips, Telephones & Disruption

Disco Project

At the DisCo Project, we naturally focus on the current, dynamic technology marketplace and the disruption it is continuing to cause to brick-and-mortar and other “legacy” industries. But disruptive innovation is not new and not unique to high-tech. It’s been around for hundreds of years and serves as a key driver of both economic growth and social evolution.

Let’s start with the poster child of disruption, buggy whip manufacturers. In the late 19th century there were some 13,000 companies involved in the horse-drawn carriage (buggy) industry. Most failed to recognize that the era of raw horsepower was giving way to that of internal combustion engines and the automobile. Buggy whips, once a proud, artisan craft, essentially became relegated to S&M purveyors. Read Theodore Levitt’s influential 1960 book Marketing Myopia for a more detailed look.

Not everyone was obsoleted by Henry Ford. Timken & Co., which had developed roller bearings for buggies to smooth the ride of wooden wheels, prospered into the industrial age by making the transition to a market characterized as “personal transportation” rather than buggies. Bell correspondenceLikewise carriage interior manufacturers, who successfully supplied customized leather-clad seats and accessories to Detroit.

One might suspect this industrial myopia has been confined to small markets with few dominant players. But not hardly. One of the more famous series of patent cases in history were the battles between Western Union and Alexander Graham Bell in the 1870s, where the telegraph giant (along with scores of others) vainly tried to contest Bell’s U.S. patents on the telephone. Ironically, the telephone was initially rejected by Western Union, the leading telecommunications company of the 1800s, because it could carry a signal only three miles. The Bell telephone therefore took root as a local communications service simple enough to be used by everyday people. Little by little, the telephone’s range improved until it supplanted Western Union and its telegraph operators altogether.

Apart from scurrilous character assassination suggesting Bell had bribed U.S. Patent and Trademark Office clerks to stamp his patent appli­cation first, the telephone patent cases are best remembered for their eventual 1879 settlement. Western Union assigned all telephone rights to the nascent Bell System with the caveat that Bell would not compete in the lucrative telegraphy market. After all, Western Union surmised, no one wanted to have their peaceful homes invaded by ringing monsters from the stressful outside world. Check out this verbatim 1876 internal memo from Western Union:

Messrs. Hubbard and Bell want to install one of their “telephone devices” in every city. The idea is idiotic on the face of it. Furthermore, why would any person want to use this ungainly and impractical device when he can send a messenger to the telegraph office and have a clear written message sent to any large city in the United States?

Epically wrong! But that, of course, is the challenge of disruptive innovation. It forces market participants to rethink their premises and reimagine the business they are in. Those who get it wrong will be lost in the dustbin (or buggy whip rack) of history. Those who get it right typically enjoy a window of success until the next inflection point arrives. Were barbers out of business when, some 200 years ago, doctors began to curtail the practice of bleeding patients, eventually usurping barbers as providers of health care? No, because barbershops moved from medicine to personal grooming.

Disruptive technologies create major new growth in the industries they penetrate — even when they cause traditionally entrenched firms to fail — by allowing less-skilled and less-affluent people to do things previously done only by expensive specialists in centralized, inconvenient locations. In effect, they offer consumers products and services that are cheaper, better, and more convenient than ever before. Disruption, a core microeconomic driver of macroeconomic growth, has played a fundamental role as the American economy has become more efficient and productive.

Clayton Christensen, Thomas Craig and Stuart Hart, The Great Disruption

There are hundreds or thousands more examples we can discuss. Polaroid and Kodak, both innovators in their own right, have faced bankruptcy and virtual irrelevance over the past few years because they could not cope with rapid disintermediation of their photography businesses by digital technologies. Walgreens, CVS and camera shops, meanwhile, have retained a solid photography revenue stream by supporting photo printing from SD cards and even Facebook photo collections.

Some businesses get it and some do not. Disruptive competition drives out those whose world view tries quixotically to preserve the past or to protect economic and social customs from technology-driven change. Disruption is of course not a panacea for all social ills; New Yorkers, for instance, complained as much about the filth and stench of cobblestoned city streets filled with horse droppings in the 19th century as they did about the filth and stench of paved streets filled with cars and CO2 fumes in the 20th century. As an economic and competitive matter, however, disruption is a process of continually “out with the old and in with the new.” And it’s been that way for as long as anyone can remember.

Courtesy of Disco Project | Of Buggy Whips, Telephones and Disruption.

 

Cover of the Rolling Stone (Not)

My photo from Sunday’s Washington Redskins’ game made the cover of Flipboard. Wow, you say? Not really. See, Flipboard is an iPad app — created by a team led by former Tellme CEO and client Mike McCue — that dynamically creates a magazine layout of all your social media connections and posts.

Still, it’s rare that any specific user’s Tweets appear on the cover page. So this is not quite like that corny 1970s song by Dr. Hook & The Medicine Show chosen as the title here. But NOT bad, not bad at all!

Posted via email from glenn’s posterous

No Longer a Golden Ticket

A J.D. degree is not worth what it once was as the legal industry wrestles with unprecedented business changes.

Posted via web from glenn’s posterous

The VC Industry is At an Inflection Point

Many are speculating that 2009 represents a fundamental turning point for the venture capital industry. Some are arguing that the industry is in dire straits after years of poor performance. Others have argued that the math simply does not work for the industry’s current size. What Is Really Happening to the Venture Capital Industry?

It is indeed quite likely that the venture industry is in the process of a very substantial reduction in size, perhaps the first in the history of the industry. However, the specific catalyst for this reduction is not directly related to the issues just mentioned. In order to fully understand what is happening, one must look upstream from the venture capitalists to the source of funds, for that is where the wheels of change are in motion.

The issue, explains Kevin Durant of abovethecrowd.com, is that a lack of liquidity and fairly ordinary returns from VC funds have driven institutional capital from the venture space. If so, it’s not that the IPO drought and absence of exit events for start-ups is what’s hammering Sand Hill Road, but rather a perfect storm of fiscal crisis and shrinking capital sources. That, coupled with the fact that VCs travel in packs and tend to jump on the bandwagon AFTER the innovation train has already left the station.

Blogged with the Flock Browser

Been There, Done That

Alright, so more than 80% of Silicon Valley's 150 largest publicly traded companies have employees holding underwater options. Everybody's Underwater In Silicon Valley [AlleyInsider.com]. It's happened before, just eight short years ago (2000-03), and that did not deter innovation or investment. Markets are cyclical and sometimes even good companies, especially before positive EBIDTA, cannot maintain market valuations. Nothing unusual about the same thing happening to public companies when panic sets into the equity markets. Stick it out, guys; you've come a long way already.

Ballmer Says Linux is “Cancer”

Microsoft’s Steve Ballmer, once again on an anti-open source crusade, now says that Linux is a “cancer” but that the new Windows Server 2003 product can compete with free software because is it “innovative.”

Innovation is not something that is easy to do in the kind of distributed environment that the open-source/Linux world works in. I would argue that our customers have seen a lot more innovation from us than they have seen from that community. . . . Linux itself is a clone of an operating system that is 20-plus years old. That’s what it is. That is what you can get today, a clone of a 20-year-old system. I’m not saying that it doesn’t have some place for some customers, but that is not an innovative proposition.

All this from the company that brought us a desktop GUI in 2000 that Apple made available in 1987, that specializes in buying technology developed elsewhere (DOS, PowerPoint, IE, etc.) and that still cannot fugure out how to put a laptop computer to sleep. Eat your Cheerios, Steve, you’re going to need them. All you have is monopoly power; in the long-run, that’s not enough to save the company.

WiFi No Shows

Starbucks and T-Mobile have made a big push to establish 802.11b “hot spots” around the country, but so far no one is buying! That’s perhaps not unexpected, since Wi-Fi is still somewhat of an early adopter technology, but the extremely small take rate — just 25,000 out of 22 million Starbucks customers — is surprisingly low. As usual in the New Economy, making a viable business model out of a new technology is proving harder to implement than the initial plans suggest.