Conference Welcome and Opening Keynote: Nick Carr
* Nick Carr, Author of The Big Switch, Rewiring the World, From Edison to Google
Nick is giving a speech. Here it is:
We're at a major turning point in the history of computing that has a whole set of ramifications. Computer systems and software algorithms are increasingly at the center of the way we do business, the way we find media, and the way we communicate. The changes are not fundamentally technological; rather, they're economic changes that are set off by the new technology. This is the "Big Switch" that he talks about.
If you look at your own behavior over the last 5 years or looking at behavior of college students, you can see that we've approached computers differently. If you wanted to do something new awhile ago, you'd buy software at the store. Increasingly, that's becoming foreign to us. Most of us, today, fire up our web browser, go to the Internet and download the software application. We bypass the process of buying at a store. This is web 2.0: delivering content online. Businesses think about computing and have kept themselves at a distance away from the web 2.0 phenomenon. They look at this as a thing people do in their homes, for geeks, etc. -- but not for businesses. They tried to keep it outside of the corporate world, but the PC proved useful and valuable for all of us. When people started using it often, businesses had to adapt to the technology and they had to think differently about their approach to computing. Corporate IT is now built around the PC.
But the change we're starting into today is more fundamental than the PC revolution. You have to go back further in time to understand the underpinnings of the change. Back years ago, someone in Troy (Henry Burden) built the water wheel, and this is power generation. Every company back then had to do this on its own. Power wasn't central to the business but it gave a competitive advantage.
If you went back to that site early in the 1900s, the wheel had been abandoned and left to collapse. What happened? Burden Ironworks didn't get out of business, but suddenly, companies didn't have to generate their own power anymore. Technological innovations changed and there were alternate current electric grids. People can now plug into their own network and get power cheaply and efficiently. They could abandon all of those personal power sources. When you think about that and put yourself into the shoes of the factory owners, it might have seemed as an incredibly leap of faith. This was just the assumption. Power was central to all of these businesses. Over time, as industrial companies grew and expanded, they also expanded the sophistication and complexity of the power distribution operations.
He shows a picture of power generation machinery that is very complex from the 1900. This is the equalivalent to enterprise systems today: incredibly complex, prone to failure, etc. But you had to have them. In 1910, only 40% of the power in the US was generated by small utilities. But 20 years later, 80% of the power was generated by utilities - and the magnitude of those utilities had lessened. As soon as there was an economical efficient means of sharing power, it became a revolution. This is the electrification of industry and society. The big news is that as soon as you moved to central supply, you can drive down the cost of energy dramatically. There was an explosion of innovation at the socket. Business people started to realize that cheap power would allow them to change the way they manufactured their own produts. Henry Ford cut down the cost of the assembly line, for example. The car became a democratic product that almost anybody bought rather than a luxury product for a relative few. It brought the industrial revolution into the home.
Ultimately, it also affected media, entertainment, news, and everything about the way we communicate and entertain ourselves. It led to the rise of mass media, mass broadcasting, etc. People were able to buy hi-fi sets and radios from living in suburban areas. The computer is the next great critical technology that is going through a similar change - going from a private supply (we have our own PCs, etc.) to a central utility supply model (the cloud computing model). We essentially plug into a shared grid to get the processing quality we need. Obviously, if you look at IT at a technological level, they are different things. But if you look at them at an economic level which is where people make decisions, they're similar. They're general purpose technologies - for many different purposes. With an electric grid or with a computing system, your options for innovation are unlimited because it's your imagination because it's all about how you want to apply that technology.
Information technology and electricity are even more rare than general purpose technolgies. They are the only 2 technologies that can be supplied over a network or over a grid. It can drive down the cost. We're beginning to see that today with hardware and software. In the history of computing, we see that it follows a similar pattern to the history of power generation. The pattern of computing was established in the very years of the last century. Just like with power generation, as new machines and technologies advanced, the applications of computers rose enormously and in the 50s and 60s, the mainframe came out. The mainframe had an advantage: it was incredibly efficient. It would operate at 80-90% of its capacity. It was centralized. But it was also impersonal use of computers. It could be used for big institutional jobs. The average worker could not access the power of the mainframe to use it for his individual job. It became the hub of corporate life.
Eventually, the mainframe wasn't the core. There was a server room and a data center which was a testimony to incredibly complexity and expense and also testifies to the labor requirements involved in managing a client-server PC based system. The current generation of computing (client-server generation) is the opposite of the mainframe era. It has made computing personal in one way. At the same time, it has made computing incredibly inefficient because every company has to build one of these rooms with similar machinery and software. This is the only way that we've known how to deploy computing across companies. It's incredibly inefficient and fragmented. Hewlett Packard saw that 80% of server's capacity goes to waste and 20% is applied. Storage capacity has a similar picture (35% is used, the rest is wasted). As for labor, it's also similiar - 70-80% keeps machines running. It isn't a business advantage. It's just the cost of doing business. If you look broadly at the economy, we're playing a big tax for the inefficient mode of computing. Back in the 60s, 5-10% of the budget went into IT. In 1980, the budget exploded to about 45%. The average company is investing almost in its computing equipment as in all its other equipment. This tells us that if there were a more efficient way of deploying systems (shared systems rather than private one), we can reduce the capital and use it for more beneficial purposes. Companies can invest in the main business instead of peripheral to the business.
Many companies are building data centers, such as Google (he shows us a picture of a Google data center in Oregon). There's a massive buildout of the computing grid for the distibution of computing just the way power companies built grids for the distribution of power. Utility computing is possible and inevitable. There are two laws behind this:
One law is Moore's Law (the power of computers for a given price is going to double every 18 months). This unerpins the explosion of the PC and computing power has gotten cheap enough that we can move from having physical machines to replicating those physical machines in software: virtualization - translation of hardware into software. You can now get cheap generic server and use virtualization to deploy that grid.
You also need to have an efficient means of distributing that power to the users. You need the computing grid. Andy Grove came up with his own law called Grove's law. The capacity of network communications doubles only every century. It explains a fundamental fact about computing: to tap into the huge efficient computing power dictated by Moore's law, you needed to run your applicatons locally. With broadband internet in the last few years, suddenly, the capacity of the network has begun to catch up with the power of the computer. Grove's law has been repealed and the network capacity catches up to computer power and you can suddenly deploy sophisticated services over this rich new grid.
Eric Schmidt predicted this back in 1993 when he was the CTO of Sun Microsystems. "When the netowrk becomes as fast as the processor, the computer hollows out and spreads across the network." The network is now becoming the computer and the datacenter in the way Schmidt predicted eloquently 15 years ago. We've seen this in the buildout of Web 2.0: the deployment of the centralized grid for student computing. We're seeing massive investments among IT providers to businesses - new startup firms supply individual business programs with enterprise solutions. People are moving into the cloud and the IT industry is acknowledging that a massive fundamental change is afoot. People move from a component business to a service based business where you operate utilities.
What does this mean? Just like the electrical socket brought a wave of innovation, we'll see similar effects come out of the utility model of computing. It dramatically reduces the cost and increases the availability of computing. We'll be able to tap into the resources anywhere with any product or type of service. It's a new wave of innovation that affects all areas of commerce, society, and culture.
Firstly, we'll be rethinking corporate IT. Corporations are going to move their datacenters into the cloud thanks to software like AppLogic. It's a fascinating applicaton and provides you with the Operating System (a WYSIWYG OS) to use grids of computers to your own purposes. It uses virtualization to create applications. Suddenly, phyisical machinery becomes software and moves out onto the utility grid. At the same time, companies and anyone who thinks about the way software applications are deployed need to think about the interfaces of computing. It's under the assumption now that applications and data want to be shared. Before, the assumption was to keep the applications separate and not shared. But that's a big problem: building on isolation is understandable from a development perspective but the business of computing is about sharing information.
He shows a Facebook stream about how you can control the data flow of information. For many people, this is the natural way we think about how computing should be delivered. It should be customizable, allow sharing, and put the user in control. This, however, is a radical departure from corporate systems of today. This is a radical new departure for IT that I think is going to flow in from the consumer space into corporations.
Finally and most importantly, every company in business is going to be challenged to figure out innovative new ways to harness the amount of cheap computing power and data storage capacity that are running over the internet. The WWW that we're used to is turning into the World Wide Computer: a shared informaion processing machine that all of us can tap into to use for our own purposes. Harnessing the power of this shared machine is going to be the big challenge of this century. One of the most obvious effects is that we're seeing a blurring of the line between the consumer software business and the media business. They're becoming one business with software taking over the characteristics of media and media taking over the characteristics of software. One example is Mint, a financial management software that looks at your financial habits and gives you more financial information. It's provided for free but you have to have an indirect means of supporting it with either ads or by taking a cut whenever a user shifts money into a new bank account (in Mint's case). The success of consumer software is no longer being measured by the number of units sold - it's being measured by the audience you're able to attract.
In turn, the media business is taking software characteristics. Media online comes from your prowess of software coding like algorithms. They're looking more and more like sets of software services. Behind the scenes, it's about using algorithms used to deploy advertisements for monetization strategies. It provides opportunities and challenges for media: tech companies can move into media and traditional media has challenges because they need to increase their software skills and adapt to the new model of doing business.
There's a continued consolidation of control particularly from a business perspective. This has been happened for a number of years. We have the perception that the web democratizes media and there's some truth to that, but because of the economic nature of the business, there's also incentive to consolidate everything, such as control. I think this trend is going to continue pushed by the economics to aggregating traffic to generating economies of scale. If you apply this to search engine advertising and PPC, you'd see a similar consolidation: using the internet as a medium or marketing channel. On the scary side, as soon as you begin to transfer businesses into software, we have the pheonemonon of the workerless company. You serve millions of customers with very few employees because they're running on cheap infrastructure with highly scalable software.
He illustrates that Skype had 200 employees with millions of users. YouTube was purchased with 60 employees. Craigslist had 20 employees. PlentyOfFish had 1 employee. This testifies to the ability to scale software with almost no labor component. This works because of increasing returns to scale, the radical automation to the application of cheap software, the global reach, and finally, user generated content. It's a central characteristic of all of these companies; the content is free from the users. An entrepreneur who can tap into this can get a whole lot of money very quickly and easily. If you think broadly about the economic effects, you can see the hollowing out of employment as processes are automated.
We'll see a greater personalization and polarization of data. It's a great thing but what happens when we receive highly customized information that is based on our preexisting patterns of behavior and prejudices? That's a bit disturbing - look at the political blogosphere of the USA. There's a lot of prejudice between different political parties and they don't link to each other. There's an assumption that this will cause greater social harmony. In some ways, it is, but this also happens and the fragmentation will occur.
The final point Carr speaks of is the implications for privacy and even for free will of people. A few years ago, AOL distributed its search logs for a 3 month period in 2006 and "carefully" anonymized all the data to help academics. But people's entire lives are laid out in the search terms they used (Thelma Arnold was tracked down by the NY Times). "My goodness, it's my whole personal life. I had no idea someone was looking over my shoulder." Everywhere we go on the 'net, there are companies looking over your shoulder and this phenomenon will only increase in the future. This presents an important ethical element that people need to think about when they go about their businesses. Computers are giving power to the individual but they're also controlling the individual. We're going to see a tug of war between the liberating side and the controlling side and it will be encapsulated in the way companies go about marketing.
As we think about this divide, here's a question: which side are you on?