Monday, September 28, 2009

Publishing at the Speed of Real Time

Obsessed with Speed
We are obsessed with speed and the desire to make and do things faster. Simple examples abound: faster humans (~27mph Usain Bolt), faster machines (257 mph SSC Ultimate Aero), faster networks. Speed is a primary measure of success. Faster means better.

 In publishing speed can have many meanings. I use it to refer to three aspects of publishing. How quickly can I move content from author to consumer (e.g. a journal article X), how many authored content(e.g. 1000 copies of journal article X) can I distribute over a period of time, and how many different authors’ content can I move simultaneously (e.g. 1000 copies of journal articles X, Y, and Z).

If we look at the classic printing press someone who is not an expert in the field can see where we will run into problems. There are physical limits to how many times we can print the same page. If we need to print more copies we need more time. And we can forget about printing a different article at the same time, we would need to add another printing press. Even if we could print that paper faster, to really increase the speed of publishing we would need to speed up the entire process, we need to move the printed paper to the consumer. The physical limits are fairly obvious.

Now take a common example of digital publishing, the blog. Using technology that is easily available I can make those problems go away. When I publish a blog it is available within seconds to anyone that wants to read it. I can distribute my content to anyone with a network connection. I can reach millions of people with that single click of the publish button.  If I have multiple pieces I want to publish at the same time I am limited only by the number of times I can click the publish button.  As quickly as I can write I can publish. We are pushing against the ultimate speed of publishing: real-time. As fast as an author can produce content it can be available for consumers.

So what happens when we push ourselves towards real time publishing? What happens when consumers are pulling on us to publish in real-time? What happens when speed becomes the key measures of success?

What Happens When We Speed Things Up?
Things Break! 
The first thing that happens when we speed up is that things break! If I push my printing press faster than it can mechanically work it will physically break. A more subtle form of breaking can be seen on the consumer side. If I expect content like news, journals, magazines, and books to be available on all of my digital devices but I have to go to a bookstore to purchase a bound paper copy of a book I want the process looks broken to me.

The pressure created when we speed things up, whether coming from the producer or the consumer, creates opportunities for new mechanisms, processes, and participants. New participants have ushered in popular new models for publishing like Blogs, Twitter, and Facebook, all of which support real-time publishing. This is putting pressure on existing processes and mechanisms. It has changed what we need to do on the production side while also shifting the expectations of the consumer.

When old mechanisms break it doesn’t mean they go away. It just means they can’t support the demands of new ways of doing things. Printing presses are not going to disappear, but if we look at the volume of published material in the world they will play a smaller role as compared to PDFs, blogs, and Twitter.

And keep in mind the new models can and will break, it’s just a matter of time. Blogs are about 10 years old years old. Twitter is only 3 years old. The new tools that will break these models probably already exist. The question is not what to do if the model breaks again but what to do when it breaks again.

Everything is Closer
When we move faster, we are shrinking the distance between two points. A fundamental shift in distance changes our accessibility patterns which in turn impacts the world around us. In Japan the first high speed rail line was opened in 1964. An estimated 400 million hours are saved annually. One city, Kakegawa City, opened a station in 1988. Over a six year period industrial employment jumped from 88.8% to 106.9%. While less easily measured, it is believed that those in Kakegawa City enjoy a better life because they have access to broader cultural resources. And while not as widely discussed, one can imagine that other areas that were once prosperous due to proximity to a traditional rail line suffered as focus shifted to the high speed rail line.

When the speed of publishing increases, the distance between the producer and the consumer shrinks.  The obvious result is that there is less time between when content is created and when it can be consumed. When the folks at Engadget are following the latest Apple event they are publishing stories directly from the event. A more subtle aspect of this is that in order to publish in real-time we don’t just use new technology; we change our process.

In a simplified publishing model there are discrete steps and people involved in the publishing process: author, editorial, design, printing, distribution, retail. With real-time publishing it is not just the technology enabling real time distribution, it is a shift in the process. The emphasis is on creating the content and expedient delivery. The author is the editor, design is a pre-developed template, printing is simply sending bits, distribution, and retail are URLs. We gain immediacy but lose the attention to detail that individuals in the process provided.

Another effect of reducing the distance between author and consumer is that the original content is only half the story. The other half is provided by feedback from the consumer. The response from a reader might be more important than the original content. What once might have been a private exchange between author and reader over the course of weeks is now a public exchange that happens in minutes. Publishing is moving from a one way statement to an ongoing group conversation.

The author and consumer are so close that the line between them is blurred. Where does a publisher draw the content line?

Losing Control
As speed increases we lose control. I grew up with the Wide World of Sports every Sunday. I will never forget the image from their opening segment of a skier losing control and literally flying over a judging stand. Another example, although more fantastical, is Mickey Mouse as the sorcerer’s apprentice. Thinking he can speed his chores up a bit he loses control over the very brooms he created to help him. This always reminds me of the viral power of a network. The problem is we don’t have a master sorcerer to set things straight when they get out of control as they did in Fantasia.

There are two areas in particular that underscore and enable this loss of control. The first area is access to publishing tools. The tools that define real time publishing are available free of charge to anyone. An individual can be the writer, editor, and distributor in the time it takes to sign up for a web service. The second area is control over content. Individuals can share digital content with the click of a button. Although there are new efforts to control content once it is placed in the digital wild, there are no effective tools to control content once it is public. This is similar to the advent of desktop publishing but with a far greater magnitude.

Real-time publishing changes the nature of publishing content. How does a publisher manage control in a world where anyone can publish and sharing content is a matter of a few mouse clicks?

Now What?
Real time publishing is changing the nature of publishing. Our existing authoring and distribution models are cracking under the pressure, failing to meet the demands of publishers and consumers. The definition of content and publishing is changing from a statement to an ongoing conversation. The content that is published online is free for the taking.

What do we do when the publishing tools and processes we rely on are no longer effective? What do we do when the definition of content shifts under our feet? How do we control content in an environment where we inherently have no control?

While there is a lot of warranted fear and uncertainty, there are publishers that are embracing these changes and rolling along with the waves. Next I'll highlight examples that I believe are showing us how to survive these changes.

Monday, September 7, 2009

Is the Cloud Timeshares All Over Again?

I recently heard a much heralded analyst discussing the cloud. He described how we had seen it all before and that it would never amount to anything, a fad of sorts.

The primary argument was that the cloud is simply a rehash of timeshare systems from the sixties and is therefore not innovative, useful, or relevant. I wasn’t alive then but I was stuck with the legacy through college. It is not the same. There are surface level similarities to be sure. They both rely on a client server relationship. They both rely on shared resources. That’s like saying a Mustang from the sixties is the same as a Tesla from today because they both have four wheels and go fast. The engine may still power the wheels but the driving and maintenance experience is radically different. The cloud is different because it is low cost and seamlessly elastic. The cost of getting started with a cloud service? About the same as a cup of coffee. The steps to expand your cloud footprint? A few clicks, 5 minutes of your time, and another cup of coffee.

He also argued that people need to know where their data physically resides. Without that critical information he believes that public cloud computing is useless for real business.

To kickoff this second round of argument he stated that SaaS is not cloud because the users know where their data is. A significant exemption when arguing that the public facing cloud is not for real business. He went on to note that, in the case of, it was in a building off of highway 101 in California. I’m not sure that knowing that my data exists in a particular building off a particular highway gives me any sense of comfort. Do they offer visiting hours so that I can take a walk around the data center with my data? Even the administrators would be hard pressed to find the exact disk drive that any particular customer data lives on. And they certainly aren’t going to let me do anything with it. God forbid anything happened while I was visiting my data. “Fire! Quick, grab your data!” Forget the data, I would be busy running and thinking, I hope they’ve replaced their Halon fire suppression systems that have the side effect of killing anything that requires oxygen. No, if I needed a local copy of my data I would probably just batch it from my SaaS vendor.

He furthers this argument with the notion that people really really really want to know where their data is. They want to see the server it is sitting on, they want to see the little green light that says it is on. This is a non-technical person mind you, looking at a green LED, feeling comforted. Quaint. I’m not sure my boss even knows what room to look in to find the server with his data. I don’t know what room to look in and I use some of those servers. I hope someone does. Oh, that’s right, our critical corporate sales and financial data is hosted by a SaaS vendor. What building on 101 is that again?

I do appreciate his perspective but I think his line of reasoning is flawed. I believe the desire to just have things work far outweighs the desire to be able to physically touch and feel the server that data is sitting on. For the hobbyist the transition from that sixties Mustang is going to be tough. They want to pop the hood, swap the spark plugs, change the oil, maybe adjust the throttle. The rest of us just want to drive fast, the less maintenance the better. That translates into less time and less money that I need to invest into my driving experience.

This is what the cloud and SaaS offer, tools that just work. Without any installation, without any cables, without any front loaded expenses, without waiting. It is comparatively less expensive than hosting your own servers. It grows and shrinks with your need and it does it in minutes. Software that just works, is an elastic resource, and I only pay for what I need? That is a new concept.