Pizzi: Ponder This Post-Broadcast Paradigm
In the previous issue we discussed the disaggregation process taking place in the media world, as the Internet’s impact is felt across all traditional media sectors.
This time let’s assume that such disaggregation has become the widespread norm — as it appears to be destined to do — and consider how radio broadcasting could remain relevant and successful within that context.
One thing that will have to change is how the impact of any content provider is measured.
No one’s numbers will be very big, and there will likely be a lot of providers with roughly equivalent, low values — as gauged by whatever tool is used for such counting.
This implies that the real value will be in qualitative delivery of eyes and ears, not their simple quantities. Advertisers will look for a service’s affinity to their product, along with other highly targeted micro-demographics it delivers (assuming it even accepts advertising).
The old adage that “Half my advertising money is wasted, I just don’t know which half” will be replaced by far more fractional analyses, and equivalently higher expectations of return.
Ratings companies — and media sales forces — will necessarily have to become much more sophisticated to remain viable.
Another traditional process that may need to change is the dissemination of news.
Today’s wholesale/retail model of networks and wire services as primary content sources for affiliated outlets will be replaced by a far more complex web of content originators and distributors.
One practical issue is how pool reporting will be handled in such an environment. Certain news-making events will never warrant or allow direct coverage by large numbers of news outlets, so some centralized reportage will still be required. But with such a Balkanized “last mile” to the news consumer, how will such rarified content be fairly and feasibly shared up at the “first mile”?
Of course, this distribution model has already changed a lot in the last decade or two (see diagram), and we’ve managed to keep up with it. Will increased complexity provide diminishing return, or will it simply be the next incremental step in media maturity?
We’ve lived with radio formats seemingly forever, but this was not always the case.
Prior to the coming of television, radio stations had far more varied and less uniform content across their dayparts. Television came along and co-opted this successful model, so radio came up with formats to take its place. (Later, with the explosion of content channels experienced when cable TV emerged as a force in the 1980s, radio’s format model was also taken up by some new services like MTV and CNN.)
Now the near-infinite environment of Internet radio has made the concept of radio formats as we know them seem dated and insufficient. Without scarcity, the lowest-common-denominator criteria used in radio formatting makes little sense.
So what will radio use to guide its content choices in this next reinvention? Will it fight fire with fire and move to selected microformats, or come full circle and return to a variety schedule? Or could some form of hyperlocalism find its way back into broadcasters’ DNA, emerging from an almost forgotten chromosome?
Another question: If those old, pre-TV days of variety radio were truly its “golden age,” will today’s formatted radio be remembered as its “platinum era” — or its “plastic period”? The answer will have some bearing on what comes next: Are today’s station formats a good foundation to build upon, or should they be rejected in favor of a more radical departure?
At the moment, everyone acknowledges these questions, but no one has many answers.
One response may be to embrace change and thus much experimentation may follow, which could provide a greatly needed invigorating effect — regardless of the intrinsic prevalence of failure that is always the most likely outcome in such speculative efforts. At the other extreme is denial, and we all know where that reaction ultimately leads.
Isaac Newton said, “If I have seen further, it is by standing on the shoulders of giants.”
Who are those that can give such a horizon-extending boost to today’s would-be visionaries?
Buckminster Fuller’s shoulders are one good vantage point, given his penchant for interdisciplinary crossover, his belief that so much of successful essence was “invisible,” and his predilection to always do “more with less.”
Fuller’s self-description as a “comprehensive anticipatory design scientist” made him the intellectual equivalent of Wayne Gretsky, whose well-known dictum of not skating to where the puck was but to where it will be had a lot to do with his being ultimately dubbed “The Great One” within his athletic milieu. We could all use that sort of predictive ability in our various worlds right now.
Another perennially good target is Arthur C. Clarke. His oft-cited assertion that “Any sufficiently advanced technology is indistinguishable from magic” could again provide helpful guidance. Wouldn’t a pocket-sized communications and rich-media terminal that could instantaneously access nearly anyone or any content on the planet at a whim seem like a near-magical device say, 20 years ago?
Finally, the current economic climate strongly reminds us of commerce’s cyclical nature, yet technology seems to only proceed in a single, forward direction. The nuanced interface of these entrepreneurial polyrhythms is likely another key to the next breakthrough. Timing is everything, it seems.
One more related quote for these troubled times is variously ascribed to several sources, but most notably of late to Presidential Chief-of-Staff Rahm Emanuel. It advises us that “a crisis is a terrible thing to waste,” and it provides the insight that this difficult economy perhaps brings us rare opportunity for more dramatic change than might be possible in bountiful times. Carpe diem.
And so broadcasters’ search for appropriate next steps continues. If you’re looking for clues and possible inspiration, consider adding the above-referenced gentlemen to your summer reading list.