By now, my four or five regular readers (hi guys!) know that I'm a bit of a Netflix fiend. I don't have cable and I almost never watch network TV. I am surely one of thousands for whom a simple cost-benefit analysis demonstrated that subscribing to Netflix, providing as it does a wealth of readily available viewing goodness (and shittiness when the mood strikes), without commercials, is an easy choice versus a usurious cable subscription, mandating as it does tuning in at a particular time and putting up with advertising that you basically paid to see.
And I also cannot be the only person who's been a Netflix subscriber for enough years that I've developed an allergy to commercials that's so acute, I can no longer listen to the radio and do my best to tune out the trailers in front of movies on the (increasingly rare) occasion that I go to the theater.
I began to wonder just how commonplace I am in the above ways while I was watching the series premiere of Marvel's Agents of S.H.I.E.L.D. last night. The show itself? Eh, kind of promising. Much of it felt…routine, unexpectedly, but the casting's good overall and the occasional Whedonish touches were noticeable. I do have my doubts about the degree to which fanboy enthusiasm for the Coulson character in Marvel films will translate to small-screen viability over an extended period.
Which brings us to the thing that really fired my imagination as S.H.I.E.L.D. ended and I switched off the opening moments of The Goldbergs. It wasn't wondering about Lola's background, or Mike's destiny, or whether the techie characters with the heavy accents would make it past the pilot. Instead, motivated by this rare hour of commercial exposure, I imagined what the future might bring for TV—and the adaptable little mammal to its overspecialized dinosaur, the Internet.
I pictured the TV world about twenty years from now—that is, what you would see if you turned on your TV and didn't connect it to the Internet—as a bleak wasteland of public-access programming, automated weather feeds, possibly one or two of those channels that reruns shows from the '50s, and (in certain fortunate markets only) local news shows. Every potentially valuable bit of programming has migrated to the Internet—which would not have been profitable twenty years earlier. Luckily for the IP owners, old and vulnerable tools like the web browser will no longer be usable to access TV shows or movies: big media and Congress worked together (in oh let's say 2023) to develop secure sub-Internets using proprietary protocols, some of which users pay for access to, others of which are "free" to users with the proper certificates, and through which only movies and TV shows are piped. Well, that and commercials, of course—fewer for the paid "channels," but still plenty for all.
By this time, Netflix has turned to shit in the eyes of those old enough to remember its infancy, and there have been dozens of video Jammie Thomases—individuals who tried to circumvent the secure channels and were punished in highly publicized trials. (Too sleazy and obvious? Don't be so sure.) Which means if you've still got a commercial allergy by 2033, your choices are
- consume no video content that's remotely new, and become some weird coffee-house book-reader type;
- risk prison time with unreliable piracy tools in the age of the panopticon; or
- suffer through the advertising and hope exposure therapy works in this context.
First world problems, to be sure, but hardly what any media consumers today would call a bright utopian TV-tomorrow. Not so the content providers. The above may be their very best-case scenario.
However, it does assume that the Viacoms and Disneys (to say nothing of Congress) adapt more or less rationally to the technological changes of today and tomorrow. I'm no industry insider, but I've seen plenty of signs that suggest rational adaptation is not to be assumed of these folks. The big HD push from a few years ago was, in retrospect, a desperate dying gasp from the traditional TV world, just as the 3D fad is a desperate dying gasp from the traditional movie world. Nielsen ratings, never terribly reliable, grow more and more irrelevant with every new Internet-driven viewing option—I still get their paper surveys in the mail, and as a (mildly) tech-savvy media consumer, I barely know how to fill them out meaningfully anymore.
So what's a more plausible model? One that assumes less competence on the part of big media than the "nightmare scenario" above? We could guess "Netflix continues to pwn through 2033," but in the spirit of plausibility, we'd have to account for Netflix's own demonstrated blips of incompetence.
Maybe the everchanging chaos of limbo that characterizes today's tech-mediated viewing options will continue to roil and confuse, with mechanisms dying just as their users get accustomed to them and giving way to barely-imaginable successors. In time, as the Gen-Y-ers and Millenials enter the stodgier phases of their lives, perhaps their patience for such continual adaptation will drain enough that they're willing to tolerate the more traditional options, keeping them viable for a couple more generations at least.
I definitely expect more erosion of the borders between film and television as forms of art and marketing, as S.H.I.E.L.D. presages. The impulse to synergize, and the capacity of each medium to compensate for the limitations of the other, make such multimedia franchises a worthwhile risk. (Coming Fall 2015: J. R. R. Tolkien's Middle-Earth Young Action Doctor Squad.)
Some will succeed, and enough successes may mean the concepts of "TV show," "episodic series," and "film franchise" morph into some amalgam of each other in a few decades. Some will fail, and if a big one fails catastrophically enough, it'll induce a reactionary jerking of knees among the demonstrably conservative content producers.
But I'd guess the synergy gamble will continue to be worthwhile—I anticipate no shortage of reimaginable properties from decades past, nor of franchisable new young-adult literature. If it becomes common enough, the multimedia franchise model may impact the delivery mechanisms themselves—perhaps allowing continuance of the network model well past what futurists might have anticipated its lifetime to be. (TV news, I suspect, won't survive in its current form past about thirty years from now, when the viewers most accustomed to it will be dying. But that's a peripheral point, and the related demise of all journalism is a tangent for another day.)
The movie theater experience will likely change too. Spielberg or Lucas (or maybe both, I can't recall) predict theater ticket prices going up to like $50+, but the experience becoming high-end enough to make it worthwhile occasionally. If they're right, the features that don't merit such an experience (your quiet dramas, your fart comedies) will go directly to premium channels—be they the secure sub-Internets, futuristic cousins of HBO, or just plain HBO. Which brings into question the economic incentive to fund all but the most event-y features in the first place—unless they're part of a cross-medium synergy project as described above.
Whatever the case, voting with our wallets is going to be increasingly irrelevant. When deprived of the ability to consume media in the fashion we prefer, because that fashion is less profitable for the content providers1, our only choice will be to consume it the way they want us to, or find something else to distract us from our daily lives. The content's quality matters much less than its accessibility; our complaints can apply to either the shitty product or the expense of accessing pretty good product (or both), but those complaints won't induce any more change than the occasional emergence of an easily-co-opted hot tech alternative. And no matter how intensely we bitch and moan, we won't revolt en masse from the concept of audiovisual entertainments altogether.
Of course, if the Oculus Rift means we'll get holodeck technology way sooner than we expected? All bets are off.
1 = If it sounds like I'm subscribing wholly to the Digital Imprimatur here, it's because I see little reason not to assume it's accurate, in broad strokes at least.
No comments:
Post a Comment