Graham is thinking of tweaking reality…
Which includes near-reality, too. And not the fun kind, either.
It all stems, at least in this version, from this article in the New Yorker, apparently about advances in film-making …
…As technology dissolved the boundaries of conventional narrative, it could also interfere with essential elements of good storytelling. What was suspense, for example, if not a deliberate attempt to withhold agency from audience members—people at the edge of their seats, screaming, “Don’t go in there!,” enjoying their role as helpless observers? At the same time, why did the mechanisms of filmmaking have to remain static? Cautiously, he embraced the idea that interactivity could enable a newly pliant idea of cinematic narrative—“one that is opposed to most popular movies, which are built on suspense, which make you want to get to the resolution, and focus you on one track, one ending.”
But now the strangeness starts to creep in. Because soon the technology will be able to drive what happens next, based on where your attention is in the scene (and possibly even vital signs). And even that is less simple than it sounds. Does the technology give you more of a character, allowing you to see things more from their viewpoint, or less, in a Hitchcock-style suspense-building exercise? Or even take you into a Rashomon-style review of the previous events, from different aspects, undermining any idea of a consistent, authorial narrative?
So far it’s all good technophile, movie-geek fun. So much so that they raised, wait, three million dollars on the first meeting? Forty million so far?
The [first-meeting] Sequoia investors recognized a business that could not only earn revenue by licensing the technology but also harvest data on viewer preferences and support new advertising models
And here’s where it starts to get disturbing. Viewer preference tracking is not just the basis of the ads that follow you all over the internet. Its role in opinion formation is arguably a significant reason for us having the president we now do. Here we go further, even — the tracking is not just the movies you like but the elements within those movies to which you pay attention, and your expressed preferences based upon that.
Explicit interactivity is going to yield to implicit interactivity, where the movie is watching you, and viewing is customized to a degree that is hard to imagine. Suppose that the movie knows that you’re a man, and a male walks in and you show signs of attraction. The plot could change to make him gay.
,,,The cinematic use of eye tracking [is] technology that is not yet commercially widespread but will likely soon be.
Which three-letter-agency (FBI? CIA? NSA?) will be the first to subpoena those records and hook them up with its own biometrics, identifying what an individual saw, what they paid attention to, who they were with and where they were?
Fortunately, though, you’ve done nothing to worry about. Not that you know of, anyway. Not yet. In 1992, Michael (“Westworld”) Crichton wrote a thriller (Rising Sun) where the solution hinged on digital records of a crime scene. In a proposal far-fetched for its time, Crichton suggested the fiendish Japanese (yes, it’s that kind of book, sorry) might be able to hack and recreate such footage to falsify its details. And now here we go, from the New Yorker piece again:
The team had written software that made it possible to manipulate objects in the film—pick them up and move them. Bloch thought the software had immediate commercial potential. Walking back to his office, he expressed his excitement about what it would mean to permanently alter a scene: to tamper with evidence in a crime drama, say, and know that the set would stay that way. “It makes the world’s existence more coherent,” he said.
Choosing your own ending is one thing. Having someone else pick it for you, and be able to demonstrate it was that way all along — that’s another thing altogether.
Minority Report anybody?
Almost turning it on its head, in fact. In Minority Report, the system predicted crimes not yet committed. Here the system could create a crime not committed at all — at least, not by that individual — and furnish the “evidence” to prove it.