TRANSLATE THIS ARTICLE
Integral World: Exploring Theories of Everything
An independent forum for a critical discussion of the integral philosophy of Ken Wilber
David Christopher LaneDavid Christopher Lane, Ph.D. Professor of Philosophy, Mt. San Antonio College Lecturer in Religious Studies, California State University, Long Beach Author of Exposing Cults: When the Skeptical Mind Confronts the Mystical (New York and London: Garland Publishers, 1994) and The Radhasoami Tradition: A Critical History of Guru Succession (New York and London: Garland Publishers, 1992).

SEE MORE ESSAYS WRITTEN BY DAVID LANE

The Frictionless Future

When "Extended" Reality Becomes Seamlessly
Integrated into our Day to Day Lives

David Lane

We don't need to indulge in Kurzwellian hype to acknowledge that the impossible is becoming possible quicker than expected.

Humans can be a lazy species at times. This is particularly true when it comes to adopting new technologies that require concerted effort to understand how they operate. As video game designers know too well, if it takes too long to load a game most of us won't bother playing it again. Thus, making a product easier to work will give it a much better chance for mass adoption, especially if it provides value not found elsewhere.

A good illustration which highlights this point is the 2007 introduction of the first Apple iPhone. While it is certainly true that the touch screen was magical, with it pinch, expand, and swipe capabilities, the real plus was how easy it was to use because it had a dedicated “home” button. This one feature was simple to understand and provided a safety net.

David Christopher Lane and his youngest son, Kelly Joseph Lane
David Christopher Lane and his
youngest son, Kelly Joseph Lane

John Carmack in his recent keynote talk at Oculus Connect 6 Conference held in San Jose, California, lucidly explained that virtual and augmented reality will only take off if it becomes more or less “frictionless.” It is inconvenient to don on a VR headset that has to be tethered to a high end (and overly expensive) computer with mounted sensors around the room, such as what happened with the first iteration of HTC Vive. It is for this reason that Oculus decided to create a standalone headset that had built-in sensors and didn't require any exterior computational connections. Facebook's first offering in this direction was called Oculus Go, which though limited to only one hand held controller and a constrained FOV (field of view), was relatively painless to use. However, because it didn't have two fully functioning hand controllers and other key features it couldn't exploit all the wonderful capabilities that virtual reality has to offer.

A year later Facebook introduced the first truly integrated and mobile functioning VR headset, the Oculus Quest, which included hand tracking and four mounted sensors that allows for much larger expanses in which to explore all things virtual.

Digitaltrends.com explains the differences in the two headsets[1]:

The Oculus Go sports a single 2,560 x 1,440 LCD panel offering 1,280 x 1,440 pixels per eye and operates at a standard 60Hz refresh rate, though some developers can unlock extra performance for a 72Hz refresh rate if the app isn't too taxing otherwise. Powering that display is a Qualcomm Snapdragon 821 processor with its own on board graphics.
In comparison, the Oculus Quest has twin OLED displays which have richer colors and deeper blacks than the Oculus Go's, but do suffer more from ghosting in high-contrast scenes. They each have a resolution of 1,440 x 1,600, which works out to a total resolution of 2,880 x 1,600 — a slight improvement that smooths out the screen door effect a little more, but is more noticeable an enhancement over the Oculus Rift. It also operates at a stock (and locked) refresh rate of 72Hz, for smoother visuals and what should be a slightly more comfortable viewing experience.
Powering those displays is a newer and more capable, Qualcomm Snapdragon 835. It's still a far cry from what PC-tethered headsets are capable of, but it should give a little more graphical grunt to the Quest over the Go, leading to better visuals once developers get to grips with the hardware.…
The starkest difference between the Oculus Quest and Oculus Go is in their tracking capabilities and input options. While neither require external trackers like the Oculus Rift, the Oculus Go utilizes on board sensors like a gyroscope, accelerometer, and magnetometer to deliver only three-degrees-of-freedom. That is tilt and rotation tracking. It cannot do positional tracking so won't notice if you move side to side, up or down, or forward and backwards. Its controller is also quite limited in that it's a small, wireless remote control with a few button inputs — though some users have reportedly found success connecting third-party gamepads for greater input options.
In comparison, the Oculus Quest offers much more nuanced and expansive tracking and input options. It uses four corner-mounted sensors on the headset itself to track the wearer in 3D space, giving six-degrees-of-freedom that allows for roomscale experiences that are typically only associated with tethered headset solutions. It also offers boundary and object tracking to make sure users don't walk into chairs or walls.

While VR has been slow in coming, and AR even more so, it doesn't take Nostradamus to see that the new digital wave is starting to grow—so much so, in fact, that it won't be long until smart phones are replaced with smart glasses, smart rings, and eventually smart neural implants.

Lest we forget, it was only five years ago when Palmer Luckey sold Oculus Rift to Facebook for nearly 3 billion dollars and just three years ago that their first commercial headset was made available. Given the wild success of Oculus Quest (Mark Zuckerberg claimed in his recent keynote address that they are selling out as quickly as they make them) and an assortment of related AR products, we are seeing the artificial and the real world melding together in ways before thought unimaginable. Indeed, we “extending” reality in so many ways and in so many directions that we now spend nearly half our day or more absorbed in data driven simulations.

It is not a stretch to imagine that in a short order those very simulations will become operative overlays to all that we experience moment to moment. Netflix's somewhat creepy, but nevertheless prophetic, movie Anon, tells of a future where everyone is psychically augmented. The technological implants are essentially seamless and wherever one encounters another person or place a full read-out display provides a biographical or historical overview.

We are already there in a very real sense, since almost everyone (even some in the poorest of countries) has a smart phone. These amazingly powerful hand-held computers have an array of applications, including the ability to amplify any given surrounding with massive amounts of information.

That we are living in cyberspace is no longer an envisioned fiction of 1984's Neuromancer but the 2019 world of today. William Gibson's future has already arrived, even if “it's not very evenly distributed.”

The attendant hassles of “extended” reality are getting resolved much quicker than anticipated. I remember just four years ago getting in a heated discussion with my then fourteen-year-old son, Shaun, who was well versed in the latest tech trends, that VR had to be portable and not connected to any external device to become popular. He countered that it would take at least another decade before that would happen. It took less than four years. Though Moore's rule—it's not really a law—is reaching its end point, there are a number of work arounds where computers and other devices can become more powerful each year and cheaper.

We don't need to indulge in Kurzwellian hype to acknowledge that the impossible is becoming possible quicker than expected and in so doing ushering in a whole new social landscape without borders. We are not merely being deluged with data, we are swimming in oceans of it and adding to it each minute by our cloud output.

Elon Musk suggests that we are already cyborgs because we have attached ourselves to smart phones and watches which are within arm's distance. The next step on the ladder is to absorb within our own bodies the ability to be Net-connected. I was somewhat surprised to learn at the Oculus Connect 6 Conference that Facebook engineers are already beta testing a “brain-machine interface from CRTL-Labs [which] reads electrical signals through your arm and decodes the data into commands that can, in theory, control electronic devices.” We even saw a mini demonstration of how it could potentially work in video games. As Andrew Bosworth, explained to PC Magazine,

"We know there are more natural, intuitive ways to interact with devices. And we want to build them. It's why we've agreed to acquire CTRL-labs."

According to the article, “Facebook has been developing mind-reading tech of its own. Its prototype involves a 'non-invasive' headset that uses infrared light to read brain activity. The goal is to decode words imagined in your mind . . . . The tech could be used to power virtual reality products such as Oculus Rift, which functions with the help of hand controllers. A brain-machine interface could streamline the experience by seamlessly translating thoughts into computer commands.”

In looking back at our own evolutionary trajectory, we can see that our species is driven to alter its environment to better suit its needs. This involved exerting tremendous amounts of energy to rearrange matter—from pruning rocks to make weaponry to landscaping plots of land for agriculture. The information age, however, allows for exponential changes since it moves “bytes” of data at the speed of electrons. Computers have ushered in a transformative way of getting things done quicker, cheaper, and more efficiently, even with how we architecture chunks of atomic bits.

Over the last 12 years, smartphones have helped dematerialize our consumption of countless other products. Wired

Andrew McAfee, who had earlier taught at Harvard University (where he had previously received his doctorate) but who is now the chief research scientist for the Sloan School of Management at M.I.T., in his recently released book, More from Less: The Surprising Story of How We Learned to Prosper Using Fewer Resources—and What Happens Next makes the provocative argument that intelligent devices, such as Apple's iPhones, are helping “save the planet”—contrarian as that may sound at first, given how much electricity they consume within one year.

But, as McAfee, points out in a widely circulated opinion piece published by Wired Magazine[2],

More from less, Andrew McAfee

“The US remains an industrial powerhouse, responsible for about 25 percent of the global economy. So what's going on? How is it that the country changed course and learned to tread more lightly on the earth over time?
Steve Cichon, a 'writer, historian, and retired radio newsman,' got a clue in 2014 when he paid $3 for a stack of old Buffalo News newspapers. On the back page of the February 16, 1991, issue was an ad from the electronics retailer Radio Shack. Cichon noticed something striking about the ad: 'There are 15 electronic gizmo type items on this page … 13 of the 15 you now always have in your pocket.'
The “gizmo type” items that had vanished into the iPhone that Cichon kept in his pocket included a calculator, camcorder, clock radio, mobile telephone, and tape recorder. And while the ad didn't include a compass, camera, barometer, altimeter, accelerometer, or GPS device, these too have vanished into the iPhone and other smartphones.
Cichon's find shows us that when thinking about their overall impact on the planet, it's not helpful to think in isolation about producing 2 billion iPhones. Instead, we should think about a counterfactual: What would have been produced over the past 12 years in a smartphone-free world? The answer, clearly, is a lot more: a lot more gear, and a lot more media.
Sales of point-and-shoot cameras, camcorders, film, and videotapes have plummeted in recent years, but that's not because we stopped caring about pictures and videos. Instead, it's because a device called the smartphone came along that let us dematerialize our consumption of these things. Dematerialization is an idea that goes back at least to the 1920s (with R. Buckminster Fuller's concept of 'ephemerialization'), and evidence from the US and other high-income countries shows that it's an idea whose time has finally come.
Why now? There are two causes, the first of which is technological progress. The ever-more powerful and popular iPhone is the poster child for this progress, but the technologies of dematerialization are all over the place. Computer-aided design enables thinner aluminum cans, lighter buildings, and more fuel-efficient engines. Sensors and machine learning allow energy-intensive facilities to be run more efficiently. Precision agriculture lets farmers increase crop tonnage while using less land, water, and fertilizer. These technologies do require a lot of electricity in aggregate, but they also save a lot of energy throughout the economy. This is why US electricity consumption is flatlining and total energy use is scarcely higher than it was before the recession began.”

But forget the iPhone and all the wondrous things it can do, imagine instead a VR and AR headset and what possibilities it holds. We can create work spaces, entertainment complexes, social arenas, fantastic environments, and have the ability to interact and have “presence” with anyone around the globe—and all of it while staying within the confines of our own living room. While it is certainly true that the iPhone reduced many gizmos down to one all-purpose device, VR and AR are truly tech's equivalent of a digital Swiss Army knife on steroids, with capabilities yet to be fully explored or harnessed.

Yes, to be sure, there is no ultimate technological utopia, but things are indeed getting better in the conglomerate, even though there are still horrid pockets of existence in dire need of improvement. Steven Pinker, the Johnstone Professor of Psychology at Harvard, in his two most recent books, The Better Angels of Our Nature and Enlightenment Now, paints a much brighter future for humanity than we might at first suspect. He clearly shows that science and technology are improving lives, despite many of the horrific hazards it has unleashed in the process. As I underlined in the Integral World essay, “108 Billion: Living the Pinkerian Thesis,”

“One of the more dramatic improvements over the last 150 years is how many people died from famine in the latter part of the 1800s—with the peak being 142.6 per 100,000 in the 1870s—contrasted with the early 2000s where it is now less than 5 per 100,000. With the progress of modern medicine, many diseases have either been greatly reduced or almost altogether eliminated. Alexander Fleming's discovery of the antibiotic effect of penicillin, for instance, is estimated to have saved the lives of eighty to two hundred million individuals across the globe.”

Such optimism may seem misplaced when we still see so much pain and suffering in the world, but all one has to do is reflect back and think of a time when small pox was rampant, when tuberculosis was untreatable, when polio was the scourge of the young, and when advanced medical treatment was “bloodletting.” Perhaps Andrew McAfee's positive outlook isn't so misplaced when he writes,

“But we don't need to worry that the iPhone and its digital kin are going to gobble up the planet, or even put a big dent in it. In fact, they're doing the opposite. They're bringing us into a second Enlightenment—a physical one this time, rather than an intellectual one. During the 21st century, I predict that this Enlightenment will spread from the US and other rich countries to low-income parts of the world, and we'll finally enter a stable and healthy relationship with the whole Earth.”

NOTES

[1] Jon Martindale, "Oculus Quest vs. Oculus Go", October 2, 2018.

[2] Andrew McAfee, "How the iPhone Helped Save the Planet", wired.com, 06.29.2019




Comment Form is loading comments...