Rubix by Chris Kelly
This conceptual technology by architecture graduate Chris Kelly would allow individuals to project digital imagery over their perception of reality and then manipulate it like the layers of a Rubik's Cube (+ movie).
Chris Kelly developed the concept for his graduation project at the University of Greenwich, exploring how flaws in human perception can cause contradictions with reality and how virtual environments can be used to reveal more about a person's surroundings.
"Our understanding of space is not always a direct function of the sensory input but a perceptual undertaking in the brain where we are constantly making subconscious judgements that accept or reject possibilities supplied to us from our sensory receptors," he says. "This process can lead to illusions or manipulations of space that the brain perceives to be reality."
The idea is based around the science that the senses gather various streams of data every second, which are then selected or rejected by the human brain. Kelly proposes a digital device that could compile all of these pieces of information and relay them back to the individual within the limits of their physical space.
"The redirection techniques and the use of overlapping architecture allow the same physical space to hold a much larger virtual space," he told Dezeen.
Referencing existing virtual reality technologies such as bionic contact lenses and the voice-controlled Google Glass headset, Kelly explains that the technology could be used in endless scenarios.
"One of the more obvious uses is in the gaming industry. Another possible use is in the architectural design process, where rather than creating fly throughs or models that can be viewed on a screen it would be possible to actually move through a virtual mock up of a design or even work from inside a virtual model whilst editing it in real time," he says.
Chris Kelly completed the project for Unit 15 of the architecture diploma course at the University of Greenwich, now led by the Bartlett School of Architecture's former Vice Dean Neil Spiller. The unit is a reincarnation of the Bartlett's successful film and animation module, which boasts Kibwe Tavares' award-winning Robots of Brixton project as one of its products.
See more of this year's graduation projects, including a series of towering seaside structures and a shape-shifting ballet school.
Here's a short description from Chris Kelly:
Rubix
The project was conceived as a complementary exercise to the written architectural thesis Time and Relative Dimensions in Space: The Possibilities of Utilising Virtual[ly Impossible] Environments in Architecture that explores the way in which virtual environments could be deployed within the physical world to expand or compress space. The thesis investigated existing research in neuroscience, psychology and philosophy, which was added to with empirical primary tests, to identify gaps in our perception that lead to a contradiction between our perception and reality. It was found that when moving with natural locomotion, such as walking in a physical space our perception of distance and orientation is incredibly malleable and can be manipulated by replacing the visual sense with a virtual stimulus that differs from what we would experience in reality. This manipulation can take the form of redirection techniques, such as rotation and translation gains and overlapping architecture which result in a stretching or compressing of distances in the virtual environment we see whilst moving through a physical space. This effect creates a TARDIS space which allows vast expanses of virtual worlds to be explored within a small physical space without ever reaching the limits of that space.
The aim of the rubix project was to develop an animation that described a conceptual tool for deploying these malleable virtual environments that could be used by their creators to shift space around us. The rubix concept stemmed from the need for an algorithmic formula for controlling the use of redirection techniques; it allows for many different spatial combinations whilst a level of control is constantly maintained. In the animation the initial Escher-esque space is a representation of our perceptual system where huge amounts of information arrive in the brain from multiple streams. The process of perception involves the brain selecting and rejecting contradicting pieces of information leading to a perception of reality that only gives us glimpses into the world we are in.
The animation represents a journey through the chosen site that was explored during an earlier project which was a stretch of the Docklands Light Railway between Beckton and East India stations. The virtual journey is compressed into 5 minutes using transitional spaces that enclose the explorer whilst the environment shifts around them. The redirection techniques deployed in the film have been exaggerated in some parts to make them more identifiable but as explored in the thesis it is also possible to deploy them subtly so the shifts in the environment would not be perceived. The development of products such as Google Glass and bionic contact lenses at the University of Washington mean it is becoming increasingly possible to overlay virtual information on the physical world. In the future this information could be overlaid so subtly and convincingly that it is possible that distance and space will become increasingly malleable and cavernous virtual spaces could exist within a small physical space, with Doctor Who's TARDIS becoming a perceived reality.