This wearable design concept helps epilepsy sufferers manage symptoms, predict potential seizures and alert passersby or loved ones when having a fit (+ slideshow).
The Dialog device, developed by American technology company Artefact, would use a wearable sensor and an iPhone app to help monitor patients’ vital signs and keep a log of conditions leading up to, during, and after a seizure.
“There are currently three million epilepsy sufferers in America, and it is the third most common neurological disorder after Alzheimer’s and stroke,” said Matthew Jordan, the project leader.
Current solutions, according to Artefact, only focus on detection, alert or journaling and don’t address the whole experience of living with the condition.
The Dialog would deal with the problem by creating a digital network that connects the person living with epilepsy to caregivers, doctors, and members of the public who have installed the Dialog app with data and instructions on how to give assistance.
The user attaches a nodule to the skin, which can be done either using transparent adhesive paper or by wearing it in a bracket that looks like a watch.
Using a series of sensors that monitors hydration, temperature, and heart rate, it gathers information on the wearer and stores the data on a smartphone.
Additionally, the sensor would prompt the wearer to take medication and record mood through the sensor’s touchscreen, and logs information about local climate conditions that could increase the likelihood of a seizure.
In the event of a fit, the wearer simply grasps the sensor, which alerts a caregiver and anyone within close proximity of the sufferer who has downloaded the app.
“It helps possible first responders be notified that a patient who is nearby is having a sustained seizure, directs the bystander to the patient, gives instructions on how to help the patient through the emergency, and affords a direct line of communication to the family caregiver,” said Jordan.
When the seizure ends, information about the length of the seizure, along with other contextual information, is displayed on the user’s smartphone to help reorient themselves.
With the information generated by wearing the sensor, the app will then be able to learn what conditions or vital signs could indicate a potential seizure is imminent and alert all parties. It would also give time for the wearer to take preventative action.
A doctor can would be able to access all of the data generated by the app and make changes to medication or offer insights into causes and symptoms.
“At this point, the device is a concept, but we designed it with technologies and components in mind that are currently in development or being tested in labs and research centres,” said Emilia Palaveeva, another member of the Dialog team.
Dezeen and MINI Frontiers: wearable technology will revolutionise healthcare for doctors and patients alike, says the director of design studio Vitamins in our final movie from December’s Wearable Futures conference.
“In the future there’s no doubt that wearable technologies are going to be part of our everyday lives,” says Duncan Fitzsimons of Vitamins. Increased usage of personal health-monitoring devices will be one example of this, he says, making the “the doctor-patient relationship change [for the] better”.
Fitzsimons explains how the current constraints on an appointment between patient and doctor – lack of time and lack of information – can be mitigated by personal monitoring devices that collect patient data over a long period of time.
“When we are ill at the moment we only see the doctor for a very small amount of time. This is just a snapshot in the progress of your illness,” he says.
“If [a doctor] has access to a wider amount of data, they’ll be able to see how your illness has started, progressed and perhaps is tailing off,” he continues. “That will enable them to have a lot more information to diagnose you better and also enable you to have a more transparent window into your health so that you can understand it better as well.”
For these benefits to be realised, Fitzsimons says the technology to record this data needs to be attractive and easy to use, citing two examples of products by healthcare company Qardio: the QardioArm, which measures blood pressure and the QardioCore, a wearable ECG (electrocardiogram) monitoring device, commonly used to detect abnormal heart rhythms. Both are designed, says Fitzsimons, to look unlike medical devices and use a smartphone as the interface with the patient.
[The above paragraph was amended on 27 February 2014. Previously, it was stated that Vitamins would be launching the QardioArm and QardioCore products.]
Fitzsimons is the co-founder of Vitamins, the design studio which last year won the transport category at the Design Museum Designs of the Year 2013 awards for its Folding Wheel project.
This is the fifth and final movie from the two-day Wearable Futures conference that explored how smart materials and new technologies are helping to make wearable technology one of the most talked-about topics in the fields of design and technology.
Alchemist Lauren Bowker applied heat-sensitive ink to a sculptural leather garment and used fire to alter its colour during a presentation for her company The Unseen (+ movie).
Coinciding with London Fashion Week earlier this month, Bowker’s design house The Unseen debuted a series of garments embedded with her colour-changing ink at an event in the Dead House – a series of vaulted passages beneath Somerset House where her studio is located.
She created a giant black headdress made from overlapping layers of hand-stitched leather that engulfed the wearer like a shell, completely covering the head and extending down past the hips.
During the presentation, a figure wearing this headdress was lead down a tunnel and positioned beneath a spotlight. Large flames erupted around the garment as wicks that protruded from the body were lit in unison.
As the heat from the fire lapped the material, peacock-tail colours began to emerge and disperse across the surface. When the flames died down, the green and purple tones remained on the material as the model was lead back into the depths of the underground vaults.
The collection also included garments worn over the torso that react to the movement of air, changing colour as environmental conditions shift in varying climates and when people come close or walk past.
“Seasonally each piece exhibits different tones of colour,” Bowker told Dezeen. “The summer environment will create a brightly coloured jacket that will dull in the wind to become black again, whereas in the winter the pieces are black until the wind hits them then revealing the colour shift.”
Made in a similar layered style to the larger heat-responsive piece, these designs were displayed on models in alcoves along the subterranean tunnels.
“The fins in each jacket are shaped and designed to create turbulence trips within the wind – triggering the colour-change response,” said Bowker.
“We see a lot of exciting [wearable technology] projects, a lot of design prototyping going on,” says van Dongen, who was speaking at the Wearable Futures conference held in December at Ravensbourne. “It’s really amazing how quickly things are evolving.”
Despite this, van Dongen says that unless the resulting products are comfortable and visually appealing fashion pieces in their own right, they won’t take off.
“It’s very important to stress the wearability,” she says. “I think it’s the only way to connect to the market, to connect to people and to transcend the realm of gadgets.”
Van Dongen launched her womenswear label, which specialises in combining fashion and technology, in 2010. Her Wearable Solar range consists of a dress that incorporates 72 flexible solar panels as well as a coat that has 48 rigid crystalline solar cells.
“Both prototypes have a modular element where you can reveal the solar panels when the sun shines but you can also hide them and wear them close to your body,”she explains. “When you wear them in full sun for one hour they can generate enough energy to charge your typical smartphone 50 percent.”
Van Dongen is aware that there will be significant production challenges to overcome before products like hers become commercially viable.
“It’s important to think how all these new designs can be integrated into the production chain,” she says. “An important next step to take wearable technology to another level is to look at the commercialisation of it.”
This is the fourth movie from the two-day Wearable Futures conference that explored how smart materials and new technologies are helping to make wearable technology one of the most talked-about topics in the fields of design and technology.
Dezeen and MINI Frontiers: scientists are combining non-living chemicals to create materials with the properties of living organisms, says the creator of a self-repairing shoe made from protocells.
Protocells, as the chemical cocktails are known, are made by mixing basic non-living molecules in lab conditions. These then combine to create substances that exhibit some of the characteristics of living cells: the ability to metabolise food, to move and to reproduce.
In this movie Dezeen filmed at the Wearable Futures conference in December, designer and materials researcher Shamees Aden explains how “scientists are now mixing together groups of chemicals [to make] them behave like living cells. They are able to reconfigure, they are able to adapt to light, pressure and heat.”
The synthetic production of living materials is so far limited to basic applications – modifying the behaviour of oil droplets in a water solution, for example – but Aden has developed a proposal that uses protocells to make self-regenerating soles for a pair of running shoes.
The Amoeba running shoes designed by Aden use protocells’ capabilities of responding to pressure, and inflates or deflates according to the texture of ground the wearer is running on to provide more or less cushioning.
Photocells, which have a limited life span, would be replenished after each run, explains Aden. “Your shoe box would be a vessel which would hold the [protocell] liquid inside. You could buy your protocell liquid and it would be dyed any colour you like and you would pour that in and as the shoe is rejuvenated the colours would emerge.”
The speculative project is the result of a collaboration with chemist Dr Michael Hanczyc of the Institute of Physics and Chemistry and the Center for Fundamental Living Technology (FLinT) in Denmark, who has worked extensively on protocells.
“At this point it is a speculative design project but it is grounded in real science and it could be in production by 2050,” says Aden.
This is the third movie from the two-day Wearable Futures conference that explored how smart materials and new technologies are helping to make wearable technology one of the most talked-about topics in the fields of design and technology.
In the first movie, designer of Dita von Teese’s 3D-printed gown Francis Bitonti explained how advances in design software mean “materials are becoming media”. In the second, Suzanne Lee explained how she makes clothes “grown using bacteria.”
Dezeen and MINI Frontiers: Suzanne Lee of BioCouture explains how she makes clothes that are “grown using bacteria” in this movie filmed at the Wearable Futures conference in London in December.
“There’s a whole spectrum of organisms that can grow material,” says Lee, who founded BioCouture to explore how organisms like bacteria, yeast, fungi and algae could be harnessed to produce fabrics.
Lee showed the Wearable Futures audience a range of jackets and shoes made from bio-materials produced by bacteria in a vat of liquid to produce bacterial cellulose – a material that has similar properties to leather.
“The recipe that I’ve been exploring to grow a piece of clothing is using a symbiotic mix of yeast and bacteria,” she said. “It’s a fermentation method that grows you bacterial cellulose. It’s kind of like a vegetable leather if you like.”
She adds: “What attracts me to it is that it’s compostable. It’s not just biodegradable, it’s compostable. So you could throw it away like you would your vegetable peelings.”
BioCouture is a London-based design consultancy that is pioneering the use of bio-materials for the fashion, sportswear and luxury sectors.
Lee is a former senior research fellow at the School of Fashion & Textiles at Central Saint Martin’s College of Art & Design, and author of the 2007 book Fashioning The Future: tomorrow’s wardrobe, which was the first publication to explore how technology could transform fashion.
“Through an engagement with biology I’m really excited about how we can think about organisms like microbes as the factories of the future,” says Lee. “What most people know BioCouture for is a series of garments that were grown using bacteria. So the fibres, the material itself and the formation of the garment has been done by a microbe rather than a plant.”
In future, Lee believes that clothing materials themselves could be living organisms that could work symbiotically with the body to nourish it and even monitor it for signs of disease.
“What we have right now are living organisms making us materials, but then the organism is killed and the material just exists like any other,” she says.
“But I can imagine that we will eventually move towards the material itself being living while it’s on you, and having a direct relationship to your whole body in this happy micro-biome environment and perhaps diagnosing and treating, nourishing in some way the body surface so becoming part of your wellbeing.”
The two-day Wearable Futures conference explored how smart materials and new technologies are helping to make wearable technology one of the most talked-about topics in the fields of design and technology.
“I think augmented reality and virtual reality will essentially converge into the same thing”, says Millns.
The co-founder of Inition explains that the next generation of appliances will blur the once-clear distinction between augmented technology devices like Google Glass and virtual reality devices like the Oculus Rift headset.
“There’s two strains of headsets: the Google Glass-type which only gives you a small image in the corner of your field of view.” says Millns, referring to Google’s augmented reality spectacles which can overlay digital information like maps and internet searches into the user’s field of vision.
“The other strain is the Oculus Rift type, which is designed to replace the entire world and give you a high resolution and the biggest picture possible.” says Millns, referring to the strap-on motion-responsive virtual reality googles from Oculus VR.
“Eventually those two things will converge [into] some sort of contact lens which goes in your eye and does both of those things. It will give you a huge image at high resolution but also the ability to see through and mix images with the real world”, says Millns.
Millns also predicts that the integration between displays and humans will become tighter and tighter, leading to what he calls a “cyborg situation where you have something embedded inside your brain that has a direct interface to your visual cortex.”
Interview: when designer Isabelle Olsson joined the secret Google X lab in 2011, Google Glass looked like a cross between a scuba mask and a cellphone. In this exclusive interview, Olsson tells Dezeen how she turned the clunky prototype into something “beautiful and comfortable”.
“When I first joined I had no idea what I was going to work on,” she said, speaking via a Google Hangout video link from New York. “Then I walked into a room full of engineers wearing a prototype of the glasses. These were very crude 3D-printed frames with a cellphone battery strapped to the legs. They weighed about 200 grams.”
She was given her first brief, which was “to make this beautiful and comfortable”.
“My initial goal was: how do we make this incredibly light? I set up three design principles; if you have something that is very complex you need to stick to some principles. The first was lightness, the second was simplicity and the third scalability”.
“We would first start by sketching by hand,” she said. “Then we would draw in Illustrator or a 2D programme. Then we would laser-cut these shapes in paper.”
“After many iterations the team would start to make models in a harder material, like plastic. And then we got into laser-cutting metals. So it was an intricate, long, back-and-forth process.”
This painstaking, craft-led approach was essential when designing something that will be worn on the face, Olsson believes.
“A 0.2mm height difference makes a complete difference to the way they look on your face,” she said. “What looks good on the computer doesn’t necessarily translate, especially with something that goes on your face. So as soon as you have an idea you need to prototype it. The next stage is about trying it on a couple of people too because something like this needs to fit a wide range of people.”
She now leads a team of less than ten designers at Google X, including “graphic designers, space and interior designers, design strategists and industrial designers but also people who work in the fashion industry”.
She says: “The funny thing is almost nobody on the design team has a technology background, which is very unusual for a tech company. But the great thing about that is that it keeps us grounded and keeps us thinking about it from a lifestyle product standpoint.”
With Glass, she was keen to ensure the product was as adaptable and accessible as possible, to ensure it could reach a wide range of potential users. “From the very beginning we designed Glass to be modular and to evolve over time,” she said.
“We’re finally at the beginning point of letting people wear what they want to wear,” Olsson said. “The frames are accessories so you detach the really expensive and complex technology from the style part: you can have a couple of different frames and you don’t need to get another Glass device.”
Images are courtesy of Google.
Here’s an edited transcript of the interview:
James Pallister: Can you start by telling me a little bit about how you started designing Google Glass?
Isabelle Olsson: Two and a half years ago I had a very simple, concise brief, and it was to make this [prototype of Google Glass] beautiful and comfortable. When I first joined I had no idea what I was going to work on. I just knew I was joining Google X and working on something new and exciting.
Then I walked into a room full of engineers wearing a prototype of the glasses. These were [very crude] 3D-printed frames with a cell-phone battery strapped to the legs. They weighed about 200 grams.
James Pallister: What were your initial design intentions?
Isabelle Olsson: My initial goal was: “how do we make this incredibly light?”. I set up three design principles; if you have something that is very complex you need to stick to some principles. The first was lightness, the second was simplicity and the third scalability.
The first thing that made me nervous was not how are we going to make this technology work but how are we going to be able to make this work for people; how are we going to make people want to wear the glasses? The first thing that came to mind is that when you walk into a glasses store you see hundreds of styles.
From the very beginning we designed this to be modular and be able to evolve over time. So in this version that you have probably seen already, there is this tiny little screw here and that is actually meant to be screwed off and then you can remove this frame and attach different kinds of frames.
James Pallister: You’re launching new prescription frames and sunglasses which fit the Google Glass you launched in 2013?
Isabelle Olsson: Yes. What is really exciting is that this is our first collection of new frames. The frames are accessories so you detach the really expensive and complex technology from the style part: you can have a couple of different frames and you don’t need to get another glass device. So we’re finally at the beginning point of letting people wear what they want to wear.
James Pallister: How many people were on the team who refined the clunky prototype into what we see today?
Isabelle Olsson: The team started off very, very small: it was like a little science project. As we started to transition it into something that you could actually wear we have grown the team. Our design team is still really small. So in the design team I can count them on my 10 fingers.
James Pallister: What kind of people do you have on your team?
Isabelle Olsson: I really believe in having a mixed team: graphic designers, space and interior designers, design strategists and industrial designers but also people who work in the fashion industry. The funny thing is almost nobody on the design team has a technology background, which is very unusual for a tech company. But the great thing about that is that it keeps us grounded and keeps us thinking about it from a lifestyle product standpoint.
James Pallister: Is that one of the strengths of the team, that you are not too obsessed the technology?
Isabelle Olsson: There’s often the view that designers and engineers have to fight; that there should always be a constant battle. I don’t believe that. I think that view belongs in the 1990s.
James Pallister: Are the glasses manufactured by Google?
Isabelle Olsson: They are made in Japan. They are made out beautiful titanium that is extremely lightweight and durable.
James Pallister: With the spectacles and sunglasses, how did you choose which styles to develop?
There actually aren’t that many styles out there, so we looked at the most popular styles and condensed then into these really iconic simplified versions of them. Bold for example is great for people that would normally prefer kind of a chunky, square style. Curve, which I’m wearing, is perhaps a little more fashion-forward. And Split is for those who like almost rimless glasses or ones which are lighter on your face. Then Thin is this very classic traditional simple style that doesn’t really stand out.
James Pallister: Had you ever designed glasses before?
Isabelle Olsson: I have designed glasses and jewellery. So it wasn’t completely new but we did spend a long time refining these. We wanted the shape to be absolutely perfect. A 0.2mm height difference makes a complete difference to the way it looks on your face. Prototyping was absolutely crucial. We also cut paper and used laser cutting and used 3D printing.
James Pallister: Could you explain the design process?
Isabelle Olsson: We would first start with sketching by hand. And then Illustrator or a 2D programme, then we would laser-cut these shapes in paper and do many alterations [iterations?]. Then we would go into a harder material, like a plastic.
Once we have the icons, then we got it into 3D. And then 3D print that. Then we got into laser-cutting metals. So it is a long, intricate, back-and-forth process.
James Pallister: So it was quite a manual process? It wasn’t so much using models and computers?
Isabelle Olsson: Yes. What looks good on the computer doesn’t necessarily translate, especially with something that goes on your face. So as soon as you have an idea, you need to prototype it to see what is broken about it. You can then see what looks weird. It can be completely off – too big or too nerdy and you look crazy! It can be a case of a couple of millimetres.
The next stage is about trying it on a couple of people too because something like this needs to fit a wide range of people. That is what I think is most exciting is that everyone on our team uses Glass. We gave them prototypes early on. It was interesting to get feedback from them and it was also valuable for me to see people walking around with them everyday.
James Pallister: What do people pay to get the device?
Isabelle Olsson: So the Explorer edition [the version of Glass released last year] is now $1500 then this new prescription glasses accessory is going to be $225.
James Pallister: Did you have to build different software to cope with the curvature of the lens?
Isabelle Olsson: No, it just works for the regular device. What’s great about it is that our existing Explorers can buy the accessory, which is just the frame part, and then attach it to their device.
James Pallister: How long do you think it will be before wearing Google Glass becomes a normal, everyday thing? Five years? Ten years?
Isabelle Olsson: Much sooner than 10 years I would say. The technology keeps on evolving. That’s the critical part about the Explorer programme [the early adopters who have been given access to Glass], to get people out in the world using Glass in their daily lives. Once more people have it, people are going to get used to it faster.
Even with the original edition or the base frame, after half an hour people say that they forget they are wearing it. When you put it on, it is so lightweight; you can personally forget that you are wearing it. Then it is about other people around you getting used to it. It takes maybe three times that amount for that to happen.
James Pallister: Have you heard of any unexpected uses of Glass?
Isabelle Olsson: I mean personally I was hoping for these cases so when anything comes up I am more excited than surprised. The artistic use of it appeals to me as a designer, when people use it to make cool stop-motion videos or in other arts projects. But also there is this firefighter who developed this special app so he can see the floorplan of a building, so it could help save lives. The more people I see using it, the more exciting it gets and the more diverse it becomes.
James Pallister: Some people are predicting that wearable technology is just a stepping stone towards cyborg technology, where the information is fed directly into the brain. What do you think of that notion?
Isabelle Olsson: I think the team and myself are more interested in what we can do today and in the next couple of years, because that is going to have an impact and be really amazing. You can speculate about the future but somehow it never ends up being what you thought it would be anyway. When you see old futuristic movies, it is kind of laughable.
James Pallister: It seems that we are getting closer and closer to a situation where we can record every situation. Does that ever worry you from a privacy viewpoint?
Isabelle Olsson: I think with any new technology you need to develop an etiquette to using it. When phones started having cameras on them people freaked out about it.
Part of the Explorer programme is that we want to hear how Glass is working and when it is useful and in what instances do you use it. We are also interested in the social side, how people react when you are wearing it. What are peoples concerns, fears, issues and hopes for it.
We hope that Glass will help people to interact with the world around them, really quickly process information and move on to the conversation they were having.
James Pallister: What do you think is the next stage for Glass?
Isabelle Olsson: Tight now we are definitely focused on slowly growing the Explorer programme, making sure that people get these frames in their hands – or on their faces should we say. We are really excited about that and obviously we are working on prioritising feedback and also creating next generation products that I can’t talk about!
James Pallister: Are there any types of technology that you think Glass will feed into in the future?
Isabelle Olsson: I think a lot of things. It is hard for us to speculate without revealing things but the focus is to make technology a more natural part of you and I think any type of services that does that. Glass is going to feed that.
News: scientists at the Google[x] research facility in California are working on contact lenses containing tiny electronics that could constantly monitor glucose levels in the tears of people with diabetes.
“We’re now testing a smart contact lens that’s built to measure glucose levels in tears using a tiny wireless chip and miniaturised glucose sensor that are embedded between two layers of soft contact lens material,” said Google in a post published on its official blog.
The contact lenses would be able to generate a reading every second, making it possible to instantly identify potentially dangerous changes in the patient’s blood sugar levels.
“We’re also investigating the potential for this to serve as an early warning for the wearer, so we’re exploring integrating tiny LED lights that could light up to indicate that glucose levels have crossed above or below certain thresholds,” the company explained.
As well as minuscule chips and sensors, the lenses could also incorporate an antenna thinner than a human hair that would communicate with apps so patients or doctors could view the measurements on a smartphone, tablet or computer.
Diabetes patients are currently required to test their blood sugar levels at regular intervals throughout the day by pricking their finger to draw a tiny amount of blood that can be analysed. The process is painful and time-consuming and can discourage people with diabetes from checking their blood glucose as frequently as they should.
“The one thing I’m excited about is that this is a device that people wear daily – the contact lens,” project co-founder Brian Otis told the BBC. “For us to be able to take that platform that exists currently, that people wear, and add intelligence and functionality to it, is really exciting.”
Google stressed that the technology is at a fledgling stage in its development but added that it will be seeking out potential partners who could help it refine the hardware and software required to turn the concept into reality.
“It’s still early days for this technology, but we’ve completed multiple clinical research studies which are helping to refine our prototype,” Google claimed. “We hope this could someday lead to a new way for people with diabetes to manage their disease.”
News:surgically implanted chips that feed digital information directly into the brain will supersede wearable technology, according to the co-founder of a leading 3D imaging studio.
WiFi-enabled chips mounted inside the skull will be more effective than today’s devices such as virtual reality headsets and Google Glass, according to Andy Millns, co-founder of London studio Inition.
“A much more successful way of doing this would be to bypass the eye altogether and directly interface with the brain,” Millns said in an interview with Dezeen. “We’re already seeing things like this with cochlear implants [electronic hearing implants] on the hearing side.”
Millns foresees a “cyborg scenario,” whereby the human brain is enhanced with digital implants. “The next step would be to have a WiFi or Bluetooth-type interface to augment the processing capacity of your brain.”
Existing virtual reality technology relies on the user wearing a headset, which displays an alternative digital world. These headsets will increasingly become so realistic that people will no longer be able to tell the difference between real and fictional landscapes, Millns said.
“The inevitable future of these things is the ability to have tighter and tighter integration between the display and the human till you end up with a cyborg scenario where you have something embedded inside your brain that has a direct interface to your visual cortex,” he said.
A cyborg, or cybernetic organism, is a living being with both organic and artificial parts. In an interview with Dezeen last year Neil Harbisson, the first officially recognised human cyborg, predicted that humans will “stop using technology as a tool and … start using technology as part of the body.” Harbisson, who has a chip at the back of his skull that allows him to perceive colours, said: “I think this will be much more common in the next few years.”
While such technology is some way away, Millns believes that augmented reality headsets will soon get so sophisticated that wearers won’t be able to tell if they’re looking at real or digital imagery.
“We’re going to get very close this year to a headset where it’s starting to get very difficult to distinguish if you’re actually wearing a headset or not,” he said. “When we start to get very high resolution headsets, with the type of display technology that we’re seeing on the market now, it’s going to blur that line between the virtual and the real.”
The forthcoming high-definition version of the Oculus Rift headset (pictured above), which was premiered at the Consumer Electronics Show in Las Vegas last week, will represent a giant leap forward in virtual reality technology, Millns said.
The Oculus Rift headset features a stereoscopic screen that creates the illusion of depth, perspective and scale. Sensors mounted on the outside of the headset track the user’s movement and move the digital imagery accordingly, allowing the user to explore virtual worlds.
Millns believes the technology will soon allow convincing “telepresence” whereby people feel they are at an event or in a location remote from where they actually are. “Virtual reality is so versatile,” said Millns. “You can create a universe from scratch, it can be useful to immerse someone in whatever world you want.”
Coupled with advances in 360-degree video cameras – which record in all directions simultaneously – the headsets could allow people remotely to attend events happening elsewhere, such as fashion shows.
“We can actually put thousands of people in a seat by the side of a catwalk and they can actually experience what it’s like to be there,” Millns said. “You can put someone in any position in the show and allow them to look around as if they were there.”
Last year Inition developed an “augmented 3D printing” service for architects that allows them to visualise the inside of models of buildings, show the services and structure and show how the building will appear at different times of the day and night.
A video of the interview with Millns will be published on Dezeen soon.
Photography is by Inition, unless otherwise stated.
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.