Despite a conservative fashion industry, rapid changes in technology will transform the clothes we wear, says Benjamin Males, of London-based fashion and technology company Studio XO.
“We believe fashion is quite antiquated,” he says. “While everything around us becomes intelligent, becomes more computational, our clothes are still very old-fashioned”.
This will not be the case for long, says Males, who believes that advances in micro-robotics and transformable textiles will soon make their way into everyday clothing, helping create clothes that can change shape using small motors.
“We believe in the next decade we’re going to see some pretty amazing things happen around transformable textiles and mechanical movement in our clothes: we are looking at introducing that in the next five years,” he says.
He points to the ubiquitous use of smartphones as evidence that people are becoming increasingly comfotable with having sophisticated technology on or very close to their bodies.
Moving up and down a clothes size may soon be possible without having to buy new clothes, predicts Males.
“We [will soon be able to] change the fit of our clothes at the push of a button, or our clothes could form new architectures around us,” he says.
Males is one of the founding partners of Studio XO, whose work includes dresses for Lady Gaga: Volantis, a flying dress powered by 12 electric motor-driven rotors, and the bubble-blowing dress Anemone, which is documented in this movie.
Males describes Studio XO’s Anemone as a provocation and a commentary on the future of textiles.
Anemone is a dress that blows large and small bubbles, the small ones creating a foam structure around the wearer and the large bubbles flying away.
Males calls the mechanisms that create this effect bubble factories. These are small, 3D-printed jaw mechanisms. When they open, a fan blows out large or small bubbles depending on the size of the mechanism’s aperture.
The dress was unveiled in 2013, when Lady Gaga wore it to the iTunes festival. It is the second so-called bubble dress which Lady Gaga has worn, the first one being a nude leotard with plastic transparent globes attached to it.
Rapidly developing flight technology will make personal flying vehicles commercially viable in the near future according to Benjamin Males, co-founder of London-based fashion and technology company Studio XO, who developed the Volantis for Lady Gaga.
“Volantis might seem very science fiction,” says Males, “but if you consider the developments in vehicle design, if you look at the trends toward space travel and jet pack design, actually the idea of having a personal aerial vehicle that has to have style doesn’t seem that crazy”.
“Who knows, in ten years time we may all be flying round in Volantises,” he adds.
Volantis is remote controlled and flies using 12 battery-powered propellors. Flown by a trained pilot who specialises in unmanned vehicles, it was unveiled with Lady Gaga at a warehouse in Brooklyn, New York City, in November last year.
Speaking to Dezeen at Studio XO’s London headquarters, Males explains how the aircraft is powered by 12 rotor blades and borrows technology commonly used in the manufacture of drones.
“It’s known as a hex 12. It has six arms and 12 rotors. Each arm has two rotors which provide the thrust to lift [it] off the ground,” he says.
The truss section at the centre of the aircraft, to which Gaga was fastened by a belt, is made of titanium. The rotors and her custom-made bodice are made of carbon fibre.
The passenger stands inside a white bodice that is connected to the truss. “Although the machine had to be strong, we also wanted it to have the affordances of fashion. So we made a very beautiful front casing which completed the dress,” says Males.
White cylinders surround the rotors in hexagonal formation and connect in the centre above the dress, which rests on the ground using a circular stand when not in flight.
Studio XO has also worked with other high-profile artists including the Black Eyed Peas and Azealia Banks, to create hybrid stage costumes that combine fashion and technology.
“We bring these subjects together, in this space – in this quite unique environment,” says Males, who is now working on the launch of a new ready-to-wear brand developing some of the ideas from the company’s stage work.
A wearable 3D-printed eyeglass that monitors breathing and pupil size to measure what people find interesting online has been developed by students at the Royal College of Art and Imperial College (+ movie).
Developed by Sanya Rai, Carine Collé and Florian Peuch, students of the RCA and Imperial College‘s joint Innovation Design Engineering course, the Amoeba is equipped with sensors designed to monitor three instinctive responses that indicate a person’s interest in what they see.
This sensory data is collated to create an intuitive alternative to bookmarking and other systems for keeping track of digital content.
“We believe that with the advent of wearable technologies, where devices will be constantly mapping every moment of our lives, organising our personal data will be a monumental task,” explained the team.
“Amoeba can help ease this process by bringing only the most interesting stuff to the forefront, making sure we never miss out on the important stuff and saving us a lot of time and effort.”
Designed in CAD and manufactured on a 3D printer, the design is, according to Sanya Rai, “a statement piece to let the world know that the wearer is immersed in research.”
The Amoeba records breathing rates using heat sensitive receptors near the wearers mouth. It has a camera embedded into the lens to measure pupil size and sensors on an arm that measure the electrical conductance of the skin, which varies with moisture levels generated by sweat.
These three elements combined create a snapshot of data about the emotional response of the wearer when they look at content.
The data is then converted into a digital signal which creates a visual map that can be viewed with Google’s Chrome browser.
According to the development team, the Amoeba has several applications including measuring the impact advertising has on potential customers.
“Amoeba reveals the true underlying changes in a user’s bio-data in order to get an honest and unbiased feedback to product developers and the industry.”
Another potential area of use is in measuring student engagement in online education. “The drop out rate from online courses is over 90 percent,” the team said. “Amoeba will help to tailor learning platforms according to the subconscious reactions of the user and thus keep him motivated and engaged on the learning platform.”
The students are currently developing the Amoeba to be able to measure interest in all digital content such as music and film, not just websites.
“Our final vision would be to have Amoeba as an embedded feature in all wearable devices so that it can help streamline all content for the user, bringing to the forefront only the most interesting stuff rather than the entire daily log of data,” said Rai.
Dezeen and MINI Frontiers: in this exclusive video interview, musician Imogen Heap demonstrates the electronic gloves that allow people to interact with their computer remotely via hand gestures.
The interview was filmed at Heap‘s home studio outside London, shortly before she launched her Kickstarter campaign to produce a limited production run of the open-source Mi.Mu gloves.
“These beautiful gloves help me gesturally interact with my computer,” says Heap, explaining how the wearable technology allows her to perform without having to interact with keyboards or control panels.
Pushing buttons and twiddling dials “is not very exciting for me or the audience,” she says. “[Now] I can make music on the move, in the flow and more humanly, [and] more naturally engage with my computer software and technology.”
Working with a team of developers and musicians, Heap has mapped movements made with the gloves to musical functions such as drum sounds or bass notes, changes of pitch, arpeggios and filters.
“What this glove enables me to do is access mappings inside my computer so that I don’t have to go to a keyboard or a fader or a button,” she says.
For example, instead of using a finger to push a fader on a mixing desk, Heap can raise her arm to achieve the same affect. By raising her hand, she can move through a scale of notes, or through pinching together her thumb, middle and forefinger and rotating it, she can apply filters to the sound.
Each gesture-control glove contains a wifi-enabled x-IMU board developed by x-IO Technologies containing an accelerometer, a magnetometer and a gyroscope.
These work together with a series of motion sensors incorporated into the fingers of each glove that track the degree of bend and the spread of the fingers. The gloves can also understand postures such as an open palm, a finger-point or a closed fist.
The latest version of the gloves feature e-textile technology, where sensors and wiring are integrated into fabric. Heap is now exploring how to make further use of electronically conducting textiles, to reduce the number of hard components in the gloves.
Heap says they will not just change performance, but the production of music too: “We really feel that they are going to change the way we make music.”
Heap’s Kickstarter campaign aims to raise £200,000 to develop and produce a limited production run of Mi.Mu gloves. If successful, she will make both the hardware and software open source, allowing people to develop their own uses for the technology. “It’s really exciting to see what people might do by hacking them,” said Heap. The Kickstarter campaign closes on 3 May 2014.
The music featured in this movie is Me, the Machine, a track that Heap wrote specifically to be performed using the gloves.
“This is really a new communication platform,” said Zuckerburg. “By feeling truly present, you can share unbounded spaces and experiences with the people in your life.”
First launched on crowd-funding website Kickstarter in 2012, Oculus Rift creates an immersive computer-generated environment in front of the wearer.
The technology is already set to change the way video games are played and Facebook plans to see this realised. “Immersive gaming will be the first, and Oculus already has big plans here that won’t be changing and we hope to accelerate,” said Zuckerburg.
“Oculus Rift has been sort of the poster child for virtual reality,” Millns said. “What you’ve got essentially is a seven-inch mobile phone-type screen and two lenses. It’s that simple.”
News: tech giant Google and eyewear company Luxottica have announced a partnership to develop Google Glass wearable headsets into consumer-friendly products.
“We have come to a point where we now have both a technology push and a consumer pull for wearable technology products and applications,” said Luxottica CEO Andrea Guerra.
Google Glass lets users send and receive messages, take pictures and search the web hands-free – this collaboration will put this technology in the hands of designers at Luxottica, which produces eyewear for brands including Ray-Ban and Oakley.
“We live in a world where technological innovation has dramatically changed the way in which we communicate and interact in everything that we do,” said Guerra.
News: musician Imogen Heap is to put an experimental electronic glove into production, creating a tool that will allow anyone to interact with their computer remotely via hand gestures (+ interview + movie).
Heap has launched a Kickstarter campaign to raise £200,000 to develop and produce a limited production run of open-source Mi.Mu gloves, with a wider production planned for the future.
“Funding this campaign will enable us to make a really important developmental leap to finalise the gloves’ design so they’re ready to go into production,” Heap said in a video accompanying her Kickstarter campaign.
Each gesture-control glove contains a range of sensors that track the position, direction and velocity of the wearer’s hand, the degree of bend in their fingers and the distance between their fingers. It can also understand “postures” such as an open palm, a finger-point or a closed fist.
The resulting data is sent wirelessly to a computer, keyboard and other electronic music equipment, allowing musicians to create music by moving their hands rather than by playing a keyboard or pressing buttons.
“Fifty percent of a performance is racing around between various instruments and bits of technology on stage,” Heap told Dezeen in an exclusive interview ahead of the Kickstarter launch. “I wanted to create something where I could manipulate my computer on the move wirelessly so that music becomes more like a dance rather than a robotic act like pressing a button or moving a fader.”
The latest version of the gloves was developed as part of Heap’s ongoing The Gloves Project, which began four years ago. In 2012 she performed with an early version of the gloves at the Wired 2012 conference.
While developed for musicians, Heap said the gloves could be “hacked” for other uses.
“I’m not claiming they’re going to be the answer to every interaction with the computer but there’s a lot of applications where it just feels wrong to use a mouse and a keyboard,” she said. “You might want to be able to make something in some architecture software where you could stretch a building or draw little windows and quickly move them around like play-dough and maybe we’ll get to the point where people will start to develop software like that.”
She added: “It’s essentially a remote control and anything that you could potentially do with your hands, you could do with your gloves.”
The key piece of technology in each glove is an x-IMU board developed by x-IO Technologies, which is mounted on the back of the hand and contains an accelerometer, magnetometer, gyroscope and wifi transmitter. The latest version of the gloves features e-textile technology, where movement sensors are integrated into the fabric.
Below is an edited transcript of the interview with Heap:
Marcus Fairs: Tell us about the gesture-controlled gloves you’ve been working on.
Imogen Heap: I’m a musician but more recently I’ve been developing some gloves with an amazing team of people to help me make music on the move gesturally, enabling me to interact more naturally with my music software, to more freely create music on the move, in the flow of things.
Marcus Fairs: So they’re to allow you to make music without having to be tied to keyboards or other physical instruments?
Imogen Heap: Fifty percent of a performance is racing around between various instruments and bits of technology on stage. For instance, pressing a record button doesn’t look or feel very expressive but actually that moment of recording something is a real creative act; it’s a musical act.
But these actions have always been hidden from the audience and they disengage me in my performance, so I wanted to find a way to do that and integrate it into the performance. It’s the unseen that I’m interested in bringing out of hiding.
There are so many types of sounds or effects that don’t have a physical existence. They are software, they are hidden inside the computer. A bass-line might sound sculpted; it might have this blobby, stretchy sound. For me doesn’t feel natural to play a sound like that on a keyboard because a keyboard is very restrictive and very linear and you only have two hands. I can play a melody but if I wanted to manipulate any kind of parameter of that sound, my other hand is completely used up. It’s quite restrictive.
I wanted to find a way to be really expressive in using these software instruments and effects that feel like how I feel they should be played and how I feel that represents the sound that’s coming out of the speakers.
So in order to free myself up on the stage from my various bits of technology and to bridge the gap between what’s going on on stage and the audience, I wanted to create something where I could manipulate my computer on the move wirelessly so that music becomes more like a dance rather than a robotic act like pressing a button or moving a fader.
Marcus Fairs:How do the gloves work?
Imogen Heap:They have bend sensors in the fingers, they have lights for feedback, they have buzzers integrated in the side so I can sense where I am if I want to get a haptic feedback. They also contain a microprocessor unit that has an accelerometer, a magnetometer and a gyroscope in it.
We’ve been developing them for about four years and they’ve come a long way. We started with fibre-optic bend sensors in the fingers but we quickly realised that we needed positional data, accelerometer data, gyroscope data so that we could really be inside the music. Because actually just having the bend sensors in the beginning was almost like just pressing buttons. It felt very unnatural.
It began with little lapel microphones, which are made by Sennheiser. Seven years ago I began to stick them onto my wrists so that I could make sound with wine glasses or I could play my mbira on stage. I would be able to avoid putting microphones on stage for festivals or touring so it would cut down on the weight and the transport costs, which is also a reason for the gloves.
In the early versions of the gloves it all connects to a hub that I wear on my upper body. It’s quite complicated but it basically communicates with the computer wirelessly so I can use it to manipulate music software, enabling me to unchain myself from the computer, to humanise the missing bits of how I interact with technology in music.
I use these gloves with a Kinect so that I can have an extra dimension on top of local gestural movements; I can use the stage as a playground like different zones for whole different presets. I could map the centre of the stage for a certain key, and if I go over to the right and I combine it with a gesture so that I don’t accidentally go in there, then I can have a whole different key or a whole different set of sounds to play with. I could unmute and mute different instruments that are inside the music software.
There’s really nothing out there on the market like this, that enables me to be this expressive with music on the move in the studio and on the stage. It’s very exciting. When you see me play, not maybe every time because maybe it goes wrong or I go wrong, but when it works, when it’s effortless and when your movement is part of the music, it’s almost like a dance. It’s so natural that the tech disappears.
Marcus Fairs: Tell us about the latest version of the gloves.
Imogen Heap: It’s very exciting because it’s so much simpler and it needs less gear, less setting up. As you can see it’s compact and doesn’t need so many extra wires and the main reason for that is this: it has an x-IMU board by Seb Magdwick of x-IO Technologies containing an accelerometer, a gyroscope, a magnetometer but the main difference is it now has wifi built into the glove. So it doesn’t need an extra unit to send information to the computer.
That is incredible because it’s sending Open Sound Control data instead of MIDI serial data. There are two bend sensors in the wrists and we’ve still got the bend sensors in the fingers and “forchettes” between them telling us how closed or open my hands are and equally how much my hands are bending. We’re finding that the bend sensors so far are the simplest solution but really we want to get to the point where it’s all e-tech style. So that we can separate the hard tech from the soft tech.
Marcus Fairs:What’s e-tech?
Imogen Heap: E-tech is electronic textiles. So information is passing through fabric by using conductive threads or materials. This is where we are and it’s beautiful.
But at the moment it’s really simple, it just sees this exoskeleton as a device and then it comes up on your computer as a wifi device and you’re ready to go. It’s super simple and it’s great.
Marcus Fairs:Could the gloves be used for other creative uses besides music?
Imogen Heap: A lot of people have been in touch. For instance a guy suggested that he could take all the international sign language which you only need one hand for and translate that sonically, so that each posture for a word or gesture for a word could be mapped and generate a word. You could hack a little speaker onto the system so it could actually speak for you as well. So that’s one idea.
And in the video for Me, The Machine, which is a song that I wrote with the gloves and for the gloves, you see me manipulating visuals with them. Just drawing lines onto a screen that’s in front of me so you can see me drawing in real time.
It’s great fun to do. I can draw little arrows and houses and people. It’s not like using a pencil; it’s incredible to be able to create these grand shapes, to be able to shift everything, painting out of nothing and spinning it around and stopping it and moving it over here. So I imagine a few people might start to use them with visuals.
Marcus Fairs: What about non-creative uses? Could these gloves be used by surgeons for example, or pilots or bus drivers?
Imogen Heap: I think there’s a lot of applications; it doesn’t have to be like you’re painting or making music with them. For our Kickstarter campaign, we’ve been starting to think about funny things we could pretend to do with them. So I suppose as long as you can access your computer inside your car, there’s no reason why you couldn’t just sit in the back of your car and indicate right or left. It’s a remote control. It feels like an expressive musical instrument sometimes but it’s essentially a remote control and anything that you could potentially do with your hands, you could do with your gloves.
Marcus Fairs: Do you plan to manufacture and sell them as a product?
Imogen Heap: We would love the gloves to be as affordable as something like a MIDI keyboard in time. Imagine if this was something that people would just go to use as one of those expressive things that they feel can’t be done with certain types of more rigid technology, because what is exciting about them is that they’re totally customisable.
You can even hack them, so you might want a screen or maybe you’ll want a push button thing, but something that gives off a smell when you move your hand. It’s really exciting to see what people might do with hacking them. The software is going to be open source and so is the hardware. We can’t wait to see what people do with them. It’s early stages.
Marcus Fairs: There’s a lot of talk about how wearable technology could remove the need to interact with computers. How do your gloves fit into that trend?
Imogen Heap: I’m not claiming they’re going to be the answer to every interaction with the computer but there’s a lot of applications where it just feels wrong to use a mouse and a keyboard. You might want to make something in some architecture software where you could stretch a building or draw little windows and quickly move them around like play-dough and maybe we’ll get to the point where people will start to develop software like that. That would be amazing.
News:Google has unveiled an operating system designed specifically for wearable devices called Android Wear, plus details of the first smartwatches to incorporate the technology.
In a series of Youtube movies released today, Google previewed the Android Wear operating system that will extend apps currently available on Android devices to present contextual information designed to be viewed at-a-glance on wearable devices.
“With a wearable device you can be going about the rest of your day, just glance down at your wrist and the information you need is there right away without even having to ask for it,” said Android’s director of engineering David Singleton.
Using the existing Google Now service, the new user interface will prioritise information specific to the user’s context to allow a more passive experience, without the need to retrieve the information from multiple applications.
For example, in the morning it could show local weather reports, the time of the wearer’s first meeting and travel time to get there based on current traffic conditions.
“Watches are good at telling time, but imagine having useful actionable information there precisely when you need it, automatically,” said Singleton.
The launch movie also shows a user receiving an alert for jellyfish when about to go surfing, and immediately swiping to a screen showing other beaches in the area to head towards.
“Think about the times you need information most urgently,” said Android designer Alex Faaborg. “The stuff you care about moves with you from place to place so you never miss out on the important stuff.”
Android Wear incorporates voice control and, like Google Glass, will be activated by saying “Okay Google”.
“We put a lot of thought into how simple this has to be,” he continued. “It has to be incredibly fast, incredibly glanceable. There’re really only two components: the information that’s most relevant to you and the ability to be able to speak to it and give it a command.”
The system will also facilitate receiving and responding to text messages and calls, and listening to music. It could also incorporate health or activity-monitoring functions to rival devices like Nike FuelBand and Fitbit.
The first device announced to use the system, the LG G Watch, will be launched by South Korean firm LG later this year. The two devices shown in the Android Wear launch movies with either a round or square case are reported to variations of the forthcoming Moto 360 from Motorola.
“To bring this vision to life, we’re working with consumer electronics manufacturers, chip makers and fashion brands who are committed to fostering an ecosystem of watches in a variety of styles, shapes and sizes,” said Singleton.
Google today made a preview software development kit available, so that developers can begin to extend Android applications to work with the new system.
News: miracle material graphene has been used to develop infrared sensors, which could be inserted into contact lenses and allow the wearer to see in the dark.
Engineers at the University of Michigan have used graphene – a material formed from a single layer of carbon atoms – to create sensors that can detect the full spectrum of light, including infrared.
The sensors detect light by measuring the behaviour of electrons and changes in current between two layers of the material, separated by an insulator.
Usually infrared sensors such as those found in night-vision goggles require bulky cooling to prevent the devices overheating, increasing their size.
However, the graphene sensors do not require cooling so can be produced as small as a fingernail and developed to be tinier still.
Once small enough, the sensors could be embedded into contact lenses or mobile phones camera lenses and used to create imagery in completely dark environments.
In this movie we filmed in Miami, Daniel Widrig says that designers can break down boundaries between disciplines by borrowing technologies and tools traditionally associated with one industry and using them in other industries, in unexpected ways.
“A lot of technology we use was originally developed for use in other disciplines such as special effects or the movie industries,” says Widrig. “One could say that boundaries are blurring between industries”
His architectural background feeds into his ongoing research into using 3D-printing for clothing and jewellery, says Widrig.
“We work with the body in quite an architectural way: we investigated certain body parts and then we applied design processes to populate body parts with architectural microstructures,” he says.
For Widrig, it is often the experimental, low-budget projects that yield the most new ideas.
“The most interesting projects for me are the self-imitated projects where you set yourself a goal and an agenda and you work with sometimes really small budgets, but you have the freedom to explore,” he explained.
These then feed into more commercial projects, from experimental furniture to sculpture, computer game design and movie sets.
The music featured in the movie is a track by Simplex. You can listen to his music on Dezeen Music Project.
Dezeen and MINI Frontiers is a year-long collaboration with MINI exploring how design and technology are coming together to shape the future.
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.