News:Adidas has become the latest brand to launch a smartwatch with the release of a device for runners that monitors performance and gives coaching tips (as well as telling the time).
The miCoach Smart Run is Adidas‘ colour touch-screen running watch, which integrates both performance tracking and personal training into one device.
“By combining the latest innovations, design and ease-of-use navigation, we consciously broke the template that defines other running watches,” said director of miCoach at Adidas Simon Drabble.
While training, users can track runs with GPS mapping, monitor heart rate, play music and get real-time coaching.
A built-in accelerometer counts steps for tracking stride rate. GPS tracks speed and distance, mapping the route travelled on the journey.
An integrated Bluetooth MP3 player can transmit music to wireless earpieces to eliminate headphone cables, which can become tangled while running.
The watch also features a personal training application that uses colours and vibrations to tell the user to speed up, slow down or pause. Coaching can be delivered by a voice via a Bluetooth headset, plus animations demonstrating exercises and workout routines can be played on screen.
“The visual and audible guidance for interval training is a leap forward from any other watch available,” said Terrence Mahon, lead endurance coach for the UK Athletics Federation.
The Smart Run watch will be available on the Adidas website from 1 November.
People are dunked in a pool of wax to create sculptural dresses by Dutch fashion designer Bart Hess as part of a futuristic city on show at this year’s Lisbon Architecture Triennale, which kicked off yesterday in the Portuguese capital.
Bart Hess created the installation to explore ways in which humans can augment and extend the shapes of their bodies, creating a kind of prosthetic that is unique each time.
To create the garment, individuals are strapped to a robotic harness then lowered into a pool of water and wax. As the wax moves in the water it begins to set, bonding itself around the body.
The person is then lifted out of the water, encased inside a cocoon of wax that can then be cut or broken.
Speaking to Dezeen, Bart Hess explained that temperature affects the end result. “More complex shapes require hotter temperatures, so you need to build up a tolerance to the heat,” he explained. “But it only hurts on the surface for a few seconds.”
The Garment District is one of five zones in the Future Perfect exhibition, which was conceived by curator Liam Young as an exploration into how technology will shape future cities.
“Telling stories about the future is a way of thinking about ideas,” said Young. “It’s about opening up a a discourse of what a city could be. Architects need to be operating beyond the now, developing strategies and tactics that will connect people with the future.”
Here’s a short project description from the exhibition organisers:
The Garment District
Our bodies are endlessly photographed, monitored and laser scanned with millimetre precision. From this context of surveillance, facial recognition, avatars and virtual ghosts, we imagine a near future where digital static, distortions and glitches become a new form of ornament.
For the youth tribes of Future Perfect the body is a site for adaption, augmentation and experimentation. They celebrate the corruption of the body data by moulding within their costumery all the imperfections of a decaying scan file. Shimmering in the exhibition landscape is a network of geometric reflective pools of molten wax. Their mirrored surface is broken by a body, suspended from a robotic harness, plunging into the liquid. A crust of wax crystallises around its curves and folds, growing architectural forms, layer by layer, like a 3d printer drawing directly onto the skin. Slowly the body emerges, encased in a dripping wet readymade prosthetic. It is a physical glitch, a manifestation of corrupt data in motion, a digital artefact. They hang from hooks like a collection of strange beasts and frozen avatars. Body prints, imperfect and distorted and always utterly unique.
Automotive brand Nissan has unveiled the first smartwatch concept to communicate with a car, providing drivers with real-time information such as average speed and heart rate (+ movie).
The Nismo Watch will connect with Nissan‘s Nismo vehicle range, enabling drivers to keep track of their speed and fuel consumption to help improve efficiency.
Racing drivers will be able to use the watch to access performance statistics while on track. Data from the car and wearer will be transmitted to a smartphone app via bluetooth, where it will be stored.
The watch will also track and rate the user’s social media activity across Facebook, Twitter, Pinterest and Instagram via Nissan’s Social Speed software.
“Wearable technology is fast becoming the next big thing and we want to take advantage of this innovative technology to make our Nismo brand more accessible,” said Gareth Dunsmore, the brand’s Marketing Communications General Manager.
The interface will be controlled by two buttons on the smooth band, which will secure to the wrist with a snap-fit mechanism and come in black, white, or a combination of black and red.
A lithium battery will have a life of over seven days and charge by micro-USB. Packaging for the watch will be made using rubber from racetrack tyres.
Nissan has also set up a mobile laboratory to develop and test additional features for its wearable technology.
It is hoped these new features will be able to monitor biometrics including heart rhythm intervals to identify when drivers are becoming tired, brainwaves to track concentration levels and emotions, plus skin temperature to record core body temperature and hydration levels.
The concept was unveiled at the Frankfurt Motor Show earlier this week, close behind electronics brand Samsung‘s announcement about its own smartwatch.
Nissan enters wearable technology space with the unveil of the Nismo watch concept
Nissan launches first smartwatch to connect car and driver performance.
First smartwatch concept to connect driver and car Nismo Watch showcases Nissan’s intent to deliver biometric data that enhances driver performance and efficiency Nissan is investigating heart and brain monitoring technologies for future wearable tech.
Nissan will become the first car manufacturer to create a smartwatch designed specifically for drivers of Nissan Nismo cars. The Nissan Nismo Concept Watch will be the first smartwatch to connect a driver to the car and will provide drivers with real-time biometric data.
The watch, unveiled today ahead of its display at the Frankfurt Motor Show (10-22 September 2013), is Nissan’s first step into wearable technology and epitomises its “fan first” approach to performance.
The Nissan Nismo Watch will allow drivers to: monitor the efficiency of their vehicle with average speed and fuel consumption readings; access vehicle telematics and performance data while on track; capture biometric data via a heart rate monitor; connect to the car using a smartphone app via Bluetooth Low Energy; receive tailored car messages from Nissan.
Gareth Dunsmore, Marketing Communications General Manager, Nissan in Europe, commented: “Wearable technology is fast becoming the next big thing and we want to take advantage of this innovative technology to make our Nismo Brand more accessible. On track, Nissan uses the latest biometric training technologies to improve the performance of our Nissan Nismo Athletes and it is this technology we want to bring to our fans to enhance their driving experience and Nismo ownership.”
Earlier this year, Nissan launched the Nismo Lab – a bespoke, mobile laboratory that features the latest, advanced biometric training tools such as brainwave technology and JukeRide – a cutting-edge performance analysis tool, that captures live biometric and telematics data from the race cars and Nissan Nismo Athletes during races. Nismo’s vision is to take these digital age technologies and make them available to athletes from other disciplines and to Nissan owners through future wearable tech.
Three key technologies have already identified for future development: ECG (Electrocardiogram) – to measure the intervals of the R-R rhythm of the heart, and identify early fatigue; EEG (Electroencephalogram) Brainwave – to monitor the drivers’ levels of concentration and emotions, and help athletes to get ‘In The Zone’; Skin Temperature – to record core body temperature and hydration levels; The Nismo Watch will also track and rate the user’s social performance across Facebook, Twitter, Pinterest and Instagram via Nissan’s proprietary Social Speed software.
The sleek Nismo Watch will be available in three colours – black, white and the flagship black and red. The watch design was inspired by the Nismo ring and is reflected in a seamless, futuristic and ergonomic design. The simple user interface can be controlled by two buttons and is secured onto the driver’s wrist via a simple snap-fit mechanism.
Even the packaging will have a unique Nismo identity. Dunsmore concluded: “We have brought the Nismo experience to life in every aspect of the watch, including its packaging, which will be made using tyres and rubber from the racetrack. As Nismo is the performance arm of Nissan, we wanted a way of integrating Nismo’s heritage in racing into this futuristic innovation.”
The Nismo Watch will use a lithium battery and will be charged by micro-USB, with a battery life of over seven days under normal usage conditions.
Researchers in Canada have designed a family of prosthetic musical instruments, including an external spine and a touch-sensitive rib cage, that create music in response to body gestures (+ interview + slideshow).
The instruments developed are a bending spine extension, a curved rib cage that fits around the waist and a visor headset with touch and motion sensors.
Each instrument can be played in a traditional hand-held way, but can also be attached to the body, freeing a dancer to twist, spin and move to create sound. All three are lit from within using LEDs.
“The goal of the project was to develop instruments that are visually striking, utilise advanced sensing technologies, and are rugged enough for extensive use in performance,” explained Malloch and Hattwick.
The researchers said that they wanted to create objects that are beautiful, functional and believable as instruments. “We wanted to move away from something that looked made by a person, because then it becomes less believable as a mysterious extension to the body,” Hattwick told Dezeen.
“The interesting thing would be either that it looks organic or that it was made by some sort of imaginary futuristic machine. Or somewhere in between,” he added.
The Rib and Visor are constructed from layers of laser-cut transparent acrylic and polycarbonate. “One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads,” said Hattwick.
The pads are connected to electronics via a thin wire that runs through the acrylic. Touch and motion sensors pick up body movements and radio transmitters are used to transmit the data to a computer that translates it into sound.
The Spine is made from laser-cut transparent acrylic vertebrae, threaded onto a transparent PVC hose in a truss-like structure. A thin and flexible length of PETG plastic slides through the vertebrae, allowing the entire structure to bend and twist. The rod is fixed at both ends of the instrument using custom-made 3D-printed components.
“We used 3D printing for a variety of purposes,” Hattwick told Dezeen. “One of the primary uses was for solving mechanical problems. All of the instruments use a custom-designed 3D-printed mounting system, allowing the dancers to smoothly slot the instruments into their costumes.”
Speaking about the future of wearable technology, Hattwick told Dezeen: “Technological devices should be made to accommodate the human body, not the other way around.”
“Just as we’ve seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology.”
Here’s a 15 minute documentary about the Instrumented Bodies project that features the instruments in action:
The team are now working to develop entirely 3D printed instruments and to radically re-imagine the forms that instruments can take.
Photographs are by Vanessa Yaremchuck, courtesy of IDMIL.
Here’s the full interview with PhD researchers Joseph Malloch and Ian Hattwick:
Kate Andrews: Why did you embark on this project? What was the motivation?
Ian Hattwick: This project began as a collaboration between members of our group in the IDMIL (specifically Joseph Malloch, Ian Hattwick, and Marlon Schumacher, supervised by Marcelo Wanderley), a composer (Sean Ferguson, also at McGill), and a choreographer (Isabelle Van Grimde).
In 2008 we worked with the same collaborators on a short piece for ‘cello and dancer’ which made use of a digital musical instrument we had already developed called the T-Stick. We decided to apply for a grant to support a longer collaboration for which we would develop instruments specifically for dancers but based loosely on the T-Stick.
During the planning stages we decided to explore ideas of instrument as prosthesis, and to design instruments that could be played both as objects and as part of the body. We started by sketching and building rough prototypes out of foam and corrugated plastic, and attaching them to the dancers to see what sort of movement would be possible – and natural – while wearing the prostheses.
After settling on three basic types of object (Spine, Rib, and Visor) we started working on developing the sensing, exploring different materials and refining the design.
Kate Andrews: What materials are the spine, rib and visor made from?
Ian Hattwick: Each of the Ribs and the Visors is constructed from a solvent-welded sandwich of laser-cut transparent acrylic and polycarbonate. One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads.
The pads are connected to the electronics in the base of the object using very thin wire, run through laser-etched grooves in the acrylic. The electronics in the base include a 3-axis accelerometer, a ZigBee radio transceiver, circuitry for capacitive touch sensing, and drivers for the embedded LEDs. Li-Ion batteries are used for power.
Each of the Spines is constructed from laser-cut transparent acrylic vertebrae threaded onto transparent PVC hose in a truss-like structure. One of the rails in the truss is a thin, very flexible length of PETg plastic that can slide through the holes in the vertebrae, allowing the entire structure to bend and twist. The PETg rod is fixed at both ends of the instrument using custom 3D-printed attachments.
For sensing, the Spines use inertial measurement units (IMUs) located at each end of the instrument – each a circuit-board including a 3-axis accelerometer, a 3-axis rate gyroscope, a 3-axis magnetometer, and a micro-controller running custom firmware to fuse the sensor data into a stable estimate of orientation using a complementary filter.
In this way we know the orientation of each end of the instrument (represented as quaternions), and we can interpolate between them to track or visualise the shape of the entire instrument (a video explaining the sensing can be watch on Youtube). Like the Ribs and Visors, the Spine uses a ZigBee radio transceiver for data communications and LiPoly batteries for power.
All of the instruments use a custom-designed 3D-printed mounting system allowing the dancers to smoothly slot the instruments into their costumes.
A computer equipped with another ZigBee radio transceiver communicates with all of the active instruments and collects their sensor data. This data is processed further and then made available on the network for use in controlling media synthesis. We use an open-source, cross platform software library called libmapper (a long term project of the IDMIL’s – more info at www.libmapper.org) to make all of the sensor data discoverable by other applications and to support the task of “mapping” the sensor, instrument and gesture data to the parameters of media synthesisers.
The use of digital fabrication technologies allowed us to quickly iterate through variations of the prototypes. To start out, we used laser-cutters at the McGill University School of Architecture and a 3D printer located at the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT). As we moved to production we outsourced some of the laser-cutting to a commercial company.
Kate Andrews: How did collaboration across disciplines of design, music and technology change and shape the project?
Ian Hattwick: From the very beginning of the project, the three artistic teams worked together to shape the final creations. In the first workshop, we brought non-functional prototypes of the instruments, and the dancers worked with them to find compelling gestures, while we tried a variety of shapes and forms and the composers thought about the kind of music the interaction of dancers and instruments suggested.
Later in the project, as we tried a variety of materials in the construction of the instruments, each new iteration would suggest new movements to the dancers and choreographer. Particularly, as we moved to clear acrylic for the basic material of the ribs, the instruments grew larger in order to have a greater visual impact, which suggested to the dancers the possibility of working with gestures both within and without the curve of the ribs.
These new gestures in turn required the ribs to have a specific size and curvature. Over time, the dancers gained a knowledge of the forms of the instruments which gave them the confidence to perform as if the instruments were actual extensions of their bodies.
Kate Andrews: How was 3D printing used during the project – and why?
Ian Hattwick: We used 3D printing for a variety of purposes in this project. One of the primary uses was for solving mechanical problems – such as designing the mounting system for the instruments.
We tried to find prefabricated solutions for attaching the instruments to the costumes, but were unable to find anything that suited our purposes, so we designed and prototyped a series of clips and mounts to find the shapes that would be easy for the dancers to use, that would be durable, and that would fit our space constraints.
In addition, 3D printing quickly became a tool which we use any time we had a need for a custom-shaped mechanical part. Some examples are a threaded, removable collar for mounting the PET-G rod to the spine, mounting collars and caps for the lighting in the spine.
[A document detailing the use of 3D printing in the project can be downloaded here].
Kate Andrews: Where do you see this technology being used now?
Ian Hattwick: 3D printing, or additive manufacturing as it is known in industry, is increasingly commonplace. In the research community, we’ve seen applications everywhere from micro-fluidic devices to creating variable acoustic spaces. One of my favourite applications is the creation of new homes for hermit crabs.
Kate Andrews: Can we expect to see other live performances using the instruments?
Ian Hattwick: We are currently working with the instruments ourselves to create new mappings and synthesis techniques, and in October we will bringing them to Greece to take part in a 10 day experimental artist residency in Greece focusing on improvisation. We’ve also been talking with a variety of other collaborators in both dance and music, so we expect to have quite a few different performances in the next year.
Kate Andrews: What do you think is the future for interactive and wearable technology?
Ian Hattwick: I’m really excited about the coming generations of constantly worn health monitors, which is the first widespread adoption of the ideas of the “quantified self” movement. I expect in a relatively short time it will be normal for people to maintain logs of more than just their their activity, heart rate, or sleep patterns, but also the effect of their mood and environment on their body. I’m also excited about e-textiles, clothing which can change its shape or visual appearance.
One of the ways in which I see the prosthetic instruments making a real contribution is the idea that technological devices should be made to accommodate the human body, and not the other way around. Particularly, you see musical instruments created so as to be easy to mass-manufacture, rather than seeking to identify and support natural physical expressions during musical performance. At the same time, by creating technologies which are invisible to the performer we take away the physical interaction with an instrument which is so much a part of how we think about performance, both individually and in ensembles.
Kate Andrews: Does this present a new future for music? For dance?
Joseph Malloch: There is no one future for music or dance, but we can always count on new technologies being adapted for art, no matter their intended purpose.
Ian Hattwick: In interactive dance, the paradigm has always been capturing the unencumbered motion of the dancer; in music, there tends to be a fetishisation of the instrument. So in a sense, the idea of prosthetic instruments challenges the existing norms of those art forms. Certainly, using the prosthetic instruments requires a different conceptualisation of how we can perform dance and music at the same time.
The challenges of working with prosthetic instruments can be strongly appealing, however, and the level of mechanical sophistication which is provided by new generations of digital manufacturing will create opportunities for artistic exploration.
Just as we’ve seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology.
Kate Andrews: What are you working on now?
Ian Hattwick: Documentation: We work in academia, and publication of in-depth documentation of our motivations, design choices, and insights gained throughout the process of development is an important part of the work. We are part of a much larger community of researchers exploring artistic uses for new technologies, and it is important that we share our experiences and results.
Mapping: The programmable connections between the gestures sensed by the instruments and the resulting sound/media really define the experiences of the performers and the audience. We are busy finding new voices and modes of performance for the prostheses.
Improvements to hardware and software: In particular, sensing technology advances very quickly, with price, quality, and miniaturisation constantly improving. There are already some new tools available now that we couldn’t use three months ago.
3D printing musical instruments: We are talking with a 3D printer manufacturer about developing acoustic instruments which are entirely 3D printed, and which take advantage of the ability to manipulate object’s internal structure as well as radically re-imagining the forms which musical instruments can take.
News: the UK government wants to ban drivers from using Google’s augmented reality eyewear ahead of the 2014 release, amid safety concerns.
According to a report published by Stuff magazine, the UK’s Department for Transport (DfT) is concerned that wearing Google’s Glass headset whilst driving would be a dangerous distraction.
The government department responsible for the British transport system told the gadget magazine that it has taken pre-emptive steps to ban drivers from using the device.
“We are aware of the impending rollout of Google Glass and are in discussion with the Police to ensure that individuals do not use this technology while driving,” a DfT spokesperson told the magazine.
Should the law be approved, drivers caught using the glasses – which allow users to send and receive messages, take pictures and search the web hands-free – could incur a £60 fixed penalty notice and three points on their driver’s license, the same as for using a mobile phone.
“It is important that drivers give their full attention to the road when they are behind the wheel,” the Department of Transport spokesperson said. “A range of offences and penalties already exist to tackle those drivers who do not pay proper attention to the road including careless driving, which will become a fixed penalty offence later this year.”
The UK government banned drivers from using hand-held mobile phones in 2003, and has convicted millions of people since it was introduced.
A group of students from the Royal College of Art in London has developed headsets that allow the wearer to adjust their sight and hearing in the same way they’d control the settings on a TV or radio (+ movie).
The Eidos equipment was developed to enhance sensory perception by tuning in to specific sounds or images amongst a barrage of sonic and visual information, then applying effects to enhance the important ones.
“We’ve found that while we experience the world as many overlapping signals, we can use technology to first isolate and then amplify the one we want,” say the designers.
The first device is a mask that fits over the mouth and ears to let the wearer hear speech more selectively. A directional microphone captures the audio, which is processed by software to neutralise background noise.
It’s then transmitted to the listener through headphones and a central mouthpiece, which passes the isolated sound directly to the inner ear via bone vibrations. “This creates the unique sensation of hearing someone talk right inside your head,” they say.
The second device fits over the eyes and applies special effects – like those seen in long-exposure photography – to what the wearer is seeing in real-time. A head-mounted camera captures the imagery and sends it to a computer, where it’s processed by custom software to detect and overlay movement.
It’s then played to the wearer inside the headset, allowing them to see patterns and traces of movement that would normally be undetectable.
Possible applications could include sports, allowing teams to visualise and improve technique in real time, and performing arts where effects normally limited to video could be applied to live performance.
The audio equipment could enable concert-goers to enhance specific elements of a band or orchestra. The designers also suggest that filtering out distracting background noise could improve focus in the classroom for children with ADHD and assist elderly people as their natural hearing ability deteriorates.
Two prototypes styled with faceted surfaces and graduated perforations were presented at the Work in Progress exhibition at the Royal College of Art earlier this year. “Our final objects convey the mixing of digital technology with the organic human body,” explain the team.
News: the highly anticipated Apple television is set to launch later this year and will be operated with a digital “iRing” worn on the finger, according to an industry insider.
After meeting with Apple suppliers in China and Taiwan, analyst Brian White told AppleInsider magazine that he expects a product promising to “revolutionise the TV experience forever” to be officially launched later this year.
The ring-shaped accessory is thought to operate as a “navigation pointer”, taking on some of the functionality of a standard remote control and featuring integrated motion detection. White also thinks it will be accompanied by a small iPad-sized screen, which will combine security, telephone and video-conferencing functions.
Opinion: in this week’s column, Dezeen editor-in-chief Marcus Fairs discusses how wearable technology will “transform our understanding of ourselves”.
I’m being watched. My steps are being counted; my location is being tracked. My sleep is being monitored and my calories logged.
The person who’s watching me is… me. I’ve put myself under auto-surveillance and I’m having a data-driven out-of-body experience. I don’t keep a diary; instead, I have a graph.
I’ve been wearing a Nike+ FuelBand on my right wrist since last summer. This device measures my footsteps, estimates my calorific burn-rate and rewards me with “Nike Fuel” – an arbitrary and essentially useless currency that I can’t spend or trade.
Yet Fuel is addictively motivational. I go out of my way to achieve my daily goal of 3,000 Fuel points. I walk, run, cycle and exercise a lot more than I used to (and swim less, since the band isn’t waterproof) and actively seek manual chores that will earn me Fuel. I take pathetic pleasure in the lightshow on the band that marks the reaching of my day’s target and enjoy checking how my own “little data” fares against the accumulated “big data” of all the other FuelBand wearers on the Nike+ website.
My FuelBand was recently joined by a Jawbone UP wristband, which captures even more data about my lifestyle, including my sleep patterns and the food types I’ve consumed (although I have to enter that information manually). The accompanying smartphone app displays my life as a series of infographics and bar graphs of a sophistication that, until recently, was only available to elite athletes.
Jawbone says I’m not alone in performing better under surveillance: the firm cites research conducted at Stanford University that found people are 26% more active when they’re being monitored. Big Brother is good for you.
Having all this information at my fingertips changes the way I perceive myself. I’m forced to correlate my internal emotional narrative with the irrefutable datastream, and the former is often exposed as an unreliable fantasist. Days where I think I’ve been impressively active turn out to be days when I’ve been abnormally lazy; nights when I feel I’ve hardly slept turn out to have been more than adequate.
In his fascinating book Thinking, Fast and Slow, psychologist Daniel Kahneman explains that human beings are hopeless intuitive statisticians; we are unable to accurately interpret experience as data. Instead, we rely on assumptions, prejudices and intuition, all of which have a high chance of being wrong.
So, for example, if you wake up feeling exceptionally tired, you will assume you didn’t get enough sleep, whereas it may instead be that you woke up during a period of deep sleep, which leaves you feeling groggy. The UP band offers a function to overcome this, with an alarm feature that wakes you only during light sleep. Even if this means waking you earlier, you’ll feel more rested for it.
Thus devices like FuelBand and UP, plus other wearable activity-tracking gadgets like Fitbit, serve as a relentless reality check for your unreliable brain. The next generation of technology that sits directly on the body – like digital tattoos – or inside it – such as implants or pills – will burrow deeper into us to extract further “quantified self” datasets, which will provide more evidence of the irrationality of human experience.
Take a visit to the doctor: an everyday interaction that involves multiple potential failure points. You may misinterpret the symptoms you are experiencing; you may miscommunicate these to the doctor; the doctor may misunderstand you; the doctor may misdiagnose your illness. The chances that the consultation is a waste of time – or worse – are high.
Wearable technology that detects illness could remove this potential for error. I recently had a conversation with a senior healthcare designer who told me that medical services could soon be made far more efficient by fitting people with monitors that would alert hospitals at the first sign of congenital illness.
“Then the hospital would contact you and ask you to come for an appointment?” I asked naively. “No,” he replied; as a human you couldn’t be trusted to respond in the correct way. “You would most likely ignore the message or put off the appointment. Instead the hospital would contact your partner or your mother.”
For designers working in the area of wearable computing, the quest is to make both device and user interface “disappear”. “I think the general idea is that the phone as an object kind of disappears,” said Google’s John Hanke in an interview with Dezeen last year, in which he talked about Google’s Glass project, which features a computer embedded in a pair of spectacles.
Speaking at the Design Indaba conference at Cape Town earlier this month, Alex Chen of Google Creative Lab echoed Hanke, saying: “From my personal need I hope technology disappears more and more from my life so you forget you’re using it all the time, instead of feeling that you’re burdened and conscious of it.”
Travis Bogard, vice president of product management and strategy at Jawbone, told me the objective was to make the UP band “as small as possible, something that gets out the way and disappears.”
In my case, the UP band disappeared so successfully that I forgot I was wearing it, neglected to charge it and have consequently accumulated zero data over the past week.
As for my FuelBand, I’ve figured out how to cheat it. It uses an accelerometer to track my movement but has no idea of the effort involved. Waving my arms around while sitting on the sofa earns almost as many fuel points as jogging; drying my hands vigorously and cleaning my teeth with exaggerated movements are as effective as a workout. Simply jiggling the band in my hand earns Fuel, as does giving it to the kids to run around with.
Wearable technology promises to transform our understanding of ourselves and consequently our sense of who we really are. It has the possibility to help us compensate for our inherent flaws and make us better, healthier people. The challenge for the designers of these devices is to figure out how to account for human stupidity and deviousness.
Materials scientist John Rogers and his firm MC10 have developed flexible electronic circuits that stick directly to the skin like temporary tattoos and monitor the wearer’s health.
The Biostamp is a thin electronic mesh that stretches with the skin and monitors temperature, hydration and strain.
Rogers suggests that his “epidermal electronics” could be developed for use in healthcare to monitor patients without tethering them to large machines. Not only would this be more convenient, but the results could be more accurate if patients were examined in their normal environment doing usual activities rather than on the hospital ward.
Other applications could include a patch that lets an athlete know when and how much to hydrate for peak performance, or one that tells you when to apply more suncream.
MC10 overcame the rigidity of normal electronic components made from brittle silicon-based wafers by printing them in very small pieces, arranged in wavy patterns.
Earlier versions were applied on an elastomer backing patch, but the latest prototype is applied directly to the skin using a rubber stamp. It can be covered with spray-on bandage available from pharmacies to make it more durable and waterproof enough to withstand sweating or washing with soapy water. It lasts up to two weeks before the skin’s natural exfoliation causes it to come away.
The team are now working on the integration of wireless power sources and communication systems to relay the information gathered to a smartphone.
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.