3D-printed noses for accident victims “within a year”

3D-printed prostheses by Fripp Design and Research

News: 3D-printed nose and ear replacements for accident victims and people with facial disfigurements could be just a year away, according to a design firm working on a new generation of prosthetics (+ interview).

Patients could get a customised nose or ear printed within 48 hours, rather than the ten weeks it takes to make a hand-made prosthesis, Fripp Design & Research believes.

“It’s time saving and cost saving,” the company’s founder Tom Fripp told Dezeen. “Particularly, the time-saving is great for the patient. Traditionally to have one made you’re waiting for about ten weeks for a hand-made prosthesis. From start to finish we would scan, design and print within 48 hours.”

Fripp said that the technology could be ready this time next year, although getting the health services to embrace it was the biggest challenge. “I think to actually get anywhere from now to [having an] available service you’re talking about a year,” he said. “It requires some sort of acceptance into the health services. That’s the biggest barrier to it.”

The project is being exhibited as part of the 3D Printshow Hospital at the 3D Printshow in London. The exhibition, which explores how 3D printing is transforming healthcare, also features a bio-printer that could print human cells that could eliminate the need for animal testing of new drugs.

Fripp is also working on 3D-printed eyes, which could be produced for less than £100, compared to the current price of up to £4,000 for existing ocular prosthetics.

UK-based Fripp uses colour 3D printing to create soft-tissue prostheses that can be used by patients who are missing sections of their face. Each custom prosthesis printed with bio-compatible starch and silicone will match the wearer’s skin colour, and take less than two days to produce.

“We reproduce the colour, which is an exact match for the skin tone,” Fripp told Dezeen. “Following that, we have to colour code it for the printer because if you send any colour to any standard printer, you get a totally different colour.”

The current process is lengthy and costly and involves taking an impression of the area to create a mould for the prosthesis, which then has to be hand painted and modified during fitting.

To speed this up, Fripp Design & Research are collaborating with researchers at the University of Sheffield to map the shape of the patient’s trauma area and capture skin colour data in an instant using a setup of multiple digital cameras.

The prosthetics are then designed using previous scans of the patients, if available, by mapping features from the patients’ relatives or simply taking stock files of parts like noses or ears.

“[We use] a graphic clay that we can carve away and morph to the trauma area,” said Fripp, “so we make sure we have a dead accurate fit.”

The shape is then printed with the precise colour profile using a Z Corp Z510 colour 3D printer. This will cost around the same as a handmade prosthetic, but once created the file can be used to generate multiple copies for replacements at a significantly lower cost.

3D-printed prostheses by Fripp Design and Research

Fripp admits his products are less realistic than the current models: “They’re not as high quality as a hand-made one which really are beautiful, but a patient can have this as an interim until their handmade one is actually produced.”

He says they have tested and fitted a prosthetic for a patient but that the project is awaiting medical accreditation. He believes that the people who are going to benefit the most from this process will be “individuals currently in the developing world who go without because they don’t have the money to pay for a skilled technician to build one.”

Fripp’s company is also working with Manchester Metropolitan University to produce stock batches of prosthetic eyes that patients could buy for just £30, which they also hope to be selling in a year’s time.

He also claims that his company has developed the first machine to 3D-print entirely in silicon, which will help remove the white lines that form around the edge of the protheses due to the silicon reacting with the starch.

For our one-off 3D-printing magazine Print Shift, we reported that the technology is making strides towards medical applications such as printing organs. Scientists have also printed a bionic ear that can hear radio frequencies beyond a human’s normal range.

Here’s the full interview we conducted with Tom Fripp:


Dan Howarth: How you go about printing a nose or an ear?

Tom Fripp: It starts off with a data capture, half of it, because we deal with patients who are sometimes very nervous, sometimes very agitated, we have to use a structured light system, its an instant capture. People who are nervous tend to move around and fidget, lasers take too long to produce them because they don’t stay still. So we use a colour photogrammetry system. It’s an array of cameras mounted in pads that are calibrated to know where each pod is sat. They all take a picture at the same time then they can work out the physical geometry and at the same time capture the colour.

That gives us a mesh of the area of the trauma. What that doesn’t include obviously is what you’ve got to produce to replace any trauma area which might be due to surgery or through disease. The next thing to do is to create that geometry, we can use either stock prosthetics that we have as CAD files or we can image a friend or family member and we will adjust it all to fit in 3D CAD. Or we could use CCRMI data if thats available.

There’s quite a lot of ways that we could reproduce, lets say for example a nose to make sure that it fits. We use a voxel modelling system for modelling so it’s pixels rather than surfaces or solid modelling, it uses a graphic clay that we can carve away and morph to the trauma area. So we make sure we have a dead accurate fit. Then we have to make sure we get the colour right and we do this by taking a special photometer reading from the patient, all captured at the same time. Then we reproduce the colour which is an exact match for the skin tone. Following that, we have to colour code it for the printer because if you send any colour to any standard printer, you get a totally different colour. Then the final stage is that we produce it, we actually 3D print the full colour part in starch because it’s a stable, lightweight and porous material. The processing involves forcing medical grade silicon into the starch, that brings out its final qualities and then the prothesis is ready to go to the fitter to be adjusted and fitted to the patient.

Dan Howarth: What are the benefits compared to the current methods of creating protheses?

Tom Fripp: It’s time saving and cost saving. Particularly, the time saving is great for the patient. Traditionally to have one made you’re waiting for about ten weeks for a hand made prosthesis. From start to finish we would scan, design and print within 48 hours. They’re not as high quality as a hand-made one which really are beautiful, but a patient can have this as an interim until their handmade one is actually produced.

The other benefit is that it is much more cost effective. Although the first one would cost about the same amount which is between £1500 and £3000 depending on where you are in the country. Our first one would cost about the same because of the design side. For a repeat handmade one you’re talking up to a thousand pounds. For our one it comes down to about £130 because we’ve just got a CAD file, we just press print again.

Dan Howarth: Has this been tested and used on patients yet?

Tom Fripp: No, we have fitted it to a patient to see what their response is to it but its not actually been provided out there as prosthesis yet. The main reason is that it’s difficult for products to get into the medical profession. We are an industrial design company, we’re finding an awful lot of resistance to it because traditionally, things come from surgeons and clinicians having an idea and developing it rather than an external design company doing the same.

Dan Howarth: How long do you think it will be until it’s taken up?

Tom Fripp: I think to actually get anywhere from now to available service, you’re talking about a year. It requires some sort of acceptance into the health services. That’s the biggest barrier to it.

Dan Howarth: What sort of printers do you use to print out the files?

Tom Fripp: We use Z Corp Z510s deliberately because its a much more of an open system and we can play about with the materials before, the more recent ones are more cartridge based.

Dan Howarth: How does the prothesis then attach to the face?

Tom Fripp: There’s a variety of ways. A lot of patients will already have an implant placed on the good tissue. So any bone underneath the trauma area that can be used, they all have a steel implant drilled into the bone then we can capture the orientation and location in our scanning process. Then we would produce the prosthesis with magnets actually inside the prosthesis which would just clip onto the implants. But the prosthesis is also made with a fine fitted edge which means that you can place a medical grade adhesive around this edge that reactivates when you clean it. So you can actually take the prosthesis off overnight and allow air to get to any scar tissue, clean it and then clip it back onto the implant with the medical adhesive, with a little bit of make-up round the edge, it hides it.

Dan Howarth: Who is going to benefit the most from this?

Tom Fripp: The people who are going to benefit the most from this are the individuals currently in the developing world who go without because they don’t have the money to pay for a skilled technician to build one. There are areas where technicians aren’t actually available and they would have to wait for up to a year or so to visit a more developed country where you get academics going over and starting up small clinics. It happens very regularly but you still have to wait a long time, and in most cases some still can’t afford it.

Dan Howarth: Whats next after it gains medical accreditation, could you then develop it to create other body parts?

Tom Fripp: Yes, we are currently constrained on the physical parts that we can produce so for example limbs are a bit troublesome because of their physical size. The starch material is very delicate when it comes out of the printer so a large limb might collapse when you actually try to process it. We have looked at other parts, things like replacing breasts, they are particularly difficult to produce because of the physical size of the moulds required to make them, make them incredibly heavy to process. The process is straightforward but there’s quite a lot of work to do on the material side before we can produce something that large.

Dan Howarth:: Have you got anything else in the pipeline?

Tom Fripp: For the last year and a half to two years, we’ve also been developing ocular prosthetics, replacing eyes for people. You have a similar situation with the handmade prosthetics, we’ve developed a way of full colour 3D printing them without them costing about £3000-4000, we can produce them for less than £100.

Dan Howarth: That works in the same way as the noses?

Tom Fripp: Kind of. With the ocular prosthetics, we’re actually producing them as stock parts so they’re a standardised set of 3D printed parts. At the moment, all of the ocular prosthetics are handmade and very expensive to produce whereas ours are far quicker and far cheaper. So ours will be about £30 and we can make approximately 150 in three hours on our system.

Dan Howarth: Is this project in the same stage as the noses?

Tom Fripp: The product is more refined actually and the process is pretty much complete. The materials are standard, there’s no issue with the materials. We’re currently working with Manchester Metropolitan University on that project. We starting to scale up the process for production. There’s an awful lot of interest in the product particularly from India.

Dan Howarth: How long do you think until that might be put into mass production?

Tom Fripp: I would imagine within 12 months, we should be producing this product and its should be going out to India.

I should mention, one of the problems with the soft tissue prosthetics is that starch and silicon don’t get on too well. So when you over-stress the prosthesis, you get a small white grazing line on it, which isn’t too much of a problem if you’ve got a temporary prosthesis. The only way to get around that is to eliminate the starch from the process, so for the last six months or so, Fripp Design as a company has developed its own new type of 3D printer which actually prints directly in silicone, which is a complete game changer because nobody is actually able to print in silicone and we’ve discovered a way. We have a test rig up and running at the moment and we’re producing samples and filed the patent about two weeks ago.

The post 3D-printed noses for accident victims
“within a year”
appeared first on Dezeen.

Instrumented Bodies by Joseph Malloch and Ian Hattwick

Researchers in Canada have designed a family of prosthetic musical instruments, including an external spine and a touch-sensitive rib cage, that create music in response to body gestures (+ interview + slideshow).

Joseph Malloch and Ian Hattwick, two PhD researchers at McGill University’s Input Devices and Music Interaction Lab (IDMIL), worked with a team of dancers, musicians, composers and choreographers to develop wearable digital instruments for a live music and dance performance, called Les Gestes.

The instruments developed are a bending spine extension, a curved rib cage that fits around the waist and a visor headset with touch and motion sensors.

Instrumented Bodies  - digital prostheses for music and dance
Spine – attached to the back

Each instrument can be played in a traditional hand-held way, but can also be attached to the body, freeing a dancer to twist, spin and move to create sound. All three are lit from within using LEDs.

“The goal of the project was to develop instruments that are visually striking, utilise advanced sensing technologies, and are rugged enough for extensive use in performance,” explained Malloch and Hattwick.

Instrumented Bodies  - digital prostheses for music and dance

The researchers said that they wanted to create objects that are beautiful, functional and believable as instruments. “We wanted to move away from something that looked made by a person, because then it becomes less believable as a mysterious extension to the body,” Hattwick told Dezeen.

“The interesting thing would be either that it looks organic or that it was made by some sort of imaginary futuristic machine. Or somewhere in between,” he added.

Instrumented Bodies  - digital prostheses for music and dance
Visor – worn on the head

The Rib and Visor are constructed from layers of laser-cut transparent acrylic and polycarbonate. “One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads,” said Hattwick.

The pads are connected to electronics via a thin wire that runs through the acrylic. Touch and motion sensors pick up body movements and radio transmitters are used to transmit the data to a computer that translates it into sound.

Instrumented Bodies  - digital prostheses for music and dance
Rib – fitted around the waist

The Spine is made from laser-cut transparent acrylic vertebrae, threaded onto a transparent PVC hose in a truss-like structure. A thin and flexible length of PETG plastic slides through the vertebrae, allowing the entire structure to bend and twist. The rod is fixed at both ends of the instrument using custom-made 3D-printed components.

Instrumented Bodies  - digital prostheses for music and dance

“We used 3D printing for a variety of purposes,” Hattwick told Dezeen. “One of the primary uses was for solving mechanical problems. All of the instruments use a custom-designed 3D-printed mounting system, allowing the dancers to smoothly slot the instruments into their costumes.”

Instrumented Bodies - digital prostheses for music and dance

Speaking about the future of wearable technology, Hattwick told Dezeen: “Technological devices should be made to accommodate the human body, not the other way around.”

“Just as we’ve seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology.”

Instrumented Bodies  - digital prostheses for music and dance

Here’s a 15 minute documentary about the Instrumented Bodies project that features the instruments in action:

The team are now working to develop entirely 3D printed instruments and to radically re-imagine the forms that instruments can take.

Instrumented Bodies  - digital prostheses for music and dance

Fetishistic suits of armour, orthopaedic braces and wearable tusks all featured in an exhibition of prosthetics at the SHOWcabinet space in London earlier this year and a 3D printed prosthetic hand has been designed to help children born without fingers.

We’ve also featured a number of wearable gadgets on Dezeen, including the UP activity-tracking wristband and electronic skin tattoosSee more wearable technology »

Photographs are by Vanessa Yaremchuck, courtesy of IDMIL.

Here’s the full interview with PhD researchers Joseph Malloch and Ian Hattwick:


Kate Andrews: Why did you embark on this project? What was the motivation?

Ian Hattwick: This project began as a collaboration between members of our group in the IDMIL (specifically Joseph Malloch, Ian Hattwick, and Marlon Schumacher, supervised by Marcelo Wanderley), a composer (Sean Ferguson, also at McGill), and a choreographer (Isabelle Van Grimde).

In 2008 we worked with the same collaborators on a short piece for ‘cello and dancer’ which made use of a digital musical instrument we had already developed called the T-Stick. We decided to apply for a grant to support a longer collaboration for which we would develop instruments specifically for dancers but based loosely on the T-Stick.

Instrumented Bodies  - digital prostheses for music and dance
Instrumented Bodies – digital prosthetics sketches

During the planning stages we decided to explore ideas of instrument as prosthesis, and to design instruments that could be played both as objects and as part of the body. We started by sketching and building rough prototypes out of foam and corrugated plastic, and attaching them to the dancers to see what sort of movement would be possible – and natural – while wearing the prostheses.

After settling on three basic types of object (Spine, Rib, and Visor) we started working on developing the sensing, exploring different materials and refining the design.

Kate Andrews: What materials are the spine, rib and visor made from?

Ian Hattwick: Each of the Ribs and the Visors is constructed from a solvent-welded sandwich of laser-cut transparent acrylic and polycarbonate. One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads.

The pads are connected to the electronics in the base of the object using very thin wire, run through laser-etched grooves in the acrylic. The electronics in the base include a 3-axis accelerometer, a ZigBee radio transceiver, circuitry for capacitive touch sensing, and drivers for the embedded LEDs. Li-Ion batteries are used for power.

Each of the Spines is constructed from laser-cut transparent acrylic vertebrae threaded onto transparent PVC hose in a truss-like structure. One of the rails in the truss is a thin, very flexible length of PETg plastic that can slide through the holes in the vertebrae, allowing the entire structure to bend and twist. The PETg rod is fixed at both ends of the instrument using custom 3D-printed attachments.

For sensing, the Spines use inertial measurement units (IMUs) located at each end of the instrument – each a circuit-board including a 3-axis accelerometer, a 3-axis rate gyroscope, a 3-axis magnetometer, and a micro-controller running custom firmware to fuse the sensor data into a stable estimate of orientation using a complementary filter.

In this way we know the orientation of each end of the instrument (represented as quaternions), and we can interpolate between them to track or visualise the shape of the entire instrument (a video explaining the sensing can be watch on Youtube). Like the Ribs and Visors, the Spine uses a ZigBee radio transceiver for data communications and LiPoly batteries for power.

Instrumented Bodies  - digital prostheses for music and dance

All of the instruments use a custom-designed 3D-printed mounting system allowing the dancers to smoothly slot the instruments into their costumes.

A computer equipped with another ZigBee radio transceiver communicates with all of the active instruments and collects their sensor data. This data is processed further and then made available on the network for use in controlling media synthesis. We use an open-source, cross platform software library called libmapper (a long term project of the IDMIL’s – more info at www.libmapper.org) to make all of the sensor data discoverable by other applications and to support the task of “mapping” the sensor, instrument and gesture data to the parameters of media synthesisers.

The use of digital fabrication technologies allowed us to quickly iterate through variations of the prototypes. To start out, we used laser-cutters at the McGill University School of Architecture and a 3D printer located at the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT). As we moved to production we outsourced some of the laser-cutting to a commercial company.

Kate Andrews: How did collaboration across disciplines of design, music and technology change and shape the project?

Ian Hattwick: From the very beginning of the project, the three artistic teams worked together to shape the final creations. In the first workshop, we brought non-functional prototypes of the instruments, and the dancers worked with them to find compelling gestures, while we tried a variety of shapes and forms and the composers thought about the kind of music the interaction of dancers and instruments suggested.

Later in the project, as we tried a variety of materials in the construction of the instruments, each new iteration would suggest new movements to the dancers and choreographer. Particularly, as we moved to clear acrylic for the basic material of the ribs, the instruments grew larger in order to have a greater visual impact, which suggested to the dancers the possibility of working with gestures both within and without the curve of the ribs.

These new gestures in turn required the ribs to have a specific size and curvature. Over time, the dancers gained a knowledge of the forms of the instruments which gave them the confidence to perform as if the instruments were actual extensions of their bodies.

Instrumented Bodies  - digital prostheses for music and dance
Component tests

Kate Andrews: How was 3D printing used during the project – and why?

Ian Hattwick: We used 3D printing for a variety of purposes in this project. One of the primary uses was for solving mechanical problems – such as designing the mounting system for the instruments.

We tried to find prefabricated solutions for attaching the instruments to the costumes, but were unable to find anything that suited our purposes, so we designed and prototyped a series of clips and mounts to find the shapes that would be easy for the dancers to use, that would be durable, and that would fit our space constraints.

In addition, 3D printing quickly became a tool which we use any time we had a need for a custom-shaped mechanical part. Some examples are a threaded, removable collar for mounting the PET-G rod to the spine, mounting collars and caps for the lighting in the spine.

[A document detailing the use of 3D printing in the project can be downloaded here].

Instrumented Bodies  - digital prostheses for music and dance
Instrumented Bodies – digital prosthetics sketches

Kate Andrews: Where do you see this technology being used now?

Ian Hattwick: 3D printing, or additive manufacturing as it is known in industry, is increasingly commonplace. In the research community, we’ve seen applications everywhere from micro-fluidic devices to creating variable acoustic spaces. One of my favourite applications is the creation of new homes for hermit crabs.

Kate Andrews: Can we expect to see other live performances using the instruments?

Ian Hattwick: We are currently working with the instruments ourselves to create new mappings and synthesis techniques, and in October we will bringing them to Greece to take part in a 
10 
day experimental 
artist 
residency 
in 
Greece focusing
 on 
improvisation. We’ve also been talking with a variety of other collaborators in both dance and music, so we expect to have quite a few different performances in the next year.

Kate Andrews: What do you think is the future for interactive and wearable technology?

Ian Hattwick: I’m really excited about the coming generations of constantly worn health monitors, which is the first widespread adoption of the ideas of the “quantified self” movement. I expect in a relatively short time it will be normal for people to maintain logs of more than just their their activity, heart rate, or sleep patterns, but also the effect of their mood and environment on their body. I’m also excited about e-textiles, clothing which can change its shape or visual appearance.

One of the ways in which I see the prosthetic instruments making a real contribution is the idea that technological devices should be made to accommodate the human body, and not the other way around. Particularly, you see musical instruments created so as to be easy to mass-manufacture, rather than seeking to identify and support natural physical expressions during musical performance. At the same time, by creating technologies which are invisible to the performer we take away the physical interaction with an instrument which is so much a part of how we think about performance, both individually and in ensembles.

Kate Andrews: Does this present a new future for music? For dance?

Joseph Malloch: There is no one future for music or dance, but we can always count on new technologies being adapted for art, no matter their intended purpose.

Ian Hattwick: In interactive dance, the paradigm has always been capturing the unencumbered motion of the dancer; in music, there tends to be a fetishisation of the instrument. So in a sense, the idea of prosthetic instruments challenges the existing norms of those art forms. Certainly, using the prosthetic instruments requires a different conceptualisation of how we can perform dance and music at the same time.

The challenges of working with prosthetic instruments can be strongly appealing, however, and the level of mechanical sophistication which is provided by new generations of digital manufacturing will create opportunities for artistic exploration.

Just as we’ve seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology.

Instrumented Bodies - digital prostheses for music and dance

Kate Andrews: What are you working on now?

Ian Hattwick: Documentation: We work in academia, and publication of in-depth documentation of our motivations, design choices, and insights gained throughout the process of development is an important part of the work. We are part of a much larger community of researchers exploring artistic uses for new technologies, and it is important that we share our experiences and results.

Mapping: The programmable connections between the gestures sensed by the instruments and the resulting sound/media really define the experiences of the performers and the audience. We are busy finding new voices and modes of performance for the prostheses.

Improvements to hardware and software: In particular, sensing technology advances very quickly, with price, quality, and miniaturisation constantly improving. There are already some new tools available now that we couldn’t use three months ago.

3D printing musical instruments: We are talking with a 3D printer manufacturer about developing acoustic instruments which are entirely 3D printed, and which take advantage of the ability to manipulate object’s internal structure as well as radically re-imagining the forms which musical instruments can take.

The post Instrumented Bodies by
Joseph Malloch and Ian Hattwick
appeared first on Dezeen.