Déjà à l’origine du livre « Rolling Words » qui se fumait pour Snoop Dogg, le directeur créatif Paulo Coelho a imaginé pour le constructeur aéronautique brésilien Embraer cette brochure et ce livre qui lévite au dessus d’une base adaptée pour l’occasion. Le tout grâce à un aimant placé sur la quatrième de couverture.
Apparently 95% of dreams are forgotten—if they’re not recorded—just after waking up. That means one-third of our lives is lost within the subconscious. With SHADOW, co-founders Hunter Lee Soik and Jason Carvalho are attempting to do…
The only way to break your iPhone addiction is lean towards gadgets that don’t disappoint! For example the Plumage Concept Phone is a good way to wean you off the apple and enrich your mindscape with something more innovative. Love the ideation and the sleek design, and of course the built-in protective cover / keyboard! Slick!
by Jorge Abellas Bang & Olufsen (B&O) has always produced exquisitely designed sound-systems not intended for use in a multi-brand entertainment setup. A few months back, the brand made a very deliberate effort to expand their…
Advertorial content: Anybody who has bought a SIM card in Europe can attest to the ease and freedom of the “pop it in, top up and go” kind of model found there—and in many countries around the world. Signing a…
Five little robots travel along lines drawn in felt-tip-pen and turn coloured scribbles into music in this installation by Japanese designer Yuri Suzuki (+ movie).
The Looks Like Music project by sound artist and designer Yuri Suzuki features robots that are programmed to follow a black line drawn on white paper. They each respond with specific sounds as they pass over coloured marks laid down across the track by visitors.
“The public is invited to actively contribute to the development of the installation in the exhibition space by extending the circuit drawn on paper,” said Suzuki. “Visitors thus participate in the creation of a large-scale artwork and enrich a collectively composed sound piece.”
Called Colour Chasers, the devices are each designed with different shapes and translate the colours they encounter into sounds including drums, deep bass, chords and melody.
The robots are produced by London technology firm Dentaku, which Suzuki co-founded with sound programmer Mark McKeague this year, and are a development of Suzuki’s earlier project focussing on dyslexia.
“I am dyslexic and I cannot read musical scores,” Suzuki told Dezeen. “However, I have a passion to play and create new music and I always dream to create new notation of music.”
“In this installation people can interact with robots and discover the new method to create music,” he added.
It’s not often I’m sitting in front of the computer with my mouth hanging open, but the video below is literally jaw-dropping. Researchers Tao Chen, Zhe Zhu, Ariel Shamir, Shi-Min Hu and Daniel Cohen-Or have developed software called “3-Sweep,” an insanely cool way to extract editable 3D data from a 2D image. Not a bunch of 2D images—we all know the technology where you walk around an object and fire off a dozen shots—just one image. Which means you no longer have to be there with a camera, but could conceivably pull a (relatively well-shot) 2D image from anywhere, and quickly create a model of it.
“Our approach combines the cognitive abilities of humans with the computational accuracy of the machine to solve this problem,” writes the team. What you basically do is use your mouse to “sweep” lines and/or ellipses across the image, quickly teaching the software where the axes are. Look at how freaking easy this looks, and watch what they do with the telescope and lamppost:
We dig that they showed not only the successes, but also the failures of the software, to give you a realistic idea of what is possible. And yes, it seems to lend itself best to geometric objects with some degree of rotational symmetry, but this is still an extraordinary breakthrough.
For those of you in Hong Kong, a 3-Sweep demo is scheduled for the upcoming SIGGRAPH Asia.
Researchers at Massachusetts Institute of Technology have built a flying robot that can guide people around complex urban environments or aid search-and-rescue missions, in an attempt to show that drones can perform useful tasks as well as sinister ones (+ movie).
The SkyCall quadcopter, designed by research group Senseable City Lab at Massachusetts Institute of Technology, acts like an electronic flying guide dog, hovering just ahead of the user and guiding them to their destination.
Yaniv Jacob Turgeman, research and development lead on the project, said SkyCall was designed to counter the sinister reputation of drones, and show they can be useful. “Our imaginations of flying sentient vehicles are filled with dystopian notions of surveillance and control, but this technology should be tasked with optimism,” he told Dezeen.
“The urban UAV (unmanned aerial vehicle) will guide us in disorienting situations, support search and rescue efforts, track environmental problems, and even act as digital insects re-introducing natural biodiversity to our man-made environments,” he added. “As a networked intelligence with a physical form, the urban UAV offers an alternative interface to the digital layers of the city.”
A prototype of the SkyCall quadcopter has already been used on test missions to guide students around the MIT campus in Cambridge, USA.
Students and visitors call for a SkyCall tour guide via a customised mobile app. When the users press the ‘call’ button, the nearest vehicle locates the caller’s phone and location via GPS and sets off to meet them.
The vehicle arrives in front of the user and awaits instructions of where to go. The visitor can then type in a simple code to tell the drone where in the campus they wish to go.
The drone travels at walking speed, hovering around two metres in front of the visitor, who can press “pause” to get the drone to hold a stationary position. The drone provides information about locations it passes by “speaking” to the user via their smartphone.
“UAV technology holds huge disruptive potential,” project lead Chris Green told Dezeen. “We want to harness this and specifically explore its value to the city and its inhabitants.”
He added: “Rather than the visitor diverting their attention to a map, the autonomous guide provides an intuitive navigational system of simply ‘following’.”
Photographs are courtesy of MIT SENSEable City Lab.
Here’s a project description from the project team:
SkyCall by MIT SENSEable City Lab
How can we re-imagine UAV technology, to help us navigate challenging situations and complex environments? This is the premise for SkyCall – an autonomous flying quadcopter and personal tour guide – operating in one of mankind’s most difficult and disorientating labyrinths: MIT campus. We tested this technology on someone you would typically expect to be lost within MIT.
Development
Our Lab is exploring two distinct development paths of UAV technology: a quadcopter’s capacity to autonomously sense and perceive its environment, and its ability to interface and interact with people. These parallel aims steered the development of SkyCall’s tour-guide system, resulting in a platform that can efficiently locate, communicate with, and guide visitors around MIT campus, specifically along predetermined routes or towards user-determined destinations.
A custom SkyCall app was developed for human/UAV interface, enabling the visitor to make specific requests, and the UAV to both locate and wirelessly communicate with them. When the user presses the ‘call’ button, SkyCall instantaneously accesses the GPS location of the visitor’s phone and relays spatial coordinates to the nearest available UAV.
The quadcopter itself utilises onboard autopilot and GPS navigation systems with sonar sensors and WiFi connectivity (via a ground station), enabling it to fly autonomously and communicate with the user via the SkyCall app. The UAV also integrates an onboard camera as both an information gathering system (relaying images to a ‘base’ location upon encountering the user), as well as a manually-controlled camera, accessible to the visitor-come-tourist again via the SkyCall app.
Future
SkyCall is Phase I of a larger development program that is currently underway at Senseable City Lab, with the broader aim of exploring novel, positive uses of UAV technology in the urban context. This project offers a case study within our ongoing research initiative, and suggests promising new infrastructure potentials.
News: technology giant Apple has lost its vision and has reached creative saturation according to Hartmut Esslinger, the industrial designer hired by Steve Jobs to help transform the brand in the 1980s.
Speaking to Quartz magazine this week, the founder of product design studio Frog said that Apple is operating like Sony was in 1980s when he worked there, where the “visionary founder has been replaced by leaders who aren’t thinking beyond refinement and increasing profit.”
“As soon as you can copy something [like the iPhone,] it’s not smart enough anymore,” he told the magazine. “I think Apple has reached in a certain way a saturation.”
Esslinger designed over 100 products for Sony prior to joining Apple in 1982, where he worked with Apple’s late co-founder Steve Jobs – who passed away in 2011 – on the early design language for Macintosh computers.
He recounted how Jobs was open to experimenting with news ideas and took risks that lead to innovation, a quality Esslinger feels is lacking at Apple today. “Steve Jobs was a man who didn’t care for any rational argument why something should not be tried,” said Esslinger. “He said a lot of ‘no,’ but he also said a lot of ‘yes’ to things and he stubbornly insisted on trying new things.”
He claimed Jobs conceived a “book-like computer” as early as 1982.”That vision eventually led to the Apple Newton, a tablet that failed, and the iPhone and iPad, which made history. That kind of vision is now lacking at Apple.”
The designer suggested that Apple is being left behind by radical thinking from young designers in places like China, where Esslinger currently leads the Detao Master Class for Strategic Design at Shanghai Institute of Visual Art (SIVA).
He said that the next generation of innovators are moving beyond flat-screen technology, developing ideas for three-dimensional interfaces. “I think flat screens have reached a level of saturation,” said Esslinger. “Screens don’t have to be all right angles – the cheapest way is not always the best way. What’s happening in China right now is a paradigm shift where they realise they have to innovate and can’t just make cheap products.”
Here’s a bizarre technological development just waiting for the right application to come along: The Disney Research outpost in Pittsburgh, co-located at Carnegie Mellon, has developed a way to transmit sound from one human body to another. The kicker is that no one else can hear it, and the two bodies must be physically touching. More specifically, the “speaker” person must use their finger to touch the “listener” person’s earlobe. It’s like a more communicative version of the Wet Willy.
They’re calling the technology Ishin-Denshin, named for the Japanese concept of tacit, unspoken understanding between two people. The way it works is that the speaker records something into a special microphone. The microphone itself then transmits that recording directly into the body of the speaker, through the very hand they’re using to hold the microphone. When they then touch their other hand to the listener’s earlobe, the sound travels into the listener’s ear. And you can even daisy-chain the sound through multiple bodies, as you’ll see here:
This is site is run by Sascha Endlicher, M.A., during ungodly late night hours. Wanna know more about him? Connect via Social Media by jumping to about.me/sascha.endlicher.