London designer Dominic Wilcox created this stylus that straps over his nose for using his touch-screen phone in the bath.
Called Finger-nose Stylus, the device is made from a handheld stylus embedded in fibrous plaster.
It allows the user to securely hold the phone with one hand and operate it with their nose.
Here are some more details from Dominic Wilcox:
I sometimes use my touchphone in the bath. I know it’s stupid. One problem I encounter is that when put my left hand in the water without thinking, it gets wet and unusable for touchscreen navigation. It is too risky to try to hold and navigate with one hand. I found that I could use my nose to scroll but I couldn’t see where my nose was touching precisely. It was at that point that I came up with this idea of a nose extension ‘finger’ that would allow navigation while my phone is firmly held by one hand.
I did send a tweet from the bath last night which was typed as ‘Hello I am tweeting with my nose’ unfortunately due to the phone’s auto-correct it sent it as… ‘Hello I am meeting with my nose’.
I lost 2 followers.
It’s also handy when out and about multi-tasking. I imagine it would be a great accessory for iPad users.
I bought a handheld stylus that I embedded in the plaster nose. The plaster comes mixed with fibres that make it look hairy.
Although this is handy for me in the bath it touches on possible uses for people without use of a hand. Though the design could be made more ‘subtle’ for everyday use maybe coming from around the neck.
News: a fat-wheeled electric concept vehicle that you ride standing up like a child’s scooter has achieved its funding target on Kickstarter, allowing final development to go ahead (+ slideshow).
The Scrooser, developed by a German company of the same name, beat its $120,000 target on the crowd-funding website. The firm will now finalise the design of its “impulse drive” motor, which sits within the hub of the rear wheel and delivers a burst of power each time the rider uses their foot to propel the vehicle forward.
Scrooser founder Jens Thieme described the product as “a completely new vehicle category.” He added: ““We are very happy with the success of Scrooser on Kickstarter. With the fresh capital, we can now take final developments of our innovative Impulse Drive and we get a lot closer to our goal. “
The motor automatically kicks in to boost the rider’s propulsion at speeds of over 2 miles per hour, but disengages when the rider uses the brake. The vehicle has a top speed of 15 miles per hour.
“A perfect pace for maneuvering through the city among pedestrian filled sidewalks is around 6 mph” the company’s website explains, “but feel free to race cyclists on bike paths at a maximum speed of up to 15 mph.”
With wide wheels and a low centre of gravity, the Scrooser remains upright when the rider dismounts. The 1000W motor provides direct power to the rear wheel, without the need for gears or chains.
A rechargeable lithium-ion battery located beneath the footboard can provide power for up to 25 days, and takes around 3 hours to recharge via a standard power outlet.
Measuring 1.75 metres and weighing 28kg, the scooter features a frame made of aluminium tube formed by a process called “freeform 3D bending”. The Scrooser also features a low seat, integrated lock and LED lights for riding at night.
Researchers in Canada have designed a family of prosthetic musical instruments, including an external spine and a touch-sensitive rib cage, that create music in response to body gestures (+ interview + slideshow).
The instruments developed are a bending spine extension, a curved rib cage that fits around the waist and a visor headset with touch and motion sensors.
Spine – attached to the back
Each instrument can be played in a traditional hand-held way, but can also be attached to the body, freeing a dancer to twist, spin and move to create sound. All three are lit from within using LEDs.
“The goal of the project was to develop instruments that are visually striking, utilise advanced sensing technologies, and are rugged enough for extensive use in performance,” explained Malloch and Hattwick.
The researchers said that they wanted to create objects that are beautiful, functional and believable as instruments. “We wanted to move away from something that looked made by a person, because then it becomes less believable as a mysterious extension to the body,” Hattwick told Dezeen.
“The interesting thing would be either that it looks organic or that it was made by some sort of imaginary futuristic machine. Or somewhere in between,” he added.
Visor – worn on the head
The Rib and Visor are constructed from layers of laser-cut transparent acrylic and polycarbonate. “One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads,” said Hattwick.
The pads are connected to electronics via a thin wire that runs through the acrylic. Touch and motion sensors pick up body movements and radio transmitters are used to transmit the data to a computer that translates it into sound.
Rib – fitted around the waist
The Spine is made from laser-cut transparent acrylic vertebrae, threaded onto a transparent PVC hose in a truss-like structure. A thin and flexible length of PETG plastic slides through the vertebrae, allowing the entire structure to bend and twist. The rod is fixed at both ends of the instrument using custom-made 3D-printed components.
“We used 3D printing for a variety of purposes,” Hattwick told Dezeen. “One of the primary uses was for solving mechanical problems. All of the instruments use a custom-designed 3D-printed mounting system, allowing the dancers to smoothly slot the instruments into their costumes.”
Speaking about the future of wearable technology, Hattwick told Dezeen: “Technological devices should be made to accommodate the human body, not the other way around.”
“Just as we’ve seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology.”
Here’s a 15 minute documentary about the Instrumented Bodies project that features the instruments in action:
The team are now working to develop entirely 3D printed instruments and to radically re-imagine the forms that instruments can take.
Photographs are by Vanessa Yaremchuck, courtesy of IDMIL.
Here’s the full interview with PhD researchers Joseph Malloch and Ian Hattwick:
Kate Andrews: Why did you embark on this project? What was the motivation?
Ian Hattwick: This project began as a collaboration between members of our group in the IDMIL (specifically Joseph Malloch, Ian Hattwick, and Marlon Schumacher, supervised by Marcelo Wanderley), a composer (Sean Ferguson, also at McGill), and a choreographer (Isabelle Van Grimde).
In 2008 we worked with the same collaborators on a short piece for ‘cello and dancer’ which made use of a digital musical instrument we had already developed called the T-Stick. We decided to apply for a grant to support a longer collaboration for which we would develop instruments specifically for dancers but based loosely on the T-Stick.
Instrumented Bodies – digital prosthetics sketches
During the planning stages we decided to explore ideas of instrument as prosthesis, and to design instruments that could be played both as objects and as part of the body. We started by sketching and building rough prototypes out of foam and corrugated plastic, and attaching them to the dancers to see what sort of movement would be possible – and natural – while wearing the prostheses.
After settling on three basic types of object (Spine, Rib, and Visor) we started working on developing the sensing, exploring different materials and refining the design.
Kate Andrews: What materials are the spine, rib and visor made from?
Ian Hattwick: Each of the Ribs and the Visors is constructed from a solvent-welded sandwich of laser-cut transparent acrylic and polycarbonate. One of the layers uses a transparent conductive plastic film, patterned with the laser cutter to form touch-sensitive pads.
The pads are connected to the electronics in the base of the object using very thin wire, run through laser-etched grooves in the acrylic. The electronics in the base include a 3-axis accelerometer, a ZigBee radio transceiver, circuitry for capacitive touch sensing, and drivers for the embedded LEDs. Li-Ion batteries are used for power.
Each of the Spines is constructed from laser-cut transparent acrylic vertebrae threaded onto transparent PVC hose in a truss-like structure. One of the rails in the truss is a thin, very flexible length of PETg plastic that can slide through the holes in the vertebrae, allowing the entire structure to bend and twist. The PETg rod is fixed at both ends of the instrument using custom 3D-printed attachments.
For sensing, the Spines use inertial measurement units (IMUs) located at each end of the instrument – each a circuit-board including a 3-axis accelerometer, a 3-axis rate gyroscope, a 3-axis magnetometer, and a micro-controller running custom firmware to fuse the sensor data into a stable estimate of orientation using a complementary filter.
In this way we know the orientation of each end of the instrument (represented as quaternions), and we can interpolate between them to track or visualise the shape of the entire instrument (a video explaining the sensing can be watch on Youtube). Like the Ribs and Visors, the Spine uses a ZigBee radio transceiver for data communications and LiPoly batteries for power.
All of the instruments use a custom-designed 3D-printed mounting system allowing the dancers to smoothly slot the instruments into their costumes.
A computer equipped with another ZigBee radio transceiver communicates with all of the active instruments and collects their sensor data. This data is processed further and then made available on the network for use in controlling media synthesis. We use an open-source, cross platform software library called libmapper (a long term project of the IDMIL’s – more info at www.libmapper.org) to make all of the sensor data discoverable by other applications and to support the task of “mapping” the sensor, instrument and gesture data to the parameters of media synthesisers.
The use of digital fabrication technologies allowed us to quickly iterate through variations of the prototypes. To start out, we used laser-cutters at the McGill University School of Architecture and a 3D printer located at the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT). As we moved to production we outsourced some of the laser-cutting to a commercial company.
Kate Andrews: How did collaboration across disciplines of design, music and technology change and shape the project?
Ian Hattwick: From the very beginning of the project, the three artistic teams worked together to shape the final creations. In the first workshop, we brought non-functional prototypes of the instruments, and the dancers worked with them to find compelling gestures, while we tried a variety of shapes and forms and the composers thought about the kind of music the interaction of dancers and instruments suggested.
Later in the project, as we tried a variety of materials in the construction of the instruments, each new iteration would suggest new movements to the dancers and choreographer. Particularly, as we moved to clear acrylic for the basic material of the ribs, the instruments grew larger in order to have a greater visual impact, which suggested to the dancers the possibility of working with gestures both within and without the curve of the ribs.
These new gestures in turn required the ribs to have a specific size and curvature. Over time, the dancers gained a knowledge of the forms of the instruments which gave them the confidence to perform as if the instruments were actual extensions of their bodies.
Component tests
Kate Andrews: How was 3D printing used during the project – and why?
Ian Hattwick: We used 3D printing for a variety of purposes in this project. One of the primary uses was for solving mechanical problems – such as designing the mounting system for the instruments.
We tried to find prefabricated solutions for attaching the instruments to the costumes, but were unable to find anything that suited our purposes, so we designed and prototyped a series of clips and mounts to find the shapes that would be easy for the dancers to use, that would be durable, and that would fit our space constraints.
In addition, 3D printing quickly became a tool which we use any time we had a need for a custom-shaped mechanical part. Some examples are a threaded, removable collar for mounting the PET-G rod to the spine, mounting collars and caps for the lighting in the spine.
[A document detailing the use of 3D printing in the project can be downloaded here].
Instrumented Bodies – digital prosthetics sketches
Kate Andrews: Where do you see this technology being used now?
Ian Hattwick: 3D printing, or additive manufacturing as it is known in industry, is increasingly commonplace. In the research community, we’ve seen applications everywhere from micro-fluidic devices to creating variable acoustic spaces. One of my favourite applications is the creation of new homes for hermit crabs.
Kate Andrews: Can we expect to see other live performances using the instruments?
Ian Hattwick: We are currently working with the instruments ourselves to create new mappings and synthesis techniques, and in October we will bringing them to Greece to take part in a 10 day experimental artist residency in Greece focusing on improvisation. We’ve also been talking with a variety of other collaborators in both dance and music, so we expect to have quite a few different performances in the next year.
Kate Andrews: What do you think is the future for interactive and wearable technology?
Ian Hattwick: I’m really excited about the coming generations of constantly worn health monitors, which is the first widespread adoption of the ideas of the “quantified self” movement. I expect in a relatively short time it will be normal for people to maintain logs of more than just their their activity, heart rate, or sleep patterns, but also the effect of their mood and environment on their body. I’m also excited about e-textiles, clothing which can change its shape or visual appearance.
One of the ways in which I see the prosthetic instruments making a real contribution is the idea that technological devices should be made to accommodate the human body, and not the other way around. Particularly, you see musical instruments created so as to be easy to mass-manufacture, rather than seeking to identify and support natural physical expressions during musical performance. At the same time, by creating technologies which are invisible to the performer we take away the physical interaction with an instrument which is so much a part of how we think about performance, both individually and in ensembles.
Kate Andrews: Does this present a new future for music? For dance?
Joseph Malloch: There is no one future for music or dance, but we can always count on new technologies being adapted for art, no matter their intended purpose.
Ian Hattwick: In interactive dance, the paradigm has always been capturing the unencumbered motion of the dancer; in music, there tends to be a fetishisation of the instrument. So in a sense, the idea of prosthetic instruments challenges the existing norms of those art forms. Certainly, using the prosthetic instruments requires a different conceptualisation of how we can perform dance and music at the same time.
The challenges of working with prosthetic instruments can be strongly appealing, however, and the level of mechanical sophistication which is provided by new generations of digital manufacturing will create opportunities for artistic exploration.
Just as we’ve seen an explosion of DIY musical instruments and interactive art based on open-source electronics, perhaps we will see an explosion of DIY mechanical devices which create new ideas of how we use our body to interact with technology.
Kate Andrews: What are you working on now?
Ian Hattwick: Documentation: We work in academia, and publication of in-depth documentation of our motivations, design choices, and insights gained throughout the process of development is an important part of the work. We are part of a much larger community of researchers exploring artistic uses for new technologies, and it is important that we share our experiences and results.
Mapping: The programmable connections between the gestures sensed by the instruments and the resulting sound/media really define the experiences of the performers and the audience. We are busy finding new voices and modes of performance for the prostheses.
Improvements to hardware and software: In particular, sensing technology advances very quickly, with price, quality, and miniaturisation constantly improving. There are already some new tools available now that we couldn’t use three months ago.
3D printing musical instruments: We are talking with a 3D printer manufacturer about developing acoustic instruments which are entirely 3D printed, and which take advantage of the ability to manipulate object’s internal structure as well as radically re-imagining the forms which musical instruments can take.
Graduate designer Mugi Yamamoto has designed an inkjet printer that sits on top of a stack of paper and eats its way down through the pile (+ slideshow).
The compact Stack printer by industrial designer Mugi Yamamoto is simply placed on top of a pile of A4 paper, rather than loading paper into the device in batches. The sheets are fed through rollers underneath the machine and exit on the top.
Yamamoto told Dezeen that his intention was to reduce the space taken up by a printer. “Thanks to this new way of printing it is possible to remove the paper tray, the bulkiest element in common printers,” said Yamamoto. “This concept allows a very light appearance and avoids frequent reloading.”
The designer looked at commercial printers and modified existing mechanisms to create the working prototype.
The printed paper creates a new pile on top of the machine. “It’s not endless – it might go up to maybe 200 sheets of paper,” Yamamoto told Dezeen.
Yamamoto completed the project while studying industrial and product design at Ecole Cantonale d’Art de Lausanne (ECAL) in Switzerland. He was also selected as one of ten young designers to exhibit at this year’s Design Parade 8 at Villa Hoailles in Hyeres, France.
The designer was born in Tokyo and is currently undertaking a design internship in Nürnberg, Germany.
These magnetic headphone jacks by New York designer Jon Patterson split in two when tugged to prevent damaging devices when wires get snagged (+ movie).
“I always break my headphones from cord snagging and sometimes I break my device completely,” Jon Patterson said.
His Pogo connector comprises two parts joined by magnets – one with a jack that fits into the headphone socket on the device, and a second longer piece that accommodates the jack from the headphones.
The signal is transferred between the two parts via four pins, but once the cord is yanked away they disconnect and the music stops until the sections are reconnected. “The magnet is strong enough to hold the device but will break upon force,” says Patterson in the video demonstration.
It can be use as a straight connection or at a ninety-degree angle, where it can fully rotate.
Jack sections can be left in devices and a receiver piece can be kept on the headphones, so swapping between different equipment is simple.
Apple’s launch of a cut-price iPhone last week – complete with blanket media coverage and the requisite 5am queuing by obsessives – was a reminder of what an insular world the tech industry is. With a starting price of £469, even the budget version of the iPhone is well beyond the means of most people on the planet. This fact hit home a few days later when I went to hear Indian entrepreneur Suneet Singh Tuli speak at the Victoria & Albert museum in London. Tuli is the man behind the Aakash tablet computer. The Aakash 4 launches soon and, though it has greater processing power than an iPad, it is ten times cheaper with a price tag of just £40.
Given Silicon Valley’s self-professed faith in the socially transformative power of technology, why does it show so little interest in trying to reach those who are most socially disadvantaged? The obvious answer is because the socially disadvantaged have no money. Yet, if you imagine reaching a market of a billion people who may be able to muster £40 for a tablet that will connect them to the internet – “the most powerful medium society has ever seen,” as Tuli puts it – you’d think there would be enough of a financial, let alone social, incentive.
Tuli, the Punjab-born and Canadian-educated CEO of Datawind, headquartered in London’s North Acton, can see the potential. He has his sights on the three billion people who have cell phones but no access to the internet. The barrier to entry, as he sees it, is not network coverage but price. Smartphones and tablet computers are out of their league. And yet, even in the US, personal computers only became commonplace once their price had dropped to roughly one week’s salary, which happened in the 1990s. That fact made Tuli realise that in order to reach the billion people living on less than £150 a month, he would need to create a tablet that retailed for about £30.
The way Datawind approached that goal was by embracing the concept of making something “good enough”. “Inexpensive and good beats expensive and great,” says Tuli. If that sounds like he’s damning his own product with faint praise, let’s remind ourselves of just how much we have all bought into the concept of “good enough”. We abandoned CDs for MP3 files, we watch pixellated videos on YouTube, we snap away with our phones even though we have digital cameras and we arrange Skype meetings knowing full well that the phrase “I’ve lost you” will feature prominently. In short, we favour convenience and instant gratification over high fidelity.
So, having briefly handled an Aakash 4 – or an Ubislate as it’s known in western markets – I can tell you that its shell is not as finely wrought as an iPad’s and its interface not as graceful. It does, however, have a 1.5 GHz processor that is more powerful than the latest iPad’s. Tuli abandoned some common tablet features, like an HDMI port, “because my customers don’t need to be able to hook up to a big plasma screen, so there’s no point spending an extra 11 cents on that port,” he says. Big deal.
The question you’re probably asking yourself is, why does India’s largely rural population need of one of these things? Tuli’s answer is education. Of the 360 million children in India, only 219 million of them are in education. That’s twice the population of the UK not receiving any schooling, and many millions more are being taught to a substandard level. India has a shortage of qualified teachers and the qualified ones are not desperate to work in rural villages.
I’ll confess that I was sceptical at first. I do not believe that a tablet computer replaces a teacher. Connect a child to the internet and you offer her a wonderful support system, but who’s to say what that child is actually doing online? “We need to connect them to the power of the MOOC [massive open online course],” says Tuli, not altogether convincingly. However, when he pointed out that the Indian government can supply Aakash tablets for less than it costs to print the necessary schoolbooks, I started to get the message. Indeed, Tuli claims the government is working on plans to distribute 220 million tablets – one for every student in the country.
But is the Aakash just another false promise? Yves Behar’s One Laptop Per Child programme seemed to offer the same potential, was feted by a wide-eyed media and scooped up awards, but ultimately failed to live up to expectations. Part of the problem was that it never actually reached its targeted $100 price tag, but there were also frankly discouraging tales of Cambodian villagers using the OLPC as a lamp. “It turns out the killer app was light,” says Tuli, with no little schadenfreude. It turns out that he may well end up collaborating with OLPC on the educational programme, though.
So what makes the Aakash different? Is Tuli just another techno-determinist who’s imbibed too much of the Silicone Valley Kool-Aid? Worse, is the social agenda a convenient cover for what is ultimately an entrepreneurial venture? Now that I come to think of it, how does he make these tablets so cheap in the first place? The Kindle Fire sells at £129, which is £30 less than it costs to manufacture – money Amazon can afford to lose because what it’s really selling is not hardware but content. Yes, Tuli cut out the unnecessary ports and features, and he negotiated a good deal on the touchscreens (the most expensive part of any tablet) but the Aakash still seems to do most of what an iPad can do, so there is presumably some very cheap labour going on that he has failed to mention.
Let’s put that aside for now, along with any qualms about the environmental impact of a billion tablets, which Tuli calls “a necessary evil” in comparison to battling illiteracy and ignorance (which I think he may be right about). Looking at the big picture, we see a massive emerging market for devices that will connect people to the knowledge resource that is the internet. India, where 800 million people use cell phones but can’t go online, is such a market. In 2011 Indians bought 250,000 tablets (mainly Apple and Samsung). The following year it was more than 3 million (mainly Aakash). In fact, Datawind fell far short of being able to keep up with demand.
Apple and Samsung may not have time for this market but they should be worried by it, because Indians are not the only ones interested in a £40 tablet. In fact, Tuli was swamped after his lecture. It’s customary at these things for a few keen audience members to mill around with an extra-time question, but this was fully half the lecture theatre. People were crowding round for a glimpse of this gadget. It was not their social consciences that drove them forward but pure consumer instinct. The air was heavy with musk.
Soon, Canadians will be able to buy an Ubislate for 37 Canadian dollars. If it’s “good enough” for them, then companies like Apple and Samsung will have to change their game rather fast. It will also suggest that India is now the place to look for disruptive innovation. The warning signs are already here. Last week Microsoft bought back £24 billion of its own shares. Earlier this year, Apple bought back £62 billion of shares. Instead of investing their cash in research, they’re giving it away to their shareholders. That, according to business thinkers like Clay Christensen, is the beginning of the end. As he said on the BBC‘s Newsnight programme last week, “Nokia is essentially gone, Blackberry is essentially gone and now Apple is next.”
For once, those catering to the so-called “other 90%” stand to gain. “Three billion users should be a big enough market but the big companies don’t want to go near it,” says Tuli. “That’s why disruption happens.”
News: researchers at Bristol University in the UK have developed a way for users to get tactile feedback from touchscreens while controlling them with gestures in mid-air.
The UltraHaptics setup transmits ultrasound impulses through the screen to exert a force on a specific point above it that’s strong enough for the user to feel with their hands.
“The use of ultrasonic vibrations is a new technique for delivering tactile sensations to the user,” explained the team. “A series of ultrasonic transducers emit very high frequency sound waves. When all of the sound waves meet at the same location at the same time, they create sensations on a human’s skin.”
The Bristol University team wanted to take the intuitive, hands-on nature of touchscreens and add the haptic feedback associated with analogue controls like buttons and switches, which they felt was lacking from flat glass interfaces.
“Current systems with integrated interactive surfaces allow users to walk-up and use them with bare hands,” said Tom Carter, a PhD student working on the project. “Our goal was to integrate haptic feedback into these systems without sacrificing their simplicity and accessibility.”
Varying the frequency of vibrations targeted to specific points makes them feel different to each other, adding an extra layer of information over the screen.
One application of this, demonstrated in the movie, shows a map where the user can feel the air above the screen to determine the population density in different areas of a city – a higher density is represented by stronger vibrations.
In addition to gaining haptic feedback, the user’s hand gestures can be tracked using a Leap Motion sensor to control what’s on the screen, rather than touching the screen itself.
The example given in the movie shows the controls of a music player operated by tapping the air above the play button and pinching the air above the volume slider. The ultrasound waves directed at these invisible control points in the air pulse when operated to let the user feel they are engaging with them, so they can operate the system without looking.
To drive the vehicle, users wear an electroencephalography (EEG) headset that measures electrical activity within the neurones of the brain and converts these fluctuations into signals that control the toy car. “As you try to focus, the increased light intensity of the vehicle indicates the level of attention you have reached,” explained Bernal. “Once the maximum level is achieved and retained for seven seconds, the vehicle starts moving forward.”
Bernal developed his project to help users train themselves in overcoming concentration problems associated with attention deficit disorders. “This project helps users to develop deeper, longer concentration by exercising the brain,” the designer told Dezeen. “It is possible for people to train or treat their minds through their own effort, and not necessarily using strong medicines such as ritalin.”
His design uses the fluctuating light levels to visualise the level of attention a user achieves in real time and rewards above-average concentration when the car moves. “I call this an empiric neuro-feedback exercise that people can do at home,” he says. “The user can’t feel anything tactile, but he will be able to visualise the behaviour of the brain.”
As part of his research for the project, Bernal visited the Dutch Neurofeedback Institute, where EEG is already used for the treatment of attention disorders, and found that “they tend to use software and digital interfaces as feedback, even-though ADHD patients are the most likely individuals to develop addictions to TV, video games and computers.”
“My project is basically a new way of employing the EEG technology in an analog way because from my personal experience, that’s more relevant for the people who can actually benefit from this technology,” he added.
The working prototype comprises a commercially available headset developed by American firm Neurosky, which has one dry electrode on the forehead and a ground on the earlobe, and the toy car that he developed and designed himself.
“The headsets are available to the public for €100 and I find the accessibility very positive, but at the moment the only way to work with them is by using a computer and performing a digital task or game,” he said.
The toy car itself is made of aluminium with a body in semi-transparent acrylic so the lights show through from the inside. “The shape is inspired by a brain synapse,” said Bernal. “I wanted to achieve a fragile-looking toy, something you have to take care of that’s complex but understandable. At the end of the day it’s not a toy but a tool to train your brain.”
Bernal has just graduated from the Man and Leisure department at Design Academy Eindhoven and showed his project at the graduation show as part of Dutch Design Week this month.
News:Google has announced it is poised to acquire domestic technology firm Nest for $3.2 billion, a move that will increase the tech giant’s presence in the home.
Google made the announcement yesterday that it is to pay $3.2 billion (£2 billion) for Nest, which was founded in 2010 by former Apple executives Tony Fadell and Matt Rogers. The company produces network-enabled home infrastructure such as thermostats and smoke alarms that can be controlled from smartphones.
The acquisition is the second largest in Google’s history, after its takeover of Motorola Mobility in 2011. The move suggests Google is rushing to achieve the creation of a connected home, where objects and appliances monitor residents’ behaviour and communicate with each other to adjust the domestic environment.
Integrating such a system in homes before rivals like Apple would force subsequent products to rely on Google’s technology and protocols.
Despite Google’s ownership of the Android operating system, Nest assured its users that the technology will remain compatible with Apple’s iOS software and other web browsers in statement on the company’s website.
After the acquisition, Nest will continue to be led by Fadell under the same name and branding. The closing of the deal is subject to conditions and approvals, but it is expected in next few months.
Nest Protect smoke alarm
The Nest Thermostat (main image) is designed to learn what temperatures a resident likes their home to be and builds a personalised schedule by picking up on routines. The thermostat can be adjusted using a smartphone app, allowing home owners to control their energy usage remotely.
The firm’s recently launched smoke alarm Nest Protect can also detect carbon monoxide, gives an early warning using lights and speech, plus can be silenced with the wave of a hand. It too connects with smart devices to alert the user when it is activated.
“[Nest is] already delivering amazing products you can buy right now – thermostats that save energy and smoke alarms that can help keep your family safe,” said Google CEO Larry Page. “We are excited to bring great experiences to more homes in more countries and fulfil their dreams.”
“With [Google’s] support, Nest will be even better placed to build simple, thoughtful devices that make life easier at home, and that have a positive impact on the world,” said Fadell.
A team in Tokyo has created a head-mounted camera that monitors brain waves and automatically starts recording when the wearer becomes interested in something (+ movie).
Developed by Tokyo company Neurowear, the Neurocam headset monitors electrical activity in the brain. When the user sees something that causes a spike in brain activity, it automatically triggers a smartphone camera mounted on the side to start recording a five-second clip of what the user is looking at.
Users download an app on their iPhone, which is then slotted into a harness on the side of the headset. A prism then directs the camera’s lens to look forward at whatever the wearer sees.
The algorithm that powers Neurocam was developed by Professor Mitsukura of Keio University. Everything the wearer sees, and the subsequent reaction in the brain, is quantified on a scale of zero to 100. When the user sees something that the algorithm allocates a score above 60, the headset begins recording. The clips can then be shared on social networks such as Facebook, or viewed at a later date.
“The Neurocam is an extraordinary experiment that challenges the way future cameras can evolve and how humans may interact with such devices,” the team said. “The Neurocam allows humans emotions to become integrated with devices, and we see this as a totally new experience.”
The team is considering adding extra software features to enhance the user experience. Manual Mode would add emotional tags to the scenes the Neurocam records in the same way it adds GPS and location data. Effect Mode, meanwhile, would automatically overlay filters and visual effects based on how the user was feeling at the time.
They are also exploring how to make the headset more wearable. “In the future, we aim to make the device smaller, comfortable and fashionable to wear,” they said.
While still a prototype at present, the project is being backed by Japanese ad agency Dentsu in a joint venture called Dentsu ScienceJam. They believe the Neurocam has a number of applications relevant to advertising and marketing, including helping to determine which products people are interested by in a retail environment.
Another possible use for the the headset could be to aid in urban planning, since the information about interest levels can be overlaid with mapping and GPS data.