Some of you may have noticed that words like rhythm, texture, pattern, can be used both to describe fabrics, as well as sound. Focused on building an interface as a whole, using mostly textiles, OCHO TONOS invites the user to interact through touch, and experience sound in a multi-sensorial way. Ocho Tonos is an interactive installation by EJTech duo (Esteban de la Torre and Judit Eszter Kárpáti) I met last July during etextile summer camp while they were working on this experimental textile interface for tactile/sonic interaction by means of tangibles: (more…)
Archive for the ‘ADK’ Category
Google has unveiled at Google IO their new Accessory Development Kit for Android mobile phones and tablets, the ADK2012
The ADK2012 is based on the upcoming Arduino Due platform and the new Arduino IDE that supports programming ARM targets.
Currently the IDE works only on the Google ADK board released at Google IO, while the official launch of the Arduino Due is due later in the year.
In his blog, Charalampos describes his experience with SeeedStudio’s Grove Ear-clip Heart Rate sensor and Cosm (former Pachube) cloud service. The employed sensor is quite cheap and can detect heart pulses from the ear lobe, by measuring the infra-red light reflected by the tissue and by checking for intensity variations.
By connecting this sensor with an ADK board and, in turn, with an Android smartphone, Charalampos implemented a portable heart-rate tracker, which is used to send the recorded data to Cosm cloud service.
For more information and sample code, see here.
[Via: Building Internet of Things]
The next time while organizing a barcamp you can display this cool hack for two great reasons: The obvious being to look cool, the other reason being well, there is Arduino and beer involved! Where both are cool.
Aptly named KegDroid, this beautiful beer dispenser was built by a google employee, Pall Carff, uses NFC reader to identify users and dispense beer.
It’s not just the beer dispensing that sounds awesome to us, oh no. KegDroid is packed to the brim with all sorts of clever tech, mostly from Google’s labs. The giant robot has a Motorola XOOM tablet running Android 4.0 Ice Cream Sandwich jammed into its belly, which runs a custom app designed by Paul.
An NFC reader sits between the two drip-catchers and interfaces with specially made NFC tokens, sending a message to an Arduino computer board that’s inside the case, which then controls the beer pouring from the taps situated in the arms of the Android robot.
So the next step can be a humanoid beer robot fetching us beer to our table when tweeted with an @RestaurantName with #tablenumber ?
If you envision an open future, sustainable development through community contribution seems to be the way out. With open platforms both across hardware and software freely interacting with each other, car makers are finding a viable way to improvise the already developing intelligence of the cars.
Ford (and other automakers) envision future cars with high tech infotainment systems galore where car dashboards could have downloadable app’s just like todays smart phones and tablets. With the OpenXC platform Ford is creating a channel for open collaboration with 3rd party application developers, allowing them to use cars like the Ford Focus to prototype their gizmos.
The OpenXC platform is an open source hardware and software stack which allows 3rd parties to connect self-created gadgets to an OpenXC-compliant car.
If “your car is as easy to program as your smartphone,” it stands to reason that future cars could generate as much innovation and excitement as todays smartphones are generating.
The company announced last week they were making the OpenXC source code available, in beta form, to developers and universities around the world. Ford demonstrated a sample third-party mobile app created with the OpenXC toolkit at NASSCOM India Leadership Summit, held last week in Mumbai India.
The OpenXC platform is being developed in collaboration with Bug Labs, a New York based developer of small computer hardware building blocks meant to help organizations build the “Internet of Things.” This concept looks toward a day if/when all objects will have embedded computerization, with ubiquitous connections to the Internet to share data and information enabling large scale applications to be built upon the data coming from all the connected gizmos.
The documentation on the OpenXC Platform website describes installing small hardware module, attaching it to the OBD-II port so the module can read CANBUS messages. The hardware module interfaces the OBD-II/CAN bus to the more common USB interface, and sends data from the car to the software running on the OpenXC software platform. The software part of the OpenXC platform runs on Arduino or Android platforms, and provides to software measurements of vehicle operation such as brake pedal status, engine speed, latitude and longitude, steering wheel angle, and vehicle speed. The documentation does not provide methods for the software application to send commands to the car, only to receive data from the car.
This is unlikely to result in consumer applications right away, if only because interfacing to the OBD-II port is not exactly a user-friendly experience. Ford is positioning this as an outreach to application developers. Ford asks us to ponder these sorts of questions: What if “user-facing hardware and software” (such as the dashboard) was based on open software stacks, where car owners could purchase and install add-ons as easily as they buy smart phone apps today? What if the infotainment systems were easily user upgradeable? What if you could transfer a high tech gizmo easily from car-to-car?
Impressing automotive hacking lets this FIAT car moving by the number of “like” from the Guarana Antarctica Facebook Fan Page. The advertising idea is simple: let the social audience support this Sau Paulo to Salvador trip to reach the Carnivalby commenting / “liking” the page. The onboard Arduino ADK (connected to a tablet and the internet) allows the car going on by a certain amount of meters (apparently one “like” is 10 meters, while each comment lets the car go ahead for 20 meters).
Stahl Stenslie, Norwegian artist, PhD, and professor at the Oslo School of Architecture, has been working with the guys at 1scale1 in making a wearable device for about a year. Prototype’s last version uses an Arduino ADK together with an Android phone. It is a location based sculpture + soundscape. When reaching a certain place, the sound starts playing binarual sounds and an array of motors embedded in the garment, a coat in this case, will give you physical feedback.
The project can be named Sense Memory […] the binaural sounds and haptic patterns were made as a response of the 22/7 terror.
However, while trying out the coat at Oslo’s main square, they got approached by the police:
Stahl was trying out his piece together with the director of the center curating and commissioning the piece. In the pictures you can see Stahl (2m tall) adjusting the coat on his partner, how the second goes for a walk and the instant when the police comes in and stops them of testing the prototype.
[They] held us in custody on the spot for half an hour, positioning themselves around and behind us as if we were expected to do something violent towards them. It was all a pretty violent experience for us. […] Interesting they would accuse us for being terrorists and potential suicide bombers. Our prime minister called for a more open society, a society with more democracy and creativity. But the police seem determined to go ahead with their own agenda.
I have not been contacted by the police for an interrogation yet. But they said we had been reported – which again means they somehow want to punish us.
The main newspaper in Norway had announced the event in an article, the day before. However, that wasn’t enough to convince the police about their non-violent intentions.
Our show was therefore announced publically as well. Not that it matters for the police, but, hey, ask before you shoot…
Stahl closed the call with a statement about the current state of arts and the fear generated by terrorism and violence in any form all around the world:
Concerning the use of new, experimental media: if you want to use your handheld device, use a different design of your clothing like our haptic bodysuit or otherwise behave differently than ‘normal’ people, well, you’re most likely a terrorist. That’s the impression you get.
I seriously hope court will dismiss any charges on Stahl and his partner regarding this matter. 1scale1 worked really hard in trying to put together a meaningful and enjoyable experience for the audience. It is really sad it turns out this way.
Check all the pictures we got from this case clicking here.
What better option than Android arduino, could be used when we think of hacking phones and and interfacing them with the sensors?
As a member of illutron, Mads Høbye – MEDEA PhD student in interaction design – was asked by SonyEricsson to challenge the more conventional usage of mobile technology, by exploring alternative usage scenarios. He called in a combination of artists, geeks and tinkerers for a four day workshop.
The Android platform proved to be a great stepping stone in that direction. During the workshop we managed to use the phones in multiple ways, by taking advantage of the embedded technologies like GPS, Compass, Wifi, GSM/3, Accelerometers, touch screen and connecting them to the Arduino platform.The compressed format of the workshop proved to a fruitful for revealing new openings and possibilities – pushing the boundaries of the normal perception of what constitutes a phone and how it should be used. From a research-through-design perspective, the resulting prototypes work as conversation pieces around what constitutes material media and how we can design position aware devices that are constantly connected to each other.