7/07/2016

ESP32 Article

ESP32 in Make: Magazin Germany

I wrote an article for the german edition of Make: Magazin (so yes, it's only in german language) about the upcoming ESP32. It's about the differences to its "little brother" ESP8266. I'm also explaning how to install and work with both the Arduino IDE and Eclipse giving some basic programming examples.

Basic electronic setup of ESP32

The ESP32 will be available in August/ September according to Espressif. Looks like we might have to wait for developer boards a few months more.

6/02/2016

ESP32 programming examples

Some code examples for the ESP32 (ESP31B) beta developer board from Espressif


ESP31B developer board
I'm currently evaluating the ESP31B developer board (ESP-WROOM-03) I got from Espressif more in depth and began to write some example code for the new chip ESP32,which is due in August/ September according to the chip manufacturer.

You can find the code in my github account here. It's work in progress and I will add more code over time.

I posted some high-resolution photos in this blog entry.

5/26/2016

9 reasons why "Big Data", "IoT" and "Smart Home" might be so dangerous...

...or delicate at least


Not only buzzword paradise (for the big companies) - but big money

Data streams

Working for years on making "things" communicate with "machines" and hence putting them in the augmented live of ourselves there is always the discomfort of the consequences of one's actions.

I wouldn't dive any deeper into any horror scenario resulting of this or serve a model for another (american) dystopian movie.
Let us instead analyse hard facts in form of ONE single data stream only being available as a graph so that you can judge yourself what could be a realistic outcome of the "Internet of Things" where already billions and billions of devices are connected worldwide.

Day 18 since seed
As you can read in my blog I'm currently tinkering with the new IKEA Krydda indoor gardening stuff. Nature should be allowed to run its course? Sure, but adding electronics always makes fun and often gives much more insight to things you didn't see before (did you ever take a thermal picture of something? See examples here). I'm putting the data into the "Internet of things" here so you can see 24/7/365 what's happening in my gardening paradise at home.


Day 18 since seed
So let's see what we can tell just from one and only one graph ("data stream"). For the beginning we assume we even have no absolute numbers like YOU do it everyday 24/7/365 with your smartphone, PC, car etc. sending your life ("numbers") to the big data centers in Silicon Valley and everywhere in the world analyzed by the smartest algorithm programmers available and fastest electronic brains existing.

ONE graph


1. Flatline

Nothing happening and therefore not interesting while there is nothing to see in the graph aka as a "flatline"? Wrong! Nothing happening could mean you are sleeping (now "they" can tell your sleeping habits if there are more time periods to sample) or find out that the apartment is not occupied (welcome burglars!). I do not even need to know the type of data (like temperature or electric power consumption).

2. Controlled curve

There is something ramping up for a certain amount of time in a very constant way. First guess: Might be a (electronic) device not a human action. Maybe the heating system? Without knowing the data type (8) it's hard to tell. But if I only know once what type of device this kind of ramping-up produces, I could draw conclusions to every other data stream from anyone I get with the same pattern.
In this case it's the wake-up mode of a Philips Hue lamp over a time frame of thirty minutes (7). Linked to number (6) you might even tell the lamp type or color. Hackers welcome if they know which devices are in your home! And your next ads in your browser/ app might be related to the lamps. You also need a light strip from Philips?

3. Elevated flatline

The previous phase is completed. Might there be something happening next or what? Compare 1, 2 and 3 over more periods and you already get very interesting details on sleeping and living habits and hence patterns.


 

4. Digital

Something gets elevated very quickly like a 0 to 1 transition. Knowing the type of data (8) it's not hard to judge what happened. More insight into your living habits if linked to (7) time and date. Just from one (mechanical) action. Imagine all the mechanical switches are replaced with "intelligent" ones.

5. Analog

Something "irregular" is happening. Typical analog data of very fine granularity and therefore very helpful. Put the data in the big electronic brains and compare it to typical analog data (e.g. weather data like temperature, wind, cloudiness) and there will be a meaningful outcome - for sure. If you have the data type (6, "light intensity"), (7, "time"), (8, "light") you do a favor to the wallets of the big data center owners because they only need to buy little computing time. Now "they" not only know your sleeping habits, living pattern, your electronic devices but also where you live.
No they did not need your GPS signal you are always sending with your smartphone or the WiFi data you are allowing them to track.

6., 7. Goody: Scaling/ reference

We are still not talking data streams with exact (absolute) numbers but only a single graph we get over time. Add goodies like a scale or reference to make it easier. But be assured clever algorithms are already bored if you give them such kind of a cheat sheet.

8. Data type

Oh "they" also know what kind of data you are delivering - thank you for willingly helping to get the graph-only data even more interpretable.

9. Big data

You are adding more linked data like temperature, humidity and a data stream with numbers? You are already toast, this only adds to the finer granularity of your being. And always remember: you still have not told someone one single (unencrypted) letter - like you do everyday putting your most intimate details on messengers like "Whatsapp" or by sending your data into the "cloud".

Conclusion


We only examined ONE very simple data stream in form of a graph. Only from the pattern of one simple, single data source you can tell a lot of stuff. Much more then mentioned above - we only scratched the surface in interpreting data. That's why educating people as "Data Scientists" exploded in recent years.

Now imagine ALL the data YOU (and of course ME!) are sending every second into the enormous data cloud! Judge yourself what could happen and what we should do in the future regarding data aka the "Internet of Things". For the moment OTHER people are getting YOUR data and earning a lot of money from it or even worse.

What we should do and behave? Proposals very welcome! For the time being enjoy this "paradise" of "Big Data" - and did you just switch your light on?

5/23/2016

Web dashboard for IKEA Krydda

Firmware sketch for ESP8266 with web dashboard on adafruit.io

Adafruit.io dashboard
Dashboard for my growth system
To monitor your cultivation unit from local/ everywhere I wrote a sketch to both send the actual light intensity values via the serial port and to adafruit.io which is a nice web service for displaying data in the internet (of things).

Currently only the light value is displayed and more sensors will be added in the future.

You can see the actual values of my Krydda system here.
 
More information on how to set up your own dashboard on adafruit.io can be found here

UPDATE 23-5-2016: Added temperature and humidity with the Sensirion SHT21 sensor.

5/22/2016

IKEA hydroponics - First sensor for ambient light

Sensor measures light intensity for the plants

Lights sensor BH1750FVI connected to a ESP8266
The first sensor for my IKEA hydroponics project is a ambient light sensor (BH1750FVI) which will measure the light intensity on the plant bed in both the seed and cultivation unit. It is currently connected to an ESP8266 - a DIY-friendly and hacker-famous microcontroller with WiFi on board. I wrote a library for the light sensor which works with both Arduinos and ESP8266. You can find it here.

Germinated seeds in cultivation unit
The example sketch in the repository currently measures only the actual value of the ambient light. In future versions this sensor will be responsible deciding if and how long the IKEA LEDs have to be switched on depending on the overall light which fell on the plants. There will also be an uplink to a web-service (like Thingspeak or adafruit.io) where all the values are monitored in real time on a nice looking dashboard and will be accessible from everywhere.

In the meantime the seeds that germinated changed their habitat to the cultivation unit.

 

5/12/2016

Project to control IKEA hydroponics Krydda/ Växer

I'm starting a new project with the brand new indoor gardening system from IKEA which is named "Krydda"/ "Växer".

This is a picture of the initial seed:

Initial seed day 0
Planned items are to control and optimize growth with a microcontroller/ embedded system and sensors (light, humidity, temperature, CO2, water level etc.) utilizing automated lighting and water refill with the new IKEA Krydda/ Växer system.

You can follow the project here and at my github repo

4/18/2016

Update: ESP8266 Breadboard Adapter Board


I designed a single-sided ESP-12/ ESP-07 breadboard adapter PCB which will be easy to etch and solder for anyone who loves to play with the ESP8266 on a breadboard like me. 

There are different designs of the breadboard adapter:
There is also a nice 3D-printed socket from Moritz in his github account to easily program (lots) of ESPs without the need to solder them at all.

Features are:

  • Fits ESP-12 and ESP-07 module
  • Single-sided self-etchable design
  • Few, cheap parts in SMD
  • Breadboard-style - one row on each side accessible
  • Vin >4.8V (max. 12V) input possible with 3V3 onboard voltage regulator (with two capacitors 10µF)
  • RST, CH_PD, GPIO0 with 4k7 pull-up resistors on board (resistors can be omitted if remote access of those GPIOs is needed)
  • GPIO15 with 4k7 pull-down (see above)
  • Tactile switch connected to GPIO0 to get into flash mode
  • Reset switch
  • Solder bridges for DTR, RTS lines to enable automatic flashing without having to press buttons

Parts needed:

  • ESP8266-12/ -07 module
  • 1x Voltage Regulator (e.g, AMS1117-3.3V, 800mA)
    Kit
  • 2x 10µF SMD ceramic capacitors
  • 4x 4.7k SMD Ohm resistors
  • 2 1k SMD Ohm resistor
  • 2x 4*4mm SMD tactile switch
  •  2x 1*8 pin header (pitch 2.54mm)
  • 12MHz crystal
  • 2x 22pF capacitors
  • Micro USB connector
  • 100nF capacitor
  • CH340G USB to serial IC
Remember to put three jumper wires on the bottom side as this is only single sided. See Eagle files for connections.

4/09/2016

Raspberry Pi 3 "Echo"

Building an Amazon Echo similar device out of a Raspberry Pi 3


I recently worked as an electronics hardware developer on a new smarthome system which is designed to have speech recognition as a way of controlling devices.


Over the course of researching soft- and hardware for this purpose while in Silicon Valley I also tested and reverse engineered the "Amazon Echo" - an electronically very well designed device and a huge success for one of the in-house manufactured devices from the electronic commerce and cloud computing company.

The latter also lays the groundwork for Amazon Echo and the speech recognition called "Alexa" utilized in the round tower like gadget. With a price tag of $180 and - more important - not yet available to customers outside the US I was quite happy back in Europe to see a github repo to allow implementing an Amazon Echo similiar device and especially speech recognition on cheap hardware like a Raspberry Pi.

I bought the quite new Raspberry Pi 3 - even if the github repo uses a Pi 2 - expecting some minor issues, what turned out to be true. A big help was to browse the "issues" related to the repo.

In short I avoided to install a new JDK because it already comes with new Raspian Jessie image. I put on the newest version of Node.js, used the WiFi which is onboard with the RasPi3 and tested different microphones because the one suggested on the github repo has some bad reviews. That's basically all I deviated from the original installation instructions, which worked out very well.

After only two hours or so everything was set up without problems. In the video below and for the first tests I used a webcam with an integrated microphone, a Logitech QuickCam Orbit AF, which I had lying around while the dedicated USB microphone was ordered but had not arrived.



Identifying the microphone chipset

The problems began when I got the new USB microphone, a "Lerox USB microphone" ordered - of cause - from Amazon. In the beginning I had barely no success getting "Alexa" recognizing my commands. I had pulsing sounds (which I hadn't before) and the speech recognition stopped before I could even tell the whole command. The microphone identifies as a "C-Media Electronics device" with a CM108-chipset.

Three efforts led me to success:
Microphone configuration with "alsamixer"

1. I adjusted the recording settings of the microphone with "alsamixer". It turned out to be a good setting (at least for the microphone used) when it is set to the highest "green" level available.

2. I changed the USB power supply for the Raspberry Pi 3. This is where the klicking sound while recording the commands came from. Might be more a bad design of the microphone than of the power supply, as I used a high quality PSU first.

Editing settings for the microphone
3. This might be the most important setting fiddling with microphone problems: I adjusted the values in the java source code (../samples/javaclient/src/main/java/com/amazon/alexa/avs/ASVApp.java) for "ENDPOINT_THRESHOLD" (minimum audio level threshold under which is considered silence) and "ENDPOINT_SECONDS" (amount of silence time before endpointing). Default was 5 respectively 2 which I changed to 7 and 4. After a "mvn install" to do a new build and the call "mvn exec:exec" it now almost works like the original Amazon Echo.

Audio device settings

4. You might have to set your microphone as default input source. You can do this by choosing "Menu -> Settings -> Audio Device Settings" selecting your soundcard (microphone), add elements and make the microphone the default. This is where you can also set the gain or any additional elements like auto gain control (AGC) when provided by the soundcard/ microphone. As far as I understand choosing and setting the microphone with "alsamixer" does the same but I'm not sure about it.

The next thing I will implement is the invocation with a spoken command like the Amazon Echo - where you can choose between "Alexa" and "Amazon". As far as I could reverse engineer it Amazon solves this with a bunch of Texas Instruments TLV320ADC3101 92dB SNR Low-Power Stereo ADCs, which have an integrated miniDSP and I guess this is where they put the algorithms (aka "magic") for recognizing the invocation command while after this streaming the rest to their cloud servers. You find a lot of technical details of the Amazon Echo in this awesome ifixit Amazon Echo teardown .

EDIT 4-10-2016: Added instructions of Elton "Eddie" Hartmans fork to the installation on my Raspberry Pi 3 and it's now possible to start voice commands either by clicking the button on the JAVA-GUI or by pressing a switch connected to the GPIOs of the Raspberry Pi.

EDIT 4-25-2016: If you want to use bluetooth speakers follow this awesome tutorial from David Roberts. Unfortunately I wasn't able to connect my microphone which is embedded in my bluetooth speaker BoomStar BT NFC X yet.