I have been working on some physical interaction methods with my animations. This is the final version of the pushbutton board for a new project. It has 6 pushbuttons permenently connected to Gnd, the outputs default to Gnd and are fed back to the microcontroller. When a button is pushed, 5v is fed to the output, bur doesnt short out the feed. I use a diode and resistor in series in each button to achieve this. 0 bounce when the button ia pushed combined with a digital read method that does a quick sweep of all inputs
Received a huge batch of ESP8266 based NodeMCU modules today from Ali Express. Some of these will be used for a special installation project at Faux Mo for next Years MOFO Festival in Hobart Tasmania. The rest are for a new batch of LED Masks and some spares. These are great little devices that can run at 180MHz and can drive both WS2812b and APA102 LED Strips
Shot a new video at 60FPS to show the gorgeous animations on this model
I have had long thought sessions about how to make my LED creations interactive. Toying with sensors, buttons, sliders etc… has often wielded “meh” results. In the end, a web/app based approach will probably be best. But how do i make it un-boring? The idea of the Idle render, what is rendered when no interactivity is around, is very important. I had an idea last night about a structured render algorithm that could be made to create really interesting patterns and be structurally interactive.
Most of my render algorithms are very similar, varying slightly in certain ways:
- Create a color object using an index value starting at 0 ending at “6 x Maximum value of Each LED”
- Create an X and Y coordinate using either a random number bound to the LED map, or a certain pattern
- Apply color to a pixel, line, box or circle at X Y co-ordinates
- add a delay of d milliseconds
- apply a canvas shift function of some sort
- apply a canvas fade function with a fade value f
The above creates some amazingly diverse animations. I plan to write up a simple web interface that allows you to control each step and then send the updates to the MU via wifi.
Here is a rough draft of a web app:
This sends a POST data packet at each change in the interface that looks like this:
Here they are naked, ill grab some more video at Seed of the finished product:
With thanks to Aerie from Kamp Kraken, who put together the creatures bodies, i was able to make these awesome looking bases:
No, get your mind out of the gutter :) its not what you think:
The will be decorated and hug on a big hoop. I really hope everyone at Burning Seed likes these!
I started work on a new version of the LED mask. I recently found a less creepy plastic face mask, and decided to work my magic on it:
First thing to do is to map out the LED arrangements. This is key to how each strip is connected and how the software drives the LEDs. You can see the strip placement markers and the direction of data below.
| This is the new mask I found.
It has a kinder, more neutral appearance…
Next I cut up the LED strips and prep them each for placement on the mask. I usually mark out the required length of connector wires on the mask and cut them up, prep them with solder and markers
|This is general layout of the mask in 2 dimensions|
Finally I connect each strip together in the order assigned in step 1, and the stick the strips to the mask:
And here is a test animation of the new mask in action:
A new #ledmask ! This is V4 utilizing a #nodemcu #esp8266 module and my #led software in combination with #apa102 #rgbled #ledstrips . This will shortly be covered by cotton lycra to diffuse the light. wanted to display the guts of this amazing hardware. This mask will be off to #burningseed #burningmanaustralia to wow the crowd!
This version uses the NodeMCU V1 module. Great little piece of hardware!
First off we have the lovely pink lady from many a Coco Poco Loco party:
|This is in fact a very silly looking fiber glass model of a naked lady. It has a base with a normal bulb inside.
The outside is covered in small glass pieces and the inside is covered in jagged fiber glass that cut my arm up pretty badly
I was asked to work my magic and here it is:
I put a strip in the head, onto the bust, all the way up the spine and along the lower parts of the front facing legs. The whole thing uses about 68 LEDs from a WS2812b strip. I used a NodeMCU for this, just for shits and giggles and to see if i could get it to drive WS2812B strips. I used THIS modified NeoPixel driver with the NodeMCU module running at 160Mhz. Note you must use the UART driven library on Pin 4. Here is another example:
The Pink Lady is getting sexier for the #cocopocoloco #burningmanaustralia #burningseed #artcar IF they like it. The #ledstrips inside are #apa102 from @tyrialight and they are being driven by a #nodemcu A video posted by Elec Dash Tron Dot Org (@wow_elec_tron) on
Next up is Jords Disco Shoulder Pads
I used a Arduino Nano for this along side a WS2812B LED Strip. The results is hillarious:
My best piece to date in my opinion:
Here are some #discopads please enjoy them #burningmanaustralia #burningseed #ledwearables #arduino #apa102 #ledstrip A video posted by Elec Dash Tron Dot Org (@wow_elec_tron) on
I have had this idea in my head ever since i saw these cheap lighting fixtures at Ikea:
These comprise of 2 outdoor lights. the bottoms are hollowed out and fixed together using some screws.
#woo I really like this one. All up the sphere has 364 #apa102 #rgbleds, it runs on a small 5v Mobile power bank which sits inside. Endless possibilities for #burningseed #burningmanaustralia #ledinstalation Awesome backing track #soultice by #monikakruse #piganddan remixed by #albertoruiz A video posted by Elec Dash Tron Dot Org (@wow_elec_tron) on
Each half has 182 LEDs arranged in 13 x 14pixel strips. You end up with a 28 x 13 pixel grid to work with, and for magic to begin.
The sphere is controlled using a NodeMcu clone that runs an ESP8266 core. It easily accepts Arduino code via the latest Arduino IDE so porting my code was relatively simple. The only change required was the FastLED library which isn’t yet supported on the ESP8266 core. The APA102 LED strips can be driven using standard SPI so rewriting the driver was pretty easy. The SPI library for the NodeMcu is even better than the arduino as it has frame based transmition where you can just shoot out a huge buffer of bytes at one go, instead of byte by byte. The performance increase is massive.
Last night I began testing a PIR sensor and some code that Rob gave me. I can now sequence the animations depending on how busy the motion in front of the PIR sensor is. I have set up a bunch of soft-polling systems inside each animation sequence, that let me switch between modes. So far so good! There will also be a button for crazy mode!