Soft Robotics – Bio-Inspiration: The Axolotl


Axolotl are adorable amphibians who never transition to land because they are neotenic, or reach adulthood without metamorphosis. They are genetically very close to salamanders. They are indigenous to two lakes near Mexico City, Lake Xochimilco and Lake Chalco, with only the former remaining.

The Axolotls were revered as sacred beings by the Aztec.  The Aztec made the largest city in Pre-Columbian Americas named Tenochtitlan (current day Mexico City) on an island named Lake Texcoco. Like many plant and animal species, colonialism has directly effected their decline. In 1521, the Spanish drained lakes and built cities in its place. Currently there has been non-native fish added to Lake Xochimilco who are eating the Axolotl young.

The Axolotl have amazing regenerative qualities, and are able to regrow without any scarring, up to a 3rd of the heart, front brain, spine, limbs and testes. One can transplant leg and place it next to an arm and the body will adapt it. When they experience a severing, the cells in that area lose their individual characeteristics and act as stem cells and are able to regenerate accordingly, seemingly indefinitely.  They are 1000 more times resistant to cancer than any other animal and the only terrestrial animal that can regenerate. They are widely tested in labs due to this unique and powerful anatomy.


How its features could be used in technology: regeneration for burn victims, amputees and cancer patients. Scientists are trying to figure out how exactly their regeneration works and some believe one day it will be able to be used for human regeneration.

When speaking of stem cells or any technology in general, there is always a looming question of ethics and which directions it can go in. We are beings of light and light makes shadow,  anything we create will embody this duality and the infinite choices of what to do within that framework.  A technology like this could be used for healing or a more “unnatural” or disturbing direction.

In the 60’s, a scientist amputated n Axolotl head and transplanted it on the back of another Axolotl. Both heads continued to grow for 65 weeks until they died. In a far out, but not implausible sense, if the Axolotl’s regenerative abilities were understood to the extend that they could be put to use – one could create creatures with multiple heads or arms, or possibly combined species.





Meditation #3 – T-eye the Knot

For the 3rd meditation I worked with Kim Lin to make a matrimonial divination firm  “T-eye the Knot”, that provides services based on your interaction with live streaming jellyfish from Monterrey Bay Aquarium.  One feature is predicting your wedding song – one of the eery tracks NASA has made from each planet’s radio transmissions.

The participant will be wearing Pupil Labs glasses watching the live stream. According to where their eyes go the most, one of the planet songs will be selected.

We decided to go with Pupil Labs instead of doing eye tracking in p5 or another program for the physicality of it, but haven’t figured out how to work with the data yet.  In the meantime, here is the concept video:

Fungus Among Us – Final Presentation Idea

I’m interesting in creating a wearable mycelium sound trigger as a mask or wearable on another body part.  I like the aesthetic of Eric Klarenbeeks’ 3D printed Myceliumchair and would like to design it using a printer if possible.


I have a MIDISPROUT that measures the micro-electrical currents of a plant and converts it into MIDI signals and I’m interested in exploring if this could be done with a Mycelium piece. I haven’t heard of anyone using this with fungus and I was hesitant to try it with the pads because I thought it might tear the skin of the mushroom, but I’m going to investigate how to do it with less damage.

I would allow the mushrooms to grow because because I think it looks beautiful and what the mycelium wants to do naturally and I assume there is more water content in the “fruit”

Screen Shot 2018-03-02 at 12.32.40 PM


I found someone doing experiments with Makey Makey that were successful:


3D mask/wearble inspo:



Screen Shot 2018-03-02 at 12.50.44 PMScreen Shot 2018-03-02 at 12.51.06 PMzgzwuk5yceme2wc904gl

I like the idea that the fruit will continually change based on the fruit growth, given that it sprouts.




Kimberly Lin and I have been developing a deck that interprets each card with a movement. “Soma” comes from the idea of Somatic  therapy where our experiences, emotions and trauma are not only stored in our mind but throughout our body. This week we added our interpretations for the 6 cards we have recorded so far. We referenced these two books (will add author names and titles here). We recorded it through a vocal processor I have to give it a touch of cyborg & swimming in the cosmos feeling.

We wanted the videos to loop but the audio to repeat only once when the card was selected while maintaining the random function. This was executed in P5 with help from Leon.

We are still developing the concept of how we want people to engage with it and am thinking of it more as an installation with life size videos and and a physical component as opposed to a website.



LIPP Week #3

I experimented with the jit.matrix, chroma key and random piano. I tried to combine them and have the random number outputs from the piano to trigger the video controls but was unsuccessful.

Screen Shot 2018-02-12 at 5.47.32 PM





There-Liminal Network Final Blog






The final test was a success!

Everything went smoothly but I made 2 changes in the final play test.

I made the light source for the photosensors candles instead of lamps which have a more focused light emanation and add to the overall sensory experience.

I tested 4 waterproof speakers and none of them sounded good enough so I’m using speakers outside the box. Visually I couldn’t make it look good with the speaker submerged this time around.

Schematic & Diagram:

Configuration & Construction:

I made a light theremin with arduino, 2 photosensors and a switch. The analog readings of the sensors were serially communicated to P5.JS via p5 serial port app. I loaded whale sounds and video into p5 along with pitch & delay effects and red, green, blue color channels to overlay on the video. All of these components are controlled by the photosensors. Pitch on one side, delay on the other, red on the right, green on the left and blue on both.

The video was run through an overhead projector and I used Maptastic in p5 that allowed me to change the shape of the video so it fit squarely into the water basen.

The circuitry is all underneath in the white box. The 2 sensors are on the first bubbles on the right and left of the box.

The audio is running out to a blue tooth speaker.

I bought an acrylic box at Canal Plastics and lined the seams with a clear waterproof sealant to make sure it could hold water.

Whale Theremin Update (still drafting post) 12.10.17

My original idea has been transformed a few times over the past few weeks but has retained the original concept as far as working with organic material, sound and triggers that don’t involve direct touch to be activated.

I worked on the circuitry and code for the Whale Theremin to get the sound and effects triggering with 2 photosensors and a button.  Once the basic code and circuitry were working, I shifted my focus to trying to get the Adafruit Capacitive Touch Shield working to trigger organic materials that are conductive such as a fruit or plant.


For some reason this proved to be very buggy and I spent too much time trying to make it work, it would sporadically read the sensors, I caught it when it momentarily worked below.

I then decided to try to get this to work longer wouldn’t be a wise use of my time and I decided to order the Adafruit Capacitive Touch break out instead, which was relatively smooth to set up and start a reading from.


I decided however that I really wanted to work with plants and midi, so it was more of a reading of the state of the plant’s microelectrical currents than just on and off.

I tried to model the galvonometer from MIDISprout’s circuitry. They have a more complex system that includes a potentiometer that effects specific parameters within MIDI and corresponding LED lights that I wasn’t interested in at this moment, my main objective was to show how plants electrically/energetically have reactions to touch / the environments around them and to have that translated into MIDI and then sound.

At Tom Igoe’s advise, I attempted to replicate just the galvonometer aspect of the circuit and to make a more straightforward MIDI communication. I spent several days on this and was unable to get the galvonometer to work properly.  This is something I definitely want to revisit in the future and get to work.

I then chose to re-focus my attention onto the whale theremin coding and fabrication further and table both of the touch sensors until the theremin felt complete. There were a couple places that I had problems with my code and Jim, Chino and Mathura (*insert last names*) were very helpful in explaining certain hiccups in my code and/or offering help on how to execute certain functions successfully.  I expanded on the visuals in p5 including mapping the 3 color channels to the 2 light sensors. red is mapped to photosensor 1, green to photosensor 2 and blue to both of them. Leon (*insert last name*) helped me execute this properly.


There has been some challenges with the construction of the piece mainly since I chose to project on top of the water. Water is very heavy and obviously damaging around electronics. I envision this project to be at a larger scale projecting onto a larger body of water, but currently I’m using a 12 x 12 x 12 acrylic box for now. I’ve lined the seams with DAP Flexible Clear Sealant and have ordered a project box that will sit underneath the box where the arduino will be housed. The photosensors will be on either side of the box.

I found the best remedy for using photosensors and a light projection that requires a darker room is using 2 flashlights, also making the photosensor readings more accurate.

I might switch to distance sensors, as recommended by Tom, but my priorities now are getting the fabrication and execution running as smoothly as possible. I haven’t had that much difficulty with the photosensors because of the constraint function in the P5 code and the relative ease of calibrating according to the light level in a particular room as well.


I’ve done a lot of research with various waterproof speakers and have ordered 3 now, the smaller one works best so far but is not powerful enough for this amount of water. The most powerful one cuts out when it’s underwater for too long, so I have a 4th one on its way that is supposed to be very powerful and is aesthetically in line with the overall piece.

I figured out how to do a multi speaker out from a macbook and think I’d like to have an audio output inside the water and also outside the water for increased sonic immersion.


I had the plant present during user testing to see the feedback with it included and everyone suggested using the plant for another project. I’m still interested in triggering the plant through underwater sounds and vibrations but probably will not use whale sounds or imagery.

I was attempting to use my own projector for the playtesting but was unable to and the internet kept going out. For my ICM class in the afternoon I was able to successfully project the water onto the surface of a fish tank and have the theremin trigger the sounds and colors, unfortunately I don’t have documentation because i was making sure everything was running smoothly.

The main feedback I got was how to make the speaker could look good and not interfere with the whale projection, Fanchi suggested netting so it would look more aquatic and it wouldn’t muffle the sound. I purchased a white speaker, since I’ll be making the bottom of my box white for optimal projection, so it can blend in as much as possible. That speaker should be arriving tonight. Another Idea I have is to put a layer of white cotton material on top so the sound can go through but the projection still has a smooth white surface.  The speaker also needs to be weighted down so it stays at the bottom of the tank. I think I will be using rocks to do so.


  • wait for final fab materials to come in and make final construction: arduino box, lights for the photosensors, alternate waterproof speaker, netting for speaker,
  • include schematics
  • update blog further with work process
  • test out new speaker and multi speaker configuration
  • test water in box, make sure nothing is leaking
  • apply Maptastic to my p5 sketch so that it projects to the exact dimensions of water surface
  • test projection in new container with Maptastic applied

Alternate water holder if acrylic box doesn’t work:

This would need to be set up near the sink in the kitchen.