Computational Xmas Tree
This repo contains the code for the computational illumination of a Christmas Tree!
It is based on the work by Matt Parker from @standupmaths, and his video "I wired my tree with 500 LED lights and calculated their 3D coordinates".
This version contains an original LED 3D calibration routine developed by @range-et based on Matt's explanations on the video, Grasshopper + C# scripts to generate animated light sequences developed by @garciadelcastillo, and python scripts for the Raspberry Pi to load and run them on the tree.
The main difference is that Matt's approach uses custom Python scripts with sequences ran directly on the RPi, whereas this workflow "bakes" the animations to CSV
files that can then be ran from the RPi. The advantage of this approach is that the CSV sequences can be parsed on any graphics environment and displayed/simulated without needing to compile Python code. The downside is that the coordinates of the LEDs need to be known a priori before generating the CSV file (CSV files become tree-specific).
Jose Luis did a video series on ParametricCamp explaining (most of) this workflow: https://youtu.be/ma4JF6b4Dqk
Usage
Calibration
If you set a physical Christmas tree with lights, you can get an estimate of their 3D position using the following:
- Run
imageCapture.py
from a Raspberry Pi with a webcam and connected to the tree. The script will take one image per LED for angle 0:$ sudo python3 imageCapture.py
- Repeat this process by physically rotating the tree 45, 90, 135, 180, 225, 270 & 315 degrees, and correspondingly updating the
MASTERDIR
variable in the script. A sample dataset is provided. - Run
mapper.ipynb
to process all the images and generate the coordinates of the tree. This may take some time! - Run
adjuster.ipynb
to correct some of the outliers.
A version of the final outcome from our test tree is provided in file coords_adjusted.txt
. If you don't have a tree yourself, feel free to use these as the starting point.
Questions, suggestions and PRs can be directed to @range-et!
Sequencing
Once 3D coordinates are available, the following C# Scripts can be used as inspiration to generate sequences:
- (optional) The Grasshopper files use a version of the tree coordinates where X and Y are normalized between
-1
and1
, and Z is proportionally scaled from0
to whatever. Additionally, the coordinates are inCSV
format. Use thexmas_tree_coord_normalizer.gh
script to translate between the two syntaxes. - The
xmas_tree_sequences_3d.gh
andxmas_tree_sequences_no_3d.gh
contain scripts to generate a few animated sequences, both spatially-aware (tree-specific) and not respectively. - (optional) If you only want to simulate already baked sequences, use
xmas_tree_simulation.gh
Execution
A few scripts, based on Matt's original code, are provided to load the CSV sequences and run them on the tree.
- Copy the content of the
execution
folder to the RPi, as well as a few of the CSV sequences. - To run a sequence:
$ sudo python3 run.py light-sequence.csv
- To run all the sequences in a folder:
$ sudo python3 run-folder.py seq-folder/ 3 30
. The optional number3
indicates that each sequence will be looped3
times before moving on to the next, and the optional30
indicates that sequences will be blended over30
frames. - If you stop a sequence and want to turn off the tree, run
$ sudo python3 flush.py
- If you need to fine calibrate the tree, you can turn on specific lights by ID running
$ sudo python3 turnon.py 0 99 199 299 399 499
.
Contributions
This repo is just a snapshot of the work we did in the Fall 2021 edition of the GSD-6338 course. Unfortunately, we do not have the resources to maintain, manage or extend it beyond what is available. If you want to ask questions, discuss standards, contribute new code, features and add new sexy goodness, please refer to Matt's repo or start your own fork. Crediting is always welcome, thank you!
Credits
Thanks to the superstar roster of TAs Indrajeet, Amelia and Jessica for putting together the tree and writing the foundational calibration code.
Big
Project led by Jose Luis Garcia del Castillo y Lopez as part of our work at GSD-6338: Introduction to Computational Design at the Harvard University Graduate School of Design.
Original idea by Matt Parker from @standupmaths.