Jekyll2022-12-11T10:30:33+00:00https://blog.amandaghassaei.com/feed.xmlAmanda GhassaeiAmanda GhassaeiDigital Marbling2022-10-25T07:00:00+00:002022-10-25T07:00:00+00:00https://blog.amandaghassaei.com/2022/10/25/digital-marbling<p>I’ve been working on a physics-based marbling simulation to explore the ways that the traditional craft of paper marbling can be augmented digitally. Paper marbling is a centuries-old craft that uses the movement of water to create swirling patterns of inks and paints on paper (and sometimes fabric). The earliest accounts of marbling date back to the 12th century in Japan, where it is known as “suminagashi.” Inks were floated on top of water and manipulated into delicate, flowing shapes using breath, fans, and other utensils. Later, paper marbling traditions emerged in the Middle East and Europe, making use of more <a href="https://www.youtube.com/watch?v=Vyga8VMWXKg">viscous media and fine-toothed combs</a> to create repeating patterns with greater regularity and control.</p>
<figure class="image">
<img src="uw_examples.jpg" width="" height="" alt="Examples of combed marbling from the <a href="https://digitalcollections.lib.washington.edu/digital/collection/dp/search">University of Washington Digital Collections</a>. Marbled paper was traditionally used in bookbinding as a decorative endpaper." /><figcaption>Examples of combed marbling from the <a href="https://digitalcollections.lib.washington.edu/digital/collection/dp/search">University of Washington Digital Collections</a>. Marbled paper was traditionally used in bookbinding as a decorative endpaper.</figcaption></figure>
<p>I built an early prototype of a marbling-inspired fluid simulation back in 2017 when I was first learning to write physics simulations with WebGL, interactive demo below:</p>
<div style="padding:56.25% 0 0 0;position:relative;"><iframe src="https://apps.amandaghassaei.com/marbling-experiment/" style="position:absolute;top:0;left:0;width:100%;height:100%;background-color:#cccccc;" frameborder="0"></iframe></div>
<p>Over the past year I’ve been revisiting this work and exploring computational methods to more accurately simulate the marbling process. Recently, I even had the opportunity to collaborate with Jessica and Jesse at <a href="https://n-e-r-v-o-u-s.com/">Nervous System</a> (whose work I am a huge admirer of) on a series of <a href="https://n-e-r-v-o-u-s.com/shop/search_tags.php?search=marbling">marbling infinity puzzles</a>:</p>
<div style="padding:56.25% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/762296510?h=4bdd9f5885&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<p>I’ve learned a lot in the process of building this simulation, and I wanted to use this post to compile some of the main ideas in one place. You can find additional info about how the marbling puzzles were designed in Nervous System’s <a href="https://n-e-r-v-o-u-s.com/blog/?p=9225">blog post</a>. This collaboration was also featured in the <a href="https://www.nytimes.com/2022/12/09/science/puzzles-jigsaw-math.html">New York Times</a>.</p>
<h2 id="marbling-simulation">Marbling Simulation</h2>
<p>In order to create a marbling simulation, you need to model the motion of fluids. Typically, this is achieved by running a physics-based fluid solver and iteratively moving virtual inks along the flow. Combing actions and other manipulations are modeled as forces applied to the fluid’s velocity field. Due to the high viscosity of the fluid media (called “size”), the colored inks of a marbling pattern tend to warp and stretch rather than diffuse and mix with each other in a turbulent manner; this behavior helps to differentiate a marbling simulation from other types of fluid simulations (e.g. smoke simulations). Also, because the underlying fluids are “incompressible”, the inks move in ways that preserve their surface area.</p>
<p>To start I read up on work by <a href="https://people.csail.mit.edu/jaffer/Marbling/">Aubrey Jaffer</a>, including the paper <a href="https://www.cad.zju.edu.cn/home/jin/cga2012/mmarbling.pdf">Mathematical Marbling</a> (Lu, Jaffer <em>et al.</em> 2012). The main idea behind Mathematical Marbling is that the combing actions in paper marbling can be modeled by closed-form mathematical transformations; in other words, they argue that you don’t actually <em>need</em> to run a full-blown fluid solver to accurately simulate a marbled pattern. They show some nice results in their paper, which they’re able to compute extremely quickly because their method is direct rather than iterative.</p>
<p>After implementing Mathematical Marbling in WebGL, I decided to go with a hybrid approach that uses some of the Mathematical Marbling transformations, but ultimately lets a fluid solver kick in to create more dynamic and life-like results. This allowed me to experiment with freeform patterns that were made using a combination of combing and <a href="https://www.cs.ubc.ca/~rbridson/docs/bridson-siggraph2007-curlnoise.pdf">curl-noise</a> (Bridson <em>et al.</em> 2007), giving the appearance of turbulent air blowing across the surface of the pattern:</p>
<div style="padding:73.96% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763259068?h=d6a6272841&loop=1&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<div style="padding:73.96% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/766735673?h=8abfe2a736&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<figure class="image">
<img src="marbling10.jpg" width="" height="" alt="Noisy bouquet pattern in shades of blue. By adding noise to the underlying fluid simulation, it's possible to create irregular patterns that have undergone some distortion due to fluid motions." /><figcaption>Noisy bouquet pattern in shades of blue. By adding noise to the underlying fluid simulation, it's possible to create irregular patterns that have undergone some distortion due to fluid motions.</figcaption></figure>
<p>I like the playfulness that interacting directly with the underlying physics simulation allows. It’s fun to watch patterns warp and stretch as the dynamics of the fluid take over:</p>
<div style="padding:75% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763460226?h=010ae1ed8d&loop=1&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<h2 id="pattern-exploration">Pattern Exploration</h2>
<!-- <figure class="image">
<img src="medium_multicolor_noise_spiked.jpg"
width=""
height=""
alt=""
></figure> -->
<p>Even though marbling techniques have been around for centuries, there is still plenty to explore when it comes to pattern generation. I’ve really only just started to scratch the surface here. My marbling simulation allows me to line up combs very precisely and apply multiple combs at once, so it’s possible to experiment with patterns that would be impractical to recreate in real life. Several of the combing patterns we used for the infinity puzzles are brand new (as far as I know) variations on traditional marbling techniques, such as <a href="http://marbleart.us/Peacock-Bouquet.htm">bouquet</a>, <a href="http://marbleart.us/BirdWing.htm">birdwing</a>, and <a href="http://marbleart.us/Thistle.htm">thistle</a>. The pattern I’m most excited about at the moment is a “two-way” variation on the popular bouquet pattern:</p>
<div style="padding:75% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763463120?h=8b8b98114e&loop=1&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<figure class="image">
<img src="bouquet2way.jpg" width="" height="" alt="The two-way bouquet pattern uses opposing sets of combs to form a bi-directional variation on the classic bouquet pattern. Unlike classic bouquet, there is no clear "up" direction to this pattern." /><figcaption>The two-way bouquet pattern uses opposing sets of combs to form a bi-directional variation on the classic bouquet pattern. Unlike classic bouquet, there is no clear "up" direction to this pattern.</figcaption></figure>
<p>Similarly, the two-way birdwing pattern contains scalloped edges in both directions. The version below contains additional vertical combing to sharpen the peaks of the larger-scale features in the design:</p>
<figure class="image">
<img src="large_teal_pink_birdwing_scalloped.jpg" width="" height="" alt="" /></figure>
<p>The dovetail pattern is a simple variation on bouquet that uses vertical combs to enhance alternating features of the pattern. The resulting shape reminds me of Islamic architecture:</p>
<div style="padding:73.96% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763303590?h=80acb25271&loop=1&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<!-- <figure class="image">
<img src="large_jeweltone_dovetail.jpg"
width=""
height=""
alt="A simple variation on bouquet that uses soft vertical combing to "
><figcaption>A simple variation on bouquet that uses soft vertical combing to </figcaption></figure> -->
<p>A unique property of digital marbling that we leveraged in the design of the infinity puzzles is the ability to create patterns that tile perfectly in space. The resulting puzzles have no edges, and the pieces fit together in a multitude of configurations. It would be interesting to merge marbling with other ideas around tilings (e.g. <a href="https://en.wikipedia.org/wiki/Truchet_tiles">Truchet tilings</a>, <a href="https://en.wikipedia.org/wiki/Uniform_tilings_in_hyperbolic_plane">hyperbolic tilings</a>) to generate novel combing patterns as well.
<!-- Additionally, most (all?) of the traditional marbling patterns are based on orthogonal combing in x and y, but there may be interesting patterns created by triaxial combing. --></p>
<figure class="image">
<img src="small_greens_blues_noise_spiked.jpg" width="" height="" alt="A 2x2 tiling of a spiked marbling pattern." /><figcaption>A 2x2 tiling of a spiked marbling pattern.</figcaption></figure>
<!-- <figure class="image">
<img src="small_spring_dovetail_2way.jpg"
width=""
height=""
alt="A 2x2 tiling of a bouquet two-way variation."
><figcaption>A 2x2 tiling of a bouquet two-way variation.</figcaption></figure> -->
<!-- <figure class="image">
<img src="large_greens_blues_noise_scalloped.jpg"
width=""
height=""
alt=""
></figure> -->
<h2 id="ink-drop-simulation">Ink Drop Simulation</h2>
<p>I also explored the behavior of individual drops of ink. Jaffer has <a href="https://people.csail.mit.edu/jaffer/Marbling/Dropping-Paint">an excellent page</a> detailing the math behind dropping paint/ink on the surface of water, and the resulting transformation that a single drop has on the rest of a marbled pattern. Again, the trick here is that each ink drop should induce an area-preserving transformation on the surrounding regions. So if many drops are used to create a ringed pattern, the rings will slowly increase in radius and decrease in thickness as they move outward. I used this technique as the basis for a suminagashi simulation:</p>
<div style="padding:100% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763243459?h=80acb25271&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<figure class="image">
<img src="sumi_array.jpg" width="" height="" alt="Suminagashi reference prints I made over the course of this project. Suminagashi is a fun and easy way to get started with paper marbling using inexpensive materials you probably already have on hand. These prints were made with newsprint, black ink, and dish soap using a plastic storage bin filled with plain water." /><figcaption>Suminagashi reference prints I made over the course of this project. Suminagashi is a fun and easy way to get started with paper marbling using inexpensive materials you probably already have on hand. These prints were made with newsprint, black ink, and dish soap using a plastic storage bin filled with plain water.</figcaption></figure>
<p>By dropping multiple drops at once, you can simulate a <a href="http://marbleart.us/StoneMarble.htm">“stone” pattern</a>, which are typically used as a starting point for combed patterns. You’ll notice that the following simulation violates incompressibility a bit, as there are regions between the drops that appear to shrink. Jaffer’s area-preserving technique works great in an infinite domain where the inks can simply flow off the edge of the screen, but the simulation below is meant to tile in space, so I had to come up with some way to loop the simulation in x and y while still preserving the correct local behavior of the drops. Due to the area expansion occurring at each drop’s center, there has to be contraction elsewhere in the pattern:</p>
<div style="padding:100% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763261287?h=b557490fee&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<p>This contraction behavior is actually somewhat realistic. In a finite-sized tray you will eventually notice ink patterns begin to shrink and become more saturated as additional drops of ink are added. You’ll also notice that the more ink drops you add, the more slowly each new drop will expand across the surface of the water. It seems that the thickness of the ink layer on top of the water increases with each extra drop of ink added, which causes the entire pattern to gradually contract. This change in thickness is not explicitly modeled in my or Jaffer’s methods, but can easily be added by scaling the effect of the ink drops.</p>
<h2 id="challenges">Challenges</h2>
<p>One of the trickiest technical aspects of creating this marbling simulation was preserving crisp boundaries between colors in the combed pattern—an important feature of paper marbling. I had been using the <a href="https://www.dgp.toronto.edu/public_user/stam/reality/Research/pdf/ns.pdf">Stable Fluids</a> (Stam, 1999) method, which tends to create excessive blurring the longer the simulation runs, so I wrote a new fluid solver based on <a href="https://www.seas.upenn.edu/~ziyinq/static/files/bimocq.pdf">BiMocq2: Efficient and Conservative Fluids Using Bidirectional Mapping</a> (Qu <em>et al.</em> 2019), which uses some tricks to significantly reduce this blurring with surprisingly little computational overhead.</p>
<div style="padding:52.22% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763271987?h=4c66c7a3de&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<p>The simulation above shows a comparison of Stable Fluids (left) and BiMocq2 (right) solving a system exhibiting <a href="https://en.wikipedia.org/wiki/Rayleigh%E2%80%93Taylor_instability">Rayleigh–Taylor instability</a>, both implemented in WebGL using the same time step, grid resolution, and initial conditions. The difference between the methods is staggering. Not only does BiMocq2 preserve cleaner boundaries between the white and black fluids, but it also reduces damping and preserves details in the underlying velocity field, which is evident by numerous fractal-like vortices that emerge in the flow. These velocity field details are not so important in the context of traditional marbling, where we’re generally concerned with laminar flows of more viscous fluids, but could be interesting to explore in future work.</p>
<h2 id="future-work">Future Work</h2>
<p>Looking forward, I’m interested in how digital marbling could unlock new aesthetics that make use of turbulent and ephemeral behaviors of fluids, which would normally be very difficult to transfer to paper:</p>
<figure class="image">
<img src="turbulent.jpg" width="" height="" alt="" /></figure>
<figure class="image">
<img src="turbulent2.jpg" width="" height="" alt="Patterns reminiscent of the swirling clouds of Jupiter, generated via a very high resolution BiMocq2 simulation." /><figcaption>Patterns reminiscent of the swirling clouds of Jupiter, generated via a very high resolution BiMocq2 simulation.</figcaption></figure>
<figure class="image">
<img src="vortices.jpg" width="" height="" alt="Early explorations combining digital marbling with <a href="https://en.wikipedia.org/wiki/K%C3%A1rm%C3%A1n_vortex_street">von Kármán vortex streets</a>." /><figcaption>Early explorations combining digital marbling with <a href="https://en.wikipedia.org/wiki/K%C3%A1rm%C3%A1n_vortex_street">von Kármán vortex streets</a>.</figcaption></figure>
<div style="padding:52.03% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/680566776?h=543eaa97a2&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<p>Throughout the development process, it’s been interesting to explore how random bugs in the code change the behavior of the system in unexpected ways:</p>
<div style="padding:52.03% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/688625186?h=103dae69e6&title=0&byline=0&portrait=0&loop=1" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<figure class="image">
<img src="tile7.jpg" width="" height="" alt="" /></figure>
<p>I loved the way this unintended sloshing motion generated surprising shapes:</p>
<div style="padding:49.11% 0 0 0;position:relative;"><iframe src="https://player.vimeo.com/video/763605852?h=8e13378fe4&loop=1&title=0&byline=0&portrait=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen=""></iframe></div>
<script src="https://player.vimeo.com/api/player.js"></script>
<p>The current iteration of the marbling simulation app is still a bit too much of a hacky research prototype to be released publicly, but I’m hoping to find some time to wrap up this work and release something in the near-ish future. Mainly, I want to add some optimizations to the simulator to create better real-time performance for larger patterns (the app crashed my computer on several occasions!). Though the current prototype has some “undo” capabilities, it would be extremely helpful to also be able to edit previous combing operations while viewing their effect on the final marbled pattern—this would allow you to more easily control the amount of color mixing in the final design.</p>
<figure class="image">
<img src="ui.jpg" width="" height="" alt="Current UI of the marbling simulator research prototype." /><figcaption>Current UI of the marbling simulator research prototype.</figcaption></figure>
<p>In the process I also created a simple file format for importing marbling combing patterns into my app; this allowed for easier interop with other programs I was using. It might also be interesting to generate velocity fields to apply directly to the fluid simulation; there is no reason you have to stick with combs when you’re working digitally.</p>
<figure class="image">
<img src="simplecombdesign.jpg" width="" height="" alt="Input combing pattern designed in a vector format." /><figcaption>Input combing pattern designed in a vector format.</figcaption></figure>
<p>This vector format uses stroke-width to roughly map to the diameter of the comb’s teeth (also known as “tines”). Taken to the extreme, you can create patterns using combs with lots of tiny tines, which essentially behave like bristles on a brush:</p>
<figure class="image">
<img src="brushed1.jpg" width="" height="" alt="Variation on a bouquet marbling pattern in a brushed style." /><figcaption>Variation on a bouquet marbling pattern in a brushed style.</figcaption></figure>
<h2 id="further-reading">Further Reading</h2>
<ul>
<li>
<p><strong><a href="https://amzn.to/3f5GmTI">The Art of Fluid Animation (2004)</a></strong> by Jos Stam – An introduction to the concepts behind fluid animation, building up to an implementation of the <a href="https://www.dgp.toronto.edu/public_user/stam/reality/Research/pdf/ns.pdf">Stable Fluids</a> paper.</p>
</li>
<li>
<p><strong><a href="https://jamie-wong.com/2016/08/05/webgl-fluid-simulation/">Fluid Simulation (with WebGL demo)</a></strong> by Jamie Wong – The most accessible explanation of the Stable Fluids method that I’ve found. Includes many interactive widgets to help you understand each component of the method.</p>
</li>
<li>
<p><strong><a href="https://people.csail.mit.edu/jaffer/Marbling/How-To">pst-marble</a></strong> by Aubrey Jaffer, Jürgen Gilg, and Manuel Luque – I have not had a chance to try this, but it appears to be a library for creating marbling patterns in Latex, based on <a href="https://www.cad.zju.edu.cn/home/jin/cga2012/mmarbling.pdf">Mathematical Marbling</a>.</p>
</li>
<li>
<p><strong><a href="https://n-e-r-v-o-u-s.com/blog/?p=9225">Marbling Infinity Puzzles blog post</a></strong> by Nervous System – Additional information about the design of the Marbling Infinity Puzzles.</p>
</li>
<li>
<p><strong><a href="https://github.com/amandaghassaei/gpu-io">gpu-io</a></strong> – A WebGL GPU computing library for running physics simulations that I’ve just recently released (soon to be announced). I’ve been building all of my digital marbling simulations (and many other things) on top of this library.</p>
</li>
<li>
<p><strong><a href="http://marbled-paper.glitch.me/">Marbled Paper App</a></strong> by Jonas Luebbers – A WebGL implementation of Mathematical Marbling.</p>
</li>
</ul>Amanda GhassaeiI’ve been working on a physics-based marbling simulation to explore the ways that the traditional craft of paper marbling can be augmented digitally. Paper marbling is a centuries-old craft that uses the movement of water to create swirling patterns of inks and paints on paper (and sometimes fabric). The earliest accounts of marbling date back to the 12th century in Japan, where it is known as “suminagashi.” Inks were floated on top of water and manipulated into delicate, flowing shapes using breath, fans, and other utensils. Later, paper marbling traditions emerged in the Middle East and Europe, making use of more viscous media and fine-toothed combs to create repeating patterns with greater regularity and control.The Recursive Universe2020-05-01T07:00:00+00:002020-05-01T07:00:00+00:00https://blog.amandaghassaei.com/2020/05/01/the-recursive-universe<p>A few years ago I came across <a href="https://www.youtube.com/watch?v=xP5-iIeKXE8">this video</a>, showing a complex machine built entirely in Conway’s Game of Life:</p>
<figure class="image">
<img src="metapixel.gif" width="" height="" alt="" /></figure>
<p>The purpose of the machine is to emulate a single Life pixel. With a big enough matrix of these “metapixels”, you can simulate a meta-version of Life on a massive scale. From there you could create a meta-metapixel out of metapixels and so on….</p>
<p>At the time I was reading the book <a href="https://www.goodreads.com/book/show/301563.The_Recursive_Universe">The Recursive Universe</a> by William Poundstone, which gives a detailed breakdown of John Conway’s 1982 proof of self-replicating objects in Life. Since its inception in 1970, Conway’s Game of Life has developed a cult following of researchers, engineers, and hobbyists, pushing each other to construct increasingly elaborate “machines” from pixels on a screen. These investigations have resulted in a <a href="https://bitstorm.org/gameoflife/lexicon/">lengthy taxonomy</a> of motifs, reactions, and mechanisms, as well as engineering principals and design abstractions. These days you can spend a long time on YouTube exploring the <a href="https://youtu.be/C2vgICfQawE?t=71">seemingly impossible things</a> people are designing in the “simple” Life universe.</p>
<p>This post is (mostly) some notes I took back in 2015 while trying to understand how this metapixel was designed. Unfortunately, John Conway <a href="https://www.nytimes.com/2020/04/15/technology/john-horton-conway-dead-coronavirus.html">passed away recently</a>, which got me thinking about Life again (not to sound too sappy). Now that I find myself with a lot of time on my hands, I figured I’d clean up these notes and finally kick this blog off (inaugural post!).</p>
<p>Here we go….</p>
<h2 id="game-of-life">Game of Life</h2>
<p><a href="http://en.wikipedia.org/wiki/Conway%27s_Game_of_Life">Conway’s Game of Life</a> is a 2D <a href="http://en.wikipedia.org/wiki/Cellular_automaton">cellular automaton</a> - a simulated world set on a grid of pixels (cells). Though the rules that govern Life are very simple, the results can become quite complex. In the Game of Life, a cell’s behavior is dictated by its current state (alive or dead) and the state of its eight nearest neighbors. The rules of Life are loosely based on population dynamics (copied from wikipedia below):</p>
<ul>
<li>Any live cell with fewer than two live neighbors dies, as if caused by under-population.</li>
<li>Any live cell with two or three live neighbors lives on to the next generation.</li>
<li>Any live cell with more than three live neighbors dies, as if by overcrowding.</li>
<li>Any dead cell with exactly three live neighbors becomes a live cell, as if by reproduction.</li>
</ul>
<p>One of the most interesting things about Life is the variety of patterns that can be constructed within it. Some patterns are readily observed in the wild – spontaneously emerging from a <a href="http://apps.amandaghassaei.com/ConwayShader/">random soup of living and dead cells</a>. Other patterns were meticulously designed by people, often built from smaller subunits with well-characterized behavior. In general, patterns in Life can be broken into a few broad categories, described below. (image source <a href="http://en.wikipedia.org/wiki/Conway%27s_Game_of_Life">wikipedia</a>)</p>
<p>Stable configurations, called “still lifes”, do not change over time:</p>
<figure class="image">
<img src="block.svg" width="" height="" alt="block" /><figcaption>block</figcaption></figure>
<figure class="image">
<img src="loaf.svg" width="" height="" alt="loaf" /><figcaption>loaf</figcaption></figure>
<figure class="image">
<img src="boat.svg" width="" height="" alt="boat" /><figcaption>boat</figcaption></figure>
<figure class="image">
<img src="beehive.svg" width="" height="" alt="beehive" /><figcaption>beehive</figcaption></figure>
<p>“Oscillators” are dynamic patterns that repeat themselves after a certain number of time steps:</p>
<figure class="image">
<img src="blinker.gif" width="" height="" alt="blinker (period 2)" /><figcaption>blinker (period 2)</figcaption></figure>
<figure class="image">
<img src="beacon.gif" width="" height="" alt="beacon (period 2)" /><figcaption>beacon (period 2)</figcaption></figure>
<figure class="image">
<img src="pulsar.gif" width="" height="" alt="pulsar (period 3)" /><figcaption>pulsar (period 3)</figcaption></figure>
<figure class="image">
<img src="pentadecathlon.gif" width="" height="" alt="pentadecathlon (period 15)" /><figcaption>pentadecathlon (period 15)</figcaption></figure>
<p>A “Spaceship” is a type of oscillator that moves across space as it oscillates. The simplest and most common spaceship (“common” here means it will often arise in the wild) is called a “glider”:</p>
<figure class="image">
<img src="glider.gif" width="" height="" alt="glider (period 4)" /><figcaption>glider (period 4)</figcaption></figure>
<p>Other common spaceships are the light-weight spaceship, and its siblings, the middle-weight and heavy-weight varieties:</p>
<figure class="image">
<img src="lwss.gif" width="" height="" alt="light-weight spaceship "LWSS" (period 4)" /><figcaption>light-weight spaceship "LWSS" (period 4)</figcaption></figure>
<figure class="image">
<img src="mwss.gif" width="" height="" alt="middle-weight spaceship "MWSS" (period 4)" /><figcaption>middle-weight spaceship "MWSS" (period 4)</figcaption></figure>
<figure class="image">
<img src="hwss.gif" width="" height="" alt="heavy-weight spaceship "HWSS" (period 4)" /><figcaption>heavy-weight spaceship "HWSS" (period 4)</figcaption></figure>
<p>“Guns” are structures that produce a stream of spaceships; Gosper’s glider gun was the first gun ever discovered:</p>
<figure class="image">
<img src="gospergun.gif" width="" height="" alt="Gosper's glider gun" /><figcaption>Gosper's glider gun</figcaption></figure>
<p>“Reactions” are collisions of Life objects that produce useful outcomes. <a href="https://www.conwaylife.com/wiki/Glider_synthesis">Glider Synthesis</a> is a process by which Life patterns are created solely through collisions of gliders. Glider synthesis forms a large subset of known and actively studied Life reactions, and it was a critical piece of Conway’s existence proof of a self-replicating machine in Life. Here is a 3 glider synthesis of a medium-weight spaceship:</p>
<figure class="image">
<img src="spaceship_synthesis.gif" width="" height="" alt="3 glider synthesis of MWSS" /><figcaption>3 glider synthesis of MWSS</figcaption></figure>
<p>Using combinations of spaceships, still lifes, oscillators, guns, and reactions between them, it’s possible to construct incredibly complex machines in Life. Here’s an example of a <a href="http://www.conwaylife.com/wiki/P416_60P5H2V0_gun">period 416 gun that constructs 60P5H2V0 spaceships</a> using a series of timed glider collisions:</p>
<iframe width="600" height="450" src="//player.vimeo.com/video/5428232?title=0&byline=0&portrait=0" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>Other notable engineering accomplishments within Life include <a href="https://www.youtube.com/watch?v=A8B5MbHPlH0">Gemini</a> (a spaceship encoded by a long glider tape), the <a href="https://www.conwaylife.com/wiki/Linear_propagator">Linear Propagator</a> (a self-replicating machine), a <a href="http://rendell-attic.org/gol/utm/index.htm">Turing Machine</a>, the <a href="http://www.igblan.free-online.co.uk/igblan/ca/">Minsky Register Machine</a> (a finite universal computer), and the <a href="http://otcametapixel.blogspot.com/2006/05/how-does-it-work.html">OTCA Metapixel</a> (a structure that behaves like a large-scale Life pixel, and the subject of the rest of this article).</p>
<h2 id="otca-metapixel">OTCA Metapixel</h2>
<figure class="image">
<img src="metapixel_galaxy.jpg" width="" height="" alt="Nine time steps of a 15x15 array of OTCA metapixels depicting a period 8 oscillator called "Kok’s Galaxy". Each time-step represents 35,328 generations of Life; the entire 8 step sequence takes 282,624 generations to complete. Each metapixel occupies 2058x2058 Life cells; the complete 15x15 metapixel array totals 30,800x30,800 Life cells (accounting for a 5 cell overlap between adjacent metapixels). Look closely and you will see that each metapixel is obeying Conway's rules." /><figcaption>Nine time steps of a 15x15 array of OTCA metapixels depicting a period 8 oscillator called "Kok’s Galaxy". Each time-step represents 35,328 generations of Life; the entire 8 step sequence takes 282,624 generations to complete. Each metapixel occupies 2058x2058 Life cells; the complete 15x15 metapixel array totals 30,800x30,800 Life cells (accounting for a 5 cell overlap between adjacent metapixels). Look closely and you will see that each metapixel is obeying Conway's rules.</figcaption></figure>
<p>In 2006 Brice Due published the <a href="http://www.conwaylife.com/wiki/OTCA_metapixel">OTCA Metapixel</a>, a 2058x2058 cell structure that emulates the behavior of a single Life cell. The OTCA Metapixel isn’t the first, smallest, or fastest metapixel designed in Life (also check out the <a href="http://www.conwaylife.com/wiki/P5760_unit_Life_cell">P5760 unit Life cell</a> and the <a href="http://www.conwaylife.com/wiki/Deep_cell">Deep cell</a>), but it has the interesting property of looking like a single life cell when zoomed out. It also has a register that allows you to program its behavior according to any arbitrary <a href="http://en.wikipedia.org/wiki/Life-like_cellular_automaton">Life-like cellular automaton</a> ruleset.</p>
<p>In the remainder of this article I’ll describe the inner workings of the metapixel. I got all my info from the <a href="http://www.conwaylife.com/wiki/OTCA_metapixel">Life Wiki</a>, the <a href="http://otcametapixel.blogspot.com/2006/05/how-does-it-work.html">OTCA site</a>, the <a href="http://www.bitstorm.org/gameoflife/lexicon/">Life Lexicon</a>. and (mostly) by downloading <a href="http://golly.sourceforge.net/">Golly</a> and watching the OTCA Metapixel run.</p>
<h2 id="clock">Clock</h2>
<p>Like most computer processors, the OTCA metapixel is driven by a central clock, which regulates the timing of its events. The clock is powered by a <a href="http://www.conwaylife.com/wiki/Tractor_beam">tractor beam</a>, a stream of <a href="http://www.conwaylife.com/wiki/Spaceships">spaceships</a> that slowly pull an object towards the direction of its source (a simple example is the <a href="https://www.youtube.com/watch?v=xDVArWLXUIU">loaf tractor beam</a> – where a <a href="https://conwaylife.com/wiki/Loaf">loaf</a> is gradually pulled toward an incoming stream of <a href="http://www.conwaylife.com/wiki/LWSS">light-weight spaceships</a>).</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/fxxh27wYgew" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>In the video above, a tractor beam pulls a <a href="http://www.conwaylife.com/wiki/Block">block</a> down by 8 cells in each collision, releasing a <a href="http://en.wikipedia.org/wiki/Glider_%28Conway%27s_Life%29">glider</a> to the right each time. A fence to the side of the tractor beam made of <a href="http://www.conwaylife.com/wiki/Eater_1">Eater 1</a>s destroys the gliders immediately, but there are a few holes in the fence that allow gliders to pass through and interact with other structures in the metapixel at precisely timed intervals.</p>
<p>At about <a href="https://youtu.be/fxxh27wYgew?t=57">0:57</a> in the video you can see the tractor beam firing three gliders past the fence into a structure called the “LWSS packet gun”; this releases three sets of three light-weight spaceships (LWSS), one set for each incoming glider. Later at about <a href="https://youtu.be/fxxh27wYgew?t=141">1:21</a> you can see four more gliders fired from the tractor beam into the metapixel. This first glider initiates a neighbor count decode sequence, the next two gliders read the results from the decoder, and the last glider initiates a logic cascade that eventually decides whether to turn the metapixel on or off in the next clock cycle (all described in more detail <a href="#determining-the-next-state">later</a>). The role of the 7 gliders is summed up in <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-22-linearclock.0.png">this image</a>.</p>
<p>Finally, the block is destroyed at the source of the tractor beam and restored in its starting position, beginning the cycle again.</p>
<h2 id="encoding-the-rules">Encoding the Rules</h2>
<p>The rules of the metapixel are encoded in two columns (like <a href="http://en.wikipedia.org/wiki/Hardware_register">hardware registers</a>) as shown in the images below:</p>
<figure class="image">
<img src="registers1.jpg" width="" height="" alt="" /></figure>
<p>Life-like automata rules are written in the form B#/S#, described <a href="http://en.wikipedia.org/wiki/Life-like_cellular_automaton#Notation_for_rules">here</a>. Conway’s rules are defined as B3/S23, so the corresponding bits of the registers of the OTCA metapixel are populated with one <a href="http://www.conwaylife.com/wiki/Eater_1">Eater-1</a> each:</p>
<figure class="image">
<img src="registers2.jpg" width="" height="" alt="" /></figure>
<h2 id="counting-neighbors">Counting Neighbors</h2>
<p>On each clock cycle three sets of three LWSSes (9 total) leave the LWSS packet gun (triggered by the clock, explained above) and complete a loop around the metapixel, passing by each of the metapixel’s <a href="http://en.wikipedia.org/wiki/Moore_neighborhood">Moore neighbors</a> on the way. As the LWSSes pass by each neighbor, the first ship in the train collides with a <a href="http://www.conwaylife.com/wiki/Pond">pond</a> in its path if that neighbor is currently in an “alive” state; this collision destoys the ship and decreases the total number of ships in the train by one. If the neighbor is not alive, then no pond is present and the ships go by unharmed. Since the train starts with 9 LWSSes, if a metapixel has 4 alive neighbors then only five LWSSes will return after completing a pass around the perimeter. A collision of a train of LWSSes with a pond is shown below at about <a href="https://youtu.be/8Ec5vRRPMwI?t=26">0:26</a> (I’ll explain how the ponds get there <a href="#accessing-neighbor-state">later</a>):</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/8Ec5vRRPMwI" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>This video also shows the shepherding of the train of LWSSes around the metapixel by <a href="http://www.conwaylife.com/wiki/Twin_bees_shuttle">twin bees shuttles</a> and other <a href="http://www.bitstorm.org/gameoflife/lexicon/#wk">reflectors</a>, and an example of “color adjustment” of the ships (changing their <a href="http://www.bitstorm.org/gameoflife/lexicon/#qr">phase</a> slightly) by a pair of glider guns. The path of the LWSSes and the location of color adjusters and possible ponds is shown in <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-24-lwsstrack.png">this image</a>.</p>
<p>The full path of the LWSS train is shown below. Notice how one ship is removed from the front of the train as it passes each living neighbor.</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/YlRjz0j_cmY" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<h2 id="comparing-neighbor-count-with-rules">Comparing Neighbor Count with Rules</h2>
<p>When the train of LWSSes completes a loop, they are channeled into a mechanism (sync buffer and p46 to p40 converter) where they given the same <a href="http://www.bitstorm.org/gameoflife/lexicon/#qr">phase</a>. Then they are sent into the rules register where they are collided with another LWSS coming from the opposite direction (originating from one of the gliders shot out of the clock) to decode the train of LWSS into the number of living neighbors (shown in <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-25-decode.png">this diagram</a>).</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/ltC2h2Yx-R8" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>The location of the collision between the LWSS train and the LWSS originating from the clock depends on how many LWSSes were destroyed from the front of the train during the loop around the metapixel. The mechanism is set up so that the collision shoots out two gliders in opposite directions, aimed towards the register that corresponds to the number of living neighbors the metapixel has (again, see <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-25-decode.png">this pic</a> as a reference).</p>
<p>Normally, the gliders will each collide with a <a href="http://www.conwaylife.com/wiki/Beehive">beehive</a> to produce a <a href="http://www.conwaylife.com/wiki/Block">block</a> and a <a href="http://www.conwaylife.com/wiki/Pond">pond</a>, this is called a <a href="http://www.conwaylife.com/wiki/Honeybit">honeybit reaction</a>. If you watch from about <a href="https://youtu.be/ltC2h2Yx-R8?t=360">6:00</a> into the video above, you’ll see the collision of the antiparallel LWSSes, sending two gliders towards 4th slot (count up from the bottom, starting at 0) in both the birth and survival registers, and at about <a href="https://youtu.be/ltC2h2Yx-R8?t=390">6:30</a> two ponds are formed in the registers. The remaining spaceships in the LWSS train collide with an Eater-1 and are destroyed. The video below shows a closer look across many generations:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/aFtNn8rquLs" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>If the slot in the register is occupied by an Eater-1 (as are the 3rd slot in the birth register, and the 2nd and 3rd slots in the survival registers, again counting up from the bottom starting at 0), then the incoming glider is destroyed before it has a chance to initiate a honeybit reaction with the beehive. This produces no pond in the register. In this mechanism a register has a pond if the number of living neighbors doesn’t satisfy the conditions for life in the next generation, and no pond if it does. Remember this information is split between two registers (birth and survival) and in a later step some logic will look at the current state of the metapixel to determine which register to read from (described <a href="#determining-the-next-state">later</a>). This mechanism is what makes the OTCA Metapixel programmable for any Life-like ruleset, pretty cool.</p>
<p>Next, the clock shoots out a pair of LWSSes to read these registers (one LWSS for each register), following the trajectories shown in <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-26-readbs.png">this diagram</a>. If a pond is present in a register, the LWSS collides with the pond and is destroyed. This completes the honeybit reaction and restores the <a href="http://www.conwaylife.com/wiki/Beehive">beehive</a> back to its original state. If the LWSS has no pond in its path, it is allowed to continue into the next logic bank. In the video above, see if you can predict the next state of the metapixel by reading from the registers.</p>
<h2 id="accessing-neighbor-state">Accessing Neighbor State</h2>
<p>Before I move to the final logic of the system, let’s look at how the neighbor states are tallied up. <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-21-programmables.0.png">This diagram</a> shows the 8 input/ouput channels of each metapixel, each corresponding to one of eight neighbors for a single metapixel. By taking a closer look at one of them, you’ll see another honeybit reaction happening:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/XsXqvPOCAOI" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>The honeybit reaction on the left corresponds to the state of the metapixel on the right, and the reaction on the right corresponds to the state of the metapixel on the left. The LWSS train moves clockwise around the metapixel, so the train moving down the screen belongs to the pixel on the left and the train moving up the screen belongs to the pixel on the right.</p>
<p>These honeybit reactions are set by a <a href="http://www.conwaylife.com/wiki/MWSS">middle weight spaceship</a> (MWSS) that moves in a counter-clockwise path around the pixel each clock cycle. <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-23-mwsstrack.png">Here</a> is a description of its path and the path of all the gliders it generates for 8 honeybit reactions around the metapixel. The MWSS is created only if the current state of the metapixel is alive and at the end of its loop around the pixel it is destroyed.</p>
<h2 id="determining-the-next-state">Determining the Next State</h2>
<p>The final state of the system is read from a series of logic gates. As has already been demonstrated in some of the previous sections, logic in Life is typically achieved by sending spaceships (usually gliders) through a path where they may or may not collide with other objects and be annihilated. At the end of their journey, a reading mechanism tests for their presence (1) or absence (0). <a href="https://www.youtube.com/watch?v=vGWGeund3eA">Here’s a video</a> that shows some concrete examples of various boolean operations. Long “glider tapes” can even be used as persistent memory to store large amounts of information, demonstrated in <a href="https://www.youtube.com/watch?v=A8B5MbHPlH0">Gemini</a> and <a href="https://vimeo.com/162959120">this programmable text generator</a> (the “Golly ticker”).</p>
<p>The final logic of the metapixel is summarized in <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-27-boatlogic.png">this diagram</a>.</p>
<p>First some definitions:</p>
<p>C is the current state of the cell, 1 or 0</p>
<p>B is the state of the birth register, 1 for satisfied birth conditions, 0 for not-satisfied</p>
<p>S if the state of the survival register, 1 for satisfied, 0 for not-satisfied</p>
<p>The states of C, B, and S are stored in the metapixel by the presence (“1”) or absence (“0”) of three <a href="http://www.conwaylife.com/wiki/Boat">boats</a>. Boats have the interesting property that when they are hit with a glider they <a href="http://www.bitstorm.org/gameoflife/lexicon/#wk">reflect</a> a new glider with a trajectory that is perpendicular to the path of the incoming glider. This collision destroys the boat, so boats are known as <a href="http://www.conwaylife.com/wiki/One-time_reflector">one time reflectors</a>. The B and S boats (shown in <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-27-boatlogic.png">this diagram</a>) are set using the LWSSes that are able to pass through the birth and survival registers during the decoding step described <a href="#comparing-neighbor-count-with-rules">above</a>. In this video, watch how boats are placed at B and S by a pair of LWSSes starting at <a href="https://youtu.be/FQdg329APN8?t=227">3:47</a>, the moment they are set comes at <a href="https://youtu.be/FQdg329APN8?t=292">4:52</a>:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/FQdg329APN8" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>A glider is used to read the states of C, B, and S and set a pond at G or H (shown in <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-27-boatlogic.png">this diagram</a>). The logic for G and H pond formation (“1” means a pond is formed, “0” means no pond) is given by the following logic:</p>
<p>G = C & !S (G equals 1 when C is 1 and S is 0, else G is 0)</p>
<p>H = !C & B (H equals 1 when C is 0 and B is 1, else H is 0)</p>
<p>The final glider released by the clock (described at the <a href="#clock">beginning</a>) sets off a logic cascade that computes these relationships. The glider is released through the small hole in the fence next to the clock, a little less than halfway up the left side of the video above (<a href="https://youtu.be/FQdg329APN8?t=945">15:45</a>), triggering the formation of a LWSS (<a href="https://youtu.be/FQdg329APN8?t=948">15:48</a>) which travels toward the right side of the video, releasing two gliders on its way (<a href="https://youtu.be/FQdg329APN8?t=971">16:11</a>). The first glider bounces around and eventually turns into a LWSS that reads the results of G and H, more on that in the next paragraph. The second glider heads towards C, if C is present it reflects and starts heading toward S. If S is present it reflects into an eater and dies, if S is not present it continues to a beehive at G and does a honeybit reaction, forming a pond. If C is not present, the second glider heads toward B instead. If B is present, the glider reflects off B and does a honeybit reaction at H, otherwise it runs into an eater and dies. Starting at <a href="https://youtu.be/FQdg329APN8?t=979">16:20</a> in the video above, we see the glider being created, reflecting off C, passing through an absent S, and finally, forming a pond at G.</p>
<p>At <a href="https://youtu.be/FQdg329APN8?t=997">16:37</a> you can see a pond set at G and a LWSS coming in from the left to read G and H. If either G or H has a pond set, the LWSS is destroyed, as shown at <a href="https://youtu.be/FQdg329APN8?t=999">16:39</a>. If neither are set (see <a href="https://youtu.be/FQdg329APN8?t=647">10:47</a>) the LWSS continues on and forms a boat at !T (shown as T with a line on top on <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-27-boatlogic.png">this diagram</a>). This can be summarized with the logic:</p>
<p>!T = !(G | H) (!T equals 1 when G and H are both 0, else !T is 0)</p>
<p>A third glider released at <a href="https://youtu.be/FQdg329APN8?t=656">10:56</a> by the original LWSS trigger heads toward !T. If the boat at !T is present the glider reflects off the boat and hits an eater, if the boat is not present it continues to T (again, <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-27-boatlogic.png">this diagram</a>). The glider is destroyed by an eater in <a href="https://youtu.be/FQdg329APN8?t=677">11:17</a>, but you can see it turn into a LWSS and head to T during a previous iteration starting at <a href="https://youtu.be/FQdg329APN8?t=333">5:33</a>.</p>
<p>A fourth glider is released by the original LWSS which turns into a LWSS that cleans up any leftover boats left at B or S after the logical operations have completed. An example starts at <a href="https://youtu.be/FQdg329APN8?t=323">5:23</a>. This LWSS has no other function, it is destroyed either by a boat it cleans up, or an eater. Its trajectory is shown <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-28-cleanstart.png">here</a>.</p>
<p>After all this we’re left with the following state:</p>
<p>T = presence of a LWSS at T, the formal logic for this state is given by:</p>
<p>T = !(!T) = G | H = (C & !S) | (!C & B)</p>
<p>which you can read as “T is 1 if the pixel is currently alive and it doesn’t satisfy the requirements for survival, or if the pixel is currently dead and does satisfy the requirements for birth, otherwise it is 0” – basically a LWSS at T means the metapixel should toggle its current state.</p>
<p>(just a bit longer, sorry this is kind of a slog)</p>
<p>The LWSS at T follows the path shown <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-29-toggledist.png">here</a>. At <a href="https://youtu.be/FQdg329APN8?t=388">6:28</a> you can see the LWSS transform into a glider and back again, while navigating through a series of twists and turns. At <a href="https://youtu.be/FQdg329APN8?t=399">6:39</a> it shoots a glider angled upward which turns into an LWSS that heads to the sync buffer and eventually the cell state boat bit (different than C, more on this in the next paragraph). The original LWSS continues to the left, shooting another two gliders off at <a href="https://youtu.be/FQdg329APN8?t=402">6:42</a> and <a href="https://youtu.be/FQdg329APN8?t=408">6:48</a>; each of these gliders turns into a LWSS and one heads to the bottom right corner and the other to the top left corner of the metapixel to toggle the state of the output display (more on that in the <a href="#output-display">next section</a>).</p>
<p>The cell state boat bit is what actually stores the current state of the cell, (C is updated periodically based on the state of the boat bit). If the boat bit is present, indicating that the current state of the metapixel is “off”, the incoming LWSS will form a glider that collides with it and removes it (starts at <a href="https://youtu.be/FQdg329APN8?t=423">7:03</a>). If the cell state boat bit is not present, indicating that the current state of the metapixel is “on”, the incoming LWSS will form a glider that creates a new boat bit (starts at <a href="https://youtu.be/FQdg329APN8?t=1138">18:58</a>).</p>
<p>Finally, let’s follow the path of the original LWSS that triggered this whole series of events. It follows the outer path shown <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-28-cleanstart.png">here</a>, gets converted into a MWSS and heads towards the cell state boat bit. If the boat bit is present it collides with it and dies, though it does preserve the boat bit during this death (<a href="https://youtu.be/FQdg329APN8?t=198">3:18</a>). If the boat bit is not present it passes by unharmed and keeps heading to the right (<a href="https://youtu.be/FQdg329APN8?t=434">7:14</a>). This is the same MWSS that does a counterclockwise loop around the cell, triggering honeybit reactions on all the neighbors’ inputs. This MWSS is only allowed to do the loop when the cell state boat bit is not present, when the metapixel is “on”. Just before it leaves to do the loop it shoots off a glider that bounces around and eventually sets a boat at C (<a href="https://youtu.be/FQdg329APN8?t=438">7:18</a>). At <a href="https://youtu.be/FQdg329APN8?t=473">7:53</a> you can see it setting off the first of 8 honeybit reactions around the metapixel, one for each Moore neighbor.</p>
<h2 id="output-display">Output Display</h2>
<p>The last piece of the metapixel is the output display that uses a ton of spaceships to fill in a big square area so it looks more or less white. Two synchronized LWSSes toggle the output display; they came from the logic mechanism described in the previous paragraphs, shown <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-29-toggledist.png">here</a>. Their timing is synced up through a series of <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-30-hwsscontrol.png">bends in their path and glider transformations</a>, then they are used to toggle a HWSS gun that triggers a series of LWSS “out of the blue” reactions. The streams of LWSSes fill the square space and mutually annihilate each other in the center of the metapixel, shown <a href="http://photos1.blogger.com/blogger/5525/3027/1600/Slide-31-displayon.png">here</a>. The mechanisms for both latches are similar:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/v0D8jud_cow" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<iframe width="560" height="315" src="https://www.youtube.com/embed/mYkjaaYn9y8" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<h2 id="further-reading">Further Reading</h2>
<p>If you’re interested in learning more, here are some related books and papers to check out:</p>
<ul>
<li>
<p><strong><a href="https://amzn.to/3eN7zFQ">The Recursive Universe (1985)</a></strong> by William Poundstone – Consider this blog post basically just an extended plea for you to read this book. Starts with an intro of Life and builds up to a breakdown of Conway’s proof of the existence of self-replicating patterns in Life.</p>
</li>
<li>
<p><strong><a href="https://amzn.to/3cAPNDU">Winning Ways for your Mathematical Plays (1982)</a></strong> by Conway, Berekamp, and Guy – Includes the original proof of the existence of self-replicating patterns in Life by John Conway. Though this proof does not lay out an exact plan for such a machine, it proves that all the necessary components exist. Several decades later the <a href="https://www.conwaylife.com/forums/viewtopic.php?p=9901#p9901">first implementations</a> of these replicators began emerging on online Life forums.</p>
</li>
<li>
<p><strong><a href="https://arxiv.org/pdf/1111.1567.pdf">SmoothLife (2011)</a></strong> by Stephan Rafler and <strong><a href="https://arxiv.org/pdf/1812.05433.pdf">Lenia (2019)</a></strong> by Bert Chan. 2D Cellular automata built in a continuous domain - where cells can take on a continuum of states rather than the binary “alive” and “dead” states of Life.</p>
</li>
</ul>
<iframe width="560" height="315" src="https://www.youtube.com/embed/KJe9H6qS82I" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<iframe width="560" height="315" src="https://www.youtube.com/embed/iE46jKYcI4Y" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<ul>
<li>
<p><strong><a href="https://distill.pub/2020/growing-ca/">Growing Neural Cellular Automata (2020)</a></strong> by Mordvintsev, Randazzo, Niklasson, and Levin – Using machine learning to generate cellular automata that grow from a simple seed to a desired form and repair themselves when damaged. These continuous CAs also include many hidden states. Differentiability is key to making this work – a major obstacle in applying these types of techniques to the non-smooth Life world.</p>
</li>
<li>
<p><strong><a href="https://amzn.to/2KEFi6K">A New Kind of Science (2002)</a></strong> by Steven Wolfram – Though I think Wolfram overstates the “newness” of the experiments conducted in A New Kind of Science, this book gives a really thorough analysis of cellular automata systems and their relationship to the natural world.</p>
</li>
<li>
<p><strong><a href="https://amzn.to/2VtnNfy">Kinematic Self-Replicating Machines (2004)</a></strong> – An overview of research problems / projects in self-replication. This book has more of a focus on physical implementations rather than simulations, but there are some clear connections to cellular automata here as well.</p>
</li>
</ul>
<p>Also, I made the animated gif at the top of the article using a <a href="meta-life_animation.py">script</a> originally written by <a href="https://www.youtube.com/watch?v=xP5-iIeKXE8">Phillip Bradbury</a> (I made some slight modifications to the script so that it would perfectly loop). Script runs in <a href="http://golly.sourceforge.net/">Golly</a> with <a href="metacell.mc">this file</a>.</p>Amanda GhassaeiA few years ago I came across this video, showing a complex machine built entirely in Conway’s Game of Life: