A camera pointed at a white wall sends what it sees to projectors, which cast their image onto that same wall for the camera to see. Out of this simple recursive loop complex self-similar structures emerge and morph from one fractal pattern to another as the projectors move.
This project was in part sponsored by HEC.
To tap into the mesmerizing and thought-provoking properties of video feedback loops and their emergent fractal patterns, I built a contraption that uses multiple projectors pointing on the same screen as a camera which feeds all these projectors with what it sees. The projectors are able to move on three axes as they throw the camera’s view on the screen. The images overlap and influence each other and when all the settings are just right, fractal-like patterns quickly emerge, stabilize, and morph from one shape to another. Stepping in front of this setup you can influence the phenomenon, disturb the fractals by casting shadows, or become part of the projection screen and see yourself propagating down into infinity.
Between the camera and the projectors is an HDMI splitter to duplicate the camera’s signal, but no unnecessary image processing is performed. I wanted to keep the phenomenon as pure and “analog” as possible. In theory, it should be able to occur with an analog camera feeding into analog projectors, or even purely optically, although I’m not sure if that is feasible. From the HDMI splitter, the signal is fed directly into the projectors, which are connected to servos that can pan them left to right and tilt them up and down. Each servo-projector setup is in a ring that can slowly roll around its axis with a continuous servo motor. The servos get their signal from a microcontroller with a servo shield. The program gives the servos commands to move randomly in a confined way into certain constellations to produce ever-changing new fractal structures.
I began my experiments with video feedback loops using various displays and cameras.
I created loops with a large TV screen and a camera. This setup allowed me to get close up to just a part of the screen, revealing oscillating patterns and Glitch-like effects.
I used a webcam and my laptop screen to experiment in a variety of ways. For example, by slowing down the frame rate, or by inverting the pixel values of the output signal so each iteration down the loop switches its colors and brightness, or by thresholding the image and only using either white or black colors.
Using the screencasting app OBS Studio, I made a virtual camera that captured the entire screen, converting it into a webcam signal. This setup produced very detailed loops, largely due to the absence of external “noise” in between.
When experimenting in OBS Studio I duplicated the source, placed them next to each other, and used a webcam to film this setup. Instead of multiple infinity tunnels forming as I first expected, a complex fractal suddenly appeared. Twisting the camera made this fractal curl into itself and the whole phenomenon seemed rather stable and reinforced itself.
To further study this kind of feedback loop I wrote a program that duplicates the screens like in OBS, but allows me to rotate, scale, and move them individually as well as overlap the screens additively as projections would. I could either use a camera and point it at the screen or just use OBS to record the display and make a virtual camera. The problem was that initially there was no light source such as the ambient light when using the webcam. To solve this problem, I programmed the screen to flash in a color when I pressed a button. Later I also added the function to display other images and throw them into the empty loop.
With this setup, I could create detailed fractal patterns but it felt like I was just generating these fractal images with my computer. Fractals are often associated with computer-generated images and I think this makes them feel more like an abstract concept and less tangible. I wanted to see what this phenomenon would look like when it is purely produced with a real camera and real projectors.
I borrowed some projectors and connected them through an HDMI splitter to my camera. Setting them up on tripods allowed me to try out many different combinations and plenty of different kinds of patterns emerged. I was also able to move into the structure and become part of it by throwing shadows and acting as the screen.
One aspect of this setup I found especially interesting was the transition from one pattern to another. When I moved one projector to point somewhere else the pattern morphed with the change propagating one frame at a time further down into itself.
However, I couldn't move all the projectors at the same time and I wanted to see a smooth transition between different states. So I started to think about how I could make a machine that could do that for me.
The most important movement is the "roll" around the axis of the projector pointing to the screen. But I also wanted to give them the ability to adjust their "pitch" and "yaw".
Having borrowed various quite large projectors first, I thought it would be best to have them point up and use mirrors for the pitch and yaw movement and rotate the projector underneath to roll the image.
After researching different types of projectors online I came across an interesting type of projector of which only a few models were actually produced. They are called laser beam scanning projectors and use MEMS (Microelectromechanical Systems) technology.
Almost all projectors use a white light source that gets focused through a lens on a screen. Between the light source and the lens are different methods of giving the light information.
LCD projectors use small LCD screens that let the light beam through to a greater or lesser extent depending on the brightness of the pixel.
DLP Projectors use a digital micromirror device (DMD) which either reflects the light through the lens or diverts it away from it depending on the image's pixel information.
For color, the light beam can either be split and moved through color filters before going through three LCDs or DMDs, or it goes through a spinning color wheel and a single DMD chip, or the light source creates the primary colors (multi-color LED).
Laser scanning projectors, on the other hand, use primary color lasers to draw the image like a cathode ray tube line by line, pixel by pixel with the help of a fast-moving mirror.
This means that laser scanning projectors do not need a lens or focusing as the laser is always a point and the image is always sharp.
It also generates less heat and therefore doesn't need a large fan, making them quieter, more compact, and lighter.
With these small projectors, it would be easier to move the whole projector instead of making a rotating setup with moving mirrors. They weren’t produced anymore but I found some pre-owned ones online.
For the machine I want only the projectors to move and the camera to stand still focused on the middle of the screen. The rotation should be the most precise and slow movement while the up down and sideways movements could be quicker. For all motions, I wanted to use both positional and continuous standard servos that can all be controlled with an Arduino microcontroller board.
I made a 3D model in Autodesk Fusion 360 to animate certain movements and to test how everything could fit together. To have the projectors rotate roughly around its center I wanted to put the pan and tilt setup into a tube that can rotate.
The projectors are small enough that a smartphone bracket for tripods can be used to mount it, but the clamp couldn't be too wide so as to not interfere with the buttons and ports on its side.
To make this whole setup rotate I found an aluminum bearing swivel that is usually used for Lazy Susan turntables, but also works vertically. I found this to be a compact solution that allowed me to place the projector close to the axis of rotation.
To rotate the bearing, I used a continuous servo motor attached to the outer ring. This motor turns the inner ring, to which I glued a toothed belt. The belt is gripped by a pulley on the servo's shaft.
The continuous servo would be tightly bolted to the outer ring on one side of the bearing, while the servos with the projector would be attached to the inner ring on the other side.
The projector-rings could be made modular and they could stand for themselves on multiple tripods in a room. This would be interesting with a large number of these modules, but since I only have enough for three I figured it would be better to have a more compact system with the camera and the projectors fixed close together. That would limit the variability, but also allow me to optimize the program for exactly this constellation.
The aluminum bearings consist of two rings that are held together by around 135 small balls in a rail. To separate them the balls have to be removed through a hole in the outer ring. I cleaned the two parts and polished their rails, to make them move more smoothly.
Before I put the rings back together, I drilled additional holes and countersinks in the rings to prevent chips and other debris from getting into the gap between the rings later. Then I put some lubricating oil in the rails and added the balls back through the hole in the outer ring again.
The bearings are 10 mm thick, but had a small, beveled edge on each side, which is why I got a 9 mm wide timing belt that fit just right. After cleaning the inner ring, I attached the timing belt bit by bit with instant glue, being careful to trim the ends so that they fit together seamlessly.
To attach the servos to the bearings and the bearing to the camera I had to make some brackets. After experimenting with cardboard cutouts and a 3D model I made sketches and got different shaped 2mm thick aluminum profiles that were available in hardware stores.
I wanted the rings to stand upright without using a vise. To allow for more freedom during experimentation, I decided to attach each ring to a tripod. With an L-profile, I made an attachment to standard tripod quick-release plates. I drilled two holes to attach it to the outer ring and drilled one hole for the tripod screw and one smaller one for the safety pin.
I used an angle grinder to cut a U-profile for the ring servo, ensuring it fits well inside and maintained the correct distance from the bearing.
For the attachment of the bracket to the outer ring I drilled an elongated hole to easily tighten and loosen the servo to the timing belt without needing to remove the entire screw.
The pulley attached to the servo shaft was 10 mm wide and fit exactly over the ring with one flange sticking out to the other side.
Since I wanted to attach the other servos to the other side of the inner ring and didn't want to use spacers, I removed the flange at the end of the pulley.
To attach the pan and tilt servos to the bearing I used an L-profile, cut out a piece for the servo, and drilled four holes to attach the servo with nuts and bolts.
Initially, I tried using just one hole to attach the setup with a toothed lock washer to the inner ring. However, for added safety, I decided to drill another hole and use two screws to secure the setup. I also shortened both ends of the profile to reduce excess material.
To realize the panning and tilting movement of the projector I got aluminum brackets for standard-size servos. I also got a metal horn with screws, because the plastic horns that got shipped with the servos seemed a bit weak. I assembled the servos and the brackets with bolts and nuts and began testing the servos with an Arduino.
To mount the projector to the U-type servo bracket arm, I used an aluminum phone clip for tripods. The first one I used was a bit too wide, so I had to use spacers made of the walls of an old rubber tube to reach the buttons. But I got another one that is lighter and narrower, and also has a screw to hold the projector in place and not solely rely on the spring mechanism when it's upside down.
The cables couldn't just be plugged into the side of the projectors, because they would coil up around the projector when it rotates and they would also stick out the side and could come in contact with the ring when the projector turns sideways.
So that they all point out backward behind the projector, I built a cable duct out of a square aluminum tube. I used an angle grinder to remove three sides of the tube and only left a thin square at the end for the cables to go through. Then I drilled one hole for the servo cables to fit through and one to bolt it on underneath the projector.
To have the HDMI cable not stick out to the side I used a 90° HDMI mini connector with a flat flexibility cable running underneath the projector to an HDMI Port that is glued on the end of the Aluminium rod.
That's where I plug in longer flexible HDMI spiral cables that can be rotated more easily than straight ones. They all go into an HDMI splitter, which duplicates the signal from the camera. The projectors have an internal battery but for longer runtimes, they should be plugged in and powered by a micro USB cable. To have them not stick out as well I use an extension that goes sideways away from the Micro-USB port and through the cable channel.
The servos are powered and controlled through three cables each. I thought about using a slip ring so they could rotate indefinitely but as the projectors require multiple cables the number of connectors would have been too large and slip rings would not have been feasible.
Instead, I used twisted extension cables for the servos attached to the rotating part of the bearing. They all plug into a servo shield on an Arduino Uno.
To test the setup, I used two analog sticks to control the servos. One to control the speed and direction of the continuous servo and the other for the projector's pan and tilt orientation. By pressing the button on one of the sticks, I can switch between the projector segments.
To prevent the servos from breaking their mount, I defined a minimum and maximum range that can't be exceeded and also installed them in a way that they all use the same starting orientation, so miswiring wouldn't damage them either.
I first made a cardboard box to put all the electronics inside and attach the analog sticks to. The cables must have a certain distance from the projectors so they don't get tangled up on themselves.
But later I designed a frame to connect the camera, the projector rings, and the electronics and put it all on a single tripod.
Using an angle grinder, I cut a piece of 150mm x 150mm square aluminum tube to use as the frame to which I would attach the rings, camera, and electronics. I cut out 3 sides to make it lighter and easier to access the camera but left 1cm wide bars to connect the front and back and later use it as a cable channel.
Next, I made a box for the electronic parts. I used the HDMI splitter’s size as the measure for the box’s length and the internal dimensions of the aluminum tube for the height and width of the front and back plate. I cut a rectangular hole in the middle of the two panels and filed it down until the HDMI splitter fit through. Then I cut two pieces from an aluminum profile to use as a bracket to attach the HDMI splitter to one of the plates.
To attach the other panel, I cut a square aluminum rod into four pieces, drilled a threaded hole in each end of each piece, and screwed them to each corner of the panels.
Next, I wanted to connect an AC converter board with four DC 5V USB ports to power the projectors. I cut out 4 holes for the USB ports and filed them down until the ports fit. Then I made a support bracket out of an aluminum profile and glued the board to it using silicone.
On the side of the USB board, I attached a board with a single USB port that pointed the opposite way through a hole in the other plate, to power the HDMI switch. To the other side of the bracket, I attached another AC converter board to power both the Arduino microcontroller board and the camera with 8.4V. I used cut pieces of plastic tube as spacers between the metal bracket and the solder joints on the boards.
To supply the boards with power I installed a C14 socket equipped with a fuse by cutting out a socket-shaped hole in the plate and soldering cables to it. I used two WAGO connectors to distribute the power to all AC converter boards. I also attached a grounding cable to one of the HDMI bracket screws to connect it to the casing.
I placed the Arduino with its servo shield upside down over the HDMI splitter and secured it with screws on a bracket. I drilled three holes for the cables going to the three-ring segments. To protect the cables from scraping on the metal I also added rubber rings in each hole.
I added a thick plate to the frame so I could have both a threaded hole to screw it to a tripod and a counterbored hole next to it to mount the camera. I had to sand the plate down a few millimeters so it would fit. To prevent the camera from moving I put a thin sheet of rubber with a hole for the screw between it and the metal plate. I drilled holes in the front of the frame to attach the ring segments to it with the same bracket I used to attach them to the tripods before.
Since I wanted one panel of the electronics box to be easily removable, instead of drilling holes for the cables I made a slit in the top and side edges to allow the servo cables and camera power cable to go through. I also made a cable duct for the servo cables, which I glued to the inside edges of the frame and which led directly to the ring segments in the front. I shortened the cables of the servos and added the necessary plugs with a crimping tool. Also, I labeled the plugs, cables, and pins with stickers to make them easier to reassemble
Since I didn’t want the analog sticks to be permanently attached, I installed network port breakout boards, which have enough pins for both sticks. On an aluminum plate, I bolted the two sticks and a network port. Then I made a square hole in the plate near the Arduino that the network port could fit through. To attach the circuit board, I had to make a suitable bracket and bolt it tight. Now I was able to read out the sticks with the Arduino whenever I attached them with an ethernet cable.
To measure the current orientation of each projector I added accelerometers to them. This way I can keep track of how far they have rolled and prevent the cables from winding up. It also makes more precise motions to target positions possible. Because this Arduino didn’t have enough analog input pins I added an analog digital multiplexer module. The cables for the sensors went through the same holes as the servo cables.
I programmed the projectors to move randomly but with specific guidelines. The main motion is the slow rotation of the ball bearings because this motion led to the most interesting changes in the emerging fractal structures. They can either go clockwise or counter-clockwise at different speeds or stand still. The small movements of the tilt and pan servos always occur simultaneously and less frequently. After a random timer expires, there can be a pause period where no servo moves so that the morphing fractals can come to rest for a while. Before turning off the power, I can have the rings return to their original position by pressing both sticks at the same time.
I experimented with multiple projection screens. I used a semi-transparent screen, to make a rear projection so one could get close to the image without interfering with it. However, it didn’t work so well with ambient light. I also tried a silver screen that reflected light back to its source. This worked better in ambient light as the camera could see the image better being close to the projectors. But if you didn’t stand close to the setup, or looked from the side, the image appeared dimmer. The screens also weren’t big enough and part of the projected image sometimes moved over its borders onto the wall. I decided to refresh the white paint on the wall and use it as a projection surface, giving the fractals the space they needed.
The tripod was great because it was flexible, easy to disassemble and transport, and allowed for experimenting with different setups. But the one I had was a bit shaky and I was concerned someone might trip over it or over a cable hanging down from it.
So I decided to build a sturdier base out of aluminum.
I used a square aluminum tube the same width as the frame and cut a square aluminum sheet plate for the foot. To connect them I used angle brackets in which I cut threaded holes. To attach them to the frame, I used a plate that fits into the square tube, attached two angle brackets to it, and used the same threaded hole and screw as for the tripod.
Near the bottom of the base, I cut a hole for a power socket with an on/off switch. I attached a cable to it and fed it through a slot at the top just below the frame. From there I plugged it into the electronics box using two 90° angled adapters to keep the cable close to the frame.
I disassembled it all to get the aluminum parts sandblasted to get a uniform clean surface. To protect the surface I also got them anodized afterwards.
For the exhibition, I reassembled the parts and transported them into the middle of a five-by-five-meter room that could be easily darkened. I taped the power cable to the ground and turned it on. At first, I was a bit worried that the cables would get tangled or the servos would break due to overlooked programming or hardware errors. But it endured 10 hours of near-continuous operation twice.
One thing I wanted to experiment with was introducing an external image into the existing loop. To do this I used a fourth projector that I could connect to a computer and project an image into the loop. I projected simple shapes and lines, text, and photos. The image was quickly picked up by the camera and propagated down the loop. The dominant color persisted and was able to change the color of the loop even after the image disappeared. I've also written a program that uses the output of the main camera or the output of a second camera pointed at the same wall to determine the average brightness. With that value, it can project an image as a “light seed” into the loop to start it again if the previous loop has faded away. This can happen when the ambient light is too low for the camera to pick it up and the projectors move into a constellation where the fractals fade. With that program running I can even underexpose the camera's view and make the fractals fade again as soon as the external projector stops supplying the “light seed”.
I also wrote a version of the program that uses the camera signal not only as a brightness sensor but to periodically take stills and save them in a folder. Now, whenever the loop needs a light seed, the program uses its memory and “dreams” of what was before. I also changed the "dream images" in terms of brightness, hue, and orientation.
I also used the extra projector to project an altered fourth duplication of the camera’s view onto the loop. Unaltered this would only add slightly more complexity, because most of the time the bright areas would overlap and form a solid area, but by slightly shifting the hue, it made the complexity visible, and vividly colorful fractals appeared. Each new iteration through the camera and projectors caused the color of copies of the color to shift further until all colors of the spectrum appeared.
To visualize the movements of the evolving structures I tried to take long exposure photos with a camera on a tripod. With this motion blur some of the dynamic changes of the evolving fractals can be visualized. However, too long an exposure will result in a blurry image with little information.
To get a better idea of how they might look as structures in time, I recorded videos of the fractals and wrote a program that stacks the individual frames in 3D space. I keyed out the black parts of the images so only the evolving fractals are visible.
Other improvements I would like to see would be a setup with higher-resolution projectors, allowing for more detailed fractals and more visible recursions. However, there are currently no 4k laser scanning projectors on the market, and the size and weight of 4k DLP or LCD projectors would be too big for the current setup.
Another way would be to have a larger projector sit on a turntable pointing up with the lens on the center axis and then using pan- and tiltable mirrors to reflect the image onto a wall. Or to have a big enough moving head from an intelligent lighting system that is big enough to host a 4k projector and use those to move them freely.
Another idea that would be interesting to see, is to have a modular setup. Each movable projector would be on its own tripod and would have its own camera. Many of these modules would stand in a room with white walls (or a curved projection screen) and project what they see into each other's field of view.
A step further would be a completely optical setup, where the light going through a camera lens isn’t digitized but goes directly back onto a screen. This might be realized through optical fibers or mirrors and could produce much more detailed patterns with virtually no delay.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla.