Printed from www.flong.com/texts/lists/slit_scan/
Contents © 2013 Golan Levin and Collaborators
Golan Levin and Collaborators
Catalogues and Lists
- Peer-Reviewed Publications
- Essays and Statements
- Interviews and Dialogues
- Catalogues and Lists
- Project Reports
- Press Clippings
- 11 2008. An Alphabetic List of My Collaborators
- 06 2008. Dr. Reinhold Grether's Media Arts List
- 05 2005. Partial Contents of my Personal Library
- 03 2005. Performances on Overhead Projectors
- 03 2005. A Catalogue of Slit-Scan Art & Research
- 09 2001. A Catalogue of Mobile-Phone Artworks
- 02 2001. Web Site Picks for FreshFroot
- 07 1996. A Bibliography of Synesthesia Research
An Informal Catalogue of Slit-Scan Video Artworks and Research
Compiled by Golan Levin. Begun: 1 March 2005. Last edit: 17 July 2010.
Image by Andrew Davidhazy
Slitscan imaging techniques are used to create static images of time-based phenomena. In traditional film photography, slit scan images are created by exposing film as it slides past a slit-shaped aperture. In the digital realm, thin slices are extracted from a sequence of video frames, and concatenated into a new image.
Recently I've seen many new-media projects based on slit-scan techniques. They range from student projects, to Java demonstrations on the Processing.org site, to works by recognized pioneers of video and interactive art. My inclination to make lists is irresistible, and so I've put together this catalogue as an aid to researchers and students. My aim is to be as inclusive as possible, rather than attempt to winnow the projects down to just a few ideal exemplars or the most significant historic precursors. Thus not all of the examples are even computational: some of the projects described below use motion-picture film, still photography, or analog video techniques. Please note that this page is not self-promotional; I have not produced any slit-scan based projects myself.
Eddie Elliott, one of the earliest researchers of digital slit-scan imaging, keeps a related list which is more oriented towards photography, early cinema and flipbooks. There is now a Flickr tag for slitscan images, and many of the latest and informal productions can be seen there.
SLIT-SCAN SOURCE CODE and DEMO PROJECTS:
Processing (Java) source code for slit-scanning:
(These examples were written for Processing v.123, and are known to work with v.1.01. Unlike the tiny code above, these examples are also extensively commented. They are revised from Neil Jenkins' Processing v.68 code for slit-scanning.)
(Reposted From Clayton Partridge, OnePixelOff.com)
(Reposted from John Dalton's tutorial article on the Studio Artist User Forum)
(Reposted from Memo Akten's Timestretching in Quartz Composer page)
Artists, researchers and authors listed in this document:
- Susanne Jaschko (writings)
- Andrew Davidhazy (still photography)
- Jacques-Henri Lartigue (still photography, 1913)
- George Silk (still photography, 1960)
- William Larson (still photography, 1967)
- Douglas Trumbull (1968)
- Derek Burnett (1983)
- R/Greenberg Associates (1983)
- Jean-Michel Jarre (1984)
- Pipilotti Rist (1986)
- Zbig Rybczynski (1988)
- Bill Spinhoven (1988)
- Eddie Elliott (1992-1994)
- Toshio Iwai (1993)
- Joachim Sauter & Dirk Lüsebrink (1995)
- Ansen Seale (1996-)
- Tamás Waliczky & Anna Szepesi (1997)
- Björn Barnekow (1997)
- Romy Achituv & Michael Naimark (1997)
- Romy Achituv (1998)
- Paul Harter (1998, 2009)
- Christian Kessler (1998)
- Martin Reinhart & Virgil Wildrich (1998)
- Sid Fels, Kenji Mase & Eric Lee (1999)
- Daniel Crooks (1999-)
- Bryan Mumford (2000's)
- Camille Utterback (2000)
- Christian Hossner (2000)
- Tania Ruiz Gutierrez (2000-2003)
- Greg Ercolano (2001)
- Mindfukc / Datadouche (2001)
- Jussi Ängeslevä & Ross Cooper (2001)
- David Tinapple (2001-2005)
- Egbert Mittelstädt (2001-2002)
- Steina Vasulka (1997-2002)
- Dietmar Offenhuber (2002)
- Michael Cohen et al. (2002-2003)
- Daniel Sauter & Osman Khan (2003)
- Osman Khan (2003)
- Kurt Ralske (2003-2004)
- Neil Jenkins (2004)
- Robert Seidel (2004)
- Fabian Thommen (2004)
- Brendan Dawes (2004)
- Stephan Schulz (2004)
- Sascha Pohflepp (2004)
- HC Gilje et al. (2004)
- Daniel Rozin (2004)
- Paul de Marinis (2004)
- Guy Hoffman (2004)
- Michael Terry et al. (2004)
- Alvaro Cassinelli & Masatoshi Ishikawa (2005)
- Miska Knapek (2005)
- Ji-Hoon Byun & E.J. Gone (2005)
- Mateusz Herczka (2005)
- Michael Aschauer (2005)
- Mark Hauenstein (2005)
- Martin Hilpoltsteiner (2005, 2007)
- Toshio Iwai + NHK (2005)
- Glen Murphy (2005)
- Dan Kaminsky (2005)
- Scott Owsley (2005)
- Scott Carver (2005)
- Mogens Jacobsen (2005)
- Andy Polaine (2006)
- James Seo (2006)
- Angus Leadley Brown (2006)
- Roman Haefeli (2006)
- Roy Tanck (1996-2006)
- Adam Magyar (2007)
- Kevin Atkinson (2007)
- Keith Lam (2007)
- Christian Rohner and Claude Hidber (2007)
- Geert Mul (2008)
- Juanjo Fernández Rivero (2008)
- Don Whitaker (2008)
- Nicolas Horne (2008)
- Alexei Shulgin and Aristarkh Chernyshev (2008)
- Mitchell Whitelaw (2008)
- Joe Baldwin (2008)
- Bradford Bohonus (2009)
- NYX (2009)
- He-Lin Luo (2009)
- Masayuki Akamatsui (2009)
Dr. Jaschko has written an article surveying the history of slit-scan imaging in the computational domain:
Space-Time Correlations Focused in Film Objects and Interactive Video. Published in: ISEA Papers, Nagoya/Japan (2002). Also published in Future Cinema.The Cinematic Imaginary after Film. Edited by Peter Weibel, MIT Press, (2003).
Basics of Strip Photography
Camera for Conical Peripheral and Panoramic Photography
Improvised Scanning Digital Camera
Image by Andrew Davidhazy.
In several articles, artist/professor Andrew Davidhazy presents an excellent overview of predominantly pre-computational slit-scan photographic techniques and practitioners. A more complete list of Davidhazy's articles, treating advanced photographic techniques (e.g. infrared, ultraviolet, high speed, synchroballistic, panoramic, peripheral, schlieren, and photofinish photography) can be found here.
Car Trip, Papa at 80 Kilometers an Hour (1913)
This 1913 still photograph by Jacques-Henri Lartigue demonstrates an unintended form of slit-scanning, which results when a fast subject is captured by a camera with a slow vertical shutter. According to Derek Baird, who explains the effect in this blog post, the racing fans are even more distorted because Lartigue was panning to follow the car. The use of "rolling shutters" to create slit-scanned images is not limited to antique cameras; any modern digital camera with a CMOS sensor can produce the same effect, as seen in this 2009 photo by Soren Ragsdale -- taken with an Apple iPhone.
Hammer thrower, U.S. track team Olympic tryouts. © Time Inc. (1960)
A broad public got an early view of slit-scan imagery in a photoessay about the Olympics printed in Life Magazine in 1960. According to a New Zealand exhibition catalogue, George Silk, the photographer, had a portable slit-scanning camera made, "using a phonograph motor to drive the film past the slit which replaced a conventional shutter. The image produced by the slit camera turned the hammer thrower at the U.S. tryouts into a cartoon strongman, but also conveyed the intensely private moment of the athlete straining in his endeavour to win. The slit camera pictures were quite abstract — Silk said: 'I was thrilled when the prints showed strength, speed, design — originality.' For the tryouts story in Life, 18 July 1960, Managing Editor, Edward K. Thompson ran the slit-camera images as large illustrations alongside straight shots of the winners."
Figure in Motion Series (1967-1970)
According to the George Eastman House Still Photograph Archive, "these gelatin silver strip photographs were made with a modified 2 1/4" square camera, creating distorted nudes stretched across the picture."
2001: A Space Odyssey, Stargate Sequence (1968)
Douglas Trumbull created breathtaking abstract slit-scan sequences for the Stanley Kubrick film, "2001: A Space Odyssey" (1968). Trumbull states, "We were struggling with the Star Gate. Nobody knew what a Star Gate was; but, I came up with some ideas that I didn't even know at the time were based on some things I was learning as a young guy about street photography and weird photographic techniques...". The slit-scan effects can be seen in the embedded video starting at 1'06".
An explanation of Trumbull's technical methods can be found here: The Underview on 2001 SLITSCAN by Martin Kelly. See also Greg Ercolano's unique de-slitcanning project which computationally recovered Trumbull's original source images from the movie's slit-scan sequences.
R/Greenberg Associates (R/GA)
Renault Encore car commercial (1983)
This early commercial by R/GA used slit scanning to apply an "elastic effect" to a moving vehicle. Conceived and developed in 1981 by R/GA's Eugene Mamut, this "elastic effect" was used in several other R/GA commercials for clients such as the US Army, Citrus Hill Orange Juice, and AT&T. Mamut has since worked the effect into movies including "LadyHawke" (1985), "Predator" (1987), "Abyss" (1989), and "Ghost Dad" (1990). For more information on Mamut's work for R/GA, see: "Elastic Effect: New Optical Wrinkle". American Cinematographer. October 1988. p.97. (pdf)
Photographer Derek Burnett used slit-scan images for several record covers in the 1980s.
Zoolookologie music video (1984)
In a music video which is otherwise perhaps best forgotten, new-age/synthpop pioneer Jean-Michel Jarre applied some form of slit-scanning technique to brief video clips of his face and body. Jarre is well-known as an early adopter of digital technologies, and the 1984 Zoolookologie video features a number of signature early-80's visual effects. Information about the video's director is elusive, but it's my guess that the video was assembled on a Quantel Paintbox.
I'm Not the Girl Who Misses Much (1986)
Video artist Pipilotti Rist appears to have used some form of analog slit-scanning technique as an incidental visual effect in this video concerned with the representation of women in music video. The video's distributor, Electronic Arts Intermix, states: "Footage of the artist chanting the piece's title (a line adapted from The Beatles' song Happiness is a Warm Gun) is replayed at high and low speeds, with obscuring video effects, blurring into an almost painterly procession of images. Rist's manipulation renders her voice into a parody of female hysteria and her body into a grotesquely dancing doll." The slitscan effect occurs between 2:15" and 2:45" in the five-minute video, and may be viewable on Youtube.
The Fourth Dimension (1988)
Rybczynski produced a 27-minute, 35mm color film exploring the choreographic and narrative potential of slit-scan techniques. The artist writes: "I shot the actors in their sets with all the movements, then in the printing phase I visualized the image in 480 lines and reproduced the images delaying, for example, each frame by one line. Thus, the last line ended up being 480 images later in respect to the higher one, so that when the head of a character is rotated, his feet are still in their original position."
It's About Time / The Time Stretcher (1988-1994)
Documentation about this project is scarce, but Dutch artist Bill Spinhoven appears to have made an interactive slit-scan-based museum installation sometime in 1988, which he continued to develop in a variety of versions through the mid-1990s.
An undated review in German by Iris Dressler, translated by Babelfish, reports: "The prototype of the interactive installation 'It's About Time' was already developed in 1988. The installation is based on a closed circuit video camera and monitor, which are manipulated, however, by an 'invisible' computer which connects them. The computer shows the picture of the viewer 'live', but temporally overstretched. From one's movement, the viewer seems as if rebuilt on the screen, permanently atomised and displaced. One dissolves in spirals, pixel, loops, and can, depending upon one's course of motion, for a short time completely disappear, as if one could from place to place 'beam oneself'. Inevitably one goes into the 'away there play', is equally dismayed as pleased when disappearing and reemerging. The installation, in the context of the exhibition is produced as a video projection."
Another description of the work, from a museum exhibition about Mirrors, reports: "In Bill Spinhoven's installation, a video camera captures visitor's images and movements, and through computer processing, projects distorted manipulations of the same scene wiggled, twisted and contorted." From these descriptions it seems likely that the graphic interaction in Spinhoven's installation was probably similar to that later employed by Kessler (1998), Reinhart (1998) or Vasulka (2002).
Eddie Elliott appears to be one of the first people to have researched how slit-scan techniques could be applied to digital video. As early as 1992, he describes a variety of both utilitarian and playful uses of digital slit-scans, which he called "Video Streamers". Elliott principally used Streamers as part of a larger visual interface system for editing and manipulating video; later, however, he developed an educational/artistic exhibit (shown at the San Francisco Exploratorium) which computed Streamers from live participant video. Elliott also created playful transformations of Streamers, such as the folding paper box template shown above. Elliott's work is extensively documented in his 1994 PhD Thesis, which he produced in the Interactive Cinema Group of the MIT Media Laboratory.
Another Time, Another Space (1993)
According to the Leonardo journal: "Another Time, Another Space consists of multiple video cameras capturing live video images of visitors; these images are then manipulated in different ways by multiple computers. For example, the images may be manipulated through the scanning of each line in a given number of video frames and altering the time reference, creating time-lapse delays, slow-motion effects and time compression, or through scanning individual horizontal pixel lines within frame stacks and combining these as output. All of these processes could, theoretically, be achieved using traditional film-editing techniques, but not in real time. Another Time, Another Space exploits the possibilities afforded by computer-manipulated real-time video technology. This live sculpting generates strange and beautiful distortions of time and spatial dimensions displayed upon a rig of viewing monitors."
In a lecture at Doors of Perception, Iwai stated: "This is an installation which I exhibited in Antwerp Central Station for the EC Japan Feest for cultural exchange between Japan and Belgium (1993). The installation featured 15 video cameras, 30 computers, 30 video monitors, and a videodisk recorder. The comings and goings of people through the station were filmed by the cameras, and were manipulated in real-time by the computer to deform shape, time reference, and showing a different time-space environment on each monitor. With tens of thousands of people passing through this public space every day, I wanted to make a piece with a highly participatory aspect which with anyone could easily interact and perform. A great many people, regardless of age or sex, stood before the installation and played with it, far exceeding any expectations of my expectations. It was very rewarding.
Another Time, Another Space NHK version 1994: In this February, I tried to make a same kind of event in Tokyo. Instead of making installation, I used a huge monitor on a wall of a building at Shinjuku station in Tokyo. From the building, I used a video camera to take a picture of people who are walking around the station, and used the computer to transform the image. Usually, on this huge monitor, many commercial and music clips are shown all day long, and nobody pays attention. But, after I started this event, everybody stopped and began to enjoy to see themselves. I think this is one of the major powers of interactivity. This event was broadcasted to all over Japan by national television NHK."
See also Toshio Iwai's Morphovision project, which presents a unique form of three-dimensional slit-scanning.
Joachim Sauter & Dirk Lüsebrink
The Invisible Shape of Things Past (1995)
The artists write: "The project enables users to transform film sequences into interactive, virtual objects. This transformation is based on the camera parameters relevant to a particular film sequence on screen: (movement, perspective, focal length). The individual frames of the film are lined up along the path of the camera. The angle of the individual frames relative to the virtual camera path depends on the view from the actual camera, whilst the size of the individual frames depends on the focal length used. The rows of pixels at the frames’ edges define the outer membrane of the film object. A spatial / temporal concept was developed for the organisation and navigation of the film objects: In the case of Berlin, for example, all the urban development phases from 1900 onwards in the vicinity of the Museum Island and Potsdamer Platz were modelled and the film objects positioned according to their virtual location, as determined by when and where they were shot. Users are able to move about within time and space, interacting with the film objects. The final stage involved building an interactive installation and a material architectural model based on individual film objects."
Temporal Forms (1996-)
Seale has been working with high-resolution digital slit-scan photography since 1996.
Tamás Waliczky & Anna Szepesi
Sculptures (Time Crystals) (1997)
Waliczky and Szepesi derived sculptural 3D forms by treating the silhouettes of human performers, captured over time, as slices of (generalized) cylinders. These 'sculptures' were then displayed on screens in the form of virtual computer-graphic constructions. This project was commissioned and co-produced by the ZKM, Karlsruhe.
TimeMirror was a project created by Bjorn Barnekow while a student at the HDK Berlin. Barkenow writes: The timemirror is a project to explore what happens if you look through a mirror in four dimensions. In a mathematical sense a mirror not only reflects light. It reflects the axis of geometry. If you place the mirror in the special angle of 45 degrees you exchange two axes and you can see the object from the side. But we have a fourth dimension: Time. If you place the mirror 45 degrees diagonally in space and time you exchange one of the space-axes with time. Under normal conditions you can see the whole object but only at one part of the time. Motion you create by comparing the recent picture with the previous. But through the timemirror you can see all time and motion of an object but only one layer of space. This creates a completely different view with a fourth dimension included. You can not see the complete object. Like tomography you have to move the visible layer through the object/time to get back the third space-dimension."
Romy Achituv with Michael Naimark et al.
Be Now Here [Interactive] (1997)
Romy Achituv developed an interactive, slit-scan based browser for panoramic footage originally shot by Michael Naimark for Naimark's prior Be Now Here project. Naimark's original footage consisted of 360-degree views of various World heritage Sites (Timbuktu, Angkor Wat, Dubrovnik) shot with a stationary but slowly-rotating camera.
Achituv writes: "The Be Now Here Interactive application is an experiment in imbedding a moving video image within a larger static visual context. It is also a prototype for an alternative structure for non-linear cinematic narrative. As the active video window moves across the screen it leaves a visual trail, which is a trace of the time and space of the cinematic path. The user can maneuver back and forth through each scene creating and erasing moments in time, laying down a panoramic still that also represents a captured slice of time, then returning to view the scene unfold and come alive. The application demonstrates the possibility for separating and independently controlling spatial and temporal cinematic elements within one narrative space. This application was created with footage shot by Michael Naimark and Interval Research Corp. for Naimark’s Be Now Here installation (1996)."
Pixel Present (1998)
Achituv writes: "In Pixel Present a pixel-wide video segment is stretched horizontally across a screen. With each consecutive video frame, the image advances one pixel to the right, leaving behind a trace of the previous recorded segment. This simple capture technique creates images in which the conventions of representing Motion and Stasis are reversed. Whereas motionless elements will appear as a series of streaks across the screen, moving images - crossing the camera’s visual scope - are reconstructed as a “scanned” imprint, appearing as discrete and discernable objects. Representational space in these images is nullified and replaced by Time, mapped onto a spatial axis. Moving the camera at varying speeds creates images with various densities of space. When the camera pans the space, the expressive image created by the scan reflects as much the videotaped surroundings as the character and dynamic of the user’s gesture."
Swap (1998, 2009)
The artist created an interactive slit-scan installation which was first exhibited in 1998 and then updated and shown again in 2009. From his abstract, he writes, "Einstein's maths teacher first proposed the idea of space-time, a unified four dimensional model in which space and time were not separate elements but were made of the same stuff. SWAP swaps one spatial dimension with that of time, cutting through the space-time continuum like a knife through a cabbage, revealing the intricate internal structure."
Christian Keßler writes: "Transverser is an interactive computer installation. The installation is based on the principle of chrono photography: movements are recorded in such a way that their course in time becomes visible in space. The movement of the visitor in front of a camera is translated into a surprising video projection in which the body's movement becomes the condition for the legibility of the image." Transverser was produced at the Kunsthochschule für Medien, Cologne, and was presented at DEAF 2000.
One of the installation's unique features is a beam (actually, a thin vertical plane) of light, produced by a static slide projector, which cuts the volume of the room. Visitors who step into this beam are both illuminated by it as well as captured by the computer's camera. In this way the beam literally illustrates the "slit" of the slit scanner.
Martin Reinhart & Virgil Wildrich
Reinhart has worked with digital slit-scanning techniques on a variety of time-based and interactive projects, including a film co-produced with director Virgil Wildrich.
Reinhart writes: "tx-transform is a film technique invented by Martin Reinhart in which the time and space axes are transposed....Martin Reinhart has been working on this technique and refining it since 1992. tx-transform was presented to the public for the first time at the Ars Electronica in Linz in September of 1998."
Elsewhere on his web site, he continues: "For a couple of years now, a new film technique developed by Martin Reinhart has been astonishing audiences at international film and multi media festivals: tx-transform. No matter in which way one experiences the tx-technique - either by seeing the short film titled after the process, or in the form of the interactive tx-transformator - tx turns the familiar perception of time and space upside down. It opens up a universe of so far unimagined pictures. In addition to its artistic character, tx-transform also offers a commercial aspect. It represents a capable and professional tool for generating spectacular effects in high definition for feature films as well as commercials."
Reinhart has sought to claim patent protection for the slit-scanning technique. He writes: "The software for enabling the method is copyrighted by law. tx-transform method: European patent application EP0967572 A2 and a corresponding US-patent application."
Sidney Fels, Kenji Mase & Eric Lee
Video Cubism (1999)
Fels et al. allow a volume of video, represented by an XYT video "cube", to be sliced by an arbitrary viewing plane. As an alternative, their work also permits the use of an unsual viewing sphere, whose surface indexes through the cubic video volume in a curved manner. These cutting surfaces can be moved and manipulated interactively, in real-time.
The artists write: "Viewing video data along the X-T axis and Y-T axis has appeared in several forms in the literature. The main distinctions this work has is that the cut plane (or cut sphere) used to view the video data can be moved to any angle and position in real-time. This provides an opportunity to interactively explore the video cube from many different angles to get both aesthetically interesting static images as well as motion effects. Currently, a single cut plane or a cut sphere is supported. With the cut plane, investigation is like being able to move a window around the video cube to see all sides as well as inside the video data; hence the name video cubism. With the cut sphere, unusual images are seen as a curvature cuts through time and space."
Static No. 12 (1999-)
Melbourne-based artist Daniel Crooks divides his time between art-making and his work as motion graphics designer at ACMI (Australian Centre for the Moving Image). According to his site, Crooks "began his Time Slice project in 1999, exploring alternative models of spatio-temporal representation through the moving image. One of the main threads of this investigation is the formal treatment of time as a spatial dimension, as a tangible and malleable material."
Streak Photography (2000's)
Bryan Mumford is an expert in high-speed and exotic still-photography techniques. Mumford created the images shown here through a process he calls "streak photography", in which a slit-scan-based image is constructed from multiple still images of an object on a slowly-rotating turntable. He writes that these images "...are composite images built up from a single streak (or strip) out of many different images. The images I show here were created using the 'Time Machine' [Mumford's precision camera-timing device, which he sells], a programmable rotary table, and an Olympus E-10 digital camera....The image shown here is what I started with: an iris in a bottle. The bottle is sitting on a rotary table. The rotary table is computerized and can be instructed to move by specific amounts. The Time Machine's flash output is used to trigger motion in the rotary table. The procedure is as follows: 1) take a picture of the iris 2) rotate the flower 1.8 degrees 3) take another picture. This process is repeated over and over until 500 to 800 pictures have been taken from all different angles."
Liquid Time (2000)
Utterback writes: "In the Liquid Time Series installation, a participant's physical motion in the installation space fragments time in a pre-recorded video clip. As the participant moves closer to the projection screen they push deeper into time—but only in the area of the screen directly in front of them. Beautiful and startling disruptions are created as people move through the installation space. As viewers move away, the fragmented image heals in their wake—like a pond returning to stillness. The interface of one's body—which can only exist in one place, at one time—becomes the means to create a space in which multiple times and perspectives coexist. The resulting imagery can be described as video cubism. To create this imagery Utterback's software deconstructs the video frame as the unit of playback. This piece destabilizes a basic premise of time based media—that the unit of recording is also the unit of playback."
This 35mm film was produced by Christian Hossner, a student of the Academy of Media Arts (KHM) in Cologne.
Tania Ruiz Gutierrez
Ms. Gutierrez's 2004 doctoral thesis from Paris University is a comprehensive treatment of spatiotemporal imaging, and includes both an historical overview of relevant precedents as well as documentation of several of the artists' own computational projects. Gutierrez's thesis, entitled Études sur le temps et l’espace dans l’image en mouvement: Tissage vidéo, objets spatio-temporels, images prédictives et cinéma infini [36Mb pdf], translates roughly to "Studies on the time and space in the moving image: Weaving together video, spatial-temporal objects, predictive images and infinite cinema").
About the projects shown above, Gutierrez states, "These tri-dimensional objects, obtained through the volumetric interpretation of the time captured in a cinematographic recording, introduce the idea of the spatialization of time. They are situated at the intersection of the still image, cinema and sculpture and constitute the starting-point of many theoretical and artistic researches; of both virtual and actual objects."
2001 A Space Odyssey: Unwrapping the Slit Scan Sequences (2001)
This unusual project is unique insofar as it entailed the opposite technique of all of the other projects discussed here: the computational recovery of the original source imagery which was used in the construction of someone else's slit-scan video sequence -- in this case, the well-known sequences created by Douglas Trumbull for Kubrick's 2001. Artist-engineer Greg Ercolano calls his technique "de-scanning".
Ercolano writes: "While watching Kubrick's '2001: A Space Odyssey', I thought it would be fun to write some software to unravel the slit scan artwork in the psychedelic sequences, to see what they were. The results of this experiment are below.. The technique used to unravel the sequences involved using an SGI's real time video hardware, with a hacked version of 'videoin.c' (from the SGI example programs) to accumulate scanlines from the DVD and concatenate them back into the original artwork. So as the film played, the program ran, unrolling the scanlines in realtime."
Mindfukc / Datadouche
The Mindfukc collective present a surprising spatiotemporal transformation of video in this online project. A scene (of several people sitting and standing) is captured by a camera moving around them, along 180 degrees of a circular track. In their slit-scan derived movie (at left), time is substituted for space: each vertical pixel column in the output movie represents a complete time-frame's worth of information as the image is spatially scanned from left to right. The creators provide their NATO 0.55 source code.
Jussi Ängeslevä & Ross Cooper
Last Clock (2001)
More Last Clock examples
The artists write: "Last is a clock that is a record of its own history. Like a familiar analogue clock, it has a second hand, a minute hand and an hour hand. The hands are arranged in concentric circles, the outermost circle being seconds, the middle circle is minutes, and the innermost circle hours. Each of the hands of Last are made from a slice of live video feed. As the hands rotate around the face of the clock they leave a trace of what has been happening in front of the camera. Once Last has been running for 12 hours, you end up with an easy-to read mandala of archived time."
Tinapple constructed a large custom scanner, in which an electromechanical rig slowly moves a high-resolution still camera along a six-foot vertical or horizontal path. The artist then computes slit-scan images from the image sequences captured in this way. Tinapple's portraits, created using this scanner, have an unusually flattened 'medieval space' with a multitude of layered perspectival planes.
Mittelstadt has created several short video works based on slitscan transformations. Presently, the artist tours with the Norwegian group Biosphere, for whom he produces visuals.
The artist writes: "A photo camera with a special photographic technology exposes film material continuously (without shutter). This photo camera captures the movements of the model and saves them like a scanner on the film-material. The resulting photograph is distorted. The images of a video film, recorded at the same moment as the photograph, are blended onto this photograph. The way in which video images and photography merge explains how the photograph derived from the real scene, as it is captured on the initial video images."
Bent Scans (2002)
Electronic-art pioneer Steina Vasulka has been researching artistic uses of the video medium since the mid-1960s. Some of her recent installations involve slit-scan imaging in real-time contexts.
About Bent Scans, Steina writes: "From early analog video days I have always had a fascination for signal/system interplay in image and sound processing. Digital video offers whole new vistas, especially through storing and retrieving of moving images in warped time. The installation uses four computers resulting in four different image projections. Though all four computers have the same camera input, a different program on each creates a very different video image on each projection. By stepping into the camera view, the visitor will experience a different view of him or herself in an immediate past time."
Steina has apparently been using slit-scanning since at least 1997. According to a personal communication from HC Gilje, a software project called "Image/ine" made real-time slit-scanning possible on consumer Macintosh computers in 1997. Image/ine was initially developed in December 1996 at STEIM by Tom Demeyer (in collaboration with Steina).
Loop City (2004)
Offenhuber has computed extremely long slit-scan panoramas from video captured through the windows of moving automobiles. These image strips, moreover, are texture-mapped onto 3D paths derived (with the aid of digital maps and GPS) from the actual trajectories of the vehicles. These paths can then be browsed and navigated interactively. In Offenhuber's Wegzeit, the video is captured from a side-facing window of a car; in Loop City, the video is captured using a special conical mirror, through a front-facing window, in order to capture the panoramic trace of a moving 360-degree (up-down-left-right) view.
Offenhuber writes: "Wegzeit explores how non-isotropic space— that is space that is structured by relative units — can be used in VR and architecture. It offers a dynamic view of Los Angeles’ structure that is radically different from conventional architectural representations. We usually consider space as being structured by absolute units. A meter is considered to have a constant length regardless of its position in space. However, in our daily life we often use units that are relative in nature: we measure space in minutes, costs or memories. Wegzeit is also a project about Los Angeles and how it is transformed when brought to relative space. Asking someone in L.A. about the distance between two locations usually prompts a response in minutes. It seems paradoxical that in a city with such a regular, Cartesian layout, people rely on subjective parameters for their spatial decisions. But especially here, perhaps, where the influence of real space is leveled by this regularity, the impact of relative spaces becomes more strongly visible. The project consists of six dynamic virtual environments that propose models of how to visualize three-dimensional relative spaces. They deal with certain properties and effects caused by the nature of relative space such as the asymmetry of temporal distances."
Michael Cohen et al.
Stylized Video Cubes (2002)
From the authors' abstract: We present a new set of non-photorealistic rendering (NPR) tools for processing video. Our approach is to treat the video as a spacetime volume of image data. Previous tools to process video for an impressionist effect have painted collections of two-dimensional strokes on each successive frame of video. In contrast, we create a set of “rendering solids.” Each rendering solid is a function defined over an interval of time; when evaluated at a particular time within that interval, it provides parameters necessary for rendering an NPR primitive. Rendering solids can be rendered interactively, giving immediate feedback to an artist along with the ability to modify styles in real time. Benefits of our approach include: a more unified treatment of the video volume’s spatial and temporal dimensions; interactive, aesthetic flexibility and control; and the extension of stylized rendering techniques for video beyond the impressionist styles previously explored. We show example styles inspired by impressionist, cubist, and abstract art of the past century.
Image Stacks (2003)
From the authors' abstract: We present a simple but powerful Image Stack process for creating an enhanced image from a stack of registered images. This paradigm combines pixels using multi-image operations on a set of images of the same subject matter. We demonstrate how Image Stacks can help create group photographs, enhance high dynamic range images, combine images captured under different lighting conditions, remove unwanted objects from images, and combine images captured at different times and with different focal lengths.
Daniel Sauter & Osman Khan
We interrupt your regularly scheduled program (2003)
A slit-scan treatment of live television in an attractive installation format. The artists write: "The installation setup is as follows: A television is placed facing the wall, its flickering glow reflecting off the wall and its sound echoing in the space. Its broadcast signal is simultaneously sent to a computer, where customized software processes the broadcast in real time by collapsing every frame of the television image into a one pixel-wide slice. These slices are horizontally arranged in sequence and projected back onto the wall next to the television set, showing an abstracted history of the broadcast signal. Cinematic cuts are transformed into clear vertical sections. Zooms become visualized as curves. Commercials and music videos are seen as vibrant vertical patterns and hectic splashes of color, while News programs are calming studies of horizontal smears. Visitors are encouraged to switch channels with the remote control and explore the relationship between the broadcast, its sound and the projection."
Sur la Table (2003)
Various colored objects are positioned on a table, and can be introduced, moved and removed by museum visitors. Real-time video of the objects is used to generate a slit-scan image which is then projected onto the table, creating the impression that the colors are running off the objects.
Khan writes: "Sur la Table revisits the domestic situation of the table. Events that normally occur on/over a table (the placing of objects, hand gestures, etc) are amplified through projection and become the basis for interactivity, ultimately changing the visitor?s relation to the table. Using a camera as input, events occurring on/over the table are projected back onto the table so that a historic timeline of events is visualized as a continuous flow of images down the table."
Ralske's Elektropoesia (2004) [pictured], created in collaboration with Norwegian composer Trond Lossius, involved 2-channel video (projected 8 meters wide) and 16 speakers for 16-channel audio, and was presented at the Electrohype exhibit at Malmö Konsthall, Sweden. In this project, video material of the aftermath of the first Gulf War is atemporally re-processed into a slit-scan image of fire and burning earth. In Ralske's earlier Amstel (2003), a ten-minute video shot from a boat in Amsterdam is atemporally processed into a slit-scan panorama.
Ralske relates that he had two principal intuitions which led to this work: "In 2000, I realized that the standard way of playing video frames ("display 30 frames per second -- start at the first frame and end at the last") is really only an arbitrary convention. By default, a one-minute video clip is something intended to be experienced in one minute. But why not consider the one-minute clip as a collection of 900 still images?
"[In 2003], I realized that not only is the sequence of frames arbitrary; the frame itself is an entirely arbitrary convention. One minute of video can also be considered to be a collection of 300 million numbers.... Now the possibilities multiply infinitely. The question is no longer, "Which frame to display next?", but "How to create a frame out of this arbitrary mass of data?" I decided to consider the data as a 3-dimensional space. A frame is created by defining a surface in the 3-D space.
"The point of these techniques is that they allow research on temporality, on the nature of time. In standard video playback, time is identical to time as we live it. In the techniques [used here], something is revealed about time: we gain a new perspective on events, wherein beginning, middle, and end are seen simultaneously. A single event can have any of an infinite number of manifestations depending on the time-angle it is viewed from. Collectively, these time-perspectives can be called 'the atemporal'. We cannot see the atemporal in life because we are bound by time. These techniques provide views of atemporality."
The Processing environment is rapidly developing and changing. Free source code for slit-scanning, using the more recent Processing version 123 is available here.
Robert Seidel created the video "_grau" using various panoramic slit-scan methods to stylistically invoke the feelings he experienced after being in a car crash. Siedal states, "... _grau is a personal reflection on memories coming up during a car accident, where past events emerge, fuse, erode and finally vanish ethereally… various real sources where distorted, filtered and fitted into a sculptural structure to create not a plain abstract, but a very private snapshot of a whole life within its last seconds…."
Thommen creates location-specific slit-scan landscapes of specific venues in Germany.
Don't Look Now (2004)
About Don't Look Now, Dawes writes: "This is a graphic interpretation of the 1973 film "Don't Look Now" directed by Nicholas Roeg. Using specially written software made with the Processing Java environment, every single frame of the movie is rendered 1 pixel wide by 300 pixels high....Even though the resulting output is twisted into distorted shapes, you can still make out parts of the actors and scenes. Tracking shots reveal themselves as stretched images while quick edits appear as staccato bars of colour."
Schulz constructed extremely long panoramic images using a camera mounted on an cleverly and economically customized dolly. Schulz writes: "For the filming process I built a dolly with which I filmed people on the street and at the same time recorded the position of the dolly on the street. This allows me now to project each video clip in the position where I filmed it. I used an older, mechanical computer mouse and attached it to the dolly's wheel, which scrolls when the dolly moves."
Pohflepp presents an interactive Shockwave experiment in which the user can, through a mouse gesture, define a rectangular region of temporal displacement in a repeating video loop.
Dutch video artist HC Gilje, in collaboration with various theater companies, incorporated live slit-scanning into several performances. For example, in the piece "Elevator", the artist writes, "...the raw material for the video is the motion of the dancers captured by two live cameras. The resulting video is backprojected on 3 3x1.2m black screens."
Time Scan Mirror (2004)
Rozin presents a real-time interactive slit-scan mirror similar to those of Spinhoven, Kessler, Vasulka and (later) Polaine. The artist writes that the Time Scan Mirror is the first of his "Software Mirrors to deal with issues of time. This series of software mirrors examines notions of time, scanning, motion and stagnation. In Time Scan Mirror only one vertical line of pixels is scanned and is continuously copied sideways, the result is a 'log' of about 30 seconds of whatever crossed in front of the center of the mirror. This also yields a multi- angled representation of peoples faces."
Paul de Marinis
Tongues of Fire (2004)
Paul de Marinis produced slitscan images of a gas flame, visualizing the way in which this flame had been modulated over time by fluctuations in air pressure from a nearby sound source. The artist writes: "These traces are made by vibrating a flame by placing a speaking tube in proximity to the gas supply. I followed the descriptions from John Tyndall's Sound in constructing the manometric capsule and, following the example Koenig’s manometric flame of Dayton Clarence Miller later in the 19th century, adapted an old bellows-camera into a slit scan recording device to inscribe the flame variations on 120 roll Ektachrome film in realtime. These images are the precursors of the oscilloscope traces that followed, and of the graphical displays now seen in our audio editing software."
Video Time Warping (2004)
Hoffman writes: "Time warping is a video effect that remaps time over the space of the frame. It was inspired by Zbig Rybczynski's 1988 short film, the Fourth Dimension. Following a conversation with one of the poor souls who slaved on the painstakingly analog 'Fourth Dimension', I thought that it might be cool to experiment with a digital version of this technique."
Michael Terry et al.
Time Maps (2004)
Michael Terry and colleagues, working as a student team in Diane Gromala's Experimental Media class at Georgia Tech, used slit-scan techniques to produce a wide range of time-lapse investigations. These were subsequently described in a SIGGRAPH 2004 Technical Sketch, "Making Space for Time in Time-lapse Photography."
The team's "Time Maps" are images and/or videos in which slit-scans are compiled from long-term time-lapse imagery. The source strips may be compiled horizontally (as in the top example above), or along any other axis; one unusual contribution is their so-called "tree ring" or radial time-maps, in which time-displacement is mapped to radial distance from an arbitrary location in the image. Another contribution is their "tiled views" or calendric time-maps, which, they write, "are particularly well suited for revealing long-term trends and periodic events." Terry's team typically uses video from outdoor web-cams as source material, thus effectively visualizing the progress of day/night cycles, weather patterns and the change of seasons.
Terry et al.'s circular "tree-ring" pattern is logically orthogonal to the spatiotemporal organization employed by Angesleva & Cooper in their "Last Clock".
Alvaro Cassinelli & Masatoshi Ishikawa
Khronos Projector Main Site (2005)
Khronos Projector Simplified Versions (Processing, 2006)
Alvaro & Ishikawa's Khronos Projector is a multi-faceted project exploring space-time representations from video. One manifestation is an interactive spatial browser for time-lapse sequences. The artists write: "Time lapse photographic sequences are formed by taking a snapshot every minute, hour or day, from a fixed camera shooting at a natural or artificial landscape. A 'Time-Punch' brings the night as a dark eye in the middle of the sky. More simply (and classically), a spatio-temporal gradient can be formed on the image by selecting a plane temporal filter."
Miska Knapek captured 24-hour time-lapse photographs of the sea between Denmark and Sweden, with a fixed camera, and then produced a series of images in which these time series are slit-scanned from left-to-right or top-to-bottom. The images provide a record of a landscape of great personal significance to the artist, as well as a more analytic record of the time-based atmospheric effects in this locale.
Knapek writes: "I set up my trusty digital camera, and captured images for 24 hours. Midnight to midnight. Then I've applied a magic spell to show all the images of a given time-span 24 hour period, in one image. The temporal progression in the images begins at the left, or the top, and continues to the other end of the image. And, voilá, one has an image consisting of images of the temporal period of the given images."
Knapek continues to make elegant, vertically moving horizontal slit scan images of various different European localities at different times in the day. Other projects by Knapek include "longlapses: mannerheimintie south, sliced horizontally" (2009) and "kiasma cafe north, cut horizontally" (2009).
Ji-Hoon Byun & E.J. Gone
Timescape is a straightforward slit-scan visualizer for live video, principally treating urban pedestrian traffic as subject matter. The artists write: "Timescape is the image representation of sequently stacked 1pixel-width video lines that are extracted each from the camera input. Timescape represents the apparition of the time-flow of the space, whereas photography is the visual presentation of the moment of space and movies express the space changed by time."
Herczka used a cross-country train as the operative mechanism of a slit-scan imager. The resulting video is displayed in a wide-screen installation. Because of the specific choices of landscape and transform, the video reveals a strange inverted parallax where objects near the camera stand still while faraway objects move: the farther away the faster.
The artist writes: "44\13 is based on a single video take which I made during a train trip between Lelystadt and Almere in the Netherlands. The footage documents a travellers perspective through artificial and completely flat landscape, between two cities designed to be neither oldfashioned nor radical. I transformed the video with an analytical algorithm similar to slitscanning, which outputs a very wide moving image showing an analysis of what was seen, time progressing to the right. The result is a combination of high speed and extreme slowness on the edge between monumental landscape painting and visualisation science. 44\13 is based on very specific choices of subject, process and technology - its perplexing and poetic qualities emerge from interaction between this particular piece of land and specific iterations of the computer process. To make the 12 meters wide image possible, the video is rendered and projected in a special high-resolution format with custom software."
Danube Panorama Project (2005)
This project, developed at the Vienna University of Applied Arts, is a comprehensive slit-scan panorama of the entire Danube river coastline. It is remarkable both for the scale of its ambition, as well as for its unusual use of GPS-based locative technologies. Every strip in the panorama is indexed by its latitude and longitude, derived from GPS, thus permitting the images to be situated and browsed with an unusual quality and degree of geographic precision.
Aschauer writes: "The Danube Panorama Project is an experimental approach in fotographic mapping and cartography. Its vision is to produce a full panorama of the Danube's river sides by digitally Slit-scanning its Coastlines, resulting in a unique 'cross section' of Europe. The Danube - 'Europe's River of Destiny' - which like nothing else is connecting Western, Middle and Eastern Europe and which like nothing else reflects the inconsistent relationships of its peoples, cultures and religions by all its historic and current functions, will guide as a symbolic red line of this fotographic survey."
Pancam was produced by Mark Hauenstein for his Master's project in Interaction design at the RCA, London. The artist produces long panoramic images by slit-scanning video shot from a moving camera. There are congruencies between his approach and those of Achituv, Offenhuber, Schulz, and Herczka.
Hauenstein writes: "Pancam is a way of taking very long panoramic images. A sequence of images is taken from a moving vehicle. Central stripes of each consecutive frames are then cut out and joined together to one long image. The whole process can be automatised with common digital devices. A standard DV camera takes on the recording of the image sequence, a PC application takes the video feed and turns it into a static image."
An interesting if perhaps unintended feature of Hauenstein's technique is a peculiar "glow" in the sky around buildings and trees. This appears to be due to an "autogain" feature on his camera, which attempts to compensate for changes in overall image brightness.
"Recreative Movement" is a diploma thesis presented in July 2005 by Martin Hilpoltsteiner at the University of Applied Sciences, Wuerzburg, Germany, in the field of communication arts. The artist writes: "Recreating Movement is an experimental tool approach for analysing film sequences. Single frames of a film are extracted and arranged behind each other in a three-dimensional space. The frames -- which are normally visible for only a split second during a film -- are rendered into a tube-like frames set that "freezes" a particular time span in a film. Various filters and settings can be applied to the film sequence with a displayable menu bar."
Hilpoltsteiner's project offers several unusual transformations of slit-scan (XYT) video volumes. These include visualizations which expose the cutting (editing) structure of the examined video; volumes which show the presence of a selected color over time; and visualizations which distort the cross-sectional area of the volume according to the amplitude of its accompanying audio track. The artist also uses transparency very cleverly; in the example pictured above, Hilpoltsteiner achieves an Edgerton-like visualization of a tennis serve by removing the background from the video frames before constructing the XYT volumes.
Hilpoltsteiner has continued to explore stacked slit-scan visualizations and created the website, MoFrames which is as he states, "a growing archive of movements displayed in a three-dimensional space [where] single frames of a pre-keyed film sequence are arranged one behind the other in a three-dimensional space."
Toshio Iwai with NHK Science Technical Research and Laboratories
Morphovision: Distorted House (2005)
Japanese media artist Toshio Iwai has systematically explored the potential of the computer to create interactive and physical extensions of cinema. His astonishing project "Morphovision" may be regarded as a three-dimensional analogue to two-dimensional slit scanning. I am indebted to Alvaro Cassinelli for this insight. [This installation is a rare treat: see it by any means possible. -- Golan.]
Conventional slit-scan images assemble a two-dimensional composite image by assembling or concatenating many thin slices of two-dimensional source images. Generally these source images are captured across a span of time, as with frames of video or film.
In Iwai's Morphovision, a (real, not virtual!) three-dimensional composite form is assembled from many thin slit-scans of a rapidly rotating, three-dimensional house sculpture. The slices are created by passing a narrow beam of light over the rotating house; they are fused into a single three-dimensional form by the mind and eye, through the perceptual phenomenon known as persistence of vision.
By rapidly moving the beam of light back and forth across the rotating sculpture, Iwai is able to image the entire 3D structure, and not just a thin cross-section of it. By changing the shape of the beam's trace from a straight line to a curvilinear path, Iwai selectively reveals the object at slightly different points in time. In this way he is able to create the remarkable illusion of a distorted and wriggling 3D sculpture.
Robot Mirror (2005)
The artist presents an interactive video mirror in which the brightness of the camera's real-time image, on a per-pixel basis, is used as a time-displacement map into the camera's own recent input history. The technique represents a generalization of slit-scanning, insofar as the time-displaced regions need not lie along a linear slit.
Murphy writes: "If we show a video on screen, we are displaying the passage of time. If we mount a camera above the screen, we can measure local brightness (light) in the form of a black and white image. As the camera is pointed outwards, the viewers provide the brightness. We then map brightness to time, where if the light is bright, we show the most recent part of the video, and if it is dark, we show earlier video. This is computed on a per-pixel basis."
Dan Kaminsky (DoxPara)
Volumetric Video (2005)
Kaminsky created virtual three-dimensional XYT volumes from video, in synthetic computer graphics, and then cut through the volumes with a variety of clipping planes. He writes: "The basic idea is simple: Video is composed of a large number of individual frames, each with X and Y dimensions. Just stack each frame on top of the next and you've got a Z dimension to place into a volume renderer."
Slitscan Images (2005)
Using 35mm equipment, Owsley creates high-resolution slitscan panoramas. He then digitizes these and makes them available as large-scale prints. Owsley captures the scans using a variety of techniques, including rotating cameras, vehicular movement, "still lifes" of moving scenes, and capture from live television.
Carver uses slit-scans to create "a fluid, mercurial image that spans multiple points in time, and multiple perspectives". The artist states, "Scan is a piece for processed video and sound, a meditation on the gestures and experiences associated with urban travel, and movement around an urban space."
Jacobsen "[imagines] video as a 3D object". In his Film Dynamics series, he experiments with slit-scanning techniques to create projects in which "a video sequence becomes a tunnel of images. When watching the sequence, you are moving through movie-space at a speed of 25 frames per second from end to end in the tunnel."
Time Sketches (Time Smear and Time Slicer) (2006)
The artist presents a video installation in which the user can interact with a real-time, slit-scanned image of himself/herself. Installed at the Powerhouse Museum in Sydney, Australia, a blurb on the museum's web site reports: "Time Smear and Time Slicer by artist Andy Polaine are part of a series of live video works called Time Sketches that experiment with interactivity and the viewer’s image. Using video processing technologies these works play with time, chopping it up into fleeting moments and stretching it out across space. The result is a digital hall of mirrors, where you can see warped versions of yourself."
Polaine writes: "Essentially I’m interested, at least in these pieces, in the moment of interaction more than the images that get produced. I am perfectly aware that the whole slit-scan thing has been done in different ways, but I’ve never experienced one working live like this and wanted to try it out myself. There is another work in there called Time Slicer which, much like an 80s video mixer I suppose, freezes frames in a sequence and keeps doing so over and over. I noticed when I was playing with this and showing others that people really started to have fun and try out different ways of chopping up their bodies and faces with the camera frame." The author explains his technique further in this ACM Demo Paper.
James (Jung-Hoon) Seo
Asynchrony (Slide, Smudge, Split and Splotch) (2006)
James Seo presents a series of computational explorations, created in Processing, in which an interactive user can "paint" a temporal displacement map directly onto a video playback screen. The temporal displacement is authored in real-time and shifts the video backwards or forwards in time on a per-pixel basis. The system allows the user to interactively create temporal displacements with a variety of spatial configurations, ranging from crisp rectangular regions to amorphous areas that assign a different time-displacement to each pixel. The results are related to those of Cassinelli & Ishikawa's Khronos Projector and to Cohen et al.'s Stylized Video Cubes.
The artist writes: "Four interactive sketches for the simultaneous visualization of multiple points in time within video. Each sketch has its own method of defining regions within the rectangular frame of the video, and each region has its own shift in the displayed time point and its own rate of time flow. Combined with time-lapse or looped video clips as source material, each sketch generates a crudely synthesized image of different time points within the shared visual space." Seo describes four different modes:
- "Slide: You can draw boxes to create rectangular regions, each with its own rate of time flow. The boxes can be static or animated in different directions and speeds.
- "Smudge: You can make time ripple and shift forward in different areas using your mouse. Time is shifted within a radius; the amount of time shifted depends on the distance from the mouse location.
- "Split: You can divide the frame into multiple rectangular regions as you create new quadrants, rows or columns. You can also use shortcuts that immediately set up a uniform NxN grid.
- "Splotch: You can paint the desired region onto the frame. Each region can have one of three modes for its video: a strobe-like multi-image effect; faster playback; reverse playback."
Angus Leadley Brown
Serrated Image (2006)
The artist creates "synchroballistic photographs" of skateboarders in action. These are obtained by performing a slit-scan procedure on video recorded while the camera tracks the motion of the skateboarder. Brown sells high-quality prints of these images from his site.
In this diploma project, the artist uses customized displacement maps to influence temporal dislocation in video playback, on a per-pixel basis.
Tanck presents a Flash applet which permits the user to view slit-scanned treatments of clips from several popular films. The applet allows various axes to be swapped with the time axis.
Urban Flow (2007)
Photographer Adam Magyar uses a self developed camera and software to produce detailed, high resolution slit-scan photographs for his "Urban Flow" series. He writes, "It is the passing of time itself that turns into space by moving forward in time from the right side towards the left in each image. [While documenting,] the people seen in the right-hand side of the image had grown several minutes older by the time the people seen in the left side passed my camera."
Face Time (2007)
Canadian computer vision programmer and self-described "vfx junkie" Kevin Atkinson documents himself using an interactive slit-scan system in which his movements are playfully augmented by wiggles in the image plane. The effect is as if his image were projected onto the surface of a liquid wherein ripples are coupled to his every motion. He accomplishes the effect in the video by "carving into the image stack with a gaussian surface, whose center is tracked to face-position." Atkinson favors the video stills seen here, which are reminiscent of the paintings of Lucian Freud and Francis Bacon.
Mobile Brush (2007)
Christian Rohner and Claude Hidber
the ZEITTER (2007)
Rohner and Hidber state, "With a better grasp of time than we mere humans, the ZEITTER is able to perceive time in a manner we can only begin to visualize with its help. A normal camera transforms the three dimensions of space into the two dimensions of a photographic print. The ZEITTER extends this to encompass the fourth dimension of time."
Using a selection of the Museum Boijmans van Beuningen's collection of classic paintings, artist Geert Mul created this interactive project that aligns each painting's horizons together in an interactive, panoramic, digital slit-scan video stream projected on a wall. "The Museum Boijmans van Beuningen invited the artist Geert Mul (b. 1965) to create an interactive installation. ‘Horizons’ is an intervention within the museum's classic collection. Geert Mul selected works of art from the Museum’s collection that feature a horizon and stored them within a database. Specially developed software fuses the horizons and processes them into projections on the wall. Visitors activate the work by moving through the space. As they approach the projected image, the landscapes fragment, thus establishing a direct link between body and image. Representations of landscapes from the entirety of art history pass by on the horizon."
Juanjo Fernández Rivero
Various Slit Scan Videos (2008)
Barcelona-based Juanjo Fernandez creates slit-scan videos of images recorded during his travels. He manipulates them in live performances with Max/MSP/Jitter.
Wave Slice (2008)
Don Whitaker experiments with horizontally-moving slit-scan techniques, using his own nature videos as source files. Whitaker's application of slit-scans to ocean waves produces an arresting image of a turbulent alien sea. Whitaker writes, "I'm intrigued by slitscan photography and other visual/temporal shenanigans. [I] love the abstract, moving shapes and colors - especially the way the ocean waves are transformed, yet still recognizable."
Processed Slow-Dance Jump (2008)
Using a Processing sketch written by Don Whitaker, Horne uses slit-scans to create this image of a ballet dancer performing a 720° jump. He writes, "Each frame of the processed video is a horizontal stack of 1-pixel vertical slices of a video image. The left of the frame is the beginning of the video clip, the right side is the end. The animation effect is created by moving the location of the vertical slice from the left of the original video frame to the right. In effect, swapping time and space."
Alexei Shulgin and Aristarkh Chernyshev
Artistic and business partners Alexei Shulgin and Aristarkh Chernyshev of Electroboutique, a company/gallery/artist-collective, released this six-part project incorporating some classic slit-scan techniques. They write, "The Timeline mirror explores the notions of time, continuity, rupture, motion, memory and transformation. Six real-time processing algorithms are applied to video captured by camera to alter a viewer's reflection in motion. Combining or juxtaposing a current frame with the previous ones stored in a memory buffer, the algorithms create stunning surrealistic distortions and offer variety of modes of time/space interrelations."
Mitchell Whitelaw is an academic, writer and artist who created these two slit-scan based data visualizations of the sky and street in Canaberra, Australia. Included in his "Watching the Sky" page is an essay he wrote for the UK Journal "Photographies" which looks at how we interpret pattern and change in the environment, and the role of data in that process. The circular image abuve uses slit-scanning to map the brightness of the sky throughout the cycle of a day, while the lower image compiles ten thousand images of a Canberra streetfront, one image per minute for over a week, into a navigable interface.
In the article he states: "I'm influenced here by the work of Lisa Jevbratt, an artist whose data visualisations have focused on the digital networks, but whose approach works against any simple notion of information... [Her] resulting images are startling and completely abstract, but not at all unstructured. Jevbratt describes the visualisations as "abstract reals", and "objects for interpretation, not interpretations." Instead of demonstrating the already known, or the answer to a preconceived question (information), Jevbratt's data works provoke, and perhaps answer, new questions; in the artist's words "hints, suggestions, and openings. [...] Although the data source in Watching the Sky is as tangible and unmysterious as possible, surprising hints and suggestions continue to appear. In one of the earliest sketches I found small but distinct variations in the "horizon" over the course of a day, and recurring on successive days. I eventually realised this was caused by the afternoon breeze, shifting foliage by a few pixels within the frame. The dataset here is a trace of a complex material field that in a sense visualises its own internal structure: the passage of a shadow across the ground appears as a recurring pattern, an enfolded or multiplexed representation of another set of material interactions."
Time-Stacked Slit Scan Panoramas (2009)
Using his Panoscan MK3 camera Bradford Bohonus, of Bohonus Virtual Reality Photography, creates these layered composites of panoramic slit scans by stacking about 50 high-resolution panoramas of a Seattle location shot at several different times.
The artist conducts experiments with slit-scan self-portraiture using Processing.
He-Lin Luo (an MFA candidate at Taipei National University of the Arts) presents "Maelstrom", an installation consisting of an empty "trick" well that projects a slit-scan image of a virtual underwater interior onto a nearby screen. Participants can "dip" their hands -- or anything roughly hand-sized or smaller -- into the well and watch their abstracted hand swirl on the screen. The artist followed soon after with "Panopticon", a project with a similar concept but one that allows participants to, in a sense, walk into the well and view a slit-scan abstraction of their entire body.
Timetracks - Slit-Scan Camera App for iPhone (2009)
This slit-scan iPhone app comes with optional settings for different types of still scan movements, and includes the ability to change the scan line width and the scan time interval. It also lets you save to a camera roll, and disable your device's sleep mode.
This work is licensed under a Creative Commons Attribution-ShareAlike 2.5 License. The texts and images contained herein are compiled from the pages of their authors, and copyrights for these materials reside with them. This compilation, however is the work of Golan Levin and if referenced, should be cited as:
Levin, Golan. An Informal Catalogue of Slit-Scan Video Artworks, 2005. World Wide Web: http://www.flong.com/texts/lists/slit_scan.
This catalog is a work in progress. If you know of any other related slit-scan projects with good web documentation, please send me an email and I'll add it to this list; I can be reached at golan at this domain name. Thanks to the many individual artists who have contacted me about their work, and to Eddie Elliott, Dr. Susanne Jaschko, Dr. Tania Ruiz Gutierrez, and Professors Joachim Sauter and Andrew Davidhazy for their pointers to large constellations of additional projects. Thanks also to Amisha Gadani for her assistance in researching, compiling and preparing the texts and images on this page.
Please click here for other informal catalogs by Golan Levin.
("Procrastination as a form of Scholarship")
Keywords: slit scan, slitscan, slit-scan, split scan, splitscan, split-scan, split-scanning, slit-scanning, scanline, slit scan video, slit-scan photography, scan-line analysis, strip photography, streak photography, rolling shutter, spatiotemporal imaging, spacio-temporal imaging, space-time correlation, space-time representation, space-time volume, XYT volume, segmentation and reassembly of digital video, chronophotography, synchroballistic photography, epipolar diagram, multiple-center-of-projection image, multiperspective panorama, video cube, videogram, temporal displacement map, moving image, interactive art, interactive video, new-media art, computational art, digital art, video art.