![]() |
1 year ago | |
---|---|---|
pd | 2 years ago | |
public | 1 year ago | |
src | 1 year ago | |
.eslintrc.js | 3 years ago | |
.gitignore | 2 years ago | |
LICENSE | 3 years ago | |
README.md | 2 years ago | |
TODO.md | 2 years ago | |
package-lock.json | 2 years ago | |
package.json | 2 years ago |
README.md
stone throw
This is the code repository for stone throw, a collaborative online sonic artwork. You can read more about the participants here.
Documented in this README are the specifcis that make the conceptual parts of stone throw function. There will be little to no documentation of the serving of files, routes, etc - this is documented infinitely better elsewhere.
This code is only really available to peruse if you are interested. It is by no means an example of good code, quite the oposite in fact. Enjoy the bad practices, changing of code style, terrible use of asyncrony, and much much more. This project was an opportunity to learn, and experience the challenges of deploying a project to be consumed by the public.
Server
The server code is comprised of three sections. The composition tracker, streaming service, and web site logic. There is a NodeJS server running Express and Socket.io which serves the front end, and manages the interactions between clients and the composition tracker. The sreaming service receives the composition from the tracker, which is subsequently emitted on the client.
To make this all work we are using the following technologies :
- NodeJS
- Express
- Socket.io
- net module
- Handlebars
- PngJs
- HTML5 Canvas
- PureData
- pdogg extension
- Icecast
- PM2
- NGINX
Scannable Files
One of the two central parts of this project is that the documents served to the user degrade on each viewing. As the life of the website progresses, the scannable files will be more erroded as more people scan them, to the point that they will less and less legible. At a certain point their individual file size will drop below a preset threshold, and they will be deleted from the server.
Scan Composition
When a scan is requested from the client a composition of a document and a random stone is generated on the server. This is done using the a NodeJS implementation of Cario Graphics library's canvas. Then the base64 encoded image is sent directly to the client requesting a scan. This is nice because then the scan preview is an image that is able to be downloaded, and that every scan is different.
Erosion
To do this many attempts at erroding the files at a low level were attempted and ultimately abandoned - because if the Guardian had trouble doing it, then yours truly definitely cannot.
In the end images are edited on a per-pixel level using pngjs where pixels values are replaced with transparent black pixels. Two "errosions" were devised, one was a random walker inspired by Dan Schiffman, the other a random cut and paste within the image.
Both are ripe for errors as a scan on an image can happen simultaneously in two places at once. In the event of an image being corrupted, it is removed from the file system in the same way as file that have gone below the preset threshold are removed. In hindsight manipulating the images in Canvas might have been easier than introducing another library.
File Tracking
The files are tracked in a JSON file within the assets folder. This was done in an attempt to avoid keeping a JavaScript object in memory throughout the life of the server. It is unclear whether the implementation has allowed for this, but the practice of data recording in files was interesting.
Additional to this there is the tracking of the total file size of the scannable documents. Tracking this within the same JSON file was considered, but seeing that it was being updated constantly it seemed like unnecessary overhead to be constantly reading and writing from a file. This data is saved in memory, and is generated everytime someone makes a scan. What is nice about this, is that the degradation doesn't always make the file smaller. It is not clear why this is the case but I have some unfounded hypotheses...
Sonic Composition
The sonic portion of this site conceptually works in reverse to the document scanning. As the files degrade, the composition progresses. A challenge here was deciding whether to generate and serve the audio from the client or the server. Because the interaction between scanning and progressing the composition was so constant, it seemed more logical to have this all server side. There will be no metion here about the Icecast setup as it is not special in any way.
Composition Tracker
This is developed in Pure Data (pd) using the pdogg extension to stream to the icecast server, and the netreceive object to receive the 'progress' of the scanning. On every scan the total size of the scannable documents is sent to pd in bytes. This then from a total file size already in the pd patch, determines the point of the composition. The composition is made up of around 10 audio files, that are loaded and mixed in pd, then streamed to the Icecast server.