Me and a friend built this my senior year in high school. It’s based on OpenMosix. The grand total spent for this project was $25 (for a 25 port ethernet switch). All of the computers were got for free from either our school or my dad’s office, since they were upgrading computers. This was more of a proof of concept than anything else, because most of the computers had around 800 MHz processors, but we also had 2 old servers in the cluster that each had two dual core 2.4GHz Xeon processors. So when the head node tried to balance the process load, it pretty much all went to the 2 most powerful nodes, and the others didn’t really get used too much. Still, we did manage to get them all networked and talking to each other. There isn’t a lot of documentation on OpenMosix and none of the forums that deal with it are active anymore. Here are the instructions I wrote detailing how to set it up. Once we had it working, we used it to contribute to [email protected] for a while. However, [email protected]ome isn’t written to take advantage of a cluster, so it wasn’t terribly effective. Optimally, we would have liked to have left the cluster up and running 24/7, adding nodes whenever we could. We then would have basically used it as a render farm. Back then, I didn’t know how to write multi-threaded applications, but now that I do, I kinda wish it was still around for me to play with. But in the end, the noise, heat, and electricity it used didn’t exactly thrill my family. It also took up a lot of room. We had no other place to put it, so we retired it and stored it in my closet for future experimentation. That never really happened, since we both went off to college and each computer ended up getting used for another project. Here’s what it looked like in it’s prime:
There are 14 computers in this picture. I think the most nodes we ever had running at once was around 12.
I shared this room with my brother. You can see why he wasn’t a fan of the cluster.