STMX is a high-performance Common Lisp library for composable Transactional Memory (TM), a concurrency control mechanism aimed at making concurrent programming easier to write and understand. Instead of traditional lock-based programming, one programs with atomic memory transactions: if a memory transaction returns normally it is committed. If it signals an error, it is rolled back. Transactions can safely run in parallel in different threads, are re-executed from the beginning in case of conflicts or if consistent reads cannot be guaranteed, and effects of a transaction are not visible from other threads until committed. This gives freedom from deadlocks, automatic rollback on failure, and aims to resolve the tension between granularity and concurrency.
HandBrake is a multi-threaded DVD to MPEG-4 ripper and converter. It can encode directly from DVDs (even encrypted ones) or from VIDEO_TS folders. AC3, LPCM, and MPEG audio tracks are supported. The output can be contained in MP4, AVI, or OGM files, and the output audio is encoded in either AAC, MP3, or Vorbis. Other features include support for 2-pass encoding, encoding of two audio tracks, a bitrate calculator, and deinterlacing, cropping, and scaling of the original picture.
SubnetMapper is an admin's quick tool to map and keep track of managed IP networks, track free ranges that are available for assigning, and document the network setup. It has been designed to help admins manage and document small to medium networks. It prints and displays freely colorable maps of your network.
Hadoop Studio is a map-reduce development environment (IDE) based on Netbeans. It makes it easy to create, understand, and debug map-reduce applications based on Hadoop, without requiring development-time access to a map-reduce cluster. The studio provides a real-time workflow view of a map-reduce job, which displays the individual inputs, outputs, and interactions between the phases of a map-reduce job. The workflow view of a job updates in real time with the developer's code changes. It then generates Java sources and compiles them into a binary jar file, which can be run on a normal Hadoop cluster.