SCMS is an MVC based secure content management system. It is designed from the ground up to withstand common Web application vulnerabilities such as SQL injection, XSS, CSRF, session fixation/hijacking, and many others. It is designed for PHP 5.0-5.2.x and MySQL 4.1+, and it can optionally support PostgreSQL as a database backend.
Innomatic is a mature and easy-to-use distributed container for PHP 5 Web applications. It is particularly oriented towards business and administrative applications such as CRM, CMS, Web-based frontends for legacy applications, and so on. It can contain multiple applications and customers or sites. It introduces a new way to distribute and manage Web applications through AppCentral.
PHP Menu Builder is a class that can be used to display nested menus of HTML links. It takes an array of menu items eventually with nested arrays that define sub-menus. The class generates HTML lists with links for each menu item. CSS classes can be used to configure the presentation styles of the menus, regular items, and the current page item.
SabreDAV allows you to easily integrate your existing Web application with WebDAV. It supports most of the common clients, including the Mac OS X Finder, the Windows XP/Vista Explorer, DavFS2, Cadaver, NetDrive, and WebDrive. It supports class 1, 2, and 3 WebDAV servers. It implements RFC2518 and revisions from RFC4918. It also implements RFC2617 (Basic/Digest auth).
The ApPHP Calendar (ApPHP CAL) is a powerful PHP calendar script that may be easily integrated and used with various PHP projects, such as schedulers, event processors etc. The calendar is very simple to install, implement, and use. It has a user-friendly interface and navigation. It can be viewed in various ways: yearly, monthly, weekly, and daily.
bot_recognizer is a PHP class that can be used to recognize Web robots and handle them specially. It can check the IP address of the computer or the user agent of the browser program currently accessing the Web server to determine if it is within a range of IP addresses known to be of Web robots like search engine site crawlers or even malicious crawlers. The class can call different callback functions depending on the type of crawler that was identified. It can also be set on debug mode by taking a given IP address or string as user agent instead of the user agent string sent by the accessing browser. The Web robots information is stored in a database. The class can load that database from a text data file.