Data Koncepts

SuperScan v2.02 - Website Security

Data Koncepts

SuperScan v2.02 - Website Security

  • Home Page open submenu
    Data Koncepts'
      Home Page
  • Webmaster open submenu
    Professional
        Webmaster

    Development
        Process

    Small Website
        Fixed Price
        Offer

    Website
        Clients


    FREEBIES:
    Webmaster Security
        (see Security)

    Search For a
        New Host
        Checklist

    Search Engine
        Optimization
        w/mod_rewrite

    mod_rewrite
        Code Generator

    E-Mail
        Encrypter
  • Web Hosting open submenu
    Web Hosting
        Info
    & Checklist
    Web Hosting
        Offer
  • Security Updated! open submenu
    Online
        Security

    SuperScan v2New!
        Attack
        Detection
        & Reporting
    Hack Recovery
  • Professional Services open submenu
    Professional
        Documents

    Digital
        Imaging

    Screensavers
  • Computers open submenu
      Hardware
      Software
  • Contact open submenu
      Contact
      Terms &
        Conditions

      Sitemap
Website monitor by killerwebstats.com

Freedom Lost! Freedom!

SuperScan

Quickly Detect and Report Hacked Files via CRON/PHP

SuperScan, previously HashScan, is a set of PHP scripts to provide a warning of changed (edited, added or deleted) files to detect a hacker's nefarious work on your website(s). HashScan simply created a hash of every file within a directory and compared it to the pervious hash values for those files; not very sophisticated.

SuperScan was the result of comments received and code offered by Han Wechgelaer of the Netherlands to extend the usability to include more file comparisons (e.g., last modified date and time) and better summary reporting. Unfortunately, my interpretation of PHP's RecursiveDirectoryIterator and RecursiveIterator Iterator failed to account for my attempt to filter directories (eliminate them from a scan).

SuperScan v2 is a major update for which I consider Jan Bakke of Norway responsible. Jan found and tested my "obvious" iterator error as well as suggested and coded the additional headers for e-mail and performed many tests to validate and optimize the code. His suggestions, testing, coding and comments were invaluable. MANY THANKS to Jan for his efforts which made v2, and its many improvements possible.

SuperScan v2 corrects the errors of SuperScan by inserting lemats' MyRecursiveFilterIterator to handle directory protection and provided a lot of clean-up of the code as well as the output (HTML for testing and text for e-mail; production use).

Current Version

SuperScan v2.02 corrects an error in the configure script's foreach loops to ensure lowercase file extensions – the $ext variable passed by reference in the loops unintentionally allowed the scanner script to modify the last variable of the array (another great pick-up by Jan Bakke!).

Earlier Versions

SuperScan 2.01, SuperScan 2.0 and Hashscan

SuperScan v2.01

SuperScan v2.01 is a minor update which:

  1. Cleaned-up the e-mail variables $to, $cc, $bcc, $from and $reply with examples in commented code.


  2. Included the account name in e-mail subject for webmasters monitoring multiple accounts (scanner and reporter reports as well as database errors).

SuperScan v2 UPDATES:

The comments received about directories not being excluded set off a massive set of improvements:

  1. Incorporated lemats' MyRecursiveFilterIterator to correctly prevent scanning of specified directories (php.net/manual/en/class.recursivefilteriterator.php).

  2. Added the commented code set_time_limit(120); to prevent early script termination for large file sets (scanning a 3600+ file account in under 10½ seconds and over 100,000 "files" scanned in about 1½ minutes). Uncomment/change the 120 second limit only if you are scanning a massive number of files or large audio/visual files (hashing ~10Gb of mp3 files took nearly 3 minutes).

  3. Jan Bakke discovered that the baseline table's `file_path` set to 200 caused a problem with artificially long path-to-file strings. I increased the fields' limits to 255 characters (VARCHAR's max length) but, if you exceed this limit, UNcomment lines 90, 102-105 and 202-208 which will count and display those path-to-file strings exceeding 255 characters (then ignore them because they will cause database errors). If you need longer string lengths, you may want to try changing the `file_path` field type to TEXT (NOT recommended because Jan's preliminary testing showed "it is a performance killer!!"). This can be updated using PHPMyAdmin's Import function with UpdateTablesFor2.0.sql. READ the WARNINGs at the top of the configure script!

  4. Deleted the $report_out switch because it only duplicated the $testing switch.

  5. Deleted the $extensionless switch (because you SHOULD scan extensionless files for changes as they are often executed as PHP scripts).

  6. Removed the ambiguity of localhost testing using 127.0.0.1 with the use of a $localtesting switch.

  7. Allowed the option to add FROM and REPLY-TO headers to e-mail sent by both the scanner and reporter scripts.

  8. Replaced 'h' (12 hour format) with 'H' (24 hour format) in 8 locations in the scanner script.

  9. Added reporting of database errors.

  10. Created the $die switch in configure.php to tell the scanner to abort the scan if a database error is found (it will generally be repeated MANY times); database connection errors WILL terminate the scripts.

  11. Used the PHP variable, __LINE__ , to identify the line number of the mysqli_errors and corrected it to identify the previous line (with the mysqli_query).

  12. Counted and reported the number of Recursive- IteratorIterator (`file_path` iterations) made during the scan as well as the microtime for execution of the scan and file difference handling.

  13. Replaced "if (not directory and not '.' and not '..')" with "if(is_file())".

  14. Because all files are liberally commented so you can identify the variables and follow the logic, I have added scanner and reporter scripts without comments or testing output in a production subdirectory of the ZIP file. Simply upload these two files to replace the commented scanner and reporter scripts (all configuration is done in the configure script).

  15. Added the scan.php script in `online` to run SuperScan from a browser when the files are NOT located in public_html (they should NEVER be located within public_html but, if you don't have access to a higher level directory, be sure to password protect their directory!).

  16. Moved the database connection for the scanner and reporter scripts to the end of the configure script.

  17. Other than extending the time limit for execution (for very large scans) and eliminating 'negative reports' (which are meant to provide a "warm, fuzzy feeling that all is okay") at line 287 (" && 0 < $count_changes"), I have made included all the configuration necessary in the configure script. It should NOT be necessary to change either the scanner or reporter scripts except for future updates.

  18. I nearly deleted the `acct` fields from the database records (and database queries) but left them in case someone is using the same database with multiple accounts (but NOT scanning the same files - which would violate the UNIQUE key of the baseline table). Feel free to make this modification on your own.

  19. I was unhappy with the format of the reporter script (and so impressed by Jan Bakke's testing) that I added fields to the scanned table to record the iterations, files captured (by the scan) and elapsed time; these are provided for each scan in the daily report. UpdateTablesFor2.0.sql from 3 above will also update your tables with these new fields

  20. As part of the database clean-up, I also specified the `hash_org` and `hash_new` fields as CHAR(40) for a marginal speed improvement (because they use a fixed hash length).
  21. There are many configurable items (all of which are in the configure.php script). You should not have to modify the scanner.php script (unless there is a PHP max_executible_time ERROR) OR reporter.php script.

    SuperScan/HashScan Coding Logic

    The logic is simple: "Build a database of hashed values for vulnerable files (those which hackers modify to execute code on your server) and compare those values to the actual hashes on a regular basis and report added, changed and deleted files."

    Obviously, the code to traverse a server's directory structure and provide hash values is far more complex than the statement above. I have commented the code so I'll avoid the long-winded explanation of the code here.

    Database Setup

    For security, use a separate database for this which does not share access credentials with any other database. Create the new database and the new user with a strong password (I recommend a 16 character password generated by strongpasswordgenerator.com). Then create two tables, baseline and tested to hold the filename and hash values and the datetime of the last scan. The file CreateTables.sql (updated for SuperScan v2) has the SQL code to create these tables for you.

    Configure

    PATH is the physical path to the start of your scan which is usually the DocumentRoot. Just remember not to use Windows' backslashes because both Apache and PHP will be looking for forward slashes

    Note that you can select the file extensions to scan and directories to exclude.

    PHP's RecursiveIteratorIterator() function is used to iterate through the PATH directory ($dir).

    The script identifies the changed files, i.e., those added, changed or deleted which are used to create arrays to facilitate reporting via echo statements or e-mail.

    On the first pass, there will be nothing in the database's baseline table and ALL files will display as Added so don't be alarmed.

    Once you've tested SuperScan (or HashScan), don't even consider placing this code in your webspace (under the DocumentRoot) as that will mean that anyone can access your file and delete the saved information to invalidate your hash scans. For simplicity, put it in the same directory of your account which holds public_html (or similar) directory.

    Activate

    Now that you have the code, you need to have it activated on a regular basis. That's where the CRON function of the server excels! Simply create a new CRON job, set the time in the middle of the night when your server should be nearly idle (you don't want to interfere with or delay visitors' activities which also means you should limit yourself to a single scan per day). The CRON.txt files has the CRON code for you to use.

    Wrap-Up

    You have created a new database with two tables, one to hold the dates and one to hold the baseline hashes. You have initiated every scan by identifying the file types (by extension) that you need to track and identified the start point (DocumentRoot) for your scan. You've scanned the files avoiding the unwanted directories and compared the hashes against the baseline in the database. Closing the process, you've updated the database tables and either displayed (on a test server) or e-mailed (from the production server) the results. Your CRON job will then activate your hash scan on a regular basis.

    This is but one part of securing your website, though, as it will only inform you of changes to the types of files you've specified. Before you get this far, you must ensure that your files are malware free, ensure that no one but you can upload via FTP (by using VERY strong passwords) and keeping "canned apps" up to date (because their patches are closing vulnerabilities found and exploited by hackers and their legions of "script kiddies").

    In summary, BE PARANOID! There may be no one out to get you but there arethose "out for kicks" who are looking for easy prey. Your objective is to avoid that classification.

    "There's no rest for the wary ... but they don't (often) have to rebuild a server!


 
  This site designed, created, maintained and copyright © 1995 - 2025 by Data Koncepts.