← All articles

Auto-Timestamping Every File That Hits Your Folder

April 24, 2026·python · automation · blockchain · tutorial

I needed a way to timestamp files the second they hit my machine. Not when I remember to do it. Not when I get around to it. The moment the file exists.

Built a Python script that watches a folder and automatically hashes + timestamps anything new that drops in. Uses the watchdog library for file monitoring and the verify-proof package I shipped with ProofAnchor. It's been running for weeks now, catching everything from draft posts to random screenshots.

Here's how it works and the code to build your own.

The Problem: Manual Timestamping Doesn't Scale

You create files faster than you timestamp them. A photographer imports 200 shots from a wedding. A writer saves 15 draft versions of an article. A developer pushes artifacts to a staging folder.

By the time you remember to timestamp anything, you've already lost the exact creation moment. The file exists. The work is done. But the proof window closed.

I wanted something that runs in the background and handles this automatically. Drop a file in the watched folder, get a blockchain timestamp without thinking about it.

Building the File Watcher

The script uses Python's watchdog library to monitor a directory for new files. When it detects a new file, it calculates the SHA-256 hash using verify-proof and stores the hash with a timestamp.

import os
import time
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler
from verify_proof import hash_file
import json
from datetime import datetime

class TimestampHandler(FileSystemEventHandler):
    def __init__(self, output_dir):
        self.output_dir = output_dir
        
    def on_created(self, event):
        if not event.is_directory:
            # Wait a moment for file to be fully written
            time.sleep(1)
            self.timestamp_file(event.src_path)
    
    def timestamp_file(self, file_path):
        try:
            # Get the SHA-256 hash
            file_hash = hash_file(file_path)
            
            # Create timestamp record
            timestamp_data = {
                'file_path': file_path,
                'file_name': os.path.basename(file_path),
                'sha256_hash': file_hash,
                'timestamped_at': datetime.now().isoformat(),
                'file_size': os.path.getsize(file_path)
            }
            
            # Save the hash record
            hash_filename = f"{os.path.basename(file_path)}_{file_hash[:8]}.json"
            hash_path = os.path.join(self.output_dir, hash_filename)
            
            with open(hash_path, 'w') as f:
                json.dump(timestamp_data, f, indent=2)
            
            print(f"Timestamped: {os.path.basename(file_path)} -> {file_hash}")
            
        except Exception as e:
            print(f"Error timestamping {file_path}: {e}")

def start_watching(watch_dir, output_dir):
    # Create output directory if it doesn't exist
    os.makedirs(output_dir, exist_ok=True)
    
    event_handler = TimestampHandler(output_dir)
    observer = Observer()
    observer.schedule(event_handler, watch_dir, recursive=False)
    
    observer.start()
    print(f"Watching {watch_dir} for new files...")
    
    try:
        while True:
            time.sleep(1)
    except KeyboardInterrupt:
        observer.stop()
    
    observer.join()

if __name__ == "__main__":
    watch_directory = "./watched_files"
    output_directory = "./timestamps"
    
    start_watching(watch_directory, output_directory)

This script monitors a folder and creates a JSON record for each new file with its SHA-256 hash and metadata. The hash gives you cryptographic proof of the file's exact contents at that moment.

Adding Verification

The timestamp records are useful, but you also want to verify files later. Here's an extended version that can verify a file against its stored hash:

import argparse
from verify_proof import hash_file

def verify_file_integrity(file_path, timestamp_file):
    """Verify a file matches its stored hash"""
    try:
        # Load the timestamp record
        with open(timestamp_file, 'r') as f:
            timestamp_data = json.load(f)
        
        # Calculate current hash
        current_hash = hash_file(file_path)
        stored_hash = timestamp_data['sha256_hash']
        
        if current_hash == stored_hash:
            print(f"✓ File integrity verified")
            print(f"  Hash: {current_hash}")
            print(f"  Timestamped: {timestamp_data['timestamped_at']}")
            return True
        else:
            print(f"✗ File has been modified")
            print(f"  Original hash: {stored_hash}")
            print(f"  Current hash:  {current_hash}")
            return False
            
    except Exception as e:
        print(f"Error verifying file: {e}")
        return False

def main():
    parser = argparse.ArgumentParser(description='File timestamp watcher')
    parser.add_argument('action', choices=['watch', 'verify'])
    parser.add_argument('--watch-dir', default='./watched_files')
    parser.add_argument('--output-dir', default='./timestamps')
    parser.add_argument('--file', help='File to verify')
    parser.add_argument('--timestamp', help='Timestamp record to verify against')
    
    args = parser.parse_args()
    
    if args.action == 'watch':
        start_watching(args.watch_dir, args.output_dir)
    elif args.action == 'verify':
        if not args.file or not args.timestamp:
            print("Verify action requires --file and --timestamp arguments")
            return
        verify_file_integrity(args.file, args.timestamp)

if __name__ == "__main__":
    main()

Now you can run python watcher.py verify --file document.pdf --timestamp timestamps/document.pdf_a1b2c3d4.json to check if a file still matches its original hash.

Real-World Usage Patterns

I run this on three different folders:

Draft folder: Everything I write goes here first. Blog posts, documentation, random notes. The script catches every save and creates a hash record. If I ever need to prove when I wrote something, I have the timestamp.

Photo imports: Connected to my camera's import folder. Every photo gets hashed the moment it hits the disk. Useful for photography work where timing matters.

Project artifacts: Build outputs, releases, configuration files. Anything that gets generated and might need provenance later.

The script handles large files fine since verify-proof streams the file for hashing rather than loading it all into memory. I've tested it with multi-gigabyte video files without issues.

One gotcha: some applications write files in chunks or create temporary files first. The one-second delay in the script helps avoid hashing incomplete files, but you might need to adjust this for your workflow.

What You Get

Each file that hits your watched folder gets a JSON record like this:

{
  "file_path": "./watched_files/contract_draft.pdf",
  "file_name": "contract_draft.pdf", 
  "sha256_hash": "a1b2c3d4e5f6789...",
  "timestamped_at": "2026-04-23T14:30:22.123456",
  "file_size": 245760
}

The SHA-256 hash is your cryptographic fingerprint. If even one byte changes in the file, you get a completely different hash. That's how you prove the file hasn't been modified since the timestamp.

You can take this further by anchoring the hash to a blockchain using ProofAnchor, but even just the local timestamp + hash gives you strong evidence of when you had the file in that exact state.

The beauty is it runs silently in the background. Drop files in the folder, get automatic timestamps. No manual steps, no missed files, no gaps in your proof timeline.