Paul Han

Objective: Continuity Fish: Human AI Augmentation

Paul Han developed AI training clusters at Meta and designed nuclear weapons at LLNL before realizing combining AI with health wearables could do more than track vitals—it could capture states of being. He wants to erase the hard swap between “2024 Paul” and “2027 Paul” by recording physiology, motion, and context 24/7 and replaying the optimal state on demand, expanding human experience.

The core problem: current wearables only quantify single metrics (steps, HRV) and hand raw numbers to a cloud app; they don’t reconstruct the multi-sensor fingerprint of a given mindset. Paul’s patch will store his digital twin through biometric monitoring and augment the mind through haptic or audio cues—no cloud, no latency.

Paul joined Calculus because traditional labs spend more time filing IRBs and booking equipment than shipping hardware; the hacker-house cadence lets him iterate boards at breakfast and run field tests before dinner.

By September he will ship: a 50-subject dataset linking multi-location PPG/ECG patterns to self-reported cognitive states; a production run of 20 FPGA-enabled patches with 48-hour adhesive wear; and a live demo where the device nudges users into their recorded “presentation mode” in real time.

Paul Han

Objective: Continuity Fish: Human AI Augmentation

Paul Han developed AI training clusters at Meta and designed nuclear weapons at LLNL before realizing combining AI with health wearables could do more than track vitals—it could capture states of being. He wants to erase the hard swap between “2024 Paul” and “2027 Paul” by recording physiology, motion, and context 24/7 and replaying the optimal state on demand, expanding human experience.

The core problem: current wearables only quantify single metrics (steps, HRV) and hand raw numbers to a cloud app; they don’t reconstruct the multi-sensor fingerprint of a given mindset. Paul’s patch will store his digital twin through biometric monitoring and augment the mind through haptic or audio cues—no cloud, no latency.

Paul joined Calculus because traditional labs spend more time filing IRBs and booking equipment than shipping hardware; the hacker-house cadence lets him iterate boards at breakfast and run field tests before dinner.

By September he will ship: a 50-subject dataset linking multi-location PPG/ECG patterns to self-reported cognitive states; a production run of 20 FPGA-enabled patches with 48-hour adhesive wear; and a live demo where the device nudges users into their recorded “presentation mode” in real time.

Paul Han

Objective: Continuity Fish: Human AI Augmentation

Paul Han developed AI training clusters at Meta and designed nuclear weapons at LLNL before realizing combining AI with health wearables could do more than track vitals—it could capture states of being. He wants to erase the hard swap between “2024 Paul” and “2027 Paul” by recording physiology, motion, and context 24/7 and replaying the optimal state on demand, expanding human experience.

The core problem: current wearables only quantify single metrics (steps, HRV) and hand raw numbers to a cloud app; they don’t reconstruct the multi-sensor fingerprint of a given mindset. Paul’s patch will store his digital twin through biometric monitoring and augment the mind through haptic or audio cues—no cloud, no latency.

Paul joined Calculus because traditional labs spend more time filing IRBs and booking equipment than shipping hardware; the hacker-house cadence lets him iterate boards at breakfast and run field tests before dinner.

By September he will ship: a 50-subject dataset linking multi-location PPG/ECG patterns to self-reported cognitive states; a production run of 20 FPGA-enabled patches with 48-hour adhesive wear; and a live demo where the device nudges users into their recorded “presentation mode” in real time.