Looking at the bench and readings, he concluded that the previous night’s firmware update had introduced a timing mismatch. The wires hadn’t burnt out, but the clock that told them when to fire had been off by a microsecond, so the expected voltage response never lined up. He suspected half the channels had dropped out, even though the hardware itself wasn’t damaged. Fifteen minutes and a simple firmware rollback later, and everything worked perfectly.
Now, Lyre and I swapped the saline for neuron cultures to check if the wires could trigger and record real biological data. While we confirmed, Aux fine-tuned his AI encoder and processed April’s data.
We were finally ready to test the integrated system, without yet risking its insertion into April’s brain. We built something we only half jokingly called a “phantom cortex,” a benchtop stand-in: a synthetic cortical sheet of cultured neurons on a chip designed to act as April’s visual cortex. On one side, we put a lab-grown retinal implant that carried live sensory input. On the other, Aux’s playback device pushed reconstructed memories. The phantom cortex’s visual field was rendered on a lab monitor so that we could assess the pattern projections. The phantom cortex rig buzzing faintly in the background, gelled neuron sheets twitching under the microscope with each ripple of charge.









