Toggle light / dark theme

Recent research demonstrates that brain organoids can indeed “learn” and perform tasks, thanks to AI-driven training techniques inspired by neuroscience and machine learning. AI technologies are essential here, as they decode complex neural data from the organoids, allowing scientists to observe how they adjust their cellular networks in response to stimuli. These AI algorithms also control the feedback signals, creating a biofeedback loop that allows the organoids to adapt and even demonstrate short-term memory (Bai et al. 2024).

One technique central to AI-integrated organoid computing is reservoir computing, a model traditionally used in silicon-based computing. In an open-loop setup, AI algorithms interact with organoids as they serve as the “reservoir,” for processing input signals and dynamically adjusting their responses. By interpreting these responses, researchers can classify, predict, and understand how organoids adapt to specific inputs, suggesting the potential for simple computational processing within a biological substrate (Kagan et al. 2023; Aaser et al. n.d.).

Our body isn’t just human—it’s home to trillions of microorganisms found in or on us. In fact, there are more microbes in our gut than there are stars in the Milky Way. These microbes are essential for human health, but scientists are still figuring out exactly what they do and how they help.

In a new study, published in Nature Microbiology, my colleagues and I explored how certain gut bacteria—a group known as Enterobacteriaceae—can protect us from harmful ones. These bacteria include species such as Escherichia coli (E coli). This is normally harmless in small amounts but can cause infections and other health problems if it grows too much.

We found that our gut environment—shaped by things like diet—plays a big role in keeping potentially harmful bacteria in check.

Cis-trans photoisomerization is a key process for many processes in biology and materials science, but only careful and time-consuming quantum chemistry methods can describe such reaction in detail. Here, a predictive tool is presented requiring few and affordable calculations, evaluating the efficiency of paradigmatic and modified photoswitches.

A groundbreaking discovery by researchers at the University of California, Los Angeles (UCLA) has challenged a long-standing rule in organic chemistry known as Bredt’s Rule. Established nearly a century ago, this rule stated that certain types of specific organic molecules could not be synthesized due to their instability. UCLA’s team’s findings open the door to new molecular structures that were previously deemed unattainable, potentially revolutionizing fields such as pharmaceutical research.

To grasp the significance of this breakthrough, it’s helpful to first understand some basics of organic chemistry. Organic chemistry primarily deals with molecules made of carbon, such as those found in living organisms. Among these, certain molecules known as olefins or alkenes feature double bonds between two carbon atoms. These double bonds create a specific geometry: the atoms and atom groups attached to them are generally in the same plane, making these structures fairly rigid.

In 1924, German chemist Julius Bredt formulated a rule regarding certain molecular structures called bridged bicyclic molecules. These molecules have a complex structure with multiple rings sharing common atoms, akin to two intertwined bracelet loops. Bredt’s Rule dictates that these molecules cannot have a double bond at a position known as the bridgehead, where the two rings meet. The rule is based on geometric reasons: a double bond at the bridgehead would create such significant structural strain that the molecule would become unstable or even impossible to synthesize.

Treating hair loss may be as simple as developing therapies to flip a molecular “switch,” according to a new study by researchers from Penn State; the University of California, Irvine; and National Taiwan University.

The researchers reviewed the biological and social evolution of human scalp hair. Based on their analysis, they proposed a novel theory that points to a molecular basis underlying the ability to grow long scalp hair.

In short, human ancestors may have always had the ability to grow long scalp hair, but the trait remained dormant until certain environmental and biological conditions — like walking upright on two legs — turned on the molecular program. The team published their findings, which they said could serve as the basis for future experimental work, in the British Journal of Dermatology.

Artificially engineered biological processes, such as perception systems, remain an elusive target for organic electronics experts due to the reliance of human senses on an adaptive network of sensory neurons, which communicate by firing in response to environmental stimuli.

A new collaboration between Northwestern University and Georgia Tech has unlocked new potential for the field by creating a novel high-performance organic electrochemical neuron (OECN) that responds within the frequency range of human neurons. The team also built a complete perception system by designing other organic materials and integrating their engineered neurons with artificial touch receptors and synapses, which enabled real-time tactile signal sensing and processing.

The research, described in a paper in Proceedings of the National Academy of Sciences, could move the needle on intelligent robots and other systems currently stymied by sensing systems that are less powerful than those of a human.

Dr. Simon Stringer. Obtained his Ph.D in mathematical state space control theory and has been a Senior Research Fellow at Oxford University for over 27 years. Simon is the director of the Oxford Centre for Theoretical Neuroscience and Artificial Intelligence, which is based within the Oxford University Department of Experimental Psychology. His department covers vision, spatial processing, motor function, language and consciousness — in particular — how the primate visual system learns to make sense of complex natural scenes. Dr. Stringers laboratory houses a team of theoreticians, who are developing computer models of a range of different aspects of brain function. Simon’s lab is investigating the neural and synaptic dynamics that underpin brain function. An important matter here is the The feature-binding problem which concerns how the visual system represents the hierarchical relationships between features. the visual system must represent hierarchical binding relations across the entire visual field at every spatial scale and level in the hierarchy of visual primitives.

We discuss the emergence of self-organised behaviour, complex information processing, invariant sensory representations and hierarchical feature binding which emerges when you build biologically plausible neural networks with temporal spiking dynamics.

00:00:00 Tim Intro.
00:09:31 Show kickoff.
00:14:37 Hierarchical Feature binding and timing of action potentials.
00:30:16 Hebb to Spike-timing-dependent plasticity (STDP)
00:35:27 Encoding of shape primitives.
00:38:50 Is imagination working in the same place in the brain.
00:41:12 Compare to supervised CNNs.
00:45:59 Speech recognition, motor system, learning mazes.
00:49:28 How practical are these spiking NNs.
00:50:19 Why simulate the human brain.
00:52:46 How much computational power do you gain from differential timings.
00:55:08 Adversarial inputs.
00:59:41 Generative / causal component needed?
01:01:46 Modalities of processing i.e. language.
01:03:42 Understanding.
01:04:37 Human hardware.
01:06:19 Roadmap of NNs?
01:10:36 Intepretability methods for these new models.
01:13:03 Won’t GPT just scale and do this anyway?
01:15:51 What about trace learning and transformation learning.
01:18:50 Categories of invariance.
01:19:47 Biological plausibility.

Pod version: https://anchor.fm/machinelearningstrehttps://en.wikipedia.org/wiki/Simon_S / simon-stringer-a3b239b4 “A new approach to solving the feature-binding problem in primate vision” https://royalsocietypublishing.org/do… James B. Isbister, Akihiro Eguchi, Nasir Ahmad, Juan M. Galeazzi, Mark J. Buckley and Simon Stringer Simon’s department is looking for funding, please do get in touch with him if you can facilitate this. #machinelearning #neuroscience.

https://www.neuroscience.ox.ac.uk/res
https://en.wikipedia.org/wiki/Simon_S
/ simon-stringer-a3b239b4.

Researchers from Northwestern and Georgia Tech developed an organic electrochemical neuron that fires at human-like frequencies, enabling real-time tactile signal processing. Their system integrates artificial neurons, touch receptors and synapses.