## Archive for the ‘biological’ category

In this podcast, I have invited Daniel Jue, one of the youngest Entrepreneurs of the field of AGI. Daniel is an Independent Artificial General Intelligence researcher at Cognami in the US. He has worked supporting the US Department of Defense, including Data Fusion and analytic development for DARPA, the Defense Advanced Research Projects Agency, whose mission is to prevent technological surprise by potential adversaries. In addition he worked with scientists and engineers at IronNet CyberSecurity, a startup with DARPA and NSA heritage who have recently gone public. In March of 2,021 Daniel took on full time AGI research, drawing upon the fields of Computer Science, Neuroscience, Philosophy and Psychology. Some of his major influences have been Jacques Pitrat’s CAIA (An Artificial AI Scientist) project, Jean Piaget’s childhood development theories and Spiking Neural Networks. He sees a generalizable substrate at the basis for AGI, where engineers design the “physics” in which intelligent behavior could emerge.

SUBSCRIBE to our YouTube Channel and “ring the bell” for all the latest videos from ‘The SCI-AI Podcast’ at https://bit.ly/3y6ISwL
- Listen to us on Buzzsprout: https://feeds.buzzsprout.com/1816580.rss.
- FOLLOW us on Instagram: https://www.instagram.com/brightvik/
- SUBSCRIBE to our channel on Apple Podcast: https://apple.co/3gllCVL
- SUBSCRIBE to our channel on Spotify Podcast: https://spoti.fi/2WfCTZx.

Hey it’s Han from WrySci HX talking about a really interesting concept — the world’s first biological gaming console that uses nanopore technology to detect molecules, and turn these readouts into games! It’s called the Demonpore 64! More below ↓↓↓

Subscribe! =]

The results showed that calves performed at a similar level to children when learning to potty-train, and did better than very young children.

If you can potty-train a child, you can potty-train a cow. At least, that was the theory a group of researchers in Germany decided to test, in a bid to find a solution to the environmental damage caused by livestock waste.

“It’s usually assumed that cattle are not capable of controlling defecation or urination,” said Jan Langbein, co-author of a study published Monday in the journal Current Biology.

By Harold Katcher.

The book The Illusion of Knowledge, by Harold Katcher, was launched on September 4th 2021 at Book Passage Ferry Building, San Francisco/CA. The book was published by NTZ, a publisher specialized on the rejuvenation field.

SEOUL, Sept 9 (Reuters) — South Korean researchers say they have developed an artificial skin-like material, inspired by natural biology, that can quickly adjust its hues like a chameleon to match its surroundings.

The team, led by Ko Seung-hwan, a mechanical engineering professor at Seoul National University, created the “skin” with a special ink that changes colour based on temperature and is controlled by tiny, flexible heaters.

“If you wear woodland camouflage uniforms in desert, you can be easily exposed,” Ko told Reuters. “Changing colours and patterns actively in accordance with surroundings is key to the camouflage technology that we created.”

Should we be searching for post-biological aliens?

A more general definition of entropy was proposed by Boltzmann (1877) as S = k ln W, where k is Boltzmann’s constant, and W is the number of possible states of a system, in the units J⋅K−1, tying entropy to statistical mechanics. Szilard (1929) suggested that entropy is fundamentally a measure of the information content of a system. Shannon (1948) defined informational entropy as $$S=-\sum_{i}{p}_{i}{log}_{b}{p}_{i}$$ where pi is the probability of finding message number i in the defined message space, and b is the base of the logarithm used (typically 2 resulting in units of bits). Landauer (1961) proposed that informational entropy is interconvertible with thermodynamic entropy such that for a computational operation in which 1 bit of information is erased, the amount of thermodynamic entropy generated is at least k ln 2. This prediction has been recently experimentally verified in several independent studies (Bérut et al. 2012; Jun et al. 2014; Hong et al. 2016; Gaudenzi et al. 2018).

The equivalency of thermodynamic and informational entropy suggests that critical points of instability and subsequent self-organization observed in thermodynamic systems may be observable in computational systems as well. Indeed, this agrees with observations in cellular automata (e.g., Langton 1986; 1990) and neural networks (e.g., Wang et al. 1990; Inoue and Kashima 1994), which self-organize to maximize informational entropy production (e.g., Solé and Miramontes 1995). The source of additional information used for self-organization has been identified as bifurcation and deterministic chaos (Langton 1990; Inoue and Kashima 1994; Solé and Miramontes 1995; Bahi et al. 2012) as defined by Devaney (1986). This may provide an explanation for the phenomenon termed emergence, known since classical antiquity (Aristotle, c. 330 BCE) but lacking a satisfactory explanation (refer to Appendix A for discussion on deterministic chaos, and Appendix B for discussion on emergence). It is also in full agreement with extensive observations of deterministic chaos in chemical (e.g., Nicolis 1990; Györgyi and Field 1992), physical (e.g., Maurer and Libchaber 1979; Mandelbrot 1983; Shaw 1984; Barnsley et al. 1988) and biological (e.g., May 1975; Chay et al. 1995; Jia et al. 2012) dissipative structures and systems.

This theoretical framework establishes a deep fundamental connection between cyberneticFootnote 1 and biological systems, and implicitly predicts that as more work is put into cybernetic systems composed of hierarchical dissipative structures, their complexity increases, allowing for more possibilities of coupled feedback and emergence at increasingly higher levels. Such high-level self-organization is routinely exploited in machine learning, where artificial neural networks (ANNs) self-organize in response to inputs from the environment similarly to neurons in the brain (e.g., Lake et al. 2017; Fong et al. 2018). The recent development of a highly organized (low entropy) immutable information carrier, in conjunction with ANN-based artificial intelligence (AI) and distributed computing systems, presents new possibilities for self-organization and emergence.

Quantum sensing is being used to outpace modern sensing processes by applying quantum mechanics to design and engineering. These optimized processes will help beat the current limits in processes like studying magnetic materials or studying biological samples. In short, quantum is the next frontier in sensing technology.

As recently as 2,019 spin defects known as qubits were discovered in 2D materials (hexagonal boron nitride) which could amplify the field of ultrathin . These scientists hit a snag in their discovery which has unleashed a scientific race to resolve the issues. Their sensitivity was limited by their low brightness and the low contrast of their magnetic resonance signal. As recently as two weeks ago on August 9 2021, Nature Physics published an article titled “quantum sensors go flat,” where they highlighted the benefits and also outlined current shortfalls of this new and exciting means of sensing via qubits in 2D materials.

A team of researchers at Purdue took on this challenge of overcoming qubit signal shortcomings in their work to develop ultrathin quantum sensors with 2D materials. Their publication in Nano Letters was published today, September 2 2021, and they have solved some of the critical issues and yielded much better results through experimentation.

Computational neuroscientists taught an artificial neural network to imitate a biological neuron. The result offers a new way to think about the complexity of single brain cells.

Deep beneath the seabed, teensy bacteria “exhale” electricity through long, skinny snorkels, and now, scientists have discovered how to switch these microbes’ electric breath on and off.

Page 1 of 10812345678Last