Menu

Blog

Archive for the ‘computing’ category: Page 210

Jul 5, 2023

Can Sponges “Think” Using Light?

Posted by in categories: computing, education

Sponges might not look like particularly complex animals, but they’ve had billions of years to evolve their own special systems. And one of those systems might involve sending messages through their body in the form of light.

Hosted by: Rose Bear Don’t Walk (She/Her)
———
Support SciShow by becoming a patron on Patreon: https://www.patreon.com/scishow.
———
Huge thanks go to the following Patreon supporters for helping us keep SciShow free for everyone forever: Matt Curls, Alisa Sherbow, Dr. Melvin Sanicas, Harrison Mills, Adam Brainard, Chris Peters, charles george, Piya Shedden, Alex Hackman, Christopher R, Boucher, Jeffrey Mckishen, Ash, Silas Emrys, Eric Jensen, Kevin Bealer, Jason A Saslow, Tom Mosner, Tomás Lagos González, Jacob, Christoph Schwanke, Sam Lutfi, Bryan Cloer.
———
Looking for SciShow elsewhere on the internet?
SciShow Tangents Podcast: https://scishow-tangents.simplecast.com/
TikTok: https://www.tiktok.com/@scishow.
Twitter: http://www.twitter.com/scishow.
Instagram: http://instagram.com/thescishowFacebook: http://www.facebook.com/scishow.

Continue reading “Can Sponges ‘Think’ Using Light?” »

Jul 4, 2023

Quantum Computing On A Commodore 64 In 200 Lines Of BASIC

Posted by in categories: computing, education, quantum physics

The term ‘quantum computer’ gets usually tossed around in the context of hyper-advanced, state-of-the-art computing devices, but much as how a 19th century mechanical computer, a discrete computer created from individual transistors, and a human being are all computers, the important quantifier is how fast and accurate the system is at the task, whether classical or quantum computing. This is demonstrated succinctly by [Davide ‘dakk’ Gessa] with 200 lines of BASIC code on a Commodore 64 (GitHub), implementing a range of quantum gates.

Much like a transistor in classical computing, the qubit forms the core of quantum computing, and we have known for a long time that a qubit can be simulated, even on something as mundane as an 8-bit MPU. Ergo [Davide]’s simulations of various quantum gates on a C64, ranging from Pauli-X, Pauli-Y, Pauli-Z, Hadamard, CNOT and SWAP, all using a two-qubit system running on a system that first saw the light of day in the early 1980s.

Naturally, the practical use of simulating a two-qubit system on a general-purpose MPU running at a blistering ~1 MHz is quite limited, but as a teaching tool it’s incredibly accessible and a fun way to introduce people to the world of quantum computing.

Jul 4, 2023

QEDMA Quantum Computing: Shaping the Future of Quantum Operating Systems

Posted by in categories: computing, quantum physics

Quantum computing has long been heralded as the next frontier in computing. However, despite their immense potential, quantum computers today still make too many errors to be useful.

While it may become possible to correct these errors in the future, there is still a long way to go to reach fault tolerance. For now, the best strategy is to minimize errors and mitigate their impact on quantum computations by devising methods that can work with the existing quantum hardware.

Continue reading “QEDMA Quantum Computing: Shaping the Future of Quantum Operating Systems” »

Jul 4, 2023

Dimitar Sasselov — What is the Far Future of Intelligence in the Universe?

Posted by in categories: biological, computing, space

Free access to Closer To Truth’s library of 5,000 videos: https://closertotruth.com/

Our universe has been developing for about 14 billion years, but human-level intelligence, at least on Earth, has emerged in a remarkably short period of time, measured in tens or hundreds of thousands of years. What then is the future of intelligence? With the exponential growth of computing, will non-biological intelligence dominate?

Continue reading “Dimitar Sasselov — What is the Far Future of Intelligence in the Universe?” »

Jul 4, 2023

Microsoft’s light-based computer marks ‘the unravelling of Moore’s Law’

Posted by in categories: computing, finance

Presenting its findings as “Unlocking the future of computing” Microsoft is edging ever closer to photon computing technology with the Analog Iterative Machine (AIM). Right now, the light-based machine is being licensed for use in financial institutions, to help navigate the endlessly complex data flowing through them.

According to the Microsoft Research Blog, “Microsoft researchers have been developing a new kind of analog optical computer that uses photons and electrons to process continuous value data, unlike today’s digital computers that use transistors to crunch through binary data” (via Hardware Info).

Jul 4, 2023

AMD Ryzen 5 7500F Allegedly Coming Soon, No iGPU Support

Posted by in category: computing

The product itself is an interesting one, and seems built to hit a sweet price/performance ratio for anyone that plans on using a discrete GPU solution. Of course, the absence of an integrated GPU does limit the users’ flexibility — I can’t count the number of times I used an integrated GPU to try and pinpoint issues with my systems (and graphics cards). But the fact remains that more consumer choice is best: users can make their own decision on whether that’s worth the extra $10 or not.

Most of this information comes courtesy of Harukaze (via Twitter), as well as a benchmark on PugetBench, where the Ryzen 5 7500F was paired with an X670E motherboard and 32 GB of DDR5-4800 memory.

Jul 3, 2023

A user-friendly platform for virtual exploration of chemical reactions

Posted by in categories: chemistry, computing

A new online platform to explore computationally calculated chemical reaction pathways has been released, allowing for in-depth understanding and design of chemical reactions.

Advances in have lead to the discovery of new reaction pathways for the synthesis of high-value compounds. Computational chemistry generates much data, and the process of organizing and visualizing this data is vital to be able to utilize it for future research.

A team of researchers from Hokkaido University, led by Professor Keisuke Takahashi at the Faculty of Chemistry and Professor Satoshi Maeda at the Institute for Chemical Reaction Design and Discovery (WPI-ICReDD), have developed a centralized, interactive, and user-friendly platform, Searching Chemical Action and Network (SCAN), to explore reaction pathways generated by computational chemistry. Their research was published in the journal Digital Discovery.

Jul 3, 2023

Unraveling a Quantum Enigma: How Tantalum Enhances Qubit Performance

Posted by in categories: chemistry, computing, nanotechnology, quantum physics

Whether it’s baking a cake, constructing a building, or creating a quantum device, the caliber of the finished product is greatly influenced by the components or fundamental materials used. In their pursuit to enhance the performance of superconducting qubits, which form the bedrock of quantum computers, scientists have been probing different foundational materials aiming to extend the coherent lifetimes of these qubits.

Coherence time serves as a metric to determine the duration a qubit can preserve quantum data, making it a key performance indicator. A recent revelation by researchers showed that the use of tantalum in superconducting qubits enhances their functionality. However, the underlying reasons remained unknown – until now.

Scientists from the Center for Functional Nanomaterials (CFN), the National Synchrotron Light Source II (NSLS-II), the Co-design Center for Quantum Advantage (C2QA), and Princeton University investigated the fundamental reasons that these qubits perform better by decoding the chemical profile of tantalum.

Jul 3, 2023

How to stop quantum computers from breaking the internet’s encryption

Posted by in categories: computing, encryption, information science, internet, quantum physics

Today’s encryption schemes will be vulnerable to future quantum computers, but new algorithms and a quantum internet could help.

Jul 3, 2023

AI and Humanity’s Future

Posted by in categories: augmented reality, automation, big data, computing, disruptive technology, evolution, futurism, innovation, internet, machine learning, robotics/AI, singularity, supercomputing, transhumanism

The concept of a computational consciousness and the potential impact it may have on humanity is a topic of ongoing debate and speculation. While Artificial Intelligence (AI) has made significant advancements in recent years, we have not yet achieved a true computational consciousness that can replicate the complexities of the human mind.

It is true that AI technologies are becoming more sophisticated and capable of performing tasks that were previously exclusive to human intelligence. However, there are fundamental differences between Artificial Intelligence and human consciousness. Human consciousness is not solely based on computation; it encompasses emotions, subjective experiences, self-awareness, and other aspects that are not yet fully understood or replicated in machines.

The arrival of advanced AI systems could certainly have transformative effects on society and our understanding of humanity. It may reshape various aspects of our lives, from how we work and communicate to how we approach healthcare and scientific discoveries. AI can enhance our capabilities and provide valuable tools for solving complex problems.

However, it is important to consider the ethical implications and potential risks associated with the development of AI. Ensuring that AI systems are developed and deployed responsibly, with a focus on fairness, transparency, and accountability, is crucial.

Continue reading “AI and Humanity's Future” »