Menu

Blog

Archive for the ‘computing’ category: Page 286

Mar 19, 2022

Clockwork DevTerm R-01 Takes RISC-V Out For A Spin

Posted by in categories: computing, education

If you’re anything like us you’ve been keeping a close eye on the development of RISC-V: an open standard instruction set architecture (ISA) that’s been threatening to change the computing status quo for what seems like forever. From its humble beginnings as a teaching tool in Berkeley’s Parallel Computing Lab in 2010, it’s popped up in various development boards and gadgets from time to time. It even showed up in the 2019 Hackaday Supercon badge, albeit in FPGA form. But getting your hands on an actual RISC-V computer has been another story entirely. Until now, that is.

Clockwork has recently announced the availability of the DevTerm R-01, a variant of their existing portable computer that’s powered by a RISC-V module rather than the ARM chips featured in the earlier A04 and A06 models. Interestingly the newest member of the family is actually the cheapest at $239 USD, though it’s worth mentioning that not only does this new model only include 1 GB of RAM, but the product page makes it clear that the RISC-V version is intended for experienced penguin wranglers who aren’t afraid of the occasional bug.

Beyond the RISC-V CPU and slimmed down main memory, this is the same DevTerm that our very own [Donald Papp] reviewed earlier this month. Thanks to the modular nature of the portable machine, this sort of component swapping is a breeze, though frankly we’re impressed that the Clockwork team is willing to go out on such a limb this early in the product’s life. In our first look at the device we figured at best they would release an updated CPU board to accommodate the Raspberry Pi 4 Compute Module, but supporting a whole new architecture is a considerably bolder move. One wonders that other plans they may have for the retro-futuristic machine. Perhaps a low-power x86 chip isn’t out of the question?

Mar 18, 2022

Human brain organoids grown in cheap 3D-printed bioreactor

Posted by in categories: computing, neuroscience

Circa 2021


It is now possible to grow and culture human brain tissue in a device that costs little more than a cup of coffee. With a $5 washable and reusable microchip, scientists can watch self-organising brain samples, known as brain organoids, growing in real time under a microscope.

The device, dubbed a “microfluidic bioreactor”, is a 4-by-6-centimetre chip that includes small wells in which the brain organoids grow. Each is filled with nutrient-rich fluid that is pumped in and out automatically, like the fluids that flush through the human brain.

Continue reading “Human brain organoids grown in cheap 3D-printed bioreactor” »

Mar 18, 2022

Future evolution: from looks to brains and personality, how will humans change in the next 10,000 years?

Posted by in categories: biotech/medical, computing, food, genetics, information science, mobile phones, neuroscience

And going forward, we’ll do this with far more knowledge of what we’re doing, and more control over the genes of our progeny. We can already screen ourselves and embryos for genetic diseases. We could potentially choose embryos for desirable genes, as we do with crops. Direct editing of the DNA of a human embryo has been proven to be possible — but seems morally abhorrent, effectively turning children into subjects of medical experimentation. And yet, if such technologies were proven safe, I could imagine a future where you’d be a bad parent not to give your children the best genes possible.

Computers also provide an entirely new selective pressure. As more and more matches are made on smartphones, we are delegating decisions about what the next generation looks like to computer algorithms, who recommend our potential matches. Digital code now helps choose what genetic code passed on to future generations, just like it shapes what you stream or buy online. This might sound like dark science fiction, but it’s already happening. Our genes are being curated by computer, just like our playlists. It’s hard to know where this leads, but I wonder if it’s entirely wise to turn over the future of our species to iPhones, the internet and the companies behind them.

Discussions of human evolution are usually backward looking, as if the greatest triumphs and challenges were in the distant past. But as technology and culture enter a period of accelerating change, our genes will too. Arguably, the most interesting parts of evolution aren’t life’s origins, dinosaurs, or Neanderthals, but what’s happening right now, our present – and our future.

Mar 18, 2022

How Graphene will Save Moore’s Law

Posted by in categories: computing, materials

While many say that Moore’s Law is dead, scientists are hard at work discovering new semiconductor materials which will help increase CPU and GPU performance well into the 2030’s right on track of Moore’s Laws exponential properties. Companies such as TSMC and Intel could use Graphene to make the smallest possible transistors and much improve their efficiency as electricity prices skyrocket. 2nm or 1nm processors might soon come out.

TIMESTAMPS:
00:00 The Revival of Moore’s Law.
01:15 Smallest Transistor ever made.
03:54 What actually are transistors?
05:49 Moore’s Law Is Dead?
07:55 Last Words.

#cpu #mooreslaw #graphene

Mar 18, 2022

This Diamond Transistor is Still Raw, But Its Future Looks Bright

Posted by in categories: computing, cosmology, quantum physics

Researchers in Japan have developed a diamond FET with high hole mobility.


In the 1970s, Stephen Hawking found that an isolated black hole would emit radiation but only when considered quantum mechanics. This is known as black hole evaporation because the black hole shrinks. However, this led to the black hole information paradox.

If the black hole evaporates entirely, physical information would permanently disappear in a black hole. However, this violates a core precept of quantum physics: the information cannot vanish from the Universe.

Continue reading “This Diamond Transistor is Still Raw, But Its Future Looks Bright” »

Mar 18, 2022

The coming decade of digital brain research — A vision for neuroscience at the intersection of technology and computing

Posted by in categories: biotech/medical, computing, neuroscience

Brain research has in recent years indisputably entered a new epoch, driven by substantial methodological advances and digitally enabled data integration and modeling at multiple scales – from molecules to the whole system. Major advances are emerging at the intersection of neuroscience with technology and computing. This new science of the brain integrates high-quality basic research, systematic data integration across multiple scales, a new culture of large-scale collaboration and translation into applications. A systematic approach, as pioneered in Europe’s Human Brain Project (HBP), will be essential in meeting the pressing medical and technological challenges of the coming decade.

Mar 17, 2022

Quantum Computing Breakthrough: Scientists Sent the First ‘Landline’ Message

Posted by in categories: computing, quantum physics

Mar 17, 2022

Exocortex: Thought this might be of some interest

Posted by in categories: biological, computing, neuroscience, transhumanism

An is an external information processing system that augments the brain’s biological high-level cognitive processes.

An individual’s would be comprised of external memory modules 0, processors 0, IO devices and software systems that would interact with, and augment, a person’s biological brain. Typically this interaction is described as being conducted through a direct brain-computer interface 0, making these extensions functionally part of the individual’s mind.

Individuals with significant exocortices can be classified as transhuman beings.

Mar 17, 2022

An indium oxide-based transistor created using atomic layer deposition

Posted by in categories: computing, mobile phones, solar power, sustainability

Over the past decades, engineers have created increasingly advanced and highly performing integrated circuits (ICs). The rising performance of these circuits in turn increased the speed and efficiency of the technology we use every day, including computers, smartphones and other smart devices.

To continue to improve the performance of integrated circuits in the future, engineers will need to create thinner transistors with shorter channels. Down-scaling existing silicon-based devices or creating smaller devices using alternative semiconducting materials that are compatible with existing fabrication processes, however, has proved to be challenging.

Researchers at Purdue University have recently developed new transistors based on indium oxide, a semiconductor that is often used to create touch screens, flatscreen TVs and solar panels. These transistors, introduced in a paper published in Nature Electronics, were fabricated using atomic layer deposition, a process that is often employed by transistor and electronics manufacturers.

Mar 17, 2022

Bringing practical applications of quantum computing closer

Posted by in categories: computing, quantum physics

Two Amazon papers at #QIP2022 could have near-term applications: Mario Berta and colleagues propose a new approach to statistical phase estimation that could en… See more.


New phase estimation technique reduces qubit count, while learning framework enables characterization of noisy quantum systems.