AI-powered misinformation detectors—artificial intelligence tools that identify false or inaccurate online content—have emerged as a potential intervention for helping internet users understand the veracity of the content they view. However, the algorithms used to create these detectors are experimental and largely untested at the scale necessary to be effective on a social media platform.
Category: information science – Page 16
If it were up to Larry Ellison, the exorbitantly rich cofounder of software outfit Oracle, all of us will soon be smiling for the camera — constantly. Not for a cheery photograph, but to appease our super-invasive, if not totally omnipresent, algorithmic overseers.
As Business Insider reports, the tech centibillionaire glibly predicts that the wonders of AI will bring about a new paradigm of supercharged surveillance, guaranteeing that the proles — excuse us, “citizens” — all behave and stay in line.
“We’re going to have supervision,” Ellison said this week at an Oracle financial analysts meeting, per BI. “Every police officer is going to be supervised at all times, and if there’s a problem, AI will report that problem and report it to the appropriate person.”
Decoding top quarks with precision: Experiment at Large Hadron Collider reveals how pairs of top quarks are produced
Posted in information science, particle physics | Leave a Comment on Decoding top quarks with precision: Experiment at Large Hadron Collider reveals how pairs of top quarks are produced
The second ATLAS study, presented recently at the 17th International Workshop on Top Quark Physics, broke new ground by providing the first dedicated ATLAS measurement of how often top-quark pairs are produced along with jets originating from charm quarks (c-jets).
ATLAS physicists analyzed events with one or two leptons (electrons and muons), using a custom flavor-tagging algorithm developed specifically for this study to distinguish c-jets from b-jets and other jets. This algorithm was essential because c-jets are even more challenging to identify than b-jets, as they have shorter lifetimes and produce less distinct signatures in the ATLAS detector.
The study found that most theoretical models provided reasonable agreement with the data, though they generally underpredicted the production rates of c-jets. These results, which for the first time separately determined the cross-sections for single and multiple charm-quark production in top-quark-pair events, highlight the need for refined simulations of these processes to improve future measurements.
Compact ‘Gene Scissors’ enable Effective Genome Editing, may offer Future Treatment of High Cholesterol Gene Defect
Posted in bioengineering, biotech/medical, genetics, information science, robotics/AI | Leave a Comment on Compact ‘Gene Scissors’ enable Effective Genome Editing, may offer Future Treatment of High Cholesterol Gene Defect
CRISPR-Cas is used broadly in research and medicine to edit, insert, delete or regulate genes in organisms. TnpB is an ancestor of this well-known “gene scissors” but is much smaller and thus easier to transport into cells.
Using protein engineering and AI algorithms, University of Zurich researchers have now enhanced TnpB capabilities to make DNA editing more efficient and versatile, paving the way for treating a genetic defect for high cholesterol in the future. The work has been published in Nature Methods.
CRISPR-Cas systems, which consist of protein and RNA components, were originally developed as a natural defense mechanism of bacteria to fend off intruding viruses. Over the last decade, re-engineering these so-called “gene scissors” has revolutionized genetic engineering in science and medicine.
The platform-agnostic algorithm ignores information related to noise in quantum computers to prevent compounding errors.
All proteins are composed of chains of amino acids, which generally fold up into compact globules with specific shapes. The folding process is governed by interactions between the different amino acids—for example, some of them carry electrical charges—so the sequence determines the structure. Because the structure in turn defines a protein’s function, deducing a protein’s structure is vital for understanding many processes in molecular biology, as well as for identifying drug molecules that might bind to and alter a protein’s activity.
Protein structures have traditionally been determined by experimental methods such as x-ray crystallography and electron microscopy. But researchers have long wished to be able to predict a structure purely from its sequence—in other words, to understand and predict the process of protein folding.
For many years, computational methods such as molecular dynamics simulations struggled with the complexity of that problem. But AlphaFold bypassed the need to simulate the folding process. Instead, the algorithm could be trained to recognize correlations between sequence and structure in known protein structures and then to generalize those relationships to predict unknown structures.
Now in Quantum: by Antonio deMarti iOlius, Patricio Fuentes, Román Orús, Pedro M. Crespo, and Josu Etxezarreta Martinez https://doi.org/10.22331/q-2024-10-10-1498
Antonio deMarti iOlius1, Patricio Fuentes2, Román Orús3,4,5, Pedro M. Crespo1, and Josu Etxezarreta Martinez1
1Department of Basic Sciences, Tecnun — University of Navarra, 20,018 San Sebastian, Spain. 2 Photonic Inc., Vancouver, British Columbia, Canada. 3 Multiverse Computing, Pio Baroja 37, 20008 San Sebastián, Spain 4 Donostia International Physics Center, Paseo Manuel de Lardizabal 4, 20018 San Sebastián, Spain 5 IKERBASQUE, Basque Foundation for Science, Plaza Euskadi 5, 48009 Bilbao, Spain.
Get full text pdfRead on arXiv VanityComment on Fermat’s library.
A team of engineers at AI inference technology company BitEnergy AI reports a method to reduce the energy needs of AI applications by 95%. The group has published a paper describing their new technique on the arXiv preprint server.
As AI applications have gone mainstream, their use has risen dramatically, leading to a notable rise in energy needs and costs. LLMs such as ChatGPT require a lot of computing power, which in turn means a lot of electricity is needed to run them.
As just one example, ChatGPT now requires roughly 564 MWh daily, or enough to power 18,000 American homes. As the science continues to advance and such apps become more popular, critics have suggested that AI applications might be using around 100 TWh annually in just a few years, on par with Bitcoin mining operations.
Researchers have developed a new algorithm called L-Mul that could reduce AI energy consumption by up to 95%.
Overcoming ‘catastrophic forgetting’: Algorithm inspired by brain allows neural networks to retain knowledge
Posted in biological, information science, robotics/AI, transportation | Leave a Comment on Overcoming ‘catastrophic forgetting’: Algorithm inspired by brain allows neural networks to retain knowledge
Neural networks have a remarkable ability to learn specific tasks, such as identifying handwritten digits. However, these models often experience “catastrophic forgetting” when taught additional tasks: They can successfully learn the new assignments, but “forget” how to complete the original. For many artificial neural networks, like those that guide self-driving cars, learning additional tasks thus requires being fully reprogrammed.
Biological brains, on the other hand, are remarkably flexible. Humans and animals can easily learn how to play a new game, for instance, without having to re-learn how to walk and talk.
Inspired by the flexibility of human and animal brains, Caltech researchers have now developed a new type of algorithm that enables neural networks to be continuously updated with new data that they are able to learn from without having to start from scratch. The algorithm, called a functionally invariant path (FIP) algorithm, has wide-ranging applications from improving recommendations on online stores to fine-tuning self-driving cars.