Toggle light / dark theme

OK. In scientific terms, it is only a ‘hypothesis’ — the reverse of the ‘Disposable Soma’ theory of ageing. Here how it goes.

For the past several decades, the Disposable Soma theory of ageing has been enjoying good publicity and a lively interest from both academics and the public alike. It stands up to scientific scrutiny, makes conceptual sense and fits well within an evolutionary framework of ageing. The theory basically suggests that, due to energy resource constraints, there is a trade-off between somatic cell and germ cell repair. As a result, germ cells are being repaired effectively and so the survival of the species is assured, at a cost of individual somatic (bodily) ageing and death. To put it very simply, we are disposable, we age and die because all the effective repair mechanisms have been diverted to our germ cell DNA in order to guarantee the survival of our species.

The theory accounts for many repair pathways and mechanisms converging upon the germ cell, and also for many of those mechanisms being driven away from somatic cell repair just to ensure germ cell survival. In the past two or three years however, it is increasingly being realised that this process is not unidirectional (from soma to germ), but it is bi-directional: under certain circumstances, somatic cells may initiate damage that affects germ cells, and also that germ cells may initiate repairs that benefit somatic cells!

I can’t even begin to describe how important this bi-directionality is. Taking this in a wider and more speculative sense, it is, in fact, the basis for the cure of ageing. The discovery that germ cells can (or are forced to) relinquish their repair priorities, and that resources can then be re-allocated for somatic repairs instead, means that we may be able to avoid age-related damage (because this would be repaired with greater fidelity) and, at the same time, avoid overpopulation (as our now damaged genetic material would be unsuitable for reproduction).

Ermolaeva et al. raised the further possibility that DNA damage in germ cells may protect somatic cells. They suggested that DNA injury in germ cells upregulates stress resistance pathways in somatic cells, and improves stress response to heat or oxidation. This is profoundly important because it shows that, in principle, when germ cells are damaged, they produce agents which can then protect somatic cells against systemic stress.

This mechanism may reflect an innate tendency to reverse the trade-offs between germ cell and somatic cell repair: when the germ cells are compromised, there is delay in offspring production matched by an increased repair of somatic cells. In Nature’s ‘eyes’, if the species cannot survive, at least the individual bodies should.

In addition, it was shown that neuronal stress induces apoptosis (orderly cell death) in the germ line. This process is mediated by the IRE-1 factor, an endoplasmic reticulum stress response sensor, which then activates p53 and initiates the apoptotic cascade in the germ line. Therefore germ cells may die due to a stress response originating from the distantly-located neurons.

If this mechanism exists, it is likely that other similar mechanisms must also exist, waiting to described. The consequence could be that neuronal positive stress (i.e. exposure to meaningful information that entices us to act) can affect our longevity by downgrading the importance of germ cell repair in favour of somatic tissue repair. In other words, the disposable soma theory can be seen in reverse: the soma (body) is not necessarily disposable but it can survive longer if it becomes indispensable, if it is ‘useful to the whole. This, as we claimed last week, can happen through mechanisms which are independent of any artificial biotechnological interventions.

We know that certain events which downgrade reproduction, may also cause a lifespan extension. Ablation of germ cells in the C.elegans worm, leads to an increased lifespan, which shows that signals from the germline have a direct impact upon somatic cell survival, and this may be due to an increased resistance of somatic cells to stress. Somatic intracellular clearance systems are also up-regulated following signals from the germ line.

In addition, protein homoeostasis in somatic cells is well-maintained when germ cells are damaged, and it is significantly downgraded when germ cell function increases. All of the above suggest that when the germ cells are healthy, somatic repair decreases, and when they are not, somatic repair improves as a counter-effect.

In an intriguing paper published last month, Lin et al. showed that under certain circumstances, somatic cells may adopt germ-like characteristics, which may suggest that these somatic cells can also be subjected to germ line protection mechanisms after their transformation. A few days ago Bazley et al. published a paper elucidating the mechanisms of how germ cells may induce somatic cell reprogramming and somatic stem cell pluripotency. This is an additional piece of evidence of the cross-talk mechanisms between soma and germ line, underscoring the fact that the health of somatic tissues depends upon signals from the germ line.

In all, there is sufficient initial evidence to suggest that my line of thinking is quite possibly correct: that the disposable soma theory is not unidirectional and the body may not, after all, be always ‘disposable’. Under certain evolutionary pressures we could experience increased somatic maintenance at the expense of germ cell repairs, and thus reach a situation where the body actually lives longer. I have already discussed that some of these evolutionary pressures could be dependent upon how well one makes themselves ‘indispensable’ to the adaptability of the homo sapiens species within a global techno-cultural environment.

“It’s much easier to replicate experiments and catch fraud if you have access to the original data. Some journals currently reward researchers for sharing the data that they used in an experiment. In the highest level of this new framework, data sharing would not only become compulsory, but independent analysts would conduct the same tests on it as those reported by the researchers, to see whether they get the same results.” Read more

http://www.slate.com/content/dam/slate/blogs/future_tense/2015/06/24/darpa_s_biology_is_technology_conference_discusses_problems_with_open_source/data.jpg.CROP.promovar-mediumlarge.jpg

“If goodwill and curiosity aren’t motivating researchers to work with open-source data on their own, there is still something that probably will: human limitation. ‘We have tiny little brains. We can’t understand the big stuff anymore,’ said Paul Cohen, a DARPA program manager in the Information and Innovation Office. ‘Machines will read the literature, machines will build complicated models, because frankly we can’t.’ When all you have to do is let your algorithms loose on a trove of publicly available data, there won’t be any reason not to pull in everything that’s out there. ” Read more

In 2014, I submitted my paper “A Universal Approach to Forces” to the journal Foundations of Physics. The 1999 Noble Laureate, Prof. Gerardus ‘t Hooft, editor of this journal, had suggested that I submit this paper to the journal Physics Essays.

My previous 2009 submission “Gravitational acceleration without mass and noninertia fields” to Physics Essays, had taken 1.5 years to review and be accepted. Therefore, I decided against Prof. Gerardus ‘t Hooft’s recommendation as I estimated that the entire 6 papers (now published as Super Physics for Super Technologies) would take up to 10 years and/or $20,000 to publish in peer reviewed journals.

Prof. Gerardus ‘t Hooft had brought up something interesting in his 2008 paper “A locally finite model for gravity” that “… absence of matter now no longer guarantees local flatness…” meaning that accelerations can be present in spacetime without the presence of mass. Wow! Isn’t this a precursor to propulsion physics, or the ability to modify spacetime without the use of mass?

As far as I could determine, he didn’t pursue this from the perspective of propulsion physics. A year earlier in 2007, I had just discovered the massless formula for gravitational acceleration g=τc^2, published in the Physics Essays paper referred above. In effect, g=τc^2 was the mathematical solution to Prof. Gerardus ‘t Hooft’s “… absence of matter now no longer guarantees local flatness…”

Prof. Gerardus ‘t Hooft used string theory to arrive at his inference. Could he empirically prove it? No, not with strings. It took a different approach, numerical modeling within the context of Einstein’s Special Theory of Relativity (STR) to derive a mathematic solution to Prof. Gerardus ‘t Hooft’s inference.

In 2013, I attended Dr. Brian Greens’s Gamow Memorial Lecture, held at the University of Colorado Boulder. If I had heard him correctly, the number of strings or string states being discovered has been increasing, and were now in the 10500 range.

I find these two encounters telling. While not rigorously proved, I infer that (i) string theories are unable to take us down a path the can be empirically proven, and (ii) they are opened ended i.e. they can be used to propose any specific set of outcomes based on any specific set of inputs. The problem with this is that you now have to find a theory for why a specific set of inputs. I would have thought that this would be heartbreaking for theoretical physicists.

In 2013, I presented the paper “Empirical Evidence Suggest A Need For A Different Gravitational Theory,” at the American Physical Society’s April conference held in Denver, CO. There I met some young physicists and asked them about working on gravity modification. One of them summarized it very well, “Do you want me to commit career suicide?” This explains why many of our young physicists continue to seek employment in the field of string theories where unfortunately, the hope of empirically testable findings, i.e. winning the Noble Prize, are next to nothing.

I think string theories are wrong.

Two transformations or contractions are present with motion, Lorentz-FitzGerald Transformation (LFT) in linear motion and Newtonian Gravitational Transformations (NGT) in gravitational fields.

The fundamental assumption or axiom of strings is that they expand when their energy (velocity) increases. This axiom (let’s name it the Tidal Axiom) appears to have its origins in tidal gravity attributed to Prof. Roger Penrose. That is, macro bodies elongate as the body falls into a gravitational field. To be consistent with NGT the atoms and elementary particles would contract in the direction of this fall. However, to be consistent with tidal gravity’s elongation, the distances between atoms in this macro body would increase at a rate consistent with the acceleration and velocities experienced by the various parts of this macro body. That is, as the atoms get flatter, the distances apart get longer. Therefore, for a string to be consistent with LFT and NGT it would have to contract, not expand. One suspects that this Tidal Axiom’s inconsistency with LFT and NGT has led to an explosion of string theories, each trying to explain Nature with no joy. See my peer-reviewed 2013 paper New Evidence, Conditions, Instruments & Experiments for Gravitational Theories published in the Journal of Modern Physics, for more.

The vindication of this contraction is the discovery of the massless formula for gravitational acceleration g=τc^2 using Newtonian Gravitational Transformations (NGT) to contract an elementary particle in a gravitational field. Neither quantum nor string theories have been able to achieve this, as quantum theories require point-like inelastic particles, while strings expand.

What worries me is that it takes about 70 to 100 years for a theory to evolve into commercially viable consumer products. Laser are good examples. So, if we are tying up our brightest scientific minds with theories that cannot lead to empirical validations, can we be the primary technological superpower a 100 years from now?

The massless formula for gravitational acceleration g=τc^2, shows us that new theories on gravity and force fields will be similar to General Relativity, which is only a gravity theory. The mass source in these new theories will be replaced by field and particle motions, not mass or momentum exchange. See my Journal of Modern Physics paper referred above on how to approach this and Super Physics for Super Technologies on how to accomplish this.

Therefore, given that the primary axiom, the Tidal Axiom, of string theories is incorrect it is vital that we recognize that any mathematical work derived from string theories is invalidated. And given that string theories are particle based theories, this mathematical work is not transferable to the new relativity type force field theories.

I forecast that both string and quantum gravity theories will be dead by 2017.

When I was seeking funding for my work, I looked at the Broad Agency Announcements (BAAs) for a category that includes gravity modification or interstellar propulsion. To my surprise, I could not find this category in any of our research organizations, including DARPA, NASA, National Science Foundation (NSF), Air Force Research Lab, Naval Research Lab, Sandia National Lab or the Missile Defense Agency.

So what are we going to do when our young graduates do not want to or cannot be employed in string theory disciplines?

(Originally published in the Huffington Post)

Gravity modification, the scientific term for antigravity, is the ability to modify the gravitational field without the use of mass. Thus legacy physics, the RSQ (Relativity, String & Quantum) theories, cannot deliver either the physics or technology as these require mass as their field origin.

Ron Kita who recently received the first US patent (8901943) related to gravity modification, in recent history, introduced me to Dr. Takaaki Musha some years ago. Dr. Musha has a distinguished history researching Biefeld-Brown in Japan, going back to the late 1980s, and worked for the Ministry of Defense and Honda R&D.

Dr. Musha is currently editing New Frontiers in Space Propulsion (Nova Publishers) expected later this year. He is one of the founders of the International Society for Space Science whose aim is to develop new propulsion systems for interstellar travel.

Wait. What? Honda? Yes. For us Americans, it is unthinkable for General Motors to investigate gravity modification, and here was Honda in the 1990s, at that, researching this topic.

In recent years Biefeld-Brown has gained some notoriety as an ionic wind effect. I, too, was of this opinion until I read Dr. Musha’s 2008 paper “Explanation of Dynamical Biefeld-Brown Effect from the Standpoint of ZPF field.” Reading this paper I realized how thorough, detailed and meticulous Dr. Musha was. Quoting selected portions from Dr. Musha’s paper:

In 1956, T.T. Brown presented a discovery known as the Biefeld-Bown effect (abbreviated B-B effect) that a sufficiently charged capacitor with dielectrics exhibited unidirectional thrust in the direction of the positive plate.

From the 1st of February until the 1st of March in 1996, the research group of the HONDA R&D Institute conducted experiments to verify the B-B effect with an improved experimental device which rejected the influence of corona discharges and electric wind around the capacitor by setting the capacitor in the insulator oil contained within a metallic vessel … The experimental results measured by the Honda research group are shown …

V. Putz and K. Svozil,

… predicted that the electron experiences an increase in its rest mass under an intense electromagnetic field …

and the equivalent

… formula with respect to the mass shift of the electron under intense electromagnetic field was discovered by P. Milonni …

Dr. Musha concludes his paper with,

… The theoretical analysis result suggests that the impulsive electric field applied to the dielectric material may produce a sufficient artificial gravity to attain velocities comparable to chemical rockets.

Given, Honda R&D’s experimental research findings, this is a major step forward for the Biefeld-Brown effect, and Biefeld-Brown is back on the table as a potential propulsion technology.

We learn two lesson.

First, that any theoretical analysis of an experimental result is advanced or handicapped by the contemporary physics. While the experimental results remain valid, at the time of the publication, zero point fluctuation (ZPF) was the appropriate theory. However, per Prof. Robert Nemiroff’s 2012 stunning discovery that quantum foam and thus ZPF does not exist, the theoretical explanation for the Biefeld-Brown effect needs to be reinvestigated in light of Putz, Svozil and Milonni’s research findings. This is not an easy task as that part of the foundational legacy physics is now void.

Second, it took decades of Dr. Musha’s own research to correctly advise Honda R&D how to conduct with great care and attention to detail, this type of experimental research. I would advise anyone serious considering Biefeld-Brown experiments to talk to Dr. Musha, first.

Another example of similar lessons relates to the Finnish/Russian Dr. Podkletnov’s gravity shielding spinning superconducting ceramic disc i.e. an object placed above this spinning disc would lose weight.

I spent years reading and rereading Dr. Podkletnov’s two papers (the 1992 “A Possibility of Gravitational Force Shielding by Bulk YBa2Cu3O7-x Superconductor” and the 1997 “Weak gravitational shielding properties of composite bulk YBa2Cu3O7-x superconductor below 70K under e.m. field”) before I fully understood all the salient observations.

Any theory on Dr. Podkletnov’s experiments must explain four observations, the stationary disc weight loss, spinning disc weight loss, weight loss increase along a radial distance and weight increase. Other than my own work I haven’t see anyone else attempt to explain all four observation within the context of the same theoretical analysis. The most likely inference is that legacy physics does not have the tools to explore Podkletnov’s experiments.

But it gets worse.

Interest in Dr. Podkletnov’s work was destroyed by two papers claiming null results. First, Woods et al, (the 2001 “Gravity Modification by High-Temperature Superconductors”) and second, Hathaway et al (the 2002 “Gravity Modification Experiments Using a Rotating Superconducting Disk and Radio Frequency Fields”). Reading through these papers it was very clear to me that neither team were able to faithfully reproduce Dr. Podkletnov’s work.

My analysis of Dr. Podkletnov’s papers show that the disc is electrified and bi-layered. By bi-layered, the top side is superconducting and the bottom non-superconducting. Therefore, to get gravity modifying effects, the key to experimental success is, bottom side needs to be much thicker than the top. Without getting into too much detail, this would introduce asymmetrical field structures, and gravity modifying effects.

The necessary dialog between theoretical explanations and experimental insight is vital to any scientific study. Without this dialog, there arises confounding obstructions; theoretically impossible but experiments work or theoretically possible but experiments don’t work. With respect to Biefeld-Brown, Dr. Musha has completed the first iteration of this dialog.

Above all, we cannot be sure what we have discovered is correct until we have tested these discoveries under different circumstances. This is especially true for future propulsion technologies where we cannot depend on legacy physics for guidance, and essentially don’t understand what we are looking for.

In the current RSQ (pronounced risk) theory climate, propulsion physics is not a safe career path to select. I do hope that serious researchers reopen the case for both Biefeld-Brown and Podkletnov experiments, and the National Science Foundation (NSF) leads the way by providing funding to do so.

(Originally published in the Huffington Post)

I first met Dr. Young Bae, NIAC Fellow, at the Defense Advanced Research Projects Agency (DARPA) sponsored 2011, 100 Year Starship Study (100YSS) at Orlando, Fla. Many of us who were there had responded to the NASA/DARPA Tactical Technology Office’s RFP to set up an organization “… to develop a viable and sustainable non-governmental organization for persistent, long-term, private-sector investment into the myriad of disciplines needed to make long-distance space travel viable …”

Yes, both DARPA and NASA are at some level interested in interstellar propulsion. Mine was one of approximately 35 (rumored number) teams from around the world vying for this DARPA grant, and Dr. Bae was with a competing team. I presented the paper “Non-Gaussian Photon Probability Distributions”, and Dr. Bae presented “A Sustainable Developmental Pathway of Photon Propulsion towards Interstellar Flight”. These were early days, the ground zero of interstellar propulsion, if you would.

Dr. Bae has been researching Photon Laser Thrust (PLT) for many years. A video of his latest experiment is available at the NASA website or on YouTube. This PLT uses light photons to move an object by colliding with (i.e. transferring momentum to) the object. The expectation is that this technology will eventually be used to propel space crafts. His most recent experiments demonstrate the horizontal movement of a 1-pound weight. This is impressive. I expect to see much more progress in the coming years.

At one level, Dr. Bae’s experiments are confirmation that Bill Nye’s Light Sail (which very unfortunately lost communications with Earth) will work.

At another level, one wonders why or how the photon, a particle without mass, has momentum that is proportion to the photon’s frequency or energy. A momentum that is observable in Dr. Bae’s and other experiments. This is not a question that contemporary physics asks. Einstein was that meticulous when he derived the Lorentz-FitzGerald Transformations (LFT) from first principles for his Special Theory of Relativity (STR). Therefore, if you think about it, and if we dare to ask the sacrilegious question, does this mean that momentum is a particle’s elementary property that appears to be related to mass? What would we discover if we could answer the question, why does momentum exist in both mass and massless particles? Sure, the short cut don’t bother me answer is, mass-energy equivalence. But why?

At the other end of photon momentum based research is the EmDrive invented by Roger Shawyer. He clearly states that the EmDrive is due to momentum exchange and not due to “quantum vacuum plasma effects”. To vindicate his claims Boeing has received all of his EmDrive designs and test data. This is not something that Boeing does lightly.

In this 2014 video a member of NASA’s Eagleworks explains that the EmDrive (renamed q-thruster) pushes against quantum vacuum, the froth of particle and antiparticle pairs in vacuum. Which raises the question, how can you push against one type and not the other? In 2011, using NASA’s Fermi Gamma-ray Space Telescope photographs, Prof. Robert Nemiroff of Michigan Technological University, made the stunning discovery that this quantum foam of particle and antiparticle pairs in a vacuum, does not exist. Unfortunately, this means that the NASA Eagleworks explanation clearly cannot be correct.

So how does the EmDrive work?

In my 2012 book An Introduction to Gravity Modification, I had explained the importance of asymmetrical fields and designs for creating propellantless engines. For example, given a particle in a gravitational field and with respect to this field’s planetary mass source, this particle will observe an asymmetrical gravitational field. The near side of this particle will experience a stronger field than the far side, and thus the motion towards the planetary mass. Granted that this difference is tiny, it is not zero. This was how I was able to determine the massless formula for gravitational acceleration, g=τc^2, where tau τ is the change in the time dilation transformation (dimensionless LFT) divided by that distance. The error in the modeled gravitational acceleration is less than 6 parts per million. Thus validating the asymmetrical approach.

In very basic terms Shawyer’s New Scientist paper suggests that it is due to the conical shape of the EmDrive that causes microwave photons to exhibit asymmetrical momentum exchange. One side of the conical structure with the larger cross section, has more momentum exchange than the other side with the smaller cross section. The difference in this momentum exchange is evidenced as a force.

However, as Dr. Bae points out, from the perspective of legacy physics, conservation of momentum is broken. If not broken, then there are no net forces. If broken, then one observes a net force. Dr. Beckwith (Prof., Chongqing University, China) confirms that Dr. Bae is correct, but the question that needs to be addressed is, could there be any additional effects which would lead to momentum conservation being violated? Or apparently violated?

To be meticulous, since energy can be transmuted into many different forms, we can ask another sacrilegious question. Can momentum be converted into something else? A wave function attribute for example, in a reversible manner, after all the massless photon momentum is directly proportional to its frequency? We don’t know. We don’t have either the theoretical or experimental basis for answering this question either in the positive or negative. Note, this is not the same as perpetual motion machines, as conservation laws still hold.

Shawyer’s work could be confirmation of these additional effects, asymmetrical properties and momentum-wave-function-attribute interchangeability. If so, the future of propulsion technologies lies in photon based propulsion.

Given that Shawyer’s video demonstrates a moving EmDrive, the really interesting question is, can we apply this model to light photons? Or for that matter, any other type of photons, radio, infrared, light, ultraviolet and X-Rays?

(Originally published in the Huffington Post)

Unknown

“The evidence is incontrovertible that recent extinction rates are unprecedented in human history and highly unusual in Earth’s history. Our analysis emphasizes that our global society has started to destroy species of other organisms at an accelerating rate, initiating a mass extinction episode unparalleled for 65 million years. If the currently elevated extinction pace is allowed to continue, humans will soon (in as little as three human lifetimes) be deprived of many biodiversity benefits.”

Read more