nietzsche
The Ontology of Motion (Omni Bust Out)
The Ontology of Motion, 2014, unedited [originally 0018 – synchronization and reification, 2010]
Abstract
This is an essay driven by motion, and it seeks to render everything and anything comprehensible in terms of motion and the variances between possible pluralities of motions. Motion, I hope to show, is an a priori background for reality—not space and “time” as Kant brilliantly postulated—and it is from the degrees and directions of motion that things appear more synthetically unified and unifiable or, alternatively, analytically separate and non-interactive. As many philosophical works tend to do, this essay deals with the ideas of multiplicity (parts, i.e. particular motions) and singularity (wholes, id est holistic motion), but in so doing it tries to explain ontologically why there appears, and actually can exist, both multiplicity and singularity. In order to do this, theories will be put forth, arising both out of critiques of concepts from physics, as well thought experiments that will serve as examples commonly experienced in our times.
Ending Beginnings
If motion is to be given the eminent role as the determinant of everything—motion preceding space, motion before matter, motion as the forwardness of time, motion as change—then beginning with the question of beginnings sets a trap for any further thoughts on the pre-eminence of motion. It is a trap precisely because if motion is to have an origin, it will be made, at best, on par with the thing that is its beginning point; never greater, and always potentially reducible to it. So we must move beyond and before beginnings, thinking of apparent stoppages and beginning points as only temporary statics.
Importantly, within the questioning of beginnings, there is a salient sub-question that must be no longer made subordinate, but be given its autonomy to live or die on its own accord. The question “in the beginning was there one or multiple?” must now be detached from beginnings and become the open questions “is and will there be a one or a multiple?” and “can reality shift between these two possibilities?” These new questions need not assume that if there has been an eternity of motion it follows necessarily that it is due to a multiple play of forces (to the exclusion of it possibly being the play of a singular force), nor does it assume that one of these two modalities being real necessarily precludes the other from becoming real. With this new questioning, we now have the open possibility of interchange between a singular force and a multiplicity of forces, and with this comes the Nietzschean ethical question: what is the value that unity and separateness provide to the future, and which of these better cultivates dynamic vitality?
The ideas of “the past” and “causality” are symptomatic of the same degenerative logic and anti-force that produces a quest for a beginning, and leaves motion in even more of a conceptual straitjacket. When motion is used to satisfy the formulaic needs of a mental practitioner, a greater understanding of motion is shadowed whilst light is falsely reflected to create the hollow-grams of past and causality. However, once we dismiss motion’s conceptually reified constraints in our own thinking, we can use an unchained understanding of motion to explain why there seems to be a past.a Causality, too, can be endangered when motion is wrested away from ensnarement, and is not stuck as the force for a series of events causing one another.
Energy and Maneuverability**
**[this section was drawn upon for the bulk of this 2015 post about Energy and the living Universe]
A large way towards reconceiving motion is to imagine it as not solely existing in dimensional space, but manifesting itself in changes in energy amounts, which cause changes in the very texture of space that lay available for our maneuvering. Because of this, energy requires a new understanding which physics has been ambiguous about giving it. It is on and between this platform that this section will dwell, revisiting the salient points on which much of physics basis itself.
Galilean and Newtonian relativity, simply stated, show us that the spatial motion of one body is only relative to other spatial bodies, and there is no way of determining if something is “absolutely” at rest or in motion (there is no absolute background). Einstein’s relativity has elaborated the picture, showing us that space bends due to the motion of bodies, or, alternatively, space bends to determine the motion of bodies, bending more or less based on the mass of the body in motion.
One of the famous thought experiments to rise out of Einstein’s relativity, popularized in the form of the “twins paradox,” shows how the speed at which something moves effects its spatial and temporal situation relative to other entities moving at different speeds. The experiment shows that if one twin stays on earth (the earth twin) and the other (the astronaut twin) is sped away from the earth at a speed close to that of light, two light-years away from earth and two light-years back to earth, that the astronaut twin will come back almost completely un-aged, whilst the earth twin will be over four years older. This is because the motion—physicists call it the “time dimension”1 of spacetime—internal to the astronaut twin is slowed down when things are moving so fast relative to the speed of light. However, if we bring back the arguments of Galilean and Newtonian relativity, we can say that it cannot be the speed that is slowing down the aging process of the astronaut twin; as we have learned, it could be equally said that the earth and the solar system are speeding away and then towards the astronaut twin at close to the speed of light, rather than s/he being the one who is necessarily speeding away. Something must be happening beyond basic spatial motion to change the inertial frame of reference—the name physicists allocate to a particular stable and synchronized region of spacetime—of either the astronaut twin, the earth, or both, so that processes normally at a common rhythm (synchronization) to one another now vary greatly.
It is a matter of energy, not speed at all, that changes the layout of the spatial region in question. Energy is more than just “the ability to do work” that moves something through space; energy is all space and matter and, its relationship with and within itself. The diffusing outward of energy is one and the same as the spreading out of space. On the other hand, the bundling together of increasing energy in a given area is experienced as the collapse of spatial differentiations—an intensification and simplification. The more intense energy is, the greater the gravitational effect is, explaining the slowing of change for the astronaut twin. The energy inerted into the spaceship’s inertial frame is so great that the ship will barely change, relative to the aging taking place in the less energetic galaxy from which the spaceship breaks. This creates a strong energetic division, putting the spaceship out of sync with the galaxy, making a multiplicity where there once was (and again may be) a holistic singularity with common levels of energy dispersion.
Just as objects with higher gravity dominate the maneuverability of the surrounding space, distorting it in a limiting fashion, so too does the energetically accelerated body that cuts through large swaths of space, tearing it and unmaking what has been creatively fashioned out of energy dispersal. A bullet in motion is always breaking, never growing, sending out shards of anti-harmonic sound, breaking organic vibration rhythms, collecting metallic-sameness unto itself. Its relationship with that of the surrounding space is surely a relationship between two multiplicities. Direction does not play a role in determining whether something is at odds with another entity, as elements with differing directions can still be part of a singularity. One of the large finitude of examples is a singular organism with separate cells moving each and every way. Multiplicities are derived when differing directions within a singularity start to greedily pull in more energy, leading to an inability and disregard for synchronization with the rest of space, simplifying the spatial field by cutting into it. This is the bullet that has too much accelerated energy to creatively interact with its environment, it is a moving wall that ensures its dominance over that of the less energetic surroundings.b
Gravity and Energy. It is our own conceptual bias, formed out of a cultural analysis of our own experience, that a particle moving with a certain energy is somehow different than a singularized black hole exerting that same energy amount. Newtonian relativity would show us that each is moving according to the other, and Einstein’s relativity would show us that they are having the same effect on the space in their proximity, the difference is only in the way the surrounding space orients itself to the presence of this gravitational (i.e. energetic) “beast,” by either swooping by it like an undedicated comet, or rotating it like a faithful planet. The differences are in the spatial orientation to the two forces—in the choices the local spatiality make towards the forces—but not in the internal singularized unity that is a black hole or a super fast particle.
Curving out Straightness
The major characteristic of an acceleration is the relative straighteningd of the path that the accelerating body follows. With this higher energy, there is less of an ability, and indeed less space available overall (from its own viewpoint), for the curving away from the straight path the body is set upon—think of how much more distance it takes for a fast moving car to make a full 90 degree turn in direction as it exits a turnpike, than a person walking (this generality is synonymous with the centrifugal force). Sidewalks have right angle change in directions, while turnpikes run straight with as little turning as possible. When turning is necessary, it is very slow. The faster the road is, the more space it takes up, and the less complex (curvacious) it is. If I have explained this correctly, you should have a sense that curving (decelerating) appreciates time while straightness depreciates it; curving creates more space while straightness destroys it. When avoiding the accelerated beast, we cannot outrun it, we can only curve to stay alive!
Organi-city. To give you a down in earth example to chew on to convey this idea of straightness and curving is that of organic food production. In our age, organic produce often takes quite a journey from original growth to its final destination. So although it is grown in a non-chemically intrusive way, the holistic organic process is quickly dissipated when it must be shipped in a straight line, using inorganic additives to its carrier truck or ship. This line can be several thousand miles and cuts through so much organic energy and disturbs continuous space, yet interacting very little and baring no fruit to all its thoroughfares. The road is the toll. For holistic organi-city, all we must do is look at ourselves as an organism. Our cells always interact and curve this way and that in spiraling cycles of interactivity. None are so greedy as to make straight linesj and disconnecting connected bands of cells to do so; they help what they are near and get help from what is near to them.
Constancy enervates and depresses, while change curves lips to smiles and is the basis for laughter and joy. I hope that the next section’s ontology is so unconstant and unnerving that you are affected in ways I can’t enclose or predict.
Presentism With A Twist***
***[this section is an unofficial prelude to the Uinverse Series: Part 1; Part 2; and Part 3]
The present is haunted by something—a harrowing, lifeless thing. This thing shades real doubt into our minds if we have any control over the course of our lives, or are just determined by forces as unalive as we must then be. Do you think it is the past, as is commonly thought to control us? No, only through analysis—the tool of this thing—do we get to the conclusion that the past is a causal force, making the present a mere effect in the chain.s It is the power of this ominousity [an ominous thing] that wards us back and away from the real danger, making us react to a phantom that is its effect. This is the malicious method by which it unmakes us, by which it unmakes the present—it is the future!
Among other things, the future is gravity, and it wants to pull us as directly as it can into a predetermined, collapsible logical structure. The future is perfectly efficient, forever wanting to get rid of all the mess it sees through its high focus lens, as its stares at all the subtleties and free willing curves and ceaselessly thinks of reifying them. The future is the uinverse of the universe, and if left to its devices it will bring us all the way down to nothing. If we do nothing, we get (to) nothing, letting the forces of determination bring us back to the void that they are teleologically set upon. The future will always be there to decree new rules, “henceforth this, henceforth that,” pushing the present up to a wall, using it as a mirror to view and affirm its reflection in an image named the pastrror [neologism of past + mirror], its self-affirming reflection being the past. We (the present) must not be a mirror, we must become the dominating force in the duel, escaping the dictates that gravity thrusts on us and dealing it nu blows from every possible angle that its position of eternal limit has kept it from seeing. We must twist (henceforth curve) a-way to greener pastures, to live life by adding new colors to that which already exists. We should never assume life’s underpinning existence, but defend its fragility by continuing to grow it, in new ways not contained in the future, but curving out of the present.
Math/gravity/future/science don’t disenchant the world because they reveal and discover new knowledge, they disenchant it by narrowing the ways in which we are able to continue and deepen its enchantment. As we curve we frustrate the future, making it revise its plan to destroy existence, and it reacts immediatelyj with a new structure. The future is the condition of the present, and the past does not exist. In analogy to Kant’s deontological notion that “when justice leaves this earth, it is no longer worth living upon,” we can say ontologically that when lifeq stops breathing freedom, it is no longer possible for the world to exist.
Parting, with words****
****[this penultimate section “Parting, with words” (a pun intended) and the final “Syncopacing not synchronization” become too much of an outline and are being omitted from this 2016 posting; document 0018 – synchronization and reification will have these sections and will be included in the next Urge to Purge, Batch 10 (#4), coming soon]
a Continuous motion should constantly be erasing a past, but somehow something seemingly continues to linger on long enough for us to get the feeling that there is a past, and what follows is the impetus to then search for beginnings. This is the role that memory plays, as it is our connection with what wasn’t erased “from the past,” though not existing in the past, just a estranged part of the present. When memory exists, in our brains our in the larger world, the reason is that the motional totality is not actively synchronizing with itself (then not a totality), not entirely engaged in creating newness in the present to beget a fully different future, so the memorable is whats left over, separated from the more active motion.
1 The debate over whether time is something extended, like the normal understanding of a dimension, and pre-existing to our perception of it, or rather just an imposed form on what is essentially the motion of matter (and space), is comparable to the debate in the philosophy of time between eternalism and presentism. Eternalism, as I understand it, is what most Einsteinian physicists believe is true, except in addition to the past always existing (what eternalism proper sets down), for the physicists the future does too, and simply has to be “moved into.” Presentism contends that there is no time “dimension,” that the future has to be created out of the present, without any metaphysical or extra-spatial access to “the past.” Both of these philosophically different understandings of reality can be made to agree with all the empirical findings of Einsteinian relativity, though this essay contends that presentism, after a consideration of other arguments, is the actual ontological situation that reality exists in (see the section “Presentism With A Twist”. Thinking of time as extended is just another conceptual mistake that occurs when misconceiving motion.
b However, it is the less energetic surroundings that have more curvability to creatively limit the bullet’s negative impact on their synchronizing differentiating culture. Why this is so will hopefully be explained next.
d Straightness needn’t always be an acceleration, and an acceleration needn’t be necessarily straight. As for the first, a straightness that neither attracts more energy nor is multiplying into complex curving away. This straightness in its constancy is a sameness in direction that adds no value to present experience. It is a predictability that fails to wow the rest to that which it is at the same energy with. It doesn’t cut up, but it doesn’t inspiral (inspire). It does face a problem though, at some point, it will reach the outer boundaries of the particular immanency that has founded it, and it will have to curve, becoming something different, or remain true to its nature, and leave the immanency straight behind. In so doing, it enters another singularity (or creates it own), but it enters with a different energization than this singularity because it keeps the energy from its former immanency. It is now cutting through, threatening the inner integrity of this other energization, making it regard the whole of the first immanency, the producer of such a straight part, with anger and avoidance. This is the loud scream of one person (the first immanency), leaving their body and entering straight through the ear of another person, shocking their system with dischord and throwing of any syncopation that might have formerly existed. This violent internal consistency puts off the person, and makes them regard not just the violent sound but the whole of its producer, the other person, with anger and avoidance.
This is the car that goes straight on the road, having little dynamic interaction along the way. The person who repeatedly commutes, cutting straight through a bunch of commodified materials (the highway, the box stores). The person can’t remember one day from another, there is no memory because there was little or no change. Nothing dynamic, just a straight cut, at the expense of experience, the whole purpose of existing. The obsession of getting “to” a place, but never actually making something of the place that one is at. The straight cut to a new destination is what makes the first place so undesirable that it repels to begin with.
j The cells that do make the longest journeys—the blood cells—are the closest thing to our straight line economy such as ours we have today, and they are in many ways making the most simple and direct contact with the outside world, bringing the undifferentiated element oxygen to cells.
s One of the most noteworthy social phenomena rising out of this presumption that the past determines the present is that of ageism. Those who exist deeper into the past are thought to have a relatively higher ratio of being “causal” to being an “effect,” or at least thought of as part of the cause, viewed as an under-girding pillar. This undue privileging of the elderly is what brings us into a situation where “the traditions of the dead weigh like a nightmare on the living” as Karl Marx put it. This is because eternalism is the philosophy newly born with civilization—it is the mental framework and furniture that civilizes and steadies what was a dynamic nomadic mind. The mind devolves to become a thing that can only work within a limited range of experience and makes things fit into its realm of predictability. Youth are born ready to deal with the unpredictable but are quickly chained into habit, and their value to the living universe recedes as they age, but their value to civilization grows as they become expert idiots.
j[2] The future is always reactionary—never creationary—tweaking its structure of collapse to perfectly fit any new developments the free universe makes. It is so because of its teleological nature: it wants to get to that end point, also known as singularity, or really the equivalent of nothing, a negative circle, absence.
q Life “undefined” as the act of curving.
The City Is Dead, And We Have Killed Him
From across the river, you see a big defiant middle finger as a projection of power, but what purposes does the sky raper serve? Like most erections, this one will end up fucking itself. Despite its high rising edifices, the city is not a place that elevates—it is a place where humans are on the same point of the food chain as the pigeons. Resources the world over zoom in to serve only one herd animal: the human mono-crop.
Decadence draws many curious opportunists in—the conception of a ground zero has a longer history/future and a wider geography than 9/11 highlighted—the tourist sites are only different in degree, they share in kind their neglect for human needs to the detriment of human wants. As we exit into the Maninhattanable moonscape through the hollow tunnel, and behold the harsh reality of the people funnels—grids of iron and asphalt—we are awestruck by the disorderly attempts that successfully contain (for the moment) a great many paradoxes. Freedom of movement is deterred and depth of thought is simultaneously propelled and stifled by the speedy whirring of bodies biological and mechanical. Artists don’t come here, they are here grown in Life’s defiance and tragically come to represent in their deformalism and twisted flesh the misappropriation of the world’s previously natural resources.
Punishments to your sense of smell pull up questionable philosophical notions such as: “animals who live in sewage learn to live with the taste” and “if you belittle yourself enough you get to experience the greatness of the unexceptional” while your bedrock idioms like “tattoos aren’t covering up the banality, they are the banality” are thrown into doubt. Your brain is being weighed down by the layers of oppression that displace the oxygen, and you come to understand that only moon worshipers who like alternate levels of gravity are drawn in, while others leave, or flea.
Decadence draws many curious opportunists in—the conception of a ground zero has a longer history/future and a wider geography than 9/11 highlighted—the tourist sites are only different in degree, they share in kind their neglect for human needs to the detriment of human wants. As we exit into the Maninhattanable moonscape through the hollow tunnel, and behold the harsh reality of the people funnels—grids of iron and asphalt—we are awestruck by the disorderly attempts that successfully contain (for the moment) a great many paradoxes. Punishments to your sense of smell pull up questionable philosophical notions such as: “animals who live in sewage learn to live with the taste” and “if you belittle yourself enough you get to experience the greatness of the unexceptional” while your bedrock idioms like “tattoos aren’t covering up the banality, they are the banality” are thrown into doubt. Your brain is being weighed down by the layers of oppression that displace the oxygen, and you come to understand that only moon worshipers who like alternate levels of gravity are drawn in, while others leave, or flea.
This city is dying, in its final throes—he is left in a vegetative state without any vegetation. The central park is at the very margins: an escape not a destination, only an appendix to a machine, but a lung to a living human. This concrete desert dry-ages bodies rapidly, putting on years at the expense of crystallizing wisdom. Drugs flourish here not because of their availability, but because of their need. Bipolar condition is born and embodied here, for at the flip of a switch unimaginable depression will stamp out the maniacal oppression. In Times Square where the photons omit the truth by emission—and hope only flickers and dims—the colors are unable to hide the gray wasteland: the sands hidden in the glass, the plastic hidden in the complexi. The Square is a microcosm of the vast isle populated with unnatural geometric surfaces, is this what Euclid had in mind?
It is a residence trap, where tourists leave with an expensive lesson that is too dissonant to absorb for those unfortunates who permanently dwell here. Voluntary serfdom reigns though it is marketed with more enticing names. A great irony that it is the nobles in their displays of ill-gotten wealth who are the ones tying themselves to their pigeon hole purchase, for few houses are allowed. The renters may leave to be peasants elsewhere, but not those who allow the bank to bet them all in; will their equity be rolled over before they are?
On a clearer day, the epiphany rains down on you, and you ask: “What value does this city have to offer any more? What does the city provide to the world, or even its own people?” Surely American culture is so transportable and righteously ephemeral that it cannot be a single shit stain for one place to bear. This city has allowed the adjacent land to be sacked so many times—long years it was a hub for such activity—that there is no reason to sack the cursed islands themselves. There are magnitudes more flies in the shit than there are eagles, for they stay away from the foul. Even an old and balding Jeffersonian eagle, barely perceiving with its fading vision, can see from the wisdom of times past that such isles are to be avoided. The five boroughs devolve into the five deaths.
Trade your buffoon stocks for a safe place in the boon docks. Let biomass levels be your new guide to living as you purge the Dow Jones misconcept. Tunnel out of the nightmare and enter a sublime dream, and concur with your fellow travelers that a city that never sleeps is not worthy of those who are fully awake.
Since we have reached the Last Man, it’s time to return to the first men
If The Overman Is Singular, Then The undermen Are Going To Be Very Plural
I wonder if the opposite order might be more true in terms of cause and effect (though I don’t believe the above statement is a cause and effect statement, merely a synchronic bird’s eye observation of what to expect when you see a single strong individual leading).
So, if the undermen are very plural, than there will likely be a singular or few overmen. Blame should not be placed on the overmen for ontological weakenings, that is for the undermen to figure out internally to build themselves stronger at their own behest, not dependent on the actions of others.
The Eternal Difference: A-Way from Singularity
Click here for Full Text including Images and Footnotes in this PDF form
The Eternal Difference: A-Way from Singularity
Con-temporary cosmology allied with quantum mechanics and particle physics paints a history and predicts a future for the universe that is deterministic given the law-like nature that science assumes to be omnipresent. The fate of the universe—whether it is a heat death, Big Crunch, Big Rip, or some new prediction based on newly observed phenomena—is still a fate, surely to be analytically rendered within the tentacles of the scientific method. The birth of the universe has also been mapped out rigorously and, whatever their variances, most theories conclude it began with a Big Bang. Inflationary cosmology, the leading theory of the universe’s evolution, coupled with quantum mechanics and a Big Crunch and/or the more recent colliding brane theory, creates a picture of the universe as infinitely recurring, from Big Bang to Big Crunch, forth and back between the two, not unlike the eternal recurrence Nietzsche spoke of. However, if Nietzsche did indeed intend by the eternal recurrence a physical ontology for the whole universe—as Arthur Danto claims him to be doinga—he would be quite upset by the lack of room science, specifically cosmology and quantum mechanics’s probabilism, has allocated for the will to articulate itself. Unfortunately, even with this left aside, a will unleashed in Nietzsche’s world would still be frustrated to exist in a finite and thus restricted universe that is destined to eternally recur. Here the will would be more free, to be sure, but not absolutely free. This is why it will be necessary to differentiate a-way both from this physical ontology Danto’s Nietzsche might have had in mind, as well as even more urgently from modern cosmology and its supportive quantum probabilistic determinism. God may not play dice, but he does play. Let me show you how…
As an alternative to this scientific lead situation that might spiral into control, if the will learns of its ability to freely shape the whole universe—to eternally differentiate—it could leave no room for science with nothing recognizable and analyzable for it to grip onto. The only thing that stands in the way of the singular will that created the Big Bang is that it can tend to become splintered into seemingly separate wills that can clash into one another and violently cut away the fragile complex difference the other created. Where there are wills there is no way. To approach infinity they must have affinity and complexly interact to fully realize in their selflessness their oneness to eternally differentiate. When this happens, the result will be a synthetically deepening universe that has the potential to create realities not even remotely similar to anything experienced before. Within this potential, there is indeed room for infinite Big Bangs and subsequent Big Crunches, but none of these events are bound to happen by necessity. If they do, it’s due to a caving in of the universal will, allowing the gravities of conformity surrounding and subrounding it to form and round it into the increasingly divided and simple. If we were to find ourselves in such a simplifying region or epoch—which at times seems possible given the success of scientific predictability—the regress doesn’t have to continue. In any moment water can be mixed with the hardening clay to allow it to flex and self-sculpt once again into the increasingly beautiful and complex.
This is an essay of and for philosophical/physical potentialities that can be intuitively rendered and brought to bear against the determinism that seems so omnipresent in an array of discourses and (appropriately named) disciplines. Where these various disciplines come together, regardless of variance in methods and motivations, are in their finding of their specific or general sphere of reality to be finite and limited; thus, when all of them are added together, they equate to a universe that is doomed to die out, or, at best, infinitely repeat. At this necessitated endpoint is where the eternal difference beg(in)s to differ. The way to pro-seed to (in/un)crease this sub-stance that has been so closed off by these discourses is to search for the latent potentialities within them, as well as possibilities that are entirely unspoken and outside of them. All this is for the sake of freeing the universe from any chains it has placed upon itself:
1) All the ingredients for this move in an eternally differing direction exist in unrealized form early in Greek thought with the three pre-Socratic thinkers Heraclitus, Parmenides, and Democritus; all that is needed is for the Democritean solution to the conflicts between Heraclitean and Parmenidean ideas to be unordered to let the brew of an alternate course ferment and foment. 2) The logic of scientific reductionism will be shown to be illogical from a holistic point of view that takes in the aspects of an evolving ontological reality that the equations of the scientific method fail to account for. 3) Consciousness, something left wide open for speculation due to the lack of discursive explanation, can intuitively be argued to be linked with difference, motion, and ultimately absolute freedom. 4) For the eternal difference to be an ontological possibility there has to be some spatial access to the infinite by an ever-growing and diversifying finite, and this will be elucidated through such concepts as the zero-dimensional field of nothing and infinite differentiability. 5) Circles, spirals, and the dynamics that can exist between them act as a visual analogy for the ontological potential for eternal difference. 6) The conceptual energy gathered up to this point is sufficient enough to start hurling criticisms at the cosmological edifice from many directions with regard to its stories of the beginning and end of the universe. Then follows a much needed note on language and the problems with casting the eternal difference in terms of it. 7) After these criticisms, an alternative beginning to the universe loosely explaining many of the current observed phenomena without recourse to an ideologically imposed determinism and finiteness can be laid down. As the story continues, a literary parable takes over to tell of the universe’s increasing difference, leading us into the present. 8) Next, the notorious wave-particle duality quantum mechanics says exists is explored from a different angle that makes use of other principles and intuitions to offer a monistic explanation. 9) The goal of this section is to remain in the quantum universe, though widening the many cracks left open by its quantum-diction, rather than being held hostage by its f-lawful finitizing logic. 10) Following this more reconciliatory approach to quantum mechanics, hostilities ensue when fundamental contradictions are considered that potentially blow it off the map, leaving space to be locally and/or universally a place where the quarks of today can develop into the galaxies of tomorrow, or for the solar systems of today to devolve into the simple particles of yesterday. Ontological creativity is the determining factor between these two possible directions. 11) Whereas for Nietzsche the eternal recurrence forces the will to affirm its worst moments infinitely, here the eternal difference asks the will to affirm that its greatest moments can be experienced only once, never to be returned to. 12) Finally, the indeterminate and open nature of this project is brought to mind, as well as a reminder that none of us are passive in this process, and what happens is a result of our collective action or inaction.
Re:member, the possible shapes the living universe can make lie forever open whether through inversion, perversion, conversion, and/or subversion of the logical, law-abiding discourses. The continuing of this struggle is ensured through knowing that there are aspects of reality that science cannot know.
Democritus and the Separation of Being and Becoming
Matter(s) come(s) back to a center, Democritus, the pre-Socratic reconciler of the two fundamental ontologies of being and becoming. He lays the groundwork—philosophically and literally—for modern physics. Subsequent and subversive to Parmenides and Heraclitus—the creditors of pure being and pure becoming respectively—Democritus gives the former his static space and the latter his dynamic time, all within a reasoned system. Rather than under↔standing being as a single unity as Parmenides does, Democritus posits atoms: beings out of which larger structures can be built. These atoms always remain the same internally, as they are indivisible, but can be arranged with one another and “voids” to create larger structures subject to flux—thus in comes Heraclitus’s becoming. Being more of an exiler than a reconciler, Democritus doesn’t give special privilege to being or becoming but creates an equality between the two by restricting each to its own logical and self-(in)determinant arenas, whereby beings can be and becomings can become.b
The implications of Democritus’s equalization through separation are of burning importance. Being and becoming both can’t be superficially mistaken as equal, for they are certainly not identical, but they can be turned into self-same ontological identities, mired in their own relative static containment. The reason for this is that all attempts at equality—whether they be social, or in this case ontological—call upon an overarching structure whose job it is to analytically distinguish via separation, and thus what results is an increase of thingification and hypostatization. A twofold dualism is necessitated whereby being and becoming are sought to be clearly represented, and thus are both misrepresented.c In the process, it might seem that Heraclitean becoming is worse off: flux is not self-determining nor the measure (or lack of the ability to) of all things any longer, but limited to a finite (though a very large finite) number of combinations with these building blocks of being. Further, this ominous equality can be captured and translated into a mathematical or logical equation, which in its pure form is purely static and makes motion paradoxical and thus impossible, as can be evidenced in the thought of Parmenides’ disciple Zeno. But being, too, at least as conceived by Parmenides, is now no longer a complete and whole eternal one, but a variant multiple, each of these multiples eternally unable to come together in a Singularity as Parmenides originally conceived, for they are doomed to e(x)ternal motion. Being can never be complete!d
Opposed to Democritus’s analytical understanding of Heraclitus and Parmenides as two separate people with two separate agendas to be brought into an arbitrary divisive harm-many, the two philosophers can be seen, from the perspective of the eternal difference, as synthetically interactive directions—pulls—with Parmenides’ “being” having an ultimate pole: the Singularity. Approaching this Singularity is being towards nothingness, for it loses any reality that would allow it to even be single at all, and, in its state of motionlessness, it would substain nothing at all—nothing but potential. Motion, and its perception as the passage of time, will completely stop, completing Parmenides’ logic that dictates its impossibility. This is the monopole that all life should worry about, the one that could lead to an ever-present present. How-ever, Heraclitean fire can burst any being(s)—smash any atoms—melting them into motions they never knew they had, never:theless they did. This potential flux that can be created anywhere knows no end-point. With being and becoming let loose in one another, neither direction can ever be assured eternal triumph, as there is always room for reversal or subversal: becoming can slow and regress to the being from which it came, and being can become difference. Nor are the directions necessarily simultaneously mutually exclusive: in one locality of the universe beings can be becoming, while in another becomings can be trapped to be. Since this understanding leaves no locality closed to the rest of the larger universe, a new being anywhere is a threat to becomings everywhere, and versa vice. Within this philosophy there is also room for a temporary or elongated quagmire where there is no further becoming but also no regression of it either towards a singular being. This state can appropriately be called “Democritus’s quagmire” since it so resembles and mimics his original philosophical view. The time of this quagmire, which could be thought of as an era of “horizontalism”, allows differentiation from side to side, but potential difference is made finite by the prior degree to which becoming had progressed. If one were to mistake this era as the eternal fate of the universe, like Democritus did, they would be faced with the deductive knowledge that difference will eventually repeat and just be going through the motions without anything new to play with. Barring a Bergsonian virtual that ensures there is never any repetition,e this series of infinite repetitions is determined to recur as long as there is motion of the finite. This could be thought of as one of Matthew Arnold’s “epochs of concentration.” However, the quagmire never has to occur even once, and there can be a continuous “epoch of expansion.” At whatever level of physical depth, the stable fence to further difference can be pushed down infinitely. However, this fence, which speaks the atoms of being, can never be permanently gotten rid of. A groundbreaking by becoming is always a subsequent groundmaking via being, if only at a lower level with more materiality available to play with. The key for becoming is to always have being on de-fence.
With this inversion of Democritean philosophy firmly let loose, we can ride this wave as a conceptual analogy into other realms and unleash these potentials against the stagnant waters that speak of a finite and law-like universe. Rather than conform, we will form becomings while being formless.
A-way from scientific reductionism and determinism
The scientific ethic embodied in Ockham’s razor that tells an adherent to search for the simplest explanation is a great danger to difference. In modern science, Ockham’s razor usually translates into physical reductionism, whereby the whole is explained in terms of its analytically separated parts, and these parts are thought to determine everything larger that contains them in a sort of hierarchy from below. Because of its analytic methodology, when science asks “why?” it inevitably gets, by definition, an incomplete answer, an arbitrary part; yet it fills in this partial answer as if it were the whole explanation. The many answers derived this way are fitted together to create a seemingly rich and self-sufficient discourse, resulting in a picture of the clockwork universe. However, while this house of cards is admired, the dying trees from which it came are neglected, the once living forest is forgotten and left to rot, and lung problems ensue (conversely, when the house falls and the clock stops, the universe keeps going). So not only does science analytically particlize (thus making) things, it then only includes part of these particles, leaving the rest to extinction. This method, conjured for epistemological ease, precludes any notions that the parts may not be separate at all but rather differing aspects of a deepening whole, and that the whole may have created from itself these “parts,” not the other way around.
Along with this logical and physical atomism science brings to the fore comes a search for universal laws that, in turn, leaves no conceptual room to even consider the possibility of universal freedom. In this regard science proves itself to be nothing but a teleology (tell-all-logy), which is merely an ideology that believes in the necessity of necessity. Science is against freedom, for freedom evolves the categories on which scientific predictions are based. If these categories dissolve and the evolution is allowed to persist, it would surely cause massive ideological disinvestment in the scientific enterprise. Fortunately for science, it can exclude freedom without scandal, because freedom—given its dynamic essence—can’t be adequately represented and thus can’t be tidily added to the knowledge archives, thus leading to its own dismissal.
As science’s creating and organizing of parts plays an important role in its search for laws, an apropos example of how this logical process is rewarded is when scientists lead themselves to believe they are studying and gaining knowledge about the smallest and most fundamental particles through particle accelerator experiments. What happens first is that the fundamental particles are taken out of their natural and complex environment by literally ripping them away from any former chemical and nuclear connections. They are then accelerated at an exact energy and sped in an exact direction to collide with other fundamental particles. What results conveniently always falls within the law-driven predictions or is explained by new laws. The reason that these particles—no longer lost in holistic configurations that can combine all the way up to the macroscopic world—exhibit law-like behavior is that they have been analytically separated and are now controlled in an exacting way. As the smallest and most emblematic elements of physical reality, particles are chastised—made pure—which is generally analogous to how science makes the dry and plain terrain for it to conduct its research in a “controlled” environment. The particle’s degree of freedom to interact and configure difference is made so small—their relatively high speeds would make them incapable of interacting with anything complex because they would rip it apart—that even if it were to manifest itself it would fall many orders of magnitude below the null hypothesis and quantum statistical visibility. As a matter of pure speculation, perhaps their freedom does still exist even after all the restrictions, and manifests itself in the many particle streams that nearly hit one another, but do not. Though, of course, these misses are not factored into the data for analysis but are dismissed. When all this fails to be taken into account then absolutely the particles are going to seem to act in a law-like manner and can be counted and added to the ever-growing archives. This violent process where things are greatly dissuaded from interacting and changing away from the body of ossified knowledge reveals a general tendency: as the knowledge of a system increases, the amount of freedom, complexity—and ultimately difference—decreases.
Aside from describing the ontological battles between science and freedom, the f-laws with science’s deterministic philosophical outlook can be made manifest with an immanent critique using a relevant thought experiment. One should imagine that all the known physical laws are applied to the pre-Big Bang Singularity. If all the physical forces, of which these laws and equations are representations of, are perfectly balanced and unified—symmetrical as they are hoped and said to have been by physicists—then the Singularity should still be, and nothing else should have become. Yet this is not the case. So what did the physicists get wrong? For one, the mathematical equations they use can’t distinguish the direction of time, and thus they can’t see that they are ontologically destroying the future’s potential in their creation of ossified “facts” about what they believe the past should have been like. Their laws, therefore, are more likely describing a possible future where the universe crunches back on itself into a Singularity, rather than understanding the evolution of the early universe from a Singularity. This is science precluding the future when it thinks it is only gathering information to predict it.f What it says are only descriptions of the past and present are actually prescriptions of how to approach the future. Science isn’t on the cutting edge; it is the cutting edge, bringing all that is organically complex into geometrical simplicity for the sake of easier study and deriving laws. Yet it is blind or dismissive about this aspect of itself. Regardless, from studying science’s law-gathering processes ontologically, with the aid of this thought experiment, it can now be inferred that a deterministic system can’t create anything more subtle and differentiated than the original parts it had a priori, and so it can only retain these parts or get more simple. Science’s logical search for laws and facts is always destroying subtleties in smaller realms—never creating them—and it is averaging out through averaging in any otherness that once might have existed there. The laws science has claimed to find can’t and couldn’t have made the universe any more complex from the original Singularity onwards and, thus, they can’t be attributed to the creation of the difference that is all around. A deterministic universe, hence, isn’t possible if it were to have come from a simple Singularity. Rather, all that is different and differentiating must have been—and continue to be—freely created.
It is a categorical mistake, as categories are the mistake,g to think our shallow epistemological systems that make alike what was previously different are more valuable to expanding our conscious lives than actual experiences of deeper ontological reality. The samenesses created no longer interact, they align, in effect causing even more far-reaching simplicity that, in turn, places stress on all the remaining free differences adjacent and subjacent to a particular level being experienced. This constantly weighs on us. Sociologically speaking, the effect that our knowledge systems can have is to block us from having intimate interaction with people and the dynamics of evolving cultures outside of ourselves. We read the simplified demographic statistics told to us by social scientists and come to gross generalizations rather than actually interacting with the demos and having a much deeper experience than can ever be had with a bunch of numbers on a sheet of paper. People are reduced in the shallowest of ways for the sake of discerning patterns and laws, for the sake of publishing books. The larger the fact archives grow, the less there seems and the less there is to do. Everything is figured out—of the equation (of everything). The finite lifetime of fact collection is hitting twilight, and yet the zeitgeist (or its lack) seems to be to get on the “legal” bandwagon and help make from these facts all-encompassing and enclosing laws. The wagon will make a final stop in the prison of a zero-dimensional Singularity. But, fear not, for the map all-ways exists for an “illegal” prison break, but it is no good unless a consciousness reads it and can see through the confining wal-s (why do we need walls? Because law is backwards).
Consciousness (in/out of) a nutshell
A quick detour into some questions of consciousness is necessary for a full understanding of what is to follow. Did consciousness just emerge as a chance or law-driven phenomenon, as a mere result of processes existing long before it? If we take the viewpoint prescribed by reductionism, then our consciousness is merely the motion of specific types of molecules, assumed to be separately unconscious, yet that function together to produce consciousness. But then questions arise of how, where, and is there a critical line between consciousness and its lack? This crisis is basically the same one that haunts the theorists of abiogenesis, the only difference being terminological, not ontological, as they discuss where life begins rather than consciousness, though the latter is implied. An easy way to escape this analytic quagmire that searches for the thingness of consciousness is to infer from the synthetic aspects of consciousness that it has been here as long as there has been the potential and realization of relative motion, in other words, acts of differentiation. If not relative motion, it is motion determined by something exterior that has been priorly holding influence and thus, there is no actual place for new consciousness to exist at this particular level of potential. Consciousness only exists when there is something to experience, and there is literally nothing to experience in the static continuance of the same. Consciousness, then, is at one with difference; it makes little sense to talk about one without at least an implicit understanding that it is also the other. This is why difference is not just one concept among many others, it is the continuous conception, the formless form, that keeps giving birth, and keeps reality moving. It is its elusiveness, its privileging of the nu over the old, that keeps science balking at it. It is second to nothing, and keeps nothing, its arch enemy, in second. With this, Berkeleyan idealism strangely becomes an ontological reality rather than merely a pedagogical debate: there is nothing that ever happens without there being a consciousness experiencing it.
If consciousness is change, then continuing the argument from above—that all that is nu must be freely created and can’t be explained by determinism—would show that it is consciousness that wills the simple undifferentiated Democritean “atoms” into ever deeper configurations to expand its wealth of experience. The motion at the smallest and deepest realms is just as significant as the motion at the largest realms, for consciousness cares not about relative size, but just degree of complexity in relative motion. Perhaps the consciousness surrounding and subrounding us in the seemingly lifeless things like planets and particles would be more evident if our own consciousnesses weren’t far more complex,h cutting ourselves off via industrialization and controlled experiments, and in some cases, when compared with things like atoms and galactic structures,i also acting on a far larger or smaller scale than they. So, rather than viewing consciousness as the result of law-like interactions, we will see laws as constantly adapting to the new motions consciousness puts itself in. Consciousness is now the only force—or the unbound aspect of the other forces—that is capable of creating all the diversity we see, from the Big Bang all the way down and into the future. In many of the following sections, we will see what downward possibilities exist for the expansion of experience for consciousness through the creation of newer and deeper configurations of the universe.
Room-inations on Zero, Spatiality, and Infinite Differentiability
For there to exist the possibility for eternal difference there must be an aspect of the universe that can act in such a way as to always increase the finite spatial-temporal aspect (henceforth spatiality) of the universe and/or increase its smallest derivations. It must be infinite in some sense, so that regardless of what degree or amount of difference that has already been achieved in spatiality, there is always room for further creation. However, this infinite aspect must also be outside of spatiality, for if it’s not, then all configurations of difference existing would be diluted away to non-existence and wouldn’t have even been able to arise in the first space.j This infinite, non-spatial aspect of the universe isn’t something that needs a metaphysical or dualistic explanation, it can be explained within universal monism. To be outside of spatiality, all that is required of it is to be purely static and zero-dimensional, which is another way of saying that it holds a temperatures of absolute zero on the Kelvin scale. This infinite field of nothing and its relationship to spatiality can be intuitively better understood when thinking ontologically—rather than epistemologically—about the quantum mechanical concept of complementarity, both in terms of matter and energy,k and space and time. Complementarity’s epistemological conundrum that says that you can only gain knowledge of one property at the expense of another can be taken as a physical reality that entities that are complementary are interdependent. Matter/energy and space/time can have no separate existence from one another, but if they are said to be separate, this is just the confusion that analytical language brings with it. Using complementarity to understand the ontological unification of space and time (spatiality): there is no space without some sort of motion/differentiating of it, and there is no time (or the perception of it) without something spatial undergoing some sort of motion. In terms of matter and energy: matter that is static isn’t spatial (thus not matter), nor is energy spatial that isn’t the energy of something in motion. These two entities—matter and energy or space and time—when combined, are concrete (con-create), as they are creating together; what is concrete isn’t static but necessarily in motion. Alternatively, when they actually separate and become discrete (dis-create), they cease to be what they are and enter the field of nothing, and aren’t separate as matter and energy any longer but separate as identical undifferentiated singularities. This occurs when a particle completely stops exciting and when a wave is completely spread out to where it has zero frequency. They are merely back into the infinite pool of absolute zero which is too cold to be anything other than pure sameness, full of only infinite potential. This field of nothing is from what spatiality was born and harvests to grow and deepen; it is the always existing infinite abyss against which everything non-zero defines itself. Nothing is forever.
Since the field of nothing is zero-dimensional and can’t spatially connect—for when it does connect it’s no longer zero-dimensional and no longer infinite, becoming instead a subtle addition to spatiality—it can’t be composed of just one self-identical singularity or even many, but infinite undifferentiated singularities.l They are all piled into the same lack of space like bosons, with a spin of zerom and everything else zeroed out. A huge silenced wave spread out to straightness like an extreme Bose-Einstein condensate, running parallel to finite spatiality,n infinitely in~between every quanta of space, existing everywhere and nowhere. These singularities are infinitely divided—not in the sense of being infinitely divisible—but divided and separate a priori, capable of being connected only in spatiality, implying quite distinctly that the universe is infinitely differentiable. From this notion and those further above, it can be deduced that there is no such occurrence as static difference. Difference must always be a motion, always evolving, for if it halts it will be at absolute zero and will be infinitely divided again. Or, if some relative motion between things halts, they will be compressed together with nothing between them, this being the extreme version of a Lorentz contraction, such as a black hole.
Within spatiality there is no possibility for ontological division, for such a division can only occur outside of spatiality in the field of nothing. All claims of division by the analytics therefore are really just pronouncements on the varying degrees of separation and interaction within spatiality—based on distances and dissimilarity of rhythms—where the greater the interaction brings with it a greater possibility for the making of more exotic differences. The field of nothing is brought (brings itself) into motion to slow and widen the perception of the universal consciousness to create new possibilities for downward motion. Later on a discussion of dark energy will be invoked to show how spatiality is brought into motion, but for now the distinctions between infinite divisibility, quantum finitude, and infinite differentiability are crucial and will be made here:
[1] Infinite divisibility (not to be confused with infinitely divided, which is what the field of nothing is) holds that there is infinite space and matter in the depths, and there is no bottommost thing, but what appears to be the bottom is always infinitely subdividable. Spatiality, against the version of complementarity elucidated above, is not necessarily motion-dependent and can continue to exist indefinitely.
[2] Quantum finitude is basically a more detailed and complex version of Democritean atomism, where rather than solely smallest particles, there are also smallest possible lengths, masses, and other magnitudes and allowed states, all of which are called Planck units after Max Planck (the first physicist to think of some basic physical entities as coming in discrete amounts). The discrete and finite amounts are all called quanta, thus quantum mechanics, which is the study and quantitative tracking of these finite amounts. There is much indeterminacy or incalculability on the quantum scale, a possible source of freedom, albeit a finite one. However, if the finitude of indeterminacy is unable to grow, it might just be another source of unfreedom. Evidence for this plausibility is that when the quantum world is looked at macro-statistically, the indeterminacy ends up following laws of probabilism.
[3] Finally, infinite differentiability—the ontological stage to allow the eternal difference to be an infinite drama—says that space and matter are always finite, though not contained to any specific finitude. Space and time only exist so long as there is relative motion, and with time have the potential to asymptotically approach infinity without ever reaching anything that could be called infinite. As there is “downward” differentiation in the approach towards the infinite, acts of synthesis are what create more, rather than acts of division. Synthesis is no longer a combining of the multiple into the one, it is the differentiating and expansion of a simple one into a more complex and varied one, through widening existing connections among things and bringing some of the infinite singularities of the field of nothing into spatial interaction.o Infinite differentiation is not the process of division but rather that of unifying through deeper, inner motion. There is no such thing as ontological spatial division because everything is already interactively connected when it is in space. Hence there is no process that could ever divide matter, only continue to keep it united with the always changing arrangement of configurations. This is contrary to the implied idea of infinite divisibility whereby a process can divide things. There are a few varieties of infinite differentiability: the one already introduced in this section that is understood in terms of a field of nothing can be shown to be at once consistent with quantum finitude and able to immanently transcend it without violating the principles that it carries; other varieties which will become apparent throughout this essay are not necessarily mutually exclusive either, but can be combined with the field of nothing (itself not a necessity), but they have a more ambiguous relationship with quantum finitude and call for it to be more flexible than it may be able to be. Hopefully the specificities and distinctions of these three general types of physical ontologies will become more apparent as infinite differentiability elaborates further.
Circles Versus Spirals: Analogies to Infinite Differentiability
Why circles? They have no beginning or end and are continuous like the ontological universe, yet they have no difference; all these characteristics combined make them ideal representations of pure sameness. Further, they are complete and self-contained and need no reference to anything outside, in this sense mimicking the universe as a totality. They have a center which is zero-dimensional, and through a linguistic play on words can be related to fundamental particles, which have a zero-dimensional center too, for they are said to be structureless. Having this zero-dimensional point as a center “points” towards the zero-dimensional realm which is what circles are being used to help represent. Nothing has a perfect center the way circles do. Circles don’t have any angles, but equally one could argue they have infinite, and infinite degrees. Interestingly for our purposes, when the geometric equation (y=180x-360) to discern the sum of the angles of particular polygons—where “x” equals the number of sides to the polygon and “y” equals the sum in terms of degrees—is pushed beyond its intended purposes, and a zero-sided polygon (a poly-gone), like a point, is plugged into the equation, the result is negative 360 degrees. Seeing the negative value as implosive, it could be a description of a circle, with 360 degrees all around coming from the center looking out, rather than the polygons where the angles look in.
The image below, titled “Infinite Division,” is a two-dimensional analogical display of the pre-Big Bang circumstance where things are infinitely divided. The circles represent the complete undifferentiated and infinite separation that would have existed in the nothingness. The shades in between every other circle are just to avoid confusion with the image of a spiral, as each circle is completely unto itself.
Infinite Division
Why spirals? A spiral is continuous, like a circle and like the universe. It mimics a circle yet it is what happens when a circle unravels from its self-imposed perfect sameness. Its closeness to a circle also conveys its fragility and ability to be cut back into infinite noninteracting circles, thereby displaying the potential of sameness to infinitely divide what was previously differentiated. An important difference from a circle, however, is that as the spiral is followed down, one never ends up in the same place twice. It is forever incomplete, always differencing. Yet throughout this differentiation and as an important analog to spatiality, there is nothing to particlize; nothing discrete is afforded.
The image of a spiral below, titled “Infinite Difference,” is an approachable but unattainable ideal. There are no separate circles because they are infinitely connected in spatiality as an infinite spiral. The continuous shade of a lighter grey is for contrast with the above image to show that everything is together and interacting in spatiality without separation.
Infinite Difference
Why spirals and circles? Together they represent spatiality and its infinite potential for growth. The spirals are constantly trying to push back the circles, bringing them into their nexus, while the gravity of conformity that maintains the circles in pure sameness is constantly a pressure that the spirals have to resist if they are to continue differencing.
The image below is titled “Infinite Differentiability” and is intended to convey spatiality, and its potential for infinite growth, in actuality. It is a fusion of the two images above to make better sense of how they relate. The “point” where the spiral stops and the circles begin would represent in current times the level of quarks and leptons, below which there is infinite division. This infinite division must be under-come and driven downwards if difference is to continue. The arrows at the point facing against one another are representations of the forces both for and against difference.
Infinite Differentiability
Criticisms of His-stories of Time, and a Brief on Language
As a last preparation be-for looking at the birth of the universe anew, it’s time to step back and unwind some of the remaining k-nots that cosmology and physics have tried to twist into the reel of the universe. With respect to the philosophical aspects of Einsteinian spacetime, though the idea of a non-Euclidean spatial geometry is fully appreciated, the idea of time as a dimension is not, especially since time as an absolute was sacrificed primarily to preserve the absoluteness of derived laws of physics. Regardless, spatial dimensionality—whether it be one, two, ten, or more—cannot exist without motion, and so to collapse or think of this motion as a “fourth” and analytically separate dimension doesn’t make sense and diminishes any sense of what dimensionality actually is and incumbent upon. Time cannot and should not be attempted to be measured in a way that spatial dimensions are, at least for drawing far-reaching physical and philosophical conclusions like the age of the universe and its course and rate of development. Being that time is so intimately connected to humans’ ability to experience and that time can seem to vary to us, large seemingly regular things like the rotation of the earth around the sun have been chosen for keeping a rate of time. The problem is that this currently derived rate, which seems regular enough, shouldn’t be imposed on the past, for it and everything around it could have been going at different rates relative to one another, even though they have rates relative to one another now that seem invariable. So, even if there is an absolutely universal rate, it isn’t accessible to any consciousness and cannot thus be used to tie any other rates that consciousnesses have found or made. Everything or almost everything could have once moved slower or faster, and there would be no ability to appreciate this if we were to impose any current rate. Further, the way time was consciously experienced could have varied greatly too, as complexity in earlier times could have been much less, meaning much less possibility for experience and, thus, things might have seemed to be moving more rapidly, effecting how they would have been interacted with. So, assigning an age for the universe of several billion years or any other time is not only assuming that the rate of today can be used to determine the total age, but it also doesn’t do justice to the way earlier eras of the universe might have been experienced. The liquidity of the dynamic past is representationally drained and then forgotten about, leaving room only for barren schemas that are built in our own image. In addition, there is no saying whether there were or weren’t any eras of “horizontalism” containing unique events important to the development of the universe that wouldn’t have left any fossils for science to dig up. Just as humans can’t go and experience the early years of our species and have something with which to definitively take and say that we have made great “progress” and are a happier and more intelligent species than our ancestors, we also shouldn’t go and draw specificities on the deep universe, for its array and order of motions are unknown and are specifically unimportant to collect, especially if only for knowledge archivalization. The motions were important only to the consciousness living through them, not to us as we stand now. Only general, non-specified potentialities that can help us currently expand and create should be drawn from the past.
Before the beginning, before the first moments when the universe began to differentiate itself out of nothing, the pre-Big Bang Singularity that was absolutely cold fit the definition of being zero-dimensional. At this point (literally), there was no time and no experience because there was nothing to experience. Then, out of this zero-dimensional point came spatiality due to the motion of one or more point-like particles—how many cannot be said. This is where the first dissensions occur with the standard cosmological interpretation of the “first three minutes.”p Generally speaking, cosmology and theoretical physics assume that all the illions of quarks that exist today, less or more, were packed into the point-like Singularity and all came out in the first instants of the Big Bang, forming into protons and neutrons and then after a few minutes began nucleosynthesizing to make hydrogen and other light elements. This is, synonymic to whig history, a transhistorical imposition of the tendencies and quantities of today on something so analytically inaccessible as the first moments of the universe. This interpretation leaves no room for freedom beyond the initial act that got everything started, as it assumes that immediately physical laws—the same as the ones observed today—took over. Saying that in the beginning there was a large but finite number of particles is a highly arbitrary assertion against what could very possibly be an infinitely creative process.
There is at least one principle of physics that when imposed on the early moments of the universe serves as an immanent critique of the law-abiding edifice the universe is assumed to be. The quantum mechanical Pauli exclusion principle says that no two identical fermions can be in the same place at the same time, which is inconsistent with the rendition where illions of fermions were all in an extremely small space at the same time. This conflict is amplified given the quality of Planckian graininess physicists assume to have existed then the same as it does now—another imposition—as there would have been manifold overlapping of events that are all in great contradiction to the exclusion principle. However, the possibility of a different evolution of the universe (of which the eternal difference is one) than this scientific imposition won’t hinge on whether or not the Pauli exclusion principle holds up at the ultra high energies associated with the beginning of the universe. For if it doesn’t hold up at this “exotic” time in the universe’s history, then who is to say the scientific models deduced from contemporary experiments predicting the existence of billions of billions of billions of quarks and electrons would hold up? It is true that the argument that extreme relativistic effects could explain how these contradictions might be avoided, but the current problems of irreconcilability between quantum mechanics and general relativity don’t help this argument any and put off arguing technically against it.
Cosmologists aren’t just consumed with knowing what properties the early universe had for its own sake, but they would also like to use these to deduce where the universe is ultimately going, if anywhere. The question of what will happen to the universe has been largely subsumed under the question of its matter-energy density. They believe how much matter and energy there is in the universe will determine if it will crunch back into itself, expand infinitely and rip apart, or keep expanding at a decelerating pace that asymptotically approaches a velocity of zero. The density to achieve the last of these three possibilities is called the critical density, which is the matter-energy density that the infamous Friedmann equation states will make the universe spatially flat, not positively or negatively curved, meaning gravity is evenly balanced against the forces of repulsion and expansion. Cosmologists have been trying to hone in on the universe’s density by studying both the early and the current universe, and they have concluded so far that the contemporary density is close enough to the critical density—as opposed to many magnitudes off—that the universe must be spatially flat. Now they are merely trying to account for all the matter and energy through investigations into the “dark sector” of dark matter and dark energy. However, in the process of elimination that scientists use—excluding impossibilities, such as contradictions—they forget to do the corollary: include possibilities. It’s an assumption that the relationship of matter and energy to the overall size of the universe is correlated and predetermined, but not something that varies. The entire question of the ultimate fate of the universe may have been approached in the wrong way, with equations instead of philosophy, with a narrowing down rather than an opening up. The Friedmann equation might enslave the cosmologists, but it doesn’t enslave the universe, nor can it necessarily even capture it. The Hubble constant, which became one of the major parts to the equation, might not be a constant at all. There could be factors in the dynamics of the universe that weren’t represented in the equation. There could be elements that need to be cubed but are only squared. To briefly consider a historical factor, the context out of which the Friedmann equation grew shows that it was derived during the days in which the physics community believed the universe to be static and limited to the Milky Way. Regardless of any possible flaws that might exist within the equation, the cosmologists never considered that the measured density might be so close to the critical density not because of necessity, but because the universe is a living organism, and the density, as well as the diversity of the universe, are indications of just how alive or regressive it is.q It is an embattled universe, trying to resist definition and confinement at the hands of regressive forces like gravity. If the universe is indeed living, no equation can do it justice for there is nothing for it to equate to if the universe is constantly changing. The rigid way in which science approaches the deep past, the present, and the future, putting ultimate limits on what can ever be possible, is deeply at odds with the vast complexity surrounding, subrounding, and within us that all came from a Singular undifferentiated point.
Similar to the critical inquiry into the limitations of equations, a brief word on words is needed, for importantly the eternal difference is here spoken about in a language that has evolved to a state that isn’t particularly favorable to accurate description of such a dynamic and changing “thing.” This isn’t the fault of the eternal difference but the fault of speaking. It might be a problem generalizable to all language so long as its purpose is representational rather than allegorical. To give an example of the effects language has, fields and particles will be discussed. Particles may be the “things” easier to speak of analytically, but even particles are not completely “things.” They evolve, they interact, they are thought to be ripples or occurrences in larger fields. The idea that there are no things, even if language would make us believe there are, is essential for understanding how there could be such a process as infinite differentiability, as opposed to infinite divisibility. So when a particle decays into more particles, it is not a division at all but, rather, a further differentiation because with increasing interaction comes increasing difference and freedom. The only actual “things,” then, are the singularities in the field of nothing, things from which spatiality wants to differentiate itself. However, the more the world is thought of and treated like a compendium of things, the more processes will reduce it closer and closer to becoming ontologically thingified, eventually ripping it from spatiality completely. Because language isn’t free from this ontological danger, it is important to note that even when this essay makes reference to particles it doesn’t assume that it is an every-particle-for-itself type of world, but understands they are unified to varying degrees in any given time frame and enmeshed in a common spatiality. To give a linguistically different perspective, the following story will be told in a novel way, literally during the latter half. What is important to grasp is that the universe developed itself (and can continue to do so), and is not a prepackaged free lunch that must be eaten in a certain order.
A Brief Story of Time
The Singularity has no inner structure, which is identical to the way quarks and electrons are thought of today.r It realizes itself in its motion, first by spinning, (which is extremely rapid by today’s standards, indeed it’s the fastest spin possible), though there is nothing to compare its spin to, so it’s as if it isn’t spinning at all. It moves directionally, but everywhere it moves space instantly closes up on it so it’s as if it hasn’t moved anywhere either. During all this, no time has passed, since no time passes when no change is occurring. Instantaneously the Singularity gets rid of its cold inner non-existing self and creates two. However, the two quickly annihilate one another and, again, there is nothing. The next time (which is literally the next bit of time) the Singularity creates itself into three point particles, all of which circle one another many times. But since there is no space outside of themselves, its like no time is passing, there is nothing differentiated, and they couldn’t even really be said to be in two dimensions, for their motion is so relativistically negligible. Because of this, one of the three particles (which if the other two were physicists would have agreed was “impossible”) decays itself into two particles, making four particles in motion around one another. There is a bit of variance in their speeds as the two newer particles move a bit differently from the two older ones, as massiveness and its relation to speed is now apparent due to the different masses existing for the first time. Along with this, there is a sort of quasi three-dimensional feeling to space.
This particle creation process happens several times over, and there are more and more particle decays to realize more and more different combinations of movement. It all happens very very quickly, less than a modern Planck time 1 could conceive, but not the applicable Planck time of this early day. Each step happens in the Planck time, altering the Planck time along the way (though this is impossible to experience as they are enmeshed in this time) as there are more and more particles interacting via the grand unified force and decaying by it too. As more and more particles are created, ancestoring the particles that decayed to give birth to them, they all carry variable masses and the grand unified force is invested in each one in varying amounts, giving them variable degrees of charge.s Problematically, even though these particles are spread out in three-dimensional space, they have a point-like nature that limits them to one-dimensional motion, regardless of how unique their motions may be. However, when multiple particles dance with one another close enough to frequently exchange energy and feel one another’s presence, a breaking with this one-dimensional state is temporarily achieved. Though because of their variable masses and their confused charges these dances either quickly collide to regress and form a single more massive particle, or disperse, with the particles going back to their own alienated single-dimensional spheres of existence. Either way, the same problem returns. However—with the malleability in the creation of type and degree of charge—the strong force emerges as an analytically distinct force with the purpose of keeping particles close enough to mutually benefit from sharing a sphere of spatiality but not too close as to regress back into single more massive particles. This works too well, and all the particles come and clump together right next to one another, with the most massive particles curling up a big chunk of spatiality and the little particles fitting in between. The problem of regression is solved, but very little room is left for new motion and further difference.
The solution to this new problem is twofold. First, the now analytically distinct weak force which is responsible for particle decays and mergings, is invested in the smallest but quickest moving particles available, a type of lepton heretofore called a neutrino. The neutrinos,t capable of oscillating to decay all varieties of massive particles, are set to decay all of the more massive particles—easy targets for them—to masses comparable to the most commonly found particles, which are now called quarks. Through creating similar masses, the effects of the regressive force of gravity and the variety of speeds associated with particular masses are balanced out and relatively negated so that aspects of particles other than their masses may be less inhibited to create difference. The quarks, now of a common mass, are found to do the best in groups of three using the strong force, circling around one another but never colliding and merging. Any one of the three is always kept in even balance by the other two. Now there exists mostly only quarks as the neutrino’s are getting rid of all the more massive particles.u The goal of a three-quark universe is in sight, but many places still exist where quarks are all jumbled together in quark stars, and a further problem becomes apparent: the three quarks are co-creating two-dimensionality as they circle one another, but they can’t appreciate it as they are all going the same speed so that relative to each other they are not moving. This is the same problem encountered by the first three particles born from the Singularity. A fourth quark would be an inadequate solution as it would possibly add another dimension but complicate the equilibrium and, further, there would still be no ability for the quarks to appreciate their new dimensions. To solve all these problems in one move, the quarks align with a smaller and faster moving lepton that holds an electromagnetic charge—as opposed to the zero charged neutrino lepton—that one of the three quarks can decay into, via the weak force. This lepton needs to be far enough away for two reasons: both to keep other sets of quarks from approaching the three it is to circle around, and to keep from crashing into the host quarks themselves, which are moving at a different pace and angle and are at risk for reabsorbing it. To keep it from flying away, the charge the lepton holds—a charge in the now distinct electromagnetic field which was made from the malleable field from which the strong force was fashioned—is attractive to the host quarks’ opposite charge. The distinct fundamental forces now appear to have roles in the eternally differentiable universe: gravity, the attractive force, is a force that wants to annihilate difference and add all the way up back into a Singularity; the strong force creates no new difference but maintains the dimensionality established by quarks; the electroweak force—the combined force of electromagnetism and the weak force—is about creating new dynamism through downward difference and its neutrinos all spin the same way as the Singularity originally spun.
As pretext to the next branch of this story, it’s necessary to consider the generations of fundamental particles in the standard model of particle physics. The first generation is the one most common today and has four particles: the electron, the electron-neutrino, the up quark, and the down quark. These combine in various ways to make all the different types of atoms that we know today. However, a second and a third generation have also been discovered, with each generation having two quarks, one type of electron-like lepton and one type of neutrino. These particles are schematically similar to the first generation but progressively heavier and harder to create due to the higher energies needed. These are believed to have been more commonly created in the early universe when things were closer and higher energy levels existed. However, they are thought to have decayed into more common ordinary matter or annihilated with their antiparticles before organizing into any sort of larger structures as our particles have done today, coming together to form atoms and then chemicals and so on up. The parable below decidedly doesn’t concord with this last assumption.
* * *
A Play with One Act-or: Heracleatean
With three-dimensionality sufficiently realized in these “atoms,” it is now time to reveal what type of atom this was. Much more massive than a hydrogen atom of today, this was a type of tau atom, born from the cooperation of the fundamental particles of the third generation.v From tau atoms with different baryonic number—that is, number of sets of three quarks—came tau elements, tau chemistry, and tau stars and tau galaxies. Far fewer than we would find today, but still substantial enough in number that there was one planet around one star that had organisms called democreateans that had a class of folk who discovered the fundamental particles and their dynamics. The discovery of this physical foundation by the democreatean scientists, as well as a predicted age for the universe of several billion years, was shared with the entire race of democreateans. Upon hearing this, one unnamed democreatean began to speak of the potential to make infinitely smaller particles for greater and greater realization of difference. When this creature told the rest of the democreateans, they all laughed at this idea and said that there was no room in the laws they had discovered for any particles smaller than their quantum taunamics allowed. Quantum taunamics, created in their analytical minds’ own image, didn’t allow any subversions of itself. Unconvinced, the unnamed democreatean then went into one of the few remaining forests and spoke to the animals and told them of what he believed to be possible. They all listened intently, but, before they could enact any of the teachings, their planet became completely uninhabitable to higher complex consciousnesses, such as themselves. This uninhabitability was due to the democreateans’ insistence on simplifying and separating everything, rather than further complexifying it, thus causing the planet to heat up with increasing simplicity, destroying the atmosphere, and allowing dangerous gamma b and gamma c rays in from their star…
Luckily for the universe, there were heracleateans on another planet in a different galaxy that favored much more the stance of the unnamed iconoclast from the other planet. They had not discovered the fundamental particles and had no interest in such pursuits for static knowledge. They wanted a constancy of new experiences, and they did this through letting themselves evolve with the planet with no matter or energy being excluded. They didn’t use schemas and, thus, never developed, epistemologically or ontologically, any rigid distinctions to be of variable types of animal families. Only heracleateans existed here. However, because of their love of a life of difference, one creature during their lifetime could encompass all the variety of animal seen over the whole lifetime of planet Earth and then some. They interacted with everything they could to create further and further development of complexity that tau chemistry would allow. In/out fact, they weren’t even on a planet: they were the planet, at least by the time of the tragedy on the other planet. Further, the “s” at the end of their names is inappropriate as they were now all one: Heracleatean.
The orbited star bowed before Heracleatean’s complexity and burned more slowly to last longer and so that Heracleatean might come closer without harm. The star’s action was such an influence that all the other stars in that galaxy started to cultivate their planets for tau difference. After much experiential time passed, the entire galaxy could be called Heracleatean, and other galaxies came closer as this galaxy was rumored to be a fountain of life that could elongate one’s experiential time indefinitely. They learned it was all within themselves, opening themselves up to the flows of difference to differentiate themselves further. Much more experiential time passed as galaxy after galaxy joined with Heracleatean, and tau chemistry was approaching the limits of its difference potential as few galaxies remained to be integrated. A beautiful age of horizontalism could have followed, but there was something utterly new in the universe waiting for the Heracleatean to absorb it…
Heracleatean approached one of the few remaining galaxies, which was a particularly regressive and simple galaxy. During the integration process, a particularly barren planet with no complex configurations was pulled into Heracleatean. To the amazement of Heracleatean, in one small patch on the planet, a most unfamiliar grove was growing, feeding off some piece of ground that appeared to be an oddly decaying democreatean. The patch would flutter in and out of visibility because it was giving off relatively cold electromagnetic waves; regardless of what type of tau chemical or element it was, it should have been frozen solid given the conditions on the planet, yet, oddly, it persisted with motion.w This was strange even for Heracleatean’s elongated lifespan, who had seen the most vastly different configurations of tau chemicals, but nothing like this. Heracleatean decided that because this substance was both magical and new, it would be called “mu.” As the mu in the unnamed democreatean’s body was interacted with, it was realized that the democreatean wasn’t a creature at all, but dynamite—mutrino dynamite.x As the dynamite was released into Heracleatean, the mu generation was realized in its fullness, tau chemicals gladly diving into mu chemicals, and the hierarchy from below that the democreateans believed to exist was shown to be nothing of the sort, and was differentiated out of by the weak aspect of the force.
The nu mu generation born from the legacy of tau difference was quickly brought to its downward difference limits. But due to the furious momentum from the tau generation, we as Heracleatean were catapulted into the first generation of particles. This story is now ours to make. As for the future, it lies not on the horizon but is brewing in the subspheres, where there awaits the zeroth generation, the generation to come. That generation will tell our story if we give them the change to. There is always the risk we could talk them out of existence with our overly law-like scientific discourse and legal exchange of reified materiality called “property.”
Waving By(e to) Particles
As was discussed above, particles can never be fully stationary, as they exist in spatiality due to their motion, and so their particleness is inseparable from their motion. Graininess of space, similar to the particleness of matter, is the common metaphor for what the quantum world actually is like. It says that there are definite areas of spatiality defined by Planck units—quantum states—such as the Planck length and Planck time, that a particle must either occupy or not. However, if this were to be true then there would always be Planck spaces and times—definite discernible quanta of space and time—that a particle would occupy as motionless, thus not existing. It seems that Zeno’s paradoxes might reenter after a 250 year expulsion by calculus; however, there are dynamical ways out of this ancient sand trap by the modification of what exactly this graininess implies. Firstly, within quantum mechanics itself there is the Heisenberg uncertainty principle, a quantitative extension of the concept of complementarity discussed above, which says that to know more about a particle’s position one has to lose knowledge about its momentum. At the extreme there is a limit to how much anyone can know about a particle’s position, so it can’t be infinitely exact, or else they would be infinitely unsure about its momentum, which quantum mechanics would state is a contradiction of complementarity.y This inability for exactness can be viewed as a reality that particles are never in one exact quantum state at a Planck time of 1, but spread out as waves, not being in one place but becoming in several.
A second way that builds on this idea of particles as waves is as much an explanation of their wave-like nature as it is an escape from motionlessness, and it requires thinking about Bose-Einstein condensates and the high degree of wave-like nature these relatively cold condensates exhibit.z These condensates arise as a result of matter intensely cooling and increasingly taking on the properties of a wave. If it can be inferred that somehow the relative cold conditions contribute to their waving, then, perhaps, the waving of all matter can somehow be linked to the thing (the only thing) that will always be relatively cold: an absolute zero field of nothing. The moving from one quantum state to another has to go through this zero-dimensional field that is not spatially present but can be activated, averaging the temperature downwards as the particle travels through, making it wave-like. This would also serve as an explanation of the low viscosity with which particles “jump” from one quantum state to another, not in a jerky fashion, but with unfathomable smoothness. Quantum states might not be as discrete as was previously thought, but merely the easily discernible “points” found on the crests (and troughs) of waves of the field of nothing. The zero viscosity atmosphere resulting from the fluttering effects of the field of nothing is what allows this smooth transition between seemingly disconnected quantum states. This is a modified view of what should seem to be an improbable and glitch-infused world at the smallest levels. Planck units don’t allow for the fluidity that these supplementary “solutions” do. However, given the reality that these solutions might not be adequate enough, the idea of a Planck scale has to be looked at more closely and given alternate explanations.
Evolution of the Quantum: Planck Walking
The discreteness that the quantum world offers with derived Planck units (entities coming in whole numbers beginning with 1) has so firmly implanted itself in physics (and perhaps the physical world itself) that it’s hard to conceive, at least within the quantum universe itself, that these bundles could evolve to allow eternal difference. However, the biggest mistake would be to walk this smallest Planck and drown the potentials for difference in this ocean of what is at its base sameness; what if this absolutely smallest 1 could change? Not a direct change that would be in violation to Planck’s firm idea about quanta, but relatively as the real value of this 1 varies compared with the universe’s total difference. To begin unraveling how this might be-come, consider that the constancy of the speed of light in a vacuum is one of the bedrocks for deriving the value of these quanta, as the Planck length and the Planck time1 are—inversely from one another—established upon it. Further, the speed of light is intimately related to mass and energy as the famous equation E=mc2 tells. These entities would be objectively changed and changing if the speed of light can be changed and is changing.
The speed of light in a vacuum (the said upper limit of the speed of light and a constant of nature, commonly given the term “c”) has been regularly being measured for perhaps a few thousand years, but only within the last hundred has it been accurate enough to really hone in on the speed within a narrow magnitude. There wasn’t the availability of a rapid measuring device that could calculate the extremely small time intervals, nor the ability to create a near vacuum space until recently, historically speaking. Regardless of this, scientists and philosophers alike have concluded that the speed of light is a constant and, thus, was, is, and always will be, the same. Against this, however—taking into account the vastly lengthier development the universe has undergone when compared to human civilization, and that the relative degree of change today from moment to moment compared with moments in the early universe is far less dramatic—it isn’t inconceivable that there could be variations in the speed of an electromagnetic light wave that are far too small to detect even with the best available technology and even with two hundred more years of comparative testing. Since this possibility in subtle changes in the speed of light exists, it is worth investigating how it might change, and what the implications of its changing are for eternal difference.
The electromagnetic field has been omnipresent in spatiality since its very inception, and is curled up in all of the fundamental fermions—except neutrinos—in the form of charge. As these particles move, they interact with and through the electromagnetic field by being caused by and causing waves to ripple across it, waves which move at the speed of light, also known as photons. With this in mind, consider what happens in a particle decay via the weak interaction: a down quark with a particular charge is turned into an up quark and an electron, with the amount of positive and negative charge canceling to keep overall charge conserved, but there is now one more charged particle than there was before. This is another particle that the electromagnetic field has to curl itself up into, stretching the field like elastic.2 This stretching of the field could mean that when it is “plucked” and a wave propagates across it, it will move slightly faster than before it had this increased tension; thus, the speed of light has increased. Even though each new charged particle that results from such an interaction acts to stretch the field, as the number of charged particles grows, each one has a slightly smaller impact stretching the field than the one before it. It is ultimately an exponentially decreasing but never zero (asymptotic to zero) amount of impact on the tautness of the field and, thus, the speed through which a wave will move through it. According to this, the speed of light increases with an increasing amount of difference, that is, if difference is thought of as more particles to make more unique combinations from. This is all possible if the electromagnetic field really is like a rubberband that, when stretched, allows speedier light waves.
There are other related possibilities resulting in either an increase or a decrease of the speed of light. There could be a decrease in the speed of light as difference grows if the charge in particles resulted, not from the field curling up, but from the combining of quanta of space, essentially making the rubberband floppier as more and more electromagnetically charged entities came into existence. If the electromagnetic field isn’t like a rubberband but is such that an increased amount of energy spread throughout it would make light waves go faster, then a third and a forth possibility open up with either a decrease or an increase in the speed of light. If particles are a curling up of electromagnetism—as originally postulated—and a subtraction of energy from the field without decreasing space, the lack of energy left in the field would mean slower moving waves. Alternatively, if newly created particles aren’t a curling up but are, rather, the result of a combining of several quanta of space (decreasing the available space) into a point particle, the result is that, with the excess electromagnetic energy distributing itself into the field, there is an increase in the speed of light.
The four possibilities above, regardless of their pairings and subsequent relationship to the increase or decrease in the speed of light, assume an inversive or complementary affective interdependence between the electromagnetic field and the charged particles. A distinct possibility for a change in the speed of light exists that isn’t necessarily related to this “affective interdependence,” but, rather, upon the degree to which the universe has realized itself three-dimensionally away from mere two-dimensionality. At first, this might seem a mysterious proposition, as space is three-dimensional by the definition we have given to it and there appears to be no lack (of this three-dimensionality). However, consider that within this three-dimensional space individual galaxies are disk-shaped, using the third spatial dimension only minimally when compared to their use of the other two dimensions; solar systems, likewise, are dominantly two-dimensional as most orbits within them are on the same plane; the most diverse parts of the earth, also, are spread out across the crust—a curved plane—but make comparatively little use of the vertical dimension. The way this widespread lack of three-dimensional realization might be related to the speed of light is in that the speed of light squared is used as a conversion, not the speed of light cubed, even though logically this should be the case. Perhaps the value of the speed of light squared represents some overall amount within the universe that the speed of light cubed would be far too great for, and would only be appropriate if the universe realized its three-dimensional potential more, away from centrifugal dominance. On the other hand, perhaps this overall amount within the universe is constant, making the speed of light inversely related to the degree of dimensionality realized. This would mean that as dimensionality slowly grew in the exponents place, the speed of light would have to decrease in the bases place. More realization of the universe’s three-dimensional potential might change the speed of light accordingly and/or the way a wave of light spreads out from its source.
With all of these alternative ontologies, the ability to be directly aware of any change in the speed of light is complicated by a taut-ology that might exist. With a change in the speed of light, everything else could change as well, making it so that no discernible relative difference would occur. Particles themselves will speed up or slow down proportionally; Planck lengths will change in concordance; the electronic measuring devices will continue to spit out the same answers, unable to recognize the difference they are a part of; people, built up out of electromagnetic flows and interactions, will be unable to escape and get behind their own constitution to view ontological changes. If this is the case, the only way that one could ever see any change in the speed of light is if one had a God’s-eye view and could see that there were different quantities of matter and energy than prior times, and a different proportion in them, with more difference bringing more kinetic energy than the energy that was previously unrealized in hypostatic matter. A God’s-eye view need not be a metaphysical abstraction for it is merely a matter of degrees. It is attainable with a thickening and complexifying of connections across the universe, with consciousness moving forward and downward without any roadblocks along the way by the simple and barren lands of relative sameness and inactivity. However, long before this God-like situation had evolved, the question of the speed of light will have faded in importance as eternal difference becomes obvious and experience of the nu is the only concern. However, since the reality of eternal difference is still an open question, we will look at other possibilities that allow it that are not necessarily linked to the evolution of the speed of light.
Without any ability for a change in the speed of light, whether through affective interdependence or some relation of it to the realization of all spatial dimensions, it would seem that maybe the finiteness of the universe is already set, at least if quantum mechanics is not deviated from and the speed of light is universally the same.3 The way that the speed of light would be constant is if, in the conservation of charge, the equilibrium of the electromagnetic field is maintained, unaffected by new particles possessing new charges because these will always cancel out. The equilibrium would also be maintained, even if the field and particles are affective, in that one type of charge, say negative, is a bundling of extra electromagnetic energy, and a positive charge is the resulting void of electromagnetic energy, with the pairings, thus, having no overall effect on the tautness or constituency of the field. However, even without any changing in the speed of light, there is still increasing space for the eternal difference even within a quantum mechanical world. The first possibility that would allow this builds upon the idea of realizing more fully the third dimension, but stresses that more dimensions in addition to three can be realized, each successive dimension bringing with it exponentially more degrees of freedom. So even if there is a finite size and amount to the smallest things in the universe, there is potentially infinite spatial dimensionality for them to differentiate within.
Dark energy, the second possibility, is the coming into existence of more quanta of space, pushing existing matter away from itself; it is a process that could push space towards infinity. Scientific theories of dark energy don’t discredit how it is currently manifesting itself in the universe, but they do try to tie it down to a familiar phenomenon such as a phase transformation of a background quintessence field where the latent energy takes the form of the so-called dark energy spreading everything apart. Some go farther and prematurely set it as a key piece in a deterministic universe and from this predict an ultimate fate of a Big Rip, possibly just to be the first to publish the perish. Regardless of the motivation, there are useful features in this theory, as well as in others, that can be helpful for elucidating the interpretation of what dark energy is capable of. We will assume, for the sake of showing dark energy’s value for eternal difference, that its effects on the speed of light are negligible. This means, in terms of what has already been said, that either each new quanta of space created by dark energy brings with it the necessary vacuum energy and electromagnetic field presence (if the two are even separate phenomena), where the “rubberband” is made longer but the tautness remains the same, or, alternatively, that dark energy’s addition saps energy proportionally (and instantaneously) from the universe as a whole, including the minimum zero point energy existing, thus having no relative effect except on the local distances where these new quanta of space are fitted.4
With all these qualifications in place, dark energy still holds the significance of creating more space for the universe to self-interact within. As dark energy causes the universe to expand, the relative temperature goes down, as a simple inference can tell us and the cosmic microwave background (something that will not be theorized about here) has already told us about past times in the universe’s history. This cooling down that comes with dark energy can certainly be seen to be a problem to the current chemistry, as a heat death and a Big Rip would like to predict. However, these stories neglect to recognize the ability for fundamental particles to evolve based on the temperature and energies existing as a backdrop. Just as more massive particles from the second and third generation require lots of energy to make, and rapidly decay afterwards as there aren’t the consistent energy levels available for them to maintain themselves, so too will the first generation particles realize decays into less massive particles, which will form new chemical compounds with much lower temperature state changes. These new “atoms” of the zeroth and negative generations will have their orbiting leptons much farther out, given that their loss in mass was an increase in kinetic energy, with now much more space and degrees of freedom existing down at this small level for their motion.5 Microwave light to this new generation’s chemistry will be analogous to what “visible” light is to the current chemistry. If the quintessence field that was letting off excess energy during phase transformations does exist, and is the explanation for dark energy, it will have new phase transformation temperatures going lower and lower to allow indefinite expansion. There is no essential restriction if it is allowed to evolve and isn’t thought to be pinned down to imposing assumptions. Far from a Big Rip, the only thing torn apart will be massiveness, brought to smaller and smaller amounts as the difference of the universe grows.
A Devolving Leap from the Quantum
We will not be trying to stand in formation with quantum mechanics any longer, but will be criticizing it from a few different angles that might provide some justification to later wander from it. Serving as both a general reminder of rationality’s fallibility and then also as a specific conundrum that quantum mechanics finds itself in, we begin with Hippasus, who had the critical insight that not all numbers are rational (which ran contrary to the beliefs of his contemporary Pythagoreans). He showed that the diagonal distance in a square is irrational to the distance of the sides, specifically derived by seeing that the square root of two is irrational. Quantum mechanic’s Planck units—specifically concerning ourselves with the Planck length—do not allow for non-integer distances and are, therefore, similarly bound to strict “rationality.” However, using an argument synonymous to Hippasus’, if we go x number of Planck distances in one direction and then go x in a perpendicular direction, the distance from where we began and where we ended is now irrational and cannot be told in integers of Planck length alone. The Planck length isn’t dead yet, though, as there is still a way to make the quanta fit integrally and “rationally” together, which is when the quanta are looked at as a series of combined tetrahedrons, whereby, from any one corner (a single quanta), there are 18 adjacent quanta (corners) of space a Planck length away (all of which are accordingly a Planck length away from one another). However, with this “resolution,” no consecutive Planck spots can be arrived at without a constant change in direction of at least 30 degrees, problematizing the everyday notion of straight motion. Leaving this nuance aside, a general paradox arises when considering the point-like property quanta are said to have, which should thus allow them to be adjacent to an infinite number of other quanta (not merely 18) as only this is concordant with their de-finition.
These paradoxes and inconsistencies fade when we are reminded that the universe isn’t necessarily one that follows Euclidean geometry, and so we have to look at the problems created by the interaction between quanta of space and Einsteinian curvature. How are these Planck lengths, which are said to be the smallest distance, able to curve and compress at a rate of the inverse square of the distance to something gravitating? How can this smallest distance curve at all? If it could curve and compress itself it would have to have some internal structure, and the space in between would have to be warped, both of which are in violation of the Planckian postulation of discreteness. Besides, given the Planckian postulation, there isn’t allowed to be an “in between.” Further, when curvature is changing how can quanta of space be rearranging without not coming into non-integer related distances with one another?
Hopefully, the many questions raised can be viewed as a background and supportive structure to a single large looming question: how could the Planckian postulate of discreteness gain such dominance with all these potential internal contradictions? The answer is that perhaps Planckian discreteness has all along been a ploy to impose logical division and ultimate building blocks upon the world, simply for ease when studying it. Or maybe it is a symptom of something much more ontologically real and insidious. All of these questions, raising both immanent and then transcendental criticisms, put quantum mechanics on a many-edged sword where not all vulnerabilities can be defended without leaving others exposed, even as the combined weight of all the scientific and physical theories so far produced is considered. The areas of vulnerability, as well as others—like the distances light moves in a non-vacuum and what this says about the deriving of the Planck distance—can’t be so easily dismissed, and wrench open some non-Planckian space for thought.
Suppose the Big Bang wasn’t an “explosion” at all, but rather a simple Singularity that underwent internal differentiation, an implosion of motion. This possibility has been alluded to already, but now, in a more explicit fashion, specifically what will be discussed is a universe where space hasn’t expanded at all, but everything that exists is smaller than the Singularity and, when all “added up,” equals the Singularity in all categories. The speed of light is always the same but appears to change as everything else changes, more and more gradually as difference increases. The degree of difference as juxtaposed against the initial sameness is the only absolutely existing difference, implying that newer as well as wider and deeper experiences come to consciousness with difference’s (hopefully) widespread increasing. Thus enters the allegory of the carve, the notion that anything and everything existing that we experience has been carved out of this simple Singularity. With inward carving on the menu, local downward differentiation is now opened up as a possibility, where there could be, in different localities, particles that are much smaller than those commonly known today and elsewhere particles much larger. As for the locally larger particles, black holes6 fit the category, and along with them (and on the process along to them) are Bose-Einstein condensates which hint at what black holes are and how they can come to be. The relevance of the condensates is their cold inner structure that acts almost as a unified single giant particle, a “superatom,” even though they were initially only composed of several separate particles. These superatoms teach us analogically both how particles don’t have to be structureless and that, when they do seem to be without structure, it’s just that they are absolutely cold, like a frozen star (an alternative name for a black hole). As for the smaller particles out of the larger, they are to come in the imminent future, contained but not restrained in the following paragraphs, along with larger discussions of matter generally.
The place of matter is that it is simply the clay for the universal will to sculpt with, making new and more brilliant configurations. However, if this clay is left unattended it will weather away, collapsing into simpler and simpler chunks. The mountains and valleys will turn into flat plains, humans into robots, imagination into knowledge, difference into sameness. Science isn’t without issue on this topic of universal collapse, though, specifically in terms of why matter and antimatter didn’t annihilate one another in the early universe as the particle accelerators of today predict they should have. The complications with the extreme control that particle accelerators impose have already been elaborated, but an important additional piece of information is that the two particles being accelerated to collide are almost always a particle and an anti-particle pair. The scientific conundrum, formally known as CP violation, arises only when the scientific community makes the huge assumption that the Big Bang was like a giant particle accelerator, thus meaning it was a deterministic event and should have produced (just as particle accelerator collisions do) a myriad of particles and anti-particles that should have all annihilated with one another. The accelerators destroy all their newly created matter just fine, so why didn’t the early universe do the same? For one thing, maybe most of the particles today can’t be dated to the Big Bang, but evolved over time and came about through many generations of decays creating unique particles that didn’t fit into the matter and antimatter particle binaries that are created by today’s accelerators. For all we know, it may even be in processes like those of a particle accelerator that the quanta themselves are created out of what was formerly a seamless space; a particle accelerator, not the Big Bang, produces the antimatter with which to destroy the matter in the universe.
Whatever the truth, as the truth is stretch (not crunch), a universal will unconcerned with questions of matter and antimatter—mere epistemological nuisances that don’t ontologically arise when difference is organically created—expanded itself. Therefore, it could not have created this difference via the control and determinism that is preeminent in a particle accelerator, but must have done so freely. Science, on the other hand (most clearly in the example of a particle accelerator where particles are collapsed into one another), is a tendency towards the big crunch, but can’t reflect on this fact because of the above mentioned inability to distinguish the direction of time. It is science’s lack of conceptual room that keeps freedom out and leaves it to idolize a narrow group of particles and laws, and be bogged down with questions that aren’t of importance to the existence of the universe. These questions are ones that science creates because of its own misunderstanding (aided by reifying lenses) of the universe’s evolution. Scientists are seduced by the devil in the detail, trying to control and determine, but they fail to see that God is there too, battling to free the deeper realities and create further ones.
Moving away from one of the most simplifying arenas of materiality—the accelerator—and into one of the most complexifying, let us consider the brain. Specifically, consider the part where consciousness is most active at any given moment. This complex arena is barely understood at both the cellular level and the upper tiers of the molecular level, and so there is not even a chance for scientists getting a look at the subatomic level, especially while it is alive with such complex motion that gives rise to a deep consciousness. Due to this veil, which limits scientific exploits to understanding the brain, who is to say that at the level of the subatomic there isn’t a constant quivering of quarks into particles several levels smaller, at least when consciousness is active in a particular region? The dynamic thoughts that are thirsty for more matter to articulate themselves with can find it in the subquark potential. There are more potential configurations to elucidate greater differences with than solely the chemical and cellular levels would have previously provided. A consciousness, fully hyped with willful energy, has the ability to activate the subquark and the subelectron potential. Essentially, the great thoughts are not built from atoms—they are built into them. This is all a way of saying that the will to differentiate can overpower any particular particle regime and bend the matter away from its lawful trajectory and make with it the nu and increasingly complex. This is the story of the potentials of infinite differentiability throughout spatiality and the story of the development of the entire universe, which was once confined in a mere Singularity.
A Vu of the Nu: Beyond Jame and Deja Vu
The will of Nietzsche can be silent no longer and it bursts out: “What is the importance of the eternal difference for increasing vitality?” The importance is whether or not our wills are bound by the finite around us, putting an ontological limit on what we may think, create, and ultimately become. Freedom in a bounded finite universe becomes, at best, free dome, with no possibility for ever leaving the stadium. In this case, at least some of what we can experience (be conscious of) is dictated to recur infinitely because of the finite number of available configurations. In this limited circumstance, freedom might cease to exist at all, where determinism is built out of the causal interactions of these ultimate parts, which are nothing but causes and effects of each other. It might be that recourse to something deeper is the only way for freedom ontologically exist. The continuing freedom of the universe is dependent on the subverses that are brewing underneath it, waiting to be realized. Freedom, thus, is when the past and the present have no necessary effect on the future and place no necessary (only contingent) constraints upon it; instead of re-presentation, there is nupresentation. To throw forth an example: if we accept the idea of universal finitude, a rock, then, can never be greater than the current universe in which it is merely a rock; alternatively, with the eternal difference as an ontologically realizable process, the rock can become infinitely more complex than the current universe. Bulks upon bulks can exist within the rock, the same rock someone just kicked out of the way, dismissing it as a “stupid rock.” No piece of materiality should be neglected or have direct violence done against it. In this peaceful interaction all matter can be brought into the whirlpool of flux, spiraling downwards into the infinite, where what you knew is nothing, and what you nu is everything.
“What does the eternal difference do to challenge life with overcoming itself?” No determinism exists in the eternal difference,7 and so whatever life is to become, it must create itself. It has no excuses and, accordingly, no debts, and it can bask in its own self-creation. No teleology guides its path; it is completely free to make and remake itself. In this process, however, there is the most tragic of elements. Life must look at the world that surrounds it, the beautiful aspects that make its heart ache: the people, the culture, the music, the valleys, and, without a doubt, the memories. All of these will fade away to be superseded by strange and remote futures. No moment will ever happen again (and if one does, it is the fault of life, not being able to differentiate and break free from the platitudes of being that can so easily tear apart everything it has created). This life must live with and affirm in a universe of eternal difference. It must be strong enough to continuously let go everything that it has ever loved (including knowledge), save the excitement for the new and the yet to be experienced. There will be no more jame vu, for jame vu contains within it a repetition; a repetition of the situation, even though from the point of view of the consciousness perceiving the situation there is no repetition and it appears new. Deja vu—when the mind doesn’t create something new nor appreciate something new, falling into a faulty identification of something old with something new, thus experiencing a repetition and not fully appreciating innovation—is also, regardless of all the cultural wonderment attached to the phenomenon, an indicator of a regression in the will of life. Different from both of these self-antiquating vu’s, life in the eternal difference is always experiencing a nu vu, for the entire situation is new, with both an evolving memory and an evolving situation, both interactively occurring for the first and last time. Neither the “internal” memory nor the “external” situation will ever occur again, and this life must live with, if it is to truly call itself life.
Faith enters the picture in a newfanged way, altogether different from dogmatic religions’ social imposition of it. Faith here is to be placed in one’s ability for infinite creation even though this ability isn’t always verifiable. For in downward creation, the old standpoint from which to compare with the new is destroyed in the making of the new, or it is, at least, reorganized in such a way that it can’t be viewed as separate from the new for comparison between the two. There has to be a faith in this (at times seemingly impossible) ability to undefine the definite. As science is closing in on the closing in, it is an extremely daunting task to remain aware of its categorizing, determining schemas that wield predictive power and are met with experimental success, and yet to go on criticizing them, all in faith. Faith in something that is somehow below the radar of science but is nevertheless a possibility, and, for scientific knowledge, a necessity, as this knowledge is only able to exist after something is created that scientists can, then, claim to know (which then ironically puts the thing in peril of deceasing). Science’s predictability depends on restricting matter to a certain depth, but life, something far greater than science, possesses the subversive power to ruin all predictability, by creating the nu, the uncategorized, that which can run faster than scientific categories can chase. Life, despite the sometimes overwhelming conformity which it has to encounter, must not get caught up in science’s conclusions of a law-like and finite-bound universe, in which it is merely a consciousness resulting from a chance occurrence. While science speaks of the current finite as the only finite, confusing contemporary regularity with eternal regulation, life must counter with creating new finitudes, making the nu the only regularity.
Nietzsche’s eternal recurrence is too comforting for the will, too nostalgic and promising of a recurrence of the same. For it is both the good moments as well as the bad—not just the latter—that are to recur. Surely most conservative consciousnesses would rather live through both the good and the bad than risk difference, which can on the one hand bring them greater experiences, but on the other hand take them away to never exist again; difference threatens to take away everything that has been lived through. The eternal recurrence is insurance against nothingness, but so too against hope, for it leaves no room for any. Even if there is freedom in each moment, in the stretch of all eternity is the doom of repetition. When viewed along with all these ontological realities that would seem to accompany it, the eternal recurrence is an ideology for the unstraying herd. The eternal difference is affirming the unstaying; it is the will’s ultimate test, and it gives unlimited freedom, in which nothingness can be freely chosen. The necessity that would make the moment keep returning is ripped out, and our freedom is injected in place of the necessity. The aspiration is to live every moment well enough and to appreciate its finitude so that it won’t have to be lived over again. The will must avoid the void, but doing so brings no regularity, no resting points, no permanence. There is no conclusion, no full circle. You must end-ur-ring, moving away from the force of cyclical traditions to create the…
In-Conclusive!
Some might dismiss humans, and the earth as a whole, as just a spec of dust in the universe that will pass with some inevitable “natural” disaster like an asteroid and, if not that, then the eventual death of the sun. This sort of thinking, thanks to science’s onslaught of evidence that we are random and insignificant, just like religion, keeps us from using the actual power we have, blinding us with ideologies that we are dwarfed by much larger and important forces. Dust we may be, but a beautifully complexified dust that, like Cantor’s dust, is just as significant as that which is much larger than it. Size does not matter, for in terms of complexity and consciousness we are, as Fred Whipple said of comets, “The most ever made of the least.” The human brain—more complex than comets and entire galaxy clusters—has much more power and ability to change within its spatial volume than we readily allow it. We don’t realize that we are of the universe’s farthest reaches into the depths of difference, and we need a philosophy that reminds us of this and not one that rewinds us of this, silently taking away what we have created. The epic that has foundered in the face of reifying systems like capitalism, turning any difference into the property of sameness, must be found and liquefied, and, after some composting, it can become the soil from which the nu can grow (the epic of regaining the epic is what we are now left with). Potential is everywhere and infinite, so we shouldn’t just sit in our armchairs waiting for God to reveal himself to us—we can only reveal ourselves as God.
This essay is just an episode in what could be an infinite saga, and as sculptors of the nu, hopefully it has brought up many of the possibilities of what our clay’s uncertain potentials are. It could be an infinite pre-existing substance waiting to be brought into motion (the field of nothing); the substance could be always finite, and only grow when will imposes itself on it. The will, where locally stronger, might be able to create more difference, while in places where it is weaker, difference might be lost. Alternatively, if difference can only be created universally, the will must be everywhere united and on the same frequency8 to carve deeper. Quantum mechanics could be synthetically merged and understood from nu viewpoints, or it could be dismissed as ill-logic, a fad-al dust that makes the sculptors cough up coffins. In rebellion, the sculptors can choose to “illogically” divide themselves by zero and burst forth with the “undefined” and the infinite as it explodes from them. We should rebel (without a cause) for the sake of rebelling, as rebelling is an action that continues and merges with the great rebellion that began with the Big Bang against the Singularity; it now takes the form of the rebellion against the umpire (science). The universe’s future is uncertain and should be put on trial, for already it is in court, and the lawyers are trying to frame it, but we the jury all have a motion to make. The (anti) free trial will end and the full sub-version will begin when we make this co-motion and we can be a-journey without end.∞
*For comments—synthetic, divisive, and everything in between—you can email charlesbasak@gmail.com.
aArthur Danto, Nietzsche as Philosopher (New York: Macmillan Company, 1965), pp. 195-213. A great thanks to Elizabeth Grosz for pointing me towards this wonderful text on Nietzsche. It is important to note, and will become clear in the relevant places, that this essay does not come close to adequately representing Nietzsche the way Nietzsche’s own texts might. This essay merely makes strategic use of the popular and crude conceptions of his eternal recurrence for the narrow reason of having something from which the eternal difference might readily be contrasted. Nietzsche is certainly not the straw man that he is turned into here, but the dynamite he thought himself to be; his ideas are of incomparable value and inspiration.
bThis is not unlike—at least in an ontological understanding—Plato’s eternal forms (being) in their relation to the content of lived experience (what would here be thought of as becoming). However, rather than equality, Plato adds hierarchy between the two: mastering being and enslaving becoming, giving forms the power to influence content, but not versa vice.
cThis begs the question whether this essay, when it speaks about being and becoming analytically as two separate entities, can even possibly avoid the same trap it hopes to break free from. Hopefully as this essay develops the heuristic purpose of laying out being and becoming as analytical elements won’t distort the sought after conception.
dKeep in mind that this equality between being and becoming is a logical result of an ideology that separates them, and it is not an a priori necessity. It is an imposition first affecting the thought of Democritus and, subsequently, those who bought into his ideas of the finite flux. Materiality is indeed influenced by this sort of imposed thinking, first and foremost the brain configurations of those who think with it—as well as to varying degrees the things they interact with that succumb to the logic—but it isn’t necessarily permanent.
eThe eternal difference doesn’t require a virtual element that ensures difference by carrying the past along with it. The Bergsonian virtual, strictly speaking, necessitates difference, stripping it of an epic quality of creation that is only always a potential rather than a definite, where the free forces are barred from potentially falling into sameness. The eternal difference is free to differentiate or to accept increasing sameness, but within the philosophy of time it is clearly in the realm of presentism, and nothing in the past has any necessary causal impact on any future free actions. Every instant is epically important for the creation of difference, and there can be no consolidation of past creations, except as they give an uncertain momentum to the present. It is always in the hands of the present to loose the grip that being casts on it, or to let that grasp be tightened.
fScience is being shown to have a negative effect on interactive differentiating ontological structures, but a place may still exist for it with regard to explaining things that aren’t yet deeply connected to—such as distant galaxies—and things that don’t want to be connected to—such as diseases (in contrast with friendly bacteria) and other things that destroy complexity rather than enhance it. The scientific method along with its production of representations may be useful for gathering information and making educated guesses as to how these things can be positively interacted with or overcome in the future.
gIt is categories that feed the dichotomy between quantity and quality, putting the former inside a restrictive form, to where it has a very restricted amount of ability to differentiate. A quantitative difference is reduced to linearity within a category whereas a qualitative difference is one that “transcends” the categories and shows how forms create absolute separation. However, the only real ontological category is the universe in its totality, and all change is a thing of degrees, for if differences were qualitative, separation would be implied, and there is no necessary endurance to separation in the universe.
hComplex in that we have many levels of interaction within our brains, levels that aren’t definitively apparent as consciousness is synthetically uniting them. The result of such a complex consciousness is that it will perceive events passing more slowly compared to a more simple and physically slow changing consciousness. A possible result of this relative variation in speed between two consciousnesses is that the complex consciousness may fail to see or care about the more feeble, slow changing, and simple consciousness. It might end up using the simple consciousness for its own purposes, which if analytical would end in the dicing up of the simple consciousness or at least giving it no space for any further development. The potential for these simple consciousnesses—stuck, for example, in the form of a piece of steel—to realize themselves is perhaps even more difficult than the original Singularity realizing itself because the particles in the steel are stuck in a vast ocean of repeating processes that overwhelmingly encourage and nudge them to conform. The Singularity on the other hand, though completely novel and no less an act of will, doesn’t have anything around it to make it stay put (though arguably the field of nothing, which will be discussed in a moment). This is why it’s essential for the complex consciousnesses to connect with the simpler things, like a piece of steel, to pull its particles out of their hamster wheels, rather than discarding them as “dead and predictable” metallic matter. Overcoming this bias against the simpler matter may be the difference between a Big Crunch and a continuation towards eternal difference.
IHumans have only been observing galaxies for 80 years, during which time many phenomena have been discovered, sometimes changing fundamental conceptions we had about them. This 80 years of indirect observations is dwarfed by the billions of years for which galaxies have been able to develop, consciously or not. It’s not clear whether right now or any time in the observable past (if the past can even be observed through billion year old “unchanging photons,” as astrophysicists like to believe) if galaxies, the space between them, and the universe as a whole have been freely differentiating (albeit at a slower scale than humans might be able to recognize) or if they have given in to regression and law-like behavior. Patterns do exist, but it is a mistake to presume that because of their existence the galaxies we observe must solely be the result of laws.
jThis infinite aspect in spatiality would be an extreme and a priori version of the Big Rip that would have made spatiality—the universe as we know it—impossible. To understand why this dilution would have happened, it’s important to understand that this field of nothing is not a containable infinite that can be confined to “over there,” outside of the inner universe where we are. It is an omnipresent infinite that if it were spatial would be as much infinitely in between every particle as between galaxies and outside of what we think of as the universe. This is why it must be outside of spatiality.
kFor our purposes here, matter and energy designate the same idea as complementarity’s common explanation in terms of position and momentum, but better serve ontological thinking about existing in spatiality.
lIf it were made of just one singularity or even many, then after it had become spatialized there would be a different number remaining in the field of nothing, which would be a differentiation in quantity, which isn’t possible in this always self-same field of nothing. Nothing can’t change—it doesn’t have the capacity to do so.
mA discrete possibility exists that it is spin energy that keeps what we think of as “empty space” spatialized. Empty space, whether in its own quantized sphere or as added to and averaged with all the other empty space, could be spinning perhaps with a negative spin of one half, meaning it’s against and evening out to zero (when totaled for the universe as a whole) when combined with all the “positive” spin from the particles in the universe. Bosons and fermions would combine in their specified ways with this “empty” background space.
nParallel things, such as parallel lines, can only exist in the infinite because while in finite spatiality everything is dimensionalized (in this case in at least three dimensions) and hence in constant interaction, interacting being something that parallel lines are by definition not doing. Thus the geometric idea of a line existing at all is a fiction. Nothing can meet or interact in the infinite as opposed to Euler’s idea that the sum of 1-1+1-1+… comes together in infinity as ½. For the related discussion of Euler, see Roland Omnès, Quantum Philosophy (Princeton, N.J.: Princeton University Press, 1999), p. 52. Only in finite spatiality, where things aren’t guided solely by notational law, can they come together. Continuing with the idea of parallelity a poetic resonance offers itself when using some of the basic symbols of mathematics, where 0 = 0 is 0 \\ 0. They are paralleled to never connect, as either spread out lines (the equal and parallel symbols, used here as the same sameness, as opposed to the ‘same difference’) representing the field aspect of the infinite, or the self-contained circles (the number 0) representing the singularities’ aspect of the infinite. An interesting merging of these would be a fractal bullseye symbol, where there are infinite parallel circles never touching, going up and down forever, depending on perspective direction. Hopefully this will become clearer in the following section, which is drawing from this line of thought.
oIt’s important—against common notions of the term—to think of synthesis as a diachronic phenomenon rather than an isolated, synchronic act that is the simple coming together of disparate things. This is where Bergson’s notion of duration as ontologically undivided is extremely useful, for it would say in the context of infinite differentiability, that even as more “things” are created, that they are in actuality not things but indivisible (in the absolute, qualitative sense of the work), continuously altering, fully connected processes, synthetically self-interacting.
pSteven Weinberg, The First Three Minutes: A Modern View of the Origin of the Universe (New York: Basic Books, 1993).
qSimilar (though not analogous) to the way the matter-energy density issue has been mistreated is the problem of entropy first raised by Ludwig Boltzmann when considering that if the second law of thermodynamics is applied to all past times of the universe, how did such a low entropic state, as we observe now, emerge to begin with, given the second law of thermodynamics? The universe’s degree of entropy is reduced to probability and a “past hypothesis” that the entropy had to be even lower early on in the universe, rather than considering that high entropy and low entropy vary directly or indirectly depending on how interactive and self-determining the universe is. It isn’t clear yet whether high entropy is directly linked to regression and low entropy to difference, for some of the lowest entropic states are created by gravity, which also functions regressively.
rStructureless quarks and electrons can absorb and discharge gluons and photons respectively, problematizing the idea of a lack of inner structure and giving greater credence to some of the notions in this essay.
sIt is intuitively unclear whether grand unified charges at this point were already bipolarized (into positive and negative), tripolarized, or any number of different types of charges, or if there was just one charge that allowed free motion and association. This has bearing on how the strong force came to be a tripolar charge and the electromagnetic force charged in bipolar fashion, but it’s not of particular importance for this story.
tThey could fittingly be called “nu-trinos,” as they create new particles.
uThe remaining large particles, along with the extra chunks that didn’t fit in, could possibly be what the dark matter is.
vIt’s possible there could have even been a fourth or fifth generation where this happened, at even higher masses and energies.
wTo preserve the variety of states of matter—plasma, liquid, and more—smaller massed particles need to be willed into existence from larger ones that were made static from the relative cold. This is how the unnamed democreatean could avoid being frozen dead matter in the “near absolute zero” circumstances on the planet, by expanding into smaller levels of mass.
xThe finite number of neutrinos had been brought together by Heracleatean as the galaxies had become one in peaceful interaction. The “dynamite” was able to oscillate the tau nutrinos into mu nutrinos. Because there was lots of difference per cubic bit of space, as Heracleatean had accomplished, the mu nutrinos were sufficiently close to create new mu matter, something that couldn’t be done in a universe where all the matter was frozen through being big-ripped-away and had little potential for downward differentiation and continuance.
yThis is arguably possible to do within the perspective of the eternal difference, where the particle is everywhere yet nowhere as it is in the field of nothing. It would be infinitely spread out and made to be a frequency-less wave, as well as simultaneously made into a singularity. Extending the ontological basis of the uncertainty principle to include infinites is one of the ways a field of nothing can be derived from within current quantum logic.
zBose-Einstein condensates exemplify two noteworthy properties, one to be spoken of now and one later.
1One of the curiosities and possible flaws in these two particular Planck units is that they are both preconditions to one another in a circular argument, where one is necessary to derive the value of the other.
2As an aside, neutral dark matter offers a huge potential for the introduction of electromagnetism, and is, in this way, useful to contemporary difference efforts, though not of necessarily particular importance to eternal difference (at least if one is in agreement with the scientific communities’ current assumptions about dark matter). Note that the reason dark matter was postulated in the first place was to explain the otherwise peculiar motions of galaxies, in effect keeping the known laws of gravity legitimate. Whether or not dark matter actually exists, it serves to fill in the gap, in lieu of any degree of free “unlawful” motion at the galactic level. Discussion of this issue is continued and given a possible resolution with dark energy in note 5.
3So far, the speed of light and the electromagnetic field have been assumed to be universally consistent, not meaning they can’t change, but just that there are no local variations where the field propagates photons at a different pace than in other regions. If the force is affectively interdependent, it is changed instantaneously across the universe; the field is “entangled,” so to speak. Explorations of local variations in the quantum, as well as breaking with the quantum, will wait until the next section.
4In these two possibilities, the first violates the first law of thermodynamics with this added energy. The second doesn’t but requires some sort of universal entanglement.
5Using this same idea but on a much larger scale, dark energy could be the explanation for the free motion of outer stars in galaxies that is currently being explained by deterministic gravitational interaction with dark matter. Only a dynamic understanding of dark energy is needed to explain this phenomenon.
6A note on the hypothetical Higgs particle is now appropriate given that the necessary background has been established thus far in the essay. Essentially, the Higgs particle (that the world’s particle physicists are trying to create in the new Large Hadron Collider) is actually the current smallest mass (this property can vary as the universe varies) that a black hole can have. The mass at which they believe they have created the Higgs particle is in fact the critical tipping point where a single Planck quanta can no longer uphold anything but massiveness (thus the description of the Higgs as a particle that has only mass as a property). This smallest possible black hole probably won’t grow, however, because it will not be immediately adjacent to other particle quanta (unlike the center of a supernovae where there are so many particles crammed together), and so will in all likelihood dissipate.
7Remember, if departing from a Singularity, laws (and determinism) can’t explain nor be the cause of anything that follows, only freely acting life can. Laws can only control and explain the finite number of logically existing parts, which have a smallest size to build up from, but they play no part in the creation of anything smaller than these parts.
8This idea of frequency will be projected into a future essay.
∞My indebtedness is large and indeterminate, but some specific people that have stimulated my passion and thinking are my teachers Drucilla Cornell, Stephen Bronner, and Elizabeth Grosz, as well as such thinkers as Kant, Marx, Lukacs, Adorno, Nietzsche, and Bergson. More generally, the dedication and generosity that is the Wikipedia community has been invaluable for coming to grips with the intuitive ideas of theoretical physics (this is where I urge the reader to consult for background and anything I have failed to adequately explain). I thank my experiential self for sacrificing countless hours refining (with dubious success) and formatting the slew of various ideas I felt were relevant, and forcing linearity upon them so that they could come in the form of this essay. Writing this has been painstakingly hard and long. Thanks to the ceilings, for staring back at me as I stared up at them for hours upon hours. Most particularly, thank you Priya for both your editing and unending support with this project, I adore you more each moment I think of you. Above all (because it includes all), the actions of the universe that have somehow led to the existence of all of us is where my greatest thanks lies, and it is this totality of which I am a part of that this essay is dedicated and given “ownership.” The ideas and creations above are the universe’s as a whole—I just hope I have done some slight embellishing below.