Relativistic pedantry

I must say first off, that I teach math and computer science, and was never qualified to teach physics. But I am interested in physics, and got drawn into in a physics discussion about how time does not stretch or compress in the visible world, and this is why in most of science, time is always the independent variable, stuck for most practical purposes on the x axis.

In the macroscopic world, time and mass are pretty reliable and so close to Einstein’s formulas (or those associated with the Special and General Theories of Relativity) at the macroscopic level that we prefer to stick to simpler formulas from classical mechanics, since they are great approximations, so long as things move well below the speed of light.

I am not sure (is anyone?) about how time is influenced by things like gravity and velocity (in particluar, the formulas stating how time is a dependent varable with respect to these things), but I remember an equation for relative mass, which doesn’t use time that would provide some insight into relativity:

\displaystyle{m(v) = lim_{v \to c^-} \frac{m_0}{\sqrt{1 - \frac{v^2}{c^2}}} = \infty}

Here, the independent variable is velocity, and it is evident that even for bodies that appear to move fast (on the scale of 10 to 20,000 km/h), it doesn’t have much impact on this equation. Rest mass and relative mass are essentially the same, and a body would have to move at nearly the speed of light for the mass of the moving body to change significantly. Indeed, as velocity v gets closer to the speed of light c, mass shoots up to infinity. I understand that Einstein stated that nothing can move faster than light, and this is supported by the above equation, since that would make it negative under the radical.

It does not escape my notice that velocity is supposed to depend on time, making the function m(v(t)), but time warps under things like high velocity also (as well as high gravity), so that time depends on … ? This is where I tell people to “go ask your physics prof” about anything more involved.

Sattelites move within the range of 10,000 to 20,000 km/h, hundreds of kilometres above the Earth’s surface. My assertion that there is not much change here in relativity terms. But this is still is large enough to keep makers of cell phones up at night, since not considering Einstein equations in time calcluations can cause GPS systems to register errors in a person’s position on the globe on the order of several kilometres, rendering the GPS functions on cell phones essentially useless.

My companion was trying to make the latter point, where I was thinking much more generally. We stick to classical mechanics, not because the equations are necessarily the correct ones, but instead because they are simple and lend a great deal of predictive power to the macroscopic world around us.

Compiling the linux kernel docs

In the last article, I said that compiling and installing source versions of software was akin to “going rogue”. I must confess that I have compiled from source and installed software that wasn’t in my distribution, most recently TexStudio, as being one of the larger projects, requiring tons of other libraries and whatnot to also be installed (or quite often, compiled from source on the side), since it wasn’t a part of the linux distro I was using at the time. It also wasn’t a part of Cygwin, and I compiled for that too. It was a great way to kill an afternoon.

But there was a time that I had compiled the kernel from source. It was necessary for me, as speed was an issue and I had slow hardware at the time. What I also had was a mixture of hardware pulled from different computers at different times. I researched specs on sound cards, network cards, video cards and the motherboard chipsets, and knew what specs to tweak on the kernel compilation dialogs, so I could get the kernel to do the right thing: which is to be fast and recognize all my hardware. I was doing this before the days of modules, with the version 1.x kernel. It worked, and it was noticeably faster than the stock kernels. X-Windows on my 80486 PC ran quite well with these compiled kernels, but was sluggish to the point of un-useable with a stock kernel running. Every few versions of the kernel, I would re-compile a new kernel for my PC, and pretty soon using the tcl/tk dialogs they had made things pretty easy, and I could answer all the questions from memory.

But then that all ended with version 2. Yes, I compiled a version 2 kernel from source, and yes, it ran OK. But it also had modules. The precompiled kernels were now stripped down and lean, and the modules would only be added as needed when the kernel auto-detected the presence of the appropriate hardware. After compiling a few times, I no longer saw the point from a performance standpoint, and today we are well into kernel version 5.3, and I haven’t compiled my own kernel for a very long time.

For the heck of it, I downloaded the 5.3 kernel, which uncompressed into nearly 1 gigabyte of source code. I studied the config options and the Makefile options, and saw that I could just run “make” to create only the documentation. So that’s what I did.

It created over 8,500 pages of documentation across dozens of PDF files. And 24 of them are zero-length PDFs, which presumably didn’t compile properly, otherwise the pagecount would have easily tipped the scales at 10,000. The pages were generated quickly, the 8,500 or more pages were generated with errors in about 3 minutes. The errors seemed to be manifest in the associated PDFs not showing up under the Documentation directory. I have a fast-ish processor, an Intel 4770k (a 4th generation i7 processor), which I never overclocked, running on what is now a fast-ish gaming motherboard (an ASUS Hero Maximus VI) with 32 gigs of fast-ish RAM. The compilation, even though it was only documentation, seemed to go screamingly fast on this computer, much faster than I was accustomed to (although I guess if I am using 80486’s and early Pentiums as a comparison …). The generated output to standard error of the LaTeX compilation was a veritable blur of underfull hbox’es and page numbers.

For the record, the pagecount was generated using the following code:

#! /bin/bash
list=`ls *.pdf`
tot=0
for i in $list ; do
        # if the PDF is of non-zero length then ...
        if [ -s "${i}" ] ; then 
                j=`pdfinfo ${i} | grep ^Pages`
                j=`awk '{gsub("Pages:", "");print}' <<< ${j}`
                # give a pagecount/filename/running total
                echo ${j}	    ${i}    ${tot}
                # tally up the total so far
                tot=$(($tot + $j))
        fi
done

echo Total page count: ${tot}

The next step for Linux development

As you might know, there are nearly 300 Linux distributions (currently 289– low in historical terms), and this is a testament to how successful the Linux kernel has become on the PC, as well as other devices, especially in relation to previously-existing *NIX systems, who have either fallen by the wayside, or are barely existing in comparison. A *NIX system that might be a distant second might be BSD UNIX.

Just earlier today, I observed that for the installation of TexStudio, for instance, there are two installation images for MS-Windows (all versions from windows 7 on up), the only distinction being between 32 and 64-bit. On the other hand, there were a plethora of Linux images, all depending on which distro of Linux you used. My distro is Ubuntu Studio. I use Gnome as the window manager. The only Ubuntu-based Linux images were for xUbuntu (which uses xfce as a window manager).

Apparently, it also seems necessary to have to compile a separate image each time a linux distro is upgraded. The 19 images I counted for xUbuntu were for versions 14 through to 19. Now, I understand that seperate images need to be compiled for different processors, but most of these are for PC’s running with 32 or 64-bit processors. The same was true for each upgrade of Debian, or Fedora, or OpenSuse. And even then, they needed separate binaries from each other. There are easily more than 50 Linux-based installation images you can choose from at the moment.

The “package” system that is now near universal in the Linux environment provides a convenient way for sysops to assure themselves that installations can happen without problems happening to the system. Before that, one compiled most new software from source, tweaking system variables and modifying config files to conform to whatever you had in your system. This has since become automated with “make –configure” or “make config” that most source has these days. In other words, modernization of Linux seems to mean increasing levels of abstraction, and increasing levels of trust that the judgement of a “make config” trumps human judgement of what needs configuring for the source to compile. On a larger scale, trusting a package manager over our own common sense can be seen as working “most of the time”, so there is a temptation to be lazy and just find something else to install besides whatever our first choice was in case the installation failed due to a package conflict. Installing software by compiling from source, once seen as the rite of passage of any sensible Linux geek, is now seen as “going rogue”, since that is now seen as subverting the package manager, and in a sense, taking the law into your own hands.

Of course, Linux installations still exist for the latter kind of Linux user. The foremost, in my opinion, is Slackware (if you screw up, at least something will run) and a close second is Arch Linux. It is my understanding that Arch Linux requires much more knowledge of your own hardware in order to even boot the system; whereas Slackware will be likely to at least boot if your knowledge of the hardware is not quite so keen (but still keen). My experience with Slackware is in the distant past, so I am not sure what is the norm these days, although I understand they still use tarballs, which I remember allowed me to play with the installation by un-compressing it in a directory tree not intended for the installation to see what was inside before I committed myself to deciding whether it could be installed. The tarballs are compressed nowadays with “xz” compression, giving the files a “.txz” extension.

But I digress. Getting back to installation images, it should not be too difficult for people who manage these linux distros to make it less necessary to have so many different images for the same Linux distribution. In the MS-Windows example, only one version of TexStudio was needed across three or four different Windows versions. I am running windows 7 with software that didn’t exist in the days of Windows 7, and with other software that originated from Windows 2000. All of it still runs, and runs quite well. Fixing this problem is hopefully do-able in the near future.

Loving your enemies

Over the past 4 years of living in my apartment high above the city, I have had to share my balcony with a number of freeloading pigeons. That may not be so bad, except they poop profusely, and leave me with the mess to clean up. I have grown to be less than loving of my fellow feathered creatures, being woken up by their cooing and their congregating on my balcony.

When looking up pigeons on Wikipedia, I learn that pigeons are not native to North America; that they sailed from Europe on the boats with the settlers. Biologists would say therefore that they are an “invasive species” to our continent, and don’t have any natural predators here to keep their populations in check. And they are, like mice, rats, and cockroaches, animals which follow human habitation everywhere, meaning that they are found everywhere in North America humans live: on the coasts, in the praries, in valleys and the craggy ledges of mountains. And on my 14th-floor concrete balcony. It feels almost like home to them.

They are so plentiful and aggressive, that it does no good to harm or kill them. You can cover your surfaces where they like to go with a kind of spikey bird repellent (I haven’t tried that idea yet), but that’s about it.

For these much-loathed birds living in close quarters with me, it would seem that I must decry their serious drop in number over the decades, and advocate for their continued survival. It would seem.

An article in Science magazine issued 3 days ago with the dry-sounding title “Decline of the North American Avifauna” has decreed that, according to their close look at the situation, bird populations have generally declined by 29% since 1970, amounting to a decrease of over 3 billion birds. The main species whose populations have faced the steepest decline are those that are common in North American cities such as sparrows, blackbirds, and starlings. There are others that are actually increasing, such as ducks and geese. The Canada goose, while being beautiful large birds and graceful in flight, have also been a nuisance in many places, have taken advantage of our most slovenly methods of garbage disposal, with many not even bothering to fly south for all the garbage we give them to feed on.

The common feral pigeon, according to supplementary data which Science Magazine has behind the paywall, is experiencing a relatively slight increase. About 3.6 million Columbidae (of which pigeons are one group of species) have been added to the North American population between 1970 and 2017. Held up against the rest of the numbers in the table, the numbers seem small compared with birds like vireos, a small insect-feeding greenish family of birds, which has had the largest increase of all at just under 90 million.

Generally, birds are an important species, keeping animal and insect populations in check. Since my nemesis the pigeon is not in decline, I do not feel I am ready to be so much of a pro-pigeon advocate.

Time to get out the bird spikes.

Paraskevidekatriaphobia

Paraskevidekatriaphobia, the fear of Friday the 13th is where people avoid travelling, marrying, or even working that day. As for the latter, you have to be pretty irrational to not see that not showing up for work for suspersitious reasons is is much more likely to have “unlucky” consequences.

Judas was the 13th guest to the last supper, who betrayed Jesus. The Norse god Loki was the 13th attendant at a dinner party, creating chaos for the 12 other gods in attendance.

13 is considered lucky in parts of south and southeast Asia, such as India and Thailand. In both cultures, they primarily use the base-10 numbering system as we do.

In Christianity, 13 often seems to be a play on the duodecimal (base-12) system. There are some mathematical qualities of 12, in that it has a lot of divisors: it can be divided into sixths, quarters, thirds, and halves, and still have integer values. Adding 1 to 12 makes 13, which is prime, and not much fun for integer division. While that could introduce a kind of chaos, you also get a prime if you subtract 1. So why isn’t 11 unlucky? Seen another way, a prime number can be regarded as lucky in itself and as seen as symbolic of wholeness in some cultures.

Statistical factoids which are actually true

I have often discussed with my students about the hilarity that ensues when one conflates correlation with causation. The lesson, of course, is that they should never be confused, otherwise you can conclude things like “World hunger is caused by a lack of television sets.” Get it? Our first world has lots of TV sets and not much hunger. The third world lacks TV sets and has a great deal of world hunger. So, the solution is that we can ship our TV sets to the third world, and world hunger can be eradicated. While I don’t have the numbers, I would assume the correlation exists. Whether causation exists is a whole other matter, and is much more difficult to prove.

So, to explain the title of this blog, I mean “actually true” in the sense that the correlations are for real; but not the causation necessarily.

I am proposing here to make a few (not too many) posts in honor of a website called Spurious Correlations.

Currently, its front page reassures us that while it is true that government spending on science and technology correlates positively, and strongly, with death by suicide, we ought to fall short on curbing spending on science research, as some of that science might be about how to reduce the incidents of suicide.

Further down that page, I am not sure what to make of the high correlation shown between per capita cheese consumption, and the number of people who died after being tangled in their bedsheets. Or of the relationship between the number of people who fell and drowned out of their fishing boat, and the marriage rate in Kentucky.

But one of the perks of the website is in its ability to conjure up statsistics based on user choices. Here was the result of some playing around I did on their website:

They appear to prefer to show all of their graphs in time series, which still shows the data more or less rising and falling together, but linear correlations are nicer. They offered the data that was used to plot this graph, and I was able use that data to make my own scatterplot relating the data to each other rather than against time, showing the data has the same r value:

Now I can feel confident in saying that if there are, say, a total of 84,000 deaths due to cancer on the 52 Thursdays of any given year, there will be 15,650 Lawyers practising in Tennessee that year also. I have also worked out that if you got rid of all of the Tennessee lawyers, we would save the lives of just over 20,000 cancer patients per year. Isn’t statistics great?

When you take the square root of the coefficient of determination, you get 0.971299, which agrees with the r value offered on their website. The data, according to the website, originates from the Centre for Disease Control and the American Bar Association.

The greatest advance in computer technology

tx-2
HP TX-2 Bought in 2007, it sported 4 USB ports, a DVD drive, two expansion ports, a VGA port, and an IR remote control.

My three HP laptops I have serve as latter-day museum pieces of how technology has progressed. I am not trying to slag Hewlett-Packard. I like their printers, and despite their reputation, I also like their laptops. Today, I am mentioning them as a microcosm of how technology has progressed. What can be said about HP can be said across the industry. HP is nothing special in this regard. These are all full laptops with attached keyboards. They all have rotating displays with a webcam, onboard stereo mikes, stereo speakers, a touchscreen and a mousepad. Also, it is fair to say all of these laptops were purchased used (saving nearly a thousand dollars apiece off the prices when new), but all have been fully functional from the first day, and are still functional.

As you read the captions on each successive illustration going from top to bottom, what I don’t mention is that, of course, video is more advanced; and the last laptop, the Elitebook is, in my experience, the first to offer an internal SSD out of the box. The Elitebook also has nowhere near the heat problems suffered by my TX2.

HP TM-2 Bought in 2010, it removed one USB port, and one of the expansion ports. It also removed the DVD ROM and no longer has an IR remote. It also removed the VGA but replaced it with an HDMI port.

But these advances are small compared to the greatest advance the progression of these laptops show: the elimination of major features, and the marketing effort on the part of computer companies that this is a “good thing”. By the time we get to the Elitebook, we no longer have a DVD drive, and have eliminated half of our USB ports. Neither of the two USB ports that remain are USB3, either. Not mentioned in the captions, are the elimination of the spare headphone jack, and the microphone jack. The combination mike/headphone jack on the Elitebook won’t support actual microphones, supporting instead, perhaps, mikes built into the headset. My headset uses a USB connection, and wouldn’t require an eighth-inch jack connection. But microphone support is terrible, making the built-in mikes your only good option.

Elitebook Revolve 810
HP EliteBook Revolve 810 Bought in 2015, it lacks all of the features the TM-2 was lacking, except one less USB port, and no touch pen. It also has neither VGA nor HDMI, but it has something touted to be a “dual display port” which fits nothing on any equipment I have, but can be converted to HDMI with the right adapter cable. It is whisper-quiet, partly because the speakers are not that great.

One thing (out of many other reasons) that motivated me not to get rid of my two older laptops is the one reason anyone would buy a convertible tablet in the first place: apart from using the screen for direct windows navigation, you can also write documents in your own handwriting, or make drawings freehand on the tablet screen. I do make use of this feature, and found to my horror that the Elitebook has really terrible support of freehand writing and drawing. The other two actually have pretty good support, and it was a great disappointment to see this feature lacking in the Elitebook despite a faster CPU and graphics processor. Apart from not having a stylus, the craggy way it renders drawings of straight lines when you do use a stylus – and even if you use a ruler – has been well documented in many other blogs and video reviews.

But even with the HP EliteBook, Apple and Google have gone even further over the deep end with elimination of features, with consumers willing to pay more for equipment that can do less. It is a marketer’s wet dream, made manifest in reality. Who needs a keyboard at all, or any external connectors? Use Bluetooth for all your peripherals (nowadays, the keyboard is a peripheral), and “the cloud” as your external hard drive. And still, these pieces of crippled hardware are so popular, they almost sell themselves. Having only bluetooth restricts flexibility, since a peripheral that doesn’t use bluetooth, such as a USB drive, is no longer an option for owners of these devices. To store, I would only have “the cloud”, and I would have to hope I would have free internet access everywhere I go in order to access my data. It is quite possible that users who rely on cloud storage are paying monthly for their internet connection, and paying monthly again for “cloud” storage. Of course Apple, Google and Microsoft are happy to provide cloud services so you can store as much data as possible, and to autosave your documents in the cloud to maximize your use of their cloud services.

A short problem

This math problem had me going for a bit. Looked at from a distance, it looked like one thing; and when I had the occasion to sit down and hash it out, it was quite another.

A student submitted a project that was of her game which was played with just two dice. If you roll a 2 or 12 you win; but if you roll any sum from 5 to 9 you lose; and if you roll a 3, a 4, a 10, or an 11, you survive to another round. You are limited to a maximum of three turns to roll a winning total.

It was not the normal Bernoulli trial, since this doesn’t just have the two states of success and failure; but it introduces a third state, which we will call “survival”. While you don’t “win” if you survive, you can still play again, but you can’t go past 3 turns. Three survivals in a row gets you nothing. You need to get a 2 or a 12 in three turns or less.

P(2 or 12) = \frac{1}{18}, the probability of winning on your first try. Even though there are three states, we can still discuss wait time. In this context, it can be 0, 1, or 2. With a wait time of 1, P(2 or 12) = P(3, 4, 10, or 11)\times (\frac{1}{18})=(\frac{5}{18})\times(\frac{1}{18}) \approx 0.015432.

It means that to even make it to the second turn you can only get there with a 3, 4, 10, or 11. If you got 2 or 12 on the first try, there would be no need for a second turn. Similarly, to get to the third turn, you needed to survive twice and win the third time: (\frac{5}{18})^2\times(\frac{1}{18}) \approx 0.00428669.

It made sense, but I still wasn’t sure about this. What about the probability of losing completely? Those are the numbers from 5 to 9, which has a probability of \frac{2}{3}. Why wasn’t I making use of that information?

I don’t think it was necessary in computing the probabilities I did, since the winning conditions preclude rolling any sum between 5 and 9. But it can come in handy as a check. A good indicator that I am on the right track is to see if expectations of winning and losing for 1000 trials, add up to 1000. For winning, I need to add up the probabilities for all 3 wait times: \frac{1}{18} + \frac{5}{324}+\frac{25}{5832} = \frac{439}{5832}\approx 0.0752743. For 1000 games, the wait time of winning is: 75.2743 games.

The expectation of losing is similarly calculated, based on a single-turn probability of 2/3: \frac{2}{3} + \frac{2}{3}\times \frac{5}{18} + \frac{2}{3}\times \left(\frac{5}{18}\right)^2 = \frac{439}{486}\approx 0.9033. This means you will lose, on average 903 out of every 1000 games, or an expectation of 903.3. Notice that for 1000 games, the winning and losing conditions don’t add to 1000. What we didn’t add was survival until the third turn: \left(\frac{5}{18}\right)^3 = \frac{125}{5832}\approx 0.02143. This means that you will “survive” (but not win)21.43 out of every 1000 games. With enough decimal precision, we do indeed get 1000 games when we add all these numbers up, or at the very least add the expectation using the fractions: 1000 \times \left(\frac{439}{5832} + \frac{439}{486} + \frac{125}{5832}\right) = 1000 \times \left(\frac{439}{5832} + \frac{5268}{5832} + \frac{125}{5832}\right) = 1000.

What is “universal” about USB?

I am one of those hopeless romantics who believe that words must mean something. “Universal” is quite a stong word when used, and its all-encompassing reach implies that it is good for … well, everything. As in the whole universe which contains that thing.

USB.org lists at least 18 connectors according to “device class”, few of which you would consider interchangeable with another. I have seen, for example radically differently shaped portable hard drive connectors over the years (at least 3 kinds) that all say they are USB, and all illustrated in the photo montage provided here. They would never be considered interchangeable.

Perhaps by “universal”, USB.org (homepage of the USB implementers forum) just means that this is another attempt to apply industry standards to an understandably chaotic computer industry. “Universal” invites mental images of “one connector-fits-all”, and we can see that can’t be the case, and it is pretty much impossible given the data needs of different devices. It appears to be an attempt to eliminate or reduce proprietary connectors, which are often made by one manufacturer for one device, and never seen again in the next model year, by any manufacturer. It is a way for a consortium of manufacturers to agree “OK, if we want to advertise USB on our products, we have to pick from this or that set of connectors to sport the USB logo on our package.”

I notice that among many of the predictable companies represented in the consortium (Intel, HP, and a plethora of small corporations and manufacturers numbering in the thousands), Apple is also on the board of directors. Apple, the current reigning king of consumer lock-in has allowed their proprietary connectors to be made by anyone. I bought one at a gas station — it works surprisingly well. It consists of a USB main cord ending with a micro USB connector, over which I can fit a (Apple) lightning connector attachment and have it both ways. I can charge and transfer data to and from my iPad with it.

Again, romantic old me talking here, but if I lose or damage a USB connector, I should be able to find replacement connectors at any electronics shop. In reality though I don’t expect stores to sell all 18 or so different kinds of cable. But I also should not be forced to send off to the manufacturer of my device for one, often at exhorbitant cost, which is what I think the consortium was trying to avoid.

Technology in Mathematics: Chalk

A variety of chalk, missing the requisite, and very necessary, white chalk.

First off, May the Fourth be with you! (Had to get that in there…)

Chalk still stands out as the technology of choice of mathematicians everywhere. Those of us forced to use whiteboards realize the wisdom of the ancients once we have to do without chalkboards and chalk.

Whiteboard markers are much more expensive than chalk; it is hard to know when markers run out of ink (whereas there is no question when you are running out of chalk); and after many uses, they leave a residue on the whiteboard which is hard to clean off. Also, whiteboard markeers have many components which make it difficult to recycle, and produces more waste when they go dry. The markers easily go dry, and one has to be always in the habit of ensuring the cap is on tightly when not in use.

Chalk, on the other hand, just runs out. It can produce dust. And though I have had a dust (house dust) allergy since I was a child, I have taught at a blackboard for 15 or more years, and have not experienced breathing problems with chalk. Chalk stands out on a dark background. A chalkdust-clogged blackboard can simply be cleaned off with a wet sponge and a squeegee.

This is in contrast to usingĀ  the alcohol-based cleaner for whiteboards. While the sponge can be rinsed after use, a cloth or paper towel soon becomes clogged with the ink wiped from the whiteboards from using the cleaner, and are soon disposed of. This just contributes to a landfill problem.

For educators concerned with the vertical classroom, students can work on these non-permanent black vertical surfaces just like they do with whiteboards.