Using the source, Luke: Compiling Firefox

My PC is one of those dual-cores from about 5 or so years ago with about 4 gigs of RAM and enough hard drive space that I don’t have to worry about running out of room any time soon. I installed the latest UBUNTU as a second operating system (XP is running on another partition), but became dissatisfied with its overly-simple interface and its hard-to-find terminal. In fact, anything that would have been called a menu system seemed impossible to configure. So, I ripped it out and installed Debian. Yeah, the software is always a version or two behind, but at least I have a guarantee of it working.

Except one thing. Debian didn’t seem to come with a web browser I liked. I wanted Firefox, since there will be a chance of syncing the menu system with my other machines. The binary downloaded from the Firefox website wouldn’t start, since it couldn’t find a library that was indeed on my system, it just wasn’t placed where Firefox was looking. So, to me that only meant one thing …

If you want a software to do anything right, you need to compile the source code yourself and build your own binary on the machine it will run on. And that’s one of the powerful advantages of open source software, and it is the whole reason why the source code is freely available. It is an essay in patience, but once it works correctly, it is always worth the extra trouble. In the source code for Firefox, I needed to use ./configure several times. Each time it reported an error of some kind, it was due to a missing package. I installed it using apt-get install <packge-name>, and there were several that were missing. I would suppose the reason so many were missing was because my Linux installation was not configured for “development” purposes (an oversight on my part). But once ./configure ran without error, I was able to do a “make”. And now the wait.

I recall back in the mid-90s, when I first used Linux on a 486, and I was compiling the kernel, I recall it took, on a 80486-DX2-100 processor with maybe 4 megs of RAM, about 5-6 hours to compile a proper kernel. At the time, that was the biggest compilation job commonly done on Linux systems, and was a rite of passage for any computer nerd, and probably still is. Things are easier nowadays with kernel modules in the version 2 and 3 kernels, since compilation is rendered almost un-necessary. Well, about an hour into the compilation of Firefox, it is still chugging away at the compilation of source code — sometimes C/C++, sometimes python, and it seems assembler at times (not sure). On 4 gigs of RAM and a dual-core processor, one would think that I wouldn’t have time to write this article (which is exactly what I am doing while I’m waiting), or at least to write this much verbiage over so long a period. But so far, the good news is that there are no serious compilation errors that would have caused the make to bail out.

Of course, I have better things to do than to compile Firefox and write about it on my blog. I am also roasting a turkey. The turkey will be ready to eat in about two hours from now. It will be interesting to see which will finish first: the turkey or my Firefox compilation. I’ll keep you posted. Right now, it is about time for its basting. The turkey, that is.

Rewriting history: Margaret Thatcher

Margaret Thatcher is younger and prettier in this photo than I remember her back in 1981...Hmm...

Today, news about the release of “secret” documents relating to the Margaret Thatcher government circa 1981 have been making the rounds in the news, and a CBC webpage has made a report that would claim to rip the lid off these secret documents.

Well, nothing of importance seems to have come out of this, if you read all the way to the bottom. Although it crosses my mind that while the riots were happening, weren’t we all awash in the media sentiment surrounding the marriage of Prince Charles and Lady Diana? Aw, maybe I’m just being crumudgeonly.

We are told that the release of these secret documents tells us that Thatcher never ceases to “fascinate” us; although for anyone who lived through the rise of neo-conservatism in the early 80s found little of this surprising or even worthy of reporting. Yes, the Canadian (Mulroney), American (Regan), and British (Thatcher) governments moved hell and earth to make sure that their policies were rammed through their respective houses of government, while firing, reassigning, or rebuffing anyone that stood in their way; and while ignoring the ensuing rise in unemployment, debt, or any other social ill that resulted.

The “secret” documents, though inconsequential, seemed to be well-timed to the release of a new play about Thatcher to be premiered in London, starring Meryl Streep. The photo posted on the CBC website was not of Thatcher, but of Streep playing Thatcher. Their caption did not remark on this, but instead carried on the discussion about Thatcher as if this photo was really of her. Maybe it was CBC incompetence, but no. The photo and article were both from the AP newswire, offered without editing or change by the CBC. The CBC did, however, have the good sense to offer a video news report that showed a mix of Streep and Thatcher ’81 in the same report, with at least some clarity as to which was which.

More on primes

Yes, I’m still obsessed with prime numbers. I wrote a C program this time that can list or count prime numbers in a certain range, or do both. The C program is pretty fast, but in terms of large numbers, it takes a while to get the primes for large numbers.

I use something related to Euclid’s Algorithm for finding factors, except I turn things inside-out so that it avoids factors and finds primes. The program is quite short, about 60 or so lines, not counting comments. It was  a great way for me to freshen up my dusty C programming skills (I am recently more into Perl than anything).

The hardest part of writing this program seemed to be in getting it to accept numbers from the command line. This was because I needed to review all that about strings, pointers, casting, string conversion, and how argv[] and argc are structured.

Once it was written, however, I could use it to find primes in numbers that were nearly as big as I wanted. It couldn’t take scientific notation, which would be a bit more work to parse from the command line. I was happy with just using raw integers, even though many of them were on the order of 1012 and beyond.

One of the great advantages of having a small C program is that I could log on to several different UNIX machines (Linux mostly), and run an instance of the program on a different range of numbers. This was very helpful when the numbers were in the trillions, because I was aware I was on a timesharing system on each machine, and that looking for primes would push up the load average quite noticeably for lengths of time extending for as much as 3 hours, and having several processes on the same machine used in tandem by other users just wouldn’t do.

I used  the UNIX time command to find out how long my program takes to find primes in real time. It was intended for my own information, but I wished I had recorded the data, since if I ever find out about an improved algorithm, or make improvements myself, it would be nice to have times I can compare these to.

The largest numbers input were on the order of 1019, a number so big I needed to significantly reduce the range of numbers it searches in to 1000. To make a search of only 1000 integers, the program still took about 20 minutes for a range such as 5⋅1019 to 5⋅1019 + 1000. I found this law of diminishing returns quite unsatisfying. The higher the number the shorter the range you had to give it, unless you wanted the program to take forever.

Dealing with pain

Ever since I have had oral surgery, I have had to deal with a constant pain in my lower jaw where my wisdom tooth used to be, since it was dug out from inside my gum and pulled out from the bone. The pain didn’t occur until hours after the anaesthetic wore off.

Of course, I was told to take strong headache medication, such as extra-strength Aspirin, and I am also taking codeine (OTC) for bedtime.

The benefit of all this discomfort, and a roughly $300.00 expenditure, so I tell myself, is that my jaw will work better. I have had a problem with chewing, and my jaw had been making various crunching noises for many  years. The noise is noticeably fainter now.

I still have to deal with oral infections, for which I also need to take antibiotics. I still bleed, ten days after the surgery, every time I brush.

To cheer myself up, I re-strung my guitar with strings I could actually play (I am in the habit of using electric guitar strings for my acoustic, since I don’t practice frequently enough (not enough time) to garner the finger strength or the requisite callouses needed for playing folk guitar. I think I will save up for an electric when I get the chance.

One thing I noticed, about five days after the surgery, I ran 5 km, and I was fine. Yesterday, I ran 10 km, and noticed that the pain actually subsided during and after the run. Probably due to endorphins.

Spammers R Us

Here is the latest list of the desperation and silliness displayed by spammers to my inbox these days:

  • Stop the Ageing Clock
  • Raise Your Courage and Confidence in Public
  • Andrea sent you an invitation to http://menstrualmigrainesurvey.mobi
  • Fantastic Offer On A Mercedes C180 CGI
  • Have You Ever Been Treated Like A Slime Ball Sales Person?
  • hello
  • Hello
  • hello my friend help me
  • Hello Sir.
  • Hello,
  • HI
  • hi
  • hiro
  • I AM WRITTING TO CONFIRM THE FACT IF YOU ARE (DEAD) OR (ALIVE)???
  • LOAN OFFER AT 3% INTEREST RATE
  • Register and be entered to win cars, trips, homes and discounts … Good Luck
  • Remain bless in the Lord.
  • This house has a huge garage for you (try 20 cars)
  • THIS IS MY LAST WISH.
  • Trade your city
  • Burdens are Pumping Up the Volume!
  • $100k for MERCILESSLY KILLING FBI, CIA and NSA AGENTS like FUCKING PIGS
  • 7 Reasons to Keep Jesus out of His Church
  • Another future on the side
  • Are you looking for additional income online?
  • Are you serious about making money?
  • Are you tense?
  • Are you a music producer?
  • bitchkitty
  • Bring funk to your lineup
  • Caffeine Extract removes Cellulite
  • Can the church save Occupy Wall Street?
  • Dangerous Abuse of Bible Translations
  • Do you need any kind of loan?
  • Final week for The Hit Play God of Carnage
  • I am Mrs Bolten, diagnosed of cancer,I DONATE USD$10 Million to YOU
  • I thought you loved me – and you?
  • I write this email to you on my sick bed
  • Important Dawnload
  • Is Your Eye Care Practice Missing Something?
  • extremely urgent. we cannot change the deadline
  • we have restricted access to your account
  • We received your tax return
  • Where is my man?
  • Will Obama stop this assault on God’s creation?
  • How would you like to have your ad on 2 Million Websites?

Not knowing what you don’t know

This topic has occasionally fascinated me. Not knowing you don’t know something is how your incompetence is used to hide your incompetence (amounting to being too incompetent to know you are incompetent). It was used about a decade ago by Donald Rumsfeld to incriminate Saddam (remember all that about the “known knowns”, the “known unknowns” and the “unknown unknowns”?). It was a sham to hide the fact that even though he had no evidence of WMDs, that “absence of evidence is not evidence of absence”. Guilty until proven innocent. Let’s spend billions on an invasion.

But, yes, the underlying idea is a real one that goes back to theories of learning and of knowing. And ironically, I have it on some authority that it has roots in Arab culture. There is an old Arab saw that goes:

"He that knows not,
    and knows not that he knows not
        is a fool.
            Shun him
He that knows not,
    and knows  that he knows not
        is a pupil.
            Teach him.
He that knows,
    and knows not that he knows
        is asleep
            Wake him.
He that knows,
    and knows that he knows
        is a teacher.
            Follow him."

Our particular interest is in the first stanza. Shunning one who doesn’t notice his/her own incompetence might be a bit harsh, because really, not knowing you are incompetent about something is a normal part of the human condition.

We are all biased, we are all unaware of whole worlds of knowledge lying outside our doorstep; yet, we work, play, pay our taxes, make decisions, go to school, get promoted, have relationships, resolve our conflicts despite not knowing everything, and not being all that conscious about what we don’t know. Our brains are wired to make sense of the world with limited knowledge, and to make judgements and to act. Normally things work out, and that’s all that matters to us. We don’t need the data gathering skills of NASA’s Mission Control to resolve even relatively difficult dilemmas. Some of us make judgements on truly degenerate knowledge of the world around us, and life goes on.

But what if we were worried about the things we do not know that we do not know? It might be marginally helpful, but I suspect that most of the time, we would be paralyzed with indecision, until we gather all of the facts. And in some situations, that is just impossible, thereby paralyzing us forever. You can’t know everything in interpersonal situations; you are not expected to have the wisdom of Confucius, the moral reasoning skills of Martin Luther King, the clairvoyance of Kreskin, and what would it have helped? You still need to work through the reasoning skills of the other person, who has a similar degenerate knowledge found in everyone else.

If you think that’s the other person’s problem, then remember that the tragedy of the crucifixion happened because Jesus suffered from a mortal flaw: he was flawed because he was perfect (he only knew that he knew), and the crowd judged him, because they didn’t know what they didn’t know. As Jesus dies on the cross, he prayed for forgiveness of this essential flaw in their humanity. In other words, being perfect is not a guarantee that you will live your life in any kind of perfect happiness, since you will live in a world where everyone around you will not know what they don’t know. And to try to attain that kind of perfection with the purpose of becoming “more knowing” or more effective in your environment is grossly un-necessary nearly all of the time. We already have within our imperfect selves the power to change our lives in a positive way.

Knowing more is, of course always better. But trying to know everything is an invitation to personal, moral and professional paralysis. We would normally accuse such people of not being able to deal with ambiguous situations, which let’s face it, covers just about all situations.

In Pursuit of Trends in Primes

INTRODUCTION

The prime numbers are numbers which are divisible only by itself and 1. This means that all primes have two factors, so for that reason, “1” is not prime.

OBSERVATIONS

The first 100000 integers seems to have the greatest density of prime numbers. 9592 primes were found there, meaning on average, nearly one in 10 of the first 100000 integers were prime.

Dissecting this interval further, for the first 1000 contiguous integers there are 168 primes:

 INTERVAL    # OF PRIMES  FRACTION
  1-100          25        0.25
101-200          21        0.21
201-300          16        0.16
301-400          16        0.16
401-500          17        0.17
501-600          14        0.14
601-700          16        0.16
701-800          14        0.14
801-900          15        0.15
901-1000         14        0.14
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
TOTAL           168        0.168

we see we’re already in trouble. While the first 200 integers clearly show dominance in having the most primes, with the first 100 integers having the most of all, the remainder fall into some kind of pseudo-random torpor, with some intervals higher, and some lower. 16 primes occur in three of the intervals, and 14 occur in another three.

In this interval, it is difficult to figure out if the numbers are meandering up or down. Let’s look at the second thousand:

1001-1100    16    0.16
1101-1200    12    0.12
1201-1300    15    0.15
1301-1400    11    0.11
1401-1500    17    0.17
1501-1600    12    0.12
1601-1700    15    0.15
1701-1800    12    0.12
1801-1900    12    0.12
1901-2000    13    0.13
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
TOTAL:      135    0.135

Is this a downward trend? We observe the first 10 thoudsands:

   1-1000    168    0.168
1001-2000    135    0.135
2001-3000    127    0.127
3001-4000    120    0.120
4001-5000    119    0.119
5001-6000    114    0.114
6001-7000    117    0.117
7001-8000    107    0.107
8001-9000    100    0.100
9001-10000   112    0.112
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
TOTAL:      1219    0.1219

A major jump upward is observed in the last interval, but it does not get to the level of the first 4 intervals. This is noticed nearly universally, and is reproduced here. If you look at the number of primes in regularly-spaced intervals of contiguous sequences of integers with each interval the same size, you tend to see a general declining trend, followed by a seemingly random meanindering of numbers of primes going up and down in number. This observation appears to be independent of
interval size.

In the interval 99001 to 100000, the last 1000 numbers in the interval, we observe that there are 86 primes. Clearly, this is below what we have just observed in the first 10 intervals of 1000, but in the interval 97001-98000, we observe only 82 primes, where we had expected a slightly greater number than 86. The number of primes once again are returning to a similar pseudo-random torpor observed earlier.

I define “pseudo” randomness as frequencies which go on a downward trend in an unpredictably meandering way. True randomness would have frequencies going all over the place.

So, if intervals of size “p” were used, the first p integers would contain the most primes; while there would be a downward trend for integers from p+1 to 2p; 2p+1 to 3p, and so on. By some middle interval kp+1 to (k+1)p, there would occur a relatively low number of primes in some pseudo random torpor. By the time we arrived at interval np+1 to (n+1)p, the frequency of primes become noticeably fewer. I am afraid I am not yet in a position to define the word “noticeably”. You just have to notice it.

But the word “noticeably” might imply that there is an upper limit of frequencies which it does not attain. That is, if “e” was the upper limit of a predicted frequency by interval np+1 to (n+1)p, we need to set the number so as to guarantee to ourselves that it does not rise above the limit. As a first approximation, I am willing to set e, if 2 divides n, as being the same for the interval (n/2)p+1 to ((n/2)+1)p, or the interval half-way to the last one. You obviously can be more restrictive than that, but at least you can see from my numbers that this clearly works out.

Contiguous Intervals of 100000

Taking the whole interval of the first 100000 positive integers contiguously, we find there are 9592 primes, far and away more than any other such interval observed for 100000 integers. The second interval from 100001 to 200000, has 8392 primes. Going from 1 billion, the interval has 4832 primes. While it is not clear if we are returning to torpor, we can see at least that the number of primes are decreasing.

The strongest proof I have observed of a decline returning to a chaos of up-and-down numbers of primes, was to see that these observations were even consistent when I counted 100,000 contiguous integers, and jumping 10^50 integers and counting another interval of 100,000 from there. I kept this up under Maple 9.5 up to 10^2000. The last intervals from 10^1700 took the longest to count (well, _you_ try checking 100000 integers which are each 1700 digits long, and see how long it takes you!). It is still cranking away, and there are 4 intervals left. In all, it will probably take 9 hours on a dual-core processor, so on the optimistic side, I have about 3 hours remaining.

The results so far have been enough to more than indicating a trend:

100000 FROM   # OF PRIMES     EQUAL OR GREATER(*)
10^0            9592
10^50            895
10^100           407
10^150           274
10^200           216
10^250           165
10^300           143
10^350           129
10^400           115
10^450            96
10^500            81
10^550            71
10^600            78                   *
10^650            65
10^700            68                   *
10^750            59
10^800            52
10^850            57                   *
10^900            57                   *
10^950            45
10^1000           39
10^1050           38
10^1100           43                   *
10^1150           33
10^1200           38                   *
10^1250           43                   *
10^1300           21
10^1350           32                   *
10^1400           28
10^1450           28                   *
10^1500           26
10^1550           35                   *
10^1600           32
10^1650           28
10^1700           28                   *
10^1750           29                   *
10^1800           27
10^1850           23

As has been stated by other writers, prime numbers appear to have a pattern, but the pattern has eluded us. It seems the pattern of decreasing numbers of primes as integers increase by many orders of magnitude seems to be elusive indeed, appearing pseudo-random, but with a general downward trend. One might desire to regress the numbers of primes per 100,000 integers to a curve, but I am not sure that would tell us anything meaningful by way of a pattern.

Why TSP rocks

I discovered textured soy protein (TSP, or textured vegetable protein [TVP]) in a local bulk foods store, sold in bins. TSP is a protein extract from the soybean. Soybeans is an already versatile plant that is used for many different purposes. It can be used as a meat analog, similar to the way surimi can be used as an analog for shellfish.

Flavoured with the appropriate broth, it can be used as a replacement or as a supplement to chicken, or beef. In the un-flavoured state, it can be used to add a meat-like texture to spaghetti sauces, chili, lasagna, you name it. If you are a vegetarian, you now can take advantage of a plant-based food which, pound for pound, has the same protein value as meat, with a fraction of the calories. You can now serve Hamburger Helper without any actual hamburger.

A recent experiment at Paul King’s Labs (my kitchen) used Hamburger Helper, the leftover remains of a Spanish onion, fried separately in a skillet greased with margarine until it was somewhat brown (grease poured away afterward), a quarter pound of lean ground beef, fried until all brown, and one full cup of TVP, to which nearly a cup of boiling water was added. To the concoction was added the noodles and the Hamburger Helper mix (Lasagna flavour), another cup of  water, and finally a small whole tin of tomato paste (mostly because I just love to kick that tomato flavour up a few notches).

TVP does not need the aid of hot water to hydrate itself; cold water will do. TVP does not necessarily need cooking, but I just want it to be there because I think it needs to absorb the surrounding juices and flavours like the meat does. So, the TVP is added just after the hamburger turns brown. Broth isn’t necessary for something like Hamburger Helper, so that additional preparation isn’t needed.

 

Algebra Tiles

I once thought algebra tiles were stupid. But these days, I believe that they are essential for young children up to Grade 10 to understand how the factored quadratic is expanded, and to actually “get” the FOIL method when it’s introduced.

It is especially revealing when you understand the root of the word “quadratic” comes from “quadrat”, which is a device used to measure land area. Imagine buying a tract of land (x – 3) meters long by (x + 4) meters wide. Such a property must be rectangular, and I think for consistency sake, so should your algebra tiles. The placement of the tiles should reflect the dimensions of the rectangle. That is, an observer should be able to make out the factors of the quadratic in the finished product.

I actually don't own any algebra tiles, so I just drew a picture conveying the general idea. The square is the square of x; the lines each represent x, to make 8x; where the lines cross each represent one more number added to the constant term. 15 crosses thus make the number 15. This looks like a great idea for teaching how binomials can be multiplied to kids who haven't seen it before. This graphc was scrawled out in MS-Paint.

If you are like me, you probably don’t own these things, but to teach expansion of binomial factors to kids, you can certainly draw squares and lines.

Multiplying a pair of binomials will generate a quadratic upon expansion, of the form Ax^2 + Bx + C.

Along the horizontal, we can suss out a measurement of x + 5 units in length in the first illustration; along the vertical we see a dimension of x + 3 units. Draw the lines so that they continue past the square and extend so that they cross all of the lines going in the other orientation. Counting the number of crosses will give you the constant term (C); counting the total number of lines will give the coefficient before the x term (B); and counting the number of squares give the coefficient of the x^2 term. Knowing that MS Paintbrush does an ugly job when using the same colour, I used a different color for the horizontal lines. These different colors can become handy when the binomial contains a subtraction. For example:

This is an improved graphic, and does illustrate how the black represents subtraction.

This second illustration shows how -2x may be represented by algebra tiles.  We see the middle term go to 3x - 2x = x. But what about the crosses? We need to make up a rule whereby if the crosses are of different signs (that is, different colors), the count is to a negative number (in this case, six crosses make -6). Conversely, if they are of the same sign (both negative or both positive), the count is to a positive number.

My Discovery of Biodegradeable Plastic

Biodegradeable plastic at one time was the kind of plastic that would indeed degrade … after 10,000 years. It wasn’t much cause for fanfare back in the ’90s (although there was much euphoria at the time). Well, I was cleaning my room one day, and now I am among the believers in biodegradeable plastic.

My room which is my home office, is, to be quite honest, a hopeless mess at the best of times. But whever I can see my way clear from my busy schedule, I do indeed put in a bit of effort to clean things up. Ironically, I used to be a lot neater, and still have a few good habits, such as keeping a few extra plastic shopping bags on hand to replace bags of garbage I throw out from my room.

So, one day, I got irritated enough at my own mess to start vacuuming and tossing out my swelling bags of garbage (I have two wastebaskets, usually full to overflowing). I grab one of my spare empty garbage bags, and discover to my horror that the bag I grabbed shattered into a dozen pieces as I grabbed it. And as I attempted to pick up some of the larger pieces of the plastic bag, each piece shattered into easily another dozen pieces each. I grumbled as I turned on the vacuum again to suck up the decayed remains of the shopping bag I was about to use.

So, the good news for the environment is that, yes, biodegradeable shopping bags do in fact exist in a meaningful way, and no, I did not go shopping prior to the last ice age.

Biodegradeable plastics last about 50 days under good aerobic conditions for disposal. I would imagine it would take longer to degrade under anaerobic conditions. Anaerobic conditions are common in landfill sites. Oxybiodegradeable plastic cannot degrade under anaerobic conditions. Hydrobiodegradeable plastic can, releasing methane. Microorganisms are generally responsible for the decay of hydobiodegradeables, but oxybiodegradeables can degrade anywhere, even under water.  I am imagining that my shopping bag was of the oxybiodegradeable kind, since it was empty, and lying around for some months.

More on the distinction.