DERB'S FEBRUARY DIARY: New Year In Chinatown; Black Math, Psychologist Math, SJW Math, And Real Math; ETC. [11 ITEMS]
03/01/2019
A+
|
a-
Print Friendly and PDF


Nightmare on Main Street.     February 5th was Lunar New Year on the Chinese system. Out with the dog, in with the pig.

It was also of course Pax 10th on the Mayan calendar. This year, however, we thought we'd forgo the human sacrifice and just have a dim sum lunch and do some pre-festival shopping in Chinatown the weekend before.

Parking-wise, Chinatown—this is the Flushing Chinatown in New York City's Queens borough, not Manhattan Chinatown—is a nightmare even on the average weekend. The weekend before Lunar New Year things are far worse.

Tooling down Flushing's Main Street I had the glum forboding that we would spend most of the day looking for a parking spot. I even suggested to my lady that we back off to a friendly non-Chinatown subway station and ride the rest of the way. No, she said, we'd have too many bags to carry so far.

In the event we lucked out. After just two circuits of the restaurant parking lot, a space opened up as we approached. Out of car, into restaurant. Piece of cake—bean-curd, whatever.

The dim sum was great, though I made the mistake I always make with dim sum: taking everything the first couple of trolleys have to offer, leaving no room in my digestive tract for later trolleys, which of course have stuff that looks even more appetizing. There must be an art to pacing yourself through this, but in fifty years of patronizing dim sum parlors on three continents, I've never mastered it.

The restaurant was clean and efficient for such a big place. There is a chaotic, sharp-elbowed, Malthusian quality to Chinese social life that in certain moods I rather like; but when they want to do brisk efficiency, they sure can. Lake Pavilion Restaurant runs like clockwork.

Then, the nightmare. Back out in the parking lot I said I'd stay by the car while Mrs walked down to the supermarket for food shopping.

Off she went. There I stood. In the street just across the low wall from our car was a food stand selling snacks. It broadcast its wares with a loud ten-second message that just kept repeating: Tianjin Taohua chao xiao chi! Chao huasheng, chao guazi—xian chao! xian mai! … ("[Business name] fried snacks! Fried peanuts, fried melon seeds—frying now! buy now! …") And repeating … and repeating …

I was stuck there waiting by the car. I didn't want Mrs to come back loaded with bags and me not be there. I didn't know what store she'd gone to, so couldn't follow her.

I got into the car and sat there with the windows closed. No good: the message was too loud. I got out, walked off across the lot, keeping the car in sight. No good: the damn thing followed me. Chao huasheng, chao guazi—xian chao! xian mai! … I wished I could smoke a cigarette to calm my nerves, but I've quit.

Chao huasheng, chao guazi—xian chao! xian mai! … how on earth did the guy running the food stand keep his sanity? The expression Chinese water torture was making a lot of sense, vivid sense.

Tianjin Taohua chao xiao chi! Chao huasheng, chao guazi—xian chao! xian mai!

Tianjin Taohua chao xiao chi! Chao huasheng, chao guazi—xian chao! xian mai!

Tianjin Taohua chao xiao chi! Chao huasheng, chao guazi—xian chao! xian mai!

Tianjin Taohua chao xiao chi! Chao huasheng, chao guazi—xian chao! xian mai! …

Mrs Derbyshire showed up just as I was trembling on the edge of homicide. "What's the matter? You look stressed out."

"Nothing, honey. I'll take the bags, get in the car."

The rest of the New Year holiday went well.

[Permalink]


Museum of the Dog.    

The Museum of the Dog, from the American Kennel Club, opened Friday a couple of blocks south of Grand Central, after relocating from St. Louis, Missouri. Its 11,000-square-foot gallery is filled with pooch-centric paintings, sculptures, photos and artifacts, some going back centuries. [Museum of the Dog is a fun, fitting tribute to man's best friend by Eric Hegedus; New York Post, February 8 2019.]

9/30/16On the dog-loving spectrum, Mrs. Derbyshire is way over at the extreme right end. When she saw that story in the New York Post she near swooned. "We must go see that!"

So off we went. Yes, the place is fun. The walls are hung with dog paintings—originals, not reproductions—some from surprisingly far back in the 19th century, the dogs looking perfectly up-to-date. The only one of the artists' names that rang a bell with me was Landseer.

I wondered if perhaps we might donate Elizabeth Cockey's portrait of our late beloved Toby to the museum, but the Mrs doggedly refuses to let it go.

There are dog-movie posters, too: LassieOld YellerBeethoven, … And sculptures, and bronzes, and porcelain dogs; and some cute hi-tech "installations."

The installation we liked best was one that takes your picture, listens to you bark, then tells you what breed of dog you most resemble. I am either a French Bulldog ("Adaptable, Playful, Smart") or a Boston Terrier ("Friendly, Bright, Amusing"), depending on whether or not I am smiling.

The Museum includes a library of dog books—I never knew there were so many. They are all properly Deweyed, so you can easily find anything you're looking for. Literature is well-represented: Jack London of course, Richard Adams, and that curious little biography of Elizabeth Barrett Browning's dog Flush by Virginia Woof … sorry, Woolf.

Some curiosities, too. There's a book of dog poems titled, of course, Doggerel;dog's history of the world, subtitle "Canines and the Domestication of Humans"; and books of dog jokes. How many dogs does it take to put in a light bulb, by breed?

Lab: Me, me, oh let me do it! Please, pleeeeaase, let me! Here, I'm here! Let me do it!
Border collie: Just one, and I'll replace any wiring that's not up to code.

Etc., etc. And yes, some of the older books are regrettably dog-eared.

[Permalink]


Book ends    "Of making many books there is no end," said the preacher. I wouldn't be too sure, pal.

Walking back to Penn Station from the dog museum, we took a detour to New York Public Library, which often puts on small, interesting exhibitions good for killing a half hour in midtown Manhattan.

On February 17th it only had a show celebrating homosexuality, which neither of us is interested in. We mooched around the magnificent building anyway.

The Mrs wanted a picture taken in the big main reading room, so we went there. Standing by the endless shelves of reference materials while she manipulated her smartphone camera, I got to thinking how redundant these books are. Collier'sBritannicaLarousse, … aren't they all online now? Does anyone actually pull one of these volumes down and open it?

Nobody did while we were there, and the reading desks were full of readers … well-nigh every one with a laptop or iPad in front of him. Nor did anyone go up to the gallery that runs round the room.

There is something melancholy, something forlorn about libraries now. I get that feeling even in my own much-less-grand provincial library, which I used to spend hours in but now rarely visit. How long will libraries still be around? Are school guidance counsellors warning youngsters not to embark on a career in librarianship?

I guess it's geezerish and futile to think like that. Probably some Sumerian Derb around 3000 b.c. was bemoaning the decline of good solid baked-clay tablets and scoffing at the new-fangled papyrus—so perishable! so flammable!

It didn't help that I am a fan of junky sci-fi-apocalypse movies like Beneath the Planet of the Apes and The Day After Tomorrow, in which a scene in the shattered ruins of New York Public Library seems to be compulsory.

Is our civilization on its last legs? Well …

[Permalink]


Life cycle of civilizations    If you enjoy contemplating the collapse of civilization, envy the the scholars at Cambridge University's Center for the Study of Existential Risk, who ponder the civilizational End Times all day long and get paid a salary for it.

Luke Kemp is one of those scholars. He got some headlines of the lesser sort this month with an essay titled "Are We on the Road to Civilizational Collapse?" [BBC, February 19, 2019] His answer: Could be.

The world is worsening in areas that have contributed to the collapse of previous societies. The climate is changing, the gap between the rich and poor is widening, the world is becoming increasingly complex, and our demands on the environment are outstripping planetary carrying capacity …

Good flesh-creeping stuff. Kemp includes a chart showing the lifespan of 86 civilizations. The average lifespan, he says, is 336 years.

This quantitative approach to civilizational rise and fall has a long pedigree. A big name here in the 20th century was Arnold Toynbee, who Kemp refers to in passing. Toynbee's humongous 12-volume Study of History was known to every educated person in the generation before mine—to most of my schoolmasters, for example. (I don't say they all read it. It was known to them, that's all.)

Toynbee is not much read nowadays, but more recent public intellectuals have worked the theme: guns-germs-and-steel guy Jared Diamond, Tiger Mom Amy Chua, polymath David Goldman, Toynbee epigone Stephen Blaha, and others.

For big-idea intellectuals, it's a hard theme to resist. It sure looks as though civilizations age the way individual human beings do. When young, they have a vitality, a can-do spirit, that lets them accomplish wonders. In middle age they get fat and complacent. Then comes senility—peevish, quarrelsome decline. Si jeunesse savait, si vieillesse pouvait. Analytically-minded people naturally want to try quantifying that.

Where is the U.S.A. on this timeline? It depends where you start from. If we take Luke Kemp's average lifespan of 336 and start from the adoption of the Constitution in 1789, we're middle-aged—fat and complacent—but we should be good until 2125.

I dunno, though. From where I'm sitting "peevish, quarrelsome decline" looks like a better fit.

[Permalink]


Quantified history bites back.     You might think that China is a problem for these civilizational theorists. Hasn't China been a continuous civilization since deep antiquity—well back in the second millennium B.C.? How does that square with Kemp's average of 336 years for the lifetime of a civilization? Is not China, as sinologist Lucian Pye famously remarked, "a civilization pretending to be a state"?

Toynbee, who favored civilizational cycles up to a thousand years long, got round the problem by grouping successive Chinese dynasties together.

Kemp just treats each important Chinese dynasty as a separate civilization; although since the main civilizational characteristics—language, literature, philosophy, social and political arrangements—have indeed changed only little and slowly across millennia, and with Lucian Pye in mind, I'd put quotes around the word "civilization" there.

The Chinese themselves take Kemp's point of view, with those quotes. They have done so for a very long time, and even had a go at quantifying "civilizational"—that is, dynastic—rise and fall two thousand years ago.

A great-uncle of Kuei Hung's contemporary Lu Wen-shu had, in fact, through astrological calculations, arrived at the conclusion that the dynasty would last for three times seventy years, and since its beginning had been fixed at 206 b.c., it was, accordingly, to end in a.d. 4.

That's from Rudi Thomsen's biography of Wang Mang, a brilliant but very unscrupulous man who took over the Han dynasty as it sputtered out in a sad trail of emperors who were debauched, under-aged, sickly, and/or short-lived without direct issue from 48 b.c. to a.d. 6.

That calculation of a 210-year dynastic span was one of the propaganda points Wang used to bolster his claim to rule. Time was up for the Han dynasty! he argued. Heaven said so!

What followed should be an object lesson for would-be quantifiers of history. Wang Mang's new dynasty—it was actually called Xin, which means "new"—was one of the shortest on record, fourteen years and nine months. There were natural catastrophes and rebellions. Wang Mang was dethroned and chopped up:

The head was sent as a trophy to the Keng-shih Emperor, who displayed it in the market-place in his provisional capital, Yüan. Here the people hurled things at it, and, according to Pan Ku, someone even cut out Wang Mang's tongue and ate it. [Ibid.]

An outlier member of the Han ruling family was put on the throne and the Han dynasty resumed as the Later or Eastern Han, a.d. 25-220.

Human affairs on the historical scale don't yield easily to numerical analysis.

[Permalink]


Code archeology.     OK, let's swing technical for the rest of the diary: Computer Science and Math.

First, a nostalgia trip for old code jockeys.

After Radio Derb ran a segment advising listeners not to learn to code, I got queries about my own coding history. What languages have I coded in?

That's not easy to answer. For one thing, what counts as a language? I've spent a lot of time coding SQL, for example. It's a standard set of instructions for communicating with databases. Is it a coding language? It says it is: SQL stands for "Structured Query Language"; but we code grunts never considered it to be in the same category of thing as COBOL, LISP, C, or PERL. SQL commands were usually just imbedded in real code.

Similarly with MVS-JCL. MVS was (is?) an operating sytem for the big old IBM mainframes; JCL was its Job Control Language for managing batch input and output, job scheduling, and some low-level data manipulations like file sorting. Again, JCL says it's a language, but none of us believed it. It was just stuff you had to know to get your job run.

You can throw in HTML—HyperText Markup Language, which I'm using to write this diary. It's mighty handy for putting web pages together, but a language? Nah.

What about scripting languages, like the one that comes with the text editor I'm using to input my HTML (the scripting language is called KEXX)? I don't think so. After twenty years of using KEXX, my biggest macro is only 334 lines. That's just clearing your throat in a real language.

And then, do I count languages I've just played with for amusement, or because everyone was talking about them? C and C++ fall into that category. I never coded them for money, even to teach them. I'm going to take a strict approach and just include languages I've earned a living coding or teaching.

With those limitations, here's the tally. Chronological order.

  • While a trainee programmer at the U.K. Post Office, I learned ALGOLfrom a manual and coded up some tables for my boss.
  • Intercode, the assembly language for the Leo 326 machines, basic second-generation mainframe workhorses for the Post Office (which in the late 1960s also ran the U.K. telephone system).
  • CLEO, a COBOL-type high-level language for the Leo. We used it for telephone billing.
  • Usercode, the assembly language for ICL System 4 machines. The System 4 was a clone of the IBM 360, so Usercode was just 360 BAL in a dress. I did not know this at the time; but later, when I had to code 360 BAL, I was pleasantly surprised—no learning curve! I'll count Usercode and BAL as one language; and throw in their macro language, which as a systems programmer I worked with some.
  • Dartmouth BASIC was a big draw for colleges in the early 1970s. I worked with Jack Harwell's firmdeveloping a BASIC interpreter for System 4 machines, testing out the math functions.
  • I lectured in Computing at a trade school in Hong Kong. We taught from approved IBM texts. We started the students with RPG, then advanced to …
  • COBOL, then to …
  • FORTRAN. I didn't know RPG or COBOL before teaching them. I learned them from the teaching texts, racing to keep ahead of the students. (No mean feat: these were Chinesestudents.) From knowing CLEO I quickly got the hang of COBOL. I knew FORTRAN from having played with it at a former installation—code monkeys like to play—but had never before made money from knowing it.
  • REXX. Having learned COBOL, which everyone in the business world used, I coasted on it through the later 1970s and early 1980s. In the later eighties I switched from MVS to VM (the hipper IBM mainframe operating system, MVS being blue-collar). REXX was a very neat interpreted language for VM. KEXX is one of its descendants.
  • PC Assembly Language. When PCs came in for office work, I learned the Assembly language from Peter Norton's books; had fun around the office taking a break from COBOL and REXX to code up little PC routines for the users.
  • https://images-na.ssl-images-amazon.com/images/I/51pgZJnx0YL.__BG0,0,0,0_FMpng_AC_UL320_SR246,320_.jpgApple IIGS Assembly Language. At some point in the late 1980s I did a job on the side for my boss, who wanted it done on an Apple IIGS. I read up the durn thing's assembly language (Merlin? Or was that just the name of the IDE? The memory is badly faded — mine, not the machine's), coded the job in that, and got paid for it.
  • Visual Basic. Assembly language is a chore after a while, like brewing your own beer. I graduated to VB for PC work, and stayed with it until I quit coding for a living in 2001. I taught it for a semester at Baruch College in New York City.

That's a neat dozen languages I've coded for money. I have a longstanding vague idea to bring my coding skills into the 21st century, if only to keep cognitive decline at bay. I've actually bought Zed Shaw's book to learn Python from, but … have not yet cracked the spine …

[Permalink]


No justice, no math!     I generally reserve my mathematical inclinations for the last segment of the diary. This month, however, math has been in the news HBD-wise, thanks to the efforts of New York Times reporter Amy Harmon.

As I noted in the February 22nd Radio Derb:

Ms Harmon recently contributed two pieces for the Times on the shortage of black mathematicians.February 18th she profiled a black mathematician who feels he don't get no respect from his peers.February 20th she expanded to a general theme about how "bigotry," "racial exclusion," "unconscious bias," and so on, keep blacks out of math.

The black mathematician in that first article was Edray Goins of Pomona College. My Radio Derb commentary passed over Prof. Goins in silence and focused on the second of Amy Harmon's articles.

Is Prof. Goins the real thing—i.e. a decently productive mid-level math academic—or just an affirmative-action token? I didn't pass an opinion for the excellent reason that I'm not qualified to judge.

There's a list of Prof. Goins' talks and refereed papers here. To answer the question in the previous paragraph you'd need some feel for:

(a)  Quantity-wise, is that a good steady level of productivity for a tenured math professor? and
(b)  Quality-wise, are those topics appropriate for same, and does Prof. Goins treat them in appropriate depth?

I don't know enough to pass judgment on either point. I could take a fair stab at (b) by actually reading the papers and talk transcripts, but … life is short. I therefore stand agnostic on the main question.

There are suggestive hints in some of the comments on the Times article, and even in the article itself:

Dr. Goins's colleagues at Purdue said his receipt of tenure and subsequent promotion to full professor signaled the university's willingness to overlook a sparse research portfolio in light of his extraordinary work with undergraduates, as well as the summer programs he organized for minority students.

[My italics.] For sure there are affirmative-action token hires in college math departments. Several readers emailed in to tell me I was much too kind to Assistant Professor Piper Harron at the University of Hawaii in my May 2017 diary. The mathematician who blogs as PostTenureTourettes was less restrained towards Prof. Harron, and took another swing at her this month while commenting on Amy Harmon's pieces.

[Permalink]


Blackety-blackety black scholarship.     The following may be relevant. I've heard it down the years from friends in many of the hard sciences. To paraphrase them:

You go to an academic conference on, oh, say, microbiology. A white guy steps up and reads a paper on the role of histone acetylation in the control of gene expression. Applause, questions, discussion.

An East Asian lady steps up and reads a paper on phospholipid biosynthesis in mammalian cells. Applause, questions, discussion.

A South Asian guy steps up and reads a paper on the regulation of sphingomyelinases. Applause, questions, discussion.

A black guy steps up and reads a paper titled: "How can we get more underprivileged youth interested in microbiology?" …

[Permalink]


The SJW-ification of math.     For sure the SJW-ification of math proceeds apace. A great host of academic administrators, and a few actual academics—with the eager assistance of fake journalists like Amy Harmon—are striving to drag this most rational, most profound, most demanding, and purest of intellectual disciplines down to the infantile subjective feelgood level of Raza Studies or Queer Legal Theory.

(I have just noticed that Ms. Harmon apparently believes that "pure math" belongs in mockery quotes while the applied variety does not. I guess she's telling us that math is, like any other style of intellection, au fond just a manifestation of the oppressive power relations in capitalist, white supremacist, patriarchal, heteronormative society. Nothing pure about that!)

For a random data point on that SJW-ification, see the editorial by Jacqueline Jensen-Vallin in the current (Feb/Mar 2019) issue of Focus, "Newsmagazine of the Mathematical Association of America." It's online here—scroll down past the cover page—but you may need to be a subscriber, so here's the content:

From the Editor [Shouldn't that be "Editrix"? … never mind—JD]

I'm very excited about this issue of MAA FOCUS. Preparing for this issue, I realized that I had a number of submissions about issues of inclusivity in our community. This is one of MAA's core values—advocating for and celebrating diversity by promoting mathematics for all and broadening access through initiatives to engage diverse audiences. I hope that you will see these themes throughout this issue.

While compiling this issue, I wanted to give voice to groups of people who have been historically silenced or heavily edited. To this end, I have tried to only lightly edit most of the articles appearing in this issue. My goal was for people to be heard as they intended. Therefore, the opinions in the articles belong to the authors and not necessarily to the MAA as an organization. I have not managed to hear all voices, but my goal is to continue to listen and give access to groups historically under-represented in mathematics. We hope that sharing these opinions leads to open communication and a conversation about ideas.

Thanks to all of you who have contributed to this issue. I am grateful for those who lent their voices, thoughts, opinions, and ideas.

Yecchh! I don't suppose being the editrix of a math newsletter pays worth a damn, but plainly Ms Jensen-Vallin is positioning herself for higher things: HR Director at Google or Twitter, perhaps.

And I can't help but admire the deft way she has absolved herself of the need to do much actual editing.

[Permalink]


Immatheracy in the human sciences.     Last year I gave a passing mention to Eric Turkheimer in my opinion piece on Charles Murray. Turkheimer is a field commander in the long rearguard action to defend the no-such-thing-as-race position against the steady advance of behavioral genetics.

I had always thought that Turkheimer was one of the sharpest knives in that particular drawer—a brilliant tactician, to flip back to the military metaphor—with a good understanding of all the main points (including, I mean, the ones he disagrees with) and formidable skill in the manipulation of data.

Well, there's another bubble burst. It turns out Turkheimer can't do middle-school math. On February 11th he tweeted:

Dumb (but real and research-related) question:

Y1 = aX² + bX + c
Y2 = dX² + eX + f

What is Y1 in terms of Y2?

r/t if you know a good high school math teacher

(I've sub- and super-scripted to make the tweet more readable.)

For Heaven's sake, man: Solve the second equation for X in terms of Y2, d, e, and f, using the standard formula for solution of a quadratic equation. Then substitute those values of X (there are two, of course—it's a QUADRATIC EQUATION) in the first equation. Next!

Turkheimer is a Professor of Psychology. Can you really be a professor of anything without knowing basic algegraic manipulations? All right, anything in the sciences: but that should certainly include soft sciences like psychology and sociology. All the results out of the soft sciences that strike me as interesting involve heavy-duty statistics, i.e. math.

The responses to Turkheimer's query are even more depressing. There is massive math-ignorance—"innumeracy" doesn't quite catch it; this is algebra, not arithmetic; "immatheracy"?—in the comment threads of even the best human-science blogs, like Greg Cochran's.

Intersecting parabolas—and in one case hyperbolas—seem to be particularly popular, I cannot fathom why. So is confusion between coefficients and variables. So is the notion that since the two equations are really the same, just with different names for the coefficients, Y1 and Y2 must be equal. Ye gods!

Is elementary math really so remote from the understanding of educated people?

Greg's post on Turkheimer's math problem is titled "An extra sense." One of the commenters tells us that this comes from a Darwin quote: Darwin felt that people who were good at math had an extra sense, which he envied. Perhaps the old boy was on to something.

Silver lining: I'm feeling much better right now about my own Class III degree.

[Permalink]


Math Corner.     All right, enough of the stupid girly social stuff. Let's have some actual math.

Here's a nifty number:

f1 = 2.920050977316…

It's irrational (see the last sentence below) so the decimal form never recurs. The ellipsis at the right-hand end means "go ahead and compute a few more—or a few dozen more, or a few trillion more—decimal places, if you feel so inclined and have the resources."

So what's nifty about it? It contains within itself all the prime numbers, that's what.

Why is that nifty? Lotsa numbers contain all the primes: Euler's product formula for the zeta function gives you an infinity of such numbers, one for every value of s that is greater than 1. Heck: 0.235711131719232931… contains all the primes.

Sure; but try extracting the primes from the decimal expression of such numbers. You can work up algorithms, but they are way cumbersome and knotty.

The nifty thing about f1 is that the primes drop out one by one, via some very simple arithmetic. f1 packages up all the primes, but in a way that makes it exceptionally easy to unwrap the package. Thus:

First, break f1 into its integer part and its decimal part. I'll call them int(f1) and dec(f1).

Plainly int(f1) = 2 and dec(f1) = 0.920050977316…

Now carry out the following extremely simple arithmetic algorithm, taking k of course to be 1 in this instance. I shall call the algorithm Al (no, it's not one of his, I just like looking at the old boy):

  • Add 1 to dec(fk). Then
  • Multiply the answer by int(fk).

Well, that's not hard. Adding 1 to dec(f1) gets you 1.920050977316… If you now multiply that by int(f1), which is to say by 2, you get 3.840101954632…

Let's call that number f2. Then let's apply Al to f2.

int(f2) is of course 3; dec(f2) is 0.840101954632… Adding 1 to the latter and multiplying by 3 gets you 5.520305863896… I'm going to call that—guess what? — f3.

Applying Al to f3 means multiplying 1.520305863896… by 5. That gets you 7.601529319480…, which of course I shall call f4.

Applying Al to f4 means multiplying 1.601529319480… by 7. Answer: f5 = 11.210705236360…

See what's happening? The integer parts of f1f2f3f4, … are just the prime numbers p1p2p3p4, … in order.

This goes on being the case for as long as you keep going, although of course you will need more and more decimal places of f1 to keep it on the rails. As I said, all the prime numbers are contained within that one starting number, f1, and can be extracted via simple arithmetic. Nifty, or what?

While indubitably nifty, the result is not very surprising when you know how f1 is defined. It's defined as an infinite sum a1 + a2 + a3 + a4 + …, where each of the a's is a fraction. The numerator of an is pn − 1, where pn is the n-th prime number. The denominator is the primorial of pn-1.

(Which is to say, the product of all the primes up to the (n − 1)-th. The primorial of 2 is 2; the primorial of 3 is 2×3, the primorial of 5 is 2×3×5; the primorial of 7 is 2×3×5×7, and so on. For convenience, although 1 is never included among the primes, the zero-th primorial is defined to be 1.)

So:

a1 = (2 − 1) / 1, which is 1.
a2 = (3 − 1) / 2, which is also 1.
a3 = (5 − 1) / (2×3), which is 4/6, i.e. 2/3.
a4 = (7 − 1) / (2×3×5), which is 1/5.
a5 = (11 − 1) / (2×3×5×7), which is 1/21.
a6 = (13 − 1) / (2×3×5×7×11), which is 2/385.
 … …

And so on. If you add those up you get 2.919479… (I think), so the sum of the a's is already closing in fast on f1.

Since this definition of f1 involves all the primes, it's not astonishing that you can squeeze all the primes back outof f1. What's nifty is that you can do it by such a simple arithmetic manipulation. Thanks, Al!

(I lifted this from a paper in the January 2019 issue of The American Mathematical Monthly, which can be found in any college library. The paper is "A Prime-Representing Constant" by Dylan Fridman, Juli Garbulsky, Bruno Glecer, James Grime, and Massi Tron Florentin. It includes some interesting generalizations, and a proof that f1 is irrational.)

[Permalink]

Print Friendly and PDF