Now that I am retired, I give talks at conferences on the work I've been doing for a while and on open-source code I've written.
One of the questions I am constantly asked is whether what I am going to present is new.
This is often required in order to have the conferences cover the travel and/or lodging expenses that will be incurred. This sum is small, given the amount of time, at least a month, working specifically to prepare for each specific workshop or conference talk. And it does not cover the unpaid amount of time and effort spent to develop the software or do the research.
From my standpoint, remuneration of any amount is not only helpful (I am not super rich), but reflects more than a grudging interest on the part of the conference in the material offered. It also partially indicates support for the vast amount of volunteer work I put into these projects.
The problem with this attestation of newness is that this immediately puts me on the defensive, instead of focusing on the material and whether it is useful, interesting, and worthy. Newness is subjective, and it is hard to dissuade a doubter that anything is truly "new".
The words I will use in the talk are not new. Maybe even some sentences and sentiments are not new. But okay, is the idea "new"?
Well, even some of the ideas, such as giving context, are not new. If trees fall in the forest and no one hears them, is hearing one for the first time new? Or maybe you've heard a tree fall but never realized that this was the sound of a tree falling?
It is always possible that someone somewhere has uttered a similar idea somehow as something I'll say.
Maybe what is meant is whether the topic is new for a presentation at this kind of venue. Well, I typically will try a dry run before a conference for a smaller local group.
In the end, what is ultimately being asked is to identify and describe items that might get said that those, largely anonymous, reviewers will consider new. This kind of guessing game I dislike participating in.
"Newness" is subjective; there is a spectrum of "newness". If someone wants to give you a hard time, there are a lot of opportunities.
Here is an example from my real-life experience.
Back in 1805, Carl Friedrich Gauss developed a method for evaluating a polynomial using nested multiplication. It was published only posthumously in 1876.
Then, in 1969, Donald Knuth in Volume 2, Semi-Numerical Algorithms, mentioned this. I know very few people who have read this book. However, someone I worked with at IBM Research undoubtedly did or said as much, and he is also a fan of Gauss. In a compiler developed around 1982, he used this method on a pioneering RISC project and compiler.
That RISC CPU didn't have a general 32-bit by 32-bit multiplication instruction. Instead, this had to be built from sequences of 32 by 2-bit "multiply step" instructions. Similar to the way long multiplication is taught in schools.
To speed up execution time, the researcher used Gauss' method to produce short and fast custom sequences of "add", "subtract", and "shift" instructions when the multiplier was an integer constant. For example, to multiply by 8, instead of using a multiplication step operation, all you need to do is use a single shift instruction. The CPU could do this in one instruction and one clock cycle. This CPU also had a "shift arbitrary amount and add" instruction that could also be done in one CPU clock cycle.
Innovations that appear in this context were to search for numbers that had easy-to-compute factors. These factors are 2 and powers of two plus or minus one. For example, 1089x can be performed in two instructions via (33 * (33x)). And 961x can be computed via (31 * (31x)).
When I joined IBM Research, this researcher was leaving the compiler group, and so I was tasked with understanding what he had done (no one else in the group really did), and then to revise the code so that it handled a new CPU type.
The Researcher, who was lazy, suggested that I write up a paper on this and add him as a coauthor. My contribution to the code and search idea was to use an alpha-beta search process, which then allowed for using more exhaustive search factors. For example, one could add a power of two plus two to the exhaustive search procedure, although we didn't do that.
The idea described here in section 8-4 of the late Henry Warren's Hacker's Delight. ISBN-13: 978-0321842688. Sadly, framing inside an alpha-beta search is not well elucidated either in the book, and the connection was not made explicitly in the paper that eventually appeared.
But as I've outlined here, a case can be made that this isn't new so much as a progression of ideas applied to a new domain. Gauss, clearly, didn't have in mind the smallest sequence of shifts and adds and subtracts to compute multiplication by specific integer constants, let alone the idea of performing an alpha-beta search to reduce the exponential search space. And Knuth doesn't mention this either.
When I submitted the initial paper, I was informed that this idea was not new. Did the reviewer know about Knuth or Gauss and cite that? If he/she did, that wasn't cited.
What was cited instead was an early Pascal P-code compiler. In that, someone noticed that to multiply by a power of minus one, you can perform this with a shift instruction followed by a subtract instruction. And it noticed it could replace a multiplication by a power of two with a shift instruction. This type of thing is called a peep-hole optimization. That code replacement only considered just those two specific cases. A shift by a power of two plus one, I don't think, was performed. Multiplication by two "short" factors was definitely not performed; there wasn't a search procedure in place, let alone alpha-beta searching, which requires caching values.
More important than "newness", from the reviewer's point of view, was my failure to cite this work. I was informed by that reviewer that if the paper were resubmitted, I would definitely have to cite the technical report for the Pascal P-Code compiler.
As I said, this work wasn't really a paper on this topic but rather one section in a technical report on the implementation of the Pascal Compiler. The final published paper omitted mention of this peephole optimization, probably due to length and it not being central to the implementation of the compiler.
The other thing that irks me about this is that although I had to credit this source, which is obliquely related at best, I can't credit or attribute the faulty thinking because this was reviewed anonymously.
(At IBM, when an idea was truly dumb, one way to cover that was to couch this idea as IBM Confidential, or even better, IBM Super-confidential.)
There wasn't an opportunity to discuss this even anonymously. You know, everyone's too busy, and this is volunteer time, and there were so many other papers to consider.
But the anonymous reviewer made sure to let me, the editor, and other reviewers know that he/she is/was an expert on this field, so that we should know that, based on this self-assessed certification of excellence of reputation, this (flawed) pronouncement of non-novelty and non-worthiness was to carry great weight.
Thank you, anonymous asshole.
I rewrote the paper and made sure to cite the largely irrelevant work done in peephole optimization. At that time, my IBM Research colleague was totally uninterested in assisting. Due to the advice of my other colleagues at IBM, his name was dropped from the resubmitted paper. This paper was delayed from publication by about two years do to all the silliness in the initial review and lag in the review process, and backlog queue of papers to be published.
In that time, I had left the compiler group and moved on to other topics, and I forgot about this.
But unbeknownst to me, someone else read the paper and then wrote a paper on how to improve the search. Assuming it followed the same rigorous procedure that was applied to my submission, that paper must have had to have been considered "new".
The problem was that the person called this "Bernstein's" method, because, you know, although this idea wasn't "new," it was useful to be referred to by name, and I suppose that person wasn't aware of Gauss's, Knuth's, or my IBM Colleague's prior work.
As a result of these events, my paper became more popular. And as a result of that, and it being called "Bernstein's algorithm," these events were noticed by the IBM Colleague. It rightfully irked him. (But in truth, he hadn't mentioned to me that this idea probably came from Gauss via Knuth, or I would have been sure to mention that. I learned that fact only later from Hank Warren.)
So many years later, I was contacted by the Editor of the Journal where the paper was published (not by my former IBM colleague who had left IBM Research). The editor let me know that he'd been contacted by this IBM Colleague who wanted to retroactively change the authorship of the paper. I said that I was willing to write with my former IBM Colleague a new paper including ideas and discoveries since the paper was written.
For example, as I mentioned, I hadn't realized at the time that what I was doing was alpha-beta search. And one might consider this function for inclusion in Sloane's OEIS (Online Encyclopedia of Integer Sequences). This sequence has the property that consecutive numbers have to increase or decrease by one, since addition and subtraction always appear as a single instruction. Also, the length of the sequence seems to grow with the log of the multiplier being computed.
And as part of that, the history, the name "Bernstein's algorithm," and paper authorship could be cleared up.
Alas, this never happened. My IBM Colleague didn't seem to show interest, just as he had abandoned interest, just as he did after explaining the code to me, which was sort of required of him to do as part of his employment. The Dr. Seuss story: "Horton Hatches a Who" comes to mind.
So let this serve to set the record straight. As well as clarify my thoughts on the subjectivity around "newness".


