Archive:
Subtopics:
Comments disabled |
Sun, 09 Sep 2018 I very recently suggested a mathematical operation that does this: $$\begin{align} \left((\sqrt\bullet) \cdot x + \left(\frac1\bullet\right) \cdot 1 \right) ⊛ (9x+4) & = \sqrt9 x^2 + \sqrt4 x + \frac19 x + \frac14 \\ & = 3x^2 + \frac{19}{9} x + \frac 14 \end{align}$$ Here the left-hand argument is like a polynomial, except that the coefficients are functions. The right-hand argument is an ordinary polynomial. It occurs to me that the APL progamming lanaguage (invented around 1966) actually has something almost like this, in its generalized matrix product. In APL, if $$c_{ij} = a_{i1} \cdot b_{1j} + The APL With this feature, the ⊛ operator I proposed above would be something
like APL does have a $$c_{11} = a_{11} \cdot b_{11} + and similarly if !!a!! is !!n×1!! and !!b!! is !!1×m!! then !!a +.× b!! is the
outer product, the !!n×m!! matrix whose !!c_{ij} = a_i × b_j!!. But I
think APL doesn't distinguish between a !!1×n!! matrix and a vector,
though, and always considers them to be vectors, so that in such cases
!!a +.× b!! always gets you the dot product, if !!a!! and !!b!! are the same
length, and an error otherwise. If you want the outer product of two
vectors you use I applied for an APL job once; I went to a job fair (late 1980s maybe?) and some Delaware bank was looking for APL programmers to help maintain their legacy APL software. I was quite excited at the idea of programming APL professionally, but I had no professional APL experience so they passed me over. I think they made a mistake, because there are not that many people with professional APL experience anyway, and how many twenty-year-olds are there who know APL and come knocking on your door looking for a job? But whatever, it's probably better that I didn't take that route. The I was pleased to find out that Iverson had designed a successor language, J, and then quickly disappointed when I saw how little it added. For example, it has an implicit “hook” construction, which is a special case in the language for handling one special case of function composition. In Haskell it would be:
but in J the [ Addendum 20180910: The explanation. ] Meanwhile the awful APL notation has gotten much more awful in J, and
you get little in return. You even lose all the fun of the little
squiggles. Haskell is a much better J than J ever was. Haskell's
notation can be pretty awful too ( I thought I'd see about implementing APL's For a regular matrix product, !!C = AB!! means that !!c_{ij}!! is the dot product of the !!i!!th row of !!A!! and the !!j!!th column of !!B!!, so I implemented a dot product function:
OK, that was straightforward. The rows of !!A!! are right there, but we also need the columns from !!B!!, so here's a function to get those:
Also straightforward. After that I toiled for a very long time over the matrix product itself. My first idea was to turn !!A!! into a list of functions, each of which would dot-product one of the rows of !!A!! by a given vector. Then I would map each of these functions over the columns of !!B!!. Turning !!A!! into a list of functions was easy:
and getting the columns of !!B!! I had already done:
and now I just need to apply each row of functions in the first part to each column in the second part and collect the results:
I don't know why this turned out to be so damn hard. This is the sort of thing that ought to be really, really easy in Haskell. But I had many difficulties. First I wasted a bunch of time trying to get
whereas
and I needed to keep that extra structure. I tried all sorts of
tinkering with Another part of the problem was I didn't know any primitive for “map a list of functions over a single argument”. Although it's not hard to write, I had some trouble thinking about it after I wrote it:
Then the “map each function over each list of arguments” is
and this almost works, except it produces the columns of the results
instead of the rows. There is an easy fix and a better fix. The easy
fix is to just transpose the final result. I never did find the
better fix. I thought I'd be able to replace Anyway this did work:
but that So then I went down a rabbit hole and wrote nine more versions of
I finally settled on
The result was okay, but it took me so long to get there. Now I have
just use:
Except uh oh, that I tinkered a bit with requiring a Monoid instance for the matrix
entries, which seemed interesting at least, but to do that I would
need to switch monoids in the middle of the computation and I didn't
want to think about how to do that. So instead I wrote a version of
This fails on empty lists, which is just fine, since I wasn't planning on multiplying any empty matrices. Then I have the final answer:
It's nice and short, but on the other hand it has that mysterious As for the shortness, let's see what it looks like in a more conventional language:
Wow, that was amazingly easy.
Okay, that was kind of a mess. The I think the standard Python answer to this is that you don't need
I don't know how I feel about that argument in general but in this case the result was lovely. I have no complaints. While I was writing the Python program I got a weird bug that turned
out to be related to mutability: I had initialized
But this makes the rows of A lot of the mess in the code is because Python is so obstinate about extending lists when you need them extended, you have to say pretty please every time. Maybe I can get rid of that by using more list comprehensions?
Python's list comprehensions usually make me long for Haskell's, which are so much nicer, but this time they were fine. Python totally wins here. No wait, that's not fair: maybe I should have been using list comprehensions in Haskell also?
Yeah, okay. All that
Well, lesson learned. I really wish I could write Haskell faster. In the mid-1990s I wrote thousands of lines of SML code and despite (or perhaps because of) SML's limitations I was usually able to get my programs to do what I wanted. But when I try to write programs in Haskell it takes me a really long time to get anywhere. Apropos of nothing, today is the 77th birthday of Dennis M. Ritchie. [ Addendum: It took me until now to realize that, after all that, the operation I wanted for polynomials is not matrix multiplication. Not at all! It is actually a convolution: $$ c_k = \sum_{i+j=k} a_ib_j $$ or, for my weird functional version, replace the multiplication !!a_ib_j!! with function composition !!a_i ∘ b_j!!. I may implement this later, for practice. And it's also tempting to try to do it in APL, even though that would most likely be a terrible waste of time… ] [ Addendum 20180909: Vaibhav Sagar points out that my [Other articles in category /prog] permanent link |