2016-06-03T00:01:35Z andrewvic joined #scheme 2016-06-03T00:06:35Z sethalves quit (Remote host closed the connection) 2016-06-03T00:13:50Z andrewvic quit (Quit: andrewvic) 2016-06-03T00:14:33Z andrewvic joined #scheme 2016-06-03T00:16:18Z m1dnight_ quit (Ping timeout: 258 seconds) 2016-06-03T00:16:18Z Tbone139 quit (Ping timeout: 258 seconds) 2016-06-03T00:21:53Z m1dnight1 joined #scheme 2016-06-03T00:22:32Z Tbone139 joined #scheme 2016-06-03T00:25:43Z daviid quit (Ping timeout: 252 seconds) 2016-06-03T00:26:30Z fiddlerwoaroof joined #scheme 2016-06-03T00:27:01Z lritter quit (Remote host closed the connection) 2016-06-03T00:29:41Z m1dnight1 is now known as m1dnight_ 2016-06-03T00:32:20Z groovy2shoes joined #scheme 2016-06-03T00:46:08Z shdeng joined #scheme 2016-06-03T00:51:54Z fugastrega joined #scheme 2016-06-03T00:56:56Z m1dnight_ quit (Ping timeout: 258 seconds) 2016-06-03T00:58:49Z m1dnight1 joined #scheme 2016-06-03T00:59:08Z groscoe quit (Ping timeout: 272 seconds) 2016-06-03T01:09:05Z daviid joined #scheme 2016-06-03T01:17:41Z sethalves joined #scheme 2016-06-03T01:23:09Z m1dnight1 quit (Ping timeout: 260 seconds) 2016-06-03T01:23:39Z adu quit (Quit: adu) 2016-06-03T01:24:48Z m1dnight1 joined #scheme 2016-06-03T01:25:07Z Riastradh quit (Ping timeout: 260 seconds) 2016-06-03T01:29:41Z adu joined #scheme 2016-06-03T01:34:23Z m1dnight1 is now known as m1dnight_ 2016-06-03T01:39:24Z tmtwd joined #scheme 2016-06-03T01:58:43Z profess quit (Quit: ZNC - http://znc.in) 2016-06-03T02:01:21Z profess joined #scheme 2016-06-03T02:03:57Z andrewvic quit (Quit: andrewvic) 2016-06-03T02:09:52Z ArneBab joined #scheme 2016-06-03T02:10:09Z keemyb quit (Remote host closed the connection) 2016-06-03T02:14:16Z ArneBab_ quit (Ping timeout: 264 seconds) 2016-06-03T02:14:45Z mastokley quit (Ping timeout: 276 seconds) 2016-06-03T02:15:45Z mastokley joined #scheme 2016-06-03T02:16:51Z tmtwd quit (Quit: Leaving) 2016-06-03T02:17:53Z annodomini joined #scheme 2016-06-03T02:31:17Z grettke quit (Quit: My Mac has gone to sleep. ZZZzzz…) 2016-06-03T02:37:23Z metaf5 quit (Quit: WeeChat 1.4) 2016-06-03T02:37:46Z metaf5 joined #scheme 2016-06-03T02:39:27Z annodomini quit (Quit: annodomini) 2016-06-03T02:39:56Z karswell quit (Remote host closed the connection) 2016-06-03T02:40:00Z karswell` joined #scheme 2016-06-03T02:45:02Z karswell` quit (Ping timeout: 260 seconds) 2016-06-03T02:48:20Z daviid quit (Ping timeout: 240 seconds) 2016-06-03T03:00:41Z jao quit (Ping timeout: 244 seconds) 2016-06-03T03:01:24Z |meta quit (Quit: Connection closed for inactivity) 2016-06-03T03:03:48Z andrewvic joined #scheme 2016-06-03T03:20:28Z neoncontrails quit (Remote host closed the connection) 2016-06-03T03:20:33Z andrewvic quit (Quit: andrewvic) 2016-06-03T03:24:25Z keemyb joined #scheme 2016-06-03T03:29:47Z leot quit (Remote host closed the connection) 2016-06-03T03:51:39Z badkins quit (Remote host closed the connection) 2016-06-03T03:52:06Z adu quit (Quit: adu) 2016-06-03T03:52:35Z pierpa quit (Ping timeout: 260 seconds) 2016-06-03T04:00:20Z mbuf joined #scheme 2016-06-03T04:07:26Z adu joined #scheme 2016-06-03T04:08:37Z adu quit (Client Quit) 2016-06-03T04:19:19Z andrewvic joined #scheme 2016-06-03T04:26:06Z andrewvic quit (Quit: andrewvic) 2016-06-03T04:33:08Z m1dnight_ quit (Ping timeout: 258 seconds) 2016-06-03T04:34:15Z m1dnight1 joined #scheme 2016-06-03T04:46:57Z lritter joined #scheme 2016-06-03T05:02:21Z masoudd quit (Quit: May your strings always be '\0' terminated.) 2016-06-03T05:05:55Z adu joined #scheme 2016-06-03T05:10:52Z andrewvic joined #scheme 2016-06-03T05:11:04Z andrewvic quit (Client Quit) 2016-06-03T05:14:44Z m1dnight1 quit (Ping timeout: 260 seconds) 2016-06-03T05:19:55Z fugastrega quit (Quit: Leaving) 2016-06-03T05:29:03Z neoncontrails joined #scheme 2016-06-03T05:36:46Z fiddlerwoaroof quit (Read error: Connection reset by peer) 2016-06-03T05:43:43Z m1dnight1 joined #scheme 2016-06-03T05:44:06Z fiddlerwoaroof joined #scheme 2016-06-03T05:46:15Z mumptai joined #scheme 2016-06-03T05:47:47Z adu quit (Quit: adu) 2016-06-03T05:48:43Z andrewvic joined #scheme 2016-06-03T05:50:51Z AlexDenisov joined #scheme 2016-06-03T05:51:21Z fiddlerwoaroof_ joined #scheme 2016-06-03T05:52:47Z fiddlerwoaroof_ quit (Read error: Connection reset by peer) 2016-06-03T05:53:34Z fiddlerwoaroof_ joined #scheme 2016-06-03T05:53:36Z fiddlerwoaroof_ quit (Read error: Connection reset by peer) 2016-06-03T05:54:12Z fiddlerwoaroof_ joined #scheme 2016-06-03T05:54:17Z adu joined #scheme 2016-06-03T05:55:06Z fiddlerwoaroof quit (Ping timeout: 276 seconds) 2016-06-03T05:59:37Z sethalves quit (Ping timeout: 244 seconds) 2016-06-03T06:01:42Z sethalves joined #scheme 2016-06-03T06:02:25Z andrewvic quit (Quit: andrewvic) 2016-06-03T06:05:10Z fiddlerwoaroof_ quit (Read error: Connection reset by peer) 2016-06-03T06:05:31Z fiddlerwoaroof joined #scheme 2016-06-03T06:09:47Z fiddlerwoaroof quit (Ping timeout: 260 seconds) 2016-06-03T06:28:31Z leot joined #scheme 2016-06-03T06:29:30Z andrewvic joined #scheme 2016-06-03T06:31:44Z brendyn quit (Ping timeout: 260 seconds) 2016-06-03T06:32:18Z m1dnight1 quit (Ping timeout: 246 seconds) 2016-06-03T06:33:33Z brendyn joined #scheme 2016-06-03T06:37:21Z adu quit (Quit: adu) 2016-06-03T06:37:54Z neoncontrails quit (Remote host closed the connection) 2016-06-03T06:38:55Z m1dnight1 joined #scheme 2016-06-03T06:40:46Z mejja quit (Quit: \ No newline at end of file) 2016-06-03T06:46:31Z shdeng quit (Quit: Leaving) 2016-06-03T06:47:46Z shdeng joined #scheme 2016-06-03T06:55:49Z civodul joined #scheme 2016-06-03T07:00:54Z andrewvic quit (Quit: andrewvic) 2016-06-03T07:09:12Z lambda-11235 quit (Quit: Bye) 2016-06-03T07:21:02Z andrewvic joined #scheme 2016-06-03T07:24:16Z andrewvic quit (Client Quit) 2016-06-03T07:30:41Z andrewvic joined #scheme 2016-06-03T07:42:24Z grettke joined #scheme 2016-06-03T07:55:00Z mmc joined #scheme 2016-06-03T07:56:02Z nilg` quit (Remote host closed the connection) 2016-06-03T07:56:59Z andrewvic quit (Quit: andrewvic) 2016-06-03T07:58:24Z stasku____ quit (Remote host closed the connection) 2016-06-03T07:58:24Z Neet quit (Remote host closed the connection) 2016-06-03T07:58:24Z micmus quit (Remote host closed the connection) 2016-06-03T07:58:24Z sz0 quit (Remote host closed the connection) 2016-06-03T07:58:25Z ELLIOTTCABLE quit (Remote host closed the connection) 2016-06-03T07:59:19Z mmc quit (Ping timeout: 244 seconds) 2016-06-03T08:00:18Z m1dnight1 quit (Ping timeout: 272 seconds) 2016-06-03T08:01:25Z m1dnight1 joined #scheme 2016-06-03T08:01:39Z andrewvic joined #scheme 2016-06-03T08:01:40Z nial quit (Ping timeout: 240 seconds) 2016-06-03T08:02:34Z mastokley quit (Ping timeout: 244 seconds) 2016-06-03T08:12:24Z lritter quit (Ping timeout: 246 seconds) 2016-06-03T08:12:57Z cemerick joined #scheme 2016-06-03T08:14:24Z mumptai quit (Remote host closed the connection) 2016-06-03T08:14:49Z nial joined #scheme 2016-06-03T08:23:16Z mlaine quit (Ping timeout: 244 seconds) 2016-06-03T08:27:55Z m1dnight1 quit (Ping timeout: 260 seconds) 2016-06-03T08:28:35Z Muir joined #scheme 2016-06-03T08:38:14Z DerGuteM1 joined #scheme 2016-06-03T08:38:26Z asumu1 joined #scheme 2016-06-03T08:38:29Z edgar-rf_ joined #scheme 2016-06-03T08:38:54Z kbtr_ joined #scheme 2016-06-03T08:39:02Z stux|work joined #scheme 2016-06-03T08:39:06Z gabot quit (Ping timeout: 264 seconds) 2016-06-03T08:39:06Z stux16777216Away quit (Ping timeout: 264 seconds) 2016-06-03T08:39:07Z Blukunfando quit (Ping timeout: 264 seconds) 2016-06-03T08:39:07Z rotty quit (Ping timeout: 264 seconds) 2016-06-03T08:39:07Z snow_bckspc quit (Ping timeout: 264 seconds) 2016-06-03T08:39:07Z kbtr quit (Ping timeout: 264 seconds) 2016-06-03T08:39:07Z asumu quit (Ping timeout: 264 seconds) 2016-06-03T08:39:08Z edgar-rft quit (Ping timeout: 264 seconds) 2016-06-03T08:39:08Z DerGuteMoritz quit (Ping timeout: 264 seconds) 2016-06-03T08:39:10Z defanor quit (Excess Flood) 2016-06-03T08:39:10Z defanor joined #scheme 2016-06-03T08:39:42Z Blukunfa1 joined #scheme 2016-06-03T08:39:48Z gabot joined #scheme 2016-06-03T08:39:51Z rotty joined #scheme 2016-06-03T08:41:26Z snow_bckspc joined #scheme 2016-06-03T08:41:51Z edgar-rf_ is now known as edgar-rft 2016-06-03T08:42:59Z jyc_ quit (Ping timeout: 260 seconds) 2016-06-03T08:45:19Z profess quit (Ping timeout: 260 seconds) 2016-06-03T08:45:23Z jyc_ joined #scheme 2016-06-03T08:47:22Z profess joined #scheme 2016-06-03T08:48:00Z mmc joined #scheme 2016-06-03T08:50:43Z Neet joined #scheme 2016-06-03T08:50:49Z ELLIOTTCABLE joined #scheme 2016-06-03T08:54:00Z stasku____ joined #scheme 2016-06-03T08:58:19Z sz0 joined #scheme 2016-06-03T09:00:25Z greatscottttt joined #scheme 2016-06-03T09:02:09Z nilg` joined #scheme 2016-06-03T09:11:57Z retroj quit (Ping timeout: 250 seconds) 2016-06-03T09:18:59Z micmus joined #scheme 2016-06-03T09:32:39Z nilg` quit (Remote host closed the connection) 2016-06-03T09:32:45Z m1dnight1 joined #scheme 2016-06-03T09:37:14Z m1dnight1 quit (Excess Flood) 2016-06-03T09:37:36Z m1dnight1 joined #scheme 2016-06-03T09:41:54Z stepnem joined #scheme 2016-06-03T09:44:23Z m1dnight1 is now known as m1dnight_ 2016-06-03T09:46:17Z bjz quit (Quit: My MacBook Pro has gone to sleep. ZZZzzz…) 2016-06-03T09:46:35Z edgar-rft quit (Quit: edgar-rft) 2016-06-03T09:56:30Z fugastrega joined #scheme 2016-06-03T09:58:56Z [dpk] joined #scheme 2016-06-03T09:59:52Z dpk quit (Excess Flood) 2016-06-03T09:59:52Z Kryo quit (Ping timeout: 244 seconds) 2016-06-03T09:59:52Z [dpk] is now known as dpk 2016-06-03T10:00:43Z bjz joined #scheme 2016-06-03T10:02:07Z sz0 quit (Remote host closed the connection) 2016-06-03T10:02:07Z Neet quit (Remote host closed the connection) 2016-06-03T10:02:08Z stasku____ quit (Remote host closed the connection) 2016-06-03T10:02:08Z micmus quit (Remote host closed the connection) 2016-06-03T10:02:08Z ELLIOTTCABLE quit (Remote host closed the connection) 2016-06-03T10:03:28Z m1dnight_ quit (Ping timeout: 264 seconds) 2016-06-03T10:04:04Z abbe quit (Ping timeout: 260 seconds) 2016-06-03T10:04:21Z Blkt quit (Remote host closed the connection) 2016-06-03T10:06:14Z abbe joined #scheme 2016-06-03T10:06:21Z Blkt joined #scheme 2016-06-03T10:06:50Z mario-go` joined #scheme 2016-06-03T10:06:54Z mario-goulart quit (Remote host closed the connection) 2016-06-03T10:07:12Z Guest749` quit (Remote host closed the connection) 2016-06-03T10:11:12Z przl joined #scheme 2016-06-03T10:11:23Z ELLIOTTCABLE joined #scheme 2016-06-03T10:11:25Z Neet joined #scheme 2016-06-03T10:14:10Z stasku____ joined #scheme 2016-06-03T10:16:13Z brendyn quit (*.net *.split) 2016-06-03T10:16:13Z Khisanth quit (*.net *.split) 2016-06-03T10:16:13Z mpu_ quit (*.net *.split) 2016-06-03T10:20:18Z przl quit (Ping timeout: 276 seconds) 2016-06-03T10:20:22Z przl_ joined #scheme 2016-06-03T10:20:32Z sz0 joined #scheme 2016-06-03T10:23:31Z Blkt quit (Remote host closed the connection) 2016-06-03T10:25:16Z Blkt joined #scheme 2016-06-03T10:30:24Z bokr joined #scheme 2016-06-03T10:32:39Z andrewvic quit (Ping timeout: 276 seconds) 2016-06-03T10:38:12Z annodomini joined #scheme 2016-06-03T10:38:13Z annodomini quit (Changing host) 2016-06-03T10:38:13Z annodomini joined #scheme 2016-06-03T10:43:53Z micmus joined #scheme 2016-06-03T10:44:12Z TheLemonMan joined #scheme 2016-06-03T10:45:20Z mbuf quit (Quit: Ex-Chat) 2016-06-03T10:49:49Z Kryo joined #scheme 2016-06-03T10:49:49Z edgar-rft joined #scheme 2016-06-03T10:49:49Z brendyn joined #scheme 2016-06-03T10:49:49Z Khisanth joined #scheme 2016-06-03T10:49:49Z mpu_ joined #scheme 2016-06-03T10:55:57Z mario-go` is now known as mario-goulart 2016-06-03T11:02:04Z shdeng quit (Quit: Leaving) 2016-06-03T11:10:34Z przl_ quit (Ping timeout: 260 seconds) 2016-06-03T11:15:34Z cemerick quit (Ping timeout: 240 seconds) 2016-06-03T11:26:19Z mokuso joined #scheme 2016-06-03T11:26:20Z mokuso quit (Changing host) 2016-06-03T11:26:20Z mokuso joined #scheme 2016-06-03T11:32:28Z mokuso quit (Quit: Reconnecting) 2016-06-03T11:33:56Z mokuso joined #scheme 2016-06-03T11:43:40Z cemerick joined #scheme 2016-06-03T11:49:14Z arbv joined #scheme 2016-06-03T12:01:28Z aries_liuxueyang quit (Ping timeout: 252 seconds) 2016-06-03T12:06:45Z groovy2shoes quit (Quit: Leaving) 2016-06-03T12:08:51Z mokuso quit (Ping timeout: 276 seconds) 2016-06-03T12:10:54Z mokuso joined #scheme 2016-06-03T12:10:54Z mokuso quit (Changing host) 2016-06-03T12:10:54Z mokuso joined #scheme 2016-06-03T12:15:47Z bokr quit (Quit: Leaving.) 2016-06-03T12:23:44Z edgar-rft quit (Quit: edgar-rft) 2016-06-03T12:31:57Z przl joined #scheme 2016-06-03T12:43:15Z ecraven: hm.. APL would be *much* easier to understand if every symbol didn't mean at least three different things, depending on context 2016-06-03T12:46:29Z LeoNerd: English too 2016-06-03T12:46:40Z ecraven: LeoNerd: to me at least APL seems much worse :) 2016-06-03T12:46:58Z LeoNerd: Well, you've probably been using English longer than APL 2016-06-03T12:47:03Z ecraven: definitely :) 2016-06-03T12:49:16Z mokuso: J looked nice. But APL looks pretty weird for my taste.. 2016-06-03T12:49:39Z ecraven: mokuso: right now, I feel the opposite, but I have looked at APL for two days now, and J only for a few minutes 2016-06-03T12:50:25Z mokuso: yeah, that's how it goes, more or less with languages or any activity :) 2016-06-03T12:50:48Z ecraven: ah, I can even write things like (⍳10),[0.5]×\ ⍳10 now and not feel completely lost :D 2016-06-03T12:51:45Z mokuso: Heh, it looks very mathematical, with all those symbols, and maybe it is since Iverson was a mathematician. Haven't bothered playing around with APL yet. Maybe sometime :) 2016-06-03T12:51:48Z ecraven: strangely APL feels like what R *should* be :) 2016-06-03T12:51:56Z mokuso: oh :> 2016-06-03T12:51:59Z ecraven: mokuso: it's impressive for array/vector stuff 2016-06-03T12:52:17Z ecraven: opens up a whole lot of new perspective if you've never used something like it before 2016-06-03T12:52:22Z mokuso: would you recommend it then for numerics and/or large dataset manipulations? 2016-06-03T12:52:25Z ecraven: like forth, smalltalk, scheme, haskell 2016-06-03T12:53:01Z ecraven: mokuso: I'm not good enough with it to actually comment, but I think it could be a very good tool for the things I'm trying to use R for now... R never made much sense to me 2016-06-03T12:53:08Z mokuso: does it need much RAM as R, analogous to the size of your datafiles ? 2016-06-03T12:53:16Z DGASAU: mokuso: there're rumors that it was used in that field for a long time. 2016-06-03T12:53:20Z ecraven: but the ultimate goal would be to extend Scheme (probably only by library) to support array/vector based stuff 2016-06-03T12:53:56Z mokuso: interesting...thanks guys, I'll try APL when I find some time running some R algorithms 2016-06-03T12:54:30Z ecraven: I'd add APL (or J, or one of the others) to the list of "you should look at this language at least once in your life to learn new things" 2016-06-03T12:56:14Z mokuso: I haven't touched J for 3 years or so, not sure if I will now...It felt nice, though, and I liked that Iverson had written some math. textbooks with J 2016-06-03T12:56:40Z ecraven: well, I really don't know too well, but J should be very similar to APL in spirit, if not in actual syntax 2016-06-03T12:56:46Z ecraven: maybe a bit like CL and Scheme? 2016-06-03T12:56:53Z ecraven: same thing, but different :p 2016-06-03T12:57:31Z mokuso: it doesn't have that many mathematical symbols/operators as APL, so it doesn't look like an arcane script. But it is processed leftwise, from the right to the left xD 2016-06-03T12:58:00Z annodomini quit (Quit: annodomini) 2016-06-03T12:58:01Z ecraven: yea, but I think the basic concepts are very similar in J and APL 2016-06-03T12:58:12Z ecraven: just the actual symbols used differ, and some details too 2016-06-03T12:58:16Z mokuso: I suppose they are. It seems J like an APL for the public 2016-06-03T12:58:24Z annodomini joined #scheme 2016-06-03T12:59:20Z ijp quit (Quit: brb actualising self) 2016-06-03T13:05:54Z brendyn quit (Quit: WeeChat 1.5) 2016-06-03T13:09:21Z AlexDeni_ joined #scheme 2016-06-03T13:09:29Z annodomini quit (Quit: annodomini) 2016-06-03T13:09:40Z AlexDenisov quit (Ping timeout: 252 seconds) 2016-06-03T13:11:19Z przl quit (Ping timeout: 260 seconds) 2016-06-03T13:11:35Z lloda: I wrote a J-ish library for Guile https://notabug.org/lloda/guile-ploy. Does function rank, rank conjunction, etc. but it's unfinished and a bit of a mess. I'm very interested in Remora! 2016-06-03T13:17:07Z przl joined #scheme 2016-06-03T13:18:42Z noethics quit (Remote host closed the connection) 2016-06-03T13:19:05Z noethics joined #scheme 2016-06-03T13:23:31Z jlongster joined #scheme 2016-06-03T13:23:37Z badkins joined #scheme 2016-06-03T13:28:11Z annodomini joined #scheme 2016-06-03T13:28:12Z annodomini quit (Changing host) 2016-06-03T13:28:12Z annodomini joined #scheme 2016-06-03T13:28:22Z badkins quit (Ping timeout: 272 seconds) 2016-06-03T13:28:27Z annodomini quit (Client Quit) 2016-06-03T13:33:30Z ecraven: lloda: thanks, I'll have a look later! 2016-06-03T13:37:05Z annodomini joined #scheme 2016-06-03T13:38:13Z nilg quit (Remote host closed the connection) 2016-06-03T13:50:35Z cemerick quit (Remote host closed the connection) 2016-06-03T13:53:04Z mmc quit (Quit: Leaving.) 2016-06-03T13:53:06Z cemerick joined #scheme 2016-06-03T13:53:48Z ecraven: ah, you have to love sentences like these: "It is obviously an inner product. (X-1⌽X) +.× (Y+1⌽Y)÷2 2016-06-03T13:59:11Z mokuso quit (Quit: later) 2016-06-03T14:13:26Z grettke quit (Quit: Textual IRC Client: www.textualapp.com) 2016-06-03T14:18:56Z IstiCusi joined #scheme 2016-06-03T14:19:05Z IstiCusi quit (Client Quit) 2016-06-03T14:19:20Z IstiCusi joined #scheme 2016-06-03T14:22:34Z badkins joined #scheme 2016-06-03T14:23:30Z badkins_ joined #scheme 2016-06-03T14:24:27Z emlow quit (Quit: emlow) 2016-06-03T14:27:18Z badkins quit (Ping timeout: 276 seconds) 2016-06-03T14:27:33Z kuribas joined #scheme 2016-06-03T14:35:26Z pierpa joined #scheme 2016-06-03T14:41:38Z fugastrega quit (Quit: Leaving) 2016-06-03T14:42:45Z ecraven: naming your language with one letter (J in this case) *really* makes it hard to find packages for it, for example 2016-06-03T14:43:38Z fugastrega joined #scheme 2016-06-03T14:47:09Z fugastrega quit (Client Quit) 2016-06-03T14:55:22Z IstiCusi quit (Quit: WeeChat 1.0.1) 2016-06-03T14:55:27Z karswell` joined #scheme 2016-06-03T14:55:54Z narendraj9 joined #scheme 2016-06-03T14:56:26Z cemerick quit (Ping timeout: 258 seconds) 2016-06-03T14:57:59Z moredhel quit (Quit: byee) 2016-06-03T14:58:24Z moredhel joined #scheme 2016-06-03T14:58:58Z adu joined #scheme 2016-06-03T15:01:47Z moredhel quit (Client Quit) 2016-06-03T15:02:07Z moredhel joined #scheme 2016-06-03T15:09:52Z cemerick joined #scheme 2016-06-03T15:10:17Z badkins_ is now known as badkins 2016-06-03T15:15:33Z aries_liuxueyang joined #scheme 2016-06-03T15:17:01Z jlongster quit (Quit: My Mac has gone to sleep. ZZZzzz…) 2016-06-03T15:18:41Z jlongster joined #scheme 2016-06-03T15:23:20Z vydd_ joined #scheme 2016-06-03T15:25:25Z badkins quit (Remote host closed the connection) 2016-06-03T15:26:40Z cemerick quit (Ping timeout: 240 seconds) 2016-06-03T15:26:52Z vydd quit (Ping timeout: 260 seconds) 2016-06-03T15:29:00Z lloda: "J language" or "J software" usually finds links 2016-06-03T15:29:12Z lloda: Jsoftware's mailing list is very active 2016-06-03T15:30:05Z vydd joined #scheme 2016-06-03T15:34:15Z vydd_ quit (Ping timeout: 276 seconds) 2016-06-03T15:37:15Z Muir quit (Quit: Leaving) 2016-06-03T15:45:12Z bjz_ joined #scheme 2016-06-03T15:46:29Z bjz quit (Ping timeout: 260 seconds) 2016-06-03T15:58:24Z lambda-11235 joined #scheme 2016-06-03T15:59:08Z tax quit (Quit: Leaving) 2016-06-03T16:00:58Z greatscottttt quit (Quit: leaving) 2016-06-03T16:04:51Z jao joined #scheme 2016-06-03T16:07:41Z mmc joined #scheme 2016-06-03T16:11:29Z badkins joined #scheme 2016-06-03T16:13:23Z civodul quit (Quit: ERC (IRC client for Emacs 24.5.1)) 2016-06-03T16:22:21Z hive-mind quit (Ping timeout: 276 seconds) 2016-06-03T16:22:22Z mumptai joined #scheme 2016-06-03T16:25:59Z annodomini quit (Quit: annodomini) 2016-06-03T16:30:17Z leot quit (Quit: BBL) 2016-06-03T16:33:59Z hive-mind joined #scheme 2016-06-03T16:36:27Z nanoz joined #scheme 2016-06-03T16:38:43Z przl quit (Ping timeout: 250 seconds) 2016-06-03T16:38:57Z mejja joined #scheme 2016-06-03T16:56:34Z pjb joined #scheme 2016-06-03T17:14:39Z ics joined #scheme 2016-06-03T17:18:28Z grettke joined #scheme 2016-06-03T17:19:59Z mokuso joined #scheme 2016-06-03T17:20:15Z vydd quit 2016-06-03T17:21:01Z edgar-rft joined #scheme 2016-06-03T17:21:33Z m1dnight_ joined #scheme 2016-06-03T17:26:23Z m1dnight_ quit (Ping timeout: 250 seconds) 2016-06-03T17:27:46Z m1dnight_ joined #scheme 2016-06-03T17:31:09Z mastokley joined #scheme 2016-06-03T17:41:22Z Riastradh joined #scheme 2016-06-03T17:44:26Z jcowan joined #scheme 2016-06-03T17:50:01Z abhinav joined #scheme 2016-06-03T17:50:30Z asumu1 is now known as asumu 2016-06-03T17:51:58Z scarygelatin joined #scheme 2016-06-03T17:59:22Z _sjs joined #scheme 2016-06-03T18:02:02Z jcowan: ho hey 2016-06-03T18:08:16Z ecraven: hey jcowan :) 2016-06-03T18:12:03Z ecraven: jcowan: a few things that came to mind right away when looking at http://trac.sacrideo.us/wg/wiki/ArraysCowan: Is it really necessary to have lower and upper bounds? shouldn't the lower bound always be 0 (or am I misunderstanding something)? I think I'm also missing rotation functions (⌽⍉⊖ in APL). 2016-06-03T18:12:11Z ecraven: have you started on an implementation yet? 2016-06-03T18:12:51Z edgar-rft quit (Quit: edgar-rft) 2016-06-03T18:13:30Z narendraj9 quit (Ping timeout: 276 seconds) 2016-06-03T18:13:32Z ecraven: jcowan: also, have you thought about adding a function/syntax to define new "scalar dyadic pervasive" functions, so that normal binary functions like +, -, /, *, etc. can easily be used on arrays? 2016-06-03T18:13:56Z lritter joined #scheme 2016-06-03T18:19:45Z mejja quit (Quit: makes pointer from integer without a cast) 2016-06-03T18:21:06Z jcowan: ecraven: I haven't started any implementation 2016-06-03T18:21:32Z jcowan: scalar dyadic (or n-adic) pervasive (if that means element-wise) functions are just wrappings of array-map. 2016-06-03T18:21:45Z jcowan: Rotation is in the todo 2016-06-03T18:22:03Z jcowan: because it's not clear if it needs to exist separate from rearrange 2016-06-03T18:22:47Z jcowan: and people do seem to want lower bounds other than all-0, at least in the Fortran world 2016-06-03T18:24:33Z abhinav quit (Ping timeout: 276 seconds) 2016-06-03T18:27:18Z ecraven: hm.. to me, they seem like an unnecessary complication that could be added on top of the basic layer if needed 2016-06-03T18:27:37Z ecraven: but as I have written 0 lines of productive array code, that might not count for much 2016-06-03T18:29:40Z jcowan too 2016-06-03T18:29:50Z jcowan: multi-dimensional arrays are a rather special-purpose data structure 2016-06-03T18:29:52Z ecraven: jcowan: does array-map work on the "leaves" of the array, or on each direct element? 2016-06-03T18:29:58Z jcowan: Just so 2016-06-03T18:30:20Z jcowan: Wait, I'm not sure what distinction you are making. 2016-06-03T18:31:07Z ecraven: I think it only makes a difference if you have what APL calls "nested arrays" 2016-06-03T18:31:23Z ecraven: so not just an n-dimensional array of numbers, but some of the elements are arrays too 2016-06-03T18:32:12Z jcowan: Oh. There's only a little support for that: recursive-ref and collapse/explode 2016-06-03T18:32:20Z mokuso quit (Quit: l8r) 2016-06-03T18:32:30Z jcowan: of course, as this is Scheme there is no reason why array elements must all be of the same type either. 2016-06-03T18:32:48Z LeoNerd: "homogenous" 2016-06-03T18:33:07Z jcowan: So array-map works on each element, not caring if it is an array or not (but the mapping procedure may of course care) 2016-06-03T18:34:51Z Riastradh: I think you should have covariant and contravariant arrays. 2016-06-03T18:34:52Z ecraven: I just tried, in APL that is what "pervasive" means, the primitive function is applied only when reaching a scalar 2016-06-03T18:35:19Z ecraven: so + works on any sort of nesting, as long as the nesting is the same for each parameter 2016-06-03T18:35:28Z Riastradh: Also array contraction and outer products. 2016-06-03T18:35:36Z ecraven: Riastradh: outer products is there :D 2016-06-03T18:36:10Z ecraven: Riastradh: is there a way to get MIT/GNU Scheme to accept utf-8 for function names? 2016-06-03T18:36:30Z Riastradh: ecraven: Maybe `-*- coding: utf-8 -*-' but I'm not sure. 2016-06-03T18:36:52Z Riastradh: ecraven: I should have clarified: nested covariant and contravariant arrays. 2016-06-03T18:37:07Z Riastradh: Like scmutils' up tuples and down tuples. 2016-06-03T18:37:50Z nanoz quit (Read error: Connection reset by peer) 2016-06-03T18:37:51Z ecraven: Riastradh: Illegal character: #\U+8c (when trying with (define (⌽ a) a)). ah well, not that much of a problem anyway :) thanks! 2016-06-03T18:38:19Z nanoz joined #scheme 2016-06-03T18:38:29Z Riastradh: Multiplying down by up is inner product; multiplying up by down is outer product; rules apply recursively to nested up/down tuples. 2016-06-03T18:38:37Z neoncontrails joined #scheme 2016-06-03T18:39:23Z pepton1 joined #scheme 2016-06-03T18:39:35Z jcowan: AFAICT array contraction is entirely an implementation optimization 2016-06-03T18:40:11Z groscoe joined #scheme 2016-06-03T18:40:44Z Riastradh: Not what I mean, then. 2016-06-03T18:41:18Z jcowan reads the scmutils docs 2016-06-03T18:41:28Z ecraven: hehe, me too :) 2016-06-03T18:43:28Z AlexDeni_ quit (Quit: My Mac has gone to sleep. ZZZzzz…) 2016-06-03T18:44:10Z jcowan: Trying to understand it leads me to dual vectors, which leads me off in the wikiweeds. WP mathematical articles just blatantly violate the principles of accessibility that apply to other WP articles. 2016-06-03T18:44:17Z Riastradh: Inner product is an example of array contraction, usually presented as `row vector * column vector'. 2016-06-03T18:44:44Z Riastradh: (`Tensor contraction' is the mathy term, except usually it requires a more homogeneous structure than is always convenient.) 2016-06-03T18:46:02Z annodomini joined #scheme 2016-06-03T18:46:18Z Riastradh: The initial idea is that a `column vector' means a point in R^n, and a `row vector' means a linear functional from R^n to R represented by its coefficients of the coordinates of the input. 2016-06-03T18:46:46Z ecraven: Riastradh: are co- and contravariant arrays other words for up and down tuples? 2016-06-03T18:46:53Z Riastradh: Then if you `apply' (multiply) row(42, 87, -4) to (by) column(1, 2, 3), you get 42*1 + 87*2 - 4*3. 2016-06-03T18:47:19Z Riastradh: That is the simplest case of contraction. 2016-06-03T18:47:28Z jcowan: Right. APL inner product (which is what I'm providing) does that: its signature is proc1 proc2 array1 array2 2016-06-03T18:47:33Z ecraven: hm. I must have SICM somewhere around here 2016-06-03T18:48:27Z Riastradh: A matrix can be seen either as a row of column vectors, or as a column of row vectors -- either as a transformation of the column vector space or as a transformation of the row vector space, depending on which side you do the multiplication on. 2016-06-03T18:48:49Z gravicappa joined #scheme 2016-06-03T18:49:04Z jcowan nods 2016-06-03T18:49:10Z Riastradh: Specifically, if A is an n-by-n matrix, and phi is a row vector in R^n, and x is a column vector in R^n, then phi A is also a row vector in R^n, and A x is also a column vector in R^n. 2016-06-03T18:49:57Z Riastradh: Both phi A and A x are kinds of contractions as well, because one of the inputs (A) is larger in some dimension than the output. 2016-06-03T18:50:19Z _sjs quit (Remote host closed the connection) 2016-06-03T18:50:29Z Riastradh: And if you do phi A x, whether via (phi A) x or via phi (A x), you get back a real number -- a further contraction. 2016-06-03T18:51:31Z ecraven: page 494 of my SICM explains up/down tuples in more depth 2016-06-03T18:53:31Z Riastradh: Suppose you have a function f: R ---> R, e.g. f(x) = x^2. The derivative of f at a point x is a number you multiply an increment in x by to get an increment in f(x): f(x + delta) ~= f(x) + f'(x) * delta. 2016-06-03T18:54:15Z Riastradh: Another way to look at the codomain of f is not as the space of real numbers, but as the space of linear maps of real numbers. 2016-06-03T18:55:14Z adu quit (Quit: adu) 2016-06-03T18:55:29Z Riastradh: E.g., for a real number y, the map z |---> y z is a linear map. The space of linear maps on R, Hom(R), is a vector space in itself, and an obvious way to endow it with coordinates is by its value at the real number 1 -- thus the one coordinate for the map z |---> y z is y. 2016-06-03T18:56:40Z _sjs joined #scheme 2016-06-03T18:57:32Z Riastradh: Hence you could also say that the derivative of f at a point x, written Df(x) instead of f'(x) to emphasize this conceptual distinction, is a *linear map* from an increment in x to an increment in f(x): f(x + delta) ~= f(x) + Df(x)(delta), where Df(x): R ---> R is a linear map, given by delta |---> f'(x) * delta. 2016-06-03T18:59:15Z Riastradh: Now consider g: R ---> R^2, e.g. g(t) = (cos t, sin t). The derivative of g at a point t is a linear map from an increment in t to an increment in g(t). No longer does a single real number suffice to represent this derivative, so g'(t) has to be an element of R^2: if g(t + delta) ~= g(t) + g'(t) * delta is to hold, then since g(t) is an element of R^2 and delta is a real number, so must g'(x) be an element of R^2. 2016-06-03T19:00:16Z Riastradh: Correspondingly, instead of g'(t) \in R^2, we can write it as Dg(t): R ---> R^2 is a linear map, or Dg(t) \in Hom(R, R^2). 2016-06-03T19:00:34Z kuribas quit (Quit: ERC Version 5.3 (IRC client for Emacs)) 2016-06-03T19:01:40Z Riastradh: Now if we have h(x, y) = x^2 + y^2 - 1, so that h: R^2 ---> R, we could also say that h'(x, y) is an element of R^2, being the vector of partial derivaves of h at x and y, but that fails to capture the distinction that Dh(t) \in Hom(R^2, R), as opposed to Dg(t) \in Hom(R, R^2). 2016-06-03T19:01:54Z Riastradh: s/derivaves/derivatives/1 2016-06-03T19:02:12Z Riastradh: So simply using a single kind of vector/array for both g' and h' is unsatisfactory. 2016-06-03T19:02:25Z Riastradh: It gets worse! 2016-06-03T19:02:39Z jcowan: So in essence up vs. down is a type system/ 2016-06-03T19:02:40Z jcowan: ? 2016-06-03T19:04:07Z Riastradh: In classical mechanics, one often considers a function of two parameters: position and velocity. A velocity is itself a derivative. E.g., L(x, v) = 1/2 k x^2 + 1/2 m v^2, for a mass m on a spring with spring constant k. 2016-06-03T19:04:17Z Riastradh: jcowan: Yes, of a sort. 2016-06-03T19:06:00Z Riastradh: Describing this function as L: R x R ---> R is easy enough. We're often concerned with its partial derivatives, d_0 L and d_1 L (written in traditional physics as d/dq L and d/dq. L, sometimes with or without L(q, q.), for maximal confusion), which are easy enough to work with when we're concerned only with a single dimension. 2016-06-03T19:06:15Z mbrock joined #scheme 2016-06-03T19:07:08Z Riastradh: But what if we want three dimensions? Now x is a point in R^3 -- and v is a *derivative* of a path in R^3, i.e. a linear map from R to R^3. So we have L: R^3 x Hom(R, R^3) ---> R. 2016-06-03T19:07:59Z Riastradh: Now what is the derivative of L at a point? It will be a linear map from an increment in an (x, v) pair, say (delta, eta), to an increment in the value of L(x, v), say L(x + delta, v + eta). 2016-06-03T19:08:33Z Riastradh: So the derivative of L at a point (x, y) has to be an element of Hom(R^3 x Hom(R, R^3), R). 2016-06-03T19:09:04Z Riastradh: When you apply it to (or multiply it by) an increment in R^3 x Hom(R, R^3), it has to give back an R. 2016-06-03T19:09:12Z Riastradh: This is a more complex array contraction! 2016-06-03T19:10:02Z Riastradh: `That's OK,' you say. `Since Hom(R, R^3) is isomorphic to R^3, it follows that R^3 x Hom(R, R^3) ~= R^3 x R^3 ~= R^6, so we can just represent this by a 6-dimensional vector and use a standard R^n inner product to compute it.' 2016-06-03T19:10:17Z xtristan joined #scheme 2016-06-03T19:11:14Z Riastradh: But now suppose instead of the path of one particle, we have a continuum of paths of many particles, e.g. the deformation of a surface over time. Then while our positions are still points in R^3, we don't just have one path through them -- we have a parametrized family of paths through them. 2016-06-03T19:12:44Z Riastradh: Maybe it'll just be the deformation of a string, not a surface, over time. 2016-06-03T19:12:59Z Riastradh: Now a path is not merely a function w: R ---> R^3, giving the coordinates w(t) of a single particle at a single time t. Now a path is a function w: R^2 ---> R^3, giving the coordinates w(t, p) of a particle p along the string at time t. 2016-06-03T19:14:13Z Riastradh: Here we have a linear map Dw(t, p): R^2 ---> R^3, or Dw(t, p) \in Hom(R^2, R^3), which sends an increment in time and space along the string to an increment in the position taken by a particle on the string. That's a velocity! 2016-06-03T19:14:37Z Riastradh: Now our L(x, v) function has signature L: R^3 x Hom(R^2, R^3) ---> R. 2016-06-03T19:15:37Z Riastradh: At this point, attempting to simply flatten the coordinates out into an element of R^3 x R^6 ~= R^9 falls apart, because the rules for contracting a representation of Hom(R^2, R^3) are different from the rules for contracting R^6. 2016-06-03T19:16:53Z badkins_ joined #scheme 2016-06-03T19:18:28Z Riastradh: In scmutils, elements of R^n are represented by up tuples of real numbers, and elements of Hom(R^n, R) by down tuples of real numbers. Multiplying a down tuple (Hom(R^n, R)) by an up tuple (R^n) applies the linear map to a point and returns a real number. 2016-06-03T19:19:46Z badkins quit (Ping timeout: 244 seconds) 2016-06-03T19:20:08Z Riastradh: But the rules apply recursively to nested up/down tuples -- the equation (* (down x y) (up a b)) = (+ (* x a) (* y b)) applies recursively when x or a or y or b is not a real number but a tuple itself. 2016-06-03T19:21:23Z ecraven: Riastradh: so in fact this is a problem for *, but in order to properly deal with it, the underlying array storage has to somehow support the up/down distinction? 2016-06-03T19:22:30Z nilg joined #scheme 2016-06-03T19:23:47Z Riastradh: So for one-dimensional paths (i.e., paths of a single particle), a position can be (up x y z) and a velocity, (up u v w); multiply velocity by a change in time delta_t and you get (up delta_x delta_y delta_z) = (up (* u delta_t) (* v delta_t) (* w delta_t)). 2016-06-03T19:24:35Z Riastradh: For two-dimensional paths (i.e., paths of a continuum of particles along a string), a position is still (up x y z), but a velocity is now (down (up ut vt wt) (up up vp wp)). Multiply it by an increment in time and string-position (up delta_t delta_p), and you get (up delta_x delta_y delta_z) = (up (+ (* ut delta_t) (* up delta_p)) ...). 2016-06-03T19:25:20Z Riastradh: (Heh. `up' was perhaps not a great name for the partial velocity with respect to p. Should've written u_t, v_t, w_t, u_p, v_p, w_p.) 2016-06-03T19:25:36Z ecraven: hehe, well, context suffices to clear things up 2016-06-03T19:25:38Z Riastradh: ecraven: Correct. In scmutils it's just two kinds of flat vectors. 2016-06-03T19:26:19Z ecraven: Riastradh: does anything but scmutils support this in this way? 2016-06-03T19:26:33Z |meta joined #scheme 2016-06-03T19:26:40Z daviid joined #scheme 2016-06-03T19:26:50Z Riastradh: Now what's an example derivative of L: R^3 x Hom(R^2, R^3) ---> R? Well, it's a linear map from an increment in space, (up delta_x delta_y delta_z), and an increment in velocity, (down (up delta_ut delta_vt delta_wt) (up delta_up delta_vp delta_wp)), and spits out a real number. 2016-06-03T19:27:56Z Riastradh: At this point I can't just do it in my head, so this is probably not right, but I encourage the dedicated reader to fix it: I'm it'll be (down (down dx dy dz) (up (down dut dvt dwt) (down dup dvp dwp))). 2016-06-03T19:28:08Z Riastradh: ecraven: I'm not aware of any other system that does this. 2016-06-03T19:28:37Z Riastradh: s/ I'm it'll be/I think it'll be/1 2016-06-03T19:29:09Z Riastradh: Note that there is no straightforward representation of this as simply a large n-by-m matrix. 2016-06-03T19:29:22Z ecraven: Riastradh: do I understand correctly that the distinction between up and down is not something that "the system" can do automatically, but "the user" has to make this distinction when entering data? 2016-06-03T19:29:42Z Riastradh: Similar problems arise when you try to do matrix calculus, e.g. for the derivatives of Gaussian process covariance kernels. 2016-06-03T19:30:05Z Riastradh: (Nothing related to celestial mechanics for that application, just to emphasize that this is not a problem confronted solely by celestial mechanics like Gerry Sussman.) 2016-06-03T19:30:28Z Riastradh: ecraven: Yes -- just as you have to keep your rows and columns straight when entering matrices. 2016-06-03T19:30:34Z leot joined #scheme 2016-06-03T19:33:08Z ecraven: so for "normal" applications, everything would be up tuples? 2016-06-03T19:33:09Z arbv quit (Ping timeout: 246 seconds) 2016-06-03T19:33:13Z Riastradh: ecraven: However, you don't have to write down the structure of, e.g., derivatives of L yourself -- automatic differentiation takes care of that. 2016-06-03T19:33:39Z ecraven: normal is the wrong word, maybe "simple"? 2016-06-03T19:33:47Z Riastradh: ecraven: When you're writing down, e.g., positions and velocities of single particle paths, yes. 2016-06-03T19:34:05Z ecraven: also things like financial maths or statistics 2016-06-03T19:34:54Z Riastradh: This was the story for Lagrangians. The story for Hamiltonians is different, because they deal in momenta, not in velocities -- and while a velocity of a path R ---> R^3 is also an element of R^3 (an increment in R^3), a momentum is the reverse, a linear map from a velocity to a real number, so Hom(R^3, R). 2016-06-03T19:35:50Z Riastradh: Hence a Hamiltonian H(q, p) [for obscure hysterical raisins, one usually writes positions as q and momenta as p], we have H: R^3 x Hom(R^3, R) ---> R, and hence a derivative of H at a point (q, p) has a different structure from the derivative of L at a point (x, v). 2016-06-03T19:36:08Z pjb quit (Remote host closed the connection) 2016-06-03T19:36:10Z adu joined #scheme 2016-06-03T19:36:53Z Riastradh: ecraven: Depends on what kind of financial math or statistics. For example, Gaussian processes appear in geostatistics and other fields of empirical inference (for which I prefer to avoid the needlessly reductive word `statistics'). 2016-06-03T19:38:15Z groovy3shoes joined #scheme 2016-06-03T19:38:17Z badkins_ quit (Read error: Connection reset by peer) 2016-06-03T19:39:08Z Riastradh: To make it a little more concrete, consider first the pdf of a univariate Gaussian distribution -- the bog-standard normal curve: rho(x | mu, sigma) = (1/sqrt(2 pi sigma)) e^{-(x - mu)^2 / 2 sigma^2}. 2016-06-03T19:39:13Z ecraven: Riastradh: might I ask for what you meant by co- and contravariance? I know these from typed arrays, but am not sure how they would apply in the context of Scheme? 2016-06-03T19:39:41Z Riastradh: Oh, co- and contra-variance are just fancy words for R^n vs Hom(R^n, R) (or the other way around, I can never keep them straight). 2016-06-03T19:40:00Z ecraven: so co- and contra-variance is the same context as up/down tuples? 2016-06-03T19:40:26Z Riastradh: More or less, yes. 2016-06-03T19:40:32Z ecraven: ok, thanks :) 2016-06-03T19:41:16Z Riastradh: Certain inference techniques, such as Hamiltonian Monte Carlo (OK, Hamiltonians got involved again, but I promise you this *isn't* about celestial mechanics), use the derivative of this density with respect to the parameters mu and sigma. 2016-06-03T19:42:34Z Riastradh: Specifically, they involve, for fixed x_0, the map f: (mu, sigma) |---> rho(x_0 | mu, sigma). For the univariate Gaussian, we have f: R x R ---> R. 2016-06-03T19:43:02Z Riastradh: Oops, I miswrote the density -- forgot a ^2 in the denominator: rho(x | mu, sigma) = (1/sqrt(2 pi sigma^2)) e^{-(x - mu)^2 / 2 sigma^2}. 2016-06-03T19:43:19Z ecraven: unfortunately, I hadn't noticed :-/ 2016-06-03T19:43:26Z jcowan: Nor I. 2016-06-03T19:43:40Z jcowan: the absent-minded professor: writes a, says b, means c, but d is correct 2016-06-03T19:44:02Z ecraven should read so many more books on all kinds of topics ;-/ 2016-06-03T19:44:04Z Riastradh: The derivative of this function f is easy enough, just like the derivative of (x, y) |---> x^2 + y^2 - 1 was. 2016-06-03T19:44:26Z Riastradh: But what if we now want to deal with a multivariate normal? 2016-06-03T19:45:05Z Riastradh: Now instead of a real number mean mu and a real number variance sigma^2, we have a real vector Mu \in R^n, and a real covariance *square*matrix* Sigma. 2016-06-03T19:45:15Z jcowan: ecraven: Trust me, I *have* read so many more books on all kinds of topics (I'm 58 and have had the Speed-Reading Superpower since childhood) and right now it barely helps 2016-06-03T19:45:48Z Riastradh: So in this case, for fixed X_0, the density F: (Mu, Sigma) |---> rho(X_0 | Mu, Sigma) is now a map F: R^n x R^(n^2) ---> R. 2016-06-03T19:46:33Z ecraven: I kind of know all these things, but obviously not well enough to actually understand everything :-/ 2016-06-03T19:46:57Z Riastradh: What do we do with the derivative of F? Yeesh! It's gotta be a linear map from an increment in Mu, Delta_Mu, and an increment in Sigma, Delta_Sigma, to an increment in density, delta_rho: Hom(R^n x R^(n^2), R). 2016-06-03T19:47:45Z groovy3shoes: discrete math ftw! :p 2016-06-03T19:47:48Z jcowan: One has to be very young indeed to understand everything. 2016-06-03T19:48:16Z groovy3shoes: you and your continuum can go hold your own party! 2016-06-03T19:48:38Z ecraven: Riastradh: may I summarise this (probably totally incorrectly) as: a general array library should maybe provide up tuples as the default (for "simple" applications), but also provide support for down tuples (for more "advanced" applications)? 2016-06-03T19:48:38Z groovy3shoes mumbles something about universal algebra and model theory 2016-06-03T19:49:01Z Riastradh: Although this space is isomorphic to R^(n + n^2), so that there exists some matrix representation of it if you were to assemble the increments in Mu and Sigma into a giant column vector, there's no *nice* matrix representation of this compound structure. 2016-06-03T19:50:05Z Riastradh: (Disclosure: I'm right across the street from Gerry Sussman as we speak, and I've recently been paid a nontrivial sum of money to write code dealing with Gaussian processes and their derivatives and empirical inference based on it. So don't feel bad if any of this is new or confusing to you! It was to me too not too long ago.) 2016-06-03T19:50:10Z groovy3shoes isn't worried because he already has a nice representation for terms :p 2016-06-03T19:51:03Z ecraven: though I'm a bit at a loss at understanding how to actually apply the up/down distinction without even providing e.g. a * procedure that works on arrays :-/ 2016-06-03T19:51:17Z ecraven: Riastradh: in Scheme? 2016-06-03T19:51:19Z MiteshNinja joined #scheme 2016-06-03T19:51:20Z Riastradh: ecraven: `should' may be too strong a word there. Mostly I was just needling jcowan about things that would be nice to have for applications I run into, even though I don't actually have the time to participate in the process for real. 2016-06-03T19:51:52Z Riastradh: Not in Scheme, no. The GP stuff is mostly in Python and a probabilistic programming language called Venture. 2016-06-03T19:51:55Z groovy3shoes: Riastradh: what other libraries provide such functionality? numpy? BLAS? 2016-06-03T19:52:07Z Riastradh: groovy3shoes: None that I'm aware of. 2016-06-03T19:52:27Z ecraven: well, it certainly doesn't hurt to learn about things that would be good to have :) 2016-06-03T19:52:30Z groovy3shoes: it seems like such an obscure feature that it wouldn't even belong in large, imo 2016-06-03T19:52:32Z Riastradh: Wish I had up/down tuples in numpy when I wrote the gradient-of-multivariate-normal-density code. 2016-06-03T19:52:45Z profess quit (Ping timeout: 260 seconds) 2016-06-03T19:52:53Z Riastradh: numpy has a somewhat more conventional tensor contraction concept. 2016-06-03T19:53:11Z Riastradh: But that doesn't help for heterogeneous objects like R^3 x Hom(R^2, R^3)! 2016-06-03T19:54:13Z Riastradh: Of course, highly optimized Fortran code like numpy relies on doesn't like hereogeneity -- that's the antithesis of vectorization and hence anathema! 2016-06-03T19:56:17Z Riastradh: Up/down tuples are obscure partly because you don't *really* need them -- as I noted earlier, you always *can* find a matrix representation of any finite-dimensional real linear map you want. But such representations don't compose very well, except insofar as you have automated logic to compose them, which is exactly what up/down tuples are. 2016-06-03T19:56:47Z Riastradh: ecraven: As for the * operation: jcowan's proposal seems to include generic `inner product' and `outer product' operations which take as parameters the * and + operations you want to use. 2016-06-03T19:57:24Z Riastradh: ecraven: It could conceivably include a generic `up/down-combine' operation that does the same, provided someone invented a better name for it than `up/down-combine'. 2016-06-03T19:57:27Z jcowan: Right, so from my point of view it seems the up/down distinction allows you to overload the two, because it performs inner products on arguments of the same type and outer products on arguments of different types. 2016-06-03T19:58:22Z profess joined #scheme 2016-06-03T19:58:40Z MiteshNinja quit (Ping timeout: 240 seconds) 2016-06-03T19:59:01Z Riastradh: jcowan: Actually, it's the other way around. up*down and down*up are inner products; up*up and down*down are outer products. 2016-06-03T19:59:09Z Riastradh: And it works recursively on the structure. 2016-06-03T19:59:26Z jcowan: Right 2016-06-03T19:59:36Z jcowan: says a, means b, etc. 2016-06-03T20:00:44Z ecraven: Riastradh: Thanks for taking the time to explain all this in such detail! 2016-06-03T20:01:06Z nilg quit (Read error: Connection reset by peer) 2016-06-03T20:01:17Z groovy3shoes: interesting... 2016-06-03T20:01:27Z groovy3shoes turns back to his studies of term rewriting systems... 2016-06-03T20:01:32Z adu quit (Quit: adu) 2016-06-03T20:03:30Z jcowan: Riastradh: do you have views about the utility of arbitrary lower bounds? 2016-06-03T20:05:42Z pierpa: without lower bounds, I'll continue to use srfi-25. than you very much. 2016-06-03T20:06:17Z profess quit (Ping timeout: 260 seconds) 2016-06-03T20:06:43Z lloda: arbitrary lower bound are useless 2016-06-03T20:07:02Z jcowan: Sounds like we should have them. 2016-06-03T20:07:16Z jcowan: After all, R7RS-large is useless, R7RS is useless, and Scheme is useless. 2016-06-03T20:07:17Z lloda: the J people said that in APL it was more 'a feature guarded against than used' 2016-06-03T20:07:28Z lloda: the ability to change the lower bound I mean 2016-06-03T20:08:07Z jcowan: That's because it's a system parameter and not a property of individual arrays, which is what I am proposing. 2016-06-03T20:08:15Z jcowan: a la Fortran (and Algol 60) 2016-06-03T20:08:24Z jcowan bows to our Ancestors of the Naur Totem 2016-06-03T20:08:40Z lloda: Guile has them too, and they a waste 2016-06-03T20:08:44Z lloda: are a waste 2016-06-03T20:09:12Z neoncontrails quit (Ping timeout: 276 seconds) 2016-06-03T20:09:25Z lloda: e.g. (make-array 0 '(1 2) '(3 4)) 2016-06-03T20:09:32Z lloda: #2@1@3((0 0) (0 0)) 2016-06-03T20:09:37Z mejja joined #scheme 2016-06-03T20:09:46Z lloda: lower bound for first axis is 1, lower bound for second axis is 3 2016-06-03T20:10:19Z neoncontrails joined #scheme 2016-06-03T20:11:36Z lloda: do you think they are useful? why? 2016-06-03T20:12:47Z ecraven: I guess the next logical step would be to have non-contiguous indices? :p 2016-06-03T20:13:14Z lloda: I can list a few problems with non-zero lower bound 2016-06-03T20:13:19Z jcowan: I thought about that: it amounts to using non-integer strides. 2016-06-03T20:13:43Z jcowan: But still exact. 2016-06-03T20:19:27Z badkins joined #scheme 2016-06-03T20:19:29Z groovy3shoes: I can't possibly imagine a good use for arbitrary lower bounds 2016-06-03T20:20:33Z groovy3shoes: and they're rare enough to be excluded from the large language, imo, not to mention how trivial it is to write a wrapper procedure or macro that emulates them 2016-06-03T20:22:44Z pierpa: So, Ada is a small language 2016-06-03T20:25:46Z groovy3shoes: lol 2016-06-03T20:26:58Z neoncont_ joined #scheme 2016-06-03T20:27:22Z neoncontrails quit (Read error: Connection reset by peer) 2016-06-03T20:29:47Z jcowan: Well, array-transform allows you to specify an arbitrary affine transformation on an index, so they have to exist in some sense, even if you can't specify them at array creation time 2016-06-03T20:29:57Z jcowan: (called share-array in SRFI 25) 2016-06-03T20:30:37Z jcowan: Old VB supported them, VB.NET does not, for what that's worth (it's one of the conversion pain points) 2016-06-03T20:32:52Z Riastradh: Sorry, had to run for a bit. 2016-06-03T20:33:25Z Riastradh: jcowan: The only lower bound of interest to me is 0, inclusive. 2016-06-03T20:40:40Z mastokley quit (Ping timeout: 240 seconds) 2016-06-03T20:43:44Z gravicappa quit (Ping timeout: 258 seconds) 2016-06-03T20:44:41Z andrewvic joined #scheme 2016-06-03T20:45:25Z TheLemonMan quit (Quit: "It's now safe to turn off your computer.") 2016-06-03T20:52:18Z andrewvic quit (Quit: andrewvic) 2016-06-03T21:00:15Z AlexDenisov joined #scheme 2016-06-03T21:07:45Z davidh joined #scheme 2016-06-03T21:09:38Z groovy3shoes quit (Remote host closed the connection) 2016-06-03T21:10:21Z jlongster quit (Quit: My Mac has gone to sleep. ZZZzzz…) 2016-06-03T21:14:01Z andrewvic joined #scheme 2016-06-03T21:14:37Z jlongster joined #scheme 2016-06-03T21:18:05Z bokr joined #scheme 2016-06-03T21:28:47Z jlongster quit (Quit: My Mac has gone to sleep. ZZZzzz…) 2016-06-03T21:33:13Z groovy3shoes joined #scheme 2016-06-03T21:36:15Z jlongster joined #scheme 2016-06-03T21:37:38Z bokr1 joined #scheme 2016-06-03T21:39:28Z neoncont_ quit (Remote host closed the connection) 2016-06-03T21:39:37Z bokr quit (Ping timeout: 260 seconds) 2016-06-03T21:47:46Z mastokley joined #scheme 2016-06-03T21:59:44Z bokr1 quit (Quit: Leaving.) 2016-06-03T22:03:07Z n_blownapart joined #scheme 2016-06-03T22:05:01Z n_blownapart: hello, the addition operator on line 6 of this sicp problem is in the outer nest...is this a quick way to see if a program is linear recursion? thanks https://www.refheap.com/119814 2016-06-03T22:06:37Z n_blownapart: I have some other questions about this if anyone has the time and inclination... 2016-06-03T22:08:20Z pepton1 quit (Ping timeout: 272 seconds) 2016-06-03T22:09:34Z pierpa: n_blownapart: no. reread that section. 2016-06-03T22:11:55Z n_blownapart: pierpa: thank you. still the section near the bottom is describing, beginning with "More precisely" the golden ratio method. I was asking about the code at top..unless I'm quite confused. 2016-06-03T22:14:21Z n_blownapart: So i.e. is this top method neither linear nor tail recursion pierpa ? I can't make it out from the text. 2016-06-03T22:14:27Z pierpa: you are asking about linear vs tree recursion? 2016-06-03T22:15:23Z n_blownapart: I have a better understanding of linear vs. tail, but in this example the text seems vague to me. 2016-06-03T22:15:56Z pierpa: what is the question? 2016-06-03T22:15:59Z mbrock quit (Quit: Connection closed for inactivity) 2016-06-03T22:16:57Z n_blownapart: on line 6, if you see the operator in the left outer nest position, if that indicates linear, where you get ( +( +( + or similar ..? 2016-06-03T22:17:38Z pierpa: I mean, the question of the exercise. What they ask to do with that fib definition 2016-06-03T22:17:50Z n_blownapart: oh sorry hold on please 2016-06-03T22:18:34Z pierpa: the + operator in the left outer nest position can't indicate linear, since that function is not linear. 2016-06-03T22:20:06Z n_blownapart: It's an example, beginning page 44, search for fib : http://web.mit.edu/alexmv/6.037/sicp.pdf 2016-06-03T22:21:15Z n_blownapart: ok I was thinking that there are quick telltale ways for a beginner to recognize various sorts of recursion. pierpa 2016-06-03T22:22:02Z pierpa: so, they give that example, and immediately after the code they say it's an example of tre recursion 2016-06-03T22:22:09Z pierpa: tree 2016-06-03T22:23:26Z pierpa: I think you can safely skip the explanation about the golden ratio, for now 2016-06-03T22:24:43Z n_blownapart: thanks I thought tree recursion had elements of linear or both linear/tail perhaps....but I have another question if you have time. this inefficient method finds the array index of the fibonacci number, not the subsequent fibonacci number itself. F(+n 1) finds the subsequent index. Why, in a pedagogical sense, would they teach this first? (skip the golden ratio? hold on to that please) 2016-06-03T22:25:34Z pierpa: where are you seeing arrays? There are no arrays in sicp, ISTR? 2016-06-03T22:25:41Z n_blownapart: I spent hours trying to see what this program did, writing out the solutions by hand. 2016-06-03T22:25:48Z n_blownapart: Arrays..?... 2016-06-03T22:26:08Z pierpa: you mentioned arrays 2016-06-03T22:26:11Z n_blownapart: I meant the index as one is in a , say, ruby array, since I studied a bit of ruby. 2016-06-03T22:26:49Z _sjs quit (Ping timeout: 252 seconds) 2016-06-03T22:26:50Z n_blownapart: forget that, just I meant the index number 01234 2016-06-03T22:28:05Z n_blownapart: after struggling, I realized they were computing the index, not the subsequent number. that was maddening and also a fascinating realization for me. 2016-06-03T22:28:16Z pierpa: what index 2016-06-03T22:28:49Z pierpa: that is a function which given n it computes the nth fibonacci number 2016-06-03T22:29:17Z pierpa: and is what is universally intended as the fib function 2016-06-03T22:29:20Z n_blownapart: that program returns (fib 5) => 5 , which is 5's index, not 8's index. 2016-06-03T22:29:30Z pierpa: are you sure? 2016-06-03T22:30:54Z _chazu quit (Ping timeout: 276 seconds) 2016-06-03T22:31:00Z n_blownapart: yeah, It drove me crazy. (fib 8) returns 21 , not 13. pierpa 2016-06-03T22:32:04Z pierpa: hmmm 2016-06-03T22:32:42Z pierpa: why are you expecting 13 2016-06-03T22:33:04Z n_blownapart: I was expecting 13, the subsequent number. The interesting thing is the leaf formation count of the tree for (fib 5), if you count both the '1' outputs and '0' outputs, amounts to the subsequent fib number, 8, which is (fib 5) => 2016-06-03T22:33:22Z n_blownapart: 8 2016-06-03T22:33:43Z pierpa: sorry, English is not my mother tongue. I don't understand what is a subsequent number 2016-06-03T22:34:17Z n_blownapart: A redundant recursion, (fib 3) * 2 , brings the tally up to 8 "leaves" 2016-06-03T22:34:32Z pierpa: uh? 2016-06-03T22:35:09Z stepnem quit (Ping timeout: 246 seconds) 2016-06-03T22:35:15Z n_blownapart: if you count just the '1' leaves, you get 5, which is the actually output, or the next index place. see diagram 1.5 2016-06-03T22:35:53Z andrewvic quit (Quit: andrewvic) 2016-06-03T22:36:36Z n_blownapart: 1+1+1+1+1+0+0+0 = 5 , or 8 leaves. 2016-06-03T22:36:55Z pierpa: so? 2016-06-03T22:38:03Z n_blownapart: I was wondering if that is the lesson in that problem, showing that both the index and the next fib number can be determined by looking at the leaves. 2016-06-03T22:38:36Z n_blownapart: sorry if that is inaccurate, though. 2016-06-03T22:38:43Z pierpa: the number of leaves is a measure of the computational effort that that algorithm requires 2016-06-03T22:39:09Z pierpa: I see no other (obvious) meaning in that number 2016-06-03T22:40:29Z n_blownapart: really, hmm. I tried some other small numbers and got the same result: that the leaves indicated both the index and the subsequent fib num. thanks kindly pierpa will work on it.. 2016-06-03T22:41:55Z pierpa: you mean, that the number of leaves in (fib n) is the value of (fib (+ n 1))? 2016-06-03T22:43:46Z chazu joined #scheme 2016-06-03T22:45:11Z pierpa: ok. it's true. 2016-06-03T22:45:39Z n_blownapart: yeah, is it true? ...I was having a hard time.. 2016-06-03T22:45:46Z n_blownapart: because 2016-06-03T22:45:47Z pierpa: it is 2016-06-03T22:47:52Z n_blownapart: while using 'trace' in drracket racket I was trying to cheat and not write out the results (all leaves) by hand. the initial (fib 5) I did by hand and couldn't believe that the program was returning the index alone. obviously my math needs salvaging. 2016-06-03T22:49:27Z pierpa: you could modify the function in such a way that it returns both the fibonacci number and the number of leaf it traverses 2016-06-03T22:49:55Z n_blownapart: but the writers are brilliant because I think they wanted me to go that route to see one way of doing it. yeah, would love to see that if you have time to paste it. 2016-06-03T22:52:15Z n_blownapart: pierpa: also any pointers on using analysis methods in the ide such as trace or display in a way where I could follow that output. I can use them to a degree... 2016-06-03T22:52:31Z n_blownapart: or others 2016-06-03T22:53:31Z pierpa: https://bpaste.net/show/da23a5c981aa 2016-06-03T22:54:07Z n_blownapart: excellent thanks kindly pierpa ... I tend to 2016-06-03T22:54:11Z pierpa: I don't use much the racket debuger. Prefer simpler methods. 2016-06-03T22:54:35Z n_blownapart: pierpa: please explain simpler ways.. 2016-06-03T22:56:39Z pierpa: I instrument the code in such a way as to collect the information I'm interested in. For example, if I was interested in counting the leaves visited by that fib function, I'd instrument the code as I just did. 2016-06-03T22:58:06Z n_blownapart: ok, thanks a lot, I need to study that paste . still, as well, I don't see why a function "To compute (fib 5) " would be written to return the index, and not 8, without explicitly telling us that it is written to return the index. see my beef ? 2016-06-03T22:58:49Z pierpa: because (fib 5) is 5 2016-06-03T22:59:02Z pierpa: while (fib 6) is 8 2016-06-03T22:59:57Z n_blownapart: right, and the input is an index, not a fibonacci number, hence my confusion. excellent book and very kind of you to help me. thanks 2016-06-03T23:01:13Z n_blownapart: since the 6 in (fib 6) is not a fibonacci number at all, I was thinking it was some sort of rational number or something. late night madness... 2016-06-03T23:02:14Z pierpa: yes, the input is the index of the fib number it computes 2016-06-03T23:02:52Z n_blownapart: everyone says how cool chapter 4 is. can't wait ! more drama later, pax 2016-06-03T23:03:45Z pierpa: :) 2016-06-03T23:12:31Z n_blownapart quit 2016-06-03T23:16:04Z adu joined #scheme 2016-06-03T23:23:52Z daviid quit (Ping timeout: 264 seconds) 2016-06-03T23:26:00Z mastokley quit (Ping timeout: 240 seconds) 2016-06-03T23:26:02Z andrewvic joined #scheme 2016-06-03T23:29:25Z retroj joined #scheme 2016-06-03T23:32:12Z adu quit (Remote host closed the connection) 2016-06-03T23:40:09Z bjz_ quit (Quit: My MacBook Pro has gone to sleep. ZZZzzz…) 2016-06-03T23:41:04Z Riastradh quit (Ping timeout: 252 seconds) 2016-06-03T23:42:34Z walter|r joined #scheme 2016-06-03T23:49:11Z groovy3shoes: ¬_¬ 2016-06-03T23:54:28Z micmus quit (Ping timeout: 264 seconds) 2016-06-03T23:57:23Z micmus joined #scheme