"Tweet id","Tweet permalink","Tweet text","time","impressions","engagements","engagement rate","retweets","replies","likes","user profile clicks","url clicks","hashtag clicks","detail expands","permalink clicks","app opens","app installs","follows","email tweet","dial phone","media views","media engagements","promoted impressions","promoted engagements","promoted engagement rate","promoted retweets","promoted replies","promoted likes","promoted user profile clicks","promoted url clicks","promoted hashtag clicks","promoted detail expands","promoted permalink clicks","promoted app opens","promoted app installs","promoted follows","promoted email tweet","promoted dial phone","promoted media views","promoted media engagements" "1101255878278299648","https://twitter.com/gwern/status/1101255878278299648","@Q__Tweet @welcomet0nature The zookeepers might've put catnip extract on it. Todd 1962 tested a whole bunch of feline species, and 4/4 of the snow leopards responded to catnip.","2019-02-28 23:00 +0000","175.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101232552554217472","https://twitter.com/gwern/status/1101232552554217472","@nakamotostudies Where did you get this supposed original e-cash.pdf? You guys know there are fakes floating around which were based on my page, right?","2019-02-28 21:27 +0000","175.0","1.0","0.005714285714285714","0.0","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101218352326983680","https://twitter.com/gwern/status/1101218352326983680","4chan's /ic/ is also trying their hand at art based on the TWDNE faces: https://t.co/bZ5mFSozfL","2019-02-28 20:31 +0000","6763.0","101.0","0.014934200798462222","0.0","0.0","5.0","2.0","77.0","0.0","17.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101218152057327616","https://twitter.com/gwern/status/1101218152057327616","Added 30k new faces from latest model generated with psi=0.6. Also adding 30k text snippets generated 2-step-wise (GPT-2-anime https://t.co/Zp3cQWB2b7 fed into GPT-2-small as a long prompt). Phew!","2019-02-28 20:30 +0000","7532.0","29.0","0.0038502389803505045","0.0","0.0","5.0","0.0","16.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101202001407500288","https://twitter.com/gwern/status/1101202001407500288","@ESYudkowsky I dunno, I've seen some pretty freaky sentences coming out of GPT-2-small... With enough selection (and better sampling like beam search), much is possible. And puns can be easily generated just with word embeddings: https://t.co/Xml394SeAr https://t.co/eWk9FApGWV","2019-02-28 19:26 +0000","2212.0","54.0","0.024412296564195298","1.0","1.0","1.0","0.0","47.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101196260965564416","https://twitter.com/gwern/status/1101196260965564416","@justheretostay1 @whyvert If you start young, IQ-test-only is going to miss a lot of kids due to low reliability. And he doesn't say one *shouldn't* take into account IQ test scores or past grades.","2019-02-28 19:03 +0000","289.0","5.0","0.01730103806228374","0.0","0.0","2.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101195653613568001","https://twitter.com/gwern/status/1101195653613568001","@JanelleCShane They might not be cats at all. Unless heroic efforts are made, some non-cats will slip into LSUN Cats's 2m or whatever images, and AFAIK, none was made. (Even ImageNet & MNIST have mistakes.)","2019-02-28 19:01 +0000","1142.0","4.0","0.0035026269702276708","0.0","0.0","4.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101168421729980417","https://twitter.com/gwern/status/1101168421729980417","@lordcataplanga @Macronicus Just use some procedural thing like n-bodies. No need for a GAN and you don't want mere 2D images anyway since they don't give you a 3D model you can manipulate or fly through etc.","2019-02-28 17:13 +0000","208.0","3.0","0.014423076923076924","0.0","1.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101162226851946496","https://twitter.com/gwern/status/1101162226851946496","@Macronicus No. Who cares about galaxies? Most people have no idea what they look like and couldn't be impressed.","2019-02-28 16:48 +0000","212.0","9.0","0.04245283018867924","0.0","1.0","0.0","2.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101162005912870912","https://twitter.com/gwern/status/1101162005912870912","@sunkworld @GelsameI @misaki_cradle At some point, can't get blood from a stone. Need to obtain more data by hook or crook, or start experimenting with architectures specifically designed to work with small n, like few-shot learning GANs.","2019-02-28 16:47 +0000","1804.0","1.0","5.543237250554324E-4","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101129768752291840","https://twitter.com/gwern/status/1101129768752291840","@GelsameI @sunkworld @misaki_cradle I assume the usual way: Sunk generated a .tfrecords from the Misaki picture collection at 512px, and simply resumed training with the new dataset.","2019-02-28 14:39 +0000","1859.0","2.0","0.0010758472296933835","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1101128814040662017","https://twitter.com/gwern/status/1101128814040662017","""In a quiet nursing home, not all is at it seems¡ªat night, ruthless and illegal foot-races are held, where old men dash down corridors & around corners to reach the bathroom in time!¡ª""M-m-multi-toilet dripping!""¡ªTune in next week for the world-premiere of... ??????? ?!""","2019-02-28 14:35 +0000","6003.0","22.0","0.003664834249541896","0.0","1.0","4.0","7.0","0.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100967883172122624","https://twitter.com/gwern/status/1100967883172122624","@sunkworld @misaki_cradle Nice. The extra margin seems to help. But what is this Misaki dataset? I'm not familiar with it.","2019-02-28 03:56 +0000","2820.0","18.0","0.006382978723404255","0.0","1.0","4.0","6.0","0.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100942035186601985","https://twitter.com/gwern/status/1100942035186601985","@makoConstruct The City of Letters is a lonely place. Often rainy (which is not good for the covers). A tough place to be. But I know that the truth is out there. And the truth... (?_?) ( ?_?)>?¡ö-¡ö (?¡ö_¡ö) ...will set you free.","2019-02-28 02:13 +0000","202.0","1.0","0.0049504950495049506","0.0","0.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100940835296854016","https://twitter.com/gwern/status/1100940835296854016","@robinhanson Looking at the Nat Geog ref, that overstates the speculation in it. There's a single group of early baboons (as well as several other species) which have more trauma than baboon skeletons excavated later. And since they're buried as a group same place/time, it's effectively n=1.","2019-02-28 02:08 +0000","2378.0","25.0","0.010513036164844407","0.0","1.0","11.0","2.0","0.0","0.0","11.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100932221811245057","https://twitter.com/gwern/status/1100932221811245057","@Go321D @JayMan471 @V4Analysis You're looking for the unique somatic mutations from the father and thus could't've been inherited from his twin. But what if a particular mutation doesn't get passed on because that region came from the mother? And I don't see why not. GenPred has the tech all set up already.","2019-02-28 01:34 +0000","328.0","3.0","0.009146341463414634","0.0","1.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100920571200712706","https://twitter.com/gwern/status/1100920571200712706","@djahandarie We tried enlarging the quote marks, bu while Obormot liked it, I thought it looked weird especially if you had a lot of quotes in close conjunction, so we settled for moving them outside and making them somewhat gray. Current demos on /Bakewell & /TWDNE","2019-02-28 00:48 +0000","227.0","2.0","0.00881057268722467","0.0","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100918442809528321","https://twitter.com/gwern/status/1100918442809528321","@sonyaellenmann Well sure, but that assumes they are (a) still training it and (b) will update the images.","2019-02-28 00:39 +0000","827.0","9.0","0.010882708585247884","0.0","0.0","2.0","1.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100880533972758529","https://twitter.com/gwern/status/1100880533972758529","@TomChivers @ESYudkowsky","2019-02-27 22:09 +0000","2884.0","14.0","0.0048543689320388345","0.0","1.0","0.0","9.0","0.0","0.0","2.0","2.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100879787235647489","https://twitter.com/gwern/status/1100879787235647489","@backus @ctbeiser I did get some attention at the time. WMF people like Gardner read it. It... just didn't do any good, in the end.","2019-02-27 22:06 +0000","1011.0","28.0","0.027695351137487636","0.0","0.0","8.0","12.0","0.0","0.0","7.0","0.0","0","0","1","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100873113515053057","https://twitter.com/gwern/status/1100873113515053057","Another cat one: why do they find earwax so interesting to smell/lick? Of like 7 or 8 cats I've tried, all were fascinated by my hearing aids or their own earwax. Many cats seem to have deep ear itches they can't get at themselves. (One turned out to have undiagnosed ear mites.)","2019-02-27 21:39 +0000","6324.0","29.0","0.004585705249841872","1.0","0.0","6.0","4.0","0.0","0.0","18.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100870492217577472","https://twitter.com/gwern/status/1100870492217577472","@sonyaellenmann (They're not very good, though. They don't even look converged, quite aside from the low res.)","2019-02-27 21:29 +0000","1097.0","11.0","0.010027347310847767","0.0","0.0","4.0","0.0","0.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100870225917022225","https://twitter.com/gwern/status/1100870225917022225","@jeremyphoward @tunguz @OpenAI @GuggerSylvain That seems a lot less impressive, then. Part of the amazing thing about even GPT-2-small is how it'll generate everything from Wikipedia articles to wire reports to Japanese song lyrics (quite a lot of those pop up on TWDNE's GPT-2-small samples, incidentally).","2019-02-27 21:28 +0000","3955.0","13.0","0.0032869785082174463","0.0","1.0","1.0","2.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100849555065716736","https://twitter.com/gwern/status/1100849555065716736","@DavidSHolz @ctbeiser Oh, that's already the case. The big knowledge graphs often draw heavily on WikiData. WP itself is a major source: play around with OA's new GPT-2-small model and you'll see a *lot* of Wikipedia has been learned by it.","2019-02-27 20:06 +0000","466.0","4.0","0.008583690987124463","0.0","1.0","3.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100842920440225795","https://twitter.com/gwern/status/1100842920440225795","@DavidSHolz @ctbeiser My impression is that in absolute terms, things have been stagnate, but considered in terms of 'online Anglophone per capita', say, things have gotten much worse. Many articles seem trapped in amber from 2009 - good luck learning anything about the current golden age of genetics!","2019-02-27 19:39 +0000","2657.0","40.0","0.015054572826496047","1.0","2.0","8.0","8.0","0.0","0.0","20.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100815114729603074","https://twitter.com/gwern/status/1100815114729603074","@eugenewei site issue I and another noticed while reading https://t.co/8FQwi1rYlV - can you disable the odd 'Escape key tries to log in' thing? Keeps screwing me up every time I C-f for something.","2019-02-27 17:49 +0000","648.0","10.0","0.015432098765432098","0.0","1.0","1.0","1.0","7.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100771769760268289","https://twitter.com/gwern/status/1100771769760268289","@apdemetriou @SteveStuWill Given global obesity rates, I'd say it's more like an anti-Flynn effect.","2019-02-27 14:56 +0000","8841.0","11.0","0.0012442031444406742","0.0","0.0","3.0","4.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100769383025176577","https://twitter.com/gwern/status/1100769383025176577","@JayMan471 @Go321D @V4Analysis Might be harder, though. Half as much to work with, and forensic traces seem like they might incorporate cells from multiple kinds of tissues, leading to a bigger 'fingerprint' of somatic mutations?","2019-02-27 14:47 +0000","434.0","2.0","0.004608294930875576","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100595908318842882","https://twitter.com/gwern/status/1100595908318842882","@mannerminded @crschmidt But the causal influence is split. I was checking Github daily for StyleGAN and started training my face StyleGAN a week before 'This Person' was put up, as part of my 3-year-long effort testing GANs for generating anime. TP merely inspired me to make TWDNE showing off my samples","2019-02-27 03:18 +0000","323.0","1.0","0.0030959752321981426","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100527312993427456","https://twitter.com/gwern/status/1100527312993427456","@al_pal_22 The double-lining is just for the abstract. If gray is removed from all blockquotes, that would be very confusing, I think.","2019-02-26 22:45 +0000","489.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100526185149329411","https://twitter.com/gwern/status/1100526185149329411","@SPC_Bitcoin @KimDotcom Aw, cheer up. I've already uploaded another 1200 waifus made with my latest StyleGAN, and GPT-2-anime+GPT-2-small text snippets.","2019-02-26 22:41 +0000","356.0","2.0","0.0056179775280898875","0.0","0.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100520109066588167","https://twitter.com/gwern/status/1100520109066588167","12MB of GPT-2-anime plot-synopsis unconditional samples which I'm feeding into GPT-2-small for additional TWDNE text samples: https://t.co/DEnRLrlRyw","2019-02-26 22:16 +0000","6623.0","39.0","0.00588857013438019","1.0","0.0","1.0","3.0","29.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100514077644853248","https://twitter.com/gwern/status/1100514077644853248","GPT-2-anime model finetuned on Kaggle anime plot synopses (https://t.co/hVjy0IEQDh): https://t.co/SxDDSEHz63 If you're using the OA codebase, unpack into the 117M subdir, then you can sample conditionally/unconditionally as before. Note: short synopses means prompting is hard.","2019-02-26 21:52 +0000","7955.0","59.0","0.007416719044626021","1.0","1.0","7.0","1.0","39.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100509956082749441","https://twitter.com/gwern/status/1100509956082749441","@realmforge @jakkdl Stealing secrets from another process is a problem for everything security-related, not just cryptocurrencies or even cryptography-related at all.","2019-02-26 21:36 +0000","235.0","6.0","0.02553191489361702","0.0","1.0","1.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100509472299143174","https://twitter.com/gwern/status/1100509472299143174","@djahandarie Part of the problem is how indiscriminate it is. My screenshot actually demonstrates this: there's one real quote from Cobb, but everything else is a proverb, paraphrase, or obsolete technical term. Cobb should be highlighted, but does everything else need to be? Probably not.","2019-02-26 21:34 +0000","335.0","2.0","0.005970149253731343","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100504199794737154","https://twitter.com/gwern/status/1100504199794737154","Thanks for voting. After thinking about the problems it's caused, I'm simplifying by disabling <q> by default, removing the JS, and I'll be doing highlighting on specific quotes with manual <q></q>s in the future.","2019-02-26 21:13 +0000","6926.0","12.0","0.0017326017903551833","0.0","1.0","3.0","0.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100453467716534286","https://twitter.com/gwern/status/1100453467716534286","_An investigation into the relation between intelligence and inheritance_, Lawrence 1931: https://t.co/zTIlPzmxjG Had to jailbreak this because OMG how do you invest *this* much effort into https://t.co/IVrkUwrOTb https://t.co/HFaxGHMdAn only to be *far worse* than a PDF URL?","2019-02-26 17:52 +0000","6729.0","55.0","0.008173577054540051","1.0","1.0","0.0","1.0","42.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100430472289665024","https://twitter.com/gwern/status/1100430472289665024","@whyvert @AmirSariaslan Still a lot better than most studies in this vein.","2019-02-26 16:20 +0000","1624.0","8.0","0.0049261083743842365","0.0","1.0","1.0","4.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100407999657472002","https://twitter.com/gwern/status/1100407999657472002","@whyvert Unfortunately, they don't do a sibling comparison although they have the sibling data, so family factors are only partially controlled; and look at how the adjustments they do do in Figure 1 shrink the correlates substantially.","2019-02-26 14:51 +0000","2524.0","13.0","0.005150554675118859","1.0","1.0","8.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100241388518572032","https://twitter.com/gwern/status/1100241388518572032","@crschmidt @ddukes @therealfitz Minor correction: it's not CC-by-NC because that's just the StyleGAN source code & pretrained model license, which does not apply to models or outputs thereof trained from scratch like my face model, any more than Microsoft owns everything you write in MS Word. It'd be PD or CC-0","2019-02-26 03:49 +0000","533.0","6.0","0.01125703564727955","0.0","1.0","0.0","4.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100233233906708481","https://twitter.com/gwern/status/1100233233906708481","@crschmidt @ddukes @therealfitz (Nobody tell them about the anime faces, or the videos.)","2019-02-26 03:16 +0000","278.0","2.0","0.007194244604316547","0.0","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100213804800069633","https://twitter.com/gwern/status/1100213804800069633","_Great Mambo Chicken and the Transhuman Condition: Science Slightly over the Edge_, Regis 1990: https://t.co/upn0BQp9BR https://t.co/r9u9A8dgTt https://t.co/nWpvejsuco https://t.co/Sqhio5c4sF @ESYudkowsky","2019-02-26 01:59 +0000","6299.0","7.0","0.001111287505953326","0.0","0.0","1.0","0.0","4.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100202421412941824","https://twitter.com/gwern/status/1100202421412941824","@_sharpobject I have no idea what that is, but as long as the actual site is working for you, then I guess it's fine. (Not like it's a high-security site that really needs HTTPS, anyway.)","2019-02-26 01:14 +0000","465.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100201912601903106","https://twitter.com/gwern/status/1100201912601903106","@_sharpobject I shouldn't be serving anything over HTTP on TWDNE; the hrefs are all protocol-agnostic (not a single 'http://' in index.html), and I thought I set CloudFlare to redirect HTTP->HTTPS.","2019-02-26 01:12 +0000","489.0","3.0","0.006134969325153374","0.0","1.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100201222374604801","https://twitter.com/gwern/status/1100201222374604801","@WiringTheBrain But you don't. What fraction of the London population, even, are licensed London cabbies with The Knowledge? <0.01%? Trivially accounted for by the non-sharedenvironment variance component.","2019-02-26 01:09 +0000","243.0","3.0","0.012345679012345678","0.0","0.0","0.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100195337577394176","https://twitter.com/gwern/status/1100195337577394176","@eukaryote314 Loss curves would be more informative, but there's something weird about your timings. At 512px, I only need 86 sec/kimg and a tick is only 258 sec/tick. I have 2 GPUs, sure, so I should be a bit faster, but why are your ticks so much longer than your kimgs?","2019-02-26 00:46 +0000","304.0","2.0","0.006578947368421052","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100188190416297984","https://twitter.com/gwern/status/1100188190416297984","@eukaryote314 Hm. What effective resolution is it at? Are you using the latest repo?","2019-02-26 00:17 +0000","267.0","1.0","0.003745318352059925","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100177774256537600","https://twitter.com/gwern/status/1100177774256537600","@eukaryote314 This is the same model you screenshotted before? It looks far worse. Did it diverge? How do the loss curves look? You may need to reset to an earlier snapshot and balance G/D more (relative learning rate is a good one to tweak).","2019-02-25 23:36 +0000","286.0","1.0","0.0034965034965034965","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100166469747384322","https://twitter.com/gwern/status/1100166469747384322","(What if Cthulhu was actually the hero all along because Cthulhu is a reincarnation of Artoria¡ªthe sleeping king myth!¡ªand sinking Ryloth was the only thing that could stop the Grail???)","2019-02-25 22:51 +0000","9846.0","21.0","0.0021328458257160268","2.0","0.0","7.0","2.0","0.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100153922159808512","https://twitter.com/gwern/status/1100153922159808512","@RojasGorky I use Chromium for most of my web dev to avoid messing with my Firefox session. Chrome/Chromium doesn't support decent justification, believe it or not (see my other tweets about that), so yeah, left/ragged.","2019-02-25 22:01 +0000","395.0","2.0","0.005063291139240506","0.0","0.0","0.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100153542529163264","https://twitter.com/gwern/status/1100153542529163264","I love the _Gunslinger Girl_ sample so much: https://t.co/APZotjBHSM","2019-02-25 22:00 +0000","12840.0","232.0","0.01806853582554517","1.0","2.0","10.0","1.0","203.0","0.0","15.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100144635526201344","https://twitter.com/gwern/status/1100144635526201344","@robinhanson Hestenes's work on physics students retaining Aristotelean 'folk physics' intuitions even when they can crank through the correct Newtonian calculations comes to mind: https://t.co/J7T1O2OKbp https://t.co/OAXuzNWq2i https://t.co/Z3W8Hx9pns","2019-02-25 21:24 +0000","1902.0","47.0","0.024710830704521555","0.0","0.0","6.0","1.0","37.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100140776078696451","https://twitter.com/gwern/status/1100140776078696451","@cowtung Don't let Matthew Butterick catch you phrasing double-spacing! (But also a moot point in HTML, AFAIK, since you have to go out of your way to stop double-spaces simply turning into single spaces due to HTML space-insensitivity.)","2019-02-25 21:09 +0000","420.0","3.0","0.007142857142857143","0.0","0.0","2.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100140433169096706","https://twitter.com/gwern/status/1100140433169096706","@apas After experimenting with JS-injection of smallcaps for H1/H2/H1+H2 words, doesn't look like it quite works with https://t.co/LC5JQL86wv. I think I'm going to try rolling it out for the introductory word or phrase on articles here on out, though.","2019-02-25 21:08 +0000","426.0","3.0","0.007042253521126761","0.0","0.0","2.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100136090361610242","https://twitter.com/gwern/status/1100136090361610242","@apas My use of footnotes is a little pathological, and not like the usual sidenotes examples of a footnote once in a while with a sentence or two (mine are more like 10 in a paragraph and each one is a mini-essay with blockquotes), so not sure it'd work as-is.","2019-02-25 20:50 +0000","491.0","4.0","0.008146639511201629","0.0","2.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100120077918240768","https://twitter.com/gwern/status/1100120077918240768","@apas Yes, I like the initial capital letter thing, but apparently it's too much in conjunction with current indents/headers. One future https://t.co/LC5JQL86wv feature I'm excited about is using JS to implement Tufte-style 'sidenotes' on sufficiently wide browsers. (Typo: 'mem ories')","2019-02-25 19:47 +0000","773.0","10.0","0.0129366106080207","0.0","2.0","2.0","0.0","3.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100117915871989760","https://twitter.com/gwern/status/1100117915871989760","@apas Yeah, you don't see it much at all, much less as this use-case, because it has a lot of puzzling drawbacks. Like, they don't copy-paste! The """"s in anything you copy-paste from https://t.co/LC5JQL86wv are there because some JS manually adds them.","2019-02-25 19:38 +0000","695.0","3.0","0.004316546762589928","0.0","1.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100112122653143047","https://twitter.com/gwern/status/1100112122653143047","@odomojuli Hm, too bad one has to whitelist it repeatedly in NoScript & uBlock to get anything to load.","2019-02-25 19:15 +0000","396.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100110010078687232","https://twitter.com/gwern/status/1100110010078687232","@sunkworld @roadrunning01 We're not sure. The wrinkles pop up much quicker on small datasets than large, and on my main face StyleGAN, lowering the LR & balancing G/D better (by decreasing D LR much more than G) *seems* to reduce wrinkling.","2019-02-25 19:07 +0000","270.0","2.0","0.007407407407407408","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100109286452150273","https://twitter.com/gwern/status/1100109286452150273","Screenshot for those not sure what I'm talking about (example from https://t.co/PoJtWfHqfX ): https://t.co/LY56NITIBs","2019-02-25 19:04 +0000","11310.0","313.0","0.027674624226348365","0.0","2.0","3.0","0.0","45.0","0.0","13.0","0.0","0","0","0","0","0","250","250","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100109285483266050","https://twitter.com/gwern/status/1100109285483266050","Site design poll: one of the more unusual (and fragile, and complicated) features is using '<q>' tags to syntax-highlight full quotes (double-quotes), similar to standard source-code highlighting. But does anyone besides me actually like it?","2019-02-25 19:04 +0000","8104.0","50.0","0.006169792694965449","1.0","3.0","3.0","3.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100068954154979328","https://twitter.com/gwern/status/1100068954154979328","@GelsameI Yeah. There's two blue-green background gaps which should be grey-white, which is a tell-tale but otherwise, who would guess this was machine-generated?","2019-02-25 16:24 +0000","166.0","1.0","0.006024096385542169","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100064708223217665","https://twitter.com/gwern/status/1100064708223217665","@technillogue Dunno. Maybe look around on Quantified Self sites.","2019-02-25 16:07 +0000","133.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100059858299174913","https://twitter.com/gwern/status/1100059858299174913","@technillogue Sorry, but I really shouldn't take on any more projects at this point.","2019-02-25 15:48 +0000","149.0","1.0","0.006711409395973154","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100051130380414977","https://twitter.com/gwern/status/1100051130380414977","@jonathanfly No. I was hoping to have the training code packaged, and add a new batch of snippets to TWDNE first (using GPT-2-anime short synopses as a prompt to GPT-2-small, which works well) to release as a whole.","2019-02-25 15:13 +0000","833.0","4.0","0.004801920768307323","0.0","1.0","3.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100045024643948544","https://twitter.com/gwern/status/1100045024643948544","@halcy You could try using a more powerful, more nonlinear classifier to allow more complicated movements in the latent space? Like an SVM or random forests classifier instead of a linear logistic.","2019-02-25 14:49 +0000","299.0","1.0","0.0033444816053511705","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100044844355989505","https://twitter.com/gwern/status/1100044844355989505","@halcy It wouldn't surprise me if they really do covary like that in the original data. Character designers & artists take eye/hair color into consideration as a major feature. eg the classic 'blue eye/red-blonde hair' vs 'blue hair/red-eye' dichotomy.","2019-02-25 14:48 +0000","402.0","1.0","0.0024875621890547263","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100043873429217280","https://twitter.com/gwern/status/1100043873429217280","@__vba We can't stop here. All this was set in motion long before you or I were born.","2019-02-25 14:44 +0000","1754.0","23.0","0.013112884834663626","0.0","1.0","9.0","5.0","0.0","0.0","6.0","2.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100043482767507457","https://twitter.com/gwern/status/1100043482767507457","@GelsameI It's almost watercolor-esque isn't it? As I keep saying, the range of styles is amazing. It's certainly not limited to your classic flat cel-like shading.","2019-02-25 14:42 +0000","193.0","2.0","0.010362694300518135","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100043013357735937","https://twitter.com/gwern/status/1100043013357735937","@halcy Oh, nice. So you just used DeepDanbooru on a few hundred images to train the latent classifier?","2019-02-25 14:41 +0000","389.0","1.0","0.002570694087403599","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100042831845122049","https://twitter.com/gwern/status/1100042831845122049","@pretendsmarts @crschmidt You literally switch one config field to point to a dataset and just keep training the model! It'll figure it out.","2019-02-25 14:40 +0000","240.0","3.0","0.0125","0.0","0.0","1.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1100041381412827137","https://twitter.com/gwern/status/1100041381412827137","""It is so sad to say that this manga has never been seen by any anime fans in the real world...Please make anime movies about me. Please make anime about me. Please make anime about your beautiful cat. Please make anime movies about me. Please make anime about your cute cat"" #283 https://t.co/lJ6QGIJYEe","2019-02-25 14:34 +0000","21943.0","864.0","0.03937474365401267","17.0","3.0","63.0","23.0","164.0","0.0","123.0","2.0","0","0","0","0","0","469","469","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099881045678321664","https://twitter.com/gwern/status/1099881045678321664","So we trained GPT-2-anime on plot synopses from MAL/Kaggle and, er, it does too good a job of learning the format & ignores prompts, simply spitting out synopses. But! We *can* feed synopses into the original GPT-2-small as a prompt! https://t.co/qd6dVeV8kS Quite coherent.","2019-02-25 03:57 +0000","30061.0","296.0","0.009846645154851801","6.0","2.0","18.0","20.0","228.0","0.0","20.0","2.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099827210125496325","https://twitter.com/gwern/status/1099827210125496325","@DefaultMCYoutub ""The thing I fear is that these kind of generators could produce enough high quality images that they could flood boorus if unchecked and overshadowing real artists who put time into their art."" I'd say that's a little overly-optimistic, but, well...","2019-02-25 00:23 +0000","672.0","4.0","0.005952380952380952","0.0","1.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099812706109059073","https://twitter.com/gwern/status/1099812706109059073","@fengchuiyulin I assume so. Haven't seen nshepperd's code yet, but at least after the first few hours of training, it's emitting amusing anime plot summaries without the need for a prompt, so finetuning training seems to be working.","2019-02-24 23:25 +0000","848.0","4.0","0.0047169811320754715","0.0","0.0","3.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099795586402930689","https://twitter.com/gwern/status/1099795586402930689","@gokstudio Without any spatial structure, that doesn't make any sense. Imagine starting with a seed like an eye in the corner of the image, if it's really spatial-structure-less like a bagnet. And how does that get you consistent interpolations or editing or object-modules?","2019-02-24 22:17 +0000","166.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099790195971538948","https://twitter.com/gwern/status/1099790195971538948","@gokstudio I have difficulty seeing how things like the photo editors, GAN Dissection, or just the interpolations can be done solely by memorizing texture patches without spatial information, in addition to the theoretical point that generation is far more difficult than mere discrimination","2019-02-24 21:56 +0000","210.0","1.0","0.004761904761904762","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099781034584010768","https://twitter.com/gwern/status/1099781034584010768","@Miles_Brundage I've been calling it GPT-2-small. Then it's easy to talk about GPT-2-large, GPT-2-anime...","2019-02-24 21:20 +0000","1548.0","18.0","0.011627906976744186","0.0","0.0","7.0","5.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099742427202707456","https://twitter.com/gwern/status/1099742427202707456","@naval I was looking at https://t.co/lRsKw877JL which is pretty funny. It's very unfortunate that WordPress manages to make so much unarchivable (AJAX links to link to some static photos!). Given the circumstances of his departure from the Internet, probably can't ask him either.","2019-02-24 18:46 +0000","1275.0","50.0","0.0392156862745098","0.0","0.0","0.0","2.0","39.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099734119523799040","https://twitter.com/gwern/status/1099734119523799040","@realmforge Please don't play dumb. Life is too short for that.","2019-02-24 18:13 +0000","468.0","4.0","0.008547008547008548","0.0","2.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099729089697599489","https://twitter.com/gwern/status/1099729089697599489","@coldxman The important thing is, is the Munchkins box wired up or not?","2019-02-24 17:53 +0000","2934.0","13.0","0.004430811179277437","0.0","0.0","5.0","3.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099722181179817985","https://twitter.com/gwern/status/1099722181179817985","@leepavelich A much better example, since there is a consensus that there is a whole bunch of complex entities which make up what used to be a fairly simple-looking set of particles. Whether this is only a temporary phase in the long scheme of things will be a good example to posterity.","2019-02-24 17:26 +0000","627.0","4.0","0.006379585326953748","0.0","0.0","2.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099716081638363136","https://twitter.com/gwern/status/1099716081638363136","@realmforge Huh?","2019-02-24 17:02 +0000","675.0","2.0","0.002962962962962963","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099697478159515648","https://twitter.com/gwern/status/1099697478159515648","@leepavelich An interesting example, but I would feel uncomfortable lumping it in with the examples already given. How many people actually bought into Leibnizian monads besides Leibniz?","2019-02-24 15:48 +0000","546.0","2.0","0.003663003663003663","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099696888562044929","https://twitter.com/gwern/status/1099696888562044929","@crschmidt What would be really good is if you could figure out how to use the StyleGAN data-preprocessor to include class-conditioning to train a single model. Ultimately, this transfer learning from closely related single-class datasets is inferior to & a poor-man's conditional GAN.","2019-02-24 15:45 +0000","506.0","1.0","0.001976284584980237","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099695625099243521","https://twitter.com/gwern/status/1099695625099243521","@BasilMarte That's a hopeful analogy. After all, Pygmalion didn't, unlike so many Greek stories, come to a sticky end.","2019-02-24 15:40 +0000","268.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099695131010191365","https://twitter.com/gwern/status/1099695131010191365","@realmforge You may not be interested in what's outside the box, but what's outside the box is interested in you, because all abstractions leak, as Spectre so spectacularly reminded us software guys recently.","2019-02-24 15:38 +0000","707.0","3.0","0.004243281471004243","0.0","1.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099669383142621186","https://twitter.com/gwern/status/1099669383142621186","HQ 30min-long video version of ""These Waifus Do Not Exist"": https://t.co/gLDVAuOYO2","2019-02-24 13:56 +0000","7631.0","90.0","0.011793998165378064","0.0","0.0","5.0","7.0","33.0","0.0","24.0","1.0","1","0","0","0","0","19","19","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099665220425920512","https://twitter.com/gwern/status/1099665220425920512","@eukaryote314 Might want to start prepping your datasets now. ? Remember, dataset cleanliness is next to GPUness.","2019-02-24 13:39 +0000","282.0","1.0","0.0035460992907801418","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099664907522457601","https://twitter.com/gwern/status/1099664907522457601","Good news! We've gotten GPT-2-small finetuning training working. So we should be able to finetune GPT-2-small on some anime plot corpus and get much better snippets. Potentially worth redoing all the snippets for.","2019-02-24 13:38 +0000","20471.0","77.0","0.0037614185921547557","0.0","6.0","21.0","11.0","0.0","0.0","39.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099529261071908869","https://twitter.com/gwern/status/1099529261071908869","@StimulusRespon1 Yes, one of the very few examples: ellipses are (slightly) more complex than circles. I like geocentrism vs heliocentrism as a case in philosophy of science just because it was so - hard to think of a better word here - unlucky for human thinkers.","2019-02-24 04:39 +0000","798.0","7.0","0.008771929824561403","0.0","1.0","3.0","3.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099528583876411393","https://twitter.com/gwern/status/1099528583876411393","@Meaningness What? When was biology *ever* 'simple tiny meat bits all the way down'? Certainly not Aristotle, or preformationism, or anything.","2019-02-24 04:36 +0000","1690.0","11.0","0.00650887573964497","0.0","1.0","0.0","1.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099525374520762368","https://twitter.com/gwern/status/1099525374520762368","In intellectual history, it's easy to name many cases where people were sure something was made of relatively few ontologically-basic complex entities but which turned out to be made of simpler more numerous entities; but can you name any examples of the *reverse* mistake?","2019-02-24 04:24 +0000","11604.0","103.0","0.008876249569114099","1.0","10.0","26.0","14.0","1.0","0.0","50.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099506416472203265","https://twitter.com/gwern/status/1099506416472203265","@amasad No API, just a very simple static site: https://t.co/Ua5Uq5fqxL You can just grab a random image at '/example-{0-69999}.jpg' for a bot.","2019-02-24 03:08 +0000","742.0","6.0","0.008086253369272238","0.0","0.0","1.0","0.0","3.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099505499492569088","https://twitter.com/gwern/status/1099505499492569088","@bj2rn Yes. As I keep telling people, the GPU-months quoted by the paper/repo are only necessary if you want to squash all the little artifacts. It's a very Pareto thing. Like 90% of the visual quality you get in the first day or three. Lots of great faces if you're willing to filter.","2019-02-24 03:05 +0000","172.0","3.0","0.01744186046511628","0.0","1.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099504672975568897","https://twitter.com/gwern/status/1099504672975568897","@Ambisinister_ One of these days I am going to go too far. But as Blake says, ""You never know what is enough unless you know what is more than enough.""","2019-02-24 03:01 +0000","359.0","2.0","0.005571030640668524","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099504377482674176","https://twitter.com/gwern/status/1099504377482674176","@bj2rn I used the public GPT-2 for the text, and a StyleGAN I trained from scratch for the faces.","2019-02-24 03:00 +0000","224.0","3.0","0.013392857142857142","0.0","1.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099504172448317440","https://twitter.com/gwern/status/1099504172448317440","@GelsameI CC-0.","2019-02-24 02:59 +0000","396.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099504024167084033","https://twitter.com/gwern/status/1099504024167084033","@amasad I'd guess your prompt's way too short. I found longer the prompt, the better. Gives it more suggestions & ideas to work with which are on-topic, you could anthropomorphize it as? Remember, has a 1024-bytepair window/memory, so ~70 words to fill up. eg https://t.co/R0pOnQ2nJm","2019-02-24 02:59 +0000","717.0","12.0","0.016736401673640166","0.0","2.0","1.0","0.0","9.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099500927042682880","https://twitter.com/gwern/status/1099500927042682880","Comment from thread where previous commenter simply used a TWDNE image as an avatar without mentioning TWDNE or anything. Between the samples on TWDNE and the models I've released, I wonder if I'm going to be seeing a lot of my StyleGAN-chans looking back at me from my browser?","2019-02-24 02:47 +0000","9365.0","73.0","0.007794981313400961","0.0","4.0","6.0","11.0","0.0","0.0","49.0","3.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099499841963610113","https://twitter.com/gwern/status/1099499841963610113","(He's right about one thing, though. StyleGAN's hair ?? amazing.)","2019-02-24 02:42 +0000","9560.0","52.0","0.005439330543933054","0.0","1.0","6.0","1.0","0.0","0.0","44.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099499830517342209","https://twitter.com/gwern/status/1099499830517342209","4chan: ""Anon, please tell me the artist of that picture. The way they draw that hair is amazing...What is this? What? They're not just cropped photos by various artists? Does the 'An image archive is available for download.' link at least contain the pictures like I assume?""","2019-02-24 02:42 +0000","5381.0","37.0","0.006876045344731463","0.0","2.0","7.0","1.0","0.0","0.0","26.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099487221785264128","https://twitter.com/gwern/status/1099487221785264128","A kind of 'worse is better', perhaps? Some very generalized reluctance to embrace any paradigm requiring a *lot* of simple units to give rise to complex entities, an insistence that complex things be made of complex things.","2019-02-24 01:52 +0000","13818.0","57.0","0.004125054277029961","1.0","3.0","31.0","9.0","0.0","0.0","13.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099486730401603584","https://twitter.com/gwern/status/1099486730401603584","In many areas, there is a 'brute force' approach. It's often pretty unpopular. Doesn't mean it's wrong, though... [genetics/statistics/AI/economics/politics-ethics/philosophy/science] :: [bigger GWASes/Monte Carlo/more GPUs/capitalism/economic growth/atomism/reductionism].","2019-02-24 01:50 +0000","29933.0","231.0","0.007717235158520696","14.0","2.0","82.0","59.0","0.0","0.0","72.0","2.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099486053675778048","https://twitter.com/gwern/status/1099486053675778048","@GelsameI @coldacid @dvoshart @dfa1979 Something like SPIRAL?","2019-02-24 01:47 +0000","283.0","1.0","0.0035335689045936395","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099479476013068289","https://twitter.com/gwern/status/1099479476013068289","@coldacid @dvoshart @dfa1979 Eh. Some of the StyleGAN FFHQ faces are simply not possible to distinguish. Many of the BigGAN samples appear perfect. Scale scale scale...","2019-02-24 01:21 +0000","320.0","2.0","0.00625","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099478919235936257","https://twitter.com/gwern/status/1099478919235936257","@gokstudio But the bag-nets paper also showed that the better CNNs aren't just bagnets, much as the same people showed that resnets are perfectly capable of learning shapes rather than textures...","2019-02-24 01:19 +0000","514.0","1.0","0.0019455252918287938","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099463066343956480","https://twitter.com/gwern/status/1099463066343956480","Site writeup (videos, model download, stats, etc): https://t.co/Ua5Uq5fqxL","2019-02-24 00:16 +0000","9870.0","103.0","0.010435663627152989","2.0","1.0","9.0","2.0","71.0","0.0","18.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099458559702720512","https://twitter.com/gwern/status/1099458559702720512","@coldacid @dvoshart @dfa1979 GAN feedback seems so impoverished, doesn't it? It's hard to believe it works at all. I've speculated about how to provide losses on a more granular level, down to the pixel: https://t.co/8M7QcA73ci","2019-02-23 23:58 +0000","329.0","4.0","0.0121580547112462","0.0","1.0","2.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099430988680171521","https://twitter.com/gwern/status/1099430988680171521","@dvoshart @coldacid @dfa1979 I think it's adorable. Like kindergarten drawings. One should treasure these stages; you know what they say about NNs, it's like they grow up overnight.","2019-02-23 22:09 +0000","397.0","5.0","0.012594458438287154","0.0","1.0","3.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099424605746417670","https://twitter.com/gwern/status/1099424605746417670","@coldacid @dvoshart @dfa1979 Speaking of SCP, did you know GPT-2 can do SCPs? https://t.co/xqGlfkOpyA https://t.co/Fv0cJXBFQC","2019-02-23 21:43 +0000","964.0","66.0","0.06846473029045644","0.0","1.0","3.0","1.0","59.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099419857282568197","https://twitter.com/gwern/status/1099419857282568197","@akarlin88 @KirkegaardEmil No, fertility is not an exception. See the Tropf paper showing SNP heritability of zero when aggregated *across* cohorts, but a much more typical amount when calculated *within* cohorts, because the fertility PGS have r_g=0 with each other: they are just that place & time bound.","2019-02-23 21:24 +0000","545.0","11.0","0.02018348623853211","1.0","0.0","3.0","1.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099419482357932037","https://twitter.com/gwern/status/1099419482357932037","@SandvikLeth @akarlin88 @KirkegaardEmil ""There is a great deal of ruin in a nation.""","2019-02-23 21:23 +0000","276.0","4.0","0.014492753623188406","0.0","1.0","0.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099408908152725504","https://twitter.com/gwern/status/1099408908152725504","""These Waifus Do Not Exist"" is particularly hypnotic: https://t.co/UhyMIgNlFc","2019-02-23 20:41 +0000","7820.0","294.0","0.037595907928388746","0.0","0.0","8.0","0.0","1.0","0.0","24.0","0.0","0","0","0","0","0","864","261","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099387141321314304","https://twitter.com/gwern/status/1099387141321314304","@chenoehart You should retry with StyleGAN sometime. It handles sharp edges/lines much better than all the previous GAN architectures I've tried.","2019-02-23 19:14 +0000","327.0","5.0","0.01529051987767584","0.0","1.0","2.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099349580443996165","https://twitter.com/gwern/status/1099349580443996165","@AndrewCutler13 @halvorz @KirkegaardEmil Yes, but actually drawing on how plants work. Card's version adds little to your basic mammalian setup. Children would need to not resemble their parents at all, other people would randomly sprout off your arm, you could clone yourself by cutting off a finger, etc.","2019-02-23 16:45 +0000","246.0","4.0","0.016260162601626018","0.0","0.0","1.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099348552772395008","https://twitter.com/gwern/status/1099348552772395008","@halvorz @AndrewCutler13 @KirkegaardEmil Plant genetics half the time leave me shaking my head, 'what the heck?' We would live in such a strange world if humans (or mammals in general) worked the way, say, apple trees do. (Great SF premise there if anyone wants it!)","2019-02-23 16:41 +0000","302.0","5.0","0.016556291390728478","0.0","1.0","3.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099347943704936448","https://twitter.com/gwern/status/1099347943704936448","@KirkegaardEmil @AndrewCutler13 That's been mentioned in things I'm sure you've read, but it's hardly the only factor. I think a lot of it may be just that they are under enormously more selection and the simple additivity is gone. Hard to imagine stuff like, say, phages' ""survival of the flattest"" in humans.","2019-02-23 16:39 +0000","1297.0","5.0","0.0038550501156515036","0.0","0.0","3.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099347352207458306","https://twitter.com/gwern/status/1099347352207458306","@hbdchick Don't worry. Thanks to the Ashkenazi culture and Jewish emphasis on learning, the adopted children gained enormously in intelligence and better life outcomes. It's for the best, really.","2019-02-23 16:36 +0000","1318.0","21.0","0.015933232169954476","0.0","2.0","8.0","3.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099342183294005250","https://twitter.com/gwern/status/1099342183294005250","@KirkegaardEmil Need to be careful about defining expertise, though. I keep seeing these microbe/plant geneticists denying basic behav gen because they assume that humans must work like their wacky organisms with 60 decaploid chromosomes where everything is 5-way interactions and env-specific.","2019-02-23 16:16 +0000","1472.0","30.0","0.020380434782608696","0.0","1.0","13.0","7.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099341401400193025","https://twitter.com/gwern/status/1099341401400193025","_Intelligence, Second Edition_, Brody 1992: https://t.co/O1s5Dl5x0c https://t.co/zfLuwi3BPX https://t.co/4WoyhsP2Qi https://t.co/jqP3dgrk5G","2019-02-23 16:13 +0000","13404.0","49.0","0.003655625186511489","0.0","2.0","3.0","3.0","28.0","0.0","13.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099334809351045123","https://twitter.com/gwern/status/1099334809351045123","@KirkegaardEmil Worse than that. If that were remotely true (as it actually is for fertility!), IQ/EDU PGSes wouldn't work out of sample, wouldn't be usable or GWASable across so many cohorts/decades, genetic correlations would be zero, biological enrichments would make no sense, and so on.","2019-02-23 15:46 +0000","3673.0","42.0","0.011434794445956983","2.0","2.0","16.0","5.0","0.0","0.0","16.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099326602025025536","https://twitter.com/gwern/status/1099326602025025536","@DefaultMCYoutub Actually, come to think of it, the Danbooru community might not be happy about StyleGAN-generated images showing up... I've posted a comment bringing this point up to try to see what they think: https://t.co/1drbzaYBBI","2019-02-23 15:14 +0000","771.0","18.0","0.023346303501945526","0.0","1.0","1.0","0.0","15.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099162649873604608","https://twitter.com/gwern/status/1099162649873604608","@Zergfriend @eukaryote314 That's just speculation, though. BigGAN's metrics, for example, are great.","2019-02-23 04:22 +0000","293.0","3.0","0.010238907849829351","0.0","1.0","0.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099156783988461568","https://twitter.com/gwern/status/1099156783988461568","@Zergfriend @eukaryote314 We don't know if the progressive aspect does anything in terms of reducing mode collapse except in the very obvious sense that archs which require progressive growing to work at all are in a way suffering the worst mode collapse there is. Does BigGAN seem mode-collapsy?","2019-02-23 03:59 +0000","471.0","1.0","0.0021231422505307855","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099147793829191681","https://twitter.com/gwern/status/1099147793829191681","@DefaultMCYoutub Sure. Upload them to Danbooru. :) You may want to learn the community guidelines before you try to participate, though. Like Wikipedia or other crowdsourcing projects, just spamming uploads is not helpful.","2019-02-23 03:23 +0000","747.0","4.0","0.00535475234270415","0.0","2.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099116530460250117","https://twitter.com/gwern/status/1099116530460250117","@Don_Rubiel @hardmaru I tried to fix that now and broke the DNS, so oh well. People will just have to use the right subdomain.","2019-02-23 01:19 +0000","202.0","2.0","0.009900990099009901","0.0","0.0","0.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099103909249986560","https://twitter.com/gwern/status/1099103909249986560","@eukaryote314 Is progressive growing necessary, or just an optimization, in StyleGAN? I (accidentally) trained a 512px anime-face StyleGAN starting at 512px (no growing) for 1.5h, and while this should be complete failure if growing required, it was still making progress towards faces: https://t.co/xp1Ii6VrZ7","2019-02-23 00:29 +0000","832.0","11.0","0.013221153846153846","0.0","1.0","0.0","1.0","2.0","0.0","2.0","0.0","0","0","0","0","0","5","5","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099083295608975367","https://twitter.com/gwern/status/1099083295608975367","@jackclarkSF @Miles_Brundage (Its name is a pun, that's how you know it's a good idea.)","2019-02-22 23:07 +0000","670.0","2.0","0.0029850746268656717","0.0","0.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099078746483671040","https://twitter.com/gwern/status/1099078746483671040","@Miles_Brundage Free GPT-2 publicity idea. ""NaNoClickLo: writers compete to write the best stories with the fewest clicks using solely the top-k completions from GPT-2, initially top-40 but with k decreasing through to the finals. (Top-1 clicks are free.)""","2019-02-22 22:49 +0000","1473.0","17.0","0.011541072640868975","1.0","1.0","9.0","2.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099077187557302274","https://twitter.com/gwern/status/1099077187557302274","@ajmooch /shakes fist angrily Yeah, well, just you wait until I get StyleGAN trained on Danbooru2018! Then we'll see if your BigGAN is still so impressive!","2019-02-22 22:43 +0000","1073.0","7.0","0.0065237651444548","0.0","0.0","3.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099066567697072130","https://twitter.com/gwern/status/1099066567697072130","@zooko That was one of the inspirations for my own https://t.co/myX81SNEyv","2019-02-22 22:01 +0000","987.0","23.0","0.02330293819655522","0.0","0.0","0.0","2.0","21.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099027765741977600","https://twitter.com/gwern/status/1099027765741977600","@brungl_ We already did it, give it a try (force-refresh if you don't see the pause button).","2019-02-22 19:26 +0000","434.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099016022588379137","https://twitter.com/gwern/status/1099016022588379137","@brungl_ I don't know how to implement that.","2019-02-22 18:40 +0000","478.0","3.0","0.006276150627615063","0.0","0.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099015229676101632","https://twitter.com/gwern/status/1099015229676101632","@crschmidt What if beds are just flat couches? ?","2019-02-22 18:37 +0000","387.0","2.0","0.00516795865633075","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099014751554809856","https://twitter.com/gwern/status/1099014751554809856","@ESYudkowsky @nnotm Hm. Could be stale browser cache if you opened earlier versions in that. What it *should* look like is this: https://t.co/y5O3wxo7r8","2019-02-22 18:35 +0000","2376.0","149.0","0.06271043771043772","0.0","0.0","0.0","3.0","10.0","0.0","9.0","1.0","0","0","0","0","0","126","126","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099006866582048768","https://twitter.com/gwern/status/1099006866582048768","@iajrz Safari browser (& Apple?) users: basically the battered-wives of web dev.","2019-02-22 18:03 +0000","266.0","2.0","0.007518796992481203","0.0","0.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1099005857969422336","https://twitter.com/gwern/status/1099005857969422336","@nnotm @ESYudkowsky We've revised the CSS to give the text snippets more emphasis, and with a wide enough monitor, they'll just appear side-by-side for maximum viewing convenience.","2019-02-22 17:59 +0000","1260.0","6.0","0.004761904761904762","0.0","1.0","1.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098991071382720513","https://twitter.com/gwern/status/1098991071382720513","@crschmidt Hey, it's not like that's not what I'm already running my stuff on! And if anyone wants to steal my data/models, they're welcome to it...","2019-02-22 17:01 +0000","217.0","1.0","0.004608294930875576","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098973817970614272","https://twitter.com/gwern/status/1098973817970614272","@EricTopol @suilee I keep thinking about that 'infer Uighur facial appearance from DNA' paper: https://t.co/82PPw6lpS4","2019-02-22 15:52 +0000","1744.0","47.0","0.02694954128440367","3.0","0.0","3.0","2.0","33.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098965362903826432","https://twitter.com/gwern/status/1098965362903826432","@antonioregalado Wow, I'm in pretty good company here.","2019-02-22 15:18 +0000","395.0","13.0","0.03291139240506329","1.0","0.0","4.0","0.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098947055949619201","https://twitter.com/gwern/status/1098947055949619201","@GelsameI Dunno. Anime<->real hasn't worked too well with CycleGAN in the past.","2019-02-22 14:06 +0000","264.0","1.0","0.003787878787878788","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098934632555401216","https://twitter.com/gwern/status/1098934632555401216","@GelsameI @random_eddie I mean the Python code running the model. The model has no way to 'decide' to stop if the Python chooses to ignore it when it generates '<endoftext>'.","2019-02-22 13:16 +0000","664.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098934311158509569","https://twitter.com/gwern/status/1098934311158509569","@xSandorRight @Fate_Pony So bright I gotta wear shades. ?","2019-02-22 13:15 +0000","170.0","2.0","0.011764705882352941","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098933954709782534","https://twitter.com/gwern/status/1098933954709782534","@ExtraFluffyTet @Mobius_One6 Swap out the .tfrecords and keep on training.","2019-02-22 13:14 +0000","522.0","1.0","0.0019157088122605363","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098933626249625602","https://twitter.com/gwern/status/1098933626249625602","@GelsameI Public small one.","2019-02-22 13:12 +0000","449.0","1.0","0.0022271714922048997","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098932998081314816","https://twitter.com/gwern/status/1098932998081314816","@GelsameI @random_eddie I don't think the sampling code actually has any support for ending when the '<endoftext>' string gets generated, so it just keeps going and forcing more characters to be generated. I've started simply stripping them out with sed.","2019-02-22 13:10 +0000","702.0","1.0","0.0014245014245014246","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098805275862188034","https://twitter.com/gwern/status/1098805275862188034","@random_eddie No, I used a pretty long one to try to stuff a lot of keywords in for GPT-2 to use. Playing around, it seemed like the more keywords I added, the more interesting the outputs tended to be. So I wound up with this: https://t.co/1Y2LbjQCYQ","2019-02-22 04:42 +0000","924.0","69.0","0.07467532467532467","0.0","1.0","3.0","1.0","61.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098799703905484800","https://twitter.com/gwern/status/1098799703905484800","@shapr @imagemage I'm only responsible for the waifu one, don't look at me.","2019-02-22 04:20 +0000","248.0","4.0","0.016129032258064516","0.0","1.0","2.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098796995320446976","https://twitter.com/gwern/status/1098796995320446976","@GelsameI I can't be help responsible for what the model randomly generates!","2019-02-22 04:09 +0000","707.0","2.0","0.002828854314002829","0.0","1.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098787416704499712","https://twitter.com/gwern/status/1098787416704499712","GPT-2 anime plot summaries added.","2019-02-22 03:31 +0000","24454.0","118.0","0.004825386439846242","5.0","5.0","26.0","20.0","0.0","0.0","62.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098766143349813249","https://twitter.com/gwern/status/1098766143349813249","@StubbornLights Heck, I'd read this LN series: https://t.co/KUEbSWNVsn #GPT2 https://t.co/k4E1rNw414","2019-02-22 02:07 +0000","4692.0","47.0","0.010017050298380221","0.0","1.0","2.0","1.0","7.0","0.0","3.0","0.0","0","0","0","0","0","33","33","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098758024532054016","https://twitter.com/gwern/status/1098758024532054016","@GelsameI My status update after waking up too early this morning.","2019-02-22 01:35 +0000","241.0","2.0","0.008298755186721992","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098755344061394944","https://twitter.com/gwern/status/1098755344061394944","@GelsameI I also mean in just terms of diversity. Like you'll open up a random sample and it'll be... like... a crayon drawing: https://t.co/kwy8D9OM1g And the next will be an oil painting, and the next pixel-perfect cel shading. MGM will never do anything like that.","2019-02-22 01:24 +0000","563.0","40.0","0.07104795737122557","0.0","1.0","0.0","2.0","37.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098751743872196615","https://twitter.com/gwern/status/1098751743872196615","@cinnamon_carter I rather doubt that. Someone estimated the electricity for this was like... $10-20. Machine learning is cheap!","2019-02-22 01:10 +0000","539.0","3.0","0.0055658627087198514","0.0","1.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098751361464836097","https://twitter.com/gwern/status/1098751361464836097","@GelsameI I did, yes. 0.7 is playing it a little bit safe, but I was worried about people seeing a bad face and dismissing it out of hand. I mean, have you see all the people dismissing it as 'oh, they all look the same as https://t.co/vUX55sclkd'? Grrr.","2019-02-22 01:08 +0000","568.0","20.0","0.035211267605633804","0.0","1.0","1.0","0.0","18.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098744805952290816","https://twitter.com/gwern/status/1098744805952290816","Since people seem to particularly enjoy the wackier faces, I generated another batch of 10k with psi=1.0 rather than 0.7. (I really should call it quits here, the site is already up to 1.96TB of bandwidth used.)","2019-02-22 00:42 +0000","21788.0","107.0","0.004910960161556821","3.0","4.0","15.0","5.0","0.0","0.0","79.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098735663539732481","https://twitter.com/gwern/status/1098735663539732481","@ipreferpi Talk to Roadrunner: https://t.co/S8tCJLkpC3 I've got bigger faces to fry.","2019-02-22 00:06 +0000","846.0","12.0","0.014184397163120567","0.0","0.0","3.0","2.0","3.0","0.0","1.0","3.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098729959462440960","https://twitter.com/gwern/status/1098729959462440960","Tumblr discovers TWDNE: https://t.co/jQnmIM9ioD https://t.co/FGZrVyzjFT https://t.co/i0JkTbPTll","2019-02-21 23:43 +0000","8194.0","194.0","0.023675860385648034","0.0","0.0","5.0","0.0","170.0","0.0","19.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098729594511859713","https://twitter.com/gwern/status/1098729594511859713","@eukaryote314 No. (There's a 512px but that's full images so you definitely don't want to crop from that!) The 220k I'm using right now is ad hoc & not clean so I'm not sure if I want to put it up. It's only 18GB, but still... Might look at the Moeimouto face dataset: https://t.co/TsuqDtnd4b","2019-02-21 23:42 +0000","383.0","3.0","0.007832898172323759","0.0","1.0","0.0","0.0","1.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098700905124413440","https://twitter.com/gwern/status/1098700905124413440","@bechhof @karlbykarlsmith @hamandcheese I've long agreed with that. Current genome editing techniques are not that interesting: https://t.co/O8xCWseGok","2019-02-21 21:48 +0000","345.0","6.0","0.017391304347826087","0.0","0.0","2.0","0.0","1.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098698333479206925","https://twitter.com/gwern/status/1098698333479206925","@bechhof @karlbykarlsmith @hamandcheese On what, the genome editing or the 'genius is synesthesia'?","2019-02-21 21:37 +0000","464.0","1.0","0.0021551724137931034","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098684339288752129","https://twitter.com/gwern/status/1098684339288752129","@EMostaque @slatestarcodex I guess his past performance... didn't predict future returns.","2019-02-21 20:42 +0000","1081.0","4.0","0.0037002775208140612","0.0","0.0","1.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098682593254486016","https://twitter.com/gwern/status/1098682593254486016","@slatestarcodex Nominative determinism: Bowser is new king of Nintendo America. https://t.co/cXf53NH9RC","2019-02-21 20:35 +0000","3144.0","42.0","0.013358778625954198","1.0","0.0","25.0","9.0","1.0","0.0","5.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098682141024677889","https://twitter.com/gwern/status/1098682141024677889","@High_magik Eh. The JS for that is a bit more than I could easily do, and I kind of like the 'rogue-like' or 'Borges's Library' aspect of it. (Doesn't it cheapen the experience if you know you can just rewind? Feel dat ???? ?? ?????!)","2019-02-21 20:33 +0000","3006.0","2.0","6.653359946773121E-4","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098678130737381377","https://twitter.com/gwern/status/1098678130737381377","@Russwarne I found it a little odd that the genetics of musical ability had been studied relatively so much so early on but appears to have more or less been completely abandoned for the past half-century. Research trends can move like that, I guess.","2019-02-21 20:17 +0000","955.0","3.0","0.0031413612565445027","0.0","1.0","0.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098677815094980610","https://twitter.com/gwern/status/1098677815094980610","@Russwarne Ah, Seashore & Stanton! I remember them from Shuter's genetics section in https://t.co/EG3VQPWTHJ After reading through Shuter & the few behavioral genetics studies on music, I was left a little puzzled. (It's probably just another highly polygenic additive trait, though.)","2019-02-21 20:16 +0000","928.0","3.0","0.003232758620689655","0.0","1.0","1.0","0.0","1.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098668201175642112","https://twitter.com/gwern/status/1098668201175642112","@alfcnz @roadrunning01 I guess? I've never used Gaussian filters before, much less Numpy's specific implementation.","2019-02-21 19:38 +0000","1131.0","1.0","8.841732979664014E-4","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098667955284557825","https://twitter.com/gwern/status/1098667955284557825","@Russwarne @Jintelligence1 @UVU @Melen_Dez @JaZBur @MDPIOpenAccess I once got curious about Goddard's immigrant IQ testing. I eventually tracked down his paper in Google Books: https://t.co/daw3Lof5ts Literally the second sentence: ""The study makes no determination of the actual percentage, even of these groups, who are feeble-minded."" ?","2019-02-21 19:37 +0000","1645.0","10.0","0.0060790273556231","0.0","1.0","0.0","2.0","4.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098665462395736065","https://twitter.com/gwern/status/1098665462395736065","@alfcnz @roadrunning01 There is, but it's a Gaussian filter on a matrix of pre-generated latents which makes them more similar to each other. It's not like 'generate a random z-vector then multiply each element by a random percentage for consecutive frame'. (Seems like an unnecessary optimization TBH).","2019-02-21 19:27 +0000","1489.0","3.0","0.0020147750167897917","0.0","1.0","0.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098658630121984000","https://twitter.com/gwern/status/1098658630121984000","@antonioregalado @pknoepfler I don't know about standard SNP arrays but CCR5 is a protein-coding gene and shouldn't that be picked up in any exome study? So either way.","2019-02-21 19:00 +0000","2593.0","10.0","0.0038565368299267257","0.0","1.0","3.0","2.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098638323717943296","https://twitter.com/gwern/status/1098638323717943296","@antonioregalado @pknoepfler Did you look at the human part? It's a candidate-gene hit (p=0.01 or 0.03!) in n=446 in a half-Ashkenazi cohort where the Ashkenazi have more CCR (hmmm.....) with a whole bunch of endpoints tested. And as usual, why didn't the SNP or exome IQ GWASes find it before if it's real?","2019-02-21 17:39 +0000","2627.0","59.0","0.022459078797106968","2.0","1.0","14.0","6.0","0.0","0.0","34.0","0.0","0","0","2","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098625469132537856","https://twitter.com/gwern/status/1098625469132537856","@alfcnz @roadrunning01 Hm... It's not clear how 'step size' is done in the video-generating code because it generates a large matrix of latents and then applies a Gaussian filter. Increasing 'smoothing' (filter width?) makes it very blurry. Tried increasing FPS to 69 & decreasing RNG SD to 0.5 in 30s. https://t.co/MtP2jm7WkR","2019-02-21 16:48 +0000","2192.0","72.0","0.032846715328467155","1.0","1.0","2.0","1.0","0.0","0.0","15.0","0.0","0","0","0","0","0","202","52","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098625193306718209","https://twitter.com/gwern/status/1098625193306718209","@halcy There's https://t.co/V2faUhBuvt but it's in a weirdo framework & he hasn't released the model yet AFAIK.","2019-02-21 16:47 +0000","1452.0","19.0","0.013085399449035813","0.0","2.0","1.0","2.0","14.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098622704427716608","https://twitter.com/gwern/status/1098622704427716608","@halcy I remain convinced that VGG16 is not a good idea for anything illustration-related (and probably not a good idea for anything these days, period).","2019-02-21 16:37 +0000","1806.0","5.0","0.0027685492801771874","0.0","1.0","0.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098621785338273794","https://twitter.com/gwern/status/1098621785338273794","@_vivalapanda @CuteMutePrude The 4chan & SA forums seemed to have no trouble coming up with emotional connections or narratives eg https://t.co/bU9g80eeHQ 'Replication, variation, selection'...","2019-02-21 16:33 +0000","931.0","5.0","0.0053705692803437165","0.0","1.0","1.0","0.0","3.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098615035398668290","https://twitter.com/gwern/status/1098615035398668290","@Saigarich @realmforge @DrazHD Increasing image scale provides more global structure to work with and also a lot more images period.","2019-02-21 16:06 +0000","1871.0","7.0","0.0037413148049171567","0.0","0.0","0.0","3.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098611818161946624","https://twitter.com/gwern/status/1098611818161946624","@eukaryote314 Just using the default multi-GPU support which is... 'data parallelism', I think. And yeah, 512px is easy. StyleGAN seems to scale very well & is stable. Higher res aren't much slower than low res. (I'm not sure the progressive growing is required at all!)","2019-02-21 15:54 +0000","1498.0","3.0","0.0020026702269692926","0.0","3.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098596342279143431","https://twitter.com/gwern/status/1098596342279143431","@hardmaru If you want funny/interesting samples, you should be looking at the psi=1.2 samples: https://t.co/xQ6o3gXk1T","2019-02-21 14:52 +0000","3541.0","15.0","0.004236091499576391","0.0","0.0","2.0","6.0","2.0","0.0","4.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098595912950136832","https://twitter.com/gwern/status/1098595912950136832","@realmforge @Saigarich @DrazHD I don't think that's necessary. Just increase the model size & resolution, and throw a few GPU-months at it.","2019-02-21 14:50 +0000","2354.0","3.0","0.001274426508071368","0.0","1.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098595736806215681","https://twitter.com/gwern/status/1098595736806215681","@eukaryote314 TWDNE uses pi GPU-weeks (based on 2x1080ti). But you get pretty good 512px faces after just 2 GPU-days or so. (The shoulders & backgrounds are damnably difficult and take forever to slowly improve.)","2019-02-21 14:50 +0000","1309.0","4.0","0.0030557677616501145","0.0","1.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098595128590192641","https://twitter.com/gwern/status/1098595128590192641","@davidabann @irishinneburg @mendel_random I think that would be a relatively short list and limited to a few minor scenarios, since my belief is that we vastly underinvest in RCTs compared to correlational... Some examples from the sequential trial literature, I suppose, like the infant steroid Cochrane Collab example.","2019-02-21 14:47 +0000","314.0","3.0","0.009554140127388535","0.0","0.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098594604146966529","https://twitter.com/gwern/status/1098594604146966529","@FatherSlate My rule of thumb so far is 1/3rd.","2019-02-21 14:45 +0000","185.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098422055727448064","https://twitter.com/gwern/status/1098422055727448064","@guillefix https://t.co/yXswDjbaj6","2019-02-21 03:20 +0000","680.0","4.0","0.0058823529411764705","0.0","1.0","3.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098420649893261312","https://twitter.com/gwern/status/1098420649893261312","@claymeuns ""Actually, you're thinking of Frankenstein's waifu.""","2019-02-21 03:14 +0000","751.0","8.0","0.010652463382157125","0.0","0.0","4.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098412392084656128","https://twitter.com/gwern/status/1098412392084656128","@coldacid Absolutely. eg https://t.co/V4TOMdGOS9 or https://t.co/J2FR4lfO7Z or https://t.co/fzrWBkSiwv or https://t.co/hcHDok2pRl","2019-02-21 02:41 +0000","1529.0","73.0","0.04774362328319163","0.0","2.0","2.0","3.0","60.0","0.0","2.0","4.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098408390613745664","https://twitter.com/gwern/status/1098408390613745664","In China, the Baidu forums in particular seem to be having a blast with TWDNE and These Waifu Do Not Exist. Forum avatars and everything. Here's one user maximizing their use of the latter even as they train their own StyleGAN: https://t.co/JharIS2yhd ? https://t.co/AfbgX33rW6","2019-02-21 02:25 +0000","20241.0","751.0","0.03710290993527988","5.0","4.0","43.0","9.0","304.0","0.0","90.0","0.0","0","0","0","0","0","296","296","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098405289257324545","https://twitter.com/gwern/status/1098405289257324545","@stormtroper1721 Yeah, that's definitely the same UI! I wonder what else he had to do to get it working?","2019-02-21 02:13 +0000","793.0","1.0","0.0012610340479192938","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098398137968934912","https://twitter.com/gwern/status/1098398137968934912","@brathalu @YahooFinance 1. The faces are generated by the neural net, that's the point. If you mean where did the real faces it was trained on come from? They're all CC-licensed Flickr photos. 2. to amuse, and inform people about the possibilities right now.","2019-02-21 01:44 +0000","554.0","6.0","0.010830324909747292","0.0","1.0","0.0","1.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098397471238209536","https://twitter.com/gwern/status/1098397471238209536","@GelsameI Yep, that's the dream. Full 1024px Danbooru2018 images generated by a StyleGAN conditioned on a text embedding of tags+metadata... Step by step we're getting there.","2019-02-21 01:42 +0000","894.0","6.0","0.006711409395973154","0.0","0.0","3.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098395385922551808","https://twitter.com/gwern/status/1098395385922551808","@GelsameI Sure. Conditional text GANs work fantastically. Look at StackGAN/StackGAN++ (or BigGAN to some degree). Not quite a trivial coding project though...","2019-02-21 01:34 +0000","839.0","2.0","0.0023837902264600714","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098394997202788353","https://twitter.com/gwern/status/1098394997202788353","@1hplovecraft So? It is unlikely ""Face Editor"" would work with zero reconfiguration with StyleGANs and even if it did, you'd have to redo all the binary classification training to make it work with my specific StyleGAN. Whereas the guy I linked appears to have a fully implemented system.","2019-02-21 01:32 +0000","789.0","2.0","0.0025348542458808617","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098394213413257216","https://twitter.com/gwern/status/1098394213413257216","@1hplovecraft Oh, I know how it works. But I want source code and whatnot, not just a preview video.","2019-02-21 01:29 +0000","769.0","1.0","0.0013003901170351106","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098384515234516997","https://twitter.com/gwern/status/1098384515234516997","@GelsameI Yeah, either he hasn't updated the button labels or the classification isn't too good, but if the latter, can probably be fixed by using a better more powerful classifier+more labeled images. And any control would make a lot of people happy, of course.","2019-02-21 00:50 +0000","870.0","3.0","0.0034482758620689655","0.0","1.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098383697026473986","https://twitter.com/gwern/status/1098383697026473986","@GelsameI I think so; see my comment.","2019-02-21 00:47 +0000","872.0","1.0","0.0011467889908256881","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098382771767128064","https://twitter.com/gwern/status/1098382771767128064","@rubenarslan @razibkhan That would depend very strongly on age, wouldn't it? That was kind of my whole point in https://t.co/yoMuULpJ6j : high-IQ elementary schools fail because phenotypic-only testing for extreme outliers is poorly correlated with adult IQ/outcomes w/o better predictors (eg genes).","2019-02-21 00:43 +0000","535.0","25.0","0.04672897196261682","1.0","1.0","5.0","0.0","9.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098381065134227456","https://twitter.com/gwern/status/1098381065134227456","Looks like someone's hooked up StyleGAN to latent-variable classifiers, allowing interactive control of the visual attributes of each face: https://t.co/odBUWO5yWB OP's not answering any questions, though. Anyone know where this comes from? Not finding anything in Google.","2019-02-21 00:37 +0000","19028.0","465.0","0.024437670800924953","4.0","3.0","32.0","8.0","384.0","0.0","34.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098362630123778050","https://twitter.com/gwern/status/1098362630123778050","@SKULLYEAHBROTH1 Someone sounds mad!","2019-02-20 23:23 +0000","282.0","6.0","0.02127659574468085","0.0","1.0","0.0","2.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098326792362954762","https://twitter.com/gwern/status/1098326792362954762","And, uh, fixed on mobile browsers. (You'd think after 460k unique visitors, someone would've mentioned that to me.)","2019-02-20 21:01 +0000","24898.0","83.0","0.0033336010924572253","0.0","4.0","11.0","17.0","0.0","0.0","51.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098304934641127424","https://twitter.com/gwern/status/1098304934641127424","'""Watch out!"" the Bayesmancer shouted to the party after rolling an informative prior of ¦Â(19,20). ""Borge's Library is almost certainly haunted by an infinite number of¡ª???? ????????!""' (I may be spending too much time on typography.) https://t.co/CnvNvptO1n","2019-02-20 19:34 +0000","16105.0","558.0","0.034647624961192176","0.0","1.0","15.0","3.0","80.0","0.0","22.0","0.0","0","0","0","0","0","437","437","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098299114624901122","https://twitter.com/gwern/status/1098299114624901122","@dresdencodak I have bad news for you: https://t.co/Eao84K4nmg appears to've been hacked to do pharmacloaking. https://t.co/rbsgX0DzLa","2019-02-20 19:11 +0000","429.0","7.0","0.016317016317016316","0.0","1.0","0.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","4","4","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098296887382343680","https://twitter.com/gwern/status/1098296887382343680","@razibkhan In a way, they've been a self-fulfilling prophecy. The experience curve effect + cumulative genotype-phenotype datasets. Amara's law interacts nicely with things that scale the way genetics does.","2019-02-20 19:02 +0000","1101.0","1.0","9.082652134423251E-4","0.0","0.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098287915392737284","https://twitter.com/gwern/status/1098287915392737284","@Kayoshio Someone made a Mega dump: https://t.co/ochBvkERqt But as they are all random samples, it seems to me that just about any way of displaying them is inherently random...","2019-02-20 18:26 +0000","567.0","9.0","0.015873015873015872","0.0","1.0","1.0","0.0","7.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098272598297849857","https://twitter.com/gwern/status/1098272598297849857","@mpmpwgmpd Reading through the blog, they take a similar position as many lawyers in the USA, but do not cite any actual precedents or court cases establishing it. So...","2019-02-20 17:26 +0000","2972.0","10.0","0.0033647375504710633","0.0","0.0","0.0","0.0","0.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098268137978585091","https://twitter.com/gwern/status/1098268137978585091","@mpmpwgmpd No one knows! It's public domain... maybe? https://t.co/ukyLtJM75V","2019-02-20 17:08 +0000","4174.0","17.0","0.004072831816003833","2.0","1.0","4.0","1.0","2.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098263189710606336","https://twitter.com/gwern/status/1098263189710606336","Video of site & demo made by Scavin: https://t.co/ZJErTAv5HG","2019-02-20 16:48 +0000","12952.0","121.0","0.009342186534898085","2.0","1.0","1.0","5.0","40.0","0.0","46.0","0.0","2","0","0","0","0","24","24","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098262819613622272","https://twitter.com/gwern/status/1098262819613622272","Upscaled faces to 1024px with waifu2x. Added another 20k faces (60k total). Added some more explanatory text & links. Site stats so far: 2,700 people on site; 430,932 unique visitors; 10.5 faces per session; >375GB bandwidth.","2019-02-20 16:47 +0000","24705.0","118.0","0.004776361060514066","1.0","2.0","31.0","16.0","0.0","0.0","68.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098251320727879680","https://twitter.com/gwern/status/1098251320727879680","@_vivalapanda ""Aah, so it's like that, huh. I understand everything now.""","2019-02-20 16:01 +0000","501.0","2.0","0.003992015968063872","0.0","0.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098246067836145664","https://twitter.com/gwern/status/1098246067836145664","@_vivalapanda Don't you start.","2019-02-20 15:40 +0000","527.0","2.0","0.003795066413662239","0.0","1.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098232891241349125","https://twitter.com/gwern/status/1098232891241349125","@VelleitySoft ""Quantity is a quality all its own."" Something people really aren't appreciating in some contexts, such as GPT-2...","2019-02-20 14:48 +0000","188.0","1.0","0.005319148936170213","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098230240910942209","https://twitter.com/gwern/status/1098230240910942209","@halcy https://t.co/mHn0AHb7fe + https://t.co/21DtraGy1N (And of course I shared the model! What fun would it be if I was the only one who could play with it? I can never understand the mindset of people who *don't* share their trained models.)","2019-02-20 14:37 +0000","1249.0","17.0","0.013610888710968775","0.0","1.0","2.0","2.0","10.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098229834294194176","https://twitter.com/gwern/status/1098229834294194176","@VelleitySoft Hah, and here I was generating another 20k images so people don't get bored.","2019-02-20 14:36 +0000","277.0","1.0","0.0036101083032490976","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098225384670269440","https://twitter.com/gwern/status/1098225384670269440","@JoakimRi Pre-generated images. (Nothing is done to make the RNG seed for each generated image unique, but the latent embedding is so large that collisions will never happen.)","2019-02-20 14:18 +0000","614.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098224007705489408","https://twitter.com/gwern/status/1098224007705489408","@halcy Neat, I'll have to steal that.","2019-02-20 14:13 +0000","1475.0","2.0","0.0013559322033898306","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098222105177534465","https://twitter.com/gwern/status/1098222105177534465","@halcy Is that the 'fine' interpolation video?","2019-02-20 14:05 +0000","1503.0","7.0","0.004657351962741184","0.0","2.0","0.0","1.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098221189577748480","https://twitter.com/gwern/status/1098221189577748480","@VelleitySoft Oh Anon! Is it not written: ""The Moving Finger writes; and, having writ, Moves on""","2019-02-20 14:01 +0000","312.0","4.0","0.01282051282051282","0.0","3.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098220747170947072","https://twitter.com/gwern/status/1098220747170947072","@disputed_proof My hypothesis is that it's a mix of some global incoherency leading to heterochromia (*the* anime GAN failure mode) and attempts at shading/lighting making darker/lighter eyes. In other GANs, dropping in a self-attention layer at 64px reduces heterochromia massively.","2019-02-20 14:00 +0000","839.0","6.0","0.007151370679380214","0.0","0.0","2.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098220211814174720","https://twitter.com/gwern/status/1098220211814174720","@MarkusRamikin Which is curious because there's only 12.6k Holos in the 220k. There's maybe 5k Asukas, so you'd expect to see a lot of Asuka if that's enough to make her overrepresented, and yet, you don't see Asuka nearly as much.","2019-02-20 13:57 +0000","707.0","1.0","0.0014144271570014145","0.0","0.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098219811132321793","https://twitter.com/gwern/status/1098219811132321793","@GelsameI Probably, yes. Lines are hard for CNNs (or maybe just GANs in general?), it seems. Sharpness always comes late in training.","2019-02-20 13:56 +0000","198.0","1.0","0.005050505050505051","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098218201652043776","https://twitter.com/gwern/status/1098218201652043776","@VelleitySoft Guys! It's just randomly generated faces! If you want them, just download the model and generate as many as you wish, that's what it's for: https://t.co/faZHaaP59C","2019-02-20 13:49 +0000","335.0","5.0","0.014925373134328358","0.0","1.0","0.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098217932251906048","https://twitter.com/gwern/status/1098217932251906048","Suddenly very glad this morning that I decided to stay up late last night and install the optional CloudFlare caching. https://t.co/WLDcBZPXEU","2019-02-20 13:48 +0000","11065.0","352.0","0.03181201988251243","1.0","5.0","27.0","5.0","28.0","0.0","59.0","0.0","0","0","0","0","0","227","227","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098116159646453761","https://twitter.com/gwern/status/1098116159646453761","@AdamDemirel @ggreer @draughtens I actually did look at the time/bounce/total statistics as we made improvements but unfortunately way too much noise & temporal variation, and I'm still running the second ad A/B test.","2019-02-20 07:04 +0000","175.0","3.0","0.017142857142857144","0.0","1.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098112791641288704","https://twitter.com/gwern/status/1098112791641288704","@crschmidt You might also want to look at https://t.co/wuVi8ZUYBE. I haven't yet run anything on them (still uploading a dataset to my server) but their prices are very tempting.","2019-02-20 06:51 +0000","467.0","11.0","0.023554603854389723","0.0","1.0","2.0","0.0","6.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098111801781309441","https://twitter.com/gwern/status/1098111801781309441","@madeofmistak3 Just as soon as OpenAI releases the full GPT-2 and we can finetune it on anime & chat dialogue corpuses...","2019-02-20 06:47 +0000","741.0","18.0","0.024291497975708502","0.0","0.0","6.0","3.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098110717604126721","https://twitter.com/gwern/status/1098110717604126721","@crschmidt It's also addictive watching transfer learning! You see it reuse the old primitives, gradually warping and distorting them to fit: ""wait. stop. no. that's not how it works. aw geez."" Like the zombie faces from my anime<->real face experiments.","2019-02-20 06:42 +0000","417.0","1.0","0.002398081534772182","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098110106145820672","https://twitter.com/gwern/status/1098110106145820672","Or 5326, for that matter. ""Well, that escalated quickly.""","2019-02-20 06:40 +0000","15028.0","59.0","0.003926004791056694","0.0","3.0","15.0","9.0","0.0","0.0","32.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098107604327387136","https://twitter.com/gwern/status/1098107604327387136","@ESYudkowsky These question seem increasingly above my paygrade. We may need to ask Tegmark.","2019-02-20 06:30 +0000","4898.0","54.0","0.011024908125765618","1.0","1.0","27.0","10.0","0.0","0.0","15.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098107410785452032","https://twitter.com/gwern/status/1098107410785452032","This has really taken off in China. I've never seen 1700 simultaneous readers in my realtime Google Analytics before, and people are looking at 11.5 faces on average. Not bad.","2019-02-20 06:29 +0000","12342.0","116.0","0.00939880084265111","0.0","2.0","24.0","13.0","0.0","0.0","76.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098100287838085120","https://twitter.com/gwern/status/1098100287838085120","@irishinneburg @davidabann @mendel_random I keep a messy list of studies I've found directly comparing correlational & RCT results at https://t.co/wZfrrgmLLX","2019-02-20 06:01 +0000","2120.0","67.0","0.03160377358490566","2.0","2.0","3.0","8.0","35.0","0.0","17.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098078033506910213","https://twitter.com/gwern/status/1098078033506910213","@sudogene / ""What! did the Hand then of the Potter shake!"" /","2019-02-20 04:32 +0000","593.0","1.0","0.0016863406408094434","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098071441759789056","https://twitter.com/gwern/status/1098071441759789056","Is clicking to request a new waifu too boring for you? Do you need even more faces to maintain that dopamine drop? Boy do I have a website for you: https://t.co/7PUk4VmGQZ","2019-02-20 04:06 +0000","36676.0","803.0","0.02189442687315956","7.0","11.0","31.0","19.0","614.0","0.0","121.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098056344664530944","https://twitter.com/gwern/status/1098056344664530944","@missamyjie A person of taste, I see.","2019-02-20 03:06 +0000","827.0","3.0","0.0036275695284159614","1.0","0.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098053759928860672","https://twitter.com/gwern/status/1098053759928860672","@Zergfriend While making it, someone told me, 'but gwern, don't these waifu now exist now that your StyleGAN has made them?' ME: 'Ah, but you see, that's the deeper level of my little joke: ?? waifu exist in the first place!' THEM: ?","2019-02-20 02:56 +0000","2646.0","22.0","0.008314436885865457","0.0","0.0","10.0","2.0","0.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098046560125026305","https://twitter.com/gwern/status/1098046560125026305","@AndrewCutler13 @KirkegaardEmil (oh come on)","2019-02-20 02:27 +0000","639.0","5.0","0.00782472613458529","0.0","2.0","1.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098043486648045569","https://twitter.com/gwern/status/1098043486648045569","@EileenOrmsby Man, I saw that in _Ghost in the Shell_ like, decades ago.","2019-02-20 02:15 +0000","802.0","10.0","0.012468827930174564","2.0","0.0","6.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098043172314365952","https://twitter.com/gwern/status/1098043172314365952","@jjvie Hm, well, I can reduce the JPEG quality... Normally I would slap Cloudflare in front of it to save bandwidth but I'm too worried about screwing it up. Bit too late at night for even more DNS shenanigans.","2019-02-20 02:14 +0000","849.0","2.0","0.002355712603062426","0.0","0.0","0.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098041714420473857","https://twitter.com/gwern/status/1098041714420473857","I suppose it was inevitable: https://t.co/yDcN0LslfL What is this odd sensation I feel in my cardiac compartment?","2019-02-20 02:08 +0000","33313.0","1068.0","0.03205955632936091","2.0","5.0","36.0","13.0","899.0","0.0","111.0","2.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098023394161975299","https://twitter.com/gwern/status/1098023394161975299","@EmperorArilando @elonmusk He was retweeting someone using my StyleGAN model to generate an interplation video, to be specific.","2019-02-20 00:55 +0000","1593.0","4.0","0.0025109855618330196","0.0","0.0","1.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098023044126253056","https://twitter.com/gwern/status/1098023044126253056","@davidabann @mendel_random We already know these sorts of analyses do not recover the causal estimates from RCTs. NICE in the UK sponsored several of the systematic reviews demonstrating as much!","2019-02-20 00:54 +0000","525.0","10.0","0.01904761904761905","0.0","1.0","0.0","4.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098017881684668416","https://twitter.com/gwern/status/1098017881684668416","@GelsameI @StubbornLights I think https://t.co/vUX55sclkd already has some work on generating/controlling very small 3D meshes/models, actually.","2019-02-20 00:33 +0000","2637.0","36.0","0.013651877133105802","0.0","0.0","3.0","2.0","31.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098008372702527488","https://twitter.com/gwern/status/1098008372702527488","@StubbornLights 'One tweet, one kill.' /Nekomiya Hinata","2019-02-19 23:56 +0000","2631.0","5.0","0.0019004180919802356","0.0","1.0","2.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1098001196219027463","https://twitter.com/gwern/status/1098001196219027463","@StubbornLights Haha, come on now that's just silly. Why, you'd have to download the GPT-2 model and Colab script locally and come up with a bunch of prompts and generate 40k samples and figure out how to use JS to randomly transclude one... hm...","2019-02-19 23:27 +0000","6871.0","17.0","0.0024741667879493523","0.0","2.0","12.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097993155738329089","https://twitter.com/gwern/status/1097993155738329089","@SilverVVulpes Whoa there, projection much? That's not my fetish.","2019-02-19 22:55 +0000","3622.0","21.0","0.005797901711761457","0.0","0.0","11.0","4.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097989106355851265","https://twitter.com/gwern/status/1097989106355851265","@srchvrs @mark_riedl You wouldn't download a child, would you? Similar question: how can you tell the difference between a DQN and a cat?","2019-02-19 22:39 +0000","1787.0","10.0","0.005595970900951315","1.0","1.0","4.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097987390407405575","https://twitter.com/gwern/status/1097987390407405575","@KOHiNARU Our ?? for this AI is pure, don't lewd her! (But probably sometime this year. Decentish results at 512px whole anime images with ~1.5 GPU-weeks so far: https://t.co/J2FR4lfO7Z )","2019-02-19 22:32 +0000","983.0","77.0","0.07833163784333673","0.0","1.0","5.0","0.0","64.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097985149726351360","https://twitter.com/gwern/status/1097985149726351360","@xsteenbrugge Interesting. I wonder what's going on with those artefacts? Trying to amplify the noise from staticy-abstractions on aerial photographs? Aerial photographs might not be 'smooth' like photos of more macro scale stuff is.","2019-02-19 22:23 +0000","153.0","4.0","0.026143790849673203","0.0","0.0","1.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097983408733650944","https://twitter.com/gwern/status/1097983408733650944","@ETDEUMPURITAS @oldmancalvin Some day we will make anime real. The Great Work continues.","2019-02-19 22:16 +0000","3942.0","4.0","0.0010147133434804667","0.0","0.0","4.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097982692262641665","https://twitter.com/gwern/status/1097982692262641665","If you don't want to browse through the dumps, you can see random face examples at https://t.co/dJEv46HxY8 https://t.co/T8JVSvPDqG","2019-02-19 22:14 +0000","26389.0","153.0","0.005797870324756527","2.0","1.0","9.0","6.0","117.0","0.0","12.0","6.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097981240572039168","https://twitter.com/gwern/status/1097981240572039168","@DrazHD Is there a kanji for that? Truly, magical moon runes.","2019-02-19 22:08 +0000","2605.0","3.0","0.0011516314779270633","0.0","0.0","3.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097981058648297472","https://twitter.com/gwern/status/1097981058648297472","@JohnJevereux https://t.co/Rgw46wyX9L","2019-02-19 22:07 +0000","218.0","5.0","0.022935779816513763","0.0","0.0","2.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097980700702269443","https://twitter.com/gwern/status/1097980700702269443","@ETDEUMPURITAS @oldmancalvin I hate to break it to you, but... Holo isn't real.","2019-02-19 22:06 +0000","4083.0","3.0","7.347538574577516E-4","0.0","0.0","2.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097980425711030272","https://twitter.com/gwern/status/1097980425711030272","@DrazHD Kanjis for what?","2019-02-19 22:05 +0000","2582.0","1.0","3.872966692486445E-4","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097980271960432640","https://twitter.com/gwern/status/1097980271960432640","@GelsameI I believe 'https://t.co/F3dfPFxas2' is doing a random lookup but also generating fresh ones on its GPU every few minutes. Me? I just generated 40k faces and if that's not enough for someone, they can go generate their own. ? Not spending $100/month on a GPU server for a joke!","2019-02-19 22:04 +0000","3682.0","66.0","0.017925040738728953","1.0","0.0","20.0","6.0","38.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097975349235855361","https://twitter.com/gwern/status/1097975349235855361","@GelsameI Thank you! Thank you, Danbooru. Good bye, StyleGAN. And to all the waifus¡ªCongratulations!","2019-02-19 21:44 +0000","2137.0","20.0","0.009358914365933552","1.0","1.0","13.0","4.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097973862392844290","https://twitter.com/gwern/status/1097973862392844290","This Waifu Does Not Exist: https://t.co/dJEv46HxY8 (u mad?) https://t.co/slt7dsYgXT","2019-02-19 21:39 +0000","222603.0","6730.0","0.03023319541964843","127.0","41.0","384.0","290.0","2592.0","0.0","655.0","2.0","0","0","6","0","0","2633","2633","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097970606274953216","https://twitter.com/gwern/status/1097970606274953216","@lucidrains @crschmidt @beast_pixels (In particular, the bare domain name https://t.co/WRz6pMR9Uc is currently in a quantum superposition of either returning the index.html like it's supposed to or returning an S3 bucket listing *somehow*, depending on your geographic location.)","2019-02-19 21:26 +0000","292.0","5.0","0.017123287671232876","0.0","1.0","1.0","0.0","3.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097969323837140997","https://twitter.com/gwern/status/1097969323837140997","@lucidrains @crschmidt @beast_pixels The .net wasn't taken so I went with that: https://t.co/dJEv46HxY8 As I thought, writing & uploading it wasn't the problem, it's the DNS issues which drove me nuts.","2019-02-19 21:21 +0000","428.0","10.0","0.02336448598130841","0.0","1.0","2.0","0.0","7.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097938365767397377","https://twitter.com/gwern/status/1097938365767397377","@lucidrains @crschmidt @beast_pixels And now that I check, 'https://t.co/yeyXfjd1iZ' is already bought.","2019-02-19 19:17 +0000","321.0","5.0","0.01557632398753894","0.0","1.0","2.0","0.0","1.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097931463624982528","https://twitter.com/gwern/status/1097931463624982528","@lucidrains @crschmidt @beast_pixels I thought about it, you're not the first to suggest it, and it's simple to implement: it's just some JS to load '<img href=""./$RANDOMINT.jpg"">'. But then I was all like 'buying a domain name & setting this all up with S3 is a pain, meh, is it really *that* funny?'","2019-02-19 18:50 +0000","356.0","8.0","0.02247191011235955","0.0","1.0","2.0","0.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097925286157565952","https://twitter.com/gwern/status/1097925286157565952","@crschmidt @beast_pixels @lucidrains Cool. I'm hoping my writeup will enable even more people to mess around with StyleGAN too. Sometimes one just needs a good step by step guide to feel like something is possible & get started...","2019-02-19 18:26 +0000","390.0","5.0","0.01282051282051282","0.0","1.0","3.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097924963615625218","https://twitter.com/gwern/status/1097924963615625218","@artetxem From what I've read of those, I think their technical sophistication is a lot lower & reliant on far less exotic skillsets, and in any case, the cost of those infrastructures are a sunk cost.","2019-02-19 18:24 +0000","1402.0","1.0","7.132667617689016E-4","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097923991828873218","https://twitter.com/gwern/status/1097923991828873218","@JanelleCShane @CapyTannin @hpsacredtext It's obviously Harry's dead father. Given that, who can blame Harry for not wanting either the snake or Snape?","2019-02-19 18:20 +0000","397.0","4.0","0.010075566750629723","0.0","0.0","0.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097910716022038530","https://twitter.com/gwern/status/1097910716022038530","@artetxem In reality, I suspect that the fully-loaded cost of writing the training code and getting it running at a sufficient scale to eat up $43k of compute would be a lot more than the actual compute. People good at ML are not cheap. Even getting something like StyleGAN running is hard.","2019-02-19 17:28 +0000","2394.0","19.0","0.007936507936507936","0.0","1.0","8.0","6.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097910139686932488","https://twitter.com/gwern/status/1097910139686932488","@roadrunning01 It might help a little bit simply because it's a better model, but I don't expect a big difference. Hard to overcome a lack of data given the current architecture.","2019-02-19 17:25 +0000","652.0","3.0","0.004601226993865031","0.0","1.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097909860711235585","https://twitter.com/gwern/status/1097909860711235585","@crschmidt @beast_pixels @lucidrains Also watch out for file format. If you trained it on JPG, have to always use JPG, and likewise PNG.","2019-02-19 17:24 +0000","2214.0","2.0","9.033423667570009E-4","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097909515465474049","https://twitter.com/gwern/status/1097909515465474049","@crschmidt @beast_pixels @lucidrains Yes, you just drop it in. The key trick however is *set kimg* forward to 7000 or so to make it start at 512px training (I assume you're using 512px). If it starts earlier, seems to delete the trained layers (the fade-in zeroes them out, maybe).","2019-02-19 17:23 +0000","2345.0","5.0","0.0021321961620469083","0.0","1.0","3.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097902876402298883","https://twitter.com/gwern/status/1097902876402298883","@roadrunning01 FWIW, the wavy-line artifacts are I think a sign of StyleGAN overfitting. You might get better interpolations from an earlier checkpoint.","2019-02-19 16:56 +0000","1229.0","6.0","0.004882017900732303","0.0","1.0","1.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097901850131644417","https://twitter.com/gwern/status/1097901850131644417","@crschmidt @beast_pixels @lucidrains Are you doing transfer learning from rooms->kitchens? I think that'd be faster than training from scratch. Lots of similar shapes and colors.","2019-02-19 16:52 +0000","2300.0","3.0","0.0013043478260869566","0.0","1.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097869845285220353","https://twitter.com/gwern/status/1097869845285220353","@toad_spotted @macatbook @sonyaellenmann @vgr Amnesia implies anyone remembered it in the first place! It just didn't get passed on, I think, and a new generation grew up and reinvented it.","2019-02-19 14:45 +0000","600.0","10.0","0.016666666666666666","0.0","1.0","4.0","2.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097868884693184514","https://twitter.com/gwern/status/1097868884693184514","@toad_spotted @macatbook @sonyaellenmann @vgr It definitely looks suspiciously like publication bias. Some of it is striking: the whole movement of 'social impact bonds' is *literally reinventing* 'performance contracting' but I seem to be the first person to ever notice that.","2019-02-19 14:41 +0000","544.0","11.0","0.02022058823529412","0.0","1.0","8.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097866784743571460","https://twitter.com/gwern/status/1097866784743571460","@macatbook @toad_spotted @sonyaellenmann @vgr Sociology/polisci/history stuff doesn't get a lot of citations, though...","2019-02-19 14:33 +0000","1046.0","9.0","0.008604206500956023","0.0","1.0","4.0","1.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097866429750280193","https://twitter.com/gwern/status/1097866429750280193","@AdamDemirel @ggreer @draughtens When the net present value of visitors per hour spent on CSS/JS/HTML improvements exceeds the net present value of visitors per hour writing. :)","2019-02-19 14:32 +0000","194.0","1.0","0.005154639175257732","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097715774368727042","https://twitter.com/gwern/status/1097715774368727042","@pm The Moon is, of course, an absurd liberal myth: https://t.co/wWWqkzq1YF (Also good: https://t.co/2HRxTY5pKZ )","2019-02-19 04:33 +0000","1642.0","96.0","0.058465286236297195","0.0","0.0","5.0","3.0","83.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097705358276128769","https://twitter.com/gwern/status/1097705358276128769","@toad_spotted @sonyaellenmann @vgr I don't envy anyone who tries to do justice to a topic like this, though. Any one of those could be (and in some cases has been) a book all itself.","2019-02-19 03:52 +0000","1583.0","24.0","0.015161086544535692","0.0","1.0","9.0","13.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097705176029429762","https://twitter.com/gwern/status/1097705176029429762","@toad_spotted @sonyaellenmann @vgr It does seem like an under-served niche, doesn't it? You'd think there's be some good book on it but AFAIK there is no single book where you can read about Higher Horizons, the OEO performance contracting experiment, Head Start, MTO, Kansas City Desegregation etc & why mistaken.","2019-02-19 03:51 +0000","1557.0","28.0","0.01798330122029544","0.0","1.0","14.0","5.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097694271195566081","https://twitter.com/gwern/status/1097694271195566081","@jackclarkSF Well-chosen images are rarely a bad idea, assuming you have the time for it.","2019-02-19 03:08 +0000","1367.0","9.0","0.006583760058522311","0.0","1.0","1.0","5.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097664236967395328","https://twitter.com/gwern/status/1097664236967395328","@michael_nielsen ""100. We will never run out of things to program as long as there is a single program around."" ¡ªAlan Perlis, ""Epigrams in Programming"" (https://t.co/5Z1855TxGX)","2019-02-19 01:08 +0000","1494.0","17.0","0.011378848728246318","0.0","0.0","4.0","5.0","7.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097663308755357697","https://twitter.com/gwern/status/1097663308755357697","@hardmaru @roadrunning01's gonna have to download the Danbooru2018 SFW subset images+metadata & learn how to use nagadomi's face cropper if he wants to go beyond potato, though. From the sound of it, the Kaggle version has too few male faces if a character like Lelouch has only n~50...","2019-02-19 01:05 +0000","2791.0","7.0","0.0025080616266571123","0.0","1.0","4.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097662380165787648","https://twitter.com/gwern/status/1097662380165787648","@ZenSaiyuki @EricVBailey Look on the bright side. My Optimus laptop burned out while running char-RNN and corrupted the hard drive at what turned out to be a naked moment in my backups, costing me weeks of data.","2019-02-19 01:01 +0000","152.0","5.0","0.03289473684210526","0.0","1.0","2.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097658491559387137","https://twitter.com/gwern/status/1097658491559387137","@ggreer @draughtens @AdamDemirel I'll change the logo from a PNG to an SVG, which should help.","2019-02-19 00:45 +0000","698.0","2.0","0.0028653295128939827","0.0","0.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097656188693553152","https://twitter.com/gwern/status/1097656188693553152","@ZenSaiyuki @EricVBailey It is definitely not fun, but it's a lot easier on Ubuntu than other distros, and it's something you have to do if you want to run any DL so it's not just a StyleGAN/ProGAN barrier.","2019-02-19 00:36 +0000","186.0","3.0","0.016129032258064516","0.0","1.0","0.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097655762216722434","https://twitter.com/gwern/status/1097655762216722434","@sherjilozair @ryan_t_lowe @OpenAI @ChrSzegedy @goodfellow_ian Seems like a pretty major way your cute little analogy fails... That is precisely the whole point of 'responsible disclosure' rather than, you know, simply 'disclosure'.","2019-02-19 00:35 +0000","656.0","9.0","0.013719512195121951","0.0","0.0","0.0","8.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097644653648797697","https://twitter.com/gwern/status/1097644653648797697","@rseymour (And given the vast differences between then and now in computer & display technology and the general context-specificness of good design, I am extremely concerned by your assertion that 'the studies from the 70s/80s seem to keep being replicated'.)","2019-02-18 23:50 +0000","688.0","6.0","0.00872093023255814","0.0","1.0","1.0","1.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097644193101635584","https://twitter.com/gwern/status/1097644193101635584","@rseymour I recall one analysis concluding the percent of publications which are replications in HCI even generously construed is ~3%, implying little replication / vast filedrawer. Psychology's Reproducibility Crisis demonstrates, I hope, that some degree of 'replication' means little...","2019-02-18 23:49 +0000","693.0","3.0","0.004329004329004329","0.0","1.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097642036331724800","https://twitter.com/gwern/status/1097642036331724800","@Coinosphere @Rassah @LN_Master_Hub @BitcoinTina @YanivMeoded @WhalePanda The deleted/edited blog posts didn't come from either Craig or the source... I dug them out of the IA and Google Reader the hard way.","2019-02-18 23:40 +0000","408.0","5.0","0.012254901960784314","0.0","2.0","1.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097641673964818432","https://twitter.com/gwern/status/1097641673964818432","@ZenSaiyuki @EricVBailey ProGAN's source has been out since like May 2018 (and I started using it then), and StyleGAN is no easier to use. I would say its actual big advantage is that it's >10x faster & converges to higher quality. That's why the recent flurry. <1/10th is way more doable for hobbyists.","2019-02-18 23:39 +0000","194.0","4.0","0.020618556701030927","0.0","1.0","2.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097640327526207493","https://twitter.com/gwern/status/1097640327526207493","@rseymour https://t.co/LpLCdin91Y","2019-02-18 23:33 +0000","765.0","1.0","0.00130718954248366","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097639362161008641","https://twitter.com/gwern/status/1097639362161008641","@sherjilozair @ryan_t_lowe @OpenAI Spectre/Meltdown could be partially patched by editing microcode/VMs, and those patches distributed to users to install & reduce attack surfaces before its disclosure. What patch can I download to protect myself from SOTA LM abuse, and who should've been notified to write it?","2019-02-18 23:29 +0000","805.0","9.0","0.011180124223602485","0.0","1.0","0.0","7.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097632310864871425","https://twitter.com/gwern/status/1097632310864871425","@david_doswell (""Avoiding success at all costs"", eh?)","2019-02-18 23:01 +0000","597.0","2.0","0.0033500837520938024","0.0","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097629980148805632","https://twitter.com/gwern/status/1097629980148805632","They have, however, found the time to implement other things... https://t.co/QDzFXuPWzY","2019-02-18 22:52 +0000","20402.0","288.0","0.014116263111459661","2.0","2.0","24.0","8.0","10.0","0.0","53.0","0.0","0","0","0","0","0","189","189","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097607286732869632","https://twitter.com/gwern/status/1097607286732869632","@Bitcoin101 @EileenOrmsby We don't know! I think literally the last update anyone had about him whatsoever is that he got married a few years ago, supposedly. We don't know if he's alive or dead, where he is, what he's sentenced to, why his case is still open, etc.","2019-02-18 21:22 +0000","269.0","2.0","0.007434944237918215","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097605712178495489","https://twitter.com/gwern/status/1097605712178495489","After extensive HTML/CSS revisions, I'm pleased to note that support for lynx/elinks on https://t.co/LC5JQL86wv has considerably improved! (And it looks a little better on the toys that kids use these days too.) https://t.co/nfRbbBXqB6","2019-02-18 21:16 +0000","11851.0","342.0","0.028858324192051303","4.0","0.0","20.0","6.0","95.0","0.0","21.0","0.0","0","0","0","0","0","196","196","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097604525802835968","https://twitter.com/gwern/status/1097604525802835968","@EileenOrmsby Although Defcon would make a good appendix or something for the sheer '???' factor, and arguably as far as we know, he did 'get away with it'.","2019-02-18 21:11 +0000","272.0","4.0","0.014705882352941176","0.0","1.0","2.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097567534721703936","https://twitter.com/gwern/status/1097567534721703936","@KirkegaardEmil What's weird about it? It's a weak attempt at humor, although a few lines like 'his position softened' weren't too bad.","2019-02-18 18:44 +0000","432.0","10.0","0.023148148148148147","0.0","0.0","0.0","1.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097552098135289856","https://twitter.com/gwern/status/1097552098135289856","@KirkegaardEmil ""The Porter¡¯s Log is a student-run satirical website which publishes cartoons and articles about life at Cambridge University.""","2019-02-18 17:43 +0000","987.0","8.0","0.008105369807497468","0.0","1.0","2.0","1.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097537429274271744","https://twitter.com/gwern/status/1097537429274271744","@xsteenbrugge @NvidiaAI (Could probably tweak StyleGAN itself to do the conversion automatically if it finds the model to be too small. After all, that's all progressive growing is: repeated net2net expansion.)","2019-02-18 16:44 +0000","1674.0","3.0","0.0017921146953405018","0.0","0.0","1.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097537197920661504","https://twitter.com/gwern/status/1097537197920661504","@xsteenbrugge @NvidiaAI It's all photographic imagery isn't it? Just has to be better than simple upscaling. & I meant for training from scratch. But code to convert between sizes shouldn't be hard. 512px->1024px = slap on some res layers initialized to zero; 1024px->512px, lop off some layers. Done.","2019-02-18 16:43 +0000","1639.0","2.0","0.0012202562538133007","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097523302363512834","https://twitter.com/gwern/status/1097523302363512834","@DrazHD @roadrunning01 My latest all-anime-faces StyleGAN model checkpoint: https://t.co/q94r3CIaMd","2019-02-18 15:48 +0000","708.0","26.0","0.03672316384180791","0.0","0.0","4.0","5.0","15.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097519997746847745","https://twitter.com/gwern/status/1097519997746847745","@roadrunning01 @hardmaru @tkasasagi (I wouldn't call that great, but it's probably better than it has any right to be at just n=50 given StyleGAN is not even intended for zero/few-shot learning...)","2019-02-18 15:35 +0000","2871.0","11.0","0.0038314176245210726","0.0","1.0","3.0","5.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097519038387941376","https://twitter.com/gwern/status/1097519038387941376","@DrazHD @roadrunning01 As I keep saying, a day on a 1080 should be more than enough for anime-face specialization (assuming you don't screw up the transfer learning). I don't think Colab is any faster, but it'd probably work since they give you enough hours to do specialization.","2019-02-18 15:31 +0000","408.0","2.0","0.004901960784313725","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097512800581271552","https://twitter.com/gwern/status/1097512800581271552","@roadrunning01 @DrazHD If he's loading a pretrained model, fakes00000.png should never show up because that means it's starting at the beginning: 8px, 0kimg, lod=5, and so on. If transfer was working, fakes00000.png would look like faces, not noise. Which is why I think something's wrong & erasing.","2019-02-18 15:06 +0000","444.0","5.0","0.01126126126126126","0.0","1.0","0.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097510594612248576","https://twitter.com/gwern/status/1097510594612248576","@DrazHD @roadrunning01 Are you starting at 512px (eg by setting resume_kimg=7000 in training_loop.py)? StyleGAN seems to erase all resolutions higher than the starting point for unknown reasons. Obviously, a very big problem for transfer learning.","2019-02-18 14:58 +0000","1384.0","4.0","0.002890173410404624","0.0","0.0","0.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097498929497862144","https://twitter.com/gwern/status/1097498929497862144","@xsteenbrugge @NvidiaAI For 1024px, you could use a super-resolution GAN like ESRGAN to upscale? Alternately, you could change the image progression budget to spend most of your time at 512px and then at the tail end try 1024px.","2019-02-18 14:11 +0000","3004.0","15.0","0.004993342210386152","0.0","1.0","6.0","8.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097493658444709888","https://twitter.com/gwern/status/1097493658444709888","@CoyneLloyd @TxdoHawk @sonyaellenmann @Tipsycaek I can believe that. Farm cats seem to avoid the biggest downsides of outdoorness in urban/suburban environments: cars and rival cats.","2019-02-18 13:50 +0000","157.0","6.0","0.03821656050955414","0.0","0.0","3.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097493163000909824","https://twitter.com/gwern/status/1097493163000909824","@GelsameI @highqualitysh1t @DeepHomage There's a Japanese guy who says you can run StyleGAN on AMDs but doesn't know if training works. I know someone who just bought an AMD GPU intending to try it on StyleGAN (I told him not to but he was seduced by the 16GB VRAM), but then he went on a _Factorio_ binge so dunno...","2019-02-18 13:48 +0000","1670.0","1.0","5.988023952095808E-4","0.0","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097331934622072832","https://twitter.com/gwern/status/1097331934622072832","@highqualitysh1t @DeepHomage Yes. Use 512px and you'll have good quality faces in a few days on 1-2 1080tis. Transfer learn from my anime-face model & it'll be more like a day or 2. Attached: 17 GPU-hours transfer-learning on 512px FFHQ. It's the last 10% of diversity/background detail which eats up weeks. https://t.co/mUbL9x9Whc","2019-02-18 03:08 +0000","2369.0","105.0","0.04432249894470241","0.0","1.0","6.0","4.0","8.0","0.0","4.0","0.0","0","0","0","0","0","82","82","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097320829183029248","https://twitter.com/gwern/status/1097320829183029248","@learn_learning3 What dataset are you training this StyleGAN on?","2019-02-18 02:24 +0000","3559.0","28.0","0.007867378477100308","0.0","1.0","1.0","20.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097318445321015296","https://twitter.com/gwern/status/1097318445321015296","@iurimatias @VitalikButerin @robinhanson @avsa No, just a funny training sample from the StyleGAN I'm running on full 512px Danbooru2018 images. Naturally, lots of figures/bodies.","2019-02-18 02:14 +0000","604.0","4.0","0.006622516556291391","0.0","0.0","1.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097317382924709888","https://twitter.com/gwern/status/1097317382924709888","@VitalikButerin @robinhanson @avsa StyleGAN-chan is concerned about this entire thread. https://t.co/KW7ov4BQGs","2019-02-18 02:10 +0000","2092.0","112.0","0.05353728489483748","0.0","1.0","6.0","8.0","3.0","0.0","19.0","1.0","0","0","0","0","0","74","74","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097284570532188160","https://twitter.com/gwern/status/1097284570532188160","@SereneBiologist @NatsuKazeOtoko @_Ryobot The important thing is that it is better now than it was before. Anyway, I'm working on a writeup of my own, so no big deal if some early coverage isn't perfect.","2019-02-18 00:00 +0000","353.0","3.0","0.0084985835694051","0.0","0.0","0.0","3.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097280334394245121","https://twitter.com/gwern/status/1097280334394245121","@jackclarkSF (And that's only the small one, imagine the big one! Or don't, if that's your preference.)","2019-02-17 23:43 +0000","1733.0","7.0","0.004039238315060588","0.0","0.0","3.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097277421735366657","https://twitter.com/gwern/status/1097277421735366657","@jackclarkSF @catherineols I hope you guys will consider giving access to antinegationism so they can more thoroughly explore the possibilities of GPT-2 for revolutionizing the writing of smut: https://t.co/MV3Ngom4Ri https://t.co/NAs4H1Ul5a","2019-02-17 23:31 +0000","1442.0","91.0","0.06310679611650485","0.0","1.0","8.0","2.0","76.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097276488016150529","https://twitter.com/gwern/status/1097276488016150529","Thanks.","2019-02-17 23:27 +0000","11560.0","45.0","0.0038927335640138406","0.0","0.0","7.0","2.0","0.0","0.0","36.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097269866724294657","https://twitter.com/gwern/status/1097269866724294657","@avgupta_me The amusing part - isn't it much much worse when everyone defaults to closed? 'Oh no, we're not worried about abuse or anything. We just don't want to bother sharing. Go pound salt.' The phrase 'Copenhagen ethics' (https://t.co/oHaiPAZJMR) comes to mind...","2019-02-17 23:01 +0000","802.0","14.0","0.017456359102244388","0.0","0.0","2.0","1.0","11.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097264422370852864","https://twitter.com/gwern/status/1097264422370852864","@avsa @FrancescoRenziA @KaiBakker ""Grayscale SVGs... a more elegant visual iconography, for a more civilized age.""","2019-02-17 22:39 +0000","280.0","1.0","0.0035714285714285713","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097249011256475649","https://twitter.com/gwern/status/1097249011256475649","@avsa @FrancescoRenziA @KaiBakker If you want icons, use the FIGR-8 icon dataset! https://t.co/INslWts1wW https://t.co/UqOAbytbUm","2019-02-17 21:38 +0000","329.0","9.0","0.02735562310030395","0.0","1.0","0.0","0.0","8.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097248698218790912","https://twitter.com/gwern/status/1097248698218790912","@roadrunning01 @hardmaru @tkasasagi That seems low quality. If this was finetuning, did you use my final face model and *start* at 512px res? I think I said that StyleGAN seems to wipe out higher layers if you start at any lower resolution when retraining a model to specialize it.","2019-02-17 21:37 +0000","3187.0","17.0","0.005334170065892689","0.0","2.0","2.0","5.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097142002683166722","https://twitter.com/gwern/status/1097142002683166722","""Proper tea is theft"" because development of tea cultivars, local growing knowledge, embodied processing skill, and reliable global trade networks for transport to end-consumers requires secure property rights & profit-motive for smallholders & traders to provide good tea.","2019-02-17 14:33 +0000","11814.0","28.0","0.0023700694091755544","0.0","2.0","8.0","3.0","0.0","0.0","15.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097139074811465728","https://twitter.com/gwern/status/1097139074811465728","@Saigarich @DrazHD You might think based on these samples that 'faces' is too far away from 'full anime images' to be useful and the faces are totally wiped out by retraining. But actually, it seems like they 'shrink' and move around to tops of bodies: https://t.co/8nzSOvQWY5","2019-02-17 14:21 +0000","16159.0","266.0","0.01646141469150319","0.0","1.0","9.0","4.0","8.0","0.0","14.0","0.0","0","0","0","0","0","230","230","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1097130852218257408","https://twitter.com/gwern/status/1097130852218257408","@makoConstruct But remember, even a private PI's ('Principal Investigator') gotta eat.","2019-02-17 13:49 +0000","198.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096994376998359040","https://twitter.com/gwern/status/1096994376998359040","@oofowieouch If you want a specific character, you need to take a bunch of images and retrain it. If you're fine with random uncontrolled characters and just want long interpolation videos, obviously you don't need to do anything, just use the model as is.","2019-02-17 04:46 +0000","283.0","4.0","0.014134275618374558","0.0","0.0","2.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096992933654470657","https://twitter.com/gwern/status/1096992933654470657","@oofowieouch Ah. Right now, it looks like you can get decent finetuning as long as you have >500 good-quality cropped face-shots of a single character or at least very similar-looking characters. See the Holo, Asuka, Zuikaku, Saber, & Louise interpolation videos demoing specialization.","2019-02-17 04:41 +0000","599.0","5.0","0.008347245409015025","0.0","1.0","2.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096992231859343360","https://twitter.com/gwern/status/1096992231859343360","@Saigarich @DrazHD The visual effects on some of these dataset transitions are amazing, like horror movie zombies or supernatural dolls. Here's the anime+FFHQ faces StyleGAN starting to train on Danbooru2018 512px SFW images: https://t.co/3t2wXOjXEM","2019-02-17 04:38 +0000","16940.0","254.0","0.01499409681227863","0.0","1.0","7.0","5.0","29.0","0.0","16.0","1.0","0","0","0","0","0","195","195","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096990018558660608","https://twitter.com/gwern/status/1096990018558660608","@oofowieouch Finetuning only works for faces. Any particular character's faces is already a face the face-StyleGAN knows how to generate reasonably well, so you're simply specializing it to specific kind of face. But knowing how to draw a face doesn't help much with, say, landscape drawings.","2019-02-17 04:29 +0000","629.0","4.0","0.006359300476947536","0.0","1.0","2.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096972332806144000","https://twitter.com/gwern/status/1096972332806144000","@privatepresh @jessesingal I've never written for Unz. Maybe you're thinking of someone else.","2019-02-17 03:19 +0000","269.0","5.0","0.01858736059479554","0.0","0.0","1.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096971946854699008","https://twitter.com/gwern/status/1096971946854699008","@__Sheik_ @razibkhan He's testing us. Don't fall for it.","2019-02-17 03:17 +0000","287.0","5.0","0.017421602787456445","0.0","0.0","1.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096965140598194178","https://twitter.com/gwern/status/1096965140598194178","@PellaCheyne @_Ryobot More finetuning from him (https://t.co/RC7wURD7He): - Saber (_Fate/Stay Night_): https://t.co/6srjiZxD96 - Louise (_Zero no Tsukaima_): https://t.co/MKmVs8f2QF","2019-02-17 02:50 +0000","992.0","38.0","0.038306451612903226","0.0","0.0","1.0","3.0","29.0","0.0","1.0","0.0","0","0","0","0","0","4","4","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096938380036849665","https://twitter.com/gwern/status/1096938380036849665","@david_doswell Writeup: https://t.co/UfYMj4m3x3","2019-02-17 01:04 +0000","180.0","4.0","0.022222222222222223","0.0","0.0","1.0","0.0","3.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096928144471351296","https://twitter.com/gwern/status/1096928144471351296","@david_doswell Bayesianly, seeing such an event would cause an update but that is more than canceled out by all of the previous & subsequent updates on *not* observing such events (as well as updating on more informative data like population statistics/base rates). It's like Hempel's raven.","2019-02-17 00:23 +0000","227.0","2.0","0.00881057268722467","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096923948678295554","https://twitter.com/gwern/status/1096923948678295554","@david_doswell People are always commanding, praying, wishing, and bargaining with tumors to go away. Oddly, doesn't usually work.","2019-02-17 00:07 +0000","219.0","5.0","0.0228310502283105","0.0","1.0","0.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096905043809390597","https://twitter.com/gwern/status/1096905043809390597","@david_doswell (Every once in a while while researching something, I realize Freeman Dyson is still alive and writing things, and I am shocked again.)","2019-02-16 22:51 +0000","365.0","2.0","0.005479452054794521","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096890662497206273","https://twitter.com/gwern/status/1096890662497206273","@david_doswell There's nothing 'reality-breaching' about a tumor being liquidated by the immune system or any of the other things that can happen to them.","2019-02-16 21:54 +0000","276.0","2.0","0.007246376811594203","0.0","1.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096884636897198080","https://twitter.com/gwern/status/1096884636897198080","@david_doswell The problem of course is that religious people are always going on about 'miracles' or 'messages' of the sort of 'and then the brain tumor vanished' or 'and just as I was praying for financial help, my sister called and offered a loan' or 'the Bible verse gave me winning tickets'","2019-02-16 21:30 +0000","311.0","1.0","0.003215434083601286","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096875260459302912","https://twitter.com/gwern/status/1096875260459302912","@david_doswell Eventually. But I'm pretty sure now that Freeman Dyson actually coined the law, and he was either misremembering & improving on what Littlewood actually wrote, or conflating Littlewood with ""Methods for Studying Coincidences"", Diaconis & Mosteller 1989 https://t.co/NbPMuio9kB","2019-02-16 20:53 +0000","596.0","7.0","0.01174496644295302","0.0","2.0","1.0","0.0","4.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096868692112101376","https://twitter.com/gwern/status/1096868692112101376","@sir_deenicus @metaviv @SuMastodon @ericjang11 You want to talk cost? A single GPU is like $0.2/h. Generating a sequence with GPT-2 is what, half a second? At how many models per GPU? Plus GPU? And how many iterations is that per dollar? Iterations which can be reused? Give me a break.","2019-02-16 20:27 +0000","240.0","2.0","0.008333333333333333","0.0","1.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096867249330233344","https://twitter.com/gwern/status/1096867249330233344","""It is a dark & cold day in the City of Letters. One lone gumshoe, Gwern Noir, private PI, pounds the pavement between the spires of Google Hall and Libgen Library, nursing a darker & colder question: did Littlewood invent Littlewood's Law of Miracles or... was he ???????""","2019-02-16 20:21 +0000","14786.0","60.0","0.004057892601109158","0.0","4.0","30.0","19.0","0.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096843775698182147","https://twitter.com/gwern/status/1096843775698182147","@sir_deenicus @metaviv @SuMastodon @ericjang11 Uh, the hit rate is poor because you're playing with the public model. And Colab demonstrates how easy it is for anyone to run. It's not black magic. It's some Python libraries on a VM. So easy & cheap Google can give it away for free, 'throttled' or not. Goalpost moving.","2019-02-16 18:48 +0000","290.0","3.0","0.010344827586206896","0.0","1.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096831481392041987","https://twitter.com/gwern/status/1096831481392041987","@sir_deenicus @metaviv @SuMastodon @ericjang11 *3 clicks*. *Now*. 'Attacks only get better.' And yes, generating words is a limit. Why do you think troll farms or 50 Cent brigades have to hire so much warm young flesh?","2019-02-16 17:59 +0000","537.0","1.0","0.00186219739292365","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096829038633648128","https://twitter.com/gwern/status/1096829038633648128","@matthew_d_green Works for ZPAQ!","2019-02-16 17:49 +0000","1517.0","6.0","0.003955174686882004","0.0","0.0","1.0","5.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096805323141193728","https://twitter.com/gwern/status/1096805323141193728","@pejmanjohn Not much I can do about Disqus. They're their own thing.","2019-02-16 16:15 +0000","322.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096802513913815040","https://twitter.com/gwern/status/1096802513913815040","@pejmanjohn Mm. Seems fine to me, but I also always seem to prefer longer lines than most people.","2019-02-16 16:04 +0000","322.0","1.0","0.003105590062111801","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096787410590019584","https://twitter.com/gwern/status/1096787410590019584","@anderssandberg ""But look - NAN-2 speaks like a human. Look at the samples I put on Github!"" NAN-2 [TPU pod with glowing red eye]: ""Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san? Nii-san?"" [Eddie pushes the power button] ""-an? Nii""","2019-02-16 15:04 +0000","730.0","10.0","0.0136986301369863","0.0","0.0","2.0","2.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096787237017137152","https://twitter.com/gwern/status/1096787237017137152","@anderssandberg ""Hah. I should have known. The most efficient computing hardware on this world ... it's the human brain, ???'? ??? TUCKER!!!"" ""You're just a grad student, Eddie. You don't understand. Google Brain demands results every year. On Arxiv. TPUs aren't free. I had to, heh.""","2019-02-16 15:03 +0000","743.0","12.0","0.016150740242261104","0.0","1.0","2.0","1.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096787018850410496","https://twitter.com/gwern/status/1096787018850410496","@anderssandberg ""To bring back your body, I'll even become a Google AIchemist. We'll look for the Philosopher's TPU... a card that lets you train models without cost, no matter their size, without a GPU cluster. It's not a legend! We can pass the Thermodynamic Gate, with reversible computing!""","2019-02-16 15:02 +0000","497.0","9.0","0.018108651911468814","0.0","1.0","5.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096786692051271680","https://twitter.com/gwern/status/1096786692051271680","@anderssandberg ""The Law of Equivalent Lunch: 'no performance without priors or data'. In those days... my brother & I believed that to be AIchemy's one and only truth."" --- ""We shouldn't have trained a WBE of Mother, brother! It's a forbidden algorithm!""","2019-02-16 15:01 +0000","392.0","5.0","0.012755102040816327","0.0","1.0","3.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096785867778199552","https://twitter.com/gwern/status/1096785867778199552","@the8472 If you have a trained tagger you can reverse-engineer the latents for better controllability, yeah. Some paper did that recently for other GANs.","2019-02-16 14:58 +0000","308.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096780736630321152","https://twitter.com/gwern/status/1096780736630321152","@the8472 Not sure what you mean. I wouldn't necessarily expect any human-made tag to correspond to each of the latent variables StyleGAN comes up with.","2019-02-16 14:37 +0000","402.0","2.0","0.004975124378109453","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096773971272519680","https://twitter.com/gwern/status/1096773971272519680","@realmforge @_Ryobot Well, it's learning a weird thing: training on just real faces, after having learned anime faces. So partway through training, it's an uneven hybrid of both.","2019-02-16 14:11 +0000","239.0","3.0","0.012552301255230125","0.0","0.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096773627444449281","https://twitter.com/gwern/status/1096773627444449281","@pejmanjohn I already did reduce the line length by restricting the body width. (Also added width-dependent line-heights.)","2019-02-16 14:09 +0000","349.0","2.0","0.0057306590257879654","0.0","1.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096773431352348675","https://twitter.com/gwern/status/1096773431352348675","@AnimeshKarnewar @elonmusk Yes.","2019-02-16 14:08 +0000","2039.0","3.0","0.0014713094654242277","0.0","0.0","0.0","2.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096602858936590336","https://twitter.com/gwern/status/1096602858936590336","@_Ryobot @lerner_adams Sequence-generation with GANs is still a big open problem AFAIK. Might not be necessary since self-supervised prediction works so well, as GPT-2 strikingly reminded us yesterday. (I also have a theory that prediction loss + RL finetuning like https://t.co/p4RD2UdsTp would work.)","2019-02-16 02:51 +0000","21606.0","54.0","0.002499305748403221","0.0","1.0","2.0","16.0","20.0","0.0","15.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096586827732058115","https://twitter.com/gwern/status/1096586827732058115","@LucreSnooker I think it looks nicer, and it's definitely associated with good typography. So it makes sense for me to use for the same reason I employ a Baskerville font variant, small caps, and old-style numerals.","2019-02-16 01:47 +0000","397.0","3.0","0.007556675062972292","0.0","0.0","0.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096581381549903874","https://twitter.com/gwern/status/1096581381549903874","@ntaylor963 I enjoy some of the reactions to having a distinct site design. One reader said it clicked for him one day when he visited a page and went - ""All this black and white! This austere design! This page length & refs! Wait - this is that site again, isn't it? Who ?? this guy?""","2019-02-16 01:25 +0000","546.0","7.0","0.01282051282051282","0.0","0.0","4.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096580422245457920","https://twitter.com/gwern/status/1096580422245457920","@_Ryobot Thanks. Machine translation from English to Japanese is too low-quality for me to try to use. But seriously, I should have initial results tomorrow from dual anime+real training - look forward to it!","2019-02-16 01:22 +0000","12933.0","72.0","0.0055671537926235215","2.0","1.0","10.0","44.0","0.0","0.0","15.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096579116667686913","https://twitter.com/gwern/status/1096579116667686913","@ntaylor963 It wouldn't make too much sense to spend a lot of time on site design years ago with a lot less site traffic, but I have a decent readership now and an experienced competent web designer helping for free, so - strike while the iron's hot!","2019-02-16 01:16 +0000","519.0","3.0","0.005780346820809248","0.0","1.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096578912992325632","https://twitter.com/gwern/status/1096578912992325632","@ntaylor963 Of course. I, ahem, justify all my CSS/JS work on https://t.co/LC5JQL86wv lately by saying it'll be amortized over millions of page views over the whole site over the next decade, as opposed to writing a single essay or analysis which only a fraction of readers will ever see.","2019-02-16 01:16 +0000","524.0","7.0","0.013358778625954198","0.0","1.0","0.0","0.0","5.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096576960208297984","https://twitter.com/gwern/status/1096576960208297984","@ntaylor963 I'm not sure I buy those bald assertions. HCI studies tend to be very low quality and when I have attempted to replicate them (for example, the NYT font experiment), I have not done so, nor have other people ('cognitive disfluency' failed recently IIRC).","2019-02-16 01:08 +0000","533.0","21.0","0.039399624765478425","0.0","1.0","2.0","0.0","0.0","0.0","18.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096575112164372481","https://twitter.com/gwern/status/1096575112164372481","@_Ryobot FWIW, I don't think my results there are very good. The psi & interpolations don't work for transforming faces. I'm working on training simultaneous anime+real, and that I think will make it learn anime<->real.","2019-02-16 01:00 +0000","14480.0","107.0","0.00738950276243094","3.0","2.0","20.0","45.0","0.0","0.0","37.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096573174848937985","https://twitter.com/gwern/status/1096573174848937985","MFW when @elonmusk-senpai finally notices me & my anime GANs but he actually noticed some other guy's tweets. https://t.co/B7iFbvbDIS","2019-02-16 00:53 +0000","30763.0","491.0","0.015960732048239768","13.0","3.0","110.0","166.0","4.0","0.0","168.0","21.0","0","0","1","0","0","5","5","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096572208460296193","https://twitter.com/gwern/status/1096572208460296193","@NatsuKazeOtoko Hey, could you correct your article https://t.co/GgtOahcWde ? @_Ryobot didn't make any of it. I trained the StyleGAN and made the videos. He merely retweeted it. (It's also spelled 'GAN', not 'GAD'.)","2019-02-16 00:49 +0000","4387.0","152.0","0.03464782311374515","0.0","2.0","22.0","23.0","92.0","0.0","13.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096570408919355394","https://twitter.com/gwern/status/1096570408919355394","Long story short, if you're upset that https://t.co/LC5JQL86wv now has nice justified text on Firefox but flush left-ragged right text on desktop Chrome/Chromium, go complain to the Big G about implementing the standard. I'm not adding a whole JS lib to do hyphenation manually.","2019-02-16 00:42 +0000","19150.0","173.0","0.009033942558746736","0.0","4.0","19.0","4.0","111.0","0.0","35.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096568507108679681","https://twitter.com/gwern/status/1096568507108679681","@AdamDemirel One day at a time...","2019-02-16 00:34 +0000","301.0","3.0","0.009966777408637873","0.0","2.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096566413341405184","https://twitter.com/gwern/status/1096566413341405184","@charlenedraws Don't worry. There'll always be a job for you in custom furry porn illustrations. ?","2019-02-16 00:26 +0000","43616.0","8.0","1.8341892883345562E-4","0.0","0.0","3.0","4.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096566081265848320","https://twitter.com/gwern/status/1096566081265848320","Browsers are also problems. Why doesn't justification work? Chrome's misses simple text layout feature since 2010; fixed 2016 & never shipped because 'lack of way to distribute dictionary' https://t.co/o4S49SndYV ! Called 'packages', guys. Same way you distribute rest of Chrome.","2019-02-16 00:25 +0000","15393.0","44.0","0.0028584421490287794","0.0","2.0","7.0","0.0","15.0","0.0","20.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096564922400587776","https://twitter.com/gwern/status/1096564922400587776","@charlenedraws Yes. Random anime/manga/illustration-style art, I should say, cropped down automatically to faces. See https://t.co/xTAGQFTJTO or https://t.co/KbhAGMTqVL for representative samples of the original images.","2019-02-16 00:20 +0000","43635.0","172.0","0.003941789847599404","0.0","1.0","1.0","1.0","164.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096560260268113921","https://twitter.com/gwern/status/1096560260268113921","@vsync /blinking site-under-construction GIF","2019-02-16 00:01 +0000","155.0","2.0","0.012903225806451613","0.0","1.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096534954580852738","https://twitter.com/gwern/status/1096534954580852738","@corasundae Oh sure. GANs and other approaches have been doing frame interpolation for ages. That's easy supervised learning. IIRC, there are tons of approaches to frame interpolation, it doesn't need NNs to be a product. But you'd have to ask animators why or why not they use them.","2019-02-15 22:21 +0000","1977.0","1.0","5.058168942842691E-4","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096534512803160064","https://twitter.com/gwern/status/1096534512803160064","@Saigarich @DrazHD Yeah, that's one side-effect I'm hoping for from the combined dataset. I hope the anime faces will get better backgrounds, but that there will be 'style transfer' on real faces too. (A pity the dataset creation script is so slow!)","2019-02-15 22:19 +0000","269.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096532994460971008","https://twitter.com/gwern/status/1096532994460971008","@roadrunning01 @Saigarich @DrazHD You'd have to ask @SkyLi0n for the bodies, but whole-Danbooru2018 is still running at https://t.co/i2g8ZHTw8T IMO it is making progress but at 1 old GPU it's going to take months.","2019-02-15 22:13 +0000","2622.0","14.0","0.005339435545385202","0.0","0.0","0.0","6.0","7.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096532676528533507","https://twitter.com/gwern/status/1096532676528533507","@Catachresis11 @DrazHD As GPT-2's recipes instructed us yesterday, for the most delicious food to sup on: ""add a pinch of Sea"".","2019-02-15 22:12 +0000","1173.0","1.0","8.525149190110827E-4","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096530406676090880","https://twitter.com/gwern/status/1096530406676090880","@Saigarich @DrazHD They're definitely... different. There's a really creepy Western-doll-like effect halfway between real & anime. But then as it learns real faces better and forgets anime, much less interesting. The combined anime+real face dataset should give more interesting results. Soon... https://t.co/srf4UiPg3m","2019-02-15 22:03 +0000","19460.0","246.0","0.01264131551901336","0.0","4.0","6.0","4.0","14.0","0.0","14.0","0.0","0","0","0","0","0","203","204","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096528947565744128","https://twitter.com/gwern/status/1096528947565744128","@tawnkramer @gdb The paper includes some n-gram similarity metrics. (As for how much 'regurgitation' is too much, well, think about how many slogans and catchphrases we use, and how easily the next word could be predicted in so much real human writing...)","2019-02-15 21:57 +0000","5256.0","29.0","0.005517503805175038","0.0","1.0","7.0","18.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096528613237755904","https://twitter.com/gwern/status/1096528613237755904","@LFGeomatics @hbdchick @V4Analysis Somatic mutations. Very rare (~1000 per individual IIRC?), hence the need for extravagant levels of WGS to detect & distinguish them with legally-applicable probability.","2019-02-15 21:56 +0000","914.0","6.0","0.006564551422319475","0.0","0.0","5.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096525893286182912","https://twitter.com/gwern/status/1096525893286182912","@lamoustache @EileenOrmsby Because the others were throwaway troll accounts while NSWGreat was an actual Evo employee, seller, and official Reddit account.","2019-02-15 21:45 +0000","888.0","5.0","0.00563063063063063","0.0","1.0","1.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096510737017720833","https://twitter.com/gwern/status/1096510737017720833","@NeerajKA Cliff Stoll eventually did admit he was badly wrong: https://t.co/019sa3s14c (Note: original comment apparently was on Boing Boing but no matter how I try I can't find where in the page it is.)","2019-02-15 20:45 +0000","1333.0","26.0","0.019504876219054765","0.0","0.0","6.0","5.0","15.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096492031503974403","https://twitter.com/gwern/status/1096492031503974403","@DrazHD This interpolation video made me feel slightly ill, which is why I want to share it with you all. https://t.co/ZqcxR9RBsj","2019-02-15 19:30 +0000","41416.0","1235.0","0.029819393471122273","29.0","6.0","116.0","14.0","8.0","0.0","199.0","2.0","0","0","1","0","0","5253","860","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096483918960881665","https://twitter.com/gwern/status/1096483918960881665","@DrazHD ? ???? ???? ? ???????? ??????? https://t.co/DgBLqph65Z","2019-02-15 18:58 +0000","23212.0","1078.0","0.046441495778045835","6.0","1.0","35.0","5.0","88.0","0.0","66.0","0.0","0","0","0","0","0","877","877","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096474738229215233","https://twitter.com/gwern/status/1096474738229215233","@Zergfriend The Total Library of Borges has a lot of smut in it, but you do need to be looking for it.","2019-02-15 18:22 +0000","453.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096438792792223744","https://twitter.com/gwern/status/1096438792792223744","@Palfy37 @dirtybiology No, because now the faces won't be adjacent. It's like asking if you can shuffle all the frames in a movie and get a 'morphing effect'. Each frame is an independent image; it merely *looks* like a smooth video when similar frames are next to each other.","2019-02-15 15:59 +0000","49474.0","4.0","8.08505477624611E-5","0.0","0.0","2.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096408125022916608","https://twitter.com/gwern/status/1096408125022916608","@V4Analysis One hopes the Italian police are aware that you can distinguish identical twins with like 100x whole-genome sequencing and that's how other identical-twin crime pairs have been busted using DNA evidence.","2019-02-15 13:57 +0000","3119.0","37.0","0.011862776530939404","3.0","3.0","16.0","3.0","0.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096403244677586944","https://twitter.com/gwern/status/1096403244677586944","@GelsameI Mm, not necessarily that crazy. Any carefully-structured formal content description database will lend itself to ML. It's no more surprising than, say, WikiData or Semantic Web versions of Wikipedia being so useful for knowledge-graph work.","2019-02-15 13:37 +0000","224.0","1.0","0.004464285714285714","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096402773162246144","https://twitter.com/gwern/status/1096402773162246144","@EileenOrmsby He was the target, remember. One of the accounts was his, and the other accounts were included because they claimed to know about NSWGreat et al and to have possibly told me in PMs: https://t.co/oWpV5zyjIk","2019-02-15 13:36 +0000","1040.0","42.0","0.04038461538461539","0.0","1.0","1.0","4.0","34.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096402385516314626","https://twitter.com/gwern/status/1096402385516314626","@Zergfriend But is the lewdness in the NN which generated them or the NN which selected just those out? ?","2019-02-15 13:34 +0000","510.0","5.0","0.00980392156862745","0.0","1.0","0.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096402009647968256","https://twitter.com/gwern/status/1096402009647968256","@GelsameI Not much point. The boorus already scrape Pixiv heavily and add a lot of quality control & metadata.","2019-02-15 13:33 +0000","202.0","1.0","0.0049504950495049506","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096401516834025473","https://twitter.com/gwern/status/1096401516834025473","@Palfy37 @dirtybiology The former. It merely looks like they are 'morphed' because each face is generated from a very similar starting point. But there's no actual 'morphing'. If the GAN had done a bad job of learning faces, you'd see the morphing would 'jump' eg the failure in https://t.co/1P4RGtum8T","2019-02-15 13:31 +0000","49729.0","258.0","0.005188119608276861","0.0","1.0","4.0","3.0","246.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096400805748527104","https://twitter.com/gwern/status/1096400805748527104","@ReciteSocial @verge If I had known how viral that tweet would go, I would've come up with something wittier than 'solid'. I regret so many things in my life. Not the faces, though.","2019-02-15 13:28 +0000","25326.0","10.0","3.9485114111979786E-4","0.0","0.0","5.0","4.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096400005118783489","https://twitter.com/gwern/status/1096400005118783489","@ancap60539851 @EileenOrmsby @abcnews No. Reddit was subpoenaed in 2015, as part of the Evo investigation. But he was only just arrested now, apparently, 4 years later.","2019-02-15 13:25 +0000","186.0","2.0","0.010752688172043012","0.0","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096259665594339328","https://twitter.com/gwern/status/1096259665594339328","@citostyle @lexfridman No, I dumped them into a Reddit comment years ago, be hard to refind it. But they're not hard to find in Google Scholar, the obvious search terms about derivative copyright and database copyrights work.","2019-02-15 04:07 +0000","427.0","6.0","0.01405152224824356","0.0","1.0","1.0","1.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096259348412669952","https://twitter.com/gwern/status/1096259348412669952","@EileenOrmsby I won't lie, I feel a little schadenfreude here since NSWGreat got my Reddit account subpoeaned way back when: https://t.co/NFzQrzgTUs","2019-02-15 04:06 +0000","2346.0","180.0","0.07672634271099744","2.0","2.0","6.0","18.0","101.0","0.0","51.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096245403408887808","https://twitter.com/gwern/status/1096245403408887808","@sir_deenicus @SuMastodon @ericjang11 Anyone can run the software if someone packages it up. The Colab notebook is literally 3 clicks. (One click to enable 'Playground', 1 click to download the model, 1 click to run the interactive script.) And even if 'most people don't', those who *do* can create millions of words.","2019-02-15 03:10 +0000","1290.0","9.0","0.0069767441860465115","0.0","1.0","4.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096244045687459840","https://twitter.com/gwern/status/1096244045687459840","@rseymour @rajivpoc @gdb Ouch. But in any case you should've been suspicious: a Reddit self-post isn't an 'outbound link with >2 karma'.","2019-02-15 03:05 +0000","931.0","13.0","0.013963480128893663","0.0","1.0","3.0","2.0","0.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096233137733074945","https://twitter.com/gwern/status/1096233137733074945","@GelsameI @DrazHD Oh, I knew several people like @SkyLi0n had given it a try with CycleGAN and other variants, didn't know MGM had tried. The results were not good, though, which is why I'm optimistic about this StyleGAN approach. Fundamentally different from CycleGAN.","2019-02-15 02:22 +0000","736.0","7.0","0.009510869565217392","0.0","2.0","0.0","0.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096232540778782720","https://twitter.com/gwern/status/1096232540778782720","@rseymour @rajivpoc @gdb It is? I just googled it and the sole hit is from 4 hours ago, implying it was copied from the OA post: https://t.co/kM3xRXj8SE","2019-02-15 02:19 +0000","27531.0","332.0","0.01205913334059787","0.0","1.0","10.0","8.0","303.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096231992285376512","https://twitter.com/gwern/status/1096231992285376512","@KeepForeverr @ArthurB @ESYudkowsky Photographic flesh textures are easy mode for GANs. I bet his dataset would work just fine if he put some real compute in (1 K80 on a VM for a day or two?!) and used a modern GAN arch. (BEGAN? PGGAN???) Almost makes me want to show him how to do it right...","2019-02-15 02:17 +0000","260.0","2.0","0.007692307692307693","0.0","0.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096230242920861696","https://twitter.com/gwern/status/1096230242920861696","@GelsameI @DrazHD I don't recall anything like that. After MGM, I recall them doing very small 3D models, I think.","2019-02-15 02:10 +0000","715.0","2.0","0.002797202797202797","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096204285682036736","https://twitter.com/gwern/status/1096204285682036736","@_jameshatfield_ @kulpability @elonmusk But Elon didn't tweet Holo - he tweeted Kagerou! Of course, I can do a Kagerou finetuning if he wants...","2019-02-15 00:27 +0000","200.0","4.0","0.02","0.0","0.0","2.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096194803149217798","https://twitter.com/gwern/status/1096194803149217798","@katyanna_q A few paragraphs is amazing, though. With the char-RNNs, you were lucky if it wasn't semantic hash after 2 sentences. And this is with just a Transformer window of 1024 byte-pairs (a bit larger than individual characters) - which immediately raises the question of Transformer-XL.","2019-02-14 23:49 +0000","315.0","7.0","0.022222222222222223","0.0","0.0","2.0","1.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096174946139684866","https://twitter.com/gwern/status/1096174946139684866","@ValARed Just call it 'public outreach' or 'service hours'! ?","2019-02-14 22:30 +0000","586.0","3.0","0.005119453924914676","0.0","0.0","1.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096162037628092416","https://twitter.com/gwern/status/1096162037628092416","@james_ough @ArtirKel Google seems to spider Reddit thoroughly. When I upload stuff like papers, sometimes I submit it to Reddit just to get it crawled quickly.","2019-02-14 21:39 +0000","844.0","12.0","0.014218009478672985","0.0","0.0","3.0","0.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096158439934513155","https://twitter.com/gwern/status/1096158439934513155","@ArtirKel Yeah. I fired up the Colab notebook very excited after reading the random samples text file and... those extra parameters make a big difference in bringing the quality up to the Uncanny Valley. A small absolute difference, but a big qualitative one...","2019-02-14 21:25 +0000","811.0","10.0","0.012330456226880395","0.0","1.0","5.0","1.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096154425507414017","https://twitter.com/gwern/status/1096154425507414017","@rishinair8 @GruPiotr @lexfridman Of course there is. The hard drive runs itself with a ton of firmware.","2019-02-14 21:09 +0000","411.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096144309278126081","https://twitter.com/gwern/status/1096144309278126081","More 'Bad Ganime'! Psi=1.2 on the latest face model, 1000 samples + interpolation videos: https://t.co/YdBQxLAbMq https://t.co/Nj5Jj65m2v","2019-02-14 20:29 +0000","35426.0","964.0","0.027211652458646193","0.0","1.0","11.0","5.0","222.0","0.0","61.0","2.0","0","0","0","0","0","662","662","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096131547126816768","https://twitter.com/gwern/status/1096131547126816768","@roadrunning01 @EldinhoC So what does another 5 days of training at 512px resolution buy? The worst latent points have gotten much better, shoulders/backgrounds markedly improved, and it's begun experimenting with fine details like adding glasses. https://t.co/zxCULLIUFD","2019-02-14 19:38 +0000","3800.0","258.0","0.06789473684210526","1.0","0.0","9.0","31.0","82.0","0.0","68.0","0.0","0","0","0","0","0","67","67","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096108474575003649","https://twitter.com/gwern/status/1096108474575003649","@DrazHD Wait until the face StyleGAN finishes training! (ETA: 5 days.) Current next plan is to redo Asuka/Holo, then drop a 512px version of FFHQ into the anime faces for further training. With any luck, StyleGAN will learn how to do real<->anime...","2019-02-14 18:06 +0000","22216.0","52.0","0.002340655383507382","2.0","2.0","12.0","18.0","0.0","0.0","18.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096108023582547968","https://twitter.com/gwern/status/1096108023582547968","@rishinair8 @GruPiotr @lexfridman Nope, doesn't work. Your don't own the copyright to the text editor, OS, networking, hard drive, or any of the other things involved either.","2019-02-14 18:04 +0000","496.0","4.0","0.008064516129032258","0.0","1.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096101498210172928","https://twitter.com/gwern/status/1096101498210172928","@jeremyphoward @lemonodor @lexfridman Ah. Well, that would do it. But even if one agrees to transfer the copyright of any models trained on the data provided under the contract, the general principle probably doesn't hold (and Google and all tech companies should hope it doesn't).","2019-02-14 17:38 +0000","2304.0","5.0","0.002170138888888889","0.0","0.0","0.0","1.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096101128398356480","https://twitter.com/gwern/status/1096101128398356480","@rishinair8 @GruPiotr @lexfridman Does the manufacturer of your camera own the images you take with it? Does the writer of the text editor you wrote your blog post about those photographs own the copyright to your blog post? What about the writers of the OS and networking code and hard drive code used to send it?","2019-02-14 17:37 +0000","555.0","3.0","0.005405405405405406","0.0","1.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096100821257895936","https://twitter.com/gwern/status/1096100821257895936","@cowtung That gets you into the questions of revealed preferences and whether 'addiction' really exists and hyperbolic discounting etc. Regardless, video gaming addicts surely make up only a small percentage of all video-game hours at this point.","2019-02-14 17:36 +0000","588.0","4.0","0.006802721088435374","0.0","1.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096100271720140801","https://twitter.com/gwern/status/1096100271720140801","@gdb I assume by '25 tries' that means this is the best out of 25 random samples?","2019-02-14 17:34 +0000","46820.0","108.0","0.0023067065356685177","0.0","1.0","45.0","46.0","0.0","0.0","15.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096086628177723394","https://twitter.com/gwern/status/1096086628177723394","@golan @lemonodor @lexfridman (If the Maps API doesn't include any specified financial penalties for breach of contract, why should that matter? They have a remedy: cancel the contract and thereby revoke future access.)","2019-02-14 16:39 +0000","376.0","4.0","0.010638297872340425","0.0","1.0","0.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096086268566532096","https://twitter.com/gwern/status/1096086268566532096","@cowtung Can entertainment even be considered as 'QALYs lost'?","2019-02-14 16:38 +0000","618.0","4.0","0.006472491909385114","0.0","1.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096074916389109761","https://twitter.com/gwern/status/1096074916389109761","@lemonodor @lexfridman One can only assume that Google would never dare take that argument to trial as such a precedent would backfire spectacularly on them.","2019-02-14 15:53 +0000","2577.0","11.0","0.004268529297632906","0.0","1.0","4.0","1.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096062185309171712","https://twitter.com/gwern/status/1096062185309171712","I also sometimes think about this question: ""At what point in time have more man-years been spent playing WWII video games than were actually spent in combat in WWII?"" https://t.co/Wtdct8sVen","2019-02-14 15:02 +0000","15475.0","327.0","0.02113085621970921","8.0","4.0","59.0","9.0","216.0","0.0","31.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096059576842113027","https://twitter.com/gwern/status/1096059576842113027","@GruPiotr @lexfridman Based on cases like Naruto (https://t.co/CoXscLmItK) the reasoning might be that the owner is whoever is most proximate to causally initiating the generation+selection process.","2019-02-14 14:52 +0000","1335.0","21.0","0.015730337078651686","0.0","1.0","4.0","1.0","11.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096058942457856000","https://twitter.com/gwern/status/1096058942457856000","????? ???, a movie about the Apollo program, cost >$60m; ?????? 13 cost >$85m; there are many movies. The Apollo cost $120b (https://t.co/HJRKleSlx3). Given the steady increase in real Hollywood movie budgets, at what point do Apollo movies cost more than Apollo?","2019-02-14 14:49 +0000","20425.0","143.0","0.007001223990208079","15.0","3.0","81.0","8.0","17.0","0.0","18.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096054997517770752","https://twitter.com/gwern/status/1096054997517770752","Q. Why can Paypal freeze accounts without recourse or explanation? A. Because long ago they decided users must accept their Terms of Service and the clause of Paypal infallibility.","2019-02-14 14:34 +0000","17361.0","83.0","0.0047808305973158225","4.0","1.0","31.0","8.0","0.0","0.0","39.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096053689473740801","https://twitter.com/gwern/status/1096053689473740801","@lexfridman I looked into this a while ago and apparently there's never been a clear precedent and the lawyers are still debating it in the journals. The de facto interpretation everyone seems to use is that the copyright is owned by the person who generated & selected the image.","2019-02-14 14:28 +0000","5901.0","102.0","0.01728520589730554","4.0","3.0","54.0","29.0","0.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096052765078507521","https://twitter.com/gwern/status/1096052765078507521","@Phantrosity /shrug. Random steps is how everyone does linear interpolations, and you can justify it handwavily by arguing that the GAN should be disentangling factors to provide a reasonably linear manifold. Seems to work...?","2019-02-14 14:25 +0000","386.0","2.0","0.0051813471502590676","0.0","1.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1096050976933453824","https://twitter.com/gwern/status/1096050976933453824","@ESYudkowsky '? / Arise, young boy / and become a legend! / ?'","2019-02-14 14:18 +0000","2281.0","19.0","0.008329679964927663","0.0","0.0","11.0","3.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095902969462177792","https://twitter.com/gwern/status/1095902969462177792","@crude2refined @soumithchintala 'intrinsic performance ratings' using chess engines? Yes, Ken Regan has a lot of this sort of analysis of objective chess-playing quality.","2019-02-14 04:30 +0000","697.0","3.0","0.00430416068866571","0.0","0.0","0.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095901662395412480","https://twitter.com/gwern/status/1095901662395412480","@xywaz It is surprising how well it works. And with such an odd architecture, in some ways. When I read the StyleGAN paper I was baffled by the decision to use *8* 512x fully-connected layers. But... it works? And yes, as it trains, even earrings are showing up!","2019-02-14 04:24 +0000","16895.0","37.0","0.00218999704054454","0.0","0.0","2.0","22.0","0.0","0.0","13.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095887086694080514","https://twitter.com/gwern/status/1095887086694080514","@xywaz It can be trained to draw men: https://t.co/kM4fSh8PQV","2019-02-14 03:26 +0000","24227.0","86.0","0.003549758533867173","0.0","1.0","3.0","29.0","20.0","0.0","32.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095880386457071617","https://twitter.com/gwern/status/1095880386457071617","@soumithchintala During AG, some asked what the best games ever were. Go people pointed at the marathon tournaments of the https://t.co/rA3d8y2F0r . I wonder if that accounts for any of the rise in the 1800s and then the eventual plateau & fall ~1900 (a delayed effect from the Meiji Restoration)?","2019-02-14 03:00 +0000","1822.0","34.0","0.018660812294182216","0.0","1.0","1.0","2.0","19.0","0.0","11.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095871405319749633","https://twitter.com/gwern/status/1095871405319749633","@TooBig2BeTrue @roadrunning01 (In general, you can say that very few people would write or design the ProGAN/StyleGAN codebase the way the original researchers did...)","2019-02-14 02:24 +0000","19272.0","5.0","2.5944375259443755E-4","0.0","0.0","0.0","5.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095871197118648321","https://twitter.com/gwern/status/1095871197118648321","@TooBig2BeTrue @roadrunning01 Most GANs are easy to run with commandline options. ProGAN/StyleGAN are highly unusual in that they have no CLI interface. You have to edit training/training_loop.py to change (most) hyperparameters although some are reset in https://t.co/spiKGnBWDy so you have to edit it too.","2019-02-14 02:23 +0000","19408.0","16.0","8.244023083264633E-4","0.0","1.0","0.0","1.0","12.0","0.0","1.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095851800115191810","https://twitter.com/gwern/status/1095851800115191810","@Jon_Christian ""...Its social accountability seems sort of like that of designers of military weapons: unculpable right up until they get a little too good at their job."" --DFW, ""E unibus pluram: television and U.S. fiction""","2019-02-14 01:06 +0000","634.0","5.0","0.007886435331230283","0.0","0.0","2.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095840842919620608","https://twitter.com/gwern/status/1095840842919620608","@deeppomf @EvaMonkey Nope, still no idea. I am hoping it is related to incomplete training of the all-face StyleGAN being retrained and that once it finishes training and I redo Asuka & Holo, the horrible noise artifacts will have simply vanished.","2019-02-14 00:23 +0000","637.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095830401766051844","https://twitter.com/gwern/status/1095830401766051844","Holo pretrained StyleGAN model: https://t.co/CnswcEKerx","2019-02-13 23:41 +0000","20862.0","190.0","0.009107468123861567","0.0","1.0","10.0","6.0","135.0","0.0","37.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095816711851397120","https://twitter.com/gwern/status/1095816711851397120","@roadrunning01 The Asuka StyleGAN pretrained model: https://t.co/yHrxc3A64W","2019-02-13 22:47 +0000","1677.0","14.0","0.008348240906380441","0.0","1.0","3.0","3.0","6.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095811993905160192","https://twitter.com/gwern/status/1095811993905160192","I say they're bad but many are legitimately cool or interesting. Setting psi=1.2 just makes the quality range huge... eg these 4 are striking. (BTW @EvaMonkey have you been looking at any of these Asuka samples? So lulzy.) https://t.co/BH8MPe9Sn7","2019-02-13 22:28 +0000","39934.0","1314.0","0.032904292081935196","0.0","2.0","26.0","26.0","206.0","0.0","73.0","2.0","0","0","0","0","0","979","979","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095809330052562944","https://twitter.com/gwern/status/1095809330052562944","@safari_eyes I think they're metaphors for themselves.","2019-02-13 22:17 +0000","629.0","1.0","0.001589825119236884","0.0","0.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095806059124535296","https://twitter.com/gwern/status/1095806059124535296","Someone asked for bizarre & bad samples, so here's ~1000 Asukas with the truncation cranked up to psi=1.2: https://t.co/7Fjx0u8fKa Enjoy, I guess?","2019-02-13 22:04 +0000","37956.0","398.0","0.010485825692907578","1.0","2.0","5.0","1.0","307.0","0.0","82.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095783484906528768","https://twitter.com/gwern/status/1095783484906528768","@DrazHD The scripts hardcode the random seeds for reproducibility. You can set the seeds randomly to get random results each time, eg in 'pretrained_example.py': replace 'rnd = np.random.RandomState(5)' with 'rnd = np.random.RandomState(None) # seeds from system randomness'","2019-02-13 20:35 +0000","21731.0","20.0","9.203442087340665E-4","0.0","1.0","4.0","8.0","0.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095782665784053765","https://twitter.com/gwern/status/1095782665784053765","@ping0x89 @SteveBellovin And fittingly, Newport's essay opens with Knuth.","2019-02-13 20:32 +0000","502.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095758972890562568","https://twitter.com/gwern/status/1095758972890562568","@pjie2 @WiringTheBrain @HankGreelyLSJU There is no inconsistency there. Polygenic selection for intelligence is already doable, and the additional technologies coming down the pipeline deliver far larger and more important gains. But thanks for admitting I was right about standing variation and my analysis as well.","2019-02-13 18:57 +0000","188.0","4.0","0.02127659574468085","0.0","0.0","1.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095757748921339904","https://twitter.com/gwern/status/1095757748921339904","People keep asking about reversing StyleGAN to control/generate variants on an existing natural image. This right now seems to be the best thing available: https://t.co/Z6vDrJOrb3","2019-02-13 18:53 +0000","30080.0","213.0","0.0070811170212765956","0.0","0.0","11.0","6.0","174.0","0.0","22.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095733669216616454","https://twitter.com/gwern/status/1095733669216616454","@pjie2 @WiringTheBrain @HankGreelyLSJU So we've gone from 'impossible no matter how many' to 'oh yeah it's possible if we treat humans like spiders' and we've gone from 'I'll be astonished if it ever happens to' 'oh yeah there's lots of ways, it's just hard and there are problems'. I see.","2019-02-13 17:17 +0000","323.0","5.0","0.015479876160990712","0.0","1.0","1.0","3.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095709551536558081","https://twitter.com/gwern/status/1095709551536558081","@pjie2 @WiringTheBrain @HankGreelyLSJU Gametogenesis is already done in the lab for rats/mice. Eggs can be harvested by biopsy, yielding hundreds or thousands of eggs without the limits of standard oocyte harvesting. The cattle people are already planning to do iterated embryo selection after decades of speculation.","2019-02-13 15:41 +0000","363.0","4.0","0.011019283746556474","0.0","1.0","1.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095704291938091011","https://twitter.com/gwern/status/1095704291938091011","@pjie2 @WiringTheBrain @HankGreelyLSJU You said 'no matter how many'. You have just correctly explained why you were wrong.","2019-02-13 15:20 +0000","429.0","4.0","0.009324009324009324","0.0","1.0","2.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095695351565168640","https://twitter.com/gwern/status/1095695351565168640","@kame_123456789 I don't have 8 V100 GPUs either, but my 2x1080ti GPUs are enough for great anime faces. You are overestimating it.","2019-02-13 14:45 +0000","137.0","2.0","0.014598540145985401","0.0","0.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095688887039475712","https://twitter.com/gwern/status/1095688887039475712","@pjie2 @WiringTheBrain @HankGreelyLSJU No. The standing variation within you and your wife is enough to do that. Think about how many thousands of height-affecting variants there are, and their population frequencies, and what happens if you maximize by selection rather than average out. 'paradox of polygenicity'.","2019-02-13 14:19 +0000","364.0","6.0","0.016483516483516484","0.0","1.0","1.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095685832030277634","https://twitter.com/gwern/status/1095685832030277634","An experimental run of StyleGAN on the full Danbooru2018 SFW 512px image corpus is pretty trippy-looking so far. Wonder if the initialization from the faces StyleGAN is why it looks so Boschian... https://t.co/o0BNTceakz","2019-02-13 14:07 +0000","33597.0","808.0","0.02404976634818585","12.0","1.0","30.0","11.0","241.0","0.0","71.0","2.0","0","0","0","0","0","440","440","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095497587426291713","https://twitter.com/gwern/status/1095497587426291713","@hardmaru @tkasasagi ('1boy solo' so you can crop out the face with nagadomi's script while being sure it's a guy, and then 50k << 220k but if you start with my all-faces StyleGAN pretrained model, it's *probably* enough, since a lot of anime faces are kinda androgynous anyway.)","2019-02-13 01:39 +0000","4821.0","15.0","0.0031113876789047915","0.0","1.0","4.0","6.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095497025616003072","https://twitter.com/gwern/status/1095497025616003072","@antonioregalado I dastn't. I'm sure dozens of people have done that one before and better.","2019-02-13 01:36 +0000","459.0","4.0","0.008714596949891068","0.0","0.0","1.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095496443698339843","https://twitter.com/gwern/status/1095496443698339843","@hardmaru @tkasasagi It's doable. Despite the dataset bias towards females, there should still be something like 50k '1boy solo' tagged images you could use for males. That should be enough. Be better if you could train a anime-face gender classifier and then just filter through all faces, though.","2019-02-13 01:34 +0000","5188.0","104.0","0.020046260601387818","2.0","1.0","3.0","16.0","0.0","0.0","82.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095493592737607680","https://twitter.com/gwern/status/1095493592737607680","??????? ??? ??????: two unemployed geometers, unsure where to go, struggle to find their place and meaning in life, but wind up returning to where they started at the end of the play.","2019-02-13 01:23 +0000","14212.0","36.0","0.0025330706445257528","1.0","2.0","18.0","0.0","0.0","0.0","15.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095465999858573312","https://twitter.com/gwern/status/1095465999858573312","@wnjhwng @QuantNoVa @_Ryobot @mrkrabs173 https://t.co/SEIRxerM4S","2019-02-12 23:33 +0000","1344.0","20.0","0.01488095238095238","0.0","0.0","0.0","5.0","14.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095465552506695680","https://twitter.com/gwern/status/1095465552506695680","@ovchinnikov @pnin1957 Whups: https://t.co/CZd0Hl3xBL","2019-02-12 23:31 +0000","2110.0","81.0","0.03838862559241706","0.0","1.0","5.0","0.0","74.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095464936959082496","https://twitter.com/gwern/status/1095464936959082496","Hi everyone thanks for reading my tweet! Be sure to check out my Soundcloud and buy the t-shirt: https://t.co/QuZ0Bp9AlT","2019-02-12 23:29 +0000","163838.0","480.0","0.002929723263223428","2.0","0.0","15.0","66.0","366.0","0.0","31.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095446114722041856","https://twitter.com/gwern/status/1095446114722041856","@hbdchick Note the rhetorical trick with their neologism of 'embryo profiling' - it already has well-known and accepted names ('PGD' or 'embryo selection'). But 'profiling' has nasty connotations to the public, so they're trying to push it instead and also hide how common PGD already is.","2019-02-12 22:14 +0000","3826.0","54.0","0.014113957135389441","8.0","0.0","25.0","10.0","0.0","0.0","10.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095437379329028096","https://twitter.com/gwern/status/1095437379329028096","@fche @pnin1957 On the contrary, 'borrowing strength' is all about boosting signal and correctly estimating what would be spuriously extreme point-estimates. It makes all estimates better and more precise.","2019-02-12 21:39 +0000","330.0","6.0","0.01818181818181818","0.0","0.0","1.0","4.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095425021554642951","https://twitter.com/gwern/status/1095425021554642951","@fche @pnin1957 No. You just shrink them back based on the standard error and group variance (which given the obvious homogeneity of the results, will shrink them a lot). In this case, they didn't, but they should've. Not much point in generating meta-analytic estimates if you don't use them.","2019-02-12 20:50 +0000","787.0","6.0","0.007623888182973317","0.0","1.0","1.0","1.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095419336397856779","https://twitter.com/gwern/status/1095419336397856779","@fche @pnin1957 The mean forms a prior to which individual imprecisely-estimated datapoints must be shrunk back to. Since the global prior strongly indicates near-zero, when you select the highest estimates, with wide error bars they are probably just outliers and will regress to the mean.","2019-02-12 20:28 +0000","752.0","1.0","0.0013297872340425532","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095410303876116480","https://twitter.com/gwern/status/1095410303876116480","@fche @pnin1957 You mean those datapoints which will shrink way back to the global mean of ~0 on replication?","2019-02-12 19:52 +0000","1038.0","9.0","0.008670520231213872","0.0","1.0","2.0","2.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095405935919534081","https://twitter.com/gwern/status/1095405935919534081","@pnin1957 Another example of the metallic laws: https://t.co/sTa9yxfR8b","2019-02-12 19:35 +0000","2805.0","86.0","0.030659536541889482","0.0","1.0","3.0","5.0","74.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095405315821047809","https://twitter.com/gwern/status/1095405315821047809","@fche @pnin1957 How do you figure that?","2019-02-12 19:32 +0000","1154.0","2.0","0.0017331022530329288","0.0","1.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095396501348667394","https://twitter.com/gwern/status/1095396501348667394","@PellaCheyne @_Ryobot On the bright side, if you can scrape up ~5k faces of any specific character (possibly less, haven't tried), you can easily retrain the full StyleGAN to a single character within a few hours. Worked well for Holo & Asuka. So if you really want a Zero Two StyleGAN, it's doable.","2019-02-12 18:57 +0000","17564.0","22.0","0.0012525620587565474","0.0","2.0","5.0","9.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095394861564551168","https://twitter.com/gwern/status/1095394861564551168","@QuantNoVa @_Ryobot @mrkrabs173 Train it yourself :) I've provided the Danbooru2018 dataset and the trained model for download already...","2019-02-12 18:51 +0000","1821.0","7.0","0.003844041735310269","0.0","1.0","0.0","6.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095394528473874436","https://twitter.com/gwern/status/1095394528473874436","@moruganribieru @_Ryobot The faces were made using the Danbooru2017 version of https://t.co/sgxDdGvfML","2019-02-12 18:49 +0000","16863.0","155.0","0.009191721520488643","0.0","0.0","8.0","14.0","119.0","0.0","14.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095394269836337153","https://twitter.com/gwern/status/1095394269836337153","@PellaCheyne @_Ryobot This was trained on Danbooru2017. As the name indicates, it contains images from 2017 and earlier. _Darling_ was a 2018 anime, so...","2019-02-12 18:48 +0000","18002.0","21.0","0.001166537051438729","0.0","1.0","4.0","10.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095392606194987009","https://twitter.com/gwern/status/1095392606194987009","@DanFrederiksen2 ~220k","2019-02-12 18:42 +0000","3955.0","5.0","0.0012642225031605564","0.0","0.0","0.0","4.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095380600142336004","https://twitter.com/gwern/status/1095380600142336004","@ArtirKel Remember to ignore any study which looks even a little bit like a candidate-gene study. They're particularly bad in the drug response area.","2019-02-12 17:54 +0000","1156.0","7.0","0.006055363321799308","0.0","1.0","4.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095369892952502273","https://twitter.com/gwern/status/1095369892952502273","@SelfReflective The https://t.co/Zp0iyGIZt6 in the StyleGAN repo implemetns several interpolations. The above interps were done using similar code in https://t.co/MQw7SlcAZp","2019-02-12 17:11 +0000","92844.0","228.0","0.0024557321959415793","1.0","2.0","6.0","7.0","210.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095364831295799296","https://twitter.com/gwern/status/1095364831295799296","@Ali_Design_1 https://t.co/MQw7SlcAZp","2019-02-12 16:51 +0000","58947.0","208.0","0.003528593482280693","1.0","0.0","7.0","1.0","198.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095348397731704832","https://twitter.com/gwern/status/1095348397731704832","@Fpga18Gansn1230 https://t.co/sgxDdGvfML","2019-02-12 15:46 +0000","497.0","10.0","0.02012072434607646","0.0","1.0","1.0","4.0","4.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095343362389884928","https://twitter.com/gwern/status/1095343362389884928","@SelfReflective Yes. The GAN learns to use a random number as a 'description' of the image (a latent embedding). Then to generate the interpolations, you start with a random number, feed it in, tweak the random number a little bit, feed it in...","2019-02-12 15:26 +0000","93449.0","12.0","1.2841228905606267E-4","0.0","1.0","4.0","7.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095335185921392641","https://twitter.com/gwern/status/1095335185921392641","@KittJohnson_ There were many very cool samples; unfortunately, I had to delete most of them because there was an artifact or the background just wasn't convincing and they might be noticed by an /r/evangelion Redditor.","2019-02-12 14:53 +0000","920.0","10.0","0.010869565217391304","0.0","1.0","2.0","4.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095331623736758273","https://twitter.com/gwern/status/1095331623736758273","~100 of my best, hand-selected Holo StyleGAN samples: https://t.co/WStllQyzeA Also, ~130 of my best Asuka Evangelion samples: https://t.co/SUHwtFJPKV","2019-02-12 14:39 +0000","40726.0","1023.0","0.025119088542945537","9.0","3.0","42.0","17.0","854.0","0.0","98.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095320579786395649","https://twitter.com/gwern/status/1095320579786395649","@william_woof Eh. The faces are so high-quality right now that I'm not sure heroic efforts in data-cleaning are necessary. What I want to try next is scaling to full Danbooru2017 images, not just faces.","2019-02-12 13:55 +0000","3641.0","5.0","0.0013732491073880802","0.0","0.0","2.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095318350337589248","https://twitter.com/gwern/status/1095318350337589248","@Saigarich Right. And IMO, their results, while excellent within that very small domain, were unfortunately highly restricted. On the other hand, my StyleGAN samples are bonkers varied. You really have to go through a few thousand samples by hand to appreciate how varied it is.","2019-02-12 13:46 +0000","3100.0","3.0","9.67741935483871E-4","0.0","0.0","2.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095317957931134981","https://twitter.com/gwern/status/1095317957931134981","I have a confession. When I submitted my best Asuka & Holo samples to Reddit (https://t.co/SUHwtFseTn https://t.co/WStllQgYn2), I may not have been entirely candid about their nature: - https://t.co/TvNKHI1KB0 - https://t.co/GaUfXGZbVv https://t.co/90ksgE1kJK ?( ?¡ã ?? ?¡ã)?","2019-02-12 13:45 +0000","40689.0","741.0","0.0182113101821131","4.0","0.0","14.0","6.0","635.0","0.0","81.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095316740177907712","https://twitter.com/gwern/status/1095316740177907712","@Rationalgaze2 https://t.co/Po709g4TTa","2019-02-12 13:40 +0000","50599.0","31.0","6.126603292555189E-4","0.0","0.0","0.0","22.0","3.0","0.0","0.0","6.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095316399826890752","https://twitter.com/gwern/status/1095316399826890752","@Zhuan_Xia @_Ryobot Yes. roadrunner01's using my model, so I'm pretty familiar with how the interpolation looks.","2019-02-12 13:39 +0000","26934.0","56.0","0.0020791564565233534","0.0","0.0","7.0","42.0","0.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095316223196311552","https://twitter.com/gwern/status/1095316223196311552","@DanielSMatthews May be some errors in the GAN (no self-attention) but heterochromia really is common in anime characters: https://t.co/OQxobic8EI","2019-02-12 13:38 +0000","10154.0","56.0","0.005515067953515856","0.0","0.0","2.0","9.0","45.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095315979318505472","https://twitter.com/gwern/status/1095315979318505472","@WMiggg 1. ~220k 2. ~8 GPU-days (but diminishing returns - results are great after just ~4)","2019-02-12 13:37 +0000","19041.0","32.0","0.001680584002941022","2.0","2.0","5.0","4.0","0.0","0.0","19.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095315534160293893","https://twitter.com/gwern/status/1095315534160293893","@william_woof https://t.co/21DtraGy1N IMO, I dislike how it cuts off so much of the top of the head. If I were taking a week to do all the cropping again, I'd fix that part.","2019-02-12 13:35 +0000","3678.0","14.0","0.0038064165307232192","0.0","1.0","0.0","0.0","13.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095315088456798209","https://twitter.com/gwern/status/1095315088456798209","@T_ahura54 @ali_fzl95 Behold in awe you obsolete monkey the majesty that is StyleGAN: https://t.co/eUn2cQ1w6Y https://t.co/lu2a9Wj2hJ https://t.co/SEIRxerM4S","2019-02-12 13:34 +0000","4327.0","89.0","0.020568523226253757","0.0","1.0","5.0","1.0","78.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095314624755523584","https://twitter.com/gwern/status/1095314624755523584","@Saigarich The problem is a specific artist doesn't draw nearly enough for a standard GAN. When I tried that with ProGAN, it couldn't do any better than memorize the Holo faces. Too few. And I didn't have the compute to train ProGAN on all 220k faces. StyleGAN is so much faster that I can.","2019-02-12 13:32 +0000","3104.0","1.0","3.2216494845360824E-4","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095174308220030977","https://twitter.com/gwern/status/1095174308220030977","@rhyolight @alanyttian @goodfellow_ian Considering the face cropping script and errors and noise and that a lot of these had to be upscaled by waifu2x to 512px... Actually, it may not be all that far off.","2019-02-12 04:14 +0000","2734.0","10.0","0.0036576444769568397","0.0","0.0","3.0","4.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095170531651915776","https://twitter.com/gwern/status/1095170531651915776","Alright ya filthy animals, have some more videos (Holo/_Spice & Wolf_): - https://t.co/KOsDUWXg5z - https://t.co/R8jpNh0fcH - https://t.co/NpG91iaAec https://t.co/ooPiXTAxrQ","2019-02-12 03:59 +0000","206640.0","6279.0","0.03038617886178862","123.0","7.0","288.0","128.0","663.0","0.0","275.0","15.0","0","0","0","0","0","27583","4780","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095153869259239424","https://twitter.com/gwern/status/1095153869259239424","@roadrunning01 Here's a more recent model: https://t.co/FQccSKNqRl You can use it in the Colab notebook by changing the URL to 'https://t.co/qC3BmZy1AF' (Google Drive direct download links are tricky to figure out...)","2019-02-12 02:53 +0000","168561.0","1121.0","0.006650411423757572","16.0","1.0","42.0","58.0","955.0","0.0","47.0","0.0","0","0","2","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095149265163292673","https://twitter.com/gwern/status/1095149265163292673","@fabricatedmath It uses many noise vectors at multiple levels as the 'style' vectors. I couldn't tell you what kind of high-dimensional geometry it would be, or what speed the interpolations move around in. You'd have to check the script for the latter one, I guess.","2019-02-12 02:35 +0000","1195.0","2.0","0.0016736401673640166","0.0","0.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095147387683782656","https://twitter.com/gwern/status/1095147387683782656","@fabricatedmath ?","2019-02-12 02:27 +0000","1174.0","3.0","0.002555366269165247","0.0","1.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095146865975312384","https://twitter.com/gwern/status/1095146865975312384","@fabricatedmath This comes up regularly, but heterochromia is not necessarily wrong, and strictly speaking lighting/angles means all characters should be a little differently eye-colored.","2019-02-12 02:25 +0000","22414.0","12.0","5.353796734183992E-4","0.0","0.0","2.0","4.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095145633189638144","https://twitter.com/gwern/status/1095145633189638144","@FrancisVeeGee NNs are fundamentally asymmetric like that. Weeks or months to train, but then it usually takes <100ms to run them a single time. So here there's 4 frames/faces being combined, maybe 30 frames a second, 10s, so 4 * 30 * 10 * 0.1 = 120s ballpark.","2019-02-12 02:20 +0000","134630.0","42.0","3.1196612939166606E-4","0.0","0.0","19.0","9.0","0.0","0.0","14.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095145010268422145","https://twitter.com/gwern/status/1095145010268422145","@michael_nielsen I remember this used to be a big thing in Internet punditry. We were all learning to think on a higher level by thinking in abstraction and leaving the details to lookups/searches... Fine for manuals where you already understand the core ideas, but for how many other things?","2019-02-12 02:18 +0000","2665.0","46.0","0.01726078799249531","0.0","1.0","14.0","8.0","0.0","0.0","23.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095142294796361728","https://twitter.com/gwern/status/1095142294796361728","@Linzolle Present day. Present time. (Ahahaha!)","2019-02-12 02:07 +0000","98156.0","135.0","0.0013753616691796731","7.0","2.0","68.0","31.0","0.0","0.0","27.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095141344295743489","https://twitter.com/gwern/status/1095141344295743489","@FrancisVeeGee I did it on Colab. It was more than a minute, less than 10 minutes, IIRC.","2019-02-12 02:03 +0000","135349.0","36.0","2.659790615372112E-4","0.0","1.0","8.0","16.0","0.0","0.0","11.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095133164001996800","https://twitter.com/gwern/status/1095133164001996800","@nmgrm @kikko_fr 's https://t.co/MQw7SlcAZp (A model eh? Would you call that... 'a hun in the oven'?)","2019-02-12 01:31 +0000","9392.0","76.0","0.008091993185689948","0.0","0.0","1.0","4.0","59.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095131651246575616","https://twitter.com/gwern/status/1095131651246575616","The StyleGAN anime face interpolations are solid: - https://t.co/JtzWaQ1rLT - https://t.co/dl5AqXYqlc - https://t.co/FaldMdAbyQ https://t.co/7T7hoCDXMg","2019-02-12 01:25 +0000","392205.0","52379.0","0.13355005673053633","901.0","33.0","1839.0","644.0","4674.0","0.0","5072.0","28.0","0","0","64","0","0","201090","39124","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095122579453632512","https://twitter.com/gwern/status/1095122579453632512","It's worth remembering the point of reading a manual: it is not to remember everything that is in it for later but to later remember ???? something is in it.","2019-02-12 00:49 +0000","89563.0","732.0","0.008173017875685272","61.0","6.0","380.0","149.0","0.0","0.0","133.0","2.0","0","0","1","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095069344130916359","https://twitter.com/gwern/status/1095069344130916359","@ESRogs @robinhanson I assume it's about HEXACO, which splits OCEAN more finely by adding another factor. It's a hierarchy. You can go down to facets, sub-facets, or even to the 'Small 100' with enough items.","2019-02-11 21:17 +0000","956.0","11.0","0.011506276150627616","1.0","0.0","6.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095055891668369413","https://twitter.com/gwern/status/1095055891668369413","@jonathanfly @fabricatedmath (Unless you're finetuning the model, I suppose. I get good finetuning in 2-4h which is 4-8 GPU-hours, so a few factors slow down is still a reasonable amount of time.)","2019-02-11 20:24 +0000","340.0","6.0","0.01764705882352941","0.0","0.0","0.0","3.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095051517374197761","https://twitter.com/gwern/status/1095051517374197761","@jonathanfly @fabricatedmath But considering how slow this is already, you might as well just rent some cloud. I'm looking at https://t.co/wuVi8ZUYBE and they have 8x1080ti or 10x1080tis at $1.2/hr, which is ridiculous.","2019-02-11 20:06 +0000","426.0","5.0","0.011737089201877934","0.0","1.0","0.0","0.0","4.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095051316169183232","https://twitter.com/gwern/status/1095051316169183232","@jonathanfly @fabricatedmath As long as you can fit minibatch=1, you can use gradient accumulation to fake larger minibatches at the linear slowdown. Assuming there's nothing dependent on 'real' minibatch count (which there probably is), it is theoretically the same exact thing but slower.","2019-02-11 20:05 +0000","414.0","4.0","0.00966183574879227","0.0","1.0","1.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095035088457383936","https://twitter.com/gwern/status/1095035088457383936","@NotJustPolitics Huh. I didn't realize noise made any difference at psi=0 because they are constant in the sample, but I guess the noise is just constant within each row and they all share the same global mean.","2019-02-11 19:01 +0000","184.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1095017173918064641","https://twitter.com/gwern/status/1095017173918064641","@NotJustPolitics https://t.co/qvNL2BfASj","2019-02-11 17:50 +0000","1079.0","1.0","9.267840593141798E-4","0.0","0.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094802169910431744","https://twitter.com/gwern/status/1094802169910431744","@lcmgcd The Nvidia StyleGAN repo is on Gihub.","2019-02-11 03:35 +0000","3205.0","1.0","3.1201248049921997E-4","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094771832157294592","https://twitter.com/gwern/status/1094771832157294592","@razibkhan @StuartJRitchie https://t.co/7HFkGoC9JK","2019-02-11 01:35 +0000","1431.0","59.0","0.04122990915443746","0.0","0.0","2.0","1.0","8.0","0.0","8.0","0.0","0","0","0","0","0","40","40","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094761460041949184","https://twitter.com/gwern/status/1094761460041949184","@Zergfriend What can I say? It was never an issue with earlier GANs where I was happy if the faces were even recognizable as such...","2019-02-11 00:54 +0000","711.0","4.0","0.005625879043600563","0.0","0.0","1.0","3.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094758270181535745","https://twitter.com/gwern/status/1094758270181535745","@Zergfriend Should be ~9%, yes. Why throw away so much data? Isn't clear how much one needs.","2019-02-11 00:41 +0000","761.0","6.0","0.00788436268068331","0.0","2.0","1.0","1.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094733864587747330","https://twitter.com/gwern/status/1094733864587747330","@metasemantic Training montage: https://t.co/5IxkjxWFvQ Definitely interesting to see that somehow the faces manage to survive the retraining and how quickly body-like shapes emerge. /checks Amazon spot prices: $7.34/hour for 8 V100s or $180 a day or ~$1200 per week (the 1024px HQ model). Hm.","2019-02-10 23:04 +0000","457.0","8.0","0.0175054704595186","0.0","0.0","0.0","0.0","7.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094731174134595584","https://twitter.com/gwern/status/1094731174134595584","@YouNeedToGoBack (And if it wasn't sacrilicious I wouldn't be doing it! All for the lulz.)","2019-02-10 22:53 +0000","2078.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094731041766629378","https://twitter.com/gwern/status/1094731041766629378","@YouNeedToGoBack I'm not sure that Holo faces are any more likely to be obscured than the anime faces in general (remember, these Holo faces are a subset of the Danbooru2017 anime faces, I'm just specializing here). So if the original faces StyleGAN didn't have this problem...","2019-02-10 22:53 +0000","2130.0","1.0","4.6948356807511736E-4","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094729238266167297","https://twitter.com/gwern/status/1094729238266167297","@sudogene My theory for that one is that it's a bad attempt to imitate this one pic where Holo is ripping apart a steak. Which might indicate memorization except the rest of the image doesn't look much like this, so it's just the one part being generalized badly.","2019-02-10 22:46 +0000","3463.0","5.0","0.0014438348252959862","0.0","0.0","5.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094728538396205057","https://twitter.com/gwern/status/1094728538396205057","If 1-2h is enough to finetune on a character, what happens if you run for 7+ hours? Overfitting? No - it gets better *and* worse: eg it's picked up more global structure like eating apples (!) but at the cost of horrible noise artifacts (???). I have no explanation for this. https://t.co/L5MtxQqcRV","2019-02-10 22:43 +0000","43520.0","1052.0","0.02417279411764706","5.0","7.0","45.0","19.0","138.0","0.0","203.0","1.0","0","0","0","0","0","634","634","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094694956525080577","https://twitter.com/gwern/status/1094694956525080577","@metasemantic Some early face->Danbooru2017 transfer learning StyleGAN results: https://t.co/S0AOaGD72b Interesting how the faces are just barely persisting on tops of body-like blobs.","2019-02-10 20:29 +0000","517.0","15.0","0.029013539651837523","0.0","1.0","1.0","0.0","10.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094675137671188481","https://twitter.com/gwern/status/1094675137671188481","@roadrunning01 It uses the latest pkl in each run folder by default. You specify which run to resume from by editing 'resume_run_id' with the approriate subdir of results/ in 'training_loop.py'.","2019-02-10 19:11 +0000","2250.0","1.0","4.4444444444444447E-4","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094628928394792960","https://twitter.com/gwern/status/1094628928394792960","@ArtirKel _Spice & WAIfu_","2019-02-10 16:07 +0000","2146.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094627764760690690","https://twitter.com/gwern/status/1094627764760690690","@enkiv2 'After humanity spent thousands of years improving our anime drawings, computers tell us that humans are completely wrong... Perhaps not a single human has touched the edge of the truth of moe.'","2019-02-10 16:02 +0000","2269.0","17.0","0.00749228735125606","1.0","1.0","6.0","0.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094626477633617922","https://twitter.com/gwern/status/1094626477633617922","I'm really enjoying these finetuning runs: taking the full face-corpus StyleGAN model and retraining it on just a specific character's corpus. Here's 2.5h (~5 epoches) of retraining on just Holo faces. https://t.co/ubuVDgWd7D","2019-02-10 15:57 +0000","48774.0","959.0","0.019662115061303154","9.0","8.0","45.0","20.0","119.0","0.0","110.0","0.0","0","0","0","0","0","648","648","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094623837558370304","https://twitter.com/gwern/status/1094623837558370304","@s14joshi No, I mean, instead of learning noise->8px then 8px->16px etc, learn 256px->512px then 128px->256px then 64px->128px then 32px->64px etc. Both seem like they should work, but I haven't seen anyone try the other way. Maybe it's a lot better! How would one know otherwise?","2019-02-10 15:47 +0000","418.0","3.0","0.007177033492822967","0.0","0.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094600509980327937","https://twitter.com/gwern/status/1094600509980327937","So, in all progressive-growing GANs, they start from low-res (like 8px) and train against down-sampled real images, adding layers progressively, learning to upscale noise into full images. But has anyone tried the other direction of shrinking learned layers instead?","2019-02-10 14:14 +0000","12019.0","30.0","0.002496047924120143","0.0","2.0","4.0","8.0","0.0","0.0","16.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094599433059266561","https://twitter.com/gwern/status/1094599433059266561","@matthew_d_green @FredericJacobs @instagram It's a good demonstration of the importance of color coordination and the right amount of contrast. Imagine if the bezel were blue or white!","2019-02-10 14:10 +0000","768.0","8.0","0.010416666666666666","0.0","0.0","3.0","1.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094589676634234880","https://twitter.com/gwern/status/1094589676634234880","@quasimondo I figured I might have to write a script to dump images and then convert them to video with ffmpeg (which was what I was doing with ProGAN since the default video settings are enormous), but if you could post yours at some point, that'd be great, thanks.","2019-02-10 13:31 +0000","730.0","4.0","0.005479452054794521","0.0","1.0","0.0","2.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094588110451757057","https://twitter.com/gwern/status/1094588110451757057","@quasimondo Ah. Speaking of which, are you writing your own code to do video interpolations? I wanted to do one for the faces but (unlike ProGAN) the StyleGAN repo doesn't seem to ship with a video script.","2019-02-10 13:25 +0000","600.0","3.0","0.005","0.0","1.0","0.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094586521460662273","https://twitter.com/gwern/status/1094586521460662273","@fabricatedmath @roadrunning01 There's also 'wAIfu' and 'husbANNdo'.","2019-02-10 13:18 +0000","426.0","1.0","0.002347417840375587","0.0","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094586338475675648","https://twitter.com/gwern/status/1094586338475675648","@jjvie A lot but it's not too bad: https://t.co/qvNL2BfASj Another guy is running my last checkpoint on the 512px SFW Danbooru2017 version w/o any other preprocessing, so we'll see how it scales.","2019-02-10 13:18 +0000","2874.0","24.0","0.008350730688935281","0.0","0.0","1.0","2.0","21.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094585878100488193","https://twitter.com/gwern/status/1094585878100488193","@quasimondo I haven't paid much attention to the sample generation time, so dunno.","2019-02-10 13:16 +0000","569.0","1.0","0.0017574692442882249","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094452913445965824","https://twitter.com/gwern/status/1094452913445965824","@roadrunning01 FWIW, I think psi=0.5-0.6 yields probably the best balance of samples.","2019-02-10 04:28 +0000","1644.0","2.0","0.0012165450121654502","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094445227799121920","https://twitter.com/gwern/status/1094445227799121920","Current StyleGAN model if anyone wants to use a good-quality but unconverged anime-face StyleGAN: https://t.co/1doxVGOm2l","2019-02-10 03:57 +0000","52024.0","500.0","0.009610948792864832","2.0","4.0","32.0","25.0","348.0","0.0","87.0","0.0","0","0","2","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094425813678727168","https://twitter.com/gwern/status/1094425813678727168","@metasemantic I wouldn't call 4GB VRAM 'mid-range' in 2019, but what I meant was you could use StyleGAN's gradient accumulation support to go all the way down to minibatch=1 if necessary.","2019-02-10 02:40 +0000","263.0","2.0","0.0076045627376425855","0.0","2.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094368265219072000","https://twitter.com/gwern/status/1094368265219072000","@leo_tdg That one was because I loved the whole chibi style it has going on. It's nice that the StyleGAN is able to, you know, do so many styles! It is genuine IMO from watching the fixed set of training samples evolve, but I don't want to do a dump now. Lots of training left to do.","2019-02-09 22:51 +0000","884.0","1.0","0.0011312217194570137","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094358580281724928","https://twitter.com/gwern/status/1094358580281724928","@leo_tdg Slowly getting better... Here's some of the nicer recent samples: https://t.co/UbF4G7iy0o I've also begun to wonder if it would make more sense to try transfer learning Danbooru2017->faces: if it's learned whole scenes/bodies first, 'scaling down' to faces might be easier.","2019-02-09 22:13 +0000","883.0","15.0","0.01698754246885617","0.0","1.0","1.0","2.0","11.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094352807254212608","https://twitter.com/gwern/status/1094352807254212608","@metasemantic So far I know of: one person on Twitter, 1024px Gothic cathedrals; 1 person in my DMs, 'custom dataset'; 1 person on IRC, SFW subset of Danbooru2017. If you have even a mid-range Nvidia GPU, you can train a new StyleGAN in a week or two after fiddling with the minibatch settings","2019-02-09 21:50 +0000","670.0","7.0","0.010447761194029851","0.0","2.0","3.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094328781597261824","https://twitter.com/gwern/status/1094328781597261824","@parand Hm... I'm not sure. I don't think the StyleGAN repo has code for that. Is there anywhere in the paper or video where they demo the output at various layers? I vaguely recall they might, to show how the noise starts off determining global structure then fine-scale noise.","2019-02-09 20:14 +0000","1212.0","4.0","0.0033003300330033004","0.0","1.0","0.0","3.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094266109488963585","https://twitter.com/gwern/status/1094266109488963585","@vsync As always, I intend to keep my own esthetic and preferences. I think the many improvements over the past few months have made https://t.co/LC5JQL86wv much better-looking by my lights, and more functional. (Also found 2 Pandoc bugs while I was at it.)","2019-02-09 16:05 +0000","212.0","6.0","0.02830188679245283","1.0","1.0","0.0","1.0","3.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094265620806418436","https://twitter.com/gwern/status/1094265620806418436","@TyphonBaalAmmon That's what I meant by 'a pain to navigate', and was doing until I complained and someone blew my mind by telling me about the sidebars.","2019-02-09 16:03 +0000","711.0","3.0","0.004219409282700422","0.0","0.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094261351776698368","https://twitter.com/gwern/status/1094261351776698368","Well, I'm being unfair. It's not completely invisible. There is a teeny-tiny white rectangle which looks like a background on the edge of the page as well, which might serve as a hint the navigation exists. (If you were a schizophrenic lunatic who sees hidden powers everywhere.)","2019-02-09 15:46 +0000","19346.0","55.0","0.002842964953995658","0.0","4.0","9.0","2.0","0.0","0.0","38.0","2.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094259418127101954","https://twitter.com/gwern/status/1094259418127101954","""How was I supposed to know ?????"" you ask. Well, he mentions it, very briefly, at the start. Hope you were reading the entire book & have a photographic memory, or are an old-school adventure gamer who mouses over every pixel on the screen looking for secrets...","2019-02-09 15:39 +0000","15299.0","33.0","0.002157003725733708","0.0","1.0","8.0","0.0","0.0","0.0","24.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094259417120419841","https://twitter.com/gwern/status/1094259417120419841","Man, some websites. Like Matthew Butterick's 'Practical Typography': eg https://t.co/H2JkbAouRi You might think it's a pain to navigate but no, it follows what I call 'the adventure game school of design' - you click on the ????????? navigation sidebars to go left/right!","2019-02-09 15:39 +0000","15156.0","457.0","0.03015307468989179","1.0","3.0","19.0","3.0","412.0","0.0","19.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094245056008146944","https://twitter.com/gwern/status/1094245056008146944","@mbwheats Dunno. Generating single heads is not that useful, and productizing/control is a lot of hard work. Look at the effort which goes into making https://t.co/e3ZBk7IWbB into a useful tool, and that's just colorizing. Might be a fun infinite-gravatar-generation tool, though.","2019-02-09 14:42 +0000","536.0","16.0","0.029850746268656716","0.0","1.0","0.0","3.0","12.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094240492424974336","https://twitter.com/gwern/status/1094240492424974336","@AlbertBrown_sq @michelnivard Yes. No one's going to abandon a novel approach so easy & cheap to test for such a severe & intractable problem because some (still debated & lacking validation) genomics approach doesn't turn in a statistically-significant result... Plus, MR what? Are there genus-level GWASes?","2019-02-09 14:23 +0000","208.0","1.0","0.004807692307692308","0.0","0.0","0.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094073729011847168","https://twitter.com/gwern/status/1094073729011847168","Expanding psi to make the transitions clearer: psi=[1, 0.7, 0.5, 0.4, 0.2, 0.1, 0, -0.1, -0.2, -0.4, -0.5, -0.7, -1]. (Head pose reversal/negation is particularly obvious with this version.) https://t.co/8ymUpp8DW9","2019-02-09 03:21 +0000","31472.0","910.0","0.02891459074733096","3.0","4.0","51.0","13.0","139.0","0.0","74.0","1.0","0","0","0","0","0","625","625","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094067654720135168","https://twitter.com/gwern/status/1094067654720135168","@joshu I've limited the page width some more and added increased line-heights based on page-width (more for wider pages).","2019-02-09 02:57 +0000","243.0","2.0","0.00823045267489712","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094031502055227392","https://twitter.com/gwern/status/1094031502055227392","@sonyaellenmann @BradyDale @liberapay @michael_nielsen Invite-only, 'founding membership' limit thing, intended for subscriptions only (which may be how I prefer to use Patreon but most Patreons are more transactional or release-based), were the big ones.","2019-02-09 00:33 +0000","621.0","3.0","0.004830917874396135","0.0","0.0","1.0","0.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094006863757299713","https://twitter.com/gwern/status/1094006863757299713","@fabricatedmath That was for 1024px and publication-quality near-photoperfect results, though. lulzy 512px is much easier. Also, I think they set their learning rates too low so I increase mine.","2019-02-08 22:55 +0000","697.0","3.0","0.00430416068866571","0.0","0.0","1.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094004850482909184","https://twitter.com/gwern/status/1094004850482909184","@fabricatedmath 2x1080ti, personal box.","2019-02-08 22:47 +0000","1196.0","7.0","0.005852842809364548","0.0","2.0","1.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1094003308371628037","https://twitter.com/gwern/status/1094003308371628037","@BradyDale @sonyaellenmann @liberapay @michael_nielsen Their drip thing is *very* different and not useful for most Patreon users. (I didn't understand the usecase when they launched it, and haven't looked into it since.)","2019-02-08 22:41 +0000","505.0","7.0","0.013861386138613862","0.0","1.0","1.0","2.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093998075813740546","https://twitter.com/gwern/status/1093998075813740546","@privatepresh See figure 8 in https://t.co/UvaO5JOtvT . Each row is a latent space point with psi = {1 / 0.7 / 0.5 / 0 / -0.5 / -0.7 / -1}.","2019-02-08 22:20 +0000","1150.0","36.0","0.03130434782608696","0.0","0.0","1.0","1.0","34.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093996229359534080","https://twitter.com/gwern/status/1093996229359534080","Another interesting thing: the figure 8 psi interpolation. 0/middle=global mean - a brown-eyed brown-haired girl. Fair enough. 1 vs -1 are opposites. What's the opposite of gray eyes/short brown hair? red eyes/long purple hair, apparently. red eyes/long red? yellow/short blue. https://t.co/9HStf2Jmoy","2019-02-08 22:13 +0000","44716.0","1529.0","0.034193577243045","19.0","7.0","59.0","27.0","110.0","0.0","107.0","1.0","0","0","0","0","0","1199","1199","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093989832613875718","https://twitter.com/gwern/status/1093989832613875718","@parand Cropped faces using nagadomi's script from Danbooru2017; hand-pruned and used an earlier GAN to delete the bottom 20% by quality; upscaled to 512px by waifu2x; ~220k faces after data augmentation; StyleGAN, from scratch ~3.5 days 2x1080ti.","2019-02-08 21:47 +0000","2297.0","21.0","0.00914235959947758","0.0","1.0","9.0","3.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093983431824949248","https://twitter.com/gwern/status/1093983431824949248","@ArkaSadhu29 @roadrunning01 Right now, it's ~3.5 days of training. I've lost time to segfaults along the way. Generating a few hundred images takes like a minute or two (concurrent with the ongoing training, could optimize it down to a few seconds if it had the GPUs to itself).","2019-02-08 21:22 +0000","3706.0","12.0","0.0032379924446842958","0.0","0.0","4.0","8.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093982514610388992","https://twitter.com/gwern/status/1093982514610388992","@Ambisinister_ - hetrochromia is valid. Some characters really are heterochromatic, and in any case, lighting often makes them distinctly different. (*Too* much heterochromia early on in training is a major diagnostic of GAN failure, though.) - yep! - nope. But bare shoulders can still be SFW.","2019-02-08 21:18 +0000","2083.0","10.0","0.004800768122899664","0.0","0.0","3.0","3.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093979199910699010","https://twitter.com/gwern/status/1093979199910699010","More samples from today: https://t.co/O0JD4Gsh6s IRC reaction: 'holy shit that is insane. seriously this is fucking insane, when did it get so good' Me: 'there is no fire alarm' ( £ş ?? £ş)","2019-02-08 21:05 +0000","48549.0","1410.0","0.029042822715194958","13.0","8.0","62.0","59.0","1105.0","0.0","162.0","0.0","0","0","1","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093902196301852672","https://twitter.com/gwern/status/1093902196301852672","""Keener sorrow than the cherry blossoms of spring is today's snowfall - weeping to the ground, leaving not even a flake behind.""","2019-02-08 15:59 +0000","7539.0","12.0","0.0015917230401910067","0.0","0.0","6.0","1.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093874655235133440","https://twitter.com/gwern/status/1093874655235133440","@JohnSpeakman4 @WonkaWasRight @whsource @Cell_Metabolism https://t.co/BAA2a7iq2R https://t.co/FQPjfJr1lK https://t.co/0v3Lzm4s7V https://t.co/mNbW90oTFW https://t.co/jHnqwaUONV all show non-neutrality of BMI, no?","2019-02-08 14:10 +0000","225.0","21.0","0.09333333333333334","1.0","1.0","3.0","0.0","15.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093733298302537729","https://twitter.com/gwern/status/1093733298302537729","@roadrunning01 I might start on full Danbooru2017 images once faces is done. Let's see how far the architecture can be pushed - that's the major limitation of the StyleGAN paper at the moment, no good indication of how far it can be pushed. If BigGAN can do ImageNet, can StyleGAN? etc.","2019-02-08 04:48 +0000","4032.0","4.0","9.92063492063492E-4","0.0","1.0","1.0","1.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093712218078801922","https://twitter.com/gwern/status/1093712218078801922","@refikanadol @quasimondo @Zoot_Allures Nice. The results on cats/cathedrals/faces definitely makes me wonder how StyleGAN would handle ImageNet or all-Danbooru2017 anime (not just face) images.","2019-02-08 03:24 +0000","551.0","8.0","0.014519056261343012","0.0","0.0","3.0","3.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093710911959900160","https://twitter.com/gwern/status/1093710911959900160","@realmforge @frostinmay Yes, both ProGAN and StyleGAN struggle with the edges and background because they're obscured and highly-varied. The faces are always in clear focus in the foreground, ofc. I imagine shoulders should get better toward the end - not even at 512px yet!","2019-02-08 03:19 +0000","18480.0","10.0","5.411255411255411E-4","0.0","0.0","4.0","5.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093710455284162560","https://twitter.com/gwern/status/1093710455284162560","@quasimondo @Zoot_Allures eg here's someone doing Gothic cathedrals with good results already: https://t.co/I24XydiFg3","2019-02-08 03:17 +0000","719.0","16.0","0.022253129346314324","0.0","1.0","2.0","9.0","1.0","0.0","0.0","3.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093709848859029504","https://twitter.com/gwern/status/1093709848859029504","@quasimondo @Zoot_Allures On the other hand, unlike BigGAN, the compute is far more reasonable. I'm already getting hilarious (SOTA, even) anime faces with ~6 GPU-days.","2019-02-08 03:15 +0000","711.0","5.0","0.007032348804500703","0.0","1.0","4.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093703960958066688","https://twitter.com/gwern/status/1093703960958066688","@frostinmay Oh, if you want to look at the rest: https://t.co/YF6pXfeNgu","2019-02-08 02:51 +0000","18567.0","117.0","0.006301502666020358","0.0","1.0","10.0","8.0","93.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093702506629206016","https://twitter.com/gwern/status/1093702506629206016","@frostinmay Aren't they? I literally burst out laughing when I finally got `generate_figures.py` working and could look at the interpolations.","2019-02-08 02:46 +0000","18484.0","12.0","6.492101276779918E-4","0.0","1.0","5.0","0.0","0.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093701790971953152","https://twitter.com/gwern/status/1093701790971953152","MFW when I look up from my GAN at the latest psych paper but then realize it's a Registered Report. https://t.co/7jJw17Ekqk","2019-02-08 02:43 +0000","345960.0","14798.0","0.04277373106717539","68.0","18.0","308.0","715.0","837.0","0.0","1696.0","6.0","0","0","28","0","0","11122","11122","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093662890761048064","https://twitter.com/gwern/status/1093662890761048064","@WonkaWasRight @whsource @JohnSpeakman4 There's a whole bunch. The tricky thing is that many of them are looking at *net* selection but over different time intervals. So it can be true that there is net selection for EDU/IQ from 50kya-now and net selection against during 1900-2000. Have to read them closely.","2019-02-08 00:08 +0000","265.0","4.0","0.01509433962264151","1.0","1.0","2.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093662170125090818","https://twitter.com/gwern/status/1093662170125090818","@michael_nielsen Someone on IRC made that exact comparison when it was linked. It helps that Bezos has hulked out over the past decade...","2019-02-08 00:05 +0000","5053.0","34.0","0.006728676034039185","0.0","1.0","12.0","9.0","0.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093637831921152000","https://twitter.com/gwern/status/1093637831921152000","@AndrewCutler13 @drmichaellevin @NoamJStein Yeah, I remember that. But as far as I know, no flesh was ever put on that inspiration and now people prefer dropout interpretations about ensembling or approximating a Bayesian posterior distribution of models.","2019-02-07 22:29 +0000","210.0","1.0","0.004761904761904762","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093637371524993026","https://twitter.com/gwern/status/1093637371524993026","@katiecorker Oh, tons of stuff never even got written up for submission because they assumed it'd be a waste of time even trying to get them published with null or weak results: https://t.co/oRSvg7pBqU / https://t.co/5TCUUUlTb1","2019-02-07 22:27 +0000","212.0","7.0","0.0330188679245283","0.0","1.0","1.0","0.0","3.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093636038633971715","https://twitter.com/gwern/status/1093636038633971715","@katiecorker There's a lot of meta-analytic literature which tracks the flow of papers from things like conference abstracts/trial registrations to final (non)publication. eg Franco's TESS: because authors must ask TESS to carry out survey/experiments for them, *all* started studies tracked.","2019-02-07 22:22 +0000","826.0","4.0","0.004842615012106538","0.0","1.0","2.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093634997809020929","https://twitter.com/gwern/status/1093634997809020929","@bryan_caplan ""The true story of Russia's weakness [stifled by bad planning, bureaucratic inefficiency, and lack of any real incentive]"", Nutter 1957 (_U.S. News & World Report_, March 1, 1957): https://t.co/un5Uy0VGfg","2019-02-07 22:17 +0000","750.0","9.0","0.012","0.0","0.0","2.0","2.0","5.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093574984688168960","https://twitter.com/gwern/status/1093574984688168960","@roadrunning01 @EldinhoC Up to 256px (I think), ~2.5d training. Big quality increase - these StyleGAN samples are ?. https://t.co/p1yJQzw7BK","2019-02-07 18:19 +0000","2721.0","84.0","0.030871003307607496","0.0","1.0","7.0","9.0","13.0","0.0","14.0","0.0","0","0","0","0","0","40","40","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093569593304641542","https://twitter.com/gwern/status/1093569593304641542","@StatModeling Acne self-experiments: https://t.co/XsUNY1qe2F ? :)","2019-02-07 17:58 +0000","677.0","6.0","0.008862629246676515","0.0","0.0","2.0","0.0","3.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093569388459057154","https://twitter.com/gwern/status/1093569388459057154","@roadrunning01 @EldinhoC StyleGAN samples on Pokemon sprites (overfitting?): https://t.co/eiRzyefTYd","2019-02-07 17:57 +0000","565.0","30.0","0.05309734513274336","0.0","0.0","0.0","0.0","30.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093540611527970817","https://twitter.com/gwern/status/1093540611527970817","@devonzuegel Olin Shivers remains a classic Acknowledgements: https://t.co/b7pO1bReDo (It's humorous... I think. Considering the CMU stories about Shivers, it's hard to tell.)","2019-02-07 16:02 +0000","6615.0","128.0","0.019349962207105064","2.0","2.0","19.0","4.0","83.0","0.0","18.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093509411811872769","https://twitter.com/gwern/status/1093509411811872769","@xsteenbrugge @genekogan @quasimondo I had the impression he was searching the z-vector directly like a greedy / evolutionary search, and not backpropping through the model itself to get gradients, which I wouldn't expect to work as well. But I could be wrong. QM?","2019-02-07 13:58 +0000","157.0","2.0","0.012738853503184714","0.0","0.0","0.0","2.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093299440625045504","https://twitter.com/gwern/status/1093299440625045504","@SteveStuWill @WiringTheBrain 'biggest'... as long as we ignore measurement error, handwaving it away as 'surely measurement error is not so big', completely ignoring that yeah, it can be quite substantial - twins ascertained for schizophrenia in their 30s, 'IQ tests' with test-retest reliabilities of .5...","2019-02-07 00:04 +0000","2530.0","35.0","0.01383399209486166","0.0","1.0","17.0","6.0","0.0","0.0","10.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093279724258889728","https://twitter.com/gwern/status/1093279724258889728","@nabeelqu @robinhanson @Davie_Michael Hedge funds particularly. A book I'd like to read one day would be all about the crazy things & data feeds traders & hedge funds have used over the years, starting with steganographic exploits of French semaphores and moving on to satellites etc.","2019-02-06 22:46 +0000","275.0","2.0","0.007272727272727273","0.0","0.0","1.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093279129712111617","https://twitter.com/gwern/status/1093279129712111617","@genekogan But isn't that true of classic style transfer as well, even the original iterative gradient descent approach? You have to pick the right levels as your Gram matrix if you want to transfer the textures/'style' rather than overwrite the content by using more semantic layers.","2019-02-06 22:43 +0000","8615.0","6.0","6.964596633778294E-4","0.0","1.0","1.0","4.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093226993146122245","https://twitter.com/gwern/status/1093226993146122245","@genekogan It's reversible in that it's differentiable! Remember that backprop works in multiple directions. You want to modify a real image? Drop in the model + image, hold them constant, and then do gradient descent on the noise vectors to get the closest approximation...","2019-02-06 19:16 +0000","1405.0","19.0","0.013523131672597865","0.0","3.0","1.0","3.0","0.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093226496347512840","https://twitter.com/gwern/status/1093226496347512840","@genekogan (Considering the architecture and their demos of controllability, isn't it more than 'sort of'?)","2019-02-06 19:14 +0000","9871.0","13.0","0.0013169891601661433","0.0","1.0","1.0","6.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093216285243129861","https://twitter.com/gwern/status/1093216285243129861","@anderssandberg https://t.co/5ijib9lHKe https://t.co/jLAQOpuX2K https://t.co/sUHVgZyDew / https://t.co/u0Rwf5w8NI","2019-02-06 18:34 +0000","370.0","8.0","0.021621621621621623","0.0","0.0","3.0","2.0","2.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093184224008523779","https://twitter.com/gwern/status/1093184224008523779","@robinhanson @Davie_Michael He doesn't even do that, and the point is that this is an industry of data labeling, which contradicts any attempt to treat it as a widespread or important generalization. But yeah, let's keep moving those goalposts...","2019-02-06 16:26 +0000","1127.0","16.0","0.01419698314108252","0.0","1.0","1.0","1.0","0.0","0.0","13.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093181823973244928","https://twitter.com/gwern/status/1093181823973244928","@robinhanson @Davie_Michael Of course they are. They are showing these are SOP with large specialized contractors worldwide in a well-developed industry devoted to precisely the thing he claims without evidence, and no one else names names either. Unless we're going to keep moving goalposts on 'common'...","2019-02-06 16:17 +0000","642.0","8.0","0.012461059190031152","1.0","1.0","2.0","0.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093180533050392576","https://twitter.com/gwern/status/1093180533050392576","@roadrunning01 @EldinhoC Current StyleGAN results on Danbooru2017 anime faces, ~23h training overnight, ~128px samples. https://t.co/nZgllglLsl","2019-02-06 16:12 +0000","3293.0","71.0","0.021560886729426056","0.0","2.0","6.0","4.0","10.0","0.0","9.0","1.0","0","0","0","0","0","39","39","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093171149343784960","https://twitter.com/gwern/status/1093171149343784960","@Davie_Michael @robinhanson Why should I believe his contentless exampleless factless assertions supposedly based on secret knowledge when I can read the news & Arxiv like anyone else and come up with contradictory examples with ease?","2019-02-06 15:34 +0000","777.0","13.0","0.01673101673101673","0.0","1.0","2.0","1.0","0.0","0.0","9.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093024023431270401","https://twitter.com/gwern/status/1093024023431270401","@parand Almost certainly you could. It's trained progressively, after all, so that's how it literally is for part of training. But would need to be a better Python/TF programmer than me to do transfer learning like that. (I'll learn it some day, I swear!)","2019-02-06 05:50 +0000","145.0","4.0","0.027586206896551724","0.0","0.0","0.0","1.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1093016864593186816","https://twitter.com/gwern/status/1093016864593186816","@parand I'm not sure how to resolve the finetuning problem since they only released 1024px faces. I could upscale my anime faces 2x - again - but then the quality will continue to suffer badly. I could try anime images in general, tons of 512px+ images that'd 2x fine.","2019-02-06 05:21 +0000","140.0","1.0","0.007142857142857143","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092991750648541185","https://twitter.com/gwern/status/1092991750648541185","@parand I'm doing 512px anime faces myself at the moment. Too bad decent 1024px anime faces are so hard to come by, or I'd be trying finetuning the 1024px Flickr pretrained model instead...","2019-02-06 03:41 +0000","154.0","1.0","0.006493506493506494","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092980609696366592","https://twitter.com/gwern/status/1092980609696366592","@JanelleCShane Interesting technical note: ProGAN also trained on the LSUN cats, but the meme captions look like Cyrillic. Why? Because they kept the default mirror/flip data augmentation. But people noticed; so for StyleGAN, they disabled the mirroring so now it looks like Latin/English.","2019-02-06 02:57 +0000","875.0","8.0","0.009142857142857144","0.0","0.0","5.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092979783879794688","https://twitter.com/gwern/status/1092979783879794688","@parand Strictly speaking. you only need 11GB if you want to train 1024px, which you probably don't, and even then, you can probably still train slower by reducing minibatch size & using gradient accumulation. (ProGAN had that built in, haven't checked whether StyleGAN kept that.)","2019-02-06 02:54 +0000","794.0","8.0","0.010075566750629723","0.0","1.0","1.0","5.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092954269840560128","https://twitter.com/gwern/status/1092954269840560128","@AodhBC @danwaterfield Whups, first link was supposed to be https://t.co/MFpf0wHQpQ","2019-02-06 01:12 +0000","4084.0","59.0","0.01444662095984329","0.0","0.0","4.0","2.0","42.0","0.0","11.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092942599638446081","https://twitter.com/gwern/status/1092942599638446081","@KaiLashArul @aravind7694 Likewise. I liked it enough to make a cleaner version of the paper from the book: https://t.co/bVdYJiPstw","2019-02-06 00:26 +0000","1891.0","65.0","0.03437334743521946","2.0","0.0","7.0","4.0","48.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092942126604800002","https://twitter.com/gwern/status/1092942126604800002","@EldinhoC More Pokemon samples from PokeGAN: https://t.co/dryRhGgxgn","2019-02-06 00:24 +0000","454.0","12.0","0.02643171806167401","0.0","1.0","0.0","0.0","11.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092919622519656449","https://twitter.com/gwern/status/1092919622519656449","@AodhBC @danwaterfield Stalin was always a tough editor: https://t.co/A4MAZtZoga https://t.co/A4MAZtZoga","2019-02-05 22:55 +0000","4947.0","100.0","0.02021427127552052","0.0","1.0","19.0","7.0","61.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092882164184690688","https://twitter.com/gwern/status/1092882164184690688","@nalim1 @sonyaellenmann That might depend on the fraud and content. Consider all the furry artists.","2019-02-05 20:26 +0000","1207.0","8.0","0.006628003314001657","0.0","0.0","2.0","1.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092868922641186817","https://twitter.com/gwern/status/1092868922641186817","@robinhanson I was tracing back one of my epigraphs to its source and I loled when I saw Huxley's context because it reminded me of you: https://t.co/Ok2JVxOUw8","2019-02-05 19:33 +0000","1175.0","48.0","0.04085106382978723","0.0","0.0","4.0","4.0","28.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092859352317419521","https://twitter.com/gwern/status/1092859352317419521","@sonyaellenmann This has worried me from the start when I began using Patreon. I'm building a house on VC sand. What happens when their money starts to run out? But the donations on Patreon are just so much larger than anywhere else, that what choice does one have?","2019-02-05 18:55 +0000","4277.0","56.0","0.01309328968903437","2.0","3.0","19.0","8.0","0.0","0.0","24.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092858613356474369","https://twitter.com/gwern/status/1092858613356474369","@sonyaellenmann @michael_nielsen And fraud. From what I understand, all companies like this, from Gratipay on, spend staggering amounts of effort dealing with CC fraud, chargebacks, etc.","2019-02-05 18:52 +0000","2394.0","27.0","0.011278195488721804","0.0","2.0","14.0","3.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092846012249587713","https://twitter.com/gwern/status/1092846012249587713","@quirkyllama Don't you worry about Tesla, they spend plenty on that: https://t.co/CMMl54E3Pj ... (And RL is an entirely different kind of problem anyway.)","2019-02-05 18:02 +0000","667.0","14.0","0.020989505247376312","0.0","0.0","0.0","2.0","6.0","0.0","6.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092840120393322499","https://twitter.com/gwern/status/1092840120393322499","@robinhanson Here's an example of the quite elaborate & expensive large-scale active learning (https://t.co/ICQHCHBpTc) used to improve data quality by Google: https://t.co/K7mBJoyklB","2019-02-05 17:39 +0000","1451.0","52.0","0.03583735354927636","1.0","0.0","6.0","0.0","35.0","0.0","10.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092838724956835841","https://twitter.com/gwern/status/1092838724956835841","@Chris_PK_Smith @robinhanson He seems to be aiming at FANG and similar Chinese/Japanese companies. Not many entities are filling data centers up with millions of dollars of GPUs/TPUs just for AI. He also conflates the 'AI research community' with the companies (as if you could just download JFT-300M...).","2019-02-05 17:33 +0000","522.0","6.0","0.011494252873563218","0.0","0.0","3.0","2.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092837853531394048","https://twitter.com/gwern/status/1092837853531394048","@robinhanson eg China https://t.co/wgHkbA4I7t or Africa https://t.co/xSpUOkTHiy? Scale, which I've never heard of, is 10k people all on its own: https://t.co/96OGpWSZ2G","2019-02-05 17:30 +0000","1724.0","14.0","0.008120649651972157","0.0","1.0","3.0","0.0","10.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092832877048487936","https://twitter.com/gwern/status/1092832877048487936","@robinhanson I don't think this is true, and he gives no evidence for it. Companies like Amazon or Google or Tesla employ thousands or tens of thousands of contractors solely to label & annotate datasets like JFT-300million and that's after using active learning for label cleaning.","2019-02-05 17:10 +0000","3593.0","68.0","0.018925688839409965","1.0","5.0","37.0","8.0","0.0","0.0","16.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092822967132868609","https://twitter.com/gwern/status/1092822967132868609","@dr_appie But of course we already knew that from the candidate-gene literature...","2019-02-05 16:31 +0000","703.0","6.0","0.008534850640113799","0.0","0.0","5.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092819643507576832","https://twitter.com/gwern/status/1092819643507576832","@KloudStrife @quasimondo On the bright side, once I wait 3 hours for the dataset to be made, pip install a lib, edit the nvcc call, edit the https://t.co/spiKGnBWDy 8->2 GPUs, wait half an hour for it to copy the dataset unnecessarily, and download a model for FID, it *does* run: https://t.co/0Fvj95DTpv","2019-02-05 16:18 +0000","441.0","25.0","0.05668934240362812","0.0","0.0","5.0","1.0","19.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092803521680945152","https://twitter.com/gwern/status/1092803521680945152","@quasimondo I'm still figuring out how to run it. It has an absurdly long (since it literally copies the entire dataset into the temp folder) setup, which makes fixing issues slow... It also still has the nccl TF bug from ProGAN: https://t.co/ymYTRmh8Nq","2019-02-05 15:13 +0000","498.0","12.0","0.024096385542168676","0.0","1.0","1.0","0.0","8.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092595083461971968","https://twitter.com/gwern/status/1092595083461971968","@roadrunning01 @EldinhoC The StyleGAN repo is finally up! At https://t.co/SEIRxerM4S and... it's empty. (The photo dataset has been released though.) Oh Nvidia how exquisite your torments.","2019-02-05 01:25 +0000","331.0","3.0","0.00906344410876133","0.0","0.0","1.0","0.0","2.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092568306094088193","https://twitter.com/gwern/status/1092568306094088193","@kopseng @highqualitysh1t @hardmaru At least with ProGAN (which the StyleGAN paper says was the starting codebase), it doesn't matter because the .tfrecords format you have to encode into is like >10x the size of either PNG/JPEG.","2019-02-04 23:39 +0000","2588.0","6.0","0.00231839258114374","0.0","0.0","2.0","3.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092551270005649408","https://twitter.com/gwern/status/1092551270005649408","@draughtens @AdamDemirel I don't think my site design needs a total overhaul *that* badly.","2019-02-04 22:31 +0000","1171.0","3.0","0.0025619128949615714","0.0","1.0","1.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092531500879433728","https://twitter.com/gwern/status/1092531500879433728","_Pain: The Gift Nobody Wants_, Brand & Yancey 1993: https://t.co/Fc7ZZ36x4J https://t.co/puVnFeBoFj https://t.co/6VYBKAWbx0 https://t.co/xkXFNv6oqX @slatestarcodex","2019-02-04 21:13 +0000","8231.0","43.0","0.005224152593852509","0.0","0.0","5.0","5.0","25.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092493333719592961","https://twitter.com/gwern/status/1092493333719592961","@brownparkere Given French politics/religions over the centuries, there's an article idea there for anyone historical-minded! (Be almost too easy in the USA, what with things like the Indians & Alcatraz...)","2019-02-04 18:41 +0000","176.0","4.0","0.022727272727272728","0.0","0.0","1.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092450918346432513","https://twitter.com/gwern/status/1092450918346432513","January links: https://t.co/S3VZQigVLo","2019-02-04 15:52 +0000","8383.0","152.0","0.01813193367529524","3.0","1.0","5.0","1.0","130.0","0.0","12.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092432116623777792","https://twitter.com/gwern/status/1092432116623777792","@AdamDemirel I'll take a look at that guide, but I'm afraid I'm not nearly rich enough to be able to pay for a real web developer. :)","2019-02-04 14:38 +0000","1493.0","8.0","0.0053583389149363695","0.0","2.0","0.0","2.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092290437996572672","https://twitter.com/gwern/status/1092290437996572672","@androgynandre @legalinspire @sonyaellenmann @eigenrobot And, just like the original, humble pie often tastes nasty.","2019-02-04 05:15 +0000","450.0","7.0","0.015555555555555555","0.0","0.0","2.0","2.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092264281842827264","https://twitter.com/gwern/status/1092264281842827264","@androgynandre @legalinspire @eigenrobot @sonyaellenmann I have no idea what you guys are talking about but that phrase really makes the old cogitators spin: if signaling is about providing info about latent traits and humility is negation of signaling, then humbleness is about minimizing information gain & can be quantified in bits...","2019-02-04 03:31 +0000","655.0","19.0","0.02900763358778626","0.0","1.0","9.0","2.0","0.0","0.0","7.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092244260882210816","https://twitter.com/gwern/status/1092244260882210816","@PuriSid @debarghya_das Such is the topsy-turvy brave new world of web design, where often the most expensive and complex websites are the least readable, beautiful, loadable, or even functional.","2019-02-04 02:11 +0000","1387.0","17.0","0.012256669069935111","2.0","1.0","9.0","0.0","0.0","0.0","5.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092234736905404418","https://twitter.com/gwern/status/1092234736905404418","@joshu how much money are we talking here","2019-02-04 01:33 +0000","1493.0","16.0","0.010716677829872739","0.0","1.0","9.0","3.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092227967961493504","https://twitter.com/gwern/status/1092227967961493504","@vsync I do too, but maybe it could be better.","2019-02-04 01:06 +0000","1650.0","7.0","0.004242424242424243","0.0","1.0","3.0","3.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092221945427517440","https://twitter.com/gwern/status/1092221945427517440","What are the most useful or beautiful monochrome or grayscale websites you know? I'm looking for CSS/JS ideas to steal for https://t.co/LC5JQL86wv.","2019-02-04 00:42 +0000","112923.0","849.0","0.007518397492096384","2.0","30.0","114.0","102.0","272.0","0.0","325.0","4.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092186164910583808","https://twitter.com/gwern/status/1092186164910583808","@webdevMason Wittgenstein would approve of your use of 3 rhetorical questions in a row.","2019-02-03 22:20 +0000","6549.0","81.0","0.012368300503893724","1.0","0.0","51.0","18.0","0.0","0.0","10.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092177902593064961","https://twitter.com/gwern/status/1092177902593064961","@LTF_01 Yes. It'll be scanned whenever I read it.","2019-02-03 21:47 +0000","198.0","1.0","0.005050505050505051","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092148805108199424","https://twitter.com/gwern/status/1092148805108199424","@karpathy @michael_nielsen @pfau (I find this paper exciting because it suggests that the really dismal results to date for cartoon/anime tagging & generation can be fixed by first training a tagger to get a good Gram style matrix, and then doing style transfer data augmentation in all the other tasks.)","2019-02-03 19:52 +0000","1861.0","7.0","0.0037614185921547557","0.0","0.0","2.0","1.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092145395650711553","https://twitter.com/gwern/status/1092145395650711553","@karpathy @michael_nielsen @pfau I think I've mentioned https://t.co/wfQS6H3a5V before, which uses style transfer to reduce texture cheating.","2019-02-03 19:38 +0000","2333.0","50.0","0.021431633090441493","0.0","1.0","11.0","1.0","34.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092125960600776704","https://twitter.com/gwern/status/1092125960600776704","_The Funding of Scientific Racism: Wickliffe Draper and the Pioneer Fund_, Tucker 2002: https://t.co/NXzCb4pLtk https://t.co/0TLmvWPiq6 https://t.co/MExKvzXmon","2019-02-03 18:21 +0000","14028.0","43.0","0.003065297975477616","0.0","2.0","4.0","3.0","16.0","0.0","18.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092106939402805250","https://twitter.com/gwern/status/1092106939402805250","@cryptodavidw Impagliazzo's possible worlds might be relevant? https://t.co/K4x9Otra42","2019-02-03 17:05 +0000","706.0","10.0","0.014164305949008499","0.0","0.0","1.0","0.0","9.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092083755819433985","https://twitter.com/gwern/status/1092083755819433985","@TimBeiko @patrick_oshag @slatestarcodex It's sort of a short version of https://t.co/dhsBk2qjeP","2019-02-03 15:33 +0000","545.0","42.0","0.07706422018348624","0.0","1.0","1.0","2.0","27.0","0.0","11.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092082918476992512","https://twitter.com/gwern/status/1092082918476992512","@pnin1957 @KirkegaardEmil I think in practice this is offset because the sibling cohorts are often much better tested than the mass samples like UKBB, so it's swings and roundabouts.","2019-02-03 15:30 +0000","885.0","10.0","0.011299435028248588","0.0","1.0","0.0","1.0","0.0","0.0","8.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1092080957614686208","https://twitter.com/gwern/status/1092080957614686208","@23andMe","2019-02-03 15:22 +0000","7118.0","79.0","0.011098623208766508","0.0","0.0","0.0","9.0","0.0","0.0","69.0","1.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091888706171092992","https://twitter.com/gwern/status/1091888706171092992","@EdgeEmpress @ATabarrok But think about how much work those yachts and airplanes provide for poor deserving glaziers.","2019-02-03 02:38 +0000","312.0","4.0","0.01282051282051282","0.0","0.0","1.0","0.0","0.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091844981852131329","https://twitter.com/gwern/status/1091844981852131329","@daniel_a_blank @Ethan_Heilman Some thoughts of my own why WP cannibalizes other wikis (when it permits itself): https://t.co/epfPl5Sw2m My early attempt to sound the alarm on the editor retention crisis: https://t.co/QofXv5UJaG And my WP editing background (back to ~2004): https://t.co/eOcbUsywPC","2019-02-02 23:45 +0000","416.0","10.0","0.02403846153846154","0.0","0.0","2.0","1.0","4.0","0.0","3.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091843944462340096","https://twitter.com/gwern/status/1091843944462340096","@daniel_a_blank @Ethan_Heilman Not sure Wikipedia is a case-study of that. WP was never supposed to exist, it was just drafts for Nupedia. & in the only major strategic decision they (esp Wales) faced (the editor-retention crisis), they denied it for years in favor of prestige projects like 'Wikipedia on DVD'.","2019-02-02 23:40 +0000","427.0","4.0","0.00936768149882904","0.0","1.0","2.0","0.0","0.0","0.0","1.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091703860731760641","https://twitter.com/gwern/status/1091703860731760641","@ATabarrok 'None'? One billionaire famously just dropped ~$0.5b on a single fake painting, and he's barely halfway through his life expectancy.","2019-02-02 14:24 +0000","2504.0","50.0","0.019968051118210862","1.0","4.0","14.0","7.0","0.0","0.0","24.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091525269725372416","https://twitter.com/gwern/status/1091525269725372416","@BTC_kahir @elisehamdon @VitalikButerin @zooko I'm waiting for the 'distracted boyfriend' meme version.","2019-02-02 02:34 +0000","6538.0","32.0","0.004894463138574488","1.0","1.0","4.0","7.0","0.0","0.0","19.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091515788538851328","https://twitter.com/gwern/status/1091515788538851328","@artonymousart Didn't I see a preview for this game on Steam?","2019-02-02 01:56 +0000","688.0","8.0","0.011627906976744186","0.0","0.0","2.0","2.0","0.0","0.0","4.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091460642643603458","https://twitter.com/gwern/status/1091460642643603458","@backus I was reading this slideshow and it mentions SomethingAwful's ""Portal of Evil"" which sounds basically like a catalogue/summary of subcultures. Might be worth looking through if the forum archives are still around. https://t.co/EgAc5EYeBX","2019-02-01 22:17 +0000","508.0","1.0","0.001968503937007874","0.0","0.0","0.0","1.0","0.0","0.0","0.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091390572517036033","https://twitter.com/gwern/status/1091390572517036033","@cjsotomatic Sounds about right. Individual differences & behavioral genetics (https://t.co/Te7Owi6Idy) have always replicated much better than certain other psychology fields. Or as Bouchard put it: https://t.co/eHel8gcR6F 'See, see, what happens when psychologists study *real* variables?'","2019-02-01 17:39 +0000","9900.0","143.0","0.014444444444444444","7.0","1.0","25.0","27.0","63.0","0.0","11.0","9.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" "1091173097317851136","https://twitter.com/gwern/status/1091173097317851136","@pixmaven @JanelleCShane Is that a beer bottle on top?","2019-02-01 03:15 +0000","240.0","5.0","0.020833333333333332","0.0","1.0","0.0","2.0","0.0","0.0","2.0","0.0","0","0","0","0","0","0","0","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-","-" He interrupted her. Close at hand is a stable where two beautiful ponies are kept. They are snowy white, and are consecrated to the goddess Ku-wanon, the deity of mercy, who is the presiding genius of the temple. They are in the care of a young girl, and it is considered a pious duty to feed them. Pease and beans are for sale outside, and many devotees contribute a few cash for the benefit of the sacred animals. If the poor beasts should eat a quarter of what is offered to them, or, rather, of what is paid for, they would soon die of overfeeding. It is shrewdly suspected that the grain is sold many times over, in consequence of a collusion between the dealers and the keeper of the horses. At all events, the health of the animals is regarded, and it would never do to give them all that is presented. On their return from the garden they stopped at a place where eggs are hatched by artificial heat. They are placed over brick ovens or furnaces, where a gentle heat is kept up, and a man is constantly on watch to see that the fire neither burns too rapidly nor too slowly. A great heat would kill the vitality of the egg by baking it, while if the temperature falls below a certain point, the hatching process does not go on. When the little chicks appear, they are placed under the care of an artificial mother, which consists of a bed of soft down and feathers, with a cover three or four inches above it. This cover has strips of down hanging from it, and touching the bed below, and the chickens nestle there quite safe from outside cold. The Chinese have practised this artificial hatching and rearing for thousands of years, and relieved the hens of a great deal of the monotony of life. He would not have it in the scabbard, and when I laid it naked in his hand he kissed the hilt. Charlotte sent Gholson for Ned Ferry. Glancing from the window, I noticed that for some better convenience our scouts had left the grove, and the prisoners had been marched in and huddled close to the veranda-steps, under their heavy marching-guard of Louisianians. One of the blue-coats called up to me softly: "Dying--really?" He turned to his fellows--"Boys, Captain's dying." Assuming an air of having forgotten all about Dick¡¯s rhyme, he went to his place in the seat behind Jeff and the instant his safety belt was snapped Jeff signaled to a farmer who had come over to investigate and satisfy himself that the airplane had legitimate business there; the farmer kicked the stones used as chocks from under the landing tires and Jeff opened up the throttle. ¡°Yes,¡± Dick supplemented Larry¡¯s new point. ¡°Another thing, Sandy, that doesn¡¯t explain why he¡¯d take three boys and fly a ship he could never use on water¡ªwith an amphibian right here.¡± Should you leave me too, O my faithless ladie? And years of remorse and despair been your fate, That night was a purging. From thenceforward Reuben was to press on straight to his goal, with no more slackenings or diversions. "Is that you, Robin?" said a soft voice; and a female face was seen peeping half way down the stairs. HoMElãñÔóÂÜÀ­³ó ENTER NUMBET 0016www.iotjia.com.cn
hhccgo.com.cn
kschain.com.cn
www.ffoier.com.cn
www.lgtozt.com.cn
www.hnzz666.org.cn
www.wanzitv.com.cn
www.tuolian.net.cn
vxnews.com.cn
siding219.com.cn
插你肥穴 制服丝袜旗袍涶乱的小电影 日本兽王无码全集 朋友们开撸吧欧美图片 � 人兽性交影片 维族人黄色片 日本女孩小嫩穴摄影 三八人体图片基地 美国 女人体图片欣赏 女人洞人体艺术 强奸做爱短片小说 父媳乱来 成人乱伦小说狠狠 91自拍电影网 我的美艳岳母小说撸二哥 谁有仓井空的电影 堂成人免费视屏 最大 拾贰月重磅巨献高清私拍安徽合肥随妻5p av女优超市 tube兽人 人体少女成人艺术 丝袜淫娃女空姐 欧美熟妇刚交图 新号百度云怎么看片 李忠瑞影音先锋下载 日本女人超黄色人体 男女明星日逼图片 sss性交图片 超级玛丽兄弟电影三级片 gaoqingtoupaiyiyuannvce 擢撸少妇色图 美女逼图欣赏 他州色图 麻岛美几 jav365熟女俱乐部 夜夜摸东京热 7788dcom 亚洲成人电台 美女自己玩射了18p 亚洲美女掰鲍图 色妹妹影视下载 丝袜美女露阴门图片 小说 吃咪咪 黄色电影全草免费 爱爱医学网 干骚逼 射进肉洞 迅雷长片狠狠撸 大胆97人体 WWW_PS31_INFO 胖妹子和瘦妹子做爱有什么不同 WWW_ZARA_COM 明星18avday 乳交 乱伦 鸡巴 女儿的嫩洞逼 人妖变装图片 忧忧人体艺术导航 日韩考屄视频 粗鸡巴日骚屄 快播52淫 超诱惑骚妇奶子图片 黑巨屌插烂插扯阴道 四季美农贸城 招商银行客户端下载 情债电视剧 时空猎人电脑版下载 黑男白女种子 谁有亚洲美图网站2014 黑丝电影有哪些 苍井空的逼放大图 鼘掠牒趯? 就要搞妹妹 撸撸网王老撸 大鸡巴肏死小骚屄了 狼友迷奸自拍视频 韩国a片观看 徐莹性爱视频ed2k 女人和畜生做爱的小说 日本爽片种子 极品淫荡骚货 操逼有害不可大力日比 大香蕉很狠撸 喜欢日肥女人 小说享受泰国浴 亚洲成人10p 草小少妇屁股 李宗瑞100级 少女的阴蒂儿阴道 六九av影院 林心如两腿分开 汤芳最大胆全裸体清晰图 哪个网可看黄色图片 狂插妹妹15p 影音先锋黑人幼幼 WWW_211VV_COM av怎么看不了 给德国处女开包 夜色贵族成人小说区时间 美女爱骚逼 爱爱小说乱伦 欧美女人全裸照片 幼幼被3p 大吊黑逼 粉色骚逼 我插过的女人 日本熟女友田真希电影影音先锋播放 草裙操比 我要肏贾玲 黄色电影影音 小姐的阴道夹的我好爽 ww66avav 国语影视先锋 手机皇色视频网站大全 WWW59QRCOM 五月亚州婷婷炸天撸 张筱雨被男人插人体写真 天天色偷拍自拍 欧美图库亚洲图库 欧美人妻性爱写真 成人电影网址youjizz 美国骚逼想唉操 爸爸插错洞吉吉 操少女小说网 视频偷拍逼 成人网黄色片下载 跟老婆操奶操逼 大鸡巴狠插黑黑的骚逼 骚妇粉色木耳 四房开心播播图片 雨宫琴音qv电影 久久视频在线播放观看( 去干网1月25日 亚洲性交色视频 爱爱帝国钟和区图片 粉嫩写穴 快播伦理影院全国若妻 WWW578133COM 日韩孕妇做爱 成人在线电台 大学生妹妹穴穴 快播户外乱伦电影 亚洲女色网伦理 婶婶与侄子伦理小说 苍井空秘密搜禅观 德川重男种子 佐佐木春香 qvodk8经典强奸乱伦 女鲍鱼16p 幼嫩系列在线 内射欧美 我爱插逼淫淫网 啊淫水图 奸恋图片 我和小姨在车里 骚妇淫 欧美大鸡巴照片 漂亮女优伦理电影 性爱小说97 国产呦呦100部精选集 艳女淫妇 尻逼电影 乱伦乱摸乱交配 哪个明星的逼最好看 春色校 妻子让朋友干出尿 无需播放器在线观看西山希 色喜大胆人体 国产自拍偷拍在线播放若怒 成人孕妇性爱电影网 美女掰骚鲍www51gannet 骚逼蜜穴 家庭乱伦动漫电影 盗撮协和影视 射死你天天日百度 儿子射进来 怡春院人兽 操岳母色色网 少女嫩逼激情小说 自拍偷拍欧美美色图 家庭乱伦之巨乳妈妈 美女游戏封面相片 黑丝avi码合集 帅的男同性恋做爱的视频 本色国际 西西人体高清18岁 人体艺术舒淇艳门照 影视先先锋 亚非成人电影 究极人渣素人 weimeiqingchunccom 老头和妇女公园不雅视频 来点黄色的videombaiducom 色漫画之丝袜少妇 第一放映室伦理欧美 狗肏人网 农村tongjian 能直接在线观看图片的网站 操你妹手机电影 妇色小说 国产抽插开档黑丝女友 色狠窝 朋友的处女女友m06xscom 伦全集狠狠射 tt老婆 yingqi 制服超短裙番号迅雷下载 变态图区精品 谷露英语 日本一级毛片magnet 徐东东丝袜图片 AV区亚洲AV欧美AVwww9966acomwwwc5508comwwwshcateringcom 女性全裸近照无遮挡物 短篇剧情网站 亚洲无码女教师撸 午夜A片播 大尺度强暴性爱 色网站色网站 久草在线被监禁了的身体玩具 欧洲学妹性交视频 当着外人的面插入下载 日本美女性交24p图片 renyudongwuxingjiao capron马蓉出轨 明星合成欧美色色欧美性爱区欧美骚妇欧美成人色图欧美大奶欧美大胆人体 真人做色性动态图片 八戒街拍裙底 春药潮吹痉挛视频 思思热资源站 久草在线视频7mwwwgsobc1024cnmyiwancom 比基尼淫 花井美纱RKI252磁力链接 少妇熟女搞逼 裸体女奴小说 两性故事淫荡家庭 日韩少妇人妻系列 人妻的禁忌 riyelu国语 耽美骚货出水 17岁少女跪嫁68岁老汉 性交口交天天撸 制服丝袜1页日日摸 b3jjj 哥哥综合影院雅蠛蝶五次郎 偷拍自偷人妻自慰 西方人与狗性交片 艳照门视频华为网盘 偷拍性生活祝频 WWW_621YY_COM 欧美女生阴道艺术 sek115苞米地 影音先锋a9电影 亚洲成人动漫av 97成人在线乐子 2015Tv丶C0M 星河大帝同人小说 夜插 日本熟女性爱偷拍xjo530tumblrcom 美女b比图片 小泽玛利亚与人妖图片 成人大胆诱惑人体艺术图片 大色se片 偷拍自拍内衣12p 少女小山雀69Xx wwwfafa98om 美国发布站2K3HH wwww336bbbcom 成人做爱有哪些影院 亚洲777www69rhcom 尼姑云影视 制服丝袜图片网站 vr美腿丝袜 天天操日日艹 破处番号 偷拍影音先锋电影网 意淫强奸人妻女友 AV电影百度云 ggggbb 北原多香子全集种子 97资源www688bxcom 正在播放87午夜孕妇 无码一级黄视频 强奸乱伦 哥哥干 色狗狗综合视频网站 先锋影音成人动 日本av百人群Pav 成人3d卡通动漫在线 wwwjjzzocom 成人性全免费视频 网55eekcom 色和尚被偷拍的女主播 微信可以处理违章吗 a图片大全 欧美成人裸体网 欧美家庭伦理片资源下载mp4 妓女av网 欧美图片激情小说 1333ccc ee214magnet 最新日本av在线视频 姐姐爱爱集 狠狠爱五色天大香蕉 哈尔滨狼友 乱伦日韩欧美制服手机在线 wwwavsj 64sm 偷偷拍自拍15p 国产偷拍厕所在线视频 强奸军人母亲 日本有个是主播做爱的片子吗 亚洲第一在线黄色网站 东方东方伊甸园苍井空 蜜桃丁香 www2Csex872com 231avav 爱吧综合 黄色网站空姐被强奸下载 啊扣你的小穴 五月天激情网址 大香蕉精品520 色中色AV在线视频哥哥干 奇米成人影视色和尚、色尼姑 新西兰成人片 500av大香蕉 黄色1级片 看欧美AV剧情片的网站 性插网页 av大b视频 撸波波映院 人妖当着妈妈和爸爸 v2视频网站 千百撸综合 教室调教老师 另类小说专区在线 av在线总站 www725bb 摩擦影院5566 女人的穴是万能的 乱伦性交激情小说 亚洲1AV 高清av1080在线 大香蕉波波在线视频 淫汁Y 清纯唯美哥哥撸 sex888欧美片 依依社区影院撸lu射 995hao m3344tt 草民盒子 西西人体偷拍另类 狠狠操骚b 亚洲偷拍自拍最新网站 制服丝袜av天天 3d黄片 kk999pw下载 WWWavtb789com 男同志免费影片 琪琪电影网李宗瑞 色在线奶奶 579路com wwwyoujizzcom美女 有没有幼女的视频网站 欧美私处艳照 亚洲图片欧美图片日本av成人有声 国产小黄片网站 搜索www123456789 撸一撸美女人体 操小屄还玩老屄 淫淫色色色色 深爱萝莉 花样困绑虏待女人方法 很哈图热图 wwwcom84 wwwsyy31con 草莓社区2016新地址 国产av91自拍 鸡鸡和光屁股 夜一夜撸一撸 91luCOM 最新乱伦比较黄的肉文 嫩妹妹色妹妹干妹妹 av英语老师 美女合集早时候的图片 韩国浴池在线播放 奶大性爱基地 L三级片 美女的逼怡院 影音先锋偷看媳妇中文 91qq1com wweuucom 美国伦理在线 狼窝激情影院影音先锋 韩国邪恶动做片mp4 古代激情17P 男操男小说 干哥哥插妹妹逼 成人激情12P 好色视频超碰碰碰 yaolu导航最新地址 同性恋舔我小穴 在线视频主播AV 手机看片www555com 整部的黄色小说 男人强奸女人操逼的视屏 色黑木耳穴 经典乱伦性爱故事 色四图吧 广濑奈央美 无码 国产成人在线视频网站 女知青风流小说 高兴为什么呢哪俩唱的 女人wumaoyingbutu 儿子与母亲的乱伦性爱故事情节 爽图在哪里看 屄 亚洲 hongkongav 乱伦熟女做爱视频 欧美破处视频区 爆插长腿黑丝妹50p 后宫社区最新最新地址 最新肥胖人体艺术摄影 淋浴做爱av 李宗瑞性侵录像网址 热香屄 女主播艳门照 车上逼逼图 为什么我喜欢让情人操我 性爱av哪里能看 影视先锋伦理电影 国内老女人av在线视频 欧美黄色网站视频第一页 美女乱论小说 日本人操b图片 神木轮奸犯王伟 与狗爱爱网聊视频 嫩乳妹妹人体艺术 1238090com 郭富城早期经典视频 女忧快播影音先锋 色色偶性爱自拍 肏嫩屄舔雪白屁股插进去 叶欣桐图片库 国产视频操胖老奶奶10306 激情爱爱1qq号是多少 酒色视频网最新网址 成人电影微信群 苍井空的电影名字 想几个男人操逼逼 在酒吧把泥酔巨乳女带回家狂插先锋影音 色mm激情万种 美女人体馒头 明星华为网盘 老妈与我乱伦 亚洲色图 女大学生性爱 毛色黄片网站 非洲美女陆逼网战 偷拍亞洲路上幼小情侶做愛相片圖? 女人白虎屄 bb性爱视频 大鸡吧干衅 日本成人强奸乱伦电影 在哪能找艳照门 欧美嫂子qvod 我喜欢大奶女人 亚洲爱色套图 遥子里被操的图片 色爱在线视频 36周了感cnbb头己下了 不举总裁的盗种妻 美国最年轻的总统 wendao 甚至导致严重的肌腱炎。建议不要长时间发短信, 梦幻西游69群p 武器战士输出手法 韩国人体艺术新闻 关于怡红院的网名 少女18艺术图片 色沟影院 女人杂交视频 大鸡巴姐姐肏骚妹妹 日本人体艺1 上原结衣的片去哪个网站找 makeloveveido 91快播电影国外处女 人体图全女 女子健美漏屄图片 美足自慰 畸形逼 吉泽明步orrent 人妻性爱色图影音先锋 女艺术图 qovd幼交电影 韩国女明星被潜规则爱爱电影 快播免费成人禁片 日本强奸资源 熟女尿射口 鑹叉儏缃戠珯链夎皝鐭ラ亾 午夜性交图片 走光露屄毛的女人 操妹妹逼 撸撸涩涩图片 惩罚骚货熟女 小岳母的淫水 狠狠干激情 人体艺术女同性恋视频 暴操成都艺校妹妹 我爱和妈妈和妹妹乱伦小说 成人极品小说合集 人体射精艺术图30p 三级片3级片四级片视频 美女的下部肉肉图 和女仆干真爽电影 在线视频录像 WWWAAACOMYAZHO 奇艺人体艺术 人狗性交私处照 插妹妹自拍偷拍 RE789COM 鲁鲁射射 深爱激情网免费视频播放 清纯唯美中文 橹一橹小说阅读 骚女嫂子 幼女小说长篇 屄都什么样 咪咪网王老撸 手机浏览器看不了在线视频 婶婶在果园里把我了 少妇熟女乱伦强奸 粉红亚洲妹偷拍自拍 WWWWWNNVCOM 国产av大片 一圾片2013推存 日本少妇伦理小说 黄昏去操逼俄罗斯 玉女情挑吉吉音影 色四月婷婷网五月天肉文 亚州成人大奶综合电影 老江吃嫩草20 亚洲怡红院首页 日本城人电 色哥百度 WWWGAN521COM 激情网站wwwhuangsedianyingnet 强暴乱伦爱爱 httwww8888tpcom 台湾无码A片欲火焚身 婷婷五月亚洲Avwww317avcom wwwAVKUTV 本站只适合18岁以上观看 tom色播网 免费成人网绐 www111eecom四虎影裤成人精品 小苍奈奈网站 妻子地铁失身 印度毛片ed2k 儿子插完妈妈插妹妹 xingjiaoluanai WWWWEIBOCOM 淫姐姐骗了弟弟操她逼 小鲜肉操熟女 老色哥奇米影视26uuu 插进妈妈的肥逼 网友自拍偷偷师爷 成人动漫亚洲色图意淫强奸亚洲色图意淫强奸 好骚色色re 不小心小孩插大人影音先锋 韩国少女掰屄图 wwwqylsp1 YY6080PAPAPA 中国空姐干逼视频 观看美女自慰直播的网址 2017搜片神器是那个 强奸色女网 日韩人与兽爱爱 母淫网姐姐 亚洲娃娃 猛操骚货痴女护士文 亚洲大奶老女人 韩国电影熟女的欲望 偷窥厕所漫画 儿媳妇与公公三级片电影 好看站撸啊撸撸 首页偷拍自拍朋友女友的嫩穴毛还是初长成那种 李宗瑞生殖器大吗 肥乳性爱 有声小说ok网 笑傲华夏有声小说 春色夜生活 知春色 樱井莉亚下马 樱井莉亚体重 樱井莉亚123 小泽玛利亚迅雷 大家给个h网 东京热n0570 东京热n0393rmvb 可以用快播看的黄片 哪里可以看黄片 男人为什么喜欢看黄片 川村まや 33链导航 性生活知识 一点色导航 tv69在线 插插洞插插B 多情成仁高清 群交乱伦电影 色人居成人网 性乐汇色电影 淫色网 yeye撸 撸撸侠网站 苏紫紫干露露全套 干海参怎么发 Av天堂色电影 4099y影院 3d爆乳女教师西瓜 精品ady9 52AVAV好色色屌丝 国产精品情侣av 西野翔被偷窥 怡红院福利二区 在线阿v在线澳门 人人操美丽人妻 人人摸人人揉视频在线 在线无码私人影院 午夜福利1000部757 咪咪av电影 海外啪啪啪视频 南方来信免费观看 100国产自拍 rbxjdh 七色沙耶香番号 芹泽直美诱惑女教师视频 青青草原影视免费观看 桥本美步百度云 青草碰碰 青青草人碰人视频在线观看 在线看片 欧美自拍 青青 日福利av 日本情爱电影 孕妇在家孕交视频乳交 樱濑奈bt 日本少妇cosplay母狗 日本桐冈市 日本男女抽插插福利免费动态资势 在线男人福利免费视频 爱青岛在线观看 在线小萝莉 美女主播自慰出水视频 VEC-360部长奥-八木梓 麻仓优112部全集 迷夜惨案在线观看 痴女诱惑 稀缺资源 九州av 夜趣阁AV导航 无码av种子在线观看 luoliai 黄色片大胸 日本在线强奸乱伦 淫妇骚屄欠肏视频 色午夜v 小操黄色免费管看视频网址 日本亚洲欧美无码免费视频 草莓网 日本冒死厕拍视频 大香蕉一本道富二代 天天影院色小 凹凸視頻免賛在線 女主播叫床视频在线观看 行交视屏 黄色三级轮奸高清 宅男伦理 在线观看成人影院在线 xinxin68 org 国产自拍在线更新av 蓝沢润手机免费视频 你懂百度资源 和充气娃娃爱爱视频 龙头蛇尾 童颜巨乳 色和尚在线视频久和综合色色 江川美奈子在线视频 3344df 怡夏影院成人快播 1204手机在线播放下载 4438x一全国大色 露谷试看 色女孩影视最新网址 欧美群交网 波多野结衣来广州 日本she射精视频 自拍内裤爱液 伦理片自拍短视频 爱草色综合 彩美旬果xfplay 亚洲 自拍 porn 亚洲种子av 苍井空AⅤ下载迅雷 磁力链 下载 rki435 488成年网站 52av2av av天5 极品红人cutie queen 浪货成人网 91p app 上原亚衣黄色 中文字幕av12免m看 www路AV 亚洲36D 福利视频偷拍 大香蕉在緌人妻 福利avi 福利roufan动漫在线观看 大香蕉s视频 迪厅里的黑人男女裸体交配视颖 大神夜店大山极品美女带回高级寓所阳台一直 xxx-av 20147-20148 青鱼视频2自拍 国产人妖 鬼父有无码的么 韩国三级电台在线直播 劲暴欧美 一本道 东京热 自拍 国产 无码 quanqiuzuidadechenrenwangzhan 午夜小影院。 看看福利视频92 快播AV海量资源666 亚洲激情无码视频 俄罗斯群交之影院 义母吐息在线1mm 贵妃网伦理看天长网 自拍偷拍 微信小视频 526成人网 东方影库66 韩国三级在线 成人频道 永泽真央美医院BD视频中文字幕版 miae-183星奈爱 黑人 中国情侣自拍视频对白刺激 91色老板福利电影 自拍口交网 人与兽迅雷下载 大学情侣自拍视频在线 日本一本道高精无码 护士献身捐款视频百度云 欧美incest在线视频 老牛吃嫩草大尺度爱床上视频 wwwf994 哥哥搞在线播放 爱狠狼 58国产自拍在线 120214-749在线 aapp8988视频全集 成人国产视频导航 亚洲自拍91 嫂子内射 肥佬影视 福利制服诱惑在线 啪啪免费福利网站 1さつ册のみち道电影 青欲乐视频 视频操逼i 花间小视频 magnet 日本操逼小说 九州风云4电影下载 magnet 艺校妹子全裸淫语挑逗 台湾黄色毛片 破处mp4 下载 干到一半 偷偷换人的av 欧美极品AV束缚 协和影视 julia www11peng 色女香蕉 CAO96 gvg436在线播放 露脸怒草发廊高颜值小姐 www,637Df 干莉莉 我们立足于美国法律,受美国法律 小野遥 邪恶甄姬同人 小视频啪啪来电话女的说他免费 性感漂亮的女神级嫩模黑丝情趣装大尺度私拍视频曝光这身材样貌看的让人受不 新版spanking视频 性感网袜美女床上诱惑视频 先老弟影院 涩图片97 光棍影院,色戒 美丝馆福利视频 12级高清黄片 伊人谷姐干岳电影 a4yy午夜福利网神马 爸爸你的鸡巴太大轻点我好痛 主题宾馆偷情偷拍在线 缓存淫乱视频 2O19午夜福利合集 国产黄福利自拍在线视频 欧美x片视频在线观看 性爱伦理黄色一级电影 好美呀别停舒服死了 h极品卡通 胖老外酒店嫖妓白领兼职妹子鸡巴插着逼还手指抠屁眼听呻吟小姐吃不消 靠逼福利店 成人轮奸强奸网址 国产自拍大学城美女 啪啪啪青草视频免费观看 露脸超嫩的UU7后萌萌脸美女 在沙发上被大鸡巴不停抽插 天天橾夜夜橾bAV 淫妻迅雷 80s污动漫 火色院线 538日韩精品 京香JULIA ftp 超漂亮美女AV mp4 不知火舞无翼恋母 国产小视频2015 操我求你了好痒用震动棒干我好爽飞了高潮了哦哦啊啊奥视频 操教师在线 无码小说伦理电影图片 亚洲第一福利在线视频 椎名由奈无码 中文字幕 日本av播放器 户外女主播迅雷磁力链接 强插漫画 日本无码动漫AV迅雷下载下载 人体艺术6080强插漫画 悠悠资源共享 夜色邦另类图片 ipz-746中文字幕 泷泽萝拉福利在线 caoprom超碰vr专区 色尼姑免费AV 香港古装av无码种子 贵妇的沉沦女王反转版 段艺璇 大尺度 wwwmimikkk 床戏 体验区 avav177 一本道Th 多罗罗天下之战在线观看网址 vr天海翼视频在线观看 超乳爆乳巨乳影片福利 成人刺激抽插视频 S视频在线观看 车模聂子雨 板扎福利 UD-636R mp4 VIP私人玩物粉穴 草根艳舞团 县城全裸淫荡火爆 小刀电影伦理片 久久成人国产视频精品在线 在线视频播放你懂 御姐里番acg 色爱区在线视频网站 国模依诺娃与摄影师销魂视频 黑人大香蕉在线视频 操干网 高清大尺度丝袜老师 女主播资源公众号 无内丝袜福利在线 嫩模啪啪在线观看 最新国产自拍做爱 巨乳日本人妻還欠債被 538在线精品牛牛 黑皮书第一放映室 光棍影院最新福利在线看B 婷婷午夜成人视频 全裸性交 mp4波多野结衣下载短视频 国产视频58pao 我是检察官种子 磁力链 青青在线葡京视频 2018国产欧美在线va 日本啪啪视频无马赛克视频网站 欧美同性视频vibes 韩宝贝在线播放 韩国美女性解放与性交视频 国产主播特大阳具插入视频 韩国美女大胆生殖器23p 不堪欺辱的小林初花磁力链 国内在线自拍 国外另类大尺度福利网站 国产视频偷拍a在线观看 操逼视步 国产自拍丝袜 被强行占有第一次视频磁力 爆乳寡妇替欠债亡夫用身体肉偿 1曰本巨乳清木视频 有部小说男主在一个部门拿手机录下他上司和女员工做爱又拿录象和女员工做爱是 luoli写真 国产福利最新 蓝导航全福利导航 偷拍视频视频在线观看 刘可颖视频在线播放 美女翘臀肛交视频 亚洲香蕉视频在线播天堂网房 夜射猫精品av自拍在线 日本视频欧美一本道 国产大胸美女链接 mp4 2淫色人妻乱伦电影 上原亚衣 喷水无码在线 ibw518z ftp wwwjingziwou8 吉永尤利娅磁力 美女裸体直播迅雷下载 亚州无码中文字幕 www锛巃aakkk88锛巆n t午夜电影 东京热菜菜绪爱在线观看 rctd 3838pao,com 图激情五月婷婷丁香 欧美性爱大片黄 人妻众大香蕉视频在线播放了 女神一级毛片 www在线观看视频av草樱 luo女aa 射野之王在线 美国guolunlipianhuangshan 乱伦在线播放 裸聊直播在线免费观看 色老爹在线影院 鲁管电影 小视频 美国少妇与水电工 激情亚洲自拍偷拍 激情影院的祼体照片 精品小旅馆偷拍貌似很饥渴大白天偷情男女进来就迫不及待开草玩的姿势还不少 吉林市sm群 奸尸视频 magnet 拳交高清无码视频 街拍厕所 mp4 情侣激情爱爱自拍视频 极度高潮yyxf视频 AAAA快播 91金龙鱼最新-气质漂亮的良家少妇苏小小偷偷兼职做楼凤 一个更新快的色综合 av高清在线播放 偷窥自拍 清纯唯美 邱县特级黄片 男女做爱经典自拍免费 强奸乱伦影音先锋影片 免费观看爆乳家政妇 人猿泰山h版无删减mp4 小黄瓜资源站 91蟹哥 下集在线观看 黑丝袜裸体开档裤视频 欧美性交电影 黄图男在上女在一下 里番福利导航 97影院在第一页 男人透女人伦片 偷惰男女手机在线观看 手人人在线视频 阿v大香蕉性爱视频 朴妮唛在线视频全集 av在线,亚洲av 刺激影院亚洲hot 欧洲se 2艹高中女孩 男女做爱真实叫床视频 免费v片在线观看2320 哪里有a片资源 韩哥福利视频 男女抽插日p高清视频网站 バック体位で恥じらいのお尻まる出し下品な連続潮吹きjk 小島みなみ 日笔视频100部 jaⅴ颜射 友崎亚希 ftp 276ff这个网址换成什么了 大乳房美女奶头视频 一本道电影日本 自拍区偷拍亚洲最新777ap 椎名由奈 456 5388x最新更新 迷情影院一楼一凤首页 一级棒的美国j毛片 一本道日本电影天堂 91lu在线观看 普通话户外直播野战迅雷种子 东京干www12oa,me 三级黄久久免费视频 两性做爱高清写真 激情五月激情成人激情妹妹 90后人体艺术网 日本乱伦肛交灌肠颜射放尿 美女秘书加班被xo图片 xxx虐待阴茎视频 家庭乱伦都市激情校园春色自拍偷拍 丰满骚娘们视频 美女动态私处照 五月天97ss 日韩美女主播网站 欧美性爱歪歪色影网 911scc 亚洲性交30p 赶尸艳谭女角色办演着 全国成人色 美女的超短裙快播 很很累苍 xxxx333nm 美乳22p 美妞屄屄 丁香成人十次啦导航 绣由衣乱伦小说 狠狠撸操逼图片