I do not understand but I am a fan
We just published our first feature story on the Australia Bushfires. Check it out here:
Happy weird duck time!
This bushfire season was always predicted to be ferocious, but with months of hot weather to go, it has already left a scar on the nation.
Oh my god
I love this step by step guide how to achieve Inbox Zero in Gmail to start the year out fresh.
In 2007, Noah Kalina posted a time-lapse video showing a picture of himself every day for six years. Pop culture swallowed it up. There was even a Simpsons parody with Homer. After another six years, it was a video for twelve years’ worth of photos. Kalina has kept his everyday project going, and the above is the new time-lapse for two decades.
This brings back graduate school memories for me as I argued for personal data collection as a diary instead of just for quantified self. I often led with Kalina’s project as a primary example. He ages, his background changes, and his camera improves, but the angle stays the same.
It’s a very tiny window into his life, played out over time, but I bet for Kalina it means a bit more. [via kottke]
Visitors to Parliament House were forced to wear face masks after smoke from bushfires blankets Canberra in a haze on January 5.
The oceans are deep. But how deep and what’s down there? Neal Agarwal provides this piece, The Deep Sea, that scales the depths of the ocean to your browser window. Scroll, scroll, and then scroll some more to see what sea life (and other things) reside at various depths.
Agarwal’s Size of Space piece from last month explores the size of space in a similar vein. It’s equally fun.
This is the internet I signed up for.
This is old news now but I am still disturbed we have just created a cow-matrix
Moscow-area farmers strapped modified VR headsets to cows to see if it improved their mood – and, of course, their milk production. The project subjected cattle to a simulated summer field with colors tuned for the animals’ eyes, giving them a decidedly more pleasing landscape than a plain, confining farm. And yes, the headsets were adapted to the “structural features” of cows’ heads so that they could see properly.
It appears to have worked, at least on a basic level. The first test reduced the cows’ anxiety and boosted their overall sentiment. While it’s not certain how well this affects the quality or volume of milk, there are plans for a more “comprehensive” study to answer that question.
This blog would have been 10 years old today (it’s still retired).10 years covering the subject of...
This blog would have been 10 years old today (it’s still retired).
10 years covering the subject of art and technology, yet the blog post that has had the biggest impact is about designing a dildo …
This is a month late, but still great
In my opinion, one of the best applications of neural networks is for generating Halloween costumes. Thanks to a dataset of over 7,100 costumes crowdsourced from readers of this blog, I’ve been able to generate Halloween costumes with progressively more powerful neural networks. In 2017, I used char-rnn, which learned to generate costumes starting from no knowledge of English (Statue of Pizza, the Fail Witch, Spartan Gandalf, and Professor Panda were some of its inventions). In 2018, I used textgen-rnn, also training from scratch, and teamed up with the New York Times to illustrate the costumes (some of my favorites were Sexy Wizard and Ruth Bader Hat Guy).
Now, as of 2019, there are much more powerful text-generating neural nets around. One of these is GPT-2, trained by OpenAI on a huge dataset of text from the internet. Using the connections it’s gleaned from this huge general dataset, GPT-2 can generate recognizable (if often weird) lists, mushrooms, British snacks, crochet patterns, and even a to-do list for a horrible goose.
So, I trained the 355-M size of GPT-2 (the largest I can currently finetune for free via Max Woolf’s collab notebook)
GPT-2 is good at costumes. Many of its inventions could easily have come from the training data. In fact, the neural net did tend to memorize the training data and repeat it back to me - technically this is what I asked for when I asked it to predict the training data. (The neural net is trying to give me exactly what I ask for, which isn’t necessarily exactly what I want.) I was using a handy script to filter out duplicates (thanks to John Tebbutt), and even so I had to check several of these to make sure they weren’t near copies of the training data. My previous Halloween costume generators would not have been smart enough to come up with things like “jackalope” or “Carl Sagan”, but GPT-2 has seen these words used online in similar contexts to things that ARE in the training data, and it makes the connection.
Eye of Sauron
a raised eyebrow
A space squirrel
Frizzle the witch
Cleopatra on vacation
Sexy Lego Batman skeleton
Oh yes, the sexy characters. The neural net definitely picked that up from the training data, and innovated admirably, bringing in words that it knew from the internet (barnacle, groundhog, and bunsen burner were not in the list of Halloween costumes), and adding a sexy twist. This is impressive (if somewhat horrifying) work. None of these were in its training data, but I wouldn’t be surprised if some of them exist.
Sweet Potato Burlesque
Sexy Rubber Duck
Sexy Bunsen burner
Sexy gingerbread man
Sexy Flying Dutchman
Sexy Flames Of War
Sexy English Tea Party
And the neural net was pretty good at designing identifiable characters, even if they are a bit on the weird side.
A spangled Auroch manatee
M. Bison the Clown Prince of Darkness
Gingerbread Man guinea pig
fast food bald eagle
Fairy root vegetable
Ghost in a packet of potato chips
Kelpie the mage
Slytherin AI priest
A skunk in a moose suit
Semi-molten Kool Aid Man
Time Lord Power Ranger
The Power Dinosaur
Deadly Snow Monkey
An evil cupcake
basic plumber’s equine
Spooky mother hen
The Bozo the Destroyer
Eight Ball of Wrath
Ursula, Queen of the Fart Science
A poker player in possession of an onion
There are hints, though, that this is the work of an AI rather than the work of someone who understands what costumes are and how they work. These, for example, take somewhat ordinary costume concepts and then make them unnecessarily difficult.
Batman on egg
Vampire in hot tub
A Hidden Jesus Statue
Zombie ice cream cone
penguin as a Newt
A wizard encased in a icicle
Zombie fisherman on a quest
Computer generated horse(?)
telephone that accepts up to 4 numbers
Third Eye Blind Photographed By Dorothy
Zombie fisherman w/ lady diegrove tied around foot
And the following costumes are clearly the product of a glitchy AI:
Eyeballed Balloon Men
Green beans in bun
The Oatmeal Tree
102 SNOWBALLS in a basket
Pie and Jell-O
List of leg parts
world´s nicest fart
Pineapple wrapped sasquatch
Is it a Snake, a Watermelon, or a Bush?
Putting Turtles on Decor
Fish tank ‘n chair
pajamas made of wood and spiders
Ssssssssssexy SSSssssssstinky Ssssssssssssexy ssssssssssssssssexy
setup 9 × 11 party trick
Commentary couldn’t be heard over the squawking of clocks
Poltergeist might be entertaining, but he’s harder to read in Hungarian
Cereal Implanting Device
blueberry sipping fizzy pop with eyes of ice
blueberry sipping fizzy pop with fake blood on it
A sarcastic, racist noble using progressively tinier body parts as a human shield
Bonus content! The above costumes are all from temperature 1.2; I also tried a higher temperature setting, but the generated costumes were at an expert level of chaos (I would like to see someone attempt to go as “hypnopotamus embroidered death”) Enter your email here to get them!
You can order my book You Look Like a Thing and I Love You! It’s out November 5 2019.
CCTV camera marketed on its ability to determine ethnicity, particularly differentiating between Han Chinese and muslim Uyghur minorities.
Europe is a lot newer than I thought!
Dwellings by most common period of construction.
I don’t understand but I love it
Not since the heady days of Pardon Me, has any video spawned so many videos sampling a video sampling a video sampling a…
Well, this is fascinating.
For Tampa Bay Times, Tracey McManus and Eli Murray delve into the purchasing of properties Clearwater, Florida by the Church of Scientology:
The Church of Scientology and companies run by its members spent $103 million over the past three years buying up vast sections of downtown Clearwater.
They now own most commercial property on every block within walking distance of the waterfront, putting the secretive church firmly in control of the area’s future.
Most of the sales have not previously been reported. The Tampa Bay Times discovered them by reviewing more than 1,000 deeds and business records, then interviewed more than 90 people to reconstruct the circumstances surrounding the transactions.
The lead-in scrollytelling through Clearwater is quite effective in laying the foundations of the story.
Reissue: “Vague - The Magazine” — I’m reading this tonight, probably
A Vinco Original based on Vogue Magazine (Photo by Andrea Yurko from Pexels)
As discussed previously, the “impeach this” map has some issues. Mainly, it equates land area to votes, which makes for a lot of visual attention to counties that are big even though not many people live in them. So, Karim Douïeb used a clever transition to change the bivariate map to a cartogram. Now you can have a dual view.
An interesting long read on the political and social dimensions around Duolingo’s decision to branch out into smaller languages like Irish and Hawaiian. Excerpt:
Of course, there’s a big difference between picking up a few words in Irish or Welsh to make you feel as if you’re connecting with your ancestors, and actually learning a language — particularly an endangered one that needs all the speakers it can get.
This is a tension that Duolingo has struggled with when it comes to its two endangered language courses, Navajo and Hawaiian. Those tongues are listed as vulnerable and critically endangered, respectively, by UNESCO.
Both languages were added to Duolingo this year to coincide with the United Nations International Year of the Indigenous Language. But they raised questions that weren’t necessarily an issue for courses such as French or Spanish, which aren’t expected to be used by native speakers of those languages.
“Who’s the audience for the Hawaiian course? Is it going to be tourists? Mostly? Because that would affect the content,” said Awodey. “Or is it going to be primarily built by and for indigenous speakers and people reconnecting with the language?”
In Hawaii, the team partnered with Kamehameha Schools, a network of private schools dedicated to teaching students of native Hawaiian heritage with a particular focus on preserving the Hawaiian language.
Despite this, however, the Duolingo Hawaiian course can sometimes risk speaking down to native Hawaiians, few of whom need teaching, for example, what a “lei” is.
“Everyone was super excited about it, but it’s totally tapered off because it’s not for natives, it’s too baby, it’s too simple,“ said Kū Kahakalau, executive director of Hawaiian language and culture NGO Kū-A-Kanaka.
Linguistic politics are often fraught with regard to majority tongues, let alone for endangered languages that have a long history of colonialism and disrespect. Scrutiny of such courses is always going to be tighter, and invisible red lines easier to cross.
"When you’re dealing with a heritage language, it does come with a bundle of stuff that we don’t have when teaching English,” said Duolingo learning scientist Hope Wilson. “There are lots of tricky issues to get into, very often there are divides within the communities where people don’t agree on, you know, issues of spelling or that kind of thing.”
In 2008, the World Wildlife Fund ran a campaign that used pixelation to represent the number of animals left for endangered species. One pixel represents an animal, so an image appears more pixelated when there are fewer animals left. Imgur user JJSmooth44 recently used more recent numbers to show the images for 22 species (sourced from the Animal Planet endangered species list).
The above is the image bengal tiger with 2,500 pixels. In contrast, the black rhino has 5,000 pixels:
Or, here’s the black footed ferret with 300:
New official video for The Beatles’ “Here Comes the Sun” for 50th Anniversary of their Abbey Road album
👏🏿 Capitalism 👏🏿 wants 👏🏿 to 👏🏿 fuck 👏🏿 you 👏🏿 all 👏🏿 the 👏🏿 time. 👏🏿
Four reasons to care about Beaked Whales, AKA the best whales.