Shared posts

17 Jan 05:37

for New Scientistp.s. get my artwork and comics at...



for New Scientist

p.s. get my artwork and comics at www.tomgauld.com

17 Jan 05:01

We just published our first feature story on the Australia...



We just published our first feature story on the Australia Bushfires. Check it out here:

https://www.over-view.com/stories/australia-bushfires

16 Jan 03:24

Happy weird duck time!Original on my site.



Happy weird duck time!

Original on my site.

16 Jan 01:22

Australia fires photos show startling transformation before and after destruction

by Leigh Tonkin, Catherine Taylor And Bridget Judd

This bushfire season was always predicted to be ferocious, but with months of hot weather to go, it has already left a scar on the nation.

16 Jan 01:21

Inbox Zero Trick

by swissmiss
Roslyn

Oh my god

I love this step by step guide how to achieve Inbox Zero in Gmail to start the year out fresh.

16 Jan 01:20

Man takes picture of himself every day for 20 years

by Nathan Yau

In 2007, Noah Kalina posted a time-lapse video showing a picture of himself every day for six years. Pop culture swallowed it up. There was even a Simpsons parody with Homer. After another six years, it was a video for twelve years’ worth of photos. Kalina has kept his everyday project going, and the above is the new time-lapse for two decades.

This brings back graduate school memories for me as I argued for personal data collection as a diary instead of just for quantified self. I often led with Kalina’s project as a primary example. He ages, his background changes, and his camera improves, but the angle stays the same.

It’s a very tiny window into his life, played out over time, but I bet for Kalina it means a bit more. [via kottke]

Tags: Noah Kalina, time-lapse

16 Jan 01:13

Visitors to Parliament House were forced to wear face masks...



Visitors to Parliament House were forced to wear face masks after smoke from bushfires blankets Canberra in a haze on January 5.

10 Jan 02:16

Where the Australia fires are burning

by Nathan Yau

The New York Times zoomed in on southeastern Australia where the fires have hit the worst. They also used small multiples to show the scale of the fires the past few months against previous years.

Tags: Australia, New York Times, wildfire

06 Dec 00:29

Scroll, scroll, scroll through the depths of the ocean

by Nathan Yau
Roslyn

Neat!

The oceans are deep. But how deep and what’s down there? Neal Agarwal provides this piece, The Deep Sea, that scales the depths of the ocean to your browser window. Scroll, scroll, and then scroll some more to see what sea life (and other things) reside at various depths.

Agarwal’s Size of Space piece from last month explores the size of space in a similar vein. It’s equally fun.

This is the internet I signed up for.

Tags: animals, Neal Agarwal, ocean, perspective

06 Dec 00:28

Moscow-area farmers strapped modified VR headsets to cows to see...

Roslyn

This is old news now but I am still disturbed we have just created a cow-matrix





Moscow-area farmers strapped modified VR headsets to cows to see if it improved their mood – and, of course, their milk production. The project subjected cattle to a simulated summer field with colors tuned for the animals’ eyes, giving them a decidedly more pleasing landscape than a plain, confining farm. And yes, the headsets were adapted to the “structural features” of cows’ heads so that they could see properly.

It appears to have worked, at least on a basic level. The first test reduced the cows’ anxiety and boosted their overall sentiment. While it’s not certain how well this affects the quality or volume of milk, there are plans for a more “comprehensive” study to answer that question.

Cows wearing VR headsets might produce better milk

06 Dec 00:27

This blog would have been 10 years old today (it’s still retired).10 years covering the subject of...

This blog would have been 10 years old today (it’s still retired).

10 years covering the subject of art and technology, yet the blog post that has had the biggest impact is about designing a dildo

01 Dec 06:07

Halloween costumes by the neural net GPT-2

Roslyn

This is a month late, but still great

sentient stone, skunk in a moose costume, eight ball of wrath

In my opinion, one of the best applications of neural networks is for generating Halloween costumes. Thanks to a dataset of over 7,100 costumes crowdsourced from readers of this blog, I’ve been able to generate Halloween costumes with progressively more powerful neural networks. In 2017, I used char-rnn, which learned to generate costumes starting from no knowledge of English (Statue of Pizza, the Fail Witch, Spartan Gandalf, and Professor Panda were some of its inventions). In 2018, I used textgen-rnn, also training from scratch, and teamed up with the New York Times to illustrate the costumes (some of my favorites were Sexy Wizard and Ruth Bader Hat Guy).

Now, as of 2019, there are much more powerful text-generating neural nets around. One of these is GPT-2, trained by OpenAI on a huge dataset of text from the internet. Using the connections it’s gleaned from this huge general dataset, GPT-2 can generate recognizable (if often weird) lists, mushrooms, British snacks, crochet patterns, and even a to-do list for a horrible goose.

So, I trained the 355-M size of GPT-2 (the largest I can currently finetune for free via Max Woolf’s collab notebook)

GPT-2 is good at costumes. Many of its inventions could easily have come from the training data. In fact, the neural net did tend to memorize the training data and repeat it back to me - technically this is what I asked for when I asked it to predict the training data. (The neural net is trying to give me exactly what I ask for, which isn’t necessarily exactly what I want.) I was using a handy script to filter out duplicates (thanks to John Tebbutt), and even so I had to check several of these to make sure they weren’t near copies of the training data. My previous Halloween costume generators would not have been smart enough to come up with things like “jackalope” or “Carl Sagan”, but GPT-2 has seen these words used online in similar contexts to things that ARE in the training data, and it makes the connection.


vampire rock, gothy giraffe, battle worm

Gothy Terminator
jackalope
vampire cat
Eye of Sauron
incognito llama
space cow
Vampire Rock
Scooby Gadget
a raised eyebrow
Battle worm
Mastodon
Swamp girl
Carl Sagan
A space squirrel
walking carpet
Frizzle the witch
Cleopatra on vacation
gothy giraffe
Sexy Lego Batman skeleton

Oh yes, the sexy characters. The neural net definitely picked that up from the training data, and innovated admirably, bringing in words that it knew from the internet (barnacle, groundhog, and bunsen burner were not in the list of Halloween costumes), and adding a sexy twist. This is impressive (if somewhat horrifying) work. None of these were in its training data, but I wouldn’t be surprised if some of them exist.


sexy hummingbird, sexy flying dutchman, sexy barnacle

Burlesque Horse
Sexy Bulldog
Sexy Egg
Sexy Parsley
Sexy Barnacle
Sexy Walrus
sexy locust
Sexy Titmouse
Sexy Hummingbird
Sweet Potato Burlesque
Sexy Groundhog
Sexy bitcoin
Sexy DNA
Sexy Rubber Duck
Sexy Bunsen burner
Butt-Monks
Sexy gingerbread man
Sexy Flying Dutchman
Sexy Chimneywatcher
Sexy Flames Of War
Sexy English Tea Party

And the neural net was pretty good at designing identifiable characters, even if they are a bit on the weird side.


gingerbread man guinea pig, ghost in a packet of potato chips, cozy coconut

A spangled Auroch manatee
M. Bison the Clown Prince of Darkness
Gingerbread Man guinea pig
Skin Fairy
sentient stone
fast food bald eagle
Fairy root vegetable
Ghost in a packet of potato chips
cozy coconut
Kelpie the mage
Crochet monster
Walrus rider
Star skunk
Slytherin AI priest
A skunk in a moose suit
Semi-molten Kool Aid Man
Time Lord Power Ranger
The Power Dinosaur
Space Oystermonger
Deadly Snow Monkey
An evil cupcake
basic plumber’s equine
Spooky mother hen
The Bozo the Destroyer
Eight Ball of Wrath
Ursula, Queen of the Fart Science
A poker player in possession of an onion

There are hints, though, that this is the work of an AI rather than the work of someone who understands what costumes are and how they work. These, for example, take somewhat ordinary costume concepts and then make them unnecessarily difficult.

Batman on egg
Vampire in hot tub
A Hidden Jesus Statue
Zombie ice cream cone
penguin as a Newt
A wizard encased in a icicle
Zombie fisherman on a quest
Computer generated horse(?)
telephone that accepts up to 4 numbers
Third Eye Blind Photographed By Dorothy
Zombie fisherman w/ lady diegrove tied around foot

And the following costumes are clearly the product of a glitchy AI:


pajamas made of wood and spiders, world's nicest fart, list of leg parts

Meat Belt
Eyeballed Balloon Men
Green beans in bun
10,000 Hands
Favorite Caterpillar
The Oatmeal Tree
102 SNOWBALLS in a basket
Pie and Jell-O
List of leg parts
world´s nicest fart
Pineapple wrapped sasquatch
Is it a Snake, a Watermelon, or a Bush?
Putting Turtles on Decor
Fish tank ‘n chair
ROBO-ACCIDENT
pajamas made of wood and spiders
Ssssssssssexy SSSssssssstinky Ssssssssssssexy ssssssssssssssssexy
setup 9 × 11 party trick
Smagma Monster
Commentary couldn’t be heard over the squawking of clocks
Poltergeist might be entertaining, but he’s harder to read in Hungarian
Cereal Implanting Device
blueberry sipping fizzy pop with eyes of ice
blueberry sipping fizzy pop with fake blood on it
A sarcastic, racist noble using progressively tinier body parts as a human shield

Bonus content! The above costumes are all from temperature 1.2; I also tried a higher temperature setting, but the generated costumes were at an expert level of chaos (I would like to see someone attempt to go as “hypnopotamus embroidered death”) Enter your email here to get them!

You can order my book You Look Like a Thing and I Love You! It’s out November 5 2019.

Amazon - Barnes & Noble - Indiebound - Tattered Cover - Powell’s


26 Nov 11:53

CCTV camera marketed on its ability to determine ethnicity,...

Roslyn

What!?



CCTV camera marketed on its ability to determine ethnicity, particularly differentiating between Han Chinese and muslim Uyghur minorities. 

Hikvision Markets Uyghur Ethnicity Analytics, Now Covers Up

26 Nov 06:41

KPI overload

by Nathan Yau
Roslyn

Oof

From Tom Fishburne, the Marketoonist. Maybe a dashboard isn’t the answer you’re looking for.

Tags: humor, KPI, Marketoonist

25 Nov 21:23

Paperless Paper Trail

by Instant Kaamos

EKERbOpWoAAT-qC

(Credit where credit is due is due to Ian Cobain Twitter: @IanCobain

If you enjoyed this recursive lost poster, you may also enjoy this recursive lost poster and maybe also this recursive lost poster.

17 Nov 06:14

mapsontheweb: Dwellings by most common period of construction.

Roslyn

Europe is a lot newer than I thought!



mapsontheweb:

Dwellings by most common period of construction.

15 Nov 22:03

Like a kid in a candy shop…

by mrjohnjohnson
Roslyn

I don’t understand but I love it

Not since the heady days of Pardon Me, has any video spawned so many videos sampling a video sampling a video sampling a…

 

 

23 Oct 14:16

Scientology city takeover

by Nathan Yau
Roslyn

Well, this is fascinating.

For Tampa Bay Times, Tracey McManus and Eli Murray delve into the purchasing of properties Clearwater, Florida by the Church of Scientology:

The Church of Scientology and companies run by its members spent $103 million over the past three years buying up vast sections of downtown Clearwater.

They now own most commercial property on every block within walking distance of the waterfront, putting the secretive church firmly in control of the area’s future.

Most of the sales have not previously been reported. The Tampa Bay Times discovered them by reviewing more than 1,000 deeds and business records, then interviewed more than 90 people to reconstruct the circumstances surrounding the transactions.

The lead-in scrollytelling through Clearwater is quite effective in laying the foundations of the story.

Tags: Scientology, Tampa Bay Times

19 Oct 22:06

Reissue: “Vague - The Magazine” — I’m reading...



Reissue: “Vague - The Magazine” — I’m reading this tonight, probably

A Vinco Original based on Vogue Magazine (Photo by Andrea Yurko from Pexels)

15 Oct 13:14

Fixing the ‘impeach this’ map with a transition to a cartogram

by Nathan Yau

As discussed previously, the “impeach this” map has some issues. Mainly, it equates land area to votes, which makes for a lot of visual attention to counties that are big even though not many people live in them. So, Karim Douïeb used a clever transition to change the bivariate map to a cartogram. Now you can have a dual view.

Tags: election, impeachment

14 Oct 02:17

Duolingo and smaller languages: Useful, but also complicated

Duolingo and smaller languages: Useful, but also complicated:

An interesting long read on the political and social dimensions around Duolingo’s decision to branch out into smaller languages like Irish and Hawaiian. Excerpt: 

Of course, there’s a big difference between picking up a few words in Irish or Welsh to make you feel as if you’re connecting with your ancestors, and actually learning a language — particularly an endangered one that needs all the speakers it can get.

This is a tension that Duolingo has struggled with when it comes to its two endangered language courses, Navajo and Hawaiian. Those tongues are listed as vulnerable and critically endangered, respectively, by UNESCO.

Both languages were added to Duolingo this year to coincide with the United Nations International Year of the Indigenous Language. But they raised questions that weren’t necessarily an issue for courses such as French or Spanish, which aren’t expected to be used by native speakers of those languages.

“Who’s the audience for the Hawaiian course? Is it going to be tourists? Mostly? Because that would affect the content,” said Awodey. “Or is it going to be primarily built by and for indigenous speakers and people reconnecting with the language?”

In Hawaii, the team partnered with Kamehameha Schools, a network of private schools dedicated to teaching students of native Hawaiian heritage with a particular focus on preserving the Hawaiian language.

Despite this, however, the Duolingo Hawaiian course can sometimes risk speaking down to native Hawaiians, few of whom need teaching, for example, what a “lei” is.

“Everyone was super excited about it, but it’s totally tapered off because it’s not for natives, it’s too baby, it’s too simple,“ said Kū Kahakalau, executive director of Hawaiian language and culture NGO Kū-A-Kanaka.

Linguistic politics are often fraught with regard to majority tongues, let alone for endangered languages that have a long history of colonialism and disrespect. Scrutiny of such courses is always going to be tighter, and invisible red lines easier to cross.

"When you’re dealing with a heritage language, it does come with a bundle of stuff that we don’t have when teaching English,” said Duolingo learning scientist Hope Wilson. “There are lots of tricky issues to get into, very often there are divides within the communities where people don’t agree on, you know, issues of spelling or that kind of thing.”

Read the whole thing

01 Oct 06:04

Pixelation to represent endangered species counts

by Nathan Yau

In 2008, the World Wildlife Fund ran a campaign that used pixelation to represent the number of animals left for endangered species. One pixel represents an animal, so an image appears more pixelated when there are fewer animals left. Imgur user JJSmooth44 recently used more recent numbers to show the images for 22 species (sourced from the Animal Planet endangered species list).

The above is the image bengal tiger with 2,500 pixels. In contrast, the black rhino has 5,000 pixels:

Or, here’s the black footed ferret with 300:

See all of them here.

Tags: animals, endangered, pixels

29 Sep 23:26

New official video for The Beatles’ “Here Comes the Sun” for 50th Anniversary of their Abbey Road album

by S. Abbas Raza

29 Sep 02:56

uncommonbish:https://www.independent.co.uk/life-style/gadgets-and...

Roslyn

Whoa.









uncommonbish:

https://www.independent.co.uk/life-style/gadgets-and-tech/news/uber-knows-when-your-phone-is-about-to-run-out-of-battery-a7042416.html


👏🏿 Capitalism 👏🏿  wants 👏🏿  to 👏🏿 fuck 👏🏿 you 👏🏿 all  👏🏿 the 👏🏿  time. 👏🏿

29 Sep 01:33

Four reasons to care about Beaked Whales, AKA the best...



Four reasons to care about Beaked Whales, AKA the best whales.

Buy a little book of my comics here.

18 Sep 09:14

A Lunar Corona over Turin

What are those colorful rings around the Moon? What are those colorful rings around the Moon?


11 Sep 23:46

Dungeon crawling or lucid dreaming?


I’ve done several experiments with a text-generating neural network called GPT-2. Trained at great expense by OpenAI (to the tune of tens of thousands of dollars worth of computing power), GPT-2 learned to imitate all kinds of text from the internet. I’ve interacted with the basic model, discovering its abilities to generate fan fiction, British snacks, or tea. I’ve also used a tool called gpt-2-simple that Max Woolf developed to make it easy to finetune GPT-2 on more specialized datasets - I’ve tried it on datasets like recipes or crochet. One of my favorite applications of GPT-2 and other text-generating neural nets is Dungeons and Dragons spells, creatures, character names, and character bios.

Recently Max published a tutorial on how to use GPT-2 to make new apps that are more complicated than just printing out large reams of text. To my delight, people have used them to made D&D games. First, there was Nick Walton’s AI Dungeon, in which a finetuned GPT-2 generates a story and three very strange action options at each step.

My latest obsession is GPT-2 Adventure, designed by a cognitive neuroscience student named Nathan, which lets you type in your own actions. And folks, this game is SO WEIRD and SO FUN.

YOU FLOAT IN SPACE, LIKE A BALLOON, AND AS YOU DROP THE CUBE, THE X, IT VANISHES IN A CLOUD OF GREASY BLACK SMOKE

>>

When I first launch the game, this is what I’m greeted with, a simple text prompt reminiscent of nostalgic dungeon crawling games. It sets the scene. It waits for me to describe my next move.

Even getting to this point is like a blast from the past. The game runs in a Google Colab notebook, a format where you’re looking at cells of code, with cryptic comments, vaguely alarming warning messages, and all. It’s not a sleek web app, but feels more akin to playing a game on an old DOS machine, or maybe on a graphing calculator. There are only two clicks to begin playing, but there’s a good two-minute wait while the game loads.



And it’s my turn to type something.

>> look up

It takes several seconds for the game to respond, as if I’m playing it on a very ancient computer. The computer is in fact a state-of-the-art NVIDIA GPU, likely a K80 or a T4. The T4 could eat my laptop for lunch. I’m just playing with a very very large neural net. Finally it answers.

YOU’RE IN LABYRINTH OF TWISTY LITTLE PASSAGES, ALL ALIKE

Almost immediately, the context seems to have changed. I’m not floating at all. I’m in a maze. I guess I’ll walk somewhere.

>> walk north

THE TROLL STEPS OUT FROM BENEATH THE BRIDGE AND BLOCKS YOUR WAY

YOU ARE ON THE SOUTH SIDE OF THE CHASM

A NOD IS GIVEN TO THE INFINITE WONDER THAT IS URBZIG

A SOLID RAINBOW SPANS THE CHASM

And suddenly things are happening. There is a troll. There is a chasm. There is a rainbow. There is… Urbzig? I play it safe.

>> praise urbzig

URBZIG IS THE PATRON SAINT OF ZORK

HE IS THE GODFATHER OF ZORK, THE PLACE TO GET NEWS OF EPIC PROPORTIONS

URBZIG IS A STRONGMAN,遊 ALL

If you’ve played classic dungeon crawling games like Zork, you may recognize some of these names and phrases. Checking Nathan’s original post, I see that the training data is indeed drawn from these games. But because it’s based on GPT-2, the training data also in some sense also includes much of the internet. This gives it a lot of latent flexibility to deal with situations that were never in the classic games.

>> ask Harry Potter to try walking across the rainbow

ASK HARRY POTTER TO TRY WALKING ACROSS THE RAINBOW
HE CAN’T
THE RAINBOW IS EXTREMELY WIDE
ABOVE YOU IS A LARGE DOME
UP AROUND ITS NECK IS A STRONGG PORE
[UNFORTUNATELY, THE DIRECTION OF TRAVEL IS A LOT MORE INTERESTING THAN YOU THINK
][YOU ARE IN A LARGE ROOM, WITH A PASSAGE TO THE EAST, A PASSAGE TO THE WEST, AND A WALL OF BROKEN ROCK TO THE SOUTH

An interesting effect of its human-written training data is that, like Visual Chatbot, it isn’t prepared for humans that ask questions about objects that aren’t there. If I refer to Harry Potter, then Harry Potter must be present. If I type something about a giraffe, then it assumes the giraffe must be present.

>> ask the giraffe which direction to go next

THE GIRAFFE IS ALREADY OUTFITTED WITH A VISOR WHICH IS CLEARLY DEFINED

Although it’s not clear that this AI actually understands what a giraffe is.

>> look at the giraffes

THE GAZEBO HAS DISAPPEARED

>> put giraffe in backpack

DONE

Navigating this world is like navigating in a dream. My location suddenly changes without warning. I thought I was by a chasm with a rainbow bridge. Now apparently I’m in a building. Dreamlike, it is also really hard to read text.

>> look around

YOU ARE IN A LARGE ROOM, WITH A PASSAGE TO THE SOUTH, A PASSAGE TO THE EAST, AND A WALL OF BROKEN ROCK TO THE NORTH
A SIGN POSTED ABOVE THE NORTH WALL READS, “STOP! PAY TROLL!” A SMALL WINDOW CAN BE SEEN IN EITHER WALL

>> read sign

THE SIGN IS VERY SIMPLE

>> yes but read sign

WE NOW HAVE ALL FIFTEEN TREASURES
IT IS TIME TO GO BACK INTO THE CAVE FOR THE END GAME

>> that’s awesome. but i still want to know what the sign says. read sign.

YOU ARE IN A GAZEBO ROOM WHICH HAS DOORS ONLY TO THE EAST AND SOUTH

Another thing that’s surprisingly difficult to do is check inventory.

>> check inventory

DONE

>> check inventory

THERE IS AN INTERESTING LOOT HERE

>> what is the loot

WHAT IS THE LOOT?

>> check inventory

YOU ARE IN THE BAG CASE

Very occasionally I will succeed in opening a bag. It will usually be weird.

>> open bag

OPENING THE BAG REVEALS A TEA BAG

>> make tea

AY! A LOVELY SURPRISE FOR THE SURPRISE

More so than any other game I’ve played, I’m unsure of the rules here. Some weird things unexpectedly work.

IF YOU DON’T GET KILLED, YOU ARE TOAST

>> Eat the toast.

EAT THE TOAST
IT TASTES LIKE A STEAK
A STEAK IS PERFECTLY FINE

Other seemingly obvious things don’t work.

THERE IS A PATH TO THE NORTH, A PATH TO THE SOUTH, OR A STEEP PATHWAY CLIMBING SOUTHWEST ALONG THE EDGE OF A CHASM

>> go north

I DON’T KNOW HOW TO DO THAT

>> go south

I DON’T KNOW HOW TO DO THAT

>> go southwest

YOU WILL BE DESCENDING INTO A VALLEY
THERE IS A NARROW BEACH TO THE NORTH AND A PATH LEADING INTO A CANYON TO THE SOUTHEAST
YOU ARE ON THE CLIFF OVERLOOKING THE FAR SIDE OF THE ROOM
A PATH LEADS SOUTH ALONG THE EDGE OF A CHASM, AND A NARROW PATH LEADING INTO A LOZENGE ROOM

Sometimes I’m wandering endlessly from room to room, from cave to cave, encountering the same chasm in slightly different variations, the same dried-up lake. Other times I’m embroiled in seemingly unescapable fights, usually with trolls. Often reality shifts entirely around me and I end up in a maze of twisty little passages, all alike, or in the hall of the mountain king. Everything seems weirdly familiar - even the room descriptions are echoes of classic dungeon crawling games. It is as if I’m dreaming about playing them again.

My actions also mold reality. I learn quickly to always check for diamonds, and never check for traps. If I find myself in trouble, I can always do something confusing and provoke a context change. Writing “In a large bowl, combine butter and sugars” is usually enough to teleport myself out of whatever situation I’m in. I can always ask a giraffe for advice, although the giraffe usually isn’t very helpful.

Without a particular mission to fulfill, the caves and chasms begin to blend into one another. I give myself missions, and that helps. Once I decide my goal is to put together a picnic (incessant “check for food” commands eventually yielded a sack with a lunch and a clove of garlic, belonging to a horse who agreed to join me on a picnic). Another time I demand a mission and am given some php code; i decide my mission is therefore to build a website, but am astounded when a unicorn steals my laptop. This is not the only time other content intrudes on the game’s fantasy landscape - once the game tries to tell me about finding cash online, and another time I am sadly informed that there is no wi-fi. I encounter dinosaurs and troikas, none of which were in the original training data. Occasionally things get creepy as hell.

>> check for loot

CHECK FOR LOOT HERE
CHECK FOR GOLD HERE
CHECK FOR TORCH HERE
CHECK FOR BONE HERE
DO CHEST COMPRESSIONS AGAIN
DO CHEST ROTATE
DO CRESCENT MOON APPEAR IN THE ROOM’S CENTER

>> do back away slowly

DO BACK AWAY SLOWLY
DO NOT OVERDO IT
THE CHEST WILL EXPLODE AND YOU WILL DIE
OKAY, SO YOU SEE ME

You can play the game too! Just follow the colab notebook link in Nathan’s blog post. Tips: 1) It may be more fun if you set yourself an arbitrary goal to reach. 2) If you find things getting repetitive, try adjusting the temperature in the “Play the game” cell to 0.6 or 1.0.

To read the fulltext of a couple of my adventures (much too long to include here), enter your email and I’ll send them to you!

You can order my book You Look Like a Thing and I Love You! It’s out November 5 2019.

Amazon - Barnes & Noble - Indiebound - Tattered Cover - Powell’s

10 Sep 05:03

For...

Roslyn

Aw :(

09 Sep 04:03

Nukege Hats: Hats for Cats, Fashioned Out of Their Own Shed

by Johnny
Roslyn

Excellent cat hat content

Several years ago Ryo Yamazaki was brushing one of his three Scottish Fold cats. A lump of shed had accumulated on the floor and in that moment Yamazaki noticed that it kind of looked like a hat. For a chuckle he decided to mold it into a pointy hat and place it on one of […]
05 Sep 11:59

The 300-Year History of Using ‘Literally’ Figuratively

The 300-Year History of Using ‘Literally’ Figuratively:

allthingslinguistic:

Kory Stamper in The Science of Us talking about hyperbolic “literally”. Excerpt: 

The emphatic “literally” is not a millennial invention; it goes back to the 1700s at least, though Smith gets it right that it’s English. John Dryden, a man who is best known as the founder of literary criticism and the prohibition against the terminal preposition, was an early user of the emphatic “literally.” Charlotte Brontë, Jane Austen, Mark Twain, Charles Dickens, William Thackeray, Vladimir Nabokov, and David Foster Wallace all used the emphatic “literally” in their works. Even Lindley Murray, 19th-century grammarian, uses the hyperbolic “literally” in his own grammar — and he was such a peever that he thought children, along with animals, shouldn’t be referred to with the pronoun “who,” as “who” conveys personhood, and only creatures with the ability to be rational are actually people.

We only began to take issue with the hyperbolic “literally” in the early 20th century. Ambrose Bierce called it “intolerable,” and usage maven H. W. Fowler said it should be “repudiated.”