Shared posts

26 Jul 13:57

Phoozy Smartphone Thermal Capsule

by Erin Carstens

Phoozy thermal capsules protect smartphones from physically destructive temperature swings similar to how Silent Pocket privacy cases protect them from personally destructive cellular, Bluetooth, WiFi, GPS, RFID, and NFC signals. Phoozy blocks out heat and cold; Silent Pockets block out skimmers and trackers. Both aim to keep your phone safer from the demons that hunt it.

Now all you have to worry about is all the demons that hunt all the rest of your stuff. Alexa! Order me a Buffybot!

In Phoozy's case, becoming a thermal capsule for your smartphone required some help from the astronauts. Its composition is an adaptation of spacesuits, made from the same materials that protect NASA's finest from extreme extraterrestrial environments. A combination of insulating, reflective, and impact-resistant layers both shield the phones they cover from the sun, high heat, and temperatures as cold as -20 degrees F, and provide military-grade protection from drops and water encounters. In fact, Phoozy compares its Impactor Core Layer to a life jacket - if you drop your Phoozy-cased phone into the water it will become a phloaty.

Additional Phoozy layers include a Chromium Shell that reflects 90+% of solar radiation, and SpaceTech Penetration Layer for toughness and insulation, and a Soft Touch Protective Liner that covers phone screens to guard against nicks and scratches.

Phoozy thermal capsules are available for iPhone 5 and up, Google Pixel and Pixel XL, Samsung Galaxy S, S8, and S8+, and LG, Motorola, Kyocera, HTC, and Microsoft smartphones.

25 Jul 16:40

What Are GitHub Pull Requests?

by Ed Wassermann
 

If you want to work with other developers, you'll need to understand Git—the most widely used system for version control and sharing code. 

In a series of quick Coffee Break Courses on Envato Tuts+, I explained some of the most important concepts in Git, such as branches, states, merging and rebasing, and more.

In this video from the series, you'll learn about GitHub pull requests. By the end of this animated lesson, you'll understand the workflow for forking, cloning, branching, and then sending a pull request to a project on GitHub.

GitHub pull requests

Watch More Git Courses

Now that you've learned about GitHub pull requests, why not learn more about Git? For the basics, start with our Introduction to Git and GitHub, and then try the other Coffee Break Courses in this series:

25 Jul 15:09

Igloo Party Bar Cooler with LiddUp Lighting

by Erin Carstens

The Igloo Party Bar Cooler keeps the party lit up all night long courtesy of LiddUp technology. About 4 years ago, startup LiddUp sought crowdfunding for its own illuminated cooler. The campaign didn't quite hit its goal, but the company decided to proceed with their lid-up-lights-on idea anyway. Eventually, they scored an appearance on Shark Tank, and then partnered up with Igloo.

The Party Bar is built to Igloo cooler standards, and enjoys the brand's name recognition, but differentiates itself from other coolers for a crowd with LiddUp's through-ice LED lighting system. Adding nighttime visibility during drink selection and a moody-smooth glow through the cold, LiddUp takes this Igloo from mere ice chest to full-on Party Bar.

Use the Party Bar with its lid on to get the most out of the fully insulated body's ice retention abilities, or snap it off to serve your beverages open-trough style. The LED lights are embedded in the cooler's liner, so they illuminate the booty buried in the ice either way.

The Party Bar also has removable drink dividers in case you want to separate the beer from the sodie, or the good stuff from the crap you'll give freeloaders who showed up empty-handed. A separate cooler caddy can add a couple of wine bottles to the chill.

Rounding out the festive nature of the Party Bar are its integrated bottle opener with cap catch bin, and a threaded, leak-resistant drain plug that's also garden hose compatible.

25 Jul 15:09

Helping journalists experiment with 360 content

by Matt Cooke

There’s already a huge amount of innovation in virtual reality and immersive storytelling—with many newsrooms experimenting and succeeding in the field—but for some, the ability to create 360 content can still be limited.

Perhaps predicting the rise of 360 technology, in 2014 Australian creative agency Grumpy Sailor worked with Google’s Creative Lab in Sydney on an experiment called Story Spheres, which stitches together photos and audio. It allows journalists, documentary makers and educators to tell powerful stories if they don’t have access to video.

Working with the same team behind the first prototype, the Google News Lab is now supporting the next iteration of the project. Today new features will help publishers—from individual journalists to large newsrooms—create and brand their immersive audio experiences. A new website will help journalists brand their creations with their own logos, help them credit their work and embed it on their own website. It’s now even simpler to upload a 360 image, edit the imagery, add an audio layer and navigate from one experience to another.

1

In the UK, Trinity Mirror has already experimented with the new tool: The Liverpool Echo took their readers through the famous dockyards of the city, and the Manchester Evening News provided a snapshot of the flowers and balloons placed in St Ann’s Square as a tribute for the Manchester terrorist attacks. In Norway, Nettavisen has been experimenting with the tool by giving their readers a glimpse at the best podcasts for their readers this summer.

Emily McCartney, a coder and “techxplorer” at Grumpy Sailor, says the improved tool will help users, too: "There's so much news to consume, and people want to be able to jump between stories without losing any time, and Story Spheres help you do that."

Discover the tool for yourself, made by Grumpy Sailor with the support of Creative Lab in Sydney and the Google News Lab.

25 Jul 14:35

Google Pixel 2 to reportedly be the first to run the Snapdragon 836

by Dima Aryeh

It’s being reported that the first device to use the upcoming Snapdragon 836 will be Google’s Pixel 2. Though both the chipset and the device are yet to be announced, this news doesn’t come as a surprise.

The first device using a specific chipset isn’t a big deal in practice, but the title often comes with a bit of prestige. Google securing the Snapdragon 836 first would be a good selling point for a line that has sold relatively poorly.

Of course, the Snapdragon 836 won’t be exclusive to Google. Many devices from other manufacturers, such as Xiaomi, will use it in the future. Hopefully the small amount of hype around the chip will help Google’s sales now that the Pixel name is out there.

24 Jul 18:57

A timeline map of the massive increase in human-caused earthquakes in Oklahoma

by Jason Kottke

In just the past 10 years, the number of earthquakes in the central US (and particularly Oklahoma) has risen dramatically. In the 7-year period ending in 2016, there were more than three times the number of magnitude 3.0+ earthquakes than in the previous 36 years. Above is a video timeline of Oklahoma earthquakes from 2004-2016. At around the midpoint of the video, you’ll probably say, “wow, that’s crazy”. Keep watching.

These earthquakes are induced earthquakes, i.e. they are caused by humans. Fracking can cause induced earthquakes but the primary cause is pumping wastewater back into the ground. From the United States Geological Survey’s page on induced earthquake myths & misconceptions (a summarized version of this paper):

Wastewater disposal wells typically operate for longer durations and inject much more fluid than hydraulic fracturing, making them more likely to induce earthquakes. Enhanced oil recovery injects fluid into rock layers where oil and gas have already been extracted, while wastewater injection often occurs in never-before-touched rocks. Therefore, wastewater injection can raise pressure levels more than enhanced oil recovery, and thus increases the likelihood of induced earthquakes.

Of course, this wastewater is a byproduct of any oil & gas production, including fracking. But specifically in Oklahoma’s case, the induced earthquakes have relatively little to do with fracking:

In contrast, in Oklahoma spent hydraulic fracturing fluid represents 10% or less of the fluids disposed of in salt-water disposal wells in Oklahoma (Murray, 2013). The vast majority of the fluid that is disposed of in disposal wells in Oklahoma is produced water. Produced water is the salty brine from ancient oceans that was entrapped in the rocks when the sediments were deposited. This water is trapped in the same pore space as oil and gas, and as oil and gas is extracted, the produced water is extracted with it. Produced water often must be disposed in injection wells because it is frequently laden with dissolved salts, minerals, and occasionally other materials that make it unsuitable for other uses.

Tags: earthquakes   maps   oil   timelines   video   water
24 Jul 18:57

While discussing the rising cost of comic fandom...

by MRTIM
Get the 400 page COMPLETE Our Valued Customers 
24 Jul 13:09

Physics Confession

"You know lightning, right? When electric charge builds up in a cloud and then discharges in a giant spark? Ask me why that happens." "Why does tha--" "No clue. We think it's related to the hair thing."
24 Jul 13:09

Bugs.

by Chris Grabowski
24 Jul 11:44

Texts From SuperheroesFacebook | Twitter | Patreon

Dan Jones

I'm not sure I get this. What's wrong with Cap's feet? Is he Achilles?



Texts From Superheroes

Facebook | Twitter | Patreon

23 Jul 02:18

Comic for 2017.07.22

23 Jul 02:18

Zombies Debate Whether or Not to Eat American Brains

by Dave Weasel

MARY GRACE LAMBERT LITTLE ROCK, AR – With another GOP health care measure stuck in Bipartisan purgatory-limbo-hell, zombies are now worried about the safety of consumption of American brains. “I mean, both physically and emotionally it seems that this particular breed of homo sapien is not safe to chomp on,” grunted head of the Zombie […]

The post Zombies Debate Whether or Not to Eat American Brains appeared first on The Valley Report.

21 Jul 21:15

For sharing:Long-ways | Square-ways











For sharing:

Long-ways | Square-ways

21 Jul 21:15

#1571 – Pipsqueak (1 Comment)

by Chris

#1571 – Pipsqueak

21 Jul 21:11

How to Pun

by Dan Jones

Yep, puns are excellent.

21 Jul 19:17

An Idea

by Reza

21 Jul 14:03

Google launches Motion Stills for Android

by Dima Aryeh

Motion Stills was a previously iOS-exclusive app that allowed you to capture Live Photos which the app then stabilized and created loops out of. You could then share them as GIFs. It’s no secret that Google’s internal app teams love iOS, but now Android is finally getting some of that love.

The Motion Stills app has been redesigned for Android, further improving it. Each frame of the video is processed as it is being shot, allowing for a stabilized video to be viewable immediately after being shot with no loading times. You can also fast forward your videos up to 8x, turning up to a full minute of video into a short clip.

If you’re interested in giving the app a shot, hit the source link. It’s available for Android devices running 5.1 and later. You’ll also be helping Google, as the feedback from the app furthers their foray into video technology. Let us know what you think!

21 Jul 14:03

Google adds Street View of the International Space Station

by Dima Aryeh

Google Street View hasn’t been limited to maps for quite a while. You can even explore the Great Barrier Reef using Street View. The app is pushing exploration beyond just your own neighborhood.

The next step is a big one, because for the first time, Street View has left the Earth. You can now explore the International Space Station thanks to an astronaut and flight engineer named Thomas Pesquet taking photos during his six-month stint on the station.

If you’re interested in checking it out, hit the source link. Google’s blog entry describes the process of getting the photos as well as Thomas’ time on the station. There is also some behind the scenes content available. It’s definitely a fascinating look into what it’s like floating 250 miles above Earth.

Source: Google Street View: International Space Station, Google Blog

21 Jul 14:03

Robert Anton Wilson

"It only takes 20 years for a liberal to become a conservative without changing a single idea."

20 Jul 20:18

Around the world with #teampixel

by Crystal Akins

With vacation mode in full swing, #teampixel members are steadily trekking into the far corners of the globe. This week’s picks range from a peaceful afternoon in a Beijing temple to the windy roads of the Great St. Bernard Pass. Take a look at our summer faves in this week’s #pixelperfect slideshow, and don’t forget to pack the sunscreen.

Pixel_720_.jpg
@zu.fuss - Great St. Bernard Pass in Switzerland
Pixel_720_2.jpg
Left: @chxtagram - Natural History Museum in London, England. Right: @jjthehoser - A bubble on the move in Canada
Pixel_720_3.jpg
@johnsocharlotte - Lakeside sunset in Chicog, Wisconsin
Pixel_720_4.jpg
Left: @doppiosogno - Colorful confetti at a Coldplay concert. Right: @rht_3 - Brooklyn Bridge in NYC
Pixel_720_5.jpg
@mrtroiano - Jardin du Luxembourg in Paris, France
Pixel_720_6.jpg
Left: @vasuagarwal - Cube House in Rotterdam, Netherlands. Right: @suluhpandu - An airport tunnel in Indonesia
Pixel_720_7.jpg
Left: @audiphotography - A shadow cast seat in Haryana, India. Right: @bloganotherfeckintravel - Temple of Heaven in Bejiing, China

Have a Pixel? Tag your photos with #teampixel, and you might get featured on the ‘gram.

20 Jul 20:18

Live TV coverage of Apollo 11 landing and moon walk

by Jason Kottke

Apollo 11 TV Coverage

48 years ago today, the lunar module from the Apollo 11 mission landed on the Moon. Later that same day, Neil Armstrong and Buzz Aldrin stepped out of the module, set foot on the surface, and went for a walk. And the entire world watched them do it. Live.

For the 40th anniversary of the landing in 2009, I put together a page where you can watch the original CBS News coverage of Walter Cronkite reporting on the Moon landing and the first Moon walk, synced to the present-day time. Just open this page in your browser and the coverage will start playing at the proper time. Here’s the schedule (all times EDT):

4:10:30 pm: Moon landing broadcast starts
4:17:40 pm: Lunar module lands on the Moon

4:20:15 pm: Break in coverage

10:51:27 pm: Moon walk broadcast starts
10:56:15 pm: First step on Moon
11:51:30 pm: Nixon speaks to the Eagle crew
12:00:30 am: Broadcast end (on July 21)

Here’s what I wrote when I launched the project:

If you’ve never seen this coverage, I urge you to watch at least the landing segment (~10 min.) and the first 10-20 minutes of the Moon walk. I hope that with the old time TV display and poor YouTube quality, you get a small sense of how someone 40 years ago might have experienced it. I’ve watched the whole thing a couple of times while putting this together and I’m struck by two things: 1) how it’s almost more amazing that hundreds of millions of people watched the first Moon walk *live* on TV than it is that they got to the Moon in the first place, and 2) that pretty much the sole purpose of the Apollo 11 Moon walk was to photograph it and broadcast it live back to Earth.

This is one of my favorite projects I’ve ever done, and it almost didn’t happen this year. I woke up this morning assuming it was just going to work again, just like it had the previous 8 years, but a bit of testing revealed that YouTube had discontinued the API I was using to display and time the videos. I wasn’t sure I had the JavaScript chops to fix it in time for the big show this afternoon. Luckily, I was able to solicit some help on Twitter and as the internet continues to be absolutely amazing, Geoff Stearns fixed the problem. As he said in his tweet, Stearns works for Google and wrote the YouTube API that had been discontinued, which is a bit like Marshall McLuhan popping out from behind a poster in Annie Hall, but instead of saying “you know nothing of my work”, he says “I’m gonna fix this up real quick”. Reader, it took him 14 minutes from saying “I’ll help” to posting the solution, and I’d bet half of that time was spent running to the fridge for a LaCroix and selecting the proper coding playlist on Spotify. So big thanks to Geoff for making this happen today! And thanks also to Brian Seward, who landed a solution in my inbox a bit after Geoff’s.

Oh, and no more Flash! So it’ll work on any modern browser with no plugins. But I tested it on my phone and it still doesn’t seem to work properly there…the video loads but doesn’t autoplay. Something to improve for next year!

Tags: Apollo   Apollo 11   Moon   space   TV   Walter Cronkite
20 Jul 19:48

Fictional names for British towns generated by a neural net

by Jason Kottke
Dan Jones

Pretty sure Batchington Crunnerton is that guy from Sherlock.

Dan Hon recently trained a neural net to generate a list of fictional British placenames. The process is fairly simple…you train a program on a real list of placenames and it “brainstorms” new names based on patterns it found in the training list. As Hon says, “the results were predictable”…and often hilarious. Here are some of my favorites from his list:

Heaton on Westom
Brumlington
Stoke of Inch
Batchington Crunnerton
Salt, Earth
Wallow Manworth
Crisklethe’s Chorn
Ponkham Bark
Buchlingtomptop
Broad Romble
Fuckley

See also auto-generated maps of fantasy worlds.

Update: Tom Taylor did a similar thing last year using Tensorflow. Here are a few of his fictional names:

Allers Bottom
Hendrelds Hill
St Ninhope
Up Maling
Firley Dinch

There’s also an associated Twitter bot. (via @philgyford)

Also, Dan Connolly had a look at the etymology of the names on Hon’s list.

Buncestergans. At first glance this doesn’t look a lot like a place name but let’s break it down. We’ve got Bun which is definitely from Ireland (see Bunratty, Bunclody, Bundoran) meaning bottom of the river, and I believe we’re talking bottom as in the mouth rather than the riverbed (or there are whole lot of magical lady-of-the-lake towns in Ireland, I’m happy believing either). Cester is our Roman fort, then we have -gans.

I don’t think gans has any meaning in British place names. My guess is the net got this from Irish surnames like Fagans, Hagans, Duggans, that sort of thing. My Gaelic’s not so great (my mother, grandmother, and several aunts and uncles would all be better suited to this question!) but I think the -gan ending in Gaelic is a diminuitive, so Buncestergans could be the Small Fort at the Bottom of the River. I quite like that. It’s a weird Gaelic-Latin hybrid but why the hell not!

Tags: artificial intelligence   Dan Connolly   Dan Hon   language   Tom Taylor
20 Jul 19:15

#1570 – Nutrition (5 Comments)

by Chris

#1570 – Nutrition

20 Jul 18:49

Google Play Protect aims to block malicious apps

by Evan Selleck

Apps and games are ridiculously popular, but unfortunately they can be used as a means to get malicious software onto a smartphone or tablet. Google has been working to keep that kind of software away from users for some time, but today they are taking yet another step towards the ultimate goal of keeping Android devices safe.

Google has launched Google Play Protect, and it’s rolling out beginning today. The new feature will be present on all Android devices running Google Mobile Services 11 or newer. Users can find it on their device by going into Settings –> Google –> Security –> Google Play Protect. If it’s there, then it will begin automatically scanning downloaded apps from the Play Store. If an app that gets pinged, it will be listed in this section of the security settings, and apps that’ve been recently scanned will be shown, too.

On the Play Store itself, there will soon be a badge for apps that says “Verified by Google Play Protect,” which should help at-a-glance downloads in the future. This is rolling out now, but it may take some time before users start seeing the badge show up on app splash pages.

Just one more step in keeping the end user safe, which is a good one. What do you think of Google’s latest security measure?

20 Jul 15:42

Playing with Shadow DOM

by Chris Coyier

About a year ago, Twitter announced it would start displaying embedded tweets with the shadow DOM rather than an <iframe>, if the browser supports shadom DOM.

Why? Well, speed is one reason.

They say:

Much lower memory utilization in the browser, and much faster render times. Tweets will appear faster and pages will scroll more smoothly, even when displaying multiple Tweets on the same page.

Why the choice? Why is it necessary to use either iframes or shadow DOM? Why not just inject the content onto the page?

It's a totally understandable need for control. An embedded Tweet should look and behave just exactly like an embedded Tweet. They don't want to worry about the styles of the page bleeding in and messing that up.

An <iframe> makes style scoping very easy. Point the src of the iframe at a URL that displays what you want an embedded tweet to look like, and you're good. The only styles used will be those you include in that document.

Twitter does this iframe-injection in a progressive enhancement and syndication-friendly way. They provide a <blockquote> with the Tweet and a <script>. The script does the iframe-injection. If the script doesn't run, no matter, a happy blockquote. If the script does run, a fully functional embedded Tweet.

That script is the key here. Scripts can do just about anything, and they host it, so they can change it up anytime. That's what they use to detect shadow DOM support and go that route instead. And as we covered, shadow DOM is faster to render and has lower memory needs. Shadow DOM can also help with the style scoping thing, which we'll look at in a moment.

Height flexibility

There's another thing too, that happens to be near and dear to my heart. An <iframe> doesn't adjust in height to fit its contents like you expect other elements to do. You set a height and that's that. It will have scrollbars, if you allow it and the content needs it. Back in the Wufoo days, we had to jump quite a few hoops to get embedded forms (in frames) to be as tall as they needed to be. Today, at CodePen, our Embedded Pens have adjustable heights, but there isn't any option for just "as tall as they need to be". (I'm exactly sure if that makes sense for CodePen Embeds or not, but anyway, you can't do it right now.)

An element with a shadow DOM is just like any other element and that it will expand to the content naturally. I'm sure that is appealing to Twitter as well. If they calculate the height wrong, they run the risk of cutting of content or at least having the embedded Tweet look busted.

Most Basic Usage

Here's the bare minimum of how you establish a shadow DOM and put stuff in it:

See the Pen Most basic shadow DOM by Chris Coyier (@chriscoyier) on CodePen.

Notice how the styling within the Shadow DOM doesn't leak out to the regular paragraph element? Both paragraphs would be red if they did.

And notice how the paragraph inside the shadow DOM isn't sans-serif like the one outside? Well, normally that would be. Inherited styles still inherit through the shadow DOM (so in that way it's not quite as strong a barrier as an iframe). But, we're forcing it back to the initial state on purpose like this:

:host {
  all: initial;
}

Handling that fallback

My first thought for dealing with a fallback for browsers that don't support shadow DOM was that you could chuck the same exact content you were stuffing into the shadow DOM into an iframe with srcdoc, like...

<iframe srcdoc="the same content">

Or more likely this is something you're doing in JavaScript, so you'd test for support first, then either do the shadow DOM stuff or dynamically create the iframe:

See the Pen Shadow DOM Basic by Chris Coyier (@chriscoyier) on CodePen.

Turns out srcdoc isn't the best choice (alone) for a fallback as there is no IE or Edge support for it. But also that it's not too big of a deal to just use a data URL for the regular src. Here's a fork by Šime Vidas where he fixes that up:

let content = `
  <style>
    body { /* for fallback iframe */
      margin: 0;
    }
    p { 
      border: 1px solid #ccc;
      padding: 1rem;
      color: red;
      font-family: sans-serif;
    }
  </style>

  <p>Element with Shadow DOM</p>
`;

let el = document.querySelector('.my-element');

if (document.body.attachShadow) {
  
  let shadow = el.attachShadow({ mode: 'open' }); // Allows JS access inside
  shadow.innerHTML = content;
  
} else {
  
  let newiframe = document.createElement('iframe');
  'srcdoc' in newiframe ?
    newiframe.srcdoc = content :
    newiframe.src = 'data:text/html;charset=UTF-8,' + content;
  
  let parent = el.parentNode;
  parent.replaceChild(newiframe, el);

}

TL;DR

  • Shadow DOM is pretty cool.
  • It's comparable to an iframe in many ways, including style encapsulation. Embedded third-party content is a pretty good use case.
  • It's possible to use it while falling back to an iframe pretty easily.
  • It's a part of the larger world of web components, but you don't have to go all-in on all that if you don't want to.

Here's another simple demo (this one using a custom element), but instead of rolling our own back support, it's polyfilled.

Playing with Shadow DOM is a post from CSS-Tricks

20 Jul 12:56

Texts From SuperheroesFacebook | Twitter | Patreon

Dan Jones

The "Rooftops" issues a few months back with Batman and Catwoman getting together was awesome!



Texts From Superheroes

Facebook | Twitter | Patreon

19 Jul 21:30

ORII Voice-Powered Bone Conduction Smartphone Ring

by Erin Carstens
Dan Jones

That's an ugly ring, but cool concept.

The ORII Smartphone Ring combines the best of voice activation, bone conduction, James Bond, and pretending your hand is a telephone like you did as a kid. Except there's no pretending with this Kickstarter project from Origami Labs. The ORII is an actual voice-powered ring that uses bone conduction to turn your body into a smartphone.

Presuming it works, wearers will be able to make or take calls just by touching their ORII finger to their ear. The ring's bone conduction technology will send sound along the first appendage into the deep caverns of the second - just like it does in bone conduction headphones.

Paired with Bluetooth, ORII responds to voice commands for calling, sending messages, setting reminders, and interacting with smartphone assistants in the same way. If you want to talk to Siri or OK Google, use the ORII CapSense button to wake them up, assume the secret hand signal, and tell the voice assistant what you want.

With dual noise-cancelling microphones, ORII conversations and commands remain private, and block out background noise. And since bone conduction sends sound directly into your inner ear, you should be able to hear through the ring/your finger even in loud places.

The ORII Ring seeks crowdfunding through August 17, 2017, with first shipments anticipated to go out in February 2018.

19 Jul 21:02

Implementing Webmentions

by Chris Coyier

We get a decent amount of comments on blog posts right here on CSS-Tricks (thanks!), but I'd also say the hay day for that is over. These days, if someone writes some sort of reaction to a blog post, it could be on their own blog, or more likely, on some social media site. It makes sense. That's their home base and it's more useful to them to keep their words there.

It's a shame, though. This fragmented conversation is slightly more useful for each individual person, it's less useful as a whole. There is no canonical conversation thread. That's what Webmentions are all about, an official spec! In a sense, they allow the conversation to be dispursed but brought all together in a canonical conversation thread on the main post.

Webmentions don't need to be an alternative to comments, although when you pop over to real Drew McLellan's post you'll see he's using them that way. They can be in addition to "regular" comments. Surely the idea of turning off regular comments is appealing from a community perspective (less asshats likely when you need to link to your published words) and a technical debt perspective.

Rachel Andrew also recently implemented them, and this is classic Jeremy Keith stuff.

Direct Link to ArticlePermalink

Implementing Webmentions is a post from CSS-Tricks

19 Jul 20:30

A bird magically floats because of a camera frame rate trick

by Jason Kottke

You know when you’re watching a fan or a wheel or something else quickly spinning and it seems to stop spinning and even looks like it’s spinning backwards? And you blink your eyes and remind yourself you’re not on drugs and haven’t been drinking heavily but it’s still somehow simultaneously spinning and not? This optical illusion occurs most commonly with video cameras (but can also occur looking through your normal eyeballs) when the frame rate of the camera matches some multiple of the rate of the thing being filmed, as with this magically levitating helicopter.

Since each frame has to ensure the blade is in the same position as the last it therefore needs to be in sync with the rpm of the rotar blades. Shutter speed then needs to be fast enough to freeze the blade without too much motion blur within each frame.

Here the rotor has five blades, now lets say the rpm of the rotor is 300. That means, per rotation, a blade is in a specific spot on five counts. That gives us an effective rpm of 1500. 1500rpm / 60secs = 25.

Therefore shooting at 25fps will ensure the rotor blades are shot in the same position every frame. Each frame then has to be shot at a fast enough shutter speed to freeze the blade for minimal motion blur.

In the video above, a home security camera catches a bird flying with a wing speed matching the frame rate of the camera, which makes it look like the bird is just magically hanging in the air, like some sort of avian wizard.

Tags: photography   video
19 Jul 16:06

Photo