Shared posts

14 Sep 02:08

Why You Should Keep Your Domain Name Separate From Your Web Host

by Michael Keshen

If you’re new to the domain realm, you may take a standard route of using your web hosting service as your domain provider. However, there may be a few things you are missing out on by keeping your services under one roof. Here’s why you should always use a different provider to maintain your domain name instead of the provider that maintains your website.

Potential to Lock with Current Web Host

If your domain name is tied to your web host, you could lose your domain name if you ever decide to switch to another host. That’s because your domain name has the potential to be locked with your current web hosting service due to what’s called registrar-locking. This can happen if it’s less than 60 days between when you start using the service and decide to go elsewhere. If your domain name is owned independently, you can point it to whatever new web host you choose even if it’s within these 60 days.

More TLD (Top-Level Domains) Options

Web hosts will typically only offer a few TLDs to choose from, which can leave you in the dust if you’re searching for a unique domain name based on a niche market or location. However, using a dedicated domain provider will typically offer much more TLDs. For example, Hover offers more than 400 TLDs for you to choose from. That means if you want to register your domain that is specific to your city or a domain that ends with anything from .photography to .horse, you’ll have a better chance finding and securing the domain name you want when you keep it with a domain name provider versus a web hosting service provider.

Avoid Domain Name Loss

Data breach is very real, and it can even happen by simply having access to your domain web hosting services. If you use a domain web hosting service that lacks multi-step authentication, it may make it easier for hackers to access your web content. What’s worse, if your domain is also parked with your web hosting service, you can potentially lose your domain name should the hacker transfer the domain name elsewhere. That can mean tons of lost hours and money trying to prove you own your domain name.

You can also easily lose your domain name if your web hosting service and domain name expire on the same date. For example, suppose you have your domain name registered with your web hosting provider and forget to renew your web hosting service, or you decide you no longer want to use the hosting service but haven’t decided on which to use. Your name can also easily be set up for failure if you have an expired credit card on file and your web hosting service provider does not alert you of the need for an update before renewal. Many aftermarket domain name companies wait for these types of opportunities to grab forgotten domains. This situation is even more critical when you lack a trademark for the domain name. Using a different provider to secure your domain name can circumvent these issues from happening in the first place.

Final Thoughts

Your domain name should always be kept separate from your web host to avoid pitfalls, such as domain name loss. If you decide to move to another site builder or web host, keep your domain name registered elsewhere so that you ensure your name will not be locked with your current web hosting provider. However, you can avoid the pitfalls of using one provider to host your web content and maintain your domain name by keeping these two functions separate.

14 Sep 02:08

What Cyclists Should Do if a Driver Crashes into Them

by Average Joe Cyclist

To avoid falls, stay AWAY from the guttersGuest post with advice about what cyclists should do if a driver crashes into them. Includes infographic about common cycling collisions, and advice to avoid collisions for drivers and cyclists.

The post What Cyclists Should Do if a Driver Crashes into Them appeared first on Average Joe Cyclist.

14 Sep 02:08

iPhone 7 reviewed: a lot of catch-up, a little leapfrog


We, the people, may complain about how exhausting it is to keep up with the annual flood of new smartphones from Apple (APPLand Samsung, and so on. But look at the bright side: At least you don’t have to create the annual set of new features. That’s their problem.

Or at least a brutal challenge. Not just because it’s increasingly difficult to think of new features, but also because the phone makers have pretty much run out of room for new components inside.

That, says Apple, is why it removed the headphone jack from the iPhone 7 and 7 Plus, which go on sale Friday. The headphone jack may not seem very big — but on the inside of the phone, the corresponding receptacle occupies an unnerving amount of nonnegotiable space.

The iPhone 7 Plus, in glistening Jet Black.The feature Apple took out

The feature Apple took out

So how are you supposed to listen to music without a headphone jack? Apple offers three ways. First, in the box, Apple includes a two-inch adapter cord that connects any headphones to the phone’s Lightning jack.

You can use any existing headphones or earbuds by popping on the little adapter cable.

Second, the phone also comes with new white earbuds that connect to the Lightning jack. Or you can use any Bluetooth wireless earbuds, including Apple’s own, super-impressive AirPods.

AirPods: Like standard Apple earbuds, but without the tangle.

In theory, those three approaches should pretty much cover you whenever you want to listen. In practice, though, you’ll still get zapped by the occasional inconvenience bug. You’ll be on a flight, for example, listening to your laptop with headphones — and when you want to switch to the phone, you’ll realize that your adapter dongle is in the overhead bin. (Actual example.)

But this kind of hassle is the new reality. Motorola and LeEvo (in China) have already ditched the headphone jack, and other phone makers will follow suit.

All right, Apple removed the headphone jack so that it could dedicate interior space to new features. What are they?

Turns out that just about every element of a smartphone that can be improved, Apple has upgraded: the case, the battery, camera, screen, speakers, processor, storage, the Home button, and the software.

The case

The back of the iPhone 7 is aluminum, whose gracefully rounded edges blend smoothly into the glass of the screen. No change there.

But in addition to the standard metal colors (matte black, silver, gold, pink gold), there’s a new finish available called Jet Black. It’s glistening, shiny, deep piano black. It’s gorgeous and sleek and smooth and you want to rub it like it’s a worry stone. It’s also slippery and fingerprinty.

Jet black: So shiny, Apple wants you to encase it.

And get this: Apple warns that “its high shine may show fine micro-abrasions with use,” so it suggests that “you use one of the many cases available to protect your iPhone.”

OK what? Why would you choose a phone for its finish and then immediately bury it in a case? What am I missing?

There is, however, one big new case feature: The iPhone is, at long last, water resistant. It can handle up to 30 minutes under a meter of water. Which means that rain and falls into the toilet can’t hurt it. (I gave my test unit four drops into a mixing bowl of water, as you can see in the video above. It never even stopped playing music, and still works perfectly.)

Apple’s late to this ball game, but it’s a really good ball game.

The battery

The iPhone 7 battery is 14% larger than the previous model’s — two hours more life per charge, says Apple — and you notice it. My iPhone 6 is usually gasping along with 9% charge by bedtime; the iPhone 7 usually has around 40% left at day’s end. (The improvement in the larger Plus model is more modest: one extra hour per charge.)

Battery-life improvement may not have the dazzle of, say, a built-in laser or thought-recognition software, but it’s one of the most important enhancements Apple could have made. If you forgot to charge your phone last night, no biggie — you’ll have until midday to find a charge.

Apple is too modest to point out another advantage of the iPhone 7’s battery, too: As far as we know, it doesn’t catch fire, and flight attendants don’t make announcements that ban your phone model in flight (*cough* Samsung Note 7 *cough*).

The camera

Apple makes a big deal of the iPhone 7’s new camera. It’s got more megapixels (12, up from 8), and the front camera has been goosed to 7 megapixels. Megapixel don’t really mean very much, though; they have no effect on picture quality.

Apple also raves about the camera’s f/1.8 aperture (lets in a lot of light). But you know what? When the light is good, the shots look exactly the same as they did on the last couple of iPhone models. (In some photos, you do see slightly richer colors, but only when you view those photos on the iPhone 7’s enhanced screen, as described below.)

The new camera shows its value primarily in low light. The stabilized lens helps a lot — an internal shock absorber that counteracts the typical tiny hand jiggles that often introduce blur into low-light photos. (This feature, which also does a great job of stabilizing videos, used to be only in the Plus-sized phones; the smaller iPhone 7 has room for it, Apple says, only because of the removal of the headphone jack.)

All of this makes a huge difference in low-light videos. The color is clearer, and the graininess much less pronounced. Low-light stills are enhanced to a lesser degree.

The iPhone 7’s stabilizer and improved light sensitivity help in low light.

The flash on the back is now made up of four LEDs instead of two, resulting in flashes (and flashlights) that are 50% brighter than before. OK, good.

On the iPhone 7 Plus, though, the camera enhancement is much bigger: Apple has installed two lenses. One is wide-angle, one is telephoto. With a tap on the screen, you zoom in 2X. This is true optical zoom, not the cruddy digital zoom on most previous phones (which just blows up the image, degrading the quality).

The iPhone 7 Plus has a breakout feature–two lenses.

You can also dial up any amount of zoom between 1X and 2X; the iPhone performs that stunt by seamlessly combining the zoom lens’s image (in the center of the photo) with a margin provided by the wide lens.

2X zoom isn’t a huge amount, but it’s 2X as much as any other thin smartphone can handle. And it’s a triumphant first step toward eliminating a key drawback of phone cameras: They can’t actually zoom. (The LG G5 tried a similar stunt, but the second lens had only half the resolution of the first, and you couldn’t do that intermediate zooming thing.)

You can even zoom right in the middle of shooting a video, which is very cool. Occasionally, the two lenses produce different color tones for the same scene; you can it in the video above, and in the grass in this still photo.

The iPhone 7 Plus has true, real, actual 2X zoom (and digital up to 10X).

Even on the Plus, by the way, you can continue to use the digital zoom beyond the 2X, all the way up to a somewhat blotchy 10X (or 6X for video).

In a software upgrade this fall, Apple says that the 7 Plus will gain the ability to create the gorgeously soft-focused background that’s common in professional photography. It’s not real shallow depth of field; it’s a special effect, a filter.

In October, the iPhone 7 Plus will gain a filter that simulates a blurry background.

When Samsung tried this a couple of years ago, the result was a disaster; the blurriness could spill horribly onto the subject’s face like some kind of reverse acid bath. But on the 7 Plus, the dual cameras are supposed to let the software perfectly pick the subject apart from the background, creating a defocused background that’s indistinguishable from the one you get from “real” cameras.

The screen

Apple makes much of the iPhone 7’s new screen with its “expanded color gamut,” meaning that it can display more colors than previous screens, and its “25% brighter” display.

In truth, the difference is very subtle. You can barely identify the brighter screen only when it’s side-by-side with last year’s model and both are at full brightness.

To test the expanded color palette, I took a series of photos with the iPhone 7 and copied them to an iPhone 6s. In side-by-side taste tests, my test panelists usually identified slightly richer colors when those photos appeared on the iPhone 7’s screen.

The bottom line: Don’t expect some jaw-dropping image improvement in screen quality.

The speakers

The iPhone now has stereo speakers! They’re at the top and bottom of the phone, so you don’t get the stereo effect unless the phone is sitting sideways. Even then, there’s very little left/right channel separation.

But never mind that: The iPhone 7’s audio system overall is definitely better than before. It may not be twice as loud, as Apple claims, but you’d definitely say that the 7 sounds fuller and stronger than previous models.

The processor

This year’s iPhone processor has four cores (brains), two of which are dedicated to lower-importance tasks (and consume less power—one of the reasons the phone gets better battery life).

The storage

The pathetically small 16-gigabyte iPhone has finally gone to the great junk drawer in the sky. Now, the three iPhone storage capacities are 32, 128, and 256 gigabytes (for $650, $750, and $850; installment and rental plans are available). For the larger 7 Plus model, the prices are $770, $870, and $970.

The Home button

The Home button, central to so many iPhone features — waking the phone, switching apps, commanding Siri, and so on — is no longer a moving, mechanical part. Now, when you press it, you feel a click, but it’s actually a sonic fake-out, a sharp internal vibration.

The advantage of this setup: You can adjust how clicky the button is. There’s no gap for water to get in. And this Home button is pressure-sensitive — it knows when you’re pressing harder — which could someday permit some cool new features nobody’s even thought of yet.

Just how clicky do you want your Home button to be?

The disadvantage of this setup: When your phone is locked up, you can no longer hold down the Sleep + Home buttons to force-restart it. Instead, you’re now supposed to use Sleep + Volume Down, just as on many Android phones.

The software

The iPhone 7 comes with iOS 10, a new version of the iPhone’s operating software — and, as its usual gift to its fans, Apple offers iOS 10 as a free upgrade to anyone with a recent iPhone model. You can download it today.

There are a lot of changes in iOS 10; you can read my review of it tomorrow. Most of them represent delightful advances in efficiency and common sense; for example, you no longer have to swipe horizontally across the screen to unlock it. Instead, you press the Home button, where your thumb is already sitting (because you used the fingerprint reader). Brilliant.

The bottom line

In recent years, Apple isn’t always the technology leader in phones. This year, once again, some of the best new features are just catching up to rival phones: water resistance, image stabilization, stereo speakers. Believe it or not, Apple isn’t even the first company to take out the headphone jack.

But catch-up has value of its own, and every company plays it. (And that camera-zoom thing on the 7 Plus is fantastic.)

Now that Apple’s phone phone is every bit as advanced as any of its rivals, and more advanced in some areas, its engineers can finally get a well-deserved break. But only for a weekend. On Monday, it’ll be time to start dreaming up new features for next year’s iPhone.

David Pogue is the founder of Yahoo Tech; here’s how to get his columns by email. On the Web, he’s davidpogue.com. On Twitter, he’s @pogue. On email, he’s poguester@yahoo.com. He welcomes non-toxic comments in the Comments below.

14 Sep 02:07

Oceans absorbing heat

by Nathan Yau

oceans-absorbing-heat

It keeps getting hotter on this planet, and the oceans are absorbing most of the heat. Tim Wallace for the New York Times shows several decades of changes.

Tags: environment, New York Times, temperature, Tim Wallace

13 Sep 23:29

How to use “Writing Without Bullshit” (infographic)

by Josh Bernoff

Eighteen months ago, I started working on a book about writing. Today you can buy it, in bookstores or online. Please consider getting a copy, or if you already have one, reviewing it on Amazon or elsewhere. I designed “Writing Without Bullshit” so you can get a lot out of it. Whether you write emails, … Continued

The post How to use “Writing Without Bullshit” (infographic) appeared first on without bullshit.

13 Sep 23:29

iOS 10: The MacStories Review

by Federico Viticci
Sometimes, change is unexpected. More often than not, change sneaks in until it feels grand and inevitable. Gradually, and then suddenly. iOS users have lived through numerous tides of such changes over the past three years.

iOS 7, introduced in 2013 as a profound redesign, was a statement from a company ready to let go of its best-selling OS' legacy. It was time to move on. With iOS 8 a year later, Apple proved that it could open up to developers and trust them to extend core parts of iOS. In the process, a new programming language was born. And with last year's iOS 9, Apple put the capstone on iOS 7's design ethos with a typeface crafted in-house, and gave the iPad the attention it deserved.

You wouldn't have expected it from a device that barely accounted for 10% of the company's revenues, but iOS 9 was, first and foremost, an iPad update. After years of neglect, Apple stood by its belief in the iPad as the future of computing and revitalized it with a good dose of multitasking. Gone was the long-held dogma of the iPad as a one-app-at-a-time deal; Slide Over and Split View – products of the patient work that went into size classes – brought a higher level of efficiency. Video, too, ended its tenure as a full-screen-only feature. Even external keyboards, once first-party accessories and then seemingly forgotten in the attic of the iPad's broken promises, made a comeback.

iOS 9 melded foundational, anticipated improvements with breakthrough feature additions. The obvious advent of Apple's own typeface in contrast to radical iPad updates; the next logical step for web views and the surprising embrace of content-blocking Safari extensions. The message was clear: iOS is in constant evolution. It's a machine sustained by change – however that may happen.

It would have been reasonable to expect the tenth iteration of iOS to bring a dramatic refresh to the interface or a full Home screen makeover. It happened with another version 10 beforetwice. And considering last year's iPad reboot, it would have been fair to imagine a continuation of that work in iOS 10, taking the iPad further than Split View.

There's very little of either in iOS 10, which is an iPhone release focused on people – consumers and their iPhone lifestyles; developers and a deeper trust bestowed on their apps. Like its predecessors, iOS 10 treads the line of surprising new features – some of which may appear unforeseen and reactionary – and improvements to existing functionalities.

iOS 10 is a major leap forward from iOS 9 – at least for iPhone users.

Even without a clean slate, and with a release cycle that may begin to split across platforms, iOS 10 packs deep changes and hundreds of subtle refinements. The final product is a major leap forward from iOS 9 – at least for iPhone users.

At the same time, iOS 10 is more than a collection of new features. It's the epitome of Apple's approach to web services and AI, messaging as a platform, virtual assistants, and the connected home. And as a cornucopia of big themes rather than trivial app updates, iOS 10 shows another side of Apple's strategy:

Sometimes, change is necessary.


eBook Version & Exclusive Making Of

An eBook version of this review is available exclusively for Club MacStories members. Club MacStories offers access to weekly MacStories extras – including workflows, app recommendations, and interviews – and it starts at $5/month.


The eBook version contains all the media (screenshots and videos) of the web version, including eBook-specific layout optimizations.

The eBook can be downloaded from the member Downloads area (to download files on iOS, see here).

In addition to the eBook, we’ll publish an exclusive Making Of newsletter for Club MacStories members later this week. In the Making Of, you’ll be able to read more about my writing process, interesting review stats, the image workflows we used, and how this special web layout was put together.

Get exclusive extras and support MacStories by signing up for Club MacStories today.

Supported Devices

As more features have been added to iOS over the years, its first-run setup flow has become bloated, if not downright unintuitive.

iOS 10 doesn't take any meaningful steps to simplify the setup of a new iOS device, which is mostly unchanged from iOS 9. The only notable difference is the action required to begin the setup process, which is now "press Home to open". As I'll explore later, there's a reason for this.

Where iOS 10 does break away from the old is in the system requirements needed to install the OS. Most devices from 2011 and 2012 aren't compatible with iOS 10, including:

  • iPhone 4S
  • iPad 2
  • iPad (3rd generation)
  • iPad mini
  • iPod touch (5th generation)
Devices supported by iOS 10.

Devices supported by iOS 10.

Progress, of course, marches on, but there are other notable points in this move.

The iPad 2 – perhaps the most popular iPad model to date – supported iOS 9 (in a highly constrained fashion) despite developers clamoring for its demise. After 5 years of service, Apple is cutting ties with it in iOS 10. By leaving the A5 and A5X CPUs behind, developers are now free to create more computationally intensive iPad apps without worrying about the lack of Retina display on the iPad 2 and the performance issues of the third-generation iPad holding them back.

Look closer, and you'll also notice that Apple is dropping support for all devices with the legacy 30-pin dock connector. If a device can run iOS 10, it is equipped with a Lightning port.

In addition to Lightning, every iOS 10-eligible iPad has a Retina display, but not every device comes with a Touch ID sensor yet, let alone a 64-bit processor, Apple Pay, or background 'Hey Siri' support.

It's going to be a while until Apple can achieve its vision of 64-bit and one-tap payments across the board, but it's good to see them moving in that direction by phasing out hardware that no longer fits what iOS has grown into. iOS 10 is starting this transition today.

The Lock Screen

One of the first interactions with iOS 10 is likely going to be an accidental swipe.

For the first time since the original iPhone, Apple is changing the "Slide to Unlock" behavior of the iOS Lock screen. iOS 10 gets rid of the popular gesture altogether, bringing tighter integration with Touch ID and an overhauled Lock screen experience.

Press to Unlock

Let's back up a bit and revisit Steve Jobs' famous unveiling of the iPhone and Slide to Unlock.

At a packed Macworld in January 2007, Jobs wowed an audience of consumers and journalists by demonstrating how natural unlocking an iPhone was going to be. Apple devised an unlocking gesture that combined the security of an intentional command with the spontaneity of multitouch. In Jobs' words:

And to unlock my phone I just take my finger and slide it across.

We wanted something you couldn't do by accident in your pocket. Just slide it across...and boom.

As the iPhone evolved to accommodate stronger passcodes, a fingerprint sensor, and a UI redesign, its unlocking mechanism stayed consistent. The passcode number pad remained on the left side of the Lock screen; even on the iPad's bigger display, the architecture of the Lock screen was no different from the iPhone.

With the iPhone 6s, it became apparent that Slide to Unlock was drifting away from its original purpose. Thanks to substantial speed and accuracy improvements, the second-generation Touch ID sensor obviated the need to slide and type a passcode. However, because users were accustomed to waking an iPhone by pressing the Home button, Touch ID would register that initial click as a successful fingerprint read. The iPhone 6s' Touch ID often caused the first Home button click to unlock an iPhone, blowing past the Lock screen with no time to check notifications.

Ironically, the convenience of Touch ID became too good for the Lock screen. As I wrote in my story on the iPhone 6s Plus:

The problem, at least for my habits, is that there is useful information to be lost by unlocking an iPhone too quickly. Since Apple's move to a moderately bigger iPhone with the iPhone 5 and especially after the much taller iPhone 6 Plus, I tweaked my grip to click the Home button not only to unlock the device, but to view Lock screen notifications as well. While annoying, the aforementioned slowness of previous Touch ID sensors wasn't a deal-breaker: a failed Touch ID scan meant I could at least view notifications. When I wanted to explicitly wake my locked iPhone's screen to view notifications, I knew I could click the Home button because Touch ID wouldn't be able to register a quick (and possibly oblique) click anyway.

That's not the case with the iPhone 6s Plus, which posed a peculiar conundrum in the first days of usage. Do I prefer the ability to reliably unlock my iPhone with Touch ID in a fraction of a second, or am I bothered too much by the speed of the process as it now prevents me from viewing notifications on the Lock screen?

Apple is making two changes to the unlocking process in iOS 10 – a structural one, with a redesign of the Lock screen and its interactivity; and a behavioral one to rethink how unlocking works.

Apple hopes that you'll no longer need to click any button to wake an iPhone. iOS 10 introduces Raise to Wake, a feature that, like the Apple Watch, turns on the iPhone's display as soon as it's picked up.

Raise to Wake is based on a framework that uses sensors – such as the motion coprocessor, accelerometer, and gyroscope – to understand if a phone has been taken out of a pocket, but also if it's been picked up from a desk or if it was already in the user's hands and its elevation changed. Due to ergonomics and hardware requirements, Raise to Wake is only available on the iPhone 6s/7 generations and it's not supported on the iPad.

Apple has learned from the first iterations of watchOS: Raise to Wake on the iPhone 6s and iOS 10 is more accurate than the similar Watch feature that shipped in 2015. In my tests, Raise to Wake has worked well when taking the iPhone out of my pocket or picking it up from a flat surface; it occasionally struggled when the iPhone was already in my hands and it was tricky for the system to determine if it was being raised enough. In most everyday scenarios, Raise to Wake should wake an iPhone without having to click the Home or sleep buttons.

Raise to Wake is only one half of the new unlocking behavior in iOS 10: you'll still need to authenticate and unlock a device to leave the Lock screen. This is where the iPhone's original unlocking process is changing.

To unlock a device running iOS 10, you need to click the Home button. If the display is already on and you place your finger on the Touch ID sensor without clicking it – as you used to do in iOS 9 – that won't unlock the device. By default, iOS 10 wants you to physically press the Home button.

Bye, slide to unlock.

Bye, slide to unlock.

This alteration stems from the unbundling of fingerprint recognition and Home button click, which are now two distinct steps. Placing a finger on Touch ID authenticates without unlocking; pressing the Home button unlocks.

In Apple's view, while Raise to Wake turns on the display, authentication may be required to interact with features on the Lock screen – such as actionable notifications, widgets, or Spotlight results. With iOS 10, users can pick up an iPhone, view what's new on the Lock screen, and authenticate (if necessary1) without the risk of unlocking it.

From a design standpoint, this change is reflected in the icons and messages displayed to the user on the Lock screen. When the display turns on with Raise to Wake, a padlock icon in the status bar indicates that the user has not yet authenticated with Touch ID. At the bottom, a 'Press home to unlock' message replaces the old 'slide to unlock' one.

Locked.

Locked.

With the display on and after Touch ID authentication, 'Press home to unlock' becomes 'Press home to open' and the status bar lock switches to an 'Unlocked' message.

Unlocked.

Unlocked.

Under the hood, clicking the Home button and placing a finger on Touch ID are two separate actions. However, the wording of 'Press home to unlock' feels like Apple wants you to think of them as one. The entire message is an illusion – pressing the Home button by itself doesn't actually unlock a device – but Raise to Wake combined with the second-generation Touch ID will make you believe in it.

Touch ID Nuances

If a device doesn’t have Touch ID (or if Touch ID can’t read a fingerprint), pressing the Home button will bring up the number pad or keyboard. And, if an iPhone’s display is activated by clicking the Home button (try this with a non-Touch ID finger) and then the finger is lifted off the button, placing it on Touch ID again (without clicking it) will unlock the device.

On an iPhone 6s, one click on the Home button is all that's needed to exit the Lock screen – at least most of the time. If the iPhone's display is off because Raise to Wake didn't work (or because you manually locked it while holding it), the experience is similar to iOS 9. Clicking the Home button with a Touch ID-enabled finger will wake up the display and bypass the Lock screen.

You can revert to a pre-iOS 10 unlocking experience if you don't like the new one. First, Raise to Wake can be disabled in Settings > Display & Brightness, and your iPhone will no longer turn on when picked up. Additionally, tucked away in Settings > Accessibility > Home Button, you'll find an option called 'Rest Finger to Open'. When enabled, your iPhone will unlock through Touch ID alone, without having to press the Home button.

It takes some time to get used to the new unlocking behavior of iOS 10. The apparent unification of Home button click and Touch ID makes less sense on devices without the second-generation sensor, where one click is rarely enough and tends to bring up the passcode view for a second attempt. And, nostalgically speaking, I miss the old 'slide to unlock' message, although for reasons that are merely emotional and not related to function.

I now expect my iPhone to know when it’s time to wake up.

After three months, Raise to Wake and Press to Unlock have made the overall unlocking experience faster and more intuitive. I now expect my iPhone to know when it's time to wake up and show me the Lock screen, and I don't miss the old unlocking process. Raise to Wake eliminates the need to click a button to wake an iPhone; having to press the Home button to unlock removes the risk of accidentally leaving the Lock screen.

But it all goes back to that accidental swipe. Picture this: you've just upgraded to iOS 10, or you've bought a new iPhone with iOS 10 pre-installed, and, instinctively, you slide to unlock. What you're going to see isn't an error message, or the Lock screen bouncing back, telling you that you need to press the Home button instead. You're going to see the biggest change to the Lock screen – potentially, a better way of interacting with apps without unlocking a device at all.

Slide to unlock, and you'll meet the new Lock screen widgets.

Lock Screen Widgets

Technically, Lock screen widgets predate iOS 10. On both the iOS 8 and iOS 9 Lock screens, users could swipe down to reveal Notification Center and its Today view. However, iOS 10 adds an entirely new dimension to the Lock screen, as well as a refreshed design for widgets throughout the system.

The Lock screen's renovation in iOS 10 starts with three pages: widgets and search on the left, the Lock screen (with notifications and media controls) in the middle, and the Camera on the right. You can swipe to move across pages, as suggested by pagination controls at the bottom of the Lock screen.

The Lock screen's new horizontal hierarchy, with widgets on the left.

The Lock screen's new horizontal hierarchy, with widgets on the left.

The leftmost page, called the Search screen, isn't completely new either. Apple took the functionality of Spotlight search and Proactive of iOS 9, mixed it up with widgets, and made it a standalone page on the iOS 10 Lock screen (and Home screen, too).

From left to right: Lock screen widgets on the Search screen; Notification Center; widgets in Notification Center.

From left to right: Lock screen widgets on the Search screen; Notification Center; widgets in Notification Center.

Notably absent from iOS 10's Lock screen is the Camera launcher button. By getting rid of the tiny shortcut in the bottom right corner, Apple has made the Camera easier to launch: swiping anywhere to move between Lock screen and Camera is easier than carefully grabbing an icon from a corner. I've been taking more spontaneous, spur-of-the-moment pictures and videos thanks to iOS 10's faster Camera activation on the Lock screen.

Apple's sloppy swiping for Lock screen navigation has one caveat. If notifications are shown, swiping horizontally can either conflict with actionable buttons (swipe to the left) or open the app that sent a notification (swipe right). You'll have to remember to swipe either on the clock/date at the top or from the edge of the display; such is the trade-off of using the same gestures for page navigation and notification actions.

Where to swipe when notifications fill the Lock screen. (Tap for full size)

Where to swipe when notifications fill the Lock screen. (Tap for full size)

Three changes stand out when swiping right to open the Search screen:

  • There's a search field at the top, shown by default;
  • The clock2 stays pinned to the right3;
  • Widgets have a new design that favors richer, bigger content areas.

Unlike their predecessors, widgets in iOS 10 don't blend in with the dark background of Notification Center. This time, Apple opted for standalone units enclosed in light cells with an extensive use of custom interfaces, buttons, images, and dark text.

Widgets in Notification Center on iOS 9 and iOS 10.

Widgets in Notification Center on iOS 9 and iOS 10.

There's a common thread between widgets and notifications (also redesigned in iOS 10): they're self-contained boxes of information, they sit on top of the wallpaper rather than meshing with it, and they display an app's icon and name in a top bar.

Notifications and widgets. Spot the trend.

Notifications and widgets. Spot the trend.

The new design is more than an aesthetic preference: the makeover has also brought functional changes that will encourage users and developers to rethink the role of widgets.

A widget in iOS 10 supports two modes: collapsed and expanded. The system loads all widgets in collapsed mode by default, which is about the height of two table rows (about 110 points). All widgets compiled for iOS 10 must support collapsed mode and consider the possibility that some users will never switch to the expanded version. Apps cannot activate expanded mode on the user's behalf; switching from compact to expanded is only possible by tapping on a 'Show More' button in the top right corner of a widget.

Compact and expanded widgets.

Compact and expanded widgets.

This is no small modification, as it poses a problem for apps that have offered widgets since iOS 8. Under the new rules, apps updated for iOS 10 can't show a widget that takes up half of the display as soon as it's installed. Any widget that wants to use more vertical space for content – such as a todo list, a calendar, or even a list of workflows – will have to account for the default compact mode.

For some developers, this will mean going back to the drawing board and create two separate widget designs as they'll no longer be able to always enforce one. Others will have to explain the difference to their users. Workflow, which used to offer a widget that could dynamically expand and collapse, is updating the widget for iOS 10 with a label to request expansion upon running a workflow that needs more space.

Workflow's new iOS 10 widget.

Workflow's new iOS 10 widget.

There's one exception: legacy iOS 9 apps that haven't been updated for iOS 10. In that case, the system won't impose compact mode and it won't cut off old widgets (which keep a darker background), but there's a strong possibility that they won't look nice next to native iOS 10 ones.

The same widget in iOS 9 legacy mode and with native iOS 10 support.

The same widget in iOS 9 legacy mode and with native iOS 10 support.

I don't see how Apple could have handled this transition differently. Design updates aside, there's an argument to be made about some developers abusing Notification Center with needlessly tall and wasteful widgets in the past. Compact mode is about giving control to the users and letting them choose how they prefer to glance at information. Want to install a widget, but don't need its full UI? Use it in compact mode. Need to get more out of it? Switch to expanded.

Apple's decision to adopt compact and expanded modes in iOS 10 is a nod to developers who shipped well-designed widgets in the past, and it provides a more stable foundation going forward.

I've been able to test a few third-party iOS 10 widgets that illustrate the advantages of these changes.

PCalc, James Thomson's popular iOS calculator, has a new widget that displays a mini calculator in compact mode with numbers and basic operations split in two rows.

Despite the small touch targets, the compact interface is usable. If you want bigger buttons and a more familiar layout, you can switch to expanded mode, which looks like a small version of PCalc living inside a widget – edge-to-edge design included.

Launcher doesn't modify its widget's interface when toggling between compact and expanded, but the constraints of the smaller layout force you to prioritize actions that are most important to you.

Using compact mode for summary-style UIs will be a common trend in iOS 10. CARROT Weather is a good example: it shows a summary of current conditions when the widget is compact, but it adds forecasts for the day and week ahead when expanded.

CARROT's widget can be customized with two styles.

CARROT's widget can be customized with two styles.

Even better, slots in the compact layout can be customized in the app, and you can choose to use the widget in light or dark mode.

Drafts has an innovative implementation of compact and expanded layouts, too. In compact, the widget features four buttons to create a note or start dictation. When the widget expanded, it grows taller with a list of items from the app's inbox, which can be tapped to resume editing.

In the past, developer Greg Pierce would have had to ask users to customize the widget or make it big by default; in iOS 10, they can switch between modes as needed.

Widgets' ubiquitous placement pushes them to a more visible stage; as soon as more developers adapt4, iOS 10 has the potential to take widgets to the next level.

I believe the new design will play an essential role in this.

The Design of Widgets

Apple advertises legibility and consistency as core tenets of widgets in iOS 10, and I agree: widget content and labels are easier to read than iOS 9. Standalone light cells separate widgets with further precision; I haven't found translucency with the Lock screen wallpaper to be an issue.

In addition, the light design brings deeper consistency between apps and widgets. Most iOS apps have light backgrounds and they employ color to outline content and indicate interactivity. In iOS 10, widgets are built the same way: the combination of light backgrounds, buttons, and custom interfaces is often consistent with the look of the containing app.

In this regard, widgets feel more like mini-apps available anywhere rather than smaller, less capable extras. The line between widget and full app UIs is more blurred than ever in iOS 10.

Apple's new Notes and Calendar widgets showcase this newfound cohesiveness. The Notes widget displays the same snippets of the list in the Notes app. Buttons to create new notes and checklists are also the same. The widget looks and feels like a small version of Notes available anywhere on iOS.

From app to widget.

From app to widget.

The Calendar widget is even more indicative. Glancing at events and recognizing their associated calendar wasn't easy in iOS 9, as they only had a thin stripe of color for the calendar to which they belonged.

The Calendar widget is more contextual on iOS 10.

The Calendar widget is more contextual on iOS 10.

In iOS 10, forgoing a dark background has allowed Apple to show Calendar events as tinted blocks matching the look of the app. Discerning events and the calendars they belong to is easier and familiar.

Consistency of apps and widgets.

Consistency of apps and widgets.

I wouldn't expect every app to adopt a widget design that exactly mirrors the interface users already know, but it can be done. Switching to a light design has given Apple a chance to reimagine widgets for consistency with apps and lively combinations of color, text, and icons. They are, overall, a step up from iOS 9 in both appearance and function.

The new direction also opens up a future opportunity: what is light can be more easily converted to dark. I could see a system dark mode working well for widgets.

The iPad Lock Screen

The iPad's Lock screen doesn't break any new ground, but there are some differences from the iPhone.

On the iPad, notifications are displayed on the left side of the screen when in landscape. They're aligned with the system clock, and they leave room for media controls to be displayed concurrently on the right. Dealing with notifications while controlling music playback is a task well suited for the iPad's larger display.

Unfortunately, Apple doesn't think portrait orientation should warrant the same perks. If a notification comes in while album artwork is displayed on the Lock screen, the artwork will be hidden. Apple decided against using a two-column layout in portrait, which I don't understand: they're already doing it for widgets on the iPad.

No artwork for you, Mr. Portrait.

No artwork for you, Mr. Portrait.

Furthermore, if no music is playing on an iPad in landscape, having notifications aligned to the left for no apparent reason looks odd and seems...unnecessary.

The right side seems cozy.

The right side seems cozy.

Widgets fare a little better. Apple has kept the two-column design first introduced in the Today view of iOS 9; you can still scroll the two lists of widgets independently.

I would have appreciated the ability to further control the resizing and placement of widgets on the iPad, and the Lock screen design seems uninspired. We'll have to make the most of this bare minimum work for now.

Apple's Widgets

iOS 10 sports an increased modularity of widgets. Apple has done away with grouping multiple types of content under Siri Suggestions – most Apple apps/features have their own widget, which can be disabled from a revamped configuration screen.

Widget's new configuration screen.

Widget's new configuration screen.

Here's an overview of what's changed.

Activity

Your Activity rings from the Apple Watch, with a summary of Move, Exercise, and Stand statistics.

Calendar

A mini calendar interface. Events are displayed as colored blocks matching the calendar they belong to. You can tap on an event to open it, and expand the widget to reveal more events.

Favorites

Shortcuts to your favorite contacts with different ways to get in touch with them. New in iOS 10, you can assign iMessage as well as third-party communication apps (messaging and VoIP) to contact entries in Favorites, which will be displayed in the widget.

Mail

...yeah.

...yeah.

The Mail widget is the weakest of the bunch: it only displays shortcuts for VIP contacts. I would have preferred to see a preview of the unified inbox, or perhaps an option to show flagged messages.

Maps

Maps has three widgets: destinations, nearby, and transit. While the latter isn't available for my area (Rome, Italy), the other two have worked inconsistently. I've never seen a nearby recommendation in the widget, despite being around places rich in POIs. The Destinations widget usually tells me how much time it'll take me to drive home, but it doesn't proactively suggest other locations I frequently visit.

Music

The Music widget is an odd one. It displays a grid of what appears to be either recently played music or your all-time most listened albums. The widget doesn't clarify whether it's showcasing albums or individual songs; it uses album artworks with no text labels, and it plays either the most played song from an album, or an entire album starting from the first song.

A nice perk: music starts playing after tapping the widget without opening Apple Music. But it always feels like a lottery.

News

Top Stories from Apple News (shown even if you mute the channel). The widget uses image thumbnails and custom typography matching the bold font of Apple News for headlines.

The best change from iOS 9: news can be disabled by removing the widget.

Notes

A preview of your most recent notes. In compact mode, the widget only shows the last modified note. In expanded mode, you get more notes and buttons to create a new note, a checklist, snap a picture, and create a drawing.

Photos

A collection of Memories created by the new Photos app in iOS 10. Each one can be tapped to view the associated memory in Photos.

Siri App Suggestions

iOS 9's proactive Siri Suggestions are now smaller in scope and they're called Siri App Suggestions. The widget displays 4 app shortcuts (8 in expanded mode), and it doesn't suggest other types of content.

Like News, it can also be removed and be placed anywhere on the Search screen.

Tips

You'd think that the Tips widget is useless – everyone likes to make fun of Tips – but hear me out. In compact mode, the widget shows a tip's snippet; you can tap it and open the Tips app. Switch to expanded mode, though, and you'll be presented with a custom interface with an explanation of the tip and a large animation at the top to show you the tip in action.

The Tips widget looks great, and it's the most technically impressive one on iOS 10.

Up Next

The old Today Summary widget has been renamed Up Next. It displays a smaller version of your next event without the full UI of the Calendar widget. Alas, the Tomorrow Summary widget is gone from iOS 10.

Weather

Perhaps the best example of how widgets can use compact and expanded modes, Apple's Weather widget shows weather conditions for the current location when compact, and a forecast of the next six hours when expanded.

Weather is the widget I've used the most in the past three months to look up forecasts from the Lock screen in just a couple of seconds.

Slide to Glance

The move to apps as atomic units scattered across the system is everywhere in iOS 10, with widgets being the foremost example.

Noticeably absent from iOS 10's widgets is a push for more proactive recommendations. As we'll see later, Apple has shifted its Proactive initiative to run through the OS and inside apps rather than distilling it into widgets.

3D Touch is another illustrious no-show. While notifications have been overhauled to make good use of 3D Touch, pressing on a widget will result in a disappointing lack of feedback. 3D Touch would be a perfect fit for widgets – imagine previewing a full note or reading the first paragraphs of a news story from the Lock screen.

The new widget design and Search screen placement make an iPhone more useful without having to unlock it. Apple has done a good job with their built-in widgets; it's up to developers now to rethink how their apps can take advantage of them. I'm optimistic that everything will turn out better than two years ago.

I unlock my iPhone less thanks to iOS 10’s more capable Lock screen.

I unlock my iPhone less thanks to iOS 10's more capable Lock screen. Raise to Wake, Press to Open, widgets, search, and rich notifications make the entire Lock screen experience drastically superior to iOS 9.

Easier to navigate, better structured, less prone to unwanted unlocks. I wouldn't be able to go back to the old Lock screen.

Notifications

iOS 10's rethinking of apps as granular interactions doesn't stop at widgets. With a new framework that can turn incoming notifications into rich, actionable interfaces, Apple wants users to spend less time jumping between apps.

Notifications iOS 9 and 10.

Notifications iOS 9 and 10.

Notifications in iOS 10 share the same design principles of widgets. Rather than being grouped in a list of items on top of a dark background, notifications are discrete light cells that can be pressed (with 3D Touch), pulled down (for incoming banners), or swiped and expanded into a floating card preview.

The anatomy of an expanded notification – whether an app has been updated for iOS 10 or not – has fixed elements that developers can't control. There's a header bar at the top with the icon and name of the app, and a close button on the right to dismiss the notification. Tapping the icon on the left side will open the app that sent the notification.

The standard look of a notification in iOS 10.

The standard look of a notification in iOS 10.

This is true for both iPhones with 3D Touch and devices without it; to expand a notification on an iPad or an older iPhone (or if you don't want to use 3D Touch), you can pull down an incoming notification banner or swipe a notification to the left in Notification Center and tap 'View'.5

New APIs allow developers to take different actions for notifications that have been sent to the user – including ones that have been cleared. First, notifications can be dismissed with a Clear action by swiping on them. Apps can monitor the dismiss action and stop delivering the same notification on other devices.

Additionally, developers can remove, update, and promote notifications that have already been sent. Apple's goal was to prevent Notification Center from being cluttered with old notifications that aren't relevant anymore. If developers implement this API, updating a notification with fresh content should help users see what's changed. Imagine sports scores or live-streaming apps and how they could update notifications. I'm curious to see which services will convert to this behavior instead of spamming users with multiple alerts.

Underneath the header of an expanded notification is the content developers can control, and where the most important changes to notifications are happening.

In iOS 10, notifications can have a title and a subtitle. The title is displayed in a bold font, which helps identifying the subject of a notification. In a Reminders notification, the name of a reminder will be the bold title at the top, with its note displayed as text content below it.

The default look of a notification in iOS 10. Expansion is relative to a notification's placement on screen.

The default look of a notification in iOS 10. Expansion is relative to a notification's placement on screen.

Below the title and subtitle, iOS 10 shows a notification's body text content (same as iOS 9) and actionable buttons. In a welcome change from the past, developers can define more than two notification actions, displayed in a list under the notification's card.6 If an app requires a quick reply upon expanding a notification, the input field will sit above the keyboard – it's not attached to the notification like in iOS 9.

Quick replies in iOS 9 and iOS 10.

Quick replies in iOS 9 and iOS 10.

Design changes alone, though, wouldn't have sufficed to modernize notifications. To reinvent their feel and capabilities, Apple has created two new extension points for developers in iOS 10: Notification Service and Notification Content.

The Notification Service extension doesn't have an interface and runs in the background. Upon triggering a notification but just before delivering it to the user, an app can call the Notification Service extension to augment or replace its payload. This extension is meant to have a short execution time and it's not designed for long tasks. Possible use cases for Notification Service extensions could be downloading an image or media file from a URL before showing a notification, or decrypting an encrypted payload locally for messaging apps that rely on end-to-end encryption.

The Notification Service extension should come in handy given iOS 10's ability to include a media attachment (images, audio, videos, and even GIFs) in both the notification banner and the expanded notification. If they adopt it, apps like WhatsApp and Telegram could omit the "[Contact] sent you an image" standard notification and display a thumbnail in the notification banner (like iMessage does) and a full image preview in the expanded notification.

Notification Content extensions are what users are going to see the most in daily usage, and they motivate iOS 10's notification card design.

A notification in iOS 10 can show a custom view between the header and default text content. Custom views can be anything – an embedded map, a message conversation, media, a calendar view, etc. – and they're managed by the Notification Content extension. Custom views are non-interactive: they can't receive touch events7, but they can be updated in-place in response to a task performed from a notification action. Apps can hide the default content of a notification if the custom view is informative enough.

Service and Content extensions, combined with the expanded design, have turned notifications in iOS 10 into a completely new experience. Notifications are no longer just text: they are custom app UIs delivered to you with rich previews and interactions that can live on longer than a couple of seconds. Notifications in iOS 10 are mini apps in and of themselves.

Notifications in iOS 10 are mini apps in and of themselves.

When you receive an iMessage that contains a photo, the incoming notification can be expanded, either with 3D Touch or a swipe. You'll be treated to a full iMessage conversation UI, living inside the notification, with the same transcript, read receipts, and typing indicators you'd see in the Messages app.

To expand a notification, you can pull it down or press on it.

To expand a notification, you can pull it down or press on it.

Not only can you send a reply – you can keep the iMessage interface open as you keep a conversation going from the notification. It's a fantastic way to check into a conversation without the constraints of a quick reply.

Scroll up in the transcript to view older messages.

Scroll up in the transcript to view older messages.

When you're done, swipe down to dismiss the notification, and you'll be back to whatever you were doing.8

Calendar notifications follow the same concept. If an event with a location attached is coming up, the expanded notification will display the default text content at the bottom, but also a preview of the address with a Maps view at the top.

Thanks to actionable buttons, you can open directions in Maps without launching Calendar. If an upcoming event doesn't have a location, you'll see a preview of your agenda inside the notification.

I tested a version of Workflow optimized for iOS 10, which brings improved notification support with the ability to customize the content displayed in a notification card. In addition to a title, you'll be able to embed pictures, videos, GIFs, and even Maps views into a Workflow notification.

Rich notifications created with Workflow.

Rich notifications created with Workflow.

Pictures are displayed as thumbnails in a notification banner before expanding it; videos can be played inline within the card itself.

And if you often receive messages containing GIFs, iOS 10 will let you preview them directly from a notification.

CARROT Weather has a clever take on rich notifications in iOS 10. The daily digest and severe weather/precipitation alerts can be expanded into dynamic preview cards.

Through a Notification Content extension, the app can embed a custom interface, sounds, and even animations inside the notification card. As a result, viewing CARROT's notifications feels more like using the app rather than reading a plain text summary.


With a new framework and the flexibility granted by extensions, we're going to see a rise of interaction methods fueled primarily by notifications. Of all the places where an app can advertise its functionality on iOS (widgets, keyboards, extensions), a notification is the most direct, contextual way to reach users at an appropriate time.

A notification carries interest and, in many cases, a sense of urgency. iOS 10 transforms notifications from a passive delivery system into an active experience where users engage with an app through UIs, actions, and feedback they're already familiar with. It's a win-win for developers, who can make their apps more useful through richer notifications, and for users, who no longer have to open apps to benefit from their services.

iOS 10's notifications are a new layer on top of apps. They're going to change how we deal with them every day.

The Home Screen

The iPhone 6s brought the first significant adjustment to the iOS Home screen in years – 3D Touch quick actions. With iOS 10, Apple is cautiously expanding the Home screen beyond app shortcuts, but in ways you might not expect.

Search from the Home screen: pull down (left) or swipe right to open the new Search screen.

Search from the Home screen: pull down (left) or swipe right to open the new Search screen.

As in iOS 9, Spotlight search can be accessed from two locations: the Search screen on the left side of the Home screen and by pulling down on app icons. The Search screen on the left mirrors its Lock screen counterpart.

A Pale Page Dot

The Search screen (both on the Lock screen and Home screen) doesn’t have a special page indicator at the bottom – it has a standard page dot. Apple may be hinting that the page can do more than search alone, but the difference, at least visually, sticks out. A different icon would have been better.

Notification Center has gone through some deeper changes. The segmented control to switch between notifications and widgets at the top is gone, replaced by another set of page indicators. Every time you open Notification Center, iOS 10 will default to showing you notifications in chronological order under a new 'Recent' header – it doesn't remember your position in the two pages. Unfortunately, the option to group notifications by app has also been removed.

Whether by laziness or deliberate design, there's an abundance of ways to activate Spotlight search in iOS 10. Let's round them up:

  • Search from the Lock screen (above widgets);
  • Open the Search screen (left side of the Home screen) and pull down or tap the search field;
  • Pull down on icons on the Home screen;
  • Swipe down to open Notification Center and tap Search above notifications;
  • Swipe right on Notification Center to open widgets and find Search at the top;
  • Use Command-Space on an iPad with an external keyboard and Spotlight will open modally on top of whatever app you're using without going back to the Home screen;
  • Last, and perhaps more perplexingly, there's a hidden way to open Spotlight modally when inside apps on the iPhone 6s. When using an app, swipe down slowly from the status bar until you feel a first haptic feedback, then let go. Instead of opening notifications, the text cursor will focus in the search field. If you don't let go after the first vibration but keep swiping down, you'll open Notification Center. This method doesn't work on the Home screen – only in apps. It's also supported on older devices, albeit without haptic feedback.

That's seven ways to open Spotlight search on iOS 10.

Six shades of Spotlight search on iPhone.

Six shades of Spotlight search on iPhone.

Being able to access search from everywhere – be it on the Home screen, the Lock screen, or when using an app – is convenient. It makes Spotlight pervasive. As Apple continues to grow their search efforts across native apps, web partnerships, and Proactive suggestions, Spotlight's omnipresence will become a valuable strategic asset.

Apple continues to be a steadfast supporter of the Home screen as a grid of icons. In a potential disappointment for those who hoped to see a major Home screen refresh this year, the biggest new feature is an extension of 3D Touch quick actions and widgets, rolled into one.

Quick actions and widgets on the Home screen.

Quick actions and widgets on the Home screen.

Apps that offer a compact widget in iOS 10 can display it alongside quick actions when a user presses the app's icon. The widget is the same used in the Search screen – in fact, there's a button to install it directly from the Home screen.

iPhone Plus models can display quick actions and widgets on the landscape Home screen as well.

iPhone Plus models can display quick actions and widgets on the landscape Home screen as well.

I'm not sure I buy into Apple's reasoning for combining widgets and quick actions – at least not yet. The glanceability of widgets finds its raison d'être on the Lock screen and inside apps; on the other hand, I associate going back to the Home screen and pressing an icon with launching, not glancing. Years of iOS usage trained me to see the Home screen as a launchpad for apps, not an information dashboard.

In three months of iOS 10 – and with plenty of glanceable/actionable widgets to test – I've only remembered to use a widget on the Home screen once (it was PCalc). It's not that having widgets alongside quick actions is bad; it's just forgettable. It's the equivalent of two neighbors being forced to live together under the same roof. Having company can be nice sometimes, but everyone would be better off at their own place.

There are other smaller 3D Touch additions to the Home screen in iOS 10. You can press on folders to bring up a Rename action, and apps inside folders that have unread badges will be listed in the folder's quick action menu.

Folders have also received a visual refresh, with a nicer background blur that shows the grid of icons in the current Home screen page.

On the iPad, Apple didn't bring any improvements to the Home screen in iOS 10, but I'm sure you'll be relieved to know that closing an iPad app no longer adjusts the icon's corner radius on the Home screen.

This relates to a deeper change happening to Home screen animations. Apple has rebuilt the entire SpringBoard animation stack with faster, interruptible animations. Along with a reduced animation curve to launch apps (what was one of the most criticized aspects of iOS 7), you can click the Home button right after tapping an app's icon and the animation will stop, going back to the Home screen in an instant.

You can try the same with a folder: tapping outside of it will cancel the animation instantly in mid-flight. The difference with iOS 9's Home screen animations is staggering.

They're not a "feature", but new animations are the best Home screen change in iOS 10.


It's fair to wonder if Apple will ever desecrate the sanctity of the Home screen and allow users to mix icons and widgets.

Anyone who's ever looked at Android will spot obvious similarities between widgets for Google's platform and what Apple has done with widgets in iOS 10. Apple still believes in the separation of icons and app content; they only added widgets to 3D Touch quick actions and they didn't even allow the iPad Pro's large Home screen to go beyond icons. But for how long?

The iOS Home screen has served us well for years, but as screens keep getting bigger, it's time to do more than a grid of icons with quick actions. The other side of the fence is closer than ever; a final leap wouldn't be too absurd.

Control Center

Since its introduction in 2013, Control Center has become a staple of iOS, providing users with a panel of commonly accessed shortcuts. iOS 10's Control Center is a radical shift from its origins, and a harbinger of how iOS is changing.

Control Center's design has evolved over the years, from the wireframe-like look of iOS 7 to the friendlier, rounder buttons of iOS 9.

Apple wasn't led astray by the expansion of iOS, to the point where cramming more functionality into Control Center turned into a balancing act of prioritizing important controls without sacrificing their purpose.

It was clear that Control Center's original vision couldn't scale to the growing nature of iOS. And so with iOS 10, Apple has torn down Control Center and started from scratch. The single-page mosaic of tiny buttons is no more. The new Control Center breaks up system shortcuts and audio controls in two separate pages, with the addition of a third page for HomeKit (if available). Everything's bigger, spacious, and colorful.

The three pages of Control Center in iOS 10.

The three pages of Control Center in iOS 10.

You still open Control Center with a swipe from the bottom of the display. In iOS 10, swiping pulls up a card with paginated controls underneath it. The design is familiar, yet unmistakably new. Margins across each side convey the card metaphor; controls are bigger and buttons have more padding; there's more color in every card.

After three years of Control Center, the new version in iOS 10 feels lively and friendly; perhaps even more fun. On the other hand, pagination and bigger controls raise a question: has simplicity come at the expense of efficiency in Control Center?

System Controls

A useful exercise to understand Control Center in iOS 10 is to take stock of how much Apple is leaving behind. Let's compare iOS 9's Control Center to the same screen in iOS 10:

The first page of Control Center in iOS 10 has lost audio playback. Initially, that may feel like a downgrade. But let's swipe left and consider what Control Center has gained by separating system and audio controls:

The difference is striking. Giving audio playback its own space lets Control Center present more information for the media being played. It's also more accessible thanks to bigger text labels, buttons that don't need to be carefully tapped, and hardware controls embedded in the same page.

This won't be easy to accept for iOS power users who cherish dense UIs: Control Center buys into a trend followed by many (but not all) parts of iOS 10. Big, bold controls, neatly laid out, spread over multiple views.

The first beneficiary of such clarity is the system controls page. The first row of toggles at the top has kept iOS 9's iconography and arrangement, but each button is color-matched to the setting it activates when toggled.9

Control Center is bleeding...four colors?

Control Center is bleeding...four colors?

I found colored toggles extravagant at first; now, I like that I can glance at those buttons and know which setting is engaged.

Don't forget about landscape mode.

Don't forget about landscape mode.

The brightness slider and the AirPlay, AirDrop, and Night Shift buttons have been enlarged and simplified as well. For one, the slider's puck is more comfortable to grab. The buttons reveal another tendency in iOS 10's semi-refreshed design language: they're actual buttons with rounded borders and they use color to indicate status.

In a change that's reminiscent of Sam Beckett's fantastic concept, you can press on the bottom row of shortcuts to show a list of 3D Touch quick actions. These include three intensity levels for the flashlight, timer options, a shortcut to copy the last Calculator result, and different Camera modes.

As I elaborated before, Control Center was an ideal candidate for 3D Touch actions. However, Apple's implementation in iOS 10 is limited to the bottom row of apps; you can't press on the Bluetooth icon to connect to previously paired devices, nor can you press on the Wi-Fi toggle to connect to a different network. The addition of 3D Touch to the lower end of Control Center shows that Apple recognizes the utility of quick actions for system-wide shortcuts, but they're not fully committed to the idea yet.

Despite some missing features and growing pains to be expected with a redesign, iOS 10's first Control Center page is an improvement. With a sensible reliance on color, a more legible layout, and the first steps toward full 3D Touch support, Control Center's system card is easier to parse, nimble, and intuitive.

"It Also Looks Great on the iPad"

Control Center's design direction has been taken to the extreme on the iPad. Only one page can be used at a time; the AirDrop, AirPlay, and Night Shift buttons are needlessly wide. It doesn't take a design expert to figure that Apple just wanted to ensure basic compatibility with an iPhone feature instead of designing Control Center around the iPad.

Look at it this way: if Control Center didn't exist on the iPhone and Apple decided to introduce it on the iPad today, would it look like this?

The lack of an iPad-first approach was passable in the old Control Center because of its compact design. But with iOS 10, following the iPhone's model has a detrimental effect. Buttons are too big and little care went into optimizing the UI for the iPad's screen. Apple should reconsider what they're doing with Control Center on the iPad instead of upscaling their iPhone designs.

Music Controls

In iOS 10, managing music and audio playback from Control Center is a richer experience, visually and functionally superior to iOS 9.

The page is split in three areas: audio information and, for the first time, artwork at the top; progress, playback controls, and volume in the middle; hardware accessories at the bottom. This is true for Apple Music and Podcasts as well as third-party apps, which don't need to optimize for iOS 10 to show album artwork.

I was skeptical when I saw that Apple moved audio controls to a separate card. The ubiquitous presence of an audio widget was my favorite aspect of Control Center; adding an extra step to reach it didn't seem a good idea. After adjusting to Control Center's audio page in the first month of iOS 10, I went back to iOS 9 and controlling music felt limited and bland.

There are two aspects to Apple's design worth noting. First, Control Center remembers the page you were using before dismissing it. If you swipe up, swipe left to open music playback, then close Control Center, the next time you open it, you'll get the Now Playing card instead of being taken back to the first page. Thanks to this, having audio controls on a separate page hasn't been a problem in my experience, but I wonder if Apple should allow reordering pages as an option.

Second, the purpose of the redesign. With artwork and comfortable UI elements, the page feels like a miniaturized music app rather than a cumbersome mishmash of buttons and sliders. It's almost as if Control Center was reimagined for how normal people like to know what's playing.

From an interaction standpoint, artwork creates a bigger touch target that you can tap to be taken into the app playing audio10; in iOS 9, you had to precisely tap on a song's small title in Control Center. There's a deeper sense of context, too. Previously, it always took me a few seconds to read through a song's information. With iOS 10, I can swipe up and glance at the artwork to see what I'm listening to.

There's a subtle touch I want to mention. When music is playing, artwork is big, it has a drop shadow, and Control Center says 'Now Playing on...' at the bottom with an icon for the device where audio output is happening. Hit pause, and the artwork shrinks, losing the drop shadow, as the 'Now Playing...' message disappears. Tap play again, and the artwork grows bigger with a delightful transition.

Control Center's audio page has two functional problems Apple should address. Song details (title, artist, and album) have been turned into lines of text that don't scroll and get cut off. Try to listen to songs with long titles – say, I've Got a Dark Alley and a Bad Idea That Says You Should Shut Your Mouth (Summer Song) – and you'll be surprised Apple designers didn't consider the issue.

That Says...?

That Says...?

In addition, the ability to "love" songs to train Apple Music has been removed from Control Center (and the Lock screen). I don't understand the decision, as having a dedicated page provides even more room for music controls.

Despite the merits of artwork and more intuitive controls, I don't think Apple added a standalone audio card to Control Center for those reasons alone. To me, the most convincing explanation comes from the hardware menu:

Picking audio accessories in Control Center.

Picking audio accessories in Control Center.

With just a few taps, you can connect to Bluetooth headphones or wireless speakers from anywhere on iOS without opening Settings. There's an obvious subtext: for a device without a headphone jack, an easier way to switch between wireless audio accessories isn't just a pet peeve – it's a necessity.

Audio playback is the clear winner of the new Control Center in iOS 10.

Audio playback is the clear winner of the new Control Center in iOS 10. Apple freed themselves from the constraints of iOS 9's tiny audio controls, and, after three years, music is claiming the prime spot it deserves in Control Center. The new audio page brings a more engaging, integrated listening experience that paves the road for what's to come.

HomeKit Controls

You can't use the third page of Control Center unless you've configured at least one HomeKit device. I don't own a lot of HomeKit accessories (I have three Hue lights and a few Elgato sensors), but the new Home page has grown so much on me, I'm no longer using any third-party HomeKit widgets.

Besides being available to users with HomeKit devices, Control Center's Home card only displays accessories and scenes that have been marked as favorites in the new Home app. The page doesn't list every HomeKit accessory, nor does it work with third-party home automation devices that don't support HomeKit.

If you meet these requirements, you'll be able to swipe over the Music card to reveal the Favorite Accessories view.

Accessory buttons carry a name and icon assigned in the Home app, and, if supported, a percentage label for intensity (lights have it, for example). A button in the top right lets you switch between accessories and scenes. To turn them on and off, you just tap a button once.

Buttons can be long-tapped to open a detail screen with more options.11 For my Hue lights, holding a button for a fraction of a second reveals a vertical slider for intensity, which can be adjusted without lifting a finger off the screen.

A second layer of navigation is nested into the detail view. With multicolor lights, you can tap on a Colors button below the intensity slider to modify presets and open a color wheel to pick a different shade. The wheel even has a segmented control to switch between color and temperature – a surprisingly deep level of hierarchy for a Control Center page.

Adjusting colors and temperature for lights inside Control Center.

Adjusting colors and temperature for lights inside Control Center.

Unfortunately, accessories that only report basic status messages don't have a useful detail view.

In spite of my limited testing environment, Control Center has become my favorite way to manage HomeKit lights and scenes. It's a testament to Apple's penchant for native integrations: lights turn on immediately because commands don't go through a third-party server, and the entire flow is faster than asking Siri to activate an accessory. I was a heavy user of third-party HomeKit widgets and apps before; on iOS 10, I have no reason to do that anymore thanks to Control Center.

If Apple didn't have big plans for the connected home, they wouldn't have given HomeKit its own section in Control Center. With HomeKit expanding to new accessory lines, I think it's going to be my second most used card after music.

Extended Control

After three years, Control Center is growing up. To make the pendulum swing back towards simplicity, Apple has traded some convenience of the original design for three standalone pages. By unbundling functionality in discrete units, Control Center is more legible, usable, and flexible.

There are missteps. The lack of any kind of user customization is inexcusable in 2016. The bottom row of shortcuts, down to four icons again, still can't be modified to accommodate user-selected apps. And you won't be able to swap toggles at the top for settings you access on a frequent basis.

Half-baked integration with 3D Touch feels like a timid attempt to take Control Center further. The addition of quick actions for apps in the first page is laudable, but why isn't the same true for toggles at the top as well? And if HomeKit accessories can show nested detail views, why can't Apple Music display a lyrics screen, too?

I want to believe that iOS 10's Control Center is foreshadowing the ability for developers to provide their own "app pages" and for users to swap default shortcuts with their favorite ones. More than ever before, Control Center is ripe for extensibility and personalization. Like widgets, I can see a future where we interact with some types of apps primarily through mini interfaces in Control Center.

I wouldn't have expected pagination to be what I wanted, but Apple was right in rethinking Control Center as a collection of pages rather than a complex unified dashboard. The majority of iOS users won't be affected by Apple's design trade-offs; they'll appreciate a screen that doesn't need a manual.

The new Control Center experience isn't a regression; it's a much needed reassessment of its role in the modern iOS.

More 3D Touch

As it's evident by now, Apple has increased the presence of 3D Touch in iOS 10. On top of notifications, Control Center, and the Home screen, 3D Touch actions have been brought to more apps and system features.

Notification Center

Like on the Apple Watch, you can press on the Clear button in Notification Center to clear all notifications in one fell swoop. Finally.

Siri App Suggestions

Apps suggested by Siri support 3D Touch to show the same quick actions available on the Home screen.

Apple Music

Among many changes, Apple Music has been given the extended 3D Touch treatment with a contextual menu for selected items and playback controls. Pressing a song or the bottom player brings up a list of options that include adding a song to a library, liking it, saving it to a playlist, or opening lyrics.

Manage Downloads

When downloading apps from the App Store or restoring a device from an iCloud backup, you can press on an in-progress download to pause it, cancel it, or prioritize it over others.

Share Apps

iOS 10 automatically adds a Share button to an app's quick action menu on the Home screen to share its link with friends. Presumably, this is meant to bolster app discovery and sharing among users.

Beta Feedback

Pressing on the icon of a TestFlight beta app shows a shortcut to send feedback to the developer via Mail.


The pervasive use of 3D Touch in iOS 10 proves Apple wants it to be an essential iOS feature. After using iOS 10, going back to iOS 9 feels like missing several layers of interaction.

This creates an even stronger tension between 3D Touch-capable iPhones and devices without it. Right now, Apple is resorting to swipes and long-taps to simulate 3D Touch on iPads and older iPhones; will they always be able to maintain backwards compatibility without making more features exclusive to 3D Touch?

Messages

iMessage is a textbook example of how a feature can turn into a liability over time.

When it was introduced five years ago, iMessage promised to bring a grand unification of SMS and free, unlimited texting with media attachments. iMessage turned Apple's Messages app into a single-stop solution for conversations between iOS users and those who would later be known as green-bubble friends. It was the right move at the time12, and it allowed Apple to have a communication service as a feature of iOS.

Over the last five years, messaging has outgrown texting. Meanwhile, iMessage (the service) and Messages (the app) have remained stuck in their ways.

Services like Facebook Messenger, WhatsApp, LINE, and WeChat haven't only reached (or surpassed) iMessage in terms of users; as mobile-first messaging apps without SMS' technical (and conceptual) debt, they have been able to relentlessly iterate on design, novel messaging concepts, notifications, and app integrations.

These companies, free of past constraints, have envisioned new ways to communicate. They've grown messaging apps into platforms, enabling others to extend them. And maybe some of the current messaging trends will turn out to be fads, but it's hard to argue against Apple's competitors with their numbers, cultural influence, and progressive lock-in. They're no joke, and Apple knows it.

But I wouldn't ascribe iMessage's slow pace of evolution to its SMS legacy alone. Because of its end-to-end encryption and Apple's strict policy on not storing sensitive user information, iMessage is by nature trickier to extend. Apple's efforts in this area are commendable, particularly when you consider how the aforementioned services diminish in functionality once you add encryption.

However, security hurdles shouldn't be an excuse for iMessage's glaring shortcomings. As laudable as Apple's stance is, most users aren't willing to put up with an app that feels old. They want to liven up conversations with rich graphics and apps. They want messaging to be personal. Technologists won't like this, but, ultimately, people just want a modern messaging app that works.

The time has come for iMessage to take the next step.

From a user's perspective, it's fair to say that Apple has been too complacent with iMessage. The service is by no means a failure – it serves hundreds of millions of users every day. But those metrics don't matter when stasis yields something worse than numbers alone: cultural irrelevancy. That iMessage, as many see it, "is just for simple texting".

The time has come for iMessage to take the next step. With a willingness to welcome developers into its most important app, and without giving up on its security ideals, Apple is reshaping how users can communicate, express themselves, and share. With iMessage in iOS 10, Apple is ready to embrace change.

App Changes

Before delving into the bigger enhancements to Messages, I want to touch upon changes to the app's interface and some minor features.

The conversation's title bar has been redesigned to embed the recipient's profile picture. Having a photo above a conversation helps identify the other person; the increase in title bar height is a trade-off worth accepting.

There's new artwork for contacts without a profile picture, too.

There's new artwork for contacts without a profile picture, too.

The profile picture can be tapped to open a person's contact card; and, you can press it to bring up a 3D Touch menu – the same one available in Contacts and Phone with a list of shortcuts to get in touch with that person.

iOS 10 brings a new layout for the bottom conversation drawer. By default, a conversation opens with a narrow text field and three icons next to it – the camera, Digital Touch, and the iMessage app launcher. As you tap into the text field to reply to a message, the three icons collapse into a chevron that can be expanded without dismissing the keyboard.

Apple has also redesigned how you can share pictures and videos. The new media picker consists of three parts: a live camera view to quickly take a picture; a scrollable grid of recent items from your library; and buttons to open the full camera interface or the photo library, accessed by swiping right.

The assumption is that, on iMessage, people tend to share their most recent pictures or take one just before sharing it. The live camera view can be used to snap a photo in a second (you don't even have to tap on the shutter button to take it). Moving the camera and library buttons to the side (hiding them by default) has freed up space for recent pictures: you can see more of them thanks to a compact grid UI.

Some won't like the extra swipe required to open the camera or library, but the live photo view makes it easier to take a picture and send it.

After picking or taking a picture, you can tap on the thumbnail in the compose field to preview it in full screen. You can also tap and hold a picture in the grid to enter the preview screen more quickly.13

Markup inside Messages.

Markup inside Messages.

Here, you have two options: you can edit a picture with the same tools of the Photos app (albeit without third-party app extensions) or use Markup to annotate it. You can tap on the Live Photo indicator to send a picture without the Live part, or press on it to preview the Live Photo.

Speaking of photos, iMessage now lets you send images at lower quality, likely to save on cellular usage. You can enable Low Quality Image Mode in Settings -> Messages.

One of the oldest entries of my iOS wish lists is also being addressed in iOS 10: you can choose to enable read receipts on a per-conversation basis.

If you, like me, always keep read receipts turned off but would like to enable them for important threads, you can do so by tapping the 'i' button at the top of a conversation and then 'Send Read Receipts'. The toggle matches the default you have in Settings and it can be overridden in each conversation.

Richer Conversations

While Messages may not look much different from iOS 9 on the surface, the core of the app – its conversation view – has been refreshed and expanded. iMessage conversations have received a host of new features in iOS 10, with a focus on rich previews and whimsical, fun interactions.

Links

In its modernization of iMessage, Apple started from web links. After years of plain, tappable URLs, Messages is adopting rich link previews, which are inspired by iOS 9's link snippets in Notes, but also more flexible and capable.

Rich links aren't a special setting of the app: the first time you receive a link in an iMessage conversation in iOS 10, it'll appear as 'Tap for Preview' button in the conversation. This is a one-time dialog to confirm you want to load links as rich previews instead of URLs, which also look different from iOS 9.

Loading a rich link for the first time in iOS 10.

Loading a rich link for the first time in iOS 10.

Like in Notes (and other services such as Slack and Facebook), rich previews use Open Graph meta tags to determine a link's title, featured image, audio and video file, or description. A web crawler has been built into Messages: as soon as you send a link, the message's bubble will show a spinner, and, depending on the speed of your Internet connection, it'll expand into a rich message bubble after a second, within the conversation.

Paste, fetch, expand into rich link.

Paste, fetch, expand into rich link.

Rich link previews in Messages use the same technology Apple brought to Notes last year, but they've been designed differently. They're message bubbles with a title and domain subtitle; the upper section, where the featured image of a link is, can grow taller than link snippets in Notes. Web articles tend to have rectangular image thumbnails; podcast episodes shared from overcast.fm are square; and links to iPhone apps shared from the App Store show a vertical screenshot.

Multiple types of shared links in Messages.

Multiple types of shared links in Messages.

Furthermore, the behavior of sharing links differs between Notes and Messages. Allow me to get a bit technical here.

In Notes, only links captured from the share extension are expanded into rich previews; pasting text that contains a link into a note doesn't turn the link into a rich preview.

13 Sep 23:25

Sightline: The Zoned Capacity Myth in Seattle

by pricetags

From the Sightline Institute.  This will sound familiar in Vancouver too, where Elizabeth Murphy in The Sun argues that there is plenty of zoned capacity in this city.

sightline-3

BY Dan Bertolet

[Residents] might also ask why the city insists on ever-taller buildings and doubling or tripling density in single-family zones with accessory dwelling units, even though planners say current zoning has plenty of capacity.

So declared the Seattle Times editorial board, parroting one of the most persistent and ubiquitous arguments made during zoning debates far and wide, a rallying cry in neighborhood preservation circles not only in Seattle but in cities across Cascadia and beyond. “We have plenty of zoned capacity” is repeated credulously and earnestly by citizen activists and homeowners at city council meetings and community forums and in online debates (see here, here, here, here, here, here, here, here, here, here).

The zoned capacity argument is also false. It’s a myth. That’s what anyone who carefully looks into the numbers on zoned capacity and the methodology for estimating them can’t help but conclude.

What makes the myth plausible, though, is that zoned capacity—how many new homes could theoretically be built under zoning rules—is a real data point that Seattle and other cities estimate and publish. (You can find it here for Seattle and here for Portland, for example.) And these estimates invariably indicate that there is plenty of zoned capacity to accommodate projected population growth, leading people to wrongly assume that their city is making enough room for newcomers. Reaching that conclusion may be an understandable layperson mistake. For knowledgeable advocates, however, it’s more like malpractice. …

But first, a more important point—in fact, the most important thing to know about zoned capacity: in every city, zoned capacity is a side show to the main event. The main event is housing prices.

sightline-4Housing prices are the crux of the matter. They reveal if people have enough housing choices. If vacancy rates are low and rents and housing prices are rising, then a city needs more homes. Period. The city needs to remove zoning-code barriers to more housing, so that builders can construct more homes. …

… here’s a summary of the key reasons Seattle’s zoned capacity estimate is misleading and does not justify halting upzones:

  • The assessment overestimates zoned capacity. It ignores many real-world obstacles to housing development.
  • Most of Seattle’s zoned capacity is in dense commercial areas, which are less family-friendly and more likely to expose residents to air pollution and automobile hazards.
  • Seattle’s zoned capacity is shrinking as construction booms. The best building sites are already gone, and others are going fast. The faster a city grows, the sooner it makes sense to upzone and keep plenty of buildable sites available.
  • Delaying upzones has the paradoxical effect of reducing future zoned capacity. Every building erected to four stories rather than eight, because zoning is too restrictive, represents four floors of potential homes denied to the city for as much as a century.
  • Seattle’s population is growing much faster than projected. …

sightline-5

The numbers are wrong: Why Seattle’s zoned capacity estimate is too high

1. It ignores how buildings actually get built

2. It ignores that Seattle is using up its “lowest hanging fruit” zoned capacity and neglecting future capacity needs.

3. It relies on a too-low population projection.

The values are wrong: Seattle’s zoned capacity does not promote equity

1. The capacity is not zoned as a shared responsibility across neighborhoods.

2. The capacity is not zoned as family-friendly or environmentally just.

… estimating zoned capacity is a theoretical exercise that ignores many of the factors that determine whether or not housing gets built: timeline, fluctuating costs and prices, unwilling sellers and the challenges of parcel assemblage, mismatches between zoning and demand, neighborhood opposition that delays or shrinks or scares away new housing, concentration of zoned capacity in dense districts with worse air and traffic and that are outside of the most desirable neighborhoods.

Zoned capacity also gets consumed by new construction itself, especially when zoning causes smaller buildings with fewer homes to be built, which locks in housing shortages in desired locations for decades. And lastly, in Seattle’s case, the “plenty of zoning capacity” myth implodes not only because the estimate itself is so flawed, but also because the city’s population is growing twice as fast as planners projected.

But as I noted at the outset, zoned capacity is a sideshow, a technical curiosity. Invoking zoned capacity as an argument against more homes for people in Seattle or any other growing city reveals nothing so much as the misunderstanding of those making the argument.

Full article here.


13 Sep 23:25

You Will Know Me

Megan Abbot’s Dare Me is a mystery about frightening young female athletes who have no fear of adults, and don’t even think much about them. You Will Know Me turns this around: it’s a mystery about the parents of acrobats – especially of one girl who, in a few years, may have a shot at the Olympics. All the parents love their kids fiercely, and all work hard not to neglect their other children despite the long hours and financial requirements that the sport demands. Though the parents become focused on and consumed by the demands of the sport, they cannot fathom the concentration, the dedication, the passion required to perform these acrobatics. The love the kids, but they cannot imagine who they have become.

13 Sep 23:24

Doonesbury Truthers

Saturday morning, I saw a clever Twitter posting cross my “politics” timeline. I tweeted it to my followers, so they’d see the image.

Actual Doonesbury cartoon from 1999.

Doonesbury Truthers

It seems to have been the right thing at the right time, as it’s been liked and retweeted thousands and thousands of times.

What really surprised me, though, we the brigade of right-wing Doonesbury Truthers who descended on my to prove that this was obviously a forgery, concocted by evil liberals. The wrong panel is signed! The colors are wrong! The web site’s been hacked! All the web sites have been hacked! Before I blocked them, I was having to explain to these folks (some of whom, to be fair, probably live outside Moscow in their mother’s basement and so don’t remember reading the comics in 1999) that millions and millions of people know these aren’t forgeries because they remember reading newspapers.

I pointed to the Amazon listing for the collected Doonesbury; they didn’t believe it. I argued that they could go down to their public library and read the newspaper themselves, forgetting that these people were likely living somewhere in Eastern Europe where American newspaper archives are a little bit less accessible than in, say, Peoria.

Still, my guess is that they weren’t all Russian, which means we have a lot of wingnuts who are so eager to find the next Dan Rather that they won’t even believe that stuff they can see in their bookstore actually exists.

13 Sep 23:24

Ohrn Image — Public Art, Public Safety

by Ken Ohrn

Ocean Cement plant on Granville Island — with murals, including truck.

Cement.Plant.MuralsSpeaking of cement trucks and cement companies — here’s a great example of a responsible corporate citizen helping keep people safe through education.  Way to go, Lafarge.

Vancouver, British Columbia-June 7, 2016 – More than 11,500 bike riders who participated in the HUB’s Bike to Work week May 30-June 5th in Metro Vancouver benefited from Lafarge Canada Inc.’s event sponsorship and safety education.

Lafarge operates over 100 ready-mix concrete trucks which travel throughout Vancouver delivering to construction sites.  “We are highly conscious of the risk introduced when these large trucks travel in close proximity to cyclists.  Our drivers receive specific training to equip them with cautious driving behavior on urban streets.  Bike to Work week was an ideal opportunity for us to reach out to cyclists and build their awareness of our trucks and their blind spots, commented David Beaupré, VP, Ready-Mix for the Greater Vancouver Area.

At three of the Celebration Stations during the event, Lafarge had a concrete mixer truck with cones highlighting the large blind spots cyclists should avoid when riding.  In addition, Lafarge volunteers shared other useful tips such as the importance of planning routes, use of hand signals, and riding in a straight line.

“We were delighted to partner with Lafarge during Bike to Work Week to help spread important safety messaging,” said Laura Jane, the Director of Corporate Engagement and Events at HUB Cycling. “We also had record-breaking Bike to Work Week participation this spring and couldn’t have done it without the support of businesses like Lafarge.”

Lafarge.BTWW

 

Lafarge’s support for Bike to Work Week included this team in HUB’s spring 2016 “Share the Road Challenge”, a travel contest between different modes.


13 Sep 23:24

The View from Here

by pricetags

Here’s what a friend sees from his window in the Olympic Village: a perfect view of Habitat Island and the Seawall, with a bit of Hinge Park.

habitat-island

Just the sort of thing to show off to delegates at the Pro Walk, Pro Bike, Pro Place conference here in Vancouver this week.

By Monday, the early fall light was even better for photographing the city.  So if you are taking shots to illustrate what this city has to offer, share them with your friends at Price Tags.


13 Sep 23:24

Infinite Scroll

by Chelsea G. Summers

“Don’t you love the Oxford Dictionary?” David Bowie asked in a BowieNet Live Chat in April 27, 1999. “When I first read it, I thought it was a really, really long poem about everything.”

Bowie’s first dictionary was likely much like my first dictionary — an eight-inch-thick tome. It was big, heavy bitch with thin-skinned pages that stuck to your fingers, covered with a stew of spider-fine print. I’ve grown up, and the dictionary has shrunk. The big, heavy book first became a couple of featherweight CD-ROMs; then the CD-ROMs thinned into weightless internet. This disappearing act is a paradoxical one, for as the dictionary has journeyed from analog to electronic to digital, it both holds more and is easier to use.

Let me put it this way: When the Oxford English Dictionary began its transition to CD-ROMs in 1984, it held about 400,000 definitions comprising about 6 million words in 13 volumes. By the time the OED, arguably the gold standard of dictionaries since its first publication in 1888, moved online in 1998, its second edition, released in 1989, had expanded to 600,000 definitions and 20 volumes. The third edition will be digital only — no print at all — and no one knows how many definitions or words it will be. What people do know is that it’ll be late. Originally to be released in 2005, the third edition of the OED is now slated for 2037.


It took the first modern English dictionary writer, Samuel Johnson, just under a decade to write his two-volume Dictionary of the English Language. Begun in 1746 at the behest of London booksellers, Johnson thought he could finish the work in three years, but it wasn’t published until 1755. While there were about 150 English dictionaries before Johnson’s, these weren’t “modern” dictionaries; primarily lists, these works were arranged by subject, not alphabetical order. In its 42,733 definitions, Johnson’s dictionary set the template for the prescriptive dictionary, one that tells you what a word means and shows you how writers have used it over time. (The other major kind of dictionary is descriptive, which mostly forgoes examples of usage and provides a definition.)

Before becoming a lexicographer, Johnson had made his literary mark as an essayist, a translator and a poet. Johnson limned his dictionary with precise concerns about codifying English spelling and pronunciation, and he buttressed his definitions with usages from English authors, leaning heavily on Shakespeare, Milton and Dryden. Many literary scholars claim Johnson’s dictionary as one of their own, but as much as it’s a literary work, Johnson’s dictionary is a political one. Eighteenth-century French lexicographers were outstripping their cousins across the Channel in dictionary quality and quantity; Johnson felt the pinch to put English words on an equal footing.

In his dictionary’s preface, Johnson calls the lexicographer “the slave of science, the pioneer of literature, doomed only to remove rubbish and clear obstructions from the paths of Learning and Genius.” This is bullshit at best and disingenuous at worst. Johnson’s dictionary wasn’t just defining English against the French; it was also defining English for the British. By the mid-18th century, the British Empire included colonies in Canada, America, and the West Indies; trading routes across India; naval bases in the Mediterranean; and, of course, a robust slave trade from Africa into the Americas. Johnson’s uniting English speakers through common spelling and pronunciation created a common consciousness in a singular sweep of authority.

The limits of your language are the limits of your world

So successful was Johnson that his dictionary stood unchallenged for more than a century. In 1857, the Philological Society of London surveyed the lexicographical landscape, pronounced it woeful and embarked on the solution that would lead to the Oxford English Dictionary. It took 75 years, three editors (give or take — it’s a little hard to count), about 2,000 unpaid contributors, and 13 volumes before the complete first edition of the OED hit the shelves with a resounding thud in 1933. Using Johnson’s prescriptive dictionary as a template, the OED set itself up as the “definitive record of the English language,” as it calls itself in its hysterically tautological tagline. Tracing usage of English words through British time and British space, the first edition of the OED drew heavily from Shakespeare, but it also included quotes from Chaucer, George Eliot, anonymous Medieval works in English and many, many others.

Mindfully presenting itself as the final authority in English, the OED shared Johnson’s insincerity about a lexicographer’s task — so too did Noah Webster in his 1828 American Dictionary of the English Language, the first modern dictionary of American English. All of the writers of these dictionaries have one major aim, and it’s not to tell you the difference between “appraise” and “apprise.” It’s to codify language, and — make no mistake — this is a nationalistic enterprise. The limits of your language are the limits of your world, to paraphrase Ludwig Wittgenstein, and these three pillars of lexicography understood that to create a nationalist identity you had to make sure your people were quite literally speaking the same language.


As Johnson’s Dictionary, the strong voice of a single man, gave way to the OED, a group effort overseen by an editor, another change took place: the dictionary’s authority became obfuscated. It’s not that the OED was without authority — backed by the Oxford Press, the OED was all about authority — it’s that the group hid the power center. An iconic print reference book, the OED’s weight has long been both literal and figurative, but by divesting itself of a strong central, named voice, the OED effectively hid the men behind the curtain. And I do mean “men”; for decades, dictionaries were entirely the work of men who reflexively used sexist word choices (racism, unquestionably present, seems yet to be documented or questioned).

Traditional reference books rely on writing that’s vetted and edited with a strong philosophical aim; however, when you move from capital-A Authority that is a person’s name to small-a authority that is the anonymity of a group, you open the door to democratization, and that is precisely what the web has capitalized upon. Sites that crowd-source information — whether Wikipedia, Genius, Wiktionary, Rotten Tomatoes or Quora — rip out the center of traditional authority, replacing a singular vision with multiplicity, but that groundwork was laid with the OED, a work that was too large for a single human to make. To give you some perspective, three editors died in the making of the first edition of OED, and that was before it ever went to the printers.

Oxford’s Word of the Year was the “Face with Tears of Joy” emoji. Lexicographers are gaining on the programmers

It’s no shock that a dictionary that took more than 70 years to publish its first edition would be fashionably late in getting on the web. The OED got online in 1999, but Dictionary.com, the first online dictionary, appeared in 1995. All these years later, Dictionary.com, a site fueled by the Random House Dictionary and supported by content from other dictionaries, remains the world leader, boasting a global Alexa ranking of 590 and the top dictionary site in the U.S. with a ranking of 307. Just below sit Urban Dictionary and Merriam-Webster online, which appeared online about the same time as the OED. All of these sites are thriving, regardless of whether they’re free, like Dictionary.com or Urban Dictionary, or subscription, like the OED (which also has a descriptive free site, Oxford Dictionaries).

So comfy are dictionaries on the web that they pwn writers on Twitter. “No one cares how you feel,” Merriam-Webster tweeted at Slate editor Gabriel Roth, who had chastised the dictionary for acting like the “chill parent who lets your friends come over and get high.” Social media may be giving dictionaries like Merriam-Webster a social conscience — it’s the first dictionary to define “they” as a third-person singular pronoun and to add “genderqueer” to its lexicon — but dictionaries have long been in the business of not caring how you feel. Roth seemed surprised that a dictionary “can indeed act like a dick in public,” but dictionaries have always been written by people, and people are dicks.

Social media spats aside, Urban Dictionary is the sign of the current, crowd-sourced times. It’s so far afield of Johnson’s Dictionary that when you visit, you can almost hear Samuel Johnson wailing from his grave. Created by Aaron Peckham as a parody response to Dictionary.com in 1999, Urban Dictionary crowd-sources slang definitions from its users (Wiktionary uses the same method for a fuller range of words). There is no editor. There is no authority but the crowd, who up-votes or down-votes definitions, making them more or less visible. For example, there are 558 Urban Dictionary definitions for the term “fuck boy,” and the one thing that they agree on is that no one can agree on what “fuck boy” means.

The first definition the Urban Dictionary offers for “fuck boy” is a “person who is a weak ass pussy that ain’t bout shit,” and while this definition may or may not fit “fuck boy” (I’m inclined to down-vote), it certainly fits “scrub.” TLC begins their 1999 anti-anthem “No Scrubs” with a fairly perfect descriptive definition: “A scrub is a guy that think he’s fly and is/Also known as a bust-down/Always talkin’ about what he wants/And just sits on his broke ass.” Here’s a fun fact: Green’s Online Dictionary dates “scrub” to 1698, and the OED dates it to 1598. While disagreeing on scrub’s inaugural voyage into print, the dictionaries converge on its meaning. Green’s defines “scrub” as a “lout, a failure” and the OED as a “mean insignificant fellow.” TLC would probably agree; neither Green nor the OED quotes them.

David Bowie said we can read the dictionary as a really, really long poem, but now that poem is bigger, and wider, and lengthier

What Urban Dictionary’s crowd-sourcing information gains in jouissance and scope, it lacks in factual accuracy — without editors doing the gritty work of fact-checking, you have to take crowdsourcing dictionaries with liberal side-eye (which Urban Dictionary tells us is “A facial expression expressing one’s criticism, disapproval, animosity, or scorn,” or “To give someone a knowing look,” or “When a member of the opposite sex looks from the corner of their [sic] at you”). The irony is that Urban Dictionary isn’t doing anything new in lexicography. Famously, James Murray, the OED’s first editor in the late 19th century, sent out calls for amateur philologists to send quotations for word usages, essentially crowd-sourcing his sources — and ended up with William Chester Minor, an American army surgeon committed to Broadmoor Criminal Lunatic Asylum for murder.

“Politicians, ugly buildings, and whores all get respectable if they last long enough,” Noah Cross says in the 1974 movie Chinatown. You can add websites like Urban Dictionary to this list. Urban Dictionary, for all its democratic chaos, has been embraced as an authority, appearing as evidence in multiple trials across the U.S. Seventeen years after it first appeared on the web, Urban Dictionary has earned its place in the world, even with those of us who take issue with it.


There’s this two-episode arc in season four of the new Doctor Who about a dead planet called simply “the Library.” The greatest collection of books and information in the universe, the Library is an entire planet housing everything ever written. It is entirely unpeopled, but for a computer interface in the form of slick, human-shaped, CGI-faced information kiosks. In its visuals of echoing spaces and menacing shadows, the two Doctor Who episodes capture the wonder and the faintly rotting scent of near-endless stacks of books, as well as the slick chill of unpeopled information. There’s this creeping feeling that the Library will crumble under its own weight, topple and crash into a great blue void.

The internet Dictionary is not yet the Library, but it could be. As dictionaries grow larger, so too grows the question of whether they will some day get so big as to become unmanageable. Programmers must stay one step ahead of lexicographers in perpetuity, staving off that collapse with the ticka-ticka-tick of their fingers. I imagine an interminable breathy race of introverts, arcane languages, and heated keyboards. The OED’s Word of the Year for 2015 was the “Face with Tears of Joy” emoji, suggesting that lexicographers are gaining on the programmers, encroaching on computer speak, accepting it as their own. How limitless the language, how exhausting the work of racing to contain it.

Dictionary writing is an endless task, and the web makes real the possibility that dictionaries themselves will be infinite. The expanding universe has nothing on the big, endless bang of lexicographers — lexicographers like those anonymous folks at the OED, gathering quotes from “Henry Miller, D. H. Lawrence and Mobb Deep,” as Virginia Heffernan notes in her saudade-laden look at the print OED’s end — or lexicographers like you and me, adding our own gloss to the words that inflame our imaginations. Bowie was not wrong that you can read the dictionary as a really, really long poem, but now that poem is bigger, and wider, and lengthier. That poem can be written by you, by me, by anyone with an internet connection, a love for words, and the ego to care. Everything awaits, right between “everyplace” and “everyway.”

13 Sep 23:22

This Emotion Detector Wants to Cheer Up Your Web Browsing Experience

by DJ Pangburn for The Creators Project

Image courtesy the artist

When news broke two years ago that Facebook had been manipulating moods and emotions with a news feed algorithm, people were understandably concerned. But this “emotional contagion” experiment ended up inspiring media artist Alexander Taylor to create Blissify, a “satirical pseudo-startup” exploring filter bubbles, “solutionism," and the quantified self.

After running a quick calibration to get to know a user’s “neutral browsing expression” (NBE), Blissify hijacks the user’s webcam to monitor their facial expressions in real-time as they browse the internet. When rapid increases in negative emotions are detected, Blissify hides the page, then adds it to the user’s personal block list.

“[Facebook’s experiment] got me thinking about how little I actually take notice of the link between my mood and the content I'm passively consuming, and the extent to which this is potentially already being exploited,” Taylor tells The Creators Project. “This linked to a past research project I did into the marketing material provided by startups specializing in 'big data,' some of which directly touched on the notion of monetizing the emotional states of your customers.”

Beyond exploring filter bubbles, Taylor says that Blissify is meant to be a bit of a comment on echo chamber. In fact, he likes to call the app an “echo chamber accelerator.”

“But even within our own chosen corners of the internet it's common to come across a specific type of content designed to draw strong negative reactions from the viewer and generate activity through those means,” he says. “Blissify is meant to be a comment on this, too.”

Taylor wrote the backend entirely in Javascript. He explains that the most important component to Blissify is clmtrackr, open-source application used for emotion detection, especially expressions signalling anger and disgust. “I found 'sadness' itself was triggered much less frequently than these two during testing,” Taylor notes.

There weren’t any direct app or extension influences on Blissify, but Taylor says he did spend a lot of time plumbing the depths of the Chrome web store’s Productivity category to get an idea of “how to present it as a 'realistic' self-help app.”

Click here to check out more of Alexander Taylor’s work.

Related:

This Dapper Robot Is an Art Critic

In ‘AMYGDALA,’ Twitter Emotions Bathe Viewers in Sounds and Light

Public Art Installation Uses Twitter Analysis To Create A Minneapolis Mood Ring

13 Sep 23:22

[Premiere] Mars Comes to Miami as an Immersive Interplanetary Experience

by Monica Uszerowicz for The Creators Project

HABZ2.pngCourtesy of FUSION

Imagine suiting up like an astronaut, your arms and legs enclosed in soft warmth and your head in a helmet. You’re suddenly in a ship headed for Mars, blasting through space and landing on its burnt sienna surface. Sound doesn’t travel the same way here—your steps along the rough, rocky surface are muted—but your journey is soundtracked by beautiful, symphonic crescendos, traveling with you. You can see and touch peaks, valleys, ice caps. If you get tired of exploring on foot, step into your rover, observing the landscape through the windows. Your body might not have gone to space, but for now, your senses indicate that you’re on the first expedition to the Red Planet.

base.jpgCourtesy of FUSION

From October 7th through the 9th, Miami’s III Points Music, Art, & Technology Festival will premiere the debut of Mars 2030, an immersive virtual reality production designed by FUSION. Participants will slip on a spacesuit and helmet and effectively fly to Mars, where they can explore a topographically accurate landscape. Though the experience feels magic (you’re on Mars!) it’s functionally realistic, both a simulation and an adventure. Mars 2030 is filled with details from real missions to Mars, thanks to Fusion’s Space Act Partnership with NASA. This partnership allowed FUSION to explore NASA’s research labs (including their Hybrid Reality Lab) and create an ultimately symbiotic project: as project designer and FUSION Virtual Reality Developer, Julian Reyes, explains, “We gave NASA all of our information so they could convert it and use it as a training simulation for their astronauts.”

VALKYRIE.PNGCourtesy of FUSION

Sound is one of Mars 2030’s most important components. Eight-time Grammy Award winner Julio Reyes has created Hope, an original score performed by the London Symphony Orchestra. The sounds of Mars 2030 will be overlaid upon these soundscapes. “All the audio and music is ambisonic and overlaid on top,” says Julian. “You’re walking on Mars, but you feel as if you’re at the center of the orchestra.” The music is gaze-dependent, as interactive as the rest of the experience: “It depends on where you’re looking. In one scene, if you’re looking at your other crew members, we’ve created a piece that aligns with the feeling of camaraderie, of being part of four people about to spend a year on Mars. But if you look outside, the music conveys a sense of fear and excitement about whether or not you’ll make it.”

Mars 2030 at iii Points from III Points on Vimeo.

Spoiler alert: you make it. In Planet, Stars, and Space, a 1962 science book by Joseph Miles Chamberlain and Thomas D. Nicholson, the section on Mars reads: “Since the dawn of civilization, people on earth have looked into the night sky and have speculated about the strange reddish-hued world that occasionally reaches such brightness that it outshines all the stars.” In Gustav Holst’s suite, The Planets, the Mars movement is majestic, like the soundtrack to a victorious battle. Mars 2030 indulges that sense of fantastical, adventurous speculation, and educationally, too. “We’re aiming to be very informative,” says Julian. “It partially comes down to consolidating all of NASA’s research into an experience that’s fun.”

outoneva.PNGCourtesy of FUSION

“We are thrilled to showcase FUSION’s Mars 2030 at III Points, enabling people to interact with this innovative experience at a festival that has been breaking boundaries since its inception,” says Daniel Eilemberg, Co-President and Chief Content Officer for FUSION. “We have been eager to find a way to work with the III Points team since our launch nearly three years ago, and what better way to collaborate than transporting audiences to another planet?”

III Points Festival is from October 7th-9th in Miami, FL. Go to iiipoints.com for full details on the festival, including its music lineup, and to purchase a ticket. Follow III Points on social media @iiipoints.

Related:

Field Trip to Mars in a Real-Life Magic School Bus

How to Fly to Mars on a DIY Spaceship

Fly into Space on This Virtual Reality Swing Set

13 Sep 23:22

Here's Why Blockbuster Superhero Movies All Sound the Same

by Beckett Mufson for The Creators Project

Screencap via

The beginning of prolific film essayist Tony Zhou's latest video, The Marvel Symphonic Universe, feels like a roast. After asking strangers on the street to recite the iconic film scores like Star Wars and James Bond, Zhou asks about the music from Marvel films—and gets crickets. But in classic Every Frame a Painting style, he expands on this idea to educate his viewers about how the cinema sausage is made.

In this episode, the true stars of the show are temporary scores, a necessary evil of big-budget filmmaking that legendary composer Danny Elfman calls, "The bane of my existence." Zhou sheds light on how filmmakers often rely on a temporary soundtrack to build the tempo and rhythm of a cut, then ask the composer to simply imitate those cues. When they use scores from other iconic films, the result is that many blockbuster movies, including most of the Marvel Universe, wind up sounding the same.

Zhou points out other elements that lead to Marvel's forgettable sounds, such as the use of narration and tendancy to use music that says the same thing as the visuals. These are not the result of bad filmmaking, he says, but merely safe filmmaking. That safety is part of what has made Marvel the highest-grossing franchise in film history, but it also leads to the blank stares in the opening minutes of the video below. Check it out for some truly ear-opening examples of how and why so many big movies sound alike, as well as an inspired rendition of the one Marvel song one of Zhou's interviewees could recite by heart.

See more from the Every Frame a Painting film essay series here.

Related:

The Challenge of Explaining How to Edit Movies

Here's the Secret to the Coen Brothers' Masterful Dialogue

Why Buster Keaton's the OG of Physical Comedy

13 Sep 23:18

Recommended on Medium: What Drives Men to Rape?

I have thought about this a lot.

Continue reading on »

13 Sep 23:17

Making Music and Embedding Sounds in Jupyter Notebooks

by Tony Hirst

It’s looking as if the new level 1 courses won’t be making use of Jupyter notebooks (unless I can find a way of sneaking them in via the single unit I’be put together!;-) but I still think they’re worth spending time exploring for course material production as well as presentation.

So to this end, as I read through the materials being drafted by others for the course, I’ll be looking for opportunities to do the quickest of quick demos, whenever the opportunity arises, to flag things that might be worth exploring more in future.

So here’s a quick example. One of the nice design features of TM112, the second of the two new first level courses, is that it incorporates some mimi-project activities for students work on across the course. One of the project themes relates to music, so I wondered what doing something musical in a Jupyter notebook might look like.

The first thing I tried was taking the outlines of one of the activities – generating an audio file using python and MIDI – to see how the embedding might work in a notebook context, without the faff of having to generate an audio file from python and then find a means of playing it:

midimusic

Yep – that seems to work… Poking around music related libraries, it seems we can also generate musical notation…

midimusic2

In fact, we can also generate musical notation from a MIDI file too…

midimusic3

(I assume the mappings are correct…)

So there may be opportunities there for creating simple audio files, along with the corresponding score, within the notebooks. Then any changes required to the audio file, as well as the score, can be effected in tandem.

I also had a quick go at generating audio files “from scratch” and then embedding the playable audio file

 

audio

That seems to work too…

We can also plot the waveform:

audio2

This might be handy for a physics or electronics course?

As well as providing an environment for creating “media-ful” teaching resources, the code could also provide the basis of interactive student explorations. I don’t have a demo of any widget powered examples to hand in a musical context (maybe later!), but for now, if you do want to play with the notebooks that generated the above, you can do so on mybinder – http://mybinder.org/repo/psychemedia/ou-tm11n – in the midiMusic.ipynb and Audio.ipynb notebooks. The original notebooks are here: https://github.com/psychemedia/OU-TM11N


13 Sep 21:47

How to filter inappropriate comments in Instagram

by Ian Hardy

Kevin Systrom, CEO and co-founder of Instagram, stated in a blog post that one of the reasons for creating the photo and video app was to build a “community where people could share their lives.” The company is now owned by Facebook and has over 500 million users.

With a community of this size, many people sharing their life on Instagram are often met with hurtful, unwanted comments.

In an effort to “promote a culture where everyone feels safe to be themselves without criticism or harassment,” Instagram rolled out a keyword moderation tool that enables all users to filter words they consider offensive or inappropriate. Comments containing  these words are then hidden from posts.

This tool is viewed by many as a positive step forward and will operate alongside the ability to report inappropriate comments and block specific user accounts.

“We know tools aren’t the only solution for this complex problem, but together, we can work towards keeping Instagram a safe place for self-expression. My commitment to you is that we will keep building features that safeguard the community and maintain what makes Instagram a positive and creative place for everyone,” said Systrom.

instagram-how-to

If you’re interested in using the new keyword moderation tool, here’s the step-by-step process:

  1. Tap on the settings icon
  2. Scroll down to the Comments section, which is below Celluar Data Use
  3. Tap Comments
  4. Turn on option to Hide Inappropriate Comments
  5. Add in your custom keywords
  6. Tap Done.
13 Sep 21:47

Note 7 Software update to limit phone’s battery life to 60 percent in South Korea

by Patrick O'Rourke

Starting September 30th, Samsung is releasing a software update that will limit current Note 7’s battery life to just 60 percent, according to a report stemming from the Associated Press.

This means that all non-recalled Note 7s will only be able to charge to 60 percent capacity, a move the company likely hopes will encourage holdouts to return their phones and also likely prevent the smartphone from catching on fire.

AP‘s report is cited from a Samsung notice posted in Korean publication Seoul Shinmun. The update seems to be currently confined to Korea, though it’s likely it will make its way to other regions soon, with most Note 7 related issues occurring in the United States. The latest numbers indicate that over 70 faulty phones have been reported.

In Canada specifically, Transport Canada has issued a warning telling Canadians not to take the Note 7 on flights. Health Canada is also working with Samsung to ensure the company’s recall goes as smoothly as possible and noted in a joint release with Samsung that only one case of overheating was reported to them so far.

Samsung sold approximately 2.5 million Note 7s since the phones launch last month. Replacement Note 7s will be marked with a blue S on their box.

Related: Samsung Canada is working with Health Canada to complete Note 7 recall

13 Sep 21:47

Apple launches Swift Playgrounds on the App Store to make coding easy for everyone

by Jessica Vomiero

Alongside the release of iOS 10, Apple’s recently-launched educational coding game Swift Playgrounds is now available in the App Store.

Apple first introduced Swift during the company’s WWDC event this past June. The development of the app was discussed again during the launch event for the iPhone 7 and iPhone 7 Plus, with Apple claiming it would be available for download soon.

Swift Playgrounds is intended for students and young people who want to learn how to code. The app brings coding concepts to life by demonstrating the cause and effect of certain symbols in a window on the side of the screen. The app includes Apple-developed programming lessons, challenges and puzzles to guide three characters through an animated world.

According to a statement sent to MobileSyrup, Swift Playgrounds employs the same programming language used by developers to create apps for Apple’s various platforms.

The company has described it as simple enough for students with no coding experience to understand, but at the same time, powerful enough for experienced programmers to get creative with Apple’s platform.

“Everyone should have the opportunity to learn coding, and we are excited to bring Swift Playgrounds to the next generation of programmers looking for a fun and easy way to explore key coding concepts using real code,” said Craig Federighi, Apple’s senior vice president of software engineering, said in a statement.

“More than 100 schools around the world have already committed to adding Swift Playgrounds to their fall curriculum, and we can’t wait to see what students create with it,” he continued.

Swift Playgrounds comes with several features, including built-in templates that users can modify for their own use, the ability to share their creations with friends via Mail or Messages, or by posting to the web.

Furthermore, Swift Playgrounds uses Swift code, meaning that finished products can be exported to Xcode to create fully functional iOS and macOS apps.

Swift Playgrounds is now available on the App Store for free and is compatible with iPad Air and iPad Pro models, as well as iPad mini 2 and later running iOS 10.

Download Swift Playgrounds from the App Store, click here.

Related: Apple Swift Playgrounds has the potential to teach anyone to code

13 Sep 21:47

iOS 10 is now available to download, features dedicated iMessage store and deeper Siri integration

by Rose Behar

Apple’s latest operating system iteration, iOS 10, is now available for download, bringing with it a dedicated iMessages app store and Siri integration for third-party apps.

The iMessages revamp is perhaps one of the most compelling aspects of iOS 10, and certainly one of the most reported on. In addition to a dedicated app store that includes sticker packs, games and third-party apps like Yelp, it also includes animations, an in-app camera feature and the ability to send doodles.

In the context of Apple ever so slightly opening of the gates of its walled garden, Siri-integration is also a large step forward. It means that voice commands can be used to control third-party apps like Uber directly.

Another significant change is the removal of ‘slide to unlock’ in favour of ‘press to unlock.’ Sliding to the right, however, does bring you to the new ‘Today View’ screen, which has been completely revitalized with widgets that can easily be shuffled and removed. Meanwhile, sliding to the left opens the camera.

Those are some of the most notable changes in iOS 10, but users who download the new version will find many other tweaks and changes to explore, including a moderate UI redesign.

The new OS version is available to all users with the iPhone 5 or higher, as well as the iPad Mini 2, iPad 4, iPad Air and later generations.

Related: 10 of the biggest changes coming in iOS 10

13 Sep 20:43

Kluckner has a Tale from the West End – Sep 20

by pricetags

Tales From the West End

“Tales From the West End” is an evening to explore and experience our community through stories about our common past.

This month writer, artist and tour guide Michael Kluckner is back as our featured story teller. Michael has more tales to tell from his research on West End resident Mrs. Henshaw.

 

JJBean Coffee Shop, 1209 Bidwell St., (Bidwell & Davie)

Tuesday, September 20

5:45-7:30, story telling from 6:00-7:00

Admission: Free, Complimentary coffee and tea thanks to JJBean

 

 


13 Sep 20:36

Sony Made the Right Decision

You can argue all you’d like about whether mirrorless or DSLR is the right choice, whether Sony’s adding full frame to the mirrorless lineup was the right choice, and all the other camera-related things that have happened since Sony took over the KonicaMinolta camera group a decade ago. …

13 Sep 20:36

iOS 10 updates are reportedly sending iPhones into recovery mode [Update, Apple says it’s fixed]

by Jessica Vomiero

Recent reports indicate that the iOS 10 update is causing some users’ phones to enter recovery mode.

Several people have tweeted that upon trying to install iOS 10, their phones have either entered recovery mode, or froze. While many users found success when updating via iTunes, several others noted that their devices were ‘bricked’ when attempting to install iOS 10.

While the details around the situation are currently slim, users are flooding Twitter with reports of disabled and frozen devices.

If your device has entered this mode and you cannot get it to restore through normal methods, there is another recovery option to try. The last method you would want to try is DFU mode. The reason we say it should be your last is because it will wipe all your data from your device. To perform a DFU mode restore plug your iPhone into a computer that has iTunes installed, hold down the power button on your iPhone until your device has powered off.

After your phone is powered off hold the home button and power button down for 10 seconds, after 10 seconds continue to hold the home button and release the power button. You should get a popup on your computer saying your phone is in restore mode, click update. This will bring you through the installation process of iOS 10. This is a clean install, meaning it removes all previous code from your phone.

Update: Apple has responded to the issues of iOS 10 bricking devices by stating it has been resolved and users can go forth and download.

“We experienced a brief issue with the software update process, affecting a small number of users during the first hour of availability. The problem was quickly resolved and we apologize to those customers. Anyone who was affected should connect to iTunes to complete the update or contact AppleCare for help.”

Zachary Gilbert also contributed to this post.

Related reading: 10 of the biggest changes in iOS 10

13 Sep 20:36

iOS 10 Refines the CarPlay Experience

by John Voorhees

CarPlay is a window into iOS – an alternate UI for your iPhone designed to limit distractions as you drive. As such, most of the changes to CarPlay are simply a reflection of iOS 10. Nonetheless, iOS 10 brings a handful of refinements that are unique to the CarPlay interface along with iOS 10 compatibility.

Removing Spotify from CarPlay.

Removing Spotify from CarPlay.

Chief among those refinements to CarPlay is the ability to rearrange and even remove certain apps from your CarPlay display. The settings to do so are under General > CarPlay on your iPhone. Picking your car from that setting shows a virtual version of your CarPlay display. Third-party apps and Apple’s Podcasts and Audiobooks apps can be removed from CarPlay by tapping the minus badge in the corner of an app's icon. Once removed, the app sits below the virtual CarPlay screen on your phone and can be re-added any time by tapping it. To rearrange the apps on your CarPlay screen, long press and drag them to the location you want.

SiriKit expands messaging on CarPlay beyond Apple's own Messages app.

SiriKit expands messaging on CarPlay beyond Apple's own Messages app.

iOS 10 opens Siri to developers of certain types of apps. Among those are VoIP and messaging apps. That's great news because one of the first things I wished for after having Siri read me my texts in the car was the ability to access messages from other services too. Soon you should be able to message your friends and family with apps like WhatsApp and Slack and make calls from VoIP apps like Skype. Unfortunately, no apps that take advantage of Siri via CarPlay have been released yet, so I have not been able to test how well they work.

Mapping received an overhaul for iOS 10 that is carried over to CarPlay. If your car supports it, Apple has added the ability to display turn-by-turn directions in the center instrument cluster on your car so you can follow directions without turning your head. Traffic awareness is also built in. When I drove into Chicago recently Siri warned my that there was a traffic accident ahead and then reassured me that I was still on the fastest route to my destination.1 If a better alternate route is available though, Siri will reroute you.

Turn by turn directions in the instrument panel.

Turn by turn directions in the instrument panel.

CarPlay remembers where your car is parked. This feature is also available if your iPhone is connected to your car via Bluetooth, but with CarPlay, a Bluetooth connection is unnecessary. After you park, a notification pops up on your lock screen showing you where your car is. Tap it and a map opens showing you the location.

iOS 10 and CarPlay My Library views.

iOS 10 and CarPlay My Library views.

The Music app in CarPlay benefits from the same organizational and interface changes that have been made on iOS devices. I especially like that changes I make on my iPhone to the order of the categories in My Library are reflected in CarPlay. That way, my most frequently used categories are always within easy reach along with Recently Added, which is always at the top of the list.

CarPlay is inherently constrained by the fact that it has to work within the confines of automobile manufacturers’ hardware. The refinements to the CarPlay interface and addition of key iOS 10 features are useful and welcome. Despite its limitations, I love that CarPlay updates automatically as my iPhone does and doesn't require updates to my car's system software. That said, I feel like there is still a lot that could be done to expand CarPlay. Perhaps with the gradual opening of Siri, more third-party developers will be able to bring their apps to the car.


  1. Important to know when you’ve got a car full of hungry people looking forward to a little Gus’s World Famous Fried Chicken↩︎

Like MacStories? Become a Member.

Club MacStories offers exclusive access to extra MacStories content, delivered every week; it’s also a way to support us directly.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it’s made in Italy.

Join Now
13 Sep 20:36

tvOS Adds Dark Mode, Siri Enhancements, and More

by John Voorhees

The fourth generation Apple TV was introduced almost exactly one year ago. Since then, Apple has continued to incrementally improve tvOS, the iOS-based operating system that controls the Apple TV. With the release of tvOS 10 today, Apple has extended the capabilities of features like Siri, added functionality to apps like Photos and Music that bring those apps in line with iOS 10, and improved the user experience with new features like a dark mode and single sign-on. Together, these features make for a more polished tvOS experience that should enhance customers’ enjoyment of the Apple TV without introducing radical changes to how the device works.

Dark Mode

Dark mode looks great.

Dark mode looks great.

If you’ve dimmed the lights to watch a movie on your Apple TV, navigating through its bright interface can be a little jarring. tvOS 10 adds adds a dark mode that is accessible as a toggle under Settings > General > Appearance.

Dark mode is applied to both the interface surrounding Apple TV apps and apps themselves to the extent they use standard user interface elements. Apps with custom elements, like many games, will require a little more work by developers to implement dark mode. Having used dark mode during the tvOS beta period, I can’t see going back to the lighter interface.

Apps that use standard controls benefit from dark mode too.

Apps that use standard controls benefit from dark mode too.

A couple screenshots of the Apple TV's new dark mode don't do it justice, so I've included a gallery of additional screenshots highlighting dark mode throughout tvOS' interface at the end of the article.

Siri

Siri has gained additional functionality with tvOS 10. Siri now understands topical searches like 'Show me teenage comedies from the 1980s' and multi-part searches like 'Show me Batman movies' followed by 'Just the ones with Christian Bale.'

Siri understands multi-part searches.

Siri understands multi-part searches.

Siri is implemented a little differently for YouTube. Siri works with multiple content providers like Netflix, Hulu, and others and presumably, Apple wanted to avoid overwhelming search results with YouTube results. To invoke a YouTube search, you say something like, 'Show me cat videos on YouTube' from the home screen or, if you're already in the YouTube app, 'Show me cat videos.'

Game Controllers May be Required

When the 4th generation of the Apple TV was introduced there were inconsistencies in Apple's documentation that made it unclear whether game developers could require a third-party game controller. Ultimately though, Apple required game developers to support the Siri Remote despite its limited utility as a game controller. With iOS 10, Apple has reversed course. Game developers can now require a game controller. In addition, tvOS 10 supports connections to up to four controllers at once.

Game developers also have access to the ReplayKit APIs with tvOS 10. Games that implement ReplayKit can hook into gameplay streaming services or save gameplay to the Apple TV for sharing later.

Music

tvOS 10 brings iOS' updated Music app to the Apple TV.

tvOS 10 brings iOS' updated Music app to the Apple TV.

The update to Music on tvOS shares the look and feel of the changes to the app on iOS. From Music you can access your Library, the new ’For You’ and ‘Browse’ sections found on iOS, Radio, and Search. 'For You' includes the new ‘My New Music Mix’ and ‘My Favorites Mix’ as well as updates to cover art for certain playlists like the A-List series and Radio stations.

Photos

tvOS 10 includes Memories.

tvOS 10 includes Memories.

Photos on tvOS adds Memories which is also a new feature on iOS and macOS Sierra. Memories creates a collage of photos around a theme along with a slideshow set to music. Memories can be anything from your best photos from the past three weeks, to a trip to the beach, or a collection of photos from a certain number of years ago. In my tests, Memories inserted a photo that didn't fit with the specific theme occasionally, but by and large, Memories worked well and is a welcome addition to tvOS.

Single Sign-On

This is a particularly American problem. Most TV network apps in the US require a cable subscription to view their shows through an app. In the past that has typically meant typing in your cable TV login credentials app-by-app on the network's website to get a passcode that will unlock an Apple TV app's content. This is even more maddening because the Apple TV has a history of forgetting logins and making you go through the process all over again.

Apple promised to put an end to this tortuous dance with single sign-on via the Apple TV's Settings app. Once logged in with your cable provider credentials, you will be spared re-entering them each time you download an app that requires your cable login. At WWDC, Eddy Cue said that there will also be special pages on the Apple TV App Store that list all of the apps you have access to based on your cable provider. Unfortunately, single sign-on was a no-show in the update released today. The feature requires the cooperation of cable providers and the developers of tvOS apps, so let's hope Apple can get both camps on board and add the feature soon.

HomeKit

HomeKit on tvOS serves two main purpose. First, you can control any HomeKit connected devices like lights or thermostats using the Siri Remote. Second, if, like most Apple TVs, yours is always at home and on or in sleep mode, it acts as a hub that your other devices can connect to remotely even when you are away from home and not on your local network. Combined with things like location awareness, HomeKit on tvOS opens up some interesting automation possibilities.

Automatic App Downloading

App discovery on tvOS has been a problem since the launch of the 4th generation Apple TV. There is no way to link to apps in the Apple TV App Store making it difficult to discover new apps. tvOS 10 improves the situation a little for tvOS apps that have an iOS counterpart. Now when you download an iOS app to an iOS device, if there is an associated tvOS app, it will be automatically downloaded to your Apple TV. Unfortunately, it is still not possible to purchase tvOS-only apps outside the Apple TV App Store, so this change will not benefit those apps.


tvOS 10 is available on the 4th generation Apple TV only as a free update.

Dark Mode Gallery

App Store banner.

App Store banner.

App Store app detail.

App Store app detail.

Music App Browse tab.

Music App Browse tab.

Featured genre playlists.

Featured genre playlists.

A Photos app Memories movie.

A Photos app Memories movie.


Like MacStories? Become a Member.

Club MacStories offers exclusive access to extra MacStories content, delivered every week; it’s also a way to support us directly.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it’s made in Italy.

Join Now
13 Sep 20:33

tvOS 10 is now available, adds enhancements to 4th Generation Apple TV’s remote app

by Patrick O'Rourke

Along with the release of watchOS 3 and iOS 10, tvOS 3 is available for download today on all 4th Generation Apple TV set-top boxes.

The new OS update adds new iPhone and iPad remote app features, but unexpectedly, doesn’t include the unified credential system for entering cable subscription information Apple has shown off in the past.

The feature’s release is delayed until later this year, though it’s unclear how much of this functionality will actually be available in Canada. tvOS 10 also features the ability to search YouTube and HomeKit with Siri, as well as a new dark mode.

It seems Apple’s servers are overloaded right now, so expect the download to take some time.

Related: watchOS 3 is here to make your Apple Watch faster

13 Sep 20:33

Punting on the Bunting

by russell davies

interesting

I was just looking for an old Interesting picture, saw this and realised - I've not got any bunting. It's all in storage somewhere. Too far away for Thursday anyway. Ah well.

Perhaps it's as well. It's time to move on. Bunting is so 2007.

13 Sep 20:30

Facebook Versus the Media

by Ben Thompson

Facebook found itself in the middle of another media controversy last week. Here’s the New York Times:

The image is iconic: A naked, 9-year-old girl fleeing napalm bombs during the Vietnam War, tears streaming down her face. The picture from 1972, which went on to win the Pulitzer Prize for spot news photography, has since been used countless times to illustrate the horrors of modern warfare.

But for Facebook, the image of the girl, Phan Thi Kim Phuc, was one that violated its standards about nudity on the social network. So after a Norwegian author posted images about the terror of war with the photo to Facebook, the company removed it.

The move triggered a backlash over how Facebook was censoring images. When a Norwegian newspaper, Aftenposten, cried foul over the takedown of the picture, thousands of people globally responded on Friday with an act of virtual civil disobedience by posting the image of Ms. Phuc on their Facebook pages and, in some cases, daring the company to act. Hours after the pushback, Facebook reinstated the photo across its site.

This, like many of Facebook’s recent run-ins with the media, has been like watching an old couple fight: they are nominally talking about the same episode, but in reality both are so wrapped up in their own issues and grievances that they are talking past each other.

Facebook Owns Facebook.com

Start with the media. Aftenposten Editor-in-chief Espen Egil Hansen wrote an open-letter to Facebook CEO Mark Zuckerberg that was, well, pretty amazing, and I’m not sure that’s a compliment:

Facebook has become a world-leading platform for spreading information, for debate and for social contact between persons. You have gained this position because you deserve it. But, dear Mark, you are the world’s most powerful editor. Even for a major player like Aftenposten, Facebook is hard to avoid. In fact we don’t really wish to avoid you, because you are offering us a great channel for distributing our content. We want to reach out with our journalism.

However, even though I am editor-in-chief of Norway’s largest newspaper, I have to realize that you are restricting my room for exercising my editorial responsibility. This is what you and your subordinates are doing in this case.

Actually, no, that is not what is happening at all. Aftenposten is not Facebook, and Facebook is not “Norway’s largest newspaper”. Accordingly, Facebook — and certainly not Mark Zuckerberg — did not take the photo down from Aftenposten.no. They did not block the print edition. They did not edit dear Espen. Rather, Facebook removed a post on Facebook.com, which Aftenposten does not own, and which Hansen admits in his own open letter is something freely offered to the newspaper, one that they take because it is “a great channel for distributing our content.”

Let me foreshadow what I will say later: Facebook screwed this up. But that doesn’t change the fact that Facebook.com is a private site, and while Aftenposten is more than happy to leverage Facebook for its own benefit that by no means suggests Aftenposten has a single iota of ownership over its page or anyone else’s.

The Freedom of the Internet

Unfortunately, Hansen’s letter gets worse:

The media have a responsibility to consider publication in every single case. This may be a heavy responsibility. Each editor must weigh the pros and cons. This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California…

The least Facebook should do in order to be in harmony with its time is introduce geographically differentiated guidelines and rules for publication. Furthermore, Facebook should distinguish between editors and other Facebook-users. Editors cannot live with you, Mark, as a master editor.

I’ll be honest, this made me mad. Hansen oh-so-blithely presumes that he, simply by virtue of his job title, is entitled to special privileges on Facebook. But why, precisely, should that be the case? The entire premise of Facebook, indeed, the underpinning of the company’s success, is that it is a platform that can be used by every single person on earth. There are no gatekeepers, and certainly no outside editors. Demanding special treatment from Facebook because one controls a printing press is not only nonsensical it is downright antithetical to not just the premise of Facebook but the radical liberty afforded by the Internet. Hansen can write his open letter on aftenposten.no and I can say he’s being ridiculous on stratechery.com and there is not a damn thing anyone, including Mark Zuckerberg, can do about it.1

Make no mistake, I recognize the threats Facebook poses to discourse and politics; I’ve written about them explicitly. There are very real concerns that people are not being exposed to news that makes them uncomfortable, and Hansen is right that the photo in question is an example of exactly why making people feel uncomfortable is so important.

But it should also not be forgotten that the prison of engagement-driving news that people are locking themselves in is one of their own making: no one is forced to rely on Facebook for news, just as Aftenposten isn’t required to post its news on Facebook. And on the flipside, the freedom and reach afforded by the Internet remain so significant that the editor-in-chief of a newspaper I had never previously read can force the CEO of one of the most valuable companies in the world to accede to his demands by rousing worldwide outrage.

These two realities are inescapably intertwined, and as a writer who almost certainly would have never been given an inch of space in Aftenposten, I’ll stick with the Internet.

Facebook is Not a Media Company

One more rant, while I’m on a roll: journalists everywhere are using this episode to again make the case that Facebook is a media company. This piece by Peter Kafka was written before this photo controversy but is an excellent case-in-point (and, sigh, it is another open letter):

Dear Mark, We get it. We understand why you don’t want to call Facebook a media company. Your investors don’t want to invest in a media company, they want to invest in a technology company. Your best-and-brightest engineers? They don’t want to work at a media company. And we’re not even going to mention Trending Topicgate here, because that would be rude.

But here’s the deal. When you gather people’s attention, and sell that attention to advertisers, guess what? You’re a media company. And you’re really good at it. Really, really good. Billions of dollars a quarter good.

Let’s be clear: Facebook could call themselves a selfie-stick company and their valuation wouldn’t change an iota. As Kafka notes later in the article Facebook gets all their content for free, which is a pretty big deal.

Indeed, I think one of the (many) reasons the media is so flummoxed with Facebook is that the company has stolen their business model and hugely improved on it. Remember, the entire reason why the media was so successful was because they made massive fixed cost investments in things like printing presses, delivery trucks, wireless spectrum, etc. that gave them monopolies or at worst oligopolies on local attention and thus advertising. The only fly in the ointment was that actual content had to be created continuously, and that’s expensive.

Facebook, like all Internet companies, takes the leverage of fixed costs to an exponentially greater level and marries that with free content creation that is far more interesting to far more people than old media ever was, which naturally attracts advertisers. To put it in academic terms, the Internet has allowed Facebook to expand the efficient frontier of attention gathering and monetization, ruining most media companies’ business model.

In other words, had Kafka insisted that Facebook is an advertising company, just like media companies, I would nod in agreement. That advertising, though, doesn’t just run against journalism: it runs against baby pictures, small businesses, cooking videos and everything in between. Facebook may be everything to the media, but the media is one of many types of content on Facebook.

In short, as long as Facebook doesn’t create content I think it’s a pretty big stretch to say they are a media company; it simply muddies the debate unnecessarily, and this dispute with Aftenposten is a perfect example of why being clear about the differences between a platform and a media company is important.

The Facebook-Media Disconnect

The disconnect in this debate reminds me of this picture:

12719113_10102665120179591_3471165635858509622_o-1200x799

Ignore the fact that Facebook owns a VR company; the point is this: Facebook is, for better or worse, running a product that is predicated on showing people exactly what they want to see, all the way down to the individual. And while there is absolutely editorial bias in any algorithm, the challenge is indeed a technical one being worked out at a scale few can fully comprehend.

That Norwegian editor-in-chief, meanwhile, is still living in a world in which he and other self-appointed gatekeepers controlled the projector for the front of the room, and the facts of this particular case aside, it is awfully hard to avoid the conclusion that he and the rest of the media feel entitled to individuals’ headsets.

Facebook’s Mistake

Still, the facts of this case do matter: first off, quite obviously this photo should have never been censored, even if the initial flagging was understandable. What is really concerning, though, was the way Facebook refused to back down, not only continuing to censor the photo but actually barring the journalist who originally posted it from the platform for three days. Yes, this was some random Facebook staffer in Hamburg, but that’s the exact problem! No one at Facebook’s headquarters seems to care about this stuff unless it turns into a crisis, which means crises are only going to continue with potentially unwanted effects.

The truth is that Facebook may not be a media company, but users do read a lot of news there; by extension, the company may not have a monopoly in news distribution, but the impact of so many self-selecting Facebook as their primary news source has significant effects on society. And, as I’ve noted repeatedly, society and its representatives may very well strike back; this sort of stupidity via apathy will only hasten the reckoning.2

  1. It should be noted that this is exactly why the Peter Thiel-Gawker episode was so concerning.
  2. And, I’d add, this is exactly why I think Facebook should have distanced itself from Thiel
13 Sep 19:57

AirPods

by Neil Cybart

AirPods will turn out to be one of the more strategically important hardware products Apple has released this decade. However, you would never know it judging from the way Apple unveiled the device last week. I suspect that was intentional. While the press remains focused on the short-term debate surrounding the iPhone's lack of a 3.5mm headphone jack, few have realized that Apple just unveiled its second wearables platform. 

AirPods 1.0

Apple introduced AirPods as a $159 solution to a problem that many iPhone users never thought they had: wired headphones. By accelerating the transition away from wired headphones, Apple is convinced that the user experience found with mobile devices will be improved. While AirPods are designed to handle most of the tasks currently given to wired EarPods, a look inside shows that Apple aims to do much more with the device. Apple's new W1 chip, the company's first wireless chip, addresses traditional shortcomings attached with wireless headphones. However, when the W1 chip is combined with additional sensors, including voice accelerometers, AirPods become Apple's latest product that capitalizes off of Siri.

After spending some time with AirPods in the demo room at the Apple keynote, there were four items that stood out to me:

  1. Easy to use. The AirPods setup was so incredibly simple (just open the AirPods charging case), I figured I must have been missing a step or two. It is clear that Apple spent much time addressing the known shortcomings found with many of the current wireless headphones available in the market.  
  2. Designed for Siri. AirPods are designed just as much for voice capture as they are for delivering sound. Even in the boisterous demo room, AirPods were able to capture my Siri command and then quickly provide the response from the nearby iPhone.
  3. The touch interface. A double tap on the outside of an AirPod activates Siri. It is easy to see how Apple will expand this touch interface in future hardware versions to activate or control additional actions. 
  4. Siri in my ear is more intuitive than in my hand or on my wrist. Using AirPods to query Siri and then quickly receive a response is incredibly intuitive, more so than my typical use case of looking at my iPhone or Apple Watch display to see Siri's written response. In addition, by having my primary interface with Siri be a small wireless device in my ear, I also gain an increased level of privacy. We will eventually get to the point at which I will be able to whisper or even mumble and AirPods will capture my command and deliver a Siri response. This will make the behavior of talking across the room to my iPhone or Amazon Echo seem downright archaic. 

There is also much intrigue found not just with the AirPods themselves, but also with the charging case. The AirPods case contains enough battery life for 24 hours of listening time while AirPods provide up to five hours of listening time on one charge. This means that I will be able to use the case for up to four to five quick recharges. (A 15 minute recharge will be enough for 3 hours of listening time). Apple ID appears to be telling us that AirPods are designed to be charged while not in use but if in a bind, an emergency recharge is possible. We see this same thought process when looking at the design given to the Apple Pencil and Magic Mouse 2.  

A New Wearables Platform

AirPods are not just a pair of bluetooth headphones or an iPhone accessory. Instead, AirPods represent Apple's second wearables platform. When thinking of AirPods in this way, it becomes much easier to envision where Apple may bring the product category over time. Not only will Apple expand the functionality found with AirPods, which is obvious, but there is opportunity for Apple to introduce a range of AirPods models that share a design language. 

Consider how far Apple has pushed its first wearables platform, Apple Watch, in just 16 months. The Apple Watch is already a $10 billion business. Apple now has five distinct Apple Watch models ranging in price from $269 to $1,499 and dozens of SKUs. While each model has the same design language (rectangular watch face and interchangeable bands), there are also key differences when it comes to features and functionality. In addition, Apple has been aggressive in building out the Watch ecosystem by releasing various Watch bands and other accessories.  

Apple's First Wearables Platform (September 2016 - 16 Months after Launch)

I expect AirPods to follow a similar pattern as Apple Watch. Additional models will eventually be introduced to address a wider portion of the wireless headphone market. While there will be some commonality between models, such as containing basic health and fitness monitoring capabilities, there will also be models that will be able to handle more differentiated use cases for certain environments such as schools and the workplace or for specific activities like running. Apple included its W1 chip in a few models of wireless Beats by Dre headphones. This move, while unusual for a company like Apple, does suggest that Apple has the intention of eventually expanding the AirPods line to include a range of models.

While Apple's two wearables platforms are inherently different from each other based on how they are designed for different sensory inputs, the two are in fact complementary to each other. The Apple Watch is designed to take advantage of the wrist's superior line of sight. This explains the device's rectangular display, designed to show as much text and other consumable information as quickly and efficiently as possible. Meanwhile, AirPods are designed to capitalize on the very powerful notification capabilities found with the human ear.

Items that are currently given to Apple Watch, such as tap notifications, may end up making much more sense for a device like AirPods, while Siri responses such as location or sports scores make sometimes make sense to be shown on an Apple Watch display instead of simply through voice in the ear. Apple's two wearables platform may end up working hand-in-hand, or maybe I should say wrist-in-ear, to provide a seamless user experience based on the most personal tech gadgets that Apple has ever sold. 

Strategy

The product strategy behind AirPods is based on what I coined the "Apple Experience" era. (My article introducing the term can be read here.) Apple will move beyond the iPhone by offering users the ability to create custom Apple experiences involving various form factors and software platforms. Apple services will help to connect everything together. 

There is a very straight-forward premise underlying the Apple Experience era: The iPhone will not be the hub of everyone's digital lives. This may seem counterintuitive considering that the iPhone has become the most valuable computer for hundreds of millions of users. However, it is this greater dependency on iPhone that opens the door for new, more personal products to flourish.

Just as our iPhone has become more powerful and capable over the years, the percentage of our daily tasks and responsibilities that we give to iPhone has been on the rise. The ongoing debate as to whether an iPad can handle all of the tasks given to a Mac ignores the fact that many have already positioned an iPhone as being able to handle many Mac tasks. We saw a few examples last week during Apple's keynote of how this trend is only going to intensify going forward. For example, the iPhone 7 Plus has a dual-camera system capable of capturing depth of field. The possibilities associated with that kind of technology could very well represent the next wave of smartphone innovation. 

While this increased functionality will increase the iPhone's value to hundreds of millions of users, it sets in motion the scenario in which room is created for new personal technology devices to begin to handle some of the more simpler tasks currently given to iPhone. For example, instead of looking at our iPhone to see who sent that incoming email, we can quickly glance at the notification on our wrist saying we received a new email. Instead of looking at our iPhone to see if we are at the right location for lunch, we get a small notice from Siri in our ear that we need to walk another two blocks for lunch. It's not that the iPhone will become less valuable in these scenarios. Rather, the value found with more personal gadgets will increase. 

By allowing consumers to pick and choose which products will handle their technology needs, we see the Apple Experience product strategy beginning to come to life. For some people, the iPhone will remain the primary hardware in their lives while others will find that Apple Watches and AirPods make much more sense for their lifestyle. We already see this evolutionary phenomenon materializing with the rise of wrist wearables. AirPods will usher in a new group of wearables that also begin to handle tasks formerly given to iPhones. 

Writing Is on the Wall

Apple is officially positioning AirPods as the beginning of the end of wired headphones. I would go much further. AirPods are the latest clue that the post-iPhone era is approaching. The writing is on the wall. A pair of AirPods (or even just one AirPod in an ear) and an Apple Watch with cellular connectively will eventually be able to handle many of the most popular tasks currently given to an iPhone.

   

It will begin with simply leaving the iPhone at home while taking Apple Watch and AirPods on a run. Then it will expand to being able to leave the iPhone at home when running a quick errand. Soon, the iPhone will become the dedicated device for tasks like watching video and writing emails. Eventually, the iPhone will begin to be treated like an iPad or Mac, serving as the device we turn to for those times we need a more powerful device. All the while, more and more tasks are given to Apple Watch and AirPods. 

Throughout this process, Apple services such as Siri, iMessage, and Apple Maps will play a big role in making this transition away from iPhone possible as the very nature of computing tasks are simplified. As third-party developers embrace Apple services in new ways, the way we interact with these services will also change. 

Apple is learning from lessons experienced with the Apple Watch to approach AirPods in a much more modest way as seen with the way management discussed the product on stage last week. AirPods are being given a very simple directive today. It may be difficult to believe, but AirPods contain the potential to eventually become a more important product for Apple than even Apple Watch. Many possibilities are created by having Siri in our ear. AirPods are a very big deal. 

Receive my analysis and perspective on Apple throughout the week via exclusive daily updates (2-3 stories a day, 10-12 stories a week). To sign up, visit the membership page