Shared posts

23 Feb 06:42

More Hum; Less Varoom

by Ken Ohrn

The Economist reviews the speed of change in motor vehicles.  The internal combustion engine’s replacement by the electric motor is happening faster than expected. It often seems that technical progress exceeds expectations, while we humans change slowly.

e-vehicle

Image via Wikimedia Commons

The implications are clear.  More electrical energy usage; less fossil fuel usage, leading to demand reduction.  Greater impetus to “leave it in the ground”, and growth of such stranded assets as an issue for utilities, fossil-fuel corporations and the governments that subsidize and enable them.  If, that is, we can overcome resistance to change by fossil-fuel companies, their captured governments, and their well-funded propaganda agents. And if we can continue to take advantage of the dropping costs for renewable energy sources like wind and solar.

The demand reduction scenario is especially concerning for Alberta’s highest-of-all high-cost tar sands industry.

THE high-pitched whirr of an electric car may not stir the soul like the bellow and growl of an internal combustion engine (ICE). But to compensate, electric motors give even the humblest cars explosive acceleration. Electric cars are similarly set for rapid forward thrust. Improving technology and tightening regulations on emissions from ICEs is about to propel electric vehicles (EVs) from a niche to the mainstream. After more than a century of reliance on fossil fuels, however, the route from petrol power to volts will be a tough one for carmakers to navigate. . . .

Ford’s boss is bolder still. In January Mark Fields announced that the “era of the electric vehicle is dawning”, and he reckons that the number of models of EVs will exceed pure ICE-powered cars within 15 years. Ford has promised 13 new electrified cars in the next five years. Others are making bigger commitments. Volkswagen, the world’s biggest carmaker, said last year that it would begin a product blitz in 2020 and launch 30 new battery-powered models by 2025, when EVs will account for up to a quarter of its sales. Daimler, a German rival, also recently set an ambitious target of up to a fifth of sales by the same date.


23 Feb 06:42

Growth without growth

by Paul Jarvis
If you don’t want to have employees, assistants, scale up or grow a company that’s bigger than you - there can still be growth involved. It’s just a different kind.
23 Feb 06:42

NeXT, Steve Jobs and the History of the Dynamic Web

by Martin

Reading stock exchange IPO prospectuses is probably not the first thing that comes to your mind when you want to learn something about computing history. This article by Hansen Hsu recently published on the blog of the Computer History Museum will convince you otherwise! In his article, Hansen discusses how NeXT, the company founded by Steve Jobs after he was ousted from Apple in the mid-1980s helped to shape the transition from a static to a dynamic web. His research is based on a draft S-1 filing for a potential NeXT IPO which was recently donated to the museum.

IPO prospectuses have a number of interesting properties for historians. A major one is that companies have to describe their products, the products of their competitors, the current technological and financial competitive landscape, risks and many other things. And the second thing that makes such documents interesting is that they have to be accurate, mustn’t leave out important facts and must stick as close to the truth as possible because otherwise the company can get sued if the stock price does not develop favorably and investors want their money back.

The NeXT draft IPO has 108 pages and reveals some interesting insights into the state of the Internet in 1996. In the document, NeXT states that most web pages were still static, i.e. there was no database and code running on web servers that assembled web pages on the fly. Yes, that’s how I remember the web in 1996. At the time I created my first web site which was hosted on a university Sun server and I created every page by hand.

At the time NeXT came out with ‘WebObjects’, an object oriented API to create web pages on the fly with information contained in a database. It seems they had quite some success with this as the prospectus lists a number of customers using it despite only launched earlier in 1996. WebObjects faded away over time because NeXT was bought by Apple, not for WebObjects but for their operating system, which would form the basis of MacOS X.

For more details, have a look at Hansen’s summary article and, if you want to know even more (e.g. how much Steve Jobs and other executives earned at NeXT), have a look at the PDF of the 108 page draft IPO document, which is available on the Computer History Museum’s website as well.

23 Feb 06:42

Strait and Narrows – Episode 2: Finding Home

by pricetags

Everyone has their own reasons for landing in Metro Vancouver, but something special links their experiences. Three individuals share their stories of finding home, and trying to make their place in the world.

strait-ep-2


23 Feb 06:41

fault lines — a cultural heritage of misaligned expectations

fault lines — a cultural heritage of misaligned expectations

Last week I had the pleasure of speaking at MuseumNext Melbourne. I did a talk that I've been threatening to do for a few years now so I was grateful for the chance to work through and better articulate my argument. A number of people said they enjoyed the talk which is always nice and I heard stories of people debating and discussing the talk on their own afterwards, which is even better.

Mostly for my own benefit of remembering I've included the talk proposal below:

Have museums, and in particular art museums, become too just a little bit too cozy with artists, their estates and their representative groups? Has it always been this way and is the only change that the increasing presence of in-house digital technologies in the a museum context only serves to highlight and reinforce this dynamic? What does it mean for a museum to try and cultivate a meaningful digital practice for their collections and exhibitions when the usage rights for their core assets are held by a third party?

How does the nearly ubiquitous presence so-called “bring-your-own” digital technologies in a museum context affect the issue? How do the abilities and expectations these technologies confer on visitors affect any middle ground that may have previously been established between the rights and demands of artists and the access and preservation goals of a museum?

What does it mean when artists themselves employ these same digital technologies and exploit audience expectations to create their own bespoke museums?

This presentation will plunge in to the topic, look around with a critical eye and endeavour to propose a practical and conceptual framework for where the museum sector goes from here.

This is what I actually said:

Hi, my name is Aaron. The story of my relationship with the cultural heritage sector is fiddly at best and boring at worst so I will just say that while I don't presently work at a museum I still use the second-person plural. I self-identify as we.

I'd like to start with a little bit of audience participation.

Raise your hand if you've ever been to a museum. Keep your hand raised if you've ever taken a photograph of a wall label to remember something you've seen while visiting a museum. Keep your hand raised, still, if you think that's working out well for you.

Between 2012 and 2015 I was part of the team at the Cooper Hewitt that built the Pen. I used to ask those same three questions during the time we were making the Pen, as a way to explain why we were making the Pen. The punch-line was always: Imagine if you never had to do that again? Imagine if you could come to museum with the confidence that remembering your visit would be easy and unintrusive and not require silly workarounds like photographing wall labels. Imagine if you could take for granted that your museum visit wouldn't be defined by all the things you had to do to remember your visit in the first place.

These days I am working on maps again. Specifically, I am working on a project to build a gazetteer of places with an open data license and global coverage spanning continents all the way down to neighbourhoods and venues. Every place in the gazetteer has a stable and permanent ID and by extension a stable and permanent URL.

When people ask why I sometimes like to say that it's mostly so that we can stop arguing about how to spell place names. What if, instead, we could take for granted that each place had a stable referent off of which we might hang all the names and all the variations in spelling, in all the languages? That might be useful if only in that it would allow us to focus on other more important things.

I mention these stories because I want to start by laying my cards on the table for what will follow in this talk. That is: I fundamentally believe that the distinction between museums and libraries and archives, in the minds of people outside the cultural heritage sector is collapsing. Assuming they ever thought those distinctions existed in the first place.

There are some people inside the sector who share my opinion but as often as not it is an idea that is met with outright hostility. These are fighting words, to many.

I have sometimes been accused of hyperbole or of not sufficiently understanding (or at least appreciating) the differing roles and responsibilities and historical contexts in which each practice has evolved. Both are fair criticisms but the problem I have with either is that that, whether or not they are true, they don't really address the actual argument I am advancing.

What if all the accussations on both side of the argument are correct?

Perhaps what I am seeing are shadows at dusk — ill-defined and lacking clarity — but that doesn't mean they aren't still there. So what exactly do I mean what I say that the distinction between these three practices has, or is, collapsing?

I mean to suggest that the functions, the external expectations of competencies, of any one professional class are blurring with the others in people's minds.

Why shouldn't a museum have a robust and well-structured body of searchable metadata not just of its collection but also all the ten-thousand word essays that have been written about it? Why shouldn't libraries be able to accept self-deposits both as an intellectual and an operational prerogative? Why shouldn't an archive offer interpretive guidance on the materials they house?

It is a challenge to explain to people outside the sector what actually distinguishes a museum from an archive when the former has storage facilities full of stuff that they can't, or won't, show to people because those objects haven't been catalogued properly. It is doubly challenging when you remember that libraries manage to make sense from a similar, often greater, chaos. It's not as though librarians actually read all the books they keep on hand but still they do enough to foster and cultivate a culture of curiousity and to promote learning and discovery in their patrons.

I think one of the reasons my argument is met with such hostility is that it is predicated on some still nascent changes that contemporary life has afforded us and the practice of preservation has not usually been in the business of nascent technologies.

By changes I mean the Internet, as a whole, and more specifically the permanent (or at least durable) and asynchronous network of documents we call the World Wide Web. This network exists in contrast to the mono-directional and now increasingly weaponized television and broadcast culture that many hoped would be relegated to the ashes of the 20th century but which has seemed to return with a vengeance.

A couple of years ago Jason Scott, whom many of you will have seen speak this morning, and I attended a different conference together and he and I were discussing his work, as part of both the Internet Archive and Archive Team, to pre-emptively save things you never knew you were going to miss on the Internet. I am 100% in support not just of these efforts but also their approach. Whatever missteps the Internet Archive and Archive Team make along the way those who follow in our footsteps will benefit from the willingness of Jason and his peers to bet on the future regardless of how dimly their contemporaries may have looked upon the present.

There is however a weak link in Jason's work.

It is a weak link whose consequences are potentially so catastrophic that unless things have gotten really really really bad we will probably never let them come to pass. It is worth recoginizing that all of Jason's work is built on the foundational layer we've come to know as the electrical grid. All of Jason's work vanishes when the power goes out.

I mention this because when you consider preservationists as a professional class and when you think about their work in an historical context then you start to realize that all they have ever known, on average, is war and pillaging and looting. As such is again important to recognize that although some mistakes have been made over the years they have otherwise done a remarkable job of keeping stuff safe under genuinely extraordinary cicrumstances.

So it's not crazy to imagine that for preservationists — again as an abstract professional class in an even more abstract historical timeline — the jury might still be out on whether electricity is anything we can depend on yet. I have yet to meet a preservationist who has expressed any kind of existential doubt about the electrical grid but, even just as an exercise, it is a possibility worth contemplating.

It seems only prudent.

I share this story because the larger argument I am trying to make in this talk rests on an even more recent and potentially less certain foundation. My argument depends not just on the electrical grid and not just a globally linked network of documents but also on a network of globally linked databases and a layer of applications that we are building on top of that.

The scale and the speed with which computerized and networked databases have shaped contemporary life often lend them and air and a weight of inevitability that is as confortable as it is misleading.

I want to acknowledge that at least one potential flaw in my argument is a misplaced confidence in our shared effort to ensure that we will be able to take for granted these layers of communal infrastructure. We treat these things as natural laws, rather than the shared and concerted efforts they are, at our own peril.

Again, even simply as an exercise, we would do well to imagine a world without an Internet. Or even just an Internet changed in nature beyond recognition. But if those possibilities are so terrible to ever let happen then we also need to think about what they makes possible. We need to to think about how those possibilities change what people expect as commonplace and, by now you might be starting to see a theme emerge, what they might take for granted.

I also want to mention that there is an entirely other talk about the roles and responsibilities that the cultural heritage sector should assume in not simply preserving this network infrastructure but actively running and maintaining it in the service of cultural heritage itself.

But this is not that talk.

Instead this a talk about the operationalization of recall. This is a talk about how recall — the core of what we champion and celebrate as so-called memory institutions — is being normalized by the network.

More than that, even, what I see is an increasing de-fetishization of recall. More and more we take recall for granted in that way we charge the most important aspects of our life with being mundane and unseen. Recall has joined the list of things that are only noticed in their absence.

I think this is a good thing but I worry that the cultural heritage sector, and in particular art museums, is structurally unprepared to adapt to it.

Here again we enter the territory of shadows at dusk. I do not want to suggest that either the problems I am describing, or their remedies, are universal. Like most things in the cultural heritage sector there will be many shades or grey that do not lend themselves easily to an eight-point strategy document. I do mean to suggest that there is something going bump in the night and it is worth our investigating.

I want to call attention to the assumption of a shared communal network infrastructure as a public good that is central to my argument. It is an assumption that, given the politics of the late 20-teens, no longer seems self-evident. And I also want to recognize that mine is an argument that presumes conditions which may be antithetical on both material and practical levels to the very practice of preservation itself. Maybe.

In the meantime it is hard to deny that we live in a world where reaching out and touching the sky is as much about touching the past as it is the present. And that we take this practice for granted.

For every Amazon order or dispatching of a car service or status update there is a Wikipedia query or someone consulting an email archive or a clown on social media being fact-checked. The list goes on. This is not simply about greater access to an ever-growing pool of information. More than that it is the ability to take for granted that the past is proximate, in a manner that is genuinely unprecented.

We use the network to circumvent the present and the moment and that's a curious environment for museums to operate in since being present in the moment is largely how we've come to see and to define ourselves.

So, in 2017 what is a museum (or a library or an archive) that does not usefully exist beyond the borders of the moment, beyond the borders of its walls? More importantly what is a museum, in 2017, that can not exist beyond its physical walls because it lacks permission to do so?

What follows is not a comprehensive catalog of ways this happens in 2017, nor are they examples specific to the network. Indeed we, in this room, could spend the rest of the afternoon documenting ways in which museums are prevented from doing their work. These examples serve only to illustrate the problem:

  • A prohibition on photography in the galleries or an overly heavy-handed response to visitors publishing photos online. A million years ago I worked at Flickr and we were constantly fielding take-down notices from the various French artists rights associations about tourist-quality photographs taken in the galleries.
  • Perpetuating the madness around print quality images in a universe where the minimum required image size for a musuem's iPad app exceeds the minimum required image size for a print publication.
  • Limiting or restricting catalog records from being published online because are not perfect or because they might upset someone.
  • Loan objects never being mentioned or included in a catalog or, worse, being removed after an exhibition comes down. This one is especially galling to me since we tell people that it's very important that they come see these objects and then pretend as though it never happened.
  • Generally limiting curatorial authority or decision making, whether it's a traveling exhibition or a retrospective of a contemporary artist's work. It's a practice that is fine if you're David Hockney and you want to hold a major retrospective of your own work at the Royal Academy which is basically a private members club. But that exhibition, in 2012, perhaps more than any other in recent memory highlighted the need for and the purpose of curatorial discretion.

What these examples point to is an imbalance in the relationship between museums and their dancing partners. That imbalance is further reflected in the inability to meaningfully interact on or with the network.

It reflects an inability for museums to engage with the network because of an overzealous regiment of permissioning or to use the network as a tool by which, to borrow Elaine Gurian's phrase, they might promote nuance and a more complex understanding of their collections.

That is also some pretty fancy talk for a pretty simple idea: That is it time for the cultural heritage sector, as a whole, to pick a fight.

It is time for the sector to pick a fight with artists, and artist's estates and even your donors. It is time for the sector to pick a fight with anyone that is preventing you from being allowed to have a greater — and I want to stress greater, not total — license of interpretation over the works which you are charged with nurturing and caring for.

The following passage did not make it in to the talk which was only 20 minutes long but since we have the luxury of time and space here: This has been a curious position to arrive at. If I studied anything it was painting and studio arts so I am broadly sympathetic with the demands of artists to maintain control and ownership of their work. I also closely followed and supported the efforts in the comix world leading up to the Creator's Bill of Rights which is an industry pretty much defined by how poorly publishers have treated artists and writers. There are good reasons why we've ended up here but I also feel as though we've gone from one extreme to the other which counts as dubious progress in my book.

It is time to pick a fight because, at least on bad days, I might even suggest that the sector has been played. We all want to outlast the present, and this is especially true of artists. Museums and libraries and archives are a pretty good bet if that's your goal.

Consider the number of works sent to MoMA, in the 1970's, after one of their registrars let it be known he would accession in to the collection anything that arrived on his desk by mail. Consider the well-known tactic of self-depositing any book with an ISBN number to the Library of Congress. If you do they are required to care and feed for that book for the rest of forever. Consider how happy people were to learn that the Library of Congress had acquired all the Twitter messages, and with it their tiny contribution, for precisely that reason.

Consider the growing number of artists and private companies who are creating their own museums in the services of... themselves.

It remains to be seen how well those last institutions will fare over time. Indeed being in it for the long-run and in being wired for the long-run, both operationally and intellectually, is what distinguishes the cultural heritage sector from other endeavours and the public is made better by those efforts.

My argument though is that taking on the burden of championing and preserving cultural heritage is a two-way street. My argument is that the sector has become prone to an unhealthy deference, to being little more than caretakers in the service of someone else's enterprise that is legitimized by our labours. There exists an imbalance in our relationship with those whose works we shepherd that, at best, hampers our ability to actively participate with the network and for our own work to square with people's expectations of what is possible and what can be taken for granted.

Ask yourselves: Can I actually do anything with my museum's collection sitting here in this or any other random conference room, right now? I suspect that that for most, probably all, of you the answer is resolutely: No, or at least not well. Those few ways in which it is possible for our collections to participate with the network are still surface-level, at most. Now ask yourselves why that is.

In closing I would like to leave you with an open question, perhaps even a provocation: Does the cultural heritage sector's current fascination with crafting experiences in fact betray our anxieties about the network? Or put another way: Is the vogue for fashioning these all-consuming, in-gallery shock-and-awe immersions actually a way for us to distract our visitors from the reality that there is often no recall of any consequence of that experience the minute they walk out the door?

Thank you.

23 Feb 06:39

Overcast 3.0: iOS 10 Features, UI Changes, Easy Queuing, and an Interview with Marco Arment

by Federico Viticci

Overcast, Marco Arment's popular podcast app for iOS, is defined by an interesting dualism: its essence has remained remarkably consistent with the original version released three years ago; at the same time, Arment has periodically revisited Overcast's design, features, and business model to build a superior listening environment for a larger audience.

The same judicious iteration permeates Overcast 3.0, launching today on the App Store. With improvements to episode management, visual changes aimed at modernizing the interface, and an evolution of the existing subscription-based model, Overcast 3.0 is another thoughtful combination of new ideas and old tropes, which converge in a refreshed yet instinctively familiar listening experience.

Overcast 3.0

While Overcast 3.0 may not look drastically different from version 2.5 (launched in March), there are several visual tweaks meant to transition the app's design language from iOS 7 to iOS 10.

It starts with the app icon, which has been redesigned to host a smaller circular shape. The change is subtle, but it's in line with other modern iOS apps.1

Overcast's old icon (top) vs. the redesign in 3.0.

Overcast's old icon (top) vs. the redesign in 3.0.

More importantly, Overcast's flow has been rebuilt around the concept of cards in lieu of nested screens. Following Apple's implementation of stacked views in Apple Music and the iMessage App Store for iOS 10, Arment has adopted a similar approach for every non-list page in Overcast 3.0: the Now Playing screen, settings, downloads, and recommendations now appear as cards that slide up from the bottom of the UI while the previously viewed screen recedes in the background and is detached from the status bar.

The new Overcast (right) uses a card-like UI inspired by Apple Music.

The new Overcast (right) uses a card-like UI inspired by Apple Music.

As I wrote in my iOS 10 review, I'm a fan of this card-like appearance, and I was hoping more developers would use it in their apps. In many ways, Arment's take on stacked cards is preferable to Apple's: while still not interruptible, the animation has a shorter execution time and the card can be dismissed with a downward swipe or by pulling from the left edge of the screen. As a result, Apple Music and Overcast may share a similar Non Playing view, but Arment's version feels snappier and more comfortable to use.2

Two design styles: Apple Music and Overcast (left, right) vs. Apple's Podcasts.

Two design styles: Apple Music and Overcast (left, right) vs. Apple's Podcasts.

Other functional improvements are poised to make Overcast a better iOS 10 citizen, too. Overcast 3.0 has a widget, which allows you to see what's playing, resume playback, and keep an eye on upcoming episodes from your Up Next queue (more on this below). The widget's design isn't too imaginative, but it gets the job done; it's nice to have more controls on the Home screen.

When scrolling a list of episodes, you can now peek and pop an episode's description and show notes with 3D Touch. I've long wished for an easier way to quickly preview the contents of an episode, and this is it. Plus, by swiping the peek card vertically, you'll access shortcuts to play an episode or delete it.

There's also support for rich notifications in iOS 10: when a new episode from your subscriptions comes in, you can expand its notification to reveal the show's artwork, episode information, and actionable buttons to play the episode or add it to your queue for later.

Rich notifications in Overcast 3.0.

Rich notifications in Overcast 3.0.

I interact with Overcast primarily by browsing the app; I dislike enabling notifications for podcasts, but I can see why others will appreciate the increased interactivity of episode alerts.

The most notable visual and practical change in Overcast 3.0, however, is the new episode action tray. Inspired by Tweetbot's tap-to-reveal action drawer in the timeline, Overcast's action bar groups and simplifies actions that were previously hidden behind swipe gestures or other menus nested in an episode's page. Upon tapping an episode in a list, Overcast will display buttons to share, star, play, add to queue, and delete the selected item.

The tray's animation doesn't have the same polished 3D effect of Tweetbot, but it's just as convenient as a gateway to oft-used actions. By switching the tap behavior on an episode's cell to unfold a contextual menu, Arment has also eliminated much of Overcast's old UI anxiety: accidentally tapping an episode no longer interrupts what you're listening to without a chance to cancel. This two-stage interface adds another step to Overcast's interaction scheme, but it also better signals the user's intention and removes a lot of complexity from the app.

Directly tied to the action tray are Overcast's new queuing and playlist management features. In Overcast 3.0, you can add an episode to Up Next (which will immediately play after the current one) or save it to the bottom of the queue. Both Up Next and Queue are options available in the playlist button of the action bar; the Queue playlist is automatically created by Overcast upon queuing an episode for the first time.

Overcast's new queuing features don't act as the defining characteristic of this update; unlike Castro 2, which revolves around a novel concept of triaging and queuing, Queue and Up Next in Overcast are just options integrated with the app's more traditional approach to browsing and managing episodes. However, I've enjoyed Overcast's unsurprising but functional implementation of queuing (particularly thanks to actions in rich notifications), and I feel like I'll spend most of my time listening from the app's Queue playlist going forward.

Speaking of which, reordering episodes in a playlist is easier in Overcast 3.0 thanks to always-visible drag handles next to each episode. The ability to quickly reorder episodes in a playlist is especially useful in conjunction with Overcast's faster Watch app.

Overcast has been rewritten for watchOS 3 (it starts up in a second on my Apple Watch Series 2) and there's a button to view upcoming episodes from the currently active playlist (whether it's the queue, a playlist, or a show's individual episode list).

Finally, Overcast 3.0 features a redesigned Now Playing view with pagination in the top half of the screen. Vertical scrolling in the Now Playing page is gone; now, you swipe right to tweak audio settings and swipe left to view episode descriptions, show notes, and links to embedded chapters.

The new paginated Now Playing screen in Overcast 3.0.

The new paginated Now Playing screen in Overcast 3.0.

I see the refreshed Now Playing screen as an overall improvement – it splits up controls and text content in dedicated areas and it allows a sloppy vertical swipe to always dismiss the Now Playing card, which feels nice.

Page indicators are hidden if an episode includes chapters.

Page indicators are hidden if an episode includes chapters.

I have one reservation, though: while episodes without chapters hint at pagination through page indicators below the show's artwork, these indicators aren't displayed if an episode contains chapters. As chapters continue to gain popularity among podcast producers, I believe some users will be confused by the design change as they won't be able to discover the Now Playing pages at all. Page indicators should always be displayed for consistency and discoverability.

Overcast 3.0 may not come with the same number of major additions as version 2.0 (which brought support for chapters and streaming), but it's a better iOS 10 app that fixes longstanding shortcomings and makes everything easier and speedier. If you didn't like Overcast before, version 3.0 likely won't change your mind. But as someone who's used the app every day since it first launched, Overcast 3.0 offers solid improvements that continue to make it my favorite podcast app for iOS.

Overcast 3.0 is available on the App Store.


In writing this story, I had a chance to ask Marco Arment a few questions about the evolution of Overcast's business model, ads, and integration with iOS 10. You can find the interview below.


A Brief Interview with Marco Arment

Federico Viticci: We discussed the switch to a patronage model when you launched Overcast 2.0. Then in September 2016, you shared some numbers about the performance of the patronage model and explained why you decided to try ads. How has the change gone so far? What have you learned since implementing ads in Overcast?

Marco Arment: Ads with a subscription to remove them has been Overcast's most successful business model. I finally have what I've been looking for: everyone getting all of the best features, while providing healthy, sustainable, and growing income over time.

Overcast's customers are also happier with this model than my previous ones. People who don't mind ads are fine with them, and people who don't want to see them are happy to pay for the subscription instead.

FV: With Overcast 3.0, you've removed Google ads in favor of your own solution. What drove this decision, and what kind of challenges do you see ahead?

MA: In order to make much money from the big mobile ad platforms, you need to be willing to embed multiple closed-source ad libraries in your app, you need to permit lots of questionable ads, and you need to design the app such that users are seeing and tapping on the ads a lot.

My Google ads in Overcast 2.6 were successful primarily because they drove people to subscribe — the ads themselves made very little money, in part because of poor placement and in part because I kept disallowing high-paying ad categories that I thought were inappropriate.

Finally, with the recent worrisome shifts in U.S. political and surveillance areas, I decided that I was no longer comfortable embedding anyone else's closed-source ad library into Overcast. For my customers' safety and privacy, I want knowledge and control of every line of non-Apple code that my app is running.

Since Google's ads weren't making much for me anyway and subscriptions made up the bulk of the revenue, I realized I could run pretty much any ads in the app, so I made my own and am trying to sell them directly. I don't have to sell many of them to match the previous ad revenue, they're completely under my control, and only the bare minimum data for accounting is collected (number of clicks per ad, etc.).

It's more work this way, but I believe it'll be worth it.

FV: One of your original goals with moving Overcast 2.0 to a free model was to make the app easier to use in its full feature set in an increasingly competitive landscape. Did the move to free (and thus to a larger audience) bring any downsides you were not expecting? Is it necessary to offer free apps with ads if you want to achieve any meaningful scale and revenue today?

MA: It depends on your app's market. For mass-market entertainment apps, especially when Apple includes a competitor for free, it's important to remove as many barriers to trying your app as possible. Free-up-front with some sort of in-app monetization is the best balance I've found between marketshare and revenue.

For Overcast, I haven't found any major downsides to a large, mostly-free audience. I've built the business with marketshare in mind, so I've kept costs low and engineered the server-side components to scale inexpensively. It achieves neither explosive growth (which needs funding and large expenditures) nor extreme profitability (which probably needs a paid-up-front app, at the expense of marketshare and competitive vulnerability). I'm satisfied with the middle ground.

FV: Finally, iOS 10. The latest version of iOS brought, among various consumer changes, support for rich notifications and an iMessage SDK. You've added rich notification support to Overcast, but haven't shipped an iMessage component for the app. Do you find iMessage apps to have a limited appeal? And what else would you like to see in the future of iOS for audio apps like Overcast?

MA: Overcast's top sharing destination is Messages, but there hasn't been much demand for an Overcast Messages app. I think iOS users mostly think to share from the source app using the iOS share sheet, which I already offer, rather than finding a corresponding app in Messages and trying to share directly from there.

There are plenty of interesting uses for Messages apps, but I haven't come up with a compelling one for Overcast yet.

My biggest platform feature request for Overcast is Siri support. Adding a Siri intent for audio services isn't easy, which is probably why it didn't make it into the initial Siri API in iOS 10 – in addition to basic playback commands, it would probably need a system to index each app's library of available content. Whenever we get more Siri API intents, probably with iOS 11, I hope that's one of them.


  1. Some examples: TwIM, Record Bird, Google. It takes some time getting used to the different shape on the Home screen (particularly compared to the guidelines followed by Apple in their apps), but I like Overcast's icon better now. It feels fresh. ↩︎
  2. I wish a light thud played through the Taptic Engine when the Now Playing screen is swiped to the bottom, though. I also would have liked to see the chevron in the top left corner (in the episode's title bar) dynamically change its shape as the card is pulled down, like Apple Music does with its pulling indicator in the Now Playing card. ↩︎

Support MacStories Directly

Club MacStories offers exclusive access to extra MacStories content, delivered every week; it’s also a way to support us directly.

Club MacStories will help you discover the best apps for your devices and get the most out of your iPhone, iPad, and Mac. Plus, it’s made in Italy.

Join Now
23 Feb 06:39

things I’ve recently learned about legislative redistricting

Interesting things are afoot in legislative redistricting! Over the past ten years, Republicans have enacted partisan gerrymanders in a number of state houses in order to establish and maintain control of U.S. politics despite their unpopular policies. I’ve been learning what I can about redistricting and I’m curious if there’s something useful I could offer as a geospatial open data person.

This post is a summary of things I’ve been learning. If any of this is wrong or incomplete, please say so in the comments below. Also, here’s an interactive map of the three overlapping districts you’re probably in right now:

Three exciting things are happening now.

First, Wisconsin is in court trying to defend its legislative plan, and not doing well. It’s a rare case of a district plan being challenged on explicitly partisan grounds; in the past we’ve seen racial and other measures used in laws like the Voting Rights Act, but partisan outcomes have not typically been considered grounds for action. It might be headed to the Supreme Court.

Second, a new measure of partisan gerrymandering, the Efficiency Gap, is providing a court-friendly measure for partisan effects. Defined by two scholars, Nicholas Stephanopoulos and Eric McGhee, the measure defines two kinds of wasted votes: “lost votes” cast in favor of a defeated candidate, and “surplus votes” cast in favor of a winning candidate that weren’t actually necessary for the candidate’s victory. Stephanopoulos sums it up as “the difference between the parties’ respective wasted votes in an election, divided by the total number of votes cast.”

Wisconsin happens to be one of the biggest bullies on this particular block:

This New Republic article provides a friendly explanation.

Third, former U.S. Attorney General Eric Holder has created the National Democratic Redistricting Committee (NDRC), a “targeted, state-by-state strategy that ensures Democrats can fight back and produce fairer maps in the 2021 redistricting process.” Right now, I’m hearing that NDRC is in early fundraising mode.

So that’s a lot.

I sent some fan mail to Eric McGhee and he graciously helped me understand a bunch of the basic concepts over coffee.

One thing I learned is the significance and use of political geography. As Marco Rogers has pointed out, liberals and democrats clump together in urban areas: “Look at the electoral maps. Drill into the states. What we see is singular blue counties, clustered around cities, in an endless sea of red.” At Code for America, we worked with a number of cities that fell into this pattern, and frequently they were looking to CfA for help dealing with blue town vs. red county issues.

Jowei Chen, associate professor of political science in Michigan, has an extensive bibliography of writing about legislative districts. In his 2013 paper Unintentional Gerrymandering, Chen demonstrates how a sampling of possible redistricting proposals can maintain partisan bias:

In contemporary Florida and several other urbanized states, voters are arranged in geographic space in such a way that traditional districting principles of contiguity and compactness will generate substantial electoral bias in favor of the Republican Party.

Geometry is a red herring. Over the years I’ve encountered a few geometry optimizations for proposed districts, including this one from Brian Olson, written up in the Washington Post:

Olson’s proposed district plans

While compactness is desirable in a district, Olson’s method prioritizes visual aesthetics above political geography, and he notes some of the factors he ignores on his site, such as travel time: “it might be the right kind of thing to measure, but it would take too long.” Olson’s method selects aesthetically pleasing shapes that are fast to calculate on his home computer. I think that’s a terrible basis for a redistricting plan, but the goofy shapes that exist in many current plans are a popular butt of jokes:

Gerrymandering T-Shirts by BorderlineStyle

Chen particularly calls out how cartographic concerns can be a dead-end:

Our simulations suggest that reducing the partisan bias observed in such states would require reformers to give up on what Dixon (1968) referred to as the “myth of non-partisan cartography,” focusing not on the intentions of mapmakers, but instead on an empirical standard that assesses whether a districting plan is likely to treat both parties equally (e.g., King et al., 2006; Hirsch, 2009).

However, geography is not insurmountable.

In a later 2015 paper, Cutting Through the Thicket, Chen argues through statistical simulations that legislative outcomes can be predicted for a given redistricting plan, and plots the potential results of many plans to show that a given outcome can be intentionally selected:

A straightforward redistricting algorithm can be used to generate a benchmark against which to contrast a plan that has been called into constitutional question, thus laying bare any partisan ad- vantage that cannot be attributed to legitimate legislative objectives.

Here’s Florida’s controversial 2012 plan shown as a dotted line to the right of 1,000 simulated plans, demonstrating a “clearer sense of how this extreme partisan advantage was created:”

A graph from Chen’s 2015 paper showing simulated partisan outcomes for Florida district plans

Chen concludes that the position of the dotted line relative to the modal outcomes shows partisan intent, if you agree that such an outcome is unlikely to be random.

In 2010, Republicans systematically generated skewed partisan outcomes in numerous state houses, as documented in this NPR interview with the author of Ratf**ked:

There was a huge Republican wave election in 2010, and that is an important piece of this. But the other important piece of Redmap is what they did to lock in those lines the following year. And it's the mapping efforts that were made and the precise strategies that were launched in 2011 to sustain those gains, even in Democratic years, which is what makes RedMap so effective and successful.

“RedMap” was a GOP program led by Republican strategist Chris Jankowski to turn the map red by targeting state legislative races:

The idea was that you could take a state like Ohio, for example. In 2008, the Democrats held a majority in the statehouse of 53-46. What RedMap does is they identify and target six specific statehouse seats. They spend $1 million on these races, which is an unheard of amount of money coming into a statehouse race. Republicans win five of these. They take control of the Statehouse in Ohio - also, the state Senate that year. And it gives them, essentially, a veto-proof run of the entire re-districting in the state.

Holder’s NDRC effort is a counter-effort to RedMap. They’re planning electoral, ballot, and legal initiatives to undo the damage of RedMap. Chen’s simulation method could allow a legislature to overcome geographic determinism and decide on an outcome that better represents the distribution of voters. Chen again:

We do not envision that a plaintiff would use our approach in isolation. On the contrary, it would be most effective in combination with evidence of partisan asymmetry and perhaps more traditional evidence including direct testimony about intent and critiques of individual districts. As with Justice Stevens’ description of partisan symmetry, we view it as a “helpful (though certainly not talismanic) tool.”

So, back to the efficiency gap.

McGhee and Stephanopoulos’s measure counts actual votes in real elections. That’s helpful to courts trying to determine whether a given plan is fair, because it does not rely on guessing about possible outcome from public opinion. Chen’s approach provides a statistical expectation for what a normal plan could do, as well as ways to adjust plans based on desired outcomes. Calculating the efficiency gap for a proposed district plan is complicated, because you need to account for cases where simple red/blue data is missing, such as a uncontested races. You have to impute the potential vote in each proposed new district.

To do this, you need precinct-level election data. Jon Schleuss, Joe Fox, and others working with Ben Welsh at the LA Times Data Desk recently created the most detailed election result map ever made for California. In other states, the data is often not available online, and must be specially requested from sometimes-unhelpful officials. Eric McGhee told me that many experts working on redistricting use a dataset maintained by DailyKos, an independent liberal news website.

LA Times maps of California’s 2016 election results

There’s a big opportunity here for a carefully-vetted online tool that could calculate measures like the efficiency gap for a variety of districting plans. For my part, I’m getting started understanding the sources and types of data that might help pull district plans in a fairer direction. If you’re curious about your own district, find yourself on this map:

Comments (6)
23 Feb 06:38

My Podcasting Setup

I’ve gotten a number of inquiries over the last 2 years about my podcasting setup and I’ve been meaning to write about it but….

But here it is! I actually wanted to write this because I felt like there actually wasn’t a ton of good information about this on the Internet that wasn’t for people who wanted to do it professionally but were rather more “casual” podcasters. So here’s what I’ve got.

There are two types of podcasts roughly: The kind you record with everyone in the same room and the kind you record where everyone is in different rooms. They both require slightly different setups so I’ll talk about both. For me, Elizabeth Matsui and I record The Effort Report locally because we’re both at Johns Hopkins. But Hilary Parker and I record Not So Standard Deviations remotely because she’s on the other side of the country most of the time.

Recording Equipment

When Hilary and I first started we just used the microphone attached to the headphones you get with your iPhone or whatever. That’s okay but the sound feels very “narrow” to me. That said, it’s a good way to get started and it likely costs you nothing.

The next level up for many people is the Blue Yeti USB Microphone which is perfectly fine microphone and not too expensive. Also, it uses USB (as opposed to more professional XLR) so it connects to any computer, which is nice. However, it typically retails for $120, which isn’t nothing, and there are probably cheaper microphones that are just as good. For example, Jason Snell recommends the Audio Technica ATR2100 which is only about $70.

If you’re willing to shell out a little more money, I’d highly recommend the Zoom H4n portable recorder. This is actually two things: a microphone and a recorder. It has a nice stero microphone built into the top along with two XLR inputs on the bottom that allow you to record from external mics. It records to SD cards so it’s great for a portable setup where you don’t want to carry a computer around with you. It retails for about $200 so it’s not cheap, but in my opinion it is worth every penny. I’ve been using my H4n for years now.

Because we do a lot or recording for our online courses here, we’ve actually got a bit more equipment in the office. So for in-person podcasts I sometimes record using a Sennheiser MKH416-P48US attached to an Auray MS-5230T microphone stand which is decidedly not cheap but is a great piece of hardware.

By the way, a microphone stand is great to have, if you can get one, so you don’t have to set the microphone on your desk or table. That way if you bump the table by accident or generally like to bang the table, it won’t get picked up on the microphone. It’s not something to get right away, but maybe later when you make the big time.

Recording Software

If you’re recording by yourself, you can just hook up your microphone to your computer and record to any old software that records sound (on the Mac you can use Quicktime). If you have multiple people, you can either

  1. Speak into the same mic and have both your voices recorded on the same audio file
  2. Use separate mics (and separate computers) and record separtely on to separate audio files. This requires synching the audio files in an editor, but that’s not too big a deal if you only have 2-3 people.

For local podcasts, I actually just use the H4n and record directly to the SD card. This creates separate WAV files for each microphone that are already synced so you can just plop them in the editor.

For remote podcasts, you’ll need some communication software. Hilary and I use Zencastr which has its own VoIP system that allows you to talk to anyone by just sending your guests a link. So I create a session in Zencastr, send Hilary the link for the session, she logs in (without needing any credentials) and we just start talking. The web site records the audio directly off of your microphone and then uploads the audio files (one for each guest) to Dropbox. The service is really nice and there are now a few just like it. Zencastr costs $20 a month right now but there is a limited free tier.

The other approach is to use something like Skype and then use something like ecamm call-recorder to record the conversation. The downside with this approach is that if you have any network trouble that messes up the audio, then you will also record that. However, Zencastr (and related services) do not work on iOS devices and other devices that use WebKit based browsers. So if you have someone calling in on a mobile device via Skype or something, then you’ll have to use this approach. Otherwise, I prefer the Zencastr approach and can’t really see any downside except for the cost.

Editing Software

There isn’t a lot of software that’s specifically designed for editing podcasts. I actually started off editing podcasts in Final Cut Pro X (nonlinear video editor) because that’s what I was familiar with. But now I use Logic Pro X, which is not really designed for podcasts, but it’s a real digital audio workstation and has nice features (like strip silence). But I think something like Audacity would be fine for basic editing.

The main thing I need to do with editing is merge the different audio tracks together and cut off any extraneous material at the beginning or the end. I don’t usually do a lot of editing in the middle unless there’s a major mishap like a siren goes by or a cat jumps on the computer. Once the editing is done I bounce to an AAC or MP3 file for uploading.

Hosting

You’ll need a service for hosting your audio files if you don’t have your own server. You can technically host your audio files anywhere, but specific services have niceties like auto-generating the RSS feed. For Not So Standard Deviations I use SoundCloud and for The Effort Report I use Libsyn.

Of the two services, I think I prefer Libsyn, because it’s specifically designed for podcasting and has somewhat better analytics. The web site feels a little bit like it was designed in 2003, but otherwise it works great. Libsyn also has features for things like advertising and subscriptions, but I don’t use any of those. SoundCloud is fine but wasn’t really designed for podcasting and sometimes feels a little unnatural.

Summary

If you’re interested in getting started in podcasting, here’s my bottom line:

  1. Get a partner. It’s more fun that way!
  2. If you and your partner are remote, use Zencastr or something similar.
  3. Splurge for the Zoom H4n if you can, otherwise get a reasonable cheap microphone like the Audio Technica or the Yeti.
  4. Don’t focus too much on editing. Just clip off the beginning and the end.
  5. Host on Libsyn.
23 Feb 06:38

Recommended on Medium: I’ll never bring my phone on an international flight again. Neither should you.

A public domain image of a US Customs and Border Patrol agent in an airport.

A few months ago I wrote about how you can encrypt your entire life in less than an hour. Well, all the security in the world can’t save you if someone has physical possession of your phone or laptop, and can intimidate you into giving up your password.

Image credit: XKCD

And a few weeks ago, that’s precisely what happened to a US citizen returning home from abroad.

On January 30th, Sidd Bikkannavar, a US-born scientist at NASA’s Jet Propulsion Laboratory flew back to Houston, Texas from Santiago, Chile.

On his way through through the airport, Customs and Border Patrol agents pulled him aside. They searched him, then detained him in a room with a bunch of other people sleeping in cots. They eventually returned and said they’d release him if he told them the password to unlock his phone.

Sidd Bikkannavar’s hobbies include racing solar-powered cars. Photo by The Verge

Bikkannavar explained that the phone belonged to NASA and had sensitive information on it, but his pleas fell on deaf ears. He eventually yielded and unlocked his phone. The agents left with his phone. Half an hour later, they returned, handed him his phone, and released him.

We’re going to discuss the legality of all of this, and what likely happened during that 30 minutes where Bikkannavar’s phone was unlocked and outside of his possession.

But before we do, take a moment to think about all the apps you have on your phone. Email? Facebook? Dropbox? Your browser? Signal? The history of everything you’ve ever done — everything you’ve ever searched, and everything you’ve ever said to anyone — is right there in those apps.

“We should treat personal electronic data with the same care and respect as weapons-grade plutonium — it is dangerous, long-lasting and once it has leaked there’s no getting it back.” — Cory Doctorow

How many potentially incriminating things do you have lying around your home? If you’re like most people, the answer is probably zero. And yet police would need to go before a judge and establish probable cause before they could get a warrant to search your home.

What we’re seeing now is that anyone can be grabbed on their way through customs and forced to hand over the full contents of their digital life.

Companies like Elcomsoft make “forensic software” that can suck down all your photos, contacts — even passwords for your email and social media accounts — in a matter of minutes. Their customers include the police forces of various countries, militaries, and private security forces. They can use these tools to permanently archive everything there is to know about you. All they need is your unlocked phone.

“If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged.” — Cardinal Richelieu in 1641

What’s the worst thing that could happen if the Customs and Border Patrol succeed in getting ahold of your unlocked phone? Well…

  • Think of all of the people you’ve ever called or emailed, and all the people you’re connected with on Facebook and LinkedIn. What are the chances that one of them has committed a serious crime, or will do so in the future?
  • Have you ever taken a photo at a protest, bought a controversial book on Amazon, or vented about an encounter with a police officer to a loved one? That information is now part of your permanent record, and could be dragged out as evidence against you if you ever end up in court.
  • There’s a movement within government to make all data from all departments available to all staff at a local, state, and federal level. The more places your data ends up, the larger a hacker’s “attack surface” is — that is, the more vulnerable your data is. A security breach in a single police station in the middle of nowhere could result in your data ending up in the hands of hackers — and potentially used against you from the shadows — for the rest of your life.

Wait a second. What about my fourth and fifth amendment rights? Isn’t this illegal?

The fourth amendment protects you against unreasonable search and seizure. The fifth amendment protects you against self-incrimination.

If a police officer were to stop you on the street of America and ask you to unlock your phone and give it to them, these amendments would give you strong legal ground for refusing to do so.

But unfortunately, the US border isn’t technically the US, and you don’t have either of these rights at the border.

It’s totally legal for a US Customs and Border Patrol officer to ask you to unlock your phone and hand it over to them. And they can detain you indefinitely if you don’t. Even if you’re a American citizen.

The border is technically outside of US jurisdiction, in a sort of legal no-man’s-land. You have very few rights there. Barring the use of “excessive force,” agents can do whatever they want to you.

So my advice is to just do whatever they tell you, to and get through customs and on into the US as quickly as you can.

The US isn’t the only country that does this.

It’s only a matter of time before downloading the contents of people’s phones becomes a standard procedure for entering every country. This already happens in Canada. And you can bet that countries like China and Russia aren’t far behind.

“Never say anything in an electronic message that you wouldn’t want appearing, and attributed to you, in tomorrow morning’s front-page headline in the New York Times.” — Colonel David Russell, former head of DARPA’s Information Processing Techniques Office

Since it’s illegal in most countries to profile individual travelers, customs officers will soon require everyone to do this.

The companies who make the software that downloads data from your phones are about to get a huge infusion of money from governments. Their software will get much faster — maybe requiring only a few seconds to download all of your most pertinent data from your phone.

If we do nothing to resist, pretty soon everyone will have to unlock their phone and hand it over to a customs agent while they’re getting their passport swiped.

Over time, this unparalleled intrusion into your personal privacy may come to feel as routine as taking off your shoes and putting them on a conveyer belt.

And with this single new procedure, all the hard work that Apple and Google have invested in encrypting the data on your phone — and fighting for your privacy in court — will be a completely moot point.

Governments will have succeeded in utterly circumventing decades of innovation in security and privacy protection. All by demanding you hand them the skeleton key to your life — your unlocked phone.

You can’t hand over a device that you don’t have.

When you travel internationally, you should leave your mobile phone and laptop at home. You can rent phones at most international airports that include data plans.

If you have family overseas, you can buy a second phone and laptop and leave them there at their home.

If you’re an employer, you can create a policy that your employees are not to bring devices with them during international travel. You can then issue them “loaner” laptops and phones once they enter the country.

Since most of our private data is stored in the cloud — and not on individual devices — you could also reset your phone to its factory settings before boarding an international flight. This process will also delete the keys necessary to unencrypt any residual data on your phone (iOS and Android fully encrypt your data).

This way, you could bring your physical phone with you, then reinstall apps and re-authenticate with them once you’ve arrived. If you’re asked to hand over your unlocked phone at the border, there won’t be any personal data on it. All your data will be safe behind the world-class security that Facebook, Google, Apple, Signal, and all these other companies use.

Is all this inconvenient? Absolutely. But it’s the only sane course of action when you consider the gravity of your data falling into the wrong hands.

If you bother locking your doors at night, you should bother securing your phone’s data during international travel.

This may upset Customs and Border Patrol agents, who are probably smart enough to realize that 85% of Americans now have smart phones, and probably 100% of the Americans who travel internationally have smart phones. They may choose to detain you anyway, and force you to give them passwords to various accounts manually. But there’s no easy way for them to know which services you use and which services you don’t use, or whether you have multiple accounts.

We live in an era of mass surveillance, where governments around the world are passing terrifying new anti-privacy laws every year.

“Those who are willing to surrender their freedom for security have always demanded that if they give up their full freedom it should also be taken from those not prepared to do so.” — Friedrich Hayek

With a lot of hard work on our part, enlightenment will triumph. Privacy will be restored. And we will beat back the current climate of fear that’s confusing people into unnecessarily giving up their rights.

In the meantime, follow the Boy Scouts of America Motto: always be prepared. The next time you plan to cross a border, leave your phone at home.

Thank you for taking the time to reading this. If you liked this, click the

23 Feb 06:35

Instapaper Liked: The true story of how Teen Vogue got mad, got woke, and began terrifying men like Donald Trump

“A woman’s preaching is like a dog’s walking on his hind legs,” Samuel Johnson said back in 1791. “It is not done well; but you are surprised to find it done at…
23 Feb 06:35

Recommended on Medium: Masjid Toronto

I wrote this for WelcomeHomeTO’s blog last year — seems as good a time as any to repost.

Continue reading on Medium »

23 Feb 06:35

Twitter Favorites: [codecatTO] @sillygwailo @TheHTML500 yay!! My mentoring buddy for the day!!

Codecat @codecatTO
@sillygwailo @TheHTML500 yay!! My mentoring buddy for the day!!
23 Feb 06:35

When Good Intentions Backfire

by zephoria

… And Why We Need a Hacker Mindset


I am surrounded by people who are driven by good intentions. Educators who want to inform students, who passionately believe that people can be empowered through knowledge. Activists who have committed their lives to addressing inequities, who believe that they have a moral responsibility to shine a spotlight on injustice. Journalists who believe their mission is to inform the public, who believe that objectivity is the cornerstone of their profession. I am in awe of their passion and commitment, their dedication and persistence.

Yet, I’m existentially struggling as I watch them fight for what is right. I havelearned that people who view themselves through the lens of good intentions cannot imagine that they could be a pawn in someone else’s game. They cannot imagine that the values and frames that they’ve dedicated their lives towards — free speech, media literacy, truth — could be manipulated or repurposed by others in ways that undermine their good intentions.

I find it frustrating to bear witness to good intentions getting manipulated,but it’s even harder to watch how those who are wedded to good intentions are often unwilling to acknowledge this, let alone start imagining how to develop the appropriate antibodies. Too many folks that I love dearly just want to double down on the approaches they’ve taken and the commitments they’ve made. On one hand, I get it — folks’ life-work and identities are caught up in these issues.

But this is where I think we’re going to get ourselves into loads of trouble.

The world is full of people with all sorts of intentions. Their practices and values, ideologies and belief systems collide in all sorts of complex way. Sometimes, the fight is about combating horrible intentions, but often it is not. In college, my roommate used to pound a mantra into my head whenever I would get spun up about something: “Do not attribute to maliciousness what you can attribute to stupidity.” I return to this statement a lot when I think about how to build resilience and challenge injustices, especially when things look so corrupt and horribly intended — or when people who should be allies see each other as combatants. But as I think about how we should resist manipulation and fight prejudice, I also think that it’s imperative to move away from simply relying on “good intentions.”

I don’t want to undermine those with good intentions, but I also don’t want good intentions to be a tool that can be used against people. So I want to think about how good intentions get embedded in various practices and the implications of how we view the different actors involved.

The Good Intentions of Media Literacy

When I penned my essay “Did Media Literacy Backfire?”, I wanted to ask those who were committed to media literacy to think about how their good intentions — situated in a broader cultural context — might not play out as they would like. Folks who critiqued my essay on media literacy pushed back in all sorts of ways, both online and off. Many made me think, but some also reminded me that my way of writing was off-putting. I was accused of using the question “Did media literacy backfire?” to stoke clicks.Some snarkily challenged my suggestion that media literacy was even meaningfully in existence, asked me to be specific about which instantiations I meant (because I used the phrase “standard implementations”), and otherwise pushed for the need to double down on “good” or “high quality” media literacy. The reality is that I’m a huge proponent of their good intentions — and have long shared them, but I wrote this piece because I’m worried that good intentions can backfire.

While I was researching youth culture, I never set out to understand what curricula teachers used in the classroom. I wasn’t there to assess the quality of the teachers or the efficacy of their formal educational approaches. I simply wanted to understand what students heard and how they incorporated the lessons they received into their lives. Although the teens that I met had a lot of choice words to offer about their teachers, I’ve always assumed that most teachers entered the profession with the best of intentions, even if their students couldn’t see that. But I spent my days listening to students’ frustrations and misperceptions of the messages teachers offered.

I’ve never met an educator who thinks that the process of educating is easy or formulaic. (Heck, this is why most educators roll their eyes when they hear talk of computerized systems that can educate better than teachers.) So why do we assume that well-intended classroom lessons — or even well-designed curricula — might not play out as we imagine? This isn’t simply about the efficacy of the lesson or the skill of the teacher, but the cultural context in which these conversations occur.

In many communities in which I’ve done research, the authority of teachers is often questioned. Nowhere is this more painfully visible than when well-intended highly educated (often white) teachers come to teach in poorer communities of color. Yet, how often are pedagogical interventions designed by researchers really taking into account the doubt that students and their parents have of these teachers? And how do we as educators and scholars grapple with how we might have made mistakes?

I’m not asking “Did Media Literacy Backfire?” to be a pain in the toosh, but to genuinely highlight how the ripple effects of good intentions may not play out as imagined on the ground for all sorts of reasons.

The Good Intentions of Engineers

From the outside, companies like Facebook and Google seem pretty evil to many people. They’re situated in a capitalist logic that many advocates and progressives despise. They’re opaque and they don’t engage the public in their decision-making processes, even when those decisions have huge implications for what people read and think. They’re extremely powerful and they’ve made a lot of people rich in an environment where financial inequality and instability is front and center. Primarily located in one small part of the country, they also seem like a monolithic beast.

As a result, it’s not surprising to me that many people assume that engineers and product designers have evil (or at least financially motivated) intentions. There’s an irony here because my experience is the opposite.Most product teams have painfully good intentions, shaped by utopic visions of how the ideal person would interact with the ideal system. Nothing is more painful than sitting through a product design session with design personae that have been plucked from a collection of clichés.

I’ve seen a lot of terribly naive product plans, with user experience mockups that lack any sense of how or why people might interact with a system in unexpected ways. I spent years tracking how people did unintended things with social media, such as the rise of “Fakesters,” or of teenagers who gamed Facebook’s system by inserting brand names into their posts, realizing that this would make their posts rise higher in the social network’s news feed. It has always boggled my mind how difficult it is for engineers and product designers to imagine how their systems would get gamed. I actually genuinely loved product work because I couldn’t help but think about how to break a system through unexpected social practices.

Most products and features that get released start with good intentions, but they too get munged by the system, framed by marketing plans, and manipulated by users. And then there’s the dance of chaos as companies seek to clean up PR messes (which often involves non-technical actors telling insane fictions about the product), patch bugs to prevent abuse, and throw bandaids on parts of the code that didn’t play out as intended. There’s a reason that no one can tell you exactly how Google’s search engine or Facebook’s news feed works. Sure, the PR folks will tell you that it’s proprietary code. But the ugly truth is that the code has been patched to smithereens to address countless types of manipulation and gamification(e.g., SEO to bots). It’s quaint to read the original “page rank” paper that Brin and Page wrote when they envisioned how a search engine could ideally work. That’s so not how the system works today.

The good intentions of engineers and product people, especially those embedded in large companies, are often doubted as sheen for a capitalist agenda. Yet, like many other well-intended actors, I often find that makers feel misunderstood and maligned, assumed to have evil thoughts. And I often think that when non-tech people start by assuming that they’re evil, we lose a significant opportunity to address problems.

The Good Intentions of Journalists

I’ve been harsh on journalists lately, mostly because I find it so infuriating that a profession that is dedicated to being a check to power could be so ill-equipped to be self-reflexive about its own practices.

Yet, I know that I’m being unfair. Their codes of conduct and idealistic visions of their profession help journalists and editors and publishers stay strong in an environment where they are accustomed to being attacked. It just kills me that the cultural of journalism makes those who have an important role to play unable to see how they can be manipulated at scale.

Sure, plenty of top-notch journalists are used to negotiating deception and avoidance. You gotta love a profession that persistently bangs its head against a wall of “no comment.” But journalism has grown up as an individual sport; a competition for leads and attention that can get fugly in the best of configurations. Time is rarely on a journalist’s side, just as nuance is rarely valued by editors. Trying to find “balance” in this ecosystem has always been a pipe dream, but objectivity is a shared hallucination that keeps well-intended journalists going.

Powerful actors have always tried to manipulate the news media, especially State actors. This is why the fourth estate is seen as so important in the American context. Yet, the game has changed, in part because of the distributed power of the masses. Social media marketers quickly figured out that manufacturing outrage and spectacle would give them a pathway to attention, attracting news media like bees to honey. Most folks rolled their eyes, watching as monied people played the same games as State actors. But what about the long tail? How do we grapple with the long tail? How should journalists respond to those who are hacking the attention economy?

I am genuinely struggling to figure out how journalists, editors, and news media should respond in an environment in which they are getting gamed.What I do know from 12-steps is that the first step is to admit that you have a problem. And we aren’t there yet. And sadly, that means that good intentions are getting gamed.

Developing the Hacker Mindset

I’m in awe of how many of the folks I vehemently disagree with are willing to align themselves with others they vehemently disagree with when they have a shared interest in the next step. Some conservative and hate groups are willing to be odd bedfellows because they’re willing to share tactics, even if they don’t share end goals. Many progressives can’t even imagine coming together with folks who have a slightly different vision, let alone a different end goal, to even imagine various tactics. Why is that?

My goal in writing these essays is not because I know the solutions to some of the most complex problems that we face — I don’t — but because I think that we need to start thinking about these puzzles sideways, upside down, and from non-Euclidean spaces. In short, I keep thinking that we need more well-intended folks to start thinking like hackers.

Think just as much about how you build an ideal system as how it might be corrupted, destroyed, manipulated, or gamed. Think about unintended consequences, not simply to stop a bad idea but to build resilience into the model.

As a developer, I always loved the notion of “extensibility” because it was an ideal of building a system that could take unimagined future development into consideration. Part of why I love the notion is that it’s bloody impossible to implement. Sure, I (poorly) comment my code and build object-oriented structures that would allow for some level of technical flexibility. But, at the end of the day, I’d always end up kicking myself for not imagining a particular use case in my original design and, as a result, doing a lot more band-aiding than I’d like to admit. The masters of software engineering extensibility are inspiring because they don’t just hold onto the task at hand, but have a vision for all sorts of different future directions that may never come into fruition. That thinking is so key to building anything, whether it be software or a campaign or a policy. And yet, it’s not a muscle that we train people to develop.

If we want to address some of the major challenges in civil society, we need the types of people who think 10 steps ahead in chess, imagine innovative ways of breaking things, and think with extensibility at their core. More importantly, we all need to develop that sensibility in ourselves. This is the hacker mindset.

This post was originally posted on Points. It builds off of a series of essays on topics affecting the public sphere written by folks at Data & Society. As expected, my earlier posts ruffled some feathers, and I’ve been trying to think about how to respond in a productive manner. This is my attempt.

Flickr Image: CC BY 2.0-licensed image by DaveBleasdale.

23 Feb 06:32

Robots are more logical than The NY Times Editorial Board, and that’s the problem

by Stowe Boyd

Sooner or later we will have to actively oppose the onslaught of automation and AI, or else there will be no work left, at all, for any of…

Continue reading on Work Futures »

23 Feb 06:32

Scaling your Product Team: When and How to Start

by Heather McCloskey

So your organization is growing, your products are becoming more complex, and you’ve got more customers demanding your attention and new functionalities. Congratulations! These are all good things, but they can stress product teams that haven’t kept up with the rest of their organization’s growth.

It’s important to recognize when it’s time to scale up and think big, which is not always easy for product management leaders that have been running lean and mean since the early days of proof-of-concepts and MVPs. When is the right time to grow the group? What are you trying to achieve by scaling your product team? and how exactly do you do it?

These are all good questions for the aspiring product leader looking to evolve with the times. Let’s start by looking at the key moments when growing the team is no longer optional.

When to Scale your Product Team

When you’re the problem

When to scale your product management team This one is obvious. Product management is working well when the engineers have plenty of meticulously planned and researched user stories, epics, and requirements piling up in their queue. Product management is NOT working well when the technical team is waiting around for product management, or you’re letting things fall through the cracks as you try to keep up with the many demands on your plate. 

If there’s simply too much for you to do, it’s time to scale up. Don’t be the bottleneck, build a wider bottle.

When you’re adopting a platform approach

Shifting from a single product to a platform is also an excellent opportunity to scale your product management organization to meet the new challenges that presents and skillsets that are required.

“You need to make an explicit delineation between customer facing product managers (AKA “Solutions Product managers” in SAAS companies) and platform product managers,” says Wyatt Jenkins of Hired, “Platform PMs are in charge of creating functionality that spans multiple product lines. These PM’s need to understand the large strategic vision well enough to make short and long term trade-offs across different products.”

When you’re feeding the beast at growth-oriented companies

At consumer tech companies, there is often an obsession with growth. That’s not necessarily a bad thing, as growth typically increases revenue and/or valuation. But… rapid growth has two major impacts on product management:

  • If it’s successful, you’ve got a ton of scaling issues to worry about that have very little to do with defining new functionality and everything to do with making sure your systems can run through the night without a babysitter.
  • Your growth engine is actually a product of its own, with tentacles reaching into web sites, social media, CRM and more.

That’s why many organizations assign product management to growth, or assign growth to product management. For example, at organizations like Uber and Facebook, under their VP of Growth they’ve got product teams (including product managers) for functions like Signups, Notifications and Onboarding. Meanwhile, at companies such as Twitter and Dropbox, they actually place the growth team under the VP of Product, again with individual product managers assigned to core growth functions.

Both of these scenarios represent an additional layer of challenges for scaling a product team. If you’re embedding product managers inside of a growth team that has nothing to do with the core product experience, your organization risks each PM and their respective team creating their own processes and habits that might not play well with others down the line. On the other side of the coin, if a PM executive is responsible for non-PM functions, they need to serve two masters.

“Pramod Sokke is the Head of Products for all clients on all BitTorrent platforms. He’s responsible for BitTorrrent’s growth metrics and non-growth metrics. Meaning Pramod must develop his product roadmaps for growth and non-growth KPIs because he’s responsible for both sets of KPIs,” reports Andrew McInnes, “People not working on growth initiatives at BitTorrent are confident that he’s prioritizing growth initiatives in a balanced way to ensure the success of everyone in the long run.”

And, of course, we all know no matter how great the growth engine may be, it ultimately comes down to product-market fit and user experience. “What is the point in acquiring all those users, if they leave once they see the product?” says Uber’s Andrew Chen. “Growth is an after-effect of strong product market fit and great distribution.”

How to Scale your Product Team

So you know you need to grow, but how do you do it? And how do you communicate the goals of this growth to everyone involved? You’re not simply adding an additional assembly line to the product management factory, you’re building a team of high-performing individuals combining to create something better than before. Here’s a few things to remember:

More product managers ≠ more features

It’s easy to equate an increased product management function with an increased output of new functionality and features, but there are a few problems with that assumption.

First of all, a new product manager can create specs all day long, but if there isn’t an additional dev team to build them, you’re not going to see an increase in features hitting the market.

More importantly, more features shouldn’t be the goal of scaling your product management team. You might even begin requesting fewer new features as you grow your team, but the items you are asking for will be better researched, more clearly defined and more closely tracked and aligned with your company’s key goals and metrics.

You’re not cloning yourself

While you have plenty to teach your new product managers, the goal of scaling is not to literally replicate yourself. Instead, you’re trying to build a team. And just like a great basketball team doesn’t put five point guards on the floor at the same time, you need to assemble a diverse combination of skillsets, experiences, and outlooks as you grow.

Working together as a

“You have to stop being the product manager or pattern manager, and start being the builder of a team,” says Microsoft veteran Steven Sinofsky, “As a product leader, it’s not necessarily your responsibility to build the product, but to be the creator of a framework for how decisions are going to get made. In doing this, you are allowing people to discover the patterns on their own. Because these might not be patterns you thought of originally, and a new pattern might be on the verge of being created.”

This means you need to accept different styles, embrace diversity, and give them a long leash. Sure, they’ll make mistakes that you would have avoided, but that’s how they’ll learn what they need to be successful in the long run. What is essential, however, is establishing a tone and framework that ensures this diverse set of individuals is all working toward the same thing.

“In order to grow and scale our product teams, people need a set of values to help them make good decisions that align with what we believe” says Paul Adams of Intercom.

Build or buy

Since no one gets a degree in product management, there’s always the question of whether you pull folks from within your own company into product management or recruit from outside the firm.

Many companies often start by converting technical personnel into product managers, under the assumption that their knowledge of the code, systems and players will be advantageous and let them hit the ground running. While this is true, it also means you’re importing their biases and institutional memory into the product management organization while skimping on business chops that an external candidate offers. That’s why some firms like to take a little of both.

“We decided to employ a mixed strategy. We hired 5 PMs from the outside, and transferred in 4 from other internal functions,” says Scott Williamson of SendGrid, “We like this combination, as it provides us with a rich mix of PM, email, and SendGrid experience. The ‘SendGrid-grown PMs’ can learn PM best practices from the more experienced team members, and the external PMs can learn from the SendGrid veterans about the customers and technology behind SendGrid.”

And just as your product management team may hail from different places, their individual job responsibilities and functions should also be tweaked and tailored to fit their individual skills.

“Not everyone has the same idea of what a product manager does, so we tried to be specific about what tasks PMs could own while sharing a vision for what the role could accomplish,” Says Isaac Souweine of Frank & Oak, “By taking an iterative approach, we were able to customize the PM role to the specific organizational and team context at Frank & Oak.”

Laying the groundwork for expanding the team

While product management seems like a no-brainer for product managers, a lot of people in the organization don’t always understand why it’s so important to grow product management at the same rate as the rest of the company. That’s why you need to win over the hearts and minds of decision makers and sell them on the merits of expanding the team.

“Given the overall organizational skepticism around the value of PM, I knew I had some bridge building to do,” says SendGrid’s Williamson, “One of my first moves was to set up 1:1s with key players across the company, to listen and understand their issues with PM to that point, and to open lines of communication that didn’t previously exist. Over time, trust developed and with it came the internal support to grow the team.”

Who belongs on the product team

This internal campaigning must not only raise the importance of product management throughout the company, it should also focus on getting the budget to hire quality and not just quantity. Top talent with experience will demand higher salaries than a fresh grad or junior engineer looking to see how the other half lives.

When it comes to how big your new team needs to be, there’s no perfect answer, but plenty of opinions on the ideal ratio.

“Tech companies should have a product management team that vaguely scales with Engineering,” says product management guru Rich Mironov, “50 Engineering folks might suggest 3 product managers; 200 Engineering folks may need 7-10 product managers wrestling with requirements/priorities/markets/interrupts.”

Whether you’re basing your new org chart on how many engineers you have, the size of your product suite or which verticals you’re targeting, your rationale should be consistent and defendable when challenged by management.

An ever-evolving organizational approach

Just like there’s no single, perfect product organization, there’s also no product organization that doesn’t need to adapt over time. As your company grows and matures, pivots and expands, refocuses and repositions, your team should also mirror the new directions and goals of the organization.

Take Buffer, for example. They’re on their fourth iteration of how they organize product management, beginning with a one-stop-shop approach before breaking up the team into holistic teams, then fluid task forces and now operating under a “goal-focused squads” and “chapters” model.

They assign a collection of skillsets to specific squads focused on particular aspects of the product suite (such as “Android” or “Onboarding”). These squads include engineers, designers, product managers, customer development and sometimes growth analysts.

Buffer also assigned everyone to chapters, where everyone with the same job role gets together (i.e. all the product managers, all the designers). It’s at the chapter level where they maintain the standards for product management, exchange ideas about best practices, etc.

“Not only does this arrangement help us create specialists with great ownership of the challenges they’re working on,” says Buffer’s Courtney Seiter. “It also shows us exactly where we need to grow the team in order to be at our most effective.”

Buffer’s approach is similar to those used at Spotify and Hudl. And while the squad/chapter approach creates a lot of high-functioning and purpose-driven teams within the organization, it does come with its own set of challenges.

“They rely heavily on squads and team members themselves to communicate openly, challenge themselves by staying uncomfortable, and share knowledge with other parts of the company when it’s needed,” says Hudl’s Jordan Degner. “As easy as that sounds, it can be even easier to stay in the comfort of your own expertise and keep that expertise to yourself.”

Of course, not every organization is a great fit for a bunch of independent teams, but the principle of assigning discrete ownership of specific areas is a common thread in successfully scaled product organizations. You have a product executive at the top, and individual product managers take ownership of measurable and contained things.

“It should be obvious and apparent what area each product owner runs, what metrics they are responsible for, and how it impacts the business,” says Barron Ernst of ShowMax. “If you have a PM working on special projects that don’t advance your startup, it’s time to question the purpose of the role.”

That means product teams should have a defined area of need and future ownership for any new product management hire. Unlike in engineering–where an extra hand is always appreciated and can be swapped from one area to the next for extra bandwidth–bringing on a new product manager with no specific purpose is unlikely to improve the situation. It’s also the reason product leaders need to create a framework for success when they add members to their team.

“Individual product managers are rarely able to define their jobs, or push back on groups that dump random work in their direction,” says Mironov. “Without someone to establish job boundaries, they end up doing a little of everything and not enough of their real value-add.”
This brings us back to the most important thing to remember as you scale your team. You’re no longer just “the product person,” you are a team leader. It’s not about what you’re doing, it’s about what you’re team is doing, and how they perform and interact with rest of the organization is a direct reflection on you. So scale wisely…

23 Feb 06:31

Manifestos and Monopolies

by Ben Thompson

It is certainly possible that, as per recent speculation, Facebook CEO Mark Zuckerberg is preparing to run for President. It is also possible that Facebook is on the verge of failing “just like MySpace”. And while I’m here, it’s possible that UFOs exist. I doubt it, though.

The reality is that Facebook is one of the most powerful companies the tech industry — and arguably, the world — has ever seen. True, everything posted on Facebook is put there for free, either by individuals or professional content creators;1 and true, Facebook isn’t really irreplaceable when it comes to the generation of economic value;2 and it is also true that there are all kinds of alternatives when it comes to communication. However, to take these truths as evidence that Facebook is fragile requires a view of the world that is increasingly archaic.

Start with production: there certainly was a point in human history when economic power was derived through the control of resources and the production of scarce goods:

fullsizerender

However, for most products this has not been the case for well over a century; first the industrial revolution and then the advent of the assembly-line method of manufacturing resulted in an abundance of products. The new source of economic power became distribution: the ability to get those mass-produced products in front of customers who were inclined to buy them:

fullsizerender-2

Today the fundamental impact of the Internet is to make distribution itself a cheap commodity — or in the case of digital content, completely free. And that, by extension, is why I have long argued that the Internet Revolution is as momentous as the Industrial Revolution: it is transforming how and where economic value is generated, and thus where power resides:

img_0138

In this brave new world, power comes not from production, not from distribution, but from controlling consumption: all markets will be demand-driven; the extent to which they already are is a function of how digitized they have become.

This is why most Facebook-fail-fundamentalists so badly miss the point: that the company pays nothing for its content is not a weakness, it is a reflection of the fundamental reality that the supply of content (and increasingly goods) is infinite, and thus worthless; that the company is not essential to the distribution of products is not a measure of its economic importance, or lack thereof, but a reflection that distribution is no longer a differentiator. And last of all, the fact that communication is possible on other platforms is to ignore the fact that communication will always be easiest on Facebook, because they own the social graph. Combine that with the fact that controlling consumption is about controlling billions of individual consumers, all of whom will, all things being equal, choose the easy option, and you start to appreciate just how dominant Facebook is.

Given this reality, why would Zuckerberg want to be President? He is not only the CEO of Facebook, he is the dominant shareholder as well, answerable to no one. His power and ability to influence is greater than any President subject to political reality and check-and-balances, and besides, as Zuckerberg made clear last week, his concern is not a mere country but rather the entire world.

Facebook Unease

The argument that Facebook is more powerful than most realize is not a new one on Stratechery; in 2015 I wrote The Facebook Epoch that made similar points about just how underrated Facebook was, particularly in Silicon Valley. In my role as an analyst I can’t help but be impressed: I have probably written more positive pieces about Facebook than just about any other company, and frankly, still will.

And yet, if you were to take a military-type approach to analysis — evaluating Facebook based on capabilities, not intent — the company is, for the exact same reasons, rather terrifying. Last year in The Voters Decide I wrote:

Given their power over what users see Facebook could, if it chose, be the most potent political force in the world. Until, of course, said meddling was uncovered, at which point the service, having so significantly betrayed trust, would lose a substantial number of users and thus its lucrative and privileged place in advertising, leading to a plunge in market value. In short, there are no incentives for Facebook to explicitly favor any type of content beyond that which drives deeper engagement; all evidence suggests that is exactly what the service does.

The furor last May over Facebook’s alleged tampering with the Trending Topics box — and Facebook’s overwrought reaction to even the suggestion of explicit bias — seemed to confirm that Facebook’s incentives were such that the company would never become overtly political. To be sure, algorithms are written by humans, which means they will always have implicit bias, and the focus on engagement has its own harms, particularly the creation of filter bubbles and fake news, but I have long viewed Facebook’s use for explicit political ends to be the greatest danger of all.

This is why I read Zuckerberg’s manifesto, Building a Global Community, with such alarm. Zuckerberg not only gives his perspective on how the world is changing — and, at least in passing, some small admission that Facebook’s focus on engagement may have driven things like filter bubbles and fake news — but for the first time explicitly commits Facebook to playing a central role in effecting that change in a manner that aligns with Zuckerberg’s personal views on the world. Zuckerberg writes:

This is a time when many of us around the world are reflecting on how we can have the most positive impact. I am reminded of my favorite saying about technology: “We always overestimate what we can do in two years, and we underestimate what we can do in ten years.” We may not have the power to create the world we want immediately, but we can all start working on the long term today. In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us.

For the past decade, Facebook has focused on connecting friends and families. With that foundation, our next focus will be developing the social infrastructure for community — for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.

It all sounds so benign, and given Zuckerberg’s framing of the disintegration of institutions that held society together, helpful, even. And one can even argue that just as the industrial revolution shifted political power from localized fiefdoms and cities to centralized nation-states, the Internet revolution will, perhaps, require a shift in political power to global entities. That seems to be Zuckerberg’s position:

Our greatest opportunities are now global — like spreading prosperity and freedom, promoting peace and understanding, lifting people out of poverty, and accelerating science. Our greatest challenges also need global responses — like ending terrorism, fighting climate change, and preventing pandemics. Progress now requires humanity coming together not just as cities or nations, but also as a global community.

There’s just one problem: first, Zuckerberg may be wrong; it’s just as plausible to argue that the ultimate end-state of the Internet Revolution is a devolution of power to smaller more responsive self-selected entities. And, even if Zuckerberg is right, is there anyone who believes that a private company run by an unaccountable all-powerful person that tracks your every move for the purpose of selling advertising is the best possible form said global governance should take?

The Cost of Monopoly

My deep-rooted suspicion of Zuckerberg’s manifesto has nothing to do with Facebook or Zuckerberg; I suspect that we agree on more political goals than not. Rather, my discomfort arises from my strong belief that centralized power is both inefficient and dangerous: no one person, or company, can figure out optimal solutions for everyone on their own, and history is riddled with examples of central planners ostensibly acting with the best of intentions — at least in their own minds — resulting in the most horrific of consequences; those consequences sometimes take the form of overt costs, both economic and humanitarian, and sometimes those costs are foregone opportunities and innovations. Usually it’s both.

Facebook is already problematic for society when it comes to opportunity costs. While the Internet — specifically, the removal of distribution as a bottleneck — is the cause of journalism’s woes, it is Facebook that has gobbled up all of the profits in publishing. Twitter, a service I believe is both unique and essential, was squashed by Facebook; I suspect the company’s struggles for viability are at the root of the service’s inability to evolve or deal with abuse. Even Snapchat, led by the most visionary product person tech has seen in years, has serious questions about its long-term viability. Facebook is too dominant: its network effects are too strong, and its data on every user on the Internet too compelling to the advertisers other consumer-serving businesses need to be viable entities.3

I don’t necessarily begrudge Facebook this dominance; as I alluded to above I myself have benefited from chronicling it. Zuckerberg identified a market opportunity, ruthlessly exploited it with superior execution, had the humility to buy when necessary and the audacity to copy well, and has deservedly profited in the face of continual skepticism. And further, as I noted, as long as Facebook was governed by the profit-maximization incentive, I was willing to tolerate the company’s unintended consequences: whatever steps would be necessary to undo the company’s dominance, particularly if initiated by governments, would have their own unintended consequences. And besides, as we saw with IBM and Windows, markets are far more effective than governments at tearing down the ecosystem-based monopolies they enable — in part because the pursuit of profit-maximizing strategies is a key ingredient of disruption.

That, though, is why for me this manifesto crosses the line: contra Spider-Man, Facebook’s great power does not entail great responsibility; said power ought to entail the refusal to apply it, no matter how altruistic the aims, and barring that, it is on the rest of us to act in opposition.

Limiting Facebook

Of course it is one thing to point out the problems with Facebook’s dominance, but it’s quite another to come up with a strategy for dealing with it; too many of the solutions — including demands that Zuckerberg use Facebook for political ends — are less concerned with the abuse of power and more with securing said power for the “right” causes. And, from the opposite side, it’s not clear that a traditional antitrust is even possible for companies governed by Aggregation Theory, as I explained last year in Antitrust and Aggregation:

To briefly recap, Aggregation Theory is about how business works in a world with zero distribution costs and zero transaction costs; consumers are attracted to an aggregator through the delivery of a superior experience, which attracts modular suppliers, which improves the experience and thus attracts more consumers, and thus more suppliers in the aforementioned virtuous cycle…

The first key antitrust implication of Aggregation Theory is that, thanks to these virtuous cycles, the big get bigger; indeed, all things being equal the equilibrium state in a market covered by Aggregation Theory is monopoly: one aggregator that has captured all of the consumers and all of the suppliers.

This monopoly, though, is a lot different than the monopolies of yesteryear: aggregators aren’t limiting consumer choice by controlling supply (like oil) or distribution (like railroads) or infrastructure (like telephone wires); rather, consumers are self-selecting onto the Aggregator’s platform because it’s a better experience.

Facebook is a particularly thorny case, because the company has multiple lock-ins: on one hand, as per Aggregation Theory, Facebook has completely modularized and commoditized content suppliers desperate to reach Facebook’s massive user base; it’s a two-sided market in which suppliers are completely powerless. But so are users, thanks to Facebook’s network effects: the number one feature of any social network is whether or not your friends or family are using it, and everyone uses Facebook (even if they also use another social network as well).

To that end, Facebook should not be allowed to buy another network-based app; I would go further and make it prima facie anticompetitive for one social network to buy another. Network effects are just too powerful to allow them to be combined. For example, the current environment would look a lot different if Facebook didn’t own Instagram or WhatsApp (and, should Facebook ever lose an antitrust lawsuit, the remedy would almost certainly be spinning off Instagram and WhatsApp).

Secondly, all social networks should be required to enable social graph portability — the ability to export your lists of friends from one network to another. Again Instagram is the perfect example: the one-time photo-filtering app launched its network off the back of Twitter by enabling the wholesale import of your Twitter social graph. And, after it was acquired by Facebook, Instagram has only accelerated its growth by continually importing your Facebook network. Today all social networks have long since made this impossible, making it that much more difficult for competitors to arise.

Third, serious attention should be given to Facebook’s data collection on individuals. As a rule I don’t have any problem with advertising, or even data collection, but Facebook is so pervasive that it is all but impossible for individuals to opt-out in any meaningful way, which further solidifies Facebook’s growing dominance of digital advertising.4


Anyone who has read Stratechery for any length of time knows I have great reservations about regulation; the benefits are easy to measure, but the opportunity costs are both invisible and often far greater. That, though, is why I am also concerned about Facebook’s dominance: there are significant opportunity costs to the social network’s dominance. Even then, my trepidation about any sort of intervention is vast, and that leads me back to Zuckerberg’s manifesto: it’s bad enough for Facebook to have so much power, but the very suggestion that Zuckerberg might utilize it for political ends raises the costs of inaction from not just opportunity costs to overt ones.

Moreover, my proposals are in line with Zuckerberg’s proclaimed goals: if the Facebook CEO truly wants to foster new kinds of communities, then he ought to unleash the force that can best build the tools those disparate communities might need. That, of course, is the market, and Facebook’s social graph is the key. That Zuckerberg believes Facebook can do it alone is evidence enough that for Zuckerberg, saving the world is at best a close second to saving Facebook; the last thing we need are unaccountable leaders who put their personal interests above those they purport to govern.

  1. Plus, of course, the content Facebook pays for to seed initiatives like live video and dedicated content for the new video tab
  2. To be clear, economic value is generated on Facebook, but the role Facebook plays, whether that be advertising, small business sites, buy-and-sell groups, etc., could be done by alternatives
  3. Social networks must be free
  4. Google is a separate topic
23 Feb 06:27

Rule by Nobody

by Adam Clair

The compensation for a death sentence is knowledge of the exact hour when one is to die.
—Cincinnatus C., Invitation to a Beheading (Vladimir Nabokov, 1935)

 

Decision-making algorithms are everywhere, sorting us, judging us, and making critical decisions about us without our having much direct influence in the process. Political campaigns use them to decide where (and where not) to campaign. Social media platforms and search engines use them to figure out which posts and links to show us and in what order, and to target ads. Retailers use them to price items dynamically and recommend items they think you’ll be more likely to consume. News sites use them to sort content. The finance industry — from your credit score to the bots that high-frequency traders use to capitalize on news stories and tweets — is dominated by algorithms. Even dating is increasingly algorithmic, enacting a kind of de facto eugenics program for the cohort that relies on such services.

For all their ubiquity, these algorithms are paradoxical at their heart. They are designed to improve on human decision-making by supposedly removing its biases and limitations, but the inevitably reductive analytical protocols they implement are often just as vulnerable to misuse. Decision-making algorithms replace humans with simplified models of human thought processes that can reify rather than mitigate the biases those programmers are working from in conceptualizing the algorithm’s intent.

Cathy O’Neil, in her recent book Weapons of Math Destruction, defines algorithms as “opinions formalized in code.” This deceptively simple appraisal radically undercuts the common view of algorithms as neutral and objective. And even if programmers were capable of correcting against their own biases, the machine-learning components of many algorithms makes their workings mysterious, sometimes even to programmers themselves, as Frank Pasquale describes in another recent book, The Black Box Society.

Algorithms can never have “enough”

In the complexity of their code and the size of the data troves they can process, these kinds of algorithms can seem unprecedented, constituting an entirely new kind of social threat. But the aims they are designed to meet are not new. The logic of how these algorithms have been applied follows from the longstanding ideals of bureaucracies generally: that is, they are presumed to concentrate power in well-ordered and consistent structures. In theory, anyway. In practice, bureaucracies tend toward inscrutable unaccountability, much as algorithms do. By framing algorithms as an extension of familiar bureaucratic principles, we can draw from the history of the critique of bureaucracy to help further unpack algorithms’ dangers. Like formalized bureaucracy, algorithms may make overtures toward transparency, but tend toward an opacity that reinforces extant social injustices.

In the early 20th century, sociologist Max Weber outlined the essence of pure bureaucracies. Like algorithms, bureaucratic processes are built on the assumption that individual human judgment is too limited, subjective, and unreliable, deficiencies that lead to nepotism, prejudice, and inefficiency. To combat that, an ideal bureaucracy, according to Weber, has a clear purpose, explicit written rules of conduct, and a merit-based hierarchy of career employees. This structure places power in the apparatus and allows bureaucracies to function consistently regardless of who occupies different roles, but this same impersonality makes them controllable by anyone who can seize their higher offices. Also, because the apparatus itself generates the power, bureaucrats have incentive to serve that apparatus and preserve it even when it veers from its original intended function. This creates a strong tendency within bureaucracies to entrench themselves regardless of who directs them.

The way algorithms are implemented can mimic these bureaucratic tendencies. Google’s search algorithm, for example, appears to have a clear, limited purpose — to return the most relevant search results and most lucrative ads — and operates within a growing but defined space. As the company’s engineers come and go, ascend through the company hierarchy or leave it entirely, the algorithm itself persists and evolves. The intent of the algorithm was once to organize the world’s information, but as it has become a commonplace way of finding information, information has been reshaped in the algorithm’s image, as is most obvious with search-engine optimization. This effectively entrenches the algorithm at the expense of the world’s diversity of information.

Both bureaucracies and algorithms are ostensibly committed to transparency but become progressively more obscure in the name of guarding their functionality. That is, the systematicity of both make them susceptible to being “gamed”; Google and Facebook justify the secrecy of their sorting algorithms as necessary to thwarting subversive actors. Weber notes that bureaucracies too tend to become increasingly complex over time while simultaneously becoming increasingly opaque. Each trend makes the other more intractable. “Once fully established, bureaucracy is among those social structures which are hardest to destroy,” Weber warns. In bureaucracies, over time, only those “in the know” can effectively navigate the encrusted processes to their own benefit. “The superiority of the professional insider every bureaucracy seeks further to increase through the means of keeping secret its knowledge and intentions,” he writes. “Bureaucratic administration always tends to exclude the public, to hide its knowledge and action from criticism as well as it can.” This makes bureaucracies appear impervious to outside criticism and amendment.

But as O’Neil argues about algorithms, “You don’t need to understand all the details of a system to know that it has failed.” The problem with both algorithms and bureaucracies is that they try to set themselves up to be failure-proof. Bad algorithms and bureaucracies have a built-in defense mechanism in their incomprehensible structure. Engineers are often the only people who can understand or even see the code; career bureaucrats are the only people who understand the inner workings of the system. Since no one else can identify the specific reasons for problems, any failure can be interpreted as a sign that the system needs to be given more power to produce better outcomes. What constitutes a better outcome remains in the control of those implementing the algorithms, and is defined in terms of what the algorithms can process.

As Weber wrote, “The consequences of bureaucracy depend upon the direction which the powers using the apparatus give it. Very frequently a crypto-plutocratic distribution of power has been the result.” Likewise with algorithms: If a company’s algorithm increases its bottom line, for example, its social ramifications may become irrelevant externalities. If a recidivism model’s goal is to lower crime, the fairness or appropriateness of the prison sentences it produces don’t matter as long as the crime rate declines. If a social media platform’s goal is to maximize “engagement,” then it can be considered successful regardless of the veracity of the news stories or intensity of the harassment that takes place there, so long as users continue clicking and commenting.

Though automated systems purport to avert discrimination, Pasquale writes, “software engineers construct the datasets mined by scoring systems; they define the parameters of data-mining analyses; they create the clusters, links, and decision trees applied; they generate the predictive models applied. Human biases and values are embedded into each and every step of development. Computerization may simply drive discrimination upstream.” O’Neil offers a similar argument: “Models are constructed not just from data but from choices we make about which data to pay attention to — and which to leave out. Those choices are not just about logistics, profits, and efficiency. They are fundamentally moral. If we back away from them and treat mathematical models as a neutral and inevitable force, like the weather or the tides, we abdicate our responsibility.”

For bad algorithms and bureaucracies, any failure can be interpreted as a sign that the system needs more power to produce better outcomes

Far from an unintended consequence, however, that abdication becomes the whole point, even if algorithms and bureaucracies are frequently born with benevolent aims in mind. For the proprietors of these algorithms, this abdication is translated into a fervor for objective purity, as if neutrality in and of itself is always an undisputable aim. The intent of algorithms is presented as always self-evident (be neutral and thus fair) rather than a matter of negotiation and implementation. The means and ends become disconnected; objectivity becomes a front, a way of certifying outcomes regardless of whether or not they constitute social improvements. Thus the focus on combatting human bias leads directly to means for cloaking and dissipating human responsibility, merely making human bias harder to detect. Efforts to be more fair end up being a temptation or justification for opacity, greasing the tracks for an uneven allocation of rewards and penalties, exacerbating existing inequalities at any turn.

In On Violence, Hannah Arendt characterizes bureaucracy as “the rule of an intricate system of bureaus in which no men, neither one nor the best, neither the few nor the many, can be held responsible, and which could be properly called rule by Nobody.” Left unchecked, bureaucracy enables an unwitting conspiracy to carry out deeds that no individual would endorse but in which all are ultimately complicit. Corporations can pursue profit without consideration for effects on the environment or human lives. Violence becomes easier at the state level. And anti-state violence, without specific targets to aim for, shifts from strategic, logical action to incomprehensible, more terroristic expressions of rage. “The greater the bureaucratization of public life, the greater will be the attraction of violence,” Arendt argues. “In a fully developed bureaucracy there is nobody left with whom one could argue, to whom one could present grievances, on whom the pressures of power could be exerted.” It would, of course, be difficult to “attack” an algorithm, to make it feel shame or guilt, to persuade it that it is wrong.


In a capitalist society, the desire to remove human biases from decision-making processes is part of the overarching pursuit of efficiency and optimization, the rationalization Weber described as an “iron cage.” Algorithms may be sold as reducing bias, but their chief aim is to afford profit, power, and control. Fairness is the alibi for the way algorithmic systems reduce human subjects to only the attributes expressible as data, which makes us easier to monitor, manipulate, sell to, and exploit. They transfer risk from their operators to those caught up within their gears. So even when algorithms are working well, they are not working at all for us.

It’s obvious that algorithms with inaccurate data can be harmful to someone trying to get a job, a loan, or an apartment, and Pasquale and O’Neil trace out the many ramifications of this. Even if you can figure out when data brokers have inaccurate data about you, it is very difficult to get them to change it, and by the time they do, the bad data may have been passed along to countless different brokers, cascading exponentially through an interlocking system of algorithmic governance. Many algorithmic systems also use questionable proxies in place of traits that are impossible to quantify or illegal to track or sort by. Some, for instance, use ZIP codes as a proxy for race.

As with bureaucracies, algorithms purport to gain fairness by measuring only what can be measured fairly, leaving out anything prone to judgment calls, but in actuality this leaves a lot of leeway for those who have inside information or connections that can help them navigate the byzantine processes, and massage their data.

More precise and accurate data can’t fix a bad system. Even though the data may be accurate, the systems may lack the proper context for that data that situates its systemic implications. Pasquale summarizes how this occurs in lending: “Subtle but persistent racism, arising out of implicit bias or other factors, may have influenced past terms of credit, and it’s much harder to keep up on a loan at 15 percent interest than one at five percent. Late payments will be more likely, and then will be fed into present credit scoring models as neutral, objective, non-racial indicia of reliability and creditworthiness.”

Often these systems create feedback loops that worsen what they purport to measure objectively. Consider a credit rating that factors in your ZIP code. If your neighbors are bad about paying their bills, your score will go down. Your interest rates go up, making it harder to pay back loans and increasing the likelihood that you miss a payment or default. That lowers your score further, along with those of your neighbors. And so on. The algorithm is prescriptive, though the banks issuing loans view it as merely predictive.

No matter how much good data you have, there will always exist additional context, in the form of additional data that could improve it. There is no limit to reach that will confer objectivity, that will render results beyond being subject to interpretation. Algorithms can never have “enough.”

The need to optimize yourself for a network of opaque algorithms induces a sort of existential torture. In The Utopia of Rules: On Technology, Stupidity, and the Secret Joys of Bureaucracy, anthropologist David Graeber suggests a fundamental law of power dynamics: “Those on the bottom of the heap have to spend a great deal of imaginative energy trying to understand the social dynamics that surround them — including having to imagine the perspectives of those on top — while the latter can wander about largely oblivious to much of what is going on around them. That is, the powerless not only end up doing most of the actual, physical labor required to keep society running, they also do most of the interpretive labor as well.” This dynamic, Graeber argues, is built into all bureaucratic structures. He describes bureaucracies as “ways of organizing stupidity” — that is, of managing and reproducing these “extremely unequal structures of imagination” in which the powerful can disregard the perspectives of those beneath them in various social and economic hierarchies. Employees need to anticipate the needs of bosses; bosses need not reciprocate. People of color are forced to learn to accommodate and anticipate the ignorance and hostility of white people. Women need to be acutely aware of men’s intentions and feelings. And so on. Even benevolent-seeming bureaucracies, in Graeber’s view, have the effect of reinforcing “the highly schematized, minimal, blinkered perspectives typical of the powerful” and their privileges of ignorance and indifference toward those positioned as below them.

Fairness is the alibi for reducing human subjects to attributes only expressible as data, which makes us easier to exploit. Algorithms transfer risk from their operators to those caught up within their gears

This helps explain why bureaucrats and software engineers have little incentive to understand the people governed by their systems, while the governed must expend precious intellectual capital trying to reverse-engineer these systems to survive within them. It’s a losing battle, of course: Navigating the world effectively may require more and more awareness and interpretation of algorithmic systems, but in many cases the more we know, the more likely our knowledge is to become obsolete. The institutions that run these systems tend to treat our reverse-engineering them as inappropriately learning how to game them, and they can change them unilaterally. As Goodhart’s law states, when a measure becomes a target, it ceases to become a useful measure. The moment that more than a few people understand how an algorithm works, its engineers will modify it, lest it lose its power.

So we must simultaneously understand how these systems work in a general sense and behave the way they want us to, but also stop short of any behavior that could be seen as gaming them. We know our actions are recorded, but not necessarily by whom. We know we are judged, but not how. Our lives and opportunities are altered accordingly but invisibly. We are forced to figure out not only how to adapt to the best of our abilities but what it is that even happened to us.

Unfortunately, there’s not much an individual can do. It’s undeniable that individuals have been harmed by algorithms yet nearly impossible for any of those victims to prove it on an individual basis and demonstrate legal standing. O’Neil and Pasquale both note that the problems with algorithms are too extensive for any silver-bullet solution, offering instead a laundry list of approaches drawing from precedents in U.S. policy (e.g. the Fair Credit Reporting Act and the Health Insurance Portability and Accountability Act) and European legal codes. But regulatory means of reigning in algorithms — even assuming the significant hurdles of regulatory capture (the government’s understanding of these instruments is informed mostly by their beneficiaries) could be surmounted — would still require labyrinthine bureaucracies to implement them. If the problem with algorithms lies in how they mimic the ways bureaucracies function, trying to fixing them with different bureaucracies merely reiterates the situation.

Algorithms are probably not going anywhere. Technology and bureaucracy both tend toward expansion as they mature. But while getting rid of algorithms seems unlikely, they can be modified toward greater social utility. This would require evaluating them not in terms of how objective they seem, but on ethical, unapologetically subjective grounds. O’Neil argues that algorithms should be judged by the ethical orientation their programmers and users give to them. “Mathematical models can sift through data to locate people who are likely to face great challenges, whether from crime, poverty, or education,” she writes. “It’s up to society whether to use that intelligence to reject and punish them — or to reach out and help them with resources they need.” O’Neil writes of even more promising applications, like an algorithm that scans troves of data for signs of forced labor in international supply chains and another that identifies children at greatest risk for abuse. Crucially, they rely on humans at both ends of the process to make key decisions.

In this paradigm, the problem with “customized” rankings is not their lack of universality but the fact they could be even more customized to suit specific users’ goals. If a platform wishes to be truly neutral, its algorithms must be amenable to the unique objectives of each user. Pasquale suggests that when Google or Yelp or Siri makes a restaurant recommendation, a user could decide whether and how heavily to take into account not just the type of food and the distance to get there, but whether the company provides its workers with health benefits or maternity leave.

Opaque algorithms that rely on Big Data create issues that are commonly brushed aside as collateral damage when they are recognized at all. But those issues are avoidable. By acknowledging and accepting the human bias endemic to these systems, those same forces could be repurposed for good. We need not be trapped in their iron cages.

23 Feb 06:25

Ohrn Image — Donair Dude With Bicycle

by Ken Ohrn

Davie St near Bute.


23 Feb 06:24

1940, Housing, And Why this Matters

by Sandy James Planner

0329 heritagedemo.jpg

As Chris Brown reports on the CBC there has been a major brouhaha regarding the City of Vancouver’s 12,000 homes that were built before 1940. In a city that had almost a thousand demolition permits taken out in 2016 (the majority in Dunbar-Southlands) the past is getting-well, lost. Of those demolished, two-thirds  of the houses were built before 1940.

In response, the City has created a “Character Home zoning review” proposing to discourage the demolition of this older housing stock by permitting replacement houses to be sizably smaller. This has not gone over well with “Many homeowners, developers, pro-density groups and even key heritage advocates are all pushing back hard against the “preservationist” plan now under discussion.”

Arguments against the designation include stifling architectural design,  and freezing much-needed locations for townhouses and family focused higher density. The City of Vancouver’s Director of Planing Gil Kelley  notes “The younger generation is feeling sqSo opening up new options for affordability and different living option choices for them is really critical — even as people here who are older are trying to hang on to what they already know.”

There have been some issues regarding the  character home designation-how will property owners be compensated for reduced returns on the property? And if a character home is deemed to be beyond rebuilding (and there will need to be guidelines to define that) can those single family lots be filled with more family friendly and affordable higher density housing forms? And in the end, can we create a new way of looking at density in this Character Home zoning review that can move the large single family areas of the city into something that is denser and more attainable for newly formed families? Our future depends on that.

heritage


23 Feb 06:24

The way we work doesn’t work, anymore

by Stowe Boyd

Lessons from the Gallup State of the American Workplace 2017

Continue reading on Work Futures »

23 Feb 06:24

"Mr. Trump’s candidacy is a message from the voters. He is the empty gin bottle they have chosen to..."

“Mr. Trump’s candidacy is a message from the voters. He is the empty gin bottle they have chosen to toss through the window.”

- David Gerlernter, Trump and the Emasculated Voter
23 Feb 02:25

Tier 3 failure shows bike-rental ain’t that easy

by John Artman

Chinese tech media is aflutter this morning after news last night that Kala, an O2O bike-rental company, went out of business after only 19 days of operation. In those 19 days, the company, in cooperation with the Putian government (a Tier 4 city in Fujian province), were only able to recover 157 of the 667 bikes they had put around the city for use. Now, they are saying that, due to an agreement with their investor, they are not able to refund any outstanding deposits.

Founded in October 2016, Kala (卡拉单车) planned to operate in Tier 3 and Tier 4 cities (in Chinese) to avoid head-on competition with bike-rental leaders Mobike and Ofo. However, it took them 2 months to find investment, after being turned down by 30 different investors.

Once they were able to get funding, they planned to expand with 5000 bikes to other Tier 3 and Tier 4 cities, but after the startling losses of their only real asset, their investor (not disclosed) invoked a clause in their value adjustment mechanism (VAM) agreement. This allowed them to take all desposits as recompense for an initial over-valuation of the company.

According to the company, they have returned as many deposits as they can, but have already ran out of money from their initial bank loan.

The news that an investor was able to walk away with all the deposits has raised many eyebrows (in Chinese) around the country. Both Mobike and Ofo have stated many times that all the deposits are kept in separate accounts from their operating budget; users can conceivably get their deposit back whenever they choose.

This outstanding failure, however, does not seem to have dampened investors enthusiam for O2O bike-rental. Mobike announced yesterday that have received post-series D funding from Temasek, a Singapore-based investment company. This puts the total amount of money Mobike has raised this year over US$ 300 million, according to the company.

23 Feb 01:49

Source: Getty Images Top Hat Raises $22.5 Million to Go After Pearson, McGraw-Hill

files/images/Top_Hat.jpg


Gerrit De Vynck, Bloomberg, Feb 23, 2017


Bloomberg offers coverage to this Canadian  company that has set up an  online bookstore for textbooks. Many of the offerings come from OpenStax (Rice University's former Connexions service) and are offered for free while the rest appear to be authored using TopHat's own authoring tool ad sell for various prices. Presumably the company has something else going for it, or they're just a really swell bunch of guys, to account for $40 million in venture capital funding. Top Hat CEO Mike Silagadze "started by selling software tools to professors that help them engage their students, such as smartphone apps that let them tell lecturers if they understand new concepts in real-time."

[Link] [Comment]
23 Feb 01:49

Trust Score (Beta) for Ethereum

files/images/joker.JPG


Nantium OÜ, Decentralize Today, Feb 23, 2017


I agree with  Nantium OÜ that "decentralization will lead to a more fair society where monopolies lose their stranglehold over some of our key economic sectors (and possibly even government sectors)." I'm less convinced that trust is a key part of this, but I'm willing to listen. In any case, what OÜ has done is to create a (beta) trust mechanism for Ethereum. Basically, it uses the same mechanism for trust as it does for payment: "you can file a complaint through an Ethereum contract that will ultimately penalize the other party’ s score." This mechanism has already been suggested for credentials, such as academic achievement or badges. I'm more inclined to think that trust (and achievement) will be derived by AIs mining publicly accessible data. But we'll see.

[Link] [Comment]
23 Feb 01:44

4chan: The Skeleton Key to the Rise of Trump

files/images/pepe.jpeg


Dale Beran, Medium, Feb 24, 2017


Celebrating the fail is the new win. This is the core value being embraced by 4chan members, alt-right supporters and Trump voters. That's the thesis of this insightful and well-argued essay by almost-loser Dale Beran in this long but engaging read. Those who hold to the (often empty) promise higher education offers should consider this perspective. It forms part of the narrative of failure that defines a substantial body of young men, the same men who constitute things like Anonymous and Gamergate. I am not sympathetic with the 4chan perspective, but I can understand it, having lived through the same broken promises, the same periods of extended unemployment, the same challenges and the same frustrations. But instead of embracing failure I  embraced diversity and equality, and found myself a cause to fight for.

[Link] [Comment]
22 Feb 18:20

Leaked Samsung Galaxy S8 Photos Show On-Screen Navigation Buttons and Always On Display

by Rajesh Pandey
The Samsung Galaxy S8 has leaked multiple times in recent weeks, but so far, the handset’s on-screen navigation bar has been camera shy. That changes today, with the leaked images giving us a clear look at the Galaxy S8 from all angles. Continue reading →
20 Feb 15:48

Sharing the Domains Documentation Love

by Reverend

I wanted to make sure Adam Croom gets the appropriate love for helping several schools get up and running with Domains documentation over the last several months. More than a few folks have inquired if we had any recommendations for solid documentation, and it is hard to compete with the new and improved OU Create docs.* Once upon a time, before the Smallest Federated Wiki whisked Mike Caulfield away, we had an idea for federated documentation using DokuWiki, but that never came to be. There was some early momentum to prevent folks from reproducing the documentation wheel—but time, energy, focus, and squirrels got in the way.  

More recently a few schools asked us if the could reproduce Oklahoma’s documentation and then customize it for their school, and it turned out that Adam had already done this for Middlebury’s Middcreate. So, he was kind enough to not only help out a few schools by porting over OU Create’s documentation site whole hog, but even wrote up a tutorial on how to do it

Re-visiting how we re-use and remix documentation resources across schools running Domain of One’s Own will most definitely be on the Domains 17 conference agenda.  I think this is something we need to revisit in order to make this much less arduous. Thanks again Adam, you rule.


David Morgen did some amazing work with Emory’s documentation early and that was the inspiration to try and figure out how to easily share work between schools that wanted to share and re-user Domains documentation.

20 Feb 15:48

Good Discussions At Bad Times

by Richard Millington

Don’t go off-topic too early.

Off-topic discussions can be useful to build strong levels of self-disclosure between members. This facilitates relationships which keep people participating and establishes a strong group identity.

But don’t do this too early.

A reader recently emailed to ask why his off-topic discussions within his Nextdoor group were falling flat. The discussions were standard, but the group was a few weeks old. The few previous discussions related to civic issues (local litter, town planning etc…).

Jumping from a discussion about an upcoming development to ‘what are you going to do this weekend?’ will always sound forced. Don’t do it.

Begin with topic-related questions first – ideally the most important topics. Give members time to become familiar with each other via discussing the topic. Wait for members to go off-topic first before you initiate and encourage similar discussions. Then gradually increase the number.

20 Feb 15:48

Anyone in the United States can now order Snapchat Spectacles online

by Ian Hardy

Late last year, Snap stated it planned to “to significantly broaden the distribution” of Spectacles in 2017.

Today, the company went one step closer to Canadians being able to purchase the sought after smart glasses. While not officially available to those located north of the U.S. border, Snap has opened up online orders for Spectacles across the United States. So, if you’re interested wearing a pair of these gems, now is the time to reconnect with a friend in the south.

The cost is $129 USD (includes charging case and cable) and the wearable manufacturer says delivery is set for two to four weeks. In addition, Snap states there is a limit of six Spectacles per household.

This is the closest Spectacles has arrived in Canada as the company has been dropping its vending machines across the United States in various locations and recently opened up a full store in New York City.

spectacles

Spectacles main feature is a “circular video format” that captures 10 seconds in 115-degree angles that play back full screen — in portrait or landscape — on a smartphone display.

Source: Snap

The post Anyone in the United States can now order Snapchat Spectacles online appeared first on MobileSyrup.

20 Feb 07:14

A few weeks with the ZTE Axon 7 Mini

by John

I’ve been using the latest handset from ZTE, the Axon 7 Mini for a few weeks now. It’s a mid-tier Android handset with a comfortable aluminum shell, a great 16 megapixel camera and true stereo surround sound from dual front facing speakers powered by Dolby Atmos.

This is the first handset from ZTE I’ve used, with only limited awareness of them through a few friends that have their devices. Apparently they are the #4 handset maker in North America that has been quietly making inroads in Canada.

The Axon 7 Mini is the little brother to their flagship Axon 7 device. The Mini sports a 5.2″ Super AMOLED screen with Gorilla Glass 4. The Mini has 3gb of RAM and a 32gb ROM and it’s sim card tray has a micro SD card slot for up to 128gb of additional memory.

The finish of the Axon 7 Mini is not unlike a MacBook – slightly grippy brushed aluminum that feels great in the hand with no sharp edges. Unlike many other Android handsets in this class, the Mini doesn’t feel cheap or plastic-y…it feels like a premium device from the other manufacturers. One nice thing that ZTE saw to include in the box is a clear protective TPU case…the thinking is that their device may not get the same level of case options in stores and this definitely is a welcome addition. It adds very little bulk to the handset and seems like it would help in case of a drop with a small lip around the front of the screen.

The mini has an 8MP front facing camera and a 16MP rear camera with a f1.9 aperture and dual LED flash. ZTE has their own camera app in place of the default Android app and it’s pretty decent with lots of manual options and auto modes. It adds some modes that you’d normally have to use a separate app for including a great time lapse mode, a super night mode for ultra low light images and a Magic Exposure mode that lets you capture long exposures like star trails, water flow and car trails.

The super night mode was particularly impressive but you’ll need a tripod or something to hold the device rock-steady.

Below the camera on the rear is a fingerprint scanner that works lightning fast to unlock the device.

The front facing camera has the ability to take selfies via the onscreen shutter button or by tapping the fingerprint scanner.

Here’s some images I took with the Axon Mini’s 16mp camera (click through for full sized):

The Mini has a USB-C charging port that can do a quick charge (2.0 version) to 50% in about 30 minutes and fully charged in 90. Included in the box is a USB-C to USB3 cable and a quick charge USB block.

Another thing that ZTE offers is the Axon Passport program. Owners get a 2 year warranty included with their phone that gives them free shipping both ways to deal with any issues and unlimited out of warranty repairs. I can’t think of any other company with that inclusive of a protection plan included with the handset, nor one that generous.

I’ve been very impressed with the Mini’s performance and especially it’s camera. A lot of great features in a clean Android (6.0) experience. I didn’t notice any lag while navigating around the Mini and all the apps I threw at it worked as expected. It’s always a nice surprise to see above average performance on a mid-range device.

The Axon 7 Mini is available now from Virgin Mobile in Canada for $0 on a 2 year term or $399 outright. Hopefully the Axon 7 Mini will become available from more carriers or available direct and unlocked to consumers like it’s bigger brother, the Axon 7.

The post A few weeks with the ZTE Axon 7 Mini appeared first on johnbiehler.com.