Shared posts

09 Aug 23:56

The Problem With Tracking Just One Metric

by Richard Millington

I once hired a copywriter to increase a client’s newcomer to registered member conversion rate. It was a (expensive) test. The copy he produced was stunning. Big promises, free resources, every psychological technique and inbound marketing technique in existence was incorporated.

The conversion would have shot up, the participation rate would have plummeted. All the free offers in the world wouldn’t get people to participate.

This is the problem with measuring (or being measured by) a single metric. You’re motivated to subvert all the resources available to you to make that metric go up. But these are complex systems we’re working with. Changes in one area affects other areas in unpredictable ways.

If you’re only being measured by registered members, hiring a copywriter to write this sort of copy that decreases the total level of activity is the logical course of action.

If you’re only being measured in total number of posts, it makes sense to say controversial stuff, start fights, initiate wordplay games, and share gossip.

If you’re only being measured by value only, it makes sense to spam members with discounts until they buy etc…

Perhaps we’re all smart enough to stay away from the extremes above, but the logic is still there. In any situation where there is a choice, the choice will favour what’s being measured.

Communities aren’t unique here. If a CEO is measured by the increase in share price each year, they’ll subvert every resource available to them to make that price go up each year (with predictable consequences).

This works for every metric you can think of. Imagine your metric is monthly active users. It makes sense to spend most of your time on having as many unique contacts with as many members as possible to encourage them to visit once. You can build up entire volunteer teams to help you.

The problem is these things work on a curve. Eventually sustained growth requires more than just a numbers game of contacts / members or big promises to newcomers. It requires word of mouth, depth of discussion, taking time to build a strong sense of community etc…

Part of the problem is simplicity. A single metric keeps everything simple for you and your boss. But it’s also just as likely to keep your community from reaching its potential and lead to its own demise.

If you’re managing a community team (or just managing your own community), have the discussion with your boss, colleagues, or even just yourself and set at least four targets.

We’ve found these work well:

1) A growth-related target. This is a target related to the number of new members each month. Use any proxy figure you like (unique new visitors, newly registered members, first-time participants etc…).

2) An activity-related target. This is the total number of posts, posts per active members, average time on site, page views per member or anything that indicates whether members are actively participating.

3) A sentiment-related target. This is a target related to the sense of community, how members feel about the community, or general satisfaction with the community. Polls and surveys work well here.

4) A value-based target. This is a target related to the value the organisation gets from the community. Call deflection, increase in retention rate, time saved etc, any return-based metric.

Setting four targets prevents anyone from subverting the other three to achieve a single aim.

If you prefer to track a single target to measure, feel free…just insist it must be achieved without any major declines in any of the other three.

09 Aug 23:56

Message to self

by Volker Weber

ZZ258F87CD

I don't keep a todo list. More precisely, my todo list lives inside my brain. There is no paper, there is no app. But there are things that are easily forgotten, like YouTube URLs and things I posted and have since deleted.

To keep track of these things, I send a message to myself. Not an email, but just a message on Slack. This space also has bots that collect stuff from all over the net for me. I can look at this stream from anywhere, from every device I have or any other connected screen.

There you have it. This is how I keep the unimportant stuff out of my brain.

09 Aug 23:55

This German woman in India had the perfect lesson for her 14-yr-old harasser

files/images/german-woman-main.jpg


The Indian Express, Aug 12, 2016


This is a great story that has already gone  viral in India.  Ulrike Reinhard is like the Euler of education, a wanderer, explorer and connector. I've been following her for many years. She is currently  developing a school in India called  Janwaar Castle based on skateboarding and community participation. India, as has often been observed, has a problem with male dominance, and  one day about two weeks ago a young man simply reached out and started touching her "because I am a man." So she took him to the police. End of story, right? No. Reinhard gave the boy an option - instead of being changed, he could help develop a learning program "how to treat women right." There was an  intense meeting with the family. Eventually they were won over. The workshop is still in planning, but she has been inundated by well-wishers from across India. Teaching through example is always powerful, and I've learned a lot here.

[Link] [Comment]
09 Aug 23:55

Sense HAT emulator

by David Honess

Over the last few months, we’ve been working with US-based startup Trinket to develop a web-based emulator for the Sense HAT, the multipurpose add-on board for the Raspberry Pi which is also the core component of the Astro Pi units on the International Space Station. We wanted to provide a unique, free learning resource that brings the excitement of programming our space-qualified hardware to students, teachers, and others all over the world.

We’re delighted to announce its release today, and you can try it for yourself right now. Click the Run ▻ button below and see what happens!

trinket-logo

The emulator will allow more people to participate in future Astro Pi competitions – you’ll be able to join in without needing to own a Raspberry Pi computer or a Sense HAT.

British ESA Astronaut Tim Peake with an Astro Pi unit on the International Space Station

British ESA Astronaut Tim Peake with the Astro Pi. Image credit ESA

The new emulator builds on Trinket’s existing Python-in-browser platform, and provides the following features:

  • Virtual Sense HAT with environmental controls and joystick input
  • Full Python syntax highlighting
  • Contextual auto-complete
  • Intuitive error reporting and highlighting
  • Image upload
  • HTML page embedding
  • Social media integration
  • Project sharing via direct URL
  • Project download as zip (for moving to Raspberry Pi)
  • All major browsers supported

sense_hat_emu

The Sense HAT has temperature, pressure and humidity sensors, and can change its behaviour according to the values they report. The Sense HAT emulator has sliders you can move to change these values, so you can test how your code responds to environmental variables.

Part of a screenshot of the Astro Pi emulator, showing three silders with buttons that can be dragged to change the temperature, pressure and humidity that the virtual Sense HAT's sensors are reporting

You can move the sliders to change what the sensors are reporting

Code written in this emulator is directly portable to a physical Raspberry Pi with a Sense HAT without modification. This means any code you write can be run by the Astro Pi units on board the ISS! It is our hope that, within the next 12 months, code that has been written in the emulator will run in space. Look out for news on this, coming soon on the Astro Pi site!

We owe huge thanks to Trinket, who have been wonderful partners in this project. The development work has been completed in just over two months, and has been a huge collaborative effort from the start. The software relies heavily on open-source technology and a global community of developers who are committed to making the power of code more accessible to students.

A closed group of beta testers, made up of previous Astro Pi participants and Code Club champions, has been putting the emulator through its paces over recent weeks. We’re proud to say that we’ve just had a bug-free open beta over the weekend, and now we’re looking forward to seeing it used as widely as possible.

So, where do you start? If you’re new to the Sense HAT, you can just copy and paste a lot of the code examples from our educational resources like this one. You can also check out our e-book Sense HAT Essentials. For a complete list of all the functions you can use, have a look at the Sense HAT API reference here; please note that the IMU (movement-sensing) functions will be supported in a future update. Head over to the main Sense HAT emulator site to see loads of other cool examples of what’s possible. Flappy LED, anyone?

Don’t forget to share your projects!

The post Sense HAT emulator appeared first on Raspberry Pi.

09 Aug 23:51

Android security – Swiss cheese pt. III

by windsorr

Reply to this post

RFM AvatarSmall

 

 

 

 

 

Security is just another reason to close Android down. 

  • Another vulnerability has emerged that exposes over 900m Android devices to the possibility of granting root access to hackers potentially making the device part of an illicit botnet as well as the theft of all data on the device.
  • Checkpoint has released details of four vulnerabilities (Quadrooter) that affect Android devices powered by Qualcomm chipsets but the good news is that there is no evidence to date that the exploits have been used in the wild.
  • Checkpoint provided Qualcomm with the details of these vulnerabilities 90 days ago which in turn released patches for its vulnerable drivers but it is there where the orderly process stops.
  • Qualcomm has made the patches available to any customer using its chipset, but whether the end devices themselves have been patched is a matter of great doubt.
  • In the normal scheme of things, a vulnerability is found, communicated to system owners who then create a patch and make it widely available.
  • In iOS and Windows, these updates are rapidly distributed to all users who then update their systems and within a few weeks the issue has been put to bed.
  • However, with Android this is not the case as there are two issues that prevent devices from being updated.
    • First: Most Android devices are not updatable.
    • Android is a commoditised, brutally competitive market meaning that in the mid-range every cent of cost matters.
    • Making a device updateable means that extra resources have to be added to the device which are never reflected in the price.
    • Consequently, the vast majority of Android devices are not updateable to later versions of Android as there is no incentive for the device maker to add this capability.
    • Second: Google has no control over the update process for any of the devices that run its services.
    • It can update Google Mobile Services (GMS) from Google Play but lower level system updates (Android) are controlled by either the maker of the device or the mobile operator.
    • The two exceptions are Xiaomi and Cyanogen both of whom have retained the ability to update devices running their software.
    • This is provided that the devices themselves are updateable as per the first issue above.
  • These issues are so acute that even Google, will not have fixed all four of the vulnerabilities in its devices until the September update is distributed and installed.
  • I suspect that for many handset and tablet makers, this vulnerability will continue to exist for a long period of time.
  • While the Quadrooter has not caused any damage per se, this issue clearly demonstrates that, should a major hack occur, it will take the Android community many months to fix it, if it fixes it at all.
  • This issue combined with the endemic fragmentation of Android is a major reason why I think Android devices generate much less traffic than iOS devices and why Android users demonstrate much lower loyalty.
  • One only has to look at the defection of users from Android to iOS when the iPhone 6 was launched for evidence of how vulnerable Android is to market share loss.
  • This is bad news for Alphabet as RFM research finds that its long-term growth is dependent on traffic generated by Android devices meaning that these issues are hampering its growth potential.
  • I continue to believe that Alphabet will solve this problem by taking Android completely proprietary, effectively removing it from open source (see here).
  • This will solve all of these problems in one go but comes with the problem of convincing the community that going proprietary is in everyone’s best interest.
  • Here, Google will have the advantage of being able to point the finger at Oracle (see here) as the architect of the problems that have forced it to close Android down which I think will make things much easier for Alphabet.
  • I think that this will begin in earnest in 2017 probably at its developer conference (i/o) in May.
  • I continue to prefer Baidu, Microsoft and Samsung over Alphabet for the immediate term with Tencent and Facebook on my watch list for the right signals to enter.
09 Aug 23:50

Fast and Accurate Document Detection for Scanning

by Ying Xiong

A few weeks ago, Dropbox launched a set of new productivity tools including document scanning on iOS. This new feature allows users to scan documents with their smartphone camera and store those scans directly in their Dropbox. The feature automatically detects the document in the frame, extracts it from the background, fits it to a rectangular shape, removes shadows and adjusts the contrast, and finally saves it to a PDF file. For Dropbox Business users, we also run Optical Character Recognition (OCR) to recognize the text in the document for search and copy-pasting.

Beginning today, we will present a series of technical blog posts describing the computer vision and machine learning technologies that make Dropbox’s document scanning possible. In this post, we’ll focus on the first part of the pipeline: document detection.

Goal

The goal of document detection is to find the corners and edges of a document in the image, so that it can be cropped out from the background. Ideally, detection should happen in real time, so that the user can interactively move the camera to capture the best image possible. This requires the detector to run really fast (100ms per frame or less) on a tight CPU and memory budget.

Possible approaches

A common approach to solving problems like this is to train a deep neural network (DNN). DNNs are algorithms that take a large amount of labeled data and automatically learn to predict labels for new inputs. These have proved to be tremendously successful for a variety of computer vision applications, including image classification, image captioning, and face detection. However, DNNs are quite expensive, both in terms of computation time and memory usage. Therefore, they are usually difficult to deploy on mobile devices.

Another potential solution is to use Apple’s rectangle detection SDK, which provides an easy-to-use API that can identify rectangles in still images or video sequences in near-realtime. The algorithm works very well in simple scenes with a single prominent rectangle in a clean background, but is less accurate in more complicated scenes, such as capturing small receipts or business cards in cluttered backgrounds, which are essential use-cases for our scanning feature.

Our approach

We decided to develop a customized computer vision algorithm that relies on a series of well-studied fundamental components, rather than the “black box” of machine learning algorithms such as DNNs. The advantages of this approach are that it is easier to understand and debug, needs much less labeled training data, runs very fast and uses less memory at run time. It is also more accurate than Apple’s SDK for the kinds of usage scenarios we care about; in an A/B test evaluation, the detections found by our algorithm are 60% less likely to be manually corrected by users than those found by Apple’s API.

Our first observation is that documents are usually rectangular-shaped in physical space, and turn into convex quadrilaterals when projected onto 2D images. Therefore, our goal turns into finding the “best” quadrilateral from the image, and use that as our proxy for the document boundary. In order to find the quadrilateral, we need to find straight lines and their intersections. Finally, to find straight lines, we need to detect strong edges in the image. This gives us the outline of our detection algorithm, as shown below. We will discuss each component in more detail next.

Document detection pipeline
Document detection pipeline

Edge detection

Finding edges in an image is a classic problem in image processing and computer vision. It has decades of history, and saw early success already in the ’80s. One of the best known methods is the Canny edge detector, named after its inventor, John Canny. It dates back to 1986 but is still widely used today.

We applied the Canny Detector to our input image, as shown below, but the results were not very promising. The main problem is that the sections of text inside the document are strongly amplified, whereas the document edges—what we’re interested in—show up very weakly.

original-small canny-200x200-small

Left: the input image. Right: the output of the Canny edge detector.

To overcome these shortcomings, we used a modern machine learning-based algorithm. The algorithm is trained on images where humans annotate the most significant edges and object boundaries. Given this labeled dataset, a machine learning model is trained to predict the probability of each pixel in an image belonging to an object boundary.

The result of this learning-based edge detector is shown below. It’s much better at focusing on the document edges that we care about.

original-small random-forest-200x200-small

Left: the input image. Right: the output of the machine learning-based edge detector.

Hough transform

Once we have an accurate edge map, we’d like to find straight lines in it. For this, we use the venerable Hough transform, a technique that lets individual data points “vote” for likely solutions to a set of equations. In our case, each detected edge pixel votes for all lines passing through that point; the hope is that by adding up the votes across all edge pixels, the true document boundaries will emerge with the most votes.

More formally, here’s how it works: The slope-intercept form of a line is y = mx + b. If we detect an edge pixel at a particular (x,y) point, we want to vote for all lines that pass through the point. This corresponds to all slopes m and intercepts b that satisfy the line equation for that point. So we set up a “Hough Space” with m and b axes. Here, a single point (m,b) corresponds to a line in the original image; conversely, a point in the original image space corresponds to a line in the Hough Space. (This is called a duality in mathematics.) For every edge pixel in the original image, we increment a count for all corresponding points in the Hough Space. Finally, we simply look for the points with most votes in the Hough Space, and convert those back into lines in the original space.

In the figure below, you can see the detected edge pixels on the left and the corresponding Hough Space in the middle. We’ve circled the points with the most votes in the Hough Space, and then converted them back into lines (overlaid onto the original image) on the right. Note that although we described the Hough Transform above in terms of the slope-intercept form of a line, in practice we use a polar parameterization, r=x·sinθ+y·cosθ, that is more robust and easier to work with.

random-forest-200x200-small hough-space original-space

Left: detected edges. Middle: the Hough Transform of the edges, with local maxima marked in red. Right: the lines corresponding to the local maxima overlaid onto the original image.

Computing intersections and scoring quadrilaterals

After finding straight lines, the rest of the work is relatively simple. We compute the intersections between the lines as potential document corners, with some simple geometric constraints. For example, intersections with very acute angles are unlikely to be document corners. We next iterate through potential document corners, and enumerate all possible quadrilaterals, each of which is scored by adding up the probability predicted by the edge detector over pixels along its perimeter. The quadrilateral with highest score is output as the detected document.

intersection quad2-1 quad3-1

Left: intersections of detected lines are potential document corners, although the red ones are filtered out by using geometric constraints. Middle: one possible quadrilateral formed by the potential corners. Right: the quadrilateral with the highest score, which is the output of our algorithm.

Putting it all together

Finally, we show a video below demonstrating each step of the pipeline. The video is generated with a standalone iOS app we built to develop, visualize and debug our algorithm. The full pipeline runs near realtime at about 8–10 frames per second.

video

Visualization of all steps in the detection algorithm.

Try it out

Try out the Dropbox doc scanner today, and stay tuned for our next blog post, where we’ll describe how we turn the detected document outline into an enhanced rectangular image.

09 Aug 23:50

The boldness of security experts’ Trump critique

by Josh Bernoff

Yesterday, 50 former national security officials who had served in Republican administrations published a letter criticizing Donald Trump. This document is bullshit-free: it’s written in the first person with active voice, direct, declarative sentences and a limited number of qualifiers. For a political document, this is remarkable. An effective document is short, features a descriptive title, and explains its … Continue reading The boldness of security experts’ Trump critique →

The post The boldness of security experts’ Trump critique appeared first on without bullshit.

09 Aug 21:02

Ohrn Image — Public Art, Public Space

by Ken Ohrn

The Woodward’s atrium is an odd space.  Very big, flanked by a drug store and a grocery store, and surrounded by offices and parts of SFU, but not often busy. There are nifty things in it, such as the huge Gastown riots photo (50′ x 30′) by Stan Douglas, and a basketball hoop. But it just isn’t “sticky”.

And on this day, a lovely piano on wheels, apparently delivered by bike. The music echoes in a pleasing way for the few who are there.

Woodwards.Atrium

“Abbot and Cordova, 7 August 1971”, by Stan Douglas; with piano accompaniment

Thanks to the Tyee:   Entitled “Abbott & Cordova, 7 August 1971,” the gigantic image by Vancouver artist Stan Douglas is a representation of a little known but crucial moment in Vancouver and the Downtown Eastside’s history. On that date, Vancouver police, in full riot gear, violently broke up a Smoke-In, a peaceful marijuana protest, inciting mayhem and destruction on Gastown’s streets. This event was the climax to heightened tensions between local government, hippies squatting in empty industrial buildings throughout Gastown and the predominately blue-collar families that had populated the neighbourhood for over a century.

 

 


09 Aug 21:02

Twitter will jump on the Stories bandwagon when it opens up Moments to everyone

by Jessica Vomiero

Last week, Instagram shocked the social-verse by launching a Stories feature that almost exactly mimicked Snapchat.

Just one week later, Twitter has developed a game plan to keep up with its competition. According to The Next Web, Moments will eventually be opened up to all Twitter users, allowing them to create and add their own Moments, instead of just viewing the ones created by the company’s editorial team.

“To date, Moments have been created by our curation team and a select group of publishing partners but it’s always been our goal to open up this creative canvas to more people. Today, we’re excited to announce that a broader group of creators will be making Moments, including influencers, partners, brands… and in the coming months, everyone,” said the company in a blog post.

Twitter will reportedly make Moments available to a select few companies and influencers over the next few weeks, including civil rights activist DeRay McKesson and VaynerMedia.

“Moments allows us to collect and curate Tweets to tell stories in new ways. It’s a powerful Twitter feature, continuing to help build community as we engage specific topics and events,” said McKesson in a statement sent to MobileSyrup.

Moments curated by certain groups or companies can be found in a select button on their profile which will link to the moments they’ve created. Furthermore, Moments might be a play on ‘stories,’ but the two features aren’t exactly the same.

While both Instagram and Snapchat versions of Stories include short clips that flow consecutively into other clips, Moments allows users to linger on a tweet or video or even scroll back. The main similarity between the two features is the potential for narrative, or in effect, storytelling.

While Twitter users haven’t received an exact timeline as to exactly when they’ll will get to curate their own Twitter Moments, Twitter has forecasted it will start rolling out the feature in the next few months.

09 Aug 21:01

Google Maps Gains ‘Wi-Fi Only’ Mode and Ability to Save Offline Data to SD Card

by Rajesh Pandey
Google today announced that it has officially started rolling out the “WiFi only” mode for Google Maps on Android. The feature was earlier in a limited beta testing by the company.  Continue reading →
09 Aug 21:01

The Suzuki Foundation supports bikes lanes on Bloor

by dandy

Gideon Suzuki Foundation

Story by Tammy Thorne originally published in our Summer 2016 issue.

Photo by Claire McFarlane

Gideon Forman, the climate change and transportation policy analyst at the David Suzuki Foundation says it didn’t take a lot of convincing to bring Canada’s best known environmental agency on board to battle for bike lanes on Bloor. One of the foundation’s priorities is getting people out of cars and on to bikes to reduce greenhouse gas emissions. We asked Forman why supporting bike lanes on Bloor was so important to the Suzuki foundation.

“The organization is completely committed to active transportation. So it didn’t take a lot of convincing. In my own [policy] work I wanted a mix of concrete projects along with more abstract work like carbon pricing [DSF has done a lot of work in Ontario on cap and trade] which is very nerdy and very important stuff but not quite as tangible. I wanted something you could show people, that you can see and feel. And the great thing about the bike lane is that it contributes to what I call ‘”climate optimism.” The bike lane is a concrete display of action that you are making a difference – that there is something we can do right here in Toronto and we can do it right now. Bloor Street is our main street – not to be too downtown-centric – but in some ways Bloor is our main street. To win this [bike lane] on our main street in Toronto is crucial. Symbolically it’s important but it’s also important in the real world. We see this as a starting point for building out the whole network. So we thought it was important in terms of climate optimism and also that it would facilitate more cycling. We thought if we could get this bike lane on Toronto’s main street we could show that we could do it in any city.”

Screen Shot 2016-08-09 at 11.01.30 AM

Our new issue of dandyhorse has arrived! dandyhorse is available for FREE at Urbane Cyclist, Bikes on Wheels, Cycle Couture, Sweet Pete's, Hoopdriver, Batemans, Velofix, and Steamwhistle. Our new issue of dandyhorse includes cover art by Kent Monkman, interviews with Catherine McKenna and the women behind Toronto's first feminist bike zine, lots of news and views on Bloor - including this story above - and much, much more! Get dandy at your door or at better bike and book shops in Toronto.

Screen shot 2016-08-09 at 12.25.51 PM

Related on the dandyBLOG:
Point/Counterpoint: On-Street Parking
The First Week of the Bloor Bike Lane in Photos
Cycling in Toronto: A personal reflection

 

09 Aug 20:59

The Sensible Code Company is our new name

by Francis Irving

For a few years now, people have said “but you don’t just do scraping, and you’re not a wiki, why are you called that?”

We’re pleased to announce that we have finally renamed our company.

We’re now called The Sensible Code Company.

The Sensible Code Company logo

Or just Sensible Code, if you’re friends!

We design and sell products that turn messy information into valuable data.

As we announced a couple of weeks ago the ScraperWiki product is now called QuickCode.

Our other main product converts PDFs into spreadsheets, it’s called PDFTables. You can try it out for free.

We’re also working on a third product – more about that when the time is right.

You’ll see our company name change on social media, on websites and in email addresses over the next day or two.

It’s been great being ScraperWiki for the last 6 years. We’ve had an amazing time, and we hope you have too. We’re sticking to the same vision, to make it so that everyone can make full use of data easily.

We’re looking forward to working with you as The Sensible Code Company!

09 Aug 20:50

PDX vs YVR: Same symptoms, different sources

by pricetags

From CityLab:

PDX 1

The dream of the ’50s is alive in Portland

The question now is whether the idea of Portlandia can survive its own city. Between 2015 and 2016, the cost of buying a home in Portland jumped from $369,500 to $412,000—the fastest-rising housing market in the U.S.

“Yes, people still want the dream,” says Alyssa Isenstein Krueger, a broker with Living Room Realty and a member of the preservation group Stop Demolishing Portland. “They want it more than ever now, because there’s this huge fear that if they don’t buy now they’ll never buy.”

But the more people who want it, the fewer who are able to get it. One of Isenstein Krueger’s client families moved from Los Angeles to Portland for its bicycle-friendly way of life, but after they received a 90-day notice from their landlord, they turned into quick buyers. They wanted the same Portland lifestyle they were renting, on a $300,000 budget. They found it, eventually, 113 blocks east of downtown.

“It’s a much longer bike commute than what they’ve had,” she says. “But that was their compromise—we need to at least live within transit and bike lanes. They are finding their own new Portlandia.”

One obvious solution is to build more affordable multifamily housing in neighborhoods where people want to live, says Mary Kyle McCurdy of 1,000 Friends of Oregon, a nonprofit that advocates for sustainable neighborhoods. But Portland’s current zoning laws are stuck two generations in the past.

Almost half of Portland—45 percent—is zoned exclusively for single-family dwellings, she says, while only 10 percent is zoned for multi-family dwellings. It’s a stale reflection of the post-World War II world in which Portland’s zoning rules were drawn up.

“In the 1950s, two-thirds of our households were families. Today, two-thirds of our households [consist of] one and two people,” McCurdy says. “We’re aging and getting younger at both ends; we come from different backgrounds and cultures. We need to catch up our zoning with our families today and for the future.”

McCurdy is working with an organization called Portland For Everyone that advocates for changing some of Portland’s zoning laws to allow for more multi-family dwellings in single-family neighborhoods. If builders are allowed to build duplexes, triplexes, quads, courtyard apartments and more mother-in-law units in Portland’s most in-demand neighborhoods, she says, then families like the one from L.A. might not have to move 100 blocks east—as long as they’re willing to trade in their dream of a mid-century bungalow.

But simply allowing for more density won’t necessarily lead to more affordable housing, Isenstein Krueger says. In fact, Portland For Everyone will only lead to a Portland For Even Fewer as developers buy the homes families want and then raze them. Even if multi-family housing goes up in these neighborhoods, it won’t be priced so most people can afford it.

In her experience, it’s already happening. One couple she worked with recently bid $375,000 on a home that was listed for $320,000 in Portland’s Eastmoreland neighborhood. A developer paid $420,000 for the house, she says, and now has a permit to demolish it.

“This whole idea that anybody is going to build affordable housing to replace the demolished housing is a load of crock,” she says. “Nobody is going to build affordable housing out of the goodness of their heart. They have never done it and they never will.”

Signs of the city to come

… Not all of that growth is going to be close to downtown, Isenstein Krueger says, but also in outer suburbs like Hillsboro and Beaverton. For Portland to truly be for everyone, she says, the city should prioritize making those outer corridors more livable, rather than change the face of neighborhoods that lack the infrastructure to take on any more people.

“One concern is, even if we are creating a sustainability utopia, it will only be for wealthy and largely white people.”

“Why do we have to destroy what we have, and what we’ve had for well over a century,” she says, “to make room for these mythical people that may or may not come?”

But signs of this future, denser Portland are already happening in some of the city’s most popular neighborhoods. Brendon Haggerty who is on the board of the Richmond Neighborhood Association,* which includes popular tourist corridors Southeast Division Street and Hawthorne Boulevard. Sitting in his quiet backyard near Hawthorne, you can’t hear the tourist traffic over the sound of spotted chickens toddling nearby.

It is the dream of Portland at its most intense, but also one Haggerty realizes may soon evolve. In the new Comprehensive Plan, the lot two doors down from his house will be rezoned for townhomes. No one in the neighborhood, he says, should be afraid of change.

“They’re NIMBYs,” he says. “The dream of Portland is not compatible with an approach to land use that protects the privilege of incumbent property owners. … To me it’s unquestionably a social justice issue. Neighborhoods like this provide access to a lot of opportunity for healthful and prosperous lifestyles, that you can’t get in other neighborhoods, and we need to be making that available to as many people as possible.”

 

And here’s PDX’s version of our foreign purchasers:

The Californians are coming

… “Keep Portland Weird” is a tired aphorism by now, but the truth is Portland has always been something weird: a city that could interpose between West Coast giants without being touched by them. …

Oregon has been the top moving destination in the U.S. for three years in a row, according to United Van Lines. More telling, however, is the number of people —more than 30,000 in 2014 alone—sliding up from California, many for high-paying tech jobs. The trend is particularly hard to miss in Portland, where about 1,000 new people move every month.

 


09 Aug 20:43

iPad Pro Plays a Role in the Stranger Things Poster Art

by John Voorhees

Adario Strange of Mashable takes a look at how artist Kyle Lambert created the 80s-style poster art for Netflix’s original series Stranger Things using an iPad Pro, Apple Pencil, Procreate, a Mac, and Photoshop.

I used the iPad Pro to do the preliminary composition ideas and the sketch that became the final Stranger Things poster. I chose to use the iPad Pro for the drawing stage of the poster because I find that I am able to sketch in a very natural way on the device using the Apple Pencil. The device in general is nice to hold for long periods of time, it is really portable and Procreate, the app that I used, has some really great Pencil brushes for drawing with.

Lambert exported the image to Photoshop on a Mac to do detail work using an Wacom Intuos tablet, which involved editing hundreds of layers. In the final stage, Lambert exported a flattened version of the art back to Procreate on an iPad Pro to add a “more fluid sketch style” to parts of the final product.

Image: Kyle Lambert

Image: Kyle Lambert

The depth and detail of Lambert’s process and artwork are fascinating and demonstrate just how powerful Procreate combined with an iPad Pro and Apple Pencil can be in the hands of a talented artist. Check out the full article on Mashable to learn more about how Kyle Lambert created the Stranger Things poster.

→ Source: mashable.com

09 Aug 20:42

Turning a smartphone into a laptop, again

by Rui Carmo

This is the year’s most annoying Kickstarter so far, if only because I can’t quite make up my mind about it.

I keep telling myself I have absolutely no need for something that (still) does less than a Chromebook, and the concept has been tackled before without runaway success, but years of tinkering with Linux chroot environments inside Android phones keep calling to me.

Their having reached the 1080p display target means it fulfills my current minimum standard for screen resolution, and it might be useful beyond its intended purpose given that it’s essentially a DisplayLink screen with a keyboard and trackpad attached, but… I have another week to think about it.

09 Aug 06:08

To PT Commenters: More signal, less noise, please

by pricetags

Price Tags commenters are a good bunch: they generally add content and insight.

But occasionally a topic comes along that unleashes the inner troll. Then we see a lot of back-and-forth, highly repetitive and increasingly personal posts that eventually drive others away.  It’s just gets too negative.

So, folks, make your point; just don’t make it too often.

Consolidate your thoughts, and add some original content.

Respect your fellow PTagger.

 


09 Aug 00:56

Google trolls 16GB iPhone users with new Photos ad

by Patrick O'Rourke

While it’s possible Apple could finally kill its 16GB iPhone storage tier with the launch of the iPhone 7 this September, current storage strapped iPhone users likely repeatedly run into the error featured in this clever Google ad.

Google has released an amusing new commercial that showcases Google Photos’ “free up space” funcionality. The ad features multiple missed photo opportunities, including graduations, birthdays, selfies and even Bigfoot sightings, all because the iPhone in question runs out of storage each time. The issue is also displayed in Apple’s well-known iPhone notification style.

It is weird, however, that when the camera pulls back, it’s actually a Nexus 6P featured in the ad, despite the obvious dig at Apple.

With Google Photos new free up space feature, once a photo has been uploaded to the cloud, users are given the option to delete the local copy of the video or photo, freeing up local storage on their smartphone for other purposes.

This feature becomes even more useful when paired with Photos’ lower-quality unlimited storage option.

To be fair, the iPhone is also capable of backing up memories to iCloud in order to clear up built-in storage, though most users will likely need to pay for more storage if they at least use their camera somewhat regularly.

SourceYouTube
09 Aug 00:56

Auerbach: “Achieving Affordable Housing in Vancouver”

by pricetags

Herb Auerbach is a long-standing real-estate development consultant in Vancouver – and the nstructor (with Michael Mortensen) for the SFU City Program course Real Estate Development from the Inside Out.

 

ACHIEVING AFFORDABLE HOUSING IN VANCOUVER

One cannot open a newspaper in Vancouver without seeing an article about the housing market and the lack of affordable housing, including complaints about the influx of foreign money, unoccupied apartments, profiteering by ruthless developers, unethical behavior on the part of brokers, and the lack of government action.

In Vancouver many of these are real issues that are doubtless affecting the real estate market, and often negatively. Addressing these complaints, however, will not produce “affordable housing” and very little is written about why we need it in Vancouver and how we can provide “affordable housing”.

There is no magic bullet.

If affordable housing is not available in the market place then it must be provided through government subsidy, either subsidizing the buyer or subsidizing the seller. All the legislation aimed at regulation of the real estate industry, taxing foreign ownership and vacant apartments, won’t produce affordable housing but it does help produce money for the government’s coffers and those monies could, if earmarked to do so, contribute, to providing some of the funds needed for the production of affordable housing.

Why do we need affordable housing in the City:

Some have argued that those who cannot afford to live in the city should seek housing elsewhere, in the suburbs, or even other cities. Encouraging more housing in the suburbs increases sprawl, and requires governments to spend more funds for expensive infrastructure, transportation and services. Besides, cleansing the city of the less affluent will not render it more secure, or more socially or culturally viable.

The need for affordable housing, in the city, has been understood by the earliest city builders and placemakers. Alexander the Great said of Alexandria that it would “nourish men of all races” and that it would thrive by virtue of its diversity, not in spite of it. His view, as was the view of Emperor Augustus when building Rome and Napoleon III when re-building Paris was that men and women of all stations and all ethnicities are needed to create and maintain a vibrant urban environment, In its own way Vancouver is attempting to emulate these great placemakers of history, by assuring the vitality, diversity and quality of its urban life is achieved. To achieve this goal Vancouver needs not only high end condos but affordable housing.

What is Affordable Housing?

The CMHC (Canadian Housing and Mortgage Corporation which insures mortgages) defines affordable housing as housing (rent or purchase) that costs no more per month than 30% of a family’s before tax gross monthly income. In Vancouver, apart from the 150,000 or so who are seeking housing, there are approximately 150,000 households paying well in excess of 30% of gross income for housing.

Market Rental and Affordable Rental Housing

Sufficient affordable rental housing is indeed the real issue and the real challenge.. Private real estate developers will not build rental housing if it is significantly less profitable than building and selling condominiums. Rental properties have been less profitable because they require more equity, are riskier, because you cannot pre-rent apartments where as the developer you can pre-sell condos. So how can a city like Vancouver encourage the development and construction of affordable rental housing?

The Affordable Housing Solution

The provision of affordable housing in the city must be the responsibility of the City, not the Federal Government or the Province, and the City must be given the authority, the funds, or the ability to secure the funds, and be able to develop for itself the administrative and organizational capacity to deliver affordable housing stock. The solution requires organizing a variety of funding sources and mechanisms, applying a number of legal tools and tax incentives as well as the application of professional and entrepreneurial skills, with the “power” to act aggressively,
.
In addition to requesting tax reform as an incentive to developers the City should take the following steps:

1. Funds realized through extraordinary fees on real estate, including but limited to foreign ownership tax, land transfer tax, vacant unit tax, and CCAs should be earmarked for affordable housing;
2. Land should be made available by the City to developers for non-market rental should never be sold, but be leased for $1 or at affordable rates.
3. All new condos should include a percentage (up to 15%) of affordable rental;
4. Request the federal government to increase the depreciation rate for affordable housing projects, and return to making soft costs for non market rental housing tax deductible;
5. Request the federal government to provide the Cities with new taxing power and the ability to tap new sources of revenues, including tax free municipal bonds.
6. Demonstrate long term profitability to induce and encourage pension funds and privately owned REITs to invest in market and non-market rental;
7. Establish an effective robust housing department under strong leadership (a Housing Tsar) with the human and financial resources to meet housing demand and set higher goals ie; to deliver 10,000 – 20,000 units per year, through both public and private mechanisms.

 

  • This course provides insight into the industry for people who deal with real estate developers or want to learn how the development industry is structured, how it functions, who the players are, what motivates them, and how they interact. You’ll benefit from a variety of personal stories from developers with a wealth of experience.

 


09 Aug 00:56

MWoS 2015: Let’s Encrypt Automation Tooling

by J.C. Jones

winterOfSecurity_logo_dark_vertical2The Mozilla Winter of Security of 2015 has ended, and the participating teams of students are completing their projects.

The Certificate Automation tooling for Let’s Encrypt project wrapped up this month, having produced an experimental proof-of-concept patch for the Nginx webserver to tightly integrate the ACME automated certificate management protocol into the server operation.

The MWoS team, my co-mentor Richard Barnes, and I would like to thank Klaus Krapfenbauer, his advisor Martin Schmiedecker, and the Technical University of Vienna for all the excellent research and work on this project.

Below is Klaus’ end-of-project presentation on AirMozilla, as well as further details on the project.

MWoS Let’s Encrypt Certificate Automation presentation on AirMozilla

Developing an ACME Module for Nginx

Author: Klaus Krapfenbauer

Note: The module is an incomplete proof-of-concept, available at https://github.com/mozilla/mwos-letsencrypt-2015

The 2015-2016 Mozilla Winter of Security included a project to implement an ACME client within a well-known web server, to show the value of automated HTTPS configuration when used with Let’s Encrypt. Projects like Caddy Server showed the tremendous ease-of-use that could be attained, so for this project we sought to patch-in such automation to a mainstay web server: Nginx.

The goal of the project is to build a module for a web server to make securing your web site even easier. Instead of configuring the web server, getting the certificate (e.g. with the Let’s Encrypt Certbot) and installing the certificate on the web server, you just need to configure your web server. The rest of the work is done by the built-in ACME module in the web server.

Nginx

This project didn’t specify which particular web server we should develop on. We evaluated several, including Apache, Nginx, and Stunnel. Since the goal is to help as many people as possible in securing their web sites we narrowed to the two most widely-used: Nginx and Apache. Ultimately, we decided to work with Nginx since it has a younger code base to develop with.

Nginx has a module system with different types of modules for different purposes. There are load-balancer modules which pass the traffic to multiple backend servers, filter modules which convert the data of a website (for example encrypt it like the SSL/TLS module) and handler modules which create the content of a web request (e.g. the http handler loads the html file from the disk and serves it). In addition to their purpose the module types also differ in how they hook into the server core, which makes the choice crucial when you start to implement a Nginx module. In our case none of the types were suitable, which introduced some difficulties, discussed later.

The ACME module

The Nginx module should be a replacement of the traditional workflow involving the ACME Certbot. Therefore the features of the module should resemble the features of the Certbot. This includes:

  • Generate and store a key-pair
  • Register an account on an ACME server
  • Create an authorization for a domain
  • Solve the HTTP challenge for the domain authorization
    • At a later date, support the other challenge types
  • Retrieve the certificate from the ACME server
  • Renew a certificate
  • Configure the Nginx SSL/TLS module to use the certificate

To provide the necessary information for all the steps in the ACME protocol, we introduced new Nginx configuration directives:

  • A directive to activate the module
  • Directive(s) for the recovery contact of the ACME account (optional)
    • An array of URIs like “mailto:” or “tel:”

Everything else is gathered from the default Nginx directives. For example, the domain for which the certificate is issued is taken from the Nginx configuration directive “server_name”.

An architecture diagram showing the different resources available to Nginx, and their relationships with the ACME module developed, as well as the ACME server: Let's Encrypt.

Architecture of the ACME Module for Nginx

As the ACME module is an extension of the Nginx server itself, it’s a part of the software and therefore uses the Nginx config file for its own configuration and stores the certificates in the Nginx config directory. The ACME module communicates with the ACME server (e.g. Let’s Encrypt, but it could be any other server speaking the ACME protocol) for gathering the certificate, then configures the SSL/TLS module to use this certificate. The SSL/TLS module then does the encryption work for the website’s communication to the end user’s browser.

Let’s look at the workflow of setting up a secure website. In a world without ACME, anyone who wanted to setup an encrypted website had to:

  1. Create a CSR (certificate signing request) with all the information needed
  2. Send the CSR over to a CA (certificate authority)
  3. Pay the CA for getting a signed certificate
  4. Wait for their reply containing the certificate (this could take hours)
  5. Download the certificate and put it in the right place on the server
  6. Configure the server to use the certificate

With the ACME protocol and the Let’s Encrypt CA you just have to:

  1. Install an ACME client
  2. Use the ACME client to:
    1. Enter all your information for the certificate
    2. Get a certificate
    3. Automatically configure the server

That’s already a huge improvement, but with the ACME module for Nginx it’s even more simple. You just have to:

  1. Activate the ACME module in the server’s configuration

Pretty much everything else is handled by the ACME module. So it does all the steps the Let’s Encrypt client does, but fully automated during the server startup. This is how easy it can and should be to encourage website admins to secure their sites.

The minimal configuration work for the ACME module is to just add the “acme” directive to the server context in the Nginx config for which you would like to activate it. For example:

…
http {
…
  server {
    listen 443 ssl;
    server_name example.com;
    acme;
    …
    <recommended SSL hardening config directives>
    …
    location / {
      …
    }
  }
}
…

Experienced challenges

Designing and developing the ACME module was quite challenging.

As mentioned earlier, there are different types of modules which enhance different portions of the Nginx core server. The default Nginx module types are: handler modules (which create content on their own), filter modules (which convert website data – like the SSL/TLS module does) and load-balancer modules (which route requests to backend servers). Unfortunately, the ACME module and its inherent workflow does not fit any of these types. Our module breaks these conventions: it has its own configuration directives, and requires hooks into both the core and other modules. Nginx’s module system was not designed to accommodate our module’s needs, therefore we had a very limited choice on when we could perform the ACME protocol communication.

The ACME module serves to configure the existing SSL/TLS module, which performs the actual encryption of the website. Our module needs to control the SSL/TLS module to some degree in order to provide the ACME-retrieved encryption certificates. Unfortunately, the SSL/TLS module does a check for the existence and the validity of the certificates during the Nginx configuration parsing phase while the server is starting. This means the ACME module must complete its tasks before the configuration is parsed. Our decision, due to those limitations, was to handle all the certificate gathering at the time when the “acme” configuration directive is parsed in the configuration during server startup. After getting the certificates, the ACME module then updates the in-memory configuration of the SSL/TLS module to use those new certificates.

Another architectural problem arose when implementing the ACME HTTP challenge-response.  To authorize a domain using the ACME HTTP challenge, the server needs to respond with a particular token at a well known URL path in its domain. Basically, it must publish this token like  a web server publishes any other site. Unfortunately, at the time the ACME module is processing, Nginx has not yet started: There’s no web server. If the ACME module exits, permitting web server functions to begin (and keeping in mind the SSL/TLS module certificate limitations from before), there’s no simple mechanism to resume the ACME functions later. Architecturally, this makes sense for Nginx, but it is inconvenient for this project. Faced with this dilemma, for the purposes of this proof-of-concept, we decided to launch an independent, tiny web server to service the ACME challenge before Nginx itself properly starts.

Conclusion

As discussed, the limitations of a Nginx module prompted some suboptimal architectural design decisions. As in many software projects, the problem is that we want something from the underlying framework which it wasn’t designed to do. The current architectural design of the ACME module should be considered a proof-of-concept.

There are potential changes that would improve the architecture of the module and the communication between the Nginx core, the SSL/TLS module and the ACME module. These changes, of course, have pros and cons which merit discussion.

One change would be deferring the retrieval of the certificate to a time after the configuration is parsed. This would require spoofing the SSL/TLS module with a temporary certificate until the newly retrieved certificate is ready. This is a corner-case issue that arises just for the first server start when there is no previously retrieved certificate already stored.

Another change is the challenge-response: A web server inside a web server (whether with a third party library or not) is not clean. Therefore perhaps the TLS-SNI or another challenge type in the ACME protocol could be more suitable, or perhaps there is some method to start Nginx while still permitting the module to continue work.

Finally, the communication to the SSL/TLS module is very hacky.

Current status of the project & future plans

The current status of the module can be roughly described as a proof-of-concept in a late development stage. The module creates an ephemeral key-pair, registers with the ACME server, requests the authentication challenge for the domain and starts to answer the challenge. As the proof of concept isn’t finished yet, we intend to carry on with the project.

Many thanks

This project was an exciting opportunity to help realize the vision of securing the whole web. Personally, I’d like to give special thanks to J.C. Jones and Richard Barnes from the Mozilla Security Engineering Team who accompanied and supported me during the project. Also special thanks to Martin Schmiedecker, my professor and mentor at SBA Research at the Vienna University of Technology, Austria. Of course I also want to thank the whole Mozilla organization for holding the Mozilla Winter of Security and enabling students around the world to participate in some great IT projects. Last but not least, many thanks to the Let’s Encrypt project for allowing me to participate and play a tiny part in such a great security project.

09 Aug 00:56

Android 7.0 Home button could provide a direct route to Google Assistant

by Rob Attrell

The latest Android 7.0 Nougat news indicates we could get a final release of the updated OS as early as next month, but only for Nexus phones. While many of the OS’ more significant features have been revealed since Nougat was first announced earlier this year, some details may not be totally clear until the final version actually gets pushed out

One small new feature that hasn’t been revealed seems to be a minor but meaningful change to Android’s focal point “Home” button. In previews for Android 7.0, the button features extra colour, as well as new animations, when used with new Google Now on Tap features.

nexus-7.0-animation

While nothing has been confirmed, it’s likely that holding down the Home button in Android 7.0 triggers Google Assistant and that the animations (seen above) provide visual feedback showing requests being processed.

It’s quite possible, however, that this could all change before Nougat’s final release. That being said, seeing Google Assistant being directly integrated into the Home button is certainly going to help the average user make use of the feature.

09 Aug 00:55

Motordom in Melbourne

by pricetags

Stephen B picks up some detritus from the Twittersphere – a view reportedly in Melbourne:

Melbourne


09 Aug 00:55

Kin Lane: Working To Avoid The Drowning Effects Of Real Time

by dnorman

A million times, this:

You hear a lot of talk about information overload, but I don’t feel the amount of information is the problem. For me, the problem comes in with the emotional investment demanded by real-time, and the ultimate toll it can take on your productivity, or just general happiness and well-being. You can see this play out in everything from expectations that you should respond to emails, all the way to social network memes getting your attention when it comes to the election, or for me personally, the concerns around security and privacy using technology.

Source: Working To Avoid The Drowning Effects Of Real Time ·

I’ve definitely been feeling this fatigue more lately. Describing it as a “real-time toll” is a good way to look at it. It’s definitely not information overload – it’s sensory and emotional overload as a result of a flood of realtime demands on attention and connection.

I am actively reducing the number of real-time platforms/connections/whatevers that I pay attention to, and am actively trying to do as much as I can on my own schedule. RSS is on my schedule. Checking and responding to email is on my schedule. Twitter/Facebook/etc are more real-time environments, so I’m reducing the amount of time I spend there.

Update: the Related Posts feature pulled up this post from 2008. “Real-time toll” is a perfect way to describe what I was getting at back then:

Every time I read an update by someone that I care about, I think about that person – if only for a second – and my sense of connection is strengthened.

09 Aug 00:55

Twitter Favorites: [DenimAndSteel] Procedural tools could become a new mainstay in architectural process, but only with major changes in pedagogy. https://t.co/oenrWyzl8D

Denim & Steel @DenimAndSteel
Procedural tools could become a new mainstay in architectural process, but only with major changes in pedagogy. killscreen.com/articles/can-a…
09 Aug 00:55

Pokémon Go installed on 100 million Android devices, makes $200 million in its first month

by Patrick O'Rourke

It seems despite predictions Pokémon Go will soon wane in popularity, the augmented reality Pocket Monster catching game is still going strong, at least in regions in which the game has recently launched.

With each new launch market, which most recently includes South America, Asia and Oceania, Pokémon Go experiences a dramatic spike in users. Sensor Tower, an app data collecting firm, says that Pokémon Go has now been installed on over 100 million Android devices, amounting to $200 million in earnings in only its first month on the market.

To put this amount in perspective, Pokémon Go is making Niantic approximately $10 million per day, though it’s believed Nintendo, as well as Google, also receive a portion of the game’s revenue given their stakes in the company. It’s worth noting that this number doesn’t include users who side-loaded the app or iOS App Store download numbers, though Apple has stated that Pokémon Go has been downloaded more than any other iOS app in its first week on the App Store.

It’s also worth keeping in mind that significant markets still remain untapped, including India, Africa, Russia and China. If the game continues growing at this rate, it’s possible that despite controversy surrounding the game’s glitches and the removal of its “Nearby” Pokémon tracing system, Niantic’s title is on pace to earn a billion dollars before the end of the year.

09 Aug 00:54

Counting [U.S.] Expatriation with R (a.k.a. a Decade of Desertion)

by hrbrmstr
2016-08-13 UPDATE: Fortune has a story on this and it does seem to be tax-related vs ideology. @thosjleeper suggested something similar as well about a week ago.

If you’re even remotely following the super insane U.S. 2016 POTUS circus election you’ve no doubt seen a resurgence of “if X gets elected, I’m moving to Y” claims by folks who are “anti” one candidate or another. The Washington Examiner did a story on last quarter’s U.S. expatriation numbers. I didn’t realize we had a department in charge of tracking and posting that data, but we do thanks to inane bureaucratic compliance laws.

I should have put “posting that data” in quotes as it’s collected quarterly and posted ~2 months later in non-uniform HTML and PDF form across individual posts in a unique/custom Federal Register publishing system. How’s that hope and change in “open government data” working out for y’all?

The data is organized enough that we can take a look at the history of expatriation with some help from R. Along the way we’ll:

  • see how to make parameterized web requests a bit cleaner with httr
  • get even more practice using the purrr package
  • perhaps learn a new “trick” when using the stringi package
  • show how we can “make do” living in a non-XPath 2 world (it’s actually pretty much at XPath 3 now, too #sigh)

A manual hunt on that system will eventually reveal a search URL that you can use in a read.csv() (to grab a list of URLs with the data, not the data itself #ugh). Those URLs are gnarly (you’ll see what I mean if you do the hunt) but we can take advantage of the standardized URL query parameter that are used in the egregiously long URLs in a far more readable fashion if we use httr::GET() directly, especially since httr::content() will auto-convert the resultant CSV to a tibble for us since the site sets the response MIME type appropriately.

Unfortunately, when using the 6039G search parameter (the expatriate tracking form ID) we do need to filter out non-quarterly report documents since the bureaucrats must have their ancillary TPS reports.

library(dplyr)
library(httr)
library(rvest)
library(purrr)
library(lubridate)
library(ggplot2) # devtools::install_github("hadley/ggplot2")
library(hrbrmisc) # devtools::install_github("hrbrmstr/hrbrmisc")
library(ggalt)
library(grid)
library(scales)
library(magrittr)
library(stringi)

GET("https://www.federalregister.gov/articles/search.csv",
    query=list(`conditions[agency_ids][]`=254,
               `conditions[publication_date][gte]`="01/01/2006",
               `conditions[publication_date][lte]`="7/29/2016",
               `conditions[term]`="6039G",
               `conditions[type][]`="NOTICE")) %>%
  content("parsed") %>%
  filter(grepl("^Quarterly", title)) -> register

glimpse(register)
## Observations: 44
## Variables: 9
## $ citation         <chr> "81 FR 50058", "81 FR 27198", "81 FR 65...
## $ document_number  <chr> "2016-18029", "2016-10578", "2016-02312...
## $ title            <chr> "Quarterly Publication of Individuals, ...
## $ publication_date <chr> "07/29/2016", "05/05/2016", "02/08/2016...
## $ type             <chr> "Notice", "Notice", "Notice", "Notice",...
## $ agency_names     <chr> "Treasury Department; Internal Revenue ...
## $ html_url         <chr> "https://www.federalregister.gov/articl...
## $ page_length      <int> 9, 17, 16, 20, 8, 20, 16, 12, 9, 15, 8,...
## $ qtr              <date> 2016-06-30, 2016-03-31, 2015-12-31, 20...

Now, we grab the content at each of the html_urls and save them off to be kind to bandwidth and/or folks with slow connections (so you don’t have to re-grab the HTML):

docs <- map(register$html_url, read_html)
saveRDS(docs, file="deserters.rds")

That generates a list of parsed HTML documents.

The reporting dates aren’t 100% consistent (i.e. not always “n” weeks from the collection date), but the data collection dates embedded textually in the report are (mostly…some vary in the use of upper/lower case). So, we use the fact that these are boring legal documents that use the same language for various phrases and extract the “quarter ending” dates so we know what year/quarter the data is relevant for:

register %<>%
  mutate(qtr=map_chr(docs, ~stri_match_all_regex(html_text(.), "quarter ending ([[:alnum:], ]+)\\.",
                                                     opts_regex=stri_opts_regex(case_insensitive=TRUE))[[1]][,2]),
         qtr=mdy(qtr))

I don’t often use that particular magrittr pipe, but it “feels right” in this case and is handy in a pinch.

If you visit some of the URLs directly, you’ll see that there are tables and/or lists of names of the expats. However, there are woefully inconsistent naming & formatting conventions for these lists of names and (as I noted earlier) there’s no XPath 2 support in R. Therefore, we have to make a slightly more verbose XPath query to target the necessary table for scraping since we need to account for vastly different column name structures for the tables we are targeting.

NOTE: Older HTML pages may not have HTML tables at all and some only reference PDFs, so don’t rely on this code working beyond these particular dates (at least consistently).

We’ll also tidy up the data into a neat tibble for plotting.

map(docs, ~html_nodes(., xpath=".//table[contains(., 'First name') or
                                         contains(., 'FIRST NAME') or
                                         contains(., 'FNAME')]")) %>%
  map(~html_table(.)[[1]]) -> tabs

data_frame(date=register$qtr, count=map_int(tabs, nrow)) %>%
  filter(format(as.Date(date), "%Y") >= 2006) -> left

With the data wrangling work out of the way, we can tally up the throngs of folks desperate for greener pastures. First, by quarter:

gg <- ggplot(left, aes(date, count))
gg <- gg + geom_lollipop()
gg <- gg + geom_label(data=data.frame(),
                      aes(x=min(left$date), y=1500, label="# individuals"),
                      family="Arial Narrow", fontface="italic", size=3, label.size=0, hjust=0)
gg <- gg + scale_x_date(expand=c(0,14), limits=range(left$date))
gg <- gg + scale_y_continuous(expand=c(0,0), label=comma, limits=c(0,1520))
gg <- gg + labs(x=NULL, y=NULL,
                title="A Decade of Desertion",
                subtitle="Quarterly counts of U.S. individuals who have chosen to expatriate (2006-2016)",
                caption="Source: https://www.federalregister.gov/")
gg <- gg + theme_hrbrmstr_an(grid="Y")
gg

RStudio

and, then annually:

left %>%
  mutate(year=format(date, "%Y")) %>%
  count(year, wt=count) %>%
  ggplot(aes(year, n)) -> gg

gg <- gg + geom_bar(stat="identity", width=0.6)
gg <- gg + geom_label(data=data.frame(), aes(x=0, y=5000, label="# individuals"),
                      family="Arial Narrow", fontface="italic", size=3, label.size=0, hjust=0)
gg <- gg + scale_y_continuous(expand=c(0,0), label=comma, limits=c(0,5100))
gg <- gg + labs(x=NULL, y=NULL,
                title="A Decade of Desertion",
                subtitle="Annual counts of U.S. individuals who have chosen to expatriate (2006-2016)",
                caption="Source: https://www.federalregister.gov/")
gg <- gg + theme_hrbrmstr_an(grid="Y")
gg

RStudio

The exodus isn’t massive but it’s actually more than I expected. It’d be interesting to track various US tax code laws, enactment of other compliance regulations and general news events to see if there are underlying reasons for the overall annual increases but also the dips in some quarters (which could just be data collection hiccups by the Feds…after all, this is government work). If you want to do all the math for correcting survey errors, it’d also be interesting to normalize this by population and track all the data back to 1996 (when HIPPA mandated the creation & publication of this quarterly list) and then see if you can predict where we’ll be at the end of this year (though I suspect political events are a motivator for at least a decent fraction of some of the quarters).

08 Aug 21:08

Google Photos

by Volker Weber

Gruber writes:

This commercial is going to turn millions of people with 16 GB iPhones into Google Photos users.

I have already converted lots of users to Google Photos. It's exactly what everybody needs. Unlimited storage, freeing up storage on your iPhone, making all your photos and videos available on all devices, and an Assistant that creates stories out of your photos. Google Photos lets you share with friends, and it is improving rapidly. Check out some of the features: shared albums, smart albums, or search my emoji. What? Search by emoji. Yes, search by effing emoji.

Apple is only catching up to some of those features with iOS 10. But Apple needs to make a big change, if they want to keep a chance to stay in the game: instead of storing everything on the device, they need to store in the cloud and only cache on the device.

08 Aug 21:08

Kin Lane: Working To Avoid The Drowning Effects Of Real Time

files/images/drowning.JPG


D'Arcy Norman, D’Arcy Norman Dot Net, Aug 11, 2016


I posted a really outstanding slide deck on Saudi Arabian youth yesterday (Sunday) and today it's gone, deleted from the internet. I had email in my inbox this morning asking for it. Sorry you missed it; it's gone. Welcome to the angst created by working in the world of real time. Meanwhile my Google Music feed is between selections and I'm feeling the weight of silence while the internet beckons me to make the next choice. All this while reading an academic paper, watching a video by Patrick Watson, and following Facebook. In a post that came out hours ago, D'Arcy Norman writes (citing Kin Lane), "I am actively reducing the number of real-time platforms/connections/whatevers that I pay attention to, and am actively trying to do as much as I can on my own schedule." What kind of way is that to be productive?

[Link] [Comment]
08 Aug 21:07

Recommended on Medium: Why working x hours a week is nonsense

I just read this article claiming that Marissa Mayer said that working 130 hours a week is what made Google successful in the early days…

Continue reading on Medium »

08 Aug 21:07

Apple will reportedly release two new Watch models later this year

by Igor Bonifacic

While the annual iPhone rumour mill has been pumping out speculation about Apple’s next smartphone for the past couple of months, over that same time period there’s been very little to come out about the next Apple Watch. But a new report from analyst KGI analyst Ming-Chi Kuo, obtained by AppleInside and MacRumors, suggests Apple plans to release two new Watch models later this year.

The first of two planned models will be a minor update to the current Apple Watch. It will include a faster processor manufactured by TSMC, one of the firms that co-produced the iPhone 6’s A8 processor. This model will also feature improved water proofing, but will otherwise look like the same device we have now. Kuo, however, expects Apple to sell it for less than the current Apple Watch.

The second, more substantial model, set to be called the Apple Watch 2, will reportedly feature a new, smaller 16nm processor, a GPS radio and barometer. To power the new hardware, the smartwatch will have a larger battery, meaning it’s unlikely it will come in a slimmer case.

Kuo believes Apple will wait until 2018 to release an Apple Watch model that features a new design. In the meantime, the company plans to add a LTE model into the mix by 2017, he says.

Kuo has something of a mixed record. Unlike someone like Evan Blass, who is able to get his hands on devices before they’re released, Kuo sources his information from Apple’s supply chain. As such, they’re more educated guesses than anything.

If Apple does announce new Watch models, it’s likely the company will do so at its annual September iPhone event.

08 Aug 18:54

Brooklyn Bridge getting Burrard volumes: Expansion proposed

by pricetags

From the New York Times:

Brooklyn

Now after escalating complaints, New York City transportation officials said on Monday that something would finally be done to solve the riddle of what they call “Times Square in the Sky.” That something — if the elderly crossing can take it — could be building a new path to alleviate congestion so bad that some people avoid the Brooklyn Bridge entirely. …

Beginning this month, a seven-month engineering study by a consulting firm, AECOM, will assess how much weight the bridge can carry, and consider options for expansion, including widening the existing promenade by building decks on to of the girders that run directly above the car lanes, according to city officials. …

Across New York City, foot traffic has increased across many bridges as the city’s population of 8.5 million is larger than ever and tourism keeps growing. The Brooklyn Bridge has seen its largest crowds on weekends, according to counts the city takes every May. An average of 1,917 people an hour crossed at peak times in 2015, or more than triple the 511 people in 2008. Weeknight traffic increased to an average of 1,057 people an hour at peak times from 880 people in 2008.

Bike traffic has also grown on the Brooklyn Bridge, reaching an average of 358 cyclists an hour at peak weeknight hours in 2015 compared with 185 cyclists in 2008.

On a single weekday, 10,000 pedestrians and 3,500 cyclists typically cross.

(Burrard Bridge had about 3,500 cycling trips by about 4 pm last Sunday.)

Brooklyn 2

A rendering of one possible option for expanding the pedestrian and bike lanes on the bridge.