Shared posts

02 May 21:55

Overcast 2.5.2 with Quicksync

by Federico Viticci

Marco Arment on the latest update to Overcast:

In the last few Overcast releases, I’ve been optimizing the sync protocol and decreasing the burden of each sync to both sides (my servers and your iPhones). In 2.5.2, we’ll reap some of the benefits with the first version of what I’ve been informally calling “quicksync”.

In short, syncing Overcast between multiple devices — say, an iPhone and an iPad — is now much faster and more accurate, making multi-device usage much more practical and compelling.

"I've been testing this for a few weeks...", the saying goes, but it's true. In my tests during the beta, quicksync made switching between podcast episodes on two devices faster and less annoying than before.

Quicksync worked well in my typical use case: I'm washing dishes and Overcast is playing through the iPad Pro's speakers, which are louder; then, I have to go out and connect my iPhone to my car's audio, resuming Overcast to the same episode. With quicksync, I no longer have to skip ahead to catch up with the iPad's progress. Marco did good work here and I hope the servers hold up well.

→ Source: marco.org

02 May 19:31

Try Redis, Get a Limited Edition T-Shirt

by Thom Crowe
Try Redis, Get a Limited Edition T-Shirt

We love Redis. It's a great in-memory database which lets you keep an entire dataset available for blazingly fast access. Or it can be your super-quick cache for your applications to call on. Or it can start aggregating your data for faster analysis later with HyperHyperLogs. Redis is a superbly versatile tool to have in your application stack; it's part of our production stack and we think it should be part of yours. There's never been a better time to give Redis a try on Compose

If you’re already running another database on Compose, spin up a Redis instance, get your data into it and see just how quickly your data can be accessed. It works in tandem with your other current database to give you super fast access to your data - think of it as the high speed glue for your applications. To see how it works, check out this article on Redis and the power of incremency. If you haven't tried Redis yet, easily spin up a MongoDB and Redis deployment to give it a try. And, as an added bonus and for a short time, we’re giving everyone who tries Redis a limited edition Compose Redis shirt.

Get started.

If you’re already up and running with Compose, just login, click deployments and spin up a new Redis instance.

If you haven’t tried Compose, today’s a good day to get started. Head over and sign up and you can try out Redis, as well as all of our other offerings, free for 30 days.

Cool, but how do I get my shirt?

Easy, when you sign up, we’ll send you an email with a link to a form. Fill out the form with all of your info we need to send you your shirt (name, address, phone number, etc), play with your new database while you wait, check your mailbox in a few days and bam, you’ve got a shirt. Pretty simple.

While you’re at it, if you’d like to write about your journey with Redis, you can submit an idea to our Write Stuff program and get paid cash and Compose credit.

02 May 19:57

Facebook Messenger could become more like Snapchat with rumoured self-destructing messages

by Rose Behar

Back in 2013 when Facebook offered to acquire Snapchat for $3 billion USD, it seemed to many a misguided act of hubris from founder Evan Spiegel to turn the offer down.

Critics assumed Snapchat was a flash in the pan and would soon go bankrupt. Three years on, Snapchat is still going strong with the 10 billion “Stories” viewed per day and a $16 billion USD valuation as of its last raise this March.

Meanwhile Facebook, the company that long saw Snapchat’s potential, now appears to be testing a feature that mimics arguably the most crucial element to Snapchat’s success. The new function appears in screenshots posted to Twitter, showing options for users to set their messages to delete after one minute, 15 minutes, an hour, four hours or a day.

This comes a mere week after the Wall Street Journal‘s Deepa Seetharaman reported that Facebook was planning to launch a stand-alone camera app that opens directly on the camera allowing users to quickly live stream video.

There have also been recent rumours that Facebook will allow users to turn on end-to-end encryption for select conversations.

With all these features echoing aspects of Snapchat, it appears that Facebook is now determined to beat the more youthful app at its own game.

Related reading: Facebook may be launching a stand-alone camera app to rival Snapchat

02 May 19:46

Google’s May 2016 Android Security Bulletin Outlines 25 Vulnerabilities

by Evan Selleck
Another month, and another security bulletin for Android posted by Google. This month’s update has just been released. Continue reading →
02 May 20:31

24-200mm Focal Length Range is Handy

by Bill Booz
Shot at 12mm focal length (35mm-equivalent of 24mm)

Shot at 12mm focal length (35mm-equivalent of 24mm)

I recently wrote about being able to carry three lenses with my tiny Panasonic GM-5 camera in a nice, small bag. One lens is always on the GM-5 and that is usually my Olympus 25mm prime lens. The other two, then, are the Panasonic 35-100mm, f/4-5.6 and The Panasonic 12-32mm, f/3.5-5.6 zoom lens.

Shot at 100mm focal length (35mm-equivalent of 200mm)

Shot at 100mm focal length (35mm-equivalent of 200mm)

Since the Panasonic GM-5 is a micro four thirds camera, it has a 2x crop factor. That means that you multiply the stated focal length of a given lens by two. So, in effect, in this little Mountainsmith bag, I’m carrying around lens power that will give me a 35mm-equivalent focal length range of 24mm through 200mm. I am including an image shot at the widest, 24mm, focal length and the longest focal length of 200mm. In addition, I have included a collage that shows the extremes of the zoom lens along with a shot taken with my great Olympus 25mm, f/1.8 lens.

Shots from all lens extremes plus the 25mm prime.

Shots from all lens extremes plus the 25mm prime.

If you would be interested in virtual tutoring sessions using either Skype or Google Hangouts, please drop me and e-mail or leave a private message to me on Facebook or Google+.

Please leave me questions or comments on my Facebook page (https://www.facebook.com/billbooz) or on Google+ (http://plus.google.com/+BillBooz). I look forward to interacting with you!

Please follow my Photography Learning Center videos on my YouTube Channel!

Check out my free eBook on Mastering Your Camera’s Priority Modes: http://www.infotor.com/ebooks. Download and enjoy!

I am available to do one-on-one tutoring or small group lessons designed to meet YOUR needs and what you want to learn in the area of photography, using flashes, or the use of Apple products and software. Give yourself the gift of learning: http://www.infotor.com/photoclasses! And check out my tutoring bundle: http://www.infotor.com/photoclasses/tutoringbundle.php as well. A PDF of my brochure is at http://www.infotor.com/photoclasses/brochures/WHBI_2015-16Brochure.pdf.

02 May 00:00

Fifty shades of open


Jeffrey Pomerantz, Robin Peek, First Monday, May 05, 2016


This could have been much more appropriately titled, but the content of the piece is spot on. Specifically:

Open means rights
Open means access
Open means use
Open means transparent
Open means participatory
Open means enabling openness
Open means philosophically aligned with open principles

[Link] [Comment]
02 May 07:59

BlueGriffon officially recommended by the French Government

by BlueGriffon

en-US TL;DR: BlueGriffon is now officially recommended as the html editor for the French Administration in its effort to rely on and promote Free Software!

Je suis très heureux de signaler que BlueGriffon, mon éditeur Web cross-platform et Wysiwyg, est officiellement recommandé par le Socle Interministériel de Logiciels Libres pour 2016 !!! Vous trouverez la liste officielle des logiciels recommandés ici (document pdf).

02 May 07:01

Smartphones – The second derivative.

by windsorr

Reply to this post

RFM AvatarSmall

 

 

 

 

 

Gap between ecosystem and hardware to increase this year. 

  • As the slowdown in the smartphone market is more severe than even I had expected, it is Xiaomi that is looking like it is in real trouble.

Smartphone and ecosystem

  • Q1 16A smartphone shipments look they have been flat or declined as much as 3% to around 340m units compared to 344m units in Q1 15A.
  • This is below RFM’s forecast of 1% growth and substantially below that which I believe most commentators and the technology industry were expecting.
  • The problem with the flattening of the market is that handset makers will have to fight even harder to find growth resulting in even greater pricing pressure.
  • This means that in revenue terms the handset market could decline by 5-10% this year.
  • This is great for consumers and for the ecosystem companies that want smartphones in the hands of as many users as possible, but for the hardware makers it is disastrous.
  • All handset makers with the exception of Samsung and Apple are barely breaking even and this added pressure could push more of them into loss making territory.
  • Consequently, I expect that this year will see an acceleration of the shakeout as the smaller companies realise that they have no hope of ever making a decent return by making commodity Android handsets.
  • This further increases my preference for the ecosystem companies as their addressable markets will keep growing despite the stagnation in the handset market.
  • The addressable market for an ecosystem is smartphone users which RFM forecasts will grow by 14% this year to 2.82bn users from 2.46bn at the end of 2015.
  • This is how the likes of Google, Facebook, Baidu, Tencent and so on will be able to post good growth this year despite the hardships being endured by hardware.

Xiaomi

  • The two exceptions to this are Apple and Xiaomi both of which have decided to monetise their ecosystem by selling hardware.
  • However, it is there that the similarity ends as despite its growth issues, Apple is still fantastically profitable.
  • Xiaomi on the hand is not and this is the third quarter in a row where it has lost market share.
  • To add insult to injury it also no longer number 1 in its home market China having been overtaken by both a resurgent Huawei and Oppo.
  • This leads me to believe that Xiaomi has no money to invest in its ecosystem which will in it falling further behind Baidu, Tencent and Alibaba and even China Mobile.
  • For Xiaomi 2015 was a year that it grew, but not as much as it has promised, while 2016 is looking like one where revenues could decline as much as 10%.
  • For a company that last raised money at $46bn on the promise of very rapid growth, this a dreadful outcome as Xiaomi badly needs to invest in its ecosystem, has no money to do so and will have great difficulty in raising any more.
  • To compound its problems it also appears that usage of its ecosystem is waning (see here) which means that the loyalty of its users to its devices may also be in decline.
  • This will further hamper profitability making the outlook for Xiaomi very difficult indeed.
  • I continue to believe that any investor that can offload his shares in Xiaomi at a valuation of $27bn will be doing very well indeed.
02 May 10:18

When Documents Become Databases – Tabulizer R Wrapper for Tabula PDF Table Extractor

by Tony Hirst

Although not necessarily the best way of publishing data, data tables in PDF documents can often be extracted quite easily, particularly if the tables are regular and the cell contents reasonably space.

For example, official timing sheets for F1 races are published by the FIA as event and timing information in a set of PDF documents containing tabulated timing data:

R_-_Best_Sector_Times_pdf__1_page_

In the past, I’ve written a variety of hand crafted scrapers to extract data from the timing sheets, but the regular way in which the data is presented in the documents means that they are quite amenable to scraping using a PDF table extractor such as Tabula. Tabula exists as both a server application, accessed via a web browser, or as a service using the tabula extractor Java application.

I don’t recall how I came across it, but the tabulizer R package provides a wrapper for tabula extractor (bundled within the package), that lets you access the service via it’s command line calls. (One dependency you do need to take care of is to have Java installed; adding Java into an RStudio docker container would be one way of taking care of this.)

Running the default extractor command on the above PDF pulls out the data of the inner table:

extract_tables('Best Sector Times.pdf')

fia_pdf_sector_extract

Where the data is spread across multiple pages, you get a data frame per page.

R_-_Lap_Analysis_pdf__page_3_of_8_

Note that the headings for the distinct tables are omitted. Tabula’s “table guesser” identifies the body of the table, but not the spanning column headers.

The default settings are such that tabula will try to scrape data from every page in the document.

fia_pdf_scrape2

Individual pages, or sets of pages, can be selected using the pages parameter. For example:

  • extract_tables('Lap Analysis.pdf',pages=1
  • extract_tables('Lap Analysis.pdf',pages=c(2,3))

Specified areas for scraping can also be specified using the area parameter:

extract_tables('Lap Analysis.pdf', pages=8, guess=F, area=list(c(178, 10, 230, 500)))

The area parameter appears to take co-ordinates in the form: top, left, width, height is now fixed to take co-ordinates in the same form as those produced by tabula app debug: top, left, bottom, right.

You can find the necessary co-ordinates using the tabula app: if you select an area and preview the data, the selected co-ordinates are viewable in the browser developer tools console area.

Select_Tables___Tabula_concole

The tabula console output gives co-ordinates in the form: top, left, bottom, right so you need to do some sums to convert these numbers to the arguments that the tabulizer area parameter wants.

fia_pdf_head_scrape

Using a combination of “guess” to find the dominant table, and specified areas, we can extract the data we need from the PDF and combine it to provide a structured and clearly labeled dataframe.

On my to do list: add this data source recipe to the Wrangling F1 Data With R book…


02 May 09:26

Lower socioeconomic status linked to lower education attainment

by Nathan Yau

Money, race, and success

The Upshot highlights research from the Stanford Center for Education Policy Analysis that looks into the relationship between a child’s parents’ socioeconomic status and their educational attainment. Researchers focused on test scores per school district in the United States.

Children in the school districts with the highest concentrations of poverty score an average of more than four grade levels below children in the richest districts.

Even more sobering, the analysis shows that the largest gaps between white children and their minority classmates emerge in some of the wealthiest communities, such as Berkeley, Calif.; Chapel Hill, N.C.; and Evanston, Ill. (Reliable estimates were not available for Asian-Americans.)

Be sure down to browse the chart the shows points for race within the the same school districts. Color represents race, and connecting lines between dots show the magnitude of the differences between white, Hispanic, and black.

If you’re interested in the data itself, you can download it from the Stanford Education Data Archive.

See also the education spending map from NPR, which suddenly takes on a new dimension.

Tags: education, race, Upshot

02 May 09:28

Handling context in "outside-in"

by Dries

In a recent post we talked about how introducing outside-in experiences could improve the Drupal site-building experience by letting you immediately edit simple configuration without leaving the page. In a follow-up blog post, we provided concrete examples of how we can apply outside-in to Drupal.

The feedback was overwhelmingly positive. However, there were also some really important questions raised. The most common concern was the idea that the mockups ignored "context".

When we showed how to place a block "outside-in", we placed it on a single page. However, in Drupal a block can also be made visible for specific pages, types, roles, languages, or any number of other contexts. The flexibility this provides is one place where Drupal shines.

Why context matters

For the sake of simplicity and focus we intentionally did not address how to handle context in outside-in in the last post. However, incorporating context into "outside-in" thinking is fundamentally important for at least two reasons:

  1. Managing context is essential to site building. Site builders commonly want to place a block or menu item that will be visible on not just one but several pages or to not all but some users. A key principle of outside-in is previewing as you edit. The challenge is that you want to preview what site visitors will see, not what you see as a site builder or site administrator.
  2. Managing context is a big usability problem on its own. Even without outside-in patterns, making context simple and usable is an unsolved problem. Modules like Context and Panels have added lots of useful functionality, but all of it happens away from the rendered page.

The ingredients: user groups and page groups

To begin to incorporate context into outside-in, Kevin Oleary, with input from yoroy, Bojhan, Angie Byron, Gábor Hojtsy and others, has iterated on the block placement examples that we presented in the last post, to incorporate some ideas for how we can make context outside-in. We're excited to share our ideas and we'd love your feedback so we can keep iterating.

To solve the problem, we recommend introducing 3 new concepts:

  1. Page groups: re-usable collections of URLs, wildcards, content types, etc.
  2. User groups: reusable collections of roles, user languages, or other user attributes.
  3. Impersonation: the ability to view the page as a user group.

Page groups

Most sites have some concept of a "section" or "type" of page that may or may not equate to a content type. A commerce store for example may have a "kids" section with several product types that share navigation or other blocks. Page groups adapts to this by creating reusable "bundles" of content consisting either of a certain type (e.g. all research reports), or of manually curated lists of pages (e.g. a group that includes /home, /contact us, and /about us), or a combination of the two (similar to Context module but context never provided an in-place UI).

User groups

User groups would combine multiple user contexts like role, language, location, etc. Example user groups could be "Authenticated users logged in from the United States", or "Anonymous users that signed up to our newsletter". The goal is to combine the massive number of potential contexts into understandable "bundles" that can be used for context and impersonation.

Impersonation

As mentioned earlier, a challenge is that you want to preview what site visitors will see, not what you see as a site builder or site administrator. Impersonation allows site builders to switch between different user groups. Switching between different user groups allow a page to be previewed as that type of user.

Using page groups, user groups and impersonation

Let's take a look at how we use these 3 ingredients in an example. For the purpose of this blog post, we want to focus on two use cases:

  1. I'm a site builder working on a life sciences journal with a paywall and I want to place a block called "Download report" next to all entities of type "Research summary" (content type), but only to users with the role "Subscriber" (user role).
  2. I want to place a block called "Access reports" on the main page, the "About us" page, and the "Contact us" page (URL based), and all research summary pages, but only to users who are anonymous users.

Things can get more complex but these two use cases are a good starting point and realistic examples of what people do with Drupal.

Step #1: place a block for anonymous users

Let's assume the user is a content editor, and the user groups "Anonymous" and "Subscriber" as well as the page groups "Subscriber pages" and "Public pages" have already been created for her by a site builder. Her first task is to place the "Access reports" block and make it visible only for anonymous users.


Place a block for anonymous users

First the editor changes the impersonation to "Anonymous" then she places the block. She is informed about the impact of the change.

Step #2: place a block for subscribers

Our editor's next task is to place the "Download reports" block and make it visible only for subscribers. To do that she is going to want to view the page as a subscriber. Here it's important that this interactions happens smoothly, and with animation, so that changes that occur on the page are not missed.


Place a block for subscribers

The editor changes the impersonation to "Subscribers". When she does the "Access reports" block is hidden as it is not visible for subscribers. When she places the "Download report" block and chooses the "Subscriber pages" page group, she is notified about the impact of the change.

Step #3: see if you did it right

Once our editor has finished step one and two she will want to go back and make sure that step two did not undo or complicate what was done in step one, for example by making the "Download report" block visible for Anonymous users or vice versa. This is where impersonation comes in.


Confirm you did it right

The anonymous users need to see the "Access reports" block and subscribers need to see the "Download report" block. Impersonation lets you see what that looks like for each user group.

Summary

The idea of combining a number of contexts into a single object is not new, both context and panels do this. What is new here is that when you bring this to the front-end with impersonation, you can make a change that has broad impact while seeing it exactly as your user will.

02 May 11:39

From the Horse’s Mouth: Councillor Joe Mihevc goes all out for the minimum grid

by dandy

horsesmouthweb

Illustration by Ian Sullivan

In our latest contribution, councillor Joe Mihevc (Ward 21, St. Paul's West) declares his support for bike lanes and the minimum grid. He also hints at Bike Share news, touts infrastructure expansion and discusses collaboration with city staff to build successful cycling projects in the city.

What are the top priorities for bike infrastructure in your ward this year?

Bike Share. The city is expanding the stations this year and there will be a few arriving for Ward 21. An announcement will be made by the city in June, so I will need to leave it at that for now, but expanding the program up the escarpment is a discussion in which I have engaged city staff over the past few years. The potential the expansion has to increase the number of people choosing to make trips by bike is exciting.

Another piece of infrastructure that I am happy about is to see our Ward 21 cyclists' request for a lane along Winona Avenue has made it into the Cycling Division's 10 Year Plan, which will go to the May16th public works and infrastructure committee (PWIC) meeting and then to city council for approval.

Would you encourage PWIC, the executive committee and council to support the minimum grid?

Absolutely. I am among councillors who signed on to Cycle Toronto's minimum grid goal. I am working to assist in the effort to have the city meet those bike lane targets. I personally met with city cycling staff to discuss the successful motion by council colleagues at the March PWIC meeting, requesting staff include $20 million and $25 million annual options for a budget increase to make the minimum grid happen. That report is part of the 10 Year Plan being considered at the May 16th PWIC meeting - and I will certainly defend the minimum grid all the way to City Council.

Related on the dandyBLOG

From the Horse’s Mouth: Gord Perks on pushing for more from city hall

Bike lanes on Bloor now one council meeting away from becoming reality

Back in the saddle: Hoopdriver re-opens in new location

Celebrating community and building bike infrastructure: Ontario Bike Summit roundup

From the Horse’s Mouth: Councillor Janet Davis on improving cycling in Ward 31

Protecting vulnerable road users protects all of us

Scarborough cyclists get a spring boost with two new bike hubs

From the Horse’s Mouth: Michael Black on Building Better Bike Lanes

From the Horse’s Mouth: Councillor Mike Layton on building community support for bike lanes

From the Horse’s Mouth: Cycling forecast for 2016

From the Horse’s Mouth: Nancy Smith Lea, TCAT

From the Horse’s Mouth: Councillor Joe Cressy on bike projects in Ward 20

From the Horse’s Mouth: Jennifer Keesmaat on the best city projects of 2015, and a look at the year ahead

02 May 11:48

Ontario is investing $20 million in public EV charging stations in 2017

by Rob Attrell

The Ontario government has been ahead of the curve in trying to adapt to the probable revolution looming in car transportation, with initiatives allowing self-driving vehicles to be tested in the province, and a $325 million Green Investment Fund working to reduce the effects of climate change through meaningful, environmentally-friendly initiatives.

This week, the Ontario Ministry of Transportation announced that a $20 million grant program has been created from the Green Investment Fund to build 500 electric vehicle charging stations across Ontario next year. The Electric Vehicle Chargers Ontario will provide 27 public and private sector groups with funds to create a network of charging stations in places where they’re needed most.

Transportation is the single largest source of greenhouse gas emissions in Ontario, with cars accounting for more atmospheric pollution than iron, steel, cement and chemical industries combined. Currently, there are 6,400 electric vehicles on Ontario’s roads, but with the industry slowly moving toward hybrid and electric models, that number will be growing quickly in the next few years.

The province has set a goal to reduce greenhouse gas emission levels to just 20 percent of 1990 pollution levels by 2050. Making recharging stations more accessible across the province will mean owning an electric vehicle as opposed to a gas-powered one is that much easier.

SourceOntario
02 May 13:01

BCE to acquire MTS for $3.9 billion, plans to divest one-third of the postpaid subscribers to Telus

by Ian Hardy

Wireless competition in Canada may have just been reduced.

BCE, parent company to Bell’s wireless business, has entered into an agreement to purchase Manitoba-based MTS (Manitoba Telecom Services Inc.) for $3.9 billion.

The transaction is expected to close in late 2016 or early 2017 pending regulatory approvals. As for the terms, BCE has agreed to pay MTS shareholders $40.00 per share, which is reportedly above the current price by a staggering 23.2 percent, and “values MTS at approximately 10.1 times 2016 estimated EBITDA.” The total transaction value is about $3.9 billion but will purchase all shares for $3.1 billion and assume its debt of approximately $800 million. Both BCE and MTS shareholders and Board of Directors have agreed to the terms of the deal.

Bell has also agreed to open a western Canadian headquarters in Manitoba, which will employ 6,900 people (MTS currently has 2,700 employees). To show its commitment to the province, Bell will also build out its network in the region and invest $1 billion over the next five years, which will specifically see its Gigabit Fibe Internet be available 12 months after the transaction closes, expand its LTE network and release Fibe TV.

MTS currently has 565,000 subscribers and the terms also state that Bell will divest a third “of MTS dealer locations in Manitoba to Telus.” This number approximately represents 140,000 to Telus. Unfortunately, there is no indication as the dollar amount Bell will be selling the MTS subs to Telus for.

When the transaction closes, Bell and MTS have agreed to sell its services under the “Bell MTS” brand name.

Finally, fine print in the release reveals that if the “arrangement agreement is terminated in certain circumstances, including if MTS enters into a definitive agreement with respect to a superior proposal, BCE is entitled to a break-fee payment of $120 million.”

George Cope, president and CEO of BCE, stated, “BCE looks forward to being part of Manitoba’s strong growth prospects, building on the tremendous MTS legacy of technological innovation, customer service and competitive success by delivering the best broadband, wireless, internet and TV services to the people of Manitoba in communities large and small. As the headquarters for the Western operations of BCE, Bell MTS will focus on delivering the benefits of new broadband communications infrastructure, ongoing technology development and enhanced community investment to Manitobans everywhere.”

“This transaction recognizes the intrinsic value of MTS and will deliver immediate and meaningful value to MTS shareholders, while offering strong benefits to MTS customers and employees, and to the Province of Manitoba,” said Jay Forbes, President & CEO, MTS. “We are proud of our history and what we have achieved as an independent company. We believe the proposed transaction we are announcing today with BCE will allow MTS to build on our successful past and achieve even more in the future.”

SourceBCE
02 May 12:00

Understanding Bluetooth Pairing Problems

by Kevin Purdy

bluetooth issues explained feature

We’ve received some complaints from our readers about the Bluetooth devices we recommend acting up, working intermittently, or otherwise failing, especially when multiple devices are involved. The problems include failing to pair, audio hiccups, and recurring dropped connections. The situation usually involves a few Bluetooth devices—say, a phone, a smartwatch, and a car stereo—trying to get along. Sometimes the conflict is between a phone, a fitness tracker, and Bluetooth headphones. Occasionally, the issue is simply a keyboard that’s confused about different iPads. And though we haven’t heard about all three-way troubles, we have some ideas about what’s going on.

02 May 14:44

What Happened to Google Maps?

by Federico Viticci

Fascinating study by Justin O'Beirne on how Google Maps changed from 2010 to 2016 – fewer cities, more roads, and not a lot of balance between them on a map at the same zoom level.

He writes:

Unfortunately, these "optimizations" only served to exacerbate the longstanding imbalances already in the maps. As is often the case with cartography: less isn't more. Less is just less. And that's certainly the case here.

As O'Beirne also notes, the changes were likely made to provide a more pleasant viewing experience on mobile devices.

I understand his point of view – the included examples really make a solid case – but I can also see why Google may consider the average user (looking up points of interest nearby, starting navigation on their phone) and think that most users don't want that kind of cartographic detail anymore.

It'd be interesting to see the same comparisons between Apple and Google, as well as between old Apple Maps and Apple Maps today.

→ Source: justinobeirne.com

02 May 15:58

Change in Metro

by pricetags

Change and development along the rapid-transit lines as part of “Skywalking through Burnaby” on Sunday:

A big hole at the Brentwood Town Centre redevelopment.  Is this the the biggest single complex in the municipality’s history?

DSC03687

Brentwood

.

At the Commercial-Broadway station, the new east platform for westbound trains is visible:

DSC03689

.

Already the station handles more passengers in a day than YVR airport.  With the opening of the Evergreen line later, the crush would be unmanageable without a platform expansion.


02 May 04:00

Tailoring Pants for Square

by Eric Ayers

This week, the Pants project announced a 1.0 release of the open source Pants Build System. The 1.0 release of Pants indicates that the tool is ready for more widespread adoption.

Square is a proud contributor to Pants. Developers at Square have been using and contributing to Pants since 2014 to develop our Java services. When we first joined the project, we found a tool that required lots of customization and insider knowledge to install and operate. Today the tool has a streamlined installation process, extensive documentation, a clean extensible design, a vibrant community and a history of stable weekly releases.

With Pants we get:

  • Reliable, reproducible builds from the current view of the code repository
  • A streamlined development workflow
  • Easy IDE setup
  • Strong integration with third party artifact repositories
  • Consistent results when switching branches
  • A distributed build cache that can be shared with CI builders and developer laptops
  • Lots of built-in tooling to help us analyze our large build graph
  • The ability to define fine grained dependencies between code modules
  • An extensible tool that can grow with our needs

./pants compile service

To understand why Square uses a tool like Pants, it helps to understand our software lifecycle. We use a monolithic codebase (monorepo) for many of the same reasons that Google does.

We build and release services from HEAD of master. Our Java codebase is housed almost entirely in a single repo consisting of over 900 projects and 45,000 source files. In this style of development, we prefer keeping all code at HEAD consistent using global refactoring and rigorous automated testing instead of maintaining strict API backwards compatibility and long deprecation cycles within the codebase.

We also have a strong commitment to using open source libraries. We extensively rely on artifacts published through the Maven Central Repository. When we upgrade a library, we can do so in one place and update the code dependencies for all services.

./pants idea service::

With such a large codebase, it becomes impractical to load the entire repo into the IDE. At Square, we primarily use IntelliJ IDEA to develop code. We use Pants to configure and launch it. Probably the most valuable feature for day to day development is simply having the ability to quickly bring up any portion of the repo in IntelliJ. With a single command, Pants configures IntelliJ to edit a module and all of its dependencies defined in the repo.

Making it easy to configure any project in the IDE means that developers can easily contribute to any project in the codebase. Being able to easily switch between branches encourages developers to collaborate. Now it is convenient to check out each other’s work locally when performing code reviews. It is easier to confine change requests to small, manageable chunks and switch between them while waiting on code reviews to complete.

./pants –help

We came to the Pants project looking for a tool to help solve problems in our build environment. Previously, we used Apache Maven to compile and package binaries. Maven is a powerful and popular tool with a modular design that makes it easy to extend with excellent support from third party tools and libraries. We had a significant investment in Maven, including many custom plugins for running code generation and supporting a distributed cache for artifacts in our continuous integration (CI) build system.

Using Maven with our “build everything from HEAD” policy strains the Maven model. Maven is designed to support editing a few modules at a time while relying on binary artifacts for most dependencies. To support building the entire repo from HEAD, we set every Maven module in the repo to a SNAPSHOT version.

Using Maven in this way works, but has drawbacks. Running a recursive compile of all dependent modules incurs a lot of overhead. We had wrapper scripts to help us try to be productive in this environment, say to run just run code generation or only run a subset of tests. Still, developers would get into trouble in some situations, often having to deal with inconsistencies between stale binary artifacts and the source on disk. For example, after using mvn install, pulling in new changes from the repo or switching back to an older branch could leave them compiling against stale code. When developers routinely question the integrity of their development environment, they waste a lot of time cleaning and rebuilding the codebase.

./pants test service:test

Our first priority was to allow developers to quickly configure their workspace in the IDE. Next, we migrated to using Pants as the tool to test and deploy artifacts in our CI builder. As of this writing, we have replaced all of our use of Maven in this repo using Pants, including:

  • Developing on developer workstations and laptops
  • Compiling and testing code in our continuous integration environments
  • Publishing artifacts to a Maven style repository
  • Integrating with third party tools like findbugs and kloc

Replacing all of our uses of Maven was not easy. We were able to do this by generating the Pants configuration using the Maven pom.xml files as a source of truth. During an interim phase we supported both tools. Through collaboration with the Pants open source community, we were able to modify Pants through hundreds of open source contributions.

./pants staging-build –deploy

Pants comes out of the box ready to edit, compile, test, and package code. Beyond that, we were able to leverage Pants' extensible module based system. A current favorite is a small custom task to deploy directly to staging environments over our internal deployment system. Along the way, other custom modules run custom code generators, gather meta information about our build process for our security infrastructure, and package the output of annotation processors into yaml files for our deployment system. Today we have about two dozen internal Pants plugins that do all those things, plus additional tools to audit our codebase, integrate with our CI system, and customize our IDE support.

Ship it!

At Square, Pants has helped us realize the promised benefits of monorepo style development at scale. Making sure the development process is consistent and reliable increases our developer velocity. Being able to quickly and reliably edit, compile, and test allows developers to concentrate on reviewing and writing code, not struggling with configuring tools. We believe that Pants is ready for more widespread adoption and encourage you to give it a try.

02 May 13:31

Managing feedback challenges writers (Survey data)

by Josh Bernoff

My survey of 547 business writers found one big pain point: managing editorial feedback. Only half of business writers get the feedback they need, and only one in three feels that their process for managing feedback works well. Business writers have little love for feedback processes In this table, the first data column summarizes all the responses; the other columns … Continue reading Managing feedback challenges writers (Survey data) →

The post Managing feedback challenges writers (Survey data) appeared first on without bullshit.

02 May 16:05

Should we end the single-family home in Vancouver?

by pricetags

An Item from Ian found in Vancity Buzz.  

s-f homes

/

Greg Mitchell asks the question:

… let’s rethink the narrative. We know Vancouver is an expensive city in which to live – it seems to be all Vancouverites think about currently (yes, I’m guilty too). And we are obviously limited in terms of our land on which we can develop (mountains, ocean – enough said). So our only option is to densify – but the question is HOW to densify.

He provides, in detail, an alternative:

4-townhouses-66-foot-lot

.

Those details here.

 


02 May 16:08

Samsung’s IoT SmartThings platform suffers from serious security issues, says report

by Patrick O'Rourke

With the smartphone industry seemingly hitting a plateau when it comes to innovation and perhaps more importantly excitement, internet of things (IoT) gadgets have become one of fastest expanding and most interesting areas of the tech industry.

In Samsung’s case, however, a report from University of Michigan computer science researchers indicates the South Korean smartphone manufacturer’s IoT SmartThings platform suffers from security issues that could potentially allow malicious apps to operate smart locks, change access codes and set off smoke Wi-Fi-enabled detectors, as well as a variety of other forms attacks on Samsung’s smart device line.

A malicious SmartThings app, with access to more permissions than necessary, downloaded directly from Samsung’s SmartThings store, is the source of the security issues according to the research. The problem also stems from apps being given permissions that aren’t actually required. For example, a smart lock only needs the ability to lock remotely, but SmartThings’ API links this command with a variety of others.

After installation, SmartThings apps also request additional permissions, allowing them to be linked to different apps installed on the smartphone, a move the researchers say isn’t necessary because it gives the app more access than is required.

Researchers demonstrated their discovery through an app that monitors the battery life of a variety of Samsung SmartThings products. After installing and granting the malicious but normal looking app permissions on the smartphone, it not only monitors battery, but also has the ability to manipulate the lock’s functionality. It does this by automatically sending out an SMS to the app’s developer each time the user reprograms the the smart lock’s pin code.

A second demonstration showed off an app allowing the user to to program their own pin code through an app that locks and unlocks a browser. Research revealed that of the 499 apps part of the study, 42 percent of them have more privileges than are necessary, giving malicious developers ample opportunity to create exploits.

Following this discovery, the University of Michigan researchers behind the discovery say they have reached out to Samsung’s SmartThings team with their findings.

While these exploits do require user interaction, many people swiftly move through the permission section of installing an app without actually realizing what they’re giving the software access to. Researchers say that of 22 SmartThings users they surveyed, 91 percent said they would allow a battery monitoring app to check their smart lock and give the app whatever permissions it requested. However, only 14 percent said they would allow the battery app to send door access codes to a remote server.

In an email to The Verge, a SmartThing representative said the following about the study.

“The potential vulnerabilities disclosed in the report are primarily dependent on two scenarios – the installation of a malicious SmartApp or the failure of third party developers to follow SmartThings guidelines on how to keep their code secure. Following this report, we have updated our to provide even better security guidance to developers.”

“Smart home devices and their associated programming platforms will continue to proliferate and will remain attractive to consumers because they provide powerful functionality. However, the findings in this paper suggest that caution is warranted as well – on the part of early adopters, and on the part of framework designers. The risks are significant, and they are unlikely to be easily addressed via simple security patches.”

 

 

02 May 16:45

The biggest single project in Burnaby history?

by pricetags

I asked below whether Brentwood Town Centre was the largest single project ever seen in Burnaby. Should have checked my e-mail to see that this just came in, via Vancity Buzz:

 

Concord

A master-planned community called Concord Brentwood is the latest development from Concord Pacific Developments Inc., renowned for its skyline-defining communities on Vancouver’s False Creek and Toronto’s lakefront.

Concord Brentwood will create a bustling community according to Concord Pacific senior vice president Matt Meehan. “Our next project in Burnaby, Concord Brentwood, will see 26 acres in the Brentwood neighbourhood transform into a beautiful and diverse mixed-use park-side community that completes the exciting revitalization of the Brentwood Town Centre neighbourhood.” …

Designed by award-winning architect James K.M. Cheng of Vancouver, Concord Brentwood will consist of 10 towers, most between 40 and 45 storeys tall. Tower 1 of Phase 1 will consist of 426 units on 45 storeys.

I don’t know if this a rendering of the massing for the proposal or the final product.  But if the latter, the architecture looks pretty blah.  I still have no explanation for why in this region there is such a reluctance to use colour, why the palette seems so constrained – off-white or gray, beige and green glass.


02 May 17:03

Harvesting Searched for Tweets Using Python

by Tony Hirst

Via Tanya Elias/eliast05, a query regarding tools for harvesting historical tweets. I haven’t been keeping track of Twitter related tools over the last few years, so my first thought is often “could Martin Hawksey’s TAGSexplorer do it?“!

But I’ve also had a the twecoll Python/command line package on my ‘to play with’ list for a while, so I though I’d give it a spin. Note that the code requires python to be installed (which it will be, by default, on a Mac).

On the command line, something like the following should be enough to get you up and running if you’re on a Mac (run the commands in a Terminal, available from the Utilities folder in the Applications folder). If wget is not available, download the twecoll file to the twitterstuff directory, and save it as twecoll (no suffix).

#Change directory to your home directory
$ cd

#Create a new directory - twitterstuff - in you home directory
$ mkdir twitterstuff

#Change directory into that directory
$ cd twitterstuff

#Fetch the twecoll code
$ wget https://raw.githubusercontent.com/jdevoo/twecoll/master/twecoll
--2016-05-02 14:51:23--  https://raw.githubusercontent.com/jdevoo/twecoll/master/twecoll
Resolving raw.githubusercontent.com... 23.235.43.133
Connecting to raw.githubusercontent.com|23.235.43.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 31445 (31K) [text/plain]
Saving to: 'twecoll'
 
twecoll                    100%[=========================================>]  30.71K  --.-KB/s   in 0.05s  
 
2016-05-02 14:51:24 (564 KB/s) - 'twecoll' saved [31445/31445]

#If you don't have wget installed, download the file from:
#https://raw.githubusercontent.com/jdevoo/twecoll/master/twecoll
#and save it in the twitterstuff directory as twecoll (no suffix)

#Show the directory listing to make sure the file is there
$ ls
twecoll

#Change the permissions on the file to 'user - executable'
$ chmod u+x twecoll

#Run the command file - the ./ reads as: 'in the current directory'
$ ./twecoll tweets -q "#lak16"

Running the code the first time prompts you for some Twitter API credentials (follow the guidance on the twecoll homepage), but this only needs doing once.

Testing the app, it works – tweets are saved as a text file in the current directory with an appropriate filename and suffix .twt – BUT the search doesn’t go back very far in time. (Is the Twitter search API crippled then…?)

Looking around for an alternative, the GetOldTweets python script, which again can be run from the command line; download the zip file from Github, move it into the twitterstuff directory, and unzip it. On the command line (if you’re still in the twitterstuff directory, run:

ls

to check the name of the folder (something like GetOldTweets-python-master) and then cd into it:

cd GetOldTweets-python-master/

to move into the unzipped folder.

Note that I found I had to install pyquery to get the script to run; on the command line, run: easy_install pyquery.

This script does not require credentials – instead it scrapes the Twitter web search. Data limits for the search can be set explicitly.

python Exporter.py --querysearch '#lak15' --since 2015-03-10 --until 2015-09-12 --maxtweets 500

Tweets are saved into the file output_got.csv and are semicolon delimited.

A couple of things I noticed with this script: it’s slow (because it “scrolls” through pages and pages of Twitter search results, which only have a small number of results on each) and on occasion seems to hang (I’m not sure if it gets stuck in an infinite loop; on a couple of occasions I used ctrl-z to break out). In such a case, it doesn’t currently save results as you go along, so you have nothing; reduce the --maxtweets value, and try again. On occasion, when running the script under the default Mac python 2.7, I noticed that there may be encoding issues in tweets which break the output, so again the file can’t get written,

Both packages run from the command line, or can be scripted from a Python programme (though I didn’t try that). If the GetOldTweets-python package can be tightened up a bit (eg in respect of UTF-8/tweet encoding issues, which are often a bugbear in Python 2.7), it looks like it could be a handy little tool. And for collecting stuff via the API (which requires authentication), rather than by scraping web results from advanced search queries, twecoll looks as if it could be quite handy too.


02 May 17:14

A Vanishing Vancouver – documented

by pricetags

The latest Sean Ruthen book review in Spacing:

VAn VAnishes

With affordable housing in the Metro Vancouver region now a daily topic of discussion, Vancouver Vanishes – Narratives of Demolition and Revival by local novelist Caroline Adderson is a formidable addition to the conversation. Combining several photos of demolished Vancouver homes, the book’s main events are the stories and essay contributions from some of the heritage community’s most outspoken proponents, including Michael Kluckner—who provides an update since he published his similarly titled Vanishing Vancouver a few years back. The editor of the book provides an alarming snapshot of the predicament we presently find ourselves in, while Kluckner gives us a brief history on how we came to be in this predicament.

More here.


02 May 17:02

My Tablet Has Stickers

by Federico Viticci

Great piece by Steven Sinofsky, who has replaced his laptop with an iPad Pro. There are several quotable passages, but I particularly liked this one:

Most problems are solved by not doing it the old way. The most important thing to keep in mind is that when you switch to a new way of doing things, there will be a lot of flows that can be accomplished but are remarkably difficult or seem like you’re fighting the system the whole time. If that is the case, the best thing to do is step back and realize that maybe you don’t need to do that anymore or even better you don’t need a special way of doing that. When the web came along, a lot of programmers worked very hard to turn “screens” (client-server front-ends) into web pages. People wanted PF-function keys and client-side field validation added to forms. It was crazy and those web sites were horrible because the whole of the metaphor was different (and better). The best way to adapt to change is to avoid trying to turn the old thing into the new things.

This paragraph encapsulates what I went through for the past two years since I switched to the iPad as my primary computer. To this day, I still get comments from a few people who think "I'm fighting the system". And we don't have to look too far back in our past to find the opinions of those who thought the iPad Pro was a platform for people who "jump through more hoops than a circus elephant".

I've been enjoying the wave of iPad enthusiasm that the iPad Pro caused, and I still believe we're just getting started.

→ Source: medium.com

02 May 11:00

The ‘Technique’ of Blackface

by Jason A. Smith

Snapchat_blackface1

Outrage over the Bob Marley Snapchat filter was swift following its brief appearance on the mobile application’s platform on April 20 (The 420 pot smoking holiday). The idea of mimicking Bob Marley in appreciation of a day dedicated to smoking marijuana enabled users to don the hat, dreads, and…blackface!? News outlets that day covered the issue pretty quickly. CNN.money and The Verge noted the negative reactions voiced on social media in regard to the filter. Tech publisher Wired released a brief article condemning it, calling it racially tone-deaf.

The racial implications of the Bob Marley filter are multifaceted, yet I would like to focus on the larger cultural logic occurring both above and behind the scenes at an organization like Snapchat. The creation of a filter that tapped into blackface iconography demonstrates the complexity of our relationship to various forms of technology – as well as how we choose to represent ourselves through those technologies. French sociologist Jacques Ellul wrote in The Technological Society of ‘technique’ as an encompassing train of thought or practice based on rationality that achieves its desired end. Ellul spoke of technique in relation to advances in technology and human affairs in the aftermath of World War II, yet his emphasis was not on the technology itself, but rather the social processes that informed the technology. This means that in relation to a mobile application like Snapchat we bring our social baggage with us when we use it, and so do developers when they decide to design a new filter. Jessie Daniels addresses racial technique in her current projects regarding colorblind racism and the internet – in which the default for tech insiders is a desire to not see race. This theoretically rich work pulls us out of the notion that technology is neutral within a society that has embedded racial meanings flowing through various actors and institutions, and where those who develop the technology we use on a daily basis are unprepared to acknowledge the racial disparities which persist, and the racial prejudice that can—and does—permeate their designs.

This understanding of technique, when combined with critical race theory, allows us to ask if the presence of blackface in technology is any big surprise in a presumably “post-racial” world. I am positive that any critical race scholar would, without hesitation, answer, “No, it’s not.”  And that’s because we are definitively not post-racial. The intentions behind the filter might have been innocent or playful by developers, but the use of blackface within society has a long and complex history – particularly in regard to its use as a tool to perpetuate systemic racial inequalities in the dehumanizing and “othering” of African Americans in the United States. Hollywood has traditionally been the long time perpetrator of promoting blackface, and variations of it, through utilizing stereotypes that adapt to a given historical moment in society. Yet the racial implications of blackface extend beyond the screens in which we view film. Over the past couple of years tensions brought up over racialized costumes during Halloween and college parties demonstrate the reach and continuation of blackface. With such a contemporary example that has generated conflict within the general public, it seems as if the tech innovators at Snapchat would have known better. I guess that is just wishful thinking. This movement and use of blackface from film, to parties, to the mobile app demonstrates what Ellul meant in regard to technique. The continuation of blackface in our society presently is not necessarily linked to the technologies that produce them, but through the ways in which individuals develop and utilize those technologies. The presumed innocence of using blackface to ‘celebrate’ an individual within a logic of providing ‘daily-new’ filters for consumer use reflects a gross oversight in what blackface means within the larger cultural sphere of public life.

The continued existence of racism in society is undertaken through multiple shifts and debates, in which no actor or institution stands in isolation. This case of the Bob Marley filter only highlights the ways that historical racist images are allowed to perpetuate themselves in the present – becoming not-so-historical in the process as they reincarnate through new mediums. I have no doubt that some cases might be found of individuals using the filter, or commenting on it, in overtly racist ways. Yet, as mentioned above, voices also sprang up to condemn the filter as racially insensitive in various social media and news sites. The technique of blackface is malleable in that it lingers on through practices that are uncritically carried out by tech developers, but those practices are also challenged through other means across various technologies. Unraveling this technique requires disrupting the structural racism that upholds it. Brushing off the filter as a misstep by Snapchat or condemning the developers as socially out of touch, is antithetical to the critical race project, a project that is less interested in identifying those who fail at race relations and more interested in identifying, and subverting, the social conditions that allow racism to persist.

 

Jason A. Smith is a doctoral candidate in Public Sociology at George Mason University whose research centers on the areas of race and the media. His dissertation will look at the Federal Communications Commission and policy decisions regarding diversity in the media for minorities and women. Along with Bhoomi K. Thakore, he is a co-editor of the forthcoming volume Race and Contention in Twenty-first Century US Media (Routledge, May 2016). He is on twitter occasionally.

 

Headline pic: Source (CC licensed and edited by the author)

02 May 17:22

Samsung Galaxy S5 starts receiving Marshmallow update in Canada

by Ian Hardy

Telus’ software update scheduled was recently updated and revealed the S5 would see the long awaited Marshmallow update today. Well, it’s now available to download.

We’ve received several tips from Canadians across the country that informed us that Android 6.0.1 Marshmallow is now available to Telus customers. The update is coming in at just over 819MB in size and brings Google Now on Tap, an upgraded app permissions manager, native support for fingerprint sensors and Marshmallow’s new Doze battery saving capabilities.

slack-imgs

Check your S5 by hitting Menu > Settings > About Device > Software Update > Update.

No indication yet from other Canadian carriers.

Update: Bell users are now reporting the update is available to downlaod.

(Thanks to everyone who sent this in!)

02 May 17:21

The Best Android Tablet

by Chris Heinonen
android tablet nvidia shield k1

After spending more than 110 hours over two years researching and testing tablets, we think the best Android tablet for most people is the Nvidia Shield Tablet K1. Other Android tablets are thinner or less expensive, but the K1 offers the best combination of speed, display, size, and features. Even if you don’t use its unique gaming features (more on those below), it’s a better value than the competition. It’s also more likely to be usable in a year or two due to its CPU, storage capacity, and stock Android OS, which gets fast updates and doesn’t slow the tablet down with extra overlays or vendor-specific UIs. But if you’re not already invested in Android, an iPad is a better tablet in general.

02 May 17:41

Next-Generation Transportation Webinar Series – May 16

by pricetags

The SFU City Program is starting the Next-Generation Transportation Webinar Series on May 16! These engaging and insightful presenters will inspire you to consider new transportation strategies to improve your communities. More details below!

Next-Generation Transportation Webinar Series

Congestion and Challenge: How cities can effectively respond to transportation demands

$20, May 16, 2016

12–1:30 pm PDT

Speaker: Daniel Firth, Sustainable Urban Mobility Strategy, City of Stockholm

.

There aren’t many cities with full-scale congestion pricing. Singapore, London, Stockholm are among the best known. What conditions allowed that to happen in a city like Stockholm? What motivated it?

This webinar takes a look at what happened, with someone who was there when it happened. Daniel Firth is project manager for Stockholm’s Sustainable Urban Mobility Strategy. He’s responsible for the implementation of Stockholm’s policies on congestion pricing and strategic parking management. Before Stockholm, Daniel spent five years at Transport for London, working on the implementation and operation of the central London congestion charging scheme.

Daniel will discuss the impediments and the responses—and provide an up-to-date analysis of the results…along with suggestions on what they’d do today if they started over.

Registration/Details


02 May 17:04

How to Raise the Next Mark Zuckerberg

by Alex

Investors and tech geeks love to talk about “unicorns”: tech startups that are worth $1 billion or more. Incredibly, many of these dazzling success stories, like Facebook and Snapchat, have been launched by people in their early twenties. Other multi-million-dollar startups are regularly created by high school and college students.

So why not help your kids achieve that kind of success for themselves? As I write in today’s Wall Street Journal, the same skills that prepare your kids for the possibility of a big startup win will also prepare them for other forms of academic or professional success. For example, you can teach your kids to be problem solvers:

Most great startups address an unsolved problem—but half the art lies in recognizing that there’s a problem to solve. Teach your children to spot those opportunities by treating every complaint as a learning opportunity: Whenever your child complains about a game, site or app (or a real-world toy or experience), ask them how they would make it better. Yes, it’s annoying that you can’t play this videogame with a friend. What features would you have to add to make it a great collaborative game? I agree, this restaurant isn’t great for young people. What would your dream restaurant look like?

Read the full story in today’s Wall Street Journal.