Shared posts

31 Mar 13:59

When it comes to science - its the economy stupid.

by Jeff Leek

I read a lot of articles about what is going wrong with science:

These articles always point to the "incentives" in science and how they don't align with how we'd like scientists to work. These discussions often frustrate me because they almost always boil down to asking scientists (especially and often junior scientists) to make some kind of change for public good without any guarantee that they are going to be ok. I've seen an acceleration/accumulation of people who are focusing on these issues, I think largely  because it is now possible to make a very nice career by pointing out how other people are doing science wrong.

The issue I have is that the people who propose unilateral moves seem to care less that science is both (a) a calling and (b) a career for most people. I do science because I love it. I do science because I want to discover new things about the world. It is a direct extension of the wonder and excitement I had about the world when I was a little kid. But science is also a career for me. It matters if I get my next grant, if I get my next paper. Why? Because I want to be able to support myself and my family.

The issue with incentives is that talking about them costs nothing, but actually changing them is expensive. Right now our system, broadly defined, rewards (a) productivity - lots of papers, (b) cleverness - coming up with an idea first, and (c) measures of prestige - journal titles, job titles, etc. This is because there are tons of people going for a relatively small amount of grant money. More importantly, that money is decided on by processes that are both peer reviewed and political.

Suppose that you wanted to change those incentives to something else. Here is a small list of things I would like:

  • People can have stable careers and live in a variety of places without massive two body problems
  • Scientists shouldn't have to move every couple of years 2-3 times right at the beginning of their career
  • We should distribute our money among the largest number of scientists possible 
  • Incentivizing long term thinking
  • Incentivizing objective peer review
  • Incentivizing openness and sharing
The key problem isn't publishing, or code, or reproducibility, or even data analysis.
The key problem is that the fundamental model by which we fund science is completely broken. 
The model now is that you have to come up with an idea every couple of years then "sell" it to funders, your peers, etc. This is the source of the following problems:
  • An incentive to publish only positive results so your ideas look good
  • An incentive to be closed so people don't discover flaws in your analysis
  •  An incentive to publish in specific "big name" journals that skews the results (again mostly in the positive direction)
  •  Pressure to publish quickly which leads to cutting corners
  • Pressure to stay in a single area and make incremental changes so you know things will work.
If we really want to have any measurable impact on science we need to solve the funding model. The solution is actually pretty simple. We need to give out 20+ year grants to people who meet minimum qualifications. These grants would cover their own salary plus one or two people and the minimum necessary equipment.
The criteria for getting or renewing these grants should not be things  like Nature papers or number of citations. It has to be designed to incentivize the things that we want to (mine are listed above). So if I was going to define the criteria for meeting the standards people would have to be:
  • Working on a scientific problem and trained as a scientist
  • Publishing all results immediately online as preprints/free code
  • Responding to queries about their data/code
  • Agreeing to peer review a number of papers per year

More importantly these grants should be given out for a very long term (20+ years) and not be tied to a specific institution. This would allow people to have flexible careers and to target bigger picture problems. We saw the benefits of people working on problems they weren't originally funded to work on with research on the Zika virus.

These grants need to be awarded using a rigorous peer review system just like the NIH, HHMI, and other organizations use to ensure we are identifying scientists with potential early in their careers and letting them flourish. But they'd be given out in a different matter. I'm very confident in a peer review to detect the difference between psuedo-science and real science, or complete hype and realistic improvement. But I'm much less confident in the ability of peer review to accurately distinguish "important" from "not important" research. So I think we should consider seriously the lottery for these grants.

Each year all eligible scientists who meet some minimum entry requirements submit proposals for what they'd like to do scientifically. Each year those proposals are reviewed to make sure they meet the very minimum bar (are they scientific? do they have relevant training at all?). Among all the (very large) class of people who pass that bar we hold a lottery. We take the number of research dollars and divide it up to give the maximum number of these grants possible.  These grants might be pretty small - just enough to fund the person's salary and maybe one or two students/postdocs. To make this works for labs that required equipment there would have to be cooperative arrangements between multiple independent indviduals to fund/sustain equipment they needed. Renewal of these grants would happen as long as you were posting your code/data online, you were meeting peer review requirements, and responding to inquires about your work.

One thing we'd do to fund this model is eliminate/reduce large-scale projects and super well funded labs. Instead of having 30 postdocs in a well funded lab, you'd have some fraction of those people funded as independent investigators right from the get-go. If we wanted to run a massive large scale program that would be out of a very specific pot of money that would have to be saved up and spent, completely outside of the pot of money for investigator-initiated grants. That would reduce the hierarchy in the system, reduce pressure that leads to bad incentive, and give us the best chance to fund creative, long term thinking science.

Regardless of whether you like my proposal or not, I hope that people will start focusing on how to change the incentives, even when that means doing something big or potentially costly.

 

 

31 Mar 13:59

Non-tidy data

During the discussion that followed the ggplot2 posts from David and I last week we started talking about tidy data and the man himself noted that matrices are often useful instead of “tidy data” and I mentioned there might be other data that are usefully “non tidy”. Here I will be using tidy/non-tidy according to Hadley’s definition. So tidy data have:

  • One variable per column
  • One observation per row
  • Each type of observational unit forms a table

I push this approach in my guide to data sharing and in a lot of my personal work. But note that non-tidy data can definitely be already processed, cleaned, organized and ready to use.

This led to a very specific blog request:

So I thought I’d talk about a couple of reasons why data are usefully non-tidy. The basic reason is that I usually take a problem first, not solution backward approach to my scientific research. In other words, the goal is to solve a particular problem and the format I chose is the one that makes it most direct/easy to solve that problem, rather than one that is theoretically optimal.   To illustrate these points I’ll use an example from my area.

Example data

Often you want data in a matrix format. One good example is gene expression data or data from another high-dimensional experiment. David talks about one such example in his post here. He makes the (valid) point that for students who aren’t going to do genomics professionally, it may be more useful to learn an abstract tool such as tidy data/dplyr. But for those working in genomics, this can make you do unnecessary work in the name of theory/abstraction.

He analyzes the data in that post by first tidying the data.

library(dplyr)
library(tidyr)
library(stringr)
library(readr)
library(broom)
 
original_data %
  separate(NAME, c("name", "BP", "MF", "systematic_name", "number"), sep = "\\|\\|") %>%
  mutate_each(funs(trimws), name:systematic_name) %>%
  select(-number, -GID, -YORF, -GWEIGHT) %>%
  gather(sample, expression, G0.05:U0.3) %>%
  separate(sample, c("nutrient", "rate"), sep = 1, convert = TRUE)

It isn’t 100% tidy as data of different types are in the same data frame (gene expression and metadata/phenotype data belong in different tables). But its close enough for our purposes. Now suppose that you wanted to fit a model and test for association between the “rate” variable and gene expression for each gene. You can do this with David’s tidy data set, dplyr, and the broom package like so:

rate_coeffs = cleaned_data %>% group_by(name) %>%
     do(fit = lm(expression ~ rate + nutrient, data = .)) %>%
     tidy(fit) %>% 
     dplyr::filter(term=="rate")

On my computer we get something like:

system.time( cleaned_data %>% group_by(name) %>%
+               do(fit = lm(expression ~ rate + nutrient, data = .)) %>%
+                tidy(fit) %>% 
+                dplyr::filter(term=="rate"))
|==========================================================|100% ~0 s remaining 
user  system elapsed 
 12.431   0.258  12.364

Let’s now try that analysis a little bit differently. As a first step, lets store the data in two separate tables. A table of “phenotype information” and a matrix of “expression levels”. This is the more common format used for these type of data. Here is the code to do that:

expr = original_data %>% 
  select(grep("[0:9]",names(original_data)))
 
rownames(expr) = original_data %>%
  separate(NAME, c("name", "BP", "MF", "systematic_name", "number"), sep = "\\|\\|") %>%
  select(systematic_name) %>% mutate_each(funs(trimws),systematic_name) %>% as.matrix()
 
vals = data.frame(vals=names(expr))
pdata = separate(vals,vals,c("nutrient", "rate"), sep = 1, convert = TRUE)
 
expr = as.matrix(expr)

If we leave the data in this format we can get the model fits and the p-values using some simple linear algebra

expr = as.matrix(expr)
 
mod = model.matrix(~ rate +  as.factor(nutrient),data=pdata)
rate_betas = expr %*% mod %*% solve(t(mod) %*% mod)

This gives the same answer after re-ordering

all(abs(rate_betas[,2]- rate_coeffs$estimate[ind]) < 1e-5,na.rm=T)
[1] TRUE

But this approach is much faster.

 system.time(expr %*% mod %*% solve(t(mod) %*% mod))
   user  system elapsed 
  0.015   0.000   0.015

This requires some knowledge of linear algebra and isn’t pretty. But it brings us to the first general point: you might not use tidy data because some computations are more efficient if the data is in a different format. 

Many examples from graphical models, to genomics, to neuroimaging, to social sciences rely on some kind of linear algebra based computations (matrix multiplication, singular value decompositions, eigen decompositions, etc.) which are all optimized to work on matrices, not tidy data frames. There are ways to improve performance with tidy data for sure, but they would require an equal amount of custom code to take advantage of say C, or vectorization properties in R.

Ok now the linear regressions here are all treated independently, but it is very well known that you get much better performance in terms of the false positive/true positive tradeoff if you use an empirical Bayes approach for this calculation where you pool variances.

If the data are in this matrix format you can do it with R like so:

library(limma)
fit_limma = lmFit(expr,mod)
ebayes_limma = eBayes(fit_limma)
topTable(ebayes_limma)

This approach is again very fast, optimized for the calculations being performed and performs much better than the one-by-one regression approach. But it requires the data in matrix or expression set format. Which brings us to the second general point: **you might not use tidy data because many functions require a different, also very clean and useful data format, and you don’t want to have to constantly be switching back and forth. **Again, this requires you to be more specific to your application, but the potential payoffs can be really big as in the case of limma.

I’m showing an example here with expression sets and matrices, but in NLP the data are often input in the form of lists, in graphical analyses as matrices, in genomic analyses as GRanges lists, etc. etc. etc. One option would be to rewrite all infrastructure in your area of interest to accept tidy data formats but that would be going against conventions of a community and would ultimately cost you a lot of work when most of that work has already been done for you.

The final point, which I won’t discuss here is that data are often usefully represented in a non-tidy way. Examples include the aforementioned GRanges list which consists of (potentially) ragged arrays of intervals and quantitative measurements about them. You could force these data to be tidy by the definition above, but again most of the infrastructure is built around a different format that is much more intuitive for that type of data. Similarly data from other applications may be more suited to application specific formats.

In summary, tidy data is a useful conceptual idea and is often the right way to go for general, small data sets, but may not be appropriate for all problems. Here are some examples of data formats (biased toward my area, but there are others) that have been widely adopted, have a ton of useful software, but don’t meet the tidy data definition above. I will define these as “processed data” as opposed to “tidy data”.

I’m sure there are a ton more I’m missing and would be happy to get some suggestions on Twitter too.

 

27 Feb 07:06

QFMFT

by AG

Jeremy Rifkin’s Zero Marginal Cost Society is a book that’s come up a few times in discussions here, and while I may have mentioned that I have multiple problems with it — its transparent assembly by interns, the guileless portrayal it offers of the Internet of Things, and particularly some of the lazy methods of argumentation Rifkin occasionally indulges in — it gets one thing so thunderingly right that it is worth quoting at some length.

The following is the best short description of the neoliberal evisceration of the public sphere between 1979 and the present I have ever come across. It resonates with my experience in every particular — and I’ve lived through this, seen it unfold on both sides of the Atlantic. If you were born anytime after, oh, 1988 or so, it will be very useful in helping you understand just what has been done to your world, and to you.

I’ll be honest with you: Sometimes I want to weep for what we’ve lost. Just the enumeration in the very first paragraph is almost overwhelming.

The Reagan/Thatcher-led economic movement to privatize public goods and services by selling off telecommunications networks, radio frequencies, electricity generation and transmission grids, public transport, government-sponsored scientific research, postal services, rail lines, public lands, prospecting rights, water and sewage services, and dozens of other activities that had long been considered public trusts, administered by government bodies, marked the final surrender of public responsibility for overseeing the general welfare of society.

Deregulation and privatization spread quickly to other countries. The magnitude of the capitulation was breathtaking in scope and scale. Governments were hollowed out overnight, becoming empty shells, while vast power over the affairs of society shifted to the private sector. The public, at large, was stripped of its “collective” power as citizens and reduced to millions of autonomous agents forced to fend for themselves in a marketplace increasingly controlled by several hundred global corporations. The disempowerment came with lightning speed, leaving little time for public reaction and even less time for public engagement in the process. There was virtually no widespread debate at the time, despite the breadth of the shift in power from the government to the private sector, leaving the public largely unaware and uninvolved, although deeply affected by the consequences.

For the most part, free-market economists, business leaders, neoliberal intellectuals, and progressive politicians — like President Bill Clinton of the United States and Prime Minister Tony Blair of the United Kingdom — were able to prevail by portraying the market as the sole key to economic progress and castigating critics as old fashioned and out of touch or, worse, as Soviet-style apologists for big government. The collapse of the Soviet empire, with its widespread corruption, inefficiencies, and stagnant economic performance was trotted out at every occasion as a whipping boy and proof positive that the well-being of society would be better assured by placing all the economic marbles in the hands of the market and letting government shrivel to the most rudimentary of public functions.

Large segments of the public acquiesced, in part because they shared a sense of frustration and disappointment with government management of goods and services — although much of the ill feeling was contrived by a business community anxious to penetrate and mine a lucrative economic largesse that had long remained under government auspices and beyond the reach of the market. After all, in most industrialized countries, publicly administered goods and services enjoyed an enviable track record. The trains ran on time, the postal service was dependable, government broadcasting was of a high quality, the electricity networks kept the lights on, the telephone networks were reliable, the public schools were adequate, and so forth.

In the end, free-market ideology prevailed.

After this rather brutal, unremitting account, it is true that Rifkin points us at the global Commons he perceives aborning as a legitimate source of hope. Let us, in turn, hope that he’s onto something. To quote someone I hold in the deepest contempt, there really is no alternative.


27 Feb 07:06

How Quartz Onboards New Users

Quartz just has the best onboarding and you should totally check it out.

27 Feb 07:06

Priority Eight

by Freewheel

The Priority Eight is a belt-driven 8-speed. Photo courtesy of http://www.prioritybicycles.com

The Priority Eight has a "pre-sale" price of $800.  Priority anticipates that the bikes will be available by April 1.

According to Priority, the purpose of their bikes is "to make cycling more simple." Their first models were belt-driven "Classic" 3-speeds offered in a  diamond or step-through aluminum frame. Priority marketed these models as "low-maintenance" bikes available in black or white.

Since launching those classic models, Priority says it has received overwhelming demand for a bike with more gears, hence the Priority Eight.  The Eight, like the Classics, has a Gates carbon belt and an internal gear hub for a clean, low-maintenance ride.  It also comes with Tektro hydraulic disc brakes.





There has been some hype about the release of the Priority Eight.  You can read about it at Gizmodo and Bike Rumor.

The Priority Eight is available in three sizes: 17", 19" and 21.5". Here are the specs:

Component Description
Frame Ultralite 6061 T6 Aluminum
Fork Ultralite 6061 T6 Aluminum
Rims Double Wall Alloy, 36h
Front Hub Priority Alloy Sealed
Rear Hub Shimano Nexus 8 Disc
Spokes Black Stainless Steel
Shifter Shimano Nexus 8 Twist
Brakes Tektro HD-M330 Hydraulic Disc Dual Piston
Crankset Gates by FSA
Front Chainring 50t Gates CDN
Rear Cog 24t Gates CDN
Pedals Alloy
Bottom Bracket Sealed Cartridge
Seat Post 27.2 Alloy Microadjust 350mm
Belt Gates Carbon Drive, CDN
Saddle Velo dual density
Handlebar Stem Alloy
Handlebar Alloy, 31.8, 6 degree rise, 630mm
Headset Neco Alloy
Tires Continental Puncture Resistant 700x32
Fenders Alloy Paint Matched
Color Matte Charcoal Grey
Sizes 17", 19" and 21.5"
Weight Approx. 26lbs
27 Feb 07:05

olloclip Studio iPhone Case/Accessories

by Jeb Brilliant

IMG_1479
olloclip released their Studio product today, it is their take on an integrated mobile photography solution. On it’s simplest level it is a case with clip on accessories to allow it to mount on a tripod, clip on lights/flashes and one of its main add ons, the finger mount.

olloclip is no stranger to photography or specialized cases. They are most well known for their clip on lenses for smart phones which are considered the best available to the mass market. The Studio is taking this to the next level and giving it’s users the ability to take pictures and video with professional gear attached to their iPhones 6, 6S along with 6 Plus/Plus S.

The demonstration that was given during CES gives a lot of hope that the unit will change mobile videography. I’ve played with the case by itself and it feels good in the hand in landscape position utilizing the finger clip. The accessories are easy to attach and more importantly feel secure. They even added the littlest thing which you don’t see on LifeProof or OtterBox cases, a spot to loop on a wrist strap (that comes included in the box).

As more and more people start to use the Studio I expect to hear some positive reports and in all honesty I think we’re going to see a ton of skateboarding videos coming from Studio users. This isn’t for the casual photographer but someone who takes pictures and video as a hobby or professionally, especially in action situations using the finger clip. On a personal level this is great for anybody wanting to attach their iPhone to a tripod/monopod without having to change cases. This case could easily be someones daily user.

Will you be checking out the Studio by olloclip?

27 Feb 07:05

Why you shouldn’t step on the scale

by Eric Karjaluoto

We all want to see numbers that show our progress. This is made even more common, due to the Quantified Self movement (which can make you a little, “measurement-crazy”). Such behavior brings with it some notable problems.

Why even bother?

Let’s say you’re on a diet. You avoid the foods you love, and follow all the rules of this regimen. In the morning, before coffee, you weigh in to check your progress. One of two things is bound to happen.

The first, is that you witness a surprising drop in your weight (perhaps a pound, in one day). This sets up an expectation that you’ll lose the same amount within the next 24 hours. Imagine stepping on that scale, the next day. You’re full of excitement—only to find that you’re up two pounds.

This shouldn’t surprise you. Small data-sets tend to provide wonky numbers. Could be that you ate something dense. Maybe, you’re better hydrated. Perhaps one of your weigh-ins was pre/post-poop. In any event, this number shouldn’t be enough to shake your resolve—but it often does.

The second, and more probable, result is that you see hardly any change from the day prior. This is particularly frustrating. Having that unflinching number stare back at you is an insult—a mockery. The scale is telling you that skipped that glass of wine for nothing. After a couple of weeks of the same, you’ll probably give up and eat the entire bag of Cool Ranch Doritos, in one sitting.

Give us a little time

This same thing happens in business, all the time. You start a new blog, and write your best stuff—but daily visits remain the same. You redesign your brand identity, but no one seems to notice. You make a bunch of sales calls, but no one wants to buy. Again, it’s easy to quit—but if you do that, you’ll never get anywhere.

Remember: just because you decided to make a big change, doesn’t mean the universe is ready for you, yet. So, you need to keep doing the work, and give the rest of us a little time to catch up.

Part of this involves putting on your blinders. Truth is, the amount you lost/gained in the past 24 hours doesn’t matter. What does, is that you form a habit you can maintain over the long term. You can’t let initial data, undue comparisons, or a lack of faith get in your way.

Today’s numbers aren’t that important

So, I suggest you do something different from what most in business suggest. Instead of measuring everything, I say you start by measuring nothing. Now, determine what you wish to achieve and outline a sensible approach. Then create a habit that supports this strategy.

Some will read this and argue that I’m too scared to face real numbers. I say the opposite is the case. It’s easy to follow the same “measure and react” dictum that everyone else is. The brave move is to ignore what the sheep do—and instead follow your own direction. In doing so, you stand to make remarkable change.

If you want to lose some weight, start by going for a half-hour walk at lunch. Or, swap your snack-time cookie for an apple. Your change needn’t be monumental—in fact, smaller ones are often more manageable. You need to do something you can keep doing, every day.

If you want to grow your user-base, work on making one small improvement a day, to your app. If you want to increase your average sale, have more conversations with those who buy. (This will help you learn what they want and what you’re missing.) If you want to generate more web traffic for you blog, skip your favorite TV program. Take the time you’ve saved and dedicate it to writing.

Your new normal

At first, this’ll feel weird. Most changes to what you’re accustomed to, do. This sensation will pass, though. In fact, in not so long, you won’t even notice this change. Soon enough, it’ll simply be what you do.

Then do your weigh-in, check your analytics, or whatever. Odds are, you won’t need to, though, because you’ll already know. Your pants will feel looser. Customers will talk about what you do. Or, readers will share your posts.

Few actions have immediate rewards. In the rare instance that there are such gains, these are almost never lasting. Expecting something different for yourself is childish—and will only leave you disappointed.

Habits are different. They amount to little immediate return, but offer substantial results when repeated. So, avoid measuring your progress at the outset. Instead, keep practicing your habits. It might not amount to much, today. Give it a year, though, and this decision might change everything for you.

27 Feb 07:05

Deconstructing Surrey’s LRT survey

by daka_x
27 Feb 07:05

Grist for the Mill

by Tim Ottinger

Let's say that I ask you to calculate all the happy prime numbers between Planck's constant and the speed of light expressed in meters per minute.

Did you immediately start reciting numbers to me? Odds are that you did not.

How accurately can you estimate the time it will take you to find those numbers?

There are a number of problems encountered that slow progress and prevent estimation:

  • Do you really understand what I have asked you to do?
  • Do you know the two constants I mentioned?
  • Do you know how to find prime numbers in that range?
  • Do you understand what a happy number is v. a non-happy number?
  • More importantly, what problem are we trying to solve here? What are the happy primes for?

I like to refer to the need for background information as "grist for the mill." It is the raw material of knowledge work.

The brain needs a certain fact base, a set of operations and transformations, and some worthy target so that it can grind away productively.

So we can categorize the above-listed problems as being problems with goals, problems with facts, and problems with operations.

We are given goals (possibly as stories or features) and we are expected to use the facts and operations we have at hand to achieve those goals.

When we have the facts and operations handy, this becomes a more-or-less trivial event. More trivial if the risk of breaking things is low, or less trivial if the the risk of unwanted side-effects is high.

The Goal Hole

When we don't clearly understand what we should build then neither teamwork, nor focus, nor all the best data repositories and training videos in the world can help us.

Without a clear, singular, immediate goal in mind, we will see teams thrash and muddle and there will be little "real work" getting done.

This is why domain expertise is so highly valued among software professionals (especially analysts and testing experts). A domain expert can grasp the intention and ramification of a new software feature in ways that in inexperienced team member cannot.

Having access to the minds, experience, and knowledge base of our colleagues helps us understand goals and learn skills more quickly.

Ultimately, though, such goal-setting relies on understanding of the customers we serve and the environment in which they do their daily work. This is why XP teams invite actual customers into the team and they work with them every day. Scrum teams tend to proxy the customers with a Product Owner who is empowered to make choices and prioritize work on the behalf of true customers.

Clear goals give us focus to
accelerate our learning.

The Skill Hole

Even if we understand the goal, we often find ourselves missing background information (facts) or skills (operations) that are applicable to meeting our goal.

If we are lucky we find our team members have the know-how we need and are available to work with us so that we can simultaneously learn and complete the job.

Sometimes knowledge is not available in our team, and so we dive into Google, StackOverflow, MSDN, or whatever fact-and-operation repositories we have open to us.

One way or another we can usually acquire the skills we need -- but it takes a bit longer, or involvement of more people to achieve it.

The learning time is generally not included in estimates, because we don't know how many skills we need to acquire or how long it will take to be "good enough" to get the job done.

Some people work in a system of individual work assignments, pressure, and tight (sometimes artificial) deadlines. Such a system prevents learning from each other and also discourages taking the time for learning.

Sometimes, lacking facts and operations and also lacking the time to acquire them, developers intentionally or accidentally revise the goal to one that is within reach of their current knowledge set.

After all, if we're right, we will save a lot of time (the "good intention" which paves many diabolical paths).

If we're wrong, then it turns into a requirements interpretation argument between development and testing, or between the development org and its customers.

We can either learn what we need to learn,
or we can muddle through with what we have.

Learning is Essential

Development team members are often asked do work that they don't fully understand using skills that they do not yet possess.

Projects often slip their dates and code quality often declines when policies and reward structures allow for neither pooling nor or solo acquisition of knowledge.

Everyone is left "muddling through" the best they can.

We find this radically sub-optimal for knowledge work.

If we want our teams to do great work consistently, we need to get past the idea that "you should already know everything" and "learning is what you do before you get hired" and move to an atmosphere of continuous learning.

A team that learns is inevitably going to outperform teams that do not. It's a matter of time.

Teams learn faster when they work together instead of working solo on different parts of the same job.

The Smallness of Me, the Enormity of Us

The circumference of a human skull is only about 22 inches and contains knowledge acquired during a single human life time (so far).

Outside that same human skull, there are many billion humans all gathering and inventing new ideas, facts, procedures, and goals.

They're working at all times of day and night, and they've been at it for millennia.

Millions of them are now building software. We are constantly finding articles touting "the 15 languages you should learn this year" and "the crucial 45 Java libraries every developer should know."

"Knowing everything you might need to know" has become a human impossibility.

While we value knowledge and experience in new hires, companies tend to place them in structures where their experience cannot be shared to benefit our organization. We are not building the broad and deep pools of competence and insight that we need.

It's hard for large and storied organizations to give up the idea of "getting maximum output for hour worked", but perhaps keeping individual noses to individual grindstones has been preventing the evolution of better working models in our industry.

We take huge risks
by doing all of our learning and thinking in solo "silos."

Pair programming and mob programming make sense because they pool knowledge and understanding. Everyone contributes information, reasoning, skill, and perspective. The job gets done, and the team grows capability for similar jobs.

This leads us to the desirable future where we don't work harder and longer, but instead accomplish our goals more quickly and easily.

Pooling abilities is so crucial in our field that the Agile Manifesto included it as a key principle:

The best architectures, requirements, and designs emerge from self-organizing teams.

Understanding and supporting the ways that people think and learn together may be the competitive advantage that our industry has been (mostly) lacking for decades, as well as the reason that so many agile transitions fail.

That is a little grist for your mill.

Some Questions

  • When your team seems slow, do you consider what it is that you need to learn in order to complete work more quickly?
  • Is your team struggling with a lack of information, a shortage of safe software techniques, or unclear goals and prioritization?
  • Are you ever so busy working around knowledge gaps that you don't have time "left over" for learning?
  • What can you do to encourage necessary learning and sharing?
  • What skill did the team lack that caused you to use work-arounds last month? Are you pursuing that skill?
  • What would happen if you were caught experimenting with a new technique, instead of working directly in production code?
  • When your team seems to be questioning goals, do you sometimes assume they're stalling?
  • Have you noticed a difference in your cognitive function when you have a a clear and immediate goal?
  • Have you ever reduced the time to deliver a feature because you acquired some new information, tool, or library?
  • How could your team be more "easily accomplished" instead of being "harder working?"

The post Grist for the Mill appeared first on Industrial Logic.

27 Feb 07:04

Tagging Twitter Users From Their Twitter Bios Using OpenCalais and Alchemy API

by Tony Hirst

Prompted by a request at the end of last week for some Twitter data I no longer have to hand, I revisited an old notebook script to try to tidy up some of my Twitter data grabbin’n’mungin’ scripts and have a quick play with some new toys, such as the pyLDAvis [demo] tool for analysing topic models in a notebook setting. (I generated my test models using gensimPy3, a Python 3 port of gensim, which all seemed to work okay…More on that in a later post, maybe…)

I also plugged in some entity extracting API calls for IBM’s Alchemy API and Thomson Reuters’ OpenCalais API. Both of these services provide a JSON response – and both are rate limited – so I’m cacheing responses for a while in a couple of MongoDB collections (one collection per service).

Here’s an example of the sort of response we can get from a call to Open Calais:

[{'_id': 213808950,
  'description': 'Daily Telegraph sub-editor, Newcastle United follower and future England cricketer',
  'reuters_entities': [{'text': 'follower and future England cricketer',
    'type': 'Position'},
   {'text': 'Daily Telegraph', 'type': 'Company'},
   {'text': 'sub-editor', 'type': 'Position'},
   {'text': 'Daily Telegraph', 'type': 'PublishedMedium'}],
  'screen_name': 'laurieallsopp'}]

Looking at that data, which I retrieved from my MongoDB (the reuters_entities are the bits I got back from the OpenCalais API), I wondered how I could query the database to pull back just the Position info, or just bios associated with a PublishedMedium.

It turns out that the $elemMatch property was the one I needed to allow what is essentially a wildcarded search into the path of the list of arrays (it can be nested if you need to search deeper…):

load_from_mongo('twitter','calaisdata',
                criteria={'reuters_entities':{'$elemMatch':{'type':'PublishedMedium'}}},
                projection={'screen_name':1, 'description':1,'reuters_entities.text':1,
                            'reuters_entities':{'$elemMatch':{'type':'PublishedMedium'}}})

calaisquery

In that example, the criteria definition limits returned records to those of type PublishedMedium, and the projection is used to return the first such matching element.

I can also run queries on job titles, or roles, as this example using grabbed AlchemyAPI data shows:

load_from_mongo('twitter','alchemydata',
                criteria={'ibm_entities':{'$elemMatch':{'type':'JobTitle'}}},
                projection={'screen_name':1, 'description':1,'ibm_entities.text':1,
                            'ibm_entities':{'$elemMatch':{'type':'JobTitle'}}})

alchemyquery

And so it follows that I could try to search for folk tagged as an editor (or variant thereof: editorial director, for example), modifying the query to additionally perform a case-insensitive search (I’m using pymongo to query the database):

criteria={'ibm_entities':{'$elemMatch':{'type':'JobTitle',
                                        'text':{ '$regex' : 'editor', '$options':'i' }}}}

For a case insensitive but otherwise exact search, use an expression of the form "^editor$" to force the search on the tag to match at the start (^) and end ($) of the string…

I’m not sure if such use of the entity data complies with the license terms though!

And of course, it would probably be much easier to just lookup whether the description contains a particular word or phrase!


27 Feb 07:04

565 Great Northern Way

by ChangingCity

This office development is just completing on the new campus of Emily Carr University on the south side of the False Creek Flats. Designed by Perkins + Will for PCI, it has around 150,000 sq ft of office in 7 storeys with a small single storey retail building alongside. It was only proposed in early 2016, and the tenants are already fitting out, showing that a project that ticks all the right boxes can get approved and built relatively fast.

The tenants are an interesting mix, including an animation studio, Finning (who make Caterpillar equipment, and who donated the land to the University consortium many years ago), Samsung, and a shared workspace leasing company.

We’ll look in more detail at the retail kiosk alongside the office  later, as it’s an unexpected design, and not yet finished.

27 Feb 07:03

New Features Allow You to Manage Feedback at Scale

by Bonnie Pecevich

At UserVoice, not only do we listen to our customers’ feedback, we also continuously think of better ways to manage it. This, in turn, helps our customers better manage their own feedback. Our newest set of features comes from both customer requests and our own experiences using UserVoice. We’re enthusiastic to introduce a new set of features to help you better manage feedback, and new metrics to provide more context to help you make better product decisions.

Bulk Actions

In the main feedback section, you’re now able to edit multiple suggestions at once (up to 5000!). Simply select which ideas you want to edit and you’ll see a new menu appear in blue that will allow you to:

  • Mark/unmark as spam
  • Delete/restore
  • Move between forums and categories

 

 

With this new set of features, we’ve made it easier than ever to organize suggestions and clear spam in one quick sweep. These features are engineered to save you time, especially when you have hundreds, if not thousands, of customer requests pouring in.

Two important notes:

  • Any of these changes can be easily, and instantly, reverted. Simply hit the undo button next to the confirmation message that appears at the top of your screen.
  • Deleting is never permanent so you can always restore suggestions.

We’ll be rolling out these new features to customers later this month. Check out our Knowledge Base to learn more. We love hearing from you so please let us know what you think!

The post New Features Allow You to Manage Feedback at Scale appeared first on UserVoice Blog.

27 Feb 07:03

The Secrets Behind Pinterest’s Outrageous Growth: An Interview

by Sara Aboulafia

Perhaps it’s needless to say, but we’ll say it anyway: product is not only the Product Manager’s responsibility. In fact, involving other team members in strategic product decisions is fundamental to growth. One person who lives this truth? John Egan, Pinterest’s Engineering Manager.

John Egan is a product-oriented Engineering Manager for the Engagement Team at Pinterest, everyone’s favorite visual bookmarking tool. The company is growing at an alarming rate: they crossed over 100 million monthly active users in September 2015, which is 100x the amount of monthly active users they had in 2011 (yes, you read that right).

Egan’s team at Pinterest is part of the Growth department, which – after some consideration – was slotted as part of the Product team. The implication of this move is clear: product and growth go hand-in-hand.

After talking with him about Pinterest’s growth strategies, we came away with four solid takeaways that many growth-hopeful companies can use at any stage:

1. Users Over Metrics = Long-Term Dividends

smiling users before metrics
Says Egan, “A lot of organizations say they put users first, but I feel like at Pinterest the company really tries to deliver on that promise. A great example of this is we recently overhauled the designs across all our emails. We decided to make a change to our Send A Pin email to include the Pin someone sent you right in the email, instead of making you click on the email to see the Pin. This change did drop metrics, but we decided to ship the change anyway since we felt it was a much better user experience. This mentality of putting users first is a long term bet that having a good brand that users trust will pay dividends over time.”

2. Growth Teams Need to Be Full-Stack

full stack of pancakes!

Says Egan, “Growth teams tend to move at a very different pace than most other product or engineering teams. To be most effective Growth teams need to be full-stack teams capable of implementing any feature they want themselves, without having to get support from another team that has their own set of projects & priorities. The Growth teams that are full-stack are the ones I’ve seen be most effective. 


“Growth teams that are full-stack are…most effective.” – John Egan, Pinterest
Click To Tweet


3. Growth Depends on Effective Leadership

effective leadership

Egan says, “A lot of it comes down to having effective leadership that can set growth goals that the whole company can get behind and holding teams accountable for making progress towards those goals. For instance, my first year at Pinterest there was a lot of discussion about how to get more men on the site. The leadership made it a priority for all teams to figure what they could do to make the product more suitable for males. Some of the biggest wins ended up not coming out of Growth, but coming from the Search & Discovery team by personalizing the content we served up based on gender. As a result of these efforts, males are now one of the fastest growing demographics on Pinterest.”

4. On That Note…Cross-Collaboration is Key

cross-collaboration is key to product growth

Egan says that at Pinterest, “most of our Growth is product-driven growth, rather than marketing driven growth. This means we drive growth by making changes to the product such as adding features, tweaking existing pages, etc. As a result, most of our Growth features are a result of a PM, designer, and developer working together to deliver the feature. While each person has their own area of core expertise, you’ll often see engineers making design suggestions or PMs helping an engineer think through a technical obstacle. This sort of cross-collaboration is really ingrained in Pinterest’s culture and we think it leads to better results than having everyone just complete their part of the project and throw it over the wall to the next person. 


“Cross-collaboration is really ingrained in Pinterest’s culture.” – John Egan
Click To Tweet


The TL;DR:

Pinterest’s growth seems like an enviable fantasy. But their strategy suggests that when a team responsible for growth is a) product-driven, b) listens to users – even, occasionally, at the expense of metrics, and c) is lead by people who can identify relevant teams and people across the company and engage them in strategic goal-setting, impressive growth is really, truly possible.

The post The Secrets Behind Pinterest’s Outrageous Growth: An Interview appeared first on UserVoice Blog.

26 Feb 22:25

Android – 1984.

by windsorr

Reply to this post

RFM AvatarSmall

 

 

 

 

 

An Orwellian future beckons for Android.

  • The news that Samsung is finally upgrading the Galaxy s6 and s6 EDGE to Marshmallow (Android 6.0) will come as little comfort to Google which continues to unable to distribute its innovations.
  • Marshmallow and iOS9 became available at roughly the same time but Marshmallow is present on just 1.2% of Google’s devices while iOS9 is on 87% of Apple devices.
  • To make matters worse the previous version of Android (Lollipop 5.0) is present on just 34% of Google’s devices despite having been available for nearly 18 months.
  • 60% of Google’s devices currently run Android 4.x meaning that most of Google’s users do not have access to any of the innovations that Google has made in Android for the last 3 years.
  • To date this has not been a real issue, as Google has still been able to update and distribute its services through Google Play, but as services need to become more sophisticated problems begin to arise.
  • Now on Tap is a great example of this.
  • Now on Tap is context based search from anywhere on the device.
  • This can be quite useful for users but also has upside for Google as it will understand what users are doing regardless of whether or not it owns the service.
  • I have long believed that this will enable Google to monetise data collected from services is does not own for the first time.
  • Consequently, this represents meaningful upside in revenues.
  • Unfortunately this requires Marshmallow to work which renders this innovation effectively useless for several years to come.
  • This is why I think Google will end up taking complete control of Android by moving the entire OS into its services layer known as Google Mobile Services (GMS).
  • This is the only way that Google can end the endemic fragmentation that continues to plague its devices as well as take back control of software distribution.
  • The net result is that a Google device will become much like an iOS or a Windows 10 device with absolutely no options for handset makers to make any changes.
  • I suspect that this could also go hand in hand with Google becoming far more prescriptive in terms of hardware to ensure that its software runs properly.
  • While this would be good for Google in terms of improving its user experience and its ability to monetise, it will be yet another nail in the coffin of the long suffering handset makers.
  • Android handset makers have already been reduced to virtual commodities with only Samsung being able to earn more than 2-4% margins on a sustainable basis.
  • I suspect that a number of the Android ecosystems like Xiaomi, Baidu, Tencent, Alibaba, Cyanogen and so on will react to this by creating their own version of Android following in Amazon’s footsteps.
  • The result will be rising costs as the alternatives to Google will end-up having to write more and more of the code themselves.
  • The result will be a series of ecosystems based on a proprietary version of Android making the attainment of scale more important than ever.
  • With life getting harder for those involved in hardware, I would still consider only being involved with either an ecosystem or a value added technology supplier.
26 Feb 22:21

Exerting Influence, Establishing Social Norms, And Driving Value

by Richard Millington

It’s naive to believe that an active community will benefit the host organisation.

It’s more likely the organisation picks up the tab while members reap the rewards.

If those rewards (belonging, support, influence, and exploration) overlap with the organisation’s goals (advocacy, retention, innovation, reduced costs), the organisation benefits. But direct overlaps are rare and there are usually many hops between an organisation’s goals and that of its members.

It’s no different to hosting an open bar, you might generate some goodwill, gain a few good leads and have a slightly more influential voice in the discussion – but there isn’t usually a direct connection to value.

Communities only create value if they establish or change the social norms of their members.

There is a BIG gap between building a community and creating social norms. To cross that gap you have to do what so many people are reluctant to do, exert influence.

Reluctance To Exert Influence

If you don’t exert influence, you’re probably not going to get much value.

You’re already exerting influence too. You have already established what behavior isn’t allowed. We just want to nudge that further along the road.

If you leave the community solely to its own devices, you’re resting your odds of success upon serendipitous luck. And if the community doesn’t generate clear value, it’s usually shut down.

We need to exert influence to establish social norms that benefit the organisation.

How Social Norms Emerge

Social norms emerge from three things:

1. What individual members believe (individual beliefs). This affects what people are likely to do when they initially join the group and how they intend to participate.

2. What members see others doing (observable behavior). This suggests what behavior is appropriate within the group.

3. How others respond to what we do (learned behavior). We pick up on implicit and explicit cues about socially desirable behavior within the group. This helps shape and normalize behavior.

These are your three tools for exerting influence.

Your ability to exert influence falls within recruiting members whose beliefs most closely align with the social norm you’re attempting to create, showcasing the desired behavior to others, and nurturing the social norms you do see emerging.

3 Approaches To Developing Social Norms

This provides you with the three possible tools to developing social norms.

Your use of each depends upon the stage of the community lifecycle. We can put this under three broad approaches, (1) nurturing, (2) foment and foster, (3) establish and enforce.

socialnorms

In a little more depth.

1. Establish and Enforce A Social Norm

I used to work at a private club which explicitly advised against suits and cell-phone use. This was a big draw for a particular group. We were attracted by this norm, embraced this norm and even helped reinforce it. Newcomers would see how other people dressed, would react if they made calls in the venue, and over time embraced the norm as their own.

Setting a social norm gives you the most control. This works best for new groups. It’s hard to force a new social norm from the top-down upon an existing group. Most failed stories about organisational change come from failed top-down efforts to change behavior. Setting the social norm works best when creating a new group and you have either 1) a large amount of pre-existing credibility with the group or 2) you understand deeply what would resonate among the group. For the latter, you need a decade of experience or a lot of in-depth research.

Setting a social norm can work well to splinter fringe members from existing groups to your cause.

2. Foment And Foster Behavior That Spreads

This is where you work with a few close associates (friends, colleagues, or believers in the social norm) to exhibit the new behavior. This only works if they like you and believe in the behavior.

This usually means they have a meaningful input into what this social norm should be and it’s developed by consensus. Your level of control is thus less. You can’t dictate the norm.

A good example would be things like purchases, book reviews, and participation. If on the day you announce an event you have 23 people primed to tweet or share that they plan to attend, this can quickly become a social norm for the group. But you have to get buy-in from those 23 people first. And the best way to get buy in is to ask them what should happen at the event.

They then positively respond to one another and to others who also plan to attend the event.

This works in online communities as much as offline. If you want colleagues sharing more information, it’s a good idea to establish the quality of information you want with a few closer colleagues first and plan out a month of participating positively to the contributions of each other who share that quality of knowledge until it becomes an established social norm.

3. Nurture existing behavior.

Here you look for pre-existing activity which mostly closely resembles the behavior you want to see and give it more attention. That more attention means featuring it, showcasing it, mentioning it to others, writing about it etc…The Heath brothers called this looking for the bright spots.

This works best in large groups with a high volume of activity. You sift through the activity to find the behavior you can nurture. Once you find the behavior you like, you can turn it into a sticky-thread, have senior people respond to it, and link to it from other articles.

You repeat this process as the behavior gradually drifts closer to the social norm you want.

Two key takeaways.

1. If you don’t exert influence, you’re unlikely to extract value. This is fine if you’re hosting the community as a charitable endeavour. But don’t be surprised if your budget (or role) gets cut when money is needed elsewhere. The more you can demonstrate value, the more resources you have to further develop the community.

2. Diagnose your community size and control before exerting influence. Your ability to foster social norms depends largely upon the size and maturity of the community. The more established the group, the less influence you have. If you’re not having much success right now, you’re probably using the wrong strategy.

Good luck.

p.s. We invite you to learn more about Advanced Engagement Methods. This is our 12-week intensive online program which will equip you with the skills, knowledge, and resources you need to drive quality engagement.

Registration closes on Feb 29, learn more here: http://www.feverbee.com/aem

19 Feb 08:30

Why I’m not lonely

by Josh Bernoff

For 33 years I worked in an office. For the last year I’ve worked on my own. I never worried about money, but did worry that I’d miss the social side of work. I don’t. I used to be part of something bigger, working with an agenda set, mostly, by my senior managers. Now I have a bigger … Continue reading Why I’m not lonely →

The post Why I’m not lonely appeared first on without bullshit.

19 Feb 08:28

And The Weakest Link In Mobile Security Is…

by Martin

… the PIN.

Today I read in the news that a smartphone manufacturer has received a “friendly” invitation letter from US law enforcement that asks them to help decrypting a phone of a terrorist that is encrypted. The encryption key itself is protected with the PIN that the user has to type in and software that keeps increasing the delay between two guessing attempts. I am sure the company sympathizes with the general idea of decrypting a device of a terrorist but sees itself unable to comply with the request as this would also significantly weaken security and privacy for the rest of us. If the let someone do it with that phone it can be done with others, too. Once in the wild…

While most of the media is discussing the pros and cons of the move there is a deeper issue here that nobody seems to think about: A simple 4 or 6 digit pin and a bit of software should not protect the ciphering key in the first place.

The mere fact that perhaps software could be written to get around the increasing delays when the PIN is entered incorrectly several times and even allow a program guessing the PIN, which can do it much faster than guessing attempts via the touch screen is clearly showing that a PIN, even if it is 6 digit long, does not really protect encrypted data on a device. Other 3-letter agencies can make similar requests with a gag order so the story would never even have come out in the first place.

The only real protection is to use a longer password when the device first boots up that is independent of the weaker PIN that the user has to type in when he unlocks the device during normal operation. If the user then selects a really long and strong “power-up” password his data is much safer than it is now, where the only thing between safety and vulnerability is the current policy of a company not to compromise their users safety and privacy.

Fortunately I discovered quite some time ago that Android does indeed have this split password for power-up and PIN / swipe gesture / etc for device unlock. A much better approach!

19 Feb 08:26

Boys play with dolls, and girls play with spaceships. Someone tell the toy makers.

files/images/113673916.jpg


Rebecca Hains, Washington Post, Feb 19, 2016


I think of this as an education issue, not a toy issue. Games and toys are among the most important educational devices we build. And so I wonder why it is so important for toy builders in the United States to create completely  separate lines of toys for boys and girls. They may as well be taught in separate schools, not interacting with each other at all! It is arguable that this separation is responsible in part for the  skew toward boys in the sciences, and for some of the blatant  misogyny we see in the technical community. Because, after all, education does have an effect on behaviour. I agree with Dan Nessell, who has lobbied to end the boy and girl awards for toy manufacturers. We should end 'boy and girl' toys altogether. When we address these issues in our own society, we will be on firmer ground when we address them in others.

[Link] [Comment]
19 Feb 08:26

The Pathology of Platforms

files/images/105i5i_FNQHckrL7wU4CXPw.jpeg


Matt Ratto, Medium, Feb 19, 2016


This dense and needlessly obscure prose hides a line of thought that is of some interest. The concept of 'making' has become trendy in recent years. But what is the purpose of 'making' something? Matt Ratto argues that it should not be defined in purely utilitarian terms, nor from the perspective of exhibition. These both deflect our attention from what is being  expressed by the act of making. Such a perspective makes it difficult to reflect critically on 'making'. For example, what does one say about the 3D printing of a working gun? The suggestion is that it is an attempt to 'naturalize' the printed gun - "to turn them into 'matters of fact' rather than 'matters for concern.'"But the mere creation of technological artifacts does not give the related ideas any special voice. That's nothing but technological determinism. But what does it mean? That's where the idea of the 'platform' comes in. It's like a shared environment, like a workbench, where we can try things out. We don't have to agree on what it means, but we can use it to ask about what is normative (what ought to be the case) what what is problematic. And the printing of a 3D gun is an "attempt to redefine what is normal and what is pathological in society." It's a move in a conversation, not the creation of a new fact.

[Link] [Comment]
19 Feb 08:26

Personal and Personalized Learning

It may be preferably to embrace an alternative to personalized learning, which might be called personal learning.  In the case of personal learning, the role of the educational system is not to provide learning, it is to support learning. Meanwhile, the decisions about what to learn, how to learn, and where to learn are made outside the educational system, and principally, by the individual learners themselves.

, , Feb 17, 2016
[Comment]
Share |
19 Feb 08:26

Encrypt

files/images/Privacy.JPG


Various authors, Mozilla, Feb 20, 2016


"Choosing what's public and what's private allows you to be you." That's the tagline from this short video from Mozilla. It's part of an effort to make internet users more aware of personal privacy and security, and to support privacy-preserving mechanisms in web browsers and websites. Mark Surman writes, by email, "We expect there are a number of significant political battles on surveillance and encryption coming down the pipe. We’ re going to need to ask people like you and your friends to take a stand. Before we do this, we want to provide a solid grounding on why encryption matters." (Note that I've removed the 'utm' tracking information from the link that Mozilla sent me.)

[Link] [Comment]
19 Feb 08:24

Interview with Leslie Van Duzer

by jamesavbligh

For my week as guest editor of Price Tags, I intend to view Vancouver from an architectural perspective. To this effect, I will be releasing an interview with an architect, planner, or academic each day. Each person has been selected for his/her unique and timely perspectives on the city. Our discussions will highlight each person’s practice along with their notions of city building and form in Vancouver.

I am sitting in Vancouver’s only honourable bagel bakery, Siegel’s Bagels, on Cornwall and Cypress, with Leslie Van Duzer. Leslie recently completed her five-year term as Director of the School of Architecture and Landscape Architecture (SALA) at UBC. During her current administrative leave, she helped launch the new Urbanarium and is continuing to work with colleagues on a series of monographs showcasing Vancouver’s most at-risk West Coast Modern architecture. Leslie graciously offered the time to join me for a chat about Vancouver.

JB: SFU has positioned itself well to study cities with their multiple downtown campuses and lecture series, such as City Conversations. What do you see as UBC’s role when it comes to advocating for urban design in Vancouver?

LVD: Indeed, SFU has done an extremely good job on this front, but UBC also does a great deal to advocate for a more sustainable, affordable and liveable city.

SALA makes a significant, ongoing contribution to the conversation. For example, every semester it sponsors a public lecture series and its Advisory Board provided the seed for the new Urbanarium, an independent non-profit organization launched on Janaury 20th. It is the Urbanarium’s aspiration to have a physical space for public lectures, debates and exhibitions on urban issues. There is surprisingly no such dedicated space currently in Vancouver, unlike most other major cities.

SALA faculty and students also do a lot of research and design work that has been, and can continue to be, of real value to the community. By way of example, there was a timely recent thesis by Mahbod Biazi who conducted comprehensive research on housing in the West End and designed thoughtful proposals for infill housing. This thesis should be part of any conversation about development in that neighbourhood.

JB: You recently published your first monograph on West Coast Modern architecture, entitled House Shumiatcher. This monograph will be the first in a series dedicated to the documentation of endangered West Coast Modern homes. How did this series come about and how has the reception to this work been?

LVD: When I first moved to Vancouver, the Schumiatchers were among the first people to welcome me to the city and into their home. I frequently visited their wonderful house for afternoon teas and dinners.  When it came time for them to sell the house, they anticipated it might be destroyed due to escalating land values. So we set out to document the house, to be sure there was a record of its existence. 

Shumiatcher Exterior

House Shumiatcher – Photograph by Michael Perlmutter

We found a publisher in San Francisco who was interested in publishing a whole series of monographs on endangered houses. I identified two SALA colleagues, Sherry McKay and Chris Macdonald, willing to join me in curating and editing the series. Swedish-American photographer Michael Perlmutter was hired to photograph all the houses and Canadian-Argentinian graphic designer Pablo Mandel will design all the books. I want to tip my hat to architect Joe Wai who encouraged the school to do more to document and help preserve this special architectural heritage. The Downs House book by Chris Macdonald will be the second in the series; it is being printed now. There has been a very positive reception from the professional and heritage community and that will only grow as more books are published. We have a series of ten planned, for starters!

JB: You are working on a project entitled “The Village Model,” a documentation of the aging-in-place movement. What insights can you draw in Vancouver between this movement, our tax deferral system, NIMBYism, and densification?

LVD: First, I need to explain what the Village Model, or better, The Village Movement, is. Few in Canada know about it, but it is a phenomenon that is sweeping across the United States.

Fourteen years ago, the first Village was established in the Beacon Hill neighborhood of Boston. Six years ago, there were 50 villages open and 50 under development. Today, there are 190 villages open and another 185 under development. That is a total of 375! The basic purpose of a Village is to allow people to age at home as long as possible. Not only is this desirable from the point of view of many seniors, it is also a cost effective model for addressing the mounting needs of an aging population and an extremely effective strategy for more general community building. Villages are membership organizations, generally established through grassroots efforts of a group of neighbors. While this is the prevailing story, as the model catches on, some villages have been founded by religious institutions, municipalities, even health care companies.

If you are a member of a village you have access to “one-stop shopping” for vetted professional services, from home health care providers to dog walkers. This is invaluable for anyone who is new to a community or simply finds themselves needing services they never required before. More importantly, members participate in social, cultural, educational and wellness programs together, so they gain peer-to-peer support, build new friendships, and have improved health and levels of engagement. There are a wealth of volunteer opportunities so members have not only the opportunity to be served, but also to serve. Volunteers are also often recruited from the general community to allow for productive inter-generational exchange. Sorry to go on so long, but I cannot say enough about this model as one solution to our temporary aging population.

It is important that we don’t overbuild for this temporary swell in the number of seniors, and the infrastructure we do build for them, should be designed flexibly (I like to say illegibly) to accommodate new programs in the future. We need to start thinking more like this so we are not stuck with obsolete buildings just 20 or 30 years later.

Back to your question, we need to address densification, affordability, and an aging population by thinking more creatively about long-term solutions that address multiple issues simultaneous. And regarding NIMBIES, well the NIMBY chair at the working table needs to be removed.

JB: You are working on another project entitled “The Art of Deception,” an exploration of the parallels between magic and architecture. Can you talk about that work and any magic moments you have found in Vancouver?

LVD: Well, this interest dates way back to my teenage years as a magician’s assistant. My brother has been a magician for half a century, since he was a little boy. His book, “The Seven Basic Secrets of Illusion Design,” got me thinking about the secrets of magic as they relate to architecture. My brother and I subsequently taught three architecture design studios together using the tricks of the magician in the design of buildings. We became interested in more than the mechanical tricks and started exploring blind spots, misdirection, the role of memory and expectations, and many other fascinating topics. In one studio, the students had to pick a UNESCO World Heritage site and redesign the tourist infrastructure using the magician’s secrets. These were surprisingly practical lessons for the students.

The intention of my project is to illustrate the positive value of misperception. I am interested in the most subtle possible application of magic in architecture. The built world is so formulaic. If you are gently disruptive with your designs, you can wake people up by giving them a moment of wonder.

To answer your question, the most magical moments I have experienced in Vancouver have been at home, living in a tower. Imagine a reflection of a building behind my view creating a phantom building in my view, or the morning sun bouncing off the tower across the street and back into my west facing apartment. Imagine a seagull flying between the setting sun and my window casting a giant flying shadow, like a Pterodactyl, in my space. And how strange that my corner unit in the Electra Building appears from within to cantilever dramatically over Burrard Street. I could go on and on, but these have been the most magical architectural encounters I have had in Vancouver.

JB: You are also proposing a collection of essays entitled “Philanthropy and City Building.” How has the role of philanthropy in Vancouver changed between the booming years of MacMillan Bloedel and today?

LVD: I cannot speak to earlier times in Vancouver, at least not yet. I arrived in Vancouver five years ago and have just taken up this topic out of concern for the current dearth of philanthropy for public city building initiatives. I moved here from Minneapolis where there is an extremely healthy culture of philanthropy, one that has allowed the city to realize many important new cultural buildings in recent years.

When I stepped down as Director of SALA, a patron gave the school a major donation to allow me to continue my outreach efforts. I am using the first installment of that donation for the City Debates, a collaboration with the new Urbanarium. Next spring, I will use the second installment to explore models of philanthropy from other cities that could inspire Vancouver. Among other aspects, I am interested in understanding how to develop a culture of philanthropy among younger people, how Vancouver might become a corporate headquarter town, and how to create the sense of engagement in the community that leads to targeted collective giving. I hope the series of talks, and the edited volume to follow, will have a positive impact locally and beyond.

JB: Do you have any particular soap box that frustrates you when it comes to architecture and urbanism in Vancouver?

LVD: I am frustrated by the lack of a rigorous, informed and unemotional debate. My problem with public hearings is that they generally focus on particular developments and the people that come out are those most personally affected by the projects. This is of course understandable, but as our big issues cannot be resolved site by site, neighbourhood by neighbourhood, even city by city, our big challenge is to figure out how we can engage the public sincerely while dealing with extremely complex regional problems that challenge even the most expert.

 I am also angered by the censorship here. I grew up in Berkeley, California in the 1960s and 1970s and am the poster-child for free speech and academic freedom. When I was the Director of SALA, I was asked more than once by city officials to silence a few of my outspoken colleagues. To my mind, this is outrageous and threatens the basic tenets of democracy.

It troubles me a lot to hear experts misaligned by the public. If you are planning something as complex as a city or a region, don’t you want experts at the helm? We should remember that our politicians, planners and other experts are also citizens who care deeply about the future of this city and region.

JB: If you could make any intervention into the fabric of Vancouver, what would you add, subtract, or modify?

LVD: I would love to work with the Vancouver Board of Parks and Recreation to organize a competition for young designers for beautiful new park infrastructure. There are so few architectural gems in our public realm, and given the opportunity, I have no doubt designers properly paid would pour their hearts into these projects. The work of Bill Pechet is one very good example of a designer committed to improving the public realm with delightful interventions. Think of our waterfront parks and flash to other cities like Barcelona; we have a long way to go but have no shortage of great designers to take us there.

Bill Pechet Bench Shipyyards

Bench, Bill Pechet – Pier Furnishings and Lighting

You can learn more about Leslie and her work from her UBC School of Architecture and Landscape Architecture bio: www.sala.ubc.ca/people/faculty/leslie-van-duzer


19 Feb 08:08

Who speaks for those who are not already here?

by jamesavbligh

I recently attended the Vancouver City Planning Commission event, A Chronology of Vancouver’s Planning and Development, at the Museum of Vancouver. The event was moderated by Stephen Quinn and panellists included Frances Bula, Noha Sedky, Jennifer Marshall, and Bob Rennie. An incendiary Q&A period followed, mostly directed towards a frustration of where and how to introduce density in Vancouver

From my experience in these kinds of settings, the vast majority of questions that get raised are from implicated parties who are defending their own interests. Further, it is rare for someone under 40 years old to stand up and voice their opinion.

For me, it was a breath of fresh air to see a young student stand up and talk about biased demographics and city-building. Andrew Martin is a Master’s of Community and Regional Planning Student at SCARP, UBC. A transcript of his question follows. I hope you find it as lucid as I did:

We live in a region and country in transition; I suspect most of today’s Canadian-born land owners grew up in single-detached family homes, an experience that is less common today as Canada becomes an increasingly dense, urban nation.

As a result, we’re experiencing growing pains, as every great city of the world has also experienced.  In my lifetime, Metro Vancouver will likely grow to be a region of 5 million inhabitants, and it won’t stop there.  The majority of those 5 million aren’t here right now to speak for themselves, but we’re making key decisions that will profoundly affect their lives.

Are our decisions serving their best interests?  Who speaks for them?  

The challenge here is to accept the reality of growth, and undertake this transition in the most just and sustainable way possible, with attention to creating quality living environments conducive to human well being.  Darwin said: “It is not the strongest of the species that survive, nor the most intelligent, but the one most responsive to change.” As a region, we have done this well in the past.  Can the residents of metro Vancouver now embrace this next evolution with rationality, optimism and careful consideration for all residents, now and to come? 

-Andrew Martin

For older generations, the notion of single-family home dwelling is considered the norm. For mine and Martin it has become less so. Will it be significantly easier to give up on preserving our pool of single family homes in Vancouver as new generations take over and what prospective future does that suggest for our dwellings?


19 Feb 08:08

Local Lettering

by jamesavbligh

I had the pleasure of meeting typographer Alanna Munro during one of Mark Busse’s Creative Mornings. Alanna is working on an outstanding project where she is slowly digitizing the hand-made and unique fonts that have appeared on some of Vancouver’s oldest and most at-risk architecture. The signs, by nature of their crafting, have come to represent an embedded and identifiable shared memory in our urban fabric. Alanna describes her work below:

the-flamingo

There are small pockets of Vancouver that are home to golden letters that live proudly on the door of the building they name. I started a project called Local Lettering in order to study and catalog these letterforms before they aged, disappeared, or got knocked down. 

One of the really interesting things that this project highlights is the relationship between a building and its name. The modern buildings going up around the city always have a name for marketing purposes, but when the building is done this name becomes more of an afterthought, sliding into the background, sometimes never making it on to the building facade. Eventually, people will forget what that building was called, just as we have forgotten the names of the apartment buildings which have lost their golden letters.

For the buildings that have lost their letters, some have been replaced with cheap vinyl letters which no longer speak the same language as the architecture. As a work of design, the building has lost some of its consistency and character. I think this idea leaves a big opportunity for architects to work with typographers to make sure every detail of their building is speaking the same language, and nothing is added as an afterthought.

-Alanna Munro


19 Feb 06:48

The Best Electric Vehicle

by Eric Evarts
electric-vehicles-kia-soul-ev

After putting in nearly three months of research, interviewing three leading experts on electric vehicles, and driving every EV currently available for sale in the US, we can say that the Kia Soul EV, which costs about $34,0001 (or $26,500 after a federal tax incentive), is currently the best electric car for drivers who want to go completely gas-free. Of all the electric cars we researched, drove, charged, and drove again, the Soul EV has one of the longest practical driving ranges at 93 miles (exceeded only by the Tesla models and nominally by the Nissan LEAF) and costs only about as much as a well-equipped compact car in the end. It also comes with the best collection of high-tech features to help maximize its range. The Soul EV is a comfortable, usable everyday vehicle thanks to the fact that it’s based on an already good car (we picked the gas-powered version as our favorite subcompact crossover SUV) designed from the get-go with electrification in mind. Quiet and zippy, the Soul EV drives better than most others, and it has access to the largest public fast-charging network in the country (though that network is mostly concentrated around cities). Lastly, Kia offers the best warranty among all affordable-EV makers. Yes, the Soul’s design is polarizing, and this electric version is sold in only 10 states, but those drawbacks are not enough to bring it down from the top of the mountain.

19 Feb 06:46

The Freedom 251 is the world’s cheapest Android smartphone at $3.7

by Rajesh Pandey
Smartphone prices have been dwindling with every passing month, but a company — Ringing Bells — has taken things to a completely different level with its Freedom 251 smartphone that was unveiled in India today. The handset costs a mere Rs. 251 ($3.7$ roughly) and runs on Android 5.1 Lollipop, making it the cheapest Android phone to be ever launched. Continue reading →
19 Feb 06:44

This is what Samsung thinks the future of tech will look like

by Patrick O'Rourke

Samsung predicts humans will eventually colonize the moon.

The company’s recent SmartThings Future Living Report, authored by a team of leading academics, has an interesting view of what the future of our world will look like and gives us a subtle glimpse at Samsung’s forward-thinking tech plans for the future.

Just to highlight a few sci-fi esque elements from the report, Samsung envisions a world filled with super skyscrapers constructed of carbon nanotubes and diamond nanothreads, though it’s unclear how these materials will actually become reality. Earth-scrapers, tall buildings that burrow 25 or more stories into the earth, are also mentioned in the report. Not stopping there, the Future Living Report also discusses underwater cities and details how personal flying drones will replace standard vehicles.

In terms of other far-reaching predictions, Samsung’s report says that future houses will be 3D printed and that walls will be flexible and be constructed of giant LED panels designed to shift at the homeowners will.

More grounded predictions include virtual meetings, of course conducted via hologram, and pods that automatically diagnose user’s illnesses.

And finally, the report says humans will colonize the moon, Mars, and then the rest of the galaxy.

samsung future

Space scientist Maggie Aderin-Pocock, who co-authored the document said, “Our lives today are almost unrecognizable from those a century ago. The internet has revolutionised the way we communicate, learn and control our lives. Just 25 years ago, technology like SmartThings would have been inconceivable, yet today, developments like this let us monitor, control and secure our living spaces with the touch of a smartphone.”

While Jetson sounding and the type of technology you’d typically find in a sci-fi film, Samsung’s SmartThings report is a fascinating look at the future of technology, though many of the company’s predictions seem like they are hundreds of years from coming to fruition.

SourceSamsung
19 Feb 06:43

Uber’s CEO talks risk, regulation and women in tech

by Jonathon Narvey

In downtown Vancouver’s Launch Academy hub for a “fireside chat” moderated by SheTalks’ Barinder Rasode and Launch Academy’s Ray Walia, Uber CEO Travis Kalanick shared his thoughts on how his tech company is adapting to the needs of its customers, facing off against competition, and dealing with regulation as its sharing-economy vision becomes reality worldwide.

A select audience of more than 50 entrepreneurs, technology experts, founders and funders listened intently and got a chance to ask the witty and enthusiastic CEO some questions of their own.

On obstacles faced along the entrepreneur’s journey

“I’ve seen what might be considered failure far more often than success,” Kalanick said. “With Scour (an earlier startup dealing with file sharing) we got sued for $250 billion by the world’s biggest entertainment companies (that’s the GDP of Sweden, I keep telling people). The rules around distribution of files — we just didn’t have a way to defend ourselves (though iTunes did it right later on).”

At one point, after taking zero salary as a CEO, Kalanick had to move back into his parents’ house. “You have to be open to hard times,” he said. “What we think in general of risk is not what an entrepreneur sees at risk. I’ve run out of money more times than you can imagine. At the end of the day, I had parents who would always let me come back and friends who would be there for me.”

Kalanick argued that the entrepreneur mentality is very different from that of an employee. “At Uber, we’re big and now we have funds to hire people. People still say it’s a risk to come over. We say, ‘you’re smart, we can pay the bills – what kind of risk are you talking about?’ We need to redefine risk. If an entrepreneur is smart and just keeps getting back up after they get knocked down, they’re going to succeed.”

On regulation, and being in a city where Uber is outlawed

“Vancouver is a really good place to talk about this,” Kalanick said. “I usually ask, who came here in an Uber? That question doesn’t work out so well here.”

“The old rules we have today were once new and controversial rules. Every old rule has the right to become a new rule – one that hopefully bends towards progress. It’s about making sure that in systems we develop, people and progress comes first. We ended up in 68 countries because that did happen. I believe Vancouver will eventually make that leap.”

“Fuck you taxi mafia!” one attendee spontaneously erupted, as Kalanick explained the local regulation around a $75 floor price for calling a car to go even one block in the city.

“I’ll let the audience speak for itself,” Kalanick replied with a chuckle, before going on to the reasons many cities had already decided Uber’s benefits outweighed possible problems for entrenched interests. “If you can do this for the lowest possible price, no one needs to buy cars. Thirty percent of our cities are parking for vehicles that we’re only sitting in for 4 percent of the day. There’s just hunks of cement storing hunks of steel instead of being allocated to schools, daycares, and other things we want in our cities. We’re creating jobs.”

“If you don’t like pollution, parking lots, and traffic congestion, you like Uber,” he added. “If you like only expensive black car service that ordinary people can’t afford, Uber is bad for you.”

On maintaining the pace of innovation through data

Kalanick said that the Uber business model depends on data. “You have to look at things at the city level. For transportation, we’re looking at what kinds of cars are in supply. What can the economy support? What are the pain points for transportation? What’s the right price point? It’s so much about getting people and cars around the city as fast as possible. There are tweaks you do elsewhere.”

“We use data to predict demand. But then we ask, how do we use that data to provide more information about supply or set the set price point? We’re not an advertising company, so don’t have the same concerns as a company like Google.”

Kalanick also said that dealing with more data requires more and more specialized talent. “What you see is more and more science, scientists and machine learning experts. That work is becoming mainstream, at least in tech companies. Everything from how to price, market to customers, serve them, is becoming a machine learning problem. We thought we were so revolutionary to have a neuroscientist on a logistics team, but in the Valley, that’s old hat.”

As they learn more about how to make use of the data, private companies will be in a position to partner with public institutions, Kalanick argued.

“There are a lot of ways cities can be planning better and benefiting from this kind of data. The burden is on us to see how we can partner with cities. This will be an offshoot and something I expect to see a lot more of that in 2016.”

On ending the gender imbalance in tech

Uber and its CEO have faced criticism and accusations of sexism over the past several years, perhaps stemming from (one might very generously suppose) a tone-deaf sense of humor at the top. That said, Kalanick came out strongly in favor of correcting the current gender imbalance in tech.

“Once we succeed in the city, we look at opportunities for ways to give back. We see a vacuum of female talent in engineering specifically, where we’re missing out on half the great minds that exist out there,” Kalanick said. “We have to find ways for younger women and girls to know that coding is not a geeky thing, but a healthy, productive thing to do. There are female role models who can show the way.”

While progress is already evident, there’s still more way to go, Kalanick noted. “You talk (about these social paradigms) to a high school girl who is already coding and they’re like ‘what are you talking about?’ So we need more of that, going to the source, providing a space for more creative minds to participate and make the world a better place.”

On Uber, the sharing economy, and safety

Certainly, one of the criticisms from taxi drivers and riders is whether Uber’s model can provide real safety for customers, which Kalanick addressed head on.

“The first part is we have to be able to say that Uber is the safest way to get from point A to B,” he said. “Safety happens before, during and after the ride. The first part comes with filtering bad drivers out of it. In the middle, it’s about monitoring the driver (and making sure the driver knows they’re being monitored – and there is monitoring for every foot of that trip). And of course during the trip you could have telematics that could tell us in real time if the driver was driving safely; you have after-trip feedback that can tell us further how to weed out bad drivers.”

Kalanick asked if taxi companies provide these tools right now, before acknowledging that no system is going to be perfect (indeed, Uber was just ordered to pay $28.5 million for false safety advertising and fees based upon claims of driver background checks). Still, Kalanick seemed confident of improving perception around what they could offer. “There is crime in every city. How can we be a part of making every city safer? Every day, we just keep getting better at it.”

On adapting to the future: the self-driving car

Uber is already on the record as supporting the move to self-driving cars – even if that move could radically shake up its business model. Kalanick said that it comes from asking the question: “do we want to be part of the future or resist it, the way the taxi companies do? We want to be part of the future. The world of self-driving cars is going to happen.”

The more interesting question is how Uber will be part of that change. Kalanick indicated that Uber would be willing to partner to make that happen.

“Since 2007, Google’s been working on autonomous vehicles because millions of people die every year in vehicle accidents,” Kalanick said. “Also, there’s pollution, traffic – we understand why they’ve been investing and trying to push that forward. Can there be a partnership for the transition period? That transition period is going to last quite a long time. We can partner with cities through that transition to help make it happen in a human way.”

On dealing with regulators and critics who remain skeptical of Uber

“I talked earlier about the city at large and all the good things that happen because of Uber – so who’s against that?” Kalanick asked. “That segment becomes very narrow. Institutions that prevent progress have to change.”

Kalanick said that taxi drivers are not bad people, “they’re just treated badly. In monopolistic systems like in NYC, taxi licenses are tradeable licenses… A taxi driver has to spend $40,000 a year to be impoverished. That’s the system you support when you support a monopolistic system like the taxi industry. It screws over choice for the consumer as well as for the driver. There’s a very narrow segment of people that would think that’s a good system.”

At least, according to Kalanick. It’s only fair to point out that in the recent past, Uber has been accused of using cutthroat business practices, including sending fake travel requests to keep its competitor Lyft’s drivers off the road. Uber is also currently involved in a legal dispute as to whether its drivers should be considered independent contractors or employees. In New York specifically, Uber drivers have penned an open letter to Uber criticizing a culture of “expendable drivers” (via Pando, and a time-unlocked article which should be read quickly). The segment that objects to Uber may be a big larger than Kalanick is willing to publicly acknowledge.

On advice for entrepreneurs who want to raise funding

Looking back on his mixed record of success with startups in his past, Kalanick noted that “when you have no money and people think your idea is terrible, you get very good at raising money.”

Kalanick cautioned against what he called “irrational funding going on” (Uber has raised just under $9 billion to date). “You can get your butt kicked by others who are raising more money and buying market share,” he said. “So you have to find a way to contain the irrational when it’s happening.”

Having just raised another $200 million to compete in emerging markets, perhaps the starkest example can be seen in one of Uber’s largest international marketplaces. “We’re profitable in the USA, but we’re losing over $1 billion a year in China. We have a fierce competitor that’s unprofitable in every city they exist in, but they’re buying up market share. I wish the world wasn’t that way. I prefer building rather than fundraising. But if I don’t participate in the fundraising bonanza, I’ll get squeezed out by others buying market share.”

Feature image courtesy Barinder Rasode.

This story was originally published on Betakit

19 Feb 06:42

Instagram begins rolling out two-factor authentication

by Ian Hardy

Instagram has doubled down on user security by rolling out two-factor authentication.

According to TechCrunch, Instagram has been testing two-factor authentication with select users and has plans to globally launch the feature soon. Two-factor authentication with Instagram works in a similar fashion to others apps such as Twitter, Facebook, Slack and Pinterest; owners verify the account with a phone number.

With two-factor authentication, if someone attempts to access the account, they’ll be denied because an authentication code is required. This code is sent via text message to the number provided when setting up the authentication.

instagram-two-factor-authentication

Facebook-owned Instagram recently updated its app with account switching functionality, giving its 400 million users the ability to operate multiple Instagram accounts without logging out to switch between them.

19 Feb 06:41

Telus says Marshmallow update coming to the Galaxy Note 5 in March, Galaxy S6 in April

by Ian Hardy

Finally, there is a date.

Telus has updated its software update schedule with a few notable devices set to receive upgrades to Android Marshmallow.

Over the next couple of months, Telus says that the Samsung Galaxy S6 edge+ and Galaxy Note 5 will receive Android 6.0 on March 9th, as the Nexus 6P will have its monthly security update.

In addition, the Galaxy S6 edge and LG G3 will receive Marshmallow a week later on March 16th. If you’re a Galaxy S6 owner on Telus, the date you are most interested in is April 13th, as this is the date that has been tentatively scheduled for your device to be updated to the latest version of Android.

Software Update Schedule telus

Of course, Telus says in the update listing’s fine print that “all dates are approximate and subject to change.”

Google’s latest OS distribution numbers revealed that Marshmallow is now powering 1.2 percent of Android devices.