Many industries across the U.S. have expressed alarm over the proposed waiver, which was put forth by a coalition of over 100 countries, led by India and South Africa, and would waive intellectual property rules in order to boost production of vaccines, medical products, and research toward ending the Covid-19 pandemic.
This might seem irrelevant to Hollywood, major publishing companies, and the music industry, but recently released disclosures show that these sectors have mobilized lobbyists to raise concerns with the waiver proposal.
The Motion Picture Association, which represents major movie and television studios, deployed five lobbyists to influence Congress and the White House over the waiver. The Association of American Publishers as well as Universal Music have similarly revealed that they are actively lobbying against it.
I think about the "star system" backlash a lot.
It's been so long, I can't remember... what did Google Reader have that The Old Reader doesn't?
There was an old campaign to make wikipedia a heritage site!
This is my favorite way to learn about the constitution.
"No State shall, without the Consent of Congress . . . enter into any Agreement or Compact with another State, or with a foreign Power, or engage in War, unless actually invaded, or in such imminent Danger as will not admit of delay."
Two groups of states (all with Democratic Governors) have reached agreements amongst themselves to coordinate their plans to reopen their economies when appropriate. Presumably they are doing this because they do not trust the Federal Government (in other words, the Trump Administration) to give them coherent advice. But can these states make these plans without the consent of Congress?
One way to think about this issue is by parsing the text that I just quoted. What is an agreement or compact exactly? Are certain formalities required or is any understanding between or among states (however informal) enough? Does the qualification about invasion or imminent danger apply only to fighting wars, or also to the earlier part about an interstate agreement or compact? If the qualification does apply, are decisions about how to reopen state economies a matter of imminent danger?
Another way to look at this problem is that nobody has standing to challenge these interstate arrangements. There could be circumstances where an individual or a state would be harmed by an interstate agreement reached without congressional consent, but I'm hard-pressed to see who would be in this instance.
The final, and most likely solution, is that nobody will pay attention to the issue at all because a state of emergency exists.
Not a fan of the Open COVID License they threw into the mix, but I'm glad they are putting the intention out there. It would be great if Gates or CZI were on board!
I'm surprised, this is a corner where I didn't expect to see a willingness to cede so much on privacy. In a country where we're struggling to do basics like timely testing, this whole position is uncharacteristically techno utopiast.
I am a privacy activist who has been riding a variety of high horses about the dangers of permanent, ubiquitous data collection since 2012.
But warning people about these dangers today is like being concerned about black mold growing in the basement when the house is on fire. Yes, in the long run the elevated humidity poses a structural risk that may make the house uninhabitable, or at least a place no one wants to live. But right now, the house is on fire. We need to pour water on it.
In our case, the fire is the global pandemic and the severe economic crisis it has precipitated. Once the initial shock wears off, we can expect this to be followed by a political crisis, in which our society will fracture along pre-existing lines of contention.
But for the moment, we are united by fear and have some latitude to act.
Doctors tell us that if we do nothing, the coronavirus will infect a large fraction of humanity over the next few months. As it spreads, the small proportion of severe cases will overwhelm the medical system, a process we are seeing play out right now in places like Lombardy and New York City. It is imperative that we slow this process down (the famous 'flattening the curve') so that the peak of infections never exceeds our capacity to treat the severely ill. In the short term this can only be done by shutting down large sections of the economy, an unprecedented move.
But once the initial outbreak is contained, we will face a dilemma. Do we hurt people by allowing the economy to collapse entirely, or do we hurt people by letting the virus spread again? How do we reconcile the two?
One way out of the dilemma would be some kind of medical advance—a vaccine, or an effective antiviral treatment that lowered the burden on hospitals. But it is not clear how long the research programs searching for these breakthroughs will take, or whether they will succeed at all.
Without these medical advances, we know the virus will resume its spread as soon as the harsh controls are lifted.
Doctors and epidemiologists caution us that the only way to go back to some semblance of normality after the initial outbreak has been brought under control will be to move from population-wide measures (like closing schools and making everyone stay home) to an aggressive case-by-case approach that involves a combination of extensive testing, rapid response, and containing clusters of infection as soon as they are found, before they have a chance to spread.
That kind of case tracking has traditionally been very labor intensive. But we could automate large parts of it with the technical infrastructure of the surveillance economy. It would not take a great deal to turn the ubiquitous tracking tools that follow us around online into a sophisticated public health alert system.
Every one of us now carries a mobile tracking device that leaves a permanent trail of location data. This data is individually identifiable, precise to within a few meters, and is harvested by a remarkable variety of devices and corporations, including the large tech companies, internet service providers, handset manufacturers, mobile companies, retail stores, and in one infamous case, public trash cans on a London street.
Anyone who has this data can retroactively reconstruct the movements of a person of interest, and track who they have been in proximity to over the past several days. Such a data set, combined with aggressive testing, offers the potential to trace entire chains of transmission in real time, and give early warning to those at highest risk.
This surveillance sounds like dystopian fantasy, but it exists today, ready for use. All of the necessary data is being collected and stored already. The only thing missing is a collective effort to pool it and make it available to public health authorities, along with a mechanism to bypass the few Federal privacy laws that prevent the government from looking at the kind of data the private sector can collect without restraint.
We've already seen such an ad-hoc redeployment of surveillance networks in Israel, where an existing domestic intelligence network was used to notify people that they had possibly been infected, and should self-quarantine, a message that was delivered by text message with no prior warning that such a system even existed.
We could make similar quick changes to the surveillance infrastructure in the United States (hopefully with a little more public awareness that such a system was coming online). When people are found to be sick, their location and contact history could then be walked back to create a list of those they were in touch with during the period of infectiousness. Those people would then be notified of the need to self-quarantine (or hunted with blowguns and tranquilizer darts, sent to FEMA labor camps, or whatever the effective intervention turns out to be.)
This tracking infrastructure could also be used to enforced self-quarantine, using the same location-aware devices. The possibilities of such a system are many, even before you start writing custom apps for it, and there would be no shortage of tech volunteers to make it a reality.
The aggregate data set this surveillance project would generate would have enormous value in its own right. It would give public health authorities a way to identify hot spots, run experiments, and find interventions that offered the maximum benefit at the lowest social cost. They could use real-time data and projections to allocate scarce resources to hospitals, and give advance warnings of larger outbreaks to state and Federal authorities in time to inform policy decisions.
Of course, all of this would come at an enormous cost to our privacy. This is usually the point in an essay where I’d break out the old Ben Franklin quote: “those who would give up essential liberty to purchase a little temporary safety deserve neither.”
But this proposal doesn’t require us to give up any liberty that we didn't already sacrifice long ago, on the altar of convenience. The terrifying surveillance infrastructure this project requires exists and is maintained in good working order in the hands of private industry, where it is entirely unregulated and is currently being used to try to sell people skin cream. Why not use it to save lives?
The most troubling change this project entails is giving access to sensitive location data across the entire population to a government agency. Of course that is scary, especially given the track record of the Trump administration. The data collection would also need to be coercive (that is, no one should be able to opt out of it, short of refusing to carry a cell phone). As with any government surveillance program, there would be the danger of a ratchet effect, where what is intended as an emergency measure becomes the permanent state of affairs, like happened in the United States in the wake of the 2001 terrorist attacks.
But the public health potential of commandeering surveillance advertising is so great that we can’t dismiss it out of hand. I am a privacy activist, typing this through gritted teeth, but I am also a human being like you, watching a global calamity unfold around us. What is the point of building this surveillance architecture if we can't use it to save lives in a scary emergency like this one?
One existing effort we could look to as a model for navigating this situation is the public/private partnership we have set up to monitor child sexual abuse material (CSAM) on the Internet.
Large image sharing sites like Facebook, Google, and Snapchat use a technology called PhotoDNA to fingerprint and identify images of known abuse material. They do this voluntarily, but if they find something, they are required by law to report it to the National Center for Missing and Exploited Children, a nongovernmental entity that makes referrals as appropriate to the FBI.
The system is not perfect, and right now is being used as a political football in a Trump administration attempt to curtail end-to-end encryption. But it shows the kind of public-private partnership you can duct tape together when the stakes are high and every party involved feels the moral imperative to act.
In this spirit, I believe the major players in the online tracking space should team up with the CDC, FEMA, or some other Federal agency that has a narrow remit around public health, and build a national tracking database that will operate for some fixed amount of time, with the sole purpose of containing the coronavirus epidemic. It will be necessary to pass legislation to loosen medical privacy laws and indemnify participating companies from privacy lawsuits, as well as override California's privacy law, to collect this data I don’t believe the legal obstacles are insuperable, but I welcome correction on this point by people who know the relevant law.
This enabling legislation, however, should come at a price. We have an opportunity to lay a foundation for the world we want to live in after the crisis is over. One reason we tolerate the fire department knocking down our door when there is an emergency is that we have strong protections against such intrusions, whether by government agencies or private persons, in more normal times. Those protections don't exist right now for online privacy. One reason this proposal is so easy to float is that private companies have enjoyed an outrageous freedom to track every aspect of our lives, keeping the data in perpetuity, and have made full use of it, turning the online economy into an extractive industry. That has to end.
Including privacy guarantees in the enabling legislation for public health surveillance will also help ensure that emergency measures don't become the new normal. If we use this capability deftly, we could come out of this crisis with a relatively intact economy, a low cumulative death toll, and a much healthier online sphere.
Of course, the worst people are in power right now, and the chances of them putting such a program through in any acceptable form are low. But it’s 2020. Weirder things have happened. The alternative is to keep this surveillance infrastructure in place to sell soap and political ads, but refuse to bring it to bear in a situation where it can save millions of lives. That would be a shameful, disgraceful legacy indeed.
I continue to believe that living in a surveillance society is incompatible in the long term with liberty. But a prerequisite of liberty is physical safety. If temporarily conscripting surveillance capitalism as a public health measure offers us a way out of this crisis, then we should take it, and make full use of it. At the same time, we should reflect on why such a powerful surveillance tool was instantly at hand in this crisis, and what its continuing existence means for our long-term future as a free people.
Thank you for the opportunity to speak with you today here in Wellington. It really is a privilege to be here. For the next few minutes I’m going to be talking about the efforts of the Documenting the Now project to build a community of practice around social media archiving. For Lave & Wenger (1991) a community of practice is a way of understanding how groups of people learn, by sharing skills and tools in relationship with each other, and in their daily lived experience. I think the best of what IIPC has to offer is as a community of practice.
But before I tell you about DocNow I need to do a little framing. Please forgive me if this seems redundant or obvious. I’ve been interested in web archiving for close to 10 years, but this is my first time here at the Web Archiving Conference.
A significant part of our work in Documenting the Now has been appreciating, and coming to terms with, the scale of the web. As you know the work of archiving social media is intrinsically bound up with archiving the web. But archiving the web is a project whose scale is paradoxically both much bigger, and much, much smaller, than the IIPC and its member institutions.
So I said it is a privilege to be here, and I meant it. According to the Carbon Footprint Calculator my return trip from Washington DC amounts to 2.19 metric tons of CO₂. To put this in perspective the Environmental Defense Fund estimates that the average American car emits about 7 tons of CO₂ in a year. How many cars do you think our conference amounts to? Hopefully you’ll see why this is important in a moment.
Perhaps you saw this story in the Guardian about the projected energy consumption of the Internet? According to the piece, researchers estimate that by 2020 running the Internet will generate 3.5% of all global emissions, which will exceed that of the entire aviation and shipping industries. That’s only a year away.
By 2025 the Internet and all its connected devices could use up to 20% of the world’s electricity. By 2040 Internet carbon emissions will be up to 14%, which is roughly the same amount as the entire United States today. Even if 100% of future data centers use renewable energy sources the additional demand for electricity would significantly eat into any savings that would have made in overall energy use.
We have a tsunami of data approaching. Everything which can be is being digitalised. It is a perfect storm.
Arguably the storm is here already. As the article goes on to point out it’s the cost of keeping increasing amounts of data online, ready to be streamed, that accounts for these observed and forecasted increases.
Just as a thought experiment try to imagine archiving all this data. Imagine some new archiving technology that made it possible to archive it all. Some new decentralized digital vellum that’s orchestrated with smart contracts and runs on cryptocurrency. What impact would archiving all this data have on our environment, and on our planet…to say nothing of our budgets?
This is not to say we don’t need better tech for web archiving–we do. But better web archiving technology won’t save us. We need to be able to think and talk about what we are saving and why. Because we can’t archive it all. In fact, we must not archive it all because that would seriously jeopardize our ability to survive as a species.
The Internet Archive’s mission is to archive the web and to provide universal access to all knowledge (Kahle, 2007). How much of the web is in the Internet Archive? Of course it’s hard, if not impossible, to say. But let’s not let that stop us.
In 2016 Google announced that it had 130 trillion URLs in its index. In a blog post also written in 2016 the Internet Archive shared that they had saved 510 billion web captures. So we could estimate that in 2016 the world’s largest public archive of web content, had collected 0.39% of the web. What 0.39% of the web do you think they archived?
Of course this is mixing apples and oranges, because on the one hand we are talking about web captures and on the other URLs. But comparing distinct URLs in the Internet Archive would only make this percentage smaller. Even worse, Google themselves don’t really know how big the web is: they just know how many distinct URLs their bots have wandered into.
But this low percentage probably doesn’t come as any surprise to the archivists in the room, who know that between 1 and 5 percent of all institutional records will survive as archives (Cook, 2011). As Harris (2007) would perhaps say, the Wayback Machine and all our web archives will always be “a sliver of a sliver of a sliver of a window into process” (Harris, 2007). This is why they are so important to us, and why an environmental analysis is so fundamental to digital preservation work (Goldman, 2018).
We need to decide what to collect. How we decide is what’s important. We need to talk about about how we decided. This is what you do here.
Ok, so the web is a lot bigger than us, and trying to archive it all might kill us all, if we were able to do it. But how is web archiving smaller than the IIPC and its individual member institutions?
In August of 2014, my friend and collaborator Bergis Jules and I sat in a bar with a small group of friends at the Society of American Archivists meeting in Washington DC. We were talking about what it would take to collect the tweets about the protests in Ferguson, Missouri that were happening as we spoke.
These protests were in direct response to the murder of African American teenager Mike Brown by police officer Darren Wilson. We were thinking about Zeynep Tufekci’s new piece What Happens in #Ferguson Affects Ferguson about the disparity in what she was seeing in Facebook versus Twitter. We weren’t exactly sure what the right way to do this collection was, but we did know why we wanted to do it. We knew that Ferguson was a significant political and cultural moment that scholars would some day want to study. Forget about the future, we knew researchers would want to study it right now.
So we used a nascent utility called twarc to collect 13 million tweets that contained the word “Ferguson” from August 9 through August 27, 2014. We started writing in a blog about how we were doing the work, and how to use the tweet metadata that had been assembled. We continued doing data collection around the protests of Walter Scott, Freddie Gray, Sandra Bland, Samuel DuBose, Alton Sterling, Philando Castille, Korryn Gaines, and the BlackLivesMatter movement that catalyzed heightened awareness about police violence against people of color, and structural racism. We were emboldened to hear that others wanted to help out, and to do this work too.
Four years and a Mellon grant later we’ve had the opportunity to work together as a team to improve that twarc utility we started with in 2014, and to create a few more tools along the way to help in doing some of this work. These probably don’t seem like your typical web archiving tools, but I’d like you to think about why.
- twarc - a command line tool for collecting tweets from Twitter’s search and streaming APIs, and can collect threaded conversations and user profile information. It also comes with a kitchen sink of utilities contributed by members of the community.
- Catalog - a clearinghouse of Twitter identifier datasets that live in institutional repositories around the web. These have been collected by folks like the University of North Texas, George Washington University, UC Riverside, University of Maryland, York University, Society of Catalan Archivists, University of Virginia, tUniversity of Puerto Rico, North Carolina State University, University of Alberta, Library and Archives Canada, and more.
- Hydrator - A desktop utility for turning tweet identifier datasets (from the Catalog) back into structured JSON and CSV for analysis. It was designed to be able to run for weeks on your laptop, to slowly reassemble a tweet dataset, while respecting Twitter’s Terms of Service, and users right to be forgotten.
- unshrtn - A microservice that makes it possible to bulk normalize and extract metadata from a large number of URLs.
- DiffEngine - a utility that tracks changes on a website using its RSS feed, and publishes these changes to Twitter and Mastodon. As an example see whitehouse_diff which announces changes to the Executive orders made on the White House blog.
- DocNow - An application (still under development) that allows archivists to observe Twitter activity, do data collection, analyze referenced web content, and optionally send it off to the Internet Archive to be archivd.
These tools emerged as part of doing work with social media archives. Rather than building one tool that attempts to solve some of the many problems of archiving social media, we wanted to create small tools that fit particular problems, and could be composed into other people’s projects and workflows. We wanted to thoughtfully intervene into a scholarly communications landscape where researchers were using social media, but not always sharing their methods, and their data sources.
The Ferguson Principles
Truth be told, these tools are actually just a sideshow for what we’ve really been trying to do. Over the past four years we’ve been able to work with an emerging community of archivists, researchers, and activists who already see the value of social media and web archiving, but are interested in developing practices that speak to the ethical concerns that arise when doing this work.
Documenting the Now is a distributed team, with porous boundaries, so having an open Slack channel was useful for coordinating. But the real work happened in several face-to-face meetings in St Louis and Ferguson, Missouri where we heard from activists about how they wanted their social justice work to be remembered. This was only made possible by facilitating direct conversations between archivists, technologists, researchers and activists about how to remember the protests.
These conversations also took place at the Ethics and Archiving the Web conference that was hosted by Rhizome held earlier this year. Drawing on our reading of existing guidelines from the Society of American Archivists and the International Association of Internet Researchers, we developed a set of recommendations described in a white paper, that we have come to informally refer to as the Ferguson Principles. We can discuss in more detail during the Q&A or during a workshop we are holding Friday.
Archivists must engage and work with the communities they wish to document on the web. Archives are often powerful institutions. Attention to the positionality of the archive vis-à-vis content creators, particularly in the case of protest, is a primary consideration that can guide efforts at preservation and access.
Documentation efforts must go beyond what can be collected without permission from the web and social media. Social media collected with the consent of content creators can form a part of richer documentation efforts that include the collection of oral histories, photographs, correspondence, and more. Simply telling the story of what happens in social media is not enough, but it can be a useful start.
Archivists should follow social media platforms’ terms of service only where they are congruent with the values of the communities they are attempting to document. What is legal is not always ethical, and what is ethical is not always legal. Context, agency and (again) positionality matter.
When possible, archivists should apply traditional archival practices such as appraisal, collection development, and donor relations to social media and web materials. It is hard work adapting these concepts to the collection of social media content, but they matter now, more than ever.
These aren’t meant to be global principles to be applied in every situation when you are archiving social media. They are meant to be touchstones for conversations to have, particularly when you are doing web archiving work in the context of social justice.
We recently announced that we received a new round of funding to continue this work, which you can read more about in a post about phase 2. But in a nutshell this funding will allow us to do three interrelated things.
The first is to continue to support and develop the tools we’ve worked on so far. If you’d like to be part of these conversations we will be opening up our technical development calls shortly.
We will also be developing a series of workshops, to help build digital community-based archives in direct partnership with social justice activist organizations. Look for information about how to apply to be part of this in the new year.
And thirdly we will be working with a new project partner, Meredith Clark at the University of Virginia, to develop an openly licensed, college-level curriculum that gives students meaningful experience with, and frameworks for, the ethical use of social media in their research.
So returning to where we started, I’ve been excited to see the IIPC developing its own virtual community in Slack, and that there are efforts such as the Online Hours Supporting Open Source (OH-SOS) and the Training Working Group that allow engagement to grow outside of the select few who are able to attend these yearly face-to-face meetings.
Since I don’t work at an IIPC member institution I’m not totally up on what is going on. But I do think there may be opportunities for the IIPS to adopt some approaches to web archiving tools and practice that get outside of the institutional walls we so often finds ourselves operating in.
I’m specifically thinking of the decades of work by the Australian school, on the Records Continuum model developed by Sue McKemmish, Frank Upward and others (McKemmish, Upward, & Reed, 2010), which takes an expansive and integrative view of what counts as records, and the contexts in which they are produced. Also, strategies drawing on community archiving, where records continue to live in the environments that they were produced could be very generative for moving efforts in this direction (Caswell, Migoni, Geraci, & Cifor, 2016; Flinn, 2007 ; Punzalan, 2009 ). What if we also thought of web archiving work as getting out into the world to help people sustain their websites rather than only taking their content and putting it in a Wayback instance, as what Brügger & Finnemann (2013) calls “reborn digital material”.
So if any of this sounds useful and interesting to you please get in touch with me or Bergis here at IIPC. I believe that Bergis will be around next week at the National Digital Forum as well. As I mentioned we are also hosting a workshop Friday morning in which we are going to be exploring some of the Ferguson Principles and more in a group setting. And finally please consider joining our Slack, asking questions, and joining the conversation there.
Brügger, N., & Finnemann, N. O. (2013). The web and digital humanities: Theoretical and methodological concerns. Journal of Broadcasting & Electronic Media, 57(1), 66–80. Retrieved from http://search.ebscohost.com.proxy-um.researchport.umd.edu/login.aspx?direct=true&db=a9h&AN=86010172&site=ehost-live
Caswell, M., Migoni, A. A., Geraci, N., & Cifor, M. (2016). “To be able to imagine otherwise”: Community archives and the importance of representation. Archives and Records. Retrieved from http://www.tandfonline.com/doi/full/10.1080/23257962.2016.1260445
Cook, T. (2011). We are what we keep; we keep what we are: Archival appraisal past, present and future. Journal of the Society of Archivists, 32(2), 173–189.
Flinn, A. (2007). Community histories, community archives: Some opportunities and challenges 1. Journal of the Society of Archivists, 28(2), 151–176.
Goldman, B. (2018). It’s not easy being green(e): Digital preservation in the age of climate change. In Archival values: Essays in honor of Mark Greene. Society of American Archivists. Retrieved from https://scholarsphere.psu.edu/concern/generic_works/bvq27zn11p
Harris, V. (2007). Society of american archivists. In (pp. 101–106).
Kahle, B. (2007). Universal access to all knowledge. The American Archivist, 70(1), 23–31.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge Ueniversity Press.
McKemmish, S., Upward, F., & Reed, B. (2010). Records continuum model. In M. Bates & M. N. Maack (Eds.), Encyclopedia of library and information sciences. Taylor & Francis.
Punzalan, R. L. (2009). ’All the things we cannot articulate’: Colonial leprosy archives and community commemoration. In J. A. Bastian & B. Alexander (Eds.), Community archives: The shaping of memory. Facet Publishing.
ahumblecabbage:I don’t trust reddit because it’s the only social network whose users don’t hate the...
I can trust my love for The Old Reader because I very much wanted to write them an angry email about their dark mode CSS.
I don’t trust reddit because it’s the only social network whose users don’t hate the website they’re using. twitter users hate twitter. snapchat users hate snapchat. but reddit users will get a neck tattoo of the goofy little alien friend and name their first born son r/gaming
ARL, Wikimedia, and Linked Open Data: Draft White Paper Open for Comments through November 30 | Association of Research Libraries® | ARL®
Ideas for Wikicite.
I think this is basically everything great and everything wrong with Twitter's relationship with the news, too.
Click here to go see the bonus panel!
Thanks to Ken, Michael, and Barbara from patreon for helping make this more clear! If anything is confusing, it's their fault.
Ben's ancestral food, doesn't sound too bad
You drove me to make my most disappointing edit to Wikipedia: https://en.wikipedia.org/w/index.php?title=Uraba_lugens&diff=857035250&oldid=856669664
Glad he survived. (BTW, that's probably a 4 on the Schmidt sting pain index.)
The trailer is very similar to Journey. Slightly less lonesome, maybe. Nothing wrong with borrowing for inspiration, but that's treated oddly in the interview.
I was hoping we'd have a new "government sponsored" disclaimer on knowyourmeme, but fortunately (unfortunately?) these paragraphs aren't in the real indictment.
“If I gave you a million dollars, would you…?” (insert possibly life-altering risk, humiliation, or soul-selling crime here). What about ten million? 100 million? One BILLION dollars? Put another way, in the terms social scientists use these days, how much money is enough to make you happy?
If you’re Montgomery Burns, it’s at least a billion dollars, lest you be forced to suffer the torments of the Millionaire’s Camp. (“Just kill me now!”) As it tends to do, The Simpsons’ dark humor nails the insatiable greed that seems the scourge of our time, when the richest 1 percent take 82 percent of the world’s wealth, and the poorest 50 percent get nothing at all.
Hypothetical windfalls aside, the question of how much is enough is an urgent one for many people: as in, how much to feed a family, supply life’s necessities, purchase just enough leisure for some small degree of personal fulfilment?
As the misery of Monty Burns demonstrates, we have a sense of the 1% as eternally unfulfilled. He’s the wicked heir to more serious tragic figures like Charles Foster Kane and Jay Gatsby. But satire is one thing, and desire, that linchpin of the economy, is another.
“What we see on TV and what advertisers tell us we need would indicate there is no ceiling when it comes to how much money is needed for happiness,” says Purdue University psychologist Andrew T. Jebb, “but we now see there are some thresholds.” In short: money is a good thing, but there is such a thing as too much of it.
Jebb and his colleagues from Purdue and the University of Virginia addressed questions in their study “Happiness, income satiation and turning points around the world” like, “Does happiness rise indefinitely with income, or is there a point at which higher incomes no longer lead to greater wellbeing?” What they found in data from an international Gallup World Poll survey of over 1.7 million people in 164 countries varies widely across the world.
People in wealthier areas seem to require more income for happiness (or “Subjective Well Being” in the social science terminology). In many parts of the world, higher incomes, “beyond satiation”—a metric that measures how much is enough—“are associated with lower life evaluations.” The authors also note that "a recent study at the country level found a slight but significant decline in life evaluation" among very high earners "in the richest countries."
You can see the wide variance in happiness worldwide in the “Happiness” study. As Dan Kopf notes at Quartz, these research findings are consistent with those of other researchers of happiness and income, though they go into much more detail. Problems with the methodology of these studies—primarily their reliance on self-reported data—make them vulnerable to several critiques.
But, assuming they demonstrate real quantities, what, on average, do they tell us? “We found that the ideal income point," averaged out in U.S. dollars, "is $95,000 for [overall life satisfaction],” says Jebb, “and $60,000 to $75,000 for emotional well-being,” a measure of day-to-day happiness. These are, mind you, individual incomes and “would likely be higher for families,” he says.
Peter Dockrill at Science Alert summarizes some other interesting findings: “Globally, it’s cheaper for men to be satisfied with their lives ($90,000) than women ($100,000), and for people of low ($70,000) or moderate education ($85,000) than people with higher education ($115,000).”
Yes, the study, like those before it, shows that after the “satiation point,” happiness decreases, though perhaps not to Monty Burns levels of dissatisfaction. But where does this leave most of us in the new Gilded Age? Given that "satiation" in the U.S. is around $105K, with day-to-day happiness around $85K, the majority of Americans fall well below the happiness line. The median salary for U.S. workers at the end of 2017 was $44, 564, according to the Bureau of Labor Statistics. Managers and professionals averaged $64,220 and service workers around $28,000. (As you might imagine, income inequality diverged sharply along racial lines.)
And while the middle class saw a slight bump in income in the last couple years, median household income was still only $59,039 in 2016. However, we measure it the "middle class... has been declining for four decades,” admits Business Insider—“identifying with the middle class is, in part, a state of mind” rather than a state of debt-to-income ratios. (One study shows that Millennials make 20% less than Baby Boomers did at the same age.) Meanwhile, as wealth increases at the top, “the country’s bottom 20% of earners became worse off.”
This may all sound like bad news for the happiness quotient of the majority, if happiness (or Subjective Well Being) requires a certain amount of material security. Maybe one positive takeaway is that it doesn’t require nearly the amount of vast private wealth that has accumulated in the hands of a very few people. According to this research, significantly redistributing that wealth might actually make the wealthy a little happier, and less Mr. Burns-like, even as it raised happiness standards a great deal for millions of others.
Not only are higher incomes "usually accompanied by higher demands," as Jebb and his colleagues conclude—on one's time, and perhaps on one's conscience—but "additional factors" may also play a role in decreasing happiness as incomes rise, including "an increase in materialistic values, additional material aspirations that may go unfulfilled, increased social comparisons," etc. The longstanding truism about money not buying love—or fulfillment, meaning, peace of mind, what-have-you—may well just be true.
You can dig further into Andrew T. Jebb's study here: “Happiness, income satiation and turning points around the world.”
How Much Money Do You Need to Be Happy? A New Study Gives Us Some Exact Figures is a post from: Open Culture. Follow us on Facebook, Twitter, and Google Plus, or get our Daily Email. And don't miss our big collections of Free Online Courses, Free Online Movies, Free eBooks, Free Audio Books, Free Foreign Language Lessons, and MOOCs.
Last week, the San Francisco Bicycle Coalition (SFBC) launched a letter-writing drive to make sure protected bike lanes are part of SFMTA’s Upper Market Street Safety Project from Octavia to the start of the Duboce bicycle path. And, Streetsblog learned this afternoon, it looks as if the campaign worked.
First, from the article published last week on the SFBC web page, announcing the letter-writing drive:
We’ve heard time and time again about the chronic double-parking in bike lanes, a well-known issue for Upper Market. Your support for protected bike lanes will use smart design to directly address this issue, but we know more needs to be done. If you want to see protected bike lanes extend further up Market, we encourage you to write a letter to new District 8 Supervisor Jeff Sheehy and the SF Municipal Transportation Agency (SFMTA). [Note: his district encompasses Upper Market]
Some more background from the SFMTA web page about the project:
Upper Market Street is one of San Francisco’s most popular corridors, but it also has some of the city’s highest collision rates. In the five-year period from 2007 to 2012, there were 27 collisions involving vehicles and pedestrians, 32 vehicle-bicycle collisions and 102 vehicle-vehicle collisions on Market Street between Octavia Boulevard and Castro street.
Given those stats, it’s astounding that installing protected bike lanes was ever considered an optional part of the project. And after conversations between SFBC leadership and Sheehy, it was unclear if there were even enough votes on the SFMTA Board of Directors to assure protected bike lanes remained in the plan. (Note that Sheehy was appointed by Mayor Lee to replace Scott Wiener.)
But late this afternoon Streetsblog received the following statement from Sheehy, texted via one of his staffers: “I support the work that the community, businesses, street safety advocates, and the SFMTA have done to develop the Upper Market street safety project, including the protected bike lanes [emphasis added]. I look forward to the implementation of this critical safety improvement as soon as possible.”
Good news for safe-streets advocates. And kudos to SFBC on a short but apparently effective campaign.
Way to go!
Wendy Alfsen, long-time Executive Director of California Walks, is retiring. Sort of. She has taken a new title of Senior Director, and former Deputy Director Tony Dang has moved up to take the reins of the advocacy organization.
The change is coming “at a particularly interesting time for transportation,” as Dang put it. Last year’s legislative session ended without an agreement on transportation funding, despite it being a top priority for Governor Jerry Brown. Now the new session has begun, but new proposals are the same as the old ones.
The national policy landscape is also full of uncertainties. “There are going to be a lot of major issues that take up a lot of air in the room,” said Dang, “and transportation will likely have to take a little bit of a back seat. But that doesn’t mean that [California Walks] or our partners will step back or stop working on these issues.”
“The beauty of this work,” said Dang, “is it’s so simple: if we create communities where people of all ages and incomes and abilities can walk safely, we will create a better society. But if the transportation system doesn’t work for folks, we can’t relieve a lot of the societal problems that we are faced with nationally.”
Dang says he is excited to be taking over as Executive Director, and Alfsen is enthusiastic as well. “Tony is doing a great job,” she said.
Dang has been working for California Walks for three years, taking on increasingly more responsibility over time. Now he will also be in charge of strategic planning for the organization, which works on two fronts to advance walkable communities. At the state level, California Walks weighs in on policymaking decisions, ensuring that pedestrian concerns are included in early stages of proposed laws as well as planning and regulations at state agencies. At the local level, the organization provides training and support for communities looking to improve local walking conditions.
California Walks also hosts a biannual summit called “Peds Count!” to share knowledge and information among advocates and agencies. Its staff conducts walk audits to understand the pedestrian experience in specific places, and trains local advocates to conduct their own.
Alfsen helped found California Walks in 2003 as an informal statewide network of local advocacy groups like Walk and Roll Berkeley, where she was working at the time. Several years later she helped it become a formal organization, providing a state-level voice for local groups.
“When I first started doing this work, there were two statewide committees at Caltrans that I could work with,” said Alfsen. “Now, just within Caltrans alone, there are ten or fifteen different advisory committees and working groups working on pedestrian and related issues. And then there are other state agencies like the Air Resources Board and the California Transportation Commission.”
“The strategy of being at the table is important in creating the change we want: to move the transportation system to be more sustainable,” said Alfsen. “But not everyone has been at the table; historically it’s been restricted to the rich and powerful, and ordinary communities and citizens have been not included until recently.”
When asked what her favorite projects have been during her time at California Walks, Alfsen had a hard time choosing. “I’m a personal fan of both policy and personal empowerment. Whenever those can be brought together, I’m excited,” she said. One favorite project was a recent success: a youth-led community engagement project that resulted in Active Transportation Program funding for a sidewalk in an unincorporated area in the Greenfield neighborhood in Bakersfield.
California Walks had been training and mentoring youth leaders, who took charge of the project. “California Walks didn’t create the idea, or apply for the grant,” she said. “It was not our baby. The youth leaders did all the work of organizing and publicizing meetings, gathering signatures, making it happen. We provided some information about how to find technical expertise, and how to access resources.”
“Of course, California Walks had worked to help create the Active Transportation Program and to make sure pedestrian projects were included in it,” she added.
She was just as excited about state policy work she’d been involved in, including an effort to remove barriers from the vehicle code so that “senior zones”—low-speed areas near senior housing, similar to school zones—could be created more easily. She’s working now with Caltrans staff to create signage standard to bring more attention to these zones, like the bright green signs and flashing lights used near some schools.
“All of this change is incremental, but it makes a big difference on the ground,” she says about her policy work. “As we’re developing more multifamily housing along transit corridors—and recognize that those transit corridors are also high-speed arterials—the question becomes one of how people can get across the street safely.”
It’s also a question of funding priorities. The U.S. has spent most transportation money making it easier for people to travel in cars. Now the country is faced with climate change resulting in part from the free and easy use of private gasoline-powered vehicles, as well as irremediable traffic congestion—and judging from the latest proposals out of the state capitol, funding priorities don’t seem to be changing.
Alfsen is enthusiastic about the potential for California Walks’ work to contribute to solving climate change. “And at a low cost,” she says. “More than 25 percent of all trips we make in California—for shopping, errands, school, and work, every trip we make—are one mile or less, and over forty percent are five miles or less. In fact, only twenty percent of ALL trips are over ten miles—yet we invest all of our resources on those trips.”
“Unfortunately, currently about half of the shorter trips are still made in single-occupant vehicles. If that could be reduced to ten or fifteen percent, we could make quite a bit of progress.”
To do that, communities need to transform themselves into places where people will choose to walk those short trips. That means taking into account many factors, which, according to Walk California’s website, include:
. . . elements of the built environment such as availability of continuous, level sidewalks and pathways, safe, accessible crossings, lighting vehicle speed, number of lanes and street width. Other factors that influence walking and walkability include real and perceived safety from crime, gang activity and aggressive dogs, graffiti, maintenance of trees and greenery, safe access to desired destinations (park, school, grocery, library, post office, etc.), public amenities like benches, drinking fountains, public art, restrooms, and trash cans, among many others.
That covers a pretty wide range of things that need addressing. At the same time, advocating for pedestrians can be hard for some people to take seriously. “Like breathing or drinking water, we learned how to walk when we were small children, and we take for granted. Even the word pedestrian has come mean everyday, monotonous, boring,” said Alfsen.
And in transportation decision making, no one was taking notice of walking. “We have devoted our resources to funding ‘limited access’ travel for cars,” said Alfsen, “but the only access that’s been limited is pedestrian access.”
“We have squandered our resources.”
The irony is that, as Dang pointed out, “everybody experiences walking basically every single day. Even people who only walk from a parking garage to an office—everyone has had experiences of unsafe crossings.”
As the new Executive Director, Dang plans to continue the work of building a coalition among like-minded organizations. High on his priority list is to ensure that proposals for transportation funding incorporate equitable approaches to who gets funding, and to make sure that transportation investments reflect state climate change goals, rather than merely continue to encourage current unsustainable means of travel.
California Walks will also be focusing on raising awareness among newly elected lawmakers of the importance of these issues.
Join us at the Internet Archive this Saturday January 7 for a government data hackathon! We are hosting an informal hackathon working with White House social media data, government web data, and data from election-related collections. We will provide more gov data than you can shake a script at! If you are interested in attending, please register using this form. The event will take place at our 300 Funston Avenue headquarters from 10am-5pm.
We have been working with the White House on their admirable project to provide public access to eight years of White House social media data for research and creative reuse. Read more on their efforts at this blog post. Copies of this data will be publicly accessible at archive.org. We have also been furiously archiving the federal government web as part of our collaborative End of Term Web Archive and have also collected a voluminous amount of media and web data as part of the 2016 election cycle. Data from these projects — and others — will be made publicly accessible for folks to analyze, study, and do fun, interesting things with.
At Saturday’s hackathon, we will give an overview of the datasets available, have short talks from affiliated projects and services, and point to tools and methods for analyzing the hackathon’s data. We plan for a loose, informal event. Some datasets that will be available for the event and publicly accessible online:
- Obama Administration White House social media from 2009-current, including Twitter, Tumblr, Vine, Facebook, and (possibly) YouTube
- Comprehensive web archive data of current White House websites: whitehouse.gov, petitions.whitehouse.gov, letsmove.gov and other .gov websites
- The End of Term Web Archives, a large-scale collaborative effort to preserve the federal government web ( .gov/.mil) at presidential transitions, including web data from 2008, 2012, and our current 2016 project
- Special sub-collections of government data, such as every powerpoint in the Internet Archive’s web archive from the .mil web domain
- Extensive archives of of social media data related to the 2016 election including data from candidates, pundits, and media
- Full text transcripts of Trump candidate speeches
- Python notebooks, cluster computing tools, and pointers to methods for playing with data at scale.
Much of this data was collected in partnership with other libraries and with the support of external funders. We thank, foremost, the current White House Office of Digital Strategy staff for their advocacy for open access and working with us and others to make their social media open to the public. We also thank our End of Term Web Archive partners and related community efforts helping preserve the .gov web, as well as the funders that have supported many of the collecting and engineering efforts that makes all this data publicly accessible, including the Institute of Museum and Library Services, Altiscale, the Knight Foundation, the Democracy Fund, the Kahle-Austin Foundation, and others.
The Web dwells in a never-ending present. It is—elementally—ethereal, ephemeral, unstable, and unreliable.
–Jill Lepore, from “The Cobweb: Can the Internet be Archived?” in the New Yorker, January 26, 2015
For twenty years, here at the Internet Archive, we’ve been trying to capture lightning in a bottle. How do you archive the “ethereal, ephemeral, unstable and unreliable“ Web? Since 1996, that has been part of our daily work. We crawl the Web, preserve it, try to make it play back, as if you were back in 1999 on your own GeoCities page, delighting in that animated Under Construction GIF you just posted.
On October 26, 2016, we will be celebrating our 20th Anniversary, and we hope you will join us. We’ve been grappling with how to convey the enormity of our task. How do you visualize the universe of the Web—the audio, images, Web pages, and software that we’ve been archiving for the last 20 years? When you come to our celebration, we’ll be presenting the work of media innovators, each trying to capture the ephemeral Web:
One view from Cyberscape, Owen Cornec and Vinay Goel’s visualization of the top 800,000 Web sites
- Cyberscape—Data visualization engineer, Owen Cornec and Internet Archive Data Scientist, Vinay Goel team up to create an interactive exploration of the top sites on the Web, as captured by the Wayback Machine as early as 1996.
- Deleted Cities—Artist Richard Vijgen’s interactive visualization of GeoCities, once the Web’s largest online community. When Yahoo decided GeoCities was obsolete in 2009, the Internet Archive and Archive Team rushed to preserve tens of millions of GeoCities “homestead” pages before they were erased. Vijgen’s work takes you back to the neighborhoods and virtual cities where a vibrant society once lived online.
Paul D. Miller aka DJ Spooky will perform a newly commissioned piece on October 26.
- DJ Spooky aka Paul D. Miller & media innovator, Greg Niemeyer join forces to create an audio and video composition, drawn completely from media preserved in the Internet Archive. DJ Spooky’s work ranges from producing 14 albums to the DVD anthology, “Pioneers of African American Cinema,” about which the New York Times wrote “there has never been a more significant video release.”
- How Media & Messaging are Shaping the 2016 Election—journalist and former Managing Editor of the Sunlight Foundation, Kathy Kiely, explains how short snippets—of debates, political ads, cable news—are altering the Presidential landscape. This analysis is made possible in part by the Internet Archive’s Political Ad Archive, preserving key ads and debates and monitoring how they are used in swing states.
- Defining Memes & Memories—perhaps the world’s only Free Range Archivist, Jason Scott, takes you on a wild ride through 20 years of memes that captured the global imagination. From the original keyboard cat to Three Wolf Moon, Scott explores the Archive items and collections that rocked the world.
And to round up the evening, Internet Archive Founder, Brewster Kahle, will reflect upon his lifelong obsession—backing up the Web, making it more reliable and secure. Our work is just beginning, but if we are successful, new generations of learners will be able to access the amazing universe of the Web, learn from it, and build societies that are even better.
GET YOUR FREE TICKET TO “How to Build an Archive—20 Years in the Making.” Wednesday, October 26, 2016 from 5-9:30 p.m. at the Internet Archive, 300 Funston Avenue, San Francisco.
“Protected bike lanes–that’s all I care about”
Kristin Smith moderated a member Q&A with Bike Coalition director Brian Wiedenmeier. Photo: Liam Brooks.
Yesterday evening some 200 San Francisco Bicycle Coalition (SFBC) members came to the Brick and Mortar Music Hall at Duboce and Mission to meet SFBC’s Executive Director, Brian Wiedenmeier. They had bánh mì sandwiches, a few beers, and good conversation before a formal Q&A hosted by Kristin Smith, currently with SFMTA and former SFBC staffer.
The questions were varied, but key in the discussion was Mayor Edwin Lee’s recent Executive Directive, instructing the police, the parks department, and SFMTA to accelerate Vision Zero projects and specific bicycle infrastructure plans in response to last month’s deaths of two cyclists, Heather Miller and Kate Slattery.
By six, the venue was already crowded, with a hundred or so bikes parked in the valet stand outside. Participants were of varied ages and professions, but all had one thing in common–a passion for cycling and safe streets. “I’m interested in seeing what Brian has to offer,” said Matt Dove, an SFBC member since 2004. Dove is Director of Bicycle Programs for the Presidio Community YMCA. “I’m curious to see what he’ll do about inclusivity,” said Eliza Barrios, a tech support staffer with the Wikimedia Foundation. She was especially interested in efforts to expand bike infrastructure in under-served and outlying areas of San Francisco, such as the Bayview and the Tenderloin.
If there was a common theme in concerns from participants, it was skepticism about whether the Mayor’s new directive will lead to tangible change on the ground. “We get too stuck in the minutia of projects in San Francisco… we need leadership to stop being re-active and start being pro-active,” said Dove.
Delicious sandwiches, beer and a whole lot of frustration with SF bike infrastructure was shared prior to the formal Q&A. Photo: Liam Brooks.
“I’m essentially encouraged [by the Mayor’s directive],” said Adam Long, SFBC member and a bicycle advocate who has been featured in Streetsblog before. “But I’m discouraged by some of the vague deadlines.”
“I don’t think the city has my back,” said David Bach, a office administrator for a San Francisco tech company, about the Mayor’s directive. “I want to see the pedal to the metal [in installing bike infra], not more smoke!” Bach, who commutes by bike every day from near the San Francisco Zoo to Howard and 7th also asked why the city can manage to ban–and enforce the ban–on plastic bags and public nudity but not on texting and driving.
The perceived lack of speed in installing infrastructure was almost universally shared, although some seemed equally concerned about the quality of the infrastructure that has gone in to date. Steve Laporte, a San Francisco attorney, said portions of raised bike lane on Market Street are lacking because cars still park there. “Then I have to get off of it, go into traffic, and then I can’t get back on because it’s raised,” he said. He wants to see physical protection on the lane. “I see piecemeal fixes, but a bike commute isn’t piecemeal and everybody is only as safe as they are in the most dangerous section they have to ride on,” he said. “Protected bike lanes–that’s all I care about,” said another participant in the event.
Wiedenmeier himself, meanwhile, was happily socializing and hearing from members informally before the Q&A began. “People are frustrated by the rate of project delivery and double parking,” he told Streetsblog. “I love hearing from people about spot improvements… tweets etc. Our organization really relies on them.”
The formal portion of the event started at 7 p.m. Smith read questions from the SFBC members. The first question was on efforts to increase diversity. “Our membership is strong and steadily growing but we could do better to have it reflect the composition of San Francisco in terms of race, gender, ethnicity and geographic distribution,” Wiedenmeier said. “As a gay white man in my 30s, my experience is very different from a black person, a brown person, a woman of any race, or an older person. I won’t make assumptions about the experiences of other.” He added that the organization has to get Spanish and Cantonese materials and on-staff fluency. They also talked about double parking and cars blocking the bike lanes. “I think MTA and SFPD can do a better job of enforcement,” he said. “But SFPD could ticket on Friday night and Saturday the same problem comes up.” Instead, he opined, the answer is protected bike lanes that make it physically impossible to block the bike lane in the first place.
But will those lanes ever get installed? Wiedenmeier is hopeful that the Mayor’s Executive Directive can cut through what he calls the “culture of delay.”
“The Masonic improvements were approved and funded four years ago and broke ground last week! That’s an unacceptable amount of time.” he said. “But we have leverage now. We can go to the Mayor and say ‘you directed the city to get this done in nine months’ and we will call you out every time it doesn’t happen.”
While some in the audience remained skeptical of the Mayor’s ability to follow through, Wiedenmeier expressed confidence. “It’s a big deal. 1,700 people contacted the Mayor’s office. When that many people write in, change can happen.”
This cake was outside the store in a garbage can and had a price tag of $40 on it.