A staggering 88% of companies listed in the 1955 Fortune 500 are nowhere to be found in the same list today. They have gone bankrupt, merged, or simply shrunk off the list. Half a century ago, the life expectancy of a firm in the Fortune 500 was around 75 years. Now it’s less than 15 years.
Box 1: Manage the present at peak efficiency and profitability.
Box 2: Escape the traps of the past by identifying and divesting businesses and abandoning practices, ideas, and attitudes that have last relevance in a changed environment.
Box 3: Generate breakthrough ideas and convert them into new products and businesses.
To endure, companies must excel at all three boxes, or their success could be very short-lived. In our work, many leaders tell us that Box 2—destroy the obsolete—is the most challenging; they find it hard to let go of the past. And yet without Box 2, organizations don’t truly transform; they persist in limiting ways of operating.
So, how do you build Box 2 muscle? How do you build a company that’s able to routinely toss what no longer works? One effective way is to work on culture. Box 2 muscle requires a culture where honest Box 2 questions are encouraged. Here are three things leaders can do to create a culture that is good at escaping the traps of the past:
Create a collective narrative that helps people understand why shedding the past (Box 2) is part of doing business and how to do it.
On the first day of his new job as CEO of Microsoft, Satya Nadella sent a powerful message to his employees. The message was intended to inspire new energy in the company at a pivotal time:
“While we have seen great success, we are hungry to do more. Our industry does not respect tradition—it only respects innovation.”
Nadella was sending a clear message to look past the “sacred cows” of the organization and pursue an agenda of innovation over orthodoxy. Later that year he reinforced the message:
“We must each have the courage to transform as individuals. We must ask ourselves, what idea can I bring to life? What insight can I illuminate? What individual life could I change? What customer can I delight? What new skill could I learn? What team could I help build? What orthodoxy should I question?”
In these two emails, Nadella introduced a new Box 2 narrative into Microsoft’s culture by openly calling into question the status-quo.
Looking at Nadella’s communications, you may consider what new narratives could you introduce to create a Box 2 culture in your company?
Role model Box 2 decisions, implicitly giving permission for others to do the same.
The CEO of a prominent Silicon Valley company, let’s call him Peter, got to unexpectedly act as a role model for Box 2 decision making at a recent off-site. In the middle of passionately advocating the development of a line extension for a current product, we’ll call Product X, Peter was interrupted by a colleague: “Given that we’re phasing out Product X and launching a much better product, Product Y, does it make sense for us to now develop Product X version 2?” The tension in the room was palpable.
Peter paused and looked down at the table. The room quieted even more. A moment later Peter looked up and smiled. “Yes, you’re right, that doesn’t make any sense.” He continued, “let’s not launch Product X version 2 and instead focus all our efforts on Product Y.”
There was a collective sigh of relief with the realization that no further effort would be directed towards a product that was seen as obsolete. After the meeting, one colleague pointed to Peter’s decision as a key turning point: “Peter’s willingness to change his mind in public was huge progress for us in establishing a strong Box 2 culture.”
What decisions could you make that set a Box 2 tone in your organization?
Make symbolic bets to remind people that the Box 2 culture is here to stay.
Some innovation leaders make Box 2 symbolic bets that send powerful ripples throughout their organizations. Symbolic bets are highly meaningful and visible actions that touch people’s hearts, let them know the new way of doing business is here to stay and have real business impact.
Former GE boss Jack Welch was a master at using symbolic bets to coach GE to have the culture he wanted to achieve his strategic goals, to be #1 or 2 in every market they were in. He recounts one such example in his autobiography Jack: Straight from the Gut:
“In those days I was… trying to blow up traditions and rituals that I felt held us back. In the fall of 1981, [I challenged] the Elfun Society, an internal management club at GE….It was a networking group for white-collar types.”
In his characteristic style, Welch blasted the Elfun Society at their leadership conference. He saw the club as a symbol for “superficial congeniality.” In his address he described them as an “institution pursuing yesterday’s agenda” and told them “he could never identify with their recent activities.” His Box 2 speech had an impact within Elfun and its turnaround became a symbol of transformation for GE as a whole:
“…Today  Elfin has more than 42,000 members, including retirees. They volunteer their time and energy in communities where GE has plants and offices. They have mentoring programs for high school students….Elfin’s self-engineered turn around became a very important symbol [of reducing bureaucracy]. It was just what I was looking for.”
As humans, we remember symbols—Jack Welch was a master in creating them that inspired many to think and be bigger. Imagine Jack looking at your organization. What symbolic bets would he make to remove unproductive vestiges of the past and create a powerful Box 2 culture in your organization?
It’s important to remember that Box 2 is part of a road that is endless. Innovation leaders are fascinated with that road. We can call it a hero’s journey: mindfully maintaining what serves, courageously letting go of what doesn’t, and purposefully creating what will.
Late last week, Uber announced it had reached a settlement with California and Massachusetts drivers over their classification as independent contractors. The settlement calls for drivers to remain independent contractors, although Uber will change... By: Zelle LLP
The U.S. Department of Labor, the Department of Health and Human Services, and the Department of the Treasury (collectively, the “Departments”) have jointly issued a new set of answers to frequently asked questions about the Affordable Care Act (the... By: Franczek Radelet P.C.
When you’re looking for a job or exploring a new career path, it’s smart to go out on informational interviews. But what should you say when you’re actually in one? Which questions will help you gain the most information? Are there any topics you should avoid? And how should you ask for more help if you need it?
What the Experts Say “Informational interviews are essential to helping you find out more about the type of industry, company, or role you’re interested in,” says Dorie Clark, author of Stand Out Networking. “You may think you already know all about a certain position, but speaking to someone directly gives you the opportunity to test your assumptions.” John Lees, a UK-based career strategist and author of The Success Code, agrees. Informational interviews “give you exposure — a way to get yourself known in the hidden job market,” he says. “The visibility may put you straight onto a short list, even if a job isn’t advertised.” They can also be a great boost to your self-esteem. “You get to wear smart business clothes and visit places of work, which maintains your confidence levels in a job search,” he explains. So whether you’re actively trying to change roles or just exploring different professional paths, here are some tips on how to make the most of an informational interview.
Prepare and practice Informational interviews are, according to Clark, “a safe environment to ask questions.” But that doesn’t mean you should go in cold. After all, your goal is to come across in a way that inspires others to help you. So do your homework. Study up on industry lingo. Learn who the biggest players are. Be able to talk about the most important trends. You don’t want to waste your expert’s time asking Google-able questions. “You will come across as a more serious candidate if you are familiar with the jargon and vocabulary,” says Clark. Lees concurs. “Showing that you’ve done your background research plants the idea of credibility in the other person’s mind,” he says. Work on your listening and conversation skills too. Lees suggests that you practice “asking great questions and conveying memorable energy” with “people who are easy to talk to, such as your family, your friends, and friends of friends.”
Keep your introduction short “What frustrates busy people is when they agree to an informational interview, and then the person seeking advice spends 15 minutes talking about himself and his job search” instead of learning from them, says Lees. It’s not a venue to practice your elevator pitch; it’s a place to “absorb information and find stuff out.” Clark suggests preparing a “brief, succinct explanation about yourself” that you can recite in three minutes max: “Here’s my background, here’s what I’m thinking, and I’d like your feedback.” People can’t help you unless they understand what you’re looking for, but this part of the conversation should be brief.
Set the tone “You want to leave people with a positive impression and enough information to recommend you to others,” says Lees. At the beginning of the interview, establish your relationship by revisiting how you were connected in the first place. “Ideally, this person has been warmly introduced to you” — perhaps you have a friend or colleague in common or you share an alma mater — so remind them, he says. It’s also a good idea to state at the outset that “you’re interested in talking to 10 or 15 industry experts” during your information-gathering phase. “That way, the person will start to process the fact that you are looking for additional sources early on. If you wait until the end to ask for other referrals, she might be caught off guard.” Ask about time constraints up front too, says Clark. “If, at the end of the time allotted, you’re having a good conversation, say, ‘I want to respect your time. I would love to keep talking, but if you need to go, I understand.’ Prove you’re a person of your word.”
Think like a journalist Prepare a list of informed, intelligent questions ahead of time, says Clark. “You don’t necessarily need to stick to the script, but if you’re unfocused and you haven’t planned, you risk offending the person. Lees recommends approaching your interview like “an investigative journalist would.” You’re not cross-examining your expert, and you certainly don’t want to come across as “pushy or difficult,” but you should “gently probe through curiosity, then listen.” He suggests a framework of five questions along the lines of Daniel Porot’s “Pie Method”:
How do you get into this line of work?
What do you enjoy about it?
What’s not so great about it?
What’s changing in the sector?
What kinds of people do well in this industry?
You can adapt these questions to your purposes; the idea is to help you “spot the roles and fields that match your skills and experience and give you an understanding of how top performers are described.”
Deliberately test your hypotheses Your mission is to grasp the reality of the industry and the job so you can begin to decide if it’s right for you. So don’t shy away from sensitive topics. “You want to hear about the underbelly,” says Clark. She suggests questions “designed to elicit the worst information,” such as:
What are the worst parts of your job?
What didn’t you know before you got into this industry that you wish someone had told you?
Some topics, such as money, may seem taboo but can be broached delicately. “Don’t ask, ‘How much money do you make?’ Instead, say something like, ‘I’ve done some research online, and it seems that the typical salary range is this,’ so you’re just asking for confirmation of public information,” says Clark.
It’s also okay to ask for advice on “how to position yourself” for a job in the industry by making your experience and skills sound relevant. She recommends saying something like, “Based on what you know about my background, what do you see as my weaknesses? And what would I need to do to allay the concerns of a potential hiring manager?” If the feedback is negative, consider it valuable information but get second and third opinions. “One person’s word is not gospel,” she says. “You may not be qualified, but you also may have spoken to a stick-in-the-mud who discourages everyone. Don’t let him limit your career options.”
Follow up with gratitude, not demands While thanking the person for their time via email is a must, Lees recommends also sending a handwritten note to express gratitude right after you meet. “It will help you be remembered,” he says. Your thank-you letter needn’t be flowery or overly effusive; instead, it should describe how the person was helpful to you and, ideally, that her guidance led to “a concrete outcome” in your job search.
Whatever you do, don’t immediately ask for a favor, adds Clark. Not only is it “considered bad manners,” but it’s also practically “an ambush because you barely know the person.” That said, “If, a couple of weeks later, a job opens up at the person’s company, you can tell the person you’re applying for it and ask if she has any quick thoughts on professional experiences you should play up in your cover letter.” If she takes the ball and runs with it and offers to put in a good word for you, that’s great. But do not ask for it.”
Play the long game The real purpose of informational interviews is to build relationships and “develop future allies, supporters, and champions,” says Lees. So don’t think of them as one-off meetings in which “someone gives you 15 minutes of his time.” Take the long view and think about ways to cultivate your new professional connection. Forward him a link to a relevant magazine article, for instance, or invite her to an upcoming conference or networking event. In other words, be helpful. “You want to be seen as giving, not constantly taking,” Lees says. Clark notes that it can be a tricky proposition when there’s a wide age or professional gap between you, but if you focus on keeping the person “apprised of your progress” — perhaps writing him a note saying you read the book he suggested or that you joined the professional association he recommended — “it shows you listened and that his advice mattered.”
Principles to Remember
Your homework. You should do enough background research before going in that you sound like a credible candidate who’s committed to moving into a new sector.
Prepare a succinct explanation about your background and what you’re looking for
Send a handwritten thank-you note. It’s good manners and makes you memorable.
Go in cold. Practice doing informational interviews with friends and family so you get used to asking great questions and listening.
Let one negative informational interview sour you on a job, company, or career path. Solicit other opinions.
Ask for favors — it’s unseemly. Instead, ask for advice on how to position yourself in the job market.
Case Study #1: Prepare and be gracious Two years ago, Matt McConnell, who lives in southern California, wanted to move from finance to marketing. He wasn’t entirely sure of his direction, so he began using informational interviews to learn about other peoples’ careers in the hopes of narrowing his focus. “I was also using the interviews to learn more about other organizations to see whether they might be places I’d want to work,” he says.
His first informational interview didn’t go very well, and Matt takes full responsibility. “I didn’t prepare,” he recalls. “He could tell, and he told me that I was wasting his time.”
Matt learned an important lesson. “I’ve never made that mistake again. I now always overprepare,” he says.
To get ready, he reads people’s LinkedIn profiles, does a Google search on their careers, and checks out their company’s website. He tends to ask the same questions, usually in the realm of how the person got started and how they ended up in their current role. “But I also make notes about particular questions I want to ask so that I have something to reference if the conversation stalls,” he says.
Matt also has a post-meeting routine. “I ask for a business card and immediately send a handwritten thank-you note. The thank you is typically three lines long, and I always mention one specific thing from our meeting that resonated with me so they know I was listening and found their time valuable,” he says.
“Early on in my career I worried that I didn’t have anything to offer anyone in return. [But] I learned that people enjoyed sharing their experiences and offering advice, so I make sure to communicate my sincere gratitude.”
Matt eventually had an informational interview with a marketing head of a quick-service restaurant group that yielded results. “After our meeting, the person called me and said her company was hiring for a role she thought I’d be perfect for,” he says. “She’d given my name to the HR department, and they were planning on calling me within the next 30 minutes to do a phone interview. That phone interview led to in-person interviews and eventually a job offer at that company.”
He worked at the company for a few years before moving on. He’s now the marketing manager for Astrophysics, a company that designs X-ray scanners for security screenings.
Case Study #2: Be respectful and don’t let negative feedback discourage you A few months ago, Susan Peppercorn, a career coach and founder of Boston-based Positive Workplace Partners, decided she wanted to write a book about work satisfaction. Trouble was, she had no experience in the publishing industry beyond blogging. To educate herself, she has been doing a lot of informational interviews.
“Some are with writers, others editors, and others published authors,” she says. “In each case, I think in advance about each person’s expertise and focus my questions on the areas where I think they might have the most valuable advice.”
Susan makes sure she is respectful of the other person’s time, never asking for more than 30 minutes and always meeting at the person’s convenience, not hers. Before each interview, Susan also considers how she might help the person with whom she’s meeting: she might have a contact she could introduce, for instance, or she could offer to look over a resume or cover letter.
One of her recent interviews was with a potential editor. Susan was excited, and she prepared by thinking about what this particular person would look for in taking on a client. She began the conversation with a two-minute description of her book idea. But during the discussion, it became apparent that the editor’s goals and hers were quite different. “He told me in a very nice way that I had virtually no chance of having a publisher accept my book proposal. My balloon was burst quickly.”
Still, after the initial disappointment, she found value in his advice. “I learned about the importance of having a platform before approaching a publisher, since they want to know in advance that your book will sell well,” she says. “That saved me a lot of time and effort trying to pitch to publishers and helped me look at the viability of self-publishing. It also made me realize that I had more work to do with regard to clarifying and communicating the value of my book.”
The experience also helped her hone her approach for subsequent informational interviews. Now she shares a brief outline of her book in advance, with a short paragraph on her motivation for wanting to write it.
One of her most recent meetings, with a published author, was extremely helpful. “He explained the concept of a platform and helped me brainstorm potential ones for my work,” she explains.
The biggest thing that gets in the way of our productivity is often ourselves. You can motivate yourself by thinking of the time you get each day as a "magic bank account" that refills each day, but if you don't use what's in it, you lose it.
According to a Gallup Poll taken in 2013, only 21 percent of people characterized business executives as having “high” ethical standards—a little above lawyers (19 percent), but below bankers (28 percent) and journalists (28 percent). Whether that’s deserved or not, it’s nevertheless true that executives set the ethical tone at their companies. But employees have the power to improve it.
Employees engage in unethical behavior from time to time: an employee takes home company supplies; a manager submits personal expenses for reimbursement by the company; a member of top management pressures accountants to manipulate the financial results. These are just a few of the violations of virtually all company codes of ethics. The question is why does it happen and can anything be done to prevent it?
Ethical behavior must be modeled by the leader of an organization. For me the issue is not so much a lack of business ethics but it is a lack of ethical leadership by management. Underlying all leadership characteristics is the need for a strong sense of ethics – right and wrong – to help those in the organization that look for moral guidance when difficult issues arise or workplace conflicts occur.
Leadership and management go hand in hand. They are not the same thing. But they are necessarily linked, and complementary. Any effort to separate the two is likely to cause more problems than it solves.
Still, much of organizational development writings have been spent delineating the differences. The manager’s job is to plan, organize and coordinate. The leader’s job is to inspire and motivate. In his 1989 book “On Becoming a Leader,” Warren Bennis composed a list of the differences:
– The manager administers; the leader innovates.
– The manager focuses on systems and structure; the leader focuses on people.
– The manager relies on control; the leader inspires trust.
– The manager has a short-range view; the leader has a long-range perspective.
– The manager asks how and when; the leader asks what and why.
– The manager has his or her eye always on the bottom line; the leader’s eye is on the horizon.
– The manager is the classic good soldier; the leader is his or her own person.
– The manager does things right; the leader does the right thing.
In the new economy, where value comes increasingly from the knowledge of people, and where workers are no longer undifferentiated parts of an industrial machine, management and leadership are not easily separated. People look to their managers, not just to assign them a task, but to define for them a purpose. And managers must organize workers, not just to maximize efficiency, but to nurture skills, develop talent and inspire results.
The late management guru Peter Drucker was one of the first to recognize this truth, as he was to recognize so many other management truths. He identified the emergence of the “knowledge worker,” and the profound differences that would cause in the way business was organized.
With the rise of the knowledge worker, “one does not ‘manage’ people,” Drucker wrote. “The task is to lead people. And the goal is to make productive the specific strengths and knowledge of every individual.”
For businesses to become more ethical and inspire a new generation of leaders, the manager must share a vision of what it means to be successful in business. In the end it is not the bottom line profit, which is fleeting at best. It is the way that managers interact with people; how they treat their employees; how they deal with customers and suppliers; and whether they respect the accounting process and make decisions based on accurate and complete financial statements, not manipulated statements for short-term gain and to enhance one’s self-interests.
Managers and leaders are committed to ethical relationships and ethical decision making, and such decisions must be ingrained in the DNA of the organization.
It has been said that character is revealed over time and under pressure. Nothing could be more true than to observe how a crisis is handled by the management in an organization and whether underlying core ethical values are stressed (i.e., honesty, integrity, respect and responsibility), and not unethical values (i.e., profits, individual wealth, power and influence).
Blog posted by Dr. Steven Mintz, aka Ethics Sage, on October 6, 2015. Professor Mintz is on the faculty of the Orfalea College of Business at Cal Poly San Luis Obispo. He also blogs at: www.workplaceethicsadvice.com.
The end of capitalism has often been imagined as a crisis of epic proportions. Perhaps a financial crisis will occur that is so vast not even government finances can rescue the system. Maybe the rising anger of exploited individuals will gradually congeal into a political movement, leading to revolution. Might some single ecological disaster bring the system to a halt? Most optimistically, capitalism might be so innovative that it will eventually produce its own superior successor, through technological invention.
But in the years that have followed the demise of state socialism in the early 1990s, a more lackluster possibility has arisen. What if the greatest threat to capitalism, at least in the liberal West, is simply lack of enthusiasm and activity? What if, rather than inciting violence or explicit refusal, contemporary capitalism is just met with a yawn? From a political point of view, this would be somewhat disappointing. Yet it is no less of an obstacle for the longer-term viability of capitalism. Without a certain level of commitment on the part of employees, businesses run into some very tangible problems, which soon show up in their profits.
This fear has gripped the imaginations of managers and policymakers in recent years, and not without reason. Various studies of employee engagement have highlighted the economic costs of allowing workers to become mentally withdrawn from their jobs. Gallup conducts frequent and wide-ranging studies in this area and has found that only 13 per cent of the global workforce is properly “engaged,” while around 20 percent of employees in North America and Europe are “actively disengaged.” They estimate that active disengagement costs the U.S. economy as much as $550 billion a year. Disengagement is believed to manifest itself in absenteeism, sickness and—sometimes more problematic—presenteeism, in which employees come into the office purely to be physically present. A Canadian study suggests over a quarter of workplace absence is due to general burnout, rather than sickness.
Few private-sector managers are required to negotiate with unions any longer, but nearly all of them confront a much trickier challenge, of dealing with employees who are regularly absent, unmotivated, or suffering from persistent, low-level mental-health problems. Resistance to work no longer manifests itself in organized voice or outright refusal, but in diffuse forms of apathy and chronic health problems. The border separating general ennui from clinical mental-health problems is especially challenging to managers in 21st century workplaces, seeing as it requires them to ask personal questions on matters that they are largely unqualified to deal with.
Lack of engagement from the workforce also registers as a problem for governments, inasmuch as it bites into economic output, and in doing so hits tax receipts. In societies with socialized health insurance and unemployment insurance, the problem is far more serious. There is a growing economic problem of individuals dropping out of work due to some often ill-defined personal and intangible problem, then gradually sinking into a more generalized inactivity. These people may show up at the doctor’s office on a regular basis, making complaints about undiagnosable pains and problems. This is often because they have nobody else to speak to and are lonely. Unemployment undermines their sense of self-worth, and inactivity brings various other psychosomatic problems with it. A general deflation of psychological and physical capacity is the end result, which in many societies produces costs for the state to pick up.
The causes of mental-health problems are obviously complex and do not lie simply in the economy any more than they do in brain chemistry. But it is the way in which these problems manifest themselves in the workplace, threatening productivity as they do so, that has placed them among the greatest problems confronting capitalism today. It is the principal reason that the World Economic Forum is now so concerned about our health the happiness industry and happiness. The murky gray area separating workplace disaffection from a clinical disorder has required managers, and the human-resources profession especially, to equip themselves with various new ways of intervening in the minds, bodies, and behaviors of their workforce. The term most commonly used to describe the goal of these new interventions is “well-being,” which encompasses the happiness and health experienced by employees.
There is a clear economic incentive for managers to consider the positive attitude of employees. Endless studies have shown that workers are more productive when they feel happy, possibly by as much as an additional 12 percent of output. And in workplaces where they feel respected, listened to, consulted, and involved, they are more likely to work harder, and less likely to take sick leave. Where employees have no say in how their work is organized, this is known to generate some of the psychological problems that now concern businesses, up to and including mental health problems. By emphasizing well-being, managers hope to turn a vicious circle of disengagement and ill-health into a virtuous one of active, fulfilling commitment.
It is tempting to be cynical about some of this: The manager is after all still attempting to extract effort from the worker. But why not also recognize the opportunity contained in this current business anxiety? If capitalism is being ground down by the chronic, unspecifiable alienation of those it depends on, then surely solving that problem may also open up possibilities for political reform? The hard economic costs that ennui now places upon employers and governments mean that human misery has shown up as a chronic problem that elites cannot simply shove aside. The question of what type of work, and what type of workplace organization, might generate a real sense of commitment and enthusiasm on the part of workers should not be abandoned altogether.
The difficulty is that the enthusiasm managers are seeking to promote is no less slippery than the psychosomatic problems they are seeking to avoid. A report commissioned by the U.K. government on the importance of employee engagement found it impossible to say exactly what this gaseous entity consists of. Expert insights that “you sort of smell it” and “know it when you see it” confirmed a shortage of objectivity on this particular issue. Managers and policymakers yearn for a hard science of workplace happiness. But it is with that sort of hard science that many of our problems begin.
* * *
Confronted by other people’s problems which are both ambiguous and personal, senior decision-makers have a tried and tested coping method: bring in the external contractors and consultants. There is copious political and market demand for experts willing to pronounce and act upon the well-being of others, on the basis of some presumed scientific authority. These sit on a spectrum between qualified medical practitioner and ill-informed bully. When handling painful issues of other people’s health and happiness, outsiders have the great advantage of being able to duck full moral accountability and, if necessary, withdraw from the job altogether. Jeremy Bentham’s vision of a “National Charity Company,” a corporation established by the state to put people to work, foreshadowed today’s murky world of workfare that lies in the unaccountable gaps between market and state.
In its bid to push people off reliance on the welfare state and the happiness industry into the labor market, the U.K. government appointed the public service outsourcing company Atos to conduct individual “work capability assessments” of individuals. As this agenda was ramped up by the Conservative-led government from 2010 onwards, it led to a number of tragedies and acts of cruelty. These included the suicide of a 53-year-old blind and agoraphobic man, Tim Salter, only weeks after his benefits were stopped in 2013, following an assessment by Atos that he was able to work. Atos also found individuals suffering brain damage and terminal cancer to be “fit for work.” In 2011, Britain’s General Medical Council investigated twelve doctors working for Atos as disability assessors, due to allegations that they were not performing their duty of care towards patients.
When it comes to then motivating people to seek work, once again, the government also stands back, letting its contractors perform the most controversial psychological interventions. Those being forced to seek work are assessed, in terms of their attitude and optimism, and then have their motivation reactivated. The companies who carry out this task in the British context are A4e and Ingeus, who hold contracts with the government to get unemployed people into jobs. In the eyes of these contractors, unemployment is really a “symptom” of some broader personal malaise, which manifests itself in inactivity. The solution consists of a range of coaching programs, combined with “behavioral activation” courses, aimed at restoring the unemployed individual’s self-belief and optimism with ruthless efficiency. As one participant in an A4e course reported, they were shouted at by a self-help guru to “talk, breathe, eat, shit belief in yourself” and that “you are the product—you either believe it or you don’t.”
To close down every route for the avoidance of work, doctors have had to be conscripted into this policy agenda too. A U.K. government report published in 2008 complained that ‘the fallacy persists that illness is incompatible with being at work’, which doctors were guilty of peddling. A government campaign was launched to dissuade doctors of this, and their official “sick notes” (which were once signed by doctors to declare that an individual shouldn’t work) were replaced by “fit notes,” requiring doctors to describe the remaining ways in which an individual could still be employed, despite any illnesses or disabilities. Doctors were encouraged to sign a draft statement scripted by the state, agreeing that work is good for people.
At the opposite end of the labor market, things look a lot sunnier, but somehow no less brutal. While Atos, A4e, and Ingeus grapple with the apparent sluggishness and pessimism of the poor, high-end wellness consultants make large sums of money by teaching corporate elites how to maintain themselves in a state of optimal psychosomatic fitness. Classes such as Jim Loehr’s “Corporate Athlete Course” ($4,900 for two and a half days) introduce executives to elite “energy investment” strategies, which will enable them to achieve a high performance level of physical and mental wellness. The American productivity guru Tim Ferriss sells advice on how senior managers should best employ their own brains over the course of the working day, following an earlier career selling dubious brain-enhancing nutritional supplements.
This consultancy circuit moves seamlessly among various apparently separate domains of expertise. The psychology of motivation blends into the physiology of health, drawing occasionally on insights from sports coaches and nutritionists, to which is added a cocktail of neuroscientific rumors and Buddhist meditation practices. Various notions of fitness, happiness, positivity, and success bleed into one another, with little explanation of how or why. The idea which accompanies all of this is that there is one ideal form of human existence: hardworking, happy, healthy and, above all, rich. A science of elite perfectibility is built on the back of this heroic capitalist vision. The flip side of this, and the real driving force behind many executive wellness programs, is a set of well-researched risks run by highly competitive businessmen, colloquially known as “burn-out,” which includes higher chances of heart attacks, strokes, and nervous breakdowns.
Of course, the majority of adults living in capitalist societies lie somewhere between the purview of Atos et al. and that of the executive wellness gurus. Is there no scope for a less individualized vision of well-being across the middle swath of the labor market? Possibly there is. But here too are some brutally competitive injunctions offered to those managers worrying about worker disengagement and its impact upon productivity.
One of America’s leading workplace happiness gurus, entrepreneur Tony Hsieh, argues that the most successful businesses are those which deliberately and strategically nurture happiness throughout their organizations. Businesses should employ chief happiness officers to ensure that nobody escapes workplace happiness. But if this sounds like the recipe for inclusive community, it isn’t. Hsieh advises businesses to identify the 10 percent of employees who are least enthusiastic towards the happiness agenda, and then lay them off. Once this is done, the remaining 90 percent will apparently become “super-engaged,” a finding which is open to more than one psychological interpretation.
As the science of happiness has moved closer to the front line of profit-maximizing business, something curious has happened to it. For Bentham, happiness was something which resulted from certain activities and choices. But in the context of business consultancy and individual coaching, happiness looks altogether different. Suddenly, it is represented as an input to certain strategies and projects, a resource to be drawn upon, which will yield more money in return. The premise that money yields a proportionate quantity of happiness is spun on its head, suggesting instead that a quantity of happiness will yield a certain amount of money.
One of a new generation of positive psychology management gurus, Shawn Achor, outlines a range of data in his book, The Happiness Advantage, suggesting that happier people achieve more in their careers. They get promoted more, sell more (if they work in marketing) and enjoy better health. Happiness becomes a form of capital on which they can fall back amidst the turbulence of an uncertain economy. It is, as the title of his book suggests, a source of advantage in the battle to succeed. If this was the limit of his wisdom, Achor might sound like a fatalist: Optimists are just luckier in all regards than pessimists.
The crucial supplement to the data is that we are all, supposedly, capable of influencing our own happiness levels. Happiness, Achor tells us, is a choice. We can either choose to be happy (and consequently successful) or choose to be unhappy (and suffer the consequences). Neuroscientist Paul Zak, another leading speaker and consultant on these issues, suggests that we view our happiness like a “muscle,” which needs exercising regularly in order to keep it in full working order, for when we need it. Lurking within this highly individualized agenda is the capacity to blame people for their own misery and failure, both of which are matters that they have evidently failed to act upon adequately.
What does “happiness” even mean, once it is being conceived of in this way? It seems to imply a source of energy and resilience, but always directed towards goals other than being happy, such as status, power, employment, and money. In the face of workplace ennui and psychological stagnation, the motivational gurus simply demand more willpower. By this account, the activities that might result in happiness, such as socializing or relaxing, are only valuable to the extent that they might restore brain and body to a level of fitness, from which they can then be propelled forwards to the next business challenge. This particular version of utilitarianism means expanding corporate rationality further into everyday life, such that there is now even an “optimal” way of taking a break from work, and simply going for a walk can be viewed as a calculated act of productivity management. What is going on? The misery of working people is a serious political issue. How did it become captured in this way?
* * *
In 1928, a researcher from Harvard Business School sat down with a young woman working in a telephone production plant in Cicero, Illinois, and asked her an unusual question: “If given three wishes, what would they be?” The woman paused to reflect before listing her answers. “Health, to take a trip home at Christmas time, and to take a wedding trip to Norway next spring.”
The reason the question was unusual was that the researcher was not, ultimately, interested in the woman’s life or wish fulfillment. He was interested in her productivity. The enthusiasm for the theories of Frederick Winslow Taylor, ostensibly the world’s first management consultant, had waned considerably since its heyday in the years prior to World War I, but Taylor’s basic scientific ambitions were still largely unquestioned among management theorists. Only in 1927 had Harvard Business School established a Fatigue Laboratory, containing rooms of various temperatures and state-of-the-art instruments to study the reactions of the human body to different types of work and recuperation. In an economy still dominated by manufacturing and physical labor, physiology and infrastructure seemed to hold the key to unleashing better workplace performance. Managers did not consider the Christmas or travel plans of their employees to be any of their business.
The man asking the questions in that telephone production plant was Elton Mayo, an Australian polymath of somewhat dubious scholarly provenance. He had dabbled in philosophy, medicine, and psychoanalysis, and was seduced by many of the doom-laden cultural critiques published in the years following World War I, such as Oswald Spengler’s Decline of the West. Mayo was convinced that civilization was heading for a fall, and that industrial conflict would be its trigger. Trade unions and socialists were thus a threat, not only to management and capital, but to world peace.
In some of Mayo’s more outlandish theories, socialism was a symptom of physical fatigue and psychiatric illness. “To any working psychologist,” he asserted, “It is at once evident that the general theories of Socialism, Guild Socialism, Anarchism and the like are very largely the phantasy constructions of the neurotic.” He believed that the only solution lay in corporations coming to provide forms of psychoanalytic therapy to their employees, which would soothe them, bringing them closer into the arms of their employers. Employees who resisted the authority of their managers were in need of treatment.
Mayo's psychosomatic theories assumed that psychiatric problems in the workplace would show up not only in terms of low productivity and industrial unrest, but high blood pressure. Between 1923 and 1925, he toured manufacturing plants in the Boston area in the company of a nurse and a blood-pressure gauge, attempting to prove this link between the mental, the economic, and the physical, which he was convinced existed quite regardless of the evidence.
The psychological study of work was an emerging field during the 1920s, led by some of the same scholars who had previously pioneered the psychological study of advertising a few years earlier. But Mayo had some much more far-reaching theories regarding the ways in which the insights of psychology might fundamentally reform and rescue capitalism. By focusing on the entire person in the workplace, including all of their personal concerns and mental well-being, work might provide the laborer with their deepest source of meaning, and offset the risk of industrial upheaval once and for all. In 1926, Mayo was hired by the Harvard Business School.
The research in Cicero, Illinois, known as the Hawthorne Studies, after the name of the manufacturing plant where they were carried out, quickly became a landmark of management science. Mayo was one of the founders of the Fatigue Laboratory, but the impact of his work was to divert attention away from the working body and towards the mental happiness of employees. According to the mythology that now surrounds the Hawthorne Studies, Mayo’s main discovery was accidental. The working women who were chosen to be observed and interviewed were taken off the regular shop floor and into a test room, where they were able to relax and interact in a more informal and convivial atmosphere. This seemed to correlate with improved performance, and Mayo had an inkling of why: The study itself, including the interview process, was what resulted in the productivity increases, because the women had developed a higher sense of group identity with one another. Their enthusiasm for work had grown, as their ability to form relationships with one another increased. The general phenomenon, whereby research subjects respond to being studied, is now known as the “Hawthorne Effect” for this reason.
The lesson that Mayo drew from his repeated visits to the Hawthorne plant was that managers had to learn how to talk to their employees if they wanted to extract greater productivity from them. An unhappy worker was also an unproductive worker, and the unhappiness stemmed from a deep-seated feeling of isolation. They also had to understand the unique psychological properties of social groups, which were not simply reducible to individual incentives, as Taylorism and neo-classical economics had supposed. A thriving and collaborative group identity could do far more for an employee ’s happiness, and hence for the manager’s bottom line, than a pay rise.
There is some basis to doubt whether Mayo was really reporting on data acquired at Hawthorne or simply repackaging some theories that he’d long held about the future of capitalism. In fact, the productivity of the women did coincide with a pay increase in 1929, but Mayo was absent at the time and chose to ignore this in his analysis. Regardless of the scientific validity of his work however, Mayo’s impact on management thinking was profound and long lasting. Whenever we now hear that managers must focus on the “whole person,” and not just the “employee,” or that employee happiness is critical to the bottom line, or that we must “love what we do” or bring an “authentic” version of ourselves to work, we are witnessing Mayo’s influence. When managers strive for more laughter in the workplace, as some consultants now insist they must, or seek to transform its smell so as to optimize our subjective feelings, they are practicing what Mayo first preached.
* * *
Within the longer history of happiness expertise, what is interesting about Mayo’s intervention is that he downplayed the more obvious material ways of tweaking the pleasures and pains of the mind. Neither money nor the physical body were deemed adequate for understanding or influencing levels of happiness, once the workplace came to be understood in terms of group psychology. Instead, talking to workers and facilitating their relations with one another became the main instruments for gauging and improving their happiness. Management, which originated as a technique for controlling slaves on plantations, and developed as a means of running heavy industrial corporations, had become a “soft” social and psychological skill.
While Mayo did not conceive of things in quite this way, this was a form of psychosomatic intervention, like a placebo. The aim of management in the 1930s was to increase output of physical produce. But now, rather than focusing on the physical and physiological work process, managers would focus on the social and psychological elements, in the expectation that this would yield behavioral, physical, economic improvements.
The term “psychotherapy” today refers to a range of treatments, ranging from more psychoanalytic, long-term relationships, to the quick fixes such as Cognitive Behavioral Therapy (CBT) that are more akin to training or coaching. But the first known uses of the term referred to the “talking cures” offered by medical doctors in the late 19th century, who came to recognize that their patients often responded as much to the manner in which they were spoken to as they did to the medicinal treatment they received.
What Mayo was recommending was the industrial parallel to this. An open, conversational relationship could be conducted in such a way as to bring about a change in the worker’s mentality and a consequent change in their physical performance. Speech was instrumentalized, to make people feel better and, as a result, behave better. Once dialogue and co-operation become viewed as an essential element of economic production, one sees the glimmer of a more transformative economic democracy.
Yet the analogy to psychosomatic medical treatments would gradually become more telling as the post-war period progressed, for a couple of coincidental reasons. Firstly, the nature of work in the West became progressively less physical over the second half of the 20th century. By the 1980s, an employee’s customer care, service ethic, and enthusiasm were not simply mental resources that existed to help churn out more products: They were the product. The importance of employee happiness and psychological engagement becomes all the greater once corporations are in the business of selling ideas, experiences, and services. Businesses speak of “intangible assets” and “human capital” in the hope of capturing this amorphous workplace ethos, but in practice it is nothing which resembles either an asset or capital. Some other way of conceiving of work is required.
Secondly, the concept of health started to undergo some profound changes. In 1948, the newly founded World Health Organization redefined health as “a state of complete physical, mental, and social well-being”—an almost utopian proposition that few of us ever attain for very long. Intangible aspects of health and illness came to the fore. This is partly a symptom of medical progress: As medicine becomes better at preventing death, so attention turns to the question of how well it is able to support life.
What does any of this have to do with management or work? The problem confronting managers and policymakers over the second half of the 20th century was that everything seemed to be evaporating into thin air at the same time. Work was becoming intangible as manufacturing went into decline. Illness was becoming intangible as mental and behavioral problems increased. Money itself was becoming intangible as the financial system globalized from the late 1960s onwards. Problems of activity and enthusiasm moved elusively between the domains of medicine, psychiatry, workplace management, and economics. The challenges of health care and those of business were becoming harder to disentangle, with the issue of mental health at the interface between the two. The job of management increasingly came to resemble psychotherapy in that original sense of “talking cure,” of propping up the well-being of individuals, in order to keep their enthusiasm for service-based jobs as high as possible.
And as the nature of work and management changes, so too does the nature of resistance. Opposition to management typically takes a form other than that preferred by the manager himself. The classical mode of opposition is for the worker to speak back or strike via a trade union. The manager, having ignored the feelings or desires of the worker, is told that they cannot do so any longer. As Mayo’s style of therapeutic management expanded over the post-war period, opposition to it began to take the opposite form. Gradually, as post-industrial workers were encouraged to be “themselves,” speaking openly and honestly to their manager, the sole remaining form of opposition was to return to the physical body once more. The only escape from a manager who wants to be your friend is to become physically ill. With the list of available diagnoses growing, and complete health becoming idealized, sickness became one of the dominant ways in which refusal to work came to manifest itself, especially from the 1970s onwards. Evidently, management could not only focus on relationships and subjective feelings, any more than it could only focus on the productive body. What it needed, if it was to ensnare employees thoroughly, was a truly psychosomatic science that could treat the mind and the body as an integrated part of a single system to be optimized. This brings us to a final character in the story of psychosomatic management.
* * *
The history of utilitarianism is littered with dashed hopes that there might be a single measure of human optimization which could serve as the instrument through which all public and private decisions might be taken. This ideal rests on the hope that the ambiguity and plurality of human culture might be overcome through knowledge of a single quantifiable entity. Whether it is via the idea of utility, energy, value, or emotion, the project of monism always involves this form of simplification.
A monistic theory of general wellness was emerging. Until the 1940s, the term “stress” was used principally in reference to metals and was virtually unknown outside the worlds of engineering and physics. An iron bar becomes stressed when it is unable to cope with the demands that are placed on it. Hans Selye, an Austrian doctor, recognized that what engineers saw as “wear and tear” in, say, a bridge, was the same problem as what he had termed “General Adaptation Syndrome” in the human body. General Adaptation Syndrome was effectively an indicator of the “rate of wear and tear in the body.” In the aftermath of World War II, he re-christened the syndrome as “stress.” By the 1950s, this was a distinctive new field of medical and biological research.
Like Mayo, Selye never saw himself as an academic only: He was on a mission. According to his holistic understanding of illness, entire societies and cultures could become sick if they lost the capacity to cope with external stimuli and demands. Equally, they could slump into passive inactivity if they were never stimulated sufficiently. As he grew older, Selye developed this idea into something approaching an ethical philosophy, though a frighteningly egocentric one. A healthy society, he argued, is built around “egoistic altruism,” in which every individual sets about doing his utmost to win the adoration of others. This produces a form of natural equilibrium, in which the egotist becomes integral to his own social system.
Selye’s own biological research, and his macho libertarian politics, the non-specific nature of stress represented an opportunity which would eventually permeate into the world of management. Stress, as Selye had argued, is simply a particular type of reaction to any excessive demand. This was equally amenable to psychological or organizational forms of exploration. In fact, without using the term stress, the U.S. military had become aware of the same syndrome during World War II, in the common forms of psychological collapse experienced by soldiers who had spent too long in battle. The stressful demands placed on a human being are not merely physical, but social and psychological too. What went on between the demand and the response was open to a range of different scientific explanations beyond merely biological ones. The study of stress became an expressly interdisciplinary field.
As the study of how humans cope with physical and mental demands, it also lent itself perfectly to the study of work. By definition, stress is something we encounter without having chosen to, but cannot avoid. It often occurs when we are trapped in a certain situation, simply forced to react to it. The field of occupational health emerged during the 1960s to understand precisely how work impacts upon us, physically and mentally. Studying how different types of job demands produce different hormonal and emotional responses yielded a number of potentially transformative findings. It wasn’t simply that excessive demands were bad for people; insufficient workplace demands—or boredom—could also be unhealthy, as Selye had recognized. Our current concern with unemployment as a potential health risk is one manifestation of the latter anxiety.
Just as Mayo’s emphasis on dialogue created an opening for a more thoroughly egalitarian critique of business hierarchy, the study of stress in the workplace achieved something similar for a while. Work carried out by the psychologist Robert Kahn and his colleagues at the University of Michigan during the early 1960s highlighted the various ways in which power structures and work design impact upon the health of employees. Badly designed jobs and lack of proper recognition in the workplace were clear contributors to physical and mental ill-health. Lack of any influence over where and when one carries out a task is a stress factor, which takes its toll on both mind and body. A number of clear routes, between the injustices of hierarchical business and the vulnerabilities of the human body, were becoming apparent. One of the most important of these was the discovery that stress leads to the cortisol hormone being released into the bloodstream, hardening the arteries and increasing the risk of heart attack. Despite the high-profile obsession with executive burn-out, this form of stress is far more common for those lacking power or status at work.
By the 1980s, the non-specific syndrome that Selye had first identified in his lecture hall in 1925 had become one of the most pressing problems confronting managers in the Western world. They were now exhibiting a generalized deflation of activity, a form of psychosomatic collapse that we have come to identify with the concept of stress. In the U.K., stress overtook repetitive strain injury in 2012 as the leading cause of absence from work. This is not easily classified as either a physical illness or a mental illness. What prompts it may include work but may equally include other types of social, psychological, or physical demands that the individual simply can’t cope with.
The science of stress was of the utmost importance for managers worrying about the depletion of their workforce. It became one of the main preoccupations of the human resources profession, who sought out rudimentary wisdom on a wide panoply of “bio-psycho-social” complaints. The sheer breadth of contributory factors to stress—some tangible, others intangible—made it extremely difficult to achieve any control over it. This is in addition to the graver psychosomatic risks faced by those in precarious jobs, who move in and out of work, without even managers to support them from one month to the next. One conclusion to draw from this would be, as per the occupational health studies of the 1960s, that the fundamental politics of work had grown dysfunctional and needed a more wholesale transformation, and not simply piecemeal medical treatment. But would this be the lesson that was learnt?
* * *
In 2001, Unilever’s senior management demanded a program to help them personally manage their own energy levels, as they feared the consequences of executive working lifestyles. Being in the industry they were in, there was ample expertise to help them design this. The “Lamplighter” health and well-being program was the result, tailor-made to help senior management keep up their performance levels and offset the risk of stress. The business benefits for Lamplighter quickly became clear, with evaluations suggesting that every £1 spent on the program yielded £3.73 in return. It was quickly rolled out across dozens of Unilever offices around the world before being extended to cover the rest of the workforce.
Programs such as Lamplighter are becoming more and more common. They seek to identify a wide range of health and well-being risks in their workforce, including the sporting activities of employees and their “mental resilience.” Lamplighter requires Unilever employees to be formally (albeit, confidentially) assessed in terms of a range of behaviors, relating to nutrition, smoking and drinking, exercise and personal stress. The state-of-the-art workplace of today has taken on features of the doctor’s surgery, just as the doctor has been required to take on skills of the motivational manager. What are referred to as “Health 2.0” technologies, such as the iPhone’s Health app, for the digital monitoring of well-being are often indistinguishable from productivity enhancements.
Many best practice employers now offer free gym membership to their most valued staff, and even free counseling. As the physical and the psychological character of work—and of illness—start to blend into each other, notions of health, happiness, and productivity become ever harder to distinguish from each other. Employers end up treating all three things as a single entity, to be maximized via a range of stimuli and instruments. This is the monistic philosophy of the 21st-century manager: Each worker can become better, in body, mind, and output.
The political hope that perhaps the human benefits of dialogue and workplace empowerment might be more thoroughly recognized turns into disappointment, as performance management and health care are fused into a science of well-being optimization. And yet there are radical political economists for whom the de-materialization of contemporary work represents an opportunity for a whole new industrial model. The shift towards a “knowledge-based” economy, in which ideas and relationships are key sources of business value, could be the basis of entirely new workplace structures in which power is decentralized and decisions taken collaboratively.
There are good reasons to suspect that such models might produce fewer psychosomatic stresses; in that sense, they may be more efficient than the status quo. If dialogue in the workplace is a necessary factor for productivity, why not grant it some real influence over how decisions get made, right up to the highest level? Rather than ironic management speak, which twists words to manipulate emotions in the expectation that this will yield greater output, a more honest reflection on the problems of occupational ill-health would question the hoarding of status and reward by a small number of senior managers. Instead, traditional forms of management and hierarchy are rescued by the new ubiquity of digital surveillance, which allows informal behavior and communication to be tracked, analyzed and managed.
Rather than the rise of alternative corporate forms, we are now witnessing the discreet return of the scientific management style, only now with even greater scientific scrutiny of bodies, movement, and performance. The front line in worker performance evaluation has shifted into bodily-monitoring devices, heart-rate monitoring, and sharing of real-time health data, for analysis of stress risks. Strange to say, the notion of what represents a good worker has gone full circle since the 1870s, from the origins of ergonomic fatigue studies, through psychology, psychosomatic medicine and back to the body once more. Perhaps the managerial cult of optimization just needs something tangible to cling onto.
As Piketty points out, there’s a historic disparity between making money from investments and making money from your day job that greatly favors the former. While the U.S. government will likely engage in political battles to move the needle for day-jobbers through initiatives like minimum wage hikes or tax breaks for the middle class, there’s renewed interest in making the other side of the equation, investing, more accessible to the 99 percent.
That’s where Aspiration, a financial services company that debuted today, comes in. Founded by Democrat wunderkind turned federal fraud prosecutor and business consultant Andrei Cherny, and advised by billionaire internet entrepreneur Jeff Skoll, Aspiration is already making waves with a business model that not only allows clients to choose their own fee (or none at all) for the services the firm provides, but one that donates 10 percent of its revenue to microloans for struggling Americans.
“Millennials are used to wanting to do business with companies that aren't just about profit, but are about a larger purpose as well,” said Cherny. “That is the way they shop, that is the way people exercise their decisions in all other aspects of their lives, and that hasn't been true in investment options.” While he acknowledges that socially responsible mutual funds have existed within major financial institutions for a long time now, he also points out that “nobody has really built a retail financial company with those kinds of values.”
At the heart of Aspiration’s online-based service is its commitment to making complex investment portfolio options more accessible to the masses. The first product, Aspiration’s flagship fund, relies on liquid alternatives. These funds, designed to hedge against the stock market by investing in nontraditional assets using sophisticated trading strategies, are all the rage on Wall Street right now, but, according to CNN Money, the average minimum investment is $6,000. Aspiration’s is $500.
To hear it from Cherny, Wall Street never made these types of tools available to the middle-class investor because it simply didn’t have to. “It's easier to find one person with $100 million than a lot of people with $1,000 or $10,000, so just from a business standpoint, they haven't had the need to do so,” he said.
But other investors and finance professionals consulted on this matter said there are also questions of risk and regulation to consider. While Aspiration’s flagship fund was created to be less volatile than the stock market on average, the Securities and Exchange Commission considers liquid alternatives to be part of its “heightened risk” examination priorities this year. Hedge funds, closely related to liquid alternatives, may seem reserved for the elite as a matter of smart business strategy, but they are actually required by law to only do business with “accredited investors”—meaning a minimum net worth of $1 million or a minimum annual income of $200,000 for each of the past two years. This regulation came about after the Depression not so much as a way to protect the wealthy from sharing the secrets to their financial success, but to keep regular investors, the type Aspiration hopes to attract, from losing their shirts.
Aspiration addresses this risk by capping the initial investment for their debut fund at $100,000. But the company also aims to empower investors to protect themselves via an educational component on the site.
“So much of what you see from the financial industry is charts and graphs and white papers,” said Cherny. “For some people, that’s great and that’s how they learn, but, for a lot of people, that is not only not how they learn but [it] is counterproductive and feels like math homework. People run in terror.” Perhaps that’s why a 2012 SEC report found that “studies show consistently that American investors lack basic financial literacy,” and that the problem is even worse among women and minorities, two groups also susceptible to poverty at greater rates. Another study showed that most Americans turn to their family for financial knowledge, giving those who come from strong investing backgrounds a significant leg-up when they decide to enter the market.
Aspiration aims to not only provide that necessary financial education, but hopes to make it more open and engaging. While most investment firms offer dry reports and often perplexing diagrams tracking an investment’s progress, Aspiration explains the basic premise behind its investment strategy in cute animated videos and Choose Your Own Adventure-type games. While this approach may induce groaning in seasoned investors, the vast majority of Americans who don’t understand basic investing principals (like compound interest and inflation) might rejoice.
“I've been trading from a pretty young age, in college,” said Aspiration’s vice president of product, Matthew Lee. “Seeing both friends and family around me, a lot of them are just financially illiterate. They have the desire and motivation to learn, but it was difficult for them because the financial language is cloaked in a lot of acronyms and hard-to-understand terms.” Aspiration seeks to create a friendlier approach to financial literacy based on narrative storytelling and gamification as a way to recruit a customer base from everyday Americans put off by traditional financial literature.
Cherny said that the educational components are “a big part of what we do and will always be a through-line in our approach.” To ensure their videos and games make sense and are engaging, Aspiration relies on extensive user feedback, including that of their lead designer Jess Brown, who is the first to admit that he has little background in the financial world. “For me, it was teaching myself through this material as well and feeling that I have a grasp enough to make these types of investments for myself,” he said of his design process. He helped come up with analogies like thinking of a diverse investment portfolio as a well-balanced diet. Now, Lee claims that even Brown’s young children can sit through the animated cartoons available on the site.
While providing explanations of financial mechanisms so simple a child could understand them has no correlation to a firm’s return on investment, it may help spark what Aspiration calls “a revolution” in wary, underserved potential investors, like young professionals, people from economically disadvantaged backgrounds, and women. It also serves Aspiration’s mission to create trust among its clients. “We were really trying to make it feel that this is not just the latest startup coming out of Silicon Valley,” said Brown. “These are real people that are really conscious of wanting to help better everybody's financial endeavors.”
For Cherny, a policy wonk at heart, this all goes back to creating a solution for the economic inequality so startlingly defined by Piketty. “What I didn't necessarily like about…what Piketty was arguing was that because of this mismatch of some people making their money off of investments and others off of labor, that's always going to be the case. What we're saying is that if you take that as being true, as opposed to being pessimistic and saying there's nothing we can do about it, the answer is give more investment opportunities to middle-class people so they can enjoy that rise up the income ladder, and wealth ladder as well.”
As you probably know, Google tracks quite a bit of what you do online. Much of this is to improve your search results or to assist with Google Now. Google's updated Account History page now allows you to "pause" or delete Google's historical information about you.