The media cover it, but John Stossel says they shouldn't. It's a scam.
Subscribe to our YouTube channel: http://youtube.com/reasontv
Like us on Facebook: https://www.facebook.com/Reason.Magazine/
Follow us on Twitter: https://twitter.com/reason
Subscribe to our podcast at iTunes: https://goo.gl/az3a7a
Reason is the planet's leading source of news, politics, and culture from a libertarian perspective. Go to reason.com for a point of view you won't get from legacy media and old left-right opinion magazines.
The Southern Poverty Law Center lists Ayaan Hirsi Ali — who grew up Muslim in Somalia and suffered female genital mutilation — as an "anti-Muslim extremist." Just because she now speaks out against radical Islam.
They also list the conservative Family Research Council as a "hate group."
That listing led a man to go to the Council's office to try to gun down their workers. The shooter later told law enforcement that he picked the group because he saw they were on the Southern Poverty Law Center's hate map and he wanted to fight bigots.
Stossel disagrees with the Family Research Council on many issues. But he says they don't deserve to be called haters. The group's Executive Vice President, Jerry Boykin, tells him: "I don't hate gay people, and I know gay people, and I have worked with gay people."
Another group that the Southern Poverty Law Center smears is the Ruth Institute. The group argues that gays shouldn't have the same rights to adopt. But does that make them haters? No, says founder Jennifer Morse: "I have no problem with gay people. That's not the issue."
Other reporters, such as Megyn McArdle at Bloomberg, have also pointed out that the group is an odd fit for a "hate" list.
There are many non-hateful groups on the Southern Poverty Law Center's hate list. But Antifa, which clearly is a hate group, is not on the list.
The Southern Poverty Law Center wouldn't talk to Stossel about their listings. Stossel says screaming "hate!" brings in money.
Morris Dees, the Center's founder, pays himself nearly half a million dollars a year. Although Dees once promised that when the Center's endowment reached $50 million, he'd stop fundraising, he didn't stop. Now the Center has $320 million dollars stashed away -- much of it in the Cayman Islands. It's all in their tax returns.
Stossel notes that they still con people into giving them even more money. Apple gave them $1 million last year.
He says the Southern Poverty Law Center has become a hate group itself. It is now a left-wing, money grabbing, slander machine.
Produced by Maxim Lott. Edited by Joshua Swain.
With the idea of class so central to Marxian theory, libertarians might be tempted to ignore class as a category. But there is in fact such a thing as libertarian class theory, because in libertarian theory there are distinct groups of exploiters and exploited. Gary Chartier joins me to discuss the history and development of libertarian class theory.
About the Guest
Gary Chartier is Associate Dean and Distinguished Professor of Law and Business Ethics at the Tom and Vi Zapara School of Business at La Sierra University, and holds his PhD and LLD from the University of Cambridge.
Skillshare is an online learning community with over 16,000 classes in design, business, and more. Get three months of Skillshare for 99 cents by going to skillshare.com/woods99.
Social Class and State Power: Exploring an Alternative Radical Tradition, eds. David M. Hart, Gary Chartier, Ross Kenyon, and Roderick T. Long
In Restraint of Trade: The Business Campaign Against Competition, 1918-1938, by Butler Shaffer
Other Books by the Guest
Ep. 90 Was Marx Right? (Peter Klein)
Ep. 387 What Rights Are, Where They Come From, and Whether Animals Have Them
Ep. 292 Anarchy and the Law (Gary Chartier)
Ep. 148 Libertarian: Thick or Thin? (Gary Chartier)
Ep. 129 Righting Rawls (Gary Chartier)
Ep. 25 Do We Need the State? (Gary Chartier)
2) Free publicity for your blog. As a special thanks if you get your hosting through one of my affiliate links (this one for Bluehost, or this one for WP Engine), I’ll boost your blog. Click here for details.
3) Free History Course: The U.S. Presidents — Politically Incorrect Edition. Get access to this 22-lesson course: 22 videos, 22 mp3 files for listening on the go, and a bibliography of reliable books on the presidents. Get it at FreeHistoryCourse.com!.
4) $160 in Free Bonuses. Free signed copy of my New York Times bestseller The Politically Incorrect Guide to American History, plus a free 10-lesson bonus course on the foundations of liberty, plus a free year’s subscription to LibertyClassroom.com, when you subscribe to the Ron Paul Curriculum site through RonPaulHomeschool.com.
5) Free Books. Boost your intellectual ammunition with my free libertarian eBooks, including 14 Hard Questions for Libertarians — Answered, Bernie Sanders Is Wrong, and Education Without the State. Find them at TomsFreeBooks.com.
Audio by Chris Williams Audio.Download Audio
This post will stay up another day. -PG
Takes A Hit In 2017
During 2017, 485 scientific papers have been published that cast doubt on the position that anthropogenic CO2 emissions function as the climate’s fundamental control knob…or that otherwise question the efficacy of climate models or the related “consensus” positions commonly endorsed by policymakers and mainstream media.
These 485 new papers affirm the position that there are significant limitations and uncertainties inherent in our understanding of climate and climate changes, emphasizing that climate science is not settled.
More specifically, the papers in this compilation support these four main skeptical positions — categorized here as N(1) – N(4) — which question climate alarm.
N(1) Natural mechanisms play well more than a negligible role (as claimed by the IPCC) in the net changes in the climate system, which includes temperature variations, precipitation patterns, weather events, etc., and the influence of increased CO2 concentrations on climatic changes are less pronounced than currently imagined.
N(2) The warming/sea levels/glacier and sea ice retreat/hurricane and drought intensities…experienced during the modern era are neither unprecedented or remarkable, nor do they fall outside the range of natural variability, as clearly shown in the first 150 graphs (from 2017) on this list.
N(3) The computer climate models are not reliable or consistently accurate, and projections of future climate states are little more than speculation as the uncertainty and error ranges are enormous in a non-linear climate system.
N(4) Current emissions-mitigation policies, especially related to the advocacy for renewables, are often ineffective and even harmful to the environment, whereas elevated CO2 and a warmer climate provide unheralded benefits to the biosphere (i.e., a greener planet and enhanced crop yields).
In sharp contrast to the above, the corresponding “consensus” positions that these papers do not support are:
A(1) Close to or over 100% (110%) of the warming since 1950 has been caused by increases in anthropogenic CO2 emissions, leaving natural attribution at something close to 0%.
RealClimate.org: “The best estimate of the warming due to anthropogenic forcings (ANT) is the orange bar (noting the 1𝛔 uncertainties). Reading off the graph, it is 0.7±0.2ºC (5-95%) with the observed warming 0.65±0.06 (5-95%). The attribution then follows as having a mean of ~110%, with a 5-95% range of 80–130%. This easily justifies the IPCC claims of having a mean near 100%, and a very low likelihood of the attribution being less than 50% (p < 0.0001!).”
A(2) Modern warming, glacier and sea ice recession, sea level rise, drought and hurricane intensities…are all occurring at unprecedentedly high and rapid rates, and the effects are globally synchronous (not just regional)…and thus dangerous consequences to the global biosphere and human civilizations loom in the near future as a consequence of anthropogenic influences.
A(3) The climate models are reliable and accurate, and the scientific understanding of the effects of both natural forcing factors (solar activity, clouds, water vapor, etc.) and CO2 concentration changes on climate is “settled enough“, which means that “the time for debate has ended“.
A(4) The proposed solutions to mitigate the dangerous consequences described in N(4) – namely, wind and solar expansion – are safe, effective, and environmentally-friendly.
To reiterate, the 485 papers compiled in 2017 support the N(1)-N(4) positions, and they undermine or at least do not support the “consensus” A(1)-A(4) positions. The papers do not do more than that. Expectations that these papers should do more than support skeptical positions and undermine “consensus” positions to “count” are deemed unreasonable in this context.
Below are the two links to the list of 485 papers as well as the guideline for the lists’ categorization.
(Parts 1 and 2 are on the same page).
Solar Influence On Climate (121)
ENSO, NAO, AMO, PDO Climate Influence (44)
Modern Climate In Phase With Natural Variability (13)
Cloud/Aerosol Climate Influence (9)
Volcanic/Tectonic Climate Influence (6)
The CO2 Greenhouse Effect – Climate Driver? (14)
Climate Model Unreliability/Biases/Errors and the Pause (28)
Failing Renewable Energy, Climate Policies (12)
Wind Power Harming The Environment, Biosphere (8)
Elevated CO2 Greens Planet, Produces Higher Crop Yields (13)
Warming Beneficial, Does Not Harm Humans, Wildlife (5)
Warming, Acidification Not Harming Oceanic Biosphere (17)
Decreases In Extreme, Unstable Weather With Warming (3)
Urban Heat Island: Raising Surface Temperatures Artificially (5)
No Increasing Trends In Intense Hurricanes (4)
No Increasing Trends In Drought/Flood Frequency, Severity (3)
Natural CO2, Methane Sources Out-Emit Human Source (4)
Increasing Snow Cover Since The 1950s (2)
Lack Of Anthropogenic/CO2 Signal In Sea Level Rise (38)
No Net Warming During 20th (21st) Century (12)
A Warmer Past: Non-Hockey Stick Reconstructions (60)
Abrupt, Degrees-Per-Decade Natural Global Warming (7)
A Model-Defying Cryosphere, Polar Ice (32)
Antarctic Ice Melting In High Geothermal Heat Flux Areas (4)
Recent Cooling In The North Atlantic, Southern Ocean (10)
New Paper Spurns Anthropogenic CO2 Warming,
Unveils Natural Explanation For Climate Change
University of California (Santa Cruz) Professor W. Jackson Davis (Ph.D.), President of the Environmental Studies Institute, has published a new paper with colleagues in the journal Climate that thoroughly undermines the conceptualization of a dominant role for anthropogenic CO2 in the global warming since 1850.
Davis points out that CO2 and global temperature have been “decoupled” throughout much of geological history, and that the amplification of CO2 concentrations yields increasingly smaller radiative effects, meaning that the higher the CO2 concentration rises, the weaker its influence.
He even suggests that the reason why the anthropogenic global warming (AGW) hypothesis (it has not reached theoretical status) has been popularized is because there are reputed to be no convincing alternative explanations.
But Davis and two other University of California (SC) scientists have proposed a newly-termed alternative explanation for the 0.8°C global temperature change since 1850. The Antarctic Centennial Oscillation (ACO) has been identified as varying in sync with solar cycles (orbital), and correlates with glacial-interglacial transitions, the 1,500-year abrupt, global-scale temperature changes (Dansgaard-Oeschger cycles), and, as the name suggests, century-scale fluctuations in global temperature.
Consequently, as the authors conclude, properties of the ACO “can explain the current global warming signal”.
[T]he contemporary global warming increase of ~0.8 °C recorded since 1850 has been attributed widely to anthropogenic emissions of carbon dioxide (CO2) into the atmosphere. Recent research has shown, however, that the concentration of CO2 in the atmosphere has been decoupled from global temperature for the last 425 million years [Davis, 2017] owing to well-established diminishing returns in marginal radiative forcing (ΔRF) as atmospheric CO2 concentration increases. Marginal forcing of temperature from increasing CO2 emissions declined by half from 1850 to 1980, and by nearly two-thirds from 1850 to 1999 [Davis, 2017]. Changes in atmospheric CO2 therefore affect global temperature weakly at most.
The anthropogenic global warming (AGW) hypothesis has been embraced partly because “…there is no convincing alternative explanation…” [USGCRP, 2017] (p. 12).
The ACO provides a possible alternative explanation in the form of a natural climate cycle that arises in Antarctica, propagates northward to influence global temperature, and peaks on a predictable centennial timetable.
We report a previously-unexplored natural temperature cycle recorded in ice cores from Antarctica—the Antarctic Centennial Oscillation (ACO)—that has oscillated for at least the last 226 millennia. Here we document the properties of the ACO and provide an initial assessment of its role in global climate. We analyzed open-source databases of stable isotopes of oxygen and hydrogen as proxies for paleo-temperatures. We find that centennial-scale spectral peaks from temperature-proxy records at Vostok over the last 10,000 years occur at the same frequencies (±2.4%) in three other paleoclimate records from drill sites distributed widely across the East Antarctic Plateau (EAP), and >98% of individual ACOs evaluated at Vostok match 1:1 with homologous cycles at the other three EAP drill sites and conversely.
The period and amplitude of ACOs oscillate in phase with glacial cycles and related surface insolation associated with planetary orbital forces. We conclude that the ACO: encompasses at least the EAP; is the proximate source of D-O oscillations in the Northern Hemisphere; therefore affects global temperature; propagates with increased velocity as temperature increases; doubled in intensity over geologic time; is modulated by global temperature variations associated with planetary orbital cycles; and is the probable paleoclimate precursor of the contemporary Antarctic Oscillation (AAO). Properties of the ACO/AAO are capable of explaining the current global warming signal.
Radiohead is reportedly suing Lana Del Rey because her song "Get Free" sounds a lot like Radiohead's old hit "Creep." As Reason's Ed Krayewski noted earlier this week, Radiohead itself was successfully sued a while back because "Creep" sounds a lot like the Hollies' "The Air I Breathe." I don't think anyone has ever sued the Hollies for "The Air I Breathe," though I wouldn't be surprised to learn that there's a Roy Orbison song somewhere that sounds almost exactly like it.
The courts have been getting more strict about this kind of thing recently, with Marvin Gaye's estate winning a case against Pharrell Williams and Robin Thicke over a song that copied the "feel" but not the actual notes of Gaye's "Got to Give It Up." In a better world, the law would be growing more tolerant of this sort of imitation, not more restrictive. Much of the evolution of music is driven by people making tiny tweaks as they slavishly copy each other. A pop world without any plagiarism would be barren indeed.
This is most obvious when it comes to musical patterns that have been around too long for anyone to hold a copyright on them. (If someone actually owned the I-IV-V blues progression, he could buy Bill Gates with enough left over to make a down payment on the Moon.) When Jay Miller wrote and Kitty Wells recorded "It Wasn't God Who Made Honky Tonk Angels," for example, they didn't hide the fact that they were using the exact same melody as Hank Thompson's hit "The Wild Side of Life"; their song, after all, was a direct response to and comment on Thompson's record. But the tune was a lot older than "The Wild Side of Life"—that same series of notes had also been used in "Great Speckled Bird," "I'm Thinking Tonight of My Blue Eyes," and other old country songs. Indeed, the melody goes back to England. A lawsuit would have disappeared into a never-ending search for the original composer.
But not every pilfered melody comes from the public domain. Listen to "Express Yourself," a top 5 R&B hit for Charles Wright and the Watts 103rd Street Rhythm Band in 1970. Then listen to Jean Knight singing "Mr. Big Stuff," the #1 soul single of 1971. Seriously: Click the links and listen. They're the same goddamn song. "Mr. Big Stuff" came out less than a year after "Express Yourself," with a completely different set of songwriters credited; and yet nobody sued. Who knows: Maybe there's some older ur-funk record that Knight and Wright were both swiping. If so, Knight kept on swiping it: She recorded several barely-masked rewrites of "Mr. Big Stuff," because why mess with success?
Song-clones like that aren't especially unusual, and a good DJ can spend hours seguing from one of them to another. But I'll give you just one more example—probably my favorite one. Here's Lyn Collins, a protégé of James Brown, singing a song called "Me and My Baby":
You know that old South Park joke that if you want to write a Christian pop song, you should just take a love song and change every "baby" to "Jesus"? Here's country star Tom T. Hall seeming to prove the point:
He doesn't quite prove the point, because I think he wrote his song first. Both records came out in 1972, but "Me and Jesus" grazed the bottom of the Billboard charts in May; "Me and My Baby" didn't show up in Billboard til the fall. So this is probably a case of someone changing "Jesus" to "baby," not the other way around. The intro to Collins' record certainly sounds like she had church on her mind.
But I'm not going to complain about Collins ripping off Hall. You know why? Because Collins' song is better. Tom T. Hall is one of my favorite songwriters, but "Me and Jesus" isn't one of his better efforts. The lyrics are kind of rote, and the melody is gospel-by-numbers. (An exercise for the reader: Extend the chain of ripoffs backward by finding some older gospel records that sound like Hall's tune.) If Collins stole his song, she made it sharper and funkier in the process.
Maybe Collins and company should've given Hall a songwriting credit. Maybe not. I'm just glad they had the freedom to borrow it and improve it. I hate the thought that a fear of Grammy-chasing attorneys might dissuade today's pop artisans from doing the same.
(For past editions of the Friday A/V Club, go here.)
Over the past years, we’ve spent a lot of time researching topics relevant to Golem development. That effort was documented on wiki in our github repository, but we started it as a collection of private notes in May 2016, when the project was not open source yet. As initially it was not done for the public, it might be a bit messy at times, things that are there may be outdated, incomplete, some of the threads were abandoned halfway when it became clear that it was a dead end.
And most importantly: not everything there is relevant to Golem. As it often happens in the research process, sometimes they were blind alleys, sometimes it turned out that what we tried to do is of no use for Golem.
Regardless of the above — it presents some knowledge that we acquired and described and now we feel it is right to share that with the community. We are opening this repository to leave a mark after 2017, to show that the work is registered not only in the developer repository. Being able to view it will certainly be useful for anyone who wants to explore their knowledge of problems of decentralized technologies.
You can find R&D repository here.
We also open usecase repository containing some early proof-of-concepts for future Golem potential integration and use cases. You can find this repository here.
TL DR: We are working to address the problem of verification on Golem network. We will achieve it by checking small parts of computed tasks for accuracy. We will be publishing more on our research results over time.
Ok, so here’s the thing: one of the main issues with building apps on Golem is the verification problem. Since we don’t know what the answer should be for our task (otherwise, we wouldn’t need Golem to perform it…), there is no simple method of verifying if the answer provided by the network is correct. It wouldn’t be a problem in a perfect world, where everyone is honest — but here, it is quite a big one.
It’s not only that there could be someone who wants to spoil the game and cheat just for fun. In our Golem world, there is an explicit economic incentive to work less (possibly doing more tasks in the same amount of time). But, because we have the Golem world under our control, we can change the rules of the game to suit us better.
Let’s look at the very simple example of a spoiled version of Tom Sawyer, who is given the task of painting Aunt Polly’s fence. We can imagine a conversation going like this:
- Tom, I would like you to paint this very long fence.
- Ehh, ok. But first I need some paint.
- Oh, I don’t have time to go to the shop. But, if you buy it yourself, and paint the fence honestly — I will check it — I’ll give you back the money and we can go eat some ice cream. Since I’ve painted this fence so many times, I know exactly how much paint you will need.
So, in this situation, Spoiled Tom sees an opportunity to earn some cash. If he would paint only 2/3rds of the fence, and auntie wouldn’t notice it, then not only does he finish earlier, but also earns 1/3rd of the paint cost.
We can imagine the fence being very long, so checking every single slat of it would be a very tedious task. There is no way auntie would do that.
So, what should she do?
Spoiler alert: the surprising answer is that she only needs to check 2 slats, not matter how long the fence is.
The (somehow informal) proof goes like this:
Let’s say that fence is 100 slats long, there are 100 units of paint needed, every slat requiring one unit.
What happens if she only checks one single slat?
Well, let’s imagine ourselves in the situation of Tom. Obviously, he has to paint at least one slat honestly (since at least one will be checked).
So he painted the first slat honestly. What’s next?
If he cheats from now on, there is 99/100 chance that he will be caught (and, in consequence, lose his 1/100 of the cost of the paint he already invested) and 1/100 chance that he will not be caught, and then he earns 99/100 of the cost.
So, the expected reward in that case is 0.
We can see, that in other cases, the calculations are the same. If he paints k slats honestly, his expected reward is:
So, in the case of checking only one slat, we see that there is an equal incentive to cheat as to not cheat. So now, we may add a really tiny weight to the honest side and break that balance. As you’ve probably already guessed, that this small weight will be checking an additional, second slat. As it slightly increases the probability of being caught, if Tom cheats (it also decreases the probability of his not being caught), it makes the expected value negative for every case where only a subset of the fence is painted honestly.
Still, this was only some intuition, and if you are not convinced, a paper with the full proof will be available soon.¹
So, back to the story — what’s the moral of it? Well, first, we should say that every task in Golem is divided into many subtasks, to be processed by the distributed network. One of the verification methods we can employ is to do a small number of these subtasks by ourselves, and compare results with the output of the network (ie, in the case of rendering, we render a small sample of pixels from the image and compare them with the network output)². The question we had to answer was — how many of these subtasks have to be verified? If the answer “was something like “half of them” — there would be little reason to even send the task to the network, because we would still need to compute a large part of it by ourselves.
What we are seeing here, with some assumptions about the “rationality” of the provider (of course, Spoiled Tom doesn’t want to lose his money!) and some other technical assumptions explicitly stated in the proof, the number of checked subtasks is not only constant with respect to the size of the task, but the constant itself is very small. So, we can dispatch a task with 10⁶ subtasks and only check 2 of them³.
If you have any comments or doubts, send them to us, to firstname.lastname@example.org
 It will be one of many our research resources we are currently making publicly available, to make the company and process of developing Golem even more transparent. More about it here.
 There is more to it, since rendering is in part a non-deterministic process, there has to be some more effort put into comparing two sets of pixels than just “are they exactly the same”, but it is beyond the scope of this article.
 Things get more complicated when we are sending a task to many providers. But, it appears that this is not only not-a-problem, but an even better situation. One solution is to use some kind of so called mechanism of lottery — but, as before, it is really a whole different topic.
Interesting research on the prevalence of adblock blockers: "Measuring and Disrupting Anti-Adblockers Using Differential Execution Analysis":
Abstract: Millions of people use adblockers to remove intrusive and malicious ads as well as protect themselves against tracking and pervasive surveillance. Online publishers consider adblockers a major threat to the ad-powered "free" Web. They have started to retaliate against adblockers by employing anti-adblockers which can detect and stop adblock users. To counter this retaliation, adblockers in turn try to detect and filter anti-adblocking scripts. This back and forth has prompted an escalating arms race between adblockers and anti-adblockers.
This is a quick post.
I once wrote that James Comey’s legacy was the destruction of the justice system and the creation of a two tiered legal system- one for the Clinton’s and democrats and one for the rest of us. Bob Mueller continues that legacy.
The differences in the fates of people associated with the Trump and Clinton campaigns involved in alleged misdeeds could not be more starkly different and it makes painfully clear how the scales of justice have democrat fingers leaning heavily on one side.
Former TRUMP adviser Michael Flynn lies to the FBI, he is charged with the making false statements.
HILLARY‘s IT guy Paul Combetta, who destroyed 30,000 emails that were already under subpoena, lied to the FBI. He gets immunity.
TRUMP campaign adviser Paul Manafort failed to register as a foreign agent and did so retroactively, is charged under the FARA act.
CLINTON pal and brother of Hillary’s campaign manager John Podesta, Tony Podesta also failed to file as a foreign agent and then did register retroactively (just like Manafort), has not been charged with a thing.
These are just two examples. There are considerably more.
It is nothing less than obscene and it proves beyond any shadow of a doubt that both “investigations” are nothing but political in nature and justice is being doled out on that basis and that basis alone.
This has been shared around a lot but I was very impressed with dolphins following strategies of deferred gratification that some humans I know would be challenged by.
At the Institute for Marine Mammal Studies in Mississippi, Kelly the dolphin has built up quite a reputation. All the dolphins at the institute are trained to hold onto any litter that falls into their pools until they see a trainer, when they can trade the litter for fish. In this way, the dolphins help to keep their pools clean.
Kelly has taken this task one step further. When people drop paper into the water she hides it under a rock at the bottom of the pool. The next time a trainer passes, she goes down to the rock and tears off a piece of paper to give to the trainer. After a fish reward, she goes back down, tears off another piece of paper, gets another fish, and so on. This behaviour is interesting because it shows that Kelly has a sense of the future and delays gratification. She has realised that a big piece of paper gets the same reward as a small piece and so delivers only small pieces to keep the extra food coming. She has, in effect, trained the humans.
Her cunning has not stopped there. One day, when a gull flew into her pool, she grabbed it, waited for the trainers and then gave it to them. It was a large bird and so the trainers gave her lots of fish. This seemed to give Kelly a new idea. The next time she was fed, instead of eating the last fish, she took it to the bottom of the pool and hid it under the rock where she had been hiding the paper. When no trainers were present, she brought the fish to the surface and used it to lure the gulls, which she would catch to get even more fish. After mastering this lucrative strategy, she taught her calf, who taught other calves, and so gull-baiting has become a hot game among the dolphins.
No surprise. This is why I removed myself from Twitter several years ago.
zappl.com and steepshot.io are censorship resistant alternatives that make use of the Steem block chain. And they also distribute crytpocurrency to reward users for contributing and evaluating content.
nChain’s Chief Scientist, Dr. Craig Wright, is known for his strong advocacy for Bitcoin (and in particular, Bitcoin Cash, which he believes represents the true spirit of Bitcoin). However, he doesn’t often speak about his own belief system and perspectives.
In this podcast with Bitcoin Will Come, Dr. Wright sits down with George Samuels to dive deeper into his early involvement with cryptocurrency and his outlook for 2018. Watch the video below.
Google managers kept blacklists of conservative employees — and one manager even considered holding 'trials', alleges a new lawsuit (GOOG, GOOGL)
- Employees at Google who express "conservative viewpoints in politically-charged debates" may find themselves blacklisted by managers at the company, alleges an explosive new lawsuit.
- And by blacklisted, that means their names may appear on actual lists, the suit contends.
- Google employees who identify as conservative say they have complained to HR and senior management about the blacklists.
- These allegations are part of a lawsuit filed on behalf of fired Google engineer James Damore that seeks to represent white males and conservatives who feel like they've been the target of discrimination.
A well-known Republican San Francisco lawyer has filed a lawsuit against Google seeking to represent white, male or conservative employees who believe the company has discriminated against them.
The lawyer is Harmeet Dhillon, a partner with the Dhillon Law Group in San Francisco and the former chairwoman of the Republican Party in San Francisco.
She has been on the hunt for such victims since she took on fired Google engineer James Damore as a client in August. And on Monday she presented the first fruits of her research in a 161-page complaint that's chock full of allegations and screenshots.
The most jaw-dropping allegation is that "Google publicly endorsed blacklists" of conservatives. The lawsuit claims that several hiring managers publicly vowed not to hire people categorized as "hostile voices" aka conservatives.
For instance, one manager wrote on one internal forum, "I will never, ever hire/transfer you onto my team. Ever."
Another manager wrote in another, "I keep a written blacklist of people whom I will never allow on or near my team, based on how they view and treat their coworkers. That blacklist got a little longer today."
The lawsuit cites another post from another hiring manager that said, "If you express a dunderheaded opinion about religion, about politics, or about ‘social justice’, it turns out I am allowed to think you’re a halfwit... I’m perfectly within my rights to mentally categorize you in my [d*ckhead] box... Yes, I maintain (mentally, and not (yet) publicly)."
Interestingly, the lawsuit doesn't show the statements that provoked such strong reactions from these managers. It only characterizes them as "tactfully expressed conservative viewpoints in politically-charged debates."
Whether expressing anti-diversity sentiments at a workplace is a protected "conservative viewpoint" or, rather, a form of bigotry that actually creates a hostile environment is at the heart of the case — and it reflects a broader debate gripping the country under the divisive presidency of Donald Trump.
"Something resembling a trial"
The lawsuit shows that in at least one case, a manager (a white woman), was contemplating keeping some kind of actual, public list of employee names.
The manager wrote on an internal post, "I am thinking of something like a google doc that accepts comments, and which calls out those googlers that are unsupportive of diversity," she wrote.
She wondered, in the post, whether special "trials" should be held for employees nominated for the list, to determine whether they belonged on it or not.
The lawsuit shows her post as evidence:
The lawsuit names other instances, too. It says that conservative employees reported such lists or other attempts to block them and their comments on Google's internal social media sites to HR who told them that employees have the right to block others or make statements about the type of employees they liked to work with.
The lawsuit says that conservative employees on two occassions in the fall of 2017, also brought the matter of such lists up with Paul Manwell, Google CEO Sundar Pichai’s chief of staff, and to senior lawyer Kent Walker.
This lawsuit was filed on behalf of Damore, the engineer who created a firestorm last summer with his memo about women in tech and his comments about how Google treats conservatives. It seeks class-action status to represent other white or male or conservatives employees who believe they faced discrimination at Google.
A Google spokesperson says the company is ready to fight this lawsuit, telling us. "We look forward to defending against Mr. Damore's lawsuit in court."
NOW WATCH: Why airplane windows have tiny holes
Trump Administration to open 90% of US Offshore to leasing and roll back Obama-era drilling regulations
America’s top five inbound vs. top five outbound states: How do they compare on a variety of economic, business climate and political measures? - Publications – AEI
North American Moving Services just released its annual US Migration Report for 2017 based on household moves from one US state to another last year. In 2017, the top five inbound US states were: Arizona, No. 1 with 67% inbound moves vs. 37% outbound, followed by Idaho (63%-37% in-out), North Carolina (62-38%), South Carolina (62-38%) and Tennessee (58-42%). The top five outbound US states last year were No. 1 Illinois (32% inbound moves vs. 68% outbound), followed by Connecticut (38%-62% in-out), New Jersey (38-62%), California (40-60%), and Michigan (41-59%).
North American Moving Services’ interactive map includes annual state migration data back to 2011 and reveals some interesting historical patterns:
Illinois has been among the top five outbound states in each year from 2011 to 2017 and was No. 1 or No. 2 in each of those years except 2011 when it was No. 3. New Jersey has been in the top five outbound states in each of the last seven years from 2011 to 2017. Connecticut has been in the top five outbound states every year since 2011 except for 2012, and Michigan every year except 2013. Last year was the first time that California was among the top five outbound states.
For the top five inbound states, Arizona has been included every year except 2011, and it’s been No. 1 or No. 2 in each of the last four years. South Carolina has been in the top five inbound states in each of the last seven years, and North Carolina every year except two (2011 and 2015). Florida has been among the top five inbound states in most years but wasn’t in 2017, it was replaced in the top five by Tennessee, which made its debut last year.
Economic Questions: What significant differences are there, if any, between the top five inbound and top five outbound states when they are compared on a variety of measures of economic performance, business climate, business, and individual taxes, fiscal health, labor market dynamism, etc.? Assuming that many Americans “move/vote with their feet” when they relocate from one state to another, is there any empirical evidence to suggest that Americans are moving to states that are relatively more economically vibrant, dynamic and business-friendly, with lower tax and regulatory burdens and more economic and job opportunities, from states that are relatively more economically stagnant with higher taxes and more regulations and with fewer economic and job opportunities?
The table above summarizes a comparison between the two groups of US states (top five inbound and outbound) on nine different measures of economic performance, labor market dynamism, business climate, tax climate and fiscal health for those ten states. And on each of those nine measures, it does appear that the top five inbound states are on average out-performing the top five outbound states, suggesting that migration patterns in the US do reflect Americans “voting/moving with their feet” from high-tax, business-unfriendly, economically stagnant states to lower-tax, business-friendly, economically vibrant states. Let’s review those nine measures, one-by-one:
1.Right-to-Work. All five of the top five inbound states are Right-to-Work (RTW) states, and all of the top five outbound states except Michigan are Forced Unionism states. According to many studies like this one by my AEI colleague Jeff Eisenach (emphasis mine):
There is a large body of rigorous economic research on the effects of RTW laws on economic performance. Overall, that research suggests that RTW laws have a positive impact on economic growth, employment, investment, and innovation, both directly and indirectly.
Therefore, it would make sense that Americans are leaving forced unionism states for greater job opportunities in RTW states.
2. Taxes. The average top individual income tax rate in the top five inbound states is 5.9% compared to 7.7% in the top outbound states. Likewise, the average top corporate tax rate in the top five inbound states is 5.4% compared to 8.1% in the top five outbound states. It’s an ironclad law of economics that if you tax something you get less of it, and it’s therefore no surprise that Americans and businesses are leaving relatively high tax states for relatively low tax states.
3. Forbes Best States for Business. Based on its most recent annual state ranking that measures six business categories: costs, labor supply, regulatory environment, current economic climate, growth prospects and quality of life, Forbes rated North Carolina ranked as the best US state for business last year. Three of the other four states in the top five inbound states (Tennessee, South Carolina, and Idaho) ranked in the top half of the best US states for business and Arizona ranked No. 33. All five top outbound states ranked in the bottom half of the best US states for business, and four states (Illinois, Connecticut, New Jersey and Michigan) ranked in the lowest one-third of US states for business.
4. Business Tax Climate Rankings. Every year The Tax Foundation creates its State Business Tax Climate Index based on each US state’s corporate income taxes, individual income taxes, sales taxes, property taxes and unemployment insurance taxes. For the most recent Tax Foundation rankings, all of the top five inbound states except North Carolina ranked in the top half (best) of states, and all of the top five outbound states ranked in the bottom (worst) half of the states except Michigan. New Jersey ranked as the worst US state (No. 50), California ranked No. 48 and Connecticut ranked No. 44.
Based on the last two categories (Forbes Best States for Business and The Tax Foundation’s Business Tax Climate Ranking), it’s perfectly understandable that low-tax, business-friendly states like North and South Carolina are experiencing net inflows of Americans and businesses, while high-tax, business-unfriendly states like New Jersey and California are losing populations.
5. State Fiscal Rankings. In an annual study, The Mercatus Center ranks each US state’s financial health based on short- and long-term debt and other key fiscal obligations, such as unfunded pensions and healthcare benefits. According to its most recent report, “The fiscal health of America’s states affects all its citizens. Indicators of fiscal health come in a variety of forms—from a state’s ability to attract businesses and how much it taxes to what services it provides and how well it keeps its promises to public-sector employees.” In its most recent 2017 report, The Mercatus Center ranked all of the top five inbound states except Arizona in the above average (Tennessee, Idaho and North Carolina) and average categories (South Carolina). In contrast, all five of the top outbound states ranked below average and in the bottom one-third of US states, and Illinois (No. 49) and New Jersey (No. 50) ranked at the very bottom.
6. Economic Performance. The last three categories above show economic performance measures for each of the ten states for: a) state GDP growth rate in the first half of 2017 (most recent data available), b) the state jobless rate in November 2017 and c) employment growth over the most recent one-year period through November 2017. For the top five inbound states, the average GDP growth rate is 1.6%, the average jobless rate is 3.7%, and the average job growth rate is 1.8%. In contrast, the figures for the five outbound states are 1.0%, 4.8%, and 0.8%. In other words, compared to the outbound states, output growth is about 50% higher in the inbound states on average (1.6% vs. 1.0%), the average jobless rate is more than one percentage point lower (3.7% vs. 4.8%) and employment growth is one percentage point higher (1.8% vs. 0.8%).
Those three important economic indicators suggest that the inbound states on average are stronger economically than the outbound states and have more robust labor markets with lower jobless rates and greater job creation.
Finally, although it’s not shown in the table above, it’s worth mentioning that all five of the inbound states have Republican-controlled state legislatures and all of the outbound states have Democratic-controlled state legislatures except for Michigan. Since the top inbound states are relatively low-tax, business-friendly, fiscally healthy, and high-growth states and the top outbound states are relatively high-tax, business-unfriendly, fiscally unhealthy and low-growth states, the difference in party control of the state legislatures is exactly what one might expect.
Bottom Line: The migration patterns of US households last year followed predictable patterns based on differences among states in economic growth, vitality and dynamism, labor market robustness, fiscal health, and party control of state legislatures. To answer the questions posed above, there are significant differences between the top five inbound and top five outbound states when they are compared on a variety of measures of economic performance, business climate, tax burdens for businesses and individuals, fiscal health, and labor market dynamism. There is empirical evidence that Americans do “vote with their feet” when they relocate from one state to another, and the evidence suggests that Americans are moving from states that are relatively more economically stagnant, Democratic-controlled fiscally unhealthy states with higher taxes, more regulations and with fewer economic and job opportunities to Republican-controlled, fiscally sound states that are relatively more economically vibrant, dynamic and business-friendly, with lower tax and regulatory burdens and more economic and job opportunities.
Who’d a-Thunk It? Americans vote with their feet because they value jobs, economic freedom and prosperity, entrepreneurship, lower taxes, and less government over the opposite?
We are in the midst of a revolution.
But it’s not a revolution in the sense most people think of. It’s not a war fought with guns and bombs.
It’s a battle of ideas.
Today’s revolution is a revolution in thought.
John Adams, founding father and second president of the United States, described the American revolution in much the same way. In his 1818 letter to Hezekiah Niles, he wrote:
“But what do we mean by the American Revolution? Do we mean the American war? The Revolution was effected before the war commenced. The Revolution was in the minds and hearts of the people; a change in their religious sentiments of their duties and obligations. … This radical change in the principles, opinions, sentiments, and affections of the people, was the real American Revolution.”
The Tenth Amendment Center was founded in 2006. At that time, nullification was a maligned idea relegated to the dustbin of history. In our early years, we counted the mere introduction of a non-binding resolution in support of the Tenth Amendment as a major success.
From those small seeds, a formidable nullification movement has grown up over the past 11-plus years. Last year alone, more than 450 bills to limit federal power in some way were introduced in state legislatures across the country.
From Small Things…
2017 marked the 250th anniversary of some of the most important essays in American history. Written by John Dickinson, the “Penman of the Revolution,” these 12 essays known as “Letters from a Farmer in Pennsylvania” were in response to the Townshend Acts.
In the first essay, Dickinson discussed the New York Restraining Act, which was the last of these British Acts. It was intended to punish the Assembly of New York, suspending its legislative powers for failing to fully comply with orders from the crown.
“If the parliament may lawfully deprive New York of any of her rights, it may deprive any, or all the other colonies of their rights; and nothing can possibly so much encourage such attempts, as a mutual inattention to the interests of each other. To divide, and thus to destroy, is the first political maxim in attacking those, who are powerful by their union.”
He continued on to say that, in essence, the rightful response at that moment would have been for other assemblies to have passed non-binding resolutions informing Parliament that the act was a violation of rights and that it should be repealed.
Why? His answer came through clearly in his signature, where he wrote the Latin phrase, Concordia res parvae crescunt.
“Small things grow great by concord.”
From its inauspicious beginnings, today’s nullification movement is growing into a revolutionary force.
TURNING RADICAL IDEALISM INTO WORKABLE STRATEGY
It’s one thing to have lofty goals. Achieving them takes more than strong rhetoric. You have to have a solid, actionable strategy.
In response to the unconstitutional Alien and Sedition Acts, Thomas Jefferson used the Kentucky and Virginia Resolutions of 1798 to lay out the principles of nullification. But the resolutions themselves did not nullify the Alien and Sedition Acts. Instead, Jefferson and Madison first created a framework for future action.
On November 17, 1798, one week after passage of the Kentucky Resolutions, Thomas Jefferson sent a draft to James Madison, along with a letter. He wrote:
I inclose you a copy of the draught of the Kentucky resolves. I think we should distinctly affirm all the important principles they contain, so as to hold to that ground in the future, and leave the matter in such a train as that we may not be committed absolutely to push the matter to extremities, & yet may be free to push as far as events will render prudent.
Jefferson and Madison stated their principles, justified their actions, and then left the door open to proceed with a practical strategy they could adapt as circumstances evolved.
At the TAC, we try to follow this blueprint. We always keep the ultimate goal in front of us, but we act strategically when and how specific situations allow. It’s a balancing act – a give and take – always keeping in mind that you don’t achieve radical change by abandoning radical principles.
William Lloyd Garrison took a similar tack in his battle against slavery in the U.S.
Garrison ranks as one of the greatest abolitionists in American history, and he understood this. He steadfastly stuck by his call for absolute and immediate emancipation of all slaves.
While it seems absurd to our 21st century sensibilities, total abolition of slavery was an idealistic, radical, extremist position in the mid-1800s. Principled abolitionists were generally reviled, even in the North. The broader abolitionist movement was dominated by pragmatists content with modest policy changes here and there. A lot of them were merely jockeying for political power. Garrison would have none of this. He believed slavery should end immediately, and he constantly said so. He wasn’t concerned about winning a popularity contest or convincing people he was properly mainstream. He unapologetically wore a badge of radicalism. He unwaveringly pursued the ideal.
But Garrison wasn’t just running around like a proverbial bull in a china shop. He had pragmatic reasons for maintaining his hard-core stance. He recognized that by pushing for the ultimate goal he was more likely to reach it.
“Urge immediate abolition as earnestly as we may, it will, alas! be gradual abolition in the end. We have never said that slavery would be overthrown by a single blow; that it ought to be, we shall always contend.”
Garrison understood that if he started by seeking half-measures, he would never end up with anything more than half-measures. He warned, “Gradualism in theory is perpetuity in practice.”
Economist and political philosopher Murray Rothbard put it this way in A Case for Radical Idealism:
“William Lloyd Garrison was not being ‘unrealistic’ when in the 1830s he first raised the glorious standard of immediate emancipation of the slaves. His goal was the morally proper one, and his strategic realism came in the fact that he did not expect his goal to be quickly reached …
Gradualism in theory indeed undercuts the goal itself by conceding that it must take second or third place to other non- or antilibertarian considerations. For a preference for gradualism implies that these other considerations are more important than liberty.”
At the TAC, we always keep the Constitution and liberty as our core objective. But we also recognize that it will take a series of small victories to reach our ultimate goal.
We’ll never abandon our radical idealism. But we will always work strategically, step-by-step, to achieve our objectives.
The 2017 State of the Nullification Movement Report tells the current story of our efforts. Read it in full here.
NOTE: Michael Boldin contributed to this post.
Here’s a full video of a 1979 appearance of Milton Friedman on Donahue.
Today, Friday, December 29, 2017, is the last federal workday of the year.
This presents an opportunity to round up all rules and regulations produced by agencies appearing in the calendar-year 2017 Federal Register
The calendar year concluded with 61,950 pages in the Federal Register this morning. Here’s the cover.
This is the lowest count since 1993’s 61,166 pages. That was Bill Clinton’s first year, and his own lowest-ever count.
A year ago, Obama set the all-time Federal Register page record with 95,894 pages.
Trump’s Federal Register is a 35 percent drop from Obama’s record, set last year.
I'm not necessarily convinced. I'll wait for the market to sort things out. But there's no doubt that high transaction fees are a serious problem for Bitcoin (segwit) adoption.
With Christmas and other winter celebrations wrapping up 2017, merchants are seeing the best holiday shopping season in years. In particular, holiday e-commerce spending is surging, estimated to break through $100 billion in 2017 in the United States alone, representing a 14% increase from last year. As more consumer buying is done online, merchants naturally want fast, reliable, low-fee payment systems. Today, e-commerce relies heavily on card payment systems like VISA and Mastercard. But as virtual currency becomes the future of money, what merchants want should be a cryptocurrency that offers bigger blocks for more transactions, faster speed, and lower transaction fees. What merchants really want for Christmas – and every day of the year – should be Bitcoin Cash.
The Bitcoin Scaling Debate
2017 was a breakout year for Bitcoin, the decentralized cryptocurrency which has soared in price over the past twelve months. But while the “legacy” Bitcoin (traded under the ticker symbol BTC) captured much media attention this year, a newer kid on the blockchain – Bitcoin Cash (ticker symbol BCH) – emerged in August 2017 via a “hard fork” of the original chain. Since then, it has quietly gained traction and I predict 2018 will be the year of Bitcoin Cash.
What’s the difference between these two coins and why should merchants care? Bitcoin was born through a 2008 white paper published under the pseudonym Satoshi Nakamoto. The paper’s title neatly summarizes the vision: Bitcoin: A Peer-to-Peer Electronic Cash System. Bitcoin was designed to be used as daily “electronic cash” by people worldwide. Its transformative concept was giving you the ability to send money or make payments directly to anyone in the world, without intermediary banks, payment processors or other trusted third parties in the middle. The transactions are recorded on a distributed ledger known as the blockchain, which is maintained by decentralized mining nodes rather than by a central authority. To achieve this vision, transactions on the Bitcoin blockchain need to be fast, low-fee, and instantly confirmed.
But due to internal debates within the Bitcoin community, the “legacy” Bitcoin (BTC) chain moved away from that direction. Its blocks remain capped at only 1MB in size. A new block is added to the blockchain every 10 minutes; with a 1MB data limit every 10 minutes, the legacy chain can process on average approximately 3 transactions per second (and at its maximum, 7 transactions per second). By comparison, the global VISA network averages 2,000 transactions per second, and can reach 56,000 transactions per second during peak periods (such as holiday shopping juggernauts Black Friday and Cyber Monday).
To increase the legacy Bitcoin network’s throughput capacity, an easy solution would be to just increase the block size. There have been efforts to raise the block size to 2MB, but that modest increase hardly solves the problem. Other proposed solutions are controversial – such as a technical change called “Segregated Witness” or SegWit, which refers to splitting signature data from the transaction data to save space. (In fact, once SegWit was actually activated in August 2017, the legacy Bitcoin is more appropriately called the “SegWit” chain and some cryptocurrency media sites – such as Bitcoin.com and CoinGeek.com have begun to call it by that name.)
At the core of this debate is a philosophical divide on what the Bitcoin network should be. Should the Bitcoin blockchain be just a “settlement layer” – relying on “off chain” or “side chain” solutions to do the fast payment processing on different layers, and periodically settle payment transactions to the blockchain? Or should it fulfill Satoshi Nakamoto’s vision of Bitcoin as a single “electronic cash” system where all transactions are processed and which is used by people efficiently around the world? The Bitcoin community thus split into two camps: “small blockers” and “big blockers.” While their disagreements continued, the legacy network did not scale bigger and faster; it slowed down with backlogs of unconfirmed transactions which could take hours to confirm and cost high fees to process. What consumer would use Bitcoin to pay for a $5.00 Starbucks coffee if it cost $10.00 or even as high as $28.00 just to process the transaction? And why would merchants want to use such a slow and costly payment system?
The Rise of Bitcoin Cash
Weary of waiting for the legacy Bitcoin chain to scale bigger and faster, on August 1, 2017, proponents of big blocks and the original Satoshi vision forked off onto a new chain called Bitcoin Cash (BCH). Its name emphasizes the goal of Bitcoin acting as electronic cash which is used, sent and spent on a daily basis – a future both consumers and merchants should welcome.
To achieve that vision, Bitcoin Cash (BCH) embraces bigger blocks. Its chain began with an initial default block size of 8MB – far larger than the SegWit Bitcoin (BTC) chain’s current 1MB or 2MB size; that allows an average 24 transactions per second. That of course is not enough to rival VISA or Mastercard speed. So the Bitcoin Cash (BCH) network plans to scale to much bigger blocks and ultimately millions of transactions per second. Its developer groups have already announced their goal to increase block size up to 32MB and higher in the future.
Through a joint Gigablock Testnet Initiative by Bitcoin Unlimited and our company nChain, blocks of up to 1 gigabyte in size (1000 megabytes!) have been successfully tested. That would enable throughout capacity of over 10,000 transactions per second, and 864 million transactions per day.
And if 1GB was not enough, 1 terabyte (1 million megabytes!) blocks could be viable. Joannes Vermorel, founder of Lokad (a quantitative supply chain technology provider in France), recently published “Terabyte Blocks for Bitcoin Cash” explaining how 1 TB blocks are viable on the Bitcoin Cash (BCH) chain. A single terabyte block (added every 10 minutes) can contain about 4 billion Bitcoin transactions, and provide capacity of 7 million transactions per second. Vermorel writes: “Assuming a worldwide population of 10 billion humans, terabyte blocks offer about 50 transactions per human per day (57 actually, but the extra numerical precision is not significant”)” and “50 transactions per day per human appears sufficient to cover all human-driven activities.” The scale of such a network is mind-boggling; it would enable Bitcoin Cash (BCH) to power not just monetary transactions but machine-to-machine data transactions of many types. Such a powerful network can power smart contracts, the Internet of Things, and even artificial intelligence. And as Vermorel explains, this is only possible on the Bitcoin Cash (BCH) network – not the Segwit Bitcoin (BTC) chain with its tiny 1MB blocks.
While the dream of a 1TB block remains in the future, there is no doubt that Bitcoin Cash (BCH) is headed to far bigger blocks. Its network will have faster speed, so that transactions confirm almost instantly rather than waiting in a backlog queue for hours. This yields low transaction fees, costing just cents per transaction. Moreover, the multiple developer groups for Bitcoin Cash (BCH) are coordinating on numerous other features for future protocol upgrades. This will ensure orderly growth of the network to become a global peer-to-peer electronic cash system.
Why Merchants Should Want Bitcoin Cash
The growth of Bitcoin Cash (BCH) comes just at the right time. Not that long ago, merchant adoption of Bitcoin was on the rise. But a JP Morgan report in 2017 found that retailer acceptance of Bitcoin has recently been declining. No doubt that was due to rising fees on the legacy chain. Just in December 2017, online gaming distributor Steam dropped Segwit Bitcoin (BTC) as a payment method due to increased transaction fees going from what was once $.20 to as high as $20.
Let’s examine the financial impact on leading e-commerce retailer Amazon. On this past Black Friday’s shopping days over Thanksgiving weekend in the U.S., Amazon processed 12.78 million online transactions (on November 23 and 24, 2017). On those dates, SegWit Bitcoin (BTC) transaction fees averaged $6 but Bitcoin Cash (BCH) transaction fees averaged around only $.20. If Amazon accepted Segwit Bitcoin (BTC) for its 12.78 million transactions, fees would have totaled $76 million; but with Bitcoin Cash (BCH), transaction fees would only be $2.5 million. However, most Bitcoin wallets are not yet configured for the cheaper fee structure the Bitcoin Cash (BCH) chain offers; by default, they use the same cost amount – expressed in Satoshis (one hundred millionth of a Bitcoin – the smallest unit) per data byte – that would be necessary on the Segwit Bitcoin (BTC) chain to process a transaction. Once wallets are properly re-configured for the Bitcoin Cash (BCH) chain, transactions fee could be as low as $0.02 per transaction, which would reduce total transaction fees to only $250,000 for Amazon to do 12.78 million transactions on a Black Friday weekend!
Given this sharp discrepancy in fee impact to merchants, it’s no coincidence that Bitpay, the leading Bitcoin payment solutions provider for merchants, recently decided to accept Bitcoin Cash (BCH) payments. In making this announcement, Bitpay explained that Bitcoin Cash (BCH) “allows for payments with significantly faster network confirmations and significantly lower miner fee costs.”
In a Bitcoin Cash (BCH) world with a massively-scaled network, merchants will benefit tremendously:
- Merchants will pay far lower transaction fees than on today’s payment card systems.
- Foreign currency exchange rates and costs will be minimized.
- With instant payment confirmations on the blockchain, chargebacks will create significantly less financial exposure to merchants.
- Auto-renewing payments can be automated on the blockchain with smart contracts. (Our company even has a patent-pending invention about that).
Those are keys to move eCommerce to bCommerce (Bitcoin commerce).
That’s a future we at nChain Group – along with many others around the globe – build towards with our research, development, intellectual property, and investments. Indeed, there is much work to do before Bitcoin Cash (BCH) fulfills the Satoshi Nakamoto vision of a global peer-to-peer electronic cash system. We need a massively scaled blockchain, more user-friendly Bitcoin wallets, better merchant point-of-sale applications for accepting Bitcoin, improved security solutions, and reasonable regulatory frameworks. But when the building blocks come together, Bitcoin Cash (BCH) has the power to make the global money system far more efficient and change how the world operates. That is why merchants should want Bitcoin Cash (which, one day, I believe will be known just as Bitcoin) – for Christmas, and every other day of the year.
The post All Merchants Want for Christmas Should be Bitcoin Cash appeared first on nChain.
More reasons why Steem is awesome- Steem is an open source blockchain protocol that any developer can leverage to build social applications that reward users in cryptocurrency - Reward splitting enables developers to immediately generate revenue by redirecting a percentage of the STEEM distributed through their application to themselves - Real-time Zero-fee token transfers enable microtransactions, frictionless interaction, as well as lower cost products and services - Steem is a distributed database capable of storing large quantities of content - Steem leverages the wisdom-of-the-crowd to distribute cryptocurrency rewards in a fair and transparent manner - High levels of usability and security through a hierarchical private key structure - The ability for multiple people to control individual accounts (multi-sig authorities) - A rewards pool filled by algorithmically generated tokens that incentivizes content creators without introducing friction or cognitive load - User-chosen account names replace the difficult to remember (and hard to type) wallet addresses used in other blockchain protocols - Built-in escrow payment support enables the exchange of goods and services with a trusted third party intermediary
Learn moreMore information about the Steem blockchain and all of its features can be found in the [Steem bluepaper](https://steem.io/steem-bluepaper.pdf).
Taking Steem to the Next Level: Smart Media Tokens (SMTs)In early 2018, the Steem blockchain will be launching “Smart Media Tokens” (SMTs) which will allow the developers of content platforms and communities to launch their own customized Proof-of-Brain tokens, with real-time zero-fee transfers, on the Steem blockchain. Along with built-in smart contracts that enable fundraising opportunities like ICOs and Founders Tokens, SMTs will allow anyone to instantly tokenize their platform and bootstrap the value of their coin by leveraging all the lessons learned from Steem and [steemit.com](https://www.steemit.com/). More details can be found in the [SMT Whitepaper](https://smt.steem.io/smt-whitepaper.pdf).
Where we are headingThe revolutionary and totally unique proof-of-brain rewards system offered by the Steem blockchain and token are designed to make Steem the ultimate on-ramp into cryptocurrency for mainstream users. The performance and scalability of the blockchain is designed with widespread mass adoption of the currency and platform in mind. When combined with lightning fast processing times and fee-less transactions, Steem is positioned to become one of the leading blockchain technologies used by people around the world.
Thank YouAchieving this milestone never would have been possible without the amazing Steem community, including but not limited to; the witnesses, 3rd party developers, our amazing content creators, and users. Thank you for making this the most heavily utilized blockchain protocol in the world! _Team Steemit_
Current StatusSignificant progress is being made with [hardfork 20](https://steemit.com/steemit/@steemitblog/proposing-hardfork-0-20-0-velocity), including some additional changes which will make the Steem blockchain even better. There is no official release date yet, but a release candidate is expected to be ready by the end of the year.
Improving Curation Incentives
Changes to 30-minute Curation WindowAs of now, Steem account holders (including bot accounts) are disincentivized by the Steem blockchain from voting on a post within the first 30 minutes. The earlier a vote is made within the initial 30-minute window, the less curation rewards the voting account receives. This was originally introduced to even the playing field between human curators and bots. This was successful but with the rise of more short-form content on the platform (content that can be read or viewed in less than 30 minutes), the community and the witnesses have come to a consensus that the 30-minute rule is taking curation rewards away from human voters who are actively consuming content and voting on material they like. For this reason, HF20 will reduce this window from 30 to 15 minutes.
Eliminating self-voting rewards through curationWhile the change to 15 minutes will even the playing field against bots, it doesn’t address the advantage self-voting gives to accounts with respect to curation rewards. If authors vote for themselves right away, they get their author rewards, 100% of the curation rewards from their vote, plus a portion of the curation rewards coming from everyone who votes for the post after them. Any other curator voting at the same time as the author would get 0% of the curation rewards. This gives the author an unfair advantage over other curators because the author can earn additional curation rewards through self-voting. In order to eliminate this unfair advantage, the unused portion of the curation rewards will be returned to the rewards pool instead of being awarded to the author, thereby increasing the overall percentage of rewards that will be paid to curators. This will better serve the original mission of the curation rewards budget: to ensure that the Steem blockchain distributes rewards to the most valuable content.
“Dust” vote changes
Removal of vote dust thresholdThe "vote dust threshold" is a rule that prevents the occurrence of extremely weak votes. Currently, accounts must possess about 1 SP in order for a 100% Voting Power vote to be successfully posted to the blockchain. If a vote is placed that is below the required threshold, it will be rejected by the blockchain. This can create a bad user experience for new users, as their votes can fail for seemingly no reason. In hardfork 20, this “vote dust threshold” will be removed. After this change users with any amount of SP will be able to cast votes so long as they have sufficient bandwidth. Votes that are below the threshold will be posted to the blockchain but will have no impact on rewards. This will allow users to have a better user experience on all Steem-based applications by enabling them to vote whenever they want to (as long as they don’t exceed their generous bandwidth allocation), without adding to the computational load on the blockchain by requiring that it calculate the impact of effectively powerless votes on the rewards pool. This also mitigates an attack vector by ensuring that if a malicious actor wanted to overburden the Steem blockchain, making countless small votes would not be an effective strategy.
Application of shift to all votesThe switch to linear rewards in hardfork 19 has had a very positive effect on user experience. HF19 ensured that the impact of each user’s vote on the rewards pool is directly proportional to their Steem Power (i.e. their stake in the platform). Users now feel more empowered, and they can see the direct correlation between the amount of Steem Power they have and the strength of their vote. However, the switch to a linear rewards curve meant that there was less of a disincentive for casting lots of inconsequential votes (“vote spamming”). While it is important for users to be able to earn rewards whenever stakeholders (even small ones) find value in their content, it is also important to disincentivize rewarding content with respect to which no other stakeholders see value. After discussion with the witnesses, it was decided to apply the “vote dust” shift to all votes equally. Each vote that is cast will be shifted down by about 1.219 SP. This effectively establishes a “baseline” voting strength that applies to everyone, while still maintaining a linear rewards curve for votes above the baseline. This way even large Steem Power holders won’t be able to profit from casting countless inconsequential votes.
Proof-of-Work account mining via softforkThe changes required to add support for PoW mining for discounted accounts will be included in hardfork 20, but the actual PoW mining will be added later as a softfork on top of HF20.
Removal of Power Down RestrictionTo prevent faucet abuse, accounts could not previously power down unless they had 10 times the account creation fee in Steem Power. Because account creation fees will now be burned with HF20, there will be less of a financial incentive to abuse faucets. After HF20, Steem account holders will have the freedom to power down their SP regardless of their account balance.
Update to witness price feed formatThis change only affects the witnesses. A small change will be made to require the base to be SBD and the quote to be STEEM. Currently, both orders are allowed and can lead to undefined behavior in the price feed. Most witnesses are already supplying their price feeds in this format.
Additional DetailsThe full list of changes, as well as additional details not listed here, can be found in the GitHub [Steem 0.20.0](https://github.com/steemit/steem/projects/3) project. More details can also be found in the original @steemitblog proposal: [Proposing Hardfork 0.20.0 “Velocity”](https://steemit.com/steemit/@steemitblog/proposing-hardfork-0-20-0-velocity).
ConclusionThe changes that are coming in the Velocity hardfork are very exciting because they are establishing the foundation that will enable Steemit.com and apps built on top of Steem to onboard millions of new users while remaining economically scalable. With these changes and the additional improvements described above, the Steem platform will be better than ever! Steem on, _Team Steemit_
FASTER, PLEASE: Aging Reversal tests in dogs by 2019 and then in human tests by 2022 if that works….